Switch to: Citations

Add references

You must login to add references.
  1. Can the maximum entropy principle be explained as a consistency requirement?Jos Uffink - 1995 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 26 (3):223-261.
    The principle of maximum entropy is a general method to assign values to probability distributions on the basis of partial information. This principle, introduced by Jaynes in 1957, forms an extension of the classical principle of insufficient reason. It has been further generalized, both in mathematical formulation and in intended scope, into the principle of maximum relative entropy or of minimum information. It has been claimed that these principles are singled out as unique methods of statistical inference that agree with (...)
    Download  
     
    Export citation  
     
    Bookmark   27 citations  
  • Information Theory and Statistical Mechanics. II.Edwin T. Jaynes - 1957 - Physical Review 108 (2):171.
    Information theory and statistical mechanics II.
    Download  
     
    Export citation  
     
    Bookmark   91 citations  
  • Bell's Theorem without Inequalities.Daniel M. Greenberger, Michael A. Horne, Abner Shimony & Anton Zeilenger - 1990 - American Journal of Physics 58 (12):1131--1143.
    Download  
     
    Export citation  
     
    Bookmark   66 citations  
  • Probability, Frequency, and Reasonable Expectation.Richard Threlkeld Cox - 1946 - American Journal of Physics 14 (2):1-13.
    Probability, Frequency and Reasonable Expectation.
    Download  
     
    Export citation  
     
    Bookmark   64 citations  
  • Information theory and statistical mechanics.Edwin T. Jaynes - 1957 - Physical Review 106:620–630.
    Information theory and statistical mechanics.
    Download  
     
    Export citation  
     
    Bookmark   51 citations