Switch to: References

Add citations

You must login to add citations.
  1. The constraint rule of the maximum entropy principle.Jos Uffink - 1996 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 27 (1):47-79.
    The principle of maximum entropy is a method for assigning values to probability distributions on the basis of partial information. In usual formulations of this and related methods of inference one assumes that this partial information takes the form of a constraint on allowed probability distributions. In practical applications, however, the information consists of empirical data. A constraint rule is then employed to construct constraints on probability distributions out of these data. Usually one adopts the rule that equates the expectation (...)
    Download  
     
    Export citation  
     
    Bookmark   21 citations  
  • On Jaynes’s Unbelievably Short Proof of the Second Law.Daniel Parker - 2011 - Philosophy of Science 78 (5):1058-1069.
    This paper investigates Jaynes’ “unbelievably short proof” of the 2nd law of thermodynamics. It assesses published criticisms of the proof and concludes that these criticisms miss the mark by demanding results that either import expectations of a proof not consistent with an information-theoretic approach, or would require assumptions not employed in the proof itself, as it looks only to establish a weaker conclusion. Finally, a weakness in the proof is identified and illustrated. This weakness stems from the fact the Jaynes’ (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Information-Theoretic Statistical Mechanics without Landauer’s Principle.Daniel Parker - 2011 - British Journal for the Philosophy of Science 62 (4):831-856.
    This article distinguishes two different senses of information-theoretic approaches to statistical mechanics that are often conflated in the literature: those relating to the thermodynamic cost of computational processes and those that offer an interpretation of statistical mechanics where the probabilities are treated as epistemic. This distinction is then investigated through Earman and Norton’s ([1999]) ‘sound’ and ‘profound’ dilemma for information-theoretic exorcisms of Maxwell’s demon. It is argued that Earman and Norton fail to countenance a ‘sound’ information-theoretic interpretation and this paper (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Entropy - A Guide for the Perplexed.Roman Frigg & Charlotte Werndl - 2011 - In Claus Beisbart & Stephan Hartmann (eds.), Probabilities in Physics. Oxford University Press. pp. 115-142.
    Entropy is ubiquitous in physics, and it plays important roles in numerous other disciplines ranging from logic and statistics to biology and economics. However, a closer look reveals a complicated picture: entropy is defined differently in different contexts, and even within the same domain different notions of entropy are at work. Some of these are defined in terms of probabilities, others are not. The aim of this chapter is to arrive at an understanding of some of the most important notions (...)
    Download  
     
    Export citation  
     
    Bookmark   21 citations