Switch to: References

Add citations

You must login to add citations.
  1. Bertrand’s Paradox and the Principle of Indifference.Nicholas Shackel - 2024 - Abingdon: Routledge.
    Events between which we have no epistemic reason to discriminate have equal epistemic probabilities. Bertrand’s chord paradox, however, appears to show this to be false, and thereby poses a general threat to probabilities for continuum sized state spaces. Articulating the nature of such spaces involves some deep mathematics and that is perhaps why the recent literature on Bertrand’s Paradox has been almost entirely from mathematicians and physicists, who have often deployed elegant mathematics of considerable sophistication. At the same time, the (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Bertrand's Paradox and the Maximum Entropy Principle.Nicholas Shackel & Darrell P. Rowbottom - 2019 - Philosophy and Phenomenological Research 101 (3):505-523.
    An important suggestion of objective Bayesians is that the maximum entropy principle can replace a principle which is known to get into paradoxical difficulties: the principle of indifference. No one has previously determined whether the maximum entropy principle is better able to solve Bertrand’s chord paradox than the principle of indifference. In this paper I show that it is not. Additionally, the course of the analysis brings to light a new paradox, a revenge paradox of the chords, that is unique (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • The constraint rule of the maximum entropy principle.Jos Uffink - 1996 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 27 (1):47-79.
    The principle of maximum entropy is a method for assigning values to probability distributions on the basis of partial information. In usual formulations of this and related methods of inference one assumes that this partial information takes the form of a constraint on allowed probability distributions. In practical applications, however, the information consists of empirical data. A constraint rule is then employed to construct constraints on probability distributions out of these data. Usually one adopts the rule that equates the expectation (...)
    Download  
     
    Export citation  
     
    Bookmark   21 citations  
  • Deceptive updating and minimal information methods.Haim Gaifman & Anubav Vasudevan - 2012 - Synthese 187 (1):147-178.
    The technique of minimizing information (infomin) has been commonly employed as a general method for both choosing and updating a subjective probability function. We argue that, in a wide class of cases, the use of infomin methods fails to cohere with our standard conception of rational degrees of belief. We introduce the notion of a deceptive updating method and argue that non-deceptiveness is a necessary condition for rational coherence. Infomin has been criticized on the grounds that there are no higher (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • Entropy - A Guide for the Perplexed.Roman Frigg & Charlotte Werndl - 2011 - In Claus Beisbart & Stephan Hartmann (eds.), Probabilities in Physics. Oxford, GB: Oxford University Press. pp. 115-142.
    Entropy is ubiquitous in physics, and it plays important roles in numerous other disciplines ranging from logic and statistics to biology and economics. However, a closer look reveals a complicated picture: entropy is defined differently in different contexts, and even within the same domain different notions of entropy are at work. Some of these are defined in terms of probabilities, others are not. The aim of this chapter is to arrive at an understanding of some of the most important notions (...)
    Download  
     
    Export citation  
     
    Bookmark   21 citations  
  • Philosophy of statistical mechanics.Lawrence Sklar - 2008 - Stanford Encyclopedia of Philosophy.
    Download  
     
    Export citation  
     
    Bookmark   10 citations  
  • A field guide to recent work on the foundations of statistical mechanics.Roman Frigg - 2008 - In Dean Rickles (ed.), The Ashgate Companion to Contemporary Philosophy of Physics. Ashgate. pp. 99-196.
    This is an extensive review of recent work on the foundations of statistical mechanics.
    Download  
     
    Export citation  
     
    Bookmark   91 citations  
  • Can the maximum entropy principle be explained as a consistency requirement?Jos Uffink - 1995 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 26 (3):223-261.
    The principle of maximum entropy is a general method to assign values to probability distributions on the basis of partial information. This principle, introduced by Jaynes in 1957, forms an extension of the classical principle of insufficient reason. It has been further generalized, both in mathematical formulation and in intended scope, into the principle of maximum relative entropy or of minimum information. It has been claimed that these principles are singled out as unique methods of statistical inference that agree with (...)
    Download  
     
    Export citation  
     
    Bookmark   27 citations  
  • Objective Bayesianism, Bayesian conditionalisation and voluntarism.Jon Williamson - 2011 - Synthese 178 (1):67-85.
    Objective Bayesianism has been criticised on the grounds that objective Bayesian updating, which on a finite outcome space appeals to the maximum entropy principle, differs from Bayesian conditionalisation. The main task of this paper is to show that this objection backfires: the difference between the two forms of updating reflects negatively on Bayesian conditionalisation rather than on objective Bayesian updating. The paper also reviews some existing criticisms and justifications of conditionalisation, arguing in particular that the diachronic Dutch book justification fails (...)
    Download  
     
    Export citation  
     
    Bookmark   18 citations