Switch to: References

Citations of:

Entropy and uncertainty

Philosophy of Science 53 (4):467-491 (1986)

Add citations

You must login to add citations.
  1. Modelling mechanisms with causal cycles.Brendan Clarke, Bert Leuridan & Jon Williamson - 2014 - Synthese 191 (8):1-31.
    Mechanistic philosophy of science views a large part of scientific activity as engaged in modelling mechanisms. While science textbooks tend to offer qualitative models of mechanisms, there is increasing demand for models from which one can draw quantitative predictions and explanations. Casini et al. (Theoria 26(1):5–33, 2011) put forward the Recursive Bayesian Networks (RBN) formalism as well suited to this end. The RBN formalism is an extension of the standard Bayesian net formalism, an extension that allows for modelling the hierarchical (...)
    Download  
     
    Export citation  
     
    Bookmark   24 citations  
  • Degeneration and Entropy.Eugene Y. S. Chua - 2022 - Kriterion - Journal of Philosophy 36 (2):123-155.
    [Accepted for publication in Lakatos's Undone Work: The Practical Turn and the Division of Philosophy of Mathematics and Philosophy of Science, special issue of Kriterion: Journal of Philosophy. Edited by S. Nagler, H. Pilin, and D. Sarikaya.] Lakatos’s analysis of progress and degeneration in the Methodology of Scientific Research Programmes is well-known. Less known, however, are his thoughts on degeneration in Proofs and Refutations. I propose and motivate two new criteria for degeneration based on the discussion in Proofs and Refutations (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Foundations of Probability.Rachael Briggs - 2015 - Journal of Philosophical Logic 44 (6):625-640.
    The foundations of probability are viewed through the lens of the subjectivist interpretation. This article surveys conditional probability, arguments for probabilism, probability dynamics, and the evidential and subjective interpretations of probability.
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Two dogmas of strong objective bayesianism.Prasanta S. Bandyopadhyay & Gordon Brittan - 2010 - International Studies in the Philosophy of Science 24 (1):45 – 65.
    We introduce a distinction, unnoticed in the literature, between four varieties of objective Bayesianism. What we call ' strong objective Bayesianism' is characterized by two claims, that all scientific inference is 'logical' and that, given the same background information two agents will ascribe a unique probability to their priors. We think that neither of these claims can be sustained; in this sense, they are 'dogmatic'. The first fails to recognize that some scientific inference, in particular that concerning evidential relations, is (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • A Theory of Epistemic Risk.Boris Babic - 2019 - Philosophy of Science 86 (3):522-550.
    I propose a general alethic theory of epistemic risk according to which the riskiness of an agent’s credence function encodes her relative sensitivity to different types of graded error. After motivating and mathematically developing this approach, I show that the epistemic risk function is a scaled reflection of expected inaccuracy. This duality between risk and information enables us to explore the relationship between attitudes to epistemic risk, the choice of scoring rules in epistemic utility theory, and the selection of priors (...)
    Download  
     
    Export citation  
     
    Bookmark   13 citations  
  • Not enough there there evidence, reasons, and language independence.Michael G. Titelbaum - 2010 - Philosophical Perspectives 24 (1):477-528.
    Begins by explaining then proving a generalized language dependence result similar to Goodman's "grue" problem. I then use this result to cast doubt on the existence of an objective evidential favoring relation (such as "the evidence confirms one hypothesis over another," "the evidence provides more reason to believe one hypothesis over the other," "the evidence justifies one hypothesis over the other," etc.). Once we understand what language dependence tells us about evidential favoring, our options are an implausibly strong conception of (...)
    Download  
     
    Export citation  
     
    Bookmark   63 citations  
  • Probabilistic stability, agm revision operators and maximum entropy.Krzysztof Mierzewski - 2020 - Review of Symbolic Logic:1-38.
    Several authors have investigated the question of whether canonical logic-based accounts of belief revision, and especially the theory of AGM revision operators, are compatible with the dynamics of Bayesian conditioning. Here we show that Leitgeb's stability rule for acceptance, which has been offered as a possible solution to the Lottery paradox, allows to bridge AGM revision and Bayesian update: using the stability rule, we prove that AGM revision operators emerge from Bayesian conditioning by an application of the principle of maximum (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • The Principal Principle, admissibility, and normal informal standards of what is reasonable.Jürgen Landes, Christian Wallmann & Jon Williamson - 2021 - European Journal for Philosophy of Science 11 (2):1-15.
    This paper highlights the role of Lewis’ Principal Principle and certain auxiliary conditions on admissibility as serving to explicate normal informal standards of what is reasonable. These considerations motivate the presuppositions of the argument that the Principal Principle implies the Principle of Indifference, put forward by Hawthorne et al.. They also suggest a line of response to recent criticisms of that argument, due to Pettigrew and Titelbaum and Hart, 621–632, 2020). The paper also shows that related concerns of Hart and (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • Peirce, Pedigree, Probability.Rush T. Stewart & Tom F. Sterkenburg - 2022 - Transactions of the Charles S. Peirce Society 58 (2):138-166.
    An aspect of Peirce’s thought that may still be underappreciated is his resistance to what Levi calls _pedigree epistemology_, to the idea that a central focus in epistemology should be the justification of current beliefs. Somewhat more widely appreciated is his rejection of the subjective view of probability. We argue that Peirce’s criticisms of subjectivism, to the extent they grant such a conception of probability is viable at all, revert back to pedigree epistemology. A thoroughgoing rejection of pedigree in the (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Objective Bayesianism, Bayesian conditionalisation and voluntarism.Jon Williamson - 2011 - Synthese 178 (1):67-85.
    Objective Bayesianism has been criticised on the grounds that objective Bayesian updating, which on a finite outcome space appeals to the maximum entropy principle, differs from Bayesian conditionalisation. The main task of this paper is to show that this objection backfires: the difference between the two forms of updating reflects negatively on Bayesian conditionalisation rather than on objective Bayesian updating. The paper also reviews some existing criticisms and justifications of conditionalisation, arguing in particular that the diachronic Dutch book justification fails (...)
    Download  
     
    Export citation  
     
    Bookmark   18 citations  
  • From Bayesian epistemology to inductive logic.Jon Williamson - 2013 - Journal of Applied Logic 11 (4):468-486.
    Inductive logic admits a variety of semantics (Haenni et al., 2011, Part 1). This paper develops semantics based on the norms of Bayesian epistemology (Williamson, 2010, Chapter 7). §1 introduces the semantics and then, in §2, the paper explores methods for drawing inferences in the resulting logic and compares the methods of this paper with the methods of Barnett and Paris (2008). §3 then evaluates this Bayesian inductive logic in the light of four traditional critiques of inductive logic, arguing (i) (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Goals and the Informativeness of Prior Probabilities.Olav Benjamin Vassend - 2017 - Erkenntnis:1-24.
    I argue that information is a goal-relative concept for Bayesians. More precisely, I argue that how much information is provided by a piece of evidence depends on whether the goal is to learn the truth or to rank actions by their expected utility, and that different confirmation measures should therefore be used in different contexts. I then show how information measures may reasonably be derived from confirmation measures, and I show how to derive goal-relative non-informative and informative priors given background (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Entropy and Insufficient Reason: A Note on the Judy Benjamin Problem.Anubav Vasudevan - 2020 - British Journal for the Philosophy of Science 71 (3):1113-1141.
    One well-known objection to the principle of maximum entropy is the so-called Judy Benjamin problem, first introduced by van Fraassen. The problem turns on the apparently puzzling fact that, on the basis of information relating an event’s conditional probability, the maximum entropy distribution will almost always assign to the event conditionalized on a probability strictly less than that assigned to it by the uniform distribution. In this article, I present an analysis of the Judy Benjamin problem that can help to (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Updating Probability: Tracking Statistics as Criterion.Bas C. van Fraassen & Joseph Y. Halpern - 2016 - British Journal for the Philosophy of Science:axv027.
    ABSTRACT For changing opinion, represented by an assignment of probabilities to propositions, the criterion proposed is motivated by the requirement that the assignment should have, and maintain, the possibility of matching in some appropriate sense statistical proportions in a population. This ‘tracking’ criterion implies limitations on policies for updating in response to a wide range of types of new input. Satisfying the criterion is shown equivalent to the principle that the prior must be a convex combination of the possible posteriors. (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • The Brandeis Dice Problem and Statistical Mechanics.Steven J. van Enk - 2014 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 48 (1):1-6.
    Download  
     
    Export citation  
     
    Bookmark  
  • Probability in Classical Statistical Mechanics.J. H. van Lith - 2003 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 34 (1):143-150.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Can the maximum entropy principle be explained as a consistency requirement?Jos Uffink - 1995 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 26 (3):223-261.
    The principle of maximum entropy is a general method to assign values to probability distributions on the basis of partial information. This principle, introduced by Jaynes in 1957, forms an extension of the classical principle of insufficient reason. It has been further generalized, both in mathematical formulation and in intended scope, into the principle of maximum relative entropy or of minimum information. It has been claimed that these principles are singled out as unique methods of statistical inference that agree with (...)
    Download  
     
    Export citation  
     
    Bookmark   27 citations  
  • The constraint rule of the maximum entropy principle.Jos Uffink - 1996 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 27 (1):47-79.
    The principle of maximum entropy is a method for assigning values to probability distributions on the basis of partial information. In usual formulations of this and related methods of inference one assumes that this partial information takes the form of a constraint on allowed probability distributions. In practical applications, however, the information consists of empirical data. A constraint rule is then employed to construct constraints on probability distributions out of these data. Usually one adopts the rule that equates the expectation (...)
    Download  
     
    Export citation  
     
    Bookmark   21 citations  
  • The Application of Constraint Semantics to the Language of Subjective Uncertainty.Eric Swanson - 2016 - Journal of Philosophical Logic 45 (2):121-146.
    This paper develops a compositional, type-driven constraint semantic theory for a fragment of the language of subjective uncertainty. In the particular application explored here, the interpretation function of constraint semantics yields not propositions but constraints on credal states as the semantic values of declarative sentences. Constraints are richer than propositions in that constraints can straightforwardly represent assessments of the probability that the world is one way rather than another. The richness of constraints helps us model communicative acts in essentially the (...)
    Download  
     
    Export citation  
     
    Bookmark   49 citations  
  • Learning and Pooling, Pooling and Learning.Rush T. Stewart & Ignacio Ojea Quintana - 2018 - Erkenntnis 83 (3):1-21.
    We explore which types of probabilistic updating commute with convex IP pooling. Positive results are stated for Bayesian conditionalization, imaging, and a certain parameterization of Jeffrey conditioning. This last observation is obtained with the help of a slight generalization of a characterization of externally Bayesian pooling operators due to Wagner :336–345, 2009). These results strengthen the case that pooling should go by imprecise probabilities since no precise pooling method is as versatile.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • A unifying framework of probabilistic reasoning: Rolf Haenni, Jan-Willem Romeijn, Gregory Wheeler and Jon Williamson: Probabilistic logic and probabilistic networks. Dordrecht: Springer, 2011, xiii+155pp, €59.95 HB. [REVIEW]Jan Sprenger - 2011 - Metascience 21 (2):459-462.
    A unifying framework of probabilistic reasoning Content Type Journal Article Category Book Review Pages 1-4 DOI 10.1007/s11016-011-9573-x Authors Jan Sprenger, Tilburg Center for Logic and Philosophy of Science, Tilburg University, P.O. Box 90153, 5000 LE Tilburg, The Netherlands Journal Metascience Online ISSN 1467-9981 Print ISSN 0815-0796.
    Download  
     
    Export citation  
     
    Bookmark  
  • A contrast between two decision rules for use with (convex) sets of probabilities: Γ-maximin versus e-admissibilty.T. Seidenfeld - 2004 - Synthese 140 (1-2):69 - 88.
    Download  
     
    Export citation  
     
    Bookmark   26 citations  
  • Identification in Games: Changing Places.Darrell Patrick Rowbottom - 2012 - Erkenntnis 77 (2):197-206.
    This paper offers a novel ‘changing places’ account of identification in games, where the consequences of role swapping are crucial. First, it illustrates how such an account is consistent with the view, in classical game theory, that only outcomes (and not pathways) are significant. Second, it argues that this account is superior to the ‘pooled resources’ alternative when it comes to dealing with some situations in which many players identify. Third, it shows how such a ‘changing places’ account can be (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Identification in Games: Changing Places.Darrell Patrick Rowbottom - 2012 - Erkenntnis 77 (2):197-206.
    This paper offers a novel ‘changing places’ account of identification in games, where the consequences of role swapping are crucial. First, it illustrates how such an account is consistent with the view, in classical game theory, that only outcomes are significant. Second, it argues that this account is superior to the ‘pooled resources’ alternative when it comes to dealing with some situations in which many players identify. Third, it shows how such a ‘changing places’ account can be used in games (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Précis and replies to contributors for book symposium on accuracy and the laws of credence.Richard Pettigrew - 2017 - Episteme 14 (1):1-30.
    ABSTRACTThis book symposium onAccuracy and the Laws of Credenceconsists of an overview of the book’s argument by the author, Richard Pettigrew, together with four commentaries on different aspects of that argument. Ben Levinstein challenges the characterisation of the legitimate measures of inaccuracy that plays a central role in the arguments of the book. Julia Staffel asks whether the arguments of the book are compatible with an ontology of doxastic states that includes full beliefs as well as credences. Fabrizio Cariani raises (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Uncertainty, credal sets and second order probability.Jonas Clausen Mork - 2013 - Synthese 190 (3):353-378.
    The last 20 years or so has seen an intense search carried out within Dempster–Shafer theory, with the aim of finding a generalization of the Shannon entropy for belief functions. In that time, there has also been much progress made in credal set theory—another generalization of the traditional Bayesian epistemic representation—albeit not in this particular area. In credal set theory, sets of probability functions are utilized to represent the epistemic state of rational agents instead of the single probability function of (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Regression to the Mean and Judy Benjamin.Randall G. McCutcheon - 2020 - Synthese 197 (3):1343-1355.
    Van Fraassen's Judy Benjamin problem asks how one ought to update one's credence in A upon receiving evidence of the sort ``A may or may not obtain, but B is k times likelier than C'', where {A,B,C} is a partition. Van Fraassen's solution, in the limiting case of increasing k, recommends a posterior converging to the probability of A conditional on A union B, where P is one's prior probability function. Grove and Halpern, and more recently Douven and Romeijn, have (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Determining Maximal Entropy Functions for Objective Bayesian Inductive Logic.Juergen Landes, Soroush Rafiee Rad & Jon Williamson - 2022 - Journal of Philosophical Logic 52 (2):555-608.
    According to the objective Bayesian approach to inductive logic, premisses inductively entail a conclusion just when every probability function with maximal entropy, from all those that satisfy the premisses, satisfies the conclusion. When premisses and conclusion are constraints on probabilities of sentences of a first-order predicate language, however, it is by no means obvious how to determine these maximal entropy functions. This paper makes progress on the problem in the following ways. Firstly, we introduce the concept of a limit in (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Non-Measurability, Imprecise Credences, and Imprecise Chances.Yoaav Isaacs, Alan Hájek & John Hawthorne - 2021 - Mind 131 (523):892-916.
    – We offer a new motivation for imprecise probabilities. We argue that there are propositions to which precise probability cannot be assigned, but to which imprecise probability can be assigned. In such cases the alternative to imprecise probability is not precise probability, but no probability at all. And an imprecise probability is substantially better than no probability at all. Our argument is based on the mathematical phenomenon of non-measurable sets. Non-measurable propositions cannot receive precise probabilities, but there is a natural (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • On Indeterminate Updating of Credences.Leendert Huisman - 2014 - Philosophy of Science 81 (4):537-557.
    The strategy of updating credences by minimizing the relative entropy has been questioned by many authors, most strongly by means of the Judy Benjamin puzzle. I present a new analysis of Judy Benjamin–like forms of new information and defend the thesis that in general the rational posterior is indeterminate, meaning that a family of posterior credence functions rather than a single one is the rational response when that type of information becomes available. The proposed thesis extends naturally to all cases (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Indifference to Anti-Humean Chances.J. Dmitri Gallow - 2022 - Canadian Journal of Philosophy 52 (5):485-501.
    An indifference principle says that your credences should be distributed uniformly over each of the possibilities you recognise. A chance deference principle says that your credences should be aligned with the chances. My thesis is that, if we are anti-Humeans about chance, then these two principles are incompatible. Anti-Humeans think that it is possible for the actual frequencies to depart from the chances. So long as you recognise possibilities like this, you cannot both spread your credences evenly and defer to (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Deceptive updating and minimal information methods.Haim Gaifman & Anubav Vasudevan - 2012 - Synthese 187 (1):147-178.
    The technique of minimizing information (infomin) has been commonly employed as a general method for both choosing and updating a subjective probability function. We argue that, in a wide class of cases, the use of infomin methods fails to cohere with our standard conception of rational degrees of belief. We introduce the notion of a deceptive updating method and argue that non-deceptiveness is a necessary condition for rational coherence. Infomin has been criticized on the grounds that there are no higher (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • Are non-accidental regularities a cosmic coincidence? Revisiting a central threat to Humean laws.Aldo Filomeno - 2019 - Synthese 198 (6):5205-5227.
    If the laws of nature are as the Humean believes, it is an unexplained cosmic coincidence that the actual Humean mosaic is as extremely regular as it is. This is a strong and well-known objection to the Humean account of laws. Yet, as reasonable as this objection may seem, it is nowadays sometimes dismissed. The reason: its unjustified implicit assignment of equiprobability to each possible Humean mosaic; that is, its assumption of the principle of indifference, which has been attacked on (...)
    Download  
     
    Export citation  
     
    Bookmark   11 citations  
  • Entropy - A Guide for the Perplexed.Roman Frigg & Charlotte Werndl - 2011 - In Claus Beisbart & Stephan Hartmann (eds.), Probabilities in Physics. Oxford University Press. pp. 115-142.
    Entropy is ubiquitous in physics, and it plays important roles in numerous other disciplines ranging from logic and statistics to biology and economics. However, a closer look reveals a complicated picture: entropy is defined differently in different contexts, and even within the same domain different notions of entropy are at work. Some of these are defined in terms of probabilities, others are not. The aim of this chapter is to arrive at an understanding of some of the most important notions (...)
    Download  
     
    Export citation  
     
    Bookmark   21 citations  
  • Philosophy of statistical mechanics.Lawrence Sklar - 2008 - Stanford Encyclopedia of Philosophy.
    Download  
     
    Export citation  
     
    Bookmark   10 citations  
  • Interpretations of probability.Alan Hájek - 2007 - Stanford Encyclopedia of Philosophy.
    Download  
     
    Export citation  
     
    Bookmark   159 citations  
  • Objective Bayesianism and the maximum entropy principle.Jürgen Landes & Jon Williamson - 2013 - Entropy 15 (9):3528-3591.
    Objective Bayesian epistemology invokes three norms: the strengths of our beliefs should be probabilities, they should be calibrated to our evidence of physical probabilities, and they should otherwise equivocate sufficiently between the basic propositions that we can express. The three norms are sometimes explicated by appealing to the maximum entropy principle, which says that a belief function should be a probability function, from all those that are calibrated to evidence, that has maximum entropy. However, the three norms of objective Bayesianism (...)
    Download  
     
    Export citation  
     
    Bookmark   17 citations  
  • A field guide to recent work on the foundations of statistical mechanics.Roman Frigg - 2008 - In Dean Rickles (ed.), The Ashgate Companion to Contemporary Philosophy of Physics. London, U.K.: Ashgate. pp. 99-196.
    This is an extensive review of recent work on the foundations of statistical mechanics.
    Download  
     
    Export citation  
     
    Bookmark   92 citations  
  • Maximum Entropy and Probability Kinematics Constrained by Conditionals.Stefan Lukits - 2015 - Entropy 17 (4):1690-1700.
    Two open questions of inductive reasoning are solved: (1) does the principle of maximum entropy (pme) give a solution to the obverse Majerník problem; and (2) is Wagner correct when he claims that Jeffrey’s updating principle (jup) contradicts pme? Majerník shows that pme provides unique and plausible marginal probabilities, given conditional probabilities. The obverse problem posed here is whether pme also provides such conditional probabilities, given certain marginal probabilities. The theorem developed to solve the obverse Majerník problem demonstrates that in (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • What the "Equal Weight View" is.Randall G. McCutcheon - manuscript
    Dawid, DeGroot and Mortera showed, a quarter century ago, that any agent who regards a fellow agent as a peer--in particular, defers to the fellow agent's prior credences in the same way that she defers to her own--and updates by split-the-difference is prone to diachronic incoherence. On the other hand one may show that there are special scenarios in which Bayesian updating approximates difference splitting, so it remains an important question whether it remains a viable response to ``generic" peer update. (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Symmetry and evidential support.Michael G. Titelbaum - 2011 - Symmetry 3 (3):680--698.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Precise Credences.Michael Titelbaum - 2019 - In Richard Pettigrew & Jonathan Weisberg (eds.), The Open Handbook of Formal Epistemology. PhilPaper Foundation. pp. 1-55.
    Download  
     
    Export citation  
     
    Bookmark   10 citations  
  • Essay review: Probability in classical statistical physics.Janneke van Lith - 2001 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 33:143–50.
    Review article of Y.M. Guttmann, <em>The Concept of Probability in Statistical Physics</em>, Cambridge: Cambridge University Press, 1999.
    Download  
     
    Export citation  
     
    Bookmark