Switch to: References

Add citations

You must login to add citations.
  1. Uniform probability in cosmology.Sylvia Wenmackers - 2023 - Studies in History and Philosophy of Science Part A 101 (C):48-60.
    Download  
     
    Export citation  
     
    Bookmark  
  • The gradation puzzle of intellectual assurance.Xiaoxing Zhang - 2021 - Analysis 81 (3):488-496.
    The Cartesian thesis that some justifications are infallible faces a gradation puzzle. On the one hand, infallible justification tolerates absolutely no possibility for error. On the other hand, infallible justifications can vary in evidential force: e.g. two persons can both be infallible regarding their pains while the one with stronger pain is nevertheless more justified. However, if a type of justification is gradable in strength, why can it always be absolute? This paper explores the potential of this gradation challenge by (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Conglomerability, disintegrability and the comparative principle.Rush T. Stewart & Michael Nielsen - 2021 - Analysis 81 (3):479-488.
    Our aim here is to present a result that connects some approaches to justifying countable additivity. This result allows us to better understand the force of a recent argument for countable additivity due to Easwaran. We have two main points. First, Easwaran’s argument in favour of countable additivity should have little persuasive force on those permissive probabilists who have already made their peace with violations of conglomerability. As our result shows, Easwaran’s main premiss – the comparative principle – is strictly (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • A Dutch Book Theorem and Converse Dutch Book Theorem for Kolmogorov Conditionalization.Michael Rescorla - unknown
    This paper discusses how to update one’s credences based on evidence that has initial probability 0. I advance a diachronic norm, Kolmogorov Conditionalization, that governs credal reallocation in many such learning scenarios. The norm is based upon Kolmogorov’s theory of conditional probability. I prove a Dutch book theorem and converse Dutch book theorem for Kolmogorov Conditionalization. The two theorems establish Kolmogorov Conditionalization as the unique credal reallocation rule that avoids a sure loss in the relevant learning scenarios.
    Download  
     
    Export citation  
     
    Bookmark   12 citations  
  • A dutch book theorem and converse dutch book theorem for Kolmogorov conditionalization.Michael Rescorla - 2018 - Review of Symbolic Logic 11 (4):705-735.
    Download  
     
    Export citation  
     
    Bookmark   14 citations  
  • The Maxim of Probabilism, with special regard to Reichenbach.Miklós Rédei & Zalán Gyenis - 2021 - Synthese 199 (3-4):8857-8874.
    It is shown that by realizing the isomorphism features of the frequency and geometric interpretations of probability, Reichenbach comes very close to the idea of identifying mathematical probability theory with measure theory in his 1949 work on foundations of probability. Some general features of Reichenbach’s axiomatization of probability theory are pointed out as likely obstacles that prevented him making this conceptual move. The role of isomorphisms of Kolmogorovian probability measure spaces is specified in what we call the “Maxim of Probabilism”, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Having a look at the Bayes Blind Spot.Miklós Rédei & Zalán Gyenis - 2019 - Synthese 198 (4):3801-3832.
    The Bayes Blind Spot of a Bayesian Agent is, by definition, the set of probability measures on a Boolean σ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\sigma $$\end{document}-algebra that are absolutely continuous with respect to the background probability measure of a Bayesian Agent on the algebra and which the Bayesian Agent cannot learn by a single conditionalization no matter what evidence he has about the elements in the Boolean σ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\sigma (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • You say you want a revolution: two notions of probabilistic independence.Alexander Meehan - 2021 - Philosophical Studies 178 (10):3319-3351.
    Branden Fitelson and Alan Hájek have suggested that it is finally time for a “revolution” in which we jettison Kolmogorov’s axiomatization of probability, and move to an alternative like Popper’s. According to these authors, not only did Kolmogorov fail to give an adequate analysis of conditional probability, he also failed to give an adequate account of another central notion in probability theory: probabilistic independence. This paper defends Kolmogorov, with a focus on this independence charge. I show that Kolmogorov’s sophisticated theory (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • The Borel-Kolmogorov Paradox Is Your Paradox Too: A Puzzle for Conditional Physical Probability.Alexander Meehan & Snow Zhang - 2021 - Philosophy of Science 88 (5):971-984.
    The Borel-Kolmogorov paradox is often presented as an obscure problem that certain mathematical accounts of conditional probability must face. In this article, we point out that the paradox arises in the physical sciences, for physical probability or chance. By carefully formulating the paradox in this setting, we show that it is a puzzle for everyone, regardless of one’s preferred probability formalism. We propose a treatment that is inspired by the approach that scientists took when confronted with these cases.
    Download  
     
    Export citation  
     
    Bookmark  
  • Jeffrey Meets Kolmogorov: A General Theory of Conditioning.Alexander Meehan & Snow Zhang - 2020 - Journal of Philosophical Logic 49 (5):941-979.
    Jeffrey conditionalization is a rule for updating degrees of belief in light of uncertain evidence. It is usually assumed that the partitions involved in Jeffrey conditionalization are finite and only contain positive-credence elements. But there are interesting examples, involving continuous quantities, in which this is not the case. Q1 Can Jeffrey conditionalization be generalized to accommodate continuous cases? Meanwhile, several authors, such as Kenny Easwaran and Michael Rescorla, have been interested in Kolmogorov’s theory of regular conditional distributions as a possible (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Kolmogorov Conditionalizers Can Be Dutch Booked.Alexander Meehan & Snow Zhang - forthcoming - Review of Symbolic Logic:1-36.
    A vexing question in Bayesian epistemology is how an agent should update on evidence which she assigned zero prior credence. Some theorists have suggested that, in such cases, the agent should update by Kolmogorov conditionalization, a norm based on Kolmogorov’s theory of regular conditional distributions. However, it turns out that in some situations, a Kolmogorov conditionalizer will plan to always assign a posterior credence of zero to the evidence she learns. Intuitively, such a plan is irrational and easily Dutch bookable. (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Non-Measurability, Imprecise Credences, and Imprecise Chances.Yoaav Isaacs, Alan Hájek & John Hawthorne - 2021 - Mind 131 (523):892-916.
    – We offer a new motivation for imprecise probabilities. We argue that there are propositions to which precise probability cannot be assigned, but to which imprecise probability can be assigned. In such cases the alternative to imprecise probability is not precise probability, but no probability at all. And an imprecise probability is substantially better than no probability at all. Our argument is based on the mathematical phenomenon of non-measurable sets. Non-measurable propositions cannot receive precise probabilities, but there is a natural (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • On the Modal Logic of Jeffrey Conditionalization.Zalán Gyenis - 2018 - Logica Universalis 12 (3-4):351-374.
    We continue the investigations initiated in the recent papers where Bayes logics have been introduced to study the general laws of Bayesian belief revision. In Bayesian belief revision a Bayesian agent revises his prior belief by conditionalizing the prior on some evidence using the Bayes rule. In this paper we take the more general Jeffrey formula as a conditioning device and study the corresponding modal logics that we call Jeffrey logics, focusing mainly on the countable case. The containment relations among (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Conditioning using conditional expectations: the Borel–Kolmogorov Paradox.Zalán Gyenis, Gabor Hofer-Szabo & Miklós Rédei - 2017 - Synthese 194 (7):2595-2630.
    The Borel–Kolmogorov Paradox is typically taken to highlight a tension between our intuition that certain conditional probabilities with respect to probability zero conditioning events are well defined and the mathematical definition of conditional probability by Bayes’ formula, which loses its meaning when the conditioning event has probability zero. We argue in this paper that the theory of conditional expectations is the proper mathematical device to conditionalize and that this theory allows conditionalization with respect to probability zero events. The conditional probabilities (...)
    Download  
     
    Export citation  
     
    Bookmark   18 citations  
  • General properties of bayesian learning as statistical inference determined by conditional expectations.Zalán Gyenis & Miklós Rédei - 2017 - Review of Symbolic Logic 10 (4):719-755.
    We investigate the general properties of general Bayesian learning, where “general Bayesian learning” means inferring a state from another that is regarded as evidence, and where the inference is conditionalizing the evidence using the conditional expectation determined by a reference probability measure representing the background subjective degrees of belief of a Bayesian Agent performing the inference. States are linear functionals that encode probability measures by assigning expectation values to random variables via integrating them with respect to the probability measure. If (...)
    Download  
     
    Export citation  
     
    Bookmark   17 citations  
  • Bayesian Epistemology.William Talbott - 2006 - Stanford Encyclopedia of Philosophy.
    ‘Bayesian epistemology’ became an epistemological movement in the 20th century, though its two main features can be traced back to the eponymous Reverend Thomas Bayes (c. 1701-61). Those two features are: (1) the introduction of a formal apparatus for inductive logic; (2) the introduction of a pragmatic self-defeat test (as illustrated by Dutch Book Arguments) for epistemic rationality as a way of extending the justification of the laws of deductive logic to include a justification for the laws of inductive logic. (...)
    Download  
     
    Export citation  
     
    Bookmark   72 citations  
  • Conditional Probabilities.Kenny Easwaran - 2019 - In Richard Pettigrew & Jonathan Weisberg (eds.), The Open Handbook of Formal Epistemology. PhilPapers Foundation. pp. 131-198.
    Download  
     
    Export citation  
     
    Bookmark   14 citations