Switch to: References

Citations of:

Bayesian rules of updating

Erkenntnis 45 (2-3):195 - 208 (1996)

Add citations

You must login to add citations.
  1. Bayesian Belief Revision Based on Agent’s Criteria.Yongfeng Yuan - 2021 - Studia Logica 109 (6):1311-1346.
    In the literature of belief revision, it is widely accepted that: there is only one revision phase in belief revision which is well characterized by the Bayes’ Rule, Jeffrey’s Rule, etc.. However, as I argue in this article, there are at least four successive phases in belief revision, namely first/second order evaluation and first/second order revision. To characterize these phases, I propose mainly four rules of belief revision based on agent’s criteria, and make one composition rule to characterize belief revision (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Having a look at the Bayes Blind Spot.Miklós Rédei & Zalán Gyenis - 2019 - Synthese 198 (4):3801-3832.
    The Bayes Blind Spot of a Bayesian Agent is, by definition, the set of probability measures on a Boolean σ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\sigma $$\end{document}-algebra that are absolutely continuous with respect to the background probability measure of a Bayesian Agent on the algebra and which the Bayesian Agent cannot learn by a single conditionalization no matter what evidence he has about the elements in the Boolean σ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\sigma (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • General properties of bayesian learning as statistical inference determined by conditional expectations.Zalán Gyenis & Miklós Rédei - 2017 - Review of Symbolic Logic 10 (4):719-755.
    We investigate the general properties of general Bayesian learning, where “general Bayesian learning” means inferring a state from another that is regarded as evidence, and where the inference is conditionalizing the evidence using the conditional expectation determined by a reference probability measure representing the background subjective degrees of belief of a Bayesian Agent performing the inference. States are linear functionals that encode probability measures by assigning expectation values to random variables via integrating them with respect to the probability measure. If (...)
    Download  
     
    Export citation  
     
    Bookmark   18 citations  
  • Bayesian Epistemology and Having Evidence.Jeffrey Dunn - 2010 - Dissertation, University of Massachusetts, Amherst
    Bayesian Epistemology is a general framework for thinking about agents who have beliefs that come in degrees. Theories in this framework give accounts of rational belief and rational belief change, which share two key features: (i) rational belief states are represented with probability functions, and (ii) rational belief change results from the acquisition of evidence. This dissertation focuses specifically on the second feature. I pose the Evidence Question: What is it to have evidence? Before addressing this question we must have (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Bayesianism and language change.Jon Williamson - 2003 - Journal of Logic, Language and Information 12 (1):53-97.
    Bayesian probability is normally defined over a fixed language or eventspace. But in practice language is susceptible to change, and thequestion naturally arises as to how Bayesian degrees of belief shouldchange as language changes. I argue here that this question poses aserious challenge to Bayesianism. The Bayesian may be able to meet thischallenge however, and I outline a practical method for changing degreesof belief over changes in finite propositional languages.
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • Bayes and beyond.Geoffrey Hellman - 1997 - Philosophy of Science 64 (2):191-221.
    Several leading topics outstanding after John Earman's Bayes or Bust? are investigated further, with emphasis on the relevance of Bayesian explication in epistemology of science, despite certain limitations. (1) Dutch Book arguments are reformulated so that their independence from utility and preference in epistemic contexts is evident. (2) The Bayesian analysis of the Quine-Duhem problem is pursued; the phenomenon of a "protective belt" of auxiliary statements around reasonably successful theories is explicated. (3) The Bayesian approach to understanding the superiority of (...)
    Download  
     
    Export citation  
     
    Bookmark   21 citations  
  • Indifference, neutrality and informativeness: generalizing the three prisoners paradox.Sergio Wechsler, L. G. Esteves, A. Simonis & C. Peixoto - 2005 - Synthese 143 (3):255-272.
    . The uniform prior distribution is often seen as a mathematical description of noninformativeness. This paper uses the well-known Three Prisoners Paradox to examine the impossibility of maintaining noninformativeness throughout hierarchization. The Paradox has been solved by Bayesian conditioning over the choice made by the Warder when asked to name a prisoner who will be shot. We generalize the paradox to situations of N prisoners, k executions and m announcements made by the Warder. We then extend the consequences of hierarchically (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Logic and probability.Colin Howson - 1997 - British Journal for the Philosophy of Science 48 (4):517-531.
    This paper argues that Ramsey's view of the calculus of subjective probabilities as, in effect, logical axioms is the correct view, with powerful heuristic value. This heuristic value is seen particularly in the analysis of the role of conditionalization in the Bayesian theory, where a semantic criterion of synchronic coherence is employed as the test of soundness, which the traditional formulation of conditionalization fails. On the other hand, there is a generally sound rule which supports conditionalization in appropriate contexts, though (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • The kinematics of belief and desire.Richard Bradley - 2007 - Synthese 156 (3):513-535.
    Richard Jeffrey regarded the version of Bayesian decision theory he floated in ‘The Logic of Decision’ and the idea of a probability kinematics—a generalisation of Bayesian conditioning to contexts in which the evidence is ‘uncertain’—as his two most important contributions to philosophy. This paper aims to connect them by developing kinematical models for the study of preference change and practical deliberation. Preference change is treated in a manner analogous to Jeffrey’s handling of belief change: not as mechanical outputs of combinations (...)
    Download  
     
    Export citation  
     
    Bookmark   21 citations  
  • Radical probabilism and bayesian conditioning.Richard Bradley - 2005 - Philosophy of Science 72 (2):342-364.
    Richard Jeffrey espoused an antifoundationalist variant of Bayesian thinking that he termed ‘Radical Probabilism’. Radical Probabilism denies both the existence of an ideal, unbiased starting point for our attempts to learn about the world and the dogma of classical Bayesianism that the only justified change of belief is one based on the learning of certainties. Probabilistic judgment is basic and irreducible. Bayesian conditioning is appropriate when interaction with the environment yields new certainty of belief in some proposition but leaves one’s (...)
    Download  
     
    Export citation  
     
    Bookmark   39 citations  
  • On the Modal Logic of Jeffrey Conditionalization.Zalán Gyenis - 2018 - Logica Universalis 12 (3-4):351-374.
    We continue the investigations initiated in the recent papers where Bayes logics have been introduced to study the general laws of Bayesian belief revision. In Bayesian belief revision a Bayesian agent revises his prior belief by conditionalizing the prior on some evidence using the Bayes rule. In this paper we take the more general Jeffrey formula as a conditioning device and study the corresponding modal logics that we call Jeffrey logics, focusing mainly on the countable case. The containment relations among (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • A Puzzle About Stalnaker’s Hypothesis.Igor Douven & Richard Dietz - 2011 - Topoi 30 (1):31-37.
    According to Stalnaker’s Hypothesis, the probability of an indicative conditional, $\Pr(\varphi \rightarrow \psi),$ equals the probability of the consequent conditional on its antecedent, $\Pr(\psi | \varphi)$ . While the hypothesis is generally taken to have been conclusively refuted by Lewis’ and others’ triviality arguments, its descriptive adequacy has been confirmed in many experimental studies. In this paper, we consider some possible ways of resolving the apparent tension between the analytical and the empirical results relating to Stalnaker’s Hypothesis and we argue (...)
    Download  
     
    Export citation  
     
    Bookmark   19 citations  
  • Objective Bayesianism, Bayesian conditionalisation and voluntarism.Jon Williamson - 2011 - Synthese 178 (1):67-85.
    Objective Bayesianism has been criticised on the grounds that objective Bayesian updating, which on a finite outcome space appeals to the maximum entropy principle, differs from Bayesian conditionalisation. The main task of this paper is to show that this objection backfires: the difference between the two forms of updating reflects negatively on Bayesian conditionalisation rather than on objective Bayesian updating. The paper also reviews some existing criticisms and justifications of conditionalisation, arguing in particular that the diachronic Dutch book justification fails (...)
    Download  
     
    Export citation  
     
    Bookmark   18 citations  
  • General properties of general Bayesian learning.Miklós Rédei & Zalán Gyenis - unknown
    We investigate the general properties of general Bayesian learning, where ``general Bayesian learning'' means inferring a state from another that is regarded as evidence, and where the inference is conditionalizing the evidence using the conditional expectation determined by a reference probability measure representing the background subjective degrees of belief of a Bayesian Agent performing the inference. States are linear functionals that encode probability measures by assigning expectation values to random variables via integrating them with respect to the probability measure. If (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Dutch-Book Arguments against using Conditional Probabilities for Conditional Bets.Keith Hutchison - 2012 - Open Journal of Philosophy 2 (3):195.
    We consider here an important family of conditional bets, those that proceed to settlement if and only if some agreed evidence is received that a condition has been met. Despite an opinion widespread in the literature, we observe that when the evidence is strong enough to generate certainty as to whether the condition has been met or not, using traditional conditional probabilities for such bets will NOT preserve a gambler from having a synchronic Dutch Book imposed upon him. On the (...)
    Download  
     
    Export citation  
     
    Bookmark