Switch to: References

Add citations

You must login to add citations.
  1. Learning Conditional Information by Jeffrey Imaging on Stalnaker Conditionals.Mario Günther - 2018 - Journal of Philosophical Logic 47 (5):851-876.
    We propose a method of learning indicative conditional information. An agent learns conditional information by Jeffrey imaging on the minimally informative proposition expressed by a Stalnaker conditional. We show that the predictions of the proposed method align with the intuitions in Douven, 239–263 2012)’s benchmark examples. Jeffrey imaging on Stalnaker conditionals can also capture the learning of uncertain conditional information, which we illustrate by generating predictions for the Judy Benjamin Problem.
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • Ramsey’s conditionals.Mario Günther & Caterina Sisti - 2022 - Synthese 200 (2):1-31.
    In this paper, we propose a unified account of conditionals inspired by Frank Ramsey. Most contemporary philosophers agree that Ramsey’s account applies to indicative conditionals only. We observe against this orthodoxy that his account covers subjunctive conditionals as well—including counterfactuals. In light of this observation, we argue that Ramsey’s account of conditionals resembles Robert Stalnaker’s possible worlds semantics supplemented by a model of belief. The resemblance suggests to reinterpret the notion of conditional degree of belief in order to overcome a (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Objective Bayesianism, Bayesian conditionalisation and voluntarism.Jon Williamson - 2011 - Synthese 178 (1):67-85.
    Objective Bayesianism has been criticised on the grounds that objective Bayesian updating, which on a finite outcome space appeals to the maximum entropy principle, differs from Bayesian conditionalisation. The main task of this paper is to show that this objection backfires: the difference between the two forms of updating reflects negatively on Bayesian conditionalisation rather than on objective Bayesian updating. The paper also reviews some existing criticisms and justifications of conditionalisation, arguing in particular that the diachronic Dutch book justification fails (...)
    Download  
     
    Export citation  
     
    Bookmark   18 citations  
  • Updating, Undermining, and Independence.Jonathan Weisberg - 2015 - British Journal for the Philosophy of Science 66 (1):121-159.
    Sometimes appearances provide epistemic support that gets undercut later. In an earlier paper I argued that standard Bayesian update rules are at odds with this phenomenon because they are ‘rigid’. Here I generalize and bolster that argument. I first show that the update rules of Dempster–Shafer theory and ranking theory are rigid too, hence also at odds with the defeasibility of appearances. I then rebut three Bayesian attempts to solve the problem. I conclude that defeasible appearances pose a more difficult (...)
    Download  
     
    Export citation  
     
    Bookmark   28 citations  
  • Commutativity or Holism? A Dilemma for Conditionalizers.Jonathan Weisberg - 2009 - British Journal for the Philosophy of Science 60 (4):793-812.
    Conditionalization and Jeffrey Conditionalization cannot simultaneously satisfy two widely held desiderata on rules for empirical learning. The first desideratum is confirmational holism, which says that the evidential import of an experience is always sensitive to our background assumptions. The second desideratum is commutativity, which says that the order in which one acquires evidence shouldn't affect what conclusions one draws, provided the same total evidence is gathered in the end. (Jeffrey) Conditionalization cannot satisfy either of these desiderata without violating the other. (...)
    Download  
     
    Export citation  
     
    Bookmark   64 citations  
  • On the Ecological and Internal Rationality of Bayesian Conditionalization and Other Belief Updating Strategies.Olav Benjamin Vassend - forthcoming - British Journal for the Philosophy of Science.
    Download  
     
    Export citation  
     
    Bookmark  
  • Entropy and Insufficient Reason: A Note on the Judy Benjamin Problem.Anubav Vasudevan - 2020 - British Journal for the Philosophy of Science 71 (3):1113-1141.
    One well-known objection to the principle of maximum entropy is the so-called Judy Benjamin problem, first introduced by van Fraassen. The problem turns on the apparently puzzling fact that, on the basis of information relating an event’s conditional probability, the maximum entropy distribution will almost always assign to the event conditionalized on a probability strictly less than that assigned to it by the uniform distribution. In this article, I present an analysis of the Judy Benjamin problem that can help to (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Updating Probability: Tracking Statistics as Criterion.Bas C. van Fraassen & Joseph Y. Halpern - 2016 - British Journal for the Philosophy of Science:axv027.
    ABSTRACT For changing opinion, represented by an assignment of probabilities to propositions, the criterion proposed is motivated by the requirement that the assignment should have, and maintain, the possibility of matching in some appropriate sense statistical proportions in a population. This ‘tracking’ criterion implies limitations on policies for updating in response to a wide range of types of new input. Satisfying the criterion is shown equivalent to the principle that the prior must be a convex combination of the possible posteriors. (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Reflection and conditionalization: Comments on Michael Rescorla.Bas C. van Fraassen - 2023 - Noûs 57 (3):539-552.
    Rescorla explores the relation between Reflection, Conditionalization, and Dutch book arguments in the presence of a weakened concept of sure loss and weakened conditions of self‐transparency for doxastic agents. The literature about Reflection and about Dutch Book arguments, though overlapping, are distinct, and its history illuminates the import of Rescorla's investigation. With examples from a previous debate in the 70s and results about Reflection and Conditionalization in the 80s, I propose a way of seeing the epistemic enterprise in the light (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Conditional Learning Through Causal Models.Jonathan Vandenburgh - 2020 - Synthese (1-2):2415-2437.
    Conditional learning, where agents learn a conditional sentence ‘If A, then B,’ is difficult to incorporate into existing Bayesian models of learning. This is because conditional learning is not uniform: in some cases, learning a conditional requires decreasing the probability of the antecedent, while in other cases, the antecedent probability stays constant or increases. I argue that how one learns a conditional depends on the causal structure relating the antecedent and the consequent, leading to a causal model of conditional learning. (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • Can the maximum entropy principle be explained as a consistency requirement?Jos Uffink - 1995 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 26 (3):223-261.
    The principle of maximum entropy is a general method to assign values to probability distributions on the basis of partial information. This principle, introduced by Jaynes in 1957, forms an extension of the classical principle of insufficient reason. It has been further generalized, both in mathematical formulation and in intended scope, into the principle of maximum relative entropy or of minimum information. It has been claimed that these principles are singled out as unique methods of statistical inference that agree with (...)
    Download  
     
    Export citation  
     
    Bookmark   27 citations  
  • The constraint rule of the maximum entropy principle.Jos Uffink - 1996 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 27 (1):47-79.
    The principle of maximum entropy is a method for assigning values to probability distributions on the basis of partial information. In usual formulations of this and related methods of inference one assumes that this partial information takes the form of a constraint on allowed probability distributions. In practical applications, however, the information consists of empirical data. A constraint rule is then employed to construct constraints on probability distributions out of these data. Usually one adopts the rule that equates the expectation (...)
    Download  
     
    Export citation  
     
    Bookmark   21 citations  
  • The Application of Constraint Semantics to the Language of Subjective Uncertainty.Eric Swanson - 2016 - Journal of Philosophical Logic 45 (2):121-146.
    This paper develops a compositional, type-driven constraint semantic theory for a fragment of the language of subjective uncertainty. In the particular application explored here, the interpretation function of constraint semantics yields not propositions but constraints on credal states as the semantic values of declarative sentences. Constraints are richer than propositions in that constraints can straightforwardly represent assessments of the probability that the world is one way rather than another. The richness of constraints helps us model communicative acts in essentially the (...)
    Download  
     
    Export citation  
     
    Bookmark   49 citations  
  • Updating, supposing, and maxent.Brian Skyrms - 1987 - Theory and Decision 22 (3):225-246.
    Download  
     
    Export citation  
     
    Bookmark   27 citations  
  • Expert deference as a belief revision schema.Joe Roussos - 2020 - Synthese (1-2):1-28.
    When an agent learns of an expert's credence in a proposition about which they are an expert, the agent should defer to the expert and adopt that credence as their own. This is a popular thought about how agents ought to respond to (ideal) experts. In a Bayesian framework, it is often modelled by endowing the agent with a set of priors that achieves this result. But this model faces a number of challenges, especially when applied to non-ideal agents (who (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Living in a Material World: A Critical Notice of Suppose and Tell: The Semantics and Heuristics of Conditionals by Timothy Williamson.Daniel Rothschild - 2023 - Mind 132 (525):208-233.
    Barristers in England are obliged to follow the ‘cab rank rule’, according to which they must take any case offered to them, as long as they have time in their.
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Peer Disagreement: A Call for the Revision of Prior Probabilities.Sven Rosenkranz & Moritz Schulz - 2015 - Dialectica 69 (4):551-586.
    The current debate about peer disagreement has so far mainly focused on the question of whether peer disagreements provide genuine counterevidence to which we should respond by revising our credences. By contrast, comparatively little attention has been devoted to the question by which process, if any, such revision should be brought about. The standard assumption is that we update our credences by conditionalizing on the evidence that peer disagreements provide. In this paper, we argue that non-dogmatist views have good reasons (...)
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • A new resolution of the Judy Benjamin Problem.Igor Douven & Jan-Willem Romeijn - 2011 - Mind 120 (479):637 - 670.
    A paper on how to adapt your probabilisitc beliefs when learning a conditional.
    Download  
     
    Export citation  
     
    Bookmark   30 citations  
  • Bayesian updating when what you learn might be false.Richard Pettigrew - 2023 - Erkenntnis 88 (1):309-324.
    Rescorla (Erkenntnis, 2020) has recently pointed out that the standard arguments for Bayesian Conditionalization assume that whenever I become certain of something, it is true. Most people would reject this assumption. In response, Rescorla offers an improved Dutch Book argument for Bayesian Conditionalization that does not make this assumption. My purpose in this paper is two-fold. First, I want to illuminate Rescorla’s new argument by giving a very general Dutch Book argument that applies to many cases of updating beyond those (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Updating as Communication.Sarah Moss - 2012 - Philosophy and Phenomenological Research 85 (2):225-248.
    Traditional procedures for rational updating fail when it comes to self-locating opinions, such as your credences about where you are and what time it is. This paper develops an updating procedure for rational agents with self-locating beliefs. In short, I argue that rational updating can be factored into two steps. The first step uses information you recall from your previous self to form a hypothetical credence distribution, and the second step changes this hypothetical distribution to reflect information you have genuinely (...)
    Download  
     
    Export citation  
     
    Bookmark   57 citations  
  • Dynamic inference and everyday conditional reasoning in the new paradigm.Mike Oaksford & Nick Chater - 2013 - Thinking and Reasoning 19 (3-4):346-379.
    Download  
     
    Export citation  
     
    Bookmark   35 citations  
  • Subjunctive Credences and Semantic Humility.Sarah Moss - 2012 - Philosophy and Phenomenological Research 87 (2):251-278.
    This paper argues that several leading theories of subjunctive conditionals are incompatible with ordinary intuitions about what credences we ought to have in subjunctive conditionals. In short, our theory of subjunctives should intuitively display semantic humility, i.e. our semantic theory should deliver the truth conditions of sentences without pronouncing on whether those conditions actually obtain. In addition to describing intuitions about subjunctive conditionals, I argue that we can derive these ordinary intuitions from justified premises, and I answer a possible worry (...)
    Download  
     
    Export citation  
     
    Bookmark   31 citations  
  • Moral Encroachment.Sarah Moss - 2018 - Proceedings of the Aristotelian Society 118 (2):177-205.
    This paper develops a precise understanding of the thesis of moral encroachment, which states that the epistemic status of an opinion can depend on its moral features. In addition, I raise objections to existing accounts of moral encroachment. For instance, many accounts fail to give sufficient attention to moral encroachment on credences. Also, many accounts focus on moral features that fail to support standard analogies between pragmatic and moral encroachment. Throughout the paper, I discuss racial profiling as a case study, (...)
    Download  
     
    Export citation  
     
    Bookmark   108 citations  
  • Epistemology Formalized.Sarah Moss - 2013 - Philosophical Review 122 (1):1-43.
    This paper argues that just as full beliefs can constitute knowledge, so can properties of your credence distribution. The resulting notion of probabilistic knowledge helps us give a natural account of knowledge ascriptions embedding language of subjective uncertainty, and a simple diagnosis of probabilistic analogs of Gettier cases. Just like propositional knowledge, probabilistic knowledge is factive, safe, and sensitive. And it helps us build knowledge-based norms of action without accepting implausible semantic assumptions or endorsing the claim that knowledge is interest-relative.
    Download  
     
    Export citation  
     
    Bookmark   80 citations  
  • Contemporary Approaches to Statistical Mechanical Probabilities: A Critical Commentary - Part I: The Indifference Approach.Christopher J. G. Meacham - 2010 - Philosophy Compass 5 (12):1116-1126.
    This pair of articles provides a critical commentary on contemporary approaches to statistical mechanical probabilities. These articles focus on the two ways of understanding these probabilities that have received the most attention in the recent literature: the epistemic indifference approach, and the Lewis-style regularity approach. These articles describe these approaches, highlight the main points of contention, and make some attempts to advance the discussion. The first of these articles provides a brief sketch of statistical mechanics, and discusses the indifference approach (...)
    Download  
     
    Export citation  
     
    Bookmark   11 citations  
  • Regression to the Mean and Judy Benjamin.Randall G. McCutcheon - 2020 - Synthese 197 (3):1343-1355.
    Van Fraassen's Judy Benjamin problem asks how one ought to update one's credence in A upon receiving evidence of the sort ``A may or may not obtain, but B is k times likelier than C'', where {A,B,C} is a partition. Van Fraassen's solution, in the limiting case of increasing k, recommends a posterior converging to the probability of A conditional on A union B, where P is one's prior probability function. Grove and Halpern, and more recently Douven and Romeijn, have (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Independence Day?Matthew Mandelkern & Daniel Rothschild - 2019 - Journal of Semantics 36 (2):193-210.
    Two recent and influential papers, van Rooij 2007 and Lassiter 2012, propose solutions to the proviso problem that make central use of related notions of independence—qualitative in the first case, probabilistic in the second. We argue here that, if these solutions are to work, they must incorporate an implicit assumption about presupposition accommodation, namely that accommodation does not interfere with existing qualitative or probabilistic independencies. We show, however, that this assumption is implausible, as updating beliefs with conditional information does not (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • The principle of maximum entropy and a problem in probability kinematics.Stefan Lukits - 2014 - Synthese 191 (7):1-23.
    Sometimes we receive evidence in a form that standard conditioning (or Jeffrey conditioning) cannot accommodate. The principle of maximum entropy (MAXENT) provides a unique solution for the posterior probability distribution based on the intuition that the information gain consistent with assumptions and evidence should be minimal. Opponents of objective methods to determine these probabilities prominently cite van Fraassen’s Judy Benjamin case to undermine the generality of maxent. This article shows that an intuitive approach to Judy Benjamin’s case supports maxent. This (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • An Objective Justification of Bayesianism II: The Consequences of Minimizing Inaccuracy.Hannes Leitgeb & Richard Pettigrew - 2010 - Philosophy of Science 77 (2):236-272.
    One of the fundamental problems of epistemology is to say when the evidence in an agent’s possession justifies the beliefs she holds. In this paper and its prequel, we defend the Bayesian solution to this problem by appealing to the following fundamental norm: Accuracy An epistemic agent ought to minimize the inaccuracy of her partial beliefs. In the prequel, we made this norm mathematically precise; in this paper, we derive its consequences. We show that the two core tenets of Bayesianism (...)
    Download  
     
    Export citation  
     
    Bookmark   145 citations  
  • On Indeterminate Updating of Credences.Leendert Huisman - 2014 - Philosophy of Science 81 (4):537-557.
    The strategy of updating credences by minimizing the relative entropy has been questioned by many authors, most strongly by means of the Judy Benjamin puzzle. I present a new analysis of Judy Benjamin–like forms of new information and defend the thesis that in general the rational posterior is indeterminate, meaning that a family of posterior credence functions rather than a single one is the rational response when that type of information becomes available. The proposed thesis extends naturally to all cases (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Bayesians Still Don’t Learn from Conditionals.Mario Günther & Borut Trpin - 2022 - Acta Analytica 38 (3):439-451.
    One of the open questions in Bayesian epistemology is how to rationally learn from indicative conditionals (Douven, 2016). Eva et al. (Mind 129(514):461–508, 2020) propose a strategy to resolve this question. They claim that their strategy provides a “uniquely rational response to any given learning scenario”. We show that their updating strategy is neither very general nor always rational. Even worse, we generalize their strategy and show that it still fails. Bad news for the Bayesians.
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Deceptive updating and minimal information methods.Haim Gaifman & Anubav Vasudevan - 2012 - Synthese 187 (1):147-178.
    The technique of minimizing information (infomin) has been commonly employed as a general method for both choosing and updating a subjective probability function. We argue that, in a wide class of cases, the use of infomin methods fails to cohere with our standard conception of rational degrees of belief. We introduce the notion of a deceptive updating method and argue that non-deceptiveness is a necessary condition for rational coherence. Infomin has been criticized on the grounds that there are no higher (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • Monty hall drives a wedge between Judy Benjamin and the sleeping beauty: A reply to Bovens.Luc Bovens & Jose-Luis Ferreira - 2010 - Analysis 70 (3):473 - 481.
    In “Judy Benjamin is a Sleeping Beauty” (2010) Bovens recognises a certain similarity between the Sleeping Beauty (SB) and the Judy Benjamin (JB). But he does not recognise the dissimilarity between underlying protocols (as spelled out in Shafer (1985). Protocols are expressed in conditional probability tables that spell out the probability of coming to learn various propositions conditional on the actual state of the world. The principle of total evidence requires that we not update on the content of the proposition (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • Learning from Conditionals.Benjamin Eva, Stephan Hartmann & Soroush Rafiee Rad - 2020 - Mind 129 (514):461-508.
    In this article, we address a major outstanding question of probabilistic Bayesian epistemology: how should a rational Bayesian agent update their beliefs upon learning an indicative conditional? A number of authors have recently contended that this question is fundamentally underdetermined by Bayesian norms, and hence that there is no single update procedure that rational agents are obliged to follow upon learning an indicative conditional. Here we resist this trend and argue that a core set of widely accepted Bayesian norms is (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • Expected Accuracy Supports Conditionalization—and Conglomerability and Reflection.Kenny Easwaran - 2013 - Philosophy of Science 80 (1):119-142.
    Expected accuracy arguments have been used by several authors (Leitgeb and Pettigrew, and Greaves and Wallace) to support the diachronic principle of conditionalization, in updates where there are only finitely many possible propositions to learn. I show that these arguments can be extended to infinite cases, giving an argument not just for conditionalization but also for principles known as ‘conglomerability’ and ‘reflection’. This shows that the expected accuracy approach is stronger than has been realized. I also argue that we should (...)
    Download  
     
    Export citation  
     
    Bookmark   67 citations  
  • Learning Conditional Information.Igor Douven - 2012 - Mind and Language 27 (3):239-263.
    Some of the information we receive comes to us in an explicitly conditional form. It is an open question how to model the accommodation of such information in a Bayesian framework. This paper presents data suggesting that there may be no strictly Bayesian account of updating on conditionals. Specifically, the data seem to indicate that such updating at least sometimes proceeds on the basis of explanatory considerations, which famously have no home in standard Bayesian epistemology. The paper also proposes a (...)
    Download  
     
    Export citation  
     
    Bookmark   16 citations  
  • Rational constraints and the Simple View.E. di Nucci - 2010 - Analysis 70 (3):481-486.
    Download  
     
    Export citation  
     
    Bookmark   19 citations  
  • Ramsey’s test, adams’ thesis, and left-nested conditionals.Richard Dietz & Igor Douven - 2010 - Review of Symbolic Logic 3 (3):467-484.
    Adams famously suggested that the acceptability of any indicative conditional whose antecedent and consequent are both factive sentences amounts to the subjective conditional probability of the consequent given the antecedent. The received view has it that this thesis offers an adequate partial explication of Ramsey’s test, which characterizes graded acceptability for conditionals in terms of hypothetical updates on the antecedent. Some results in van Fraassen may raise hope that this explicatory approach to Ramsey’s test is extendible to left-nested conditionals, that (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Belief revision generalized: A joint characterization of Bayes's and Jeffrey's rules.Franz Dietrich, Christian List & Richard Bradley - 2016 - Journal of Economic Theory 162:352-371.
    We present a general framework for representing belief-revision rules and use it to characterize Bayes's rule as a classical example and Jeffrey's rule as a non-classical one. In Jeffrey's rule, the input to a belief revision is not simply the information that some event has occurred, as in Bayes's rule, but a new assignment of probabilities to some events. Despite their differences, Bayes's and Jeffrey's rules can be characterized in terms of the same axioms: "responsiveness", which requires that revised beliefs (...)
    Download  
     
    Export citation  
     
    Bookmark   14 citations  
  • Higher-Order Beliefs and the Undermining Problem for Bayesianism.Lisa Cassell - 2019 - Acta Analytica 34 (2):197-213.
    Jonathan Weisberg has argued that Bayesianism’s rigid updating rules make Bayesian updating incompatible with undermining defeat. In this paper, I argue that when we attend to the higher-order beliefs we must ascribe to agents in the kinds of cases Weisberg considers, the problem he raises disappears. Once we acknowledge the importance of higher-order beliefs to the undermining story, we are led to a different understanding of how these cases arise. And on this different understanding of things, the rigid nature of (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Proposition-valued random variables as information.Richard Bradley - 2010 - Synthese 175 (1):17 - 38.
    The notion of a proposition as a set of possible worlds or states occupies central stage in probability theory, semantics and epistemology, where it serves as the fundamental unit both of information and meaning. But this fact should not blind us to the existence of prospects with a different structure. In the paper I examine the use of random variables—in particular, proposition-valued random variables— in these fields and argue that we need a general account of rational attitude formation with respect (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Aggregating Causal Judgments.Richard Bradley, Franz Dietrich & Christian List - 2014 - Philosophy of Science 81 (4):491-515.
    Decision-making typically requires judgments about causal relations: we need to know the causal effects of our actions and the causal relevance of various environmental factors. We investigate how several individuals' causal judgments can be aggregated into collective causal judgments. First, we consider the aggregation of causal judgments via the aggregation of probabilistic judgments, and identify the limitations of this approach. We then explore the possibility of aggregating causal judgments independently of probabilistic ones. Formally, we introduce the problem of causal-network aggregation. (...)
    Download  
     
    Export citation  
     
    Bookmark   12 citations  
  • Judy Benjamin is a Sleeping Beauty.Luc Bovens - 2010 - Analysis 70 (1):23-26.
    I argue that van Fraassen's Judy Benjamin Problem and Elga's Sleeping Beauty Problem have the same structure.
    Download  
     
    Export citation  
     
    Bookmark   13 citations  
  • Normativity, Epistemic Rationality, and Noisy Statistical Evidence.Boris Babic, Anil Gaba, Ilia Tsetlin & Robert Winkler - forthcoming - British Journal for the Philosophy of Science.
    Many philosophers have argued that statistical evidence regarding group char- acteristics (particularly stereotypical ones) can create normative conflicts between the requirements of epistemic rationality and our moral obligations to each other. In a recent paper, Johnson-King and Babic argue that such conflicts can usually be avoided: what ordinary morality requires, they argue, epistemic rationality permits. In this paper, we show that as data gets large, Johnson-King and Babic’s approach becomes less plausible. More constructively, we build on their project and develop (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • What the "Equal Weight View" is.Randall G. McCutcheon - manuscript
    Dawid, DeGroot and Mortera showed, a quarter century ago, that any agent who regards a fellow agent as a peer--in particular, defers to the fellow agent's prior credences in the same way that she defers to her own--and updates by split-the-difference is prone to diachronic incoherence. On the other hand one may show that there are special scenarios in which Bayesian updating approximates difference splitting, so it remains an important question whether it remains a viable response to ``generic" peer update. (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • How to Learn Concepts, Consequences, and Conditionals.Franz Huber - 2015 - Analytica: an electronic, open-access journal for philosophy of science 1 (1):20-36.
    In this brief note I show how to model conceptual change, logical learning, and revision of one's beliefs in response to conditional information such as indicative conditionals that do not express propositions.
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Vasudevan on Judy Benjamin.Randall G. McCutcheon - manuscript
    Anubav Vasudevan characterized van Fraassen’s “Infomin” solution to the Judy Benjamin Problem (i.e. the solution by way of minimizing the Kullback-Leibler divergence between the posterior and prior) as an implementation of a “brand of epistemic charity” taking “the form of an assumption on the part of Judy Benjamin that her informant’s evidential report leaves out no relevant information”. After an analysis of the example that led Vasudevan to this way of thinking about Infomin, as well as of a new one (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Eva, Hartmann and Rad on Kullback-Leibler Minimization.Randall G. McCutcheon - manuscript
    We address problems (that have since been addressed) in a proofs-version of a paper by Eva, Hartmann and Rad, who where attempting to justify the Kullback-Leibler divergence minimization solution to van Fraassen’s Judy Benjamin problem.
    Download  
     
    Export citation  
     
    Bookmark