Switch to: References

Add citations

You must login to add citations.
  1. Measuring the overall incoherence of credence functions.Julia Staffel - 2015 - Synthese 192 (5):1467-1493.
    Many philosophers hold that the probability axioms constitute norms of rationality governing degrees of belief. This view, known as subjective Bayesianism, has been widely criticized for being too idealized. It is claimed that the norms on degrees of belief postulated by subjective Bayesianism cannot be followed by human agents, and hence have no normative force for beings like us. This problem is especially pressing since the standard framework of subjective Bayesianism only allows us to distinguish between two kinds of credence (...)
    Download  
     
    Export citation  
     
    Bookmark   18 citations  
  • The puzzle of the changing past.L. Barlassina & F. Del Prete - 2015 - Analysis 75 (1):59-67.
    If you utter sentence (1) ‘Obama was born in 1961’ now, you say something true about the past. Since the past will always be such that the year 1961 has the property of being a time in which Obama was born, it seems impossible that could ever be false in a future context of utterance. We shall consider the case of a sentence about the past exactly like (1), but which was true when uttered a few years ago and is (...)
    Download  
     
    Export citation  
     
    Bookmark   13 citations  
  • Accuracy, Risk, and the Principle of Indifference.Richard Pettigrew - 2016 - Philosophy and Phenomenological Research 92 (1):35-59.
    In Bayesian epistemology, the problem of the priors is this: How should we set our credences (or degrees of belief) in the absence of evidence? That is, how should we set our prior or initial credences, the credences with which we begin our credal life? David Lewis liked to call an agent at the beginning of her credal journey a superbaby. The problem of the priors asks for the norms that govern these superbabies. -/- The Principle of Indifference gives a (...)
    Download  
     
    Export citation  
     
    Bookmark   72 citations  
  • Rational Probabilistic Incoherence.Michael Caie - 2013 - Philosophical Review 122 (4):527-575.
    Probabilism is the view that a rational agent's credences should always be probabilistically coherent. It has been argued that Probabilism follows, given the assumption that an epistemically rational agent ought to try to have credences that represent the world as accurately as possible. The key claim in this argument is that the goal of representing the world as accurately as possible is best served by having credences that are probabilistically coherent. This essay shows that this claim is false. In certain (...)
    Download  
     
    Export citation  
     
    Bookmark   51 citations  
  • Epistemic Utility Theory.Richard Pettigrew - 2010
    Beliefs come in different strengths. What are the norms that govern these strengths of belief? Let an agent's belief function at a particular time be the function that assigns, to each of the propositions about which she has an opinion, the strength of her belief in that proposition at that time. Traditionally, philosophers have claimed that an agent's belief function at any time ought to be a probability function, and that she ought to update her belief function upon obtaining new (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Formal Representations of Belief.Franz Huber - 2008 - Stanford Encyclopedia of Philosophy.
    Epistemology is the study of knowledge and justified belief. Belief is thus central to epistemology. It comes in a qualitative form, as when Sophia believes that Vienna is the capital of Austria, and a quantitative form, as when Sophia's degree of belief that Vienna is the capital of Austria is at least twice her degree of belief that tomorrow it will be sunny in Vienna. Formal epistemology, as opposed to mainstream epistemology (Hendricks 2006), is epistemology done in a formal way, (...)
    Download  
     
    Export citation  
     
    Bookmark   26 citations  
  • Three conceptions of explaining how possibly—and one reductive account.Johannes Persson - 2011 - In Henk W. de Regt (ed.), EPSA Philosophy of Science: Amsterdam 2009. Springer. pp. 275--286.
    Philosophers of science have often favoured reductive approaches to how-possibly explanation. This article identifies three alternative conceptions making how-possibly explanation an interesting phenomenon in its own right. The first variety approaches “how possibly X?” by showing that X is not epistemically impossible. This can sometimes be achieved by removing misunderstandings concerning the implications of one’s current belief system but involves characteristically a modification of this belief system so that acceptance of X does not result in contradiction. The second variety offers (...)
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • (1 other version)Interpretations of probability.Alan Hájek - 2007 - Stanford Encyclopedia of Philosophy.
    Download  
     
    Export citation  
     
    Bookmark   162 citations  
  • An accuracy-based approach to quantum conditionalization.Alexander Meehan & Jer Alex Steeger - forthcoming - British Journal for the Philosophy of Science.
    A core tenet of Bayesian epistemology is that rational agents update by conditionalization. Accuracy arguments in favour of this norm are well known. Meanwhile, scholars working in quantum probability and quantum state estimation have proposed multiple updating rules, all of which look prima facie like analogues of Bayesian conditionalization. The most common are Lüders conditionalization and Bayesian mean estimation (BME). Some authors also endorse a lesser-known alternative that we call retrodiction. We show how one can view Lüders and BME as (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Infinite Opinion Sets and Relative Accuracy.Ilho Park & Jaemin Jung - 2023 - Journal of Philosophy 120 (6):285-313.
    We can have credences in an infinite number of propositions—that is, our opinion set can be infinite. Accuracy-first epistemologists have devoted themselves to evaluating credal states with the help of the concept of ‘accuracy’. Unfortunately, under several innocuous assumptions, infinite opinion sets yield several undesirable results, some of which are even fatal, to accuracy-first epistemology. Moreover, accuracy-first epistemologists cannot circumvent these difficulties in any standard way. In this regard, we will suggest a non-standard approach, called a relativistic approach, to accuracy-first (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • The material theory of induction.John D. Norton - 2021 - Calgary, Alberta, Canada: University of Calgary Press.
    The inaugural title in the new, Open Access series BSPS Open, The Material Theory of Induction will initiate a new tradition in the analysis of inductive inference. The fundamental burden of a theory of inductive inference is to determine which are the good inductive inferences or relations of inductive support and why it is that they are so. The traditional approach is modeled on that taken in accounts of deductive inference. It seeks universally applicable schemas or rules or a single (...)
    Download  
     
    Export citation  
     
    Bookmark   42 citations  
  • Accuracy and Probabilism in Infinite Domains.Michael Nielsen - 2023 - Mind 132 (526):402-427.
    The best accuracy arguments for probabilism apply only to credence functions with finite domains, that is, credence functions that assign credence to at most finitely many propositions. This is a significant limitation. It reveals that the support for the accuracy-first program in epistemology is a lot weaker than it seems at first glance, and it means that accuracy arguments cannot yet accomplish everything that their competitors, the pragmatic (Dutch book) arguments, can. In this paper, I investigate the extent to which (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • A Dilemma for Solomonoff Prediction.Sven Neth - 2023 - Philosophy of Science 90 (2):288-306.
    The framework of Solomonoff prediction assigns prior probability to hypotheses inversely proportional to their Kolmogorov complexity. There are two well-known problems. First, the Solomonoff prior is relative to a choice of Universal Turing machine. Second, the Solomonoff prior is not computable. However, there are responses to both problems. Different Solomonoff priors converge with more and more data. Further, there are computable approximations to the Solomonoff prior. I argue that there is a tension between these two responses. This is because computable (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Radical Pooling and Imprecise Probabilities.Ignacio Ojea Quintana - forthcoming - Erkenntnis:1-28.
    This paper focuses on radical pooling, or the question of how to aggregate credences when there is a fundamental disagreement about which is the relevant logical space for inquiry. The solution advanced is based on the notion of consensus as common ground, where agents can find it by suspending judgment on logical possibilities. This is exemplified with cases of scientific revolution. On a formal level, the proposal uses algebraic joins and imprecise probabilities; which is shown to be compatible with the (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Downwards Propriety in Epistemic Utility Theory.Alejandro Pérez Carballo - 2023 - Mind 132 (525):30-62.
    Epistemic Utility Theory is often identified with the project of *axiology-first epistemology*—the project of vindicating norms of epistemic rationality purely in terms of epistemic value. One of the central goals of axiology-first epistemology is to provide a justification of the central norm of Bayesian epistemology, Probabilism. The first part of this paper presents a new challenge to axiology first epistemology: I argue that in order to justify Probabilism in purely axiological terms, proponents of axiology first epistemology need to justify a (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Symmetry and partial belief geometry.Stefan Lukits - 2021 - European Journal for Philosophy of Science 11 (3):1-24.
    When beliefs are quantified as credences, they are related to each other in terms of closeness and accuracy. The “accuracy first” approach in formal epistemology wants to establish a normative account for credences based entirely on the alethic properties of the credence: how close it is to the truth. To pull off this project, there is a need for a scoring rule. There is widespread agreement about some constraints on this scoring rule, but not whether a unique scoring rule stands (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Updating incoherent credences ‐ Extending the Dutch strategy argument for conditionalization.Glauber De Bona & Julia Staffel - 2021 - Philosophy and Phenomenological Research 105 (2):435-460.
    In this paper, we ask: how should an agent who has incoherent credences update when they learn new evidence? The standard Bayesian answer for coherent agents is that they should conditionalize; however, this updating rule is not defined for incoherent starting credences. We show how one of the main arguments for conditionalization, the Dutch strategy argument, can be extended to devise a target property for updating plans that can apply to them regardless of whether the agent starts out with coherent (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • On the Best Accuracy Arguments for Probabilism.Michael Nielsen - 2022 - Philosophy of Science 89 (3):621-630.
    In a recent paper, Pettigrew reports a generalization of the celebrated accuracy-dominance theorem due to Predd et al., but Pettigrew’s proof is incorrect. I will explain the mistakes and provide a correct proof.
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Accuracy-dominance and conditionalization.Michael Nielsen - 2021 - Philosophical Studies 178 (10):3217-3236.
    Epistemic decision theory produces arguments with both normative and mathematical premises. I begin by arguing that philosophers should care about whether the mathematical premises (1) are true, (2) are strong, and (3) admit simple proofs. I then discuss a theorem that Briggs and Pettigrew (2020) use as a premise in a novel accuracy-dominance argument for conditionalization. I argue that the theorem and its proof can be improved in a number of ways. First, I present a counterexample that shows that one (...)
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • (1 other version)On the Accuracy of Group Credences.Richard Pettigrew - 2019 - Oxford Studies in Epistemology 6.
    We often ask for the opinion of a group of individuals. How strongly does the scientific community believe that the rate at which sea levels are rising has increased over the last 200 years? How likely does the UK Treasury think it is that there will be a recession if the country leaves the European Union? What are these group credences that such questions request? And how do they relate to the individual credences assigned by the members of the particular (...)
    Download  
     
    Export citation  
     
    Bookmark   16 citations  
  • Epistemic Decision Theory's Reckoning.Conor Mayo-Wilson & Gregory Wheeler - manuscript
    Epistemic decision theory (EDT) employs the mathematical tools of rational choice theory to justify epistemic norms, including probabilism, conditionalization, and the Principal Principle, among others. Practitioners of EDT endorse two theses: (1) epistemic value is distinct from subjective preference, and (2) belief and epistemic value can be numerically quantified. We argue the first thesis, which we call epistemic puritanism, undermines the second.
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Updating for Externalists.J. Dmitri Gallow - 2021 - Noûs 55 (3):487-516.
    The externalist says that your evidence could fail to tell you what evidence you do or not do have. In that case, it could be rational for you to be uncertain about what your evidence is. This is a kind of uncertainty which orthodox Bayesian epistemology has difficulty modeling. For, if externalism is correct, then the orthodox Bayesian learning norms of conditionalization and reflection are inconsistent with each other. I recommend that an externalist Bayesian reject conditionalization. In its stead, I (...)
    Download  
     
    Export citation  
     
    Bookmark   17 citations  
  • An Accuracy‐Dominance Argument for Conditionalization.R. A. Briggs & Richard Pettigrew - 2020 - Noûs 54 (1):162-181.
    Epistemic decision theorists aim to justify Bayesian norms by arguing that these norms further the goal of epistemic accuracy—having beliefs that are as close as possible to the truth. The standard defense of Probabilism appeals to accuracy dominance: for every belief state that violates the probability calculus, there is some probabilistic belief state that is more accurate, come what may. The standard defense of Conditionalization, on the other hand, appeals to expected accuracy: before the evidence is in, one should expect (...)
    Download  
     
    Export citation  
     
    Bookmark   62 citations  
  • Learning and Value Change.J. Dmitri Gallow - 2019 - Philosophers' Imprint 19:1--22.
    Accuracy-first accounts of rational learning attempt to vindicate the intuitive idea that, while rationally-formed belief need not be true, it is nevertheless likely to be true. To this end, they attempt to show that the Bayesian's rational learning norms are a consequence of the rational pursuit of accuracy. Existing accounts fall short of this goal, for they presuppose evidential norms which are not and cannot be vindicated in terms of the single-minded pursuit of accuracy. I propose an alternative account, according (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Epistemic Utility and the Normativity of Logic.Richard Pettigrew - 2017 - Logos and Episteme 8 (4):455-492.
    How does logic relate to rational belief? Is logic normative for belief, as some say? What, if anything, do facts about logical consequence tell us about norms of doxastic rationality? In this paper, we consider a range of putative logic-rationality bridge principles. These purport to relate facts about logical consequence to norms that govern the rationality of our beliefs and credences. To investigate these principles, we deploy a novel approach, namely, epistemic utility theory. That is, we assume that doxastic attitudes (...)
    Download  
     
    Export citation  
     
    Bookmark   10 citations  
  • Lockeans Maximize Expected Accuracy.Kevin Dorst - 2019 - Mind 128 (509):175-211.
    The Lockean Thesis says that you must believe p iff you’re sufficiently confident of it. On some versions, the 'must' asserts a metaphysical connection; on others, it asserts a normative one. On some versions, 'sufficiently confident' refers to a fixed threshold of credence; on others, it varies with proposition and context. Claim: the Lockean Thesis follows from epistemic utility theory—the view that rational requirements are constrained by the norm to promote accuracy. Different versions of this theory generate different versions of (...)
    Download  
     
    Export citation  
     
    Bookmark   98 citations  
  • A flaw in the Stich–Plantinga challenge to evolutionary reliabilism.Michael J. Deem - 2018 - Analysis 78 (2):216-225.
    Evolutionary reliabilism is the view that natural selection likely favoured reliable cognitive faculties in humans. While ER enjoys some plausibility, Stephen Stich and Alvin Plantinga have presented well-known challenges to the view. Their arguments rely on a common premiss; namely, that natural selection is indifferent to truth. This article shows that this premiss is both imprecise and too weak to support their conclusions and, therefore, that their challenges to ER fail.
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Jamesian epistemology formalised: An explication of ‘the will to believe’.Richard Pettigrew - 2016 - Episteme 13 (3):253-268.
    Famously, William James held that there are two commandments that govern our epistemic life: Believe truth! Shun error! In this paper, I give a formal account of James' claim using the tools of epistemic utility theory. I begin by giving the account for categorical doxastic states – that is, full belief, full disbelief, and suspension of judgment. Then I will show how the account plays out for graded doxastic states – that is, credences. The latter part of the paper thus (...)
    Download  
     
    Export citation  
     
    Bookmark   22 citations  
  • Strictly Proper Scoring Rules.Juergen Landes - unknown
    Epistemic scoring rules are the en vogue tool for justifications of the probability norm and further norms of rational belief formation. They are different in kind and application from statistical scoring rules from which they arose. In the first part of the paper I argue that statistical scoring rules, properly understood, are in principle better suited to justify the probability norm than their epistemic brethren. Furthermore, I give a justification of the probability norm applying statistical scoring rules. In the second (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Accuracy and infinity: a dilemma for subjective Bayesians.Mikayla Kelley & Sven Neth - 2023 - Synthese 201 (12):1-14.
    We argue that subjective Bayesians face a dilemma: they must offend against the spirit of their permissivism about rational credence or reject the principle that one should avoid accuracy dominance.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Generalized Immodesty Principles in Epistemic Utility Theory.Alejandro Pérez Carballo - 2023 - Ergo: An Open Access Journal of Philosophy 10 (31):874–907.
    Epistemic rationality is typically taken to be immodest at least in this sense: a rational epistemic state should always take itself to be doing at least as well, epistemically and by its own light, than any alternative epistemic state. If epistemic states are probability functions and their alternatives are other probability functions defined over the same collection of proposition, we can capture the relevant sense of immodesty by claiming that epistemic utility functions are (strictly) proper. In this paper I examine (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Towards the Inevitability of Non-Classical Probability.Giacomo Molinari - 2023 - Review of Symbolic Logic 16 (4):1053-1079.
    This paper generalises an argument for probabilism due to Lindley [9]. I extend the argument to a number of non-classical logical settings whose truth-values, seen here as ideal aims for belief, are in the set $\{0,1\}$, and where logical consequence $\models $ is given the “no-drop” characterization. First I will show that, in each of these settings, an agent’s credence can only avoid accuracy-domination if its canonical transform is a (possibly non-classical) probability function. In other words, if an agent values (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • On the Expected Utility Objection to the Dutch Book Argument for Probabilism.Richard Pettigrew - 2021 - Noûs (1):23-38.
    The Dutch Book Argument for Probabilism assumes Ramsey's Thesis (RT), which purports to determine the prices an agent is rationally required to pay for a bet. Recently, a new objection to Ramsey's Thesis has emerged (Hedden 2013, Wronski & Godziszewski 2017, Wronski 2018)--I call this the Expected Utility Objection. According to this objection, it is Maximise Subjective Expected Utility (MSEU) that determines the prices an agent is required to pay for a bet, and this often disagrees with Ramsey's Thesis. I (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Accuracy, Verisimilitude, and Scoring Rules.Jeffrey Dunn - 2019 - Australasian Journal of Philosophy 97 (1):151-166.
    Suppose that beliefs come in degrees. How should we then measure the accuracy of these degrees of belief? Scoring rules are usually thought to be the mathematical tool appropriate for this job. But there are many scoring rules, which lead to different ordinal accuracy rankings. Recently, Fallis and Lewis [2016] have given an argument that, if sound, rules out many popular scoring rules, including the Brier score, as genuine measures of accuracy. I respond to this argument, in part by noting (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • Advanced modalizing de dicto and de re.John Divers & John J. Parry - 2018 - Analysis 78 (3):415-425.
    Lewis’ analysis of modality faces a problem in that it appears to confer unintended truth values to certain modal claims about the pluriverse: e.g. ‘It is possible that there are many worlds’ is false when we expect truth. This is the problem of advanced modalizing. Divers presents a principled solution to this problem by treating modal modifiers as semantically redundant in some such cases. However, this semantic move does not deal adequately with advanced de re modal claims. Here, we motivate (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Aggregating incoherent agents who disagree.Richard Pettigrew - 2019 - Synthese 196 (7):2737-2776.
    In this paper, we explore how we should aggregate the degrees of belief of a group of agents to give a single coherent set of degrees of belief, when at least some of those agents might be probabilistically incoherent. There are a number of ways of aggregating degrees of belief, and there are a number of ways of fixing incoherent degrees of belief. When we have picked one of each, should we aggregate first and then fix, or fix first and (...)
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • Agreement and Updating For Self-Locating Belief.Michael Caie - 2018 - Journal of Philosophical Logic 47 (3):513-547.
    In this paper, I argue that some plausible principles concerning which credences are rationally permissible for agents given information about one another’s epistemic and credal states have some surprising consequences for which credences an agent ought to have in light of self-locating information. I provide a framework that allows us to state these constraints and draw out these consequences precisely. I then consider and assess the prospects for rejecting these prima facie plausible principles.
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • The Population Ethics of Belief: In Search of an Epistemic Theory X.Richard Pettigrew - 2018 - Noûs 52 (2):336-372.
    Consider Phoebe and Daphne. Phoebe has credences in 1 million propositions. Daphne, on the other hand, has credences in all of these propositions, but she's also got credences in 999 million other propositions. Phoebe's credences are all very accurate. Each of Daphne's credences, in contrast, are not very accurate at all; each is a little more accurate than it is inaccurate, but not by much. Whose doxastic state is better, Phoebe's or Daphne's? It is clear that this question is analogous (...)
    Download  
     
    Export citation  
     
    Bookmark   16 citations  
  • Epistemic Utility and Norms for Credences.Richard Pettigrew - 2013 - Philosophy Compass 8 (10):897-908.
    Beliefs come in different strengths. An agent's credence in a proposition is a measure of the strength of her belief in that proposition. Various norms for credences have been proposed. Traditionally, philosophers have tried to argue for these norms by showing that any agent who violates them will be lead by her credences to make bad decisions. In this article, we survey a new strategy for justifying these norms. The strategy begins by identifying an epistemic utility function and a decision-theoretic (...)
    Download  
     
    Export citation  
     
    Bookmark   40 citations  
  • Improving Aggregated Forecasts of Probability.Guanchun Wang, Sanjeev Kulkarni & Daniel N. Osherson - unknown
    ��The Coherent Approximation Principle (CAP) is a method for aggregating forecasts of probability from a group of judges by enforcing coherence with minimal adjustment. This paper explores two methods to further improve the forecasting accuracy within the CAP framework and proposes practical algorithms that implement them. These methods allow flexibility to add fixed constraints to the coherentization process and compensate for the psychological bias present in probability estimates from human judges. The algorithms were tested on a data set of nearly (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Accuracy Across Doxastic Attitudes: Recent Work on the Accuracy of Belief.Robert Weston Siscoe - 2022 - American Philosophical Quarterly 59 (2):201-217.
    James Joyce's article “A Nonpragmatic Vindication of Probabilism” introduced an approach to arguing for credal norms by appealing to the epistemic value of accuracy. The central thought was that credences ought to accurately represent the world, a guiding thought that has gone on to generate an entire research paradigm on the rationality of credences. Recently, a number of epistemologists have begun to apply this same thought to full beliefs, attempting to explain and argue for norms of belief in terms of (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Ideal counterpart theorizing and the accuracy argument for probabilism.Clinton Castro & Olav Vassend - 2018 - Analysis 78 (2):207-216.
    One of the main goals of Bayesian epistemology is to justify the rational norms credence functions ought to obey. Accuracy arguments attempt to justify these norms from the assumption that the source of value for credences relevant to their epistemic status is their accuracy. This assumption and some standard decision-theoretic principles are used to argue for norms like Probabilism, the thesis that an agent’s credence function is rational only if it obeys the probability axioms. We introduce an example that shows (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Formal Epistemology Meets Mechanism Design.Jürgen Landes - 2023 - Journal for General Philosophy of Science / Zeitschrift für Allgemeine Wissenschaftstheorie 54 (2):215-231.
    This article connects recent work in formal epistemology to work in economics and computer science. Analysing the Dutch Book Arguments, Epistemic Utility Theory and Objective Bayesian Epistemology we discover that formal epistemologists employ the same argument structure as economists and computer scientists. Since similar approaches often have similar problems and have shared solutions, opportunities for cross-fertilisation abound.
    Download  
     
    Export citation  
     
    Bookmark  
  • (1 other version)Accuracy, Conditionalization, and Probabilism.Peter J. Lewis & Don Fallis - manuscript
    Accuracy-based arguments for conditionalization and probabilism appear to have a significant advantage over their Dutch Book rivals. They rely only on the plausible epistemic norm that one should try to decrease the inaccuracy of one's beliefs. Furthermore, it seems that conditionalization and probabilism follow from a wide range of measures of inaccuracy. However, we argue that among the measures in the literature, there are some from which one can prove conditionalization, others from which one can prove probabilism, and none from (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Generalized Information Theory Meets Human Cognition: Introducing a Unified Framework to Model Uncertainty and Information Search.Vincenzo Crupi, Jonathan D. Nelson, Björn Meder, Gustavo Cevolani & Katya Tentori - 2018 - Cognitive Science 42 (5):1410-1456.
    Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people's goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty and the (...)
    Download  
     
    Export citation  
     
    Bookmark   17 citations  
  • (1 other version)On the accuracy of group credences.Richard Pettigrew - 2016 - In Oxford Studies in Epistemology Vol.6. Oxford University Press.
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • Why be coherent?Glauber De Bona & Julia Staffel - 2018 - Analysis 78 (3):405-415.
    Bayesians defend norms of ideal rationality such as probabilism, which they claim should be approximated by non-ideal thinkers. Yet, it is not often discussed exactly in what sense it is beneficial for an agent’s credence function to approximate probabilistic coherence. Some existing research indicates that approximating coherence leads to improvements in accuracy, whereas other research suggests that it decreases Dutch book vulnerability. Yet, the existing results don’t settle whether there is a way of approximating coherence that delivers both benefits at (...)
    Download  
     
    Export citation  
     
    Bookmark   17 citations  
  • Judging the Probability of Hypotheses Versus the Impact of Evidence: Which Form of Inductive Inference Is More Accurate and Time‐Consistent?Katya Tentori, Nick Chater & Vincenzo Crupi - 2016 - Cognitive Science 40 (3):758-778.
    Inductive reasoning requires exploiting links between evidence and hypotheses. This can be done focusing either on the posterior probability of the hypothesis when updated on the new evidence or on the impact of the new evidence on the credibility of the hypothesis. But are these two cognitive representations equally reliable? This study investigates this question by comparing probability and impact judgments on the same experimental materials. The results indicate that impact judgments are more consistent in time and more accurate than (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • The value of cost-free uncertain evidence.Patryk Dziurosz-Serafinowicz & Dominika Dziurosz-Serafinowicz - 2021 - Synthese 199 (5-6):13313-13343.
    We explore the question of whether cost-free uncertain evidence is worth waiting for in advance of making a decision. A classical result in Bayesian decision theory, known as the value of evidence theorem, says that, under certain conditions, when you update your credences by conditionalizing on some cost-free and certain evidence, the subjective expected utility of obtaining this evidence is never less than the subjective expected utility of not obtaining it. We extend this result to a type of update method, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • (1 other version)The Foundations of Epistemic Decision Theory.Jason Konek & Benjamin A. Levinstein - 2019 - Mind 128 (509):69-107.
    Download  
     
    Export citation  
     
    Bookmark   25 citations