Switch to: References

Add citations

You must login to add citations.
  1. An accuracy-based approach to quantum conditionalization.Alexander Meehan & Jer Alex Steeger - forthcoming - British Journal for the Philosophy of Science.
    A core tenet of Bayesian epistemology is that rational agents update by conditionalization. Accuracy arguments in favour of this norm are well known. Meanwhile, scholars working in quantum probability and quantum state estimation have proposed multiple updating rules, all of which look prima facie like analogues of Bayesian conditionalization. The most common are Lüders conditionalization and Bayesian mean estimation (BME). Some authors also endorse a lesser-known alternative that we call retrodiction. We show how one can view Lüders and BME as (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Towards the Inevitability of Non-Classical Probability.Giacomo Molinari - 2023 - Review of Symbolic Logic 16 (4):1053-1079.
    This paper generalises an argument for probabilism due to Lindley [9]. I extend the argument to a number of non-classical logical settings whose truth-values, seen here as ideal aims for belief, are in the set $\{0,1\}$, and where logical consequence $\models $ is given the “no-drop” characterization. First I will show that, in each of these settings, an agent’s credence can only avoid accuracy-domination if its canonical transform is a (possibly non-classical) probability function. In other words, if an agent values (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Infinite Opinion Sets and Relative Accuracy.Ilho Park & Jaemin Jung - 2023 - Journal of Philosophy 120 (6):285-313.
    We can have credences in an infinite number of propositions—that is, our opinion set can be infinite. Accuracy-first epistemologists have devoted themselves to evaluating credal states with the help of the concept of ‘accuracy’. Unfortunately, under several innocuous assumptions, infinite opinion sets yield several undesirable results, some of which are even fatal, to accuracy-first epistemology. Moreover, accuracy-first epistemologists cannot circumvent these difficulties in any standard way. In this regard, we will suggest a non-standard approach, called a relativistic approach, to accuracy-first (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Accuracy-dominance and conditionalization.Michael Nielsen - 2021 - Philosophical Studies 178 (10):3217-3236.
    Epistemic decision theory produces arguments with both normative and mathematical premises. I begin by arguing that philosophers should care about whether the mathematical premises (1) are true, (2) are strong, and (3) admit simple proofs. I then discuss a theorem that Briggs and Pettigrew (2020) use as a premise in a novel accuracy-dominance argument for conditionalization. I argue that the theorem and its proof can be improved in a number of ways. First, I present a counterexample that shows that one (...)
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • A flaw in the Stich–Plantinga challenge to evolutionary reliabilism.Michael J. Deem - 2018 - Analysis 78 (2):216-225.
    Evolutionary reliabilism is the view that natural selection likely favoured reliable cognitive faculties in humans. While ER enjoys some plausibility, Stephen Stich and Alvin Plantinga have presented well-known challenges to the view. Their arguments rely on a common premiss; namely, that natural selection is indifferent to truth. This article shows that this premiss is both imprecise and too weak to support their conclusions and, therefore, that their challenges to ER fail.
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Why be coherent?Glauber De Bona & Julia Staffel - 2018 - Analysis 78 (3):405-415.
    Bayesians defend norms of ideal rationality such as probabilism, which they claim should be approximated by non-ideal thinkers. Yet, it is not often discussed exactly in what sense it is beneficial for an agent’s credence function to approximate probabilistic coherence. Some existing research indicates that approximating coherence leads to improvements in accuracy, whereas other research suggests that it decreases Dutch book vulnerability. Yet, the existing results don’t settle whether there is a way of approximating coherence that delivers both benefits at (...)
    Download  
     
    Export citation  
     
    Bookmark   16 citations  
  • Updating incoherent credences ‐ Extending the Dutch strategy argument for conditionalization.Glauber De Bona & Julia Staffel - 2021 - Philosophy and Phenomenological Research 105 (2):435-460.
    In this paper, we ask: how should an agent who has incoherent credences update when they learn new evidence? The standard Bayesian answer for coherent agents is that they should conditionalize; however, this updating rule is not defined for incoherent starting credences. We show how one of the main arguments for conditionalization, the Dutch strategy argument, can be extended to devise a target property for updating plans that can apply to them regardless of whether the agent starts out with coherent (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Generalized Information Theory Meets Human Cognition: Introducing a Unified Framework to Model Uncertainty and Information Search.Vincenzo Crupi, Jonathan D. Nelson, Björn Meder, Gustavo Cevolani & Katya Tentori - 2018 - Cognitive Science 42 (5):1410-1456.
    Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people's goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty and the (...)
    Download  
     
    Export citation  
     
    Bookmark   17 citations  
  • Ideal counterpart theorizing and the accuracy argument for probabilism.Clinton Castro & Olav Vassend - 2018 - Analysis 78 (2):207-216.
    One of the main goals of Bayesian epistemology is to justify the rational norms credence functions ought to obey. Accuracy arguments attempt to justify these norms from the assumption that the source of value for credences relevant to their epistemic status is their accuracy. This assumption and some standard decision-theoretic principles are used to argue for norms like Probabilism, the thesis that an agent’s credence function is rational only if it obeys the probability axioms. We introduce an example that shows (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Generalized Immodesty Principles in Epistemic Utility Theory.Alejandro Pérez Carballo - 2023 - Ergo: An Open Access Journal of Philosophy 10 (31):874–907.
    Epistemic rationality is typically taken to be immodest at least in this sense: a rational epistemic state should always take itself to be doing at least as well, epistemically and by its own light, than any alternative epistemic state. If epistemic states are probability functions and their alternatives are other probability functions defined over the same collection of proposition, we can capture the relevant sense of immodesty by claiming that epistemic utility functions are (strictly) proper. In this paper I examine (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Downwards Propriety in Epistemic Utility Theory.Alejandro Pérez Carballo - 2023 - Mind 132 (525):30-62.
    Epistemic Utility Theory is often identified with the project of *axiology-first epistemology*—the project of vindicating norms of epistemic rationality purely in terms of epistemic value. One of the central goals of axiology-first epistemology is to provide a justification of the central norm of Bayesian epistemology, Probabilism. The first part of this paper presents a new challenge to axiology first epistemology: I argue that in order to justify Probabilism in purely axiological terms, proponents of axiology first epistemology need to justify a (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Rational Probabilistic Incoherence.Michael Caie - 2013 - Philosophical Review 122 (4):527-575.
    Probabilism is the view that a rational agent's credences should always be probabilistically coherent. It has been argued that Probabilism follows, given the assumption that an epistemically rational agent ought to try to have credences that represent the world as accurately as possible. The key claim in this argument is that the goal of representing the world as accurately as possible is best served by having credences that are probabilistically coherent. This essay shows that this claim is false. In certain (...)
    Download  
     
    Export citation  
     
    Bookmark   50 citations  
  • Agreement and Updating For Self-Locating Belief.Michael Caie - 2018 - Journal of Philosophical Logic 47 (3):513-547.
    In this paper, I argue that some plausible principles concerning which credences are rationally permissible for agents given information about one another’s epistemic and credal states have some surprising consequences for which credences an agent ought to have in light of self-locating information. I provide a framework that allows us to state these constraints and draw out these consequences precisely. I then consider and assess the prospects for rejecting these prima facie plausible principles.
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • An Accuracy‐Dominance Argument for Conditionalization.R. A. Briggs & Richard Pettigrew - 2020 - Noûs 54 (1):162-181.
    Epistemic decision theorists aim to justify Bayesian norms by arguing that these norms further the goal of epistemic accuracy—having beliefs that are as close as possible to the truth. The standard defense of Probabilism appeals to accuracy dominance: for every belief state that violates the probability calculus, there is some probabilistic belief state that is more accurate, come what may. The standard defense of Conditionalization, on the other hand, appeals to expected accuracy: before the evidence is in, one should expect (...)
    Download  
     
    Export citation  
     
    Bookmark   62 citations  
  • The puzzle of the changing past.L. Barlassina & F. Del Prete - 2015 - Analysis 75 (1):59-67.
    If you utter sentence (1) ‘Obama was born in 1961’ now, you say something true about the past. Since the past will always be such that the year 1961 has the property of being a time in which Obama was born, it seems impossible that could ever be false in a future context of utterance. We shall consider the case of a sentence about the past exactly like (1), but which was true when uttered a few years ago and is (...)
    Download  
     
    Export citation  
     
    Bookmark   13 citations  
  • A Dilemma for Solomonoff Prediction.Sven Neth - 2023 - Philosophy of Science 90 (2):288-306.
    The framework of Solomonoff prediction assigns prior probability to hypotheses inversely proportional to their Kolmogorov complexity. There are two well-known problems. First, the Solomonoff prior is relative to a choice of Universal Turing machine. Second, the Solomonoff prior is not computable. However, there are responses to both problems. Different Solomonoff priors converge with more and more data. Further, there are computable approximations to the Solomonoff prior. I argue that there is a tension between these two responses. This is because computable (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Accuracy and infinity: a dilemma for subjective Bayesians.Mikayla Kelley & Sven Neth - 2023 - Synthese 201 (12):1-14.
    We argue that subjective Bayesians face a dilemma: they must offend against the spirit of their permissivism about rational credence or reject the principle that one should avoid accuracy dominance.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Accuracy, probabilism and Bayesian update in infinite domains.Alexander R. Pruss - 2022 - Synthese 200 (6):1-29.
    Scoring rules measure the accuracy or epistemic utility of a credence assignment. A significant literature uses plausible conditions on scoring rules on finite sample spaces to argue for both probabilism—the doctrine that credences ought to satisfy the axioms of probabilism—and for the optimality of Bayesian update as a response to evidence. I prove a number of formal results regarding scoring rules on infinite sample spaces that impact the extension of these arguments to infinite sample spaces. A common condition in the (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Accuracy, Risk, and the Principle of Indifference.Richard Pettigrew - 2016 - Philosophy and Phenomenological Research 92 (1):35-59.
    In Bayesian epistemology, the problem of the priors is this: How should we set our credences (or degrees of belief) in the absence of evidence? That is, how should we set our prior or initial credences, the credences with which we begin our credal life? David Lewis liked to call an agent at the beginning of her credal journey a superbaby. The problem of the priors asks for the norms that govern these superbabies. -/- The Principle of Indifference gives a (...)
    Download  
     
    Export citation  
     
    Bookmark   70 citations  
  • The material theory of induction.John D. Norton - 2021 - Calgary, Alberta, Canada: University of Calgary Press.
    The inaugural title in the new, Open Access series BSPS Open, The Material Theory of Induction will initiate a new tradition in the analysis of inductive inference. The fundamental burden of a theory of inductive inference is to determine which are the good inductive inferences or relations of inductive support and why it is that they are so. The traditional approach is modeled on that taken in accounts of deductive inference. It seeks universally applicable schemas or rules or a single (...)
    Download  
     
    Export citation  
     
    Bookmark   39 citations  
  • Accuracy Across Doxastic Attitudes: Recent Work on the Accuracy of Belief.Robert Weston Siscoe - 2022 - American Philosophical Quarterly 59 (2):201-217.
    James Joyce's article “A Nonpragmatic Vindication of Probabilism” introduced an approach to arguing for credal norms by appealing to the epistemic value of accuracy. The central thought was that credences ought to accurately represent the world, a guiding thought that has gone on to generate an entire research paradigm on the rationality of credences. Recently, a number of epistemologists have begun to apply this same thought to full beliefs, attempting to explain and argue for norms of belief in terms of (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Updating: Learning versus supposing.Jiaying Zhao, Vincenzo Crupi, Katya Tentori, Branden Fitelson & Daniel Osherson - 2012 - Cognition 124 (3):373-378.
    Download  
     
    Export citation  
     
    Bookmark   10 citations  
  • Justifying the Norms of Inductive Inference.Olav Benjamin Vassend - 2022 - British Journal for the Philosophy of Science 73 (1):135-160.
    Bayesian inference is limited in scope because it cannot be applied in idealized contexts where none of the hypotheses under consideration is true and because it is committed to always using the likelihood as a measure of evidential favouring, even when that is inappropriate. The purpose of this article is to study inductive inference in a very general setting where finding the truth is not necessarily the goal and where the measure of evidential favouring is not necessarily the likelihood. I (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Judging the Probability of Hypotheses Versus the Impact of Evidence: Which Form of Inductive Inference Is More Accurate and Time‐Consistent?Katya Tentori, Nick Chater & Vincenzo Crupi - 2016 - Cognitive Science 40 (3):758-778.
    Inductive reasoning requires exploiting links between evidence and hypotheses. This can be done focusing either on the posterior probability of the hypothesis when updated on the new evidence or on the impact of the new evidence on the credibility of the hypothesis. But are these two cognitive representations equally reliable? This study investigates this question by comparing probability and impact judgments on the same experimental materials. The results indicate that impact judgments are more consistent in time and more accurate than (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • Should I pretend I'm perfect?Julia Staffel - 2017 - Res Philosophica 94 (2):301-324.
    Ideal agents are role models whose perfection in some normative domain we try to approximate. But which form should this striving take? It is well known that following ideal rules of practical reasoning can have disastrous results for non-ideal agents. Yet, this issue has not been explored with respect to rules of theoretical reasoning. I show how we can extend Bayesian models of ideally rational agents in order to pose and answer the question of whether non-ideal agents should form new (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • Measuring the overall incoherence of credence functions.Julia Staffel - 2015 - Synthese 192 (5):1467-1493.
    Many philosophers hold that the probability axioms constitute norms of rationality governing degrees of belief. This view, known as subjective Bayesianism, has been widely criticized for being too idealized. It is claimed that the norms on degrees of belief postulated by subjective Bayesianism cannot be followed by human agents, and hence have no normative force for beings like us. This problem is especially pressing since the standard framework of subjective Bayesianism only allows us to distinguish between two kinds of credence (...)
    Download  
     
    Export citation  
     
    Bookmark   17 citations  
  • Radical Pooling and Imprecise Probabilities.Ignacio Ojea Quintana - forthcoming - Erkenntnis:1-28.
    This paper focuses on radical pooling, or the question of how to aggregate credences when there is a fundamental disagreement about which is the relevant logical space for inquiry. The solution advanced is based on the notion of consensus as common ground, where agents can find it by suspending judgment on logical possibilities. This is exemplified with cases of scientific revolution. On a formal level, the proposal uses algebraic joins and imprecise probabilities; which is shown to be compatible with the (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • The Population Ethics of Belief: In Search of an Epistemic Theory X.Richard Pettigrew - 2018 - Noûs 52 (2):336-372.
    Consider Phoebe and Daphne. Phoebe has credences in 1 million propositions. Daphne, on the other hand, has credences in all of these propositions, but she's also got credences in 999 million other propositions. Phoebe's credences are all very accurate. Each of Daphne's credences, in contrast, are not very accurate at all; each is a little more accurate than it is inaccurate, but not by much. Whose doxastic state is better, Phoebe's or Daphne's? It is clear that this question is analogous (...)
    Download  
     
    Export citation  
     
    Bookmark   16 citations  
  • Précis and replies to contributors for book symposium on accuracy and the laws of credence.Richard Pettigrew - 2017 - Episteme 14 (1):1-30.
    ABSTRACTThis book symposium onAccuracy and the Laws of Credenceconsists of an overview of the book’s argument by the author, Richard Pettigrew, together with four commentaries on different aspects of that argument. Ben Levinstein challenges the characterisation of the legitimate measures of inaccuracy that plays a central role in the arguments of the book. Julia Staffel asks whether the arguments of the book are compatible with an ontology of doxastic states that includes full beliefs as well as credences. Fabrizio Cariani raises (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • On the Accuracy of Group Credences.Richard Pettigrew - 2019 - Oxford Studies in Epistemology 6.
    We often ask for the opinion of a group of individuals. How strongly does the scientific community believe that the rate at which sea levels are rising has increased over the last 200 years? How likely does the UK Treasury think it is that there will be a recession if the country leaves the European Union? What are these group credences that such questions request? And how do they relate to the individual credences assigned by the members of the particular (...)
    Download  
     
    Export citation  
     
    Bookmark   14 citations  
  • On the Expected Utility Objection to the Dutch Book Argument for Probabilism.Richard Pettigrew - 2021 - Noûs (1):23-38.
    The Dutch Book Argument for Probabilism assumes Ramsey's Thesis (RT), which purports to determine the prices an agent is rationally required to pay for a bet. Recently, a new objection to Ramsey's Thesis has emerged (Hedden 2013, Wronski & Godziszewski 2017, Wronski 2018)--I call this the Expected Utility Objection. According to this objection, it is Maximise Subjective Expected Utility (MSEU) that determines the prices an agent is required to pay for a bet, and this often disagrees with Ramsey's Thesis. I (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Jamesian epistemology formalised: An explication of ‘the will to believe’.Richard Pettigrew - 2016 - Episteme 13 (3):253-268.
    Famously, William James held that there are two commandments that govern our epistemic life: Believe truth! Shun error! In this paper, I give a formal account of James' claim using the tools of epistemic utility theory. I begin by giving the account for categorical doxastic states – that is, full belief, full disbelief, and suspension of judgment. Then I will show how the account plays out for graded doxastic states – that is, credences. The latter part of the paper thus (...)
    Download  
     
    Export citation  
     
    Bookmark   21 citations  
  • Epistemic Utility and Norms for Credences.Richard Pettigrew - 2013 - Philosophy Compass 8 (10):897-908.
    Beliefs come in different strengths. An agent's credence in a proposition is a measure of the strength of her belief in that proposition. Various norms for credences have been proposed. Traditionally, philosophers have tried to argue for these norms by showing that any agent who violates them will be lead by her credences to make bad decisions. In this article, we survey a new strategy for justifying these norms. The strategy begins by identifying an epistemic utility function and a decision-theoretic (...)
    Download  
     
    Export citation  
     
    Bookmark   40 citations  
  • Epistemic Utility and the Normativity of Logic.Richard Pettigrew - 2017 - Logos and Episteme 8 (4):455-492.
    How does logic relate to rational belief? Is logic normative for belief, as some say? What, if anything, do facts about logical consequence tell us about norms of doxastic rationality? In this paper, we consider a range of putative logic-rationality bridge principles. These purport to relate facts about logical consequence to norms that govern the rationality of our beliefs and credences. To investigate these principles, we deploy a novel approach, namely, epistemic utility theory. That is, we assume that doxastic attitudes (...)
    Download  
     
    Export citation  
     
    Bookmark   10 citations  
  • Aggregating incoherent agents who disagree.Richard Pettigrew - 2019 - Synthese 196 (7):2737-2776.
    In this paper, we explore how we should aggregate the degrees of belief of a group of agents to give a single coherent set of degrees of belief, when at least some of those agents might be probabilistically incoherent. There are a number of ways of aggregating degrees of belief, and there are a number of ways of fixing incoherent degrees of belief. When we have picked one of each, should we aggregate first and then fix, or fix first and (...)
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • Three conceptions of explaining how possibly—and one reductive account.Johannes Persson - 2009 - In Henk W. de Regt (ed.), Epsa Philosophy of Science: Amsterdam 2009. Springer. pp. 275--286.
    Philosophers of science have often favoured reductive approaches to how-possibly explanation. This article identifies three alternative conceptions making how-possibly explanation an interesting phenomenon in its own right. The first variety approaches “how possibly X?” by showing that X is not epistemically impossible. This can sometimes be achieved by removing misunderstandings concerning the implications of one’s current belief system but involves characteristically a modification of this belief system so that acceptance of X does not result in contradiction. The second variety offers (...)
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • On the Best Accuracy Arguments for Probabilism.Michael Nielsen - 2022 - Philosophy of Science 89 (3):621-630.
    In a recent paper, Pettigrew reports a generalization of the celebrated accuracy-dominance theorem due to Predd et al., but Pettigrew’s proof is incorrect. I will explain the mistakes and provide a correct proof.
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Accuracy and Probabilism in Infinite Domains.Michael Nielsen - 2023 - Mind 132 (526):402-427.
    The best accuracy arguments for probabilism apply only to credence functions with finite domains, that is, credence functions that assign credence to at most finitely many propositions. This is a significant limitation. It reveals that the support for the accuracy-first program in epistemology is a lot weaker than it seems at first glance, and it means that accuracy arguments cannot yet accomplish everything that their competitors, the pragmatic (Dutch book) arguments, can. In this paper, I investigate the extent to which (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Symmetry and partial belief geometry.Stefan Lukits - 2021 - European Journal for Philosophy of Science 11 (3):1-24.
    When beliefs are quantified as credences, they are related to each other in terms of closeness and accuracy. The “accuracy first” approach in formal epistemology wants to establish a normative account for credences based entirely on the alethic properties of the credence: how close it is to the truth. To pull off this project, there is a need for a scoring rule. There is widespread agreement about some constraints on this scoring rule, but not whether a unique scoring rule stands (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Formal Epistemology Meets Mechanism Design.Jürgen Landes - 2023 - Journal for General Philosophy of Science / Zeitschrift für Allgemeine Wissenschaftstheorie 54 (2):215-231.
    This article connects recent work in formal epistemology to work in economics and computer science. Analysing the Dutch Book Arguments, Epistemic Utility Theory and Objective Bayesian Epistemology we discover that formal epistemologists employ the same argument structure as economists and computer scientists. Since similar approaches often have similar problems and have shared solutions, opportunities for cross-fertilisation abound.
    Download  
     
    Export citation  
     
    Bookmark  
  • The Foundations of Epistemic Decision Theory.Jason Konek & Benjamin A. Levinstein - 2019 - Mind 128 (509):69-107.
    Download  
     
    Export citation  
     
    Bookmark   23 citations  
  • Measuring inaccuracy of uncertain doxastic states in many-valued logical systems.Pavel Janda - 2016 - Journal of Applied Logic 14:95-112.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • What probability probably isn't.C. Howson - 2015 - Analysis 75 (1):53-59.
    Joyce and others have claimed that degrees of belief are estimates of truth-values and that the probability axioms are conditions of admissibility for these estimates with respect to a scoring rule penalizing inaccuracy. In this article, I argue that the claim that the rules of probability are truth-directed in this way depends on an assumption that is both implausible and lacks any supporting evidence, strongly suggesting that the probability axioms have nothing intrinsically to do with truth-directedness.
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Updating for Externalists.J. Dmitri Gallow - 2021 - Noûs 55 (3):487-516.
    The externalist says that your evidence could fail to tell you what evidence you do or not do have. In that case, it could be rational for you to be uncertain about what your evidence is. This is a kind of uncertainty which orthodox Bayesian epistemology has difficulty modeling. For, if externalism is correct, then the orthodox Bayesian learning norms of conditionalization and reflection are inconsistent with each other. I recommend that an externalist Bayesian reject conditionalization. In its stead, I (...)
    Download  
     
    Export citation  
     
    Bookmark   17 citations  
  • The value of cost-free uncertain evidence.Patryk Dziurosz-Serafinowicz & Dominika Dziurosz-Serafinowicz - 2021 - Synthese 199 (5-6):13313-13343.
    We explore the question of whether cost-free uncertain evidence is worth waiting for in advance of making a decision. A classical result in Bayesian decision theory, known as the value of evidence theorem, says that, under certain conditions, when you update your credences by conditionalizing on some cost-free and certain evidence, the subjective expected utility of obtaining this evidence is never less than the subjective expected utility of not obtaining it. We extend this result to a type of update method, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Chance, Resiliency, and Humean Supervenience.Patryk Dziurosz-Serafinowicz - 2019 - Erkenntnis 84 (1):1-19.
    This paper shows how a particular resiliency-centered approach to chance lends support for two conditions characterizing chance. The first condition says that the present chance of some proposition A conditional on the proposition about some later chance of A should be set equal to that later chance of A. The second condition requires the present chance of some proposition A to be equal to the weighted average of possible later chances of A. I first introduce, motivate, and make precise a (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Accuracy, Verisimilitude, and Scoring Rules.Jeffrey Dunn - 2019 - Australasian Journal of Philosophy 97 (1):151-166.
    ABSTRACTSuppose that beliefs come in degrees. How should we then measure the accuracy of these degrees of belief? Scoring rules are usually thought to be the mathematical tool appropriate for this job. But there are many scoring rules, which lead to different ordinal accuracy rankings. Recently, Fallis and Lewis [2016] have given an argument that, if sound, rules out many popular scoring rules, including the Brier score, as genuine measures of accuracy. I respond to this argument, in part by noting (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • Lockeans Maximize Expected Accuracy.Kevin Dorst - 2019 - Mind 128 (509):175-211.
    The Lockean Thesis says that you must believe p iff you’re sufficiently confident of it. On some versions, the 'must' asserts a metaphysical connection; on others, it asserts a normative one. On some versions, 'sufficiently confident' refers to a fixed threshold of credence; on others, it varies with proposition and context. Claim: the Lockean Thesis follows from epistemic utility theory—the view that rational requirements are constrained by the norm to promote accuracy. Different versions of this theory generate different versions of (...)
    Download  
     
    Export citation  
     
    Bookmark   94 citations  
  • Advanced modalizing de dicto and de re.John Divers & John J. Parry - 2018 - Analysis 78 (3):415-425.
    Lewis’ analysis of modality faces a problem in that it appears to confer unintended truth values to certain modal claims about the pluriverse: e.g. ‘It is possible that there are many worlds’ is false when we expect truth. This is the problem of advanced modalizing. Divers presents a principled solution to this problem by treating modal modifiers as semantically redundant in some such cases. However, this semantic move does not deal adequately with advanced de re modal claims. Here, we motivate (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Formal Representations of Belief.Franz Huber - 2008 - Stanford Encyclopedia of Philosophy.
    Epistemology is the study of knowledge and justified belief. Belief is thus central to epistemology. It comes in a qualitative form, as when Sophia believes that Vienna is the capital of Austria, and a quantitative form, as when Sophia's degree of belief that Vienna is the capital of Austria is at least twice her degree of belief that tomorrow it will be sunny in Vienna. Formal epistemology, as opposed to mainstream epistemology (Hendricks 2006), is epistemology done in a formal way, (...)
    Download  
     
    Export citation  
     
    Bookmark   27 citations