This book explores a question central to philosophy--namely, what does it take for a belief to be justified or rational? According to a widespread view, whether one has justification for believing a proposition is determined by how probable that proposition is, given one's evidence. In this book this view is rejected and replaced with another: in order for one to have justification for believing a proposition, one's evidence must normically support it--roughly, one's evidence must make the falsity of that proposition (...) abnormal in the sense of calling for special, independent explanation. This conception of justification bears upon a range of topics in epistemology and beyond. Ultimately, this way of looking at justification guides us to a new, unfamiliar picture of how we should respond to our evidence and manage our own fallibility. This picture is developed here. (shrink)
This paper develops an information-sensitive theory of the semantics and probability of conditionals and statements involving epistemic modals. The theory validates a number of principles linking probability and modality, including the principle that the probability of a conditional If A, then C equals the probability of C, updated with A. The theory avoids so-called triviality results, which are standardly taken to show that principles of this sort cannot be validated. To achieve this, we deny that rational (...) agents update their credences via conditionalization. We offer a new rule of update, Hyperconditionalization, which agrees with Conditionalization whenever nonmodal statements are at stake but differs for modal and conditional sentences. (shrink)
Many have argued that a rational agent's attitude towards a proposition may be better represented by a probability range than by a single number. I show that in such cases an agent will have unstable betting behaviour, and so will behave in an unpredictable way. I use this point to argue against a range of responses to the ‘two bets’ argument for sharp probabilities.
This paper is about teaching probability to students of philosophy who don’t aim to do primarily formal work in their research. These students are unlikely to seek out classes about probability or formal epistemology for various reasons, for example because they don’t realize that this knowledge would be useful for them or because they are intimidated by the material. However, most areas of philosophy now contain debates that incorporate probability, and basic knowledge of it is essential even (...) for philosophers whose work isn’t primarily formal. In this paper, I explain how to teach probability to students who are not already enthusiastic about formal philosophy, taking into account the common phenomena of math anxiety and the lack of reading skills for formal texts. I address course design, lesson design, and assignment design. Most of my recommendations also apply to teaching formal methods other than probability theory. (shrink)
This chapter will review selected aspects of the terrain of discussions about probabilities in statistical mechanics (with no pretensions to exhaustiveness, though the major issues will be touched upon), and will argue for a number of claims. None of the claims to be defended is entirely original, but all deserve emphasis. The first, and least controversial, is that probabilistic notions are needed to make sense of statistical mechanics. The reason for this is the same reason that convinced Maxwell, Gibbs, and (...) Boltzmann that probabilities would be needed, namely, that the second law of thermodynamics, which in its original formulation says that certain processes are impossible, must, on the kinetic theory, be replaced by a weaker formulation according to which what the original version deems impossible is merely improbable. Second is that we ought not take the standard measures invoked in equilibrium statistical mechanics as giving, in any sense, the correct probabilities about microstates of the system. We can settle for a much weaker claim: that the probabilities for outcomes of experiments yielded by the standard distributions are effectively the same as those yielded by any distribution that we should take as a representing probabilities over microstates. Lastly, (and most controversially): in asking about the status of probabilities in statistical mechanics, the familiar dichotomy between epistemic probabilities (credences, or degrees of belief) and ontic (physical) probabilities is insufficient; the concept of probability that is best suited to the needs of statistical mechanics is one that combines epistemic and physical considerations. (shrink)
It is well known that classical, aka ‘sharp’, Bayesian decision theory, which models belief states as single probability functions, faces a number of serious difficulties with respect to its handling of agnosticism. These difficulties have led to the increasing popularity of so-called ‘imprecise’ models of decision-making, which represent belief states as sets of probability functions. In a recent paper, however, Adam Elga has argued in favour of a putative normative principle of sequential choice that he claims to be (...) borne out by the sharp model but not by any promising incarnation of its imprecise counterpart. After first pointing out that Elga has fallen short of establishing that his principle is indeed uniquely borne out by the sharp model, I cast aspersions on its plausibility. I show that a slight weakening of the principle is satisfied by at least one, but interestingly not all, varieties of the imprecise model and point out that Elga has failed to motivate his stronger commitment. (shrink)
Subjective probability plays an increasingly important role in many fields concerned with human cognition and behavior. Yet there have been significant criticisms of the idea that probabilities could actually be represented in the mind. This paper presents and elaborates a view of subjective probability as a kind of sampling propensity associated with internally represented generative models. The resulting view answers to some of the most well known criticisms of subjective probability, and is also supported by empirical work (...) in neuroscience and behavioral psychology. The repercussions of the view for how we conceive of many ordinary instances of subjective probability, and how it relates to more traditional conceptions of subjective probability, are discussed in some detail. (shrink)
I argue that when we use ‘probability’ language in epistemic contexts—e.g., when we ask how probable some hypothesis is, given the evidence available to us—we are talking about degrees of support, rather than degrees of belief. The epistemic probability of A given B is the mind-independent degree to which B supports A, not the degree to which someone with B as their evidence believes A, or the degree to which someone would or should believe A if they had (...) B as their evidence. My central argument is that the degree-of-support interpretation lets us better model good reasoning in certain cases involving old evidence. Degree-of-belief interpretations make the wrong predictions not only about whether old evidence confirms new hypotheses, but about the values of the probabilities that enter into Bayes’ Theorem when we calculate the probability of hypotheses conditional on old evidence and new background information. (shrink)
John Maynard Keynes’s A Treatise on Probability is the seminal text for the logical interpretation of probability. According to his analysis, probabilities are evidential relations between a hypothesis and some evidence, just like the relations of deductive logic. While some philosophers had suggested similar ideas prior to Keynes, it was not until his Treatise that the logical interpretation of probability was advocated in a clear, systematic and rigorous way. I trace Keynes’s influence in the philosophy of (...) class='Hi'>probability through a heterogeneous sample of thinkers who adopted his interpretation. This sample consists of Frederick C. Benenson, Roy Harrod, Donald C. Williams, Henry E. Kyburg and David Stove. The ideas of Keynes prove to be adaptable to their diverse theories of probability. My discussion indicates both the robustness of Keynes’s probability theory and the importance of its influence on the philosophers whom I describe. I also discuss the Problem of the Priors. I argue that none of those I discuss have obviously improved on Keynes’s theory with respect to this issue. (shrink)
The role of probability is one of the most contested issues in the interpretation of contemporary physics. In this paper, I’ll be reevaluating some widely held assumptions about where and how probabilities arise. Larry Sklar voices the conventional wisdom about probability in classical physics in a piece in the Stanford Online Encyclopedia of Philosophy, when he writes that “Statistical mechanics was the first foundational physical theory in which probabilistic concepts and probabilistic explanation played a fundamental role.” And the (...) conventional wisdom about quantum probabilities is that they are basic, not reducible to the types of probabilities we see in statistical mechanics. In the first section of this paper, I’ll argue that in fact classical physics was steeped in probability long before statistical mechanics came on the scene, specifically, that an objective measure over phase space is an indispensable component of any informative physical theory. In the next section, I’ll argue that this objective measure is the fundamental form of physical probability and that quantum probabilities can be defined in terms of it. In the last, I’ll raise some questions about the metaphysical status of the fundamental measure. (shrink)
The article is a plea for ethicists to regard probability as one of their most important concerns. It outlines a series of topics of central importance in ethical theory in which probability is implicated, often in a surprisingly deep way, and lists a number of open problems. Topics covered include: interpretations of probability in ethical contexts; the evaluative and normative significance of risk or uncertainty; uses and abuses of expected utility theory; veils of ignorance; Harsanyi’s aggregation theorem; (...) population size problems; equality; fairness; giving priority to the worse off; continuity; incommensurability; nonexpected utility theory; evaluative measurement; aggregation; causal and evidential decision theory; act consequentialism; rule consequentialism; and deontology. (shrink)
Modern scientific cosmology pushes the boundaries of knowledge and the knowable. This is prompting questions on the nature of scientific knowledge. A central issue is what defines a 'good' model. When addressing global properties of the Universe or its initial state this becomes a particularly pressing issue. How to assess the probability of the Universe as a whole is empirically ambiguous, since we can examine only part of a single realisation of the system under investigation: at some point, data (...) will run out. We review the basics of applying Bayesian statistical explanation to the Universe as a whole. We argue that a conventional Bayesian approach to model inference generally fails in such circumstances, and cannot resolve, e.g., the so-called 'measure problem' in inflationary cosmology. Implicit and non-empirical valuations inevitably enter model assessment in these cases. This undermines the possibility to perform Bayesian model comparison. One must therefore either stay silent, or pursue a more general form of systematic and rational model assessment. We outline a generalised axiological Bayesian model inference framework, based on mathematical lattices. This extends inference based on empirical data (evidence) to additionally consider the properties of model structure (elegance) and model possibility space (beneficence). We propose this as a natural and theoretically well-motivated framework for introducing an explicit, rational approach to theoretical model prejudice and inference beyond data. (shrink)
Enjoying great popularity in decision theory, epistemology, and philosophy of science, Bayesianism as understood here is fundamentally concerned with epistemically ideal rationality. It assumes a tight connection between evidential probability and ideally rational credence, and usually interprets evidential probability in terms of such credence. Timothy Williamson challenges Bayesianism by arguing that evidential probabilities cannot be adequately interpreted as the credences of an ideal agent. From this and his assumption that evidential probabilities cannot be interpreted as the actual credences (...) of human agents either, he concludes that no interpretation of evidential probabilities in terms of credence is adequate. I argue to the contrary. My overarching aim is to show on behalf of Bayesians how one can still interpret evidential probabilities in terms of ideally rational credence and how one can maintain a tight connection between evidential probabilities and ideally rational credence even if the former cannot be interpreted in terms of the latter. By achieving this aim I illuminate the limits and prospects of Bayesianism. (shrink)
There is a trade-off between specificity and accuracy in existing models of belief. Descriptions of agents in the tripartite model, which recognizes only three doxastic attitudes—belief, disbelief, and suspension of judgment—are typically accurate, but not sufficiently specific. The orthodox Bayesian model, which requires real-valued credences, is perfectly specific, but often inaccurate: we often lack precise credences. I argue, first, that a popular attempt to fix the Bayesian model by using sets of functions is also inaccurate, since it requires us to (...) have interval-valued credences with perfectly precise endpoints. We can see this problem as analogous to the problem of higher order vagueness. Ultimately, I argue, the only way to avoid these problems is to endorse Insurmountable Unclassifiability. This principle has some surprising and radical consequences. For example, it entails that the trade-off between accuracy and specificity is in-principle unavoidable: sometimes it is simply impossible to characterize an agent’s doxastic state in a way that is both fully accurate and maximally specific. What we can do, however, is improve on both the tripartite and existing Bayesian models. I construct a new model of belief—the minimal model—that allows us to characterize agents with much greater specificity than the tripartite model, and yet which remains, unlike existing Bayesian models, perfectly accurate. (shrink)
Stalnaker's Thesis about indicative conditionals is, roughly, that the probability one ought to assign to an indicative conditional equals the probability that one ought to assign to its consequent conditional on its antecedent. The thesis seems right. If you draw a card from a standard 52-card deck, how confident are you that the card is a diamond if it's a red card? To answer this, you calculate the proportion of red cards that are diamonds -- that is, you (...) calculate the probability of drawing a diamond conditional on drawing a red card. Skyrms' Thesis about counterfactual conditionals is, roughly, that the probability that one ought to assign to a counterfactual equals one's rational expectation of the chance, at a relevant past time, of its consequent conditional on its antecedent. This thesis also seems right. If you decide not to enter a 100-ticket lottery, how confident are you that you would have won had you bought a ticket? To answer this, you calculate the prior chance--that is, the chance just before your decision not to buy a ticket---of winning conditional on entering the lottery. The central project of this article is to develop a new uniform theory of conditionals that allows us to derive a version of Skyrms' Thesis from a version of Stalnaker's Thesis, together with a chance-deference norm relating rational credence to beliefs about objective chance. (shrink)
Probability can be used to measure degree of belief in two ways: objectively and subjectively. The objective measure is a measure of the rational degree of belief in a proposition given a set of evidential propositions. The subjective measure is the measure of a particular subject’s dispositions to decide between options. In both measures, certainty is a degree of belief 1. I will show, however, that there can be cases where one belief is stronger than another yet both beliefs (...) are plausibly measurable as objectively and subjectively certain. In ordinary language, we can say that while both beliefs are certain, one belief is more certain than the other. I will then propose second, non probabilistic dimension of measurement, which tracks this variation in certainty in such cases where the probability is 1. A general principle of rationality is that one’s subjective degree of belief should match the rational degree of belief given the evidence available. In this paper I hope to show that it is also a rational principle that the maximum stake size at which one should remain certain should match the rational weight of certainty given the evidence available. Neither objective nor subjective measures of certainty conform to the axioms of probability, but instead are measured in utility. This has the consequence that, although it is often rational to be certain to some degree, there is no such thing as absolute certainty. (shrink)
In the Bayesian approach to quantum mechanics, probabilities—and thus quantum states—represent an agent’s degrees of belief, rather than corresponding to objective properties of physical systems. In this paper we investigate the concept of certainty in quantum mechanics. Particularly, we show how the probability-1 predictions derived from pure quantum states highlight a fundamental difference between our Bayesian approach, on the one hand, and Copenhagen and similar interpretations on the other. We first review the main arguments for the general claim that (...) probabilities always represent degrees of belief. We then argue that a quantum state prepared by some physical device always depends on an agent’s prior beliefs, implying that the probability-1 predictions derived from that state also depend on the agent’s prior beliefs. Quantum certainty is therefore always some agent’s certainty. Conversely, if facts about an experimental setup could imply agent-independent certainty for a measurement outcome, as in many Copenhagen-like interpretations, that outcome would effectively correspond to a preexisting system property. The idea that measurement outcomes occurring with certainty correspond to preexisting system properties is, however, in conflict with locality. We emphasize this by giving a version of an argument of Stairs [(1983). Quantum logic, realism, and value-definiteness. Philosophy of Science, 50, 578], which applies the Kochen–Specker theorem to an entangled bipartite system. (shrink)
According to the Lockean thesis, a proposition is believed just in case it is highly probable. While this thesis enjoys strong intuitive support, it is known to conflict with seemingly plausible logical constraints on our beliefs. One way out of this conflict is to make probability 1 a requirement for belief, but most have rejected this option for entailing what they see as an untenable skepticism. Recently, two new solutions to the conflict have been proposed that are alleged to (...) be non-skeptical. We compare these proposals with each other and with the Lockean thesis, in particular with regard to the question of how much we gain by adopting any one of them instead of the probability 1 requirement, that is, of how likely it is that one believes more than the things one is fully certain of. (shrink)
Non-Archimedean probability functions allow us to combine regularity with perfect additivity. We discuss the philosophical motivation for a particular choice of axioms for a non-Archimedean probability theory and answer some philosophical objections that have been raised against infinitesimal probabilities in general.
I present new counterexamples to the asymmetry of grounding: we have prima facie reason to think that some conditional probabilities partially ground their inverse conditional probabilities, and vice versa. These new counterexamples may require that we reject the asymmetry of grounding, or alternatively may require that we reject one or more of the assumptions which enable the counterexamples. Either way, by reflecting on these purported counterexamples to grounding asymmetry we learn something important, either about the formal properties of grounding, or (...) about the nature of probability. (shrink)
In finite probability theory, events are subsets S⊆U of the outcome set. Subsets can be represented by 1-dimensional column vectors. By extending the representation of events to two dimensional matrices, we can introduce "superposition events." Probabilities are introduced for classical events, superposition events, and their mixtures by using density matrices. Then probabilities for experiments or `measurements' of all these events can be determined in a manner exactly like in quantum mechanics (QM) using density matrices. Moreover the transformation of the (...) density matrices induced by the experiments or `measurements' is the Lüders mixture operation as in QM. And finally by moving the machinery into the n-dimensional vector space over ℤ₂, different basis sets become different outcome sets. That `non-commutative' extension of finite probability theory yields the pedagogical model of quantum mechanics over ℤ₂ that can model many characteristic non-classical results of QM. (shrink)
This paper argues that the technical notion of conditional probability, as given by the ratio analysis, is unsuitable for dealing with our pretheoretical and intuitive understanding of both conditionality and probability. This is an ontological account of conditionals that include an irreducible dispositional connection between the antecedent and consequent conditions and where the conditional has to be treated as an indivisible whole rather than compositional. The relevant type of conditionality is found in some well-defined group of conditional statements. (...) As an alternative, therefore, we briefly offer grounds for what we would call an ontological reading: for both conditionality and conditional probability in general. It is not offered as a fully developed theory of conditionality but can be used, we claim, to explain why calculations according to the RATIO scheme does not coincide with our intuitive notion of conditional probability. What it shows us is that for an understanding of the whole range of conditionals we will need what John Heil (2003), in response to Quine (1953), calls an ontological point of view. (shrink)
Many philosophers argue that Keynes’s concept of the “weight of arguments” is an important aspect of argument appraisal. The weight of an argument is the quantity of relevant evidence cited in the premises. However, this dimension of argumentation does not have a received method for formalisation. Kyburg has suggested a measure of weight that uses the degree of imprecision in his system of “Evidential Probability” to quantify weight. I develop and defend this approach to measuring weight. I illustrate the (...) usefulness of this measure by employing it to develop an answer to Popper’s Paradox of Ideal Evidence. (shrink)
Early work on the frequency theory of probability made extensive use of the notion of randomness, conceived of as a property possessed by disorderly collections of outcomes. Growing out of this work, a rich mathematical literature on algorithmic randomness and Kolmogorov complexity developed through the twentieth century, but largely lost contact with the philosophical literature on physical probability. The present chapter begins with a clarification of the notions of randomness and probability, conceiving of the former as a (...) property of a sequence of outcomes, and the latter as a property of the process generating those outcomes. A discussion follows of the nature and limits of the relationship between the two notions, with largely negative verdicts on the prospects for any reduction of one to the other, although the existence of an apparently random sequence of outcomes is good evidence for the involvement of a genuinely chancy process. (shrink)
Recent years have witnessed a proliferation of attempts to apply the mathematical theory of probability to the semantics of natural language probability talk. These sorts of “probabilistic” semantics are often motivated by their ability to explain intuitions about inferences involving “likely” and “probably”—intuitions that Angelika Kratzer’s canonical semantics fails to accommodate through a semantics based solely on an ordering of worlds and a qualitative ranking of propositions. However, recent work by Wesley Holliday and Thomas Icard has been widely (...) thought to undercut this motivation: they present a world-ordering semantics that yields essentially the same logic as probabilistic semantics. In this paper, I argue that the challenge remains: defenders of world-ordering semantics have yet to offer a plausible semantics that captures the logic of comparative likelihood. Holliday & Icard’s semantics yields an adequate logic only if models are restricted to Noetherian pre-orders. But I argue that the Noetherian restriction faces problems in cases involving infinitely large domains of epistemic possibilities. As a result, probabilistic semantics remains the better explanation of the data. (shrink)
Bayesianism is the position that scientific reasoning is probabilistic and that probabilities are adequately interpreted as an agent's actual subjective degrees of belief, measured by her betting behaviour. Confirmation is one important aspect of scientific reasoning. The thesis of this paper is the following: if scientific reasoning is at all probabilistic, the subjective interpretation has to be given up in order to get right confirmation—and thus scientific reasoning in general. The Bayesian approach to scientific reasoning Bayesian confirmation theory The example (...) The less reliable the source of information, the higher the degree of Bayesian confirmation Measure sensitivity A more general version of the problem of old evidence Conditioning on the entailment relation The counterfactual strategy Generalizing the counterfactual strategy The desired result, and a necessary and sufficient condition for it Actual degrees of belief The common knock-down feature, or ‘anything goes’ The problem of prior probabilities. (shrink)
We propose a new account of indicative conditionals, giving acceptability and logical closure conditions for them. We start from Adams’ Thesis: the claim that the acceptability of a simple indicative equals the corresponding conditional probability. The Thesis is widely endorsed, but arguably false and refuted by empirical research. To fix it, we submit, we need a relevance constraint: we accept a simple conditional 'If φ, then ψ' to the extent that (i) the conditional probability p(ψ|φ) is high, provided (...) that (ii) φ is relevant for ψ. How (i) should work is well-understood. It is (ii) that holds the key to improve our understanding of conditionals. Our account has (i) a probabilistic component, using Popper functions; (ii) a relevance component, given via an algebraic structure of topics or subject matters. We present a probabilistic logic for simple indicatives, and argue that its (in)validities are both theoretically desirable and in line with empirical results on how people reason with conditionals. (shrink)
If we add as an extra premise that the agent does know H, then it is possible for her to know E H, we get the conclusion that the agent does not really know H. But even without that closure premise, or something like it, the conclusion seems quite dramatic. One possible response to the argument, floated by both Descartes and Hume, is to accept the conclusion and embrace scepticism. We cannot know anything that goes beyond our evidence, so (...) we do not know very much at all. This is a remarkably sceptical conclusion, so we should resist it if at all possible. A more modern response, associated with externalists like John McDowell and Timothy Williamson, is to accept the conclusion but deny it is as sceptical as it first appears. The Humean argument, even if it works, only shows that our evidence and our knowledge are more closely linked than we might have thought. Perhaps that’s true because we have a lot of evidence, not because we have very little knowledge. There’s something right about this response I think. We have more evidence than Descartes or Hume thought we had. But I think we still need the idea of ampliative knowledge. It stretches the concept of evidence to breaking point to suggest that all of our knowledge, including knowledge about the future, is part of our evidence. So the conclusion really is unacceptable. Or, at least, I think we should try to see what an epistemology that rejects the conclusion looks like. (shrink)
The epistemic probability of A given B is the degree to which B evidentially supports A, or makes A plausible. This paper is a first step in answering the question of what determines the values of epistemic probabilities. I break this question into two parts: the structural question and the substantive question. Just as an object’s weight is determined by its mass and gravitational acceleration, some probabilities are determined by other, more basic ones. The structural question asks what probabilities (...) are not determined in this way—these are the basic probabilities which determine values for all other probabilities. The substantive question asks how the values of these basic probabilities are determined. I defend an answer to the structural question on which basic probabilities are the probabilities of atomic propositions conditional on potential direct explanations. I defend this against the view, implicit in orthodox mathematical treatments of probability, that basic probabilities are the unconditional probabilities of complete worlds. I then apply my answer to the structural question to clear up common confusions in expositions of Bayesianism and shed light on the “problem of the priors.”. (shrink)
Epistemic closure under known implication is the principle that knowledge of "p" and knowledge of "p implies q", together, imply knowledge of "q". This principle is intuitive, yet several putative counterexamples have been formulated against it. This paper addresses the question, why is epistemic closure both intuitive and prone to counterexamples? In particular, the paper examines whether probability theory can offer an answer to this question based on four strategies. The first probability-based strategy rests on the accumulation of (...) risks. The problem with this strategy is that risk accumulation cannot accommodate certain counterexamples to epistemic closure. The second strategy is based on the idea of evidential support, that is, a piece of evidence supports a proposition whenever it increases the probability of the proposition. This strategy makes progress and can accommodate certain putative counterexamples to closure. However, this strategy also gives rise to a number of counterintuitive results. Finally, there are two broadly probabilistic strategies, one based on the idea of resilient probability and the other on the idea of assumptions that are taken for granted. These strategies are promising but are prone to some of the shortcomings of the second strategy. All in all, I conclude that each strategy fails. Probability theory, then, is unlikely to offer the account we need. (shrink)
This paper motivates and develops a novel semantic framework for deontic modals. The framework is designed to shed light on two things: the relationship between deontic modals and substantive theories of practical rationality and the interaction of deontic modals with conditionals, epistemic modals and probability operators. I argue that, in order to model inferential connections between deontic modals and probability operators, we need more structure than is provided by classical intensional theories. In particular, we need probabilistic structure that (...) interacts directly with the compositional semantics of deontic modals. However, I reject theories that provide this probabilistic structure by claiming that the semantics of deontic modals is linked to the Bayesian notion of expectation. I offer a probabilistic premise semantics that explains all the data that create trouble for the rival theories. (shrink)
A probability distribution is regular if no possible event is assigned probability zero. While some hold that probabilities should always be regular, three counter-arguments have been posed based on examples where, if regularity holds, then perfectly similar events must have different probabilities. Howson (2017) and Benci et al. (2016) have raised technical objections to these symmetry arguments, but we see here that their objections fail. Howson says that Williamson’s (2007) “isomorphic” events are not in fact isomorphic, but Howson (...) is speaking of set-theoretic representations of events in a probability model. While those sets are not isomorphic, Williamson’s physical events are, in the relevant sense. Benci et al. claim that all three arguments rest on a conflation of different models, but they do not. They are founded on the premise that similar events should have the same probability in the same model, or in one case, on the assumption that a single rotation-invariant distribution is possible. Having failed to refute the symmetry arguments on such technical grounds, one could deny their implicit premises, which is a heavy cost, or adopt varying degrees of instrumentalism or pluralism about regularity, but that would not serve the project of accurately modelling chances. (shrink)
Recently there have been several attempts in formal epistemology to develop an adequate probabilistic measure of coherence. There is much to recommend probabilistic measures of coherence. They are quantitative and render formally precise a notion—coherence—notorious for its elusiveness. Further, some of them do very well, intuitively, on a variety of test cases. Siebel, however, argues that there can be no adequate probabilistic measure of coherence. Take some set of propositions A, some probabilistic measure of coherence, and a probability distribution (...) such that all the probabilities on which A’s degree of coherence depends (according to the measure in question) are defined. Then, the argument goes, the degree to which A is coherent depends solely on the details of the distribution in question and not at all on the explanatory relations, if any, standing between the propositions in A. This is problematic, the argument continues, because, first, explanation matters for coherence, and, second, explanation cannot be adequately captured solely in terms of probability. We argue that Siebel’s argument falls short. (shrink)
Automated reasoning about uncertain knowledge has many applications. One difficulty when developing such systems is the lack of a completely satisfactory integration of logic and probability. We address this problem directly. Expressive languages like higher-order logic are ideally suited for representing and reasoning about structured knowledge. Uncertain knowledge can be modeled by using graded probabilities rather than binary truth-values. The main technical problem studied in this paper is the following: Given a set of sentences, each having some probability (...) of being true, what probability should be ascribed to other (query) sentences? A natural wish-list, among others, is that the probability distribution (i) is consistent with the knowledge base, (ii) allows for a consistent inference procedure and in particular (iii) reduces to deductive logic in the limit of probabilities being 0 and 1, (iv) allows (Bayesian) inductive reasoning and (v) learning in the limit and in particular (vi) allows confirmation of universally quantified hypotheses/sentences. We translate this wish-list into technical requirements for a prior probability and show that probabilities satisfying all our criteria exist. We also give explicit constructions and several general characterizations of probabilities that satisfy some or all of the criteria and various (counter) examples. We also derive necessary and sufficient conditions for extending beliefs about finitely many sentences to suitable probabilities over all sentences, and in particular least dogmatic or least biased ones. We conclude with a brief outlook on how the developed theory might be used and approximated in autonomous reasoning agents. Our theory is a step towards a globally consistent and empirically satisfactory unification of probability and logic. (shrink)
Many epistemologists hold that an agent can come to justifiably believe that p is true by seeing that it appears that p is true, without having any antecedent reason to believe that visual impressions are generally reliable. Certain reliabilists think this, at least if the agent’s vision is generally reliable. And it is a central tenet of dogmatism (as described by Pryor (2000) and Pryor (2004)) that this is possible. Against these positions it has been argued (e.g. by Cohen (2005) (...) and White (2006)) that this violates some principles from probabilistic learning theory. To see the problem, let’s note what the dogmatist thinks we can learn by paying attention to how things appear. (The reliabilist says the same things, but we’ll focus on the dogmatist.) Suppose an agent receives an appearance that p, and comes to believe that p. Letting Ap be the proposition that it appears to the agent that p, and → be the material implication, we can say that the agent learns that p, and hence is in a position to infer Ap → p, once they receive the evidence Ap.1 This is surprising, because we can prove the following. (shrink)
We generalize the Kolmogorov axioms for probability calculus to obtain conditions defining, for any given logic, a class of probability functions relative to that logic, coinciding with the standard probability functions in the special case of classical logic but allowing consideration of other classes of "essentially Kolmogorovian" probability functions relative to other logics. We take a broad view of the Bayesian approach as dictating inter alia that from the perspective of a given logic, rational degrees of (...) belief are those representable by probability functions from the class appropriate to that logic. Classical Bayesianism, which fixes the logic as classical logic, is only one version of this general approach. Another, which we call Intuitionistic Bayesianism, selects intuitionistic logic as the preferred logic and the associated class of probability functions as the right class of candidate representions of epistemic states (rational allocations of degrees of belief). Various objections to classical Bayesianism are, we argue, best met by passing to intuitionistic Bayesianism—in which the probability functions are taken relative to intuitionistic logic—rather than by adopting a radically non-Kolmogorovian, for example, nonadditive, conception of (or substitute for) probability functions, in spite of the popularity of the latter response among those who have raised these objections. The interest of intuitionistic Bayesianism is further enhanced by the availability of a Dutch Book argument justifying the selection of intuitionistic probability functions as guides to rational betting behavior when due consideration is paid to the fact that bets are settled only when/if the outcome bet on becomes known. (shrink)
In this paper, I will attempt to develop and defend a common form of intuitive resistance to the companions in guilt argument. I will argue that one can reasonably believe there are promising solutions to the access problem for mathematical realism that don’t translate to moral realism. In particular, I will suggest that the structuralist project of accounting for mathematical knowledge in terms of some form of logical knowledge offers significant hope of success while no analogous approach offers such hope (...) for moral realism. (shrink)
When probability discounting (or probability weighting), one multiplies the value of an outcome by one's subjective probability that the outcome will obtain in decision-making. The broader import of defending probability discounting is to help justify cost-benefit analyses in contexts such as climate change. This chapter defends probability discounting under risk both negatively, from arguments by Simon Caney (2008, 2009), and with a new positive argument. First, in responding to Caney, I argue that small costs and (...) benefits need to be evaluated, and that viewing practices at the social level is too coarse-grained. Second, I argue for probability discounting using a distinction between causal responsibility and moral responsibility. Moral responsibility can be cashed out in terms of blameworthiness and praiseworthiness, while causal responsibility obtains in full for any effect which is part of a causal chain linked to one's act. With this distinction in hand, unlike causal responsibility, moral responsibility can be seen as coming in degrees. My argument is, given that we can limit our deliberation and consideration to that which we are morally responsible for and that our moral responsibility for outcomes is limited by our subjective probabilities, our subjective probabilities can ground probability discounting. (shrink)
Conditional probability is often used to represent the probability of the conditional. However, triviality results suggest that the thesis that the probability of the conditional always equals conditional probability leads to untenable conclusions. In this paper, I offer an interpretation of this thesis in a possible worlds framework, arguing that the triviality results make assumptions at odds with the use of conditional probability. I argue that these assumptions come from a theory called the operator theory (...) and that the rival restrictor theory can avoid these problematic assumptions. In doing so, I argue that recent extensions of the triviality arguments to restrictor conditionals fail, making assumptions which are only justified on the operator theory. (shrink)
In this short survey article, I discuss Bell’s theorem and some strategies that attempt to avoid the conclusion of non-locality. I focus on two that intersect with the philosophy of probability: (1) quantum probabilities and (2) superdeterminism. The issues they raised not only apply to a wide class of no-go theorems about quantum mechanics but are also of general philosophical interest.
Given a few assumptions, the probability of a conjunction is raised, and the probability of its negation is lowered, by conditionalising upon one of the conjuncts. This simple result appears to bring Bayesian confirmation theory into tension with the prominent dogmatist view of perceptual justification – a tension often portrayed as a kind of ‘Bayesian objection’ to dogmatism. In a recent paper, David Jehle and Brian Weatherson observe that, while this crucial result holds within classical probability theory, (...) it fails within intuitionistic probability theory. They conclude that the dogmatist who is willing to take intuitionistic logic seriously can make a convincing reply to the Bayesian objection. In this paper, I argue that this conclusion is premature – the Bayesian objection can survive the transition from classical to intuitionistic probability, albeit in a slightly altered form. I shall conclude with some general thoughts about what the Bayesian objection to dogmatism does and doesn’t show. (shrink)
In this paper, I will claim that fictional works apparently about utterly immigrant objects, i.e., real individuals imported in fiction from reality, are instead about fictional individuals that intentionally resemble those real individuals in a significant manner: fictional surrogates of such individuals. Since I also share the realists’ conviction that the remaining fictional works concern native characters, i.e., full-fledged fictional individuals that originate in fiction itself, I will here defend a hyperrealist position according to which fictional works only concern fictional (...) individuals. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.