References in:
Imprecise Probabilities
In Richard Pettigrew & Jonathan Weisberg (eds.), The Open Handbook of Formal Epistemology. PhilPapers Foundation. pp. 107130 (2019)
Add references
You must login to add references.


When making decisions, people naturally face uncertainty about the potential consequences of their actions due in part to limits in their capacity to represent, evaluate or deliberate. Nonetheless, they aim to make the best decisions possible. In Decision Theory with a Human Face, Richard Bradley develops new theories of agency and rational decisionmaking, offering guidance on how 'real' agents who are aware of their bounds should represent the uncertainty they face, how they should revise their opinions as a result of (...) 

Bayesian models typically assume that agents are rational, logically omniscient and opinionated. The last of these has little descriptive or normative appeal, however, and limits our ability to describe how agents make up their minds (as opposed to changing them) or how they can suspend or withdraw their opinions. To address these limitations this paper represents the attitudinal states of nonopinionated agents by sets of (permissible) probability and desirability functions. Several basic ways in which such states of mind can be (...) 

Vague subjective probability may be modeled by means of a set of probability functions, so that the represented opinion has only a lower and upper bound. The standard rule of conditionalization can be straightforwardly adapted to this. But this combination has difficulties which, though well known in the technical literature, have not been given sufficient attention in probabilist or Bayesian epistemology. Specifically, updating on apparently irrelevant bits of news can be destructive of one’s explicitly prior expectations. Stability of vague subjective (...) 

We use probabilitymatching variations on Ellsberg’s singleurn experiment to assess three questions: (1) How sensitive are ambiguity attitudes to changes from a gain to a loss frame? (2) How sensitive are ambiguity attitudes to making ambiguity easier to recognize? (3) What is the relation between subjects’ consistency of choice and the ambiguity attitudes their choices display? Contrary to most other studies, we find that a switch from a gain to a loss frame does not lead to a switch from ambiguity (...) 







_Epistemology and Inference _ was first published in 1983. Minnesota Archive Editions uses digital technology to make longunavailable books once again accessible, and are published unaltered from the original University of Minnesota Press editions. Henry Kyburg has developed an original and important perspective on probabilistic and statistical inference. Unlike much contemporary writing by philosophers on these topics, Kyburg's work is informed by issues that have arisen in statistical theory and practice as well as issues familiar to professional philosophers. In two (...) 

There has been much recent interest in imprecise probabilities, models of belief that allow unsharp or fuzzy credence. There have also been some influential criticisms of this position. Here we argue, chiefly against Elga, that subjective probabilities need not be sharp. The key question is whether the imprecise probabilist can make reasonable sequences of decisions. We argue that she can. We outline Elga's argument and clarify the assumptions he makes and the principles of rationality he is implicitly committed to. We (...) 



Traditional Bayesianism requires that an agent’s degrees of belief be represented by a realvalued, probabilistic credence function. However, in many cases it seems that our evidence is not rich enough to warrant such precision. In light of this, some have proposed that we instead represent an agent’s degrees of belief as a set of credence functions. This way, we can respect the evidence by requiring that the set, often called the agent’s credal state, includes all credence functions that are in (...) 

We generalize the Kolmogorov axioms for probability calculus to obtain conditions defining, for any given logic, a class of probability functions relative to that logic, coinciding with the standard probability functions in the special case of classical logic but allowing consideration of other classes of "essentially Kolmogorovian" probability functions relative to other logics. We take a broad view of the Bayesian approach as dictating inter alia that from the perspective of a given logic, rational degrees of belief are those representable (...) 

Decisions are made under uncertainty when there are distinct outcomes of a given action, and one is uncertain to which the act will lead. Decisions are made under indeterminacy when there are distinct outcomes of a given action, and it is indeterminate to which the act will lead. This paper develops a theory of (synchronic and diachronic) decisionmaking under indeterminacy that portrays the rational response to such situations as inconstant. Rational agents have to capriciously and randomly choose how to resolve (...) 

Dilation occurs when an interval probability estimate of some event E is properly included in the interval probability estimate of E conditional on every event F of some partition, which means that one’s initial estimate of E becomes less precise no matter how an experiment turns out. Critics maintain that dilation is a pathological feature of imprecise probability models, while others have thought the problem is with Bayesian updating. However, two points are often overlooked: (1) knowing that E is stochastically (...) 

According to the Imprecise Credence Framework (ICF), a rational believer's doxastic state should be modelled by a set of probability functions rather than a single probability function, namely, the set of probability functions allowed by the evidence ( Joyce [2005] ). Roger White ( [2010] ) has recently given an arresting argument against the ICF, which has garnered a number of responses. In this article, I attempt to cast doubt on his argument. First, I point out that it's not an (...) 

The pragmatic character of the Dutch book argument makes it unsuitable as an "epistemic" justification for the fundamental probabilist dogma that rational partial beliefs must conform to the axioms of probability. To secure an appropriately epistemic justification for this conclusion, one must explain what it means for a system of partial beliefs to accurately represent the state of the world, and then show that partial beliefs that violate the laws of probability are invariably less accurate than they could be otherwise. (...) 







According to Bayesian epistemology, the epistemically rational agent updates her beliefs by conditionalization: that is, her posterior subjective probability after taking account of evidence X, pnew, is to be set equal to her prior conditional probability pold(·X). Bayesians can be challenged to provide a justification for their claim that conditionalization is recommended by rationality—whence the normative force of the injunction to conditionalize? There are several existing justifications for conditionalization, but none directly addresses the idea that conditionalization will be epistemically rational (...) 

the symmetry of our evidential situation. If our confidence is best modeled by a standard probability function this means that we are to distribute our subjective probability or credence sharply and evenly over possibilities among which our evidence does not discriminate. Once thought to be the central principle of probabilistic reasoning by great.. 

Orthodox Bayesian decision theory requires an agent’s beliefs representable by a realvalued function, ideally a probability function. Many theorists have argued this is too restrictive; it can be perfectly reasonable to have indeterminate degrees of belief. So doxastic states are ideally representable by a set of probability functions. One consequence of this is that the expected value of a gamble will be imprecise. This paper looks at the attempts to extend Bayesian decision theory to deal with such cases, and concludes (...) 









We review de Finetti’s two coherence criteria for determinate probabilities: coherence1defined in terms of previsions for a set of events that are undominated by the status quo – previsions immune to a sureloss – and coherence2 defined in terms of forecasts for events undominated in Brier score by a rival forecast. We propose a criterion of IPcoherence2 based on a generalization of Brier score for IPforecasts that uses 1sided, lower and upper, probability forecasts. However, whereas Brier score is a strictly (...) 

Recently many have argued that agents must sometimes have credences that are imprecise, represented by a set of probability measures. But opponents claim that fans of imprecise credences cannot provide a decision theory that protects agents who follow it from foregoing sure money. In particular, agents with imprecise credences appear doomed to act irrationally in diachronic cases, where they are called to make decisions at earlier and later times. I respond to this claim on behalf of imprecise credence fans. Once (...) 

Jim Joyce has presented an argument for Probabilism based on considerations of epistemic utility [Joyce, 1998]. In a recent paper, I adapted this argument to give an argument for Probablism and the Principal Principle based on similar considerations [Pettigrew, 2012]. Joyce’s argument assumes that a credence in a true proposition is better the closer it is to maximal credence, whilst a credence in a false proposition is better the closer it is to minimal credence. By contrast, my argument in that (...) 

Those who model doxastic states with a set of probability functions, rather than a single function, face a pressing challenge: can they provide a plausible decision theory compatible with their view? Adam Elga and others claim that they cannot, and that the set of functions model should be rejected for this reason. This paper aims to answer this challenge. The key insight is that the set of functions model can be seen as an instance of the supervaluationist approach to vagueness (...) 

A number of Bayesians claim that, if one has no evidence relevant to a proposition P, then one's credence in P should be spread over the interval [0, 1]. Against this, I argue: first, that it is inconsistent with plausible claims about comparative levels of confidence; second, that it precludes inductive learning in certain cases. Two motivations for the view are considered and rejected. A discussion of alternatives leads to the conjecture that there is an inprinciple limitation on formal representations (...) 

In a recent, thoughtprovoking paper Adam Elga argues against unsharp – e.g., indeterminate, fuzzy and unreliable – probabilities. Rationality demands sharpness, he contends, and this means that decision theories like Levi's, Gärdenfors and Sahlin's, and Kyburg's, though they employ different decision rules, face a common, and serious, problem. This article defends the rule to maximize minimum expected utility against Elga's objection. 



Jim Joyce argues for two amendments to probabilism. The first is the doctrine that credences are rational, or not, in virtue of their accuracy or “closeness to the truth” (1998). The second is a shift from a numerically precise model of belief to an imprecise model represented by a set of probability functions (2010). We argue that both amendments cannot be satisfied simultaneously. To do so, we employ a (slightlygeneralized) impossibility theorem of Seidenfeld, Schervish, and Kadane (2012), who show that (...) 





Is Bayesian decision theory a panacea for many of the problems in epistemology and the philosophy of science, or is it philosophical snakeoil? For years a debate had been waged amongst specialists regarding the import and legitimacy of this body of theory. Mark Kaplan had written the first accessible and nontechnical book to address this controversy. Introducing a new variant on Bayesian decision theory the author offers a compelling case that, while no panacea, decision theory does in fact have the (...) 

Many have claimed that unspecific evidence sometimes demands unsharp, indeterminate, imprecise, vague, or intervalvalued probabilities. Against this, a variant of the diachronic Dutch Book argument shows that perfectly rational agents always have perfectly sharp probabilities. 







It is well known that classical, aka ‘sharp’, Bayesian decision theory, which models belief states as single probability functions, faces a number of serious difficulties with respect to its handling of agnosticism. These difficulties have led to the increasing popularity of socalled ‘imprecise’ models of decisionmaking, which represent belief states as sets of probability functions. In a recent paper, however, Adam Elga has argued in favour of a putative normative principle of sequential choice that he claims to be borne out (...) 

