Many have argued that a rational agent's attitude towards a proposition may be better represented by a probability range than by a single number. I show that in such cases an agent will have unstable betting behaviour, and so will behave in an unpredictable way. I use this point to argue against a range of responses to the ‘two bets’ argument for sharp probabilities.
There is a trade-off between specificity and accuracy in existing models of belief. Descriptions of agents in the tripartite model, which recognizes only three doxastic attitudes—belief, disbelief, and suspension of judgment—are typically accurate, but not sufficiently specific. The orthodox Bayesian model, which requires real-valued credences, is perfectly specific, but often inaccurate: we often lack precise credences. I argue, first, that a popular attempt to fix the Bayesian model by using sets of functions is also inaccurate, since it requires us to (...) have interval-valued credences with perfectly precise endpoints. We can see this problem as analogous to the problem of higher order vagueness. Ultimately, I argue, the only way to avoid these problems is to endorse Insurmountable Unclassifiability. This principle has some surprising and radical consequences. For example, it entails that the trade-off between accuracy and specificity is in-principle unavoidable: sometimes it is simply impossible to characterize an agent’s doxastic state in a way that is both fully accurate and maximally specific. What we can do, however, is improve on both the tripartite and existing Bayesian models. I construct a new model of belief—the minimal model—that allows us to characterize agents with much greater specificity than the tripartite model, and yet which remains, unlike existing Bayesian models, perfectly accurate. (shrink)
Sometimes different partitions of the same space each seem to divide that space into propositions that call for equal epistemic treatment. Famously, equal treatment in the form of equal point-valued credence leads to incoherence. Some have argued that equal treatment in the form of equal interval-valued credence solves the puzzle. This paper shows that, once we rule out intervals with extreme endpoints, this proposal also leads to incoherence.
Many philosophers argue that Keynes’s concept of the “weight of arguments” is an important aspect of argument appraisal. The weight of an argument is the quantity of relevant evidence cited in the premises. However, this dimension of argumentation does not have a received method for formalisation. Kyburg has suggested a measure of weight that uses the degree of imprecision in his system of “Evidential Probability” to quantify weight. I develop and defend this approach to measuring weight. I illustrate the usefulness (...) of this measure by employing it to develop an answer to Popper’s Paradox of Ideal Evidence. (shrink)
In his entry on "Quantum Logic and Probability Theory" in the Stanford Encyclopedia of Philosophy, Alexander Wilce (2012) writes that "it is uncontroversial (though remarkable) the formal apparatus quantum mechanics reduces neatly to a generalization of classical probability in which the role played by a Boolean algebra of events in the latter is taken over the 'quantum logic' of projection operators on a Hilbert space." For a long time, Patrick Suppes has opposed this view (see, for example, the paper collected (...) in Suppes and Zanotti (1996). Instead of changing the logic and moving from a Boolean algebra to a non-Boolean algebra, one can also 'save the phenomena' by weakening the axioms of probability theory and work instead with upper and lower probabilities. However, it is fair to say that despite Suppes' efforts upper and lower probabilities are not particularly popular in physics as well as in the foundations of physics, at least so far. Instead, quantum logic is booming again, especially since quantum information and computation became hot topics. Interestingly, however, impreciseprobabilities are becoming more and more popular in formal epistemology as recent work by authors such as James Joye (2010) and Roger White (2010) demonstrates. (shrink)
Evidentialists say that a necessary condition of sound epistemic reasoning is that our beliefs reflect only our evidence. This thesis arguably conflicts with standard Bayesianism, due to the importance of prior probabilities in the latter. Some evidentialists have responded by modelling belief-states using impreciseprobabilities (Joyce 2005). However, Roger White (2010) and Aron Vallinder (2018) argue that this Imprecise Bayesianism is incompatible with evidentialism due to “inertia”, where Imprecise Bayesian agents become stuck in a state (...) of ambivalence towards hypotheses. Additionally, escapes from inertia apparently only create further conflicts with evidentialism. This dilemma gives a reason for evidentialist imprecise probabilists to look for alternatives without inertia. I shall argue that Henry E. Kyburg’s approach offers an evidentialist-friendly imprecise probability theory without inertia, and that its relevant anti-inertia features are independently justified. I also connect the traditional epistemological debates concerning the “ethics of belief” more systematically with formal epistemology than has been hitherto done. (shrink)
Jim Joyce argues for two amendments to probabilism. The first is the doctrine that credences are rational, or not, in virtue of their accuracy or “closeness to the truth” (1998). The second is a shift from a numerically precise model of belief to an imprecise model represented by a set of probability functions (2010). We argue that both amendments cannot be satisfied simultaneously. To do so, we employ a (slightly-generalized) impossibility theorem of Seidenfeld, Schervish, and Kadane (2012), who show (...) that there is no strictly proper scoring rule for impreciseprobabilities. -/- The question then is what should give way. Joyce, who is well aware of this no-go result, thinks that a quantifiability constraint on epistemic accuracy should be relaxed to accommodate imprecision. We argue instead that another Joycean assumption — called strict immodesty— should be rejected, and we prove a representation theorem that characterizes all “mildly” immodest measures of inaccuracy. (shrink)
A number of Bayesians claim that, if one has no evidence relevant to a proposition P, then one's credence in P should be spread over the interval [0, 1]. Against this, I argue: first, that it is inconsistent with plausible claims about comparative levels of confidence; second, that it precludes inductive learning in certain cases. Two motivations for the view are considered and rejected. A discussion of alternatives leads to the conjecture that there is an in-principle limitation on formal representations (...) of belief: they cannot be both fully accurate and maximally specific. (shrink)
It is well known that classical, aka ‘sharp’, Bayesian decision theory, which models belief states as single probability functions, faces a number of serious difficulties with respect to its handling of agnosticism. These difficulties have led to the increasing popularity of so-called ‘imprecise’ models of decision-making, which represent belief states as sets of probability functions. In a recent paper, however, Adam Elga has argued in favour of a putative normative principle of sequential choice that he claims to be borne (...) out by the sharp model but not by any promising incarnation of its imprecise counterpart. After first pointing out that Elga has fallen short of establishing that his principle is indeed uniquely borne out by the sharp model, I cast aspersions on its plausibility. I show that a slight weakening of the principle is satisfied by at least one, but interestingly not all, varieties of the imprecise model and point out that Elga has failed to motivate his stronger commitment. (shrink)
Rational credence should be coherent in the sense that your attitudes should not leave you open to a sure loss. Rational credence should be such that you can learn when confronted with relevant evidence. Rational credence should not be sensitive to irrelevant differences in the presentation of the epistemic situation. We explore the extent to which orthodox probabilistic approaches to rational credence can satisfy these three desiderata and find them wanting. We demonstrate that an imprecise probability approach does better. (...) Along the way we shall demonstrate that the problem of “belief inertia” is not an issue for a large class of IP credences, and provide a solution to van Fraassen’s box factory puzzle. (shrink)
Does the strength of a particular belief depend upon the significance we attach to it? Do we move from one context to another, remaining in the same doxastic state concerning p yet holding a stronger belief that p in one context than in the other? For that to be so, a doxastic state must have a certain sort of context-sensitive complexity. So the question is about the nature of belief states, as we understand them, or as we think a theory (...) should model them. I explore the idea and how it relates to work on impreciseprobabilities and second-order confidence. (shrink)
John Maynard Keynes’s A Treatise on Probability is the seminal text for the logical interpretation of probability. According to his analysis, probabilities are evidential relations between a hypothesis and some evidence, just like the relations of deductive logic. While some philosophers had suggested similar ideas prior to Keynes, it was not until his Treatise that the logical interpretation of probability was advocated in a clear, systematic and rigorous way. I trace Keynes’s influence in the philosophy of probability through a (...) heterogeneous sample of thinkers who adopted his interpretation. This sample consists of Frederick C. Benenson, Roy Harrod, Donald C. Williams, Henry E. Kyburg and David Stove. The ideas of Keynes prove to be adaptable to their diverse theories of probability. My discussion indicates both the robustness of Keynes’s probability theory and the importance of its influence on the philosophers whom I describe. I also discuss the Problem of the Priors. I argue that none of those I discuss have obviously improved on Keynes’s theory with respect to this issue. (shrink)
It is well known that there are, at least, two sorts of cases where one should not prefer a direct inference based on a narrower reference class, in particular: cases where the narrower reference class is gerrymandered, and cases where one lacks an evidential basis for forming a precise-valued frequency judgment for the narrower reference class. I here propose (1) that the preceding exceptions exhaust the circumstances where one should not prefer direct inference based on a narrower reference class, and (...) (2) that minimal frequency information for a narrower (non-gerrymandered) reference class is sufficient to yield the defeat of a direct inference for a broader reference class. By the application of a method for inferring relatively informative expected frequencies, I argue that the latter claim does not result in an overly incredulous approach to direct inference. The method introduced here permits one to infer a relatively informative expected frequency for a reference class R', given frequency information for a superset of R' and/or frequency information for a sample drawn from R'. (shrink)
Moss (2018) argues that rational agents are best thought of not as having degrees of belief in various propositions but as having beliefs in probabilistic contents, or probabilistic beliefs. Probabilistic contents are sets of probability functions. Probabilistic belief states, in turn, are modeled by sets of probabilistic contents, or sets of sets of probability functions. We argue that this Mossean framework is of considerable interest quite independently of its role in Moss’ account of probabilistic knowledge or her semantics for epistemic (...) modals and probability operators. It is an extremely general model of uncertainty. Indeed, it is at least as general and expressively powerful as every other current imprecise probability framework, including lower probabilities, lower previsions, sets of probabilities, sets of desirable gambles, and choice functions. In addition, we partially answer an important question that Moss leaves open, viz., why should rational agents have consistent probabilistic beliefs? We show that an important subclass of Mossean believers avoid Dutch bookability iff they have consistent probabilistic beliefs. (shrink)
The theory of lower previsions is designed around the principles of coherence and sure-loss avoidance, thus steers clear of all the updating anomalies highlighted in Gong and Meng's "Judicious Judgment Meets Unsettling Updating: Dilation, Sure Loss, and Simpson's Paradox" except dilation. In fact, the traditional problem with the theory of imprecise probability is that coherent inference is too complicated rather than unsettling. Progress has been made simplifying coherent inference by demoting sets of probabilities from fundamental building blocks to (...) secondary representations that are derived or discarded as needed. (shrink)
When do probability distribution functions (PDFs) about future climate misrepresent uncertainty? How can we recognise when such misrepresentation occurs and thus avoid it in reasoning about or communicating our uncertainty? And when we should not use a PDF, what should we do instead? In this paper we address these three questions. We start by providing a classification of types of uncertainty and using this classification to illustrate when PDFs misrepresent our uncertainty in a way that may adversely affect decisions. We (...) then discuss when it is reasonable and appropriate to use a PDF to reason about or communicate uncertainty about climate. We consider two perspectives on this issue. On one, which we argue is preferable, available theory and evidence in climate science basically excludes using PDFs to represent our uncertainty. On the other, PDFs can legitimately be provided when resting on appropriate expert judgement and recognition of associated risks. Once we have specified the border between appropriate and inappropriate uses of PDFs, we explore alternatives to their use. We briefly describe two formal alternatives, namely impreciseprobabilities and possibilistic distribution functions, as well as informal possibilistic alternatives. We suggest that the possibilistic alternatives are preferable. -/- . (shrink)
Bayesians often appeal to “merging of opinions” to rebut charges of excessive subjectivity. But what happens in the short run is often of greater interest than what happens in the limit. Seidenfeld and coauthors use this observation as motivation for investigating the counterintuitive short run phenomenon of dilation, since, they allege, dilation is “the opposite” of asymptotic merging of opinions. The measure of uncertainty relevant for dilation, however, is not the one relevant for merging of opinions. We explicitly investigate the (...) short run behavior of the metric relevant for merging, and show that dilation is independent of the opposite of merging. (shrink)
Dilation occurs when an interval probability estimate of some event E is properly included in the interval probability estimate of E conditional on every event F of some partition, which means that one’s initial estimate of E becomes less precise no matter how an experiment turns out. Critics maintain that dilation is a pathological feature of imprecise probability models, while others have thought the problem is with Bayesian updating. However, two points are often overlooked: (1) knowing that E is (...) stochastically independent of F (for all F in a partition of the underlying state space) is sufficient to avoid dilation, but (2) stochastic independence is not the only independence concept at play within imprecise probability models. In this paper we give a simple characterization of dilation formulated in terms of deviation from stochastic independence, propose a measure of dilation, and distinguish between proper and improper dilation. Through this we revisit the most sensational examples of dilation, which play up independence between dilator and dilatee, and find the sensationalism undermined by either fallacious reasoning with impreciseprobabilities or improperly constructed imprecise probability models. (shrink)
We call something a paradox if it strikes us as peculiar in a certain way, if it strikes us as something that is not simply nonsense, and yet it poses some difficulty in seeing how it could make sense. When we examine paradoxes more closely, we find that for some the peculiarity is relieved and for others it intensifies. Some are peculiar because they jar with how we expect things to go, but the jarring is to do with imprecision and (...) misunderstandings in our thought, failures to appreciate the breadth of possibility consistent with our beliefs. Other paradoxes, however, pose deep problems. Closer examination does not explain them away. Instead, they challenge the coherence of certain conceptual resources and hence challenge the significance of beliefs which deploy those resources. I shall call the former kind weak paradoxes and the latter, strong paradoxes. Whether a particular paradox is weak or strong is sometimes a matter of controversy—sometimes it has been realised that what was thought strong is in fact weak, and vice versa,— but the distinction between the two kinds is generally thought to be worth drawing. In this Cchapter, I shall cover both weak and strong probabilistic paradoxes. (shrink)
We provide counterexamples to some purported characterizations of dilation due to Pedersen and Wheeler :1305–1342, 2014, ISIPTA ’15: Proceedings of the 9th international symposium on imprecise probability: theories and applications, 2015).
Merging of opinions results underwrite Bayesian rejoinders to complaints about the subjective nature of personal probability. Such results establish that sufficiently similar priors achieve consensus in the long run when fed the same increasing stream of evidence. Initial subjectivity, the line goes, is of mere transient significance, giving way to intersubjective agreement eventually. Here, we establish a merging result for sets of probability measures that are updated by Jeffrey conditioning. This generalizes a number of different merging results in the literature. (...) We also show that such sets converge to a shared, maximally informed opinion. Convergence to a maximally informed opinion is a (weak) Jeffrey conditioning analogue of Bayesian “convergence to the truth” for conditional probabilities. Finally, we demonstrate the philosophical significance of our study by detailing applications to the topics of dynamic coherence, impreciseprobabilities, and probabilistic opinion pooling. (shrink)
The desirable gambles framework offers the most comprehensive foundations for the theory of lower pre- visions, which in turn affords the most general ac- count of impreciseprobabilities. Nevertheless, for all its generality, the theory of lower previsions rests on the notion of linear utility. This commitment to linearity is clearest in the coherence axioms for sets of desirable gambles. This paper considers two routes to relaxing this commitment. The first preserves the additive structure of the desirable gambles (...) framework and the machinery for coherent inference but detaches the interpretation of desirability from the multiplicative scale invariance axiom. The second strays from the additive combination axiom to accommodate repeated gambles that return rewards by a non-stationary processes that is not necessarily additive. Unlike the first approach, which is a conservative amendment to the desirable gambles framework, the second is a rad- ical departure. Yet, common to both is a method for describing rewards called discounted utility. (shrink)
Epistemic states of uncertainty play important roles in ethical and political theorizing. Theories that appeal to a “veil of ignorance,” for example, analyze fairness or impartiality in terms of certain states of ignorance. It is important, then, to scrutinize proposed conceptions of ignorance and explore promising alternatives in such contexts. Here, I study Lerner’s probabilistic egalitarian theorem in the setting of impreciseprobabilities. Lerner’s theorem assumes that a social planner tasked with distributing income to individuals in a population (...) is “completely ignorant” about which utility functions belong to which individuals. Lerner models this ignorance with a certain uniform probability distribution, and shows that, under certain further assumptions, income should be equally distributed. Much of the criticism of the relevance of Lerner’s result centers on the representation of ignorance involved. Impreciseprobabilities provide a general framework for reasoning about various forms of uncertainty including, in particular, ignorance. To what extent can Lerner’s conclusion be maintained in this setting? (shrink)
In attempting to form rational personal probabilities by direct inference, it is usually assumed that one should prefer frequency information concerning more specific reference classes. While the preceding assumption is intuitively plausible, little energy has been expended in explaining why it should be accepted. In the present article, I address this omission by showing that, among the principled policies that may be used in setting one’s personal probabilities, the policy of making direct inferences with a preference for frequency (...) information for more specific reference classes yields personal probabilities whose accuracy is optimal, according to all proper scoring rules, in situations where all of the relevant frequency information is point-valued. Assuming that frequency information for narrower reference classes is preferred, when the relevant frequency statements are point-valued, a dilemma arises when choosing whether to make a direct inference based upon relatively precise-valued frequency information for a broad reference class, R, or upon relatively imprecise-valued frequency information for a more specific reference class, R*. I address such cases, by showing that it is often possible to make a precise-valued frequency judgment regarding R* based on precise-valued frequency information for R, using standard principles of direct inference. Having made such a frequency judgment, the dilemma of choosing between and is removed, and one may proceed by using the precise-valued frequency estimate for the more specific reference class as a premise for direct inference. (shrink)
Dogmatism is sometimes thought to be incompatible with Bayesian models of rational learning. I show that the best model for updating imprecise credences is compatible with dogmatism.
Why should we refrain from doing things that, taken collectively, are environmentally destructive, if our individual acts seem almost certain to make no difference? According to the expected consequences approach, we should refrain from doing these things because our individual acts have small risks of causing great harm, which outweigh the expected benefits of performing them. Several authors have argued convincingly that this provides a plausible account of our moral reasons to do things like vote for policies that will reduce (...) our countries’ greenhouse gas emissions, adopt plant-based diets, and otherwise reduce our individual emissions. But this approach has recently been challenged by authors like Bernward Gesang and Julia Nefsky. Gesang contends that it may be genuinely impossible for our individual emissions to make a morally relevant difference. Nefsky argues more generally that the expected consequences approach cannot adequately explain our reasons not to do things if there is no precise fact of the matter about whether their outcomes are harmful. -/- In the following chapter, author Howard Nye defends the expected consequences approach against these objections. Nye contends that Gesang has shown at most that our emissions could have metaphysically indeterministic effects that lack precise objective chances. He argues, moreover, that the expected consequences approach can draw upon existing extensions to cases of indeterminism and impreciseprobabilities to deliver the result that we have the same moral reasons to reduce our emissions in Gesang’s scenario as in deterministic scenarios. Nye also shows how the expected consequences approach can draw upon these extensions to handle Nefsky’s concern about the absence of precise facts concerning whether the outcomes of certain acts are harmful. The author concludes that the expected consequences approach provides a fully adequate account of our moral reasons to take both political and personal action to reduce our ecological footprints. (shrink)
In some severely uncertain situations, exemplified by climate change and novel pandemics, policymakers lack a reasoned basis for assigning probabilities to the possible outcomes of the policies they must choose between. I outline and defend an uncertainty averse, egalitarian approach to policy evaluation in these contexts. The upshot is a theory of distributive justice which offers especially strong reasons to guard against individual and collective misfortune.
It is a consequence of the theory of imprecise credences that there exist situations in which rational agents inevitably become less opinionated toward some propositions as they gather more evidence. The fact that an agent's imprecise credal state can dilate in this way is often treated as a strike against the imprecise approach to inductive inference. Here, we show that dilation is not a mere artifact of this approach by demonstrating that opinion loss is countenanced as rational (...) by a substantially broader class of normative theories than has been previously recognised. Specifically, we show that dilation-like phenomena arise even when one abandons the basic assumption that agents have (precise or imprecise) credences of any kind, and follows directly from bedrock norms for rational comparative confidence judgements of the form `I am at least as confident in p as I am in q'. We then use the comparative confidence framework to develop a novel understanding of what exactly gives rise to dilation-like phenomena. By considering opinion loss in this more general setting, we are able to provide a novel assessment of the prospects for an account of inductive inference that is not saddled with the inevitability of rational opinion loss. (shrink)
If there are fundamental laws of nature, can they fail to be exact? In this paper, I consider the possibility that some fundamental laws are vague. I call this phenomenon 'fundamental nomic vagueness.' I characterize fundamental nomic vagueness as the existence of borderline lawful worlds and the presence of several other accompanying features. Under certain assumptions, such vagueness prevents the fundamental physical theory from being completely expressible in the mathematical language. Moreover, I suggest that such vagueness can be regarded as (...) 'vagueness in the world.' For a case study, we turn to the Past Hypothesis, a postulate that (partially) explains the direction of time in our world. We have reasons to take it seriously as a candidate fundamental law of nature. Yet it is vague: it admits borderline (nomologically) possible worlds. An exact version would lead to an untraceable arbitrariness absent in any other fundamental laws. However, the dilemma between fundamental nomic vagueness and untraceable arbitrariness is dissolved in a new quantum theory of time's arrow. (shrink)
One recent topic of debate in Bayesian epistemology has been the question of whether imprecise credences can be rational. I argue that one account of imprecise credences, the orthodox treatment as defended by James M. Joyce, is untenable. Despite Joyce’s claims to the contrary, a puzzle introduced by Roger White shows that the orthodox account, when paired with Bas C. van Fraassen’s Reflection Principle, can lead to inconsistent beliefs. Proponents of imprecise credences, then, must either provide a (...) compelling reason to reject Reflection or admit that the rational credences in White’s case are precise. (shrink)
It is often suggested that when opinions differ among individuals in a group, the opinions should be aggregated to form a compromise. This paper compares two approaches to aggregating opinions, linear pooling and what I call opinion agglomeration. In evaluating both strategies, I propose a pragmatic criterion, No Regrets, entailing that an aggregation strategy should prevent groups from buying and selling bets on events at prices regretted by their members. I show that only opinion agglomeration is able to satisfy the (...) demand. I then proceed to give normative and empirical arguments in support of the pragmatic criterion for opinion aggregation, and that ultimately favor opinion agglomeration. (shrink)
According to the traditional Bayesian view of credence, its structure is that of precise probability, its objects are descriptive propositions about the empirical world, and its dynamics are given by conditionalization. Each of the three essays that make up this thesis deals with a different variation on this traditional picture. The first variation replaces precise probability with sets of probabilities. The resulting imprecise Bayesianism is sometimes motivated on the grounds that our beliefs should not be more precise than (...) the evidence calls for. One known problem for this evidentially motivated imprecise view is that in certain cases, our imprecise credence in a particular proposition will remain the same no matter how much evidence we receive. In the first essay I argue that the problem is much more general than has been appreciated so far, and that it’s difficult to avoid without compromising the initial evidentialist motivation. The second variation replaces descriptive claims with moral claims as the objects of credence. I consider three standard arguments for probabilism with respect to descriptive uncertainty—representation theorem arguments, Dutch book arguments, and accuracy arguments—in order to examine whether such arguments can also be used to establish probabilism with respect to moral uncertainty. In the second essay, I argue that by and large they can, with some caveats. First, I don’t examine whether these arguments can be given sound non-cognitivist readings, and any conclusions therefore only hold conditional on cognitivism. Second, decision-theoretic representation theorems are found to be less convincing in the moral case, because there they implausibly commit us to thinking that intertheoretic comparisons of value are always possible. Third and finally, certain considerations may lead one to think that imprecise probabilism provides a more plausible model of moral epistemology. The third variation considers whether, in addition to conditionalization, agents may also change their minds by becoming aware of propositions they had not previously entertained, and therefore not previously assigned any probability. More specifically, I argue that if we wish to make room for reflective equilibrium in a probabilistic moral epistemology, we must allow for awareness growth. In the third essay, I sketch the outline of such a Bayesian account of reflective equilibrium. Given that this account gives a central place to awareness growth, and that the rationality constraints on belief change by awareness growth are much weaker than those on belief change by conditionalization, it follows that the rationality constraints on the credences of agents who are seeking reflective equilibrium are correspondingly weaker. (shrink)
Formal epistemologists often claim that our credences should be representable by a probability function. Complete probabilistic coherence, however, is only possible for ideal agents, raising the question of how this requirement relates to our everyday judgments concerning rationality. One possible answer is that being rational is a contextual matter, that the standards for rationality change along with the situation. Just like who counts as tall changes depending on whether we are considering toddlers or basketball players, perhaps what counts as rational (...) shifts according to whether we are considering ideal agents or creatures more like ourselves. Even though a number of formal epistemologists have endorsed this type of solution, I will argue that there is no way to spell out this contextual account that can make sense of our everyday judgments about rationality. Those who defend probabilistic coherence requirements will need an alternative account of the relationship between real and ideal rationality. (shrink)
Fifty years of effort in artificial intelligence (AI) and the formalization of legal reasoning have produced both successes and failures. Considerable success in organizing and displaying evidence and its interrelationships has been accompanied by failure to achieve the original ambition of AI as applied to law: fully automated legal decision-making. The obstacles to formalizing legal reasoning have proved to be the same ones that make the formalization of commonsense reasoning so difficult, and are most evident where legal reasoning has to (...) meld with the vast web of ordinary human knowledge of the world. Underlying many of the problems is the mismatch between the discreteness of symbol manipulation and the continuous nature of imprecise natural language, of degrees of similarity and analogy, and of probabilities. (shrink)
This book explores a question central to philosophy--namely, what does it take for a belief to be justified or rational? According to a widespread view, whether one has justification for believing a proposition is determined by how probable that proposition is, given one's evidence. In this book this view is rejected and replaced with another: in order for one to have justification for believing a proposition, one's evidence must normically support it--roughly, one's evidence must make the falsity of that proposition (...) abnormal in the sense of calling for special, independent explanation. This conception of justification bears upon a range of topics in epistemology and beyond. Ultimately, this way of looking at justification guides us to a new, unfamiliar picture of how we should respond to our evidence and manage our own fallibility. This picture is developed here. (shrink)
This article explores the main similarities and differences between Derek Parfit’s notion of imprecise comparability and a related notion I have proposed of parity. I argue that the main difference between imprecise comparability and parity can be understood by reference to ‘the standard view’. The standard view claims that 1) differences between cardinally ranked items can always be measured by a scale of units of the relevant value, and 2) all rankings proceed in terms of the trichotomy of (...) ‘better than’, ‘worse than’, and ‘equally good’. Imprecise comparability, which can be understood in terms of the more familiar notions of cardinality and incommensurability, rejects only the first claim while parity rejects both claims of the standard view. -/- I then argue that insofar as those attracted to imprecise comparability assume that all rankings are trichotomous, as Parfit appears to, the view should be rejected. This is because imprecise equality is not a form of equality but is a sui generis ‘fourth’ basic way in which items can be ranked. We should, I argue, understand imprecise equality as parity, and imprecise comparability as entailing ‘tetrachotomy’ – that if two items are comparable, one must better than, worse than, equal to, or on a par with the other. Thus those attracted to the idea that cardinality can be imprecise should abandon trichotomy and accept parity and tetrachotomy instead. -/- Finally, I illustrate the difference between Parfit’s trichotomous notion of imprecise comparability and parity by examining how each notion might be employed in different solutions to the problem posed by the Repugnant Conclusion in population ethics. I suggest that parity provides the arguably more ecumenical solution to the problem. (shrink)
Traditional Bayesianism requires that an agent’s degrees of belief be represented by a real-valued, probabilistic credence function. However, in many cases it seems that our evidence is not rich enough to warrant such precision. In light of this, some have proposed that we instead represent an agent’s degrees of belief as a set of credence functions. This way, we can respect the evidence by requiring that the set, often called the agent’s credal state, includes all credence functions that are in (...) some sense compatible with the evidence. One known problem for this evidentially motivated imprecise view is that in certain cases, our imprecise credence in a particular proposition will remain the same no matter how much evidence we receive. In this article I argue that the problem is much more general than has been appreciated so far, and that it’s difficult to avoid it without compromising the initial evidentialist motivation. _1_ Introduction _2_ Precision and Its Problems _3_ Imprecise Bayesianism and Respecting Ambiguous Evidence _4_ Local Belief Inertia _5_ From Local to Global Belief Inertia _6_ Responding to Global Belief Inertia _7_ Conclusion. (shrink)
In the Bayesian approach to quantum mechanics, probabilities—and thus quantum states—represent an agent’s degrees of belief, rather than corresponding to objective properties of physical systems. In this paper we investigate the concept of certainty in quantum mechanics. Particularly, we show how the probability-1 predictions derived from pure quantum states highlight a fundamental difference between our Bayesian approach, on the one hand, and Copenhagen and similar interpretations on the other. We first review the main arguments for the general claim that (...)probabilities always represent degrees of belief. We then argue that a quantum state prepared by some physical device always depends on an agent’s prior beliefs, implying that the probability-1 predictions derived from that state also depend on the agent’s prior beliefs. Quantum certainty is therefore always some agent’s certainty. Conversely, if facts about an experimental setup could imply agent-independent certainty for a measurement outcome, as in many Copenhagen-like interpretations, that outcome would effectively correspond to a preexisting system property. The idea that measurement outcomes occurring with certainty correspond to preexisting system properties is, however, in conflict with locality. We emphasize this by giving a version of an argument of Stairs [(1983). Quantum logic, realism, and value-definiteness. Philosophy of Science, 50, 578], which applies the Kochen–Specker theorem to an entangled bipartite system. (shrink)
A probability distribution is regular if no possible event is assigned probability zero. While some hold that probabilities should always be regular, three counter-arguments have been posed based on examples where, if regularity holds, then perfectly similar events must have different probabilities. Howson (2017) and Benci et al. (2016) have raised technical objections to these symmetry arguments, but we see here that their objections fail. Howson says that Williamson’s (2007) “isomorphic” events are not in fact isomorphic, but Howson (...) is speaking of set-theoretic representations of events in a probability model. While those sets are not isomorphic, Williamson’s physical events are, in the relevant sense. Benci et al. claim that all three arguments rest on a conflation of different models, but they do not. They are founded on the premise that similar events should have the same probability in the same model, or in one case, on the assumption that a single rotation-invariant distribution is possible. Having failed to refute the symmetry arguments on such technical grounds, one could deny their implicit premises, which is a heavy cost, or adopt varying degrees of instrumentalism or pluralism about regularity, but that would not serve the project of accurately modelling chances. (shrink)
Many have claimed that epistemic rationality sometimes requires us to have imprecise credal states (i.e. credal states representable only by sets of credence functions) rather than precise ones (i.e. credal states representable by single credence functions). Some writers have recently argued that this claim conflicts with accuracy-centered epistemology, i.e., the project of justifying epistemic norms by appealing solely to the overall accuracy of the doxastic states they recommend. But these arguments are far from decisive. In this essay, we prove (...) some new results, which show that there is little hope for reconciling the rationality of credal imprecision with accuracy-centered epistemology. (shrink)
In this short survey article, I discuss Bell’s theorem and some strategies that attempt to avoid the conclusion of non-locality. I focus on two that intersect with the philosophy of probability: (1) quantum probabilities and (2) superdeterminism. The issues they raised not only apply to a wide class of no-go theorems about quantum mechanics but are also of general philosophical interest.
This paper develops an information-sensitive theory of the semantics and probability of conditionals and statements involving epistemic modals. The theory validates a number of principles linking probability and modality, including the principle that the probability of a conditional If A, then C equals the probability of C, updated with A. The theory avoids so-called triviality results, which are standardly taken to show that principles of this sort cannot be validated. To achieve this, we deny that rational agents update their credences (...) via conditionalization. We offer a new rule of update, Hyperconditionalization, which agrees with Conditionalization whenever nonmodal statements are at stake but differs for modal and conditional sentences. (shrink)
Recently, Derek Parfit has offered a novel solution to the “Repugnant Conclusion” that compared with the existence of many people whose quality of life would be very high, there is some much larger number of people whose existence would be better but whose lives would be barely worth living. On this solution, qualitative differences between two populations will often entail that the populations are merely “imprecisely” comparable. According to Parfit, this fact allows us to avoid the Repugnant Conclusion without violating (...) the transitivity of better than. In this paper, I argue that Parfit’s view nevertheless implies two objectionable conclusions. The first is an alternative version of the Repugnant Conclusion that, Parfit suggests, may not be all that repugnant. The second is a revised version of the first that is nearly identical to the Repugnant Conclusion. I conclude that Parfit’s view offers no escape from repugnance. (shrink)
A probability distribution is regular if it does not assign probability zero to any possible event. While some hold that probabilities should always be regular, three counter-arguments have been posed based on examples where, if regularity holds, then perfectly similar events must have different probabilities. Howson and Benci et al. have raised technical objections to these symmetry arguments, but we see here that their objections fail. Howson says that Williamson’s “isomorphic” events are not in fact isomorphic, but Howson (...) is speaking of set-theoretic representations of events in a probability model. While those sets are not isomorphic, Williamson’s physical events are, in the relevant sense. Benci et al. claim that all three arguments rest on a conflation of different models, but they do not. They are founded on the premise that similar events should have the same probability in the same model, or in one case, on the assumption that a single rotation-invariant distribution is possible. Having failed to refute the symmetry arguments on such technical grounds, one could deny their implicit premises, which is a heavy cost, or adopt varying degrees of instrumentalism or pluralism about regularity, but that would not serve the project of accurately modelling chances. (shrink)
This chapter will review selected aspects of the terrain of discussions about probabilities in statistical mechanics (with no pretensions to exhaustiveness, though the major issues will be touched upon), and will argue for a number of claims. None of the claims to be defended is entirely original, but all deserve emphasis. The first, and least controversial, is that probabilistic notions are needed to make sense of statistical mechanics. The reason for this is the same reason that convinced Maxwell, Gibbs, (...) and Boltzmann that probabilities would be needed, namely, that the second law of thermodynamics, which in its original formulation says that certain processes are impossible, must, on the kinetic theory, be replaced by a weaker formulation according to which what the original version deems impossible is merely improbable. Second is that we ought not take the standard measures invoked in equilibrium statistical mechanics as giving, in any sense, the correct probabilities about microstates of the system. We can settle for a much weaker claim: that the probabilities for outcomes of experiments yielded by the standard distributions are effectively the same as those yielded by any distribution that we should take as a representing probabilities over microstates. Lastly, (and most controversially): in asking about the status of probabilities in statistical mechanics, the familiar dichotomy between epistemic probabilities (credences, or degrees of belief) and ontic (physical) probabilities is insufficient; the concept of probability that is best suited to the needs of statistical mechanics is one that combines epistemic and physical considerations. (shrink)
This Open Access book addresses the age-old problem of infinite regresses in epistemology. How can we ever come to know something if knowing requires having good reasons, and reasons can only be good if they are backed by good reasons in turn? The problem has puzzled philosophers ever since antiquity, giving rise to what is often called Agrippa's Trilemma. The current volume approaches the old problem in a provocative and thoroughly contemporary way. Taking seriously the idea that good reasons are (...) typically probabilistic in character, it develops and defends a new solution that challenges venerable philosophical intuitions and explains why they were mistakenly held. Key to the new solution is the phenomenon of fading foundations, according to which distant reasons are less important than those that are nearby. The phenomenon takes the sting out of Agrippa's Trilemma; moreover, since the theory that describes it is general and abstract, it is readily applicable outside epistemology, notably to debates on infinite regresses in metaphysics. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.