Citations of:
Add citations
You must login to add citations.


Many philosophers argue that Keynes’s concept of the “weight of arguments” is an important aspect of argument appraisal. The weight of an argument is the quantity of relevant evidence cited in the premises. However, this dimension of argumentation does not have a received method for formalisation. Kyburg has suggested a measure of weight that uses the degree of imprecision in his system of “Evidential Probability” to quantify weight. I develop and defend this approach to measuring weight. I illustrate the usefulness (...) 

This paper offers a probabilistic treatment of the conditions for argument cogency as endorsed in informal logic: acceptability, relevance, and sufficiency. Treating a natural language argument as a reasonclaimcomplex, our analysis identifies content features of defeasible argument on which the RSA conditions depend, namely: change in the commitment to the reason, the reason’s sensitivity and selectivity to the claim, one’s prior commitment to the claim, and the contextually determined thresholds of acceptability for reasons and for claims. Results contrast with, and (...) 

Gordon Belot has recently developed a novel argument against Bayesianism. He shows that there is an interesting class of problems that, intuitively, no rational belief forming method is likely to get right. But a Bayesian agent’s credence, before the problem starts, that she will get the problem right has to be 1. This is an implausible kind of immodesty on the part of Bayesians. My aim is to show that while this is a good argument against traditional, precise Bayesians, the (...) 

In a quantum universe with a strong arrow of time, it is standard to postulate that the initial wave function started in a particular macrostatethe special lowentropy macrostate selected by the Past Hypothesis. Moreover, there is an additional postulate about statistical mechanical probabilities according to which the initial wave function is a ''typical'' choice in the macrostate. Together, they support a probabilistic version of the Second Law of Thermodynamics: typical initial wave functions will increase in entropy. Hence, there are two (...) 

What should one believe about the unobserved? My thesis is a collection of four papers, each of which addresses this question. In the first paper, “Why Subjectivism?”, I consider the standing of a position called radical subjective Bayesianism, or subjectivism. The view is composed of two claims—that agents ought to be logically omniscient, and that there is no further norm of rationality—both of which are subject to seemingly conclusive objections. In this paper, I seek, if not to rehabilitate subjectivism, at (...) 

Many have claimed that epistemic rationality sometimes requires us to have imprecise credal states (i.e. credal states representable only by sets of credence functions) rather than precise ones (i.e. credal states representable by single credence functions). Some writers have recently argued that this claim conflicts with accuracycentered epistemology, i.e., the project of justifying epistemic norms by appealing solely to the overall accuracy of the doxastic states they recommend. But these arguments are far from decisive. In this essay, we prove some (...) 

Sometimes epistemologists theorize about belief, a tripartite attitude on which one can believe, withhold belief, or disbelieve a proposition. In other cases, epistemologists theorize about credence, a finegrained attitude that represents one’s subjective probability or confidence level toward a proposition. How do these two attitudes relate to each other? This article explores the relationship between belief and credence in two categories: descriptive and normative. It then explains the broader significance of the beliefcredence connection and concludes with general lessons from the (...) 

Two compelling principles, the Reasonable Range Principle and the Preservation of Irrelevant Evidence Principle, are necessary conditions that any response to peer disagreements ought to abide by. The Reasonable Range Principle maintains that a resolution to a peer disagreement should not fall outside the range of views expressed by the peers in their dispute, whereas the Preservation of Irrelevant Evidence Principle maintains that a resolution strategy should be able to preserve unanimous judgments of evidential irrelevance among the peers. No standard (...) 

The basic Bayesian model of credence states, where each individual’s belief state is represented by a single probability measure, has been criticized as psychologically implausible, unable to represent the intuitive distinction between precise and imprecise probabilities, and normatively unjustifiable due to a need to adopt arbitrary, unmotivated priors. These arguments are often used to motivate a model on which imprecise credal states are represented by sets of probability measures. I connect this debate with recent work in Bayesian cognitive science, where (...) 

One implication of Bell’s theorem is that there cannot in general be hidden variable models for quantum mechanics that both are noncontextual and retain the structure of a classical probability space. Thus, some hidden variable programs aim to retain noncontextuality at the cost of using a generalization of the Kolmogorov probability axioms. We generalize a theorem of Feintzeig to show that such programs are committed to the existence of a finite null cover for some quantum mechanical experiments, i.e., a finite (...) 

This dissertation aims to determine the optimal level of granularity for the variables used in probabilistic causal models. These causal models are useful for generating explanations in a number of scientific contexts. In Chapter 1, I argue that there is rarely a unique level of granularity at which a given phenomenon can be causally explained, thereby rejecting various causal exclusion arguments. In Chapter 2, I consider several recent proposals for measuring the explanatory power of causal explanations, and show that these (...) 

De Finetti is one of the founding fathers of the subjective school of probability. He held that probabilities are subjective, coherent degrees of expectation, and he argued that none of the objective interpretations of probability make sense. While his theory has been influential in science and philosophy, it has encountered various objections. I argue that these objections overlook central aspects of de Finetti’s philosophy of probability and are largely unfounded. I propose a new interpretation of de Finetti’s theory that highlights (...) 

According to the traditional Bayesian view of credence, its structure is that of precise probability, its objects are descriptive propositions about the empirical world, and its dynamics are given by conditionalization. Each of the three essays that make up this thesis deals with a different variation on this traditional picture. The first variation replaces precise probability with sets of probabilities. The resulting imprecise Bayesianism is sometimes motivated on the grounds that our beliefs should not be more precise than the evidence (...) 

Cosmology raises novel philosophical questions regarding the use of probabilities in inference. This work aims at identifying and assessing lines of arguments and problematic principles in probabilistic reasoning in cosmology. / The first, second, and third papers deal with the intersection of two distinct problems: accounting for selection effects, and representing ignorance or indifference in probabilistic inferences. These two problems meet in the cosmology literature when anthropic considerations are used to predict cosmological parameters by conditionalizing the distribution of, e.g., the (...) 



Traditional Bayesianism requires that an agent’s degrees of belief be represented by a realvalued, probabilistic credence function. However, in many cases it seems that our evidence is not rich enough to warrant such precision. In light of this, some have proposed that we instead represent an agent’s degrees of belief as a set of credence functions. This way, we can respect the evidence by requiring that the set, often called the agent’s credal state, includes all credence functions that are in (...) 

The question of how the probabilistic opinions of different individuals should be aggregated to form a group opinion is controversial. But one assumption seems to be pretty much common ground: for a group of Bayesians, the representation of group opinion should itself be a unique probability distribution, 410–414, [45]; Bordley Management Science, 28, 1137–1148, [5]; Genest et al. The Annals of Statistics, 487–501, [21]; Genest and Zidek Statistical Science, 114–135, [23]; Mongin Journal of Economic Theory, 66, 313–351, [46]; Clemen and (...) 

This article considers the extent to which Bayesian networks with imprecise probabilities, which are used in statistics and computer science for predictive purposes, can be used to represent causal structure. It is argued that the adequacy conditions for causal representation in the precise context—the Causal Markov Condition and Minimality—do not readily translate into the imprecise context. Crucial to this argument is the fact that the independence relation between random variables can be understood in several different ways when the joint probability (...) 