Citations of:
Add citations
You must login to add citations.


In this short survey article, I discuss Bell’s theorem and some strategies that attempt to avoid the conclusion of nonlocality. I focus on two that intersect with the philosophy of probability: (1) quantum probabilities and (2) superdeterminism. The issues they raised not only apply to a wide class of nogo theorems about quantum mechanics but are also of general philosophical interest. 

If chances are propensities, what reason do we have to expect them to be probabilities? I will offer a new answer to this question. It comes in two parts. First, I will defend an accuracycentred account of what it is for a causal system to have precise propensities in the first place. Second, I will prove that, given some pretty weak assumptions about the nature of comparative causal dispositions, and some fairly standard assumptions about reasonable measures of inaccuracy, propensities must (...) 

Pragmatic factors encroach on epistemic predicates not solely because the threshold for actionable belief may shift with an epistemic agent’s context, as an evidential Bayesian might insist, but also because what our inductive policy should be may shift with that context. I argue for this thesis in the context of imprecise probabilities, maintaining that the choice of the defining hyperparameter for an Imprecise Dirichlet Model for nonparametric predictive inference may correspond to the choice of an eager versus reticent inductive policy (...) 

The disjunction problem and the distality problem each presents a challenge that any theory of mental content must address. Here we consider their bearing on purely probabilistic causal theories. In addition to considering these problems separately, we consider a third challenge—that a theory must solve both. We call this “the hard problem.” We consider 8 basic ppc theories along with 240 hybrids of them, and show that some can handle the disjunction problem and some can handle the distality problem, but (...) 

The principle of indifference has fallen from grace in contemporary philosophy, yet some papers have recently sought to vindicate its plausibility. This paper follows suit. In it, I articulate a version of the principle and provide what appears to be a novel argument in favour of it. The argument relies on a thought experiment where, intuitively, an agent’s confidence in any particular outcome being true should decrease with the addition of outcomes to the relevant space of possible outcomes. Put simply: (...) 

Logic is widely held to be a normative discipline. Various claims have been offered in support of this view, but they all revolve around the idea that logic is concerned with how one ought to reason. I argue that most of these claims—while perhaps correct—only entail that logic is normative in a way that many, if not all, intellectual disciplines are normative. I also identify some claims whose correctness would make logic normative in a way that sets it apart from (...) 

This paper critically assesses existing accounts of the nature of difficulty, finds them wanting, and proposes a new account. The concept of difficulty is routinely invoked in debates regarding degrees of moral responsibility, and the value of achievement. Until recently, however, there has not been any sustained attempt to provide an account of the nature of difficulty itself. This has changed with Gwen Bradford’s Achievement, which argues that difficulty is a matter of how much intense effort is expended. But while (...) 

Singlecase and longrun propensity theories are among the main objective interpretations of probability. There have been various objections to these theories, e.g. that it is difficult to explain why propensities should satisfy the probability axioms and, worse, that propensities are at odds with these axioms, that the explication of propensities is circular and accordingly not informative, and that singlecase propensities are metaphysical and accordingly nonscientific. We consider various propensity theories of probability and their prospects in light of these objections. We (...) 

Recent literature has paid attention to a demarcation problem for evolutionary debunking arguments. This is the problem of asking in virtue of what regulative metaepistemic norm evolutionary considerations either render a belief justified, or debunk it as unjustified. I examine the socalled ‘Milvian Bridge principle’ A new science of religion, Routledge, New York, 2012; Sloan, McKenny, Eggelson Darwin in the 21st century: nature, humanity, and God, University Press, Notre Dame, 2015)), which offers exactly such a called for regulative metaepistemic norm. (...) 

I argue that the propensity interpretation of fitness, properly understood, not only solves the explanatory circularity problem and the mismatch problem, but can also withstand the Pandora’s box full of problems that have been thrown at it. Fitness is the propensity (i.e., probabilistic ability, based on heritable physical traits) for organisms or types of organisms to survive and reproduce in particular environments and in particular populations for a specified number of generations; if greater than one generation, “reproduction” includes descendants of (...) 

Quantum mechanics portrays the universe as involving nonlocal influences that are difficult to reconcile with relativity theory. By postulating backward causation, retrocausal interpretations of quantum mechanics could circumvent these influences and accordingly reconcile quantum mechanics with relativity. The postulation of backward causation poses various challenges for the retrocausal interpretations of quantum mechanics and for the existing conceptual frameworks for analyzing counterfactual dependence, causation and causal explanation. In this chapter, we analyze the nature of time, causation and explanation in a local, (...) 

There are two central questions concerning probability. First, what are its formal features? That is a mathematical question, to which there is a standard, widely (though not universally) agreed upon answer. This answer is reviewed in the next section. Second, what sorts of things are probabilitieswhat, that is, is the subject matter of probability theory? This is a philosophical question, and while the mathematical theory of probability certainly bears on it, the answer must come from elsewhere. To see why, observe (...) 

In October 2009 I decided to stop doing philosophy. This meant, in particular, stopping work on the book that I was writing on the nature of probability. At that time, I had no intention of making my unﬁnished draft available to others. However, I recently noticed how many people are reading the lecture notes and articles on my web site. Since this draft book contains some important improvements on those materials, I decided to make it available to anyone who wants (...) 

This volume focuses on various questions concerning the interpretation of probability and probabilistic reasoning in biology and physics. It is inspired by the idea that philosophers of biology and philosophers of physics who work on the foundations of their disciplines encounter similar questions and problems concerning the role and application of probability, and that interaction between the two communities will be both interesting and fruitful. In this introduction we present the background to the main questions that the volume focuses on (...) 

We survey a variety of possible explications of the term “Individual Risk.” These in turn are based on a variety of interpretations of “Probability,” including classical, enumerative, frequency, formal, metaphysical, personal, propensity, chance and logical conceptions of probability, which we review and compare. We distinguish between “groupist” and “individualist” understandings of probability, and explore both “group to individual” and “individual to group” approaches to characterising individual risk. Although in the end that concept remains subtle and elusive, some pragmatic suggestions for (...) 

Both I and Belnap, motivated the "BelnapDunn 4valued Logic" by talk of the reasoner being simply "told true" (T) and simply "told false" (F), which leaves the options of being neither "told true" nor "told false" (N), and being both "told true" and "told false" (B). Belnap motivated these notions by consideration of unstructured databases that allow for negative information as well as positive information (even when they conflict). We now experience this on a daily basis with the Web. But (...) 

Concrete Causation centers about theories of causation, their interpretation, and their embedding in metaphysicalontological questions, as well as the application of such theories in the context of science and decision theory. The dissertation is divided into four chapters, that firstly undertake the historicalsystematic localization of central problems (chapter 1) to then give a rendition of the concepts and the formalisms underlying David Lewis' and Judea Pearl's theories (chapter 2). After philosophically motivated conceptual deliberations Pearl's mathematicaltechnical framework is drawn on for (...) 

The paper’s target is the historically influential betting interpretation of subjective probabilities due to Ramsey and de Finetti. While there are several classical and wellknown objections to this interpretation, the paper focuses on just one fundamental problem: There is a sense in which degrees of belief cannot be interpreted as betting rates. The reasons differ in different cases, but there’s one crucial feature that all these cases have in common: The agent’s degree of belief in a proposition A does not (...) 

I describe a realist, ontologically objective interpretation of probability, "farflung frequency (FFF) mechanistic probability". FFF mechanistic probability is defined in terms of facts about the causal structure of devices and certain sets of frequencies in the actual world. Though defined partly in terms of frequencies, FFF mechanistic probability avoids many drawbacks of wellknown frequency theories and helps causally explain stable frequencies, which will usually be close to the values of mechanistic probabilities. I also argue that it's a virtue rather than (...) 

In this paper, we offer an alternative interpretation for the claim that ‘S is justified in believing that φ’. First, we present what seems to be a common way of interpreting this claim: as an attribution of propositional justification. According to this interpretation, being justified is just a matter of having confirming evidence. We present a type of case that does not fit well with the standard concept, where considerations about cognition are made relevant. The concept of cognitive algorithm is (...) 

The main objective of the paper is to propose a frequentist interpretation of probability in the context of modelbased induction, anchored on the Strong Law of Large Numbers (SLLN) and justifiable on empirical grounds. It is argued that the prevailing views in philosophy of science concerning induction and the frequentist interpretation of probability are unduly influenced by enumerative induction, and the von Mises rendering, both of which are at odds with frequentist modelbased induction that dominates current practice. The differences between (...) 

ABSTRACTIn Darwinian Population and Natural Selection, Peter GodfreySmith brought the topic of natural selection back to the forefront of philosophy of biology, highlighting different issues surro... 

Bayesian confirmation theory is our best formal framework for describing inductive reasoning. The problem of old evidence is a particularly difficult one for confirmation theory, because it suggests that this framework fails to account for central and important cases of inductive reasoning and scientific inference. I show that we can appeal to the fragmentation of doxastic states to solve this problem for confirmation theory. This fragmentation solution is independently wellmotivated because of the success of fragmentation in solving other problems. I (...) 

Enjoying great popularity in decision theory, epistemology, and philosophy of science, Bayesianism as understood here is fundamentally concerned with epistemically ideal rationality. It assumes a tight connection between evidential probability and ideally rational credence, and usually interprets evidential probability in terms of such credence. Timothy Williamson challenges Bayesianism by arguing that evidential probabilities cannot be adequately interpreted as the credences of an ideal agent. From this and his assumption that evidential probabilities cannot be interpreted as the actual credences of human (...) 

While there has been much discussion about what makes some mathematical proofs more explanatory than others, and what are mathematical coincidences, in this article I explore the distinct phenomenon of mathematical facts that call for explanation. The existence of mathematical facts that call for explanation stands in tension with virtually all existing accounts of “calling for explanation”, which imply that necessary facts cannot call for explanation. In this paper I explore what theoretical revisions are needed in order to accommodate this (...) 

Entropy is ubiquitous in physics, and it plays important roles in numerous other disciplines ranging from logic and statistics to biology and economics. However, a closer look reveals a complicated picture: entropy is defined differently in different contexts, and even within the same domain different notions of entropy are at work. Some of these are defined in terms of probabilities, others are not. The aim of this chapter is to arrive at an understanding of some of the most important notions (...) 

In order to tackle the question posed by the title – notoriously answered in the positive, among others, by Heisenberg, Margenau, Popper and Redhead – I first discuss some attempts at distinguishing dispositional from nondispositional properties, and then relate the distinction to the formalism of quantum mechanics. Since any answer to the question titling the paper must be interpretationdependent, I review some of the main interpretations of quantum mechanics in order to argue that the ontology of theories regarding “wave collapse” (...) 

I argue that a pragmatic scepticism about metaphysical modality is a perfectly reasonable position to maintain. I then illustrate the difficulties and limitations associated with some strategies for defeating such scepticism. These strategies appeal to associations between metaphysical modality and the following: objective probability, counterfactuals and distinctive explanatory value. 



I argue for two claims: that the ordinary English truth predicate is a gradable adjective and that truth is a property that comes in degrees. The first is a semantic claim, motivated by the linguistic evidence and the similarity of the truth predicate’s behavior to other gradable terms. The second is a claim in natural language metaphysics, motivated by interpreting the best semantic analysis of gradable terms as applied to the truth predicate. In addition to providing arguments for these two (...) 

Sherrilyn Roush’s Tracking Truth (2005) is an impressive, precisioncrafted work. Although it sets out to rehabilitate the epistemological theory of Robert Nozick’s Philosophical Explanations (1981), its departures from Nozick’s line are extensive and original enough that it should be regarded as a distinct form of epistemological externalism. Roush’s mission is to develop an externalism that averts the problems and counterexamples encountered not only by Nozick’s theory but by other varieties of externalism as well. Roush advances both a theory of knowledge (...) 

In this paper, I present a solution to the Doomsday argument based on a third type of solution, by contrast to, on the one hand, the CarterLeslie view and, on the other hand, the Eckhardt et al. analysis. I begin by strengthening both competing models by highlighting some variations of their original models, which renders them less vulnerable to several objections. I then describe a third line of solution, which incorporates insights from both Leslie and Eckhardt’s models and fits more (...) 

The disjunction problem and the distality problem each presents a challenge that any theory of mental content must address. Here we consider their bearing on purely probabilistic causal theories. In addition to considering these problems separately, we consider a third challenge—that a theory must solve both. We call this “the hard problem.” We consider 8 basic ppc theories along with 240 hybrids of them, and show that some can handle the disjunction problem and some can handle the distality problem, but (...) 

Spontaneous collapse theories of quantum mechanics turn the usual Schrödinger equation into a stochastic dynamical law. In particular, in this paper, I will focus on the GRW theory. Two philosophical issues that can be raised about GRW concern (i) the ontology of the theory, in particular the nature of the wave function and its role within the theory, and (ii) the interpretation of the objective probabilities involved in the dynamics of the theory. During the last years, it has been claimed (...) 

In a recent paper, Appley and Stoutenburg present two new objections to Explanationist Evidentialism : the Regress Objection and the Threshold Objection. In this paper, I develop a version of EE that is independently plausible and empirically grounded, and show that it can meet Appley and Stoutenburg’s objections. 

The supporters of Indeterminate Futurism Theory [IFT] suggest three different reasons for preferring their view over Growing Block Theory [GBT]. If compared to GBT, IFT offers a better account for the open future problem, our cognitive attitudes towards future contingents, and how open the future is. Michael TzeSung Longenecker disagrees with them, stating that the advantages suggested by IFT's supporters are not advantages at all and/or can be accommodated by GBT. This means that, if he is right, there is no (...) 

The anticausal prophecies of last century have been disproved. Causality is neither a ‘relic of a bygone’ nor ‘another fetish of modern science’; it still occupies a large part of the current debate in philosophy and the sciences. This investigation into causal modelling presents the rationale of causality, i.e. the notion that guides causal reasoning in causal modelling. It is argued that causal models are regimented by a rationale of variation, nor of regularity neither invariance, thus breaking down the dominant (...) 



Deepfakes are realistic videos created using new machine learning techniques rather than traditional photographic means. They tend to depict people saying and doing things that they did not actually say or do. In the news media and the blogosphere, the worry has been raised that, as a result of deepfakes, we are heading toward an “infopocalypse” where we cannot tell what is real from what is not. Several philosophers have now issued similar warnings. In this paper, I offer an analysis (...) 

‘If I were to toss a coin 1000 times, then it would land heads exactly n times’. Is there a specific value of n that renders this counterfactual true? According to an increasingly influential view, there is. A precursor of the view goes back to the Molinists; more recently it has been inspired by Stalnaker, and versions of it have been advocated by Hawthorne, Bradley, Moss, Schulz, and Stefánsson. More generally, I attribute to these authors what I call Counterfactual Plenitude:For (...) 

Precautionary Principles are often said to be appropriate for decisionmaking in contexts of uncertainty such as climate policy. Contexts of uncertainty are contrasted to contexts of risk depending on whether we have probabilities or not. Against this view, I argue that the riskuncertainty distinction is practically irrelevant. I start by noting that the history of the distinction between risk and uncertainty is more varied than is sometimes assumed. In order to examine the distinction, I unpack the idea of having probabilities, (...) 

The paper suggests a modal predicate logic that deals with classical quantification and modalities as well as intermediate operators, like “most” and “mostly”. Following up the theory of generalized quantifiers, we will understand them as twoplaced operators and call them determiners. Quantifiers as well as modal operators will be constructed from them. Besides the classical deduction, we discuss a weaker probabilistic inference “therefore, probably” defined by symmetrical probability measures in Carnap’s style. The given probabilistic inference relates intermediate quantification to singular (...) 

In this contribution, I comment on Raffaella Campaner’s defense of explanatory pluralism in psychiatry (in this volume). In her paper, Campaner focuses primarily on explanatory pluralism in contrast to explanatory reductionism. Furthermore, she distinguishes between pluralists who consider pluralism to be a temporary state on the one hand and pluralists who consider it to be a persisting state on the other hand. I suggest that it would be helpful to distinguish more than those two versions of pluralism – different understandings (...) 

The Problem of Old Evidence is a perennial issue for Bayesian confirmation theory. Garber famously argues that the problem can be solved by conditionalizing on the proposition that a hypothesis deductively implies the existence of the old evidence. In recent work, Hartmann and Fitelson :712–717, 2015) and Sprenger :383–401, 2015) aim for similar, but more general, solutions to the Problem of Old Evidence. These solutions are more general because they allow the explanatory relationship between a new hypothesis and old evidence (...) 

Over the last 40 years or so, there has been an explosion of cultural evolution research in anthropology and archaeology. In each discipline, cultural evolutionists investigate how interactions between individuals translate into group level patterns, with the aim of explaining the diachronic dynamics and diversity of cultural traits. However, while much attention has been given to deterministic processes, we contend that current evolutionary accounts of cultural change are limited because they do not adopt a systematic stochastic approach. First, we show (...) 

From Leibniz to Krauss philosophers and scientists have raised the question as to why there is something rather than nothing. Whyquestions request a type of explanation and this is often thought to include a deductive component. With classical logic in the background only trivial answers are forthcoming. With free logics in the background, be they of the negative, positive or neutral variety, only questionbegging answers are to be expected. The same conclusion is reached for the modal version of the Question, (...) 

This paper introduces a new infinite paradox. The main novelty is that it poses problems of causality in a very different form from to the one in use until now. By means of a probabilistic generalization, the paradox shows that the disposition to act according to a specific plan is not always necessary to derive causal effects in Benardetetype contexts involving infinity. It also suggests that, in such cases, the explanation for those causal effects requires a propensity interpretation of probability. 

Drift is often characterized in statistical terms. Yet such a purely statistical characterization is ambiguous for it can accept multiple physical interpretations. Because of this ambiguity it is important to distinguish what sorts of processes can lead to this statistical phenomenon. After presenting a physical interpretation of drift originating from the most popular interpretation of fitness, namely the propensity interpretation, I propose a different one starting from an analysis of the concept of drift made by GodfreySmith. Further on, I show (...) 

When members of a group doxastically disagree with each other, decisions in the group are often hard to make. The members are supposed to find an epistemic compromise. How do members of a group reach a rational epistemic compromise on a proposition when they have different (rational) credences in the proposition? I answer the question by suggesting the FineGrained Method of Aggregation, which is introduced in Brössel and Eder 2014 and is further developed here. I show how this method faces (...) 