Are counterfactuals with true antecedents and consequents automatically true? That is, is ConjunctionConditionalization: if (X & Y), then (X > Y) valid? Stalnaker and Lewis think so, but many others disagree. We note here that the extant arguments for ConjunctionConditionalization are unpersuasive, before presenting a family of more compelling arguments. These arguments rely on some standard theorems of the logic of counterfactuals as well as a plausible and popular semantic claim about certain semifactuals. Denying (...)ConjunctionConditionalization, then, requires rejecting other aspects of the standard logic of counterfactuals, or else our intuitive picture of semifactuals. (shrink)
This discussion note examines a recent argument for the principle that any counterfactual with true components is itself true. That argument rests upon two widely accepted principles of counterfactual logic to which the paper presents counterexamples. The conclusion speculates briefly upon the wider lessons that philosophers should draw from these examples for the semantics of counterfactuals.
Greaves and Wallace argue that conditionalization maximizes expected accuracy. In this paper I show that their result only applies to a restricted range of cases. I then show that the update procedure that maximizes expected accuracy in general is one in which, upon learning P, we conditionalize, not on P, but on the proposition that we learned P. After proving this result, I provide further generalizations and show that much of the accuracy-first epistemology program is committed to KK-like iteration (...) principles and to the existence of a class of propositions that rational agents will be certain of if and only if they are true. (shrink)
At the heart of the Bayesianism is a rule, Conditionalization, which tells us how to update our beliefs. Typical formulations of this rule are underspecified. This paper considers how, exactly, this rule should be formulated. It focuses on three issues: when a subject’s evidence is received, whether the rule prescribes sequential or interval updates, and whether the rule is narrow or wide scope. After examining these issues, it argues that there are two distinct and equally viable versions of (...) class='Hi'>Conditionalization to choose from. And which version we choose has interesting ramifications, bearing on issues such as whether Conditionalization can handle continuous evidence, and whether Jeffrey Conditionalization is really a generalization of Conditionalization. (shrink)
Seeing a red hat can (i) increase my credence in the hat is red, and (ii) introduce a negative dependence between that proposition and po- tential undermining defeaters such as the light is red. The rigidity of Jeffrey Conditionalization makes this awkward, as rigidity preserves inde- pendence. The picture is less awkward given ‘Holistic Conditionalization’, or so it is claimed. I defend Jeffrey Conditionalization’s consistency with underminable perceptual learning and its superiority to Holistic Conditionalization, arguing that (...) the latter is merely a special case of the former, is itself rigid, and is committed to implausible accounts of perceptual con- firmation and of undermining defeat. (shrink)
The applicability of Bayesian conditionalization in setting one’s posterior probability for a proposition, α, is limited to cases where the value of a corresponding prior probability, PPRI(α|∧E), is available, where ∧E represents one’s complete body of evidence. In order to extend probability updating to cases where the prior probabilities needed for Bayesian conditionalization are unavailable, I introduce an inference schema, defeasible conditionalization, which allows one to update one’s personal probability in a proposition by conditioning on a proposition (...) that represents a proper subset of one’s complete body of evidence. While defeasible conditionalization has wider applicability than standard Bayesian conditionalization (since it may be used when the value of a relevant prior probability, PPRI(α|∧E), is unavailable), there are circumstances under which some instances of defeasible conditionalization are unreasonable. To address this difficulty, I outline the conditions under which instances of defeasible conditionalization are defeated. To conclude the article, I suggest that the prescriptions of direct inference and statistical induction can be encoded within the proposed system of probability updating, by the selection of intuitively reasonable prior probabilities. (shrink)
Conditionalization is one of the central norms of Bayesian epistemology. But there are a number of competing formulations, and a number of arguments that purport to establish it. In this paper, I explore which formulations of the norm are supported by which arguments. In their standard formulations, each of the arguments I consider here depends on the same assumption, which I call Deterministic Updating. I will investigate whether it is possible to amend these arguments so that they no longer (...) depend on it. As I show, whether this is possible depends on the formulation of the norm under consideration. (shrink)
Conditionalization is a widely endorsed rule for updating one’s beliefs. But a sea of complaints have been raised about it, including worries regarding how the rule handles error correction, changing desiderata of theory choice, evidence loss, self-locating beliefs, learning about new theories, and confirmation. In light of such worries, a number of authors have suggested replacing Conditionalization with a different rule — one that appeals to what I’ll call “ur-priors”. But different authors have understood the rule in different (...) ways, and these different understandings solve different problems. In this paper, I aim to map out the terrain regarding these issues. I survey the different problems that might motivate the adoption of such a rule, flesh out the different understandings of the rule that have been proposed, and assess their pros and cons. I conclude by suggesting that one particular batch of proposals, proposals that appeal to what I’ll call “loaded evidential standards”, are especially promising. (shrink)
Colin Howson (1995 ) offers a counter-example to the rule of conditionalization. I will argue that the counter-example doesn't hit its target. The problem is that Howson mis-describes the total evidence the agent has. In particular, Howson overlooks how the restriction that the agent learn 'E and nothing else' interacts with the de se evidence 'I have learnt E'.
In this paper we discuss the extent to which conjunction and disjunction can be rightfully regarded as such, in the context of infectious logics. Infectious logics are peculiar many-valued logics whose underlying algebra has an absorbing or infectious element, which is assigned to a compound formula whenever it is assigned to one of its components. To discuss these matters, we review the philosophical motivations for infectious logics due to Bochvar, Halldén, Fitting, Ferguson and Beall, noticing that none of them (...) discusses our main question. This is why we finally turn to the analysis of the truth-conditions for conjunction and disjunction in infectious logics, employing the framework of plurivalent logics, as discussed by Priest. In doing so, we arrive at the interesting conclusion that —in the context of infectious logics— conjunction is conjunction, whereas disjunction is not disjunction. (shrink)
This paper shows that any view of future contingent claims that treats such claims as having indeterminate truth values or as simply being false implies probabilistic irrationality. This is because such views of the future imply violations of reflection, special reflection and conditionalization.
The central topic of this inquiry is a cross-linguistic contrast in the interaction of conjunction and negation. In Hungarian (Russian, Serbian, Italian, Japanese), in contrast to English (German), negated definite conjunctions are naturally and exclusively interpreted as `neither’. It is proposed that Hungarian-type languages conjunctions simply replicate the behavior of plurals, their closest semantic relatives. More puzzling is why English-type languages present a different range of interpretations. By teasing out finer distinctions in focus on connectives, syntactic structure, and context, (...) the paper tracks down missing readings and argues that it is eventually not necessary to postulate a radical cross-linguistic semantic difference. In the course of making that argument it is observed that negated conjunctions on the `neither’ reading carry the expectation that the predicate hold of both conjuncts. The paper investigates several hypotheses concerning the source of this expectation. (shrink)
Rodriguez-Pereyra (2006) argues for the disjunction thesis but against the conjunction thesis. I argue that accepting the disjunction thesis undermines his argument against the conjunction thesis.
How do temporal and eternal beliefs interact? I argue that acquiring a temporal belief should have no effect on eternal beliefs for an important range of cases. Thus, I oppose the popular view that new norms of belief change must be introduced for cases where the only change is the passing of time. I defend this position from the purported counter-examples of the Prisoner and Sleeping Beauty. I distinguish two importantly different ways in which temporal beliefs can be acquired and (...) draw some general conclusions about their impact on eternal beliefs. (shrink)
There are two ways of understanding the notion of a contradiction: as a conjunction of a statement and its negation, or as a pair of statements one of which is the negation of the other. Correspondingly, there are two ways of understanding the Law of Non-Contradiction (LNC), i.e., the law that says that no contradictions can be true. In this paper I offer some arguments to the effect that on the first (collective) reading LNC is non-negotiable, but on the (...) second (distributive) reading it is perfectly plausible to suppose that LNC may, in some rather special and perhaps undesirable circumstances, fail to hold. (shrink)
This paper is a response to replies by Dan López de Sa and Mark Jago to my ‘Truthmaking, Entailment, and the Conjuction Thesis’. In that paper, my main aim was to argue against the Entailment Principle by arguing against the Conjunction Thesis, which is entailed by the Entailment Principle. In the course of so doing, although not essential for my project in that paper, I defended the Disjunction Thesis. López de Sa has objected both to my defence of the (...) Disjunction Thesis and my case against the Conjunction Thesis. I shall show that his objections are unfounded and based on serious misunderstandings of my position, what the relevant debate is, and some fundamental notions of Truthmaker Theory. Jago argues that accepting the Disjunction Thesis and rejecting the Conjunction Thesis is hard to maintain. But I show that Jago has not shown that accepting the Disjunction Thesis while rejecting the Conjunction Thesis is impossible or even hard to maintain. Jago believes that, to accept the Disjunction Thesis while rejecting the Conjunction Thesis, one needs to reject his axiom (T3), which says that all the truthmakers for <P&P> are truthmakers for <P>. I argue that there are reasons to reject such a principle, and the version of it that says that what makes <P&P> true makes <P> true. (shrink)
It has been argued that if the rigidity condition is satisfied, a rational agent operating with uncertain evidence should update her subjective probabilities by Jeffrey conditionalization or else a series of bets resulting in a sure loss could be made against her. We show, however, that even if the rigidity condition is satisfied, it is not always safe to update probability distributions by JC because there exist such sequences of non-misleading uncertain observations where it may be foreseen that an (...) agent who updates her subjective probabilities by JC will end up nearly certain that a false hypothesis is true. We analyze the features of JC that lead to this problem, specify the conditions in which it arises and respond to potential objections. (shrink)
Epistemic decision theory produces arguments with both normative and mathematical premises. I begin by arguing that philosophers should care about whether the mathematical premises (1) are true, (2) are strong, and (3) admit simple proofs. I then discuss a theorem that Briggs and Pettigrew (2020) use as a premise in a novel accuracy-dominance argument for conditionalization. I argue that the theorem and its proof can be improved in a number of ways. First, I present a counterexample that shows that (...) one of the theorem’s claims is false. As a result of this, Briggs and Pettigrew’s argument for conditionalization is unsound. I go on to explore how a sound accuracy-dominance argument for conditionalization might be recovered. In the course of doing this, I prove two new theorems that correct and strengthen the result reported by Briggs and Pettigrew. I show how my results can be combined with various normative premises to produce sound arguments for conditionalization. I also show that my results can be used to support normative conclusions that are stronger than the one that Briggs and Pettigrew’s argument supports. Finally, I show that Briggs and Pettigrew’s proofs can be simplified considerably. (shrink)
We provide a 'verisimilitudinarian' analysis of the well-known Linda paradox or conjunction fallacy, i.e., the fact that most people judge the probability of the conjunctive statement "Linda is a bank teller and is active in the feminist movement" (B & F) as more probable than the isolated statement "Linda is a bank teller" (B), contrary to an uncontroversial principle of probability theory. The basic idea is that experimental participants may judge B & F a better hypothesis about Linda as (...) compared to B because they evaluate B & F as more verisimilar than B. In fact, the hypothesis "feminist bank teller", while less likely to be true than "bank teller", may well be a better approximation to the truth about Linda. (shrink)
Boghossian’s (2003) proposal to conditionalize concepts as a way to secure their legitimacy in disputable cases applies well, not just to pejoratives – on whose account Boghossian first proposed it – but also to thick ethical concepts. It actually has important advantages when dealing with some worries raised by the application of thick ethical terms, and the truth and facticity of corresponding statements. In this paper, I will try to show, however, that thick ethical concepts present a specific case, whose (...) analysis requires a somewhat different reconstruction from that which Boghossian offers. A proper account of thick ethical concepts should be able to explain how ‘evaluated’ and ‘evaluation’ are connected. (shrink)
Church and Fitch have argued that from the verificationationist thesis “for every proposition, if this proposition is true, then it is possible to know it” we can derive that for every truth there is someone who knows that truth. Moreover, Humberstone has shown that from the latter proposition we can derive that someone knows every truth, hence that there is an omniscient being. In his article “Omnificence”, John Bigelow adapted these arguments in order to argue that from the assumption "every (...) contingent proposition is such that if it is true something brought it about that it is true" we can derive that there is an omnificent being: a being that brings it about that every true contingent proposition is true. In my reply to his article, I show that Bigelow’s argument is flawed because there is some formal property that the knowledge operator has but that the bringing about operator lacks. This is the property of distributing over conjunctions. I explain why what brings it about that some conjunctive proposition is true need not bring it about that its conjuncts are true. (shrink)
Starting from a recent paper by S. Kaufmann, we introduce a notion of conjunction of two conditional events and then we analyze it in the setting of coherence. We give a representation of the conjoined conditional and we show that this new object is a conditional random quantity, whose set of possible values normally contains the probabilities assessed for the two conditional events. We examine some cases of logical dependencies, where the conjunction is a conditional event; moreover, we (...) give the lower and upper bounds on the conjunction. We also examine an apparent paradox concerning stochastic independence which can actually be explained in terms of uncorrelation. We briefly introduce the notions of disjunction and iterated conditioning and we show that the usual probabilistic properties still hold. (shrink)
Let f(1)=2, f(2)=4, and let f(n+1)=f(n)! for every integer n≥2. Edmund Landau's conjecture states that the set P(n^2+1) of primes of the form n^2+1 is infinite. Landau's conjecture implies the following unproven statement Φ: card(P(n^2+1))<ω ⇒ P(n^2+1)⊆(-∞,f(7)]. Let B denote the system of equations: {x_i!=x_k: i,k∈{1,...,9}}∪ {x_i⋅x_j=x_k: i,j,k∈{1,...,9}}. We write down a system U⊆B of 9 equations which has exactly two solutions in positive integers, namely (1,...,1) and (f(1),...,f(9)). Let Ψ denote the statement: if a system S⊆B has at most (...) finitely many solutions in positive integers x_1,...,x_9, then each such solution (x_1,...,x_9) satisfies x_1,...,x_9≤f(9). We write down a system A⊆B of 8 equations. The statement Ψ restricted to the system A is equivalent to the statement Φ. This heuristically proves the statement Φ . This proof does not yield that card(P(n^2+1))=ω. Algorithms always terminate. We explain the distinction between "existing algorithms" (i.e. algorithms whose existence is provable in ZFC) and "known algorithms" (i.e. algorithms whose existence is constructive and currently known to us). Conditions (1)-(5) concern sets X⊆N. *** (1) There are many elements of X and it is conjectured that X is infinite. (2) No known algorithm with no input returns the logical value of the statement card(X)=ω. (3) A known algorithm for every k∈N decides whether or not k∈X. (4) A known algorithm with no input returns an integer n satisfying card(X)<ω ⇒ X⊆(-∞,n]. (5) There is a known condition C, which can be formalized in ZFC, such that for all except at most finitely many k∈N, k satisfies the condition C if and only if k∈X. The simplest known such condition C defines in N the set X. *** The set X={k∈N: (f(7)<k) ⇒ (f(7),k)∩P(n^2+1)≠∅} satisfies conditions (1)-(4). A more complicated set X⊆N satisfies conditions (1)-(5). No set X⊆N will satisfy conditions (1)-(4) forever, if for every algorithm with no inputs, at some future day, a computer will be able to execute this algorithm in 1 second or less. The physical limits of computation disprove this assumption. The statement Φ implies that conditions (1)-(5) hold for X={1}∪P(n^2+1). To define the condition Γ(X) from the title, we formulate condition (4) for n=(((24!)!)!)! and take the conjunction of conditions (1)-(4) or the conjunction of conditions (1)-(5). (shrink)
Is the basic mechanism behind presupposition projection fundamentally asymmetric or symmetric? This is a basic question for the theory of presupposition, which also bears on broader issues concerning the source of asymmetries observed in natural language: are these simply rooted in superficial asymmetries of language use— language use unfolds in time, which we experience as fundamentally asymmetric— or can they be, at least in part, directly referenced in linguistic knowledge and representations? In this paper we aim to make progress on (...) these questions by exploring presupposition projection across conjunction, which has typically been taken as a central piece of evidence that presupposition projection is asymmetric. As a number of authors have recently pointed out, however, whether or not this conclusion is warranted is not clear once we take into account independent issues of redundancy. Building on previous work by Chemla & Schlenker (2012) and Schwarz (2015), we approach this question experimentally by using an inference task which controls for redundancy and presupposition suspension. We find strong evidence for left-to-right filtering across conjunctions, but no evidence for right-to-left filtering, suggesting that, at least as a default, presupposition projection across conjunction is indeed asymmetric. (shrink)
Human agents happen to judge that a conjunction of two terms is more probable than one of the terms, in contradiction with the rules of classical probabilities—this is the conjunction fallacy. One of the most discussed accounts of this fallacy is currently the quantum-like explanation, which relies on models exploiting the mathematics of quantum mechanics. The aim of this paper is to investigate the empirical adequacy of major quantum-like models which represent beliefs with quantum states. We first argue (...) that they can be tested in three different ways, in a question order effect configuration which is different from the traditional conjunction fallacy experiment. We then carry out our proposed experiment, with varied methodologies from experimental economics. The experimental results we get are at odds with the predictions of the quantum-like models. This strongly suggests that this quantum-like account of the conjunction fallacy fails. Future possible research paths are discussed. (shrink)
The word 'and' can be used both intersectively, as in 'John lies and cheats', and collectively, as in 'John and Mary met'. Research has tried to determine which one of these two meanings is basic. Focusing on coordination of nouns ('liar and cheat'), this article argues that the basic meaning of 'and' is intersective. This theory has been successfully applied to coordination of other kinds of constituents (Partee & Rooth 1983; Winter 2001). Certain cases of noun coordination ('men and women') (...) challenge this view, and have therefore been argued to favor the collective theory (Heycock & Zamparelli 2005). The main result of this article is that the intersective theory actually predicts the collective behavior of 'and' in 'men and women'. 'And' leads to collectivity by interacting with silent operators involving set minimization and choice functions, which have been postulated to account for phenomena involving indefinites, collective predicates and coordinations of noun phrases (Winter 2001). This article also shows that the collective theory does not generalize to coordinations of noun phrases in the way it has been previously suggested. (shrink)
In this paper, I provide an accuracy-based argument for conditionalization (via reflection) that does not rely on norms of maximizing expected accuracy. -/- (This is a draft of a paper that I wrote in 2013. It stalled for no very good reason. I still believe the content is right).
This article addresses a debate in Descartes scholarship over the mind-dependence or -independence of time by turning to Merleau-Ponty’s "Nature" and "The Visible and the Invisible." In doing so, it shows that both sides of the debate ignore that time for Descartes is a measure of duration in general. The consequences to remembering what time is are that the future is shown to be the invisible of an intertwining of past and future, and that historicity is the invisible of God.
The causal and simulation theories are often presented as very distinct views about declarative memory, their major difference lying on the causal condition. The causal theory states that remembering involves an accurate representation causally connected to an earlier experience (the causal condition). In the simulation theory, remembering involves an accurate representation generated by a reliable memory process (no causal condition). I investigate how to construe detailed versions of these theories that correctly classify memory errors (DRM, “lost in the mall”, and (...) memory-conjunction errors) as misremembering or confabulation. Neither causalists nor simulationists have paid attention to memory-conjunction errors, which is unfortunate because both theories have problems with these cases. The source of the difficulty is the background assumption that an act of remembering has one (and only one) target. I fix these theories for those cases. The resulting versions are closely related when implemented using tools of information theory, differing only on how memory transmits information about the past. The implementation provides us with insights about the distinction between confabulatory and non-confabulatory memory, where memory-conjunction errors have a privileged position. (shrink)
A theory of truth is an explanation of the nature of truth and set of rules that true things obey. A theory of truth is basically an attempt to enlighten on the nature of truth and formulate a set of laws that ‘true’ things follow. When we recall a memory, or analyze a statement, or appeal to evaluate within our brain, in fact, we are in quest for truth. Different theories of truth try to understand it from different perspectives. Attempts (...) to analyze truth down the history can neatly be divided into two: Classical and Contemporary theories. The classical or otherwise known as the traditional theories of truth are, Correspondence theory, Coherence theory, Pragmatic theory. Contemporary theories or otherwise known as modern theories of truth are mostly deflationary theories. Over and above these two categories we shall discuss a new theory of truth known as Functional theory of truth. When we analyze contemporary theories of truth, such as functionalist theory of truth, specifically alethic pluralism which commits the problem of mixed compound and also suggest the solution for the same. (shrink)
The idea about this special issue came from a paper published as an updated and upridged version of an older memorial lecture given by Brian D. Josephson and Michael Conrad at the Gujarat Vidyapith University in Ahmedabad, India on March 2, 1984. The title of this paper was “Uniting Eastern Philosophy and Western Science” (1992). We thought that this topic deserves to be revisited after 25 years to demonstrate to the scientific community which new insights and achievements were attained in (...) this fairly broad field during this period. (shrink)
A handful of well-known arguments (the 'diachronic Dutch book arguments') rely upon theorems establishing that, in certain circumstances, you are immune from sure monetary loss (you are not 'diachronically Dutch bookable') if and only if you adopt the strategy of conditionalizing (or Jeffrey conditionalizing) on whatever evidence you happen to receive. These theorems require non-trivial assumptions about which evidence you might acquire---in the case of conditionalization, the assumption is that, if you might learn that e, then it is not (...) the case that you might learn something else that is consistent with e. These assumptions may not be relaxed. When they are, not only will non-(Jeffrey) conditionalizers be immune from diachronic Dutch bookability, but (Jeffrey) conditionalizers will themselves be diachronically Dutch bookable. I argue: 1) that there are epistemic situations in which these assumptions are violated; 2) that this reveals a conflict between the premise that susceptibility to sure monetary loss is irrational, on the one hand, and the view that rational belief revision is a function of your prior beliefs and the acquired evidence alone, on the other; and 3) that this inconsistency demonstrates that diachronic Dutch book arguments for (Jeffrey) conditionalization are invalid. (shrink)
I advocate Time-Slice Rationality, the thesis that the relationship between two time-slices of the same person is not importantly different, for purposes of rational evaluation, from the relationship between time-slices of distinct persons. The locus of rationality, so to speak, is the time-slice rather than the temporally extended agent. This claim is motivated by consideration of puzzle cases for personal identity over time and by a very moderate form of internalism about rationality. Time-Slice Rationality conflicts with two proposed principles of (...) rationality, Conditionalization and Reflection. Conditionalization is a diachronic norm saying how your current degrees of belief should fit with your old ones, while Reflection is a norm enjoining you to defer to the degrees of belief that you expect to have in the future. But they are independently problematic and should be replaced by improved, time-slice-centric principles. Conditionalization should be replaced by a synchronic norm saying what degrees of belief you ought to have given your current evidence and Reflection should be replaced by a norm which instructs you to defer to the degrees of belief of agents you take to be experts. These replacement principles do all the work that the old principles were supposed to do while avoiding their problems. In this way, Time-Slice Rationality puts the theory of rationality on firmer foundations and yields better norms than alternative, non-time-slice-centric approaches. (shrink)
According to an increasingly popular epistemological view, people need outright beliefs in addition to credences to simplify their reasoning. Outright beliefs simplify reasoning by allowing thinkers to ignore small error probabilities. What is outright believed can change between contexts. It has been claimed that thinkers manage shifts in their outright beliefs and credences across contexts by an updating procedure resembling conditionalization, which I call pseudo-conditionalization (PC). But conditionalization is notoriously complicated. The claim that thinkers manage their beliefs (...) via PC is thus in tension with the view that the function of beliefs is to simplify our reasoning. I propose to resolve this puzzle by rejecting the view that thinkers employ PC. Based on this solution, I furthermore argue for a descriptive and a normative claim. The descriptive claim is that the available strategies for managing beliefs and credences across contexts that are compatible with the simplifying function of outright beliefs can generate synchronic and diachronic incoherence in a thinker’s attitudes. Moreover, I argue that the view of outright belief as a simplifying heuristic is incompatible with the view that there are ideal norms of coherence or consistency governing outright beliefs that are too complicated for human thinkers to comply with. (shrink)
How should a group with different opinions (but the same values) make decisions? In a Bayesian setting, the natural question is how to aggregate credences: how to use a single credence function to naturally represent a collection of different credence functions. An extension of the standard Dutch-book arguments that apply to individual decision-makers recommends that group credences should be updated by conditionalization. This imposes a constraint on what aggregation rules can be like. Taking conditionalization as a basic constraint, (...) we gather lessons from the established work on credence aggregation, and extend this work with two new impossibility results. We then explore contrasting features of two kinds of rules that satisfy the constraints we articulate: one kind uses fixed prior credences, and the other uses geometric averaging, as opposed to arithmetic averaging. We also prove a new characterisation result for geometric averaging. Finally we consider applications to neighboring philosophical issues, including the epistemology of disagreement. (shrink)
In this paper I present a new way of understanding Dutch Book Arguments: the idea is that an agent is shown to be incoherent iff he would accept as fair a set of bets that would result in a loss under any interpretation of the claims involved. This draws on a standard definition of logical inconsistency. On this new understanding, the Dutch Book Arguments for the probability axioms go through, but the Dutch Book Argument for Reflection fails. The question of (...) whether we have a Dutch Book Argument for Conditionalization is left open. (shrink)
Dutch Book arguments have been presented for static belief systems and for belief change by conditionalization. An argument is given here that a rule for belief change which under certain conditions violates probability kinematics will leave the agent open to a Dutch Book.
The externalist says that your evidence could fail to tell you what evidence you do or not do have. In that case, it could be rational for you to be uncertain about what your evidence is. This is a kind of uncertainty which orthodox Bayesian epistemology has difficulty modeling. For, if externalism is correct, then the orthodox Bayesian learning norms of conditionalization and reflection are inconsistent with each other. I recommend that an externalist Bayesian reject conditionalization. In its (...) stead, I provide a new theory of rational learning for the externalist. I defend this theory by arguing that its advice will be followed by anyone whose learning dispositions maximize expected accuracy. I then explore some of this theory’s consequences for the rationality of epistemic akrasia, peer disagreement, undercutting defeat, and uncertain evidence. (shrink)
The Reflection Principle can be defended with a Diachronic Dutch Book Argument (DBA), but it is also defeated by numerous compelling counter-examples. It seems then that Diachronic DBAs can lead us astray. Should we reject them en masse—including Lewis’s Diachronic DBA for Conditionalization? Rachael Briggs’s “suppositional test” is supposed to differentiate between Diachronic DBAs that we can safely ignore (including the DBA for Reflection) and Diachronic DBAs that we should find compelling (including the DBA for Conditionalization). I argue (...) that Brigg’s suppositional test is wrong: it sets the bar for coherence too high and places certain cases of self-doubt on the wrong side of the divide. Given that the suppositional test is unsatisfactory, we are left without any justification for discriminating between Diachronic DBAs and ought to reject them all—including the DBA for Conditionalization. (shrink)
Weisberg introduces a phenomenon he terms perceptual undermining. He argues that it poses a problem for Jeffrey conditionalization, and Bayesian epistemology in general. This is Weisberg’s paradox. Weisberg argues that perceptual undermining also poses a problem for ranking theory and for Dempster-Shafer theory. In this note I argue that perceptual undermining does not pose a problem for any of these theories: for true conditionalizers Weisberg’s paradox is a false alarm.
From the dictum "ought implies can", it has been argued that no account of belief's normativity can avoid the unpalatable result that, for unbelievable propositions such as "It is raining and nobody believes that it is raining", one ought not to believe them even if true. In this article, I argue that this move only succeeds on a faulty assumption about the conjunction of doxastic "oughts.".
Import-Export says that a conditional 'If p, if q, r' is always equivalent to the conditional 'If p and q, r'. I argue that Import-Export does not sit well with a classical approach to conjunction: given some plausible and widely accepted principles about conditionals, Import-Export together with classical conjunction leads to absurd consequences. My main goal is to draw out these surprising connections. In concluding I argue that the right response is to reject Import-Export and adopt instead a (...) limited version which better fits natural language data; accounts for all the intuitions that motivate Import-Export in the first place; and fits better with a classical conjunction. (shrink)
In this essay, I cast doubt on an apparent truism: namely, that if evidence is available for gathering and use at a negligible cost, then it's always instrumentally rational for us to gather that evidence and use it for making decisions. Call this thesis Value of Information. I show that Value of Information conflicts with two other plausible theses. The first is the view that an agent's evidence can entail non-trivial propositions about the external world. The second is the view (...) that epistemic rationality requires us to update our credences by conditionalization. These two theses, given some plausible assumptions, make room for rationally biased inquiries where Value of Information fails. I go on to argue that this is bad news for defenders of Value of Information. (shrink)
Section 1 provides a brief summary of the pair-list literature singling out some points that are particularly relevant for the coming discussion. -/- Section 2 shows that the dilemma of quantifi cation versus domain restriction arises only in extensional complement interrogatives. In matrix questions and in intensional complements only universals support pairlist readings, whence the simplest domain restriction treatment suffices. Related data including conjunction, disjunction, and cumulative readings are discussed -/- Section 3 argues that in the case of extensional (...) complements the domain restriction treatment is inadequate for at least two independent reasons. One has to do with the fact that not only upward monotonic quantifi ers support pairlist readings, and the other with the derivation of apparent scope out readings. The reasoning is supplemented with some discussion of the semantic properties of layered quantifi ers. The above will establish the need for quantifi cation, so the question arises how the objections explicitly enlisted in the literature against quantifi cation can be answered. Section 4 considers the de dicto reading of the quantifi er s restriction, quanti cational variability, and the absence of pairlist readings with whether questions, and argues that they need not militate against the quanti ficational analysis. -/- Section 5 summarizes the emergent proposal -/- Finally, section 6 discusses the signifi cance of the above findings for the behavior of weak islands. (shrink)
Accuracy-first accounts of rational learning attempt to vindicate the intuitive idea that, while rationally-formed belief need not be true, it is nevertheless likely to be true. To this end, they attempt to show that the Bayesian's rational learning norms are a consequence of the rational pursuit of accuracy. Existing accounts fall short of this goal, for they presuppose evidential norms which are not and cannot be vindicated in terms of the single-minded pursuit of accuracy. I propose an alternative account, according (...) to which learning experiences rationalize changes in the way you value accuracy, which in turn rationalize changes in belief. I show that this account is capable of vindicating the Bayesian's rational learning norms in terms of the single-minded pursuit of accuracy, so long as accuracy is rationally valued. (shrink)
According to Bayesian orthodoxy, an agent should update---or at least should plan to update---her credences by conditionalization. Some have defended this claim by means of a diachronic Dutch book argument. They say: an agent who does not plan to update her credences by conditionalization is vulnerable (by her own lights) to a diachronic Dutch book, i.e., a sequence of bets which, when accepted, guarantee loss of utility. Here, I show that this argument is in tension with evidence externalism, (...) i.e., the view that an agent's evidence can entail non-trivial propositions about the external world. I argue that this tension casts doubt on the idea that diachronic Dutch books can be used to justify or vindicate updating plans. (shrink)
The Sleeping Beauty problem has attracted considerable attention in the literature as a paradigmatic example of how self-locating uncertainty creates problems for the Bayesian principles of Conditionalization and Reflection. Furthermore, it is also thought to raise serious issues for diachronic Dutch Book arguments. I show that, contrary to what is commonly accepted, it is possible to represent the Sleeping Beauty problem within a standard Bayesian framework. Once the problem is correctly represented, the ‘thirder’ solution satisfies standard rationality principles, vindicating (...) why it is not vulnerable to diachronic Dutch Book arguments. Moreover, the diachronic Dutch Books against the ‘halfer’ solutions fail to undermine the standard arguments for Conditionalization. The main upshot that emerges from my discussion is that the disagreement between different solutions does not challenge the applicability of Bayesian reasoning to centered settings, nor the commitment to Conditionalization, but is instead an instance of the familiar problem of choosing the priors. (shrink)
Michael Rescorla (2020) has recently pointed out that the standard arguments for Bayesian Conditionalization assume that whenever you take yourself to learn something with certainty, it's true. Most people would reject this assumption. In response, Rescorla offers an improved Dutch Book argument for Bayesian Conditionalization that does not make this assumption. My purpose in this paper is two-fold. First, I want to illuminate Rescorla's new argument by giving a very general Dutch Book argument that applies to many cases (...) of updating beyond those covered by Conditionalization, and then showing how Rescorla's version follows as a special case of that. Second, I want to show how to generalise Briggs and Pettigrew's Accuracy Dominance argument to avoid the assumption that Rescorla has identified (Briggs & Pettigrew 2018). (shrink)
This develops a framework for second-order conditionalization on statements about one's own epistemic reliability. It is the generalization of the framework of "Second-Guessing" (2009) to the case where the subject is uncertain about her reliability. See also "Epistemic Self-Doubt" (2017).
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.