We propose a new account of indicative conditionals, giving acceptability and logical closure conditions for them. We start from Adams’ Thesis: the claim that the acceptability of a simple indicative equals the corresponding conditionalprobability. The Thesis is widely endorsed, but arguably false and refuted by empirical research. To fix it, we submit, we need a relevance constraint: we accept a simple conditional 'If φ, then ψ' to the extent that (i) the conditionalprobability p(ψ|φ) (...) is high, provided that (ii) φ is relevant for ψ. How (i) should work is well-understood. It is (ii) that holds the key to improve our understanding of conditionals. Our account has (i) a probabilistic component, using Popper functions; (ii) a relevance component, given via an algebraic structure of topics or subject matters. We present a probabilistic logic for simple indicatives, and argue that its (in)validities are both theoretically desirable and in line with empirical results on how people reason with conditionals. (shrink)
This paper argues that the technical notion of conditionalprobability, as given by the ratio analysis, is unsuitable for dealing with our pretheoretical and intuitive understanding of both conditionality and probability. This is an ontological account of conditionals that include an irreducible dispositional connection between the antecedent and consequent conditions and where the conditional has to be treated as an indivisible whole rather than compositional. The relevant type of conditionality is found in some well-defined group of (...)conditional statements. As an alternative, therefore, we briefly offer grounds for what we would call an ontological reading: for both conditionality and conditionalprobability in general. It is not offered as a fully developed theory of conditionality but can be used, we claim, to explain why calculations according to the RATIO scheme does not coincide with our intuitive notion of conditionalprobability. What it shows us is that for an understanding of the whole range of conditionals we will need what John Heil (2003), in response to Quine (1953), calls an ontological point of view. (shrink)
Conditionalprobability is often used to represent the probability of the conditional. However, triviality results suggest that the thesis that the probability of the conditional always equals conditionalprobability leads to untenable conclusions. In this paper, I offer an interpretation of this thesis in a possible worlds framework, arguing that the triviality results make assumptions at odds with the use of conditionalprobability. I argue that these assumptions come from a (...) theory called the operator theory and that the rival restrictor theory can avoid these problematic assumptions. In doing so, I argue that recent extensions of the triviality arguments to restrictor conditionals fail, making assumptions which are only justified on the operator theory. (shrink)
The standard treatment of conditionalprobability leaves conditionalprobability undefined when the conditioning proposition has zero probability. Nonetheless, some find the option of extending the scope of conditionalprobability to include zero-probability conditions attractive or even compelling. This article reviews some of the pitfalls associated with this move, and concludes that, for the most part, probabilities conditional on zero-probability propositions are more trouble than they are worth.
A theory of cognitive systems individuation is presented and defended. The approach has some affinity with Leonard Talmy's Overlapping Systems Model of Cognitive Organization, and the paper's first section explores aspects of Talmy's view that are shared by the view developed herein. According to the view on offer -- the conditionalprobability of co-contribution account (CPC) -- a cognitive system is a collection of mechanisms that contribute, in overlapping subsets, to a wide variety of forms of intelligent behavior. (...) Central to this approach is the idea of an integrated system. A formal characterization of integration is laid out in the form of a conditional-probabilitybased measure of the clustering of causal contributors to the production of intelligent behavior. I relate the view to the debate over extended and embodied cognition and respond to objections that have been raised in print by Andy Clark, Colin Klein, and Felipe de Brigard. (shrink)
Karl Popper discovered in 1938 that the unconditional probability of a conditional of the form ‘If A, then B’ normally exceeds the conditionalprobability of B given A, provided that ‘If A, then B’ is taken to mean the same as ‘Not (A and not B)’. So it was clear (but presumably only to him at that time) that the conditionalprobability of B given A cannot be reduced to the unconditional probability of (...) the material conditional ‘If A, then B’. I describe how this insight was developed in Popper’s writings and I add to this historical study a logical one, in which I compare laws of excess in Kolmogorov probability theory with laws of excess in Popper probability theory. (shrink)
There are many scientific and everyday cases where each of Pr and Pr is high and it seems that Pr is high. But high probability is not transitive and so it might be in such cases that each of Pr and Pr is high and in fact Pr is not high. There is no issue in the special case where the following condition, which I call “C1”, holds: H 1 entails H 2. This condition is sufficient for transitivity in (...) high probability. But many of the scientific and everyday cases referred to above are cases where it is not the case that H 1 entails H 2. I consider whether there are additional conditions sufficient for transitivity in high probability. I consider three candidate conditions. I call them “C2”, “C3”, and “C2&3”. I argue that C2&3, but neither C2 nor C3, is sufficient for transitivity in high probability. I then set out some further results and relate the discussion to the Bayesian requirement of coherence. (shrink)
This paper explores the interaction of well-motivated (if controversial) principles governing the probability conditionals, with accounts of what it is for a sentence to be indefinite. The conclusion can be played in a variety of ways. It could be regarded as a new reason to be suspicious of the intuitive data about the probability of conditionals; or, holding fixed the data, it could be used to give traction on the philosophical analysis of a contentious notion—indefiniteness. The paper outlines (...) the various options, and shows that ‘rejectionist’ theories of indefiniteness are incompatible with the results. Rejectionist theories include popular accounts such as supervaluationism, non-classical truth-value gap theories, and accounts of indeterminacy that centre on rejecting the law of excluded middle. An appendix compares the results obtained here with the ‘impossibility’ results descending from Lewis ( 1976 ). (shrink)
A few purported counterexamples to the Adams thesis have cropped up in the literature in the last few decades. I propose a theory that accounts for them, in a way that makes the connections between indicative conditionals and counterfactuals clearer.
The chapter is devoted to the probability and acceptability of indicative conditionals. Focusing on three inﬂuential theses, the Equation, Adams’ thesis, and the qualitative version of Adams’ thesis, Sikorski argues that none of them is well supported by the available empirical evidence. In the most controversial case of the Equation, the results of many studies which support it are, at least to some degree, undermined by some recent experimental ﬁndings. Sikorski discusses the Ramsey Test, and Lewis’s triviality proof, with (...) special attention dedicated to the popular ways of blocking it. Sikorski concludes that the role of the three theses in future studies of conditionals should be re-thought, and he presents alternative proposals. (shrink)
Dorothy Edgington’s work has been at the centre of a range of ongoing debates in philosophical logic, philosophy of mind and language, metaphysics, and epistemology. This work has focused, although by no means exclusively, on the overlapping areas of conditionals, probability, and paradox. In what follows, I briefly sketch some themes from these three areas relevant to Dorothy’s work, highlighting how some of Dorothy’s work and some of the contributions of this volume fit in to these debates.
This paper develops an information-sensitive theory of the semantics and probability of conditionals and statements involving epistemic modals. The theory validates a number of principles linking probability and modality, including the principle that the probability of a conditional If A, then C equals the probability of C, updated with A. The theory avoids so-called triviality results, which are standardly taken to show that principles of this sort cannot be validated. To achieve this, we deny that (...) rational agents update their credences via conditionalization. We offer a new rule of update, Hyperconditionalization, which agrees with Conditionalization whenever nonmodal statements are at stake but differs for modal and conditional sentences. (shrink)
While we cannot ensure the occurrence of serendipity due to its nature of unexpectedness, we can try to prepare the optimal conditions to improve the possibility. This chapter first describes two types of unexpected information: within or from beyond one’s perceivable range. Next, we describe four stages of the serendipity attainment process: navigation, noticing, evaluation, and implementation. On this basis, we discuss six scenarios in the order of serendipity encounter and attainment probability, which are determined by information availability in (...) the environment and the mindset in terms of information processing. The serendipity attainment process has a higher success rate when acquiring precise navigation and employing the 3D principles of creativity (best expertise within discipline, the best expertise out of discipline, and discipline process). (shrink)
*This work is no longer under development* Two major themes in the literature on indicative conditionals are that the content of indicative conditionals typically depends on what is known;1 that conditionals are intimately related to conditional probabilities.2 In possible world semantics for counterfactual conditionals, a standard assumption is that conditionals whose antecedents are metaphysically impossible are vacuously true.3 This aspect has recently been brought to the fore, and defended by Tim Williamson, who uses it in to characterize alethic necessity (...) by exploiting such equivalences as: A⇔¬A A. One might wish to postulate an analogous connection for indicative conditionals, with indicatives whose antecedents are epistemically impossible being vacuously true: and indeed, the modal account of indicative conditionals of Brian Weatherson has exactly this feature.4 This allows one to characterize an epistemic modal by the equivalence A⇔¬A→A. For simplicity, in what follows we write A as KA and think of it as expressing that subject S knows that A.5 The connection to probability has received much attention. Stalnaker suggested, as a way of articulating the ‘Ramsey Test’, the following very general schema for indicative conditionals relative to some probability function P: P = P 1For example, Nolan ; Weatherson ; Gillies. 2For example Stalnaker ; McGee ; Adams. 3Lewis. See Nolan for criticism. 4‘epistemically possible’ here means incompatible with what is known. 5This idea was suggested to me in conversation by John Hawthorne. I do not know of it being explored in print. The plausibility of this characterization will depend on the exact sense of ‘epistemically possible’ in play—if it is compatibility with what a single subject knows, then can be read ‘the relevant subject knows that p’. If it is more delicately formulated, we might be able to read as the epistemic modal ‘must’. (shrink)
Why are conditional degrees of belief in an observation E, given a statistical hypothesis H, aligned with the objective probabilities expressed by H? After showing that standard replies are not satisfactory, I develop a suppositional analysis of conditional degree of belief, transferring Ramsey’s classical proposal to statistical inference. The analysis saves the alignment, explains the role of chance-credence coordination, and rebuts the charge of arbitrary assessment of evidence in Bayesian inference. Finally, I explore the implications of this analysis (...) for Bayesian reasoning with idealized models in science. (shrink)
Stalnaker's Thesis about indicative conditionals is, roughly, that the probability one ought to assign to an indicative conditional equals the probability that one ought to assign to its consequent conditional on its antecedent. The thesis seems right. If you draw a card from a standard 52-card deck, how confident are you that the card is a diamond if it's a red card? To answer this, you calculate the proportion of red cards that are diamonds -- that (...) is, you calculate the probability of drawing a diamond conditional on drawing a red card. Skyrms' Thesis about counterfactual conditionals is, roughly, that the probability that one ought to assign to a counterfactual equals one's rational expectation of the chance, at a relevant past time, of its consequent conditional on its antecedent. This thesis also seems right. If you decide not to enter a 100-ticket lottery, how confident are you that you would have won had you bought a ticket? To answer this, you calculate the prior chance--that is, the chance just before your decision not to buy a ticket---of winning conditional on entering the lottery. The central project of this article is to develop a new uniform theory of conditionals that allows us to derive a version of Skyrms' Thesis from a version of Stalnaker's Thesis, together with a chance-deference norm relating rational credence to beliefs about objective chance. (shrink)
The epistemic probability of A given B is the degree to which B evidentially supports A, or makes A plausible. This paper is a first step in answering the question of what determines the values of epistemic probabilities. I break this question into two parts: the structural question and the substantive question. Just as an object’s weight is determined by its mass and gravitational acceleration, some probabilities are determined by other, more basic ones. The structural question asks what probabilities (...) are not determined in this way—these are the basic probabilities which determine values for all other probabilities. The substantive question asks how the values of these basic probabilities are determined. I defend an answer to the structural question on which basic probabilities are the probabilities of atomic propositions conditional on potential direct explanations. I defend this against the view, implicit in orthodox mathematical treatments of probability, that basic probabilities are the unconditional probabilities of complete worlds. I then apply my answer to the structural question to clear up common confusions in expositions of Bayesianism and shed light on the “problem of the priors.”. (shrink)
This paper motivates and develops a novel semantic framework for deontic modals. The framework is designed to shed light on two things: the relationship between deontic modals and substantive theories of practical rationality and the interaction of deontic modals with conditionals, epistemic modals and probability operators. I argue that, in order to model inferential connections between deontic modals and probability operators, we need more structure than is provided by classical intensional theories. In particular, we need probabilistic structure that (...) interacts directly with the compositional semantics of deontic modals. However, I reject theories that provide this probabilistic structure by claiming that the semantics of deontic modals is linked to the Bayesian notion of expectation. I offer a probabilistic premise semantics that explains all the data that create trouble for the rival theories. (shrink)
A study is reported testing two hypotheses about a close parallel relation between indicative conditionals, if A then B , and conditional bets, I bet you that if A then B . The first is that both the indicative conditional and the conditional bet are related to the conditionalprobability, P(B|A). The second is that de Finetti's three-valued truth table has psychological reality for both types of conditional— true , false , or void for (...) indicative conditionals and win , lose , or void for conditional bets. The participants were presented with an array of chips in two different colours and two different shapes, and an indicative conditional or a conditional bet about a random chip. They had to make judgements in two conditions: either about the chances of making the indicative conditional true or false or about the chances of winning or losing the conditional bet. The observed distributions of responses in the two conditions were generally related to the conditionalprobability, supporting the first hypothesis. In addition, a majority of participants in further conditions chose the third option, “void”, when the antecedent of the conditional was false, supporting the second hypothesis. (shrink)
In my article, I present a new version of a probabilistic truth prescribing semantics for natural language indicative conditionals. The proposed truth conditions can be paraphrased as follows: an indicative conditional is true if the corresponding conditionalprobability is high and the antecedent is positively probabilistically relevant for the consequent or the probability of the antecedent of the conditional equals 0. In the paper, the truth conditions are defended and some of the logical properties of (...) the proposed semantics are described. (shrink)
This paper discusses and relates two puzzles for indicative conditionals: a puzzle about indeterminacy and a puzzle about triviality. Both puzzles arise because of Ramsey's Observation, which states that the probability of a conditional is equal to the conditionalprobability of its consequent given its antecedent. The puzzle of indeterminacy is the problem of reconciling this fact about conditionals with the fact that they seem to lack truth values at worlds where their antecedents are false. The (...) puzzle of triviality is the problem of reconciling Ramsey's Observation with various triviality proofs which establish that Ramsey's Observation cannot hold in full generality. In the paper, I argue for a solution to the indeterminacy puzzle and then apply the resulting theory to the triviality puzzle. On the theory I defend, the truth conditions of indicative conditionals are highly context dependent and such that an indicative conditional may be indeterminate in truth value at each possible world throughout some region of logical space and yet still have a nonzero probability throughout that region. (shrink)
Should we understand implicit attitudes on the model of belief? I argue that implicit attitudes are (probably) members of a different psychological kind altogether, because they seem to be insensitive to the logical form of an agent’s thoughts and perceptions. A state is sensitive to logical form only if it is sensitive to the logical constituents of the content of other states (e.g., operators like negation and conditional). I explain sensitivity to logical form and argue that it is a (...) necessary condition for belief. I appeal to two areas of research that seem to show that implicit attitudes fail spectacularly to satisfy this condition—although persistent gaps in the empirical literature leave matters inconclusive. I sketch an alternative account, according to which implicit attitudes are sensitive merely to spatiotemporal relations in thought and perception, i.e., the spatial and temporal orders in which people think, see, or hear things. (shrink)
Bayesianism is the position that scientific reasoning is probabilistic and that probabilities are adequately interpreted as an agent's actual subjective degrees of belief, measured by her betting behaviour. Confirmation is one important aspect of scientific reasoning. The thesis of this paper is the following: if scientific reasoning is at all probabilistic, the subjective interpretation has to be given up in order to get right confirmation—and thus scientific reasoning in general. The Bayesian approach to scientific reasoning Bayesian confirmation theory The example (...) The less reliable the source of information, the higher the degree of Bayesian confirmation Measure sensitivity A more general version of the problem of old evidence Conditioning on the entailment relation The counterfactual strategy Generalizing the counterfactual strategy The desired result, and a necessary and sufficient condition for it Actual degrees of belief The common knock-down feature, or ‘anything goes’ The problem of prior probabilities. (shrink)
We present a puzzle about knowledge, probability and conditionals. We show that in certain cases some basic and plausible principles governing our reasoning come into conflict. In particular, we show that there is a simple argument that a person may be in a position to know a conditional the consequent of which has a low probabilityconditional on its antecedent, contra Adams’ Thesis. We suggest that the puzzle motivates a very strong restriction on the inference of (...) a conditional from a disjunction. (shrink)
We generalize the Kolmogorov axioms for probability calculus to obtain conditions defining, for any given logic, a class of probability functions relative to that logic, coinciding with the standard probability functions in the special case of classical logic but allowing consideration of other classes of "essentially Kolmogorovian" probability functions relative to other logics. We take a broad view of the Bayesian approach as dictating inter alia that from the perspective of a given logic, rational degrees of (...) belief are those representable by probability functions from the class appropriate to that logic. Classical Bayesianism, which fixes the logic as classical logic, is only one version of this general approach. Another, which we call Intuitionistic Bayesianism, selects intuitionistic logic as the preferred logic and the associated class of probability functions as the right class of candidate representions of epistemic states (rational allocations of degrees of belief). Various objections to classical Bayesianism are, we argue, best met by passing to intuitionistic Bayesianism—in which the probability functions are taken relative to intuitionistic logic—rather than by adopting a radically non-Kolmogorovian, for example, nonadditive, conception of (or substitute for) probability functions, in spite of the popularity of the latter response among those who have raised these objections. The interest of intuitionistic Bayesianism is further enhanced by the availability of a Dutch Book argument justifying the selection of intuitionistic probability functions as guides to rational betting behavior when due consideration is paid to the fact that bets are settled only when/if the outcome bet on becomes known. (shrink)
There is a long tradition in formal epistemology and in the psychology of reasoning to investigate indicative conditionals. In psychology, the propositional calculus was taken for granted to be the normative standard of reference. Experimental tasks, evaluation of the participants’ responses and psychological model building, were inspired by the semantics of the material conditional. Recent empirical work on indicative conditionals focuses on uncertainty. Consequently, the normative standard of reference has changed. I argue why neither logic nor standard probability (...) theory provide appropriate rationality norms for uncertain conditionals. I advocate coherence based probability logic as an appropriate framework for investigating uncertain conditionals. Detailed proofs of the probabilistic non-informativeness of a paradox of the material conditional illustrate the approach from a formal point of view. I survey selected data on human reasoning about uncertain conditionals which additionally support the plausibility of the approach from an empirical point of view. (shrink)
Dutch Book arguments have been presented for static belief systems and for belief change by conditionalization. An argument is given here that a rule for belief change which under certain conditions violates probability kinematics will leave the agent open to a Dutch Book.
Automated reasoning about uncertain knowledge has many applications. One difficulty when developing such systems is the lack of a completely satisfactory integration of logic and probability. We address this problem directly. Expressive languages like higher-order logic are ideally suited for representing and reasoning about structured knowledge. Uncertain knowledge can be modeled by using graded probabilities rather than binary truth-values. The main technical problem studied in this paper is the following: Given a set of sentences, each having some probability (...) of being true, what probability should be ascribed to other (query) sentences? A natural wish-list, among others, is that the probability distribution (i) is consistent with the knowledge base, (ii) allows for a consistent inference procedure and in particular (iii) reduces to deductive logic in the limit of probabilities being 0 and 1, (iv) allows (Bayesian) inductive reasoning and (v) learning in the limit and in particular (vi) allows confirmation of universally quantified hypotheses/sentences. We translate this wish-list into technical requirements for a prior probability and show that probabilities satisfying all our criteria exist. We also give explicit constructions and several general characterizations of probabilities that satisfy some or all of the criteria and various (counter) examples. We also derive necessary and sufficient conditions for extending beliefs about finitely many sentences to suitable probabilities over all sentences, and in particular least dogmatic or least biased ones. We conclude with a brief outlook on how the developed theory might be used and approximated in autonomous reasoning agents. Our theory is a step towards a globally consistent and empirically satisfactory unification of probability and logic. (shrink)
In this study we investigate the influence of reason-relation readings of indicative conditionals and ‘and’/‘but’/‘therefore’ sentences on various cognitive assessments. According to the Frege-Grice tradition, a dissociation is expected. Specifically, differences in the reason-relation reading of these sentences should affect participants’ evaluations of their acceptability but not of their truth value. In two experiments we tested this assumption by introducing a relevance manipulation into the truth-table task as well as in other tasks assessing the participants’ acceptability and probability evaluations. (...) Across the two experiments a strong dissociation was found. The reason-relation reading of all four sentences strongly affected their probability and acceptability evaluations, but hardly affected their respective truth evaluations. Implications of this result for recent work on indicative conditionals are discussed. (shrink)
Rollback arguments focus on long sequences of actions with identical initial conditions in order to explicate the luck problem that indeterminism poses for libertarian free will theories (i.e. the problem that indeterministic actions appear arbitrary in a free-will undermining way). In this paper, I propose a rollback argument for probability incompatibilism, i.e. for the thesis that free will is incompatible with all world-states being governed by objective probabilities. Other than the most prominently discussed rollback arguments, this argument explicitly focusses (...) on the ability to act otherwise. It argues that the negligible probability of the relative frequencies in overall rollback patterns being relevantly different indicates that even the ability to act otherwise with regard to individual actions is not free-will enabling. My proposed argument provides probability incompatibilists with a tool to argue against a classical event-causal response to the luck problem, while it can still motivate an agent-causal response to it. (shrink)
“Negative probability” in practice. Quantum Communication: Very small phase space regions turn out to be thermodynamically analogical to those of superconductors. Macro-bodies or signals might exist in coherent or entangled state. Such physical objects having unusual properties could be the basis of quantum communication channels or even normal physical ones … Questions and a few answers about negative probability: Why does it appear in quantum mechanics? It appears in phase-space formulated quantum mechanics; next, in quantum correlations … and (...) for wave-particle dualism. Its meaning:- mathematically: a ratio of two measures (of sets), which are not collinear; physically: the ratio of the measurements of two physical quantities, which are not simultaneously measurable. The main innovation is in the mapping between phase and Hilbert space, since both are sums. Phase space is a sum of cells, and Hilbert space is a sum of qubits. The mapping is reduced to the mapping of a cell into a qubit and vice versa. Negative probability helps quantum mechanics to be represented quasi-statistically by quasi-probabilistic distributions. Pure states of negative probability cannot exist, but they, where the conditions for their expression exists, decrease the sum probability of the integrally positive regions of the distributions. They reflect the immediate interaction (interference) of probabilities common in quantum mechanics. (shrink)
Formalization of the semantics of generics has been considered extremely challenging for their inherent vagueness and context-dependence that hinder a single fixed truth condition. The present study suggests a way to formalize the semantics of generics by constructing flexible acceptance conditions with comparative probabilities. Findings from our in-depth psycholinguistic experiment show that two comparative probabilities—cue validity and prevalence—indeed construct the flexible acceptance conditions for generics in a systematic manner that can be applied to a diverse types of generics: Acceptability of (...) IS_A relational generics is mostly determined by prevalence without interaction with cue validity; feature-describing generics are endorsed acceptable with high cue validity, albeit mediated by prevalence; and acceptability of feature-describing generics with low cue validity is mostly determined by prevalence irrespective of cue validity. Such systematic patterns indicate a great potential for the formalization of the semantics of generics. (shrink)
This paper calls for a re-appraisal of McGee's analysis of the semantics, logic and probabilities of indicative conditionals presented in his 1989 paper Conditional probabilities and compounds of conditionals. The probabilistic measures introduced by McGee are given a new axiomatisation built on the principle that the antecedent of a conditional is probabilistically independent of the conditional and a more transparent method of constructing such measures is provided. McGee's Dutch book argument is restructured to more clearly reveal that (...) it introduces a novel contribution to the epistemology of semantic indeterminacy, and shows that its more controversial implications are unavoidable if we want to maintain the Ramsey Test along with the standard laws of probability. Importantly, it is shown that the counterexamples that have been levelled at McGee's analysis|generating a rather wide consensus that it yields `unintuitive' or `wrong' probabilities for compounds |fail to strike at their intended target; for to honour the intuitions of the counterexamples one must either give up the Ramsey Test or the standard laws of probability. It will be argued that we need to give up neither if we take the counterexamples as further evidence that the indicative conditional sometimes allows for a non-epistemic `causal' interpretation alongside its usual epistemic interpretation. (shrink)
This paper explores the possibility that causal decision theory can be formulated in terms of probabilities of conditionals. It is argued that a generalized Stalnaker semantics in combination with an underlying branching time structure not only provides the basis for a plausible account of the semantics of indicative conditionals, but also that the resulting conditionals have properties that make them well-suited as a basis for formulating causal decision theory. Decision theory (at least if we omit the frills) is not an (...) esoteric science, however unfamiliar it may seem to an outsider. Rather it is a systematic exposition of the consequences of certain well-chosen platitudes about belief, desire, preference and choice. It is the very core of our common-sense theory of persons, dissected out and elegantly systematized. (David Lewis, Synthese 23:331–344, 1974, p. 337). A small distortion in the analysis of the conditional may create spurious problems with the analysis of other concepts. So if the facts about usage favor one among a number of subtly different theories, it may be important to determine which one it is. (Robert Stalnaker, A Defense of Conditional Excluded Middle, pp. 87–104, 1980, p. 87). (shrink)
In a quantum universe with a strong arrow of time, it is standard to postulate that the initial wave function started in a particular macrostate---the special low-entropy macrostate selected by the Past Hypothesis. Moreover, there is an additional postulate about statistical mechanical probabilities according to which the initial wave function is a ''typical'' choice in the macrostate. Together, they support a probabilistic version of the Second Law of Thermodynamics: typical initial wave functions will increase in entropy. Hence, there are two (...) sources of randomness in such a universe: the quantum-mechanical probabilities of the Born rule and the statistical mechanical probabilities of the Statistical Postulate. I propose a new way to understand time's arrow in a quantum universe. It is based on what I call the Thermodynamic Theories of Quantum Mechanics. According to this perspective, there is a natural choice for the initial quantum state of the universe, which is given by not a wave function but by a density matrix. The density matrix plays a microscopic role: it appears in the fundamental dynamical equations of those theories. The density matrix also plays a macroscopic / thermodynamic role: it is exactly the projection operator onto the Past Hypothesis subspace. Thus, given an initial subspace, we obtain a unique choice of the initial density matrix. I call this property "the conditional uniqueness" of the initial quantum state. The conditional uniqueness provides a new and general strategy to eliminate statistical mechanical probabilities in the fundamental physical theories, by which we can reduce the two sources of randomness to only the quantum mechanical one. I also explore the idea of an absolutely unique initial quantum state, in a way that might realize Penrose's idea of a strongly deterministic universe. (shrink)
The material interpretation of conditionals is commonly recognized as involving some paradoxical results. I here argue that the truth functional approach to natural language is the reason for the inadequacy of this material interpretation, since the truth or falsity of some pair of statements ‘p’ and ‘q’ cannot per se be decisive for the truth or falsity of a conditional relation ‘if p then q’. This inadequacy also affects the ability of the overall formal system to establish whether or (...) not arguments involving conditionals are valid. I also demonstrate that the Paradox of Indicative Conditionals does not actually involve a paradox, but instead contains some paralogistic elements that make it appear to be a paradox. The discussion of the paradox in this paper further reveals that the material interpretation of conditionals adversely affects the treatment of disjunctions. -/- Much has been said about these matters in the literature that point in the same direction. However, there seems to be some reluctance against fully complying with the arguments against the truth functional account of conditionals, since many of the alternative accounts rely on the material conditional, or at least on an understanding of the conditional as a function of antecedent and consequent in a similar sense as the material conditional. My argument against truth functionality indicates that it may in general involve similar problems to treat conditionals as such functions, whether one deals with theories of truth, assertability or probability. (shrink)
The logic of indicative conditionals remains the topic of deep and intractable philosophical disagreement. I show that two influential epistemic norms—the Lockean theory of belief and the Ramsey test for conditional belief—are jointly sufficient to ground a powerful new argument for a particular conception of the logic of indicative conditionals. Specifically, the argument demonstrates, contrary to the received historical narrative, that there is a real sense in which Stalnaker’s semantics for the indicative did succeed in capturing the logic of (...) the Ramseyan indicative conditional. (shrink)
Wittgenstein did not write very much on the topic of probability. The little we have comes from a few short pages of the Tractatus, some 'remarks' from the 1930s, and the informal conversations which went on during that decade with the Vienna Circle. Nevertheless, Wittgenstein's views were highly influential in the later development of the logical theory of probability. This paper will attempt to clarify and defend Wittgenstein's conception of probability against some oft-cited criticisms that stem from (...) a misunderstanding of his views. Max Black, for instance, criticises Wittgenstein for formulating a theory of probability that is capable of being used only against the backdrop of the ideal language of the Tractatus. I argue that on the contrary, by appealing to the 'hypothetical laws of nature', Wittgenstein is able to make sense of probability statements involving propositions that have not been completely analysed. G.H. von Wright criticises Wittgenstein's characterisation of these very hypothetical laws. He argues that by introducing them Wittgenstein makes what is distinctive about his theory superfluous, for the hypothetical laws are directly inspired by statistical observations and hence these observations indirectly determine the mechanism by which the logical theory of probability operates. I argue that this is not the case at all, and that while statistical observations play a part in the formation of the hypothetical laws, these observations are only necessary, but not sufficient conditions for the introduction of these hypotheses. (shrink)
Over the past two decades, gamblers have begun taking mathematics into account more seriously than ever before. While probability theory is the only rigorous theory modeling the uncertainty, even though in idealized conditions, numerical probabilities are viewed not only as mere mathematical information, but also as a decision-making criterion, especially in gambling. This book presents the mathematics underlying the major games of chance and provides a precise account of the odds associated with all gaming events. It begins by explaining (...) in simple terms the meaning of the concept of probability for the layman and goes on to become an enlightening journey through the mathematics of chance, randomness and risk. It then continues with the basics of discrete probability, combinatorics and counting arguments for those interested in the supporting mathematics. These mathematic sections may be skipped by readers who do not have a minimal background in mathematics; these readers can skip directly to the Guide to Numerical Results to pick the odds and recommendations they need for the desired gaming situation. Doing so is possible due to the organization of that chapter, in which the results are listed at the end of each section, mostly in the form of tables. The chapter titled The Mathematics of Games of Chance presents these games not only as a good application field for probability theory, but also in terms of human actions where probability-based strategies can be tried to achieve favorable results. Through suggestive examples, the reader can see what are the experiments, events and probability fields in games of chance and how probability calculus works there. The main portion of this work is a collection of probability results for each type of game. Each game s section is packed with formulas and tables. Each section also contains a description of the game, a classification of the gaming events and the applicable probability calculations. The primary goal of this work is to allow the reader to quickly find the odds for a specific gaming situation, in order to improve his or her betting/gaming decisions. Every type of gaming event is tabulated in a logical, consistent and comprehensive manner. The complete methodology and complete or partial calculations are shown to teach players how to calculate probability for any situation, for every stage of the game for any game. Here, readers can find the real odds, returned by precise mathematical formulas and not by partial simulations that most software uses. Collections of odds are presented, as well as strategic recommendations based on those odds, where necessary, for each type of gaming situation. The book contains much new and original material that has not been published previously and provides great coverage of probabilities for the following games of chance: Dice, Slots, Roulette, Baccarat, Blackjack, Texas Hold em Poker, Lottery and Sport Bets. Most of games of chance are predisposed to probability-based decisions. This is why the approach is not an exclusively statistical one, but analytical: every gaming event is taken as an individual applied probability problem to solve. A special chapter defines the probability-based strategy and mathematically shows why such strategy is theoretically optimal.". (shrink)
Disjunctive antecedent conditionals —conditionals of the form if A or B, C—sometimes seem to entail both of their simplifications and sometimes seem not to. I argue that this behavior reveals a genuine ambiguity in DACs. Along the way, I discuss a new observation about the role of focal stress in distinguishing the two interpretations of DACs. I propose a new theory, according to which the surface form of a DAC underdetermines its logical form: on one possible logical form, if A (...) or B, C does entail both of its simplifications, while on the other, it does not. (shrink)
Starting from a recent paper by S. Kaufmann, we introduce a notion of conjunction of two conditional events and then we analyze it in the setting of coherence. We give a representation of the conjoined conditional and we show that this new object is a conditional random quantity, whose set of possible values normally contains the probabilities assessed for the two conditional events. We examine some cases of logical dependencies, where the conjunction is a conditional (...) event; moreover, we give the lower and upper bounds on the conjunction. We also examine an apparent paradox concerning stochastic independence which can actually be explained in terms of uncorrelation. We briefly introduce the notions of disjunction and iterated conditioning and we show that the usual probabilistic properties still hold. (shrink)
In the following we will investigate whether von Mises’ frequency interpretation of probability can be modified to make it philosophically acceptable. We will reject certain elements of von Mises’ theory, but retain others. In the interpretation we propose we do not use von Mises’ often criticized ‘infinite collectives’ but we retain two essential claims of his interpretation, stating that probability can only be defined for events that can be repeated in similar conditions, and that exhibit frequency stabilization. The (...) central idea of the present article is that the mentioned ‘conditions’ should be well-defined and ‘partitioned’. More precisely, we will divide probabilistic systems into object, initializing, and probing subsystem, and show that such partitioning allows to solve problems. Moreover we will argue that a key idea of the Copenhagen interpretation of quantum mechanics (the determinant role of the observing system) can be seen as deriving from an analytic definition of probability as frequency. Thus a secondary aim of the article is to illustrate the virtues of analytic definition of concepts, consisting of making explicit what is implicit. (shrink)
In Counterfactual Conditionals, Daniel Dohrn discusses the standard account of counterfactuals, conditionals of the form ‘If A had been the case, then B would have been the case’. According to the standard account, a counterfactual is true if the then-sentence is true in all closest worlds in which the if-sentence is true. Closeness is spelled out in terms of an ordering of worlds by their similarity. Dohrn explores resources of defending the standard account against several challenges. In particular, he defends (...) the standard logics for counterfactuals. He discusses exemplary doubts as to whether conditionals have truth conditions. He inquires into the interaction between truth and probability of counterfactuals. He tackles problems with the similarity ordering. He address the interaction between counterfactuals and normalcy conditions. He closes with elaborating peculiarities of future-directed counterfactuals. (shrink)
Evidentialists say that a necessary condition of sound epistemic reasoning is that our beliefs reflect only our evidence. This thesis arguably conflicts with standard Bayesianism, due to the importance of prior probabilities in the latter. Some evidentialists have responded by modelling belief-states using imprecise probabilities (Joyce 2005). However, Roger White (2010) and Aron Vallinder (2018) argue that this Imprecise Bayesianism is incompatible with evidentialism due to “inertia”, where Imprecise Bayesian agents become stuck in a state of ambivalence towards hypotheses. Additionally, (...) escapes from inertia apparently only create further conflicts with evidentialism. This dilemma gives a reason for evidentialist imprecise probabilists to look for alternatives without inertia. I shall argue that Henry E. Kyburg’s approach offers an evidentialist-friendly imprecise probability theory without inertia, and that its relevant anti-inertia features are independently justified. I also connect the traditional epistemological debates concerning the “ethics of belief” more systematically with formal epistemology than has been hitherto done. (shrink)
Bayesian confirmation theory is rife with confirmation measures. Zalabardo focuses on the probability difference measure, the probability ratio measure, the likelihood difference measure, and the likelihood ratio measure. He argues that the likelihood ratio measure is adequate, but each of the other three measures is not. He argues for this by setting out three adequacy conditions on confirmation measures and arguing in effect that all of them are met by the likelihood ratio measure but not by any of (...) the other three measures. Glass and McCartney, hereafter “G&M,” accept the conclusion of Zalabardo’s argument along with each of the premises in it. They nonetheless try to improve on Zalabardo’s argument by replacing his third adequacy condition with a weaker condition. They do this because of a worry to the effect that Zalabardo’s third adequacy condition runs counter to the idea behind his first adequacy condition. G&M have in mind confirmation in the sense of increase in probability: the degree to which E confirms H is a matter of the degree to which E increases H’s probability. I call this sense of confirmation “IP.” I set out four ways of precisifying IP. I call them “IP1,” “IP2,” “IP3,” and “IP4.” Each of them is based on the assumption that the degree to which E increases H’s probability is a matter of the distance between p and a certain other probability involving H. I then evaluate G&M’s argument in light of them. (shrink)
IBE ('Inference to the best explanation' or abduction) is a popular and highly plausible theory of how we should judge the evidence for claims of past events based on present evidence. It has been notably developed and supported recently by Meyer following Lipton. I believe this theory is essentially correct. This paper supports IBE from a probability perspective, and argues that the retrodictive probabilities involved in such inferences should be analysed in terms of predictive probabilities and a priori (...) class='Hi'>probability ratios of initial events. The key point is to separate these two features. Disagreements over evidence can be traced to disagreements over either the a priori probability ratios or predictive conditional ratios. In many cases, in real science, judgements of the former are necessarily subjective. The principles of iterated evidence are also discussed. The Sceptic's position is criticised as ignoring iteration of evidence, and characteristically failing to adjust a priori probability ratios in response to empirical evidence. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.