After pinpointing a conceptual confusion (TCC), a Reality preconception (TRP1), and a fallacious dichotomy (TFD), the famous EPR/EPRB argument for correlated ‘particles’ is studied in the light of the Ontic Probability Interpretation (TOPI) of Quantum Theory (QT). Another Reality preconception (TRP2) is identified, showing that EPR used and ignored QT predictions in a single paralogism. Employing TFD and TRP2, EPR unveiled a contradiction veiled in its premises. By removing nonlocality from QT’s Ontology by fiat, EPR preordained its incompleteness. (...) The Petitio Principii fallacy was at work from the outset. Einstein surmised the solution to his incompleteness/nonlocality dilemma in 1949, but never abandoned his philosophical stance. It is concluded that there are no definitions of Reality: we have to accept that Reality may not conform to our prejudices and, if an otherwise successful theory predicts what we do not believe in, no gedankenexperiment will help because our biases may slither through. Only actual experiments could assist in solving Einstein’s dilemma, as proven in the last 50 years. Notwithstanding, EPR is one of the most influential papers in history and has immensely sparked both conceptual and technological progress. Future articles will further explain TOPI, demonstrating its soundness and potential for nurturing theoretical advance. (shrink)
Karl Popper discovered in 1938 that the unconditional probability of a conditional of the form ‘If A, then B’ normally exceeds the conditional probability of B given A, provided that ‘If A, then B’ is taken to mean the same as ‘Not (A and not B)’. So it was clear (but presumably only to him at that time) that the conditional probability of B given A cannot be reduced to the unconditional probability of the material conditional (...) ‘If A, then B’. I describe how this insight was developed in Popper’s writings and I add to this historical study a logical one, in which I compare laws of excess in Kolmogorov probabilitytheory with laws of excess in Popper probabilitytheory. (shrink)
Evolutionary theory (ET) is teeming with probabilities. Probabilities exist at all levels: the level of mutation, the level of microevolution, and the level of macroevolution. This uncontroversial claim raises a number of contentious issues. For example, is the evolutionary process (as opposed to the theory) indeterministic, or is it deterministic? Philosophers of biology have taken different sides on this issue. Millstein (1997) has argued that we are not currently able answer this question, and that even scientific realists ought (...) to remain agnostic concerning the determinism or indeterminism of evolutionary processes. If this argument is correct, it suggests that, whatever we take probabilities in ET to be, they must be consistent with either determinism or indeterminism. This raises some interesting philosophical questions: How should we understand the probabilities used in ET? In other words, what is meant by saying that a certain evolutionary change is more or less probable? Which interpretation of probability is the most appropriate for ET? I argue that the probabilities used in ET are objective in a realist sense, if not in an indeterministic sense. Furthermore, there are a number of interpretations of probability that are objective and would be consistent with ET under determinism or indeterminism. However, I argue that evolutionary probabilities are best understood as propensities of population-level kinds. (shrink)
This dissertation is a contribution to formal and computational philosophy. -/- In the first part, we show that by exploiting the parallels between large, yet finite lotteries on the one hand and countably infinite lotteries on the other, we gain insights in the foundations of probabilitytheory as well as in epistemology. Case 1: Infinite lotteries. We discuss how the concept of a fair finite lottery can best be extended to denumerably infinite lotteries. The solution boils down to (...) the introduction of infinitesimal probability values, which can be achieved using non-standard analysis. Our solution can be generalized to uncountable sample spaces, giving rise to a Non-Archimedean Probability (NAP) theory. Case 2: Large but finite lotteries. We propose application of the language of relative analysis (a type of non-standard analysis) to formulate a new model for rational belief, called Stratified Belief. This contextualist model seems well-suited to deal with a concept of beliefs based on probabilities ‘sufficiently close to unity’. -/- The second part presents a case study in social epistemology. We model a group of agents who update their opinions by averaging the opinions of other agents. Our main goal is to calculate the probability for an agent to end up in an inconsistent belief state due to updating. To that end, an analytical expression is given and evaluated numerically, both exactly and using statistical sampling. The probability of ending up in an inconsistent belief state turns out to be always smaller than 2%. (shrink)
In finite probabilitytheory, events are subsets S⊆U of the outcome set. Subsets can be represented by 1-dimensional column vectors. By extending the representation of events to two dimensional matrices, we can introduce "superposition events." Probabilities are introduced for classical events, superposition events, and their mixtures by using density matrices. Then probabilities for experiments or `measurements' of all these events can be determined in a manner exactly like in quantum mechanics (QM) using density matrices. Moreover the transformation of (...) the density matrices induced by the experiments or `measurements' is the Lüders mixture operation as in QM. And finally by moving the machinery into the n-dimensional vector space over ℤ₂, different basis sets become different outcome sets. That `non-commutative' extension of finite probabilitytheory yields the pedagogical model of quantum mechanics over ℤ₂ that can model many characteristic non-classical results of QM. (shrink)
Epistemic closure under known implication is the principle that knowledge of "p" and knowledge of "p implies q", together, imply knowledge of "q". This principle is intuitive, yet several putative counterexamples have been formulated against it. This paper addresses the question, why is epistemic closure both intuitive and prone to counterexamples? In particular, the paper examines whether probabilitytheory can offer an answer to this question based on four strategies. The first probability-based strategy rests on the accumulation (...) of risks. The problem with this strategy is that risk accumulation cannot accommodate certain counterexamples to epistemic closure. The second strategy is based on the idea of evidential support, that is, a piece of evidence supports a proposition whenever it increases the probability of the proposition. This strategy makes progress and can accommodate certain putative counterexamples to closure. However, this strategy also gives rise to a number of counterintuitive results. Finally, there are two broadly probabilistic strategies, one based on the idea of resilient probability and the other on the idea of assumptions that are taken for granted. These strategies are promising but are prone to some of the shortcomings of the second strategy. All in all, I conclude that each strategy fails. Probabilitytheory, then, is unlikely to offer the account we need. (shrink)
Decision theory has at its core a set of mathematical theorems that connect rational preferences to functions with certain structural properties. The components of these theorems, as well as their bearing on questions surrounding rationality, can be interpreted in a variety of ways. Philosophy’s current interest in decision theory represents a convergence of two very different lines of thought, one concerned with the question of how one ought to act, and the other concerned with the question of what (...) action consists in and what it reveals about the actor’s mental states. As a result, the theory has come to have two different uses in philosophy, which we might call the normative use and the interpretive use. It also has a related use that is largely within the domain of psychology, the descriptive use. This essay examines the historical development of decision theory and its uses; the relationship between the norm of decision theory and the notion of rationality; and the interdependence of the uses of decision theory. (shrink)
We call something a paradox if it strikes us as peculiar in a certain way, if it strikes us as something that is not simply nonsense, and yet it poses some difficulty in seeing how it could make sense. When we examine paradoxes more closely, we find that for some the peculiarity is relieved and for others it intensifies. Some are peculiar because they jar with how we expect things to go, but the jarring is to do with imprecision and (...) misunderstandings in our thought, failures to appreciate the breadth of possibility consistent with our beliefs. Other paradoxes, however, pose deep problems. Closer examination does not explain them away. Instead, they challenge the coherence of certain conceptual resources and hence challenge the significance of beliefs which deploy those resources. I shall call the former kind weak paradoxes and the latter, strong paradoxes. Whether a particular paradox is weak or strong is sometimes a matter of controversy—sometimes it has been realised that what was thought strong is in fact weak, and vice versa,— but the distinction between the two kinds is generally thought to be worth drawing. In this Cchapter, I shall cover both weak and strong probabilistic paradoxes. (shrink)
This paper shows how the classical finite probabilitytheory (with equiprobable outcomes) can be reinterpreted and recast as the quantum probability calculus of a pedagogical or "toy" model of quantum mechanics over sets (QM/sets). There are two parts. The notion of an "event" is reinterpreted from being an epistemological state of indefiniteness to being an objective state of indefiniteness. And the mathematical framework of finite probabilitytheory is recast as the quantum probability calculus for (...) QM/sets. The point is not to clarify finite probabilitytheory but to elucidate quantum mechanics itself by seeing some of its quantum features in a classical setting. (shrink)
In this text the ancient philosophical question of determinism (“Does every event have a cause ?”) will be re-examined. In the philosophy of science and physics communities the orthodox position states that the physical world is indeterministic: quantum events would have no causes but happen by irreducible chance. Arguably the clearest theorem that leads to this conclusion is Bell’s theorem. The commonly accepted ‘solution’ to the theorem is ‘indeterminism’, in agreement with the Copenhagen interpretation. Here it is recalled that indeterminism (...) is not really a physical but rather a philosophical hypothesis, and that it has counterintuitive and far-reaching implications. At the same time another solution to Bell’s theorem exists, often termed ‘superdeterminism’ or ‘total determinism’. Superdeterminism appears to be a philosophical position that is centuries and probably millennia old: it is for instance Spinoza’s determinism. If Bell’s theorem has both indeterministic and deterministic solutions, choosing between determinism and indeterminism is a philosophical question, not a matter of physical experimentation, as is widely believed. If it is impossible to use physics for deciding between both positions, it is legitimate to ask which philosophical theories are of help. Here it is argued that probabilitytheory – more precisely the interpretation of probability – is instrumental for advancing the debate. It appears that the hypothesis of determinism allows to answer a series of precise questions from probabilitytheory, while indeterminism remains silent for these questions. From this point of view determinism appears to be the more reasonable assumption, after all. (shrink)
That uses of language not only can, but even normally do, have the character of actions was a fact largely unrealised by those engaged in the study of language before the present century, at least in the sense that there was lacking any attempt to come to terms systematically with the action-theoretic peculiarities of language use. Where the action-character of linguistic phenomena was acknowledged, it was normally regarded as a peripheral matter, relating to derivative or nonstandard aspects of language which (...) could afford to be ignored. (shrink)
Preliminary version of “Towards a History of Speech Act Theory”, in A. Burkhardt (ed.), Speech Acts, Meanings and Intentions. Critical Approaches to the Philosophy of John R. Searle, Berlin/New York: de Gruyter, 1990, 29–61.
How were reliable predictions made before Pascal and Fermat's discovery of the mathematics of probability in 1654? What methods in law, science, commerce, philosophy, and logic helped us to get at the truth in cases where certainty was not attainable? The book examines how judges, witch inquisitors, and juries evaluated evidence; how scientists weighed reasons for and against scientific theories; and how merchants counted shipwrecks to determine insurance rates. Also included are the problem of induction before Hume, design arguments (...) for the existence of God, and theories on how to evaluate scientific and historical hypotheses. It is explained how Pascal and Fermat's work on chance arose out of legal thought on aleatory contracts. The book interprets pre-Pascalian unquantified probability in a generally objective Bayesian or logical probabilist sense. (shrink)
I introduce a formalization of probability which takes the concept of 'evidence' as primitive. In parallel to the intuitionistic conception of truth, in which 'proof' is primitive and an assertion A is judged to be true just in case there is a proof witnessing it, here 'evidence' is primitive and A is judged to be probable just in case there is evidence supporting it. I formalize this outlook by representing propositions as types in Martin-Lof type theory (MLTT) and (...) defining a 'probability type' on top of the existing machinery of MLTT, whose inhabitants represent pieces of evidence in favor of a proposition. One upshot of this approach is the potential for a mathematical formalism which treats 'conjectures' as mathematical objects in their own right. Other intuitive properties of evidence occur as theorems in this formalism. (shrink)
In the following we will investigate whether von Mises’ frequency interpretation of probability can be modified to make it philosophically acceptable. We will reject certain elements of von Mises’ theory, but retain others. In the interpretation we propose we do not use von Mises’ often criticized ‘infinite collectives’ but we retain two essential claims of his interpretation, stating that probability can only be defined for events that can be repeated in similar conditions, and that exhibit frequency stabilization. (...) The central idea of the present article is that the mentioned ‘conditions’ should be well-defined and ‘partitioned’. More precisely, we will divide probabilistic systems into object, initializing, and probing subsystem, and show that such partitioning allows to solve problems. Moreover we will argue that a key idea of the Copenhagen interpretation of quantum mechanics (the determinant role of the observing system) can be seen as deriving from an analytic definition of probability as frequency. Thus a secondary aim of the article is to illustrate the virtues of analytic definition of concepts, consisting of making explicit what is implicit. (shrink)
This paper motivates and develops a novel semantic framework for deontic modals. The framework is designed to shed light on two things: the relationship between deontic modals and substantive theories of practical rationality and the interaction of deontic modals with conditionals, epistemic modals and probability operators. I argue that, in order to model inferential connections between deontic modals and probability operators, we need more structure than is provided by classical intensional theories. In particular, we need probabilistic structure that (...) interacts directly with the compositional semantics of deontic modals. However, I reject theories that provide this probabilistic structure by claiming that the semantics of deontic modals is linked to the Bayesian notion of expectation. I offer a probabilistic premise semantics that explains all the data that create trouble for the rival theories. (shrink)
Ignited by Einstein and Bohr a century ago, the philosophical struggle about Reality is yet unfinished, with no signs of a swift resolution. Despite vast technological progress fueled by the iconic EPR paper (EPR), the intricate link between ontic and epistemic aspects of Quantum Theory (QT) has greatly hindered our grip on Reality and further progress in physical theory. Fallacies concealed by tortuous logical negations made EPR comprehension much harder than it could have been had Einstein written it (...) himself in German. It is plagued with preconceptions about what a physical property is, the 'Uncertainty Principle', and the Principle of Locality. Numerous interpretations of QT vis à vis Reality exist and are keenly disputed. This is the first of a series of articles arguing for a physical interpretation called ‘The Ontic Probability Interpretation’ (TOPI). A gradual explanation of TOPI is given intertwined with a meticulous logico-philosophical scrutiny of EPR. Part I focuses on the meaning of Einstein’s ‘Incompleteness’ claim. A conceptual confusion, a preconception about Reality, and a flawed dichotomy are shown to be severe obstacles for the EPR argument to succeed. Part II analyzes Einstein’s ‘Incompleteness/Nonlocality Dilemma’. Future articles will further explain TOPI, demonstrating its soundness and potential for nurturing theoretical progress. (shrink)
The value of knowledge can vary in that knowledge of important facts is more valuable than knowledge of trivialities. This variation in the value of knowledge is mirrored by a variation in evidential standards. Matters of greater importance require greater evidential support. But all knowledge, however trivial, needs to be evidentially certain. So on one hand we have a variable evidential standard that depends on the value of the knowledge, and on the other, we have the invariant standard of evidential (...) certainty. This paradox in the concept of knowledge runs deep in the history of philosophy. We approach this paradox by proposing a bet settlement theory of knowledge. Degrees of belief can be measured by the expected value of a bet divided by stake size, with the highest degree of belief being probability 1, or certainty. Evidence sufficient to settle the bet makes the expectation equal to the stake size and therefore has evidential probability 1. This gives us the invariant evidential certainty standard for knowledge. The value of knowledge relative to a bet is given by the stake size. We propose that evidential probability can vary with stake size, so that evidential certainty at low stakes does not entail evidential certainty at high stakes. This solves the paradox by allowing that certainty is necessary for knowledge at any stakes, but that the evidential standards for knowledge vary according to what is at stake. We give a Stake Size Variation Principle that calculates evidential probability from the value of evidence and the stakes. Stake size variant degrees of belief are probabilistically coherent and explain a greater range of preferences than orthodox expected utility theory, namely the Ellsberg and Allais preferences. The resulting theory of knowledge gives an empirically adequate, rationally grounded, unified account of evidence, value and probability. (shrink)
Karl Popper (1902-1994) was one of the most influential philosophers of science of the 20th century. He made significant contributions to debates concerning general scientific methodology and theory choice, the demarcation of science from non-science, the nature of probability and quantum mechanics, and the methodology of the social sciences. His work is notable for its wide influence both within the philosophy of science, within science itself, and within a broader social context. Popper’s early work attempts to solve the (...) problem of demarcation and offer a clear criterion that distinguishes scientific theories from metaphysical or mythological claims. Popper’s falsificationist methodology holds that scientific theories are characterized by entailing predictions that future observations might reveal to be false. When theories are falsified by such observations, scientists can respond by revising the theory, or by rejecting the theory in favor of a rival or by maintaining the theory as is and changing an auxiliary hypothesis. In either case, however, this process must aim at the production of new, falsifiable predictions. While Popper recognizes that scientists can and do hold onto theories in the face of failed predictions when there are no predictively superior rivals to turn to. He holds that scientific practice is characterized by its continual effort to test theories against experience and make revisions based on the outcomes of these tests. By contrast, theories that are permanently immunized from falsification by the introduction of untestable ad hoc hypotheses can no longer be classified as scientific. Among other things, Popper argues that his falsificationist proposal allows for a solution of the problem of induction, since inductive reasoning plays no role in his account of theory choice. Along with his general proposals regarding falsification and scientific methodology, Popper is notable for his work on probability and quantum mechanics and on the methodology of the social sciences. Popper defends a propensity theory of probability, according to which probabilities are interpreted as objective, mind-independent properties of experimental setups. Popper then uses this theory to provide a realist interpretation of quantum mechanics, though its applicability goes beyond this specific case. With respect to the social sciences, Popper argued against the historicist attempt to formulate universal laws covering the whole of human history and instead argued in favor of methodological individualism and situational logic. Table of Contents 1. Background 2. Falsification and the Criterion of Demarcation a. Popper on Physics and Psychoanalysis b. Auxiliary and Ad Hoc Hypotheses c. Basic Sentences and the Role of Convention d. Induction, Corroboration, and Verisimilitude 3. Criticisms of Falsificationism 4. Realism, Quantum Mechanics, and Probability 5. Methodology in the Social Sciences 6. Popper’s Legacy 7. References and Further Reading a. Primary Sources b. Secondary Sources -/- . (shrink)
This proposal serves to enhance scientific and technological literacy, by promoting STEM (Science, Technology, Engineering, and Mathematics) education with particular reference to contemporary physics. The study is presented in the form of a repertoire, and it gives the reader a glimpse of the conceptual structure and development of quantum theory along a rational line of thought, whose understanding might be the key to introducing young generations of students to physics.
An essay review of Richard Johns "A Theory of Physical Probability" (University of Toronto Press, 2002). Forthcoming in Studies in History and Philosophy of Science.
Using “brute reason” I will show why there can be only one valid interpretation of probability. The valid interpretation turns out to be a further refinement of Popper’s Propensity interpretation of probability. Via some famous probability puzzles and new thought experiments I will show how all other interpretations of probability fail, in particular the Bayesian interpretations, while these puzzles do not present any difficulties for the interpretation proposed here. In addition, the new interpretation casts doubt on (...) some concepts often taken as basic and unproblematic, like rationality, utility and expectation. This in turn has implications for decision theory, economic theory and the philosophy of physics. (shrink)
The major competing statistical paradigms share a common remarkable but unremarked thread: in many of their inferential applications, different probability interpretations are combined. How this plays out in different theories of inference depends on the type of question asked. We distinguish four question types: confirmation, evidence, decision, and prediction. We show that Bayesian confirmation theory mixes what are intuitively “subjective” and “objective” interpretations of probability, whereas the likelihood-based account of evidence melds three conceptions of what constitutes an (...) “objective” probability. (shrink)
Early work on the frequency theory of probability made extensive use of the notion of randomness, conceived of as a property possessed by disorderly collections of outcomes. Growing out of this work, a rich mathematical literature on algorithmic randomness and Kolmogorov complexity developed through the twentieth century, but largely lost contact with the philosophical literature on physical probability. The present chapter begins with a clarification of the notions of randomness and probability, conceiving of the former as (...) a property of a sequence of outcomes, and the latter as a property of the process generating those outcomes. A discussion follows of the nature and limits of the relationship between the two notions, with largely negative verdicts on the prospects for any reduction of one to the other, although the existence of an apparently random sequence of outcomes is good evidence for the involvement of a genuinely chancy process. (shrink)
The article is a plea for ethicists to regard probability as one of their most important concerns. It outlines a series of topics of central importance in ethical theory in which probability is implicated, often in a surprisingly deep way, and lists a number of open problems. Topics covered include: interpretations of probability in ethical contexts; the evaluative and normative significance of risk or uncertainty; uses and abuses of expected utility theory; veils of ignorance; Harsanyi’s (...) aggregation theorem; population size problems; equality; fairness; giving priority to the worse off; continuity; incommensurability; nonexpected utility theory; evaluative measurement; aggregation; causal and evidential decision theory; act consequentialism; rule consequentialism; and deontology. (shrink)
Bolzano hat seine Wahrscheinlichkeitslehre in 15 Punkten im § 14 des zweiten Teils seiner Religionswissenschaft sowie in 20 Punkten im § 161 des zweiten Bandes seiner Wissenschaftslehre niedergelegt. (Ich verweise auf die Religionswissenschaft mit 'RW II', auf die Wissenschaftslehre mit 'WL II'.) In der RW II (vgl. p. 37) ist seine Wahrscheinlichkeitslehre eingebettet in seine Ausführungen "Über die Natur der historischen Erkenntniß, besonders in Hinsicht auf Wunder", und die Lehrsätze, die er dort zusammenstellt, dienen dem ausdrücklichen Zweck, mit mathematischem Rüstzeug (...) Lehrmeinungen entgegentreten zu können, gemäß denen Wundererzählungen keine Glaubwürdigkeit zukommen könne. In der WL II (vgl. p. 171) führt Bolzano im großen und ganzen dieselben Lehrsätze an wie in der RW II, entwickelt nun aber die Wahrscheinlichkeitslehre innerhalb seiner Lehre von den Sätzen an sich. Dabei orientiert er sich zwar durchaus an den Lehrsätzen in den damaligen "Schriften über die Wahrscheinlichkeitsrechnung" (vg. WL II, p. 190), korrigiert aber dort, wo es ihm nötig erscheint (vgl. WL II, pp. 187–191), und leistet so im Grunde eine Reformulierung des elementaren Teils der Wahrscheinlichkeitslehre seiner Zeit innerhalb seiner logischen Theorie von den Sätzen an sich. — Ich bezwecke hier keine historische Studie über Bolzanos Wahrscheinlichkeitslehre, obwohl es von Interesse sein mag, herauszuschälen, worin Bolzano mit welchen Wahrscheinlichkeitstheoretikern seiner Zeit übereinstimmt, und worin nicht, insbesondere welche Schwächen von Bolzanos Wahrscheinlichkeitslehre Schwächen aller damaligen Wahrscheinlichkeitslehren waren. Eine wichtige systematische Studie über Bolzanos Wahrscheinlichkeitslehre bestünde — wie von Berg (1962, pp. 148-149) ansatzweise begonnen — in einer exakten Rekonstruktion seiner Wahrscheinlichkeitslehre innerhalb eines konsistenten logischen Systems der Sätze an sich. Ich werde im folgenden etwas bei weitem Bescheideneres, doch möglicherweise durchaus Fruchtbares versuchen, nämlich die Lehrsätze von Bolzanos Wahrscheinlichkeitslehre in die Sprache einer heutigen Wahrscheinlichkeitstheorie zu übersetzen und die übersetzten Lehrsätze dort herzuleiten, soweit dies möglich ist. Man könnte dann in einem zweiten Schritt, der hier nicht mehr unternommen wird, untersuchen, inwieweit jene Thesen, die den Herleitungstest überstanden haben, jenen Zweck erfüllen, den Bolzano ihnen ursprünglich zugedacht hat: als mathematisches Rüstzeug für seine Argumentationen gegen die Auffassung zu dienen, Wundererzählungen könnten nicht glaubwürdig sein. (shrink)
We provide a 'verisimilitudinarian' analysis of the well-known Linda paradox or conjunction fallacy, i.e., the fact that most people judge the probability of the conjunctive statement "Linda is a bank teller and is active in the feminist movement" (B & F) as more probable than the isolated statement "Linda is a bank teller" (B), contrary to an uncontroversial principle of probabilitytheory. The basic idea is that experimental participants may judge B & F a better hypothesis about (...) Linda as compared to B because they evaluate B & F as more verisimilar than B. In fact, the hypothesis "feminist bank teller", while less likely to be true than "bank teller", may well be a better approximation to the truth about Linda. (shrink)
Many epistemologists hold that an agent can come to justifiably believe that p is true by seeing that it appears that p is true, without having any antecedent reason to believe that visual impressions are generally reliable. Certain reliabilists think this, at least if the agent’s vision is generally reliable. And it is a central tenet of dogmatism (as described by Pryor (2000) and Pryor (2004)) that this is possible. Against these positions it has been argued (e.g. by Cohen (2005) (...) and White (2006)) that this violates some principles from probabilistic learning theory. To see the problem, let’s note what the dogmatist thinks we can learn by paying attention to how things appear. (The reliabilist says the same things, but we’ll focus on the dogmatist.) Suppose an agent receives an appearance that p, and comes to believe that p. Letting Ap be the proposition that it appears to the agent that p, and → be the material implication, we can say that the agent learns that p, and hence is in a position to infer Ap → p, once they receive the evidence Ap.1 This is surprising, because we can prove the following. (shrink)
This Open Access book addresses the age-old problem of infinite regresses in epistemology. How can we ever come to know something if knowing requires having good reasons, and reasons can only be good if they are backed by good reasons in turn? The problem has puzzled philosophers ever since antiquity, giving rise to what is often called Agrippa's Trilemma. The current volume approaches the old problem in a provocative and thoroughly contemporary way. Taking seriously the idea that good reasons are (...) typically probabilistic in character, it develops and defends a new solution that challenges venerable philosophical intuitions and explains why they were mistakenly held. Key to the new solution is the phenomenon of fading foundations, according to which distant reasons are less important than those that are nearby. The phenomenon takes the sting out of Agrippa's Trilemma; moreover, since the theory that describes it is general and abstract, it is readily applicable outside epistemology, notably to debates on infinite regresses in metaphysics. (shrink)
According to the antirealist argument known as the pessimistic induction, the history of science is a graveyard of dead scientific theories and abandoned theoretical posits. Support for this pessimistic picture of the history of science usually comes from a few case histories, such as the demise of the phlogiston theory and the abandonment of caloric as the substance of heat. In this article, I wish to take a new approach to examining the ‘history of science as (...) a graveyard of theories’ picture. Using JSTOR Data for Research and Springer Exemplar, I present new lines of evidence that are at odds with this pessimistic picture of the history of science. When rigorously tested against the historical record of science, I submit, the pessimistic picture of the history of science as a graveyard of dead theories and abandoned posits may turn out to be no more than a philosophers’ myth. (shrink)
The history of science is often conceptualized through 'paradigm shifts,' where the accumulation of evidence leads to abrupt changes in scientific theories. Experimental evidence suggests that this kind of hypothesis revision occurs in more mundane circumstances, such as when children learn concepts and when adults engage in strategic behavior. In this paper, I argue that the model of hypothesis testing can explain how people learn certain complex, theory-laden propositions such as conditional sentences ('If A, then B') and probabilistic (...) constraints ('The probability that A is p'). Theories are formalized as probability distributions over a set of possible outcomes and theory change is triggered by a constraint which is incompatible with the initial theory. This leads agents to consult a higher order probability function, or a 'prior over priors,' to choose the most likely alternative theory which satisfies the constraint. The hypothesis testing model is applied to three examples: a simple probabilistic constraint involving coin bias, the sundowners problem for conditional learning, and the Judy Benjamin problem for learning conditional probability constraints. The model of hypothesis testing is contrasted with the more conservative learning theory of relative information minimization, which dominates current approaches to learning conditional and probabilistic information. (shrink)
A polemical account of Australian philosophy up to 2003, emphasising its unique aspects (such as commitment to realism) and the connections between philosophers' views and their lives. Topics include early idealism, the dominance of John Anderson in Sydney, the Orr case, Catholic scholasticism, Melbourne Wittgensteinianism, philosophy of science, the Sydney disturbances of the 1970s, Francofeminism, environmental philosophy, the philosophy of law and Mabo, ethics and Peter Singer. Realist theories especially praised are David Armstrong's on universals, David Stove's on logical (...) class='Hi'>probability and the ethical realism of Rai Gaita and Catholic philosophers. In addition to strict philosophy, the book treats non-religious moral traditions to train virtue, such as Freemasonry, civics education and the Greek and Roman classics. (shrink)
Logical Probability (LP) is strictly distinguished from Statistical Probability (SP). To measure semantic information or confirm hypotheses, we need to use sampling distribution (conditional SP function) to test or confirm fuzzy truth function (conditional LP function). The Semantic Information Measure (SIM) proposed is compatible with Shannon’s information theory and Fisher’s likelihood method. It can ensure that the less the LP of a predicate is and the larger the true value of the proposition is, the more information there (...) is. So the SIM can be used as Popper's information criterion for falsification or test. The SIM also allows us to optimize the true-value of counterexamples or degrees of disbelief in a hypothesis to get the optimized degree of belief, i. e. Degree of Confirmation (DOC). To explain confirmation, this paper 1) provides the calculation method of the DOC of universal hypotheses; 2) discusses how to resolve Raven Paradox with new DOC and its increment; 3) derives the DOC of rapid HIV tests: DOC of “+” =1-(1-specificity)/sensitivity, which is similar to Likelihood Ratio (=sensitivity/(1-specificity)) but has the upper limit 1; 4) discusses negative DOC for excessive affirmations, wrong hypotheses, or lies; and 5) discusses the DOC of general hypotheses with GPS as example. (shrink)
This paper is concerned with representations of belief by means of nonadditive probabilities of the Dempster-Shafer (DS) type. After surveying some foundational issues and results in the D.S. theory, including Suppes's related contributions, the paper proceeds to analyze the connection of the D.S. theory with some of the work currently pursued in epistemic logic. A preliminary investigation of the modal logic of belief functions à la Shafer is made. There it is shown that the Alchourrron-Gärdenfors-Makinson (A.G.M.) logic of (...) belief change is closely related to the D.S. theory. The final section compares the critique of Bayesianism which underlies the present paper with some important objections raised by Suppes against this doctrine. -/- . (shrink)
As stochastic independence is essential to the mathematical development of probabilitytheory, it seems that any foundational work on probability should be able to account for this property. Bayesian decision theory appears to be wanting in this respect. Savage’s postulates on preferences under uncertainty entail a subjective expected utility representation, and this asserts only the existence and uniqueness of a subjective probability measure, regardless of its properties. What is missing is a preference condition corresponding to (...) stochastic independence. To fill this significant gap, the article axiomatizes Bayesian decision theory afresh and proves several representation theorems in this novel framework. (shrink)
PHILOSOPHY OF SCIENCE, vol. 52, number 1, pp.44-63. R.M. Nugayev, Kazan State |University, USSR. -/- THE HISTORY OF QUANTUM THEORY AS A DECISIVE ARGUMENT FAVORING EINSTEIN OVER LJRENTZ. -/- Abstract. Einstein’s papers on relativity, quantum theory and statistical mechanics were all part of a single research programme ; the aim was to unify mechanics and electrodynamics. It was this broader program – which eventually split into relativistic physics and quantummmechanics – that superseded Lorentz’s theory. The argument (...) of this paper is partly historical and partly methodological. A notion of “crossbred objects” – theoretical objects with contradictory properties which are part of the domain of application of two different research programs – is developed that explains the dynamics of revolutionary theory change. (shrink)
Intellectual history still quite commonly distinguishes between the episode we know as the Scientific Revolution, and its successor era, the Enlightenment, in terms of the calculatory and quantifying zeal of the former—the age of mechanics—and the rather scientifically lackadaisical mood of the latter, more concerned with freedom, public space and aesthetics. It is possible to challenge this distinction in a variety of ways, but the approach I examine here, in which the focus on an emerging scientific field or cluster (...) of disciplines—the ‘life sciences’, particularly natural history, medicine, and physiology —is, not Romantically anti-scientific, but resolutely anti-mathematical. Diderot bluntly states, in his Thoughts on the interpretation of nature, that “We are on the verge of a great revolution in the sciences. Given the taste people seem to have for morals, belles-lettres, the history of nature and experimental physics, I dare say that before a hundred years, there will not be more than three great geometricians remaining in Europe. The science will stop short where the Bernoullis, the Eulers, the Maupertuis, the Clairauts, the Fontaines and the D’Alemberts will have left it.... We will not go beyond.” Similarly, Buffon in the first discourse of his Histoire naturelle speaks of the “over-reliance on mathematical sciences,” given that mathematical truths are merely “definitional” and “demonstrative,” and thereby “abstract, intellectual and arbitrary.” Earlier in the Thoughts, Diderot judges “the thing of the mathematician” to have “as little existence in nature as that of the gambler.” Significantly, this attitude—taken by great scientists who also translated Newton or wrote careful papers on probabilitytheory, as well as by others such as Mandeville—participates in the effort to conceptualize what we might call a new ontology for the emerging life sciences, very different from both the ‘iatromechanism’ and the ‘animism’ of earlier generations, which either failed to account for specifically living, goal-directed features of organisms, or accounted for them in supernaturalistic terms by appealing to an ‘anima’ as explanatory principle. Anti-mathematicism here is then a key component of a naturalistic, open-ended project to give a successful reductionist model of explanation in ‘natural history’, a model which is no more vitalist than it is materialist—but which is fairly far removed from early modern mechanism. (shrink)
My aims in this essay are two. First (§§1-4), I want to get clear on the very idea of a theory of the history of philosophy, the idea of an overarching account of the evolution of philosophical reflection since the inception of written philosophy. And secondly (§§5-8), I want to actually sketch such a global theory of the history of philosophy, which I call the two-streams theory.
Given a few assumptions, the probability of a conjunction is raised, and the probability of its negation is lowered, by conditionalising upon one of the conjuncts. This simple result appears to bring Bayesian confirmation theory into tension with the prominent dogmatist view of perceptual justification – a tension often portrayed as a kind of ‘Bayesian objection’ to dogmatism. In a recent paper, David Jehle and Brian Weatherson observe that, while this crucial result holds within classical probability (...)theory, it fails within intuitionistic probabilitytheory. They conclude that the dogmatist who is willing to take intuitionistic logic seriously can make a convincing reply to the Bayesian objection. In this paper, I argue that this conclusion is premature – the Bayesian objection can survive the transition from classical to intuitionistic probability, albeit in a slightly altered form. I shall conclude with some general thoughts about what the Bayesian objection to dogmatism does and doesn’t show. (shrink)
Abstract In this paper I consider an easier-to-read and improved to a certain extent version of the causal chance-based analysis of counterfactuals that I proposed and argued for in my A Theory of Counterfactuals. Sections 2, 3 and 4 form Part I: In it, I survey the analysis of the core counterfactuals (in which, very roughly, the antecedent is compatible with history prior to it). In section 2 I go through the three main aspects of this analysis, which (...) are the following. First, it is a causal analysis, in that it requires that intermediate events to which the antecedent event is not a cause be preserved in the main truth-condition schema. Second, it highlights the central notion to the semantics of counterfactuals on the account presented here -- the notion of the counterfactual probability of a given counterfactual, which is the probability of the consequent given the following: the antecedent, the prior history, and the preserved intermediate events. Third, it considers the truth conditions for counterfactuals of this sort as consisting in this counterfactual probability being higher than a threshold. In section 3, I re-formulate the analysis of preservational counterfactuals in terms of the notion of being a cause, which ends up being quite compact. In section 4 I illustrate this analysis by showing how it handles two examples that have been considered puzzling – Morgenbesser's counterfactual and Edgington's counterfactual. Sections 5 and on constitute Part II: Its main initial thrust is provided in section 5, where I present the main lines of the extension of the theory from the core counterfactuals (analyzed in part I) to counterfactuals (roughly) whose antecedents are not compatible with their prior history. In this part II, I elaborate on counterfactuals that don't belong to the core, and more specifically on so-called reconstructional counterfactuals (as opposed to the preservational counterfactuals, which constitute the core counterfactual-type). The heart of the analysis is formulated in terms of processes leading to the antecedent (event/state), and more specifically in terms of processes likely to have led to the antecedent, a notion which is analyzed entirely in terms of chance. It covers so-called reconstructional counterfactuals as opposed to the core, so-called preservational counterfactuals, which are analyzed in sections 2 and 3 of part I. The counterfactual probability of such reconstructional counterfactuals is determined via the probability of possible processes leading to the antecedent weighed, primarily and roughly, by the conditional probability of the antecedent given such process: The counterfactual probability is thus, very roughly, a weighted sum for all processes most likely to have led to the antecedent, diverging at a fixed time. In section 6 I explain and elaborate further on the main points in section 5. In section 7 I illustrate the reconstructional analysis. I specify counterfactuals which are so-called process-pointers, since their consequent specifies stages in processes likely to have led to their antecedent. I argue that so-called backtracking counterfactuals are process-pointers counterfactuals, which fit into the reconstructional analysis, and do not call for a separate reading. I then illustrate cases where a speaker unwittingly employs a certain counterfactual while charitably construable as intending to assert (or ‘having in mind’) another. Here I also cover the issue of how to construe what one can take as back-tracking counterfactuals, or counterfactuals of the reconstructional sort, and more specifically, which divergence point they should be taken as alluding to (prior to which the history is held fixed). Some such cases also give rise to what one can take as a dual reading of a counterfactual between preservational and reconstructional readings. Such cases may yield an ambiguity, where in many cases one construal is dominant. In section 8 I illustrate the analysis by applying it to the famous Bizet-Verdi counterfactuals. This detailed analysis of counterfactuals (designed for the indeterministic case) has three main distinctive elements: its being chance-based, its causal aspect, and the use it makes of processes most likely to have led to the antecedent-event. This analysis is couched in a very different conceptual base from, and is an alternative account to, analyses in terms of the standard notion of closeness or distance of possible worlds, which is the main feature of the Stalnaker-Lewis-type analyses of counterfactuals. This notion of closeness or distance plays no role whatsoever in the analysis presented here. (This notion of closeness has been left open by Stalnaker, and to significant extent also by Lewis's second account.) . (shrink)
Modern scientific cosmology pushes the boundaries of knowledge and the knowable. This is prompting questions on the nature of scientific knowledge. A central issue is what defines a 'good' model. When addressing global properties of the Universe or its initial state this becomes a particularly pressing issue. How to assess the probability of the Universe as a whole is empirically ambiguous, since we can examine only part of a single realisation of the system under investigation: at some point, data (...) will run out. We review the basics of applying Bayesian statistical explanation to the Universe as a whole. We argue that a conventional Bayesian approach to model inference generally fails in such circumstances, and cannot resolve, e.g., the so-called 'measure problem' in inflationary cosmology. Implicit and non-empirical valuations inevitably enter model assessment in these cases. This undermines the possibility to perform Bayesian model comparison. One must therefore either stay silent, or pursue a more general form of systematic and rational model assessment. We outline a generalised axiological Bayesian model inference framework, based on mathematical lattices. This extends inference based on empirical data (evidence) to additionally consider the properties of model structure (elegance) and model possibility space (beneficence). We propose this as a natural and theoretically well-motivated framework for introducing an explicit, rational approach to theoretical model prejudice and inference beyond data. (shrink)
This brief article is a discussion-starter on the question of the role and use of theories and philosophies of history. In the last few decades, theories of history typically intended to transform the practice of historical studies through a straightforward application of their insights. Contrary to this, I argue that they either bring about particular historiographical innovations in terms of methodology but leave the entirety of historical studies intact, or change the way we think about the entirety of (...) historical studies merely by describing and explaining it in fresh and novel ways, without the need of application. In the former case, theories appear as internal to historical studies. In the latter case, they appear as theories about history. Such theories about history are no longer limited to study history understood as historical writing. In reflecting on the historical condition of the ever-changing world, they foster a more fruitful cooperative relationship with the discipline of history. Discussing the scope and use of such theories of history is inevitable today when a younger generation sets out to theorize history against the backdrop of the experiential horizon of their own times. (shrink)
Book review of Paul Horwich, Probability and Evidence (Cambridge Philosophy Classics edition), Cambridge: Cambridge University Press, 2016, 147pp, £14.99 (paperback).
The history of the relationship between Christian theology and the natural sciences has been conditioned by the initial decision of the masters of the "first scientific revolution" to disregard any necessary explanatory premiss to account for the constituting organization and the framing of naturally occurring entities. Not paying any attention to hierarchical control, they ended-up disseminating a vision and understanding in which it was no longer possible for a theology of nature to send questions in the direction of the (...) experimental sciences, as was done in the past between theology and many philosophically-based thought-systems. Presenting the history of some hinge-periods in the development of the Western-world sciences, this book first sets out to consider the conceptual revolution which has, in the 20th Century, related consciousness, physical laws and levels of organization, in order to show that a new chance existed then for theology. This discourse was invited to revise its language to open it up to the quest for meaning which we find on the periphery of the project of the experimental sciences. The Century-old reflection on the foundations of probability had prepared the ground for the introduction of the concept of information, at first linked to an effort aimed at maximizing the efficiency of electromagnetic communications. Taking the full measure of the questions that information theory poses to the biological sciences, this work attempts to identify the areas of convergence setting the stage for general systems theory, while it also tries to identify the insufficiencies of this recent vision and to highlight the questions left unanswered. Re-reading some of the traditional proofs of God's existence from the order of the world, relying on some pioneering insights of Ludwig von Bertalanffy and Norbert Wiener, the author brings those proofs and insights in contact with the fascinating initial project of cybernetics and the elements of a "mythical" nature which, from its inception, it could never entirely eliminate. This book ends with the confrontation between the conceptually most extended regulation factors in the history of Western thought. It articulates the poetic utopia concerned with an immediate grasp of the world in its "deictic" character with the concurrent one aimed at the domination over matter and energy expressed by technology's driving rational utopia. (shrink)
In this paper, I will reread the history of molecular genetics from a psychoanalytical angle, analysing it as a case history. Building on the developmental theories of Freud and his followers, I will distinguish four stages, namely: (1) oedipal childhood, notably the epoch of model building (1943–1953); (2) the latency period, with a focus on the development of basic skills (1953–1989); (3) adolescence, exemplified by the Human Genome Project, with its fierce conflicts, great expectations and grandiose claims (1989–2003) (...) and (4) adulthood (2003–present) during which revolutionary research areas such as molecular biology and genomics have achieved a certain level of normalcy—have evolved into a normal science. I will indicate how a psychoanalytical assessment conducted in this manner may help us to interpret and address some of the key normative issues that have been raised with regard to molecular genetics over the years, such as ‘relevance’, ‘responsible innovation’ and ‘promise management’. (shrink)
Stalnaker's Thesis about indicative conditionals is, roughly, that the probability one ought to assign to an indicative conditional equals the probability that one ought to assign to its consequent conditional on its antecedent. The thesis seems right. If you draw a card from a standard 52-card deck, how confident are you that the card is a diamond if it's a red card? To answer this, you calculate the proportion of red cards that are diamonds -- that is, you (...) calculate the probability of drawing a diamond conditional on drawing a red card. Skyrms' Thesis about counterfactual conditionals is, roughly, that the probability that one ought to assign to a counterfactual equals one's rational expectation of the chance, at a relevant past time, of its consequent conditional on its antecedent. This thesis also seems right. If you decide not to enter a 100-ticket lottery, how confident are you that you would have won had you bought a ticket? To answer this, you calculate the prior chance--that is, the chance just before your decision not to buy a ticket---of winning conditional on entering the lottery. The central project of this article is to develop a new uniform theory of conditionals that allows us to derive a version of Skyrms' Thesis from a version of Stalnaker's Thesis, together with a chance-deference norm relating rational credence to beliefs about objective chance. (shrink)
Critics of the computational connectionism of the last decade suggest that it shares undesirable features with earlier empiricist or associationist approaches, and with behaviourist theories of learning. To assess the accuracy of this charge the works of earlier writers are examined for the presence of such features, and brief accounts of those found are given for Herbert Spencer, William James and the learning theorists Thorndike, Pavlov and Hull. The idea that cognition depends on associative connections among large networks of neurons (...) is indeed one with precedents, although the implications of this for psychological issues have been interpreted variously — not all versions of connectionism are alike. (shrink)
Leibniz’s account of probability has come into better focus over the past decades. However, less attention has been paid to a certain domain of application of that account, that is, the application of it to the moral or ethical domain—the sphere of action, choice and practice. This is significant, as Leibniz had some things to say about applying probabilitytheory to the moral domain, and thought the matter quite relevant. Leibniz’s work in this area is conducted at (...) a high level of abstraction. It establishes a proof of concept, rather than concrete guidelines for how to apply calculations to specific cases. Still, this highly abstract material does allow us to begin to construct a framework for thinking about Leibniz’s approach to the ethical side of probability. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.