This paper is a response to Tyler Wunder’s ‘The modality of theism and probabilistic natural theology: a tension in Alvin Plantinga's philosophy’ (this journal). In his article, Wunder argues that if the proponent of the Evolutionary Argument Against Naturalism (EAAN) holds theism to be non-contingent and frames the argument in terms of objectiveprobability, that the EAAN is either unsound or theism is necessarily false. I argue that a modest revision of the EAAN renders Wunder’s objection irrelevant, and (...) that this revision actually widens the scope of the argument. (shrink)
The major competing statistical paradigms share a common remarkable but unremarked thread: in many of their inferential applications, different probability interpretations are combined. How this plays out in different theories of inference depends on the type of question asked. We distinguish four question types: confirmation, evidence, decision, and prediction. We show that Bayesian confirmation theory mixes what are intuitively “subjective” and “objective” interpretations of probability, whereas the likelihood-based account of evidence melds three conceptions of what constitutes an (...) “objective” probability. (shrink)
The objective Bayesian view of proof (or logical probability, or evidential support) is explained and defended: that the relation of evidence to hypothesis (in legal trials, science etc) is a strictly logical one, comparable to deductive logic. This view is distinguished from the thesis, which had some popularity in law in the 1980s, that legal evidence ought to be evaluated using numerical probabilities and formulas. While numbers are not always useful, a central role is played in uncertain reasoning (...) by the ‘proportional syllogism’, or argument from frequencies, such as ‘nearly all aeroplane flights arrive safely, so my flight is very likely to arrive safely’. Such arguments raise the ‘problem of the reference class’, arising from the fact that an individual case may be a member of many different classes in which frequencies differ. For example, if 15 per cent of swans are black and 60 per cent of fauna in the zoo is black, what should I think about the likelihood of a swan in the zoo being black? The nature of the problem is explained, and legal cases where it arises are given. It is explained how recent work in data mining on the relevance of features for prediction provides a solution to the reference class problem. (shrink)
We offer a new argument for the claim that there can be non-degenerate objective chance (“true randomness”) in a deterministic world. Using a formal model of the relationship between different levels of description of a system, we show how objective chance at a higher level can coexist with its absence at a lower level. Unlike previous arguments for the level-specificity of chance, our argument shows, in a precise sense, that higher-level chance does not collapse into epistemic probability, (...) despite higher-level properties supervening on lower-level ones. We show that the distinction between objective chance and epistemic probability can be drawn, and operationalized, at every level of description. There is, therefore, not a single distinction between objective and epistemic probability, but a family of such distinctions. (shrink)
In the contemporary philosophical debate about probability, one of the main problems con‐ cerns the relation between objectiveprobability and determinism. Is it possible for objectiveprobability and determinism to co‐exist? this is one of the questions this dispute tries to answer. the scope of discussion is conducted between advocates of a positive answer (com‐ patibilist) and co‐existence opponents (incompatibilist). In the early twentieth century, many logicians also developed topics regarding probability and determinism. One (...) of them was the outstanding Polish logician and philosopher — Jan Łukasiewicz. the general purpose of this paper is to analyse and implement Łukasiewicz’s views regarding determinism and probability in the contemporary eld of this problem. I will try to show the relation between his interpre‐ tations of these concepts and in consequence his attempt to con ont them. As a result of the above analysis, I present some di erent positions (located in the elds of logic and semantics) in the contemporary discourse about the relation between objectiveprobability and determi‐ nism. moreover, I will present Łukasiewicz’s views about this relation and the consequence of these solutions in the eld of logic. (shrink)
We investigate the conflict between the ex ante and ex post criteria of social welfare in a new framework of individual and social decisions, which distinguishes between two sources of uncertainty, here interpreted as an objective and a subjective source respectively. This framework makes it possible to endow the individuals and society not only with ex ante and ex post preferences, as is usually done, but also with interim preferences of two kinds, and correspondingly, to introduce interim forms of (...) the Pareto principle. After characterizing the ex ante and ex post criteria, we present a first solution to their conflict that extends the former as much possible in the direction of the latter. Then, we present a second solution, which goes in the opposite direction, and is also maximally assertive. Both solutions translate the assumed Pareto conditions into weighted additive utility representations, and both attribute to the individuals common probability values on the objective source of uncertainty, and different probability values on the subjective source. We discuss these solutions in terms of two conceptual arguments, i.e., the by now classic spurious unanimity argument and a novel informational argument labelled complementary ignorance. The paper complies with the standard economic methodology of basing probability and utility representations on preference axioms, but for the sake of completeness, also considers a construal of objective uncertainty based on the assumption of an exogeneously given probability measure. JEL classification: D70; D81. (shrink)
In a quantum universe with a strong arrow of time, we postulate a low-entropy boundary condition to account for the temporal asymmetry. In this paper, I show that the Past Hypothesis also contains enough information to simplify the quantum ontology and define a unique initial condition in such a world. First, I introduce Density Matrix Realism, the thesis that the quantum universe is described by a fundamental density matrix that represents something objective. This stands in sharp contrast to Wave (...) Function Realism, the thesis that the quantum universe is described by a wave function that represents something objective. Second, I suggest that the Past Hypothesis is sufficient to determine a unique and simple density matrix. This is achieved by what I call the Initial Projection Hypothesis: the initial density matrix of the universe is the normalized projection onto the special low-dimensional Hilbert space. Third, because the initial quantum state is unique and simple, we have a strong case for the \emph{Nomological Thesis}: the initial quantum state of the universe is on a par with laws of nature. This new package of ideas has several interesting implications, including on the harmony between statistical mechanics and quantum mechanics, the dynamic unity of the universe and the subsystems, and the alleged conflict between Humean supervenience and quantum entanglement. (shrink)
How were reliable predictions made before Pascal and Fermat's discovery of the mathematics of probability in 1654? What methods in law, science, commerce, philosophy, and logic helped us to get at the truth in cases where certainty was not attainable? The book examines how judges, witch inquisitors, and juries evaluated evidence; how scientists weighed reasons for and against scientific theories; and how merchants counted shipwrecks to determine insurance rates. Also included are the problem of induction before Hume, design arguments (...) for the existence of God, and theories on how to evaluate scientific and historical hypotheses. It is explained how Pascal and Fermat's work on chance arose out of legal thought on aleatory contracts. The book interprets pre-Pascalian unquantified probability in a generally objective Bayesian or logical probabilist sense. (shrink)
In this paper I provide a frequentist philosophical-methodological solution for the stopping rule problem presented by Lindley & Phillips in 1976, which is settled in the ecological realm of testing koalas’ sex ratio. I deliver criteria for discerning a stopping rule, an evidence and a model that are epistemically more appropriate for testing the hypothesis of the case studied, by appealing to physical notion of probability and by analyzing the content of possible formulations of evidence, assumptions of models and (...) meaning of the ecological hypothesis. First, I show the difference in the evidence taken into account in different frequentist sampling procedures presented in the problem. Next, I discuss the inapplicability of the Carnapian principle of total evidence in deciding which formulation of evidence associated with a given sampling procedure and statistical model is epistemically more appropriate for testing the hypothesis in question. Then I propose a double-perspective (evidence and model) frequentist solution based on the choice of evidence which better corresponds to the investigated ecological hypothesis, as well as on the choice of a model that embraces less unrealistic ontological assumptions. Finally, I discuss two perspectives of the stopping rule dependence. (shrink)
This paper shows how the classical finite probability theory (with equiprobable outcomes) can be reinterpreted and recast as the quantum probability calculus of a pedagogical or "toy" model of quantum mechanics over sets (QM/sets). There are two parts. The notion of an "event" is reinterpreted from being an epistemological state of indefiniteness to being an objective state of indefiniteness. And the mathematical framework of finite probability theory is recast as the quantum probability calculus for QM/sets. (...) The point is not to clarify finite probability theory but to elucidate quantum mechanics itself by seeing some of its quantum features in a classical setting. (shrink)
Classical physics and quantum physics suggest two meta-physical types of reality: the classical notion of a objectively definite reality with properties "all the way down," and the quantum notion of an objectively indefinite type of reality. The problem of interpreting quantum mechanics (QM) is essentially the problem of making sense out of an objectively indefinite reality. These two types of reality can be respectively associated with the two mathematical concepts of subsets and quotient sets (or partitions) which are category-theoretically dual (...) to one another and which are developed in two mathematical logics, the usual Boolean logic of subsets and the more recent logic of partitions. Our sense-making strategy is "follow the math" by showing how the logic and mathematics of set partitions can be transported in a natural way to Hilbert spaces where it yields the mathematical machinery of QM--which shows that the mathematical framework of QM is a type of logical system over ℂ. And then we show how the machinery of QM can be transported the other way down to the set-like vector spaces over ℤ₂ showing how the classical logical finite probability calculus (in a "non-commutative" version) is a type of "quantum mechanics" over ℤ₂, i.e., over sets. In this way, we try to make sense out of objective indefiniteness and thus to interpret quantum mechanics. (shrink)
Evolutionary theory (ET) is teeming with probabilities. Probabilities exist at all levels: the level of mutation, the level of microevolution, and the level of macroevolution. This uncontroversial claim raises a number of contentious issues. For example, is the evolutionary process (as opposed to the theory) indeterministic, or is it deterministic? Philosophers of biology have taken different sides on this issue. Millstein (1997) has argued that we are not currently able answer this question, and that even scientific realists ought to remain (...) agnostic concerning the determinism or indeterminism of evolutionary processes. If this argument is correct, it suggests that, whatever we take probabilities in ET to be, they must be consistent with either determinism or indeterminism. This raises some interesting philosophical questions: How should we understand the probabilities used in ET? In other words, what is meant by saying that a certain evolutionary change is more or less probable? Which interpretation of probability is the most appropriate for ET? I argue that the probabilities used in ET are objective in a realist sense, if not in an indeterministic sense. Furthermore, there are a number of interpretations of probability that are objective and would be consistent with ET under determinism or indeterminism. However, I argue that evolutionary probabilities are best understood as propensities of population-level kinds. (shrink)
Probability can be used to measure degree of belief in two ways: objectively and subjectively. The objective measure is a measure of the rational degree of belief in a proposition given a set of evidential propositions. The subjective measure is the measure of a particular subject’s dispositions to decide between options. In both measures, certainty is a degree of belief 1. I will show, however, that there can be cases where one belief is stronger than another yet both (...) beliefs are plausibly measurable as objectively and subjectively certain. In ordinary language, we can say that while both beliefs are certain, one belief is more certain than the other. I will then propose second, non probabilistic dimension of measurement, which tracks this variation in certainty in such cases where the probability is 1. A general principle of rationality is that one’s subjective degree of belief should match the rational degree of belief given the evidence available. In this paper I hope to show that it is also a rational principle that the maximum stake size at which one should remain certain should match the rational weight of certainty given the evidence available. Neither objective nor subjective measures of certainty conform to the axioms of probability, but instead are measured in utility. This has the consequence that, although it is often rational to be certain to some degree, there is no such thing as absolute certainty. (shrink)
In this paper we discuss the new Tweety puzzle. The original Tweety puzzle was addressed by approaches in non-monotonic logic, which aim to adequately represent the Tweety case, namely that Tweety is a penguin and, thus, an exceptional bird, which cannot fly, although in general birds can fly. The new Tweety puzzle is intended as a challenge for probabilistic theories of epistemic states. In the first part of the paper we argue against monistic Bayesians, who assume that epistemic states can (...) at any given time be adequately described by a single subjective probability function. We show that monistic Bayesians cannot provide an adequate solution to the new Tweety puzzle, because this requires one to refer to a frequency-based probability function. We conclude that monistic Bayesianism cannot be a fully adequate theory of epistemic states. In the second part we describe an empirical study, which provides support for the thesis that monistic Bayesianism is also inadequate as a descriptive theory of cognitive states. In the final part of the paper we criticize Bayesian approaches in cognitive science, insofar as their monistic tendency cannot adequately address the new Tweety puzzle. We, further, argue against monistic Bayesianism in cognitive science by means of a case study. In this case study we show that Oaksford and Chater’s (2007, 2008) model of conditional inference—contrary to the authors’ theoretical position—has to refer also to a frequency-based probability function. (shrink)
In standard probability theory, probability zero is not the same as impossibility. But many have suggested that only impossible events should have probability zero. This can be arranged if we allow infinitesimal probabilities, but infinitesimals do not solve all of the problems. We will see that regular probabilities are not invariant over rigid transformations, even for simple, bounded, countable, constructive, and disjoint sets. Hence, regular chances cannot be determined by space-time invariant physical laws, and regular credences cannot (...) satisfy seemingly reasonable symmetry principles. Moreover, the examples here are immune to the objections against Williamson’s infinite coin flips. (shrink)
Mathematicians often speak of conjectures, yet unproved, as probable or well-confirmed by evidence. The Riemann Hypothesis, for example, is widely believed to be almost certainly true. There seems no initial reason to distinguish such probability from the same notion in empirical science. Yet it is hard to see how there could be probabilistic relations between the necessary truths of pure mathematics. The existence of such logical relations, short of certainty, is defended using the theory of logical probability (or (...)objective Bayesianism or non-deductive logic), and some detailed examples of its use in mathematics surveyed. Examples of inductive reasoning in experimental mathematics are given and it is argued that the problem of induction is best appreciated in the mathematical case. (shrink)
There is widespread belief in a tension between quantum theory and special relativity, motivated by the idea that quantum theory violates J. S. Bell’s criterion of local causality, which is meant to implement the causal structure of relativistic space-time. This paper argues that if one takes the essential intuitive idea behind local causality to be that probabilities in a locally causal theory depend only on what occurs in the backward light cone and if one regards objectiveprobability as (...) what imposes constraints on rational credence along the lines of David Lewis’ Principal Principle, then one arrives at the view that whether or not Bell’s criterion holds is irrelevant for whether or not local causality holds. The assumptions on which this argument rests are highlighted, and those that may seem controversial are motivated. (shrink)
In a recent article, Gordon Belot uses the so-called undermining phenomenon to try to raise a new difficulty for reductive accounts of objectiveprobability, such as Humean Best System accounts. In this paper I will give a critical discussion of Belot’s paper and argue that, in fact, there is no new difficulty here for chance reductionists to address.
There is widespread excitement in the literature about the method of arbitrary functions: many take it to show that it is from the dynamics of systems that the objectivity of probabilities emerge. In this paper, I differentiate three ways in which a probability function might be objective, and I argue that the method of arbitrary functions cannot help us show that dynamics objectivise probabilities in any of these senses.
There is good reason to believe that scientific realism requires a commitment to the objective modal structure of the physical world. Causality, equilibrium, laws of nature, and probability all feature prominently in scientific theory and explanation, and each one is a modal notion. If we are committed to the content of our best scientific theories, we must accept the modal nature of the physical world. But what does the scientific realist’s commitment to physical modality require? We consider whether (...) scientific realism is compatible with Humeanism about the laws of nature, and we conclude that it is not. We specifically identify three major problems for the best-systems account of lawhood: its central concept of strength cannot be formulated non-circularly, it cannot offer a satisfactory account of the laws of the special sciences, and it can offer no explanation of the success of inductive inference. In addition, Humeanism fails to be naturalistically motivated. For these reasons, we conclude that the scientific realist must embrace natural necessity. (shrink)
I argue that riskier killings of innocent people are, other things equal, objectively worse than less risky killings. I ground these views in considerations of disrespect and security. Killing someone more riskily shows greater disrespect for him by more grievously undervaluing his standing and interests, and more seriously undermines his security by exposing a disposition to harm him across all counterfactual scenarios in which the probability of killing an innocent person is that high or less. I argue that the (...) salient probabilities are the agent’s sincere, sane, subjective probabilities, and that this thesis is relevant whether your risk-taking pertains to the probability of killing a person or to the probability that the person you kill is not liable to be killed. I then defend the view’s relevance to intentional killing; show how it differs from an account of blameworthiness; and explain its significance for all-things-considered justification and justification under uncertainty. (shrink)
Karl Popper (1902-1994) was one of the most influential philosophers of science of the 20th century. He made significant contributions to debates concerning general scientific methodology and theory choice, the demarcation of science from non-science, the nature of probability and quantum mechanics, and the methodology of the social sciences. His work is notable for its wide influence both within the philosophy of science, within science itself, and within a broader social context. Popper’s early work attempts to solve the problem (...) of demarcation and offer a clear criterion that distinguishes scientific theories from metaphysical or mythological claims. Popper’s falsificationist methodology holds that scientific theories are characterized by entailing predictions that future observations might reveal to be false. When theories are falsified by such observations, scientists can respond by revising the theory, or by rejecting the theory in favor of a rival or by maintaining the theory as is and changing an auxiliary hypothesis. In either case, however, this process must aim at the production of new, falsifiable predictions. While Popper recognizes that scientists can and do hold onto theories in the face of failed predictions when there are no predictively superior rivals to turn to. He holds that scientific practice is characterized by its continual effort to test theories against experience and make revisions based on the outcomes of these tests. By contrast, theories that are permanently immunized from falsification by the introduction of untestable ad hoc hypotheses can no longer be classified as scientific. Among other things, Popper argues that his falsificationist proposal allows for a solution of the problem of induction, since inductive reasoning plays no role in his account of theory choice. Along with his general proposals regarding falsification and scientific methodology, Popper is notable for his work on probability and quantum mechanics and on the methodology of the social sciences. Popper defends a propensity theory of probability, according to which probabilities are interpreted as objective, mind-independent properties of experimental setups. Popper then uses this theory to provide a realist interpretation of quantum mechanics, though its applicability goes beyond this specific case. With respect to the social sciences, Popper argued against the historicist attempt to formulate universal laws covering the whole of human history and instead argued in favor of methodological individualism and situational logic. Table of Contents 1. Background 2. Falsification and the Criterion of Demarcation a. Popper on Physics and Psychoanalysis b. Auxiliary and Ad Hoc Hypotheses c. Basic Sentences and the Role of Convention d. Induction, Corroboration, and Verisimilitude 3. Criticisms of Falsificationism 4. Realism, Quantum Mechanics, and Probability 5. Methodology in the Social Sciences 6. Popper’s Legacy 7. References and Further Reading a. Primary Sources b. Secondary Sources -/- . (shrink)
The conspicuous similarities between interpretive strategies in classical statistical mechanics and in quantum mechanics may be grounded on their employment of common implementations of probability. The objective probabilities which represent the underlying stochasticity of these theories can be naturally associated with three of their common formal features: initial conditions, dynamics, and observables. Various well-known interpretations of the two theories line up with particular choices among these three ways of implementing probability. This perspective has significant application to debates (...) on primitive ontology and to the quantum measurement problem. (shrink)
The question I am addressing in this paper is the following: how is it possible to empirically test, or confirm, counterfactuals? After motivating this question in Section 1, I will look at two approaches to counterfactuals, and at how counterfactuals can be empirically tested, or confirmed, if at all, on these accounts in Section 2. I will then digress into the philosophy of probability in Section 3. The reason for this digression is that I want to use the way (...) observable absolute and relative frequencies, two empirical notions, are used to empirically test, or confirm, hypotheses about objective chances, a metaphysical notion, as a role-model. Specifically, I want to use this probabilistic account of the testing of chance hypotheses as a role-model for the account of the testing of counterfactuals, another metaphysical notion, that I will present in Sections 4 to 8. I will conclude by comparing my proposal to one non-probabilistic and one probabilistic alternative in Section 9. (shrink)
Alvin Plantinga has famously argued that the naturalist who accepts evolutionary theory has a defeater for all of her beliefs, including her belief in naturalism and evolution. Hence, he says, naturalism, when conjoined with evolution, is self defeating and cannot be rationally accepted. This is known as the evolutionary argument against naturalism (EAAN). However, Tyler Wunder (Religious Studies 51:391– 399, 2015) has recently shown that if the EAAN is framed in terms of objectiveprobability and theism is assumed (...) to be non-contingent, then either theism is necessarily false or the EAAN is unsound. Neither option is attractive to the proponent of the EAAN. Perry Hendricks (Religious Studies 1–5, 2018) has responded to Wunder’s criticism, showing that the EAAN can be salvaged and, indeed, strengthened, by framing it in terms not of naturalism (N), but of a proposition that is entailed by N that is also consistent with theism. We will show that once Hendricks’ solution to Wunder’s objection is accepted, a puzzle ensues: if the EAAN provides the naturalist with a defeater for all of her beliefs, then an extension of it appears to provide God with a defeater for all of his beliefs. After bringing out this puzzle, we suggest several ways in which the proponent of the EAAN might solve it, but also show some potential weaknesses in these purported solutions. Whether the solutions to the puzzle that we consider ultimately succeed is unclear to us. (Translation: the authors disagree. One author thinks that the solutions (or,at least, some of them) that we consider do solve the puzzle while the other author does not.) However, it is clear to us that this is an issue that proponents of the EAAN need to address. (shrink)
In the classic Miners case, an agent subjectively ought to do what they know is objectively wrong. This case shows that the subjective and objective ‘oughts’ are somewhat independent. But there remains a powerful intuition that the guidance of objective ‘oughts’ is more authoritative—so long as we know what they tell us. We argue that this intuition must be given up in light of a monotonicity principle, which undercuts the rationale for saying that objective ‘oughts’ are an (...) authoritative guide for agents and advisors. (shrink)
The concept of agent-responsibility for an outcome (that is, of the outcome reflecting the autonomous choice of the agent) is central to both ethics and political philosophy. The concept, however, remains radically under-explored. In particular, the issue of partial responsibility for an outcome needs further development. I propose an account of partial responsibility based on partial causal contribution. Agents who choose autonomously in full knowledge of the consequences are agent-responsible, I claim, for the shift in the objectiveprobability (...) of the outcome in question that her choice induces. Thus, agents will typically be only partially agent-responsible (that is, for a shift of less than 100 percent) for any given outcome. The model has an implication that is generally rejected: that agents who purchase lottery tickets and win are agent-responsible for only part of the winnings. (shrink)
[1] You have a crystal ball. Unfortunately, it’s defective. Rather than predicting the future, it gives you the chances of future events. Is it then of any use? It certainly seems so. You may not know for sure whether the stock market will crash next week; but if you know for sure that it has an 80% chance of crashing, then you should be 80% confident that it will—and you should plan accordingly. More generally, given that the chance of a (...) proposition A is x%, your conditional credence in A should be x%. This is a chance-credence principle: a principle relating chance (objectiveprobability) with credence (subjective probability, degree of belief). Let’s call it the Minimal Principle (MP). (shrink)
The article is a plea for ethicists to regard probability as one of their most important concerns. It outlines a series of topics of central importance in ethical theory in which probability is implicated, often in a surprisingly deep way, and lists a number of open problems. Topics covered include: interpretations of probability in ethical contexts; the evaluative and normative significance of risk or uncertainty; uses and abuses of expected utility theory; veils of ignorance; Harsanyi’s aggregation theorem; (...) population size problems; equality; fairness; giving priority to the worse off; continuity; incommensurability; nonexpected utility theory; evaluative measurement; aggregation; causal and evidential decision theory; act consequentialism; rule consequentialism; and deontology. (shrink)
This chapter is divided into three parts. First I outline what makes something an objective list theory of well-being. I then go on to look at the motivations for holding such a view before turning to objections to these theories of well-being.
This paper motivates and develops a novel semantic framework for deontic modals. The framework is designed to shed light on two things: the relationship between deontic modals and substantive theories of practical rationality and the interaction of deontic modals with conditionals, epistemic modals and probability operators. I argue that, in order to model inferential connections between deontic modals and probability operators, we need more structure than is provided by classical intensional theories. In particular, we need probabilistic structure that (...) interacts directly with the compositional semantics of deontic modals. However, I reject theories that provide this probabilistic structure by claiming that the semantics of deontic modals is linked to the Bayesian notion of expectation. I offer a probabilistic premise semantics that explains all the data that create trouble for the rival theories. (shrink)
I offer a new theory of faultless disagreement, according to which truth is absolute (non-relative) but can still be non-objective. What's relative is truth-aptness: a sentence like ‘Vegemite is tasty’ (V) can be truth-accessible and bivalent in one context but not in another. Within a context in which V fails to be bivalent, we can affirm that there is no issue of truth or falsity about V, still disputants, affirming and denying V, were not at fault, since, in their (...) context of assertion V was bivalent. This theory requires a theory of assertion that is a form of cognitive expressivism. (shrink)
In this study we investigate the influence of reason-relation readings of indicative conditionals and ‘and’/‘but’/‘therefore’ sentences on various cognitive assessments. According to the Frege-Grice tradition, a dissociation is expected. Specifically, differences in the reason-relation reading of these sentences should affect participants’ evaluations of their acceptability but not of their truth value. In two experiments we tested this assumption by introducing a relevance manipulation into the truth-table task as well as in other tasks assessing the participants’ acceptability and probability evaluations. (...) Across the two experiments a strong dissociation was found. The reason-relation reading of all four sentences strongly affected their probability and acceptability evaluations, but hardly affected their respective truth evaluations. Implications of this result for recent work on indicative conditionals are discussed. (shrink)
The notion of comparative probability defined in Bayesian subjectivist theory stems from an intuitive idea that, for a given pair of events, one event may be considered “more probable” than the other. Yet it is conceivable that there are cases where it is indeterminate as to which event is more probable, due to, e.g., lack of robust statistical information. We take that these cases involve indeterminate comparative probabilities. This paper provides a Savage-style decision-theoretic foundation for indeterminate comparative probabilities.
We provide a 'verisimilitudinarian' analysis of the well-known Linda paradox or conjunction fallacy, i.e., the fact that most people judge the probability of the conjunctive statement "Linda is a bank teller and is active in the feminist movement" (B & F) as more probable than the isolated statement "Linda is a bank teller" (B), contrary to an uncontroversial principle of probability theory. The basic idea is that experimental participants may judge B & F a better hypothesis about Linda (...) as compared to B because they evaluate B & F as more verisimilar than B. In fact, the hypothesis "feminist bank teller", while less likely to be true than "bank teller", may well be a better approximation to the truth about Linda. (shrink)
Classical logic is usually interpreted as the logic of propositions. But from Boole's original development up to modern categorical logic, there has always been the alternative interpretation of classical logic as the logic of subsets of any given (nonempty) universe set. Partitions on a universe set are dual to subsets of a universe set in the sense of the reverse-the-arrows category-theoretic duality--which is reflected in the duality between quotient objects and subobjects throughout algebra. Hence the idea arises of a dual (...) logic of partitions. That dual logic is described here. Partition logic is at the same mathematical level as subset logic since models for both are constructed from (partitions on or subsets of) arbitrary unstructured sets with no ordering relations, compatibility or accessibility relations, or topologies on the sets. Just as Boole developed logical finite probability theory as a quantitative treatment of subset logic, applying the analogous mathematical steps to partition logic yields a logical notion of entropy so that information theory can be refounded on partition logic. But the biggest application is that when partition logic and the accompanying logical information theory are "lifted" to complex vector spaces, then the mathematical framework of quantum mechanics is obtained. Partition logic models indefiniteness (i.e., numerical attributes on a set become more definite as the inverse-image partition becomes more refined) while subset logic models the definiteness of classical physics (an entity either definitely has a property or definitely does not). Hence partition logic provides the backstory so the old idea of "objective indefiniteness" in QM can be fleshed out to a full interpretation of quantum mechanics. (shrink)
Many philosophers argue that Keynes’s concept of the “weight of arguments” is an important aspect of argument appraisal. The weight of an argument is the quantity of relevant evidence cited in the premises. However, this dimension of argumentation does not have a received method for formalisation. Kyburg has suggested a measure of weight that uses the degree of imprecision in his system of “Evidential Probability” to quantify weight. I develop and defend this approach to measuring weight. I illustrate the (...) usefulness of this measure by employing it to develop an answer to Popper’s Paradox of Ideal Evidence. (shrink)
This paper draws together as many as possible of the clues and pieces of the puzzle surrounding T. S. Eliot’s “infamous” literary term “objective correlative”. Many different scholars have claimed many different sources for the term, in Pound, Whitman, Baudelaire, Washington Allston, Santayana, Husserl, Nietzsche, Newman, Walter Pater, Coleridge, Russell, Bradley, Bergson, Bosanquet, Schopenhauer and Arnold. This paper aims to rewrite this list by surveying those individuals who, in different ways, either offer the truest claim to being the source (...) of the term, or contributed the most to Eliot’s development of it: Allston, Husserl, Bradley and Bergson. What the paper will argue is that Eliot’s possible inspiration for the term is more indebted to the idealist tradition, and Bergson’s aesthetic development of it, than to the phenomenology of Husserl. (shrink)
We think we have lots of substantial knowledge about the future. But contemporary wisdom has it that indeterminism prevails in such a way that just about any proposition about the future has a non-zero objective chance of being false.2, 3 What should one do about this? One, pessimistic, reaction is scepticism about knowledge of the future. We think this should be something of a last resort, especially since this scepticism is likely to infect alleged knowledge of the present and (...) past. One anti-sceptical strategy is to pin our hopes on determinism, conceding that knowledge of the future is unavailable in an indeterministic world. This is not satisfying either: we would rather not be hostage to empirical fortune in the way that this strategy recommends. A final strategy, one that we shall explore in this paper, is one of reconciliation: knowledge of a proposition is compatible with a subject’s belief having a non-zero objective chance of error.4 Following Williamson, we are interested in tying knowledge to the presence or absence of error in close cases, and so we shall explore the connections between knowledge and objective chance within such a framework. We don’t want to get tangled up here in complications involved in attempting to formulate a necessary and sufficient condition for knowledge in terms of safety. Instead, we will assume the following rough and ready necessary condition: a subject knows P only if she could not easily have falsely believed P.5 Assuming that easiness is to be spelt.. (shrink)
This paper defends David Hume's "Of Miracles" from John Earman's (2000) Bayesian attack by showing that Earman misrepresents Hume's argument against believing in miracles and misunderstands Hume's epistemology of probable belief. It argues, moreover, that Hume's account of evidence is fundamentally non-mathematical and thus cannot be properly represented in a Bayesian framework. Hume's account of probability is show to be consistent with a long and laudable tradition of evidential reasoning going back to ancient Roman law.
We expound an alternative to the Copenhagen interpretation of the formalism of nonrelativistic quantum mechanics. The basic difference is that the new interpretation is formulated in the language of epistemological realism. It involves a change in some basic physical concepts. The ψ function is no longer interpreted as a probability amplitude of the observed behaviour of elementary particles but as an objective physical field representing the particles themselves. The particles are thus extended objects whose extension varies in time (...) according to the variation of ψ. They are considered as fundamental regions of space with some kind of nonlocality. Special consideration is given to the Heisenberg relations, the Einstein-Podolsky- Rosen correlations, the reduction process, the problem of measurement, and the quantum-statistical distributions. (shrink)
A probability distribution is regular if no possible event is assigned probability zero. While some hold that probabilities should always be regular, three counter-arguments have been posed based on examples where, if regularity holds, then perfectly similar events must have different probabilities. Howson (2017) and Benci et al. (2016) have raised technical objections to these symmetry arguments, but we see here that their objections fail. Howson says that Williamson’s (2007) “isomorphic” events are not in fact isomorphic, but Howson (...) is speaking of set-theoretic representations of events in a probability model. While those sets are not isomorphic, Williamson’s physical events are, in the relevant sense. Benci et al. claim that all three arguments rest on a conflation of different models, but they do not. They are founded on the premise that similar events should have the same probability in the same model, or in one case, on the assumption that a single rotation-invariant distribution is possible. Having failed to refute the symmetry arguments on such technical grounds, one could deny their implicit premises, which is a heavy cost, or adopt varying degrees of instrumentalism or pluralism about regularity, but that would not serve the project of accurately modelling chances. (shrink)
When probability discounting (or probability weighting), one multiplies the value of an outcome by one's subjective probability that the outcome will obtain in decision-making. The broader import of defending probability discounting is to help justify cost-benefit analyses in contexts such as climate change. This chapter defends probability discounting under risk both negatively, from arguments by Simon Caney (2008, 2009), and with a new positive argument. First, in responding to Caney, I argue that small costs and (...) benefits need to be evaluated, and that viewing practices at the social level is too coarse-grained. Second, I argue for probability discounting, using a distinction between causal responsibility and moral responsibility. Moral responsibility can be cashed out in terms of blameworthiness and praiseworthiness, while causal responsibility obtains in full for any effect which is part of a causal chain linked to one's act. With this distinction in hand, unlike causal responsibility, moral responsibility can be seen as coming in degrees. My argument is, given that we can limit our deliberation and consideration to that which we are morally responsible for and that our moral responsibility for outcomes is limited by our subjective probabilities, our subjective probabilities can ground probability discounting. (shrink)
A definition of causation as probability-raising is threatened by two kinds of counterexample: first, when a cause lowers the probability of its effect; and second, when the probability of an effect is raised by a non-cause. In this paper, I present an account that deals successfully with problem cases of both these kinds. In doing so, I also explore some novel implications of incorporating into the metaphysical investigation considerations of causal psychology.
There is a plethora of confirmation measures in the literature. Zalabardo considers four such measures: PD, PR, LD, and LR. He argues for LR and against each of PD, PR, and LD. First, he argues that PR is the better of the two probability measures. Next, he argues that LR is the better of the two likelihood measures. Finally, he argues that LR is superior to PR. I set aside LD and focus on the trio of PD, PR, and (...) LR. The question I address is whether Zalabardo succeeds in showing that LR is superior to each of PD and PR. I argue that the answer is negative. I also argue, though, that measures such as PD and PR, on one hand, and measures such as LR, on the other hand, are naturally understood as explications of distinct senses of confirmation. (shrink)
Objective: In this essay, I will try to track some historical and modern stages of the discussion on the Gettier problem, and point out the interrelations of the questions that this problem raises for epistemologists, with sceptical arguments, and a so-called problem of relevance. Methods: historical analysis, induction, generalization, deduction, discourse, intuition results: Albeit the contextual theories of knowledge, the use of different definitions of knowledge, and the different ways of the uses of knowledge do not resolve all the (...) issues that the sceptic can put forward, but they can be productive in giving clarity to a concept of knowledge for us. On the other hand, our knowledge will always have an element of intuition and subjectivity, however not equating to epistemic luck and probability. Significance novelty: the approach to the context in general, not giving up being a Subject may give us a clarity about the sense of what it means to say – “I know”. (shrink)
This book explores a question central to philosophy--namely, what does it take for a belief to be justified or rational? According to a widespread view, whether one has justification for believing a proposition is determined by how probable that proposition is, given one's evidence. In this book this view is rejected and replaced with another: in order for one to have justification for believing a proposition, one's evidence must normically support it--roughly, one's evidence must make the falsity of that proposition (...) abnormal in the sense of calling for special, independent explanation. This conception of justification bears upon a range of topics in epistemology and beyond. Ultimately, this way of looking at justification guides us to a new, unfamiliar picture of how we should respond to our evidence and manage our own fallibility. This picture is developed here. (shrink)
We generalize the Kolmogorov axioms for probability calculus to obtain conditions defining, for any given logic, a class of probability functions relative to that logic, coinciding with the standard probability functions in the special case of classical logic but allowing consideration of other classes of "essentially Kolmogorovian" probability functions relative to other logics. We take a broad view of the Bayesian approach as dictating inter alia that from the perspective of a given logic, rational degrees of (...) belief are those representable by probability functions from the class appropriate to that logic. Classical Bayesianism, which fixes the logic as classical logic, is only one version of this general approach. Another, which we call Intuitionistic Bayesianism, selects intuitionistic logic as the preferred logic and the associated class of probability functions as the right class of candidate representions of epistemic states (rational allocations of degrees of belief). Various objections to classical Bayesianism are, we argue, best met by passing to intuitionistic Bayesianism—in which the probability functions are taken relative to intuitionistic logic—rather than by adopting a radically non-Kolmogorovian, for example, nonadditive, conception of (or substitute for) probability functions, in spite of the popularity of the latter response among those who have raised these objections. The interest of intuitionistic Bayesianism is further enhanced by the availability of a Dutch Book argument justifying the selection of intuitionistic probability functions as guides to rational betting behavior when due consideration is paid to the fact that bets are settled only when/if the outcome bet on becomes known. (shrink)
Background: how mind functions is subject to continuing scientific discussion. A simplistic approach says that, since no convincing way has been found to model subjective experience, mind cannot exist. A second holds that, since mind cannot be described by classical physics, it must be described by quantum physics. Another perspective concerns mind's hypothesized ability to interact with the world of quanta: it should be responsible for reduction of quantum wave packets; physics producing 'Objective Reduction' is postulated to form the (...) basis for mind-matter interactions. This presentation describes results derived from a new approach to these problems. It is based on well-established biology involving physics not previously applied to the fields of mind, or consciousness studies, that of critical feedback instability. -/- Methods: 'self-organized criticality' in complexity biology places system loci of control at critical instabilities, physical properties of which, including information properties, are presented. Their elucidation shows that they can model hitherto unexplained properties of experience. -/- Results: All results depend on physical properties of critical instabilities. First, at least one feed-back or feed-forward loop must have feedback gain, g = 1: information flows round the loop impress perfect images of system states back on themselves: they represent processes of perfect self-observation. This annihilates system quanta: system excitations are instability fluctuations, which cannot be quantized. Major results follow: -/- 1. Information vectors representing criticality states must include at least one attached information loop denoting self-observation. -/- 2. Such loop structures are attributed a function, 'registering the state's own existence', explaining -/- a. Subjective 'awareness of one's own presence' -/- b. How content-free states of awareness can be remembered (Jon Shear) -/- c. Subjective experience of time duration (Immanuel Kant) -/- d. The 'witness' property of experience – often mentioned by athletes 'in the zone' -/- e. The natural association between consciousness and intelligence -/- This novel, physically and biologically sound approach seems to satisfactorily model subjectivity. -/- Further significant results follow: -/- 1. Registration of external information in excited states of systems at criticality reduces external wave-packets: the new model exhibits 'Objective Reduction' of wave packets. -/- 2. High internal coherence (postulated by Domash & Penrose) leading to a. Non-separable information vector bundles. b. Non-reductive states (Chalmers's criterion for experience). -/- 3. Information that is: a. encoded in coherence negentropy; b. non-digitizable, and therefore c. computationally without digital equivalent (posited by Penrose). -/- Discussion and Conclusions: instability physics implies anharmonic motion, preventing excitation quantization, and totally different from the quantum physics of simple harmonic motion at stability. Instability excitations are different from anything hitherto conceived in information science. They can model aspects of mind never previously treated, including genuine subjectivity, objective reduction of wave-packets, and inter alia all properties given above. (shrink)
Jakob Friedrich Fries (1773-1843): A Philosophy of the Exact Sciences -/- Shortened version of the article of the same name in: Tabula Rasa. Jenenser magazine for critical thinking. 6th of November 1994 edition -/- 1. Biography -/- Jakob Friedrich Fries was born on the 23rd of August, 1773 in Barby on the Elbe. Because Fries' father had little time, on account of his journeying, he gave up both his sons, of whom Jakob Friedrich was the elder, to the Herrnhut Teaching (...) Institution in Niesky in 1778. Fries attended the theological seminar in Niesky in autumn 1792, which lasted for three years. There he (secretly) began to study Kant. The reading of Kant's works led Fries, for the first time, to a deep philosophical satisfaction. His enthusiasm for Kant is to be understood against the background that a considerable measure of Kant's philosophy is based on a firm foundation of what happens in an analogous and similar manner in mathematics. -/- During this period he also read Heinrich Jacobi's novels, as well as works of the awakening classic German literature; in particular Friedrich Schiller's works. In 1795, Fries arrived at Leipzig University to study law. During his time in Leipzig he became acquainted with Fichte's philosophy. In autumn of the same year he moved to Jena to hear Fichte at first hand, but was soon disappointed. -/- During his first sojourn in Jenaer (1796), Fries got to know the chemist A. N. Scherer who was very influenced by the work of the chemist A. L. Lavoisier. Fries discovered, at Scherer's suggestion, the law of stoichiometric composition. Because he felt that his work still need some time before completion, he withdrew as a private tutor to Zofingen (in Switzerland). There Fries worked on his main critical work, and studied Newton's "Philosophiae naturalis principia mathematica". He remained a lifelong admirer of Newton, whom he praised as a perfectionist of astronomy. Fries saw the final aim of his mathematical natural philosophy in the union of Newton's Principia with Kant's philosophy. -/- With the aim of qualifying as a lecturer, he returned to Jena in 1800. Now Fries was known from his independent writings, such as "Reinhold, Fichte and Schelling" (1st edition in 1803), and "Systems of Philosophy as an Evident Science" (1804). The relationship between G. W. F. Hegel and Fries did not develop favourably. Hegel speaks of "the leader of the superficial army", and at other places he expresses: "he is an extremely narrow-minded bragger". On the other hand, Fries also has an unfavourable take on Hegel. He writes of the "Redundancy of the Hegelistic dialectic" (1828). In his History of Philosophy (1837/40) he writes of Hegel, amongst other things: "Your way of philosophising seems just to give expression to nonsense in the shortest possible way". In this work, Fries appears to argue with Hegel in an objective manner, and expresses a positive attitude to his work. -/- In 1805, Fries was appointed professor for philosophy in Heidelberg. In his time spent in Heidelberg, he married Caroline Erdmann. He also sealed his friendships with W. M. L. de Wette and F. H. Jacobi. Jacobi was amongst the contemporaries who most impressed Fries during this period. In Heidelberg, Fries wrote, amongst other things, his three-volume main work New Critique of Reason (1807). -/- In 1816 Fries returned to Jena. When in 1817 the Wartburg festival took place, Fries was among the guests, and made a small speech. 1819 was the so-called "Great Year" for Fries: His wife Caroline died, and Karl Sand, a member of a student fraternity, and one of Fries' former students stabbed the author August von Kotzebue to death. Fries was punished with a philosophy teaching ban but still received a professorship for physics and mathematics. Only after a period of years, and under restrictions, he was again allowed to read philosophy. From now on, Fries was excluded from political influence. The rest of his life he devoted himself once again to philosophical and natural studies. During this period, he wrote "Mathematical Natural Philosophy" (1822) and the "History of Philosophy" (1837/40). -/- Fries suffered from a stroke on New Year's Day 1843, and a second stroke, on the 10th of August 1843 ended his life. -/- 2. Fries' Work Fries left an extensive body of work. A look at the subject areas he worked on makes us aware of the universality of his thinking. Amongst these subjects are: Psychic anthropology, psychology, pure philosophy, logic, metaphysics, ethics, politics, religious philosophy, aesthetics, natural philosophy, mathematics, physics and medical subjects, to which, e.g., the text "Regarding the optical centre in the eye together with general remarks about the theory of seeing" (1839) bear witness. With popular philosophical writings like the novel "Julius and Evagoras" (1822), or the arabesque "Longing, and a Trip to the Middle of Nowhere" (1820), he tried to make his philosophy accessible to a broader public. Anthropological considerations are shown in the methodical basis of his philosophy, and to this end, he provides the following didactic instruction for the study of his work: "If somebody wishes to study philosophy on the basis of this guide, I would recommend that after studying natural philosophy, a strict study of logic should follow in order to peruse metaphysics and its applied teachings more rapidly, followed by a strict study of criticism, followed once again by a return to an even closer study of metaphysics and its applied teachings." -/- 3. Continuation of Fries' work through the Friesian School -/- Fries' ideas found general acceptance amongst scientists and mathematicians. A large part of the followers of the "Fries School of Thought" had a scientific or mathematical background. Amongst them were biologist Matthias Jakob Schleiden, mathematics and science specialist philosopher Ernst Friedrich Apelt, the zoologist Oscar Schmidt, and the mathematician Oscar Xavier Schlömilch. Between the years 1847 and 1849, the treatises of the "Fries School of Thought", with which the publishers aimed to pursue philosophy according to the model of the natural sciences appeared. In the Kant-Fries philosophy, they saw the realisation of this ideal. The history of the "New Fries School of Thought" began in 1903. It was in this year that the philosopher Leonard Nelson gathered together a small discussion circle in Goettingen. Amongst the founding members of this circle were: A. Rüstow, C. Brinkmann and H. Goesch. In 1904 L. Nelson, A. Rüstow, H. Goesch and the student W. Mecklenburg travelled to Thuringia to find the missing Fries writings. In the same year, G. Hessenberg, K. Kaiser and Nelson published the first pamphlet from their first volume of the "Treatises of the Fries School of Thought, New Edition". -/- The school set out with the aim of searching for the missing Fries' texts, and re-publishing them with a view to re-opening discussion of Fries' brand of philosophy. The members of the circle met regularly for discussions. Additionally, larger conferences took place, mostly during the holidays. Featuring as speakers were: Otto Apelt, Otto Berg, Paul Bernays, G. Fraenkel, K. Grelling, G. Hessenberg, A. Kronfeld, O. Meyerhof, L. Nelson and R. Otto. On the 1st of March 1913, the Jakob-Friedrich-Fries society was founded. Whilst the Fries' school of thought dealt in continuum with the advancement of the Kant-Fries philosophy, the members of the Jakob-Friedrich-Fries society's main task was the dissemination of the Fries' school publications. In May/June, 1914, the organisations took part in their last common conference before the gulf created by the outbreak of the First World War. Several members died during the war. Others returned disabled. The next conference took place in 1919. A second conference followed in 1921. Nevertheless, such intensive work as had been undertaken between 1903 and 1914 was no longer possible. -/- Leonard Nelson died in October 1927. In the 1930's, the 6th and final volume of "Treatises of the Fries School of Thought, New Edition" was published. Franz Oppenheimer, Otto Meyerhof, Minna Specht and Grete Hermann were involved in their publication. -/- 4. About Mathematical Natural Philosophy -/- In 1822, Fries' "Mathematical Natural Philosophy" appeared. Fries rejects the speculative natural philosophy of his time - above all Schelling's natural philosophy. A natural study, founded on speculative philosophy, ceases with its collection, arrangement and order of well-known facts. Only a mathematical natural philosophy can deliver the necessary explanatory reasoning. The basic dictum of his mathematical natural philosophy is: "All natural theories must be definable using purely mathematically determinable reasons of explanation." Fries is of the opinion that science can attain completeness only by the subordination of the empirical facts to the metaphysical categories and mathematical laws. -/- The crux of Fries' natural philosophy is the thought that mathematics must be made fertile for use by the natural sciences. However, pure mathematics displays solely empty abstraction. To be able to apply them to the sensory world, an intermediatory connection is required. Mathematics must be connected to metaphysics. The pure mechanics, consisting of three parts are these: a) A study of geometrical movement, which considers solely the direction of the movement, b) A study of kinematics, which considers velocity in Addition, c) A study of dynamic movement, which also incorporates mass and power, as well as direction and velocity. -/- Of great interest is Fries' natural philosophy in view of its methodology, particularly with regard to the doctrine "leading maxims". Fries calls these "leading maxims" "heuristic", "because they are principal rules for scientific invention". -/- Fries' philosophy found great recognition with Carl Friedrich Gauss, amongst others. Fries asked for Gauss's opinion on his work "An Attempt at a Criticism based on the Principles of the Probability Calculus" (1842). Gauss also provided his opinions on "Mathematical Natural Philosophy" (1822) and on Fries' "History of Philosophy". Gauss acknowledged Fries' philosophy and wrote in a letter to Fries: "I have always had a great predilection for philosophical speculation, and now I am all the more happy to have a reliable teacher in you in the study of the destinies of science, from the most ancient up to the latest times, as I have not always found the desired satisfaction in my own reading of the writings of some of the philosophers. In particular, the writings of several famous (maybe better, so-called famous) philosophers who have appeared since Kant have reminded me of the sieve of a goat-milker, or to use a modern image instead of an old-fashioned one, of Münchhausen's plait, with which he pulled himself from out of the water. These amateurs would not dare make such a confession before their Masters; it would not happen were they were to consider the case upon its merits. I have often regretted not living in your locality, so as to be able to glean much pleasurable entertainment from philosophical verbal discourse." -/- The starting point of the new adoption of Fries was Nelson's article "The critical method and the relation of psychology to philosophy" (1904). Nelson dedicates special attention to Fries' re-interpretation of Kant's deduction concept. Fries awards Kant's criticism the rationale of anthropological idiom, in that he is guided by the idea that one can examine in a psychological way which knowledge we have "a priori", and how this is created, so that we can therefore recognise our own knowledge "a priori" in an empirical way. Fries understands deduction to mean an "awareness residing darkly in us is, and only open to basic metaphysical principles through conscious reflection.". -/- Nelson has pointed to an analogy between Fries' deduction and modern metamathematics. In the same manner, as with the anthropological deduction of the content of the critical investigation into the metaphysical object show, the content of mathematics become, in David Hilbert's view, the object of metamathematics. -/-. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.