This paper shows how the classical finite probabilitytheory (with equiprobable outcomes) can be reinterpreted and recast as the quantum probability calculus of a pedagogical or "toy" model of quantum mechanics over sets (QM/sets). There are two parts. The notion of an "event" is reinterpreted from being an epistemological state of indefiniteness to being an objective state of indefiniteness. And the mathematical framework of finite probabilitytheory is recast as the quantum probability calculus (...) for QM/sets. The point is not to clarify finite probabilitytheory but to elucidate quantum mechanics itself by seeing some of its quantum features in a classical setting. (shrink)
Contrary to Bell’s theorem it is demonstrated that with the use of classicalprobabilitytheory the quantum correlation can be approximated. Hence, one may not conclude from experiment that all local hidden variable theories are ruled out by a violation of inequality result.
This paper motivates and develops a novel semantic framework for deontic modals. The framework is designed to shed light on two things: the relationship between deontic modals and substantive theories of practical rationality and the interaction of deontic modals with conditionals, epistemic modals and probability operators. I argue that, in order to model inferential connections between deontic modals and probability operators, we need more structure than is provided by classical intensional theories. In particular, we need probabilistic structure (...) that interacts directly with the compositional semantics of deontic modals. However, I reject theories that provide this probabilistic structure by claiming that the semantics of deontic modals is linked to the Bayesian notion of expectation. I offer a probabilistic premise semantics that explains all the data that create trouble for the rival theories. (shrink)
Epistemic closure under known implication is the principle that knowledge of \ and knowledge of \, together, imply knowledge of \. This principle is intuitive, yet several putative counterexamples have been formulated against it. This paper addresses the question, why is epistemic closure both intuitive and prone to counterexamples? In particular, the paper examines whether probabilitytheory can offer an answer to this question based on four strategies. The first probability-based strategy rests on the accumulation of risks. (...) The problem with this strategy is that risk accumulation cannot accommodate certain counterexamples to epistemic closure. The second strategy is based on the idea of evidential support, that is, a piece of evidence supports a proposition whenever it increases the probability of the proposition. This strategy makes progress and can accommodate certain putative counterexamples to closure. However, this strategy also gives rise to a number of counterintuitive results. Finally, there are two broadly probabilistic strategies, one based on the idea of resilient probability and the other on the idea of assumptions that are taken for granted. These strategies are promising but are prone to some of the shortcomings of the second strategy. All in all, I conclude that each strategy fails. Probabilitytheory, then, is unlikely to offer the account we need. (shrink)
In this text the ancient philosophical question of determinism (“Does every event have a cause ?”) will be re-examined. In the philosophy of science and physics communities the orthodox position states that the physical world is indeterministic: quantum events would have no causes but happen by irreducible chance. Arguably the clearest theorem that leads to this conclusion is Bell’s theorem. The commonly accepted ‘solution’ to the theorem is ‘indeterminism’, in agreement with the Copenhagen interpretation. Here it is recalled that indeterminism (...) is not really a physical but rather a philosophical hypothesis, and that it has counterintuitive and far-reaching implications. At the same time another solution to Bell’s theorem exists, often termed ‘superdeterminism’ or ‘total determinism’. Superdeterminism appears to be a philosophical position that is centuries and probably millennia old: it is for instance Spinoza’s determinism. If Bell’s theorem has both indeterministic and deterministic solutions, choosing between determinism and indeterminism is a philosophical question, not a matter of physical experimentation, as is widely believed. If it is impossible to use physics for deciding between both positions, it is legitimate to ask which philosophical theories are of help. Here it is argued that probabilitytheory – more precisely the interpretation of probability – is instrumental for advancing the debate. It appears that the hypothesis of determinism allows to answer a series of precise questions from probabilitytheory, while indeterminism remains silent for these questions. From this point of view determinism appears to be the more reasonable assumption, after all. (shrink)
This dissertation is an analysis of the development of dialectic and argumentation theory in post-classical Islamic intellectual history. The central concerns of the thesis are; treatises on the theoretical understanding of the concept of dialectic and argumentation theory, and how, in practice, the concept of dialectic, as expressed in the Greek classical tradition, was received and used by five communities in the Islamic intellectual camp. It shows how dialectic as an argumentative discourse diffused into five communities (...) (theologicians, poets, grammarians, philosophers and jurists) and how these local dialectics that the individual communities developed fused into a single system to form a general argumentation theory (adab al-bahth) applicable to all fields. I evaluate a treatise by Shams al-Din Samarqandi (d.702/1302), the founder of this general theory, and the treatises that were written after him as a result of his work. I concentrate specifically on work by 'Ad}ud al-Din al-Iji (d.756/1355), Sayyid Sharif al-Jurjani (d.816/1413), Taşköprüzâde (d.968/1561), Saçaklızâde (d.1150/1737) and Gelenbevî (d.1205/1791) and analyze how each writer (from Samarqandi to Gelenbevî) altered the shape of argumentative discourse and how later intellectuals in the post-classical Islamic world responded to that discourse bequeathed by their predecessors. What is striking about the period that this dissertation investigates (from 1300-1800) is the persistence of what could be called the linguistic turn in argumentation theory. After a centuries-long run, the jadal-based dialectic of the classical period was displaced by a new argumentation theory, which was dominantly linguistic in character. This linguistic turn in argumentation dates from the final quarter of the fourteenth century in Iji's impressively prescient work on 'ilm al-wad'. This idea, which finally surfaced in the post-classical period, that argumentation is about definition and that, therefore, defining is the business of language—even perhaps, that language is the only available medium for understanding and being understood—affected the way that argumentation theory was processed throughout most of the period in question.The argumentative discourse that started with Ibn al-Rawandi in the third/ninth century left a permanent imprint on Islamic intellectual history, which was then full of concepts, terminology and objectives from this discourse up until the late nineteenth century. From this perspective, Islamic intellectual history can be read as the tension between two languages: the "language of dialectic" (jadal) and the "language of demonstration" (burhan), each of which refer not only to a significant feature of that history, but also to a feature that could dramatically alter the interpretation of that history. (shrink)
This paper is concerned with representations of belief by means of nonadditive probabilities of the Dempster-Shafer (DS) type. After surveying some foundational issues and results in the D.S. theory, including Suppes's related contributions, the paper proceeds to analyze the connection of the D.S. theory with some of the work currently pursued in epistemic logic. A preliminary investigation of the modal logic of belief functions à la Shafer is made. There it is shown that the Alchourrron-Gärdenfors-Makinson (A.G.M.) logic of (...) belief change is closely related to the D.S. theory. The final section compares the critique of Bayesianism which underlies the present paper with some important objections raised by Suppes against this doctrine. -/- . (shrink)
In The Mind Doesn’t Work that Way, Jerry Fodor argues that mental representations have context sensitive features relevant to cognition, and that, therefore, the Classical Computational Theory of Mind (CTM) is mistaken. We call this the Globality Argument. This is an in principle argument against CTM. We argue that it is self-defeating. We consider an alternative argument constructed from materials in the discussion, which avoids the pitfalls of the official argument. We argue that it is also unsound and (...) that, while it is an empirical issue whether context sensitive features of mental representations are relevant to cognition, it is empirically implausible. (shrink)
It is exhibited that mature scientific economical theory is a set of propositions that describe the relationship between theoretical objects of two types - basic objects and derivative ones. The set of basic objects makes up the aggregate of initial idealizations (the Fundamental Theoretical Scheme or FTS) with no direct reference to experimental data. The derivative theoretical objects are formed from the basic ones according to certain rules. The sets of derivative objects form partial theoretical schemes or PTS. Any (...) mature economics scientific theory grows due to transitions, in order to describe each new experimental situation, from FTS to PTS. Each PTS construction from the FTS represents a problem that cannot be reduced to a strict algorithm. (shrink)
Conceptual Metaphor Theory makes some strong claims against so-called ClassicalTheory which spans the accounts of metaphors from Aristotle to Davidson. Most of these theories, because of their traditional literal-metaphorical distinction, fail to take into account the phenomenon of conceptual metaphor. I argue that the underlying mechanism for explaining metaphor bears some striking resemblances among all of these theories. A mapping between two structures is always expressed. Conceptual Metaphor Theory insists, however, that the literal-metaphorical distinction of (...)Classical Theories is empirically wrong. I claim that this criticism is based rather on terminological decisions than on empirical issues. Conceptual Metaphor Theory focusses primarily on conventional metaphors and struggles to extend its mechanism to novel metaphors, whereas Classical Theories focus on novel metaphors and struggle to extend their mechanisms to conventional metaphors. Since all of these theories study metaphors from the synchronic point of view, they are unable to take into account any semantic change. A diachronic perspective is what we need here, one which would allow us to explain the role of metaphor in semantic change and the development of language in general. (shrink)
We generalize the Kolmogorov axioms for probability calculus to obtain conditions defining, for any given logic, a class of probability functions relative to that logic, coinciding with the standard probability functions in the special case of classical logic but allowing consideration of other classes of "essentially Kolmogorovian" probability functions relative to other logics. We take a broad view of the Bayesian approach as dictating inter alia that from the perspective of a given logic, rational degrees (...) of belief are those representable by probability functions from the class appropriate to that logic. Classical Bayesianism, which fixes the logic as classical logic, is only one version of this general approach. Another, which we call Intuitionistic Bayesianism, selects intuitionistic logic as the preferred logic and the associated class of probability functions as the right class of candidate representions of epistemic states (rational allocations of degrees of belief). Various objections to classical Bayesianism are, we argue, best met by passing to intuitionistic Bayesianism—in which the probability functions are taken relative to intuitionistic logic—rather than by adopting a radically non-Kolmogorovian, for example, nonadditive, conception of (or substitute for) probability functions, in spite of the popularity of the latter response among those who have raised these objections. The interest of intuitionistic Bayesianism is further enhanced by the availability of a Dutch Book argument justifying the selection of intuitionistic probability functions as guides to rational betting behavior when due consideration is paid to the fact that bets are settled only when/if the outcome bet on becomes known. (shrink)
In this paper I propose an interpretation of classical statistical mechanics that centers on taking seriously the idea that probability measures represent complete states of statistical mechanical systems. I show how this leads naturally to the idea that the stochasticity of statistical mechanics is associated directly with the observables of the theory rather than with the microstates (as traditional accounts would have it). The usual assumption that microstates are representationally significant in the theory is therefore dispensable, (...) a consequence which suggests interesting possibilities for developing non-equilibrium statistical mechanics and investigating inter-theoretic answers to the foundational questions of statistical mechanics. (shrink)
This work is a conceptual analysis of certain recent developments in the mathematical foundations of Classical and Quantum Mechanics which have allowed to formulate both theories in a common language. From the algebraic point of view, the set of observables of a physical system, be it classical or quantum, is described by a Jordan-Lie algebra. From the geometric point of view, the space of states of any system is described by a uniform Poisson space with transition probability. (...) Both these structures are here perceived as formal translations of the fundamental twofold role of properties in Mechanics: they are at the same time quantities and transformations. The question becomes then to understand the precise articulation between these two roles. The analysis will show that Quantum Mechanics can be thought as distinguishing itself from Classical Mechanics by a compatibility condition between properties-as-quantities and properties-as-transformations. -/- Moreover, this dissertation shows the existence of a tension between a certain "abstract way" of conceiving mathematical structures, used in the practice of mathematical physics, and the necessary capacity to specify particular states or observables. It then becomes important to understand how, within the formalism, one can construct a labelling scheme. The “Chase for Individuation” is the analysis of different mathematical techniques which attempt to overcome this tension. In particular, we discuss how group theory furnishes a partial solution. (shrink)
In the paper we will employ set theory to study the formal aspects of quantum mechanics without explicitly making use of space-time. It is demonstrated that von Neuman and Zermelo numeral sets, previously efectively used in the explanation of Hardy’s paradox, follow a Heisenberg quantum form. Here monadic union plays the role of time derivative. The logical counterpart of monadic union plays the part of the Hamiltonian in the commutator. The use of numerals and monadic union in the (...) class='Hi'>classicalprobability resolution of Hardy’s paradox [1] is supported with the present derivation of a commutator for sets. (shrink)
Evolutionary theory (ET) is teeming with probabilities. Probabilities exist at all levels: the level of mutation, the level of microevolution, and the level of macroevolution. This uncontroversial claim raises a number of contentious issues. For example, is the evolutionary process (as opposed to the theory) indeterministic, or is it deterministic? Philosophers of biology have taken different sides on this issue. Millstein (1997) has argued that we are not currently able answer this question, and that even scientific realists ought (...) to remain agnostic concerning the determinism or indeterminism of evolutionary processes. If this argument is correct, it suggests that, whatever we take probabilities in ET to be, they must be consistent with either determinism or indeterminism. This raises some interesting philosophical questions: How should we understand the probabilities used in ET? In other words, what is meant by saying that a certain evolutionary change is more or less probable? Which interpretation of probability is the most appropriate for ET? I argue that the probabilities used in ET are objective in a realist sense, if not in an indeterministic sense. Furthermore, there are a number of interpretations of probability that are objective and would be consistent with ET under determinism or indeterminism. However, I argue that evolutionary probabilities are best understood as propensities of population-level kinds. (shrink)
We evaluate classicalprobability in relation to the random generation of a Shakespearean sonnet by a typing monkey and the random generation of universes in a World Ensemble based on various multiverse models involving eternal inflation. We calculate that it would take a monkey roughly 10^942 years to type a Shakespearean sonnet, which pushes the scenario into a World Ensemble. The evaluation of a World Ensemble based on various models of eternal inflation suggests that there is no middle (...) ground between eternal Poincare-Zermelo recurrence and a 0 probability in regards to the natural generation of the initial conditions of the universe. (shrink)
With his influence on the development of physiology, physics and geometry, Hermann von Helmholtz – like few scientists of the second half of the 19th century – is representative of the research in natural science in Germany. The development of his understanding of science is not less representative. Until the late sixties, he emphatically claimed the truth of science; later on, he began to see the conditions for the validity of scientific knowledge in relative terms, and this can, in summary, (...) be referred to as hypothesizing. Already in the past century, HeImholtz made first approaches to an understanding of science, which were incompatible with his own former position and which pointed to the modern age to an astonishingly large extent. A comparison with Karl R. Popper's logic of research will illustrate how closely he nevertheless approached modern understanding of science. In Popper's logic of research, hypothesizing of scientific knowledge is definitely much more advanced than in Helmholtz's theory of science. What begins vaguely to emerge with Helmholtz has already become an explicitly formulated programme with Popper. Although HeImholtz and Popper are not on a direct line of epistemological development and Popper refers to HeImholtz only rarely and casually, there are in fact surprising points of contact which have not been taken notice of so far and which appear above all if one looks at Helmholtz's understanding of science against the background of Popper's logic of research. (shrink)
Given a few assumptions, the probability of a conjunction is raised, and the probability of its negation is lowered, by conditionalising upon one of the conjuncts. This simple result appears to bring Bayesian confirmation theory into tension with the prominent dogmatist view of perceptual justification – a tension often portrayed as a kind of ‘Bayesian objection’ to dogmatism. In a recent paper, David Jehle and Brian Weatherson observe that, while this crucial result holds within classical (...) class='Hi'>probabilitytheory, it fails within intuitionistic probabilitytheory. They conclude that the dogmatist who is willing to take intuitionistic logic seriously can make a convincing reply to the Bayesian objection. In this paper, I argue that this conclusion is premature – the Bayesian objection can survive the transition from classical to intuitionistic probability, albeit in a slightly altered form. I shall conclude with some general thoughts about what the Bayesian objection to dogmatism does and doesn’t show. (shrink)
Many epistemologists hold that an agent can come to justifiably believe that p is true by seeing that it appears that p is true, without having any antecedent reason to believe that visual impressions are generally reliable. Certain reliabilists think this, at least if the agent’s vision is generally reliable. And it is a central tenet of dogmatism (as described by Pryor (2000) and Pryor (2004)) that this is possible. Against these positions it has been argued (e.g. by Cohen (2005) (...) and White (2006)) that this violates some principles from probabilistic learning theory. To see the problem, let’s note what the dogmatist thinks we can learn by paying attention to how things appear. (The reliabilist says the same things, but we’ll focus on the dogmatist.) Suppose an agent receives an appearance that p, and comes to believe that p. Letting Ap be the proposition that it appears to the agent that p, and → be the material implication, we can say that the agent learns that p, and hence is in a position to infer Ap → p, once they receive the evidence Ap.1 This is surprising, because we can prove the following. (shrink)
*These notes were folded into the published paper "Probability and nonclassical logic*. Revising semantics and logic has consequences for the theory of mind. Standard formal treatments of rational belief and desire make classical assumptions. If we are to challenge the presuppositions, we indicate what is kind of theory is going to take their place. Consider probabilitytheory interpreted as an account of ideal partial belief. But if some propositions are neither true nor false, or (...) are half true, or whatever—then it’s far from clear that our degrees of belief in it and its negation should sum to 1, as classicalprobabilitytheory requires (?, cf.). There are extant proposals in the literature for generalizing (categorical) probabilitytheory to a non-classical setting, and we will use these below. But subjective probabilities themselves stand in functional relations to other mental states, and we need to trace the knock-on consequences of revisionism for this interrelationship (arguably, degrees of belief only count as kinds of belief in virtue of standing in these functional relationships). (shrink)
For languages which conform to classical logic such extensions are constructed that they possess a consistent theory of truth. Every language, whose sentences have meanings which make them true or false, is shown to have an extension possessing a consistent theory of truth when that extension is interpreted by meanings of its sentences.
The notion of a partition on a set is mathematically dual to the notion of a subset of a set, so there is a logic of partitions dual to Boole's logic of subsets (Boolean logic is usually mis-specified as "propositional" logic). The notion of an element of a subset has as its dual the notion of a distinction of a partition (a pair of elements in different blocks). Boole developed finite logical probability as the normalized counting measure on elements (...) of subsets so there is a dual concept of logical entropy which is the normalized counting measure on distinctions of partitions. Thus the logical notion of information is a measure of distinctions. Classical logical entropy naturally extends to the notion of quantum logical entropy which provides a more natural and informative alternative to the usual Von Neumann entropy in quantum information theory. The quantum logical entropy of a post-measurement density matrix has the simple interpretation as the probability that two independent measurements of the same state using the same observable will have different results. The main result of the paper is that the increase in quantum logical entropy due to a projective measurement of a pure state is the sum of the absolute squares of the off-diagonal entries ("coherences") of the pure state density matrix that are zeroed ("decohered") by the measurement, i.e., the measure of the distinctions ("decoherences") created by the measurement. (shrink)
Throughout this paper, we are trying to show how and why our Mathematical frame-work seems inappropriate to solve problems in Theory of Computation. More exactly, the concept of turning back in time in paradoxes causes inconsistency in modeling of the concept of Time in some semantic situations. As we see in the first chapter, by introducing a version of “Unexpected Hanging Paradox”,first we attempt to open a new explanation for some paradoxes. In the second step, by applying this paradox, (...) it is demonstrated that any formalized system for the Theory of Computation based on Classical Logic and Turing Model of Computation leads us to a contradiction. We conclude that our mathematical frame work is inappropriate for Theory of Computation. Furthermore, the result provides us a reason that many problems in Complexity Theory resist to be solved.(This work is completed in 2017 -5- 2, it is in vixra in 2017-5-14, presented in Unilog 2018, Vichy). (shrink)
Some variants of quantum theory theorize dogmatic "unimodal" states-of-being, and are based on hodge-podge classical-quantum language. They are based on ontic syntax, but pragmatic semantics. This error was termed semantic inconsistency [1]. Measurement seems to be central problem of these theories, and widely discussed in their interpretation. Copenhagen theory deviates from this prescription, which is modeled on experience. A complete quantum experiment is "bimodal". An experimenter creates the system-under-study in initial mode of experiment, and annihilates it in (...) the final. The experimental intervention lies beyond the theory. I theorize most rudimentary bimodal quantum experiments studied by Finkelstein [2], and deduce "bimodal probability density" P=|In><Fin| to represent complete quantum experiments. It resembles core insights of the Copenhagen theory. (shrink)
Stochastic independence (SI) has a complex status in probabilitytheory. It is not part of the definition of a probability measure, but it is nonetheless an essential property for the mathematical development of this theory, hence a property that any theory on the foundations of probability should be able to account for. Bayesian decision theory, which is one such theory, appears to be wanting in this respect. In Savage's classic treatment, postulates on (...) preferences under uncertainty are shown to entail a subjective expected utility (SEU) representation, and this permits asserting only the existence and uniqueness of a subjective probability, regardless of its properties. What is missing is a preference postulate that would specifically connect with the SI property. The paper develops a version of Bayesian decision theory that fills this gap. In a framework of multiple sources of uncertainty, we introduce preference conditions that jointly entail the SEU representation and the property that the subjective probability in this representation treats the sources of uncertainty as being stochastically independent. We give two representation theorems of graded complexity to demonstrate the power of our preference conditions. Two sections of comments follow, one connecting the theorems with earlier results in Bayesian decision theory, and the other connecting them with the foundational discussion on SI in probabilitytheory and the philosophy of probability. Appendices offer more technical material. (shrink)
Karl Popper discovered in 1938 that the unconditional probability of a conditional of the form ‘If A, then B’ normally exceeds the conditional probability of B given A, provided that ‘If A, then B’ is taken to mean the same as ‘Not (A and not B)’. So it was clear (but presumably only to him at that time) that the conditional probability of B given A cannot be reduced to the unconditional probability of the material conditional (...) ‘If A, then B’. I describe how this insight was developed in Popper’s writings and I add to this historical study a logical one, in which I compare laws of excess in Kolmogorov probabilitytheory with laws of excess in Popper probabilitytheory. (shrink)
Stochastic independence has a complex status in probabilitytheory. It is not part of the definition of a probability measure, but it is nonetheless an essential property for the mathematical development of this theory. Bayesian decision theorists such as Savage can be criticized for being silent about stochastic independence. From their current preference axioms, they can derive no more than the definitional properties of a probability measure. In a new framework of twofold uncertainty, we introduce (...) preference axioms that entail not only these definitional properties, but also the stochastic independence of the two sources of uncertainty. This goes some way towards filling a curious lacuna in Bayesian decision theory. (shrink)
This paper explores the implications of empirical theories of migration for normative accounts of migration and distributive justice. It examines neo-classical economics, world-systems theory, dual labor market theory, and feminist approaches to migration and contends that neo-classical economic theory in isolation provides an inadequate understanding of migration. Other theories provide a fuller account of how national and global economic, political, and social institutions cause and shape migration flows by actively affecting people's opportunity sets in source (...) countries and by admitting people according to social categories such as class and gender. These empirical theories reveal the causal impact of institutions regulating migration and clarify moral obligations frequently overlooked by normative theorists. (shrink)
Recent developments in pure mathematics and in mathematical logic have uncovered a fundamental duality between "existence" and "information." In logic, the duality is between the Boolean logic of subsets and the logic of quotient sets, equivalence relations, or partitions. The analogue to an element of a subset is the notion of a distinction of a partition, and that leads to a whole stream of dualities or analogies--including the development of new logical foundations for information theory parallel to Boole's development (...) of logical finite probabilitytheory. After outlining these dual concepts in mathematical terms, we turn to a more metaphysical speculation about two dual notions of reality, a fully definite notion using Boolean logic and appropriate for classical physics, and the other objectively indefinite notion using partition logic which turns out to be appropriate for quantum mechanics. The existence-information duality is used to intuitively illustrate these two dual notions of reality. The elucidation of the objectively indefinite notion of reality leads to the "killer application" of the existence-information duality, namely the interpretation of quantum mechanics. (shrink)
The singularities from the general relativity resulting by solving Einstein's equations were and still are the subject of many scientific debates: Are there singularities in spacetime, or not? Big Bang was an initial singularity? If singularities exist, what is their ontology? Is the general theory of relativity a theory that has shown its limits in this case?
Is quantum mechanics about ‘states’? Or is it basically another kind of probabilitytheory? It is argued that the elementary formalism of quantum mechanics operates as a well-justified alternative to ‘classical’ instantiations of a probability calculus. Its providing a general framework for prediction accounts for its distinctive traits, which one should be careful not to mistake for reflections of any strange ontology. The suggestion is also made that quantum theory unwittingly emerged, in Schrödinger’s formulation, as (...) a ‘lossy’ by-product of a quantum-mechanical variant of the Hamilton-Jacobi equation. As it turns out, the effectiveness of quantum theory qua predictive algorithm makes up for the computational impracticability of that master equation. (shrink)
Bayesian confirmation theory is rife with confirmation measures. Many of them differ from each other in important respects. It turns out, though, that all the standard confirmation measures in the literature run counter to the so-called “Reverse Matthew Effect” (“RME” for short). Suppose, to illustrate, that H1 and H2 are equally successful in predicting E in that p(E | H1)/p(E) = p(E | H2)/p(E) > 1. Suppose, further, that initially H1 is less probable than H2 in that p(H1) < (...) p(H2). Then by RME it follows that the degree to which E confirms H1 is greater than the degree to which it confirms H2. But by all the standard confirmation measures in the literature, in contrast, it follows that the degree to which E confirms H1 is less than or equal to the degree to which it confirms H2. It might seem, then, that RME should be rejected as implausible. Festa (2012), however, argues that there are scientific contexts in which RME holds. If Festa’s argument is sound, it follows that there are scientific contexts in which none of the standard confirmation measures in the literature is adequate. Festa’s argument is thus interesting, important, and deserving of careful examination. I consider five distinct respects in which E can be related to H, use them to construct five distinct ways of understanding confirmation measures, which I call “Increase in Probability”, “Partial Dependence”, “Partial Entailment”, “Partial Discrimination”, and “Popper Corroboration”, and argue that each such way runs counter to RME. The result is that it is not at all clear that there is a place in Bayesian confirmation theory for RME. (shrink)
Bayesian confirmation theory is rife with confirmation measures. Zalabardo focuses on the probability difference measure, the probability ratio measure, the likelihood difference measure, and the likelihood ratio measure. He argues that the likelihood ratio measure is adequate, but each of the other three measures is not. He argues for this by setting out three adequacy conditions on confirmation measures and arguing in effect that all of them are met by the likelihood ratio measure but not by any (...) of the other three measures. Glass and McCartney, hereafter “G&M,” accept the conclusion of Zalabardo’s argument along with each of the premises in it. They nonetheless try to improve on Zalabardo’s argument by replacing his third adequacy condition with a weaker condition. They do this because of a worry to the effect that Zalabardo’s third adequacy condition runs counter to the idea behind his first adequacy condition. G&M have in mind confirmation in the sense of increase in probability: the degree to which E confirms H is a matter of the degree to which E increases H’s probability. I call this sense of confirmation “IP.” I set out four ways of precisifying IP. I call them “IP1,” “IP2,” “IP3,” and “IP4.” Each of them is based on the assumption that the degree to which E increases H’s probability is a matter of the distance between p and a certain other probability involving H. I then evaluate G&M’s argument in light of them. (shrink)
Many recent theories of epistemic discourse exploit an informational notion of consequence, i.e. a notion that defines entailment as preservation of support by an information state. This paper investigates how informational consequence fits with probabilistic reasoning. I raise two problems. First, all informational inferences that are not also classical inferences are, intuitively, probabilistically invalid. Second, all these inferences can be exploited, in a systematic way, to generate triviality results. The informational theorist is left with two options, both of them (...) radical: they can either deny that epistemic modal claims have probability at all, or they can move to a nonstandard probabilitytheory. (shrink)
The author’s studies in the philosophy of science, culminating in this book, were inspired by his previous research in the domains of classical and quantum gravity. In fact it was the need to bring some order in the family of modern classical theories of gravitation and to build up the appropriate conceptual foundations of quantum gravity , that forced the author to create his own methodological model of theory change, which he applies rather successfully to the most (...) controversial case study, the Lorentz-Einstein transition. (shrink)
Classical logic is usually interpreted as the logic of propositions. But from Boole's original development up to modern categorical logic, there has always been the alternative interpretation of classical logic as the logic of subsets of any given (nonempty) universe set. Partitions on a universe set are dual to subsets of a universe set in the sense of the reverse-the-arrows category-theoretic duality--which is reflected in the duality between quotient objects and subobjects throughout algebra. Hence the idea arises of (...) a dual logic of partitions. That dual logic is described here. Partition logic is at the same mathematical level as subset logic since models for both are constructed from (partitions on or subsets of) arbitrary unstructured sets with no ordering relations, compatibility or accessibility relations, or topologies on the sets. Just as Boole developed logical finite probabilitytheory as a quantitative treatment of subset logic, applying the analogous mathematical steps to partition logic yields a logical notion of entropy so that information theory can be refounded on partition logic. But the biggest application is that when partition logic and the accompanying logical information theory are "lifted" to complex vector spaces, then the mathematical framework of quantum mechanics is obtained. Partition logic models indefiniteness (i.e., numerical attributes on a set become more definite as the inverse-image partition becomes more refined) while subset logic models the definiteness of classical physics (an entity either definitely has a property or definitely does not). Hence partition logic provides the backstory so the old idea of "objective indefiniteness" in QM can be fleshed out to a full interpretation of quantum mechanics. (shrink)
I introduce a formalization of probability which takes the concept of 'evidence' as primitive. In parallel to the intuitionistic conception of truth, in which 'proof' is primitive and an assertion A is judged to be true just in case there is a proof witnessing it, here 'evidence' is primitive and A is judged to be probable just in case there is evidence supporting it. I formalize this outlook by representing propositions as types in Martin-Lof type theory (MLTT) and (...) defining a 'probability type' on top of the existing machinery of MLTT, whose inhabitants represent pieces of evidence in favor of a proposition. One upshot of this approach is the potential for a mathematical formalism which treats 'conjectures' as mathematical objects in their own right. Other intuitive properties of evidence occur as theorems in this formalism. (shrink)
This paper shows how the classical finite probabilitytheory (with equiprobable outcomes) can be reinterpreted and recast as the quantum probability calculus of a pedagogical or toy model of quantum mechanics over sets (QM/sets). There have been several previous attempts to develop a quantum-like model with the base field of ℂ replaced by ℤ₂. Since there are no inner products on vector spaces over finite fields, the problem is to define the Dirac brackets and the (...) class='Hi'>probability calculus. The previous attempts all required the brackets to take values in ℤ₂. But the usual QM brackets <ψ|ϕ> give the "overlap" between states ψ and ϕ, so for subsets S,T⊆U, the natural definition is <S|T>=|S∩T| (taking values in the natural numbers). This allows QM/sets to be developed with a full probability calculus that turns out to be a non-commutative extension of classical Laplace-Boole finite probabilitytheory. The pedagogical model is illustrated by giving simple treatments of the indeterminacy principle, the double-slit experiment, Bell's Theorem, and identical particles in QM/Sets. A more technical appendix explains the mathematics behind carrying some vector space structures between QM over ℂ and QM/Sets over ℤ₂. (shrink)
Modal logic is one of philosophy’s many children. As a mature adult it has moved out of the parental home and is nowadays straying far from its parent. But the ties are still there: philosophy is important to modal logic, modal logic is important for philosophy. Or, at least, this is a thesis we try to defend in this chapter. Limitations of space have ruled out any attempt at writing a survey of all the work going on in our field—a (...) book would be needed for that. Instead, we have tried to select material that is of interest in its own right or exemplifies noteworthy features in interesting ways. Here are some themes that have guided us throughout the writing: • The back-and-forth between philosophy and modal logic. There has been a good deal of give-and-take in the past. Carnap tried to use his modal logic to throw light on old philosophical questions, thereby inspiring others to continue his work and still others to criticise it. He certainly provoked Quine, who in his turn provided—and continues to provide—a healthy challenge to modal logicians. And Kripke’s and David Lewis’s philosophies are connected, in interesting ways, with their modal logic. Analytic philosophy would have been a lot different without modal logic! • The interpretation problem. The problem of providing a certain modal logic with an intuitive interpretation should not be conflated with the problem of providing a formal system with a model-theoretic semantics. An intuitively appealing model-theoretic semantics may be an important step towards solving the interpretation problem, but only a step. One may compare this situation with that in probabilitytheory, where definitions of concepts like ‘outcome space’ and ‘random variable’ are orthogonal to questions about “interpretations” of the concept of probability. • The value of formalisation. Modal logic sets standards of precision, which are a challenge to—and sometimes a model for—philosophy. Classical philosophical questions can be sharpened and seen from a new perspective when formulated in a framework of modal logic. On the other hand, representing old questions in a formal garb has its dangers, such as simplification and distortion. • Why modal logic rather than classical (first or higher order) logic? The idioms of modal logic—today there are many!—seem better to correspond to human ways of thinking than ordinary extensional logic. (Cf. Chomsky’s conjecture that the NP + VP pattern is wired into the human brain.) In his An Essay in Modal Logic (1951) von Wright distinguished between four kinds of modalities: alethic (modes of truth: necessity, possibility and impossibility), epistemic (modes of being known: known to be true, known to be false, undecided), deontic (modes of obligation: obligatory, permitted, forbidden) and existential (modes of existence: universality, existence, emptiness). The existential modalities are not usually counted as modalities, but the other three categories are exemplified in three sections into which this chapter is divided. Section 1 is devoted to alethic modal logic and reviews some main themes at the heart of philosophical modal logic. Sections 2 and 3 deal with topics in epistemic logic and deontic logic, respectively, and are meant to illustrate two different uses that modal logic or indeed any logic can have: it may be applied to already existing (non-logical) theory, or it can be used to develop new theory. (shrink)
An inequality in quantum mechanics, which does not appear to be well known, is derived by elementary means and shown to be quite useful. The inequality applies to 'all' operators and 'all' pairs of quantum states, including mixed states. It generalizes the rule of the orthogonality of eigenvectors for distinct eigenvalues and is shown to imply all the Robertson generalized uncertainty relations. It severely constrains the difference between probabilities obtained from 'close' quantum states and the different responses they can have (...) to unitary transformations. Thus, it is dubbed a master inequality. With appropriate definitions the inequality also holds throughout general probabilitytheory and appears not to be well known there either. That classical inequality is obtained here in an appendix. The quantum inequality can be obtained from the classical version but a more direct quantum approach is employed here. A similar but weaker classical inequality has been reported by Uffink and van Lith. (shrink)
In his classic book “the Foundations of Statistics” Savage developed a formal system of rational decision making. The system is based on (i) a set of possible states of the world, (ii) a set of consequences, (iii) a set of acts, which are functions from states to consequences, and (iv) a preference relation over the acts, which represents the preferences of an idealized rational agent. The goal and the culmination of the enterprise is a representation theorem: Any preference relation that (...) satisfies certain arguably acceptable postulates determines a (finitely additive) probability distribution over the states and a utility assignment to the consequences, such that the preferences among acts are determined by their expected utilities. Additional problematic assumptions are however required in Savage's proofs. First, there is a Boolean algebra of events (sets of states) which determines the richness of the set of acts. The probabilities are assigned to members of this algebra. Savage's proof requires that this be a σ-algebra (i.e., closed under infinite countable unions and intersections), which makes for an extremely rich preference relation. On Savage's view we should not require subjective probabilities to be σ-additive. He therefore finds the insistence on a σ-algebra peculiar and is unhappy with it. But he sees no way of avoiding it. Second, the assignment of utilities requires the constant act assumption: for every consequence there is a constant act, which produces that consequence in every state. This assumption is known to be highly counterintuitive. The present work contains two mathematical results. The first, and the more difficult one, shows that the σ-algebra assumption can be dropped. The second states that, as long as utilities are assigned to finite gambles only, the constant act assumption can be replaced by the more plausible and much weaker assumption that there are at least two non-equivalent constant acts. The second result also employs a novel way of deriving utilities in Savage-style systems -- without appealing to von Neumann-Morgenstern lotteries. The paper discusses the notion of “idealized agent" that underlies Savage's approach, and argues that the simplified system, which is adequate for all the actual purposes for which the system is designed, involves a more realistic notion of an idealized agent. (shrink)
Using “brute reason” I will show why there can be only one valid interpretation of probability. The valid interpretation turns out to be a further refinement of Popper’s Propensity interpretation of probability. Via some famous probability puzzles and new thought experiments I will show how all other interpretations of probability fail, in particular the Bayesian interpretations, while these puzzles do not present any difficulties for the interpretation proposed here. In addition, the new interpretation casts doubt on (...) some concepts often taken as basic and unproblematic, like rationality, utility and expectation. This in turn has implications for decision theory, economic theory and the philosophy of physics. (shrink)
This paper starts by indicating the analysis of Hempel's conditions of adequacy for any relation of confirmation (Hempel, 1945) as presented in Huber (submitted). There I argue contra Carnap (1962, Section 87) that Hempel felt the need for two concepts of confirmation: one aiming at plausible theories and another aiming at informative theories. However, he also realized that these two concepts are conflicting, and he gave up the concept of confirmation aiming at informative theories. The main part of the paper (...) consists in working out the claim that one can have Hempel's cake and eat it too - in the sense that there is a logic of theory assessment that takes into account both of the two conflicting aspects of plausibility and informativeness. According to the semantics of this logic, a is an acceptable theory for evidence β if and only if a is both sufficiently plausible given β and sufficiently informative about β. This is spelt out in terms of ranking functions (Spohn, 1988) and shown to represent the syntactically specified notion of an assessment relation. The paper then compares these acceptability relations to explanatory and confirmatory consequence relations (Flach, 2000) as well as to nonmonotonic consequence relations (Kraus et al., 1990). It concludes by relating the plausibility-informativeness approach to Carnap's positive relevance account, thereby shedding new light on Carnap's analysis as well as solving another problem of confirmation theory. (shrink)
It is well known that classical, aka ‘sharp’, Bayesian decision theory, which models belief states as single probability functions, faces a number of serious difficulties with respect to its handling of agnosticism. These difficulties have led to the increasing popularity of so-called ‘imprecise’ models of decision-making, which represent belief states as sets of probability functions. In a recent paper, however, Adam Elga has argued in favour of a putative normative principle of sequential choice that he claims (...) to be borne out by the sharp model but not by any promising incarnation of its imprecise counterpart. After first pointing out that Elga has fallen short of establishing that his principle is indeed uniquely borne out by the sharp model, I cast aspersions on its plausibility. I show that a slight weakening of the principle is satisfied by at least one, but interestingly not all, varieties of the imprecise model and point out that Elga has failed to motivate his stronger commitment. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.