We propose a nonmonotonic Description Logic of typicality able to account for the phenomenon of the combination of prototypical concepts. The proposed logic relies on the logic of typicality ALC + TR, whose semantics is based on the notion of rational closure, as well as on the distributed semantics of probabilistic Description Logics, and is equipped with a cognitive heuristic used by humans for concept composition. We first extend the logic of typicality ALC + TR by (...) typicality inclusions of the form p :: T(C) v D, whose intuitive meaning is that “we believe with degree p about the fact that typical Cs are Ds”. As in the distributed semantics, we define different scenarios containing only some typicality inclusions, each one having a suitable probability. We then exploit such scenarios in order to ascribe typical properties to a concept C obtained as the combination of two prototypical concepts. We also show that reasoning in the proposed Description Logic is EXPTIME-complete as for the underlying standard Description Logic ALC. (shrink)
Many epistemologists hold that an agent can come to justifiably believe that p is true by seeing that it appears that p is true, without having any antecedent reason to believe that visual impressions are generally reliable. Certain reliabilists think this, at least if the agent’s vision is generally reliable. And it is a central tenet of dogmatism (as described by Pryor (2000) and Pryor (2004)) that this is possible. Against these positions it has been argued (e.g. by Cohen (2005) (...) and White (2006)) that this violates some principles from probabilistic learning theory. To see the problem, let’s note what the dogmatist thinks we can learn by paying attention to how things appear. (The reliabilist says the same things, but we’ll focus on the dogmatist.) Suppose an agent receives an appearance that p, and comes to believe that p. Letting Ap be the proposition that it appears to the agent that p, and → be the material implication, we can say that the agent learns that p, and hence is in a position to infer Ap → p, once they receive the evidence Ap.1 This is surprising, because we can prove the following. (shrink)
Automated reasoning about uncertain knowledge has many applications. One difficulty when developing such systems is the lack of a completely satisfactory integration of logic and probability. We address this problem directly. Expressive languages like higher-order logic are ideally suited for representing and reasoning about structured knowledge. Uncertain knowledge can be modeled by using graded probabilities rather than binary truth-values. The main technical problem studied in this paper is the following: Given a set of sentences, each having some (...)probability of being true, what probability should be ascribed to other (query) sentences? A natural wish-list, among others, is that the probability distribution (i) is consistent with the knowledge base, (ii) allows for a consistent inference procedure and in particular (iii) reduces to deductive logic in the limit of probabilities being 0 and 1, (iv) allows (Bayesian) inductive reasoning and (v) learning in the limit and in particular (vi) allows confirmation of universally quantified hypotheses/sentences. We translate this wish-list into technical requirements for a prior probability and show that probabilities satisfying all our criteria exist. We also give explicit constructions and several general characterizations of probabilities that satisfy some or all of the criteria and various (counter) examples. We also derive necessary and sufficient conditions for extending beliefs about finitely many sentences to suitable probabilities over all sentences, and in particular least dogmatic or least biased ones. We conclude with a brief outlook on how the developed theory might be used and approximated in autonomous reasoning agents. Our theory is a step towards a globally consistent and empirically satisfactory unification of probability and logic. (shrink)
This paper is concerned with representations of belief by means of nonadditive probabilities of the Dempster-Shafer (DS) type. After surveying some foundational issues and results in the D.S. theory, including Suppes's related contributions, the paper proceeds to analyze the connection of the D.S. theory with some of the work currently pursued in epistemic logic. A preliminary investigation of the modal logic of belief functions à la Shafer is made. There it is shown that the Alchourrron-Gärdenfors-Makinson (A.G.M.) logic (...) of belief change is closely related to the D.S. theory. The final section compares the critique of Bayesianism which underlies the present paper with some important objections raised by Suppes against this doctrine. -/- . (shrink)
We discuss the relationship between logic, geometry and probability theory under the light of a novel approach to quantum probabilities which generalizes the method developed by R. T. Cox to the quantum logical approach to physical theories.
I introduce a formalization of probability which takes the concept of 'evidence' as primitive. In parallel to the intuitionistic conception of truth, in which 'proof' is primitive and an assertion A is judged to be true just in case there is a proof witnessing it, here 'evidence' is primitive and A is judged to be probable just in case there is evidence supporting it. I formalize this outlook by representing propositions as types in Martin-Lof type theory (MLTT) and defining (...) a 'probability type' on top of the existing machinery of MLTT, whose inhabitants represent pieces of evidence in favor of a proposition. One upshot of this approach is the potential for a mathematical formalism which treats 'conjectures' as mathematical objects in their own right. Other intuitive properties of evidence occur as theorems in this formalism. (shrink)
Mathematicians often speak of conjectures, yet unproved, as probable or well-confirmed by evidence. The Riemann Hypothesis, for example, is widely believed to be almost certainly true. There seems no initial reason to distinguish such probability from the same notion in empirical science. Yet it is hard to see how there could be probabilistic relations between the necessary truths of pure mathematics. The existence of such logical relations, short of certainty, is defended using the theory of logical probability (or (...) objective Bayesianism or non-deductive logic), and some detailed examples of its use in mathematics surveyed. Examples of inductive reasoning in experimental mathematics are given and it is argued that the problem of induction is best appreciated in the mathematical case. (shrink)
John Maynard Keynes’s A Treatise on Probability is the seminal text for the logical interpretation of probability. According to his analysis, probabilities are evidential relations between a hypothesis and some evidence, just like the relations of deductive logic. While some philosophers had suggested similar ideas prior to Keynes, it was not until his Treatise that the logical interpretation of probability was advocated in a clear, systematic and rigorous way. I trace Keynes’s influence in the philosophy of (...)probability through a heterogeneous sample of thinkers who adopted his interpretation. This sample consists of Frederick C. Benenson, Roy Harrod, Donald C. Williams, Henry E. Kyburg and David Stove. The ideas of Keynes prove to be adaptable to their diverse theories of probability. My discussion indicates both the robustness of Keynes’s probability theory and the importance of its influence on the philosophers whom I describe. I also discuss the Problem of the Priors. I argue that none of those I discuss have obviously improved on Keynes’s theory with respect to this issue. (shrink)
In this paper, I will attempt to develop and defend a common form of intuitive resistance to the companions in guilt argument. I will argue that one can reasonably believe there are promising solutions to the access problem for mathematical realism that don’t translate to moral realism. In particular, I will suggest that the structuralist project of accounting for mathematical knowledge in terms of some form of logical knowledge offers significant hope of success while no analogous approach offers such hope (...) for moral realism. (shrink)
Enjoying great popularity in decision theory, epistemology, and philosophy of science, Bayesianism as understood here is fundamentally concerned with epistemically ideal rationality. It assumes a tight connection between evidential probability and ideally rational credence, and usually interprets evidential probability in terms of such credence. Timothy Williamson challenges Bayesianism by arguing that evidential probabilities cannot be adequately interpreted as the credences of an ideal agent. From this and his assumption that evidential probabilities cannot be interpreted as the actual credences (...) of human agents either, he concludes that no interpretation of evidential probabilities in terms of credence is adequate. I argue to the contrary. My overarching aim is to show on behalf of Bayesians how one can still interpret evidential probabilities in terms of ideally rational credence and how one can maintain a tight connection between evidential probabilities and ideally rational credence even if the former cannot be interpreted in terms of the latter. By achieving this aim I illuminate the limits and prospects of Bayesianism. (shrink)
According to the Lockean thesis, a proposition is believed just in case it is highly probable. While this thesis enjoys strong intuitive support, it is known to conflict with seemingly plausible logical constraints on our beliefs. One way out of this conflict is to make probability 1 a requirement for belief, but most have rejected this option for entailing what they see as an untenable skepticism. Recently, two new solutions to the conflict have been proposed that are alleged to (...) be non-skeptical. We compare these proposals with each other and with the Lockean thesis, in particular with regard to the question of how much we gain by adopting any one of them instead of the probability 1 requirement, that is, of how likely it is that one believes more than the things one is fully certain of. (shrink)
We generalize the Kolmogorov axioms for probability calculus to obtain conditions defining, for any given logic, a class of probability functions relative to that logic, coinciding with the standard probability functions in the special case of classical logic but allowing consideration of other classes of "essentially Kolmogorovian" probability functions relative to other logics. We take a broad view of the Bayesian approach as dictating inter alia that from the perspective of a given (...) class='Hi'>logic, rational degrees of belief are those representable by probability functions from the class appropriate to that logic. Classical Bayesianism, which fixes the logic as classical logic, is only one version of this general approach. Another, which we call Intuitionistic Bayesianism, selects intuitionistic logic as the preferred logic and the associated class of probability functions as the right class of candidate representions of epistemic states (rational allocations of degrees of belief). Various objections to classical Bayesianism are, we argue, best met by passing to intuitionistic Bayesianism—in which the probability functions are taken relative to intuitionistic logic—rather than by adopting a radically non-Kolmogorovian, for example, nonadditive, conception of (or substitute for) probability functions, in spite of the popularity of the latter response among those who have raised these objections. The interest of intuitionistic Bayesianism is further enhanced by the availability of a Dutch Book argument justifying the selection of intuitionistic probability functions as guides to rational betting behavior when due consideration is paid to the fact that bets are settled only when/if the outcome bet on becomes known. (shrink)
We propose a new account of indicative conditionals, giving acceptability and logical closure conditions for them. We start from Adams’ Thesis: the claim that the acceptability of a simple indicative equals the corresponding conditional probability. The Thesis is widely endorsed, but arguably false and refuted by empirical research. To fix it, we submit, we need a relevance constraint: we accept a simple conditional 'If φ, then ψ' to the extent that (i) the conditional probability p(ψ|φ) is high, provided (...) that (ii) φ is relevant for ψ. How (i) should work is well-understood. It is (ii) that holds the key to improve our understanding of conditionals. Our account has (i) a probabilistic component, using Popper functions; (ii) a relevance component, given via an algebraic structure of topics or subject matters. We present a probabilistic logic for simple indicatives, and argue that its (in)validities are both theoretically desirable and in line with empirical results on how people reason with conditionals. (shrink)
Should we understand implicit attitudes on the model of belief? I argue that implicit attitudes are (probably) members of a different psychological kind altogether, because they seem to be insensitive to the logical form of an agent’s thoughts and perceptions. A state is sensitive to logical form only if it is sensitive to the logical constituents of the content of other states (e.g., operators like negation and conditional). I explain sensitivity to logical form and argue that it is a necessary (...) condition for belief. I appeal to two areas of research that seem to show that implicit attitudes fail spectacularly to satisfy this condition—although persistent gaps in the empirical literature leave matters inconclusive. I sketch an alternative account, according to which implicit attitudes are sensitive merely to spatiotemporal relations in thought and perception, i.e., the spatial and temporal orders in which people think, see, or hear things. (shrink)
Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose (...) of this paper is to give the direct generalization to quantum logical information theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., qudits of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates that are distinguished by the measurement. Both the classical and quantum versions of logical entropy have simple interpretations as “two-draw” probabilities for distinctions. The conclusion is that quantum logical entropy is the simple and natural notion of information for quantum information theory focusing on the distinguishing of quantum states. (shrink)
Many philosophers argue that Keynes’s concept of the “weight of arguments” is an important aspect of argument appraisal. The weight of an argument is the quantity of relevant evidence cited in the premises. However, this dimension of argumentation does not have a received method for formalisation. Kyburg has suggested a measure of weight that uses the degree of imprecision in his system of “Evidential Probability” to quantify weight. I develop and defend this approach to measuring weight. I illustrate the (...) usefulness of this measure by employing it to develop an answer to Popper’s Paradox of Ideal Evidence. (shrink)
The logic of indicative conditionals remains the topic of deep and intractable philosophical disagreement. I show that two influential epistemic norms—the Lockean theory of belief and the Ramsey test for conditional belief—are jointly sufficient to ground a powerful new argument for a particular conception of the logic of indicative conditionals. Specifically, the argument demonstrates, contrary to the received historical narrative, that there is a real sense in which Stalnaker’s semantics for the indicative did succeed in capturing the (...) class='Hi'>logic of the Ramseyan indicative conditional. (shrink)
In his entry on "Quantum Logic and Probability Theory" in the Stanford Encyclopedia of Philosophy, Alexander Wilce (2012) writes that "it is uncontroversial (though remarkable) the formal apparatus quantum mechanics reduces neatly to a generalization of classical probability in which the role played by a Boolean algebra of events in the latter is taken over the 'quantum logic' of projection operators on a Hilbert space." For a long time, Patrick Suppes has opposed this view (see, for (...) example, the paper collected in Suppes and Zanotti (1996). Instead of changing the logic and moving from a Boolean algebra to a non-Boolean algebra, one can also 'save the phenomena' by weakening the axioms of probability theory and work instead with upper and lower probabilities. However, it is fair to say that despite Suppes' efforts upper and lower probabilities are not particularly popular in physics as well as in the foundations of physics, at least so far. Instead, quantum logic is booming again, especially since quantum information and computation became hot topics. Interestingly, however, imprecise probabilities are becoming more and more popular in formal epistemology as recent work by authors such as James Joye (2010) and Roger White (2010) demonstrates. (shrink)
Modal logic is one of philosophy’s many children. As a mature adult it has moved out of the parental home and is nowadays straying far from its parent. But the ties are still there: philosophy is important to modal logic, modal logic is important for philosophy. Or, at least, this is a thesis we try to defend in this chapter. Limitations of space have ruled out any attempt at writing a survey of all the work going on (...) in our field—a book would be needed for that. Instead, we have tried to select material that is of interest in its own right or exemplifies noteworthy features in interesting ways. Here are some themes that have guided us throughout the writing: • The back-and-forth between philosophy and modal logic. There has been a good deal of give-and-take in the past. Carnap tried to use his modal logic to throw light on old philosophical questions, thereby inspiring others to continue his work and still others to criticise it. He certainly provoked Quine, who in his turn provided—and continues to provide—a healthy challenge to modal logicians. And Kripke’s and David Lewis’s philosophies are connected, in interesting ways, with their modal logic. Analytic philosophy would have been a lot different without modal logic! • The interpretation problem. The problem of providing a certain modal logic with an intuitive interpretation should not be conflated with the problem of providing a formal system with a model-theoretic semantics. An intuitively appealing model-theoretic semantics may be an important step towards solving the interpretation problem, but only a step. One may compare this situation with that in probability theory, where definitions of concepts like ‘outcome space’ and ‘random variable’ are orthogonal to questions about “interpretations” of the concept of probability. • The value of formalisation. Modal logic sets standards of precision, which are a challenge to—and sometimes a model for—philosophy. Classical philosophical questions can be sharpened and seen from a new perspective when formulated in a framework of modal logic. On the other hand, representing old questions in a formal garb has its dangers, such as simplification and distortion. • Why modal logic rather than classical (first or higher order) logic? The idioms of modal logic—today there are many!—seem better to correspond to human ways of thinking than ordinary extensional logic. (Cf. Chomsky’s conjecture that the NP + VP pattern is wired into the human brain.) In his An Essay in Modal Logic (1951) von Wright distinguished between four kinds of modalities: alethic (modes of truth: necessity, possibility and impossibility), epistemic (modes of being known: known to be true, known to be false, undecided), deontic (modes of obligation: obligatory, permitted, forbidden) and existential (modes of existence: universality, existence, emptiness). The existential modalities are not usually counted as modalities, but the other three categories are exemplified in three sections into which this chapter is divided. Section 1 is devoted to alethic modal logic and reviews some main themes at the heart of philosophical modal logic. Sections 2 and 3 deal with topics in epistemic logic and deontic logic, respectively, and are meant to illustrate two different uses that modal logic or indeed any logic can have: it may be applied to already existing (non-logical) theory, or it can be used to develop new theory. (shrink)
Given a few assumptions, the probability of a conjunction is raised, and the probability of its negation is lowered, by conditionalising upon one of the conjuncts. This simple result appears to bring Bayesian confirmation theory into tension with the prominent dogmatist view of perceptual justification – a tension often portrayed as a kind of ‘Bayesian objection’ to dogmatism. In a recent paper, David Jehle and Brian Weatherson observe that, while this crucial result holds within classical probability theory, (...) it fails within intuitionistic probability theory. They conclude that the dogmatist who is willing to take intuitionistic logic seriously can make a convincing reply to the Bayesian objection. In this paper, I argue that this conclusion is premature – the Bayesian objection can survive the transition from classical to intuitionistic probability, albeit in a slightly altered form. I shall conclude with some general thoughts about what the Bayesian objection to dogmatism does and doesn’t show. (shrink)
Classical logic is usually interpreted as the logic of propositions. But from Boole's original development up to modern categorical logic, there has always been the alternative interpretation of classical logic as the logic of subsets of any given (nonempty) universe set. Partitions on a universe set are dual to subsets of a universe set in the sense of the reverse-the-arrows category-theoretic duality--which is reflected in the duality between quotient objects and subobjects throughout algebra. Hence the (...) idea arises of a dual logic of partitions. That dual logic is described here. Partition logic is at the same mathematical level as subset logic since models for both are constructed from (partitions on or subsets of) arbitrary unstructured sets with no ordering relations, compatibility or accessibility relations, or topologies on the sets. Just as Boole developed logical finite probability theory as a quantitative treatment of subset logic, applying the analogous mathematical steps to partition logic yields a logical notion of entropy so that information theory can be refounded on partition logic. But the biggest application is that when partition logic and the accompanying logical information theory are "lifted" to complex vector spaces, then the mathematical framework of quantum mechanics is obtained. Partition logic models indefiniteness (i.e., numerical attributes on a set become more definite as the inverse-image partition becomes more refined) while subset logic models the definiteness of classical physics (an entity either definitely has a property or definitely does not). Hence partition logic provides the backstory so the old idea of "objective indefiniteness" in QM can be fleshed out to a full interpretation of quantum mechanics. (shrink)
This paper starts by indicating the analysis of Hempel's conditions of adequacy for any relation of confirmation (Hempel, 1945) as presented in Huber (submitted). There I argue contra Carnap (1962, Section 87) that Hempel felt the need for two concepts of confirmation: one aiming at plausible theories and another aiming at informative theories. However, he also realized that these two concepts are conflicting, and he gave up the concept of confirmation aiming at informative theories. The main part of the paper (...) consists in working out the claim that one can have Hempel's cake and eat it too - in the sense that there is a logic of theory assessment that takes into account both of the two conflicting aspects of plausibility and informativeness. According to the semantics of this logic, a is an acceptable theory for evidence β if and only if a is both sufficiently plausible given β and sufficiently informative about β. This is spelt out in terms of ranking functions (Spohn, 1988) and shown to represent the syntactically specified notion of an assessment relation. The paper then compares these acceptability relations to explanatory and confirmatory consequence relations (Flach, 2000) as well as to nonmonotonic consequence relations (Kraus et al., 1990). It concludes by relating the plausibility-informativeness approach to Carnap's positive relevance account, thereby shedding new light on Carnap's analysis as well as solving another problem of confirmation theory. (shrink)
Wittgenstein did not write very much on the topic of probability. The little we have comes from a few short pages of the Tractatus, some 'remarks' from the 1930s, and the informal conversations which went on during that decade with the Vienna Circle. Nevertheless, Wittgenstein's views were highly influential in the later development of the logical theory of probability. This paper will attempt to clarify and defend Wittgenstein's conception of probability against some oft-cited criticisms that stem from (...) a misunderstanding of his views. Max Black, for instance, criticises Wittgenstein for formulating a theory of probability that is capable of being used only against the backdrop of the ideal language of the Tractatus. I argue that on the contrary, by appealing to the 'hypothetical laws of nature', Wittgenstein is able to make sense of probability statements involving propositions that have not been completely analysed. G.H. von Wright criticises Wittgenstein's characterisation of these very hypothetical laws. He argues that by introducing them Wittgenstein makes what is distinctive about his theory superfluous, for the hypothetical laws are directly inspired by statistical observations and hence these observations indirectly determine the mechanism by which the logical theory of probability operates. I argue that this is not the case at all, and that while statistical observations play a part in the formation of the hypothetical laws, these observations are only necessary, but not sufficient conditions for the introduction of these hypotheses. (shrink)
The paper argues that the two best known formal logical fallacies, namely denying the antecedent (DA) and affirming the consequent (AC) are not just basic and simple errors, which prove human irrationality, but rather informational shortcuts, which may provide a quick and dirty way of extracting useful information from the environment. DA and AC are shown to be degraded versions of Bayes’ theorem, once this is stripped of some of its probabilities. The less the probabilities count, the closer these fallacies (...) become to a reasoning that is not only informationally useful but also logically valid. (shrink)
Over the past two decades, gamblers have begun taking mathematics into account more seriously than ever before. While probability theory is the only rigorous theory modeling the uncertainty, even though in idealized conditions, numerical probabilities are viewed not only as mere mathematical information, but also as a decision-making criterion, especially in gambling. This book presents the mathematics underlying the major games of chance and provides a precise account of the odds associated with all gaming events. It begins by explaining (...) in simple terms the meaning of the concept of probability for the layman and goes on to become an enlightening journey through the mathematics of chance, randomness and risk. It then continues with the basics of discrete probability, combinatorics and counting arguments for those interested in the supporting mathematics. These mathematic sections may be skipped by readers who do not have a minimal background in mathematics; these readers can skip directly to the Guide to Numerical Results to pick the odds and recommendations they need for the desired gaming situation. Doing so is possible due to the organization of that chapter, in which the results are listed at the end of each section, mostly in the form of tables. The chapter titled The Mathematics of Games of Chance presents these games not only as a good application field for probability theory, but also in terms of human actions where probability-based strategies can be tried to achieve favorable results. Through suggestive examples, the reader can see what are the experiments, events and probability fields in games of chance and how probability calculus works there. The main portion of this work is a collection of probability results for each type of game. Each game s section is packed with formulas and tables. Each section also contains a description of the game, a classification of the gaming events and the applicable probability calculations. The primary goal of this work is to allow the reader to quickly find the odds for a specific gaming situation, in order to improve his or her betting/gaming decisions. Every type of gaming event is tabulated in a logical, consistent and comprehensive manner. The complete methodology and complete or partial calculations are shown to teach players how to calculate probability for any situation, for every stage of the game for any game. Here, readers can find the real odds, returned by precise mathematical formulas and not by partial simulations that most software uses. Collections of odds are presented, as well as strategic recommendations based on those odds, where necessary, for each type of gaming situation. The book contains much new and original material that has not been published previously and provides great coverage of probabilities for the following games of chance: Dice, Slots, Roulette, Baccarat, Blackjack, Texas Hold em Poker, Lottery and Sport Bets. Most of games of chance are predisposed to probability-based decisions. This is why the approach is not an exclusively statistical one, but analytical: every gaming event is taken as an individual applied probability problem to solve. A special chapter defines the probability-based strategy and mathematically shows why such strategy is theoretically optimal.". (shrink)
How were reliable predictions made before Pascal and Fermat's discovery of the mathematics of probability in 1654? What methods in law, science, commerce, philosophy, and logic helped us to get at the truth in cases where certainty was not attainable? The book examines how judges, witch inquisitors, and juries evaluated evidence; how scientists weighed reasons for and against scientific theories; and how merchants counted shipwrecks to determine insurance rates. Also included are the problem of induction before Hume, design (...) arguments for the existence of God, and theories on how to evaluate scientific and historical hypotheses. It is explained how Pascal and Fermat's work on chance arose out of legal thought on aleatory contracts. The book interprets pre-Pascalian unquantified probability in a generally objective Bayesian or logical probabilist sense. (shrink)
In this paper the strategy for the eliminative reduction of the alethic modalities suggested by John Venn is outlined and it is shown to anticipate certain related contemporary empiricistic and nominalistic projects. Venn attempted to reduce the alethic modalities to probabilities, and thus suggested a promising solution to the nagging issue of the inclusion of modal statements in empiricistic philosophical systems. However, despite the promise that this suggestion held for laying the ‘ghost of modality’ to rest, this general approach, tempered (...) modal eliminativism, is shown to be inadequate for that task.<br><br>. (shrink)
The most widespread models of rational reasoners (the model based on modal epistemic logic and the model based on probability theory) exhibit the problem of logical omniscience. The most common strategy for avoiding this problem is to interpret the models as describing the explicit beliefs of an ideal reasoner, but only the implicit beliefs of a real reasoner. I argue that this strategy faces serious normative issues. In this paper, I present the more fundamental problem of logical omnipotence, (...) which highlights the normative content of the problem of logical omniscience. I introduce two developments of the notion of implicit belief (accessible and stable belief ) and use them in two versions of the most common strategy applied to the problem of logical omnipotence. (shrink)
We provide a 'verisimilitudinarian' analysis of the well-known Linda paradox or conjunction fallacy, i.e., the fact that most people judge the probability of the conjunctive statement "Linda is a bank teller and is active in the feminist movement" (B & F) as more probable than the isolated statement "Linda is a bank teller" (B), contrary to an uncontroversial principle of probability theory. The basic idea is that experimental participants may judge B & F a better hypothesis about Linda (...) as compared to B because they evaluate B & F as more verisimilar than B. In fact, the hypothesis "feminist bank teller", while less likely to be true than "bank teller", may well be a better approximation to the truth about Linda. (shrink)
We propose a nonmonotonic Description Logic of typicality able to account for the phenomenon of combining prototypical concepts, an open problem in the fields of AI and cognitive modelling. Our logic extends the logic of typicality ALC + TR, based on the notion of rational closure, by inclusions p :: T(C) v D (“we have probability p that typical Cs are Ds”), coming from the distributed semantics of probabilistic Description Logics. Additionally, it embeds a set of (...) cognitive heuristics for concept combination. We show that the complexity of reasoning in our logic is EXPTIME-complete as in ALC. (shrink)
The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set--just as the usual logical notion of probability based (...) on the Boolean logic of subsets is the normalized counting measure of the subsets (events). Thus logical entropy is a measure on the set of ordered pairs, and all the compound notions of entropy (join entropy, conditional entropy, and mutual information) arise in the usual way from the measure (e.g., the inclusion-exclusion principle)--just like the corresponding notions of probability. The usual Shannon entropy of a partition is developed by replacing the normalized count of distinctions (dits) by the average number of binary partitions (bits) necessary to make all the distinctions of the partition. (shrink)
I argue that any broadly dispositional analysis of probability will either fail to give an adequate explication of probability, or else will fail to provide an explication that can be gainfully employed elsewhere (for instance, in empirical science or in the regulation of credence). The diversity and number of arguments suggests that there is little prospect of any successful analysis along these lines.
This paper explores the interaction of well-motivated (if controversial) principles governing the probability conditionals, with accounts of what it is for a sentence to be indefinite. The conclusion can be played in a variety of ways. It could be regarded as a new reason to be suspicious of the intuitive data about the probability of conditionals; or, holding fixed the data, it could be used to give traction on the philosophical analysis of a contentious notion—indefiniteness. The paper outlines (...) the various options, and shows that ‘rejectionist’ theories of indefiniteness are incompatible with the results. Rejectionist theories include popular accounts such as supervaluationism, non-classical truth-value gap theories, and accounts of indeterminacy that centre on rejecting the law of excluded middle. An appendix compares the results obtained here with the ‘impossibility’ results descending from Lewis ( 1976 ). (shrink)
This paper outlines a formal recursive wager resolution calculus (WRC) that provides a novel conceptual framework for sentential logic via bridge rules that link wager resolution with truth values. When paired with a traditional truth-centric criterion of logical soundness WRC generates a sentential logic that is broadly truth-conditional but not truth-functional, supports the rules of proof employed in standard mathematics, and is immune to the most vexing features of their traditional implementation. WRC also supports a novel probabilistic criterion (...) of logical soundness, the fair betting probability criterion (FBP). It guarantees that the conclusion of an FBP-valid argument is at least as credible as a conjunction of premises, and also that the conclusion is true if the premises are. In addition, WRC provides a platform for a novel non-probabilistic, computationally simpler criterion of logical soundness – the criterion of Super-validity - that issues the same logical appraisals as FBP, and hence the same guarantees. (shrink)
In this paper I present a new way of understanding Dutch Book Arguments: the idea is that an agent is shown to be incoherent iff he would accept as fair a set of bets that would result in a loss under any interpretation of the claims involved. This draws on a standard definition of logical inconsistency. On this new understanding, the Dutch Book Arguments for the probability axioms go through, but the Dutch Book Argument for Reflection fails. The question (...) of whether we have a Dutch Book Argument for Conditionalization is left open. (shrink)
Structuralism has recently moved center stage in philosophy of mathematics. One of the issues discussed is the underlying logic of mathematical structuralism. In this paper, I want to look at the dual question, namely the underlying structures of logic. Indeed, from a mathematical structuralist standpoint, it makes perfect sense to try to identify the abstract structures underlying logic. We claim that one answer to this question is provided by categorical logic. In fact, we claim that the (...) latter can be seen—and probably should be seen—as being a structuralist approach to logic and it is from this angle that categorical logic is best understood. (shrink)
The paper argues that the two best known formal logical fallacies, namely denying the antecedent (DA) and affirming the consequent (AC) are not just basic and simple errors, which prove human irrationality, but rather informational shortcuts, which may provide a quick and dirty way of extracting useful information from the environment. DA and AC are shown to be degraded versions of Bayes’ theorem, once this is stripped of some of its probabilities. The less the probabilities count, the closer these fallacies (...) become to a reasoning that is not only informationally useful but also logically valid. (shrink)
Although the cognition is significant in strategic reasoning, its role has been weakly analyzed, because only the average intelligence is usually considered. For example, prisoner's dilemma in game theory, would have different outcomes for persons with different intelligence. I show how various levels of intelligence influence the quality of reasoning, decision, or the probability of psychosis. I explain my original methodology developed for my MA thesis in clinical psychology in 1998, and grant research in 1999, demonstrating the bias of (...) the classic IQ method, and how the intelligence limits thinking. Based on that I defined Personality Model, providing insight into understanding of psychosis (schizophrenia, bi-polar), which has not been explained yet by psychology or psychiatry. In addition, it enables to analyze and assess non-linear problems, utilizable in computer programming, visualization (animation) or other fields including Baduk game. I've already applied some principles in complex information system www.each.co.uk, and video-animations exhibited in London, Germany, Tokyo. I need to mention my experience in chess composition between 1994 and 2000, winning a few international prizes and inventing a special class of fairy rules redefining the mate. The chess composition principles or patterns show the way to organize logical series to higher advanced mechanisms (like calculus), applicable to other fields. One of such principles is a logical aesthetic innovation: new strategy, defined by Italian composers. Finally I show how the simple redefinition of the classic utility concept links economics and psychology to explain irrational / destructive behavior. All presented results (from the research) can be repeated. (shrink)
This paper presents a new analysis of C.G. Hempel’s conditions of adequacy for any relation of confirmation [Hempel C. G. (1945). Aspects of scientific explanation and other essays in the philosophy of science. New York: The Free Press, pp. 3–51.], differing from the one Carnap gave in §87 of his [1962. Logical foundations of probability (2nd ed.). Chicago: University of Chicago Press.]. Hempel, it is argued, felt the need for two concepts of confirmation: one aiming at true hypotheses and (...) another aiming at informative hypotheses. However, he also realized that these two concepts are conflicting, and he gave up the concept of confirmation aiming at informative hypotheses. I then show that one can have Hempel’s cake and eat it too. There is a logic that takes into account both of these two conflicting aspects. According to this logic, a sentence H is an acceptable hypothesis for evidence E if and only if H is both sufficiently plausible given E and sufficiently informative about E. Finally, the logic sheds new light on Carnap’s analysis. (shrink)
Karl Popper discovered in 1938 that the unconditional probability of a conditional of the form ‘If A, then B’ normally exceeds the conditional probability of B given A, provided that ‘If A, then B’ is taken to mean the same as ‘Not (A and not B)’. So it was clear (but presumably only to him at that time) that the conditional probability of B given A cannot be reduced to the unconditional probability of the material conditional (...) ‘If A, then B’. I describe how this insight was developed in Popper’s writings and I add to this historical study a logical one, in which I compare laws of excess in Kolmogorov probability theory with laws of excess in Popper probability theory. (shrink)
A historical review and philosophical look at the introduction of “negative probability” as well as “complex probability” is suggested. The generalization of “probability” is forced by mathematical models in physical or technical disciplines. Initially, they are involved only as an auxiliary tool to complement mathematical models to the completeness to corresponding operations. Rewards, they acquire ontological status, especially in quantum mechanics and its formulation as a natural information theory as “quantum information” after the experimental confirmation the phenomena (...) of “entanglement”. Philosophical interpretations appear. A generalization of them is suggested: ontologically, they correspond to a relevant generalization to the relation of a part and its whole where the whole is a subset of the part rather than vice versa. The structure of “vector space” is involved necessarily in order to differ the part “by itself” from it in relation to the whole as a projection within it. That difference is reflected in the new dimension of vector space both mathematically and conceptually. Then, “negative or complex probability” are interpreted as a quantity corresponding the generalized case where the part can be “bigger” than the whole, and it is represented only partly in general within the whole. (shrink)
By the lights of a central logical positivist thesis in modal epistemology, for every necessary truth that we know, we know it a priori and for every contingent truth that we know, we know it a posteriori. Kripke attacks on both flanks, arguing that we know necessary a posteriori truths and that we probably know contingent a priori truths. In a reflection of Kripke's confidence in his own arguments, the first of these Kripkean claims is far more widely accepted than (...) the second. Contrary to received opinion, the paper argues, the considerations Kripke adduces concerning truths purported to be necessary a posteriori do not disprove the logical positivist thesis that necessary truth and a priori truth are co-extensive. (shrink)
Neuroscience has studied deductive reasoning over the last 20 years under the assumption that deductive inferences are not only de jure but also de facto distinct from other forms of inference. The objective of this research is to verify if logically valid deductions leave any cerebral electrical trait that is distinct from the trait left by non-valid deductions. 23 subjects with an average age of 20.35 years were registered with MEG and placed into a two conditions paradigm (100 trials for (...) each condition) which each presented the exact same relational complexity (same variables and content) but had distinct logical complexity. Both conditions show the same electromagnetic components (P3, N4) in the early temporal window (250–525 ms) and P6 in the late temporal window (500–775 ms). The significant activity in both valid and invalid conditions is found in sensors from medial prefrontal regions, probably corresponding to the ACC or to the medial prefrontal cortex. The amplitude and intensity of valid deductions is significantly lower in both temporal windows (p = 0.0003). The reaction time was 54.37% slower in the valid condition. Validity leaves a minimal but measurable hypoactive electrical trait in brain processing. The minor electrical demand is attributable to the recursive and automatable character of valid deductions, suggesting a physical indicator of computational deductive properties. It is hypothesized that all valid deductions are recursive and hypoactive. (shrink)
Although we often see references to Carnap’s inductive logic even in modern literatures, seemingly its confusing style has long obstructed its correct understanding. So instead of Carnap, in this paper, I devote myself to its necessary and sufficient commentary. In the beginning part (Sections 2-5), I explain why Carnap began the study of inductive logic and how he related it with our thought on probability (Sections 2-4). Therein, I trace Carnap’s thought back to Wittgenstein’s Tractatus as well (...) (Section 5). In the succeeding sections, I attempt the simplest exhibition of Carnap’s earlier system, where his original thought was thoroughly provided. For this purpose, minor concepts to which researchers have not paid attention are highlighted, for example, m-function (Section 8), in-correlation (Section 10), C-correlate (Section 10), statistical distribution (Section 12), and fitting sequence (Section 17). The climax of this paper is the proof of theorem (56). Through this theorem, we will be able to overview Carnap’s whole system. (shrink)
This paper discusses an almost sixty year old problem in the philosophy of science -- that of a logic of confirmation. We present a new analysis of Carl G. Hempel's conditions of adequacy (Hempel 1945), differing from the one Carnap gave in §87 of his Logical Foundations of Probability (1962). Hempel, it is argued, felt the need for two concepts of confirmation: one aiming at true theories and another aiming at informative theories. However, he also realized that these (...) two concepts are conflicting, and he gave up the concept of confirmation aiming at informative theories. We then show that one can have Hempel's cake and eat it, too: There is a (rank-theoretic and genuinely nonmonotonic) logic of confirmation -- or rather, theory assessment -- that takes into account both of these two conflicting aspects. According to this logic, a statement H is an acceptable theory for the data E if and only if H is both sufficiently plausible given E and sufficiently informative about E. Finally, the logic sheds new light on Carnap's analysis (and solves another problem of confirmation theory). (shrink)
Ignited by Einstein and Bohr a century ago, the philosophical struggle about Reality is yet unfinished, with no signs of a swift resolution. Despite vast technological progress fueled by the iconic EPR paper (EPR), the intricate link between ontic and epistemic aspects of Quantum Theory (QT) has greatly hindered our grip on Reality and further progress in physical theory. Fallacies concealed by tortuous logical negations made EPR comprehension much harder than it could have been had Einstein written it himself in (...) German. It is plagued with preconceptions about what a physical property is, the 'Uncertainty Principle', and the Principle of Locality. Numerous interpretations of QT vis à vis Reality exist and are keenly disputed. This is the first of a series of articles arguing for a physical interpretation called ‘The Ontic Probability Interpretation’ (TOPI). A gradual explanation of TOPI is given intertwined with a meticulous logico-philosophical scrutiny of EPR. Part I focuses on the meaning of Einstein’s ‘Incompleteness’ claim. A conceptual confusion, a preconception about Reality, and a flawed dichotomy are shown to be severe obstacles for the EPR argument to succeed. Part II analyzes Einstein’s ‘Incompleteness/Nonlocality Dilemma’. Future articles will further explain TOPI, demonstrating its soundness and potential for nurturing theoretical progress. (shrink)
Dorothy Edgington’s work has been at the centre of a range of ongoing debates in philosophical logic, philosophy of mind and language, metaphysics, and epistemology. This work has focused, although by no means exclusively, on the overlapping areas of conditionals, probability, and paradox. In what follows, I briefly sketch some themes from these three areas relevant to Dorothy’s work, highlighting how some of Dorothy’s work and some of the contributions of this volume fit in to these debates.
We study the modal logic M L r of the countable random frame, which is contained in and `approximates' the modal logic of almost sure frame validity, i.e. the logic of those modal principles which are valid with asymptotic probability 1 in a randomly chosen finite frame. We give a sound and complete axiomatization of M L r and show that it is not finitely axiomatizable. Then we describe the finite frames of that logic and (...) show that it has the finite frame property and its satisfiability problem is in EXPTIME. All these results easily extend to temporal and other multi-modal logics. Finally, we show that there are modal formulas which are almost surely valid in the finite, yet fail in the countable random frame, and hence do not follow from the extension axioms. Therefore the analog of Fagin's transfer theorem for almost sure validity in first-order logic fails for modal logic. (shrink)
We extend the framework of Inductive Logic to Second Order languages and introduce Wilmers' Principle, a rational principle for probability functions on Second Order languages. We derive a representation theorem for functions satisfying this principle and investigate its relationship to the first order principles of Regularity and Super Regularity.
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.