Some of the most important developments of symbolic logic took place in the 1920s. Foremost among them are the distinction between syntax and semantics and the formulation of questions of completeness and decidability of logical systems. David Hilbert and his students played a very important part in these developments. Their contributions can be traced to unpublished lecture notes and other manuscripts by Hilbert and Bernays dating to the period 1917-1923. The aim of this paper is to describe these results, (...) focussing primarily on propositionallogic, and to put them in their historical context. It is argued that truth-value semantics, syntactic ("Post-") and semantic completeness, decidability, and other results were first obtained by Hilbert and Bernays in 1918, and that Bernays's role in their discovery and the subsequent development of mathematical logic is much greater than has so far been acknowledged. (shrink)
Basic Argument forms Modus Ponens , Modus Tollens , Hypothetical Syllogism and Dilemma contains ‘If –then’ conditions. Conclusions from the Arguments containing ‘If –then’ conditions can be deduced very easily without any significant memorization by applying Raval’s method. Method: In Raval’s method If P then Q is written as P (2$) – Q (1$) and viewed numerically, in currency form i.e. P is viewed as 2$ and Q is viewed as 1$ and implications from this notations are valid conclusions. If (...) one has 2$ then he definitely have 1$. If one do not have 2$, he may not have 1$. If one is having 1$, he may not have 2$. If one do not have 1$, he definitely doesn’t have 2$. (shrink)
Propositional identity is not expressed by a predicate. So its logic is not given by the ordinary first order axioms for identity. What are the logical axioms governing this concept, then? Some axioms in addition to those proposed by Arthur Prior are proposed.
This paper contends that Stoic logic (i.e. Stoic analysis) deserves more attention from contemporary logicians. It sets out how, compared with contemporary propositional calculi, Stoic analysis is closest to methods of backward proof search for Gentzen-inspired substructural sequent logics, as they have been developed in logic programming and structural proof theory, and produces its proof search calculus in tree form. It shows how multiple similarities to Gentzen sequent systems combine with intriguing dissimilarities that may enrich contemporary discussion. (...) Much of Stoic logic appears surprisingly modern: a recursively formulated syntax with some truth-functional propositional operators; analogues to cut rules, axiom schemata and Gentzen’s negation-introduction rules; an implicit variable-sharing principle and deliberate rejection of Thinning and avoidance of paradoxes of implication. These latter features mark the system out as a relevance logic, where the absence of duals for its left and right introduction rules puts it in the vicinity of McCall’s connexive logic. Methodologically, the choice of meticulously formulated meta-logical rules in lieu of axiom and inference schemata absorbs some structural rules and results in an economical, precise and elegant system that values decidability over completeness. (shrink)
ABSTRACT: An introduction to Stoic logic. Stoic logic can in many respects be regarded as a fore-runner of modern propositionallogic. I discuss: 1. the Stoic notion of sayables or meanings (lekta); the Stoic assertibles (axiomata) and their similarities and differences to modern propositions; the time-dependency of their truth; 2.-3. assertibles with demonstratives and quantified assertibles and their truth-conditions; truth-functionality of negations and conjunctions; non-truth-functionality of disjunctions and conditionals; language regimentation and ‘bracketing’ devices; Stoic basic principles (...) of propositionallogic; 4. Stoic modal logic; 5. Stoic theory of arguments: two premisses requirement; validity and soundness; 6. Stoic syllogistic or theory of formally valid arguments: a reconstruction of the Stoic deductive system, which consisted of accounts of five types of indemonstrable syllogisms, which function as nullary argumental rules that identify indemonstrables or axioms of the system, and four deductive rules (themata) by which certain complex arguments can be reduced to indemonstrables and thus shown to be formally valid themselves; 7. arguments that were considered as non-syllogistically valid (subsyllogistic and unmethodically concluding arguments). Their validity was explained by recourse to formally valid arguments. (shrink)
forall x: Calgary is a full-featured textbook on formal logic. It covers key notions of logic such as consequence and validity of arguments, the syntax of truth-functional propositionallogic TFL and truth-table semantics, the syntax of first-order (predicate) logic FOL with identity (first-order interpretations), translating (formalizing) English in TFL and FOL, and Fitch-style natural deduction proof systems for both TFL and FOL. It also deals with some advanced topics such as truth-functional completeness and modal (...) class='Hi'>logic. Exercises with solutions are available. It is provided in PDF (for screen reading, printing, and a special version for dyslexics) and in LaTeX source code. (shrink)
The Frege point to the effect that e.g. the clauses of conditionals are not asserted and therefore cannot be assertions is often taken to establish a dichotomy between the content of a speech act, which is propositional and belongs to logic and semantics, and its force, which belongs to pragmatics. Recently this dichotomy has been questioned by philosophers such as Peter Hanks and Francois Recanati, who propose act-theoretic accounts of propositions, argue that we can’t account for propositional (...) unity independently of the forceful acts of speakers, and respond to the Frege point by appealing to a notion of force cancellation. I argue that the notion of force cancellation is faced with a dilemma and offer an alternative response to the Frege point, which extends the act-theoretic account to logical acts such as conditionalizing or disjoining. Such higher-level acts allow us to present forceful acts while suspending commitment to them. In connecting them, a subject rather commits to an affirmation function of such acts. In contrast, the Frege point confuses a lack of commitment to what is put forward with a lack of commitment or force in what is put forward. (shrink)
It is shown that Gqp↑, the quantified propositional Gödel logic based on the truth-value set V↑ = {1 - 1/n : n≥1}∪{1}, is decidable. This result is obtained by reduction to Büchi's theory S1S. An alternative proof based on elimination of quantifiers is also given, which yields both an axiomatization and a characterization of Gqp↑ as the intersection of all finite-valued quantified propositional Gödel logics.
The first learning game to be developed to help students to develop and hone skills in constructing proofs in both the propositional and first-order predicate calculi. It comprises an autotelic (self-motivating) learning approach to assist students in developing skills and strategies of proof in the propositional and predicate calculus. The text of VALIDITY consists of a general introduction that describes earlier studies made of autotelic learning games, paying particular attention to work done at the Law School of Yale (...) University, called the ALL Project (Accelerated Learning of Logic). Following the introduction, the game of VALIDITY is described, first with reference to the propositional calculus, and then in connection with the first-order predicate calculus with identity. Sections in the text are devoted to discussions of the various rules of derivation employed in both calculi. Three appendices follow the main text; these provide a catalogue of sequents and theorems that have been proved for the propositional calculus and for the predicate calculus, and include suggestions for the classroom use of VALIDITY in university-level courses in mathematical logic. (shrink)
The paper surveys the currently available axiomatizations of common belief (CB) and common knowledge (CK) by means of modal propositional logics. (Throughout, knowledge- whether individual or common- is defined as true belief.) Section 1 introduces the formal method of axiomatization followed by epistemic logicians, especially the syntax-semantics distinction, and the notion of a soundness and completeness theorem. Section 2 explains the syntactical concepts, while briefly discussing their motivations. Two standard semantic constructions, Kripke structures and neighbourhood structures, are introduced in (...) Sections 3 and 4, respectively. It is recalled that Aumann's partitional model of CK is a particular case of a definition in terms of Kripke structures. The paper also restates the well-known fact that Kripke structures can be regarded as particular cases of neighbourhood structures. Section 3 reviews the soundness and completeness theorems proved w.r.t. the former structures by Fagin, Halpern, Moses and Vardi, as well as related results by Lismont. Section 4 reviews the corresponding theorems derived w.r.t. the latter structures by Lismont and Mongin. A general conclusion of the paper is that the axiomatization of CB does not require as strong systems of individual belief as was originally thought- only monotonicity has thusfar proved indispensable. Section 5 explains another consequence of general relevance: despite the "infinitary" nature of CB, the axiom systems of this paper admit of effective decision procedures, i.e., they are decidable in the logician's sense. (shrink)
Carnap’s result about classical proof-theories not ruling out non-normal valuations of propositionallogic formulae has seen renewed philosophical interest in recent years. In this note I contribute some considerations which may be helpful in its philosophical assessment. I suggest a vantage point from which to see the way in which classical proof-theories do, at least to a considerable extent, encode the meanings of the connectives (not by determining a range of admissible valuations, but in their own way), and (...) I demonstrate a kind of converse to Carnap’s result. (shrink)
The “sign of consequence” is a notation for propositionallogic that Peirce invented in 1886 and used at least until 1894. It substituted the “copula of inclusion” which he had been using since 1870.
The Genuine Process Logic described here (abbreviation: GPL) places the object-bound process itself at the center of formalism. It should be suitable for everyday use, i.e. it is not primarily intended for the formalization of computer programs, but instead, as a counter-conception to the classical state logics. The new and central operator of the GPL is an action symbol replacing the classical state symbols, e.g. of equivalence or identity. The complete renunciation of object-language state expressions also results in a (...) completely new metalinguistic framework, both regarding the axioms and the expressive possibilities of this system. A mixture with state logical terms is readily possible. (shrink)
We present epistemic multilateral logic, a general logical framework for reasoning involving epistemic modality. Standard bilateral systems use propositional formulae marked with signs for assertion and rejection. Epistemic multilateral logic extends standard bilateral systems with a sign for the speech act of weak assertion (Incurvati and Schlöder 2019) and an operator for epistemic modality. We prove that epistemic multilateral logic is sound and complete with respect to the modal logic S5 modulo an appropriate translation. The (...) logical framework developed provides the basis for a novel, proof-theoretic approach to the study of epistemic modality. To demonstrate the fruitfulness of the approach, we show how the framework allows us to reconcile classical logic with the contradictoriness of so-called Yalcin sentences and to distinguish between various inference patterns on the basis of the epistemic properties they preserve. (shrink)
We introduce a number of logics to reason about collective propositional attitudes that are defined by means of the majority rule. It is well known that majoritarian aggregation is subject to irrationality, as the results in social choice theory and judgment aggregation show. The proposed logics for modelling collective attitudes are based on a substructural propositionallogic that allows for circumventing inconsistent outcomes. Individual and collective propositional attitudes, such as beliefs, desires, obligations, are then modelled by (...) means of minimal modalities to ensure a number of basic principles. In this way, a viable consistent modelling of collective attitudes is obtained. (shrink)
In the present paper we propose a system of propositionallogic for reasoning about justification, truthmaking, and the connection between justifiers and truthmakers. The logic of justification and truthmaking is developed according to the fundamental ideas introduced by Artemov. Justifiers and truthmakers are treated in a similar way, exploiting the intuition that justifiers provide epistemic grounds for propositions to be considered true, while truthmakers provide ontological grounds for propositions to be true. This system of logic is (...) then applied both for interpreting the notorious definition of knowledge as justified true belief and for advancing a new solution to Gettier counterexamples to this standard definition. (shrink)
In previous articles, it has been shown that the deductive system developed by Aristotle in his "second logic" is a natural deduction system and not an axiomatic system as previously had been thought. It was also stated that Aristotle's logic is self-sufficient in two senses: First, that it presupposed no other logical concepts, not even those of propositionallogic; second, that it is (strongly) complete in the sense that every valid argument expressible in the language of (...) the system is deducible by means of a formal deduction in the system. Review of the system makes the first point obvious. The purpose of the present article is to prove the second. Strong completeness is demonstrated for the Aristotelian system. (shrink)
In this paper I introduce a sequent system for the propositional modal logic S5. Derivations of valid sequents in the system are shown to correspond to proofs in a novel natural deduction system of circuit proofs (reminiscient of proofnets in linear logic, or multiple-conclusion calculi for classical logic). -/- The sequent derivations and proofnets are both simple extensions of sequents and proofnets for classical propositionallogic, in which the new machinery—to take account of the (...) modal vocabulary—is directly motivated in terms of the simple, universal Kripke semantics for S5. The sequent system is cut-free and the circuit proofs are normalising. (shrink)
Modern categorical logic as well as the Kripke and topological models of intuitionistic logic suggest that the interpretation of ordinary “propositional” logic should in general be the logic of subsets of a given universe set. Partitions on a set are dual to subsets of a set in the sense of the category-theoretic duality of epimorphisms and monomorphisms—which is reflected in the duality between quotient objects and subobjects throughout algebra. If “propositional” logic is thus (...) seen as the logic of subsets of a universe set, then the question naturally arises of a dual logic of partitions on a universe set. This paper is an introduction to that logic of partitions dual to classical subset logic. The paper goes from basic concepts up through the correctness and completeness theorems for a tableau system of partition logic. (shrink)
In this paper the propositionallogic LTop is introduced, as an extension of classical propositionallogic by adding a paraconsistent negation. This logic has a very natural interpretation in terms of topological models. The logic LTop is nothing more than an alternative presentation of modal logic S4, but in the language of a paraconsistent logic. Moreover, LTop is a logic of formal inconsistency in which the consistency and inconsistency operators have a (...) nice topological interpretation. This constitutes a new proof of S4 as being "the logic of topological spaces", but now under the perspective of paraconsistency. (shrink)
This paper is concerned with a propositional modal logic with operators for necessity, actuality and apriority. The logic is characterized by a class of relational structures defined according to ideas of epistemic two-dimensional semantics, and can therefore be seen as formalizing the relations between necessity, actuality and apriority according to epistemic two-dimensional semantics. We can ask whether this logic is correct, in the sense that its theorems are all and only the informally valid formulas. This paper (...) gives outlines of two arguments that jointly show that this is the case. The first is intended to show that the logic is informally sound, in the sense that all of its theorems are informally valid. The second is intended to show that it is informally complete, in the sense that all informal validities are among its theorems. In order to give these arguments, a number of independently interesting results concerning the logic are proven. In particular, the soundness and completeness of two proof systems with respect to the semantics is proven (Theorems 2.11 and 2.15), as well as a normal form theorem (Theorem 3.2), an elimination theorem for the actuality operator (Corollary 3.6), and the decidability of the logic (Corollary 3.7). It turns out that the logic invalidates a plausible principle concerning the interaction of apriority and necessity; consequently, a variant semantics is briefly explored on which this principle is valid. The paper concludes by assessing the implications of these results for epistemic two-dimensional semantics. (shrink)
Since the time of Aristotle's students, interpreters have considered Prior Analytics to be a treatise about deductive reasoning, more generally, about methods of determining the validity and invalidity of premise-conclusion arguments. People studied Prior Analytics in order to learn more about deductive reasoning and to improve their own reasoning skills. These interpreters understood Aristotle to be focusing on two epistemic processes: first, the process of establishing knowledge that a conclusion follows necessarily from a set of premises (that is, on the (...) epistemic process of extracting information implicit in explicitly given information) and, second, the process of establishing knowledge that a conclusion does not follow. Despite the overwhelming tendency to interpret the syllogistic as formal epistemology, it was not until the early 1970s that it occurred to anyone to think that Aristotle may have developed a theory of deductive reasoning with a well worked-out system of deductions comparable in rigor and precision with systems such as propositionallogic or equational logic familiar from mathematical logic. When modern logicians in the 1920s and 1930s first turned their attention to the problem of understanding Aristotle's contribution to logic in modern terms, they were guided both by the Frege-Russell conception of logic as formal ontology and at the same time by a desire to protect Aristotle from possible charges of psychologism. They thought they saw Aristotle applying the informal axiomatic method to formal ontology, not as making the first steps into formal epistemology. They did not notice Aristotle's description of deductive reasoning. Ironically, the formal axiomatic method (in which one explicitly presents not merely the substantive axioms but also the deductive processes used to derive theorems from the axioms) is incipient in Aristotle's presentation. Partly in opposition to the axiomatic, ontically-oriented approach to Aristotle's logic and partly as a result of attempting to increase the degree of fit between interpretation and text, logicians in the 1970s working independently came to remarkably similar conclusions to the effect that Aristotle indeed had produced the first system of formal deductions. They concluded that Aristotle had analyzed the process of deduction and that his achievement included a semantically complete system of natural deductions including both direct and indirect deductions. Where the interpretations of the 1920s and 1930s attribute to Aristotle a system of propositions organized deductively, the interpretations of the 1970s attribute to Aristotle a system of deductions, or extended deductive discourses, organized epistemically. The logicians of the 1920s and 1930s take Aristotle to be deducing laws of logic from axiomatic origins; the logicians of the 1970s take Aristotle to be describing the process of deduction and in particular to be describing deductions themselves, both those deductions that are proofs based on axiomatic premises and those deductions that, though deductively cogent, do not establish the truth of the conclusion but only that the conclusion is implied by the premise-set. Thus, two very different and opposed interpretations had emerged, interestingly both products of modern logicians equipped with the theoretical apparatus of mathematical logic. The issue at stake between these two interpretations is the historical question of Aristotle's place in the history of logic and of his orientation in philosophy of logic. This paper affirms Aristotle's place as the founder of logic taken as formal epistemology, including the study of deductive reasoning. A by-product of this study of Aristotle's accomplishments in logic is a clarification of a distinction implicit in discourses among logicians--that between logic as formal ontology and logic as formal epistemology. (shrink)
Sentences about logic are often used to show that certain embedding expressions, including attitude verbs, conditionals, and epistemic modals, are hyperintensional. Yet it not clear how to regiment “logic talk” in the object language so that it can be compositionally embedded under such expressions. This paper does two things. First, it argues against a standard account of logic talk, viz., the impossible worlds semantics. It is shown that this semantics does not easily extend to a language with (...)propositional quantifiers, which are necessary for regimenting some logic talk. Second, it develops an alternative framework based on logical expressivism, which explains logic talk using shifting conventions. When combined with the standard S5π+ semantics for propositional quantifiers, this framework results in a well-behaved system that does not face the problems of the impossible worlds semantics. It can also be naturally extended with hybrid operators to regiment a broader range of logic talk, e.g., claims about what laws hold according to other logics. The resulting system, called hyperlogic, is therefore a better framework for modeling logic talk than previous accounts. (shrink)
We investigate an enrichment of the propositional modal language L with a "universal" modality ■ having semantics x ⊧ ■φ iff ∀y(y ⊧ φ), and a countable set of "names" - a special kind of propositional variables ranging over singleton sets of worlds. The obtained language ℒ $_{c}$ proves to have a great expressive power. It is equivalent with respect to modal definability to another enrichment ℒ(⍯) of ℒ, where ⍯ is an additional modality with the semantics x (...) ⊧ ⍯φ iff Vy(y ≠ x → y ⊧ φ). Model-theoretic characterizations of modal definability in these languages are obtained. Further we consider deductive systems in ℒ $_{c}$ . Strong completeness of the normal ℒ $_{c}$ logics is proved with respect to models in which all worlds are named. Every ℒ $_{c}$ -logic axiomatized by formulae containing only names (but not propositional variables) is proved to be strongly frame-complete. Problems concerning transfer of properties ([in]completeness, filtration, finite model property etc.) from ℒ to ℒ $_{c}$ are discussed. Finally, further perspectives for names in multimodal environment are briefly sketched. (shrink)
The purpose of the present paper is to provide a way of understanding systems of logic of essence by introducing a new semantic framework for them. Three central results are achieved: first, the now standard Fitting semantics for the propositionallogic of evidence is adapted in order to provide a new, simplified semantics for the propositionallogic of essence; secondly, we show how it is possible to construe the concept of necessary truth explicitly by using (...) the concept of essential truth; finally, Fitting semantics is adapted in order to present a simplified semantics for the quantified logic of essence. (shrink)
Takeuti and Titani have introduced and investigated a logic they called intuitionistic fuzzy logic. This logic is characterized as the first-order Gödel logic based on the truth value set [0,1]. The logic is known to be axiomatizable, but no deduction system amenable to proof-theoretic, and hence, computational treatment, has been known. Such a system is presented here, based on previous work on hypersequent calculi for propositional Gödel logics by Avron. It is shown that the (...) system is sound and complete, and allows cut-elimination. A question by Takano regarding the eliminability of the Takeuti-Titani density rule is answered affirmatively. (shrink)
This book treats ancient logic: the logic that originated in Greece by Aristotle and the Stoics, mainly in the hundred year period beginning about 350 BCE. Ancient logic was never completely ignored by modern logic from its Boolean origin in the middle 1800s: it was prominent in Boole’s writings and it was mentioned by Frege and by Hilbert. Nevertheless, the first century of mathematical logic did not take it seriously enough to study the ancient (...) class='Hi'>logic texts. A renaissance in ancient logic studies occurred in the early 1950s with the publication of the landmark Aristotle’s Syllogistic by Jan Łukasiewicz, Oxford UP 1951, 2nd ed. 1957. Despite its title, it treats the logic of the Stoics as well as that of Aristotle. Łukasiewicz was a distinguished mathematical logician. He had created many-valued logic and the parenthesis-free prefix notation known as Polish notation. He co-authored with Alfred Tarski’s an important paper on metatheory of propositionallogic and he was one of Tarski’s the three main teachers at the University of Warsaw. Łukasiewicz’s stature was just short of that of the giants: Aristotle, Boole, Frege, Tarski and Gödel. No mathematical logician of his caliber had ever before quoted the actual teachings of ancient logicians. -/- Not only did Łukasiewicz inject fresh hypotheses, new concepts, and imaginative modern perspectives into the field, his enormous prestige and that of the Warsaw School of Logic reflected on the whole field of ancient logic studies. Suddenly, this previously somewhat dormant and obscure field became active and gained in respectability and importance in the eyes of logicians, mathematicians, linguists, analytic philosophers, and historians. Next to Aristotle himself and perhaps the Stoic logician Chrysippus, Łukasiewicz is the most prominent figure in ancient logic studies. A huge literature traces its origins to Łukasiewicz. -/- This Ancient Logic and Its Modern Interpretations, is based on the 1973 Buffalo Symposium on Modernist Interpretations of Ancient Logic, the first conference devoted entirely to critical assessment of the state of ancient logic studies. (shrink)
In this paper, we investigate the expressiveness of the variety of propositional interval neighborhood logics , we establish their decidability on linearly ordered domains and some important subclasses, and we prove the undecidability of a number of extensions of PNL with additional modalities over interval relations. All together, we show that PNL form a quite expressive and nearly maximal decidable fragment of Halpern–Shoham’s interval logic HS.
Entailment in propositional Gödel logics can be defined in a natural way. While all infinite sets of truth values yield the same sets of tautologies, the entailment relations differ. It is shown that there is a rich structure of infinite-valued Gödel logics, only one of which is compact. It is also shown that the compact infinite-valued Gödel logic is the only one which interpolates, and the only one with an r.e. entailment relation.
In this paper we focus our attention on tableau methods for propositional interval temporal logics. These logics provide a natural framework for representing and reasoning about temporal properties in several areas of computer science. However, while various tableau methods have been developed for linear and branching time point-based temporal logics, not much work has been done on tableau methods for interval-based ones. We develop a general tableau method for Venema's \cdt\ logic interpreted over partial orders (\nsbcdt\ for short). (...) It combines features of the classical tableau method for first-order logic with those of explicit tableau methods for modal logics with constraint label management, and it can be easily tailored to most propositional interval temporal logics proposed in the literature. We prove its soundness and completeness, and we show how it has been implemented. (shrink)
The conventional wisdom has it that between 1905 and 1919 Russell was critical to pragmatism. In particular, in two essays written in 1908–9, he sharply attacked the pragmatist theory of truth, emphasizing that truth is not relative to human practice. In fact, however, Russell was much more indebted to the pragmatists, in particular to William James, as usually believed. For example, he borrowed from James two key concepts of his new epistemology: sense-data, and the distinction between knowledge by acquaintance and (...) knowledge by description. Reasonable explanation of this is that, historically, Russell’s logical realism and James’s pragmatism have the same roots—the German philosopher Rudolph Hermann Lotze (1817–1881). In this paper we are going to explore the fact that in 1905, under Lotze’s influence, Russell married propositions with beliefs. A few years later this step also made Russell prone to embrace the theory of truth-making that has its roots in James. In contrast to the concept of sense-data and to the distinction between knowledge by acquaintance and knowledge by description, however, the understanding that we believe propositions—and not, for example, simply grasp them—was in tension with Russell’s Principle of Extensionality, according to which propositions can be logically connected with other propositions only as truth-functions. The point is that when we judge a mind-relation (for example, a relation of belief) to a proposition, the latter cannot be determined as true or false. The two most talented pupils of Russell, Wittgenstein and Ramsey, severely criticized the central place propositional attitudes play in Russell’s logic. Wittgenstein analyzed “A believes that p” to “ ‘p’ says p” (5.542). Ramsey criticized Russell’s beliefs in propositions the other way round: He stressed that belief is an ambiguous term that can be interpreted for the better in the sense of pragmatism. Prima facie surprisingly, he maintained that his “pragmatism is derived from Mr Russell.” (1927: 51). (shrink)
Epistemic two-dimensional semantics is a theory in the philosophy of language that provides an account of meaning which is sensitive to the distinction between necessity and apriority. While this theory is usually presented in an informal manner, I take some steps in formalizing it in this paper. To do so, I define a semantics for a propositional modal logic with operators for the modalities of necessity, actuality, and apriority that captures the relevant ideas of epistemic two-dimensional semantics. I (...) also describe some properties of the logic that are interesting from a philosophical perspective, and apply it to the so-called nesting problem. (shrink)
Benchmarking automated theorem proving (ATP) systems using standardized problem sets is a well-established method for measuring their performance. However, the availability of such libraries for non-classical logics is very limited. In this work we propose a library for benchmarking Girard's (propositional) intuitionistic linear logic. For a quick bootstrapping of the collection of problems, and for discussing the selection of relevant problems and understanding their meaning as linear logic theorems, we use translations of the collection of Kleene's intuitionistic (...) theorems in the traditional monograph "Introduction to Metamathematics". We analyze four different translations of intuitionistic logic into linear logic and compare their proofs using a linear logic based prover with focusing. In order to enhance the set of problems in our library, we apply the three provability-preserving translations to the propositional benchmarks in the ILTP Library. Finally, we generate a comprehensive set of reachability problems for Petri nets and encode such problems as linear logic sequents, thus enlarging our collection of problems. (shrink)
Suppose that the members of a group each hold a rational set of judgments on some interconnected questions, and imagine that the group itself has to form a collective, rational set of judgments on those questions. How should it go about dealing with this task? We argue that the question raised is subject to a difficulty that has recently been noticed in discussion of the doctrinal paradox in jurisprudence. And we show that there is a general impossibility theorem that that (...) difficulty illustrates. Our paper describes this impossibility result and provides an exploration of its significance. The result naturally invites comparison with Kenneth Arrow's famous theorem (Arrow, 1963 and 1984; Sen, 1970) and we elaborate that comparison in a companion paper (List and Pettit, 2002). The paper is in four sections. The first section documents the need for various groups to aggregate its members' judgments; the second presents the discursive paradox; the third gives an informal statement of the more general impossibility result; the formal proof is presented in an appendix. The fourth section, finally, discusses some escape routes from that impossibility. (shrink)
The syllogistic figures and moods can be taken to be argument schemata as can the rules of the Stoic propositionallogic. Sentence schemata have been used in axiomatizations of logic only since the landmark 1927 von Neumann paper [31]. Modern philosophers know the role of schemata in explications of the semantic conception of truth through Tarski’s 1933 Convention T [42]. Mathematical logicians recognize the role of schemata in first-order number theory where Peano’s second-order Induction Axiom is approximated (...) by Herbrand’s Induction-Axiom Schema [23]. Similarly, in first-order set theory, Zermelo’s second-order Separation Axiom is approximated by Fraenkel’s first-order Separation Schema [17]. In some of several closely related senses, a schema is a complex system having multiple components one of which is a template-text or scheme-template, a syntactic string composed of one or more “blanks” and also possibly significant words and/or symbols. In accordance with a side condition the template-text of a schema is used as a “template” to specify a multitude, often infinite, of linguistic expressions such as phrases, sentences, or argument-texts, called instances of the schema. The side condition is a second component. The collection of instances may but need not be regarded as a third component. The instances are almost always considered to come from a previously identified language (whether formal or natural), which is often considered to be another component. This article reviews the often-conflicting uses of the expressions ‘schema’ and ‘scheme’ in the literature of logic. It discusses the different definitions presupposed by those uses. And it examines the ontological and epistemic presuppositions circumvented or mooted by the use of schemata, as well as the ontological and epistemic presuppositions engendered by their use. In short, this paper is an introduction to the history and philosophy of schemata. (shrink)
Information-theoretic approaches to formal logic analyse the "common intuitive" concept of propositional implication (or argumental validity) in terms of information content of propositions and sets of propositions: one given proposition implies a second if the former contains all of the information contained by the latter; an argument is valid if the conclusion contains no information beyond that of the premise-set. This paper locates information-theoretic approaches historically, philosophically and pragmatically. Advantages and disadvantages are identified by examining such approaches in (...) themselves and by contrasting them with standard transformation-theoretic approaches. Transformation-theoretic approaches analyse validity (and thus implication) in terms of transformations that map one argument onto another: a given argument is valid if no transformation carries it onto an argument with all true premises and false conclusion. Model-theoretic, set-theoretic, and substitution-theoretic approaches, which dominate current literature, can be construed as transformation-theoretic, as can the so-called possible-worlds approaches. Ontic and epistemic presuppositions of both types of approaches are considered. Attention is given to the question of whether our historically cumulative experience applying logic is better explained from a purely information-theoretic perspective or from a purely transformation-theoretic perspective or whether apparent conflicts between the two types of approaches need to be reconciled in order to forge a new type of approach that recognizes their basic complementarity. (shrink)
We approach the topic of solution equivalence of propositional problems from the perspective of non-constructive procedural theory of problems based on Transparent Intensional Logic (TIL). The answer we put forward is that two solutions are equivalent if and only if they have equivalent solution concepts. Solution concepts can be understood as a generalization of the notion of proof objects from the Curry-Howard isomorphism.
The verb ‘to know’ can be used both in ascriptions of propositional knowledge and ascriptions of knowledge of acquaintance. In the formal epistemology literature, the former use of ‘know’ has attracted considerable attention, while the latter is typically regarded as derivative. This attitude may be unsatisfactory for those philosophers who, like Russell, are not willing to think of knowledge of acquaintance as a subsidiary or dependent kind of knowledge. In this paper we outline a logic of knowledge of (...) acquaintance in which ascriptions like ‘Mary knows Smith’ are regarded as formally interesting in their own right, remaining neutral on their relation to ascriptions of propositional knowledge. The resulting logical framework, which is based on Hintikka’s modal approach to epistemic logic, provides a fresh perspective on various issues and notions at play in the philosophical debate on acquaintance. (shrink)
Propositional logics in general, considered as a set of sentences, can be undecidable even if they have “nice” representations, e.g., are given by a calculus. Even decidable propositional logics can be computationally complex (e.g., already intuitionistic logic is PSPACE-complete). On the other hand, finite-valued logics are computationally relatively simple—at worst NP. Moreover, finite-valued semantics are simple, and general methods for theorem proving exist. This raises the question to what extent and under what circumstances propositional logics represented (...) in various ways can be approximated by finite-valued logics. It is shown that the minimal m-valued logic for which a given calculus is strongly sound can be calculated. It is also investigated under which conditions propositional logics can be characterized as the intersection of (effectively given) sequences of finite-valued logics. (shrink)
This article presents modal versions of resource-conscious logics. We concentrate on extensions of variants of linear logic with one minimal non-normal modality. In earlier work, where we investigated agency in multi-agent systems, we have shown that the results scale up to logics with multiple non-minimal modalities. Here, we start with the language of propositional intuitionistic linear logic without the additive disjunction, to which we add a modality. We provide an interpretation of this language on a class of (...) Kripke resource models extended with a neighbourhood function: modal Kripke resource models. We propose a Hilbert-style axiomatisation and a Gentzen-style sequent calculus. We show that the proof theories are sound and complete with respect to the class of modal Kripke resource models. We show that the sequent calculus admits cut elimination and that proof-search is in PSPACE. We then show how to extend the results when non-commutative connectives are added to the language. Finally, we put the l.. (shrink)
Simple partial logic (=SPL) is, broadly speaking, an extensional logic which allows for the truth-value gap. First I give a system of propositional SPL by partializing classical logic, as well as extending it with several non-classical truth-functional operators. Second I show a way based on SPL to construct a system of tensed ontology, by representing tensed statements as two kinds of necessary statements in a linear model that consists of the present and future worlds. Finally I (...) compare that way with other two ways based on Łukasiewicz’s three-valued logic and branching temporal logic. (shrink)
ABSTRACT: Summary presentation of the surviving logic theories of Philo the Dialectician (aka Philo of Megara) and Diodorus Cronus, including some general remarks on propositional logical elements in their logic, a presentation of their theories of the conditional and a presentation of their modal theories, including a brief suggestion for a solution of the Master Argument.
The Logic of Causation: Definition, Induction and Deduction of Deterministic Causality is a treatise of formal logic and of aetiology. It is an original and wide-ranging investigation of the definition of causation (deterministic causality) in all its forms, and of the deduction and induction of such forms. The work was carried out in three phases over a dozen years (1998-2010), each phase introducing more sophisticated methods than the previous to solve outstanding problems. This study was intended as part (...) of a larger work on causal logic, which additionally treats volition and allied cause-effect relations (2004). The Logic of Causation deals with the main technicalities relating to reasoning about causation. Once all the deductive characteristics of causation in all its forms have been treated, and we have gained an understanding as to how it is induced, we are able to discuss more intelligently its epistemological and ontological status. In this context, past theories of causation are reviewed and evaluated (although some of the issues involved here can only be fully dealt with in a larger perspective, taking volition and other aspects of causality into consideration, as done in Volition and Allied Causal Concepts). Phase I: Macroanalysis. Starting with the paradigm of causation, its most obvious and strongest form, we can by abstraction of its defining components distinguish four genera of causation, or generic determinations, namely: complete, partial, necessary and contingent causation. When these genera and their negations are combined together in every which way, and tested for consistency, it is found that only four species of causation, or specific determinations, remain conceivable. The concept of causation thus gives rise to a number of positive and negative propositional forms, which can be studied in detail with relative ease because they are compounds of conjunctive and conditional propositions whose properties are already well known to logicians. The logical relations (oppositions) between the various determinations (and their negations) are investigated, as well as their respective implications (eductions). Thereafter, their interactions (in syllogistic reasoning) are treated in the most rigorous manner. The main question we try to answer here is: is (or when is) the cause of a cause of something itself a cause of that thing, and if so to what degree? The figures and moods of positive causative syllogism are listed exhaustively; and the resulting arguments validated or invalidated, as the case may be. In this context, a general and sure method of evaluation called ‘matricial analysis’ (macroanalysis) is introduced. Because this (initial) method is cumbersome, it is used as little as possible – the remaining cases being evaluated by means of reduction. Phase II: Microanalysis. Seeing various difficulties encountered in the first phase, and the fact that some issues were left unresolved in it, a more precise method is developed in the second phase, capable of systematically answering most outstanding questions. This improved matricial analysis (microanalysis) is based on tabular prediction of all logically conceivable combinations and permutations of conjunctions between two or more items and their negations (grand matrices). Each such possible combination is called a ‘modus’ and is assigned a permanent number within the framework concerned (for 2, 3, or more items). This allows us to identify each distinct (causative or other, positive or negative) propositional form with a number of alternative moduses. This technique greatly facilitates all work with causative and related forms, allowing us to systematically consider their eductions, oppositions, and syllogistic combinations. In fact, it constitutes a most radical approach not only to causative propositions and their derivatives, but perhaps more importantly to their constituent conditional propositions. Moreover, it is not limited to logical conditioning and causation, but is equally applicable to other modes of modality, including extensional, natural, temporal and spatial conditioning and causation. From the results obtained, we are able to settle with formal certainty most of the historically controversial issues relating to causation. Phase III: Software Assisted Analysis. The approach in the second phase was very ‘manual’ and time consuming; the third phase is intended to ‘mechanize’ much of the work involved by means of spreadsheets (to begin with). This increases reliability of calculations (though no errors were found, in fact) – but also allows for a wider scope. Indeed, we are now able to produce a larger, 4-item grand matrix, and on its basis find the moduses of causative and other forms needed to investigate 4-item syllogism. As well, now each modus can be interpreted with greater precision and causation can be more precisely defined and treated. In this latest phase, the research is brought to a successful finish! Its main ambition, to obtain a complete and reliable listing of all 3-item and 4-item causative syllogisms, being truly fulfilled. This was made technically feasible, in spite of limitations in computer software and hardware, by cutting up problems into smaller pieces. For every mood of the syllogism, it was thus possible to scan for conclusions ‘mechanically’ (using spreadsheets), testing all forms of causative and preventive conclusions. Until now, this job could only be done ‘manually’, and therefore not exhaustively and with certainty. It took over 72’000 pages of spreadsheets to generate the sought for conclusions. This is a historic breakthrough for causal logic and logic in general. Of course, not all conceivable issues are resolved. There is still some work that needs doing, notably with regard to 5-item causative syllogism. But what has been achieved solves the core problem. The method for the resolution of all outstanding issues has definitely now been found and proven. The only obstacle to solving most of them is the amount of labor needed to produce the remaining (less important) tables. As for 5-item syllogism, bigger computer resources are also needed. (shrink)
Scroggs's theorem on the extensions of S5 is an early landmark in the modern mathematical studies of modal logics. From it, we know that the lattice of normal extensions of S5 is isomorphic to the inverse order of the natural numbers with infinity and that all extensions of S5 are in fact normal. In this paper, we consider extending Scroggs's theorem to modal logics with propositional quantifiers governed by the axioms and rules analogous to the usual ones for ordinary (...) quantifiers. We call them Π-logics. Taking S5Π, the smallest normal Π-logic extending S5, as the natural counterpart to S5 in Scroggs's theorem, we show that all normal Π-logics extending S5Π are complete with respect to their complete simple S5 algebras, that they form a lattice that is isomorphic to the lattice of the open sets of the disjoint union of two copies of the one-point compactification of N, that they have arbitrarily high Turing-degrees, and that there are non-normal Π-logics extending S5Π. (shrink)
Well-known results due to David Makinson show that there are exactly two Post complete normal modal logics, that in both of them, the modal operator is truth-functional, and that every consistent normal modal logic can be extended to at least one of them. Lloyd Humberstone has recently shown that a natural analog of this result in congruential modal logics fails, by showing that not every congruential modal logic can be extended to one in which the modal operator is (...) truth-functional. As Humberstone notes, the issue of Post completeness in congruential modal logics is not well understood. The present article shows that in contrast to normal modal logics, the extent of the property of Post completeness among congruential modal logics depends on the background set of logics. Some basic results on the corresponding properties of Post completeness are established, in particular that although a congruential modal logic is Post complete among all modal logics if and only if its modality is truth-functional, there are continuum many modal logics Post complete among congruential modal logics. (shrink)
This paper discusses the reports in Diogenes Laertius and in Sextus Empiricus concerning the classification of propositions. It is argued that the material in Sextus uses a source going back to the Dialectical school whose most prominent members were Diodorus Cronus and Philo of Megara. The material preserved in Diogenes Laertius, on the other hand, goes back to Chrysippus.
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.