John Corcoran and George Boger. Aristotelian logic and Euclidean geometry. Bulletin of Symbolic Logic. 20 (2014) 131. -/- By an Aristotelian logic we mean any system of direct and indirect deductions, chains of reasoning linking conclusions to premises—complete syllogisms, to use Aristotle’s phrase—1) intended to show that their conclusions follow logically from their respective premises and 2) resembling those in Aristotle’s Prior Analytics. Such systems presuppose existence of cases where it is not obvious that the conclusion follows (...) from the premises: there must be something deductions can show. Corcoran calls a proposition that follows from given premises a hidden consequence of those premises if it is not obvious that the proposition follows from those premises. By a Euclidean geometry we mean an extended discourse beginning with basic premises—axioms, postulates, definitions—1) treating a universe of geometrical figures and 2) resembling Euclid’s Elements. There were Euclidean geometries before Euclid (fl. 300 BCE), even before Aristotle (384–322 BCE). Bochenski, Lukasiewicz, Patzig and others never new this or if they did they found it inconvenient to mention. Euclid shows no awareness of Aristotle. It is obvious today—as it should have been obvious in Euclid’s time, if anyone knew both—that Aristotle’s logic was insufficient for Euclid’s geometry: few if any geometrical theorems can be deduced from Euclid’s premises by means of Aristotle’s deductions. Aristotle’s writings don’t say whether his logic is sufficient for Euclidean geometry. But, there is not even one fully-presented example. However, Aristotle’s writings do make clear that he endorsed the goal of a sufficient system. Nevertheless, incredible as this is today, many logicians after Aristotle claimed that Aristotelian logics are sufficient for Euclidean geometries. This paper reviews and analyses such claims by Mill, Boole, De Morgan, Russell, Poincaré, and others. It also examines early contrary statements by Hintikka, Mueller, Smith, and others. Special attention is given to the argumentations pro or con and especially to their logical, epistemic, and ontological presuppositions. What methodology is necessary or sufficient to show that a given logic is adequate or inadequate to serve as the underlying logi of a given science. (shrink)
In previous articles, it has been shown that the deductive system developed by Aristotle in his "second logic" is a natural deduction system and not an axiomatic system as previously had been thought. It was also stated that Aristotle's logic is self-sufficient in two senses: First, that it presupposed no other logical concepts, not even those of propositional logic; second, that it is (strongly) complete in the sense that every valid argument expressible in the language of the (...) system is deducible by means of a formal deduction in the system. Review of the system makes the first point obvious. The purpose of the present article is to prove the second. Strong completeness is demonstrated for the Aristotelian system. (shrink)
We begin with an introductory overview of contributions made by more than twenty scholars associated with the Philosophy Department at the University of Buffalo during the last half-century to our understanding and evaluation of Aristotle's logic. More well-known developments are merely mentioned in..
In the present article we attempt to show that Aristotle's syllogistic is an underlyinglogiC which includes a natural deductive system and that it isn't an axiomatic theory as had previously been thought. We construct a mathematical model which reflects certain structural aspects of Aristotle's logic. We examine the relation of the model to the system of logic envisaged in scattered parts of Prior and Posterior Analytics. Our interpretation restores Aristotle's reputation as a logician of consummate (...) imagination and skill. Several attributions of shortcomings and logical errors to Aristotle are shown to be without merit. Aristotle's logic is found to be self-sufficient in several senses: his theory of deduction is logically sound in every detail. (His indirect deductions have been criticized, but incorrectly on our account.) Aristotle's logic presupposes no other logical concepts, not even those of propositional logic. The Aristotelian system is seen to be complete in the sense that every valid argument expressible in his system admits of a deduction within his deductive system: every semantically valid argument is deducible. (shrink)
This presentation of Aristotle's natural deduction system supplements earlier presentations and gives more historical evidence. Some fine-tunings resulted from conversations with Timothy Smiley, Charles Kahn, Josiah Gould, John Kearns,John Glanvillle, and William Parry.The criticism of Aristotle's theory of propositions found at the end of this 1974 presentation was retracted in Corcoran's 2009 HPL article "Aristotle's demonstrative logic".
Prior Analytics by the Greek philosopher Aristotle (384 – 322 BCE) and Laws of Thought by the English mathematician George Boole (1815 – 1864) are the two most important surviving original logical works from before the advent of modern logic. This article has a single goal: to compare Aristotle’s system with the system that Boole constructed over twenty-two centuries later intending to extend and perfect what Aristotle had started. This comparison merits an article itself. Accordingly, this article does not (...) discuss many other historically and philosophically important aspects of Boole’s book, e.g. his confused attempt to apply differential calculus to logic, his misguided effort to make his system of ‘class logic’ serve as a kind of ‘truth-functional logic’, his now almost forgotten foray into probability theory, or his blindness to the fact that a truth-functional combination of equations that follows from a given truth-functional combination of equations need not follow truth-functionally. One of the main conclusions is that Boole’s contribution widened logic and changed its nature to such an extent that he fully deserves to share with Aristotle the status of being a founding figure in logic. By setting forth in clear and systematic fashion the basic methods for establishing validity and for establishing invalidity, Aristotle became the founder of logic as formal epistemology. By making the first unmistakable steps toward opening logic to the study of ‘laws of thought’—tautologies and laws such as excluded middle and non-contradiction—Boole became the founder of logic as formal ontology. (shrink)
I present a reconstruction of the logical system of the Tractatus, which differs from classical logic in two ways. It includes an account of Wittgenstein’s “form-series” device, which suffices to express some effectively generated countably infinite disjunctions. And its attendant notion of structure is relativized to the fixed underlying universe of what is named. -/- There follow three results. First, the class of concepts definable in the system is closed under finitary induction. Second, if the universe of objects (...) is countably infinite, then the property of being a tautology is \Pi^1_1-complete. But third, it is only granted the assumption of countability that the class of tautologies is \Sigma_1-definable in set theory. -/- Wittgenstein famously urges that logical relationships must show themselves in the structure of signs. He also urges that the size of the universe cannot be prejudged. The results of this paper indicate that there is no single way in which logical relationships could be held to make themselves manifest in signs, which does not prejudge the number of objects. (shrink)
JUNE 2015 UPDATE: A BIBLIOGRAPHY: JOHN CORCORAN’S PUBLICATIONS ON ARISTOTLE 1972–2015 By John Corcoran -/- This presentation includes a complete bibliography of John Corcoran’s publications relevant to his research on Aristotle’s logic. Sections I, II, III, and IV list 21 articles, 44 abstracts, 3 books, and 11 reviews. It starts with two watershed articles published in 1972: the Philosophy & Phenomenological Research article from Corcoran’s Philadelphia period that antedates his Aristotle studies and the Journal of Symbolic Logic article (...) from his Buffalo period first reporting his original results; it ends with works published in 2015. A few of the items are annotated as listed or with endnotes connecting them with other work and pointing out passages that in-retrospect are seen to be misleading and in a few places erroneous. In addition, Section V, “Discussions”, is a nearly complete secondary bibliography of works describing, interpreting, extending, improving, supporting, and criticizing Corcoran’s work: 8 items published in the 1970s, 23 in the 1980s, 42 in the 1990s, 56 in the 2000s, and 69 in the current decade. The secondary bibliography is also annotated as listed or with endnotes: some simply quoting from the cited item, but several answering criticisms and identifying errors. Section VI, “Alternatives”, lists recent works on Aristotle’s logic oblivious of Corcoran’s research and, more generally, of the Lukasiewicz-initiated tradition. As is evident from Section VII, “Acknowledgements”, Corcoran’s publications benefited from consultation with other scholars, most notably Timothy Smiley, Michael Scanlan, Roberto Torretti, and Kevin Tracy. All of Corcoran’s Greek translations were done in collaboration with two or more classicists. Corcoran never published a sentence without discussing it with his colleagues and students. -/- REQUEST: Please send errors, omissions, and suggestions. I am especially interested in citations made in non-English publications. Also, let me know of passages I should comment on. (shrink)
For more than fifty years, taxonomists have proposed numerous alternative definitions of species while they searched for a unique, comprehensive, and persuasive definition. This monograph shows that these efforts have been unnecessary, and indeed have provably been a pursuit of a will o’ the wisp because they have failed to recognize the theoretical impossibility of what they seek to accomplish. A clear and rigorous understanding of the logicunderlying species definition leads both to a recognition of the inescapable (...) ambiguity that affects the definition of species, and to a framework-relative approach to species definition that is logically compelling, i.e., cannot not be accepted without inconsistency. An appendix reflects upon the conclusions reached, applying them in an intellectually whimsical taxonomic thought experiment that conjectures the possibility of an emerging new human species. (shrink)
This review places this translation and commentary on Book A of Prior Analytics in historical, logical, and philosophical perspective. In particular, it details the author’s positions on current controversies. The author of this translation and commentary is a prolific and respected scholar, a leading figure in a large and still rapidly growing area of scholarship: Prior Analytics studies PAS. PAS treats many aspects of Aristotle’s Prior Analytics: historical context, previous writings that influenced it, preservation and transmission of its manuscripts, editions (...) of its manuscripts, interpretations, commentaries, translations, and its influence on subsequent logic, philosophy, and mathematics. All this attention is warranted because Prior Analytics marks the origin of logic: the field that, among other things, asks of a given proposition whether it follows from a given set of propositions; and, if it follows, how we determine that it follows; and, if it does not follow, how we determine that it does not follow. This translation and commentary is not suitable for use in an undergraduate course. It has too many quirks that the teacher would want to warn against. A copy editor should have dealt with these things and with other matters such as incorrect punctuation and improper end-of-line divisions. The prose is heavily laden with glaring clichés. The one-page preface contains “longer than I care to remember”, “more than I can possibly list here”, “first and foremost”, and “last and by no means least”—a sentence later is devoted to thanking the “incredibly meticulous and helpful copy-editor”. A few pages later the translator reveals the need “to find a path between the Scylla … and the Charybdis …”. Moreover, the index is far from meeting the needs of undergraduate students. The attention to scholarly detail is not what one hoped for from Oxford University Press. At 26b10-15, this translation reads “let swan and white be chosen as white things” for what Smith correctly translates “let swan and snow be selected from among those white things”. At 41b16, “angles AB and CD” should read “angles AC and BD”. Despite this book’s flaws, it will be found useful if not indispensable for those currently engaged in Prior Analytics studies. The alternatives suggested to Robin Smith’s translation choices are often worth consideration. It is to be emphasized, however, that this book is unsuitable for those entering Prior Analytics studies. (shrink)
We provide a direct method for proving Craig interpolation for a range of modal and intuitionistic logics, including those containing a "converse" modality. We demonstrate this method for classical tense logic, its extensions with path axioms, and for bi-intuitionistic logic. These logics do not have straightforward formalisations in the traditional Gentzen-style sequent calculus, but have all been shown to have cut-free nested sequent calculi. The proof of the interpolation theorem uses these calculi and is purely syntactic, without resorting (...) to embeddings, semantic arguments, or interpreted connectives external to the underlying logical language. A novel feature of our proof includes an orthogonality condition for defining duality between interpolants. (shrink)
We propose a nonmonotonic Description Logic of typicality able to account for the phenomenon of the combination of prototypical concepts. The proposed logic relies on the logic of typicality ALC + TR, whose semantics is based on the notion of rational closure, as well as on the distributed semantics of probabilistic Description Logics, and is equipped with a cognitive heuristic used by humans for concept composition. We first extend the logic of typicality ALC + TR by (...) typicality inclusions of the form p :: T(C) v D, whose intuitive meaning is that “we believe with degree p about the fact that typical Cs are Ds”. As in the distributed semantics, we define different scenarios containing only some typicality inclusions, each one having a suitable probability. We then exploit such scenarios in order to ascribe typical properties to a concept C obtained as the combination of two prototypical concepts. We also show that reasoning in the proposed Description Logic is EXPTIME-complete as for the underlying standard Description Logic ALC. (shrink)
It is claimed hereby that, against a current view of logic as a theory of consequence, opposition is a basic logical concept that can be used to define consequence itself. This requires some substantial changes in the underlying framework, including: a non-Fregean semantics of questions and answers, instead of the usual truth-conditional semantics; an extension of opposition as a relation between any structured objects; a definition of oppositions in terms of basic negation. Objections to this claim will be (...) reviewed. (shrink)
In their paper Nothing but the Truth Andreas Pietz and Umberto Rivieccio present Exactly True Logic, an interesting variation upon the four-valued logic for first-degree entailment FDE that was given by Belnap and Dunn in the 1970s. Pietz & Rivieccio provide this logic with a Hilbert-style axiomatisation and write that finding a nice sequent calculus for the logic will presumably not be easy. But a sequent calculus can be given and in this paper we will show (...) that a calculus for the Belnap-Dunn logic we have defined earlier can in fact be reused for the purpose of characterising ETL, provided a small alteration is made—initial assignments of signs to the sentences of a sequent to be proved must be different from those used for characterising FDE. While Pietz & Rivieccio define ETL on the language of classical propositional logic we also study its consequence relation on an extension of this language that is functionally complete for the underlying four truth values. On this extension the calculus gets a multiple-tree character—two proof trees may be needed to establish one proof. (shrink)
Hilbert's ε-calculus is based on an extension of the language of predicate logic by a term-forming operator εx. Two fundamental results about the ε-calculus, the first and second epsilon theorem, play a rôle similar to that which the cut-elimination theorem plays in sequent calculus. In particular, Herbrand's Theorem is a consequence of the epsilon theorems. The paper investigates the epsilon theorems and the complexity of the elimination procedure underlying their proof, as well as the length of Herbrand disjunctions (...) of existential theorems obtained by this elimination procedure. (shrink)
Commentators have not said much regarding Berkeley and Stoicism. Even when they do, they generally limit their remarks to Berkeley’s Siris (1744) where he invokes characteristically Stoic themes about the World Soul, “seminal reasons,” and the animating fire of the universe. The Stoic heritage of other Berkeleian doctrines (e.g., about mind or the semiotic character of nature) is seldom recognized, and when it is, little is made of it in explaining his other doctrines (e.g., immaterialism). None of this is surprising, (...) considering how Stoics are considered arch-materialists and determinists. My aim is to suggest that our understanding of Berkeley’s philosophy is improved significantly by acknowledging its underlying Stoic character. I argue that Berkeley proposes not only a semantic ontology based on assumptions of Stoic logic but also a doctrine in which perceptions or ideas are intelligible precisely because they are always embedded in the propositions of a discourse or language. (shrink)
Arthur Danto’s recent book, Andy Warhol, leads the reader through the story of the iconic American’s artistic life highlighted by a philosophical commentary, a commentary that merges Danto’s aesthetic theory with the artist himself. Inspired by Warhol’s Brillo Box installation, art that in Danto’s eyes was indiscernible from the everyday boxes it represented, Danto developed a theory that is able to differentiate art from non-art by employing the body of conceptual art theory manifest in what he termed the ‘artworld’. The (...) strength of Danto’s theory is found in its ability to explain the art of the post-modern era. His body of work weaves philosophy, art history and art criticism together, merging his aesthetic philosophy with his extensive knowledge of the world of art. Danto’s essentialist theory of embodied meaning provides him with a critical tool that succeeds in explaining the currents of contemporary art, a task that many great thinkers of art history were unable to do. If Warhol inspired Danto to create a philosophy of art, it is appropriate that Danto write a tribute to Warhol that traces how Warhol brought philosophy into art. Danto’s account of ‘Warhol as philosopher’ positions him as a pivotal figure in the history of twentieth-century art, effecting a sea change in how art was made and viewed. Warhol achieved this by conceiving of works that embodied the answers to a series of philosophical puzzles surrounding the nature of art. Warhol, as Danto describes him, manifests himself in his art because he had transformed himself, in a way, into an icon of the times. This pragmatist notion that art should undermine the dichotomies that exist between art and life would, by some accounts, position Warhol to be the philosopher that Danto claims him to be, for he dissolved the philosophical questions posted by late modern aesthetic thinkers by creating art that imploded the accepted notions of art at the time. One of Danto’s greatest contributions to aesthetics is his theory’s ability to distinguish art from non-art, recognizing that it is the artist’s intention that levels the sublimity of art into the commonplace, thereby transfiguring the everyday. However, acknowledging this achievement, I argue that Warhol’s philosophical contribution actually manifests itself in a manner different from that proposed by Danto. Danto maintains that the internal drive of art leads to the unfolding of art theoretical concepts that ineluctably shift the terrain of world of art. I would agree with Danto that Warhol, almost as Hegel viewed Napoleon as Geist on a horse, pushed forward the boundaries of art through the actualization of art’s internal drive. But I would disagree that the conceptual nature of art is one that unfolds merely as a relation of concepts that artists trace through a connection to the meaning of history they forge using their unmediated grasp of style. Rather, I would argue that the artist’s style is not bound so narrowly to the meanings they express. Through their aesthetic articulations, artists initiate a process of social interaction. This process employs the philosophical logic which Danto attributes to Warhol indirectly, and through it, it is able to transfigure the vocabulary of art—the concepts of the artworld—by superseding the language of modernism. Warhol’s philosophical contribution is seen in his mastery of both the medium of art and the underlyinglogic of the medium’s expression and reception. (shrink)
There has been a recent surge of work on deontic modality within philosophy of language. This work has put the deontic logic tradition in contact with natural language semantics, resulting in significant increase in sophistication on both ends. This chapter surveys the main motivations, achievements, and prospects of this work.
Composition as identity, as I understand it, is a theory of the composite structure of reality. The theory’s underlyinglogic is irreducibly plural; its fundamental primitive is a generalized identity relation that takes either plural or singular arguments. Strong versions of the theory that incorporate a generalized version of the indiscernibility of identicals are incompatible with the framework of plural logic, and should be rejected. Weak versions of the theory that are based on the idea that composition (...) is merely analogous to identity are too weak to be interesting, lacking in metaphysical consequence. I defend a moderate version according to which composition is a kind of identity, and argue that the difference is metaphysically substantial, not merely terminological. I then consider whether the notion of generalized identity, though fundamental, can be elucidated in modal terms by reverse engineering Hume’s Dictum. Unfortunately, for realists about possible worlds, such as myself,... (shrink)
We introduce an effective translation from proofs in the display calculus to proofs in the labelled calculus in the context of tense logics. We identify the labelled calculus proofs in the image of this translation as those built from labelled sequents whose underlying directed graph possesses certain properties. For the basic normal tense logic Kt, the image is shown to be the set of all proofs in the labelled calculus G3Kt.
Debunking arguments—also known as etiological arguments, genealogical arguments, access problems, isolation objec- tions, and reliability challenges—arise in philosophical debates about a diverse range of topics, including causation, chance, color, consciousness, epistemic reasons, free will, grounding, laws of nature, logic, mathematics, modality, morality, natural kinds, ordinary objects, religion, and time. What unifies the arguments is the transition from a premise about what does or doesn't explain why we have certain mental states to a negative assessment of their epistemic status. I (...) examine the common, underlying structure of the arguments and the different strategies for motivating and resisting the premises of debunking arguments. (shrink)
Kant’s account of the sublime makes frequent appeals to infinity, appeals which have been extensively criticised by commentators such as Budd and Crowther. This paper examines the costs and benefits of reconstructing the account in finitist terms. On the one hand, drawing on a detailed comparison of the first and third Critiques, I argue that the underlyinglogic of Kant’s position is essentially finitist. I defend the approach against longstanding objections, as well as addressing recent infinitist work by (...) Moore and Smith. On the other hand, however, I argue that finitism faces distinctive problems of its own: whilst the resultant theory is a coherent and interesting one, it is unclear in what sense it remains an analysis of the sublime. I illustrate the worry by juxtaposing the finitist reading with analytical cubism. -/- . (shrink)
I develop and defend a truthmaker semantics for the relevant logic R. The approach begins with a simple philosophical idea and develops it in various directions, so as to build a technically adequate relevant semantics. The central philosophical idea is that truths are true in virtue of specific states. Developing the idea formally results in a semantics on which truthmakers are relevant to what they make true. A very natural notion of conditionality is added, giving us relevant implication. I (...) then investigate ways to add conjunction, disjunction, and negation; and I discuss how to justify contraposition and excluded middle within a truthmaker semantics. (shrink)
Two systems of belief change based on paraconsistent logics are introduced in this article by means of AGM-like postulates. The first one, AGMp, is defined over any paraconsistent logic which extends classical logic such that the law of excluded middle holds w.r.t. the paraconsistent negation. The second one, AGMo , is specifically designed for paraconsistent logics known as Logics of Formal Inconsistency (LFIs), which have a formal consistency operator that allows to recover all the classical inferences. Besides the (...) three usual operations over belief sets, namely expansion, contraction and revision (which is obtained from contraction by the Levi identity), the underlying paraconsistent logic allows us to define additional operations involving (non-explosive) contradictions. Thus, it is defined external revision (which is obtained from contraction by the reverse Levi identity), consolidation and semi-revision, all of them over belief sets. It is worth noting that the latter operations, introduced by S. Hansson, involve the temporary acceptance of contradictory beliefs, and so they were originally defined only for belief bases. Unlike to previous proposals in the literature, only defined for specific paraconsistent logics, the present approach can be applied to a general class of paraconsistent logics which are supraclassical, thus preserving the spirit of AGM. Moreover, representation theorems w.r.t. constructions based on selection functions are obtained for all the operations. (shrink)
One of the most fundamental questions in the philosophy of mathematics concerns the relation between truth and formal proof. The position according to which the two concepts are the same is called deflationism, and the opposing viewpoint substantialism. In an important result of mathematical logic, Kurt Gödel proved in his first incompleteness theorem that all consistent formal systems containing arithmetic include sentences that can neither be proved nor disproved within that system. However, such undecidable Gödel sentences can be established (...) to be true once we expand the formal system with Alfred Tarski s semantical theory of truth, as shown by Stewart Shapiro and Jeffrey Ketland in their semantical arguments for the substantiality of truth. According to them, in Gödel sentences we have an explicit case of true but unprovable sentences, and hence deflationism is refuted. -/- Against that, Neil Tennant has shown that instead of Tarskian truth we can expand the formal system with a soundness principle, according to which all provable sentences are assertable, and the assertability of Gödel sentences follows. This way, the relevant question is not whether we can establish the truth of Gödel sentences, but whether Tarskian truth is a more plausible expansion than a soundness principle. In this work I will argue that this problem is best approached once we think of mathematics as the full human phenomenon, and not just consisting of formal systems. When pre-formal mathematical thinking is included in our account, we see that Tarskian truth is in fact not an expansion at all. I claim that what proof is to formal mathematics, truth is to pre-formal thinking, and the Tarskian account of semantical truth mirrors this relation accurately. -/- However, the introduction of pre-formal mathematics is vulnerable to the deflationist counterargument that while existing in practice, pre-formal thinking could still be philosophically superfluous if it does not refer to anything objective. Against this, I argue that all truly deflationist philosophical theories lead to arbitrariness of mathematics. In all other philosophical accounts of mathematics there is room for a reference of the pre-formal mathematics, and the expansion of Tarkian truth can be made naturally. Hence, if we reject the arbitrariness of mathematics, I argue in this work, we must accept the substantiality of truth. Related subjects such as neo-Fregeanism will also be covered, and shown not to change the need for Tarskian truth. -/- The only remaining route for the deflationist is to change the underlyinglogic so that our formal languages can include their own truth predicates, which Tarski showed to be impossible for classical first-order languages. With such logics we would have no need to expand the formal systems, and the above argument would fail. From the alternative approaches, in this work I focus mostly on the Independence Friendly (IF) logic of Jaakko Hintikka and Gabriel Sandu. Hintikka has claimed that an IF language can include its own adequate truth predicate. I argue that while this is indeed the case, we cannot recognize the truth predicate as such within the same IF language, and the need for Tarskian truth remains. In addition to IF logic, also second-order logic and Saul Kripke s approach using Kleenean logic will be shown to fail in a similar fashion. (shrink)
According to Kwame Nkrumah, the conscience of the African society is plagued with three strands of influences which have competing and conflicting ideologies: “African society has one segment which comprises our traditional way of life; it has a second segment which is filled by the presence of the Islamic tradition in Africa; it has a final segment which represents the infiltration of the Christian tradition and culture of Western Europe into Africa, using colonialism and neocolonialism as its primary vehicles.” When (...) these three segments with their conflicting ideologies are allowed to co-exist, the African society “will be racked by the most malignant schizophrenia.” Nkrumah’s solution, philosophical consciencism, presents an ideology aimed at achieving a harmony among the three segments in such a way that is “in tune with the original humanist principles underlying African society.” I do two main things in this paper: first, I present an analysis and critique of Nkrumah’s understanding of how the harmony is to be achieved in African societies; and second, I show how the theoretical ideas of philosophical consciencism – materialism, dialectical change, categorial conversion, socialism – are given actual form and content on the social-political scene through an analysis of Nkrumah's set theoretic terms. (shrink)
“Slingshot Arguments” are a family of arguments underlying the Fregean view that if sentences have reference at all, their references are their truth-values. Usually seen as a kind of collapsing argument, the slingshot consists in proving that, once you suppose that there are some items that are references of sentences (as facts or situations, for example), these items collapse into just two items: The True and The False. This dissertation treats of the slingshot dubbed “Gödel’s slingshot”. Gödel argued that (...) there is a deep connection between these arguments and definite descriptions. More precisely, according to Gödel, if one adopts Russell’s interpretation of definite descriptions (which clashes with Frege’s view that definite descriptions are singular terms), it is possible to evade the slingshot. We challenge Gödel’s view in two manners, first by presenting a slingshot even with a Russellian interpretation of definite descriptions and second by presenting a slingshot even when we change from singular terms to plural terms in the light of new developments of the so-called Plural Logic. The text is divided in three chapters, in the first, we present the discussion between Russell and Frege regarding definite descriptions, in the second, we present Gödel’s position and reconstructions of Gödel’s argument and in the third we prove our slingshot argument for Plural Logic. In light of these results we conclude that we can maintain the validity of slingshot arguments even within a Russellian interpretation of definite descriptions or in the context of Plural Logic. (shrink)
In this paper I will develop a view about the semantics of imperatives, which I term Modal Noncognitivism, on which imperatives might be said to have truth conditions (dispositionally, anyway), but on which it does not make sense to see them as expressing propositions (hence does not make sense to ascribe to them truth or falsity). This view stands against “Cognitivist” accounts of the semantics of imperatives, on which imperatives are claimed to express propositions, which are then enlisted in explanations (...) of the relevant logico-semantic phenomena. It also stands against the major competitors to Cognitivist accounts—all of which are non-truth-conditional and, as a result, fail to provide satisfying explanations of the fundamental semantic characteristics of imperatives (or so I argue). The view of imperatives I defend here improves on various treatments of imperatives on the market in giving an empirically and theoretically adequate account of their semantics and logic. It yields explanations of a wide range of semantic and logical phenomena about imperatives—explanations that are, I argue, at least as satisfying as the sorts of explanations of semantic and logical phenomena familiar from truth-conditional semantics. But it accomplishes this while defending the notion—which is, I argue, substantially correct—that imperatives could not have propositions, or truth conditions, as their meanings. (shrink)
ABSTRACT: Intuitionistic logic provides an elegant solution to the Sorites Paradox. Its acceptance has been hampered by two factors. First, the lack of an accepted semantics for languages containing vague terms has led even philosophers sympathetic to intuitionism to complain that no explanation has been given of why intuitionistic logic is the correct logic for such languages. Second, switching from classical to intuitionistic logic, while it may help with the Sorites, does not appear to offer any (...) advantages when dealing with the so-called paradoxes of higher-order vagueness. We offer a proposal that makes strides on both issues. We argue that the intuitionist’s characteristic rejection of any third alethic value alongside true and false is best elaborated by taking the normal modal system S4M to be the sentential logic of the operator ‘it is clearly the case that’. S4M opens the way to an account of higher-order vagueness which avoids the paradoxes that have been thought to infect the notion. S4M is one of the modal counterparts of the intuitionistic sentential calculus (IPC) and we use this fact to explain why IPC is the correct sentential logic to use when reasoning with vague statements. We also show that our key results go through in an intuitionistic version of S4M. Finally, we deploy our analysis to reply to Timothy Williamson’s objections to intuitionistic treatments of vagueness. (shrink)
The five English words—sentence, proposition, judgment, statement, and fact—are central to coherent discussion in logic. However, each is ambiguous in that logicians use each with multiple normal meanings. Several of their meanings are vague in the sense of admitting borderline cases. In the course of displaying and describing the phenomena discussed using these words, this paper juxtaposes, distinguishes, and analyzes several senses of these and related words, focusing on a constellation of recommended senses. One of the purposes of this (...) paper is to demonstrate that ordinary English properly used has the resources for intricate and philosophically sound investigation of rather deep issues in logic and philosophy of language. No mathematical, logical, or linguistic symbols are used. Meanings need to be identified and clarified before being expressed in symbols. We hope to establish that clarity is served by deferring the extensive use of formalized or logically perfect languages until a solid “informal” foundation has been established. Questions of “ontological status”—e.g., whether propositions or sentences, or for that matter characters, numbers, truth-values, or instants, are “real entities”, are “idealizations”, or are “theoretical constructs”—plays no role in this paper. As is suggested by the title, this paper is written to be read aloud. -/- I hope that reading this aloud in groups will unite people in the enjoyment of the humanistic spirit of analytic philosophy. (shrink)
This paper presents a way of formalising definite descriptions with a binary quantifier ι, where ιx[F, G] is read as ‘The F is G’. Introduction and elimination rules for ι in a system of intuitionist negative free logic are formulated. Procedures for removing maximal formulas of the form ιx[F, G] are given, and it is shown that deductions in the system can be brought into normal form.
Classical logic is usually interpreted as the logic of propositions. But from Boole's original development up to modern categorical logic, there has always been the alternative interpretation of classical logic as the logic of subsets of any given (nonempty) universe set. Partitions on a universe set are dual to subsets of a universe set in the sense of the reverse-the-arrows category-theoretic duality--which is reflected in the duality between quotient objects and subobjects throughout algebra. Hence the (...) idea arises of a dual logic of partitions. That dual logic is described here. Partition logic is at the same mathematical level as subset logic since models for both are constructed from (partitions on or subsets of) arbitrary unstructured sets with no ordering relations, compatibility or accessibility relations, or topologies on the sets. Just as Boole developed logical finite probability theory as a quantitative treatment of subset logic, applying the analogous mathematical steps to partition logic yields a logical notion of entropy so that information theory can be refounded on partition logic. But the biggest application is that when partition logic and the accompanying logical information theory are "lifted" to complex vector spaces, then the mathematical framework of quantum mechanics is obtained. Partition logic models indefiniteness (i.e., numerical attributes on a set become more definite as the inverse-image partition becomes more refined) while subset logic models the definiteness of classical physics (an entity either definitely has a property or definitely does not). Hence partition logic provides the backstory so the old idea of "objective indefiniteness" in QM can be fleshed out to a full interpretation of quantum mechanics. (shrink)
An exact truthmaker for A is a state which, as well as guaranteeing A’s truth, is wholly relevant to it. States with parts irrelevant to whether A is true do not count as exact truthmakers for A. Giving semantics in this way produces a very unusual consequence relation, on which conjunctions do not entail their conjuncts. This feature makes the resulting logic highly unusual. In this paper, we set out formal semantics for exact truthmaking and characterise the resulting notion (...) of entailment, showing that it is compact and decidable. We then investigate the effect of various restrictions on the semantics. We also formulate a sequent-style proof system for exact entailment and give soundness and completeness results. (shrink)
We reconsider the pragmatic interpretation of intuitionistic logic [21] regarded as a logic of assertions and their justi cations and its relations with classical logic. We recall an extension of this approach to a logic dealing with assertions and obligations, related by a notion of causal implication [14, 45]. We focus on the extension to co-intuitionistic logic, seen as a logic of hypotheses [8, 9, 13] and on polarized bi-intuitionistic logic as a (...) class='Hi'>logic of assertions and conjectures: looking at the S4 modal translation, we give a de nition of a system AHL of bi-intuitionistic logic that correctly represents the duality between intuitionistic and co-intuitionistic logic, correcting a mistake in previous work [7, 10]. A computational interpretation of cointuitionism as a distributed calculus of coroutines is then used to give an operational interpretation of subtraction.Work on linear co-intuitionism is then recalled, a linear calculus of co-intuitionistic coroutines is de ned and a probabilistic interpretation of linear co-intuitionism is given as in [9]. Also we remark that by extending the language of intuitionistic logic we can express the notion of expectation, an assertion that in all situations the truth of p is possible and that in a logic of expectations the law of double negation holds. Similarly, extending co-intuitionistic logic, we can express the notion of conjecture that p, de ned as a hypothesis that in some situation the truth of p is epistemically necessary. (shrink)
This paper is concerned with a propositional modal logic with operators for necessity, actuality and apriority. The logic is characterized by a class of relational structures defined according to ideas of epistemic two-dimensional semantics, and can therefore be seen as formalizing the relations between necessity, actuality and apriority according to epistemic two-dimensional semantics. We can ask whether this logic is correct, in the sense that its theorems are all and only the informally valid formulas. This paper gives (...) outlines of two arguments that jointly show that this is the case. The first is intended to show that the logic is informally sound, in the sense that all of its theorems are informally valid. The second is intended to show that it is informally complete, in the sense that all informal validities are among its theorems. In order to give these arguments, a number of independently interesting results concerning the logic are proven. In particular, the soundness and completeness of two proof systems with respect to the semantics is proven (Theorems 2.11 and 2.15), as well as a normal form theorem (Theorem 3.2), an elimination theorem for the actuality operator (Corollary 3.6), and the decidability of the logic (Corollary 3.7). It turns out that the logic invalidates a plausible principle concerning the interaction of apriority and necessity; consequently, a variant semantics is briefly explored on which this principle is valid. The paper concludes by assessing the implications of these results for epistemic two-dimensional semantics. (shrink)
The rather unrestrained use of second-order logic in the neo-logicist program is critically examined. It is argued in some detail that it brings with it genuine set-theoretical existence assumptions and that the mathematical power that Hume’s Principle seems to provide, in the derivation of Frege’s Theorem, comes largely from the ‘logic’ assumed rather than from Hume’s Principle. It is shown that Hume’s Principle is in reality not stronger than the very weak Robinson Arithmetic Q. Consequently, only a few (...) rudimentary facts of arithmetic are logically derivable from Hume’s Principle. And that hardly counts as a vindication of logicism. (shrink)
We investigate an enrichment of the propositional modal language L with a "universal" modality ■ having semantics x ⊧ ■φ iff ∀y(y ⊧ φ), and a countable set of "names" - a special kind of propositional variables ranging over singleton sets of worlds. The obtained language ℒ $_{c}$ proves to have a great expressive power. It is equivalent with respect to modal definability to another enrichment ℒ(⍯) of ℒ, where ⍯ is an additional modality with the semantics x ⊧ ⍯φ (...) iff Vy(y ≠ x → y ⊧ φ). Model-theoretic characterizations of modal definability in these languages are obtained. Further we consider deductive systems in ℒ $_{c}$ . Strong completeness of the normal ℒ $_{c}$ logics is proved with respect to models in which all worlds are named. Every ℒ $_{c}$ -logic axiomatized by formulae containing only names (but not propositional variables) is proved to be strongly frame-complete. Problems concerning transfer of properties ([in]completeness, filtration, finite model property etc.) from ℒ to ℒ $_{c}$ are discussed. Finally, further perspectives for names in multimodal environment are briefly sketched. (shrink)
George Boole emerged from the British tradition of the “New Analytic”, known for the view that the laws of logic are laws of thought. Logicians in the New Analytic tradition were influenced by the work of Immanuel Kant, and by the German logicians Wilhelm Traugott Krug and Wilhelm Esser, among others. In his 1854 work An Investigation of the Laws of Thought on Which are Founded the Mathematical Theories of Logic and Probabilities, Boole argues that the laws of (...) thought acquire normative force when constrained to mathematical reasoning. Boole’s motivation is, first, to address issues in the foundations of mathematics, including the relationship between arithmetic and algebra, and the study and application of differential equations (Durand-Richard, van Evra, Panteki). Second, Boole intended to derive the laws of logic from the laws of the operation of the human mind, and to show that these laws were valid of algebra and of logic both, when applied to a restricted domain. Boole’s thorough and flexible work in these areas influenced the development of model theory (see Hodges, forthcoming), and has much in common with contemporary inferentialist approaches to logic (found in, e.g., Peregrin and Resnik). (shrink)
“Second-order Logic” in Anderson, C.A. and Zeleny, M., Eds. Logic, Meaning, and Computation: Essays in Memory of Alonzo Church. Dordrecht: Kluwer, 2001. Pp. 61–76. -/- Abstract. This expository article focuses on the fundamental differences between second- order logic and first-order logic. It is written entirely in ordinary English without logical symbols. It employs second-order propositions and second-order reasoning in a natural way to illustrate the fact that second-order logic is actually a familiar part of our (...) traditional intuitive logical framework and that it is not an artificial formalism created by specialists for technical purposes. To illustrate some of the main relationships between second-order logic and first-order logic, this paper introduces basic logic, a kind of zero-order logic, which is more rudimentary than first-order and which is transcended by first-order in the same way that first-order is transcended by second-order. The heuristic effectiveness and the historical importance of second-order logic are reviewed in the context of the contemporary debate over the legitimacy of second-order logic. Rejection of second-order logic is viewed as radical: an incipient paradigm shift involving radical repudiation of a part of our scientific tradition, a tradition that is defended by classical logicians. But it is also viewed as reactionary: as being analogous to the reactionary repudiation of symbolic logic by supporters of “Aristotelian” traditional logic. But even if “genuine” logic comes to be regarded as excluding second-order reasoning, which seems less likely today than fifty years ago, its effectiveness as a heuristic instrument will remain and its importance for understanding the history of logic and mathematics will not be diminished. Second-order logic may someday be gone, but it will never be forgotten. Technical formalisms have been avoided entirely in an effort to reach a wide audience, but every effort has been made to limit the inevitable sacrifice of rigor. People who do not know second-order logic cannot understand the modern debate over its legitimacy and they are cut-off from the heuristic advantages of second-order logic. And, what may be worse, they are cut-off from an understanding of the history of logic and thus are constrained to have distorted views of the nature of the subject. As Aristotle first said, we do not understand a discipline until we have seen its development. It is a truism that a person's conceptions of what a discipline is and of what it can become are predicated on their conception of what it has been. (shrink)
This paper contends that Stoic logic (i.e. Stoic analysis) deserves more attention from contemporary logicians. It sets out how, compared with contemporary propositional calculi, Stoic analysis is closest to methods of backward proof search for Gentzen-inspired substructural sequent logics, as they have been developed in logic programming and structural proof theory, and produces its proof search calculus in tree form. It shows how multiple similarities to Gentzen sequent systems combine with intriguing dissimilarities that may enrich contemporary discussion. Much (...) of Stoic logic appears surprisingly modern: a recursively formulated syntax with some truth-functional propositional operators; analogues to cut rules, axiom schemata and Gentzen’s negation-introduction rules; an implicit variable-sharing principle and deliberate rejection of Thinning and avoidance of paradoxes of implication. These latter features mark the system out as a relevance logic, where the absence of duals for its left and right introduction rules puts it in the vicinity of McCall’s connexive logic. Methodologically, the choice of meticulously formulated meta-logical rules in lieu of axiom and inference schemata absorbs some structural rules and results in an economical, precise and elegant system that values decidability over completeness. (shrink)
In the paper we present a formal system motivated by a specific methodology of creating norms. According to the methodology, a norm-giver before establishing a set of norms should create a picture of the agent by creating his repertoire of actions. Then, knowing what the agent can do in particular situations, the norm-giver regulates these actions by assigning deontic qualifications to each of them. The set of norms created for each situation should respect (1) generally valid deontic principles being the (...) theses of our logic and (2) facts from the ontology of action whose relevance for the systems of norms we postulate. (shrink)
Demonstrative logic, the study of demonstration as opposed to persuasion, is the subject of Aristotle's two-volume Analytics. Many examples are geometrical. Demonstration produces knowledge (of the truth of propositions). Persuasion merely produces opinion. Aristotle presented a general truth-and-consequence conception of demonstration meant to apply to all demonstrations. According to him, a demonstration, which normally proves a conclusion not previously known to be true, is an extended argumentation beginning with premises known to be truths and containing a chain of reasoning (...) showing by deductively evident steps that its conclusion is a consequence of its premises. In particular, a demonstration is a deduction whose premises are known to be true. Aristotle's general theory of demonstration required a prior general theory of deduction presented in the Prior Analytics. His general immediate-deduction-chaining conception of deduction was meant to apply to all deductions. According to him, any deduction that is not immediately evident is an extended argumentation that involves a chaining of intermediate immediately evident steps that shows its final conclusion to follow logically from its premises. To illustrate his general theory of deduction, he presented an ingeniously simple and mathematically precise special case traditionally known as the categorical syllogistic. (shrink)
In the present paper we propose a system of propositional logic for reasoning about justification, truthmaking, and the connection between justifiers and truthmakers. The logic of justification and truthmaking is developed according to the fundamental ideas introduced by Artemov. Justifiers and truthmakers are treated in a similar way, exploiting the intuition that justifiers provide epistemic grounds for propositions to be considered true, while truthmakers provide ontological grounds for propositions to be true. This system of logic is then (...) applied both for interpreting the notorious definition of knowledge as justified true belief and for advancing a new solution to Gettier counterexamples to this standard definition. (shrink)
Epistemic logics based on the possible worlds semantics suffer from the problem of logical omniscience, whereby agents are described as knowing all logical consequences of what they know, including all tautologies. This problem is doubly challenging: on the one hand, agents should be treated as logically non-omniscient, and on the other hand, as moderately logically competent. Many responses to logical omniscience fail to meet this double challenge because the concepts of knowledge and reasoning are not properly separated. In this paper, (...) I present a dynamic logic of knowledge that models an agent’s epistemic state as it evolves over the course of reasoning. I show that the logic does not sacrifice logical competence on the altar of logical non- omniscience. (shrink)
Priest has provided a simple tableau calculus for Chellas's conditional logic Ck. We provide rules which, when added to Priest's system, result in tableau calculi for Chellas's CK and Lewis's VC. Completeness of these tableaux, however, relies on the cut rule.
It is often said that ‘every logical truth is obvious’ (Quine 1970: 82), that the ‘axioms and rules of logic are true in an obvious way’ (Murawski 2014: 87), or that ‘logic is a theory of the obvious’ (Sher 1999: 207). In this chapter, I set out to test empirically how the idea that logic is obvious is reflected in the scholarly work of logicians and philosophers of logic. My approach is data-driven. That is to say, (...) I propose that systematically searching for patterns of usage in databases of scholarly works, such as JSTOR, can provide new insights into the ways in which the idea that logic is obvious is reflected in logical and philosophical practice, i.e., in the arguments that logicians and philosophers of logic actually make in their published work. (shrink)
Modern categorical logic as well as the Kripke and topological models of intuitionistic logic suggest that the interpretation of ordinary “propositional” logic should in general be the logic of subsets of a given universe set. Partitions on a set are dual to subsets of a set in the sense of the category-theoretic duality of epimorphisms and monomorphisms—which is reflected in the duality between quotient objects and subobjects throughout algebra. If “propositional” logic is thus seen as (...) the logic of subsets of a universe set, then the question naturally arises of a dual logic of partitions on a universe set. This paper is an introduction to that logic of partitions dual to classical subset logic. The paper goes from basic concepts up through the correctness and completeness theorems for a tableau system of partition logic. (shrink)
In this paper I introduce a sequent system for the propositional modal logic S5. Derivations of valid sequents in the system are shown to correspond to proofs in a novel natural deduction system of circuit proofs (reminiscient of proofnets in linear logic, or multiple-conclusion calculi for classical logic). -/- The sequent derivations and proofnets are both simple extensions of sequents and proofnets for classical propositional logic, in which the new machinery—to take account of the modal vocabulary—is (...) directly motivated in terms of the simple, universal Kripke semantics for S5. The sequent system is cut-free and the circuit proofs are normalising. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.