forall x: Calgary is a full-featured textbook on formallogic. It covers key notions of logic such as consequence and validity of arguments, the syntax of truth-functional propositional logic TFL and truth-table semantics, the syntax of first-order (predicate) logic FOL with identity (first-order interpretations), translating (formalizing) English in TFL and FOL, and Fitch-style natural deduction proof systems for both TFL and FOL. It also deals with some advanced topics such as truth-functional completeness and modal (...) class='Hi'>logic. Exercises with solutions are available. It is provided in PDF (for screen reading, printing, and a special version for dyslexics) and in LaTeX source code. (shrink)
Not focusing on the history of classical logic, this book provides discussions and quotes central passages on its origins and development, namely from a philosophical perspective. Not being a book in mathematical logic, it takes formallogic from an essentially mathematical perspective. Biased towards a computational approach, with SAT and VAL as its backbone, this is an introduction to logic that covers essential aspects of the three branches of logic, to wit, philosophical, mathematical, and (...) computational. (shrink)
An introduction to sentential logic and first-order predicate logic with identity, logical systems that significantly influenced twentieth-century analytic philosophy. After working through the material in this book, a student should be able to understand most quantified expressions that arise in their philosophical reading. -/- This books treats symbolization, formal semantics, and proof theory for each language. The discussion of formal semantics is more direct than in many introductory texts. Although forall x does not contain proofs of (...) soundness and completeness, it lays the groundwork for understanding why these are things that need to be proven. -/- The book highlights the choices involved in developing sentential and predicate logic. Students should realize that these two are not the only possible formal languages. In translating to a formal language, we simplify and profit in clarity. The simplification comes at a cost, and different formal languages are suited to translating different parts of natural language. (shrink)
A dialectical contradiction can be appropriately described within the framework of classical formallogic. It is in harmony with the law of noncontradiction. According to our definition, two theories make up a dialectical contradiction if each of them is consistent and their union is inconsistent. It can happen that each of these two theories has an intended model. Plenty of examples are to be found in the history of science.
Revised version of chapter in J. N. Mohanty and W. McKenna (eds.), Husserl’s Phenomenology: A Textbook, Lanham: University Press of America, 1989, 29–67. -/- Logic for Husserl is a science of science, a science of what all sciences have in common in their modes of validation. Thus logic deals with universal laws relating to truth, to deduction, to verification and falsification, and with laws relating to theory as such, and to what makes for theoretical unity, both on the (...) side of the propositions of a theory and on the side of the domain of objects to which these propositions refer. This essay presents a systematic overview of Husserl’s views on these matters as put forward in his Logical Investigations. It shows how Husserl’s theory of linguistic meanings as species of mental acts, his formal ontology of part, whole and dependence, his theory of meaning categories, and his theory of categorial intuition combine with his theory of science to form a single whole. Finally, it explores the ways in which Husserl’s ideas on these matters can be put to use in solving problems in the philosophy of language, logic and mathematics in a way which does justice to the role of mental activity in each of these domains while at the same time avoiding the pitfalls of psychologism. (shrink)
Judaic Logic is an original inquiry into the forms of thought determining Jewish law and belief, from the impartial perspective of a logician. Judaic Logic attempts to honestly estimate the extent to which the logic employed within Judaism fits into the general norms, and whether it has any contributions to make to them. The author ranges far and wide in Jewish lore, finding clear evidence of both inductive and deductive reasoning in the Torah and other books of (...) the Bible, and analyzing the methodology of the Talmud and other Rabbinic literature by means of formal tools which make possible its objective evaluation with reference to scientific logic. The result is a highly innovative work – incisive and open, free of clichés or manipulation. Judaic Logic succeeds in translating vague and confusing interpretative principles and examples into formulas with the clarity and precision of Aristotelean syllogism. Among the positive outcomes, for logic in general, are a thorough listing, analysis and validation of the various forms of a-fortiori argument, as well as a clarification of dialectic logic. However, on the negative side, this demystification of Talmudic/Rabbinic modes of thought (hermeneutic and heuristic) reveals most of them to be, contrary to the boasts of orthodox commentators, far from deductive and certain. They are often, legitimately enough, inductive. But they are also often unnatural and arbitrary constructs, supported by unverifiable claims and fallacious techniques. Many other thought-processes, used but not noticed or discussed by the Rabbis, are identified in this treatise, and subjected to logical review. Various more or less explicit Rabbinic doctrines, which have logical significance, are also examined in it. In particular, this work includes a formal study of the ethical logic (deontology) found in Jewish law, to elicit both its universal aspects and its peculiarities. With regard to Biblical studies, one notable finding is an explicit formulation (which, however, the Rabbis failed to take note of and stress) of the principles of adduction in the Torah, written long before the acknowledgement of these principles in Western philosophy and their assimilation in a developed theory of knowledge. Another surprise is that, in contrast to Midrashic claims, the Tanakh (Jewish Bible) contains a lot more than ten instances of qal vachomer (a-fortiori) reasoning. In sum, Judaic Logic elucidates and evaluates the epistemological assumptions which have generated the Halakhah (Jewish religious jurisprudence) and allied doctrines. Traditional justifications, or rationalizations, concerning Judaic law and belief, are carefully dissected and weighed at the level of logical process and structure, without concern for content. This foundational approach, devoid of any critical or supportive bias, clears the way for a timely reassessment of orthodox Judaism (and incidentally, other religious systems, by means of analogies or contrasts). Judaic Logic ought, therefore, to be read by all Halakhists, as well as Bible and Talmud scholars and students; and also by everyone interested in the theory, practise and history of logic. (shrink)
A collection of material on Husserl's Logical Investigations, and specifically on Husserl's formal theory of parts, wholes and dependence and its influence in ontology, logic and psychology. Includes translations of classic works by Adolf Reinach and Eugenie Ginsberg, as well as original contributions by Wolfgang Künne, Kevin Mulligan, Gilbert Null, Barry Smith, Peter M. Simons, Roger A. Simons and Dallas Willard. Documents work on Husserl's ontology arising out of early meetings of the Seminar for Austro-German Philosophy.
Basic Formal Ontology (BFO) is a top-level ontology used in hundreds of active projects in scientific and other domains. BFO has been selected to serve as top-level ontology in the Industrial Ontologies Foundry (IOF), an initiative to create a suite of ontologies to support digital manufacturing on the part of representatives from a number of branches of the advanced manufacturing industries. We here present a first draft set of axioms and definitions of an IOF upper ontology descending from BFO. (...) The axiomatization is designed to capture the meanings of terms commonly used in manufacturing and is designed to serve as starting point for the construction of the IOF ontology suite. (shrink)
The current resurgence of interest in cognition and in the nature of cognitive processing has brought with it also a renewed interest in the early work of Husserl, which contains one of the most sustained attempts to come to grips with the problems of logic from a cognitive point of view. Logic, for Husserl, is a theory of science; but it is a theory which takes seriously the idea that scientific theories are constituted by the mental acts of (...) cognitive subjects. The present essay begins with an exposition of Husserl's act-based conception of what a science is, and goes on to consider his account of the role of linguistic meanings, of the ontology of scientific objects, and of evidence and truth. The essay concentrates almost exclusively on the Logical Investigations of 1900/01. This is not only because this work, which is surely Husserl's single most important masterpiece, has been overshadowed first of all by his Ideas I and then later by the Crisis. It is also because the Investigations contain, in a peculiarly clear and pregnant form, a whole panoply of ideas on logic and cognitive theory which either simply disappeared in Husserl's own later writings or became obfuscated by an admixture of that great mystery which is 'transcendental phenomenology'. (shrink)
This textbook has developed over the last few years of teaching introductory symbolic logic and critical thinking courses. It has been truly a pleasure to have benefited from such great students and colleagues over the years. As we have become increasingly frustrated with the costs of traditional logic textbooks (though many of them deserve high praise for their accuracy and depth), the move to open source has become more and more attractive. We're happy to provide it free of (...) charge for educational use. With that being said, there are always improvements to be made here and we would be most grateful for constructive feedback and criticism. We have chosen to write this text in LaTex and have adopted certain conventions with symbols. Certainly many important aspects of critical thinking and logic have been omitted here, including historical developments and key logicians, and for that we apologize. Our goal was to create a textbook that could be provided to students free of charge and still contain some of the more important elements of critical thinking and introductory logic. To that end, an additional benefit of providing this textbook as a Open Education Resource (OER) is that we will be able to provide newer updated versions of this text more frequently, and without any concern about increased charges each time. We are particularly looking forward to expanding our examples, and adding student exercises. We will additionally aim to continually improve the quality and accessibility of our text for students and faculty alike. We have included a bibliography that includes many admirable textbooks, all of which we have benefited from. The interested reader is encouraged to consult these texts for further study and clarification. These texts have been a great inspiration for us and provide features to students that this concise textbook does not. We would both like to thank the philosophy students at numerous schools in the Puget Sound region for their patience and helpful suggestions. In particular, we would like to thank our colleagues at Green River College, who have helped us immensely in numerous different ways. Please feel free to contact us with comments and suggestions. We will strive to correct errors when pointed out, add necessary material, and make other additional and needed changes as they arise. Please check back for the most up to date version. (shrink)
Kant claims that Aristotles logic as complete, explain the historical and philosophical considerations that commit him to proving the completeness claim and sketch the proof based on materials from his logic corpus. The proof will turn out to be an integral part of Kant’s larger reform of formallogic in response to a foundational crisis facing it.
The main purpose of the paper is to outline the formal-logical, general theory of language treated as a particular ontological being. The theory itself is called the ontology of language, because it is motivated by the fact that the language plays a special role: it reflects ontology and ontology reflects the world. Language expressions are considered to have a dual ontological status. They are understood as either concretes, that is tokens – material, physical objects, or types – classes of (...) tokens, which are abstract objects. Such a duality is taken into account in the presented logical theory of syntax, semantics and pragmatics. We point to the possibility of building it on two different levels; one which stems from concretes, language tokens of expressions, whereas the other one – from their classes, types conceived as abstract, ideal beings. The aim of this work is not only to outline this theory as taking into account the functional approach to language, with respect to the dual ontological nature of its expressions, but also to show that the logic based on it is ontologically neutral in the sense that it abstracts from accepting some existential assumptions, related with the ontological nature of these linguistic expressions and their extra-linguistic ontological counterparts (objects). (shrink)
This article discusses a relation between the formal science of logical semantics and some monotheistic, polytheistic and Trinitarian Christian notions. This relation appears in the use of the existential quantifier and of logical-modal notions when some monotheistic and polytheistic concepts and, principally, the concept of Trinity Dogma are analyzed. Thus, some presupposed modal notions will appear in some monotheistic propositions, such as the notion of “logically necessary”. From this, it will be shown how the term “God” is a polysemic (...) term and is often treated as both subject and predicate. This will make it clear that there is no plausible intellectual justification for believing that the term “God” can only be used as a name and never as a predicate, and vice versa. After that analysis, I will show that the conjunction of the “Trinity Dogma” with some type of “monotheistic position” would necessarily imply some class of absurdity and/or semantic “oddity”. (shrink)
In this paper the propositional logic LTop is introduced, as an extension of classical propositional logic by adding a paraconsistent negation. This logic has a very natural interpretation in terms of topological models. The logic LTop is nothing more than an alternative presentation of modal logic S4, but in the language of a paraconsistent logic. Moreover, LTop is a logic of formal inconsistency in which the consistency and inconsistency operators have a nice (...) topological interpretation. This constitutes a new proof of S4 as being "the logic of topological spaces", but now under the perspective of paraconsistency. (shrink)
One of the most expected properties of a logical system is that it can be algebraizable, in the sense that an algebraic counterpart of the deductive machinery could be found. Since the inception of da Costa's paraconsistent calculi, an algebraic equivalent for such systems have been searched. It is known that these systems are non self-extensional (i.e., they do not satisfy the replacement property). More than this, they are not algebraizable in the sense of Blok-Pigozzi. The same negative results hold (...) for several systems of the hierarchy of paraconsistent logics known as Logics of Formal Inconsistency (LFIs). Because of this, these logics are uniquely characterized by semantics of non-deterministic kind. This paper offers a solution for two open problems in the domain of paraconsistency, in particular connected to algebraization of LFIs, by obtaining several LFIs weaker than C1, each of one is algebraizable in the standard Lindenbaum-Tarski's sense by a suitable variety of Boolean algebras extended with operators. This means that such LFIs satisfy the replacement property. The weakest LFI satisfying replacement presented here is called RmbC, which is obtained from the basic LFI called mbC. Some axiomatic extensions of RmbC are also studied, and in addition a neighborhood semantics is defined for such systems. It is shown that RmbC can be defined within the minimal bimodal non-normal logic E+E defined by the fusion of the non-normal modal logic E with itself. Finally, the framework is extended to first-order languages. RQmbC, the quantified extension of RmbC, is shown to be sound and complete w.r.t. BALFI semantics. (shrink)
Since the time of Aristotle's students, interpreters have considered Prior Analytics to be a treatise about deductive reasoning, more generally, about methods of determining the validity and invalidity of premise-conclusion arguments. People studied Prior Analytics in order to learn more about deductive reasoning and to improve their own reasoning skills. These interpreters understood Aristotle to be focusing on two epistemic processes: first, the process of establishing knowledge that a conclusion follows necessarily from a set of premises (that is, on the (...) epistemic process of extracting information implicit in explicitly given information) and, second, the process of establishing knowledge that a conclusion does not follow. Despite the overwhelming tendency to interpret the syllogistic as formal epistemology, it was not until the early 1970s that it occurred to anyone to think that Aristotle may have developed a theory of deductive reasoning with a well worked-out system of deductions comparable in rigor and precision with systems such as propositional logic or equational logic familiar from mathematical logic. When modern logicians in the 1920s and 1930s first turned their attention to the problem of understanding Aristotle's contribution to logic in modern terms, they were guided both by the Frege-Russell conception of logic as formal ontology and at the same time by a desire to protect Aristotle from possible charges of psychologism. They thought they saw Aristotle applying the informal axiomatic method to formal ontology, not as making the first steps into formal epistemology. They did not notice Aristotle's description of deductive reasoning. Ironically, the formal axiomatic method (in which one explicitly presents not merely the substantive axioms but also the deductive processes used to derive theorems from the axioms) is incipient in Aristotle's presentation. Partly in opposition to the axiomatic, ontically-oriented approach to Aristotle's logic and partly as a result of attempting to increase the degree of fit between interpretation and text, logicians in the 1970s working independently came to remarkably similar conclusions to the effect that Aristotle indeed had produced the first system of formal deductions. They concluded that Aristotle had analyzed the process of deduction and that his achievement included a semantically complete system of natural deductions including both direct and indirect deductions. Where the interpretations of the 1920s and 1930s attribute to Aristotle a system of propositions organized deductively, the interpretations of the 1970s attribute to Aristotle a system of deductions, or extended deductive discourses, organized epistemically. The logicians of the 1920s and 1930s take Aristotle to be deducing laws of logic from axiomatic origins; the logicians of the 1970s take Aristotle to be describing the process of deduction and in particular to be describing deductions themselves, both those deductions that are proofs based on axiomatic premises and those deductions that, though deductively cogent, do not establish the truth of the conclusion but only that the conclusion is implied by the premise-set. Thus, two very different and opposed interpretations had emerged, interestingly both products of modern logicians equipped with the theoretical apparatus of mathematical logic. The issue at stake between these two interpretations is the historical question of Aristotle's place in the history of logic and of his orientation in philosophy of logic. This paper affirms Aristotle's place as the founder of logic taken as formal epistemology, including the study of deductive reasoning. A by-product of this study of Aristotle's accomplishments in logic is a clarification of a distinction implicit in discourses among logicians--that between logic as formal ontology and logic as formal epistemology. (shrink)
ABSTRACT: This 1974 paper builds on our 1969 paper (Corcoran-Weaver [2]). Here we present three (modal, sentential) logics which may be thought of as partial systematizations of the semantic and deductive properties of a sentence operator which expresses certain kinds of necessity. The logical truths [sc. tautologies] of these three logics coincide with one another and with those of standard formalizations of Lewis's S5. These logics, when regarded as logistic systems (cf. Corcoran [1], p. 154), are seen to be equivalent; (...) but, when regarded as consequence systems (ibid., p. 157), one diverges from the others in a fashion which suggests that two standard measures of semantic complexity may not be as closely linked as previously thought. -/- This 1974 paper uses the linear notation for natural deduction presented in [2]: each two-dimensional deduction is represented by a unique one-dimensional string of characters. Thus obviating need for two-dimensional trees, tableaux, lists, and the like—thereby facilitating electronic communication of natural deductions. The 1969 paper presents a (modal, sentential) logic which may be thought of as a partial systematization of the semantic and deductive properties of a sentence operator which expresses certain kinds of necessity. The logical truths [sc. tautologies] of this logic coincides those of standard formalizations of Lewis’s S4. Among the paper's innovations is its treatment of modal logic in the setting of natural deduction systems--as opposed to axiomatic systems. The author’s apologize for the now obsolete terminology. For example, these papers speak of “a proof of a sentence from a set of premises” where today “a deduction of a sentence from a set of premises” would be preferable. 1. Corcoran, John. 1969. Three Logical Theories, Philosophy of Science 36, 153–77. J P R -/- 2. Corcoran, John and George Weaver. 1969. Logical Consequence in Modal Logic: Natural Deduction in S5 Notre Dame Journal of FormalLogic 10, 370–84. MR0249278 (40 #2524). 3. Weaver, George and John Corcoran. 1974. Logical Consequence in Modal Logic: Some Semantic Systems for S4, Notre Dame Journal of FormalLogic 15, 370–78. MR0351765 (50 #4253). (shrink)
In this paper we present a philosophical motivation for the logics of formal inconsistency, a family of paraconsistent logics whose distinctive feature is that of having resources for expressing the notion of consistency within the object language in such a way that consistency may be logically independent of non- contradiction. We defend the view according to which logics of formal inconsistency may be interpreted as theories of logical consequence of an epistemological character. We also argue that in order (...) to philosophically justify paraconsistency there is no need to endorse dialetheism, the thesis that there are true contradictions. Furthermore, we argue that an intuitive reading of the bivalued semantics for the logic mbC, a logic of formal inconsistency based on classical logic, fits in well with the basic ideas of an intuitive interpretation of contradictions. On this interpretation, the acceptance of a pair of propositions A and ¬A does not mean that A is simultaneously true and false, but rather that there is conflicting evidence about the truth value of A. (shrink)
John Venn has the “uneasy suspicion” that the stagnation in mathematical logic between J. H. Lambert and George Boole was due to Kant’s “disastrous effect on logical method,” namely the “strictest preservation [of logic] from mathematical encroachment.” Kant’s actual position is more nuanced, however. In this chapter, I tease out the nuances by examining his use of Leonhard Euler’s circles and comparing it with Euler’s own use. I do so in light of the developments in logical calculus from (...) G. W. Leibniz to Lambert and Gottfried Ploucquet. While Kant is evidently open to using mathematical tools in logic, his main concern is to clarify what mathematical tools can be used to achieve. For without such clarification, all efforts at introducing mathematical tools into logic would be blind if not complete waste of time. In the end, Kant would stress, the means provided by formallogic at best help us to express and order what we already know in some sense. No matter how much mathematical notations may enhance the precision of this function of formallogic, it does not change the fact that no truths can, strictly speaking, be revealed or established by means of those notations. (shrink)
We reconsider the pragmatic interpretation of intuitionistic logic [21] regarded as a logic of assertions and their justi cations and its relations with classical logic. We recall an extension of this approach to a logic dealing with assertions and obligations, related by a notion of causal implication [14, 45]. We focus on the extension to co-intuitionistic logic, seen as a logic of hypotheses [8, 9, 13] and on polarized bi-intuitionistic logic as a (...) class='Hi'>logic of assertions and conjectures: looking at the S4 modal translation, we give a de nition of a system AHL of bi-intuitionistic logic that correctly represents the duality between intuitionistic and co-intuitionistic logic, correcting a mistake in previous work [7, 10]. A computational interpretation of cointuitionism as a distributed calculus of coroutines is then used to give an operational interpretation of subtraction.Work on linear co-intuitionism is then recalled, a linear calculus of co-intuitionistic coroutines is de ned and a probabilistic interpretation of linear co-intuitionism is given as in [9]. Also we remark that by extending the language of intuitionistic logic we can express the notion of expectation, an assertion that in all situations the truth of p is possible and that in a logic of expectations the law of double negation holds. Similarly, extending co-intuitionistic logic, we can express the notion of conjecture that p, de ned as a hypothesis that in some situation the truth of p is epistemically necessary. (shrink)
It is a received view that Kant’s formallogic (or what he calls “pure general logic”) is thoroughly intensional. On this view, even the notion of logical extension must be understood solely in terms of the concepts that are subordinate to a given concept. I grant that the subordination relation among concepts is an important theme in Kant’s logical doctrine of concepts. But I argue that it is both possible and important to ascribe to Kant an objectual (...) notion of logical extension according to which the extension of a concept is the multitude of objects falling under it. I begin by defending this ascription in response to three reasons that are commonly invoked against it. First, I explain that this ascription is compatible with Kant’s philosophical reflections on the nature and boundary of a formallogic. Second, I show that the objectual notion of extension I ascribe to Kant can be traced back to many of the early modern works of logic with which he was more or less familiar. Third, I argue that such a notion of extension makes perfect sense of a pivotal principle in Kant’s logic, namely the principle that the quantity of a concept’s extension is inversely proportional to that of its intension. In the process, I tease out two important features of the Kantian objectual notion of logical extension in terms of which it markedly differs from the modern one. First, on the modern notion the extension of a concept is the sum of the objects actually falling under it; on the Kantian notion, by contrast, the extension of a concept consists of the multitude of possible objects—not in the metaphysical sense of possibility, though—to which a concept applies in virtue of being a general representation. While the quantity of the former extension is finite, that of the latter is infinite—as is reflected in Kant’s use of a plane-geometrical figure (e.g., circle, square), which is continuum as opposed to discretum, to represent the extension in question. Second, on the modern notion of extension, a concept that signifies exactly one object has a one-member extension; on the Kantian notion, however, such a concept has no extension at all—for a concept is taken to have extension only if it signifies a multitude of things. This feature of logical extension is manifested in Kant’s claim that a singular concept (or a concept in its singular use) can, for lack of extension, be figuratively represented only by a point—as opposed to an extended figure like circle, which is reserved for a general concept (or a concept in its general use). Precisely on account of these two features, the Kantian objectual extension proves vital to Kant’s theory of logical quantification (in universal, particular and singular judgments, respectively) and to his view regarding the formal truth of analytic judgments. (shrink)
Argumentation theory underwent a significant development in the Fifties and Sixties: its revival is usually connected to Perelman's criticism of formallogic and the development of informal logic. Interestingly enough it was during this period that Artificial Intelligence was developed, which defended the following thesis (from now on referred to as the AI-thesis): human reasoning can be emulated by machines. The paper suggests a reconstruction of the opposition between formal and informal logic as a move (...) against a premise of an argument for the AI-thesis, and suggests making a distinction between a broad and a narrow notion of algorithm that might be used to reformulate the question as a foundational problem for argumentation theory. (shrink)
C. I. Lewis (I883-I964) was the first major figure in history and philosophy of logic—-a field that has come to be recognized as a separate specialty after years of work by Ivor Grattan-Guinness and others (Dawson 2003, 257).Lewis was among the earliest to accept the challenges offered by this field; he was the first who had the philosophical and mathematical talent, the philosophical, logical, and historical background, and the patience and dedication to objectivity needed to excel. He was blessed (...) with many fortunate circumstances, not least of which was entering the field when mathematical logic, after only six decades of toil, had just reaped one of its most important harvests with publication of the monumental Principia Mathematica. It was a time of joyful optimism which demanded an historical account and a sober philosophical critique. Lewis was one of the first to apply to mathematical logic the Aristotelian dictum that we do not understand a living institution until we see it growing from its birth. (shrink)
ABSTRACT: In its strongest unqualified form, the principle of wholistic reference is that in any given discourse, each proposition refers to the whole universe of that discourse, regardless of how limited the referents of its non-logical or content terms. According to this principle every proposition of number theory, even an equation such as "5 + 7 = 12", refers not only to the individual numbers that it happens to mention but to the whole universe of numbers. This principle, its history, (...) and its relevance to some of Oswaldo Chateaubriand's work are discussed in my 2004 paper "The Principle of Wholistic Reference" in Essays on Chateaubriand's "Logical Forms". In Chateaubriand's réplica (reply), which is printed with my paper, he raised several important additional issues including the three I focus on in this tréplica (reply to his reply): truth-values, universes of discourse, and formal ontology. This paper is self-contained: it is not necessary to have read the above-mentioned works. The principle of wholistic reference (PWR) was first put forth by George Boole in 1847 when he espoused a monistic fixed-universe viewpoint similar to the one Frege and Russell espoused throughout their careers. Later, Boole elaborated PWR in 1854 from the pluralistic multiple-universes perspective. (shrink)
We present a philosophical motivation for the logics of formal inconsistency, a family of paraconsistent logics whose distinctive feature is that of having resources for expressing the notion of consistency within the object language. We shall defend the view according to which logics of formal inconsistency are theories of logical consequence of normative and epistemic character. This approach not only allows us to make inferences in the presence of contradictions, but offers a philosophically acceptable account of paraconsistency.
Work on the nature and scope of formallogic has focused unduly on the distinction between logical and extra-logical vocabulary; which argument forms a logical theory countenances depends not only on its stock of logical terms, but also on its range of grammatical categories and modes of composition. Furthermore, there is a sense in which logical terms are unnecessary. Alexandra Zinke has recently pointed out that propositional logic can be done without logical terms. By defining a logical-term-free (...) language with the full expressive power of first-order logic with identity, I show that this is true of logic more generally. Furthermore, having, in a logical theory, non-trivial valid forms that do not involve logical terms is not merely a technical possibility. As the case of adverbs shows, issues about the range of argument forms logic should countenance can quite naturally arise in such a way that they do not turn on whether we countenance certain terms as logical. (shrink)
We show how removing faith-based beliefs in current philosophies of classical and constructive mathematics admits formal, evidence-based, definitions of constructive mathematics; of a constructively well-defined logic of a formal mathematical language; and of a constructively well-defined model of such a language. -/- We argue that, from an evidence-based perspective, classical approaches which follow Hilbert's formal definitions of quantification can be labelled `theistic'; whilst constructive approaches based on Brouwer's philosophy of Intuitionism can be labelled `atheistic'. -/- We (...) then adopt what may be labelled a finitary, evidence-based, `agnostic' perspective and argue that Brouwerian atheism is merely a restricted perspective within the finitary agnostic perspective, whilst Hilbertian theism contradicts the finitary agnostic perspective. -/- We then consider the argument that Tarski's classic definitions permit an intelligence---whether human or mechanistic---to admit finitary, evidence-based, definitions of the satisfaction and truth of the atomic formulas of the first-order Peano Arithmetic PA over the domain N of the natural numbers in two, hitherto unsuspected and essentially different, ways. -/- We show that the two definitions correspond to two distinctly different---not necessarily evidence-based but complementary---assignments of satisfaction and truth to the compound formulas of PA over N. -/- We further show that the PA axioms are true over N, and that the PA rules of inference preserve truth over N, under both the complementary interpretations; and conclude some unsuspected constructive consequences of such complementarity for the foundations of mathematics, logic, philosophy, and the physical sciences. -/- . (shrink)
Just started a new book. The aim is to establish a science of knowledge in the same way that we have a science of physics or a science of materials. This might appear as an overly ambitious, possibly arrogant, objective, but bear with me. On the day I am beginning to write it–June 7th, 2020–, I think I am in possession of a few things that will help me to achieve this objective. Again, bear with me. My aim is well (...) reflected in the title I chose (just now) for this book: Knowledge & Logic: Towards a science of knowledge. Its most important feature is that I shall take logic to be to knowledge science as calculus is to physics or to materials science. I do not intend to reclaim knowledge from the bosom of philosophy, in which, known as epistemology its erudite discussion has hardly progressed since Plato first defined it as true belief with logos. With only a few adjustments, it will actually provide me with the right, science-bound start. More recently, knowledge has been reclaimed by the field of BA, a reclaim that has opened the box of Pandora: Among the evils, and perhaps at the head of the list, is an overly lay, essentially naive, notion of knowledge. But the very idea that one can have something like “knowledge (management) software” puts us on the right track. (shrink)
Prior Analytics by the Greek philosopher Aristotle (384 – 322 BCE) and Laws of Thought by the English mathematician George Boole (1815 – 1864) are the two most important surviving original logical works from before the advent of modern logic. This article has a single goal: to compare Aristotle’s system with the system that Boole constructed over twenty-two centuries later intending to extend and perfect what Aristotle had started. This comparison merits an article itself. Accordingly, this article does not (...) discuss many other historically and philosophically important aspects of Boole’s book, e.g. his confused attempt to apply differential calculus to logic, his misguided effort to make his system of ‘class logic’ serve as a kind of ‘truth-functional logic’, his now almost forgotten foray into probability theory, or his blindness to the fact that a truth-functional combination of equations that follows from a given truth-functional combination of equations need not follow truth-functionally. One of the main conclusions is that Boole’s contribution widened logic and changed its nature to such an extent that he fully deserves to share with Aristotle the status of being a founding figure in logic. By setting forth in clear and systematic fashion the basic methods for establishing validity and for establishing invalidity, Aristotle became the founder of logic as formal epistemology. By making the first unmistakable steps toward opening logic to the study of ‘laws of thought’—tautologies and laws such as excluded middle and non-contradiction—Boole became the founder of logic as formal ontology. (shrink)
Neo-Fregean approaches to set theory, following Frege, have it that sets are the extensions of concepts, where concepts are the values of second-order variables. The idea is that, given a second-order entity $X$, there may be an object $\varepsilon X$, which is the extension of X. Other writers have also claimed a similar relationship between second-order logic and set theory, where sets arise from pluralities. This paper considers two interpretations of second-order logic—as being either extensional or intensional—and whether (...) either is more appropriate for this approach to the foundations of set theory. Although there seems to be a case for the extensional interpretation resulting from modal considerations, I show how there is no obstacle to starting with an intensional second-order logic. I do so by showing how the $\varepsilon$ operator can have the effect of “extensionalizing” intensional second-order entities. (shrink)
We study a new formallogic LD introduced by Prof. Grzegorczyk. The logic is based on so-called descriptive equivalence, corresponding to the idea of shared meaning rather than shared truth value. We construct a semantics for LD based on a new type of algebras and prove its soundness and completeness. We further show several examples of classical laws that hold for LD as well as laws that fail. Finally, we list a number of open problems. -/- .
There is a long tradition in formal epistemology and in the psychology of reasoning to investigate indicative conditionals. In psychology, the propositional calculus was taken for granted to be the normative standard of reference. Experimental tasks, evaluation of the participants’ responses and psychological model building, were inspired by the semantics of the material conditional. Recent empirical work on indicative conditionals focuses on uncertainty. Consequently, the normative standard of reference has changed. I argue why neither logic nor standard probability (...) theory provide appropriate rationality norms for uncertain conditionals. I advocate coherence based probability logic as an appropriate framework for investigating uncertain conditionals. Detailed proofs of the probabilistic non-informativeness of a paradox of the material conditional illustrate the approach from a formal point of view. I survey selected data on human reasoning about uncertain conditionals which additionally support the plausibility of the approach from an empirical point of view. (shrink)
The logics of formal inconsistency (LFIs, for short) are paraconsistent logics (that is, logics containing contradictory but non-trivial theories) having a consistency connective which allows to recover the ex falso quodlibet principle in a controlled way. The aim of this paper is considering a novel semantical approach to first-order LFIs based on Tarskian structures defined over swap structures, a special class of multialgebras. The proposed semantical framework generalizes previous aproaches to quantified LFIs presented in the literature. The case of (...) QmbC, the simpler quantified LFI expanding classical logic, will be analyzed in detail. An axiomatic extension of QmbC called QLFI1o is also studied, which is equivalent to the quantified version of da Costa and D'Ottaviano 3-valued logic J3. The semantical structures for this logic turn out to be Tarkian structures based on twist structures. The expansion of QmbC and QLFI1o with a standard equality predicate is also considered. (shrink)
This book serves as a concise introduction to some main topics in modern formallogic for undergraduates who already have some familiarity with formal languages. There are chapters on sentential and quantificational logic, modal logic, elementary set theory, a brief introduction to the incompleteness theorem, and a modern development of traditional Aristotelian Logic.
An exact truthmaker for A is a state which, as well as guaranteeing A’s truth, is wholly relevant to it. States with parts irrelevant to whether A is true do not count as exact truthmakers for A. Giving semantics in this way produces a very unusual consequence relation, on which conjunctions do not entail their conjuncts. This feature makes the resulting logic highly unusual. In this paper, we set out formal semantics for exact truthmaking and characterise the resulting (...) notion of entailment, showing that it is compact and decidable. We then investigate the effect of various restrictions on the semantics. We also formulate a sequent-style proof system for exact entailment and give soundness and completeness results. (shrink)
This paper introduces new logical systems which axiomatize a formal representation of inconsistency (here taken to be equivalent to contradictoriness) in classical logic. We start from an intuitive semantical account of inconsistent data, fixing some basic requirements, and provide two distinct sound and complete axiomatics for such semantics, LFI1 and LFI2, as well as their first-order extensions, LFI1* and LFI2*, depending on which additional requirements are considered. These formal systems are examples of what we dub Logics of (...)Formal Inconsistency (LFI) and form part of a much larger family of similar logics. We also show that there are translations from classical and paraconsistent first-order logics into LFI1* and LFI2*, and back. Hence, despite their status as subsystems of classical logic, LFI1* and LFI2* can codify any classical or paraconsistent reasoning. (shrink)
In the paper we present a formal system motivated by a specific methodology of creating norms. According to the methodology, a norm-giver before establishing a set of norms should create a picture of the agent by creating his repertoire of actions. Then, knowing what the agent can do in particular situations, the norm-giver regulates these actions by assigning deontic qualifications to each of them. The set of norms created for each situation should respect (1) generally valid deontic principles being (...) the theses of our logic and (2) facts from the ontology of action whose relevance for the systems of norms we postulate. (shrink)
In previous articles, it has been shown that the deductive system developed by Aristotle in his "second logic" is a natural deduction system and not an axiomatic system as previously had been thought. It was also stated that Aristotle's logic is self-sufficient in two senses: First, that it presupposed no other logical concepts, not even those of propositional logic; second, that it is (strongly) complete in the sense that every valid argument expressible in the language of the (...) system is deducible by means of a formal deduction in the system. Review of the system makes the first point obvious. The purpose of the present article is to prove the second. Strong completeness is demonstrated for the Aristotelian system. (shrink)
Information-theoretic approaches to formallogic analyse the "common intuitive" concept of propositional implication (or argumental validity) in terms of information content of propositions and sets of propositions: one given proposition implies a second if the former contains all of the information contained by the latter; an argument is valid if the conclusion contains no information beyond that of the premise-set. This paper locates information-theoretic approaches historically, philosophically and pragmatically. Advantages and disadvantages are identified by examining such approaches in (...) themselves and by contrasting them with standard transformation-theoretic approaches. Transformation-theoretic approaches analyse validity (and thus implication) in terms of transformations that map one argument onto another: a given argument is valid if no transformation carries it onto an argument with all true premises and false conclusion. Model-theoretic, set-theoretic, and substitution-theoretic approaches, which dominate current literature, can be construed as transformation-theoretic, as can the so-called possible-worlds approaches. Ontic and epistemic presuppositions of both types of approaches are considered. Attention is given to the question of whether our historically cumulative experience applying logic is better explained from a purely information-theoretic perspective or from a purely transformation-theoretic perspective or whether apparent conflicts between the two types of approaches need to be reconciled in order to forge a new type of approach that recognizes their basic complementarity. (shrink)
Information-theoretic approaches to formallogic analyze the "common intuitive" concepts of implication, consequence, and validity in terms of information content of propositions and sets of propositions: one given proposition implies a second if the former contains all of the information contained by the latter; one given proposition is a consequence of a second if the latter contains all of the information contained by the former; an argument is valid if the conclusion contains no information beyond that of the (...) premise-set. This paper locates information-theoretic approaches historically, philosophically, and pragmatically. Advantages and disadvantages are identified by examining such approaches in themselves and by contrasting them with standard transformation-theoretic approaches. Transformation-theoretic approaches analyze validity (and thus implication) in terms of transformations that map one argument onto another: a given argument is valid if no transformation carries it onto an argument with all true premises and false conclusion. Model-theoretic, set-theoretic, and substitution-theoretic approaches, which dominate current literature, can be construed as transformation-theoretic, as can the so-called possible-worlds approaches. Ontic and epistemic presuppositions of both types of approaches are considered. Attention is given to the question of whether our historically cumulative experience applying logic is better explained from a purely information-theoretic perspective or from a purely transformation-theoretic perspective or whether apparent conflicts between the two types of approaches need to be reconciled in order to forge a new type of approach that recognizes their basic complementarity. (shrink)
It is argued, on the basis of ideas derived from Wittgenstein's Tractatus and Husserl's Logical Investigations, that the formal comprehends more than the logical. More specifically: that there exist certain formal-ontological constants (part, whole, overlapping, etc.) which do not fall within the province of logic. A two-dimensional directly depicting language is developed for the representation of the constants of formal ontology, and means are provided for the extension of this language to enable the representation of certain (...) materially necessary relations. The paper concludes with a discussion of the relationship between formallogic, formal ontology, and mathematics. (shrink)
This paper reviews the central points and presents some recent developments of the epistemic approach to paraconsistency in terms of the preservation of evidence. Two formal systems are surveyed, the basic logic of evidence (BLE) and the logic of evidence and truth (LET J ), designed to deal, respectively, with evidence and with evidence and truth. While BLE is equivalent to Nelson’s logic N4, it has been conceived for a different purpose. Adequate valuation semantics that provide (...) decidability are given for both BLE and LET J . The meanings of the connectives of BLE and LET J , from the point of view of preservation of evidence, is explained with the aid of an inferential semantics. A formalization of the notion of evidence for BLE as proposed by M. Fitting is also reviewed here. As a novel result, the paper shows that LET J is semantically characterized through the so-called Fidel structures. Some opportunities for further research are also discussed. (shrink)
Peer Instruction is a simple and effective technique you can use to make lectures more interactive, more engaging, and more effective learning experiences. Although well known in science and mathematics, the technique appears to be little known in the humanities. In this paper, we explain how Peer Instruction can be applied in philosophy lectures. We report the results from our own experience of using Peer Instruction in undergraduate courses in philosophy, formallogic, and critical thinking. We have consistently (...) found it to be a highly effective method of improving the lecture experience for both students and the lecturer. (shrink)
Traditionally transcendental logic has been set apart from formallogic. Transcendental logic had to deal with the conditions of possibility of judgements, which were presupposed by formallogic. Defined as a purely philosophical enterprise transcendental logic was considered as being a priori delivering either analytic or even synthetic a priori results. In this paper it is argued that this separation from the (empirical) cognitive sciences should be given up. Transcendental logic should be (...) understood as focusing on specific questions. These do not, as some recent analytic philosophy has it, include a refutation of scepticism. And they are not to be separated from meta-logical investigations. Transcendental logic properly understood, and redefined along these theses, should concern itself with the (formal) re-construction of the presupposed necessary conditions and rules of linguistic communication in general. It aims at universality and reflexive closure. (shrink)
The program put forward in von Wright's last works defines deontic logic as ``a study of conditions which must be satisfied in rational norm-giving activity'' and thus introduces the perspective of logical pragmatics. In this paper a formal explication for von Wright's program is proposed within the framework of set-theoretic approach and extended to a two-sets model which allows for the separate treatment of obligation-norms and permission norms. The three translation functions connecting the language of deontic logic (...) with the language of the extended set-theoretical approach are introduced, and used in proving the correspondence between the deontic theorems, on one side, and the perfection properties of the norm-set and the ``counter-set'', on the other side. In this way the possibility of reinterpretation of standard deontic logic as the theory of perfection properties that ought to be achieved in norm-giving activity has been formally proved. The extended set-theoretic approach is applied to the problem of rationality of principles of completion of normative systems. The paper concludes with a plaidoyer for logical pragmatics turn envisaged in the late phase of Von Wright's work in deontic logic. (shrink)
This article is part of a larger project in which I attempt to show that Western formallogic, from its inception in Aristotle onward, has both been partially constituted by, and partially constitutive of, what has become known as racism. In contrast to this trend, the present article concerns the major philosopher whose contribution to logic has been perhaps the most derided and marginalized, and yet whose character and politics are, from a contemporary perspective, drastically superior—John Stuart (...) Mill. My approach to my core concern will be one of narrowing concentric circles. I will begin with Mill’s occasional political writings that bear on the issue of racism, including “The Negro Question.” From there, the core of the article will explore the political dimensions of Mill’s A System of Logic. (shrink)
This special issue of the Logic Journal of the IGPL includes revised and updated versions of the best work presented at the fourth edition of the workshop Formal Ap- proaches to Multi-Agent Systems, FAMAS'09, which took place in Turin, Italy, from 7 to 11 September, 2009, under the umbrella of the Multi-Agent Logics, Languages, and Organisations Federated Workshops (MALLOW). -/- Just like its predecessor, research reported in this FAMAS 2009 special issue is very much inspired by practical concerns. (...) This time the authors of all the five selected papers are concerned with knowledge and beliefs in multi-agent settings: How to create a group belief in a fair way from individual plausibility orderings? How to close gaps and resolve ambiguities in a tractable way, when information comes from multiple sources? How to reason about a spatial environment? How to compare the strengths of an agent's beliefs in a principled way? How to decide as efficiently as possible whether a given formula concerning group beliefs is valid? These questions and their answers lead to a multi-faceted and at the same time coherent special issue. We concisely introduce the five articles. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.