Girolamo Saccheri (1667--1733) was an Italian Jesuit priest, scholastic philosopher, and mathematician. He earned a permanent place in the history of mathematics by discovering and rigorously deducing an elaborate chain of consequences of an axiom-set for what is now known as hyperbolic (or Lobachevskian) plane geometry. Reviewer's remarks: (1) On two pages of this book Saccheri refers to his previous and equally original book Logica demonstrativa (Turin, 1697) to which 14 of the 16 pages of the editor's "Introduction" are devoted. (...) At the time of the first edition, 1920, the editor was apparently not acquainted with the secondary literature on Logica demonstrativa which continued to grow in the period preceding the second edition \ref[see D. J. Struik, in Dictionary of scientific biography, Vol. 12, 55--57, Scribner's, New York, 1975]. Of special interest in this connection is a series of three articles by A. F. Emch [Scripta Math. 3 (1935), 51--60; Zbl 10, 386; ibid. 3 (1935), 143--152; Zbl 11, 193; ibid. 3 (1935), 221--333; Zbl 12, 98]. (2) It seems curious that modern writers believe that demonstration of the "nondeducibility" of the parallel postulate vindicates Euclid whereas at first Saccheri seems to have thought that demonstration of its "deducibility" is what would vindicate Euclid. Saccheri is perfectly clear in his commitment to the ancient (and now discredited) view that it is wrong to take as an "axiom" a proposition which is not a "primal verity", which is not "known through itself". So it would seem that Saccheri should think that he was convicting Euclid of error by deducing the parallel postulate. The resolution of this confusion is that Saccheri thought that he had proved, not merely that the parallel postulate was true, but that it was a "primal verity" and, thus, that Euclid was correct in taking it as an "axiom". As implausible as this claim about Saccheri may seem, the passage on p. 237, lines 3--15, seems to admit of no other interpretation. Indeed, Emch takes it this way. (3) As has been noted by many others, Saccheri was fascinated, if not obsessed, by what may be called "reflexive indirect deductions", indirect deductions which show that a conclusion follows from given premises by a chain of reasoning beginning with the given premises augmented by the denial of the desired conclusion and ending with the conclusion itself. It is obvious, of course, that this is simply a species of ordinary indirect deduction; a conclusion follows from given premises if a contradiction is deducible from those given premises augmented by the denial of the conclusion---and it is immaterial whether the contradiction involves one of the premises, the denial of the conclusion, or even, as often happens, intermediate propositions distinct from the given premises and the denial of the conclusion. Saccheri seemed to think that a proposition proved in this way was deduced from its own denial and, thus, that its denial was self-contradictory (p. 207). Inference from this mistake to the idea that propositions proved in this way are "primal verities" would involve yet another confusion. The reviewer gratefully acknowledges extensive communication with his former doctoral students J. Gasser and M. Scanlan. ADDED 14 March 14, 2015: (1) Wikipedia reports that many of Saccheri's ideas have a precedent in the 11th Century Persian polymath Omar Khayyám's Discussion of Difficulties in Euclid, a fact ignored in most Western sources until recently. It is unclear whether Saccheri had access to this work in translation, or developed his ideas independently. (2) This book is another exemplification of the huge difference between indirect deduction and indirect reduction. Indirect deduction requires making an assumption that is inconsistent with the premises previously adopted. This means that the reasoner must perform a certain mental act of assuming a certain proposition. It case the premises are all known truths, indirect deduction—which would then be indirect proof—requires the reasoner to assume a falsehood. This fact has been noted by several prominent mathematicians including Hardy, Hilbert, and Tarski. Indirect reduction requires no new assumption. Indirect reduction is simply a transformation of an argument in one form into another argument in a different form. In an indirect reduction one proposition in the old premise set is replaced by the contradictory opposite of the old conclusion and the new conclusion becomes the contradictory opposite of the replaced premise. Roughly and schematically, P,Q/R becomes P,~R/~Q or ~R, Q/~P. Saccheri’s work involved indirect deduction not indirect reduction. (3) The distinction between indirect deduction and indirect reduction has largely slipped through the cracks, the cracks between medieval-oriented logic and modern-oriented logic. The medievalists have a heavy investment in reduction and, though they have heard of deduction, they think that deduction is a form of reduction, or vice versa, or in some cases they think that the word ‘deduction’ is the modern way of referring to reduction. The modernists have no interest in reduction, i.e. in the process of transforming one argument into another having exactly the same number of premises. Modern logicians, like Aristotle, are concerned with deducing a single proposition from a set of propositions. Some focus on deducing a single proposition from the null set—something difficult to relate to reduction. (shrink)
The concept of burden of proof is used in a wide range of discourses, from philosophy to law, science, skepticism, and even in everyday reasoning. This paper provides an analysis of the proper deployment of burden of proof, focusing in particular on skeptical discussions of pseudoscience and the paranormal, where burden of proof assignments are most poignant and relatively clear-cut. We argue that burden of proof is often misapplied or used as a mere rhetorical gambit, with (...) little appreciation of the underlying principles. The paper elaborates on an important distinction between evidential and prudential varieties of burdens of proof, which is cashed out in terms of Bayesian probabilities and error management theory. Finally, we explore the relationship between burden of proof and several (alleged) informal logical fallacies. This allows us to get a firmer grip on the concept and its applications in different domains, and also to clear up some confusions with regard to when exactly some fallacies (ad hominem, ad ignorantiam, and petitio principii) may or may not occur. (shrink)
Kant's A-Edition objective deduction is naturally (and has traditionally been) divided into two arguments: an " argument from above" and one that proceeds " von unten auf." This would suggest a picture of Kant's procedure in the objective deduction as first descending and ascending the same ladder, the better, perhaps, to test its durability or to thoroughly convince the reader of its soundness. There are obvious obstacles to such a reading, however; and in this chapter I will argue (...) that the arguments from above and below constitute different, albeit importantly inter-related, proofs. Rather than drawing on the differences in their premises, however, I will highlight what I take to be the different concerns addressed and, correspondingly, the distinct conclusions reached by each. In particular, I will show that both arguments can be understood to address distinct specters, with the argument from above addressing an internal concern generated by Kant’s own transcendental idealism, and the argument from below seeking to dispel a more traditional, broadly Humean challenge to the understanding’s role in experience. These distinct concerns also imply that these arguments yield distinct conclusions, though I will show that they are in fact complementary. (shrink)
The fundamental assumption of Dummett’s and Prawitz’ proof-theoretic justification of deduction is that ‘if we have a valid argument for a complex statement, we can construct a valid argument for it which finishes with an application of one of the introduction rules governing its principal operator’. I argue that the assumption is flawed in this general version, but should be restricted, not to apply to arguments in general, but only to proofs. I also argue that Dummett’s and Prawitz’ (...) project of providing a logical basis for metaphysics only relies on the restricted assumption. (shrink)
ABSTRACT This part of the series has a dual purpose. In the first place we will discuss two kinds of theories of proof. The first kind will be called a theory of linear proof. The second has been called a theory of suppositional proof. The term "natural deduction" has often and correctly been used to refer to the second kind of theory, but I shall not do so here because many of the theories so-called are not (...) of the second kind--they must be thought of either as disguised linear theories or theories of a third kind (see postscript below). The second purpose of this part is 25 to develop some of the main ideas needed in constructing a comprehensive theory of proof. The reason for choosing the linear and suppositional theories for this purpose is because the linear theory includes only rules of a very simple nature, and the suppositional theory can be seen as the result of making the linear theory more comprehensive. CORRECTION: At the time these articles were written the word ‘proof’ especially in the phrase ‘proof from hypotheses’ was widely used to refer to what were earlier and are now called deductions. I ask your forgiveness. I have forgiven Church and Henkin who misled me. (shrink)
1971. Discourse Grammars and the Structure of Mathematical Reasoning II: The Nature of a Correct Theory of Proof and Its Value, Journal of Structural Learning 3, #2, 1–16. REPRINTED 1976. Structural Learning II Issues and Approaches, ed. J. Scandura, Gordon & Breach Science Publishers, New York, MR56#15263. -/- This is the second of a series of three articles dealing with application of linguistics and logic to the study of mathematical reasoning, especially in the setting of a concern for improvement (...) of mathematical education. The present article presupposes the previous one. Herein we develop our ideas of the purposes of a theory of proof and the criterion of success to be applied to such theories. In addition we speculate at length concerning the specific kinds of uses to which a successful theory of proof may be put vis-a-vis improvement of various aspects of mathematical education. The final article will deal with the construction of such a theory. The 1st is the 1971. Discourse Grammars and the Structure of Mathematical Reasoning I: Mathematical Reasoning and Stratification of Language, Journal of Structural Learning 3, #1, 55–74. https://www.academia.edu/s/fb081b1886?source=link . (shrink)
In the proof-theoretic semantics approach to meaning, harmony , requiring a balance between introduction-rules (I-rules) and elimination rules (E-rules) within a meaning conferring natural-deductionproof-system, is a central notion. In this paper, we consider two notions of harmony that were proposed in the literature: 1. GE-harmony , requiring a certain form of the E-rules, given the form of the I-rules. 2. Local intrinsic harmony : imposes the existence of certain transformations of derivations, known as reduction and expansion (...) . We propose a construction of the E-rules (in GE-form) from given I-rules, and prove that the constructed rules satisfy also local intrinsic harmony. The construction is based on a classification of I-rules, and constitute an implementation to Gentzen’s (and Pawitz’) remark, that E-rules can be “read off” I-rules. (shrink)
Argumentations are at the heart of the deductive and the hypothetico-deductive methods, which are involved in attempts to reduce currently open problems to problems already solved. These two methods span the entire spectrum of problem-oriented reasoning from the simplest and most practical to the most complex and most theoretical, thereby uniting all objective thought whether ancient or contemporary, whether humanistic or scientific, whether normative or descriptive, whether concrete or abstract. Analysis, synthesis, evaluation, and function of argumentations are described. Perennial philosophic (...) problems, epistemic and ontic, related to argumentations are put in perspective. So much of what has been regarded as logic is seen to be involved in the study of argumentations that logic may be usefully defined as the systematic study of argumentations, which is virtually identical to the quest of objective understanding of objectivity. -/- KEY WORDS: hypothesis, theorem, argumentation, proof, deduction, premise-conclusion argument, valid, inference, implication, epistemic, ontic, cogent, fallacious, paradox, formal, validation. (shrink)
A Mathematical Review by John Corcoran, SUNY/Buffalo -/- Macbeth, Danielle Diagrammatic reasoning in Frege's Begriffsschrift. Synthese 186 (2012), no. 1, 289–314. ABSTRACT This review begins with two quotations from the paper: its abstract and the first paragraph of the conclusion. The point of the quotations is to make clear by the “give-them-enough-rope” strategy how murky, incompetent, and badly written the paper is. I know I am asking a lot, but I have to ask you to read the quoted passages—aloud if (...) possible. Don’t miss the silly attempt to recycle Kant’s quip “Concepts without intuitions are empty; intuitions without concepts are blind”. What the paper was aiming at includes the absurdity: “Proofs without definitions are empty; definitions without proofs are, if not blind, then dumb.” But the author even bollixed this. The editor didn’t even notice. The copy-editor missed it. And the author’s proof-reading did not catch it. In order not to torment you I will quote the sentence as it appears: “In a slogan: proofs without definitions are empty, merely the aimless manipulation of signs according to rules; and definitions without proofs are, if no blind, then dumb.”[sic] The rest of my review discusses the paper’s astounding misattribution to contemporary logicians of the information-theoretic approach. This approach was cruelly trashed by Quine in his 1970 Philosophy of Logic, and thereafter ignored by every text I know of. The paper under review attributes generally to modern philosophers and logicians views that were never espoused by any of the prominent logicians—such as Hilbert, Gödel, Tarski, Church, and Quine—apparently in an attempt to distance them from Frege: the focus of the article. On page 310 we find the following paragraph. “In our logics it is assumed that inference potential is given by truth-conditions. Hence, we think, deduction can be nothing more than a matter of making explicit information that is already contained in one’s premises. If the deduction is valid then the information contained in the conclusion must be contained already in the premises; if that information is not contained already in the premises […], then the argument cannot be valid.” Although the paper is meticulous in citing supporting literature for less questionable points, no references are given for this. In fact, the view that deduction is the making explicit of information that is only implicit in premises has not been espoused by any standard symbolic logic books. It has only recently been articulated by a small number of philosophical logicians from a younger generation, for example, in the prize-winning essay by J. Sagüillo, Methodological practice and complementary concepts of logical consequence: Tarski’s model-theoretic consequence and Corcoran’s information-theoretic consequence, History and Philosophy of Logic, 30 (2009), pp. 21–48. The paper omits definitions of key terms including ‘ampliative’, ‘explicatory’, ‘inference potential’, ‘truth-condition’, and ‘information’. The definition of prime number on page 292 is as follows: “To say that a number is prime is to say that it is not divisible without remainder by another number”. This would make one be the only prime number. The paper being reviewed had the benefit of two anonymous referees who contributed “very helpful comments on an earlier draft”. Could these anonymous referees have read the paper? -/- J. Corcoran, U of Buffalo, SUNY -/- PS By the way, if anyone has a paper that has been turned down by other journals, any journal that would publish something like this might be worth trying. (shrink)
A textbook on proof in mathematics, inspired by an Aristotelian point of view on mathematics and proof. The book expounds the traditional view of proof as deduction of theorems from evident premises via obviously valid steps. It deals with the proof of "all" statements, "some" statements, multiple quantifiers and mathematical induction.
This paper discusses proof-theoretic semantics, the project of specifying the meanings of the logical constants in terms of rules of inference governing them. I concentrate on Michael Dummett’s and Dag Prawitz’ philosophical motivations and give precise characterisations of the crucial notions of harmony and stability, placed in the context of proving normalisation results in systems of natural deduction. I point out a problem for defining the meaning of negation in this framework and prospects for an account of the (...) meanings of modal operators in terms of rules of inference. (shrink)
In the transcendental deduction, the central argument of the Critique of Pure Reason, Kant seeks to secure the objective validity of our basic categories of thought. He distinguishes objective and subjective sides of this argument. The latter side, the subjective deduction, is normally understood as an investigation of our cognitive faculties. It is identified with Kant’s account of a threefold synthesis involved in our cognition of objects of experience, and it is said to precede and ground Kant’s (...) class='Hi'>proof of the validity of the categories in the objective deduction. I challenge this standard reading of the subjective deduction, arguing, first, that there is little textual evidence for it, and, second, that it encourages a problematic conception of how the deduction works. In its place, I present a new reading of the subjective deduction. Rather than being a broad investigation of our cognitive faculties, it should be seen as addressing a specific worry that arises in the course of the objective deduction. The latter establishes the need for a necessary connection between our capacities for thinking and being given objects, but Kant acknowledges that his readers might struggle to comprehend how these seemingly independent capacities are coordinated. Even worse, they might well believe that in asserting this necessary connection, Kant’s position amounts to an implausible subjective idealism. The subjective deduction ismeant to allay these concerns by showing that they rest on a misunderstanding of the relation between these faculties. This new reading of the subjective deduction offers a better fit with Kant’s text. It also has broader implications, for it reveals the more philosophically plausible account of our relation to the world as thinkers that Kant is defending – an account that is largely obscured by the standard reading of the subjective deduction. (shrink)
We argue that the need for commentary in commonly used linear calculi of natural deduction is connected to the “deletion” of illocutionary expressions that express the role of propositions as reasons, assumptions, or inferred propositions. We first analyze the formalization of an informal proof in some common calculi which do not formalize natural language illocutionary expressions, and show that in these calculi the formalizations of the example proof rely on commentary devices that have no counterpart in the (...) original proof. We then present a linear natural deduction calculus that makes use of formal illocutionary expressions in such a way that unique readability for derivations is guaranteed – thus showing that formalizing illocutionary expressions can eliminate the need for commentary. (shrink)
The proof theory of many-valued systems has not been investigated to an extent comparable to the work done on axiomatizatbility of many-valued logics. Proof theory requires appropriate formalisms, such as sequent calculus, natural deduction, and tableaux for classical (and intuitionistic) logic. One particular method for systematically obtaining calculi for all finite-valued logics was invented independently by several researchers, with slight variations in design and presentation. The main aim of this report is to develop the proof theory (...) of finite-valued first order logics in a general way, and to present some of the more important results in this area. In Systems covered are the resolution calculus, sequent calculus, tableaux, and natural deduction. This report is actually a template, from which all results can be specialized to particular logics. (shrink)
Harold Hodes in [1] introduces an extension of first-order modal logic featuring a backtracking operator, and provides a possible worlds semantics, according to which the operator is a kind of device for ‘world travel’; he does not provide a proof theory. In this paper, I provide a natural deduction system for modal logic featuring this operator, and argue that the system can be motivated in terms of a reading of the backtracking operator whereby it serves to indicate modal (...) scope. I prove soundness and completeness theorems with respect to Hodes’ semantics, as well as semantics with fewer restrictions on the accessibility relation. (shrink)
In this paper, I consider a family of three-valued regular logics: the well-known strong and weak S.C. Kleene’s logics and two intermedi- ate logics, where one was discovered by M. Fitting and the other one by E. Komendantskaya. All these systems were originally presented in the semantical way and based on the theory of recursion. However, the proof theory of them still is not fully developed. Thus, natural deduction sys- tems are built only for strong Kleene’s logic both (...) with one (A. Urquhart, G. Priest, A. Tamminga) and two designated values (G. Priest, B. Kooi, A. Tamminga). The purpose of this paper is to provide natural deduction systems for weak and intermediate regular logics both with one and two designated values. (shrink)
The paper briefly surveys the sentential proof-theoretic semantics for fragment of English. Then, appealing to a version of Frege’s context-principle (specified to fit type-logical grammar), a method is presented for deriving proof-theoretic meanings for sub-sentential phrases, down to lexical units (words). The sentential meaning is decomposed according to the function-argument structure as determined by the type-logical grammar. In doing so, the paper presents a novel proof-theoretic interpretation of simple type, replacing Montague’s model-theoretic type interpretation (in arbitrary Henkin (...) models). The domains of derivations are collections of derivations in the associated “dedicated” natural-deductionproof-system, and functions therein (with no appeal to models, truth-values and elements of a domain). The compositionality of the semantics is analyzed. (shrink)
Demonstrative logic, the study of demonstration as opposed to persuasion, is the subject of Aristotle's two-volume Analytics. Many examples are geometrical. Demonstration produces knowledge (of the truth of propositions). Persuasion merely produces opinion. Aristotle presented a general truth-and-consequence conception of demonstration meant to apply to all demonstrations. According to him, a demonstration, which normally proves a conclusion not previously known to be true, is an extended argumentation beginning with premises known to be truths and containing a chain of reasoning showing (...) by deductively evident steps that its conclusion is a consequence of its premises. In particular, a demonstration is a deduction whose premises are known to be true. Aristotle's general theory of demonstration required a prior general theory of deduction presented in the Prior Analytics. His general immediate-deduction-chaining conception of deduction was meant to apply to all deductions. According to him, any deduction that is not immediately evident is an extended argumentation that involves a chaining of intermediate immediately evident steps that shows its final conclusion to follow logically from its premises. To illustrate his general theory of deduction, he presented an ingeniously simple and mathematically precise special case traditionally known as the categorical syllogistic. (shrink)
Euclid's classic proof about the infinitude of prime numbers has been a standard model of reasoning in student textbooks and books of elementary number theory. It has withstood scrutiny for over 2000 years but we shall prove that despite the deceptive appearance of its analytical reasoning it is tautological in nature. We shall argue that the proof is more of an observation about the general property of a prime numbers than an expository style of natural deduction of (...) the proof of their infinitude. (shrink)
Takeuti and Titani have introduced and investigated a logic they called intuitionistic fuzzy logic. This logic is characterized as the first-order Gödel logic based on the truth value set [0,1]. The logic is known to be axiomatizable, but no deduction system amenable to proof-theoretic, and hence, computational treatment, has been known. Such a system is presented here, based on previous work on hypersequent calculi for propositional Gödel logics by Avron. It is shown that the system is sound and (...) complete, and allows cut-elimination. A question by Takano regarding the eliminability of the Takeuti-Titani density rule is answered affirmatively. (shrink)
In this paper, I argue that, other things being equal, simpler arguments are better. In other words, I argue that, other things being equal, it is rational to prefer simpler arguments over less simple ones. I sketch three arguments in support of this claim: an argument from mathematical proofs, an argument from scientific theories, and an argument from the conjunction rule.
Argumentations are at the heart of the deductive and the hypothetico-deductive methods, which are involved in attempts to reduce currently open problems to problems already solved. These two methods span the entire spectrum of problem-oriented reasoning from the simplest and most practical to the most complex and most theoretical, thereby uniting all objective thought whether ancient or contemporary, whether humanistic or scientific, whether normative or descriptive, whether concrete or abstract. Analysis, synthesis, evaluation, and function of argumentations are described. Perennial philosophic (...) problems, epistemic and ontic, related to argumentations are put in perspective. So much of what has been regarded as logic is seen to be involved in the study of argumentations that logic may be usefully defined as the systematic study of argumentations, which is virtually identical to the quest of objective understanding of objectivity. (shrink)
In this paper, I'll present a general way of "reading off" introduction/elimination rules from elimination/introduction rules, and define notions of harmony and stability on the basis of it.
I am saying farewell after more than forty happy years of teaching logic at the University of Buffalo. But this is only a partial farewell. I will no longer be at UB to teach classroom courses or seminars. But nothing else will change. I will continue to be available for independent study. I will continue to write abstracts and articles with people who have taken courses or seminars with me. And I will continue to honor the LogicLifetimeGuarantee™, which is earned (...) by taking one of my logic courses or seminars. As you know, according to the terms of the LogicLifetimeGuarantee™, I stand behind everything I teach. If you find anything to be unsatisfactory, I am committed to fixing it. If you forget anything, I will remind you. If you have questions, I will answer them or ask more questions. And if you need more detail on any topic we discussed, I will help you to broaden and deepen your knowledge—and maybe write an abstract or article. Stay in touch. (shrink)
SEMANTIC ARITHMETIC: A PREFACE John Corcoran Abstract Number theory, or pure arithmetic, concerns the natural numbers themselves, not the notation used, and in particular not the numerals. String theory, or pure syntax, concems the numerals as strings of «uninterpreted» characters without regard to the numbe~s they may be used to denote. Number theory is purely arithmetic; string theory is purely syntactical... in so far as the universe of discourse alone is considered. Semantic arithmetic is a broad subject which begins when (...) numerals are mentioned (not just used) and mentioned as names of numbers (not just as syntactic objects). Semantic arithmetic leads to many fascinating and surprising algorithms and decision procedures; it reveals in a vivid way the experiential import of mathematical propositions and the predictive power of mathematical knowledge; it provides an interesting perspective for philosophical, historical, and pedagogical studies of the growth of scientific knowledge and of the role metalinguistic discourse in scientific thought. (shrink)
Critical thinking involves deliberate application of tests and standards to beliefs per se and to methods used to arrive at beliefs. Pedagogical license is authorization accorded to teachers permitting them to use otherwise illicit means in order to achieve pedagogical goals. Pedagogical license is thus analogous to poetic license or, more generally, to artistic license. Pedagogical license will be found to be pervasive in college teaching. This presentation suggests that critical thinking courses emphasize two topics: first, the nature and usefulness (...) of critical thinking; second, the nature and pervasiveness of pedagogical license. Awareness of pedagogical license alerts the student to the need for critical thinking. Indoctrination is done to students; education is done by students. (shrink)
Articles by Ian Mueller, Ronald Zirin, Norman Kretzmann, John Corcoran, John Mulhern, Mary Mulhern,Josiah Gould, and others. Topics: Aristotle's Syllogistic, Stoic Logic, Modern Research in Ancient Logic.
Because formal systems of symbolic logic inherently express and represent the deductive inference model formal proofs to theorem consequences can be understood to represent sound deductive inference to deductive conclusions without any need for other representations.
In this paper, I develop a quasi-transcendental argument to justify Kant’s infamous claim “man is evil by nature.” The cornerstone of my reconstruction lies in drawing a systematic distinction between the seemingly identical concepts of “evil disposition” (böseGesinnung) and “propensity to evil” (Hang zumBösen). The former, I argue, Kant reserves to describe the fundamental moral outlook of a single individual; the latter, the moral orientation of the whole species. Moreover, the appellative “evil” ranges over two different types of moral failure: (...) while an “evil disposition” is a failure to realize the good (i.e., to adopt the motive of duty as limiting condition for all one’s desires), an “evil propensity” is a failure to realize the highest good (i.e., to engage in the collective project of transforming the legal order into an ethical community). This correlation between units of moral analysis and types of obligation suggests a way to offer a deduction of the universal propensity on behalf of Kant. It consists in tracing the source of radical evil to the same subjective necessity that gives rise to the doctrine of the highest good. For, at the basis of Kant’s two doctrines lies the same natural dialectic between happiness and morality. While the highest good brings about the critically acceptable resolution of this dialectic, the propensity to evil perpetuates and aggravates it. Instead of connecting happiness and morality in an objective relation, the human will subordinatesmorality to the pursuit of happiness according to the subjective order of association. If this reading is correct, it would explain why prior attempts at a transcendental deduction have failed: interpreters have looked for the key to the deduction in the body of Kant’s text, where it is not to be found, for it is tucked, instead, in the Preface to the first edition. (shrink)
Kant’s argument in § 38 of the *Critique of Judgment* is subject to a dilemma: if the subjective condition of cognition is the sufficient condition of the pleasure of taste, then every object of experience must produce that pleasure; if not, then the universal communicability of cognition does not entail the universal communicability of the pleasure. Kant’s use of an additional premise in § 21 may get him out of this difficulty, but the premises themselves hang in the air and (...) have no independent plausibility. What Kant offers as a proof of our right to make judgments of taste is more charitably construed as an indirect argument for the adequacy of a speculative explanation of a *presumed* right to make judgments of taste. (shrink)
Semantics plays a role in grammar in at least three guises. (A) Linguists seek to account for speakers‘ knowledge of what linguistic expressions mean. This goal is typically achieved by assigning a model theoretic interpretation2 in a compositional fashion. For example, No whale flies is true if and only if the intersection of the sets of whales and fliers is empty in the model. (B) Linguists seek to account for the ability of speakers to make various inferences based on semantic (...) knowledge. For example, No whale flies entails No blue whale flies and No whale flies high. (C) The wellformedness of a variety of syntactic constructions depends on morpho-syntactic features with a semantic flavor. For example, Under no circumstances would a whale fly is grammatical, whereas Under some circumstances would a whale fly is not, corresponding to the downward vs. upward monotonic features of the preposed phrases. (shrink)
ABSTRACT: This 1974 paper builds on our 1969 paper (Corcoran-Weaver [2]). Here we present three (modal, sentential) logics which may be thought of as partial systematizations of the semantic and deductive properties of a sentence operator which expresses certain kinds of necessity. The logical truths [sc. tautologies] of these three logics coincide with one another and with those of standard formalizations of Lewis's S5. These logics, when regarded as logistic systems (cf. Corcoran [1], p. 154), are seen to be equivalent; (...) but, when regarded as consequence systems (ibid., p. 157), one diverges from the others in a fashion which suggests that two standard measures of semantic complexity may not be as closely linked as previously thought. -/- This 1974 paper uses the linear notation for natural deduction presented in [2]: each two-dimensional deduction is represented by a unique one-dimensional string of characters. Thus obviating need for two-dimensional trees, tableaux, lists, and the like—thereby facilitating electronic communication of natural deductions. The 1969 paper presents a (modal, sentential) logic which may be thought of as a partial systematization of the semantic and deductive properties of a sentence operator which expresses certain kinds of necessity. The logical truths [sc. tautologies] of this logic coincides those of standard formalizations of Lewis’s S4. Among the paper's innovations is its treatment of modal logic in the setting of natural deduction systems--as opposed to axiomatic systems. The author’s apologize for the now obsolete terminology. For example, these papers speak of “a proof of a sentence from a set of premises” where today “a deduction of a sentence from a set of premises” would be preferable. 1. Corcoran, John. 1969. Three Logical Theories, Philosophy of Science 36, 153–77. J P R -/- 2. Corcoran, John and George Weaver. 1969. Logical Consequence in Modal Logic: Natural Deduction in S5 Notre Dame Journal of Formal Logic 10, 370–84. MR0249278 (40 #2524). 3. Weaver, George and John Corcoran. 1974. Logical Consequence in Modal Logic: Some Semantic Systems for S4, Notre Dame Journal of Formal Logic 15, 370–78. MR0351765 (50 #4253). (shrink)
As the 19th century drew to a close, logicians formalized an ideal notion of proof. They were driven by nothing other than an abiding interest in truth, and their proofs were as ethereal as the mind of God. Yet within decades these mathematical abstractions were realized by the hand of man, in the digital stored-program computer. How it came to be recognized that proofs and programs are the same thing is a story that spans a century, a chase with (...) as many twists and turns as a thriller. At the end of the story is a new principle for designing programming languages that will guide computers into the 21st century. -/- For my money, Gentzen’s natural deduction and Church’s lambda calculus are on a par with Einstein’s relativity and Dirac’s quantum physics for elegance and insight. And the maths are a lot simpler. I want to show you the essence of these ideas. I’ll need a few symbols, but not too many, and I’ll explain as I go along. -/- To simplify, I’ll present the story as we understand it now, with some asides to fill in the history. First, I’ll introduce Gentzen’s natural deduction, a formalism for proofs. Next, I’ll introduce Church’s lambda calculus, a formalism for programs. Then I’ll explain why proofs and programs are really the same thing, and how simplifying a proof corresponds to executing a program. Finally, I’ll conclude with a look at how these principles are being applied to design a new generation of programming languages, particularly mobile code for the Internet. (shrink)
According to a common conception of legal proof, satisfying a legal burden requires establishing a claim to a numerical threshold. Beyond reasonable doubt, for example, is often glossed as 90% or 95% likelihood given the evidence. Preponderance of evidence is interpreted as meaning at least 50% likelihood given the evidence. -/- In light of problems with the common conception, I propose a new ‘relevant alternatives’ framework for legal standards of proof. Relevant alternative accounts of knowledge state that a (...) person knows a proposition when their evidence rules out all relevant error possibilities. I adapt this framework to model three legal standards of proof—the preponderance of evidence, clear and convincing evidence, and beyond reasonable doubt standards. -/- I describe virtues of this framework. I argue that, by eschewing numerical thresholds, the relevant alternatives framework avoids problems inherent to rival models. I conclude by articulating aspects of legal normativity and practice illuminated by the relevant alternatives framework. (shrink)
Nelson's Proof of the Impossibility of the Theory of Knowledge -/- In addressing the possibility of a theory of knowledge, Leonard Nelson noted the contradiction of an epistemological criterion that one would require in order to differentiate between valid and invalid knowledge. Nelson concluded that the inconsistency of such a criterion proves the impossibility of the theory of knowledge. -/- Had the epistemological criterion had a perception, then it would presume to adjudicate on its own truth (thus epistemological circular (...) argument). However, if one were to assume that the criterion is not knowledge, one would then have to justify how this is a criterion for truth - yet this would only be possible when it may be considered as an object of knowledge. One would equally have had to predetermine the criterion in order to determine the truth of this knowledge, thereby providing another circular argument. Ostensibly, every criterion of truth fails at its very own test since it cannot guarantee its own truth, just as Munchausen, contrary to his assertion, could not draw himself out of the swamp by tugging on a tuft of his own hair. -/- Nelson proposed a solution of the epistemological problem (the question of the differentiation between valid and invalid knowledge), that based on Jakob Friedrich Fries' differentiation between proof and deduction. Proof, according to Nelson (in reference to Fries), can be defined as derivation of truth from one statement from another statement. Thus, from the truth in the statement that "all men are mortal", one is then able to say that "Socrates is a man" and thence extrapolate from the truth of the statement that "Socrates is mortal." If knowledge were to be considered somewhat judgmental (in a statement), then an attempt at proof (i.e. recourse to previous judgments) would inevitably lead to an infinite regression in justification, since each judgment would necessitate a further justification from another judgment. Every attempt to prove an epistemological criterion is thus also confronted by this regression in justification. -/- Nelson's attempt at a solution rests on the assumption of the existence of an immediate knowledge as a justification of the truth (mediate) of knowledge. Nelson considers immediate knowledge to be non-judgmental knowledge. These include intuitions (e. g. seeing-the-red-roof) and also philosophical knowledge that pre-exists in his opinion before a judgmental reflexion (immediate) in our reason (e. g. the principle of causality). -/- Proof of the truth of mediate knowledge can be effected by showing its compliance with attendant immediate knowledge (rational truth = correspondence of mediate knowledge with their immediate knowledge). Nelson considered this as a resolution of the circular epistemological argument. In regard to philosophical knowledge, Nelson sees these as subject to deduction and not proof. The following example illustrates the goal of deduction: -/- An approach for deducing the principle of causality: A) Every change has a cause. (The principle of causality) A´) A is a reiteration of an immediate knowledge. (Meta-assertion following A) -/- "A" may not be provable, but A´ may justified, and thus Nelson identified it as a deduction following from A. // reference: http://www.friesian.com/nelproof.htm. (shrink)
According to non-conceptualist interpretations, Kant held that the application of concepts is not necessary for perceptual experience. Some have motivated non-conceptualism by noting the affinities between Kant's account of perception and contemporary relational theories of perception. In this paper I argue (i) that non-conceptualism cannot provide an account of the Transcendental Deduction and thus ought to be rejected; and (ii) that this has no bearing on the issue of whether Kant endorsed a relational account of perceptual experience.
In order to perform certain actions – such as incarcerating a person or revoking parental rights – the state must establish certain facts to a particular standard of proof. These standards – such as preponderance of evidence and beyond reasonable doubt – are often interpreted as likelihoods or epistemic confidences. Many theorists construe them numerically; beyond reasonable doubt, for example, is often construed as 90 to 95% confidence in the guilt of the defendant. -/- A family of influential cases (...) suggests standards of proof should not be interpreted numerically. These ‘proof paradoxes’ illustrate that purely statistical evidence can warrant high credence in a disputed fact without satisfying the relevant legal standard. In this essay I evaluate three influential attempts to explain why merely statistical evidence cannot satisfy legal standards. (shrink)
Gaisi Takeuti (1926–2017) is one of the most distinguished logicians in proof theory after Hilbert and Gentzen. He extensively extended Hilbert's program in the sense that he formulated Gentzen's sequent calculus, conjectured that cut-elimination holds for it (Takeuti's conjecture), and obtained several stunning results in the 1950–60s towards the solution of his conjecture. Though he has been known chiefly as a great mathematician, he wrote many papers in English and Japanese where he expressed his philosophical thoughts. In particular, he (...) used several keywords such as "active intuition" and "self-reflection" from Nishida's philosophy. In this paper, we aim to describe a general outline of our project to investigate Takeuti's philosophy of mathematics. In particular, after reviewing Takeuti's proof-theoretic results briefly, we describe some key elements in Takeuti's texts. By explaining these texts, we point out the connection between Takeuti's proof theory and Nishida's philosophy and explain the future goals of our project. (shrink)
This presentation of Aristotle's natural deduction system supplements earlier presentations and gives more historical evidence. Some fine-tunings resulted from conversations with Timothy Smiley, Charles Kahn, Josiah Gould, John Kearns,John Glanvillle, and William Parry.The criticism of Aristotle's theory of propositions found at the end of this 1974 presentation was retracted in Corcoran's 2009 HPL article "Aristotle's demonstrative logic".
James Van Cleve has argued that Kant’s Transcendental Deduction of the categories shows, at most, that we must apply the categories to experience. And this falls short of Kant’s aim, which is to show that they must so apply. In this discussion I argue that once we have noted the differences between the first and second editions of the Deduction, this objection is less telling. But Van Cleve’s objection can help illuminate the structure of the B Deduction, (...) and it suggests an interesting reason why the rewriting might have been thought necessary. (shrink)
Roughly, a proof of a theorem, is “pure” if it draws only on what is “close” or “intrinsic” to that theorem. Mathematicians employ a variety of terms to identify pure proofs, saying that a pure proof is one that avoids what is “extrinsic,” “extraneous,” “distant,” “remote,” “alien,” or “foreign” to the problem or theorem under investigation. In the background of these attributions is the view that there is a distance measure (or a variety of such measures) between mathematical (...) statements and proofs. Mathematicians have paid little attention to specifying such distance measures precisely because in practice certain methods of proof have seemed self- evidently impure by design: think for instance of analytic geometry and analytic number theory. By contrast, mathematicians have paid considerable attention to whether such impurities are a good thing or to be avoided, and some have claimed that they are valuable because generally impure proofs are simpler than pure proofs. This article is an investigation of this claim, formulated more precisely by proof- theoretic means. After assembling evidence from proof theory that may be thought to support this claim, we will argue that on the contrary this evidence does not support the claim. (shrink)
I prove the nonexistence of gods. The proof is based on three axioms: Ockham’s razor (OR), religiosity is endogenous in humans, and, there are no miracles. The OR is formulated operationally, to remove improper postulates, such that it yields not only a plausible argument but truth. The validity of the second and the third axiom is established empirically by inductive reasoning relying on a thorough analysis of the psychiatric literature and skeptical publications. With these axioms I prove that gods (...) are not necessary for our universe. Applying OR yields that gods do not exist. The implications of this article are enormous. Mankind’s understanding of the world is elevated to a higher level to a unified view on the world being nature and mankind being a part of it. (shrink)
forall x: Calgary is a full-featured textbook on formal logic. It covers key notions of logic such as consequence and validity of arguments, the syntax of truth-functional propositional logic TFL and truth-table semantics, the syntax of first-order (predicate) logic FOL with identity (first-order interpretations), translating (formalizing) English in TFL and FOL, and Fitch-style natural deductionproof systems for both TFL and FOL. It also deals with some advanced topics such as truth-functional completeness and modal logic. Exercises with solutions (...) are available. It is provided in PDF (for screen reading, printing, and a special version for dyslexics) and in LaTeX source code. (shrink)
For deductive reasoning to be justified, it must be guaranteed to preserve truth from premises to conclusion; and for it to be useful to us, it must be capable of informing us of something. How can we capture this notion of information content, whilst respecting the fact that the content of the premises, if true, already secures the truth of the conclusion? This is the problem I address here. I begin by considering and rejecting several accounts of informational content. I (...) then develop an account on which informational contents are indeterminate in their membership. This allows there to be cases in which it is indeterminate whether a given deduction is informative. Nevertheless, on the picture I present, there are determinate cases of informative (and determinate cases of uninformative) inferences. I argue that the model I offer is the best way for an account of content to respect the meaning of the logical constants and the inference rules associated with them without collapsing into a classical picture of content, unable to account for informative deductive inferences. (shrink)
One of the strongest motivations for conceptualist readings of Kant is the belief that the Transcendental Deduction is incompatible with nonconceptualism. In this article, I argue that this belief is simply false: the Deduction and nonconceptualism are compatible at both an exegetical and a philosophical level. Placing particular emphasis on the case of non-human animals, I discuss in detail how and why my reading diverges from those of Ginsborg, Allais, Gomes and others. I suggest ultimately that it is (...) only by embracing nonconceptualism that we can fully recognise the delicate calibration of the trap which the Critique sets for Hume. (shrink)
This paper contends that Stoic logic (i.e. Stoic analysis) deserves more attention from contemporary logicians. It sets out how, compared with contemporary propositional calculi, Stoic analysis is closest to methods of backward proof search for Gentzen-inspired substructural sequent logics, as they have been developed in logic programming and structural proof theory, and produces its proof search calculus in tree form. It shows how multiple similarities to Gentzen sequent systems combine with intriguing dissimilarities that may enrich contemporary discussion. (...) Much of Stoic logic appears surprisingly modern: a recursively formulated syntax with some truth-functional propositional operators; analogues to cut rules, axiom schemata and Gentzen’s negation-introduction rules; an implicit variable-sharing principle and deliberate rejection of Thinning and avoidance of paradoxes of implication. These latter features mark the system out as a relevance logic, where the absence of duals for its left and right introduction rules puts it in the vicinity of McCall’s connexive logic. Methodologically, the choice of meticulously formulated meta-logical rules in lieu of axiom and inference schemata absorbs some structural rules and results in an economical, precise and elegant system that values decidability over completeness. (shrink)
It is tempting to think that multi premise closure creates a special class of paradoxes having to do with the accumulation of risks, and that these paradoxes could be escaped by rejecting the principle, while still retaining single premise closure. I argue that single premise deduction is also susceptible to risks. I show that what I take to be the strongest argument for rejecting multi premise closure is also an argument for rejecting single premise closure. Because of the symmetry (...) between the principles, they come as a package: either both will have to be rejected or both will have to be revised. (shrink)
According to Jim Pryor’s dogmatism, if you have an experience as if P, you acquire immediate prima facie justification for believing P. Pryor contends that dogmatism validates Moore’s infamous proof of a material world. Against Pryor, I argue that if dogmatism is true, Moore’s proof turns out to be non-transmissive of justification according to one of the senses of non-transmissivity defined by Crispin Wright. This type of non-transmissivity doesn’t deprive dogmatism of its apparent antisceptical bite.
Smith argues that, unlike other forms of evidence, naked statistical evidence fails to satisfy normic support. This is his solution to the puzzles of statistical evidence in legal proof. This paper focuses on Smith’s claim that DNA evidence in cold-hit cases does not satisfy normic support. I argue that if this claim is correct, virtually no other form of evidence used at trial can satisfy normic support. This is troublesome. I discuss a few ways in which Smith can respond.
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.