In this paper, I consider a family of three-valued regular logics: the well-known strong and weak S.C. Kleene’s logics and two intermedi- ate logics, where one was discovered by M. Fitting and the other one by E. Komendantskaya. All these systems were originally presented in the semantical way and based on the theory of recursion. However, the proof theory of them still is not fully developed. Thus, naturaldeduction sys- tems are built only for strong Kleene’s logic both (...) with one (A. Urquhart, G. Priest, A. Tamminga) and two designated values (G. Priest, B. Kooi, A. Tamminga). The purpose of this paper is to provide naturaldeduction systems for weak and intermediate regular logics both with one and two designated values. (shrink)
We present a sound and complete Fitch-style naturaldeduction system for an S5 modal logic containing an actuality operator, a diagonal necessity operator, and a diagonal possibility operator. The logic is two-dimensional, where we evaluate sentences with respect to both an actual world (first dimension) and a world of evaluation (second dimension). The diagonal necessity operator behaves as a quantifier over every point on the diagonal between actual worlds and worlds of evaluation, while the diagonal possibility quantifies over (...) some point on the diagonal. Thus, they are just like the epistemic operators for apriority and its dual. We take this extension of Fitch’s familiar derivation system to be a very natural one, since the new rules and labeled lines hereby introduced preserve the structure of Fitch’s own rules for the modal case. (shrink)
Methods available for the axiomatization of arbitrary finite-valued logics can be applied to obtain sound and complete intelim rules for all truth-functional connectives of classical logic including the Sheffer stroke and Peirce’s arrow. The restriction to a single conclusion in standard systems of naturaldeduction requires the introduction of additional rules to make the resulting systems complete; these rules are nevertheless still simple and correspond straightforwardly to the classical absurdity rule. Omitting these rules results in systems for intuitionistic (...) versions of the connectives in question. (shrink)
This presentation of Aristotle's naturaldeduction system supplements earlier presentations and gives more historical evidence. Some fine-tunings resulted from conversations with Timothy Smiley, Charles Kahn, Josiah Gould, John Kearns,John Glanvillle, and William Parry.The criticism of Aristotle's theory of propositions found at the end of this 1974 presentation was retracted in Corcoran's 2009 HPL article "Aristotle's demonstrative logic".
Harold Hodes in [1] introduces an extension of first-order modal logic featuring a backtracking operator, and provides a possible worlds semantics, according to which the operator is a kind of device for ‘world travel’; he does not provide a proof theory. In this paper, I provide a naturaldeduction system for modal logic featuring this operator, and argue that the system can be motivated in terms of a reading of the backtracking operator whereby it serves to indicate modal (...) scope. I prove soundness and completeness theorems with respect to Hodes’ semantics, as well as semantics with fewer restrictions on the accessibility relation. (shrink)
A construction principle for naturaldeduction systems for arbitrary, finitely-many-valued first order logics is exhibited. These systems are systematically obtained from sequent calculi, which in turn can be automatically extracted from the truth tables of the logics under consideration. Soundness and cut-free completeness of these sequent calculi translate into soundness, completeness, and normal-form theorems for naturaldeduction systems.
We argue that the need for commentary in commonly used linear calculi of naturaldeduction is connected to the “deletion” of illocutionary expressions that express the role of propositions as reasons, assumptions, or inferred propositions. We first analyze the formalization of an informal proof in some common calculi which do not formalize natural language illocutionary expressions, and show that in these calculi the formalizations of the example proof rely on commentary devices that have no counterpart in the (...) original proof. We then present a linear naturaldeduction calculus that makes use of formal illocutionary expressions in such a way that unique readability for derivations is guaranteed – thus showing that formalizing illocutionary expressions can eliminate the need for commentary. (shrink)
This paper presents a way of formalising definite descriptions with a binary quantifier ι, where ιx[F, G] is read as ‘The F is G’. Introduction and elimination rules for ι in a system of intuitionist negative free logic are formulated. Procedures for removing maximal formulas of the form ιx[F, G] are given, and it is shown that deductions in the system can be brought into normal form.
my goal here is to provide a detailed analysis of the methods of inference that are employed in De prospectiva pingendi. For this purpose, a method of naturaldeduction is proposed. the treatise by Piero della Francesca is a manifestation of a union between the ne arts and the mathematical sciences of arithmetic and geometry. He de nes painting as a part of perspective and, speaking precisely, as a branch of geometry, which is why we nd advanced geometrical (...) exercises here. (shrink)
PARC is an "appended numeral" system of naturaldeduction that I learned as an undergraduate and have taught for many years. Despite its considerable pedagogical strengths, PARC appears to have never been published. The system features explicit "tracking" of premises and assumptions throughout a derivation, the collapsing of indirect proofs into conditional proofs, and a very simple set of quantificational rules without the long list of exceptions that bedevil students learning existential instantiation and universal generalization. The system can (...) be used with any Copi-style set of inference rules, so it is quite adaptable to many mainstream symbolic logic textbooks. Consequently, PARC may be especially attractive to logic teachers who find Jaskowski/Gentzen-style introduction/elimination rules to be far less "natural" than Copi-style rules. The PARC system is also keyboard-friendly in comparison to the widely adopted Jaskowski-style graphical subproof system of naturaldeduction, viz., Fitch diagrams and Copi "bent arrow" diagrams. (shrink)
This paper presents two systems of naturaldeduction for the rejection of non-tautologies of classical propositional logic. The first system is sound and complete with respect to the body of all non-tautologies, the second system is sound and complete with respect to the body of all contradictions. The second system is a subsystem of the first. Starting with Jan Łukasiewicz's work, we describe the historical development of theories of rejection for classical propositional logic. Subsequently, we present the two (...) systems of naturaldeduction and prove them to be sound and complete. We conclude with a ‘Theorem of Inversion’. (shrink)
Building on the work of Peter Hinst and Geo Siegwart, we develop a pragmatised naturaldeduction calculus, i.e. a naturaldeduction calculus that incorporates illocutionary operators at the formal level, and prove its adequacy. In contrast to other linear calculi of naturaldeduction, derivations in this calculus are sequences of object-language sentences which do not require graphical or other means of commentary in order to keep track of assumptions or to indicate subproofs. (Translation of (...) our German paper "Ein Redehandlungskalkül. Ein pragmatisierter Kalkül des natürlichen Schließens nebst Metatheorie"; online available at http://philpapers.org/rec/CORERE.). (shrink)
Some logic students falter at the transition from the mechanical method of truth tables to the less-mechanical method of naturaldeduction. This short paper introduces a word game intended to ease that transition.
This paper examines Hobbes’s criticisms of Robert Boyle’s air-pump experiments in light of Hobbes’s account in _De Corpore_ and _De Homine_ of the relationship of natural philosophy to geometry. I argue that Hobbes’s criticisms rely upon his understanding of what counts as “true physics.” Instead of seeing Hobbes as defending natural philosophy as “a causal enterprise … [that] as such, secured total and irrevocable assent,” 1 I argue that, in his disagreement with Boyle, Hobbes relied upon his understanding (...) of natural philosophy as a mixed mathematical science. In a mixed mathematical science one can mix facts from experience with causal principles borrowed from geometry. Hobbes’s harsh criticisms of Boyle’s philosophy, especially in the _Dialogus Physicus, sive De natura aeris_, should thus be understood as Hobbes advancing his view of the proper relationship of natural philosophy to geometry in terms of mixing principles from geometry with facts from experience. Understood in this light, Hobbes need not be taken to reject or diminish the importance of experiment/experience; nor should Hobbes’s criticisms in _Dialogus Physicus_ be understood as rejecting experimenting as ignoble and not befitting a philosopher. Instead, Hobbes’s viewpoint is that experiment/experience must be understood within its proper place – it establishes the ‘that’ for a mixed mathematical science explanation. (shrink)
This essay partly builds on and partly criticizes a striking idea of Dieter Henrich. Henrich argues that Kant's distinction in the first Critique between the question of fact (quid facti) and the question of law (quid juris) provides clues to the argumentative structure of a philosophical "Deduction". Henrich suggests that the unity of apperception plays a role analogous to a legal factum. By contrast, I argue, first, that the question of fact in the first Critique is settled by the (...) Metaphysical Deduction, which establishes the purity of origin of the Categories, and, second, that in the second Critique, the relevant factum is the Fact of Reason, which amounts to the fact that the Moral Law is pure in origin. (shrink)
Deductive inference is usually regarded as being “tautological” or “analytical”: the information conveyed by the conclusion is contained in the information conveyed by the premises. This idea, however, clashes with the undecidability of first-order logic and with the (likely) intractability of Boolean logic. In this article, we address the problem both from the semantic and the proof-theoretical point of view. We propose a hierarchy of propositional logics that are all tractable (i.e. decidable in polynomial time), although by means of growing (...) computational resources, and converge towards classical propositional logic. The underlying claim is that this hierarchy can be used to represent increasing levels of “depth” or “informativeness” of Boolean reasoning. Special attention is paid to the most basic logic in this hierarchy, the pure “intelim logic”, which satisfies all the requirements of a naturaldeduction system (allowing both introduction and elimination rules for each logical operator) while admitting of a feasible (quadratic) decision procedure. We argue that this logic is “analytic” in a particularly strict sense, in that it rules out any use of “virtual information”, which is chiefly responsible for the combinatorial explosion of standard classical systems. As a result, analyticity and tractability are reconciled and growing degrees of computational complexity are associated with the depth at which the use of virtual information is allowed. (shrink)
Future Logic is an original, and wide-ranging treatise of formal logic. It deals with deduction and induction, of categorical and conditional propositions, involving the natural, temporal, extensional, and logical modalities. Traditional and Modern logic have covered in detail only formal deduction from actual categoricals, or from logical conditionals (conjunctives, hypotheticals, and disjunctives). Deduction from modal categoricals has also been considered, though very vaguely and roughly; whereas deduction from natural, temporal and extensional forms of conditioning (...) has been all but totally ignored. As for induction, apart from the elucidation of adductive processes (the scientific method), almost no formal work has been done. This is the first work ever to strictly formalize the inductive processes of generalization and particularization, through the novel methods of factorial analysis, factor selection and formula revision. This is the first work ever to develop a formal logic of the natural, temporal and extensional types of conditioning (as distinct from logical conditioning), including their production from modal categorical premises. Future Logic contains a great many other new discoveries, organized into a unified, consistent and empirical system, with precise definitions of the various categories and types of modality (including logical modality), and full awareness of the epistemological and ontological issues involved. Though strictly formal, it uses ordinary language, wherever symbols can be avoided. Among its other contributions: a full list of the valid modal syllogisms (which is more restrictive than previous lists); the main formalities of the logic of change (which introduces a dynamic instead of merely static approach to classification); the first formal definitions of the modal types of causality; a new theory of class logic, free of the Russell Paradox; as well as a critical review of modern metalogic. But it is impossible to list briefly all the innovations in logical science — and therefore, epistemology and ontology — this book presents; it has to be read for its scope to be appreciated. (shrink)
This paper raises obvious questions undermining any residual confidence in Mates work and revealing our embarrassing ignorance of true nature of Stoic deduction. It was inspired by the challenging exploratory work of JOSIAH GOULD.
I offer an alternative account of the relationship of Hobbesian geometry to natural philosophy by arguing that mixed mathematics provided Hobbes with a model for thinking about it. In mixed mathematics, one may borrow causal principles from one science and use them in another science without there being a deductive relationship between those two sciences. Natural philosophy for Hobbes is mixed because an explanation may combine observations from experience (the ‘that’) with causal principles from geometry (the ‘why’). My (...) argument shows that Hobbesian natural philosophy relies upon suppositions that bodies plausibly behave according to these borrowed causal principles from geometry, acknowledging that bodies in the world may not actually behave this way. First, I consider Hobbes's relation to Aristotelian mixed mathematics and to Isaac Barrow's broadening of mixed mathematics in Mathematical Lectures (1683). I show that for Hobbes maker's knowledge from geometry provides the ‘why’ in mixed-mathematical explanations. Next, I examine two explanations from De corpore Part IV: (1) the explanation of sense in De corpore 25.1-2; and (2) the explanation of the swelling of parts of the body when they become warm in De corpore 27.3. In both explanations, I show Hobbes borrowing and citing geometrical principles and mixing these principles with appeals to experience. (shrink)
Kant's A-Edition objective deduction is naturally (and has traditionally been) divided into two arguments: an " argument from above" and one that proceeds " von unten auf." This would suggest a picture of Kant's procedure in the objective deduction as first descending and ascending the same ladder, the better, perhaps, to test its durability or to thoroughly convince the reader of its soundness. There are obvious obstacles to such a reading, however; and in this chapter I will argue (...) that the arguments from above and below constitute different, albeit importantly inter-related, proofs. Rather than drawing on the differences in their premises, however, I will highlight what I take to be the different concerns addressed and, correspondingly, the distinct conclusions reached by each. In particular, I will show that both arguments can be understood to address distinct specters, with the argument from above addressing an internal concern generated by Kant’s own transcendental idealism, and the argument from below seeking to dispel a more traditional, broadly Humean challenge to the understanding’s role in experience. These distinct concerns also imply that these arguments yield distinct conclusions, though I will show that they are in fact complementary. (shrink)
Medical diagnosis has been traditionally recognized as a privileged field of application for so called probabilistic induction. Consequently, the Bayesian theorem, which mathematically formalizes this form of inference, has been seen as the most adequate tool for quantifying the uncertainty surrounding the diagnosis by providing probabilities of different diagnostic hypotheses, given symptomatic or laboratory data. On the other side, it has also been remarked that differential diagnosis rather works by exclusion, e.g. by modus tollens, i.e. deductively. By drawing on a (...) case history, this paper aims at clarifying some points on the issue. Namely: 1) Medical diagnosis does not represent, strictly speaking, a form of induction, but a type, of what in Peircean terms should be called ‘abduction’ (identifying a case as the token of a specific type); 2) in performing the single diagnostic steps, however, different inferential methods are used for both inductive and deductive nature: modus tollens, hypothetical-deductive method, abduction; 3) Bayes’ theorem is a probabilized form of abduction which uses mathematics in order to justify the degree of confidence which can be entertained on a hypothesis given the available evidence; 4) although theoretically irreconcilable, in practice, both the hypothetical- deductive method and the Bayesian one, are used in the same diagnosis with no serious compromise for its correctness; 5) Medical diagnosis, especially differential diagnosis, also uses a kind of “probabilistic modus tollens”, in that, signs (symptoms or laboratory data) are taken as strong evidence for a given hypothesis not to be true: the focus is not on hypothesis confirmation, but instead on its refutation [Pr (¬ H/E1, E2, …, En)]. Especially at the beginning of a complicated case, odds are between the hypothesis that is potentially being excluded and a vague “other”. This procedure has the advantage of providing a clue of what evidence to look for and to eventually reduce the set of candidate hypotheses if conclusive negative evidence is found. 6) Bayes’ theorem in the hypothesis-confirmation form can more faithfully, although idealistically, represent the medical diagnosis when the diagnostic itinerary has come to a reduced set of plausible hypotheses after a process of progressive elimination of candidate hypotheses; 7) Bayes’ theorem is however indispensable in the case of litigation in order to assess doctor’s responsibility for medical error by taking into account the weight of the evidence at his disposal. (shrink)
The aim of this paper is to examine the nature, scope and importance of philosophy in the light of its relation to other disciplines. This work pays its focus on the various fundamental problems of philosophy, relating to Ethics, Metaphysics, Epistemology Logic, and its association with scientific realism. It will also highlight the various facets of these problems and the role of philosophers to point out the various issues relating to human issues. It is widely agreed that philosophy as a (...) multi-dimensional subject that shows affinity to others branches of philosophy like, Philosophy of Science, Humanities, Physics and Mathematics, but this paper also seeks, a philosophical nature towards the universal problems of nature. It evaluates the contribution and sacrifices of the great sages of philosophers to promote the clarity and progress in the field of philosophy. (shrink)
Few of Kant’s doctrines are as difficult to understand as that of self-affection. Its brief career in the published literature consists principally in its unheralded introduction in the Transcendental Aesthetic and unexpected re-appearance at a key moment in the Deduction chapter in the B edition of the first Critique. Kant’s commentators, confronted with the difficulty of this doctrine, have naturally resorted to various strategies of clarification, ranging from distinguishing between empirical and transcendental self-affection, divorcing self-affection from the claims of (...) self-knowledge with which Kant explicitly connects it, and, perhaps least justified of all, ignoring the doctrine altogether. Yet the connection between self-affection and central Critical doctrines (such as the transcendental synthesis of the imagination) marks all of these strategies as last resorts. In this paper, I seek to provide a clearer outline of the constellation of those issues which inform Kant’s discussion of self-affection. More particularly, I intend to explain the crucial role played by self-affection in the account of the transcendental conditions of perception provided late in the B Deduction. (shrink)
In this paper, I discuss the debate on Kant and nonconceptual content. Inspired by Kant’s account of the intimate relation between intuition and concepts, McDowell (1996) has forcefully argued that the relation between sensible content and concepts is such that sensible content does not severally contribute to cognition but always only in conjunction with concepts. This view is known as conceptualism. Recently, Kantians Robert Hanna and Lucy Allais, among others, have brought against this view the charge that it neglects the (...) possibility of the existence of essentially nonconceptual content that is not conceptualised or subject to conceptualisation. Their critique of McDowell amounts to nonconceptualism. However, both views, conceptualist and nonconceptualist, share the assumption that intuition is synthesised content in Kant’s sense. My interest is not in the validity of the philosophical positions of conceptualism or nonconceptualism per se. I am particularly interested in the extent to which the views that McDowell and Hanna and Allais respectively advance are true to Kant, or can validly be seen as Kantian positions. I argue that although McDowell is right that intuition is only epistemically relevant in conjunction with concepts, Hanna and Allais are right with regard to the existence of essentially nonconceptual content (intuitions) independently of the functions of the understanding, but that they are wrong with regard to non-conceptualised intuition being synthesised content in Kant’s sense. Kantian conceptualists (Bowman 2011; Griffith 2012; Gomes 2014) have responded to the recent nonconceptualist offensive, with reference to A89ff./B122ff. (§13)—which, confusingly, the nonconceptualists also cite as evidence for their contrary reading—by arguing that the nonconceptualist view conflicts with the central goal of the Transcendental Deduction, namely, to argue that all intuitions are subject to the categories. I contend that the conceptualist reading of A89ff./B122ff. is unfounded, but also that the nonconceptualists are wrong to believe that intuitions as such refer strictly to objects independently of the functions of the understanding, and that they are mistaken about the relation between figurative synthesis and intellectual synthesis. I argue that Kant is a conceptualist, albeit not in the sense that standard conceptualists assume. Perceptual knowledge is always judgemental, though without this resulting in the standard conceptualist claim that, necessarily, all intuitions or all perceptions per se stand under the categories (strong conceptualism). I endorse the nonconceptualist view that, for Kant, perception per se, i.e. any mere or ‘blind’ intuition of objects (i.e. objects as indeterminate appearances) short of perceptual knowledge, does not necessarily stand under the categories. Perception is not yet perceptual knowledge. In this context, I point out the common failure in the literature on the Transcendental Deduction, both of the conceptualist and nonconceptualist stripe, to take account of the modal nature of Kant’s argument for the relation between intuition and concept insofar as cognition should arise from it. (shrink)
In this paper we focus on the logicality of language, i.e. the idea that the language system contains a deductive device to exclude analytic constructions. Puzzling evidence for the logicality of language comes from acceptable contradictions and tautologies. The standard response in the literature involves assuming that the language system only accesses analyticities that are due to skeletons as opposed to standard logical forms. In this paper we submit evidence in support of alternative accounts of logicality, which reject the stipulation (...) of a natural logic and assume instead the meaning modulation of nonlogical terms. (shrink)
Considering how Kant’s synthetic unity of apperception could be “naturalized,” this paper seeks to liberate the Kantian theory of experience from any foundationalist renderings that blur the lines between the empirical and transcendental, without compromising Kant’s attempt to investigate how the invariant structures of experience condition and supply rules for our knowledge of the world. This paper begins with an overview of the Transcendental Deduction’s apperceptive “I think.” We then consider Sellars’ Myth of Jones and Sellars’ notion of noumenal (...) reality as a “limit concept” not in metaphysical but alongside pragmatist lines, where the “in-itself” is schematized as a regulatory ideal that normatively orients science as a self-correcting enterprise. Providing a successor-account to Sellars’ naturalization of Kant’s ‘I think,’ we seek to develop hard-transcendental and soft-transcendental pragmatic conditions to describe protocols for revision and integration, proffering an anti-dogmatic metaphysical stance that, true to Kant, expands our understanding of perception and linguistic licensing to include the kind of sensory and conceptual capacities associated with sapient experience. (shrink)
In the 17th century, Hobbes stated that we reason by addition and subtraction. Historians of logic note that Hobbes thought of reasoning as “a ‘species of computation’” but point out that “his writing contains in fact no attempt to work out such a project.” Though Leibniz mentions the plus/minus character of the positive and negative copulas, neither he nor Hobbes say anything about a plus/minus character of other common logical words that drive our deductive judgments, words like ‘some’, ‘all’, ‘if’, (...) and ‘and’, each of which actually turns out to have an oppositive, character that allows us, “in our silent reasoning,” to ignore its literal meaning and to reckon with it as one reckons with a plus or a minus operator in elementary algebra or arithmetic. These ‘logical constants’ of natural language figure crucially in our everyday reasoning. Because Hobbes and Leibniz did not identify them as the plus and minus words we reason with, their insight into what goes on in ‘ratiocination’ did not provide a guide for a research program that could develop a +/- logic that actually describes how we reason deductively. I will argue that such a +/- logic provides a way back from modern predicate logic—the logic of quantifiers and bound variables that is now ‘standard logic’—to an Aristotelian term logic of natural language that had been the millennial standard logic. (shrink)
The Logic of Causation: Definition, Induction and Deduction of Deterministic Causality is a treatise of formal logic and of aetiology. It is an original and wide-ranging investigation of the definition of causation (deterministic causality) in all its forms, and of the deduction and induction of such forms. The work was carried out in three phases over a dozen years (1998-2010), each phase introducing more sophisticated methods than the previous to solve outstanding problems. This study was intended as part (...) of a larger work on causal logic, which additionally treats volition and allied cause-effect relations (2004). The Logic of Causation deals with the main technicalities relating to reasoning about causation. Once all the deductive characteristics of causation in all its forms have been treated, and we have gained an understanding as to how it is induced, we are able to discuss more intelligently its epistemological and ontological status. In this context, past theories of causation are reviewed and evaluated (although some of the issues involved here can only be fully dealt with in a larger perspective, taking volition and other aspects of causality into consideration, as done in Volition and Allied Causal Concepts). Phase I: Macroanalysis. Starting with the paradigm of causation, its most obvious and strongest form, we can by abstraction of its defining components distinguish four genera of causation, or generic determinations, namely: complete, partial, necessary and contingent causation. When these genera and their negations are combined together in every which way, and tested for consistency, it is found that only four species of causation, or specific determinations, remain conceivable. The concept of causation thus gives rise to a number of positive and negative propositional forms, which can be studied in detail with relative ease because they are compounds of conjunctive and conditional propositions whose properties are already well known to logicians. The logical relations (oppositions) between the various determinations (and their negations) are investigated, as well as their respective implications (eductions). Thereafter, their interactions (in syllogistic reasoning) are treated in the most rigorous manner. The main question we try to answer here is: is (or when is) the cause of a cause of something itself a cause of that thing, and if so to what degree? The figures and moods of positive causative syllogism are listed exhaustively; and the resulting arguments validated or invalidated, as the case may be. In this context, a general and sure method of evaluation called ‘matricial analysis’ (macroanalysis) is introduced. Because this (initial) method is cumbersome, it is used as little as possible – the remaining cases being evaluated by means of reduction. Phase II: Microanalysis. Seeing various difficulties encountered in the first phase, and the fact that some issues were left unresolved in it, a more precise method is developed in the second phase, capable of systematically answering most outstanding questions. This improved matricial analysis (microanalysis) is based on tabular prediction of all logically conceivable combinations and permutations of conjunctions between two or more items and their negations (grand matrices). Each such possible combination is called a ‘modus’ and is assigned a permanent number within the framework concerned (for 2, 3, or more items). This allows us to identify each distinct (causative or other, positive or negative) propositional form with a number of alternative moduses. This technique greatly facilitates all work with causative and related forms, allowing us to systematically consider their eductions, oppositions, and syllogistic combinations. In fact, it constitutes a most radical approach not only to causative propositions and their derivatives, but perhaps more importantly to their constituent conditional propositions. Moreover, it is not limited to logical conditioning and causation, but is equally applicable to other modes of modality, including extensional, natural, temporal and spatial conditioning and causation. From the results obtained, we are able to settle with formal certainty most of the historically controversial issues relating to causation. Phase III: Software Assisted Analysis. The approach in the second phase was very ‘manual’ and time consuming; the third phase is intended to ‘mechanize’ much of the work involved by means of spreadsheets (to begin with). This increases reliability of calculations (though no errors were found, in fact) – but also allows for a wider scope. Indeed, we are now able to produce a larger, 4-item grand matrix, and on its basis find the moduses of causative and other forms needed to investigate 4-item syllogism. As well, now each modus can be interpreted with greater precision and causation can be more precisely defined and treated. In this latest phase, the research is brought to a successful finish! Its main ambition, to obtain a complete and reliable listing of all 3-item and 4-item causative syllogisms, being truly fulfilled. This was made technically feasible, in spite of limitations in computer software and hardware, by cutting up problems into smaller pieces. For every mood of the syllogism, it was thus possible to scan for conclusions ‘mechanically’ (using spreadsheets), testing all forms of causative and preventive conclusions. Until now, this job could only be done ‘manually’, and therefore not exhaustively and with certainty. It took over 72’000 pages of spreadsheets to generate the sought for conclusions. This is a historic breakthrough for causal logic and logic in general. Of course, not all conceivable issues are resolved. There is still some work that needs doing, notably with regard to 5-item causative syllogism. But what has been achieved solves the core problem. The method for the resolution of all outstanding issues has definitely now been found and proven. The only obstacle to solving most of them is the amount of labor needed to produce the remaining (less important) tables. As for 5-item syllogism, bigger computer resources are also needed. (shrink)
“Nature” is one of the most challenging concepts in philosophy, and notoriously difficult to define. In ancient Greece, two strategies for coming to terms with nature were developed. On the one hand, nature was seen as a perfect geometrical order, analysable with the help of geometry and deductive reasoning. On the other hand, a more Dionysian view emerged, stressing nature’s unpredictability, capriciousness and fluidity. This view was exemplified by De Rerum Natura, a philosophical masterpiece in verse. In a philosophy course (...) for science students, participants use both approaches. They are asked to give a definition of nature, and subsequently to capture nature in a poem. Quite consistently, their poetry proves more convincing than their definitions. In this paper, an anthology of student poetry is presented and analysed. To what extent may verse-writing as a philosophical assignment enable science students to come to terms with their understanding of nature? (shrink)
The vision of natural kinds that is most common in the modern philosophy of biology, particularly with respect to the question whether species and other taxa are natural kinds, is based on a revision of the notion by Mill in A System of Logic. However, there was another conception that Whewell had previously captured well, which taxonomists have always employed, of kinds as being types that need not have necessary and sufficient characters and properties, or essences. These competing (...) views employ different approaches to scientific methodologies: Mill’s class-kinds are not formed by induction but by deduction, while Whewell’s type-kinds are inductive. More recently, phylogenetic kinds (clades, or monophyletic-kinds) are inductively projectible, and escape Mill’s strictures. Mill’s version represents a shift in the notions of kinds from the biological to the physical sciences. (shrink)
Contemporary Humeans treat laws of nature as statements of exceptionless regularities that function as the axioms of the best deductive system. Such ‘Best System Accounts’ marry realism about laws with a denial of necessary connections among events. I argue that Hume’s predecessor, George Berkeley, offers a more sophisticated conception of laws, equally consistent with the absence of powers or necessary connections among events in the natural world. On this view, laws are not statements of regularities but the most general (...) rules God follows in producing the world. Pace most commentators, I argue that Berkeley’s view is neither instrumentalist nor reductionist. More important, the Berkeleyan Best System can solve some of the problems afflicting its Humean rivals, including the problems of theory choice and Nancy Cartwright’s ‘facticity’ dilemma. Some of these solutions are available in the contemporary context, without any appeal to God. Berkeley’s account deserves to be taken seriously in its own right. (shrink)
1971. Discourse Grammars and the Structure of Mathematical Reasoning II: The Nature of a Correct Theory of Proof and Its Value, Journal of Structural Learning 3, #2, 1–16. REPRINTED 1976. Structural Learning II Issues and Approaches, ed. J. Scandura, Gordon & Breach Science Publishers, New York, MR56#15263. -/- This is the second of a series of three articles dealing with application of linguistics and logic to the study of mathematical reasoning, especially in the setting of a concern for improvement of (...) mathematical education. The present article presupposes the previous one. Herein we develop our ideas of the purposes of a theory of proof and the criterion of success to be applied to such theories. In addition we speculate at length concerning the specific kinds of uses to which a successful theory of proof may be put vis-a-vis improvement of various aspects of mathematical education. The final article will deal with the construction of such a theory. The 1st is the 1971. Discourse Grammars and the Structure of Mathematical Reasoning I: Mathematical Reasoning and Stratification of Language, Journal of Structural Learning 3, #1, 55–74. https://www.academia.edu/s/fb081b1886?source=link . (shrink)
In the proof-theoretic semantics approach to meaning, harmony , requiring a balance between introduction-rules (I-rules) and elimination rules (E-rules) within a meaning conferring natural-deduction proof-system, is a central notion. In this paper, we consider two notions of harmony that were proposed in the literature: 1. GE-harmony , requiring a certain form of the E-rules, given the form of the I-rules. 2. Local intrinsic harmony : imposes the existence of certain transformations of derivations, known as reduction and expansion . (...) We propose a construction of the E-rules (in GE-form) from given I-rules, and prove that the constructed rules satisfy also local intrinsic harmony. The construction is based on a classification of I-rules, and constitute an implementation to Gentzen’s (and Pawitz’) remark, that E-rules can be “read off” I-rules. (shrink)
The main objective of this PhD Thesis is to present a method of obtaining strong normalization via natural ordinal, which is applicable to naturaldeduction systems and typed lambda calculus. The method includes (a) the definition of a numerical assignment that associates each derivation (or lambda term) to a natural number and (b) the proof that this assignment decreases with reductions of maximal formulas (or redex). Besides, because the numerical assignment used coincide with the length of (...) a specific sequence of reduction - the worst reduction sequence - it is the lowest upper bound on the length of reduction sequences. The main commitment of the introduced method is that it is constructive and elementary, produced only through analyzing structural and combinatorial properties of derivations and lambda terms, without appeal to any sophisticated mathematical tool. Together with the exposition of the method, it is presented a comparative study of some articles in the literature that also get strong normalization by means we can identify with the natural ordinal methods. Among them we highlight Howard[1968], which performs an ordinal analysis of Godel’s Dialectica interpretation for intuitionistic first order arithmetic. We reveal a fact about this article not noted by the author himself: a syntactic proof of strong normalization theorem for the system of typified lambda calculus λ⊃ is a consequence of its results. This would be the first strong normalization proof in the literature. (written in Portuguese). (shrink)
Universal Generalization, if it is not the most poorly understood inference rule in naturaldeduction, then it is the least well explained or justified. The inference rule is, prima facie, quite ambitious: on the basis of a fact established of one thing, I may infer that the fact holds of every thing in the class to which the one belongs—a class which may contain indefinitely many things. How can such an inference be made with any confidence as to (...) its validity or ability to preserve truth from premise to conclusion? My goal in this paper is to explain how Universal Generalization works in a way that makes sense of its ability to preserve truth. In doing so, I shall review common accounts of Universal Generalization and explain why they are inadequate or are explanatorily unsatisfying. Happily, my account makes no ontological or epistemological presumptions and therefore should be compatible with whichever ontological or epistemological schemes the reader prefers. (shrink)
In this paper I try to explain a strange omission in Hume’s methodological descriptions in his first Enquiry. In the course of this explanation I reveal a kind of rationalistic tendency of the latter work. It seems to contrast with “experimental method” of his early Treatise of Human Nature, but, as I show that there is no discrepancy between the actual methods of both works, I make an attempt to explain the change in Hume’s characterization of his own methods. This (...) attempt leads to the question about his interpretation of the science of human nature. I argue that his view on this science was not a constant one and that initially he identified this science with his account of passions. As this presupposes the primacy of Book 2 of his Treatise I try to find new confirmations of the old hypothesis that this Book had been written before the Book 1, dealing with understanding. Finally, I show that this discussion of Hume’s methodology may be of some interest to proponents of conceptual analysis. -/- . (shrink)
Proceeding on the Automatic Deduction System developped at the Philosophy Faculty of the UNAM at Mexico City. (Deduktor Mexican Group of Logics work under the direction of the professor Hugo Padilla Chacón). Conference presented at the mexican City of Guadalajara at the Universidad de Guadalajara, Jalisco, by invitation of the latinoamerican association of philosophy SOPHIA. Early stage of the deductional systems at 2-valued logic. This work embodies the implementation of the first whole and standalone arithmetization of bivalent Logic, the (...) theoretical framework of Hugo Padilla Chacón published in 1984. (shrink)
In previous articles, it has been shown that the deductive system developed by Aristotle in his "second logic" is a naturaldeduction system and not an axiomatic system as previously had been thought. It was also stated that Aristotle's logic is self-sufficient in two senses: First, that it presupposed no other logical concepts, not even those of propositional logic; second, that it is (strongly) complete in the sense that every valid argument expressible in the language of the system (...) is deducible by means of a formal deduction in the system. Review of the system makes the first point obvious. The purpose of the present article is to prove the second. Strong completeness is demonstrated for the Aristotelian system. (shrink)
This paper and its sequel “look under the hood” of the usual sorts of proof-theoretic systems for certain well-known intuitionistic and classical propositional modal logics. Section 1 is preliminary. Of most importance: a marked formula will be the result of prefixing a formula in a propositional modal language with a step-marker, for this paper either 0 or 1. Think of 1 as indicating the taking of “one step away from 0.” Deductions will be constructed using marked formulas. Section 2 presents (...) the model-theoretic concepts, based on those in [7], that guide the rest of this paper. Section 3 presents NaturalDeduction systems IK and CK, formalizations of intuitionistic and classical one-step versions of K. In these systems, occurrences of step-markers allow deductions to display deductive structure that is covered over in familiar “no step” proof-theoretic systems for such logics. Box and Diamond are governed by Introduction and Elimination rules; the familiar K rule and Necessitation are derived (i.e. admissible) rules. CK will be the result of adding the 0-version of the Rule of Excluded Middle to the rules which generate IK. Note: IK is the result of merely dropping that rule from those generating CK, without addition of further rules or axioms (as was needed in [7]). These proof-theoretic systems yield intuitionistic and classical consequence relations by the obvious definition. Section 4 provides some examples of what can be deduced in IK. Section 5 defines some proof-theoretic concepts that are used in Section 6 to prove the soundness of the consequence relation for IK (relative to the class of models defined in Section 2.) Section 7 proves its completeness (relative to that class). Section 8 extends these results to the consequence relation for CK. (Looking ahead: Part 2 will investigate one-step proof-theoretic systems formalizing intuitionistic and classical one-step versions of some familiar logics stronger than K.). (shrink)
The proof theory of many-valued systems has not been investigated to an extent comparable to the work done on axiomatizatbility of many-valued logics. Proof theory requires appropriate formalisms, such as sequent calculus, naturaldeduction, and tableaux for classical (and intuitionistic) logic. One particular method for systematically obtaining calculi for all finite-valued logics was invented independently by several researchers, with slight variations in design and presentation. The main aim of this report is to develop the proof theory of finite-valued (...) first order logics in a general way, and to present some of the more important results in this area. In Systems covered are the resolution calculus, sequent calculus, tableaux, and naturaldeduction. This report is actually a template, from which all results can be specialized to particular logics. (shrink)
In the 19th century, a transition took place from the classical to the modern ideal of science: Science would no longer be regarded as a categorical-deductive system of absolute truths, but instead as a hypothetical-deductive system of problematically conditional propositions. In this process, the synthetic a priori also took on more and more of the status of something problematically conditional, which could be found out and corrected empirically, and was itself even ultimately contingent upon empiricism. Along the way, it lost (...) its original purpose, namely to formulate the conditions for the possibility of objective knowledge. To the extent that one continues to attribute objectivity to scientific knowledge, however, the question of the synthetic a priori remains current. The present volume aims to trace the historical roots and varied interpretations of the synthetic a priori while also seeking new approaches toward a contemporary reinterpretation of this fundamental concept. // -/- Im 19. Jahrhundert vollzieht sich der Übergang vom klassischen zum modernen Wissenschaftsideal: Die Wissenschaft wird nicht mehr als kategorisch-deduktives System absoluter Wahrheiten, sondern als ein hypothetisch-deduktives System problematisch-konditionaler Sätze angesehen. Damit erlangt auch das synthetische Apriori mehr und mehr den Status von etwas Problematisch-Konditionalem, das vermöge der Empirie aufgefunden und nachkorrigiert wird, schlussendlich sogar selbst von der Empirie abhängt. Es büßt dabei seinen ursprünglichen Zweck ein, nämlich die Bedingungen der Möglichkeit objektiver Erkenntnis zu formulieren. Sofern man wissenschaftlicher Erkenntnis Objektivität zugesteht, bleibt jedoch die Frage nach dem synthetischen Apriori aktuell. Das vorliegende Buch will einerseits den historischen Wurzeln sowie verschiedenen Interpretationen des synthetischen Apriori nachspüren und andererseits nach Ansätzen für eine zeitgemäße Reinterpretation dieses fundamentalen Begriffes fragen. (shrink)
Building on the work of Peter Hinst and Geo Siegwart, we develop a pragmatised naturaldeduction calculus, i.e., a naturaldeduction calculus that incorporates illocutionary operators at the formal level, and prove its adequacy. In contrast to other linear calculi of naturaldeduction, derivations in this calculus are sequences of object-language sentences which do not require graphical or other means of commentary in order to keep track of assumptions or to indicate subproofs.
Recent work in formal semantics suggests that the language system includes not only a structure building device, as standardly assumed, but also a natural deductive system which can determine when expressions have trivial truth‐conditions (e.g., are logically true/false) and mark them as unacceptable. This hypothesis, called the ‘logicality of language’, accounts for many acceptability patterns, including systematic restrictions on the distribution of quantifiers. To deal with apparent counter‐examples consisting of acceptable tautologies and contradictions, the logicality of language is often (...) paired with an additional assumption according to which logical forms are radically underspecified: i.e., the language system can see functional terms but is ‘blind’ to open class terms to the extent that different tokens of the same term are treated as if independent. This conception of logical form has profound implications: it suggests an extreme version of the modularity of language, and can only be paired with non‐classical—indeed quite exotic—kinds of deductive systems. The aim of this paper is to show that we can pair the logicality of language with a different and ultimately more traditional account of logical form. This framework accounts for the basic acceptability patterns which motivated the logicality of language, can explain why some tautologies and contradictions are acceptable, and makes better predictions in key cases. As a result, we can pursue versions of the logicality of language in frameworks compatible with the view that the language system is not radically modular vis‐á‐vis its open class terms and employs a deductive system that is basically classical. (shrink)
This presentation includes a complete bibliography of John Corcoran’s publications relevant on Aristotle’s logic. The Sections I, II, III, and IV list respectively 23 articles, 44 abstracts, 3 books, and 11 reviews. Section I starts with two watershed articles published in 1972: the Philosophy & Phenomenological Research article—from Corcoran’s Philadelphia period that antedates his discovery of Aristotle’s naturaldeduction system—and the Journal of Symbolic Logic article—from his Buffalo period first reporting his original results. It ends with works published (...) in 2015. Some items are annotated as listed or with endnotes connecting them with other work and pointing out passages that, in retrospect, are seen to be misleading and in a few places erroneous. In addition, Section V, “Discussions”, is a nearly complete secondary bibliography of works describing, interpreting, extending, improving, supporting, and criticizing Corcoran’s work: 10 items published in the 1970s, 24 in the 1980s, 42 in the 1990s, 60 in the 2000s, and 70 in the current decade. The secondary bibliography is also annotated as listed or with endnotes: some simply quoting from the cited item, but several answering criticisms and identifying errors. Section VI, “Alternatives”, lists recent works on Aristotle’s logic oblivious of Corcoran’s research and, more generally in some cases, even of the Łukasiewicz-initiated tradition. As is evident from Section VII, “Acknowledgements”, Corcoran’s publications benefited from consultation with other scholars, most notably George Boger, Charles Kahn, John Mulhern, Mary Mulhern, Anthony Preus, Timothy Smiley, Michael Scanlan, Roberto Torretti, and Kevin Tracy. All of Corcoran’s Greek translations were done in collaboration with two or more classicists. Corcoran never published a sentence without discussing it with his colleagues and students. (shrink)
As the 19th century drew to a close, logicians formalized an ideal notion of proof. They were driven by nothing other than an abiding interest in truth, and their proofs were as ethereal as the mind of God. Yet within decades these mathematical abstractions were realized by the hand of man, in the digital stored-program computer. How it came to be recognized that proofs and programs are the same thing is a story that spans a century, a chase with as (...) many twists and turns as a thriller. At the end of the story is a new principle for designing programming languages that will guide computers into the 21st century. -/- For my money, Gentzen’s naturaldeduction and Church’s lambda calculus are on a par with Einstein’s relativity and Dirac’s quantum physics for elegance and insight. And the maths are a lot simpler. I want to show you the essence of these ideas. I’ll need a few symbols, but not too many, and I’ll explain as I go along. -/- To simplify, I’ll present the story as we understand it now, with some asides to fill in the history. First, I’ll introduce Gentzen’s naturaldeduction, a formalism for proofs. Next, I’ll introduce Church’s lambda calculus, a formalism for programs. Then I’ll explain why proofs and programs are really the same thing, and how simplifying a proof corresponds to executing a program. Finally, I’ll conclude with a look at how these principles are being applied to design a new generation of programming languages, particularly mobile code for the Internet. (shrink)
Recent work in formal semantics suggests that the language system includes not only a structure building device, as standardly assumed, but also a natural deductive system which can determine when expressions have trivial truth-conditions (e.g., are logically true/false) and mark them as unacceptable. This hypothesis, called the `logicality of language', accounts for many acceptability patterns, including systematic restrictions on the distribution of quantifiers. To deal with apparent counter-examples consisting of acceptable tautologies and contradictions, the logicality of language is often (...) paired with an additional assumption according to which logical forms are radically underspecified: i.e., the language system can see functional terms but is `blind' to open class terms to the extent that different tokens of the same term are treated as if independent. This conception of logical form has profound implications: it suggests an extreme version of the modularity of language, and can only be paired with non-classical---indeed quite exotic---kinds of deductive systems. The aim of this paper is to show that we can pair the logicality of language with a different and ultimately more traditional account of logical form. This framework accounts for the basic acceptability patterns which motivated the logicality of language, can explain why some tautologies and contradictions are acceptable, and makes better predictions in key cases. As a result, we can pursue versions of the logicality of language in frameworks compatible with the view that the language system is not radically modular vis-a-vis its open class terms and employs a deductive system that is basically classical. (shrink)
Since the time of Aristotle's students, interpreters have considered Prior Analytics to be a treatise about deductive reasoning, more generally, about methods of determining the validity and invalidity of premise-conclusion arguments. People studied Prior Analytics in order to learn more about deductive reasoning and to improve their own reasoning skills. These interpreters understood Aristotle to be focusing on two epistemic processes: first, the process of establishing knowledge that a conclusion follows necessarily from a set of premises (that is, on the (...) epistemic process of extracting information implicit in explicitly given information) and, second, the process of establishing knowledge that a conclusion does not follow. Despite the overwhelming tendency to interpret the syllogistic as formal epistemology, it was not until the early 1970s that it occurred to anyone to think that Aristotle may have developed a theory of deductive reasoning with a well worked-out system of deductions comparable in rigor and precision with systems such as propositional logic or equational logic familiar from mathematical logic. When modern logicians in the 1920s and 1930s first turned their attention to the problem of understanding Aristotle's contribution to logic in modern terms, they were guided both by the Frege-Russell conception of logic as formal ontology and at the same time by a desire to protect Aristotle from possible charges of psychologism. They thought they saw Aristotle applying the informal axiomatic method to formal ontology, not as making the first steps into formal epistemology. They did not notice Aristotle's description of deductive reasoning. Ironically, the formal axiomatic method (in which one explicitly presents not merely the substantive axioms but also the deductive processes used to derive theorems from the axioms) is incipient in Aristotle's presentation. Partly in opposition to the axiomatic, ontically-oriented approach to Aristotle's logic and partly as a result of attempting to increase the degree of fit between interpretation and text, logicians in the 1970s working independently came to remarkably similar conclusions to the effect that Aristotle indeed had produced the first system of formal deductions. They concluded that Aristotle had analyzed the process of deduction and that his achievement included a semantically complete system of natural deductions including both direct and indirect deductions. Where the interpretations of the 1920s and 1930s attribute to Aristotle a system of propositions organized deductively, the interpretations of the 1970s attribute to Aristotle a system of deductions, or extended deductive discourses, organized epistemically. The logicians of the 1920s and 1930s take Aristotle to be deducing laws of logic from axiomatic origins; the logicians of the 1970s take Aristotle to be describing the process of deduction and in particular to be describing deductions themselves, both those deductions that are proofs based on axiomatic premises and those deductions that, though deductively cogent, do not establish the truth of the conclusion but only that the conclusion is implied by the premise-set. Thus, two very different and opposed interpretations had emerged, interestingly both products of modern logicians equipped with the theoretical apparatus of mathematical logic. The issue at stake between these two interpretations is the historical question of Aristotle's place in the history of logic and of his orientation in philosophy of logic. This paper affirms Aristotle's place as the founder of logic taken as formal epistemology, including the study of deductive reasoning. A by-product of this study of Aristotle's accomplishments in logic is a clarification of a distinction implicit in discourses among logicians--that between logic as formal ontology and logic as formal epistemology. (shrink)
In the present article we attempt to show that Aristotle's syllogistic is an underlying logiC which includes a natural deductive system and that it isn't an axiomatic theory as had previously been thought. We construct a mathematical model which reflects certain structural aspects of Aristotle's logic. We examine the relation of the model to the system of logic envisaged in scattered parts of Prior and Posterior Analytics. Our interpretation restores Aristotle's reputation as a logician of consummate imagination and skill. (...) Several attributions of shortcomings and logical errors to Aristotle are shown to be without merit. Aristotle's logic is found to be self-sufficient in several senses: his theory of deduction is logically sound in every detail. (His indirect deductions have been criticized, but incorrectly on our account.) Aristotle's logic presupposes no other logical concepts, not even those of propositional logic. The Aristotelian system is seen to be complete in the sense that every valid argument expressible in his system admits of a deduction within his deductive system: every semantically valid argument is deducible. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.