One semantic and two syntactic decision procedures are given for determining the validity of Aristotelian assertoric and apodeictic syllogisms. Results are obtained by using the Aristotelian deductions that necessarily have an even number of premises.
It is commonly held that Kant ventured to derive morality from freedom in Groundwork III. It is also believed that he reversed this strategy in the second Critique, attempting to derive freedom from morality instead. In this paper, I set out to challenge these familiar assumptions: Kant’s argument in Groundwork III rests on a moral conception of the intelligible world, one that plays a similar role as the ‘fact of reason’ in the second Critique. Accordingly, I argue, there is no (...) reversal in the proof-structure of Kant’s two works. (shrink)
In the present article we attempt to show that Aristotle's syllogistic is an underlying logiC which includes a natural deductive system and that it isn't an axiomatic theory as had previously been thought. We construct a mathematical model which reflects certain structural aspects of Aristotle's logic. We examine the relation of the model to the system of logic envisaged in scattered parts of Prior and Posterior Analytics. Our interpretation restores Aristotle's reputation as a logician of consummate imagination and skill. Several (...) attributions of shortcomings and logical errors to Aristotle are shown to be without merit. Aristotle's logic is found to be self-sufficient in several senses: his theory of deduction is logically sound in every detail. (His indirect deductions have been criticized, but incorrectly on our account.) Aristotle's logic presupposes no other logical concepts, not even those of propositional logic. The Aristotelian system is seen to be complete in the sense that every valid argument expressible in his system admits of a deduction within his deductive system: every semantically valid argument is deducible. (shrink)
It is one thing for a given proposition to follow or to not follow from a given set of propositions and it is quite another thing for it to be shown either that the given proposition follows or that it does not follow.* Using a formal deduction to show that a conclusion follows and using a countermodel to show that a conclusion does not follow are both traditional practices recognized by Aristotle and used down through the history of logic. These (...) practices presuppose, respectively, a criterion of validity and a criterion of invalidity each of which has been extended and refined by modern logicians: deductions are studied in formal syntax (proof theory) and coun¬termodels are studied in formal semantics (model theory). The purpose of this paper is to compare these two criteria to the corresponding criteria employed in Boole’s first logical work, The Mathematical Analysis of Logic (1847). In particular, this paper presents a detailed study of the relevant metalogical passages and an analysis of Boole’s symbolic derivations. It is well known, of course, that Boole’s logical analysis of compound terms (involving ‘not’, ‘and’, ‘or’, ‘except’, etc.) contributed to the enlargement of the class of propositions and arguments formally treatable in logic. The present study shows, in addition, that Boole made significant contributions to the study of deduc¬tive reasoning. He identified the role of logical axioms (as opposed to inference rules) in formal deductions, he conceived of the idea of an axiomatic deductive sys¬tem (which yields logical truths by itself and which yields consequences when ap¬plied to arbitrary premises). Nevertheless, surprisingly, Boole’s attempt to imple¬ment his idea of an axiomatic deductive system involved striking omissions: Boole does not use his own formal deductions to establish validity. Boole does give symbolic derivations, several of which are vitiated by “Boole’s Solutions Fallacy”: the fallacy of supposing that a solution to an equation is necessarily a logical consequence of the equation. This fallacy seems to have led Boole to confuse equational calculi (i.e., methods for gen-erating solutions) with deduction procedures (i.e., methods for generating consequences). The methodological confusion is closely related to the fact, shown in detail below, that Boole had adopted an unsound criterion of validity. It is also shown that Boole totally ignored the countermodel criterion of invalid¬ity. Careful examination of the text does not reveal with certainty a test for invalidity which was adopted by Boole. However, we have isolated a test that he seems to use in this way and we show that this test is ineffectual in the sense that it does not serve to identify invalid arguments. We go beyond the simple goal stated above. Besides comparing Boole’s earliest criteria of validity and invalidity with those traditionally (and still generally) employed, this paper also investigates the framework and details of THE MATHEMATICAL ANALYSIS OF LOGIC. (shrink)
This interesting and imaginative monograph is based on the author’s PhD dissertation supervised by Saul Kripke. It is dedicated to Timothy Smiley, whose interpretation of PRIOR ANALYTICS informs its approach. As suggested by its title, this short work demonstrates conclusively that Aristotle’s syllogistic is a suitable vehicle for fruitful discussion of contemporary issues in logical theory. Aristotle’s syllogistic is represented by Corcoran’s 1972 reconstruction. The review studies Lear’s treatment of Aristotle’s logic, his appreciation of the Corcoran-Smiley paradigm, and his understanding (...) of modern logical theory. In the process Corcoran and Scanlan present new, previously unpublished results. Corcoran regards this review as an important contribution to contemporary study of PRIOR ANALYTICS: both the book and the review deserve to be better known. (shrink)
As noted in 1962 by Timothy Smiley, if Aristotle’s logic is faithfully translated into modern symbolic logic, the fit is exact. If categorical sentences are translated into many-sorted logic MSL according to Smiley’s method or the two other methods presented here, an argument with arbitrarily many premises is valid according to Aristotle’s system if and only if its translation is valid according to modern standard many-sorted logic. As William Parry observed in 1973, this result can be proved using my 1972 (...) proof of the completeness of Aristotle’s syllogistic. (shrink)
The Neo-Moorean Deduction (I have a hand, so I am not a brain-in-a-vat) and the Zebra Deduction (the creature is a zebra, so isn’t a cleverly disguised mule) are notorious. Crispin Wright, Martin Davies, Fred Dretske, and Brian McLaughlin, among others, argue that these deductions are instances of transmission failure. That is, they argue that these deductions cannot transmit justification to their conclusions. I contend, however, that the notoriety of these deductions is undeserved. My strategy is to (...) clarify, attack, defend, and apply. I clarify what transmission and transmission failure really are, thereby exposing two questionable but quotidian assumptions. I attack existing views of transmission failure, especially those of Crispin Wright. I defend a permissive view of transmission failure, one which holds that deductions of a certain kind fail to transmit only because of premise circularity. Finally, I apply this account to the Neo-Moorean and Zebra Deductions and show that, given my permissive view, these deductions transmit in an intuitively acceptable way—at least if either a certain type of circularity is benign or a certain view of perceptual justification is false. (shrink)
Published in 1903, this book was the first comprehensive treatise on the logical foundations of mathematics written in English. It sets forth, as far as possible without mathematical and logical symbolism, the grounds in favour of the view that mathematics and logic are identical. It proposes simply that what is commonly called mathematics are merely later deductions from logical premises. It provided the thesis for which _Principia Mathematica_ provided the detailed proof, and introduced the work of Frege to a (...) wider audience. In addition to the new introduction by John Slater, this edition contains Russell's introduction to the 1937 edition in which he defends his position against his formalist and intuitionist critics. (shrink)
In this paper, I investigate whether we can use a world-involving framework to model the epistemic states of non-ideal agents. The standard possible-world framework falters in this respect because of a commitment to logical omniscience. A familiar attempt to overcome this problem centers around the use of impossible worlds where the truths of logic can be false. As we shall see, if we admit impossible worlds where “anything goes” in modal space, it is easy to model extremely non-ideal agents that (...) are incapable of performing even the most elementary logical deductions. A much harder, and considerably less investigated challenge is to ensure that the resulting modal space can also be used to model moderately ideal agents that are not logically omniscient but nevertheless logically competent. Intuitively, while such agents may fail to rule out subtly impossible worlds that verify complex logical falsehoods, they are nevertheless able to rule out blatantly impossible worlds that verify obvious logical falsehoods. To model moderately ideal agents, I argue, the job is to construct a modal space that contains only possible and non-trivially impossible worlds where it is not the case that “anything goes”. But I prove that it is impossible to develop an impossible-world framework that can do this job and that satisfies certain standard conditions. Effectively, I show that attempts to model moderately ideal agents in a world-involving framework collapse to modeling either logical omniscient agents, or extremely non-ideal agents. (shrink)
Reason is the tool of our knowledge but in philosophy this tool encounters difficulties, especially, when it is faced with the big questions - the source of philosophy's deep disagreements. Another difficulty arises from the fact that philosophy and religion cross each other's path: the first draws deductions from rational principles in its approach to religion while the second does not remain firm on its terrain - it keeps looking for rational answers. In essence, this is what this article (...) deals with. If for thousands of years philosophers have been taking turns in arguing and disagreeing with one another on some big questions, there comes a time when the circle has to be broken. In fact, that time has come; this article offers a novel approach to end the age-old debate on, for instance, God talk. (shrink)
Since the time of Aristotle's students, interpreters have considered Prior Analytics to be a treatise about deductive reasoning, more generally, about methods of determining the validity and invalidity of premise-conclusion arguments. People studied Prior Analytics in order to learn more about deductive reasoning and to improve their own reasoning skills. These interpreters understood Aristotle to be focusing on two epistemic processes: first, the process of establishing knowledge that a conclusion follows necessarily from a set of premises (that is, on the (...) epistemic process of extracting information implicit in explicitly given information) and, second, the process of establishing knowledge that a conclusion does not follow. Despite the overwhelming tendency to interpret the syllogistic as formal epistemology, it was not until the early 1970s that it occurred to anyone to think that Aristotle may have developed a theory of deductive reasoning with a well worked-out system of deductions comparable in rigor and precision with systems such as propositional logic or equational logic familiar from mathematical logic. When modern logicians in the 1920s and 1930s first turned their attention to the problem of understanding Aristotle's contribution to logic in modern terms, they were guided both by the Frege-Russell conception of logic as formal ontology and at the same time by a desire to protect Aristotle from possible charges of psychologism. They thought they saw Aristotle applying the informal axiomatic method to formal ontology, not as making the first steps into formal epistemology. They did not notice Aristotle's description of deductive reasoning. Ironically, the formal axiomatic method (in which one explicitly presents not merely the substantive axioms but also the deductive processes used to derive theorems from the axioms) is incipient in Aristotle's presentation. Partly in opposition to the axiomatic, ontically-oriented approach to Aristotle's logic and partly as a result of attempting to increase the degree of fit between interpretation and text, logicians in the 1970s working independently came to remarkably similar conclusions to the effect that Aristotle indeed had produced the first system of formal deductions. They concluded that Aristotle had analyzed the process of deduction and that his achievement included a semantically complete system of natural deductions including both direct and indirect deductions. Where the interpretations of the 1920s and 1930s attribute to Aristotle a system of propositions organized deductively, the interpretations of the 1970s attribute to Aristotle a system of deductions, or extended deductive discourses, organized epistemically. The logicians of the 1920s and 1930s take Aristotle to be deducing laws of logic from axiomatic origins; the logicians of the 1970s take Aristotle to be describing the process of deduction and in particular to be describing deductions themselves, both those deductions that are proofs based on axiomatic premises and those deductions that, though deductively cogent, do not establish the truth of the conclusion but only that the conclusion is implied by the premise-set. Thus, two very different and opposed interpretations had emerged, interestingly both products of modern logicians equipped with the theoretical apparatus of mathematical logic. The issue at stake between these two interpretations is the historical question of Aristotle's place in the history of logic and of his orientation in philosophy of logic. This paper affirms Aristotle's place as the founder of logic taken as formal epistemology, including the study of deductive reasoning. A by-product of this study of Aristotle's accomplishments in logic is a clarification of a distinction implicit in discourses among logicians--that between logic as formal ontology and logic as formal epistemology. (shrink)
This paper presents a way of formalising definite descriptions with a binary quantifier ι, where ιx[F, G] is read as ‘The F is G’. Introduction and elimination rules for ι in a system of intuitionist negative free logic are formulated. Procedures for removing maximal formulas of the form ιx[F, G] are given, and it is shown that deductions in the system can be brought into normal form.
Demonstrative logic, the study of demonstration as opposed to persuasion, is the subject of Aristotle's two-volume Analytics. Many examples are geometrical. Demonstration produces knowledge (of the truth of propositions). Persuasion merely produces opinion. Aristotle presented a general truth-and-consequence conception of demonstration meant to apply to all demonstrations. According to him, a demonstration, which normally proves a conclusion not previously known to be true, is an extended argumentation beginning with premises known to be truths and containing a chain of reasoning showing (...) by deductively evident steps that its conclusion is a consequence of its premises. In particular, a demonstration is a deduction whose premises are known to be true. Aristotle's general theory of demonstration required a prior general theory of deduction presented in the Prior Analytics. His general immediate-deduction-chaining conception of deduction was meant to apply to all deductions. According to him, any deduction that is not immediately evident is an extended argumentation that involves a chaining of intermediate immediately evident steps that shows its final conclusion to follow logically from its premises. To illustrate his general theory of deduction, he presented an ingeniously simple and mathematically precise special case traditionally known as the categorical syllogistic. (shrink)
It’s often thought that the phenomenon of risk aggregation poses a problem for multi-premise closure but not for single-premise closure. But recently, Lasonen-Aarnio and Schechter have challenged this thought. Lasonen-Aarnio argues that, insofar as risk aggregation poses a problem for multi-premise closure, it poses a similar problem for single-premise closure. For she thinks that, there being such a thing as deductive risk, risk may aggregate over a single premise and the deduction itself. Schechter argues that single-premise closure succumbs to risk (...) aggregation outright. For he thinks that there could be a long sequence of competent single-premise deductions such that, even though we are justified in believing the initial premise of the sequence, intutively, we are not justified in believing the final conclusion. This intuition, Schechter thinks, vitiates single-premise closure. In this paper, I defend single-premise closure against the arguments offered by Lasonen-Aarnio and Schechter. (shrink)
A new proof style adequate for modal logics is defined from the polynomial ring calculus. The new semantics not only expresses truth conditions of modal formulas by means of polynomials, but also permits to perform deductions through polynomial handling. This paper also investigates relationships among the PRC here defined, the algebraic semantics for modal logics, equational logics, the Dijkstra???Scholten equational-proof style, and rewriting systems. The method proposed is throughly exemplified for S 5, and can be easily extended to other (...) modal logics. (shrink)
The paper revisits the rationality principle from the particular perspective of the unity of social sciences. It has been argued that the principle was the unique law of the social sciences and that accordingly there are no deep differences between them (Popper). It has also been argued that the rationality principle was specific to economics as opposed to the other social sciences, especially sociology (Pareto). The paper rejects these opposite views on the grounds that the rationality principle is strictly metaphysical (...) and does not have the logical force required to deliver interesting deductions. Explanation in the social sciences takes place at a level of specialization that is always higher than that of the principle itself. However, what is peculiar about economics is that it specializes the explanatory rational schemes to a degree unparalleled in history and sociology. As a consequence, there is a backward-and-forward move between specific and general formulations of rationality that takes place in economics and has no analogue in the other social sciences. (shrink)
Chapin reviewed this 1972 ZEITSCHRIFT paper that proves the completeness theorem for the logic of variable-binding-term operators created by Corcoran and his student John Herring in the 1971 LOGIQUE ET ANALYSE paper in which the theorem was conjectured. This leveraging proof extends completeness of ordinary first-order logic to the extension with vbtos. Newton da Costa independently proved the same theorem about the same time using a Henkin-type proof. This 1972 paper builds on the 1971 “Notes on a Semantic Analysis of (...) Variable Binding Term Operators” (Co-author John Herring), Logique et Analyse 55, 646–57. MR0307874 (46 #6989). A variable binding term operator (vbto) is a non-logical constant, say v, which combines with a variable y and a formula F containing y free to form a term (vy:F) whose free variables are exact ly those of F, excluding y. Kalish-Montague 1964 proposed using vbtos to formalize definite descriptions “the x: x+x=2”, set abstracts {x: F}, minimization in recursive function theory “the least x: x+x>2”, etc. However, they gave no semantics for vbtos. Hatcher 1968 gave a semantics but one that has flaws described in the 1971 paper and admitted by Hatcher. In 1971 we give a correct semantic analysis of vbtos. We also give axioms for using them in deductions. And we conjecture strong completeness for the deductions with respect to the semantics. The conjecture, proved in this paper with Hatcher’s help, was proved independently about the same time by Newton da Costa. (shrink)
-/- A variable binding term operator (vbto) is a non-logical constant, say v, which combines with a variable y and a formula F containing y free to form a term (vy:F) whose free variables are exact ly those of F, excluding y. -/- Kalish-Montague proposed using vbtos to formalize definite descriptions, set abstracts {x: F}, minimalization in recursive function theory, etc. However, they gave no sematics for vbtos. Hatcher gave a semantics but one that has flaws. We give a correct (...) semantic analysis of vbtos. We also give axioms for using them in deductions. And we conjecture strong completeness for the deductions with respect to the semantics. The conjecture was later proved independently by the authors and by Newton da Costa. -/- The expression (vy:F) is called a variable bound term (vbt). In case F has only y free, (vy:F) has the syntactic propreties of an individual constant; and under a suitable interpretation of the language vy:F) denotes an individual. By a semantic analysis of vbtos we mean a proposal for amending the standard notions of (1) "an interpretation o f a first -order language" and (2) " the denotation of a term under an interpretation and an assignment", such that (1') an interpretation o f a first -order language associates a set-theoretic structure with each vbto and (2') under any interpretation and assignment each vb t denotes an individual. (shrink)
This short paper has two loosely connected parts. In the first part, I discuss the difference between classical and intuitionist logic in relation to different the role of hypotheses play in each logic. Harmony is normally understood as a relation between two ways of manipulating formulas in systems of natural deduction: their introduction and elimination. I argue, however, that there is at least a third way of manipulating formulas, namely the discharge of assumption, and that the difference between classical and (...) intuitionist logic can be characterised as a difference of the conditions under which discharge is allowed. Harmony, as ordinarily understood, has nothing to say about discharge. This raises the question whether the notion of harmony can be suitably extended. This requires there to be a suitable fourth way of manipulating formulas that discharge can stand in harmony to. The question is whether there is such a notion: what might it be that stands to discharge of formulas as introduction stands to elimination? One that immediately comes to mind is the making of assumptions. I leave it as an open question for further research whether the notion of harmony can be fruitfully extended in the way suggested here. In the second part, I discuss bilateralism, which proposes a wholesale revision of what it is that is assumed and manipulated by rules of inference in deductions: rules apply to speech acts – assertions and denials – rather than propositions. I point out two problems for bilateralism. First, bilaterlists cannot, contrary to what they claim to be able to do, draw a distinction between the truth and assertibility of a proposition. Secondly, it is not clear what it means to assume an expression such as '+ A' that is supposed to stand for an assertion. Worse than that, it is plausible that making an assumption is a particular speech act, as argued by Dummett (Frege: Philosophy of Language, p.309ff). Bilaterlists accept that speech acts cannot be embedded in other speech acts. But then it is meaningless to assume + A or − A. (shrink)
Girolamo Saccheri (1667--1733) was an Italian Jesuit priest, scholastic philosopher, and mathematician. He earned a permanent place in the history of mathematics by discovering and rigorously deducing an elaborate chain of consequences of an axiom-set for what is now known as hyperbolic (or Lobachevskian) plane geometry. Reviewer's remarks: (1) On two pages of this book Saccheri refers to his previous and equally original book Logica demonstrativa (Turin, 1697) to which 14 of the 16 pages of the editor's "Introduction" are devoted. (...) At the time of the first edition, 1920, the editor was apparently not acquainted with the secondary literature on Logica demonstrativa which continued to grow in the period preceding the second edition \ref[see D. J. Struik, in Dictionary of scientific biography, Vol. 12, 55--57, Scribner's, New York, 1975]. Of special interest in this connection is a series of three articles by A. F. Emch [Scripta Math. 3 (1935), 51--60; Zbl 10, 386; ibid. 3 (1935), 143--152; Zbl 11, 193; ibid. 3 (1935), 221--333; Zbl 12, 98]. (2) It seems curious that modern writers believe that demonstration of the "nondeducibility" of the parallel postulate vindicates Euclid whereas at first Saccheri seems to have thought that demonstration of its "deducibility" is what would vindicate Euclid. Saccheri is perfectly clear in his commitment to the ancient (and now discredited) view that it is wrong to take as an "axiom" a proposition which is not a "primal verity", which is not "known through itself". So it would seem that Saccheri should think that he was convicting Euclid of error by deducing the parallel postulate. The resolution of this confusion is that Saccheri thought that he had proved, not merely that the parallel postulate was true, but that it was a "primal verity" and, thus, that Euclid was correct in taking it as an "axiom". As implausible as this claim about Saccheri may seem, the passage on p. 237, lines 3--15, seems to admit of no other interpretation. Indeed, Emch takes it this way. (3) As has been noted by many others, Saccheri was fascinated, if not obsessed, by what may be called "reflexive indirect deductions", indirect deductions which show that a conclusion follows from given premises by a chain of reasoning beginning with the given premises augmented by the denial of the desired conclusion and ending with the conclusion itself. It is obvious, of course, that this is simply a species of ordinary indirect deduction; a conclusion follows from given premises if a contradiction is deducible from those given premises augmented by the denial of the conclusion---and it is immaterial whether the contradiction involves one of the premises, the denial of the conclusion, or even, as often happens, intermediate propositions distinct from the given premises and the denial of the conclusion. Saccheri seemed to think that a proposition proved in this way was deduced from its own denial and, thus, that its denial was self-contradictory (p. 207). Inference from this mistake to the idea that propositions proved in this way are "primal verities" would involve yet another confusion. The reviewer gratefully acknowledges extensive communication with his former doctoral students J. Gasser and M. Scanlan. ADDED 14 March 14, 2015: (1) Wikipedia reports that many of Saccheri's ideas have a precedent in the 11th Century Persian polymath Omar Khayyám's Discussion of Difficulties in Euclid, a fact ignored in most Western sources until recently. It is unclear whether Saccheri had access to this work in translation, or developed his ideas independently. (2) This book is another exemplification of the huge difference between indirect deduction and indirect reduction. Indirect deduction requires making an assumption that is inconsistent with the premises previously adopted. This means that the reasoner must perform a certain mental act of assuming a certain proposition. It case the premises are all known truths, indirect deduction—which would then be indirect proof—requires the reasoner to assume a falsehood. This fact has been noted by several prominent mathematicians including Hardy, Hilbert, and Tarski. Indirect reduction requires no new assumption. Indirect reduction is simply a transformation of an argument in one form into another argument in a different form. In an indirect reduction one proposition in the old premise set is replaced by the contradictory opposite of the old conclusion and the new conclusion becomes the contradictory opposite of the replaced premise. Roughly and schematically, P,Q/R becomes P,~R/~Q or ~R, Q/~P. Saccheri’s work involved indirect deduction not indirect reduction. (3) The distinction between indirect deduction and indirect reduction has largely slipped through the cracks, the cracks between medieval-oriented logic and modern-oriented logic. The medievalists have a heavy investment in reduction and, though they have heard of deduction, they think that deduction is a form of reduction, or vice versa, or in some cases they think that the word ‘deduction’ is the modern way of referring to reduction. The modernists have no interest in reduction, i.e. in the process of transforming one argument into another having exactly the same number of premises. Modern logicians, like Aristotle, are concerned with deducing a single proposition from a set of propositions. Some focus on deducing a single proposition from the null set—something difficult to relate to reduction. (shrink)
John Corcoran and George Boger. Aristotelian logic and Euclidean geometry. Bulletin of Symbolic Logic. 20 (2014) 131. -/- By an Aristotelian logic we mean any system of direct and indirect deductions, chains of reasoning linking conclusions to premises—complete syllogisms, to use Aristotle’s phrase—1) intended to show that their conclusions follow logically from their respective premises and 2) resembling those in Aristotle’s Prior Analytics. Such systems presuppose existence of cases where it is not obvious that the conclusion follows from the (...) premises: there must be something deductions can show. Corcoran calls a proposition that follows from given premises a hidden consequence of those premises if it is not obvious that the proposition follows from those premises. By a Euclidean geometry we mean an extended discourse beginning with basic premises—axioms, postulates, definitions—1) treating a universe of geometrical figures and 2) resembling Euclid’s Elements. There were Euclidean geometries before Euclid (fl. 300 BCE), even before Aristotle (384–322 BCE). Bochenski, Lukasiewicz, Patzig and others never new this or if they did they found it inconvenient to mention. Euclid shows no awareness of Aristotle. It is obvious today—as it should have been obvious in Euclid’s time, if anyone knew both—that Aristotle’s logic was insufficient for Euclid’s geometry: few if any geometrical theorems can be deduced from Euclid’s premises by means of Aristotle’s deductions. Aristotle’s writings don’t say whether his logic is sufficient for Euclidean geometry. But, there is not even one fully-presented example. However, Aristotle’s writings do make clear that he endorsed the goal of a sufficient system. Nevertheless, incredible as this is today, many logicians after Aristotle claimed that Aristotelian logics are sufficient for Euclidean geometries. This paper reviews and analyses such claims by Mill, Boole, De Morgan, Russell, Poincaré, and others. It also examines early contrary statements by Hintikka, Mueller, Smith, and others. Special attention is given to the argumentations pro or con and especially to their logical, epistemic, and ontological presuppositions. What methodology is necessary or sufficient to show that a given logic is adequate or inadequate to serve as the underlying logi of a given science. (shrink)
Ordinarily counterfactuals are seen as making statements about states of aﬀairs, albeit ones that hold in merely possible or alternative worlds. Thus analyzed, nearly all counterfactuals turn out to be incoherent. Any counterfactual, thus analyzed, requires that there be a metaphysically (not just epistemically) possible world w where the laws are the same as here, and where almost all of the facts are the same as here. (The factual diﬀerences relate to the antecedent and consequent of the counter-factual.) But, as (...) I show, this requirement typically involves the positing of worlds whose necessary non-existence can be shown by fairly elementary deductions. Further, the possible-worlds analysis of counterfactuals is guilty of covert circularity. For, thus analyzed, counterfactuals can only be understood in terms of laws of nature (the laws that apply here are assumed in the hypothetical world - except in the atypical case where the counterfactual is also a counter-nomic). But the concept of a law cannot itself be deﬁned except in terms of the notion of a counterfactual (a law is given by a counterfactual-supporting proposition). I give a purely epistemic analysis of counterfactuals, arguing that they are crypto-probability propositions. I also argue that the relevant kind of probability can be deﬁned wholly in terms of what has happened (not what would happen and not even what must happen in a nomic sense). So my analysis isn’t guilty of any kind of circularity. (shrink)
The Philosopher Queen: Feminist Essays on War, Love, and Knowledge. By Chris Cuomo. Lanham, Md.: Rowman and Littlefield Publishers, Inc., 2003. The Philosopher Queen is a powerful illustration of what Cherríe Moraga calls a "theory in the flesh." That is, theorizing from a place where "physical realities of our lives—our skin color, the land or concrete we grow up on, our sexual longings—all fuse to create a politic [and, I would add, an ethics, spirituality, and epistemology] born out of necessity" (...) (Moraga 21). Cuomo's theory in the flesh combines standard philosophical essays with personal narratives and invites us to do philosophy from this joyful and witty place. Readers are invited to reframe and reexamine war, science, gender, sexuality, race, ecology, knowledge, and politics in a voice that is fearless, funny, faithful, and feminist—one that disrupts common understandings of how philosophy ought to be done. Instead philosophy should help us to "negotiate a wild, wicked world, and to provide some understanding of being and existence. The best philosophy aims to promote good and to produce knowledge, and therefore enable flourishing" (xi). Accepted philosophical approaches alone are inadequate. Life's challenges resist formulaic solutions. Knowledge is not always produced through neat deductions: truths are partial, power divides, stomachs growl, hearts are broken, and emotions influence... (shrink)
A truth-preservation fallacy is using the concept of truth-preservation where some other concept is needed. For example, in certain contexts saying that consequences can be deduced from premises using truth-preserving deduction rules is a fallacy if it suggests that all truth-preserving rules are consequence-preserving. The arithmetic additive-associativity rule that yields 6 = (3 + (2 + 1)) from 6 = ((3 + 2) + 1) is truth-preserving but not consequence-preserving. As noted in James Gasser’s dissertation, Leibniz has been criticized for (...) using that rule in attempting to show that arithmetic equations are consequences of definitions. -/- A system of deductions is truth-preserving if each of its deductions having true premises has a true conclusion—and consequence-preserving if, for any given set of sentences, each deduction having premises that are consequences of that set has a conclusion that is a consequence of that set. Consequence-preserving amounts to: in each of its deductions the conclusion is a consequence of the premises. The same definitions apply to deduction rules considered as systems of deductions. Every consequence-preserving system is truth-preserving. It is not as well-known that the converse fails: not every truth-preserving system is consequence-preserving. Likewise for rules: not every truth-preserving rule is consequence-preserving. There are many famous examples. In ordinary first-order Peano-Arithmetic, the induction rule yields the conclusion ‘every number x is such that: x is zero or x is a successor’—which is not a consequence of the null set—from two tautological premises, which are consequences of the null set, of course. The arithmetic induction rule is truth-preserving but not consequence-preserving. Truth-preserving rules that are not consequence-preserving are non-logical or extra-logical rules. Such rules are unacceptable to persons espousing traditional truth-and-consequence conceptions of demonstration: a demonstration shows its conclusion is true by showing that its conclusion is a consequence of premises already known to be true. The 1965 Preface in Benson Mates (1972, vii) contains the first occurrence of truth-preservation fallacies in the book. (shrink)
ABSTRACT This part of the series has a dual purpose. In the first place we will discuss two kinds of theories of proof. The first kind will be called a theory of linear proof. The second has been called a theory of suppositional proof. The term "natural deduction" has often and correctly been used to refer to the second kind of theory, but I shall not do so here because many of the theories so-called are not of the second kind--they (...) must be thought of either as disguised linear theories or theories of a third kind (see postscript below). The second purpose of this part is 25 to develop some of the main ideas needed in constructing a comprehensive theory of proof. The reason for choosing the linear and suppositional theories for this purpose is because the linear theory includes only rules of a very simple nature, and the suppositional theory can be seen as the result of making the linear theory more comprehensive. CORRECTION: At the time these articles were written the word ‘proof’ especially in the phrase ‘proof from hypotheses’ was widely used to refer to what were earlier and are now called deductions. I ask your forgiveness. I have forgiven Church and Henkin who misled me. (shrink)
I argue 1) That in his celebrated Is/Ought passage, Hume employs ‘deduction’ in the strict sense, according to which if a conclusion B is justly or evidently deduced from a set of premises A, A cannot be true and B false, or B false and the premises A true. 2) That Hume was following the common custom of his times which sometimes employed ‘deduction’ in a strict sense to denote inferences in which, in the words of Dr Watts’ Logick, ‘the (...) premises, according to the reason of things, do really contain the conclusion that is deduced from them’; that although Hume sometimes uses ‘demonstrative argument’ as a synonym for ‘deduction’, like most of his contemporaries, he generally reserves the word ‘demonstration’ for deductive inferences in which the premises are both necessary and self-evident. 3) That Mr Hume did indeed mean to suggest that deductions from IS to OUGHT were ‘altogether inconceivable’ since if ought represents a new relation or affirmation, it cannot, in the strict sense, be justly deduced from premises which do not really contain it. 4) That in a large and liberal (or perhaps loose and promiscuous) sense Hume does deduce oughts and ought nots from observations concerning human affairs, but that the deductions in question are not inferences, but explanations, since in another sense of ‘deduce’, common in the Eighteenth Century, to deduce B from A is to trace B back to A or to explain B in terms of A; 5) That a small attention to the context of Hume’s remarks and to the logical notions on which they are based would subvert those vulgar systems of philosophy which exaggerate the distinction between fact and value; for just because it is ‘altogether inconceivable’ that the new relation or affirmation OUGHT should be a deduction from others that are entirely different from it, it does not follow that the facts represented by IS and IS NOT are at bottom any different from the values represented by OUGHT and OUGHT NOT. (shrink)
The teaching of the Aquinas Academy in its first thirty years was based on the scholastic philosophy of Thomas Aquinas, then regarded as the official philosophy of the Catholic Church. That philosophy has not been so much heard of in the last thirty years, but it has a strong presence below the surface. Its natural law theory of ethics, especially, still informs Vatican pronouncements on moral topics such as contraception and euthanasia. It has also been important in Australia in the (...) High Court’s deliberations on the Mabo case. It is argued that some officially-sanctioned deductions on particular cases have not been correct, but that any attempt to do without a natural law foundation of ethics would throw out the baby with the bathwater. The sense of the basic objective worth of persons that is the centre of natural law ethics is essential to any ethics better than a simple “might is right” approach. (shrink)
In this paper we prove the completeness of three logical systems I LI, IL2 and IL3. IL1 deals solely with identities {a = b), and its deductions are the direct deductions constructed with the three traditional rules: (T) from a = b and b = c infer a = c, (S) from a = b infer b = a and (A) infer a = a(from anything). IL2 deals solely with identities and inidentities {a ± b) and its (...) class='Hi'>deductions include both the direct and the indirect deductions constructed with the three traditional rules. IL3 is a hybrid of IL1 and IL2: its deductions are all direct as in IL1 but it deals with identities and inidentities as in IL2. IL1 and IL2 have a high degree of naturalness. Although the hybrid system IL3 was constructed as an artifact useful in the mathematical study of IL1 and IL2, it nevertheless has some intrinsically interesting aspects. The main motivation for describing and studying such simple systems is pedagogical. In teaching beginning logic one would like to present a system of logic which has the following properties. First, it exemplifies the main ideas of logic: implication, deduction, non-implication, counterargument(or countermodel), logical truth, self-contradiction, consistency,satisfiability, etc. Second, it exemplifies the usual general metaprinciples of logic: contraposition and transitivity of implication, cut laws, completeness,soundness, etc. Third, it is simple enough to be thoroughly grasped by beginners. Fourth, it is obvious enough so that its rules do not appear to be arbitrary or purely conventional. Fifth, it does not invite confusions which must be unlearned later. Sixth, it involves a minimum of presuppositions which are no longer accepted in mainstream contemporary logic. (shrink)
Following Quine [] and others we take deductions to produce knowledge of implications: a person gains knowledge that a given premise-set implies a given conclusion by deducing—producing a deduction of—the conclusion from those premises. How does this happen? How does a person recognize their desire for that knowledge of a certain implication, or that they lack it? How do they produce a suitable deduction? And most importantly, how does their production of that deduction provide them with knowledge of the (...) implication. What experienceable sign reveals to the reasoner that they achieved the desired knowledge? If a deduction is an array of inscriptions constructed by following syntactical—mechanical, machine-performable—rules as suggested by Tarski, Carnap, Church, and others, the epistemic question becomes even more pressing and more challenging. Moreover, deduction, the ability to produce deductions and to recognize them when produced, is operational knowledge that presupposes other component operations such as recognizing characters, making assumptions, inferring conclusions from premises, chaining inferences [AL]. (shrink)
To eliminate incompleteness, undecidability and inconsistency from formal systems we only need to convert the formal proofs to theorem consequences of symbolic logic to conform to the sound deductive inference model. -/- Within the sound deductive inference model there is a (connected sequence of valid deductions from true premises to a true conclusion) thus unlike the formal proofs of symbolic logic provability cannot diverge from truth.
Since string theory has not been able to explain phenomena to date, it may seem that this confirms Feyerabend's view that there is no "method" of science. And yet, string theory is still the most active research program for quantum gravity. But, compared to other non-falsifiable theories, this has something extra, especially mathematical language, with a clear logic of deductions. Up to a point it can reproduce classical gauge theories and general relativity. And there is hope that in the (...) not too distant future experiments can be developed to test the theory. DOI: 10.13140/RG.2.2.28892.33920. (shrink)
ABSTRACT: This 1974 paper builds on our 1969 paper (Corcoran-Weaver [2]). Here we present three (modal, sentential) logics which may be thought of as partial systematizations of the semantic and deductive properties of a sentence operator which expresses certain kinds of necessity. The logical truths [sc. tautologies] of these three logics coincide with one another and with those of standard formalizations of Lewis's S5. These logics, when regarded as logistic systems (cf. Corcoran [1], p. 154), are seen to be equivalent; (...) but, when regarded as consequence systems (ibid., p. 157), one diverges from the others in a fashion which suggests that two standard measures of semantic complexity may not be as closely linked as previously thought. -/- This 1974 paper uses the linear notation for natural deduction presented in [2]: each two-dimensional deduction is represented by a unique one-dimensional string of characters. Thus obviating need for two-dimensional trees, tableaux, lists, and the like—thereby facilitating electronic communication of natural deductions. The 1969 paper presents a (modal, sentential) logic which may be thought of as a partial systematization of the semantic and deductive properties of a sentence operator which expresses certain kinds of necessity. The logical truths [sc. tautologies] of this logic coincides those of standard formalizations of Lewis’s S4. Among the paper's innovations is its treatment of modal logic in the setting of natural deduction systems--as opposed to axiomatic systems. The author’s apologize for the now obsolete terminology. For example, these papers speak of “a proof of a sentence from a set of premises” where today “a deduction of a sentence from a set of premises” would be preferable. 1. Corcoran, John. 1969. Three Logical Theories, Philosophy of Science 36, 153–77. J P R -/- 2. Corcoran, John and George Weaver. 1969. Logical Consequence in Modal Logic: Natural Deduction in S5 Notre Dame Journal of Formal Logic 10, 370–84. MR0249278 (40 #2524). 3. Weaver, George and John Corcoran. 1974. Logical Consequence in Modal Logic: Some Semantic Systems for S4, Notre Dame Journal of Formal Logic 15, 370–78. MR0351765 (50 #4253). (shrink)
In this paper, I address the negative side effects on face-to-face communication and well-being resulting from our continual use of mobile-mediated technology. I consider these consequences by drawing on Søren Kierkegaard's deductions on deficient communication, and discuss one remedy he suggests: a closer relationship with nature. However, technology is so ubiquitous in the modern age that the prospect of escaping it, is nearly futile. In response, I offer a solution from the ideology of friluftsliv, which views a regular relationship (...) with nature as a way of getting in touch with one's natural human identity and restoring balance in life. I draw parallels between friluftsliv and Kierkegaard's ideas on nature and walking for curative purposes. I argue that the answer to our problem is not to shun technology, but to experience a regular relationship with nature as a way of offsetting its harmful effects. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.