In the present article we attempt to show that Aristotle's syllogistic is an underlying logiC which includes a natural deductive system and that it isn't an axiomatic theory as had previously been thought. We construct a mathematical model which reflects certain structural aspects of Aristotle's logic. We examine the relation of the model to the system of logic envisaged in scattered parts of Prior and Posterior Analytics. Our interpretation restores Aristotle's reputation as a logician of consummate imagination and skill. Several (...) attributions of shortcomings and logical errors to Aristotle are shown to be without merit. Aristotle's logic is found to be self-sufficient in several senses: his theory of deduction is logically sound in every detail. (His indirect deductions have been criticized, but incorrectly on our account.) Aristotle's logic presupposes no other logical concepts, not even those of propositional logic. The Aristotelian system is seen to be complete in the sense that every valid argument expressible in his system admits of a deduction within his deductive system: every semantically valid argument is deducible. (shrink)
For deductive reasoning to be justified, it must be guaranteed to preserve truth from premises to conclusion; and for it to be useful to us, it must be capable of informing us of something. How can we capture this notion of information content, whilst respecting the fact that the content of the premises, if true, already secures the truth of the conclusion? This is the problem I address here. I begin by considering and rejecting several accounts of informational (...) content. I then develop an account on which informational contents are indeterminate in their membership. This allows there to be cases in which it is indeterminate whether a given deduction is informative. Nevertheless, on the picture I present, there are determinate cases of informative (and determinate cases of uninformative) inferences. I argue that the model I offer is the best way for an account of content to respect the meaning of the logical constants and the inference rules associated with them without collapsing into a classical picture of content, unable to account for informative deductive inferences. (shrink)
Demonstrative logic, the study of demonstration as opposed to persuasion, is the subject of Aristotle's two-volume Analytics. Many examples are geometrical. Demonstration produces knowledge (of the truth of propositions). Persuasion merely produces opinion. Aristotle presented a general truth-and-consequence conception of demonstration meant to apply to all demonstrations. According to him, a demonstration, which normally proves a conclusion not previously known to be true, is an extended argumentation beginning with premises known to be truths and containing a chain of (...) reasoning showing by deductively evident steps that its conclusion is a consequence of its premises. In particular, a demonstration is a deduction whose premises are known to be true. Aristotle's general theory of demonstration required a prior general theory of deduction presented in the Prior Analytics. His general immediate-deduction-chaining conception of deduction was meant to apply to all deductions. According to him, any deduction that is not immediately evident is an extended argumentation that involves a chaining of intermediate immediately evident steps that shows its final conclusion to follow logically from its premises. To illustrate his general theory of deduction, he presented an ingeniously simple and mathematically precise special case traditionally known as the categorical syllogistic. (shrink)
It’s often thought that the phenomenon of risk aggregation poses a problem for multi-premise closure but not for single-premise closure. But recently, Lasonen-Aarnio and Schechter have challenged this thought. Lasonen-Aarnio argues that, insofar as risk aggregation poses a problem for multi-premise closure, it poses a similar problem for single-premise closure. For she thinks that, there being such a thing as deductive risk, risk may aggregate over a single premise and the deduction itself. Schechter argues (...) that single-premise closure succumbs to risk aggregation outright. For he thinks that there could be a long sequence of competent single-premise deductions such that, even though we are justified in believing the initial premise of the sequence, intutively, we are not justified in believing the final conclusion. This intuition, Schechter thinks, vitiates single-premise closure. In this paper, I defend single-premise closure against the arguments offered by Lasonen-Aarnio and Schechter. (shrink)
Following Quine [] and others we take deductions to produce knowledge of implications: a person gains knowledge that a given premise-set implies a given conclusion by deducing—producing a deduction of—the conclusion from those premises. How does this happen? How does a person recognize their desire for that knowledge of a certain implication, or that they lack it? How do they produce a suitable deduction? And most importantly, how does their production of that deduction provide (...) them with knowledge of the implication. What experienceable sign reveals to the reasoner that they achieved the desired knowledge? If a deduction is an array of inscriptions constructed by following syntactical—mechanical, machine-performable—rules as suggested by Tarski, Carnap, Church, and others, the epistemic question becomes even more pressing and more challenging. Moreover, deduction, the ability to produce deductions and to recognize them when produced, is operational knowledge that presupposes other component operations such as recognizing characters, making assumptions, inferring conclusions from premises, chaining inferences [AL]. (shrink)
Deductive inference is usually regarded as being “tautological” or “analytical”: the information conveyed by the conclusion is contained in the information conveyed by the premises. This idea, however, clashes with the undecidability of first-order logic and with the (likely) intractability of Boolean logic. In this article, we address the problem both from the semantic and the proof-theoretical point of view. We propose a hierarchy of propositional logics that are all tractable (i.e. decidable in polynomial time), although by means of (...) growing computational resources, and converge towards classical propositional logic. The underlying claim is that this hierarchy can be used to represent increasing levels of “depth” or “informativeness” of Boolean reasoning. Special attention is paid to the most basic logic in this hierarchy, the pure “intelim logic”, which satisfies all the requirements of a natural deduction system (allowing both introduction and elimination rules for each logical operator) while admitting of a feasible (quadratic) decision procedure. We argue that this logic is “analytic” in a particularly strict sense, in that it rules out any use of “virtual information”, which is chiefly responsible for the combinatorial explosion of standard classical systems. As a result, analyticity and tractability are reconciled and growing degrees of computational complexity are associated with the depth at which the use of virtual information is allowed. (shrink)
Deductive reasoning is the kind of reasoning in which, roughly, the truth of the input propositions (the premises) logically guarantees the truth of the output proposition (the conclusion), provided that no mistake has been made in the reasoning. The premises may be propositions that the reasoner believes or assumptions that the reasoner is exploring. Deductive reasoning contrasts with inductive reasoning, the kind of reasoning in which the truth of the premises need not guarantee the truth of the conclusion.
This paper describes a cubic water tank equipped with a movable partition receiving various amounts of liquid used to represent joint probability distributions. This device is applied to the investigation of deductive inferences under uncertainty. The analogy is exploited to determine by qualitative reasoning the limits in probability of the conclusion of twenty basic deductive arguments (such as Modus Ponens, And-introduction, Contraposition, etc.) often used as benchmark problems by the various theoretical approaches to reasoning under uncertainty. The probability bounds (...) imposed by the premises on the conclusion are derived on the basis of a few trivial principles such as "a part of the tank cannot contain more liquid than its capacity allows", or "if a part is empty, the other part contains all the liquid". This stems from the equivalence between the physical constraints imposed by the capacity of the tank and its subdivisions on the volumes of liquid, and the axioms and rules of probability. The device materializes de Finetti's coherence approach to probability. It also suggests a physical counterpart of Dutch book arguments to assess individuals' rationality in probability judgments in the sense that individuals whose degrees of belief in a conclusion are out of the bounds of coherence intervals would commit themselves to executing physically impossible tasks. (shrink)
Girolamo Saccheri (1667--1733) was an Italian Jesuit priest, scholastic philosopher, and mathematician. He earned a permanent place in the history of mathematics by discovering and rigorously deducing an elaborate chain of consequences of an axiom-set for what is now known as hyperbolic (or Lobachevskian) plane geometry. Reviewer's remarks: (1) On two pages of this book Saccheri refers to his previous and equally original book Logica demonstrativa (Turin, 1697) to which 14 of the 16 pages of the editor's "Introduction" are (...) devoted. At the time of the first edition, 1920, the editor was apparently not acquainted with the secondary literature on Logica demonstrativa which continued to grow in the period preceding the second edition \ref[see D. J. Struik, in Dictionary of scientific biography, Vol. 12, 55--57, Scribner's, New York, 1975]. Of special interest in this connection is a series of three articles by A. F. Emch [Scripta Math. 3 (1935), 51--60; Zbl 10, 386; ibid. 3 (1935), 143--152; Zbl 11, 193; ibid. 3 (1935), 221--333; Zbl 12, 98]. (2) It seems curious that modern writers believe that demonstration of the "nondeducibility" of the parallel postulate vindicates Euclid whereas at first Saccheri seems to have thought that demonstration of its "deducibility" is what would vindicate Euclid. Saccheri is perfectly clear in his commitment to the ancient (and now discredited) view that it is wrong to take as an "axiom" a proposition which is not a "primal verity", which is not "known through itself". So it would seem that Saccheri should think that he was convicting Euclid of error by deducing the parallel postulate. The resolution of this confusion is that Saccheri thought that he had proved, not merely that the parallel postulate was true, but that it was a "primal verity" and, thus, that Euclid was correct in taking it as an "axiom". As implausible as this claim about Saccheri may seem, the passage on p. 237, lines 3--15, seems to admit of no other interpretation. Indeed, Emch takes it this way. (3) As has been noted by many others, Saccheri was fascinated, if not obsessed, by what may be called "reflexive indirect deductions", indirect deductions which show that a conclusion follows from given premises by a chain of reasoning beginning with the given premises augmented by the denial of the desired conclusion and ending with the conclusion itself. It is obvious, of course, that this is simply a species of ordinary indirect deduction; a conclusion follows from given premises if a contradiction is deducible from those given premises augmented by the denial of the conclusion---and it is immaterial whether the contradiction involves one of the premises, the denial of the conclusion, or even, as often happens, intermediate propositions distinct from the given premises and the denial of the conclusion. Saccheri seemed to think that a proposition proved in this way was deduced from its own denial and, thus, that its denial was self-contradictory (p. 207). Inference from this mistake to the idea that propositions proved in this way are "primal verities" would involve yet another confusion. The reviewer gratefully acknowledges extensive communication with his former doctoral students J. Gasser and M. Scanlan. ADDED 14 March 14, 2015: (1) Wikipedia reports that many of Saccheri's ideas have a precedent in the 11th Century Persian polymath Omar Khayyám's Discussion of Difficulties in Euclid, a fact ignored in most Western sources until recently. It is unclear whether Saccheri had access to this work in translation, or developed his ideas independently. (2) This book is another exemplification of the huge difference between indirect deduction and indirect reduction. Indirect deduction requires making an assumption that is inconsistent with the premises previously adopted. This means that the reasoner must perform a certain mental act of assuming a certain proposition. It case the premises are all known truths, indirect deduction—which would then be indirect proof—requires the reasoner to assume a falsehood. This fact has been noted by several prominent mathematicians including Hardy, Hilbert, and Tarski. Indirect reduction requires no new assumption. Indirect reduction is simply a transformation of an argument in one form into another argument in a different form. In an indirect reduction one proposition in the old premise set is replaced by the contradictory opposite of the old conclusion and the new conclusion becomes the contradictory opposite of the replaced premise. Roughly and schematically, P,Q/R becomes P,~R/~Q or ~R, Q/~P. Saccheri’s work involved indirect deduction not indirect reduction. (3) The distinction between indirect deduction and indirect reduction has largely slipped through the cracks, the cracks between medieval-oriented logic and modern-oriented logic. The medievalists have a heavy investment in reduction and, though they have heard of deduction, they think that deduction is a form of reduction, or vice versa, or in some cases they think that the word ‘deduction’ is the modern way of referring to reduction. The modernists have no interest in reduction, i.e. in the process of transforming one argument into another having exactly the same number of premises. Modern logicians, like Aristotle, are concerned with deducing a single proposition from a set of propositions. Some focus on deducing a single proposition from the null set—something difficult to relate to reduction. (shrink)
The idea that knowledge can be extended by inference from what is known seems highly plausible. Yet, as shown by familiar preface paradox and lottery-type cases, the possibility of aggregating uncertainty casts doubt on its tenability. We show that these considerations go much further than previously recognized and significantly restrict the kinds of closure ordinary theories of knowledge can endorse. Meeting the challenge of uncertainty aggregation requires either the restriction of knowledge-extending inferences to single premises, or eliminating epistemic uncertainty in (...) known premises. The first strategy, while effective, retains little of the original idea—conclusions even of modus ponens inferences from known premises are not always known. We then look at the second strategy, inspecting the most elaborate and promising attempt to secure the epistemic role of basic inferences, namely Timothy Williamson’s safety theory of knowledge. We argue that while it indeed has the merit of allowing basic inferences such as modus ponens to extend knowledge, Williamson’s theory faces formidable difficulties. These difficulties, moreover, arise from the very feature responsible for its virtue- the infallibilism of knowledge. (shrink)
This paper raises obvious questions undermining any residual confidence in Mates work and revealing our embarrassing ignorance of true nature of Stoic deduction. It was inspired by the challenging exploratory work of JOSIAH GOULD.
Is it possible to give a justification of our own practice of deductive inference? The purpose of this paper is to explain what such a justification might consist in and what its purpose could be. On the conception that we are going to pursue, to give a justification for a deductive practice means to explain in terms of an intuitively satisfactory notion of validity why the inferences that conform to the practice coincide with the valid ones. That is, a justification (...) should provide an analysis of the notion of validity and show that the inferences that conform to the practice are just the ones that are valid. Moreover, a complete justification should also explain the purpose, or point, of our inferential practice. We are first going to discuss the objection that any justification of our deductive practice must use deduction and therefore be circular. Then we will consider a particular model of justificatory explanation, building on Georg Kreisel’s concept of informal rigour. Finally, in the main part of the paper, we will discuss three ideas for defining the notion of validity: (i) the classical conception according to which the notion of (bivalent) truth is taken as basic and validity is defined in terms of the preservation of truth; (ii) the constructivist idea of starting instead with the notion of (a canonical) proof (or verification) and define validity in terms of this notion; (iii) the idea of taking the notions of rational acceptance and rejection as given and define an argument to be valid just in case it is irrational to simultaneously accept its premises and reject its conclusion (or conclusions, if we allow for multiple conclusions). Building on work by Dana Scott, we show that the last conception may be viewed as being, in a certain sense, equivalent to the first one. Finally, we discuss the so-called paradox of inference and the informativeness of deductive arguments. (shrink)
The present PhD thesis is concerned with the question whether good reasoning requires that the subject has some cognitive grip on the relation between premises and conclusion. One consideration in favor of such a requirement goes as follows: In order for my belief-formation to be an instance of reasoning, and not merely a causally related sequence of beliefs, the process must be guided by my endorsement of a rule of reasoning. Therefore I must have justified beliefs about the relation (...) between my premises and my conclusion. -/- The rationality of a belief often depends on whether it is rightly connected to other beliefs, or more generally to other mental states —the states capable of providing a reason to holding the belief in question. For instance, some rational beliefs are connected to other beliefs by being inferred from them. It is often accepted that the connection implies that the subject in some sense ‘takes the mental states in question to be reason-providing’. But views on how exactly this is to be understood differ widely. They range from interpretations according to which ‘taking a mental state to be reason-providing’ imposes a mere causal sustaining relation between belief and reason-providing state to interpretations according to which one ‘takes a mental state to be reason-providing’ only if one believes that the state is reason-providing. The most common worry about the latter view is that it faces a vicious regress. In this thesis a different but in some respects similar interpretation of ‘taking something as reason-providing’ is given. It is argued to consist of a disposition to react in certain ways to information that challenges the reason-providing capacity of the allegedly reason-providing state. For instance, that one has inferred A from B partly consists in being disposed to suspend judgment about A if one obtains a reason to believe that B does not render A probable. The account is defended against regress-objections and the suspicion of explanatory circularity. (shrink)
Kant's A-Edition objective deduction is naturally (and has traditionally been) divided into two arguments: an " argument from above" and one that proceeds " von unten auf." This would suggest a picture of Kant's procedure in the objective deduction as first descending and ascending the same ladder, the better, perhaps, to test its durability or to thoroughly convince the reader of its soundness. There are obvious obstacles to such a reading, however; and in this chapter I will argue (...) that the arguments from above and below constitute different, albeit importantly inter-related, proofs. Rather than drawing on the differences in their premises, however, I will highlight what I take to be the different concerns addressed and, correspondingly, the distinct conclusions reached by each. In particular, I will show that both arguments can be understood to address distinct specters, with the argument from above addressing an internal concern generated by Kant’s own transcendental idealism, and the argument from below seeking to dispel a more traditional, broadly Humean challenge to the understanding’s role in experience. These distinct concerns also imply that these arguments yield distinct conclusions, though I will show that they are in fact complementary. (shrink)
A truth-preservation fallacy is using the concept of truth-preservation where some other concept is needed. For example, in certain contexts saying that consequences can be deduced from premises using truth-preserving deduction rules is a fallacy if it suggests that all truth-preserving rules are consequence-preserving. The arithmetic additive-associativity rule that yields 6 = (3 + (2 + 1)) from 6 = ((3 + 2) + 1) is truth-preserving but not consequence-preserving. As noted in James Gasser’s dissertation, Leibniz has been criticized (...) for using that rule in attempting to show that arithmetic equations are consequences of definitions. -/- A system of deductions is truth-preserving if each of its deductions having true premises has a true conclusion—and consequence-preserving if, for any given set of sentences, each deduction having premises that are consequences of that set has a conclusion that is a consequence of that set. Consequence-preserving amounts to: in each of its deductions the conclusion is a consequence of the premises. The same definitions apply to deduction rules considered as systems of deductions. Every consequence-preserving system is truth-preserving. It is not as well-known that the converse fails: not every truth-preserving system is consequence-preserving. Likewise for rules: not every truth-preserving rule is consequence-preserving. There are many famous examples. In ordinary first-order Peano-Arithmetic, the induction rule yields the conclusion ‘every number x is such that: x is zero or x is a successor’—which is not a consequence of the null set—from two tautological premises, which are consequences of the null set, of course. The arithmetic induction rule is truth-preserving but not consequence-preserving. Truth-preserving rules that are not consequence-preserving are non-logical or extra-logical rules. Such rules are unacceptable to persons espousing traditional truth-and-consequence conceptions of demonstration: a demonstration shows its conclusion is true by showing that its conclusion is a consequence of premises already known to be true. The 1965 Preface in Benson Mates (1972, vii) contains the first occurrence of truth-preservation fallacies in the book. (shrink)
The conventional notion of a formal system is adapted to conform to the sound deductive inference model operating on finite strings. Finite strings stipulated to have the semantic value of Boolean true provide the sound deductive premises. Truth preserving finite string transformation rules provide the valid deductive inference. Sound deductive conclusions are the result of these finite string transformation rules.
John Corcoran and George Boger. Aristotelian logic and Euclidean geometry. Bulletin of Symbolic Logic. 20 (2014) 131. -/- By an Aristotelian logic we mean any system of direct and indirect deductions, chains of reasoning linking conclusions to premises—complete syllogisms, to use Aristotle’s phrase—1) intended to show that their conclusions follow logically from their respective premises and 2) resembling those in Aristotle’s Prior Analytics. Such systems presuppose existence of cases where it is not obvious that the conclusion follows from the (...) premises: there must be something deductions can show. Corcoran calls a proposition that follows from given premises a hidden consequence of those premises if it is not obvious that the proposition follows from those premises. By a Euclidean geometry we mean an extended discourse beginning with basic premises—axioms, postulates, definitions—1) treating a universe of geometrical figures and 2) resembling Euclid’s Elements. There were Euclidean geometries before Euclid (fl. 300 BCE), even before Aristotle (384–322 BCE). Bochenski, Lukasiewicz, Patzig and others never new this or if they did they found it inconvenient to mention. Euclid shows no awareness of Aristotle. It is obvious today—as it should have been obvious in Euclid’s time, if anyone knew both—that Aristotle’s logic was insufficient for Euclid’s geometry: few if any geometrical theorems can be deduced from Euclid’s premises by means of Aristotle’s deductions. Aristotle’s writings don’t say whether his logic is sufficient for Euclidean geometry. But, there is not even one fully-presented example. However, Aristotle’s writings do make clear that he endorsed the goal of a sufficient system. Nevertheless, incredible as this is today, many logicians after Aristotle claimed that Aristotelian logics are sufficient for Euclidean geometries. This paper reviews and analyses such claims by Mill, Boole, De Morgan, Russell, Poincaré, and others. It also examines early contrary statements by Hintikka, Mueller, Smith, and others. Special attention is given to the argumentations pro or con and especially to their logical, epistemic, and ontological presuppositions. What methodology is necessary or sufficient to show that a given logic is adequate or inadequate to serve as the underlying logi of a given science. (shrink)
Since the time of Aristotle's students, interpreters have considered Prior Analytics to be a treatise about deductive reasoning, more generally, about methods of determining the validity and invalidity of premise-conclusion arguments. People studied Prior Analytics in order to learn more about deductive reasoning and to improve their own reasoning skills. These interpreters understood Aristotle to be focusing on two epistemic processes: first, the process of establishing knowledge that a conclusion follows necessarily from a set of premises (that (...) is, on the epistemic process of extracting information implicit in explicitly given information) and, second, the process of establishing knowledge that a conclusion does not follow. Despite the overwhelming tendency to interpret the syllogistic as formal epistemology, it was not until the early 1970s that it occurred to anyone to think that Aristotle may have developed a theory of deductive reasoning with a well worked-out system of deductions comparable in rigor and precision with systems such as propositional logic or equational logic familiar from mathematical logic. When modern logicians in the 1920s and 1930s first turned their attention to the problem of understanding Aristotle's contribution to logic in modern terms, they were guided both by the Frege-Russell conception of logic as formal ontology and at the same time by a desire to protect Aristotle from possible charges of psychologism. They thought they saw Aristotle applying the informal axiomatic method to formal ontology, not as making the first steps into formal epistemology. They did not notice Aristotle's description of deductive reasoning. Ironically, the formal axiomatic method (in which one explicitly presents not merely the substantive axioms but also the deductive processes used to derive theorems from the axioms) is incipient in Aristotle's presentation. Partly in opposition to the axiomatic, ontically-oriented approach to Aristotle's logic and partly as a result of attempting to increase the degree of fit between interpretation and text, logicians in the 1970s working independently came to remarkably similar conclusions to the effect that Aristotle indeed had produced the first system of formal deductions. They concluded that Aristotle had analyzed the process of deduction and that his achievement included a semantically complete system of natural deductions including both direct and indirect deductions. Where the interpretations of the 1920s and 1930s attribute to Aristotle a system of propositions organized deductively, the interpretations of the 1970s attribute to Aristotle a system of deductions, or extended deductive discourses, organized epistemically. The logicians of the 1920s and 1930s take Aristotle to be deducing laws of logic from axiomatic origins; the logicians of the 1970s take Aristotle to be describing the process of deduction and in particular to be describing deductions themselves, both those deductions that are proofs based on axiomatic premises and those deductions that, though deductively cogent, do not establish the truth of the conclusion but only that the conclusion is implied by the premise-set. Thus, two very different and opposed interpretations had emerged, interestingly both products of modern logicians equipped with the theoretical apparatus of mathematical logic. The issue at stake between these two interpretations is the historical question of Aristotle's place in the history of logic and of his orientation in philosophy of logic. This paper affirms Aristotle's place as the founder of logic taken as formal epistemology, including the study of deductive reasoning. A by-product of this study of Aristotle's accomplishments in logic is a clarification of a distinction implicit in discourses among logicians--that between logic as formal ontology and logic as formal epistemology. (shrink)
It is tempting to think that multi premise closure creates a special class of paradoxes having to do with the accumulation of risks, and that these paradoxes could be escaped by rejecting the principle, while still retaining single premise closure. I argue that single premisededuction is also susceptible to risks. I show that what I take to be the strongest argument for rejecting multi premise closure is also an argument for rejecting single premise (...) closure. Because of the symmetry between the principles, they come as a package: either both will have to be rejected or both will have to be revised. (shrink)
Closure for justification is the claim that thinkers are justified in believing the logical consequences of their justified beliefs, at least when those consequences are competently deduced. Many have found this principle to be very plausible. Even more attractive is the special case of Closure known as Single-Premise Closure. In this paper, I present a challenge to Single-Premise Closure. The challenge is based on the phenomenon of rational self-doubt – it can be rational to be less than fully (...) confident in one's beliefs and patterns of reasoning. In rough outline, the argument is as follows: Consider a thinker who deduces a conclusion from a justified initial premise via an incredibly long sequence of small competent deductions. Surely, such a thinker should suspect that he has made a mistake somewhere. And surely, given this, he should not believe the conclusion of the deduction even though he has a justified belief in the initial premise. (shrink)
In his book The Things We Mean, Stephen Schiffer advances a subtle defence of what he calls the ‘face-value’ analysis of attributions of belief and reports of speech. Under this analysis, ‘Harold believes that there is life on Venus’ expresses a relation between Harold and a certain abstract object, the proposition that there is life on Venus. The present essay first proposes an improvement to Schiffer’s ‘pleonastic’ theory of propositions. It then challenges the face-value analysis. There will be such things (...) as propositions only if they possess conditions of identity and distinctness. By analyzing Frege’s theory of propositions (Gedanken), I argue that such conditions may be found for the special case of beliefs and sayings advanced as premises and conclusions of deductive arguments. These conditions, however, are not applicable to most ordinary beliefs and sayings. Ordinary attributions and reports, then, do not place thinkers and speakers in relations to propositions. A bonus is exposure of the fallacy in the Putnam-Taschek objection to Frege’s theory of sense and reference. (shrink)
The argument diagramming method developed by Monroe C. Beardsley in his (1950) book Practical Logic, which has since become the gold standard for diagramming arguments in informal logic, makes it possible to map the relation between premises and conclusions of a chain of reasoning in relatively complex ways. The method has since been adapted and developed in a number of directions by many contemporary informal logicians and argumentation theorists. It has proved useful in practical applications and especially pedagogically in (...) teaching basic logic and critical reasoning skills at all levels of scientific education. I propose in this essay to build on Beardsley diagramming techniques to refine and supplement their structural tools for visualizing logical relationships in a number of categories not originally accommodated by Beardsley diagramming, including circular reasoning, reductio ad absurdum arguments, and efforts to dispute and contradict arguments, with applications and analysis. (shrink)
It is one thing for a given proposition to follow or to not follow from a given set of propositions and it is quite another thing for it to be shown either that the given proposition follows or that it does not follow.* Using a formal deduction to show that a conclusion follows and using a countermodel to show that a conclusion does not follow are both traditional practices recognized by Aristotle and used down through the history (...) of logic. These practices presuppose, respectively, a criterion of validity and a criterion of invalidity each of which has been extended and refined by modern logicians: deductions are studied in formal syntax (proof theory) and coun¬termodels are studied in formal semantics (model theory). The purpose of this paper is to compare these two criteria to the corresponding criteria employed in Boole’s first logical work, The Mathematical Analysis of Logic (1847). In particular, this paper presents a detailed study of the relevant metalogical passages and an analysis of Boole’s symbolic derivations. It is well known, of course, that Boole’s logical analysis of compound terms (involving ‘not’, ‘and’, ‘or’, ‘except’, etc.) contributed to the enlargement of the class of propositions and arguments formally treatable in logic. The present study shows, in addition, that Boole made significant contributions to the study of deduc¬tive reasoning. He identified the role of logical axioms (as opposed to inference rules) in formal deductions, he conceived of the idea of an axiomatic deductive sys¬tem (which yields logical truths by itself and which yields consequences when ap¬plied to arbitrary premises). Nevertheless, surprisingly, Boole’s attempt to imple¬ment his idea of an axiomatic deductive system involved striking omissions: Boole does not use his own formal deductions to establish validity. Boole does give symbolic derivations, several of which are vitiated by “Boole’s Solutions Fallacy”: the fallacy of supposing that a solution to an equation is necessarily a logical consequence of the equation. This fallacy seems to have led Boole to confuse equational calculi (i.e., methods for gen-erating solutions) with deduction procedures (i.e., methods for generating consequences). The methodological confusion is closely related to the fact, shown in detail below, that Boole had adopted an unsound criterion of validity. It is also shown that Boole totally ignored the countermodel criterion of invalid¬ity. Careful examination of the text does not reveal with certainty a test for invalidity which was adopted by Boole. However, we have isolated a test that he seems to use in this way and we show that this test is ineffectual in the sense that it does not serve to identify invalid arguments. We go beyond the simple goal stated above. Besides comparing Boole’s earliest criteria of validity and invalidity with those traditionally (and still generally) employed, this paper also investigates the framework and details of THE MATHEMATICAL ANALYSIS OF LOGIC. (shrink)
The Neo-Moorean Deduction (I have a hand, so I am not a brain-in-a-vat) and the Zebra Deduction (the creature is a zebra, so isn’t a cleverly disguised mule) are notorious. Crispin Wright, Martin Davies, Fred Dretske, and Brian McLaughlin, among others, argue that these deductions are instances of transmission failure. That is, they argue that these deductions cannot transmit justification to their conclusions. I contend, however, that the notoriety of these deductions is undeserved. My strategy is to clarify, (...) attack, defend, and apply. I clarify what transmission and transmission failure really are, thereby exposing two questionable but quotidian assumptions. I attack existing views of transmission failure, especially those of Crispin Wright. I defend a permissive view of transmission failure, one which holds that deductions of a certain kind fail to transmit only because of premise circularity. Finally, I apply this account to the Neo-Moorean and Zebra Deductions and show that, given my permissive view, these deductions transmit in an intuitively acceptable way—at least if either a certain type of circularity is benign or a certain view of perceptual justification is false. (shrink)
Argumentations are at the heart of the deductive and the hypothetico-deductive methods, which are involved in attempts to reduce currently open problems to problems already solved. These two methods span the entire spectrum of problem-oriented reasoning from the simplest and most practical to the most complex and most theoretical, thereby uniting all objective thought whether ancient or contemporary, whether humanistic or scientific, whether normative or descriptive, whether concrete or abstract. Analysis, synthesis, evaluation, and function of argumentations are described. Perennial philosophic (...) problems, epistemic and ontic, related to argumentations are put in perspective. So much of what has been regarded as logic is seen to be involved in the study of argumentations that logic may be usefully defined as the systematic study of argumentations, which is virtually identical to the quest of objective understanding of objectivity. -/- KEY WORDS: hypothesis, theorem, argumentation, proof, deduction, premise-conclusion argument, valid, inference, implication, epistemic, ontic, cogent, fallacious, paradox, formal, validation. (shrink)
A Mathematical Review by John Corcoran, SUNY/Buffalo -/- Macbeth, Danielle Diagrammatic reasoning in Frege's Begriffsschrift. Synthese 186 (2012), no. 1, 289–314. ABSTRACT This review begins with two quotations from the paper: its abstract and the first paragraph of the conclusion. The point of the quotations is to make clear by the “give-them-enough-rope” strategy how murky, incompetent, and badly written the paper is. I know I am asking a lot, but I have to ask you to read the quoted passages—aloud (...) if possible. Don’t miss the silly attempt to recycle Kant’s quip “Concepts without intuitions are empty; intuitions without concepts are blind”. What the paper was aiming at includes the absurdity: “Proofs without definitions are empty; definitions without proofs are, if not blind, then dumb.” But the author even bollixed this. The editor didn’t even notice. The copy-editor missed it. And the author’s proof-reading did not catch it. In order not to torment you I will quote the sentence as it appears: “In a slogan: proofs without definitions are empty, merely the aimless manipulation of signs according to rules; and definitions without proofs are, if no blind, then dumb.”[sic] The rest of my review discusses the paper’s astounding misattribution to contemporary logicians of the information-theoretic approach. This approach was cruelly trashed by Quine in his 1970 Philosophy of Logic, and thereafter ignored by every text I know of. The paper under review attributes generally to modern philosophers and logicians views that were never espoused by any of the prominent logicians—such as Hilbert, Gödel, Tarski, Church, and Quine—apparently in an attempt to distance them from Frege: the focus of the article. On page 310 we find the following paragraph. “In our logics it is assumed that inference potential is given by truth-conditions. Hence, we think, deduction can be nothing more than a matter of making explicit information that is already contained in one’s premises. If the deduction is valid then the information contained in the conclusion must be contained already in the premises; if that information is not contained already in the premises […], then the argument cannot be valid.” Although the paper is meticulous in citing supporting literature for less questionable points, no references are given for this. In fact, the view that deduction is the making explicit of information that is only implicit in premises has not been espoused by any standard symbolic logic books. It has only recently been articulated by a small number of philosophical logicians from a younger generation, for example, in the prize-winning essay by J. Sagüillo, Methodological practice and complementary concepts of logical consequence: Tarski’s model-theoretic consequence and Corcoran’s information-theoretic consequence, History and Philosophy of Logic, 30 (2009), pp. 21–48. The paper omits definitions of key terms including ‘ampliative’, ‘explicatory’, ‘inference potential’, ‘truth-condition’, and ‘information’. The definition of prime number on page 292 is as follows: “To say that a number is prime is to say that it is not divisible without remainder by another number”. This would make one be the only prime number. The paper being reviewed had the benefit of two anonymous referees who contributed “very helpful comments on an earlier draft”. Could these anonymous referees have read the paper? -/- J. Corcoran, U of Buffalo, SUNY -/- PS By the way, if anyone has a paper that has been turned down by other journals, any journal that would publish something like this might be worth trying. (shrink)
According to the “paradox of knowability”, the moderate thesis that all truths are knowable – ... – implies the seemingly preposterous claim that all truths are actually known – ... –, i.e. that we are omniscient. If Fitch’s argument were successful, it would amount to a knockdown rebuttal of anti-realism by reductio. In the paper I defend the nowadays rather neglected strategy of intuitionistic revisionism. Employing only intuitionistically acceptable rules of inference, the conclusion of the argument is, firstly, not (...) ..., but .... Secondly, even if there were an intuitionistically acceptable proof of ..., i.e. an argument based on a different set of premises, the conclusion would have to be interpreted in accordance with Heyting semantics, and read in this way, the apparently preposterous conclusion would be true on conceptual grounds and acceptable even from a realist point of view. Fitch’s argument, understood as an immanent critique of verificationism, fails because in a debate dealing with the justification of deduction there can be no interpreted formal language on which realists and anti-realists could agree. Thus, the underlying problem is that a satisfactory solution to the “problem of shared content” is not available. I conclude with some remarks on the proposals by J. Salerno and N. Tennant to reconstruct certain arguments in the debate on anti-realism by establishing aporias. (shrink)
Condorcet's famous jury theorem reaches an optimistic conclusion on the correctness of majority decisions, based on two controversial premises about voters: they are competent and vote independently, in a technical sense. I carefully analyse these premises and show that: whether a premise is justi…ed depends on the notion of probability considered; none of the notions renders both premises simultaneously justi…ed. Under the perhaps most interesting notions, the independence assumption should be weakened.
Methods available for the axiomatization of arbitrary finite-valued logics can be applied to obtain sound and complete intelim rules for all truth-functional connectives of classical logic including the Sheffer stroke and Peirce’s arrow. The restriction to a single conclusion in standard systems of natural deduction requires the introduction of additional rules to make the resulting systems complete; these rules are nevertheless still simple and correspond straightforwardly to the classical absurdity rule. Omitting these rules results in systems for intuitionistic (...) versions of the connectives in question. (shrink)
according to aristotle's posterior analytics, scientific expertise is composed of two different cognitive dispositions. Some propositions in the domain can be scientifically explained, which means that they are known by "demonstration", a deductive argument in which the premises are explanatory of the conclusion. Thus, the kind of cognition that apprehends those propositions is called "demonstrative knowledge".1 However, not all propositions in a scientific domain are demonstrable. Demonstrations are ultimately based on indemonstrable principles, whose knowledge is called "comprehension".2 If the (...) knowledge of all scientific propositions were... (shrink)
The contemporary theory of epistemic democracy often draws on the Condorcet Jury Theorem to formally justify the ‘wisdom of crowds’. But this theorem is inapplicable in its current form, since one of its premises – voter independence – is notoriously violated. This premise carries responsibility for the theorem's misleading conclusion that ‘large crowds are infallible’. We prove a more useful jury theorem: under defensible premises, ‘large crowds are fallible but better than small groups’. This theorem rehabilitates the importance (...) of deliberation and education, which appear inessential in the classical jury framework. Our theorem is related to Ladha's (1993) seminal jury theorem for interchangeable (‘indistinguishable’) voters based on de Finetti's Theorem. We also prove a more general and simpler such jury theorem. (shrink)
Consider the following. The first is a one-premise argument; the second has two premises. The question sign marks the conclusions as such. -/- Matthew, Mark, Luke, and John wrote Greek. ? Every evangelist wrote Greek. -/- Matthew, Mark, Luke, and John wrote Greek. Every evangelist is Matthew, Mark, Luke, or John. ? Every evangelist wrote Greek. -/- The above pair of premise-conclusion arguments is of a sort familiar to logicians and philosophers of science. In each case the (...) first premise is logically equivalent to the set of four atomic propositions: “Matthew wrote Greek”, “Mark wrote Greek”, “Luke wrote Greek”, and “John wrote Greek”. The universe of discourse is the set of evangelists. We presuppose standard first-order logic. -/- As many logic texts teach, the first of these two premise-conclusion arguments—sometimes called a complete enumerative induction— is invalid in the sense that its conclusion does not follow from its premises. To get a counterargument, replace ‘Matthew’, ‘Mark’, ‘Luke’, and ‘John’ by ‘two’,’four’, ‘six’ and ‘eight’; replace ‘wrote Greek’ by ‘are even’; and replace ‘evangelist’ by ‘number’. This replacement converts the first argument into one having true premises and false conclusion. -/- But the same replacement performed on the second argument does no such thing: it converts the second premise into the falsehood “Every number is two, four, six, or eight”. As many logic texts teach, there is no replacement that converts the second argument into one with all true premises and false conclusion. The second is valid; its conclusion is deducible from its two premises using an instructive natural deduction. -/- This paper “does the math” behind the above examples. The theorem could be stated informally: the above examples are typical. (shrink)
This document diagrams the forms OOA, OOE, OOI, and OOO, including all four figures. Each form and figure has the following information: (1) Premises as stated: Venn diagram showing what the premises say; (2) Purported conclusion: diagram showing what the premises claim to say; (3) Relation of premises to conclusion: intended to describe how the premises and conclusion relate to each other, such as validity or contradiction. Used in only a few examples; (4) Distribution: intended to create (...) a system in which each syllogism has a unique code. In each premise and conclusion, the terms are each assigned a one or a zero, based on whether the term is distributed; (5) Rules: lists the rules of the syllogism and shows whether that particular syllogism follows, violates, or is unaffected by, each rule. (shrink)
In this paper I draw attention to a peculiar epistemic feature exhibited by certain deductively valid inferences. Certain deductively valid inferences are unable to enhance the reliability of one's belief that the conclusion is true—in a sense that will be fully explained. As I shall show, this feature is demonstrably present in certain philosophically significant inferences—such as GE Moore's notorious 'proof' of the existence of the external world. I suggest that this peculiar epistemic feature might be correlated with the (...) much discussed phenomenon that Crispin Wright and Martin Davies have called 'transmission failure'—the apparent failure, on the part of some deductively valid inferences to transmit one's justification for believing the premises. (shrink)
Conspiracy theories should be neither believed nor investigated - that is the conventional wisdom. I argue that it is sometimes permissible both to investigate and to believe. Hence this is a dispute in the ethics of belief. I defend epistemic ‘oughts’ that apply in the first instance to belief-forming strategies that are partly under our control. I argue that the policy of systematically doubting or disbelieving conspiracy theories would be both a political disaster and the epistemic equivalent of self-mutilation, since (...) it leads to the conclusion that history is bunk and the nightly news unbelievable. In fact (of course) the policy is not employed systematically but is only wheeled on to do down theories that the speaker happens to dislike. I develop a deductive argument from hard-to-deny premises that if you are not a ‘conspiracy theorist’ in my anodyne sense of the word then you are an ‘idiot’ in the Greek sense of the word, that is, someone so politically purblind as to have no opinions about either history or public affairs. The conventional wisdom can only be saved (if at all) if ‘conspiracy theory’ is given a slanted definition. I discuss some slanted definitions apparently presupposed by proponents of the conventional wisdom (including, amongst others, Tony Blair) and conclude that even with these definitions the conventional wisdom comes out as deeply unwise. I finish up with a little harmless fun at the expense of David Aaronvitch whose abilities as a rhetorician and a popular historian are not perhaps matched by a corresponding capacity for logical thought. (shrink)
Deduction is important to scientific inquiry because it can extend knowledge efficiently, bypassing the need to investigate everything directly. The existence of closure failure—where one knows the premises and that the premises imply the conclusion but nevertheless does not know the conclusion—is a problem because it threatens this usage. It means that we cannot trust deduction for gaining new knowledge unless we can identify such cases ahead of time so as to avoid them. For philosophically engineered (...) examples we have “inner alarm bells” to detect closure failure, but in scientific investigation we would want to use deduction for extension of our knowledge to matters we don’t already know that we couldn’t know. Through a quantitative treatment of how fast probabilistic sensitivity is lost over steps of deduction, I identify a condition that guarantees that the growth of potential error will be gradual; thus, dramatic closure failure is avoided. Whether the condition is fulfilled is often obvious, but sometimes it requires substantive investigation. I illustrate that not only safe deduction but the discovery of dramatic closure failures can lead to scientific advances. (shrink)
The paper discusses the origin of dark matter and dark energy from the concepts of time and the totality in the final analysis. Though both seem to be rather philosophical, nonetheless they are postulated axiomatically and interpreted physically, and the corresponding philosophical transcendentalism serves heuristically. The exposition of the article means to outline the “forest for the trees”, however, in an absolutely rigorous mathematical way, which to be explicated in detail in a future paper. The “two deductions” are two successive (...) stage of a single conclusion mentioned above. The concept of “transcendental invariance” meaning ontologically and physically interpreting the mathematical equivalence of the axiom of choice and the well-ordering “theorem” is utilized again. Then, time arrow is a corollary from that transcendental invariance, and in turn, it implies quantum information conservation as the Noether correlate of the linear “increase of time” after time arrow. Quantum information conservation implies a few fundamental corollaries such as the “conservation of energy conservation” in quantum mechanics from reasons quite different from those in classical mechanics and physics as well as the “absence of hidden variables” (versus Einstein’s conjecture) in it. However, the paper is concentrated only into the inference of another corollary from quantum information conservation, namely, dark matter and dark energy being due to entanglement, and thus and in the final analysis, to the conservation of quantum information, however observed experimentally only on the “cognitive screen” of “Mach’s principle” in Einstein’s general relativity. therefore excluding any other source of gravitational field than mass and gravity. Then, if quantum information by itself would generate a certain nonzero gravitational field, it will be depicted on the same screen as certain masses and energies distributed in space-time, and most presumably, observable as those dark energy and dark matter predominating in the universe as about 96% of its energy and matter quite unexpectedly for physics and the scientific worldview nowadays. Besides on the cognitive screen of general relativity, entanglement is available necessarily on still one “cognitive screen” (namely, that of quantum mechanics), being furthermore “flat”. Most probably, that projection is confinement, a mysterious and ad hoc added interaction along with the fundamental tree ones of the Standard model being even inconsistent to them conceptually, as far as it need differ the local space from the global space being definable only as a relation between them (similar to entanglement). So, entanglement is able to link the gravity of general relativity to the confinement of the Standard model as its projections of the “cognitive screens” of those two fundamental physical theories. (shrink)
The premise-fact confusion in Aristotle’s PRIOR ANALYTICS. -/- The premise-fact fallacy is talking about premises when the facts are what matters or talking about facts when the premises are what matters. It is not useful to put too fine a point on this pencil. -/- In one form it is thinking that the truth-values of premises are relevant to what their consequences in fact are, or relevant to determining what their consequences are. Thus, e.g., someone commits the (...) class='Hi'>premise-fact fallacy if they think that a proposition has different consequences were it true than it would have if false. C. I. Lewis said that confusing logical consequence with material consequence leads to this fallacy. See Corcoran’s 1973 “Meanings of implication” [available on Academia. edu]. -/- The premise-fact confusion occurs in a written passage that implies the premise-fact fallacy or that suggests that the writer isn’t clear about the issues involved in the premise-fact fallacy. Here are some examples. -/- E1: If Abe is Ben and Ben swims, then it would follow that Abe swims. -/- Comment: The truth is that from “Abe is Ben and Ben swims”, the proposition “Abe swims” follows. Whether in fact Abe is Ben and Ben swims is irrelevant to whether “Abe swims” follows from “Abe is Ben and Ben swims”. -/- E1 suggests that maybe “Abe swims” wouldn’t follow from “Abe is Ben and Ben swims” if the latter were false. -/- E2: The truth of “Abe is Ben and Ben swims” implies that Abe swims. -/- E3: Indirect deduction requires assuming something false. -/- Comment: If the premises of an indirect deduction are true the conclusion is true and thus the “reductio” assumption is false. But deduction, whether direct or indirect, does not require true premises. In fact, indirect deduction is often used to determine that the premises are not all true. -/- Anyway, the one-page paper accompanying this abstract reports one of dozens of premise-fact errors in PRIOR ANALYTICS. In the session, people can add their own examples and comment on them. For example, is the one at 25b32 the first? What is the next premise-fact error after 25b32? Which translators or commentators discuss this? -/- . (shrink)
Timothy Williamson has famously argued that the principle should be rejected. We analyze Williamson's argument and show that its key premise is ambiguous, and that when it is properly stated this premise no longer supports the argument against. After canvassing possible objections to our argument, we reflect upon some conclusions that suggest significant epistemological ramifications pertaining to the acquisition of knowledge from prior knowledge by deduction.
One semantic and two syntactic decision procedures are given for determining the validity of Aristotelian assertoric and apodeictic syllogisms. Results are obtained by using the Aristotelian deductions that necessarily have an even number of premises.
Kant’s argument in § 38 of the *Critique of Judgment* is subject to a dilemma: if the subjective condition of cognition is the sufficient condition of the pleasure of taste, then every object of experience must produce that pleasure; if not, then the universal communicability of cognition does not entail the universal communicability of the pleasure. Kant’s use of an additional premise in § 21 may get him out of this difficulty, but the premises themselves hang in the air (...) and have no independent plausibility. What Kant offers as a proof of our right to make judgments of taste is more charitably construed as an indirect argument for the adequacy of a speculative explanation of a *presumed* right to make judgments of taste. (shrink)
In this fragment of Opuscula Logica it is displayed an arithmetical treatment of the aristotelic syllogisms upon the previous interpretations of Christine Ladd-Franklin and Jean Piaget. For the first time, the whole deductive corpus for each syllogism is presented in the two innovative modalities first proposed by Hugo Padilla Chacón. A. The Projection method (all the possible expressions that can be deduced through the conditional from a logical expression) and B. The Retrojection method (all the possible valid antecedents or premises (...) conjunction for an expression proposed as a conclusion). The results are numerically expressed, with their equivalents in the propositional language of bivalent logic. (shrink)
To eliminate incompleteness, undecidability and inconsistency from formal systems we only need to convert the formal proofs to theorem consequences of symbolic logic to conform to the sound deductive inference model. -/- Within the sound deductive inference model there is a (connected sequence of valid deductions from true premises to a true conclusion) thus unlike the formal proofs of symbolic logic provability cannot diverge from truth.
In attempt to provide an answer to the question of origin of deductive proofs, I argue that Aristotle’s philosophy of math is more accurate opposed to a Platonic philosophy of math, given the evidence of how mathematics began. Aristotle says that mathematical knowledge is a posteriori, known through induction; but once knowledge has become unqualified it can grow into deduction. Two pieces of recent scholarship on Greek mathematics propose new ways of thinking about how mathematics began in the Greek (...) culture. Both claimed there was a close relationship between the culture and mathematicians; mathematics was understood through imaginative processes, experiencing the proofs in tangible ways, and establishing a consistent unified form of argumentation. These pieces of evidence provide the context in which Aristotle worked and their contributions lend support to the argument that mathematical premises as inductively available is a better way of understanding the origins of deductive practices, opposed to the Platonic tradition. (shrink)
I argue 1) That in his celebrated Is/Ought passage, Hume employs ‘deduction’ in the strict sense, according to which if a conclusion B is justly or evidently deduced from a set of premises A, A cannot be true and B false, or B false and the premises A true. 2) That Hume was following the common custom of his times which sometimes employed ‘deduction’ in a strict sense to denote inferences in which, in the words of Dr (...) Watts’ Logick, ‘the premises, according to the reason of things, do really contain the conclusion that is deduced from them’; that although Hume sometimes uses ‘demonstrative argument’ as a synonym for ‘deduction’, like most of his contemporaries, he generally reserves the word ‘demonstration’ for deductive inferences in which the premises are both necessary and self-evident. 3) That Mr Hume did indeed mean to suggest that deductions from IS to OUGHT were ‘altogether inconceivable’ since if ought represents a new relation or affirmation, it cannot, in the strict sense, be justly deduced from premises which do not really contain it. 4) That in a large and liberal (or perhaps loose and promiscuous) sense Hume does deduce oughts and ought nots from observations concerning human affairs, but that the deductions in question are not inferences, but explanations, since in another sense of ‘deduce’, common in the Eighteenth Century, to deduce B from A is to trace B back to A or to explain B in terms of A; 5) That a small attention to the context of Hume’s remarks and to the logical notions on which they are based would subvert those vulgar systems of philosophy which exaggerate the distinction between fact and value; for just because it is ‘altogether inconceivable’ that the new relation or affirmation OUGHT should be a deduction from others that are entirely different from it, it does not follow that the facts represented by IS and IS NOT are at bottom any different from the values represented by OUGHT and OUGHT NOT. (shrink)
We are much better equipped to let the facts reveal themselves to us instead of blinding ourselves to them or stubbornly trying to force them into preconceived molds. We no longer embarrass ourselves in front of our students, for example, by insisting that “Some Xs are Y” means the same as “Some X is Y”, and lamely adding “for purposes of logic” whenever there is pushback. Logic teaching in this century can exploit the new spirit of objectivity, humility, clarity, observationalism, (...) contextualism, and pluralism. Besides the new spirit there have been quiet developments in logic and its history and philosophy that could radically improve logic teaching. This lecture expands points which apply equally well in first, second, and third courses, i.e. in “critical thinking”, “deductive logic”, and “symbolic logic”. (shrink)
Information-theoretic approaches to formal logic analyze the "common intuitive" concepts of implication, consequence, and validity in terms of information content of propositions and sets of propositions: one given proposition implies a second if the former contains all of the information contained by the latter; one given proposition is a consequence of a second if the latter contains all of the information contained by the former; an argument is valid if the conclusion contains no information beyond that of the (...) class='Hi'>premise-set. This paper locates information-theoretic approaches historically, philosophically, and pragmatically. Advantages and disadvantages are identified by examining such approaches in themselves and by contrasting them with standard transformation-theoretic approaches. Transformation-theoretic approaches analyze validity (and thus implication) in terms of transformations that map one argument onto another: a given argument is valid if no transformation carries it onto an argument with all true premises and false conclusion. Model-theoretic, set-theoretic, and substitution-theoretic approaches, which dominate current literature, can be construed as transformation-theoretic, as can the so-called possible-worlds approaches. Ontic and epistemic presuppositions of both types of approaches are considered. Attention is given to the question of whether our historically cumulative experience applying logic is better explained from a purely information-theoretic perspective or from a purely transformation-theoretic perspective or whether apparent conflicts between the two types of approaches need to be reconciled in order to forge a new type of approach that recognizes their basic complementarity. (shrink)
The paper defends a variant of the material implication approach to the meaning of conditional sentences against some arguments that are considered to be widely subscribed to and/or important in the philosophical, psychological and linguistic literature. These arguments are shown to be wrong, debatable, or to miss their aim if the truth conditions defining material implication are viewed as determining nothing but the denotation of conditional sentences and if the function of conditional sentences in deduction (logic) is focused on (...) rather than in inferencing (reasoning). It is shown that some ‘paradoxes of material implication’ are due to inconsistent premises of deductions introduced by semantic relations between clauses constituting the premises, a fact which does not invalidate the approach. Other ‘paradoxes’ are shown to arise because they are based on uninformative deductions, violating a basic pragmatic principle. In addition, the paper introduces the distinction between the set of possible states of a mental model of the actual world and of alternative worlds. It is argued that material implication determines the denotation of an indicative conditional as a subset of the former set and the denotation of a subjunctive conditional as a subset of the latter set, thus unifying these two types of conditionals. (shrink)
Could the intersection of [formal proofs of mathematical logic] and [sound deductive inference] specify formal systems having [deductively sound formal proofs of mathematical logic]? All that we have to do to provide [deductively sound formal proofs of mathematical logic] is select the subset of conventional [formal proofs of mathematical logic] having true premises and now we have [deductively sound formal proofs of mathematical logic].
How is moral knowledge possible? This paper defends the anti-Humean thesis that we can acquire moral knowledge by deduction from wholly non-moral premises. According to Hume’s Law, as it has become known, we cannot deduce an ‘ought’ from an ‘is’, since it is “altogether inconceivable how this new relation can be a deduction from others, which are entirely different from it” (Hume, 1739, 3.1.1). This paper explores the prospects for a deductive theory of moral knowledge that rejects Hume’s (...) Law. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.