The starting point of this paper concerns the apparent difference between what we might call absolute truth and truth in a model, following Donald Davidson. The notion of absolute truth is the one familiar from Tarski’s T-schema: ‘Snow is white’ is true if and only if snow is white. Instead of being a property of sentences as absolute truth appears to be, truth in a model, that is relative truth, is evaluated in terms of (...) the relation between sentences and models. I wish to examine the apparent dual nature of logicaltruth (without dwelling on Davidson), and suggest that we are dealing with a distinction between a metaphysical and a linguistic interpretation of truth. I take my cue from John Etchemendy, who suggests that absolute truth could be considered as being equivalent to truth in the ‘right model’, i.e., the model that corresponds with the world. However, the notion of ‘model’ is not entirely appropriate here as it is closely associated with relative truth. Instead, I propose that the metaphysical interpretation of truth may be illustrated in modal terms, by metaphysical modality in particular. One of the tasks that I will undertake in this paper is to develop this modal interpretation, partly building on my previous work on the metaphysical interpretation of the law of non-contradiction (Tahko 2009). After an explication of the metaphysical interpretation of logicaltruth, a brief study of how this interpretation connects with some recent important themes in philosophical logic follows. In particular, I discuss logical pluralism and propose an understanding of pluralism from the point of view of the metaphysical interpretation. (shrink)
Classical logic counts sentences such as ‘Alice is identical with Alice’ as logically true. A standard objection to classical logic is that Alice’s self-identity, for instance, is not a matter of logic because the identity of particular objects is not a matter of logic. For this reason, many philosophers argue that classical logic is not the right logic, and that it should be abandoned in favour of free logic — logic free of existential commitments with respect to singular terms. In (...) most standard free logics, sentences such as ‘Alice is identical with Alice’ are not logically true. This paper argues that this objection from existential commitments is some- what superficial and that there is a deeper reason why ‘Alice is identical with Alice’ should not be considered a logicaltruth. Indeed, a key fundamental thought about the nature of logic is that a logicaltruth is true in virtue of its logical form. The fundamental problem I raise is that a sentence such as ‘Alice is identical with Alice’ appears to not even be true in virtue of its logical form. Thus this paper argues that given that such a sentence is not true in virtue of its logical form, it should not be counted as logically true. It moreover argues, on the same grounds, that even the sentences which free logicians regard as logically true shouldn’t be regarded as logically true. So in this sense free logic is no repair to classical logic. (shrink)
Monists say that the nature of truth is invariant, whichever sentence you consider; pluralists say that the nature of truth varies between different sets of sentences. The orthodoxy is that logic and logical form favour monism: there must be a single property that is preserved in any valid inference; and any truth-functional complex must be true in the same way as its components. The orthodoxy, I argue, is mistaken. Logic and logical form impose only structural (...) constraints on a metaphysics of truth. Monistic theories are not guaranteed to satisfy these constraints, and there is a pluralistic theory that does so. (shrink)
Aristotle’s words in the Metaphysics: “to say of what is that it is, or of what is not that it is not, is true” are often understood as indicating a correspondence view of truth: a statement is true if it corresponds to something in the world that makes it true. Aristotle’s words can also be interpreted in a deflationary, i.e., metaphysically less loaded, way. According to the latter view, the concept of truth is contained in platitudes like: ‘It (...) is true that snow is white iff snow is white’, ‘It is true that neutrinos have mass iff neutrinos have mass’, etc. Our understanding of the concept of truth is exhausted by these and similar equivalences. This is all there is to truth. In his book Truth (Second edition 1998), Paul Horwich develops minimalism, a special variant of the deflationary view. According to Horwich’s minimalism, truth is an indefinable property of propositions characterized by what he calls the minimal theory, i.e., all (nonparadoxical) propositions of the form: It is true that p if and only if p. Although the idea of minimalism is simple and straightforward, the proper formulation of Horwich’s theory is no simple matter. In this paper, I shall discuss some of the difficulties of a logical nature that arise. First, I discuss problems that arise when we try to give a rigorous characterization of the theory without presupposing a prior understanding of the notion of truth. Next I turn to Horwich’s treatment of the Liar paradox and a paradox about the totality of all propositions that was first formulated by Russell (1903). My conclusion is that Horwich’s minimal theory cannot deal with these difficulties in an adequate way, and that it has to be revised in fundamental ways in order to do so. Once such revisions have been carried out the theory may, however, have lost some of its appealing simplicity. (shrink)
Postmodernists claim that there is no truth. However, the statement 'there is no truth ' is self -contradictory. This essay shows the following: One cannot state the idea 'there is no truth ' universally without creating a paradox. In contrast, the statement 'there is truth ' does not produce such a paradox. Therefore, it is more logical that truth exists.
This study concerns logical systems considered as theories. By searching for the problems which the traditionally given systems may reasonably be intended to solve, we clarify the rationales for the adequacy criteria commonly applied to logical systems. From this point of view there appear to be three basic types of logical systems: those concerned with logicaltruth; those concerned with logicaltruth and with logical consequence; and those concerned with deduction per se (...) as well as with logicaltruth and logical consequence. Adequacy criteria for systems of the first two types include: effectiveness, soundness, completeness, Post completeness, "strong soundness" and strong completeness. Consideration of a logical system as a theory of deduction leads us to attempt to formulate two adequacy criteria for systems of proofs. The first deals with the concept of rigor or "gaplessness" in proofs. The second is a completeness condition for a system of proofs. An historical note at the end of the paper suggests a remarkable parallel between the above hierarchy of systems and the actual historical development of this area of logic. (shrink)
I argue that conventional implicatures embed in logical compounds, and are non-truth-conditional contributors to sentence meaning. This, I argue has significant implications for how we understand truth, truth-conditional content, and truth-bearers.
An interesting question is whether deflationism about truth (and falsity) extends to related properties and relations on truthbearers. Lionel Shapiro (2011) answers affirmatively by arguing that a certain deflationism about truth is as plausible as an analogous version of deflationism about logical consequence. I argue that the argument fails on two counts. First, it trivializes to any relation between truthbearers, including substantive ones; in other words, his argument can be used to establish that deflationism about truth (...) is as plausible as deflationism about an arbitrary sentential relation. Second, the alleged analogy between the arguments for deflationism about truth and deflationism about consequence fails. Along the way I consider what implications the failure of the equiplausibility thesis has for deflationism about falsity. (shrink)
This interesting and imaginative monograph is based on the author’s PhD dissertation supervised by Saul Kripke. It is dedicated to Timothy Smiley, whose interpretation of PRIOR ANALYTICS informs its approach. As suggested by its title, this short work demonstrates conclusively that Aristotle’s syllogistic is a suitable vehicle for fruitful discussion of contemporary issues in logical theory. Aristotle’s syllogistic is represented by Corcoran’s 1972 reconstruction. The review studies Lear’s treatment of Aristotle’s logic, his appreciation of the Corcoran-Smiley paradigm, and his (...) understanding of modern logical theory. In the process Corcoran and Scanlan present new, previously unpublished results. Corcoran regards this review as an important contribution to contemporary study of PRIOR ANALYTICS: both the book and the review deserve to be better known. (shrink)
Bertrand Russell, in the second of his 1914 Lowell lectures, Our Knowledge of the External World, asserted famously that ‘every philosophical problem, when it is subjected to the necessary analysis and purification, is found either to be not really philosophical at all, or else to be, in the sense in which we are using the word, logical’ (Russell 1993, p. 42). He went on to characterize that portion of logic that concerned the study of forms of propositions, or, as (...) he called them, ‘logical forms’. This portion of logic he called ‘philosophical logic’. Russell asserted that ... some kind of knowledge of logical forms, though with most people it is not explicit, is involved in all understanding of discourse. It is the business of philosophical logic to extract this knowledge from its concrete integuments, and to render it explicit and pure. (p. 53) Perhaps no one still endorses quite this grand a view of the role of logic and the investigation of logical form in philosophy. But talk of logical form retains a central role in analytic philosophy. Given its widespread use in philosophy and linguistics, it is rather surprising that the concept of logical form has not received more attention by philosophers than it has. The concern of this paper is to say something about what talk of logical form comes to, in a tradition that stretches back to (and arguably beyond) Russell’s use of that expression. This will not be exactly Russell’s conception. For we do not endorse Russell’s view that propositions are the bearers of logical form, or that appeal to propositions adds anything to our understanding of what talk of logical form comes to. But we will be concerned to provide an account responsive to the interests expressed by Russell in the above quotations, though one clarified of extraneous elements, and expressed precisely. For this purpose, it is important to note that the concern expressed by Russell in the above passages, as the surrounding text makes clear, is a concern not just with logic conceived narrowly as the study of logical terms, but with propositional form more generally, which includes, e.g., such features as those that correspond to the number of argument places in a propositional function, and the categories of objects which propositional.... (shrink)
I discuss Quine's claim that anyone denying what we now take to be a logicaltruth would be using logical words in a novel way. I trace this to a confusions between outright denial and failure to assert, and assertion of a negation. (This abstract is written from memory decades after the article.).
If logicaltruth is necessitated by sheer syntax, mathematics is categorially unlike logic even if all mathematics derives from definitions and logical principles. This contrast gets obscured by the plausibility of the Synonym Substitution Principle implicit in conceptions of analyticity: synonym substitution cannot alter sentence sense. The Principle obviously fails with intercepting: nonuniform term substitution in logical sentences. 'Televisions are televisions' and 'TVs are televisions' neither sound alike nor are used interchangeably. Interception synonymy gets assumed because (...)logical sentences and their synomic interceptions have identical factual content, which seems to exhaust semantic content. However, intercepting alters syntax by eliminating term recurrence, the sole strictly syntactic means of ensuring necessary term coextension, and thereby syntactically securing necessary truth. Interceptional necessity is lexical, a notational artifact. The denial of interception nonsynonymy and the disregard of term recurrence in logic link with many misconceptions about propositions, logical form, conventions, and metalanguages. Mathematics is distinct from logic: its truth is not syntactic; it is transmitted by synonym substitution; term recurrence has no essential role. The '=' of mathematics is an objectual relation between numbers; the '=' of logic marks a syntactic relation of coreferring terms. (shrink)
This book is best regarded as a concise essay developing the personal views of a major philosopher of logic and as such it is to be welcomed by scholars in the field. It is not (and does not purport to be) a treatment of a significant portion of those philosophical problems generally thought to be germane to logic. It would be easy to list many popular topics in philosophy of logic which it does not mention. Even its "definition" of logic-"the (...) systematic study of logicaltruth"-is peculiar to the author and would be regarded as inappropriately restrictive by many logicians There are several standard ways of defining truth using sequences. Quine’s discussions in the 1970 first printing of Philosophy of logic and in previous lectures were vitiated by mixing two. Quine’s logical Two-Method Error, which eluded Quine’s colleagues, was corrected in the 1978 sixth printing. But Quine never explicitly acknowledged, described, or even mentioned the error in print although in correspondence he did thank Corcoran for bringing it to his attention. In regard to style one may note that the book is rich in metaphorical and sometimes even cryptic passages one of the more remarkable of which occurs in the Preface and seems to imply that deductive logic does not warrant distinctive philosophical treatment. Moreover, the author's sesquipedalian performances sometimes subvert perspicuity. (shrink)
This paper claims that there is no such thing as the correct answer to the question of what is logical form: two significantly different notions of logical form are needed to fulfil two major theoretical roles that pertain respectively to logic and semantics. The first part of the paper outlines the thesis that a unique notion of logical form fulfils both roles, and argues that the alleged best candidate for making it true is unsuited for one of (...) the two roles. The second part spells out a considerably different notion which is free from that problem, although it does not fit the other role. As it will be suggested, each of the two notions suits at most one role, so the uniqueness thesis is ungrounded. (shrink)
Benjamin Schnieder has argued that several traditional definitions of truth-functionality fail to capture a central intuition informal characterizations of the notion often capture. The intuition is that the truth-value of a sentence that employs a truth-functional operator depends upon the truth-values of the sentences upon which the operator operates. Schnieder proposes an alternative definition of truth-functionality that is designed to accommodate this intuition. We argue that one traditional definition of ‘truth-functionality’ is immune from the (...) counterexamples that Schnieder proposes and is preferable to Schnieder’s alternative. (shrink)
The problem analysed in this paper is whether we can gain knowledge by using valid inferences, and how we can explain this process from a model-theoretic perspective. According to the paradox of inference (Cohen & Nagel 1936/1998, 173), it is logically impossible for an inference to be both valid and its conclusion to possess novelty with respect to the premises. I argue in this paper that valid inference has an epistemic significance, i.e., it can be used by an agent to (...) enlarge his knowledge, and this significance can be accounted in model-theoretic terms. I will argue first that the paradox is based on an equivocation, namely, it arises because logical containment, i.e., logical implication, is identified with epistemological containment, i.e., the knowledge of the premises entails the knowledge of the conclusion. Second, I will argue that a truth-conditional theory of meaning has the necessary resources to explain the epistemic significance of valid inferences. I will explain this epistemic significance starting from Carnap’s semantic theory of meaning and Tarski’s notion of satisfaction. In this way I will counter (Prawitz 2012b)’s claim that a truth-conditional theory of meaning is not able to account the legitimacy of valid inferences, i.e., their epistemic significance. (shrink)
This special issue collects together nine new essays on logical consequence :the relation obtaining between the premises and the conclusion of a logically valid argument. The present paper is a partial, and opinionated,introduction to the contemporary debate on the topic. We focus on two inﬂuential accounts of consequence, the model-theoretic and the proof-theoretic, and on the seeming platitude that valid arguments necessarilypreserve truth. We brieﬂy discuss the main objections these accounts face, as well as Hartry Field’s contention that (...) such objections show consequenceto be a primitive, indeﬁnable notion, and that we must reject the claim that valid arguments necessarily preserve truth. We suggest that the accountsin question have the resources to meet the objections standardly thought to herald their demise and make two main claims: (i) that consequence, as opposed to logical consequence, is the epistemologically signiﬁcant relation philosophers should be mainly interested in; and (ii) that consequence is a paradoxical notion if truth is. (shrink)
In the early 20th century, scepticism was common among philosophers about the very meaningfulness of the notion of truth – and of the related notions of denotation, definition etc. (i.e., what Tarski called semantical concepts). Awareness was growing of the various logical paradoxes and anomalies arising from these concepts. In addition, more philosophical reasons were being given for this aversion.1 The atmosphere changed dramatically with Alfred Tarski’s path-breaking contribution. What Tarski did was to show that, assuming that the (...) syntax of the object language is specified exactly enough, and that the metatheory has a certain amount of set theoretic power,2 one can explicitly define truth in the object language. And what can be explicitly defined can be eliminated. It follows that the defined concept cannot give rise to any inconsistencies (that is, paradoxes). This gave new respectability to the concept of truth and related notions. Nevertheless, philosophers’ judgements on the nature and philosophical relevance of Tarski’s work have varied. It is my aim here to review and evaluate some threads in this debate. (shrink)
The starting point of this paper is the idea that linguistic representation is the result of a global process: a process of interaction of a community of cognitive-linguistic agents, with one another and with the environment. I maintain that the study of truth, meaning and related notions should be addressed without losing perspective of this process, and I oppose the ‘static’ or ‘analytic’ approach, which is fundamentally based on our own knowledge of the conventional meaning of words and sentences, (...) and the ability of using them that we have as competent speakers. I argue that the analytic perspective is responsible for five recurring difficulties in truthmaker theory: (1) the lack of attention to the difference of explanatory role between the distinct notions proposed as primary truthbearer; (2) the adscription of purely extra-linguistic truthmakers to ‘synthetic truths’, ignoring the contribution of the linguistic factor; (3) the adscription of purely linguistic truthmakers to ‘logical’ and ‘analytic truths’, ignoring the contribution of the worldly factor; (4) the difficulties in the search for minimal truthmakers; (5) the problems in the treatment of ‘negative facts’ and of other ‘logically complex facts’. I do not provide an account of how to solve these difficulties, but I do show how the ‘process model’ helps to clear up confusion regarding them. (shrink)
Proposition and sentence are two separate entities indicating their specific purposes, definitions and problems. A proposition is a logical entity. A proposition asserts that something is or not the case, any proposition may be affirmed or denied, all proportions are either true (1’s) or false (0’s). All proportions are sentences but all sentences are not propositions. Propositions are factual contains three terms: subject, predicate and copula and are always in indicative or declarative mood. While sentence is a grammatical entity, (...) a unit of language that expresses a complete thought; a sentence may express a proposition, but is distinct from the proposition it may be used to express: categories, declarative sentences, exclamatory, imperative and interrogative sentences. Not all sentences are propositions, propositions express sentence. Sentence is a proposition only in condition when it bears truth values i.e. true or false. We use English sentences governed by imprecise rule to state the precise rules of proposition. In logic we use sentence as logical entity having propositional function but grammatical sentences are different from logical sentences while the former are having only two divisions namely subject and predicate and may express wishes, orders, surprise or facts and also have multiple subjects and predicates and the latter must be in a propositional form which states quantity of the subject and the quality of the proposition and multiple subjects and multiple predicate make the proposition multiple. (shrink)
In True Enough, Catherine Elgin (2017) argues against veritism, which is the view that truth is the paramount epistemic objective. Elgin’s argument against veritism proceeds from considering the role that models, idealizations, and thought experiments play in science to the conclusion that veritism is unacceptable. In this commentary, I argue that Elgin’s argument fails as an argument against veritism. I sketch a refutation by logical analogy of Elgin’s argument. Just as one can aim at gold medals and still (...) find approximations to gold, such as silver and bronze medals, to be acceptable and honest achievements in competitive sports, one can aim at full truths as the paramount epistemic objective and still find approximations to truth, such as models and idealizations, to be acceptable and honest achievements in scientific inquiry. (shrink)
ABSTRACT: This 1974 paper builds on our 1969 paper (Corcoran-Weaver [2]). Here we present three (modal, sentential) logics which may be thought of as partial systematizations of the semantic and deductive properties of a sentence operator which expresses certain kinds of necessity. The logical truths [sc. tautologies] of these three logics coincide with one another and with those of standard formalizations of Lewis's S5. These logics, when regarded as logistic systems (cf. Corcoran [1], p. 154), are seen to be (...) equivalent; but, when regarded as consequence systems (ibid., p. 157), one diverges from the others in a fashion which suggests that two standard measures of semantic complexity may not be as closely linked as previously thought. -/- This 1974 paper uses the linear notation for natural deduction presented in [2]: each two-dimensional deduction is represented by a unique one-dimensional string of characters. Thus obviating need for two-dimensional trees, tableaux, lists, and the like—thereby facilitating electronic communication of natural deductions. The 1969 paper presents a (modal, sentential) logic which may be thought of as a partial systematization of the semantic and deductive properties of a sentence operator which expresses certain kinds of necessity. The logical truths [sc. tautologies] of this logic coincides those of standard formalizations of Lewis’s S4. Among the paper's innovations is its treatment of modal logic in the setting of natural deduction systems--as opposed to axiomatic systems. The author’s apologize for the now obsolete terminology. For example, these papers speak of “a proof of a sentence from a set of premises” where today “a deduction of a sentence from a set of premises” would be preferable. 1. Corcoran, John. 1969. Three Logical Theories, Philosophy of Science 36, 153–77. J P R -/- 2. Corcoran, John and George Weaver. 1969. Logical Consequence in Modal Logic: Natural Deduction in S5 Notre Dame Journal of Formal Logic 10, 370–84. MR0249278 (40 #2524). 3. Weaver, George and John Corcoran. 1974. Logical Consequence in Modal Logic: Some Semantic Systems for S4, Notre Dame Journal of Formal Logic 15, 370–78. MR0351765 (50 #4253). (shrink)
Recent work in formal semantics suggests that the language system includes not only a structure building device, as standardly assumed, but also a natural deductive system which can determine when expressions have trivial truth‐conditions (e.g., are logically true/false) and mark them as unacceptable. This hypothesis, called the ‘logicality of language’, accounts for many acceptability patterns, including systematic restrictions on the distribution of quantifiers. To deal with apparent counter‐examples consisting of acceptable tautologies and contradictions, the logicality of language is often (...) paired with an additional assumption according to which logical forms are radically underspecified: i.e., the language system can see functional terms but is ‘blind’ to open class terms to the extent that different tokens of the same term are treated as if independent. This conception of logical form has profound implications: it suggests an extreme version of the modularity of language, and can only be paired with non‐classical—indeed quite exotic—kinds of deductive systems. The aim of this paper is to show that we can pair the logicality of language with a different and ultimately more traditional account of logical form. This framework accounts for the basic acceptability patterns which motivated the logicality of language, can explain why some tautologies and contradictions are acceptable, and makes better predictions in key cases. As a result, we can pursue versions of the logicality of language in frameworks compatible with the view that the language system is not radically modular vis‐á‐vis its open class terms and employs a deductive system that is basically classical. (shrink)
Kuhn's alleged taxonomic interpretation of incommensurability is grounded on an ill defined notion of untranslatability and is hence radically incomplete. To supplement it, I reconstruct Kuhn's taxonomic interpretation on the basis of a logical-semantic theory of taxonomy, a semantic theory of truth-value, and a truth-value conditional theory of cross-language communication. According to the reconstruction, two scientific languages are incommensurable when core sentences of one language, which have truth values when considered within its own context, lack (...) class='Hi'>truth values when considered within the context of the other due to the unmatchable taxonomic structures underlying them. So constructed, Kuhn's mature interpretation of incommensurability does not depend upon the notion of truth-preserving translatability, but rather depends on the notion of truth-value-status-preserving cross-language communication. The reconstruction makes Kuhn's notion of incommensurability a well grounded, tenable and integrated notion.Author Keywords: Incommensurability; Thomas Kuhn; Taxonomic structures; Lexicons; Truth-value; Untranslatability; Cross-language communication. (shrink)
Since the time of Aristotle's students, interpreters have considered Prior Analytics to be a treatise about deductive reasoning, more generally, about methods of determining the validity and invalidity of premise-conclusion arguments. People studied Prior Analytics in order to learn more about deductive reasoning and to improve their own reasoning skills. These interpreters understood Aristotle to be focusing on two epistemic processes: first, the process of establishing knowledge that a conclusion follows necessarily from a set of premises (that is, on the (...) epistemic process of extracting information implicit in explicitly given information) and, second, the process of establishing knowledge that a conclusion does not follow. Despite the overwhelming tendency to interpret the syllogistic as formal epistemology, it was not until the early 1970s that it occurred to anyone to think that Aristotle may have developed a theory of deductive reasoning with a well worked-out system of deductions comparable in rigor and precision with systems such as propositional logic or equational logic familiar from mathematical logic. When modern logicians in the 1920s and 1930s first turned their attention to the problem of understanding Aristotle's contribution to logic in modern terms, they were guided both by the Frege-Russell conception of logic as formal ontology and at the same time by a desire to protect Aristotle from possible charges of psychologism. They thought they saw Aristotle applying the informal axiomatic method to formal ontology, not as making the first steps into formal epistemology. They did not notice Aristotle's description of deductive reasoning. Ironically, the formal axiomatic method (in which one explicitly presents not merely the substantive axioms but also the deductive processes used to derive theorems from the axioms) is incipient in Aristotle's presentation. Partly in opposition to the axiomatic, ontically-oriented approach to Aristotle's logic and partly as a result of attempting to increase the degree of fit between interpretation and text, logicians in the 1970s working independently came to remarkably similar conclusions to the effect that Aristotle indeed had produced the first system of formal deductions. They concluded that Aristotle had analyzed the process of deduction and that his achievement included a semantically complete system of natural deductions including both direct and indirect deductions. Where the interpretations of the 1920s and 1930s attribute to Aristotle a system of propositions organized deductively, the interpretations of the 1970s attribute to Aristotle a system of deductions, or extended deductive discourses, organized epistemically. The logicians of the 1920s and 1930s take Aristotle to be deducing laws of logic from axiomatic origins; the logicians of the 1970s take Aristotle to be describing the process of deduction and in particular to be describing deductions themselves, both those deductions that are proofs based on axiomatic premises and those deductions that, though deductively cogent, do not establish the truth of the conclusion but only that the conclusion is implied by the premise-set. Thus, two very different and opposed interpretations had emerged, interestingly both products of modern logicians equipped with the theoretical apparatus of mathematical logic. The issue at stake between these two interpretations is the historical question of Aristotle's place in the history of logic and of his orientation in philosophy of logic. This paper affirms Aristotle's place as the founder of logic taken as formal epistemology, including the study of deductive reasoning. A by-product of this study of Aristotle's accomplishments in logic is a clarification of a distinction implicit in discourses among logicians--that between logic as formal ontology and logic as formal epistemology. (shrink)
Donald Davidson contributed to the discussion of logical form in two ways. On the one hand, he made several influential suggestions on how to give the logical forms of certain constructions of natural language. His account of adverbial modification and so called action-sentences is nowadays, in some form or other, widely employed in linguistics (Harman (forthcoming) calls it "the standard view"). Davidson's approaches to indirect discourse and quotation, while not as influential, also still attract attention today. On the (...) other hand, Davidson provided a general account of what logical form is. This paper is concerned with this general account. Its foremost aim is to give a faithful and detailed picture of what, according to Davidson, it means to give the logical form of a sentence. The structure of the paper is as follows. (1) I will first informally introduce a notion of logical form as the form that matters in certain kinds of entailments, and indicate why philosophers have taken an interest in such a notion. (2) The second section develops constraints that we should arguably abide by in giving an account of logical form. (3) I then turn to Davidson’s view of what is involved in giving such an account. To this end, I will try to reconstruct Davidson’s view of the connection between an assignment of logical forms, a truth theory and a meaning theory. (4) Finally, I will briefly discuss possible problems of Davidson’s account as developed in this paper. (shrink)
In matters of personal taste, faultless disagreement occurs between people who disagree over what is tasty, fun, etc., in those cases when each of these people seems equally far from the objective truth. Faultless disagreement is often taken as evidence that truth is relative. This article aims to help us avoid the truth-relativist conclusion. The article, however, does not argue directly against relativism; instead, the article defends non-relative truth constructively, aiming to explain faultless disagreement with the (...) resources of semantic contextualism. To this end the article describes and advocates a contextualist solution inspired by supervaluationist truth-value gap approaches. The solution presented here, however, does not require truth value gaps; it preserves both logical bivalence and non-relative truth, even while it acknowledges and explains the possibility of faultless disagreement. The solution is motivated by the correlation between assertions’ being true and their being useful. This correlation, furthermore, is used not only to tell which assertions are true, but also to determine which linguistic intuitions are reliable. (shrink)
This paper deals with the logical form of quantified sentences. Its purpose is to elucidate one plausible sense in which quantified sentences can adequately be represented in the language of first-order logic. Section 1 introduces some basic notions drawn from general quantification theory. Section 2 outlines a crucial assumption, namely, that logical form is a matter of truth-conditions. Section 3 shows how the truth-conditions of quantified sentences can be represented in the language of first-order logic consistently (...) with some established undefinability results. Section 4 sketches an account of vague quantifier expressions along the lines suggested. Finally, section 5 addresses the vexed issue of logicality. (shrink)
We analyze the logical form of the domain knowledge that grounds analogical inferences and generalizations from a single instance. The form of the assumptions which justify analogies is given schematically as the "determination rule", so called because it expresses the relation of one set of variables determining the values of another set. The determination relation is a logical generalization of the different types of dependency relations defined in database theory. Specifically, we define determination as a relation between schemata (...) of first order logic that have two kinds of free variables: (1) object variables and (2) what we call "polar" variables, which hold the place of truth values. Determination rules facilitate sound rule inference and valid conclusions projected by analogy from single instances, without implying what the conclusion should be prior to an inspection of the instance. They also provide a way to specify what information is sufficiently relevant to decide a question, prior to knowledge of the answer to the question. (shrink)
Philosophers are divided on whether the proof- or truth-theoretic approach to logic is more fruitful. The paper demonstrates the considerable explanatory power of a truth-based approach to logic by showing that and how it can provide (i) an explanatory characterization —both semantic and proof-theoretical—of logical inference, (ii) an explanatory criterion for logical constants and operators, (iii) an explanatory account of logic’s role (function) in knowledge, as well as explanations of (iv) the characteristic features of logic —formality, (...) strong modal force, generality, topic neutrality, basicness, and (quasi-)apriority, (v) the veridicality of logic and its applicability to science, (v) the normativity of logic, (vi) error, revision, and expansion in/of logic, and (vii) the relation between logic and mathematics. The high explanatory power of the truth-theoretic approach does not rule out an equal or even higher explanatory power of the proof-theoretic approach. But to the extent that the truth-theoretic approach is shown to be highly explanatory, it sets a standard for other approaches to logic, including the proof-theoretic approach. (shrink)
Sharon Street’s 2006 article “A Darwinian Dilemma for Realist Theories of Value” challenges the epistemological pretensions of the moral realist, of the nonnaturalist in particular. Given that “Evolutionary forces have played a tremendous role in shaping the content of human evaluative attitudes” – why should one suppose such attitudes and concomitant beliefs would track an independent moral reality? Especially since, on a nonnaturalist view, moral truth is causally inert. I abstract a logical skeleton of Street’s argument and, with (...) its aid, focus on problematic assumptions regarding the (a)causality of moral truth. It emerges that there are acquired causal powers that compensate for the intrinsic impotence of moral truth, as well as two distinct levels at which truth-tracking might occur. I argue that while evolution’s selective forces do not track moral truth, that does not imply individual organisms could not have evolved that capability. -/- . (shrink)
Recent work in formal semantics suggests that the language system includes not only a structure building device, as standardly assumed, but also a natural deductive system which can determine when expressions have trivial truth-conditions (e.g., are logically true/false) and mark them as unacceptable. This hypothesis, called the `logicality of language', accounts for many acceptability patterns, including systematic restrictions on the distribution of quantifiers. To deal with apparent counter-examples consisting of acceptable tautologies and contradictions, the logicality of language is often (...) paired with an additional assumption according to which logical forms are radically underspecified: i.e., the language system can see functional terms but is `blind' to open class terms to the extent that different tokens of the same term are treated as if independent. This conception of logical form has profound implications: it suggests an extreme version of the modularity of language, and can only be paired with non-classical---indeed quite exotic---kinds of deductive systems. The aim of this paper is to show that we can pair the logicality of language with a different and ultimately more traditional account of logical form. This framework accounts for the basic acceptability patterns which motivated the logicality of language, can explain why some tautologies and contradictions are acceptable, and makes better predictions in key cases. As a result, we can pursue versions of the logicality of language in frameworks compatible with the view that the language system is not radically modular vis-a-vis its open class terms and employs a deductive system that is basically classical. (shrink)
This paper attempts to address the question what logical strength theories of truth have by considering such questions as: If you take a theory T and add a theory of truth to it, how strong is the resulting theory, as compared to T? Once the question has been properly formulated, the answer turns out to be about as elegant as one could want: Adding a theory of truth to a finitely axiomatized theory T is more or (...) less equivalent to a kind of abstract consistency statement. A large part of the interest of the paper lies in the way syntactic theories are 'disentangled' from object theories. (shrink)
This paper attempts to address the question what logical strength theories of truth have by considering such questions as: If you take a theory T and add a theory of truth to it, how strong is the resulting theory, as compared to T? It turns out that, in a wide range of cases, we can get some nice answers to this question, but only if we work in a framework that is somewhat different from those usually employed (...) in discussions of axiomatic theories of truth. These results are then used to address a range of philosophical questions connected with truth, such as what Tarski meant by "essential richness" and the so-called conservativeness argument against deflationism. -/- This draft dates from about 2009, with some significant updates having been made around 2011. Around then, however, I decided that the paper was becoming unmanageable and that I was trying to do too many things in it. I have therefore exploded the paper into several pieces, which will be published separately. These include "Disquotationalism and the Compositional Principles", "The Logical Strength of Compositional Principles", "Consistency and the Theory of Truth", and "What Is Essential Richness?" You should probably read those instead, since this draft remains a bit of a mess. Terminology and notation are inconsistent, and some of the proofs aren't quite right. So, caveat lector. I make it public only because it has been cited in a few places now. (shrink)
It is a received view that Kant’s formal logic (or what he calls “pure general logic”) is thoroughly intensional. On this view, even the notion of logical extension must be understood solely in terms of the concepts that are subordinate to a given concept. I grant that the subordination relation among concepts is an important theme in Kant’s logical doctrine of concepts. But I argue that it is both possible and important to ascribe to Kant an objectual notion (...) of logical extension according to which the extension of a concept is the multitude of objects falling under it. I begin by defending this ascription in response to three reasons that are commonly invoked against it. First, I explain that this ascription is compatible with Kant’s philosophical reflections on the nature and boundary of a formal logic. Second, I show that the objectual notion of extension I ascribe to Kant can be traced back to many of the early modern works of logic with which he was more or less familiar. Third, I argue that such a notion of extension makes perfect sense of a pivotal principle in Kant’s logic, namely the principle that the quantity of a concept’s extension is inversely proportional to that of its intension. In the process, I tease out two important features of the Kantian objectual notion of logical extension in terms of which it markedly differs from the modern one. First, on the modern notion the extension of a concept is the sum of the objects actually falling under it; on the Kantian notion, by contrast, the extension of a concept consists of the multitude of possible objects—not in the metaphysical sense of possibility, though—to which a concept applies in virtue of being a general representation. While the quantity of the former extension is finite, that of the latter is infinite—as is reflected in Kant’s use of a plane-geometrical figure (e.g., circle, square), which is continuum as opposed to discretum, to represent the extension in question. Second, on the modern notion of extension, a concept that signifies exactly one object has a one-member extension; on the Kantian notion, however, such a concept has no extension at all—for a concept is taken to have extension only if it signifies a multitude of things. This feature of logical extension is manifested in Kant’s claim that a singular concept (or a concept in its singular use) can, for lack of extension, be figuratively represented only by a point—as opposed to an extended figure like circle, which is reserved for a general concept (or a concept in its general use). Precisely on account of these two features, the Kantian objectual extension proves vital to Kant’s theory of logical quantification (in universal, particular and singular judgments, respectively) and to his view regarding the formal truth of analytic judgments. (shrink)
This paper argues that the obvious validity of certain inferences involving indirect speech reports as premises and truth or falsity ascriptions as conclusions is incompatible with Davidson's so-called "paratactic" analysis of the logical form of indirect discourse. Besides disqualifying that analysis, this problem is also claimed to indicate that the analysis is doubly in tension with Davidson's metasemantic views. Specifically, it can be reconciled neither with one of Davidson's key assumptions regarding the adequacy of the kind of semantic (...) theory he recommends nor with one of his key assumptions regarding the inadequacy of a kind of semantic theory he rejects. (shrink)
The aim of this paper is to show that the account of objective truth taken for granted by logicians at least since the publication in 1933 of Tarski’s “The Concept of Truth in Formalized Languages” arose out of a tradition of philosophical thinking initiated by Bolzano and Brentano. The paper shows more specifically that certain investigations of states of affairs and other objectual correlates of judging acts, investigations carried out by Austrian and Polish philosophers around the turn of (...) the century, formed part of the background of views that led to standard current accounts of the objectivity of truth. It thus lends support to speculations on the role of Brentano and his heirs in contemporary logical philosophy advanced by Jan Wolenski in his masterpiece of 1989 on the Logic and philosophy in the Lvov-Warsaw School. (shrink)
By the lights of a central logical positivist thesis in modal epistemology, for every necessary truth that we know, we know it a priori and for every contingent truth that we know, we know it a posteriori. Kripke attacks on both flanks, arguing that we know necessary a posteriori truths and that we probably know contingent a priori truths. In a reflection of Kripke's confidence in his own arguments, the first of these Kripkean claims is far more (...) widely accepted than the second. Contrary to received opinion, the paper argues, the considerations Kripke adduces concerning truths purported to be necessary a posteriori do not disprove the logical positivist thesis that necessary truth and a priori truth are co-extensive. (shrink)
Gómez-Torrente’s papers have made important contributions to vindicate Tarski’s model-theoretic account of the logical properties in the face of Etchemendy’s criticisms. However, at some points his vindication depends on interpreting the Tarskian account as purportedly modally deflationary, i.e., as not intended to capture the intuitive modal element in the logical properties, that logical consequence is (epistemic or alethic) necessary truth-preservation. Here it is argued that the views expressed in Tarski’s seminal work do not support this modally (...) deflationary interpretation, even if Tarski himself was sceptical about modalities. (shrink)
We present Logical Description Grammar (LDG), a model ofgrammar and the syntax-semantics interface based on descriptions inelementary logic. A description may simultaneously describe the syntacticstructure and the semantics of a natural language expression, i.e., thedescribing logic talks about the trees and about the truth-conditionsof the language described. Logical Description Grammars offer a naturalway of dealing with underspecification in natural language syntax andsemantics. If a logical description (up to isomorphism) has exactly onetree plus truth-conditions as a (...) model, it completely specifies thatgrammatical object. More common is the situation, corresponding tounderspecification, in which there is more than one model. A situation inwhich there are no models corresponds to an ungrammatical input. (shrink)
A truth-preservation fallacy is using the concept of truth-preservation where some other concept is needed. For example, in certain contexts saying that consequences can be deduced from premises using truth-preserving deduction rules is a fallacy if it suggests that all truth-preserving rules are consequence-preserving. The arithmetic additive-associativity rule that yields 6 = (3 + (2 + 1)) from 6 = ((3 + 2) + 1) is truth-preserving but not consequence-preserving. As noted in James Gasser’s dissertation, (...) Leibniz has been criticized for using that rule in attempting to show that arithmetic equations are consequences of definitions. -/- A system of deductions is truth-preserving if each of its deductions having true premises has a true conclusion—and consequence-preserving if, for any given set of sentences, each deduction having premises that are consequences of that set has a conclusion that is a consequence of that set. Consequence-preserving amounts to: in each of its deductions the conclusion is a consequence of the premises. The same definitions apply to deduction rules considered as systems of deductions. Every consequence-preserving system is truth-preserving. It is not as well-known that the converse fails: not every truth-preserving system is consequence-preserving. Likewise for rules: not every truth-preserving rule is consequence-preserving. There are many famous examples. In ordinary first-order Peano-Arithmetic, the induction rule yields the conclusion ‘every number x is such that: x is zero or x is a successor’—which is not a consequence of the null set—from two tautological premises, which are consequences of the null set, of course. The arithmetic induction rule is truth-preserving but not consequence-preserving. Truth-preserving rules that are not consequence-preserving are non-logical or extra-logical rules. Such rules are unacceptable to persons espousing traditional truth-and-consequence conceptions of demonstration: a demonstration shows its conclusion is true by showing that its conclusion is a consequence of premises already known to be true. The 1965 Preface in Benson Mates (1972, vii) contains the first occurrence of truth-preservation fallacies in the book. (shrink)
In explaining the notion of a fundamental property or relation, metaphysicians will often draw an analogy with languages. The fundamental properties and relations stand to reality as the primitive predicates and relations stand to a language: the smallest set of vocabulary God would need in order to write the `book of the world'. In this paper I attempt to make good on this metaphor. In order to do this I introduce a modality that, put informally, stands to propositions as (...) class='Hi'>logicaltruth stands to sentences. The resulting theory, formulated in higher-order logic, also vindicates the Humean idea that fundamental properties and relations are freely recombinable and a variant of the structural idea that propositions can be decomposed into their fundamental constituents via logical operations. Indeed, it is seen that, although these ideas are seemingly distinct, they are not independent, and fall out of a natural and general theory about the granularity of reality. (shrink)
The paper proposes two logical analyses of (the norms of) justification. In a first, realist-minded case, truth is logically independent from justification and leads to a pragmatic logic LP including two epistemic and pragmatic operators, namely, assertion and hypothesis. In a second, antirealist-minded case, truth is not logically independent from justification and results in two logical systems of information and justification: AR4 and AR4¢, respectively, provided with a question-answer semantics. The latter proposes many more epistemic agents, (...) each corresponding to a wide variety of epistemic norms. After comparing the different norms of justification involved in these logical systems, two hexagons expressing Aristotelian relations of opposition will be gathered in order to clarify how (a fragment of) pragmatic formulas can be interpreted in a fuzzy-based question-answer semantics. (shrink)
Russell prétend qu’un examen des croyances est indispensable pour définir nos raisonnements quotidiens et comprendre ce que les philosophes entendent par la notion de vérité. Cela étant, l’auteur considère qu’une étude de ces croyances n’a aucun rapport avec la logique, laquelle concerne uniquement le vrai et le faux. En d’autres termes, Russell associe croyance et psychologie tout en réservant le domaine de la logique au thème de la proposition, vraie ou fausse par définition. Une certaine théorie de la vérité sous-tend (...) son rejet primordial d’une logique épistémique ; si la croyance a un intérêt philosophique, c’est parce que Russell se sert de son examen pour discerner la théorie de la vérité que défendent plusieurs philosophes : la théorie de la vérité-cohérence est une cible principale du Britannique, qui défend pour sa part une théorie de la vérité-correspondance en caractérisant la proposition par sa relation de correspondance ou de non-correspondance avec un fait. (shrink)
Quine argues that if sentences that are set theoretically equivalent are interchangeable salva veritate, then all transparent operators are truth-functional. Criticisms of this argument fail to take into account the conditional character of the conclusion. Quine also argues that, for any person P with minimal logical acuity, if ‘belief’ has a sense in which it is a transparent operator, then, in that sense of the word, P believes everything if P believes anything. The suggestion is made that he (...) intends that result to show us that ‘believes’ has no transparent sense. Criticisms of this argument are either based on unwarranted assertions or on definitions of key terms that depart from Quine’s usage of those terms. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.