2nd edition. The theory of logicalconsequence is central in modern logic and its applications. However, it is mostly dispersed in an abundance of often difficultly accessible papers, and rarely treated with applications in mind. This book collects the most fundamental aspects of this theory and offers the reader the basics of its applications in computer science, artificial intelligence, and cognitive science, to name but the most important fields where this notion finds its many applications.
An interesting question is whether deflationism about truth (and falsity) extends to related properties and relations on truthbearers. Lionel Shapiro (2011) answers affirmatively by arguing that a certain deflationism about truth is as plausible as an analogous version of deflationism about logicalconsequence. I argue that the argument fails on two counts. First, it trivializes to any relation between truthbearers, including substantive ones; in other words, his argument can be used to establish that deflationism about truth is as (...) plausible as deflationism about an arbitrary sentential relation. Second, the alleged analogy between the arguments for deflationism about truth and deflationism about consequence fails. Along the way I consider what implications the failure of the equiplausibility thesis has for deflationism about falsity. (shrink)
ABSTRACT: This 1974 paper builds on our 1969 paper (Corcoran-Weaver [2]). Here we present three (modal, sentential) logics which may be thought of as partial systematizations of the semantic and deductive properties of a sentence operator which expresses certain kinds of necessity. The logical truths [sc. tautologies] of these three logics coincide with one another and with those of standard formalizations of Lewis's S5. These logics, when regarded as logistic systems (cf. Corcoran [1], p. 154), are seen to be (...) equivalent; but, when regarded as consequence systems (ibid., p. 157), one diverges from the others in a fashion which suggests that two standard measures of semantic complexity may not be as closely linked as previously thought. -/- This 1974 paper uses the linear notation for natural deduction presented in [2]: each two-dimensional deduction is represented by a unique one-dimensional string of characters. Thus obviating need for two-dimensional trees, tableaux, lists, and the like—thereby facilitating electronic communication of natural deductions. The 1969 paper presents a (modal, sentential) logic which may be thought of as a partial systematization of the semantic and deductive properties of a sentence operator which expresses certain kinds of necessity. The logical truths [sc. tautologies] of this logic coincides those of standard formalizations of Lewis’s S4. Among the paper's innovations is its treatment of modal logic in the setting of natural deduction systems--as opposed to axiomatic systems. The author’s apologize for the now obsolete terminology. For example, these papers speak of “a proof of a sentence from a set of premises” where today “a deduction of a sentence from a set of premises” would be preferable. 1. Corcoran, John. 1969. Three Logical Theories, Philosophy of Science 36, 153–77. J P R -/- 2. Corcoran, John and George Weaver. 1969. LogicalConsequence in Modal Logic: Natural Deduction in S5 Notre Dame Journal of Formal Logic 10, 370–84. MR0249278 (40 #2524). 3. Weaver, George and John Corcoran. 1974. LogicalConsequence in Modal Logic: Some Semantic Systems for S4, Notre Dame Journal of Formal Logic 15, 370–78. MR0351765 (50 #4253). (shrink)
There is a profound, but frequently ignored relationship between the classical notion of logicalconsequence (formal implication) and material implication. The first repeats the patterns of the latter, but with a wider modal reach. It is argued that this kinship between formal and material implication simply means that they express the same variety of implication, but differ in scope. Formal implication is unrestricted material implication. This apparently innocuous observation has some significant corollaries: (1) conditionals are not connectives, but (...) arguments; (2) the traditional examples of valid argumentative forms are metalogical principles that express the properties of logicalconsequence; (3) formal logic is not a useful guide to detect valid arguments in the real world; (4) it is incoherent to propose alternatives to the material implication while accepting the classical properties of formal implication; (5) the counter-examples to classical argumentative forms and conditional puzzles are unsound. (shrink)
In a recent article, “LogicalConsequence and Natural Language”, Michael Glanzberg claims that there is no relation of logicalconsequence in natural language (2015). The present paper counters that claim. I shall discuss Glanzberg’s arguments and show why they don’t hold. I further show how Glanzberg’s claims may be used to rather support the existence of logicalconsequence in natural language.
Gómez-Torrente’s papers have made important contributions to vindicate Tarski’s model-theoretic account of the logical properties in the face of Etchemendy’s criticisms. However, at some points his vindication depends on interpreting the Tarskian account as purportedly modally deflationary, i.e., as not intended to capture the intuitive modal element in the logical properties, that logicalconsequence is (epistemic or alethic) necessary truth-preservation. Here it is argued that the views expressed in Tarski’s seminal work do not support this modally (...) deflationary interpretation, even if Tarski himself was sceptical about modalities. (shrink)
When some P implies some Q, this should have some impact on what attitudes we take to P and Q. In other words: logicalconsequence has a normative import. I use this idea, recently explored by a number of scholars, as a stepping stone to a bolder view: that relations of logicalconsequence can be identified with norms on our propositional attitudes, or at least that our talk of logicalconsequence can be explained in (...) terms of such norms. I investigate the prospects of such a cognitive norm account of logicalconsequence. I go over the challenges involved in finding a plausible bridge principle connecting logicalconsequence to cognitive norms, in particular a biconditional principle that gives us not only necessary but sufficient conditions for logicalconsequence in terms of norms on propositional attitudes. Then, on the assumption that an adequate norm can be found, I consider what the philosophical merits of such a cognitive norm account would be, and what theoretical commitments it would generate. (shrink)
In this work, we propose a definition of logicalconsequence based on the relation between the quantity of information present in a particular set of formulae and a particular formula. As a starting point, we use Shannon‟s quantitative notion of information, founded on the concepts of logarithmic function and probability value. We first consider some of the basic elements of an axiomatic probability theory, and then construct a probabilistic semantics for languages of classical propositional logic. We define the (...) quantity of information for the formulae of these languages and introduce the concept of informational logicalconsequence, identifying some important results, among them: certain arguments that have traditionally been considered valid, such as modus ponens, are not valid from the informational perspective; the logic underlying informational logicalconsequence is not classical, and is at the least paraconsistent sensu lato; informational logicalconsequence is not a Tarskian logicalconsequence. (shrink)
Intermediary metabolism molecules are orchestrated into logical pathways stemming from history (L-amino acids, D-sugars) and dynamic constraints (hydrolysis of pyrophosphate or amide groups is the driving force of anabolism). Beside essential metabolites, numerous variants derive from programmed or accidental changes. Broken down, variants enter standard pathways, producing further variants. Macromolecule modification alters enzyme reactions specificity. Metabolism conform thermodynamic laws, precluding strict accuracy. Hence, for each regular pathway, a wealth of variants inputs and produces metabolites that are similar to but (...) not the exact replicas of core metabolites. As corollary, a shadow, paralogous metabolism, is associated to standard metabolism. We focus on a logic of paralogous metabolism based on diversion of the core metabolic mimics into pathways where they are modified to minimize their input in the core pathways where they create havoc. We propose that a significant proportion of paralogues of well-characterized enzymes have evolved as the natural way to cope with paralogous metabolites. A second type of denouement uses a process where protecting/deprotecting unwanted metabolites - conceptually similar to the procedure used in the laboratory of an organic chemist - is used to enter a completely new catabolic pathway. (shrink)
Consequence rleations over sets of "judgments" are defined by using "overdetermined" as well as "underdetermined" valuations. Some of these relations are shown to be categorical. And generalized soundness and completeness results are given for both multiple and single conclusion consequence relations.
Anti-exceptionalism about logic is the doctrine that logic does not require its own epistemology, for its methods are continuous with those of science. Although most recently urged by Williamson, the idea goes back at least to Lakatos, who wanted to adapt Popper's falsicationism and extend it not only to mathematics but to logic as well. But one needs to be careful here to distinguish the empirical from the a posteriori. Lakatos coined the term 'quasi-empirical' `for the counterinstances to putative mathematical (...) and logical theses. Mathematics and logic may both be a posteriori, but it does not follow that they are empirical. Indeed, as Williamson has demonstrated, what counts as empirical knowledge, and the role of experience in acquiring knowledge, are both unclear. Moreover, knowledge, even of necessary truths, is fallible. Nonetheless, logicalconsequence holds in virtue of the meaning of the logical terms, just as consequence in general holds in virtue of the meanings of the concepts involved; and so logic is both analytic and necessary. In this respect, it is exceptional. But its methodologyand its epistemology are the same as those of mathematics and science in being fallibilist, and counterexamples to seemingly analytic truths are as likely as those in any scientic endeavour. What is needed is a new account of the evidential basis of knowledge, one which is, perhaps surprisingly, found in Aristotle. (shrink)
In his new book, Logical Form, Andrea Iacona distinguishes between two different roles that have been ascribed to the notion of logical form: the logical role and the semantic role. These two roles entail a bifurcation of the notion of logical form. Both notions of logical form, according to Iacona, are descriptive, having to do with different features of natural language sentences. I agree that the notion of logical form bifurcates, but not that the (...)logical role is merely descriptive. In this paper, I focus on formalization, a process by which logical form, on its logical role, is attributed to natural language sentences. According to some, formalization is a form of explication, and it involves normative, pragmatic, as well as creative aspects. I present a view by which formalization involves explicit commitments on behalf of a reasoner or an interpreter, which serve the normative grounds for the evaluation of a given text. In previous work, I proposed the framework of semantic constraints for the explication of logicalconsequence. Here, I extend the framework to include formalization constraints. The various constraints then serve the role of commitments. I discuss specific issues raised by Iacona concerning univocality, co-reference and equivocation, and I show how our views on these matters diverge as a result of our different starting assumptions. (shrink)
This special issue collects together nine new essays on logicalconsequence :the relation obtaining between the premises and the conclusion of a logically valid argument. The present paper is a partial, and opinionated,introduction to the contemporary debate on the topic. We focus on two inﬂuential accounts of consequence, the model-theoretic and the proof-theoretic, and on the seeming platitude that valid arguments necessarilypreserve truth. We brieﬂy discuss the main objections these accounts face, as well as Hartry Field’s contention (...) that such objections show consequenceto be a primitive, indeﬁnable notion, and that we must reject the claim that valid arguments necessarily preserve truth. We suggest that the accountsin question have the resources to meet the objections standardly thought to herald their demise and make two main claims: (i) that consequence, as opposed to logicalconsequence, is the epistemologically signiﬁcant relation philosophers should be mainly interested in; and (ii) that consequence is a paradoxical notion if truth is. (shrink)
We present a framework for epistemic logic, modeling the logical aspects of System 1 and System 2 cognitive processes, as per dual process theories of reasoning. The framework combines non-normal worlds semantics with the techniques of Dynamic Epistemic Logic. It models non-logically-omniscient, but moderately rational agents: their System 1 makes fast sense of incoming information by integrating it on the basis of their background knowledge and beliefs. Their System 2 allows them to slowly, step-wise unpack some of the (...) class='Hi'>logical consequences of such knowledge and beliefs, by paying a cognitive cost. The framework is applied to three instances of limited rationality, widely discussed in cognitive psychology: Stereotypical Thinking, the Framing Effect, and the Anchoring Effect. (shrink)
Edelcio G. de Souza is a Brazilian logician and philosopher who has researches in the domains of abstract logic, non-classical systems, philosophy of science and the foundations of mathematics. This book is in his honor with the purpose of celebrating his 60th birthday. It contains some articles connected with the above topics and other subjects in logical investigations.
Edelcio G. de Souza is a Brazilian logician and philosopher who has researches in the domains of abstract logic, non-classical systems, philosophy of science and the foundations of mathematics. This book is in his honor with the purpose of celebrating his 60th birthday. It contains some articles connected with the above topics and other subjects in logical investigations.
This paper proposes substitutional definitions of logical truth and consequence in terms of relative interpretations that are extensionally equivalent to the model-theoretic definitions for any relational first-order language. Our philosophical motivation to consider substitutional definitions is based on the hope to simplify the meta-theory of logicalconsequence. We discuss to what extent our definitions can contribute to that.
Logical pluralism is the view that there is more than one correct logic. This very general characterization gives rise to a whole family of positions. I argue that not all of them are stable. The main argument in the paper is inspired by considerations known as the “collapse problem”, and it aims at the most popular form of logical pluralism advocated by JC Beall and Greg Restall. I argue that there is a more general argument available that challenges (...) all variants of logical pluralism that meet the following three conditions: that there are at least two correct logical systems characterized in terms of different consequence relations, that there is some sort of rivalry among the correct logics, and that logicalconsequence is normative. The hypothesis I argue for amounts to conditional claim: If a position satisfies all these conditions, then that position is unstable in the sense that it collapses into competing positions. (shrink)
This study concerns logical systems considered as theories. By searching for the problems which the traditionally given systems may reasonably be intended to solve, we clarify the rationales for the adequacy criteria commonly applied to logical systems. From this point of view there appear to be three basic types of logical systems: those concerned with logical truth; those concerned with logical truth and with logicalconsequence; and those concerned with deduction per se as (...) well as with logical truth and logicalconsequence. Adequacy criteria for systems of the first two types include: effectiveness, soundness, completeness, Post completeness, "strong soundness" and strong completeness. Consideration of a logical system as a theory of deduction leads us to attempt to formulate two adequacy criteria for systems of proofs. The first deals with the concept of rigor or "gaplessness" in proofs. The second is a completeness condition for a system of proofs. An historical note at the end of the paper suggests a remarkable parallel between the above hierarchy of systems and the actual historical development of this area of logic. (shrink)
Since at least the 1960s, deontic logicians and ethicists have worried about whether there can be normative systems that allow conflicting obligations. Surprisingly, however, little direct attention has been paid to questions about how we may reason with conflicting obligations. In this paper, I present a problem for making sense of reasoning with conflicting obligations and argue that no deontic logic can solve this problem. I then develop an account of reasoning based on the popular idea in ethics that reasons (...) explain obligations and show that it solves this problem. (shrink)
Contrastivists view ought-sentences as expressing comparisons among alternatives. Deontic actualists believe that the value of each alternative in such a comparison is determined by what would actually happen if that alternative were to be the case. One of the arguments that motivates actualism is a challenge to the principle of agglomeration over conjunction—the principle according to which if you ought to run and you ought to jump, then you ought to run and jump. I argue that there is no way (...) of developing the actualist insight into a logic that invalidates the agglomeration principle without also invalidating other desirable patterns of inference. After doing this, I extend the analysis to other contrastive views that challenge agglomeration in the way that the actualist does. This motivates skepticism about the actualist’s way of challenging agglomeration. (shrink)
Substructural logics and their application to logical and semantic paradoxes have been extensively studied, but non-reflexive systems have been somewhat neglected. Here, we aim to fill this lacuna, at least in part, by presenting a non-reflexive logic and theory of naive consequence (and truth). We also investigate the semantics and the proof-theory of the system. Finally, we develop a compositional theory of truth (and consequence) in our non-reflexive framework.
The Bounds of Logic presents a new philosophical theory of the scope and nature of logic based on critical analysis of the principles underlying modern Tarskian logic and inspired by mathematical and linguistic development. Extracting central philosophical ideas from Tarski’s early work in semantics, Sher questions whether these are fully realized by the standard first-order system. The answer lays the foundation for a new, broader conception of logic. By generally characterizing logical terms, Sher establishes a fundamental result in semantics. (...) Her development of the notion of logicality for quantifiers and her work on branching are of great importance for linguistics. Sher outlines the boundaries of the new logic and points out some of the philosophical ramifications of the new view of logic for such issues as the logicist thesis, ontological commitment, the role of mathematics in logic, and the metaphysical underpinning of logic. She proposes a constructive definition of logical terms, reexamines and extends the notion of branching quantification, and discusses various linguistic issues and applications. (shrink)
The result of combining classical quantificational logic with modal logic proves necessitism – the claim that necessarily everything is necessarily identical to something. This problem is reflected in the purely quantificational theory by theorems such as ∃x t=x; it is a theorem, for example, that something is identical to Timothy Williamson. The standard way to avoid these consequences is to weaken the theory of quantification to a certain kind of free logic. However, it has often been noted that in order (...) to specify the truth conditions of certain sentences involving constants or variables that don’t denote, one has to apparently quantify over things that are not identical to anything. In this paper I defend a contingentist, non-Meinongian metaphysics within a positive free logic. I argue that although certain names and free variables do not actually refer to anything, in each case there might have been something they actually refer to, allowing one to interpret the contingentist claims without quantifying over mere possibilia. (shrink)
Does rationality require logical omniscience? Our best formal theories of rationality imply that it does, but our ordinary evaluations of rationality seem to suggest otherwise. This paper aims to resolve the tension by arguing that our ordinary evaluations of rationality are not only consistent with the thesis that rationality requires logical omniscience, but also provide a compelling rationale for accepting this thesis in the first place. This paper also defends an account of apriori justification for logical beliefs (...) that is designed to explain the rational requirement of logical omniscience. On this account, apriori justification for beliefs about logic has its source in logical facts, rather than psychological facts about experience, reasoning, or understanding. This account has important consequences for the epistemic role of experience in the logical domain. In a slogan, the epistemic role of experience in the apriori domain is not a justifying role, but rather an enabling and disabling role. (shrink)
I discuss Greg Restall’s attempt to generate an account of logicalconsequence from the incoherence of certain packages of assertions and denials. I take up his justification of the cut rule and argue that, in order to avoid counterexamples to cut, he needs, at least, to introduce a notion of logical form. I then suggest a few problems that will arise for his account if a notion of logical form is assumed. I close by sketching what (...) I take to be the most natural minimal way of distinguishing content and form and suggest further problems arising for this route. (shrink)
One of the open problems in the philosophy of information is whether there is an information logic (IL), different from epistemic (EL) and doxastic logic (DL), which formalises the relation “a is informed that p” (Iap) satisfactorily. In this paper, the problem is solved by arguing that the axiom schemata of the normal modal logic (NML) KTB (also known as B or Br or Brouwer’s system) are well suited to formalise the relation of “being informed”. After having shown that IL (...) can be constructed as an informational reading of KTB, four consequences of a KTB-based IL are explored: information overload; the veridicality thesis (Iap → p); the relation between IL and EL; and the Kp → Bp principle or entailment property, according to which knowledge implies belief. Although these issues are discussed later in the article, they are the motivations behind the development of IL. (shrink)
There is a natural story about what logic is that sees it as tied up with two operations: a ‘throw things into a bag’ operation and a ‘closure’ operation. In a pair of recent papers, Jc Beall has fleshed out the account of logic this leaves us with in more detail. Using Beall’s exposition as a guide, this paper points out some problems with taking the second operation to be closure in the usual sense. After pointing out these problems, I (...) then turn to fixing them in a restricted case and modulo a few simplifying assumptions. In a followup paper, the simplifications and restrictions will be removed. (shrink)
This paper explores the logical consequences of the the thesis that all of the essential properties of consciousness can be known introspectively (Completeness, called "Strong Transparency" in the paper, following D.M. Armstrong's older terminology). It is argued that it can be known introspectively that consciousness does not have complete access to its essential properties; and it is show how this undermines conceivability arguments for dualism.
Conditional excluded middle (CEM) is the following principe of counterfactual logic: either, if it were the case that φ, it would be the case that ψ, or, if it were the case that φ, it would be the case that not-ψ. I will first show that CEM entails the identity of indiscernibles, the falsity of physicalism, and the failure of the modal to supervene on the categorical and of the vague to supervene on the precise. I will then argue that (...) we should accept these startling conclusions, since CEM is valid. (shrink)
According to traditional logical expressivism, logical operators allow speakers to explicitly endorse claims that are already implicitly endorsed in their discursive practice — endorsed in virtue of that practice’s having instituted certain logical relations. Here, I propose a different version of logical expressivism, according to which the expressive role of logical operators is explained without invoking logical relations at all, but instead in terms of the expression of discursive-practical attitudes. In defense of this alternative, (...) I present a deflationary account of the expressive role of vocabulary by which we ascribe logical relations. (shrink)
In some recent works, Crupi and Iacona have outlined an analysis of ‘if’ based on Chrysippus’ idea that a conditional holds whenever the negation of its consequent is incompatible with its antecedent. This paper presents a sound and complete system of conditional logic that accommodates their analysis. The soundness and completeness proofs that will be provided rely on a general method elaborated by Raidl, which applies to a wide range of systems of conditional logic.
I explore the logic of ground. I first develop a logic of weak ground. This logic strengthens the logic of weak ground presented by Fine in his ‘Guide to Ground.’ This logic, I argue, generates many plausible principles which Fine’s system leaves out. I then derive from this a logic of strict ground. I argue that there is a strong abductive case for adopting this logic. It’s elegant, parsimonious and explanatorily powerful. Yet, so I suggest, adopting it has important consequences. (...) First, it means we should think of ground as a type of identity. Second, it means we should reject much of Fine’s logic of strict ground. I also show how the logic I develop connects to other systems in the literature. It is definitionally equivalent both to Angell’s logic of analytic containment and to Correia’s system G. (shrink)
Demonstrative logic, the study of demonstration as opposed to persuasion, is the subject of Aristotle's two-volume Analytics. Many examples are geometrical. Demonstration produces knowledge (of the truth of propositions). Persuasion merely produces opinion. Aristotle presented a general truth-and-consequence conception of demonstration meant to apply to all demonstrations. According to him, a demonstration, which normally proves a conclusion not previously known to be true, is an extended argumentation beginning with premises known to be truths and containing a chain of reasoning (...) showing by deductively evident steps that its conclusion is a consequence of its premises. In particular, a demonstration is a deduction whose premises are known to be true. Aristotle's general theory of demonstration required a prior general theory of deduction presented in the Prior Analytics. His general immediate-deduction-chaining conception of deduction was meant to apply to all deductions. According to him, any deduction that is not immediately evident is an extended argumentation that involves a chaining of intermediate immediately evident steps that shows its final conclusion to follow logically from its premises. To illustrate his general theory of deduction, he presented an ingeniously simple and mathematically precise special case traditionally known as the categorical syllogistic. (shrink)
The reasoning process of analogy is characterized by a strict interdependence between a process of abstraction of a common feature and the transfer of an attribute of the Analogue to the Primary Subject. The first reasoning step is regarded as an abstraction of a generic characteristic that is relevant for the attribution of the predicate. The abstracted feature can be considered from a logic-semantic perspective as a functional genus, in the sense that it is contextually essential for the attribution of (...) the predicate, i.e. that is pragmatically fundamental (i.e. relevant) for the predica-tion, or rather the achievement of the communicative intention. While the transfer of the predicate from the Analogue to the analogical genus and from the genus to the Primary Subject is guaranteed by the maxims (or rules of inference) governing the genus-species relation, the connection between the genus and the predicate can be complex, characterized by various types of reasoning patterns. The relevance relation can hide implicit arguments, such as an implicit argument from classification , an evaluation based on values, consequences or rules, a causal relation, or an argument from practical reasoning. (shrink)
Epistemic logics based on the possible worlds semantics suffer from the problem of logical omniscience, whereby agents are described as knowing all logical consequences of what they know, including all tautologies. This problem is doubly challenging: on the one hand, agents should be treated as logically non-omniscient, and on the other hand, as moderately logically competent. Many responses to logical omniscience fail to meet this double challenge because the concepts of knowledge and reasoning are not properly separated. (...) In this paper, I present a dynamic logic of knowledge that models an agent’s epistemic state as it evolves over the course of reasoning. I show that the logic does not sacrifice logical competence on the altar of logical non- omniscience. (shrink)
I discuss Greg Restall’s attempt to generate an account of logicalconsequence from the incoherence of certain packages of assertions and denials. I take up his justification of the cut rule and argue that, in order to avoid counterexamples to cut, he needs, at least, to introduce a notion of logical form. I then suggest a few problems that will arise for his account if a notion of logical form is assumed. I close by sketching what (...) I take to be the most natural minimal way of distinguishing content and form and suggest further problems arising for this route. (shrink)
This paper discusses the history of the confusion and controversies over whether the definition of consequence presented in the 11-page 1936 Tarski consequence-definition paper is based on a monistic fixed-universe framework?like Begriffsschrift and Principia Mathematica. Monistic fixed-universe frameworks, common in pre-WWII logic, keep the range of the individual variables fixed as the class of all individuals. The contrary alternative is that the definition is predicated on a pluralistic multiple-universe framework?like the 1931 Gödel incompleteness paper. A pluralistic multiple-universe framework (...) recognizes multiple universes of discourse serving as different ranges of the individual variables in different interpretations?as in post-WWII model theory. In the early 1960s, many logicians?mistakenly, as we show?held the ?contrary alternative? that Tarski 1936 had already adopted a Gödel-type, pluralistic, multiple-universe framework. We explain that Tarski had not yet shifted out of the monistic, Frege?Russell, fixed-universe paradigm. We further argue that between his Principia-influenced pre-WWII Warsaw period and his model-theoretic post-WWII Berkeley period, Tarski's philosophy underwent many other radical changes. (shrink)
The need to distinguish between logical and extra-logical varieties of inference, entailment, validity, and consistency has played a prominent role in meta-ethical debates between expressivists and descriptivists. But, to date, the importance that matters of logical form play in these distinctions has been overlooked. That’s a mistake given the foundational place that logical form plays in our understanding of the difference between the logical and the extra-logical. This essay argues that descriptivists are better positioned (...) than their expressivist rivals to provide the needed account of logical form, and so better able to capture the needed distinctions. This finding is significant for several reasons: First, it provides a new argument against expressivism. Second, it reveals that descriptivists can make use of this new argument only if they are willing to take a controversial—but plausible—stand on claims about the nature and foundations of logic. (shrink)
It is claimed hereby that, against a current view of logic as a theory of consequence, opposition is a basic logical concept that can be used to define consequence itself. This requires some substantial changes in the underlying framework, including: a non-Fregean semantics of questions and answers, instead of the usual truth-conditional semantics; an extension of opposition as a relation between any structured objects; a definition of oppositions in terms of basic negation. Objections to this claim will (...) be reviewed. (shrink)
This paper discusses three relevant logics that obey Component Homogeneity - a principle that Goddard and Routley introduce in their project of a logic of significance. The paper establishes two main results. First, it establishes a general characterization result for two families of logic that obey Component Homogeneity - that is, we provide a set of necessary and sufficient conditions for their consequence relations. From this, we derive characterization results for S*fde, dS*fde, crossS*fde. Second, the paper establishes complete sequent (...) calculi for S*fde, dS*fde, crossS*fde. Among the other accomplishments of the paper, we generalize the semantics from Bochvar, Hallden, Deutsch and Daniels, we provide a general recipe to define containment logics, we explore the single-premise/single-conclusion fragment of S*fde, dS*fde, crossS*fdeand the connections between crossS*fde and the logic Eq of equality by Epstein. Also, we present S*fde as a relevant logic of meaninglessness that follows the main philosophical tenets of Goddard and Routley, and we briefly examine three further systems that are closely related to our main logics. Finally, we discuss Routley's criticism to containment logic in light of our results, and overview some open issues. (shrink)
The paper argues that the two best known formal logical fallacies, namely denying the antecedent (DA) and affirming the consequent (AC) are not just basic and simple errors, which prove human irrationality, but rather informational shortcuts, which may provide a quick and dirty way of extracting useful information from the environment. DA and AC are shown to be degraded versions of Bayes’ theorem, once this is stripped of some of its probabilities. The less the probabilities count, the closer these (...) fallacies become to a reasoning that is not only informationally useful but also logically valid. (shrink)
Hyperlogic is a hyperintensional system designed to regiment metalogical claims (e.g., "Intuitionistic logic is correct" or "The law of excluded middle holds") into the object language, including within embedded environments such as attitude reports and counterfactuals. This paper is the first of a two-part series exploring the logic of hyperlogic. This part presents a minimal logic of hyperlogic and proves its completeness. It consists of two interdefined axiomatic systems: one for classical consequence (truth preservation under a classical interpretation of (...) the connectives) and one for "universal" consequence (truth preservation under any interpretation). The sequel to this paper explores stronger logics that are sound and complete over various restricted classes of models as well as languages with hyperintensional operators. (shrink)
The traditional possible-worlds model of belief describes agents as ‘logically omniscient’ in the sense that they believe all logical consequences of what they believe, including all logical truths. This is widely considered a problem if we want to reason about the epistemic lives of non-ideal agents who—much like ordinary human beings—are logically competent, but not logically omniscient. A popular strategy for avoiding logical omniscience centers around the use of impossible worlds: worlds that, in one way or another, (...) violate the laws of logic. In this paper, we argue that existing impossible-worlds models of belief fail to describe agents who are both logically non-omniscient and logically competent. To model such agents, we argue, we need to ‘dynamize’ the impossible-worlds framework in a way that allows us to capture not only what agents believe, but also what they are able to infer from what they believe. In light of this diagnosis, we go on to develop the formal details of a dynamic impossible-worlds framework, and show that it successfully models agents who are both logically non-omniscient and logically competent. (shrink)
Some theorists have developed formal approaches to truth that depend on counterexamples to the structural rules of contraction. Here, we study such approaches, with an eye to helping them respond to a certain kind of objection. We define a contractive relative of each noncontractive relation, for use in responding to the objection in question, and we explore one example: the contractive relative of multiplicative-additive affine logic with transparent truth, or MAALT. -/- .
Although the invariance criterion of logicality first emerged as a criterion of a purely mathematical interest, it has developed into a criterion of considerable linguistic and philosophical interest. In this paper I compare two different perspectives on this criterion. The first is the perspective of natural language. Here, the invariance criterion is measured by its success in capturing our linguistic intuitions about logicality and explaining our logical behavior in natural-linguistic settings. The second perspective is more theoretical. Here, the invariance (...) criterion is used as a tool for developing a theoretical foundation of logic, focused on a critical examination, explanation, and justification of its veridicality and modal force. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.