This paper has two aims. First, it sets out an interpretation of the relevantlogic E of relevant entailment based on the theory of situated inference. Second, it uses this interpretation, together with Anderson and Belnap’s natural deduc- tion system for E, to generalise E to a range of other systems of strict relevant implication. Routley–Meyer ternary relation semantics for these systems are produced and completeness theorems are proven. -/- .
This paper discusses three relevant logics that obey Component Homogeneity - a principle that Goddard and Routley introduce in their project of a logic of significance. The paper establishes two main results. First, it establishes a general characterization result for two families of logic that obey Component Homogeneity - that is, we provide a set of necessary and sufficient conditions for their consequence relations. From this, we derive characterization results for S*fde, dS*fde, crossS*fde. Second, the paper establishes (...) complete sequent calculi for S*fde, dS*fde, crossS*fde. Among the other accomplishments of the paper, we generalize the semantics from Bochvar, Hallden, Deutsch and Daniels, we provide a general recipe to define containment logics, we explore the single-premise/single-conclusion fragment of S*fde, dS*fde, crossS*fdeand the connections between crossS*fde and the logic Eq of equality by Epstein. Also, we present S*fde as a relevantlogic of meaninglessness that follows the main philosophical tenets of Goddard and Routley, and we briefly examine three further systems that are closely related to our main logics. Finally, we discuss Routley's criticism to containment logic in light of our results, and overview some open issues. (shrink)
The aim of this paper is to explore what insights relevant logics may provide for the understanding of literary fictional narrative. To date, hardly anyone has reflected on the intersection of relevant logics and narratology, and some could think that there is good reason for it. On the one hand, relevance has been a prominent issue in pragmatics, in the tradition of Grice, and Sperber and Wilson; thus framed, relevance is highly context-sensitive, so it seems unsuitable for formal (...) analysis. On the other hand, the very idea of a logic of narrative has been criticized, arguing that logic brings to a stasis the time of human action (Ricœur, II: 29-60), or that its emphasis on rules misses the creative, unpredictable character of literature (De Man)... First, I will briefly introduce relevant logics, with an eye to showing their interest for narratological concerns, rather than to here providing a coherent (let alone comprehensive) survey. Secondly, lest I get drawn into purely abstract discussion, I will analyse several stories in order to give some instances of the kind of topics congenial to narratology that may be addressed with a relevantist toolkit. Thirdly (and lastly), I will expand in more theoretical fashion on certain issues raised in the second section and bring them into connection with pragmatic relevance theory. (shrink)
I develop and defend a truthmaker semantics for the relevantlogic R. The approach begins with a simple philosophical idea and develops it in various directions, so as to build a technically adequate relevant semantics. The central philosophical idea is that truths are true in virtue of specific states. Developing the idea formally results in a semantics on which truthmakers are relevant to what they make true. A very natural notion of conditionality is added, giving us (...)relevant implication. I then investigate ways to add conjunction, disjunction, and negation; and I discuss how to justify contraposition and excluded middle within a truthmaker semantics. (shrink)
The system R, or more precisely the pure implicational fragment R›, is considered by the relevance logicians as the most important. The another central system of relevance logic has been the logic E of entailment that was supposed to capture strict relevant implication. The next system of relevance logic is RM or R-mingle. The question is whether adding mingle axiom to R› yields the pure implicational fragment RM› of the system? As concerns the weak systems there (...) are at least two approaches to the problem. First of all, it is possible to restrict a validity of some theorems. In another approach we can investigate even weaker logics which have no theorems and are characterized only by rules of deducibility. (shrink)
This paper sets out to evaluate the claim that Aristotle’s Assertoric Syllogistic is a relevance logic or shows significant similarities with it. I prepare the grounds for a meaningful comparison by extracting the notion of relevance employed in the most influential work on modern relevance logic, Anderson and Belnap’s Entailment. This notion is characterized by two conditions imposed on the concept of validity: first, that some meaning content is shared between the premises and the conclusion, and second, that (...) the premises of a proof are actually used to derive the conclusion. Turning to Aristotle’s Prior Analytics, I argue that there is evidence that Aristotle’s Assertoric Syllogistic satisfies both conditions. Moreover, Aristotle at one point explicitly addresses the potential harmfulness of syllogisms with unused premises. Here, I argue that Aristotle’s analysis allows for a rejection of such syllogisms on formal grounds established in the foregoing parts of the Prior Analytics. In a final section I consider the view that Aristotle distinguished between validity on the one hand and syllogistic validity on the other. Following this line of reasoning, Aristotle’s logic might not be a relevance logic, since relevance is part of syllogistic validity and not, as modern relevance logic demands, of general validity. I argue that the reasons to reject this view are more compelling than the reasons to accept it and that we can, cautiously, uphold the result that Aristotle’s logic is a relevance logic. (shrink)
In 1942 Haskell B. Curry presented what is now called Curry's paradox which can be found in a logic independently of its stand on negation. In recent years there has been a revitalised interest in non-classical solutions to the semantic paradoxes. In this article the non-classical resolution of Curry’s Paradox and Shaw-Kwei' sparadox without rejection any contraction postulate is proposed. In additional relevant paraconsistent logic C ̌_n^#,1≤n<ω, in fact,provide an effective way of circumventing triviality of da Costa’s (...) paraconsistent Set Theories〖NF〗n^C. (shrink)
This interesting and imaginative monograph is based on the author’s PhD dissertation supervised by Saul Kripke. It is dedicated to Timothy Smiley, whose interpretation of PRIOR ANALYTICS informs its approach. As suggested by its title, this short work demonstrates conclusively that Aristotle’s syllogistic is a suitable vehicle for fruitful discussion of contemporary issues in logical theory. Aristotle’s syllogistic is represented by Corcoran’s 1972 reconstruction. The review studies Lear’s treatment of Aristotle’s logic, his appreciation of the Corcoran-Smiley paradigm, and his (...) understanding of modern logical theory. In the process Corcoran and Scanlan present new, previously unpublished results. Corcoran regards this review as an important contribution to contemporary study of PRIOR ANALYTICS: both the book and the review deserve to be better known. (shrink)
Formal symptoms of relevance usually concern the propositional variables shared between the antecedent and the consequent of provable conditionals. Among the most famous results about such symptoms are Belnap’s early results showing that for sublogics of the strong relevantlogic R, provable conditionals share a signed variable between antecedent and consequent. For logics weaker than R stronger variable sharing results are available. In 1984, Ross Brady gave one well-known example of such a result. As a corollary to the (...) main result of the paper, we give a very simple proof of a related but strictly stronger result. (shrink)
Relevant logics infamously have the property that they only validate a conditional when some propositional variable is shared between its antecedent and consequent. This property has been strengthened in a variety of ways over the last half-century. Two of the more famous of these strengthenings are the strong variable sharing property and the depth relevance property. In this paper I demonstrate that an appropriate class of relevant logics has a property that might naturally be characterized as the supremum (...) of these two properties. I also show how to use this fact to demonstrate that these logics seem to be constructive in previously unknown ways. (shrink)
What does it mean for the laws of logic to fail? My task in this paper is to answer this question. I use the resources that Routley/Sylvan developed with his collaborators for the semantics of relevant logics to explain a world where the laws of logic fail. I claim that the non-normal worlds that Routley/Sylvan introduced are exactly such worlds. To disambiguate different kinds of impossible worlds, I call such worlds logically impossible worlds. At a logically impossible (...) world, the laws of logic fail. In this paper, I provide a definition of logically impossible worlds. I then show that there is nothing strange about admitting such worlds. (shrink)
A. J. Ayer’s empiricist criterion of meaning was supposed to have sorted all statements into nonsense on the one hand, and tautologies or genuinely factual statements on the other. Unfortunately for Ayer, it follows from classical logic that his criterion is trivial—it classifies all statements as either tautologies or genuinely factual, but none as nonsense. However, in this paper, I argue that Ayer’s criterion of meaning can be defended from classical proofs of its triviality by the adoption of a (...)relevantlogic—an idea which is motivated because, according to Ayer, the genuinely factual statements are those which observation is relevant to. (shrink)
Relevant logics provide an alternative to classical implication that is capable of accounting for the relationship between the antecedent and the consequence of a valid implication. Relevant implication is usually explained in terms of information required to assess a proposition. By doing so, relevant implication introduces a number of cognitively relevant aspects in the de nition of logical operators. In this paper, we aim to take a closer look at the cognitive feature of relevant implication. (...) For this purpose, we develop a cognitively-oriented interpretation of the semantics of relevant logics. In particular, we provide an interpretation of Routley-Meyer semantics in terms of conceptual spaces and we show that it meets the constraints of the algebraic semantics of relevantlogic. (shrink)
Relevance logic has become ontologically fertile. No longer is the idea of relevance restricted in its application to purely logical relations among propositions, for as Dunn has shown in his (1987), it is possible to extend the idea in such a way that we can distinguish also between relevant and irrelevant predications, as for example between “Reagan is tall” and “Reagan is such that Socrates is wise”. Dunn shows that we can exploit certain special properties of identity within (...) the context of standard relevance logic in a way which allows us to discriminate further between relevant and irrelevant properties, as also between relevant and irrelevant relations. The idea yields a family of ontologically interesting results concerning the different ways in which attributes and objects may hang together. Because of certain notorious peculiarities of relevance logic, however,1 Dunn’s idea breaks down where the attempt is made to have it bear fruit in application to relations among entities which are of homogeneous type. (shrink)
There is a natural story about what logic is that sees it as tied up with two operations: a ‘throw things into a bag’ operation and a ‘closure’ operation. In a pair of recent papers, Jc Beall has fleshed out the account of logic this leaves us with in more detail. Using Beall’s exposition as a guide, this paper points out some problems with taking the second operation to be closure in the usual sense. After pointing out these (...) problems, I then turn to fixing them in a restricted case and modulo a few simplifying assumptions. In a followup paper, the simplifications and restrictions will be removed. (shrink)
Logic arguably plays a role in the normativity of reasoning. In particular, there are plausible norms of belief/disbelief whose antecedents are constituted by claims about what follows from what. But is logic also relevant to the normativity of agnostic attitudes? The question here is whether logical entailment also puts constraints on what kinds of things one can suspend judgment about. In this paper I address that question and I give a positive answer to it. In particular, I (...) advance two logical norms of agnosticism, where the first one allows us to assess situations in which the subject is agnostic about the conclusion of a valid argument and the second one allows us to assess situations in which the subject is agnostic about one of the premises of a valid argument. (shrink)
The paper presents an exhaustive menu of nonmonotonic logics. The options are individuated in terms of the principles they reject. I locate, e.g., cumulative logics and relevance logics on this menu. I highlight some frequently neglected options, and I argue that these neglected options are particularly attractive for inferentialists.
A logic is called 'paraconsistent' if it rejects the rule called 'ex contradictione quodlibet', according to which any conclusion follows from inconsistent premises. While logicians have proposed many technically developed paraconsistent logical systems and contemporary philosophers like Graham Priest have advanced the view that some contradictions can be true, and advocated a paraconsistent logic to deal with them, until recent times these systems have been little understood by philosophers. This book presents a comprehensive overview on paraconsistent logical systems (...) to change this situation. The book includes almost every major author currently working in the field. The papers are on the cutting edge of the literature some of which discuss current debates and others present important new ideas. The editors have avoided papers about technical details of paraconsistent logic, but instead concentrated upon works that discuss more 'big picture' ideas. Different treatments of paradoxes takes centre stage in many of the papers, but also there are several papers on how to interpret paraconistent logic and some on how it can be applied to philosophy of mathematics, the philosophy of language, and metaphysics. (shrink)
One logic or many? I say—many. Or rather, I say there is one logic for each way of specifying the class of all possible circumstances, or models, i.e., all ways of interpreting a given language. But because there is no unique way of doing this, I say there is no unique logic except in a relative sense. Indeed, given any two competing logical theories T1 and T2 (in the same language) one could always consider their common core, (...) T, and settle on that theory. So, given any language L, one could settle on the minimal logic T0 corresponding to the common core shared by all competitors. That would be a way of resisting relativism, as long as one is willing to redraw the bounds of logic accordingly. However, such a minimal theory T0 may be empty if the syntax of L contains no special ingredients the interpretation of which is independent of the specification of the relevant L-models. And generally—I argue—this is indeed the case. (shrink)
The notions of types of dialogue and dialectical relevance are central themes in Walton’s work and the grounds for a dialectical approach to many fallacies. After outlining the dialogue models constituting the background of Walton’s account, this article presents the concepts of dialectical relevance and dialogue shifts in their application to biased argumentation, fallacious moves, and illicit argumentative strategies. Showing the different dialectical proposals Walton advanced in several studies on argumentation as a development of a dialogical system, it has proved (...) possible to highlight the fundamental aspects of his theory in a comprehensive model of communication and interaction. (shrink)
ABSTRACT: A detailed presentation of Stoic theory of arguments, including truth-value changes of arguments, Stoic syllogistic, Stoic indemonstrable arguments, Stoic inference rules (themata), including cut rules and antilogism, argumental deduction, elements of relevance logic in Stoic syllogistic, the question of completeness of Stoic logic, Stoic arguments valid in the specific sense, e.g. "Dio says it is day. But Dio speaks truly. Therefore it is day." A more formal and more detailed account of the Stoic theory of deduction can (...) be found in S. Bobzien, Stoic Syllogistic, OSAP 1996. (shrink)
We introduce a number of logics to reason about collective propositional attitudes that are defined by means of the majority rule. It is well known that majoritarian aggregation is subject to irrationality, as the results in social choice theory and judgment aggregation show. The proposed logics for modelling collective attitudes are based on a substructural propositional logic that allows for circumventing inconsistent outcomes. Individual and collective propositional attitudes, such as beliefs, desires, obligations, are then modelled by means of minimal (...) modalities to ensure a number of basic principles. In this way, a viable consistent modelling of collective attitudes is obtained. (shrink)
Val Plumwood’s 1993 paper, “The politics of reason: towards a feminist logic” (hence- forth POR) attempted to set the stage for what she hoped would begin serious feminist exploration into formal logic – not merely its historical abuses, but, more importantly, its potential uses. This work offers us: (1) a case for there being feminist logic; and (2) a sketch of what it should resemble. The former goal of Plumwood’s paper encourages feminist theorists to reject anti-logic (...) feminist views. The paper’s latter aim is even more challenging. Plumwood’s critique of classical negation (and classical logic) as a logic of domination asks us to recognize that particular logical systems are weapons of oppression. Against anti-logic feminist theorists, Plumwood argues that there are other logics besides classical logic, such as relevant logics, which are suited for feminist theorizing. Some logics may oppress while others may liberate. We provide details about the sources and context for her rejection of classical logic and motivation for promoting relevant logics as feminist. (shrink)
We provide a logical matrix semantics and a Gentzen-style sequent calculus for the first-degree entailments valid in W. T. Parry’s logic of Analytic Implication. We achieve the former by introducing a logical matrix closely related to that inducing paracomplete weak Kleene logic, and the latter by presenting a calculus where the initial sequents and the left and right rules for negation are subject to linguistic constraints.
We propose a new account of indicative conditionals, giving acceptability and logical closure conditions for them. We start from Adams’ Thesis: the claim that the acceptability of a simple indicative equals the corresponding conditional probability. The Thesis is widely endorsed, but arguably false and refuted by empirical research. To fix it, we submit, we need a relevance constraint: we accept a simple conditional 'If φ, then ψ' to the extent that (i) the conditional probability p(ψ|φ) is high, provided that (ii) (...) φ is relevant for ψ. How (i) should work is well-understood. It is (ii) that holds the key to improve our understanding of conditionals. Our account has (i) a probabilistic component, using Popper functions; (ii) a relevance component, given via an algebraic structure of topics or subject matters. We present a probabilistic logic for simple indicatives, and argue that its (in)validities are both theoretically desirable and in line with empirical results on how people reason with conditionals. (shrink)
Section 1 reviews Strawson’s logic of presuppositions. Strawson’s justification is critiqued and a new justification proposed. Section 2 extends the logic of presuppositions to cases when the subject class is necessarily empty, such as (x)((Px & ~Px) → Qx) . The strong similarity of the resulting logic with Richard Diaz’s truth-relevantlogic is pointed out. Section 3 further extends the logic of presuppositions to sentences with many variables, and a certain valuation is proposed. It (...) is noted that, given this valuation, Gödel’s sentence becomes neither true nor false. The similarity of this outcome with Goldstein and Gaifman’s solution of the Liar paradox, which is discussed in section 4, is emphasized. Section 5 returns to the definition of meaningfulness; the meaninglessness of certain sentences with empty subjects and of the Liar sentence is discussed. The objective of this paper is to show how all of the above-mentioned concepts are interrelated. (shrink)
It is natural to think that our ordinary practices in giving explanations for our actions, for what we do, commit us to claiming that content properties are causally relevant to physical events such as the movements of our limbs and bodies, and events which these in turn cause. If you want to know why my body ambulates across the street, or why my arm went up before I set out, we suppose I have given you an answer when I (...) say that I wanted to greet a friend on the other side of the street, and thought that my arm's going up would be interpreted by him as a signal to stop for a moment. This widely held view might be disputed, but I shall not argue for it in this paper. I want to start with the view that our beliefs and desires and other propositional attitudes are causally relevant, in virtue of their modes and particular contents, to our movements, in order to investigate the consequences for analyses of thought content. For this purpose, I argue, in sec. II, for three necessary conditions on causal relevance: (a) a nomic sufficiency condition, (b) a logical independence condition, and (c) a screening-off condition. In sec. III, I apply these conditions to relational and functional theories of thought content, arguing that these theories cannot accommodate the causal relevance of content properties to our behaviour. I argue further that, on two plausible assumptions, one about the dependence of the mental on the physical, and the other about the availability in principle of causal explanations of our movements in terms of our non-relational physical properties, content properties can be causally relevant only if they are nomically type-correlated, relative to certain circumstances, with non-relational physical properties of our bodies. In sec. IV, I respond to a number of objections that might be advanced against this conclusion. (shrink)
Epistemic two-dimensional semantics is a theory in the philosophy of language that provides an account of meaning which is sensitive to the distinction between necessity and apriority. While this theory is usually presented in an informal manner, I take some steps in formalizing it in this paper. To do so, I define a semantics for a propositional modal logic with operators for the modalities of necessity, actuality, and apriority that captures the relevant ideas of epistemic two-dimensional semantics. I (...) also describe some properties of the logic that are interesting from a philosophical perspective, and apply it to the so-called nesting problem. (shrink)
In this paper I will develop a view about the semantics of imperatives, which I term Modal Noncognitivism, on which imperatives might be said to have truth conditions (dispositionally, anyway), but on which it does not make sense to see them as expressing propositions (hence does not make sense to ascribe to them truth or falsity). This view stands against “Cognitivist” accounts of the semantics of imperatives, on which imperatives are claimed to express propositions, which are then enlisted in explanations (...) of the relevant logico-semantic phenomena. It also stands against the major competitors to Cognitivist accounts—all of which are non-truth-conditional and, as a result, fail to provide satisfying explanations of the fundamental semantic characteristics of imperatives (or so I argue). The view of imperatives I defend here improves on various treatments of imperatives on the market in giving an empirically and theoretically adequate account of their semantics and logic. It yields explanations of a wide range of semantic and logical phenomena about imperatives—explanations that are, I argue, at least as satisfying as the sorts of explanations of semantic and logical phenomena familiar from truth-conditional semantics. But it accomplishes this while defending the notion—which is, I argue, substantially correct—that imperatives could not have propositions, or truth conditions, as their meanings. (shrink)
This chapter focuses on alternative logics. It discusses a hierarchy of logical reform. It presents case studies that illustrate particular aspects of the logical revisionism discussed in the chapter. The first case study is of intuitionistic logic. The second case study turns to quantum logic, a system proposed on empirical grounds as a resolution of the antinomies of quantum mechanics. The third case study is concerned with systems of relevance logic, which have been the subject of an (...) especially detailed reform program. Finally, the fourth case study is paraconsistent logic, perhaps the most controversial of serious proposals. (shrink)
This paper uses argument diagrams, argumentation schemes, and some tools from formal argumentation systems developed in artificial intelligence to build a graph-theoretic model of relevance shown to be applicable as a practical method for helping a third party judge issues of relevance or irrelevance of an argument in real examples. Examples used to illustrate how the method works are drawn from disputes about relevance in natural language discourse, including a criminal trial and a parliamentary debate.
We analyze the logical form of the domain knowledge that grounds analogical inferences and generalizations from a single instance. The form of the assumptions which justify analogies is given schematically as the "determination rule", so called because it expresses the relation of one set of variables determining the values of another set. The determination relation is a logical generalization of the different types of dependency relations defined in database theory. Specifically, we define determination as a relation between schemata of first (...) order logic that have two kinds of free variables: (1) object variables and (2) what we call "polar" variables, which hold the place of truth values. Determination rules facilitate sound rule inference and valid conclusions projected by analogy from single instances, without implying what the conclusion should be prior to an inspection of the instance. They also provide a way to specify what information is sufficiently relevant to decide a question, prior to knowledge of the answer to the question. (shrink)
Imagine a dog tracing a scent to a crossroads, sniffing all but one of the exits, and then proceeding down the last without further examination. According to Sextus Empiricus, Chrysippus argued that the dog effectively employs disjunctive syllogism, concluding that since the quarry left no trace on the other paths, it must have taken the last. The story has been retold many times, with at least four different morals: (1) dogs use logic, so they are as clever as humans; (...) (2) dogs use logic, so using logic is nothing special; (3) dogs reason well enough without logic; (4) dogs reason better for not having logic. This paper traces the history of Chrysippus's dog, from antiquity up to its discussion by relevance logicians in the twentieth century. (shrink)
Agents require a constant flow, and a high level of processing, of relevant semantic information, in order to interact successfully among themselves and with the environment in which they are embedded. Standard theories of information, however, are silent on the nature of epistemic relevance. In this paper, a subjectivist interpretation of epistemic relevance is developed and defended. It is based on a counterfactual and metatheoretical analysis of the degree of relevance of some semantic information i to an informee/agent a, (...) as a function of the accuracy of i understood as an answer to a query q, given the probability that q might be asked by a. This interpretation of epistemic relevance vindicates a strongly semantic theory of information, according to which semantic information encapsulates truth. It accounts satisfactorily for several important applications and interpretations of the concept of relevant information in a variety of philosophical areas. And it interfaces successfully with current philosophical interpretations of causal and logical relevance. (shrink)
Logics of joint strategic ability have recently received attention, with arguably the most influential being those in a family that includes Coalition Logic (CL) and Alternating-time Temporal Logic (ATL). Notably, both CL and ATL bypass the epistemic issues that underpin Schelling-type coordination problems, by apparently relying on the meta-level assumption of (perfectly reliable) communication between cooperating rational agents. Yet such epistemic issues arise naturally in settings relevant to ATL and CL: these logics are standardly interpreted on structures (...) where agents move simultaneously, opening the possibility that an agent cannot foresee the concurrent choices of other agents. In this paper we introduce a variant of CL we call Two-Player Strategic Coordination Logic (SCL2). The key novelty of this framework is an operator for capturing coalitional ability when the cooperating agents cannot share strategic information. We identify significant differences in the expressive power and validities of SCL2 and CL2, and present a sound and complete axiomatization for SCL2. We briefly address conceptual challenges when shifting attention to games with more than two players and stronger notions of rationality. (shrink)
This article is primarily concerned with the articulation of a defensible position on the relevance of phenomenological analysis with the current epistemological edifice as this latter has evolved since the rupture with the classical scientific paradigm pointing to the Newtonian-Leibnizian tradition which took place around the beginning of 20th century. My approach is generally based on the reduction of the objects-contents of natural sciences, abstracted in the form of ideal objectivities in the corresponding logical-mathematical theories, to the content of meaning-acts (...) ultimately referring to a specific being-within-the-world experience. This is a position that finds itself in line with Husserl’s gradual departure from the psychologistic interpretations of his earlier works on the philosophy of logic and mathematics and culminates in a properly meant phenomenological foundation of natural sciences in his last major published work, namely the Crisis of European Sciences and the Transcendental Phenomenology. Further this article tries to set up a context of discourse in which to found both physical and formal objects in parallel terms as essentially temporal-noematic objects to the extent that they may be considered as invariants of the constitutional modes of a temporal consciousness. (shrink)
The reasoning process of analogy is characterized by a strict interdependence between a process of abstraction of a common feature and the transfer of an attribute of the Analogue to the Primary Subject. The first reasoning step is regarded as an abstraction of a generic characteristic that is relevant for the attribution of the predicate. The abstracted feature can be considered from a logic-semantic perspective as a functional genus, in the sense that it is contextually essential for the (...) attribution of the predicate, i.e. that is pragmatically fundamental (i.e. relevant) for the predica-tion, or rather the achievement of the communicative intention. While the transfer of the predicate from the Analogue to the analogical genus and from the genus to the Primary Subject is guaranteed by the maxims (or rules of inference) governing the genus-species relation, the connection between the genus and the predicate can be complex, characterized by various types of reasoning patterns. The relevance relation can hide implicit arguments, such as an implicit argument from classification , an evaluation based on values, consequences or rules, a causal relation, or an argument from practical reasoning. (shrink)
This paper contends that Stoic logic (i.e. Stoic analysis) deserves more attention from contemporary logicians. It sets out how, compared with contemporary propositional calculi, Stoic analysis is closest to methods of backward proof search for Gentzen-inspired substructural sequent logics, as they have been developed in logic programming and structural proof theory, and produces its proof search calculus in tree form. It shows how multiple similarities to Gentzen sequent systems combine with intriguing dissimilarities that may enrich contemporary discussion. Much (...) of Stoic logic appears surprisingly modern: a recursively formulated syntax with some truth-functional propositional operators; analogues to cut rules, axiom schemata and Gentzen’s negation-introduction rules; an implicit variable-sharing principle and deliberate rejection of Thinning and avoidance of paradoxes of implication. These latter features mark the system out as a relevance logic, where the absence of duals for its left and right introduction rules puts it in the vicinity of McCall’s connexive logic. Methodologically, the choice of meticulously formulated meta-logical rules in lieu of axiom and inference schemata absorbs some structural rules and results in an economical, precise and elegant system that values decidability over completeness. (shrink)
Recently several papers have reported relevance effects on the cognitive assessments of indicative conditionals, which pose an explanatory challenge to the Suppositional Theory of conditionals advanced by David Over, which is influential in the psychology of reasoning. Some of these results concern the “Equation” (P(if A, then C) = P(C|A)), others the de Finetti truth table, and yet others the uncertain and-to-inference task. The purpose of this chapter is to take a Birdseye view on the debate and investigate some of (...) the open theoretical issues posed by the empirical results. Central among these is whether to count these effects as belonging to pragmatics or semantics. (shrink)
This paper defines the form of prior knowledge that is required for sound inferences by analogy and single-instance generalizations, in both logical and probabilistic reasoning. In the logical case, the first order determination rule defined in Davies (1985) is shown to solve both the justification and non-redundancy problems for analogical inference. The statistical analogue of determination that is put forward is termed 'uniformity'. Based on the semantics of determination and uniformity, a third notion of "relevance" is defined, both logically and (...) probabilistically. The statistical relevance of one function in determining another is put forward as a way of defining the value of information: The statistical relevance of a function F to a function G is the absolute value of the change in one's information about the value of G afforded by specifying the value of F. This theory provides normative justifications for conclusions projected by analogy from one case to another, and for generalization from an instance to a rule. The soundness of such conclusions, in either the logical or the probabilistic case, can be identified with the extent to which the corresponding criteria (determination and uniformity) actually hold for the features being related. (shrink)
I argue against abductivism about logic, which is the view that rational theory choice in logic happens by abduction. Abduction cannot serve as a neutral arbiter in many foundational disputes in logic because, in order to use abduction, one must first identify the relevant data. Which data one deems relevant depends on what I call one's conception of logic. One's conception of logic is, however, not independent of one's views regarding many of the (...) foundational disputes that one may hope to solve by abduction. (shrink)
Though there have been productive interactions between moral philosophers and deontic logicians, there has also been a tradition of neglecting the insights that the fields can offer one another. The most sustained interactions between moral philosophers and deontic logicians have notbeen systematic but instead have been scattered across a number of distinct and often unrelated topics. This chapter primarily focuses on three topics. First, we discuss the “actualism/possibilism” debate which, very roughly, concerns the relevance of what one will do at (...) some future time to what one ought to do at present (§2). This topic is also used to introduce various modal deontic logics. Second we discuss the particularism debate which, very roughly, concerns whether there can be any systematic general theory of what we ought to do (§3). This topic is also used to introduce various non-modal deontic logics. Third, we discuss collective action problems which concern the connection between the obligations of individuals and the behavior and obligations of groups of individuals (§4).This topic is also used to discuss formal systems that allow us to study the relationship between individuals and groups. The chapter also contains a general discussion of the relation between ethical theory and deontic logic (§1) and a brief consideration of other miscellaneous topics (§5). (shrink)
In this article I attempt to overcome extant obstacles in deriving fundamental, objective and logically deduced definitions of personhood and their rights, by introducing an a priori paradigm of beings and morality. I do so by drawing a distinction between entities that are sought as ends and entities that are sought as means to said ends. The former entities, I offer, are the essence of personhood and are considered precious by observers possessing a logical system of valuation. The latter entities (...) – those sought only as a means to an end – I term ‘materials.’ Materials are sought for their conditional value: Important for achieving sought ends, they are not considered precious in and of themselves. A normative system for how this dichotomy of entities should interact is consequently derived and introduced. This paradigm has applicability for modern humanism and beyond. Assuming societal technological progression whereby human bodies and their surrounding infrastructures continue to evolve and integrate, the distinction between beings and their supporting materials, and a moral code for their interactions, will become ever more relevant. (shrink)
This paper starts by indicating the analysis of Hempel's conditions of adequacy for any relation of confirmation (Hempel, 1945) as presented in Huber (submitted). There I argue contra Carnap (1962, Section 87) that Hempel felt the need for two concepts of confirmation: one aiming at plausible theories and another aiming at informative theories. However, he also realized that these two concepts are conflicting, and he gave up the concept of confirmation aiming at informative theories. The main part of the paper (...) consists in working out the claim that one can have Hempel's cake and eat it too - in the sense that there is a logic of theory assessment that takes into account both of the two conflicting aspects of plausibility and informativeness. According to the semantics of this logic, a is an acceptable theory for evidence β if and only if a is both sufficiently plausible given β and sufficiently informative about β. This is spelt out in terms of ranking functions (Spohn, 1988) and shown to represent the syntactically specified notion of an assessment relation. The paper then compares these acceptability relations to explanatory and confirmatory consequence relations (Flach, 2000) as well as to nonmonotonic consequence relations (Kraus et al., 1990). It concludes by relating the plausibility-informativeness approach to Carnap's positive relevance account, thereby shedding new light on Carnap's analysis as well as solving another problem of confirmation theory. (shrink)
The paper surveys the currently available axiomatizations of common belief (CB) and common knowledge (CK) by means of modal propositional logics. (Throughout, knowledge- whether individual or common- is defined as true belief.) Section 1 introduces the formal method of axiomatization followed by epistemic logicians, especially the syntax-semantics distinction, and the notion of a soundness and completeness theorem. Section 2 explains the syntactical concepts, while briefly discussing their motivations. Two standard semantic constructions, Kripke structures and neighbourhood structures, are introduced in Sections (...) 3 and 4, respectively. It is recalled that Aumann's partitional model of CK is a particular case of a definition in terms of Kripke structures. The paper also restates the well-known fact that Kripke structures can be regarded as particular cases of neighbourhood structures. Section 3 reviews the soundness and completeness theorems proved w.r.t. the former structures by Fagin, Halpern, Moses and Vardi, as well as related results by Lismont. Section 4 reviews the corresponding theorems derived w.r.t. the latter structures by Lismont and Mongin. A general conclusion of the paper is that the axiomatization of CB does not require as strong systems of individual belief as was originally thought- only monotonicity has thusfar proved indispensable. Section 5 explains another consequence of general relevance: despite the "infinitary" nature of CB, the axiom systems of this paper admit of effective decision procedures, i.e., they are decidable in the logician's sense. (shrink)
Our question is: can we embed minimal negation in implicative logics weaker than I→? Previous results show how to define minimal negation in the positive fragment of the logic of relevance R and in contractionless intuitionistic logic. Is it possible to endow weaker positive logics with minimal negation? This paper prooves that minimal negation can be embedded in even such a weak system as Anderson and Belnap’s minimal positive logic.
The paper discusses approaches to Epistemic Contextualism that model the satisfaction of the predicate ‘know’ in a given context C in terms of the notion of belief/fact-matching throughout a contextually specified similarity sphere of worlds that is centred on actuality. The paper offers three counterexamples to approaches of this type and argues that they lead to insurmountable difficulties. I conclude that what contextualists (and Subject-Sensitive Invariantists) have traditionally called the ‘epistemic standards’ of a given context C cannot be explicated in (...) terms of a contextually specified similarity sphere that is centred on actuality. The mentioned accounts of epistemic relevance and thus the corresponding accounts of the context-sensitivity (or subject-sensitivity) of ‘knows’ are to be rejected. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.