In this article, we redefine classical notions of theory reduction in such a way that model-theoretic preferential semantics becomes part of a realist depiction of this aspect of science. We offer a model-theoretic reconstruction of science in which theory succession or reduction is often better - or at a finer level of analysis - interpreted as the result of model succession or reduction. This analysis leads to 'defeasible reduction', defined as follows: The conjunction of the assumptions of a reducing (...) theory T with the definitions translating the vocabulary of a reduced theory T' to the vocabulary of T, defeasibly entails the assumptions of reduced T'. This relation of defeasible reduction offers, in the context of additional knowledge becoming available, articulation of a more flexible kind of reduction in theory development than in the classical case. Also, defeasible reduction is shown to solve the problems of entailment that classical homogeneous reduction encounters. Reduction in the defeasible sense is a practical device for studying the processes of science, since it is about highlighting different aspects of the same theory at different times of application, rather than about naive dreams concerning a metaphysical unity of science. (shrink)
In this paper, by suggesting a formal representation of science based on recent advances in logic-based Artificial Intelligence (AI), we show how three serious concerns around the realisation of traditional scientific realism (the theory/observation distinction, over-determination of theories by data, and theory revision) can be overcome such that traditional realism is given a new guise as ‘naturalised’. We contend that such issues can be dealt with (in the context of scientific realism) by developing a formal representation of science based on (...) the application of the following tools from Knowledge Representation: the family of Description Logics, an enrichment of classical logics via defeasible statements, and an application of the preferential interpretation of the approach to Belief Revision. (shrink)
We present two defeasible logics of norm-propositions (statements about norms) that (i) consistently allow for the possibility of normative gaps and normative conflicts, and (ii) map each premise set to a sufficiently rich consequence set. In order to meet (i), we define the logic LNP, a conflict- and gap-tolerant logic of norm-propositions capable of formalizing both normative conflicts and normative gaps within the object language. Next, we strengthen LNP within the adaptive logic framework for non-monotonic reasoning in order to (...) meet (ii). This results in the adaptive logics LNPr and LNPm, which interpret a given set of premises in such a way that normative conflicts and normative gaps are avoided ‘whenever possible’. LNPr and LNPm are equipped with a preferential semantics and a dynamic proof theory. (shrink)
In this inaugural lecture I offer, against the background of a discussion of knowledge representation and its tools, an overview of my research in the philosophy of science. I defend a relational model-theoretic realism as being the appropriate meta-stance most congruent with the model-theoretic view of science as a form of human engagement with the world. Making use of logics with preferential semantics within a model-theoretic paradigm, I give an account of science as process and product. I demonstrate (...) the power of the full-blown employment of this paradigm in the philosophy of science by discussing the main applications of model-theoretic realism to traditional problems in the philosophy of science. I discuss my views of the nature of logic and of its role in the philosophy of science today. I also specifically offer a brief discussion on the future of cognitive philosophy in South Africa. My conclusion is a general look at the nature of philosophical inquiry and its significance for philosophers today. South African Journal of Philosophy Vol. 25 (4) 2006: pp. 275-289. (shrink)
I explore the logic of ground. I first develop a logic of weak ground. This logic strengthens the logic of weak ground presented by Fine in his ‘Guide to Ground.’ This logic, I argue, generates many plausible principles which Fine’s system leaves out. I then derive from this a logic of strict ground. I argue that there is a strong abductive case for adopting this logic. It’s elegant, parsimonious and explanatorily powerful. Yet, so I suggest, adopting it has important consequences. (...) First, it means we should think of ground as a type of identity. Second, it means we should reject much of Fine’s logic of strict ground. I also show how the logic I develop connects to other systems in the literature. It is definitionally equivalent both to Angell’s logic of analytic containment and to Correia’s system G. (shrink)
In this paper I argue that pluralism at the level of logical systems requires a certain monism at the meta-logical level, and so, in a sense, there cannot be pluralism all the way down. The adequate alternative logical systems bottom out in a shared basic meta-logic, and as such, logical pluralism is limited. I argue that the content of this basic meta-logic must include the analogue of logical rules Modus Ponens and Universal Instantiation. I show this through a detailed analysis (...) of the ‘adoption problem’, which manifests something special about MP and UI. It appears that MP and UI underwrite the very nature of a logical rule of inference, due to all rules of inference being conditional and universal in their structure. As such, all logical rules presuppose MP and UI, making MP and UI self-governing, basic, unadoptable, and required in the meta-logic for the adequacy of any logical system. (shrink)
Many philosophers take purportedly logical cases of ground ) to be obvious cases, and indeed such cases have been used to motivate the existence of and importance of ground. I argue against this. I do so by motivating two kinds of semantic determination relations. Intuitions of logical ground track these semantic relations. Moreover, our knowledge of semantics for first order logic can explain why we have such intuitions. And, I argue, neither semantic relation can be a species of ground even (...) on a quite broad conception of what ground is. Hence, without a positive argument for taking so-called ‘logical ground’ to be something distinct from a semantic determination relation, we should cease treating logical cases as cases of ground. (shrink)
I consider the question of the possibility of the coexistence of neighborly love (love for strangers) and preferential love (love for persons because of or despite their attributes). This question has long perplexed interpreters of Kierkegaard. I make a threefold intervention into this interpretive debate. First, I aim to show that we shouldn’t privilege preferential love over neighborly love. Second, I reformulate preferential and neighborly love on a ‘topological’ model, so as to get a better grip on (...) them. And third, I argue that preferential love can coexist with neighborly love insofar as the latter is granted primacy over the former. (shrink)
At least since Aristotle’s famous 'sea-battle' passages in On Interpretation 9, some substantial minority of philosophers has been attracted to the doctrine of the open future--the doctrine that future contingent statements are not true. But, prima facie, such views seem inconsistent with the following intuition: if something has happened, then (looking back) it was the case that it would happen. How can it be that, looking forwards, it isn’t true that there will be a sea battle, while also being true (...) that, looking backwards, it was the case that there would be a sea battle? This tension forms, in large part, what might be called the problem of future contingents. A dominant trend in temporal logic and semantic theorizing about future contingents seeks to validate both intuitions. Theorists in this tradition--including some interpretations of Aristotle, but paradigmatically, Thomason (1970), as well as more recent developments in Belnap, et. al (2001) and MacFarlane (2003, 2014)--have argued that the apparent tension between the intuitions is in fact merely apparent. In short, such theorists seek to maintain both of the following two theses: (i) the open future: Future contingents are not true, and (ii) retro-closure: From the fact that something is true, it follows that it was the case that it would be true. It is well-known that reflection on the problem of future contingents has in many ways been inspired by importantly parallel issues regarding divine foreknowledge and indeterminism. In this paper, we take up this perspective, and ask what accepting both the open future and retro-closure predicts about omniscience. When we theorize about a perfect knower, we are theorizing about what an ideal agent ought to believe. Our contention is that there isn’t an acceptable view of ideally rational belief given the assumptions of the open future and retro-closure, and thus this casts doubt on the conjunction of those assumptions. (shrink)
When discussing Logical Pluralism several critics argue that such an open-minded position is untenable. The key to this conclusion is that, given a number of widely accepted assumptions, the pluralist view collapses into Logical Monism. In this paper we show that the arguments usually employed to arrive at this conclusion do not work. The main reason for this is the existence of certain substructural logics which have the same set of valid inferences as Classical Logic—although they are, in a (...) clear sense, non-identical to it. We argue that this phenomenon can be generalized, given the existence of logics which coincide with Classical Logic regarding a number of metainferential levels—although they are, again, clearly different systems. We claim this highlights the need to arrive at a more refined version of the Collapse Argument, which we discuss at the end of the paper. (shrink)
Not focusing on the history of classical logic, this book provides discussions and quotes central passages on its origins and development, namely from a philosophical perspective. Not being a book in mathematical logic, it takes formal logic from an essentially mathematical perspective. Biased towards a computational approach, with SAT and VAL as its backbone, this is an introduction to logic that covers essential aspects of the three branches of logic, to wit, philosophical, mathematical, and computational.
Logical monism is the view that there is ‘One True Logic’. This is the default position, against which pluralists react. If there were not ‘One True Logic’, it is hard to see how there could be one true theory of anything. A theory is closed under a logic! But what is logical monism? In this article, I consider semantic, logical, modal, scientific, and metaphysical proposals. I argue that, on no ‘factualist’ analysis (according to which ‘there is One True Logic’ expresses (...) a factual claim, rather than an attitude like approval), does the doctrine have both metaphysical and methodological import. Metaphysically, logics abound. Methodologically, what to infer from what is not settled by the facts, even the normative ones. I conclude that the only interesting sense in which there could be One True Logic is noncognitive. The same may be true of monism about normative areas, like moral, epistemic, and prudential ones, generally. (shrink)
What does it mean for the laws of logic to fail? My task in this paper is to answer this question. I use the resources that Routley/Sylvan developed with his collaborators for the semantics of relevant logics to explain a world where the laws of logic fail. I claim that the non-normal worlds that Routley/Sylvan introduced are exactly such worlds. To disambiguate different kinds of impossible worlds, I call such worlds logically impossible worlds. At a logically impossible world, the (...) laws of logic fail. In this paper, I provide a definition of logically impossible worlds. I then show that there is nothing strange about admitting such worlds. (shrink)
This paper discusses three relevant logics that obey Component Homogeneity - a principle that Goddard and Routley introduce in their project of a logic of significance. The paper establishes two main results. First, it establishes a general characterization result for two families of logic that obey Component Homogeneity - that is, we provide a set of necessary and sufficient conditions for their consequence relations. From this, we derive characterization results for S*fde, dS*fde, crossS*fde. Second, the paper establishes complete sequent (...) calculi for S*fde, dS*fde, crossS*fde. Among the other accomplishments of the paper, we generalize the semantics from Bochvar, Hallden, Deutsch and Daniels, we provide a general recipe to define containment logics, we explore the single-premise/single-conclusion fragment of S*fde, dS*fde, crossS*fdeand the connections between crossS*fde and the logic Eq of equality by Epstein. Also, we present S*fde as a relevant logic of meaninglessness that follows the main philosophical tenets of Goddard and Routley, and we briefly examine three further systems that are closely related to our main logics. Finally, we discuss Routley's criticism to containment logic in light of our results, and overview some open issues. (shrink)
2nd edition. Many-valued logics are those logics that have more than the two classical truth values, to wit, true and false; in fact, they can have from three to infinitely many truth values. This property, together with truth-functionality, provides a powerful formalism to reason in settings where classical logic—as well as other non-classical logics—is of no avail. Indeed, originally motivated by philosophical concerns, these logics soon proved relevant for a plethora of applications ranging from switching theory (...) to cognitive modeling, and they are today in more demand than ever, due to the realization that inconsistency and vagueness in knowledge bases and information processes are not only inevitable and acceptable, but also perhaps welcome. The main modern applications of (any) logic are to be found in the digital computer, and we thus require the practical knowledge how to computerize—which also means automate—decisions (i.e. reasoning) in many-valued logics. This, in turn, necessitates a mathematical foundation for these logics. This book provides both these mathematical foundation and practical knowledge in a rigorous, yet accessible, text, while at the same time situating these logics in the context of the satisfiability problem (SAT) and automated deduction. The main text is complemented with a large selection of exercises, a plus for the reader wishing to not only learn about, but also do something with, many-valued logics. (shrink)
It would be good to have a Bayesian decision theory that assesses our decisions and thinking according to everyday standards of rationality — standards that do not require logical omniscience (Garber 1983, Hacking 1967). To that end we develop a “fragmented” decision theory in which a single state of mind is represented by a family of credence functions, each associated with a distinct choice condition (Lewis 1982, Stalnaker 1984). The theory imposes a local coherence assumption guaranteeing that as an agent's (...) attention shifts, successive batches of "obvious" logical information become available to her. A rule of expected utility maximization can then be applied to the decision of what to attend to next during a train of thought. On the resulting theory, rationality requires ordinary agents to be logically competent and to often engage in trains of thought that increase the unification of their states of mind. But rationality does not require ordinary agents to be logically omniscient. (shrink)
It has been known for a few years that no more than Pi-1-1 comprehension is needed for the proof of "Frege's Theorem". One can at least imagine a view that would regard Pi-1-1 comprehension axioms as logical truths but deny that status to any that are more complex—a view that would, in particular, deny that full second-order logic deserves the name. Such a view would serve the purposes of neo-logicists. It is, in fact, no part of my view that, say, (...) Delta-3-1 comprehension axioms are not logical truths. What I am going to suggest, however, is that there is a special case to be made on behalf of Pi-1-1 comprehension. Making the case involves investigating extensions of first-order logic that do not rely upon the presence of second-order quantifiers. A formal system for so-called "ancestral logic" is developed, and it is then extended to yield what I call "Arché logic". (shrink)
I show that intuitive and logical considerations do not justify introducing Leibniz’s Law of the Indiscernibility of Identicals in more than a limited form, as applying to atomic formulas. Once this is accepted, it follows that Leibniz’s Law generalises to all formulas of the first-order Predicate Calculus but not to modal formulas. Among other things, identity turns out to be logically contingent.
A general theory of logical oppositions is proposed by abstracting these from the Aristotelian background of quantified sentences. Opposition is a relation that goes beyond incompatibility (not being true together), and a question-answer semantics is devised to investigate the features of oppositions and opposites within a functional calculus. Finally, several theoretical problems about its applicability are considered.
Suppose a person lives in a sub-Saharan country that has won its independence from colonial powers in the last 50 years or so. Suppose also that that person has become a high-ranking government official who makes decisions on how to allocate goods, such as civil service jobs and contracts with private firms. Should such a person refrain from considering any particulars about potential recipients or might it be appropriate to consider, for example, family membership, party affiliation, race or revolutionary stature (...) as reasons to benefit certain individuals at some cost to the general public? Which of these factors should be considered unjust, or even corrupt, as a basis on which to allocate state goods and which should not? This chapter outlines an attractive moral theory with African content that forbids both impartialism and a strong form of partialism that would permit government officials to favour members of their families or political parties. Between these two extremes, a moderate partialism is prescribed. This permits government agents to occasionally favour veterans and victims of state injustices at some cost to the general public. This chapter seeks to provide a new, unified explanation of why sub-Saharan values permit some forms of partiality, such as the preferential hiring of those who struggled against colonialism, but prohibit other nepotistic forms of partiality. (shrink)
A natural suggestion and increasingly popular account of how to revise our logical beliefs treats revision of logic analogously to the revision of scientific theories. I investigate this approach and argue that simple applications of abductive methodology to logic result in revision-cycles, developing a detailed case study of an actual dispute with this property. This is problematic if we take abductive methodology to provide justification for revising our logical framework. I then generalize the case study, pointing to similarities with more (...) recent and popular heterodox logics such as naïve logics of truth. I use this discussion to motivate a constraint—logical partisanhood—on the uses of such methodology: roughly: both the proposed alternative and our actual background logic must be able to agree that moving to the alternative logic is no worse than staying put. (shrink)
In previous work, I introduced a complete axiomatization of classical non-tautologies based essentially on Łukasiewicz’s rejection method. The present paper provides a new, Hilbert-type axiomatization (along with related systems to axiomatize classical contradictions, non-contradictions, contingencies and non-contingencies respectively). This new system is mathematically less elegant, but the format of the inferential rules and the structure of the completeness proof possess some intrinsic interest and suggests instructive comparisons with the logic of tautologies.
2nd edition. The theory of logical consequence is central in modern logic and its applications. However, it is mostly dispersed in an abundance of often difficultly accessible papers, and rarely treated with applications in mind. This book collects the most fundamental aspects of this theory and offers the reader the basics of its applications in computer science, artificial intelligence, and cognitive science, to name but the most important fields where this notion finds its many applications.
I distinguish two ways of developing anti-exceptionalist approaches to logical revision. The first emphasizes comparing the theoretical virtuousness of developed bodies of logical theories, such as classical and intuitionistic logic. I'll call this whole theory comparison. The second attempts local repairs to problematic bits of our logical theories, such as dropping excluded middle to deal with intuitions about vagueness. I'll call this the piecemeal approach. I then briefly discuss a problem I've developed elsewhere for comparisons of logical theories. Essentially, the (...) problem is that a pair of logics may each evaluate the alternative as superior to themselves, resulting in oscillation between logical options. The piecemeal approach offers a way out of this problem andthereby might seem a preferable to whole theory comparisons. I go on to show that reflective equilibrium, the best known piecemeal method, has deep problems of its own when applied to logic. (shrink)
This book is written for those who wish to learn some basic principles of formal logic but more importantly learn some easy methods to unpick arguments and assess their value for truth and validity. -/- The first section explains the ideas behind traditional logic which was formed well over two thousand years ago by the ancient Greeks. Terms such as ‘categorical syllogism’, ‘premise’, ‘deduction’ and ‘validity’ may appear at first sight to be inscrutable but will easily be understood with examples (...) bringing the subjects to life. Traditionally, Venn diagrams have been employed to test arguments. These are very useful but their application is limited and they are not open to quantification. The mid-section of this book introduces a methodology that makes the analysis of arguments accessible with the use of a new form of diagram, modified from those of the mathematician Leonhard Euler. These new diagrammatic methods will be employed to demonstrate an addition to the basic form of syllogism. This includes a refined definition of the terms ‘most’ and ‘some’ within propositions. This may seem a little obscure at the moment but one will readily apprehend these new methods and principles of a more modern logic. (shrink)
This is the 3rd edition. Although a number of new technological applications require classical deductive computation with non-classical logics, many key technologies still do well—or exclusively, for that matter—with classical logic. In this first volume, we elaborate on classical deductive computing with classical logic. The objective of the main text is to provide the reader with a thorough elaboration on both classical computing – a.k.a. formal languages and automata theory – and classical deduction with the classical first-order predicate calculus (...) with a view to computational implementations, namely in automated theorem proving and logic programming. The present third edition improves on the previous ones by providing an altogether more algorithmic approach: There is now a wholly new section on algorithms and there are in total fourteen clearly isolated algorithms designed in pseudo-code. Other improvements are, for instance, an emphasis on functions in Chapter 1 and more exercises with Turing machines. (shrink)
Inductive Logic is a ‘thematic compilation’ by Avi Sion. It collects in one volume many (though not all) of the essays, that he has written on this subject over a period of some 23 years, which all demonstrate the possibility and conditions of validity of human knowledge, the utility and reliability of human cognitive means when properly used, contrary to the skeptical assumptions that are nowadays fashionable.
Logical Criticism of Buddhist Doctrines is a ‘thematic compilation’ by Avi Sion. It collects in one volume the essays that he has written on this subject over a period of some 15 years after the publication of his first book on Buddhism, Buddhist Illogic. It comprises expositions and empirical and logical critiques of many (though not all) Buddhist doctrines, such as impermanence, interdependence, emptiness, the denial of self or soul. It includes his most recent essay, regarding the five skandhas doctrine.
This essay examines the philosophical significance of $\Omega$-logic in Zermelo-Fraenkel set theory with choice (ZFC). The duality between coalgebra and algebra permits Boolean-valued algebraic models of ZFC to be interpreted as coalgebras. The modal profile of $\Omega$-logical validity can then be countenanced within a coalgebraic logic, and $\Omega$-logical validity can be defined via deterministic automata. I argue that the philosophical significance of the foregoing is two-fold. First, because the epistemic and modal profiles of $\Omega$-logical validity correspond to those of second-order (...) logical consequence, $\Omega$-logical validity is genuinely logical, and thus vindicates a neo-logicist conception of mathematical truth in the set-theoretic multiverse. Second, the foregoing provides a modal-computational account of the interpretation of mathematical vocabulary, adducing in favor of a realist conception of the cumulative hierarchy of sets. (shrink)
According to certain normative theories in epistemology, rationality requires us to be logically omniscient. Yet this prescription clashes with our ordinary judgments of rationality. How should we resolve this tension? In this paper, I focus particularly on the logical omniscience requirement in Bayesian epistemology. Building on a key insight by Hacking :311–325, 1967), I develop a version of Bayesianism that permits logical ignorance. This includes: an account of the synchronic norms that govern a logically ignorant individual at any given time; (...) an account of how we reduce our logical ignorance by learning logical facts and how we should update our credences in response to such evidence; and an account of when logical ignorance is irrational and when it isn’t. At the end, I explain why the requirement of logical omniscience remains true of ideal agents with no computational, processing, or storage limitations. (shrink)
This article aims to introduce a new solution to the Logical Problem of the Trinity. This solution is provided by utilising a number of theses within the field of contemporary metaphysics in order to establish a conceptual basis for a novel account and model of the doctrine of the Trinity termed Monarchical Aspectivalism, which will provide the means for proposing an alternative reading of the Athanasian Creed that is free from any consistency problems.
Anti-exceptionalism about logic is the doctrine that logic does not require its own epistemology, for its methods are continuous with those of science. Although most recently urged by Williamson, the idea goes back at least to Lakatos, who wanted to adapt Popper's falsicationism and extend it not only to mathematics but to logic as well. But one needs to be careful here to distinguish the empirical from the a posteriori. Lakatos coined the term 'quasi-empirical' `for the counterinstances to putative mathematical (...) and logical theses. Mathematics and logic may both be a posteriori, but it does not follow that they are empirical. Indeed, as Williamson has demonstrated, what counts as empirical knowledge, and the role of experience in acquiring knowledge, are both unclear. Moreover, knowledge, even of necessary truths, is fallible. Nonetheless, logical consequence holds in virtue of the meaning of the logical terms, just as consequence in general holds in virtue of the meanings of the concepts involved; and so logic is both analytic and necessary. In this respect, it is exceptional. But its methodologyand its epistemology are the same as those of mathematics and science in being fallibilist, and counterexamples to seemingly analytic truths are as likely as those in any scientic endeavour. What is needed is a new account of the evidential basis of knowledge, one which is, perhaps surprisingly, found in Aristotle. (shrink)
Work on the nature and scope of formal logic has focused unduly on the distinction between logical and extra-logical vocabulary; which argument forms a logical theory countenances depends not only on its stock of logical terms, but also on its range of grammatical categories and modes of composition. Furthermore, there is a sense in which logical terms are unnecessary. Alexandra Zinke has recently pointed out that propositional logic can be done without logical terms. By defining a logical-term-free language with the (...) full expressive power of first-order logic with identity, I show that this is true of logic more generally. Furthermore, having, in a logical theory, non-trivial valid forms that do not involve logical terms is not merely a technical possibility. As the case of adverbs shows, issues about the range of argument forms logic should countenance can quite naturally arise in such a way that they do not turn on whether we countenance certain terms as logical. (shrink)
Logic in the Torah is a ‘thematic compilation’ by Avi Sion. It collects in one volume essays that he has written on this subject in Judaic Logic (1995) and A Fortiori Logic (2013), in which traces of logic in the Torah and related religious documents (the Nakh, the Christian Bible, and the Koran and Hadiths) are identified and analyzed.
Logic in the Talmud is a ‘thematic compilation’ by Avi Sion. It collects in one volume essays that he has written on this subject in Judaic Logic (1995) and A Fortiori Logic (2013), in which traces of logic in the Talmud (the Mishna and Gemara) are identified and analyzed. While this book does not constitute an exhaustive study of logic in the Talmud, it is a ground-breaking and extensive study.
We explore the view that Frege's puzzle is a source of straightforward counterexamples to Leibniz's law. Taking this seriously requires us to revise the classical logic of quantifiers and identity; we work out the options, in the context of higher-order logic. The logics we arrive at provide the resources for a straightforward semantics of attitude reports that is consistent with the Millian thesis that the meaning of a name is just the thing it stands for. We provide models to (...) show that some of these logics are non-degenerate. (shrink)
Philosophy of biology is often said to have emerged in the last third of the twentieth century. Prior to this time, it has been alleged that the only authors who engaged philosophically with the life sciences were either logical empiricists who sought to impose the explanatory ideals of the physical sciences onto biology, or vitalists who invoked mystical agencies in an attempt to ward off the threat of physicochemical reduction. These schools paid little attention to actual biological science, and as (...) a result philosophy of biology languished in a state of futility for much of the twentieth century. The situation, we are told, only began to change in the late 1960s and early 1970s, when a new generation of researchers began to focus on problems internal to biology, leading to the consolidation of the discipline. In this paper we challenge this widely accepted narrative of the history of philosophy of biology. We do so by arguing that the most important tradition within early twentieth-century philosophy of biology was neither logical empiricism nor vitalism, but the organicist movement that flourished between the First and Second World Wars. We show that the organicist corpus is thematically and methodologically continuous with the contemporary literature in order to discredit the view that early work in the philosophy of biology was unproductive, and we emphasize the desirability of integrating the historical and contemporary conversations into a single, unified discourse. (shrink)
The result of combining classical quantificational logic with modal logic proves necessitism – the claim that necessarily everything is necessarily identical to something. This problem is reflected in the purely quantificational theory by theorems such as ∃x t=x; it is a theorem, for example, that something is identical to Timothy Williamson. The standard way to avoid these consequences is to weaken the theory of quantification to a certain kind of free logic. However, it has often been noted that in order (...) to specify the truth conditions of certain sentences involving constants or variables that don’t denote, one has to apparently quantify over things that are not identical to anything. In this paper I defend a contingentist, non-Meinongian metaphysics within a positive free logic. I argue that although certain names and free variables do not actually refer to anything, in each case there might have been something they actually refer to, allowing one to interpret the contingentist claims without quantifying over mere possibilia. (shrink)
The informal logic movement began as an attempt to develop – and teach – an alternative logic which can account for the real life arguing that surrounds us in our daily lives – in newspapers and the popular media, political and social commentary, advertising, and interpersonal exchange. The movement was rooted in research and discussion in Canada and especially at the University of Windsor, and has become a branch of argumentation theory which intersects with related traditions and approaches (notably formal (...) logic, rhetoric and dialectics in the form of pragma-dialectics). In this volume, some of the best known contributors to the movement discuss their views and the reasoning and argument which is informal logic’s subject matter. Many themes and issues are explored in a way that will fuel the continued evolution of the field. Federico Puppo adds an insightful essay which considers the origins and development of informal logic and whether informal logicians are properly described as a “school” of thought. In considering that proposition, Puppo introduces readers to a diverse range of essays, some of them previously published, others written specifically for this volume. (shrink)
We argue that the extant evidence for Stoic logic provides all the elements required for a variable-free theory of multiple generality, including a number of remarkably modern features that straddle logic and semantics, such as the understanding of one- and two-place predicates as functions, the canonical formulation of universals as quantified conditionals, a straightforward relation between elements of propositional and first-order logic, and the roles of anaphora and rigid order in the regimented sentences that express multiply general propositions. We consider (...) and reinterpret some ancient texts that have been neglected in the context of Stoic universal and existential propositions and offer new explanations of some puzzling features in Stoic logic. Our results confirm that Stoic logic surpasses Aristotle’s with regard to multiple generality, and are a reminder that focusing on multiple generality through the lens of Frege-inspired variable-binding quantifier theory may hamper our understanding and appreciation of pre-Fregean theories of multiple generality. (shrink)
An introductory textbook on metalogic. It covers naive set theory, first-order logic, sequent calculus and natural deduction, the completeness, compactness, and Löwenheim-Skolem theorems, Turing machines, and the undecidability of the halting problem and of first-order logic. The audience is undergraduate students with some background in formal logic.
What is a logical constant? The question is addressed in the tradition of Tarski's definition of logical operations as operations which are invariant under permutation. The paper introduces a general setting in which invariance criteria for logical operations can be compared and argues for invariance under potential isomorphism as the most natural characterization of logical operations.
In spite of its significance for everyday and philosophical discourse, the explanatory connective has not received much treatment in the philosophy of logic. The present paper develops a logic for based on systematic connections between and the truth-functional connectives.
Logic arguably plays a role in the normativity of reasoning. In particular, there are plausible norms of belief/disbelief whose antecedents are constituted by claims about what follows from what. But is logic also relevant to the normativity of agnostic attitudes? The question here is whether logical entailment also puts constraints on what kinds of things one can suspend judgment about. In this paper I address that question and I give a positive answer to it. In particular, I advance two logical (...) norms of agnosticism, where the first one allows us to assess situations in which the subject is agnostic about the conclusion of a valid argument and the second one allows us to assess situations in which the subject is agnostic about one of the premises of a valid argument. (shrink)
In this article, we will present a number of technical results concerning Classical Logic, ST and related systems. Our main contribution consists in offering a novel identity criterion for logics in general and, therefore, for Classical Logic. In particular, we will firstly generalize the ST phenomenon, thereby obtaining a recursively defined hierarchy of strict-tolerant systems. Secondly, we will prove that the logics in this hierarchy are progressively more classical, although not entirely classical. We will claim that a logic (...) is to be identified with an infinite sequence of consequence relations holding between increasingly complex relata: formulae, inferences, metainferences, and so on. As a result, the present proposal allows not only to differentiate Classical Logic from ST, but also from other systems sharing with it their valid metainferences. Finally, we show how these results have interesting consequences for some topics in the philosophical logic literature, among them for the debate around Logical Pluralism. The reason being that the discussion concerning this topic is usually carried out employing a rivalry criterion for logics that will need to be modified in light of the present investigation, according to which two logics can be non-identical even if they share the same valid inferences. (shrink)
Articles by Ian Mueller, Ronald Zirin, Norman Kretzmann, John Corcoran, John Mulhern, Mary Mulhern,Josiah Gould, and others. Topics: Aristotle's Syllogistic, Stoic Logic, Modern Research in Ancient Logic.
Recent work in formal semantics suggests that the language system includes not only a structure building device, as standardly assumed, but also a natural deductive system which can determine when expressions have trivial truth-conditions (e.g., are logically true/false) and mark them as unacceptable. This hypothesis, called the `logicality of language', accounts for many acceptability patterns, including systematic restrictions on the distribution of quantifiers. To deal with apparent counter-examples consisting of acceptable tautologies and contradictions, the logicality of language is often paired (...) with an additional assumption according to which logical forms are radically underspecified: i.e., the language system can see functional terms but is `blind' to open class terms to the extent that different tokens of the same term are treated as if independent. This conception of logical form has profound implications: it suggests an extreme version of the modularity of language, and can only be paired with non-classical---indeed quite exotic---kinds of deductive systems. The aim of this paper is to show that we can pair the logicality of language with a different and ultimately more traditional account of logical form. This framework accounts for the basic acceptability patterns which motivated the logicality of language, can explain why some tautologies and contradictions are acceptable, and makes better predictions in key cases. As a result, we can pursue versions of the logicality of language in frameworks compatible with the view that the language system is not radically modular vis-a-vis its open class terms and employs a deductive system that is basically classical. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.