Citations of:
Add citations
You must login to add citations.


There are two ways of understanding the notion of a contradiction: as a conjunction of a statement and its negation, or as a pair of statements one of which is the negation of the other. Correspondingly, there are two ways of understanding the Law of NonContradiction (LNC), i.e., the law that says that no contradictions can be true. In this paper I offer some arguments to the effect that on the first (collective) reading LNC is nonnegotiable, but on the second (...) 

This is a companion paper to Braüner where a natural deduction system for propositional hybrid logic is given. In the present paper we generalize the system to the firstorder case. Our natural deduction system for firstorder hybrid logic can be extended with additional inference rules corresponding to conditions on the accessibility relations and the quantifier domains expressed by socalled geometric theories. We prove soundness and completeness and we prove a normalisation theorem. Moreover, we give an axiom system firstorder hybrid logic. 

We introduce a simple inference system based on two primitive relations between terms, namely, inclusion and exclusion relations. We present a normalization theorem, and then provide a characterization of the structure of normal proofs. Based on this, inferences in a syllogistic fragment of natural language are reconstructed within our system. We also show that our system can be embedded into a fragment of propositional minimal logic. 

In the present paper I wish to regard constructivelogic as a selfcontained system for the treatment ofepistemological issues; the explanations of theconstructivist logical notions are cast in anepistemological mold already from the outset. Thediscussion offered here intends to make explicit thisimplicit epistemic character of constructivism.Particular attention will be given to the intendedinterpretation laid down by Heyting. This interpretation, especially as refined in the typetheoretical work of Per MartinLöf, puts thesystem on par with the early efforts of Frege andWhiteheadRussell. This quite (...) 

If proofs are nothing more than truth makers, then there is no force in the standard argument against classical logic (there is no guarantee that there is either a proof forA or a proof fornot A). The standard intuitionistic conception of a mathematical proof is stronger: there are epistemic constraints on proofs. But the idea that proofs must be recognizable as such by us, with our actual capacities, is incompatible with the standard intuitionistic explanations of the meanings of the logical (...) 

This paper presents rules of inference for a binary quantifier I for the formalisation of sentences containing definite descriptions within intuitionist positive free logic. I binds one variable and forms a formula from two formulas. Ix[F, G] means ‘The F is G’. The system is shown to have desirable prooftheoretic properties: it is proved that deductions in it can be brought into normal form. The discussion is rounded up by comparisons between the approach to the formalisation of definite descriptions recommended (...) 

This paper considers prooftheoretic semantics for necessity within Dummett's and Prawitz's framework. Inspired by a system of Pfenning's and Davies's, the language of intuitionist logic is extended by a higher order operator which captures a notion of validity. A notion of relative necessary is defined in terms of it, which expresses a necessary connection between the assumptions and the conclusion of a deduction. 

The focus of this paper are Dummett's meaningtheoretical arguments against classical logic based on consideration about the meaning of negation. Using Dummettian principles, I shall outline three such arguments, of increasing strength, and show that they are unsuccessful by giving responses to each argument on behalf of the classical logician. What is crucial is that in responding to these arguments a classicist need not challenge any of the basic assumptions of Dummett's outlook on the theory of meaning. In particular, I (...) 

This short paper has two loosely connected parts. In the first part, I discuss the difference between classical and intuitionist logic in relation to different the role of hypotheses play in each logic. Harmony is normally understood as a relation between two ways of manipulating formulas in systems of natural deduction: their introduction and elimination. I argue, however, that there is at least a third way of manipulating formulas, namely the discharge of assumption, and that the difference between classical and (...) 

This paper formulates a bilateral account of harmony, which is an alternative to the one proposed by Francez. It builds on an account of harmony for unilateral logic proposed by Kürbis and the observation that reading some of the rules for the connectives of bilateral logic bottom up gives the grounds and consequences of formulas with the opposite speech act. Thus the consequences of asserting a formula give grounds for denying it, namely if the opposite speech act is applied to (...) 

Bilateralists hold that the meanings of the connectives are determined by rules of inference for their use in deductive reasoning with asserted and denied formulas. This paper presents two bilateral connectives comparable to Prior's tonk, for which, unlike for tonk, there are reduction steps for the removal of maximal formulas arising from introducing and eliminating formulas with those connectives as main operators. Adding either of them to bilateral classical logic results in an incoherent system. One way around this problem is (...) 

This paper considers a formalisation of classical logic using general introduction rules and general elimination rules. It proposes a definition of ‘maximal formula’, ‘segment’ and ‘maximal segment’ suitable to the system, and gives reduction procedures for them. It is then shown that deductions in the system convert into normal form, i.e. deductions that contain neither maximal formulas nor maximal segments, and that deductions in normal form satisfy the subformula property. Tarski’s Rule is treated as a general introduction rule for implication. (...) 

At the end of the 1980s, Tennant invented a logical system that he called “intuitionistic relevant logic”. Now he calls this same system “Core logic.” In Section 1, by reference to the rules of natural deduction for $\mathbf{IR}$, I explain why $\mathbf{IR}$ is a relevant logic in a subtle way. Sections 2, 3, and 4 give three reasons to assert that $\mathbf{IR}$ cannot be a core logic. 

We develop a bottomup approach to truthvalue semantics for classical logic of partial terms based on equality and apply it to prove the conservativity of the addition of partial description and selection functions, independently of any strictness assumption. 

Kalrnaric. We set out a system T, consisting of normal proofs constructed by means of elegantly symmetrical introduction and elimination rules. In the system T there are two requirements, called ( ) and ()), on applications of discharge rules. T is sound and complete for Kalmaric arguments. ( ) requires nonvacuous discharge of assumptions; ()) requires that the assumption discharged be the sole one available of highest degree. We then consider a 'Duhemian' extension T*, obtained simply by dropping the requirement (...) 

In order to explicate Gentzen’s famous remark that the introductionrules for logical constants give their meaning, the eliminationrules being simply consequences of the meaning so given, we develop natural deduction rules for Sheffer’s stroke, alternative denial. The first system turns out to lack Double Negation. Strengthening the introductionrules by allowing the introduction of Sheffer’s stroke into a disjunctive context produces a complete system of classical logic, one which preserves the harmony between the rules which Gentzen wanted: all indirect proof reduces (...) 

The geometric system of deduction called NGraphs was introduced by de Oliveira in 2001. The proofs in this system are represented by means of digraphs and, while its derivations are mostly based on Gentzen's sequent calculus, the system gets its inspiration from geometrically based systems, such as the Kneales' tables of development, Statman's proofsasgraphs, Buss' logical flow graphs, and Girard's proofnets. Given that all these geometric systems appeal to the classical symmetry between premises and conclusions, providing an intuitionistic version of (...) 

Information is a notion of wide use and great intuitive appeal, and hence, not surprisingly, different formal paradigms claim part of it, from Shannon channel theory to Kolmogorov complexity. Information is also a widely used term in logic, but a similar diversity repeats itself: there are several competing logical accounts of this notion, ranging from semantic to syntactic. In this chapter, we will discuss three major logical accounts of information. 

It is often claimed that nominalistic programmes to reconstruct mathematics fail, since they will at some point involve the notion of logical consequence which is unavailable to the nominalist. In this paper we use an idea of Goodman and Quine to develop a nominalistically acceptable explication of logical consequence. 

According to the species of neologicism advanced by Hale and Wright, mathematical knowledge is essentially logical knowledge. Their view is found to be best understood as a set of related though independent theses: (1) neofregeanisma general conception of the relation between language and reality; (2) the method of abstractiona particular method for introducing concepts into language; (3) the scope of logicsecondorder logic is logic. The criticisms of Boolos, Dummett, Field and Quine (amongst others) of these theses are explicated and assessed. (...) 

Quine's argument for a naturalized epistemology is routinely perceived as an argument from despair: traditional epistemology must be abandoned because all attempts to deduce our scientific theories from sense experience have failed. In this paper, I will show that this picture is historically inaccurate and that Quine's argument against first philosophy is considerably stronger and subtler than the standard conception suggests. For Quine, the first philosopher's quest for foundations is inherently incoherent; the very idea of a selfsufficient sense datum language (...) 







Is it possible to give a justification of our own practice of deductive inference? The purpose of this paper is to explain what such a justification might consist in and what its purpose could be. On the conception that we are going to pursue, to give a justification for a deductive practice means to explain in terms of an intuitively satisfactory notion of validity why the inferences that conform to the practice coincide with the valid ones. That is, a justification (...) 

Philosophers are divided on whether the proof or truththeoretic approach to logic is more fruitful. The paper demonstrates the considerable explanatory power of a truthbased approach to logic by showing that and how it can provide (i) an explanatory characterization —both semantic and prooftheoretical—of logical inference, (ii) an explanatory criterion for logical constants and operators, (iii) an explanatory account of logic’s role (function) in knowledge, as well as explanations of (iv) the characteristic features of logic —formality, strong modal force, generality, (...) 

In the paper a decision procedure for S5 is presented which uses a cutfree sequent calculus with additional rules allowing a reduction to normal modal forms. It utilizes the fact that in S5 every formula is equivalent to some 1degree formula, i.e. a modallyflat formula with modal functors having only boolean formulas in its scope. In contrast to many sequent calculi for S5 the presented system does not introduce any extra devices. Thus it is a standard version of SC but (...) 

This paper presents a way of formalising definite descriptions with a binary quantifier ι, where ιx[F, G] is read as ‘The F is G’. Introduction and elimination rules for ι in a system of intuitionist negative free logic are formulated. Procedures for removing maximal formulas of the form ιx[F, G] are given, and it is shown that deductions in the system can be brought into normal form. 

This paper discusses prooftheoretic semantics, the project of specifying the meanings of the logical constants in terms of rules of inference governing them. I concentrate on Michael Dummett’s and Dag Prawitz’ philosophical motivations and give precise characterisations of the crucial notions of harmony and stability, placed in the context of proving normalisation results in systems of natural deduction. I point out a problem for defining the meaning of negation in this framework and prospects for an account of the meanings of (...) 





In the domain of ontology design as well as in Knowledge Representation, modeling universals is a challenging problem.Most approaches that have addressed this problem rely on Description Logics (DLs) but many difficulties remain, due to underconstrained representation which reduces the inferences that can be drawn and further causes problems in expressiveness. In mathematical logic and program checking, type theories have proved to be appealing but, so far they have not been applied in the formalization of ontologies. To bridge this gap, (...) 

We present an inferentialist account of the epistemic modal operator might. Our starting point is the bilateralist programme. A bilateralist explains the operator not in terms of the speech act of rejection ; we explain the operator might in terms of weak assertion, a speech act whose existence we argue for on the basis of linguistic evidence. We show that our account of might provides a solution to certain wellknown puzzles about the semantics of modal vocabulary whilst retaining classical logic. (...) 

Dummett’s justification procedures are revisited. They are used as background for the discussion of some conceptual and technical issues in prooftheoretic semantics, especially the role played by assumptions in prooftheoretic definitions of validity. 

The standard approach to what I call “prooftheoretic semantics”, which is mainly due to Dummett and Prawitz, attempts to give a semantics of proofs by defining what counts as a valid proof. After a discussion of the general aims of prooftheoretic semantics, this paper investigates in detail various notions of prooftheoretic validity and offers certain improvements of the definitions given by Prawitz. Particular emphasis is placed on the relationship between semantic validity concepts and validity concepts used in normalization theory. It (...) 





Semantic justifications of the classical rules of logical inference typically make use of a notion of bivalent truth, understood as a property guaranteed to attach to a sentence or its negation regardless of the prospects for speakers to determine it as so doing. For want of a convincing alternative account of classical logic, some philosophers suspicious of such recognitiontranscending bivalence have seen no choice but to declare classical deduction unwarranted and settle for a weaker system; intuitionistic logic in particular, buttressed (...) 

After some motivating remarks in Section 1, in Section 2 we show how to replace an axiomatic basis for any one of a broad range of sentential logics having finitely many axiom schemes and Modus Ponens as the sole proper rule, by a basis with the same axiom schemes and finitely many onepremiss rules. Section 3 mentions some questions arising from this replacement procedure , explores another such procedure, and discusses some aspects of the consequence relations associated with the different (...) 

Peter Milne (2007) poses two challenges to the inferential theorist of meaning. This study responds to both. First, it argues that the method of natural deduction idealizes the essential details of correct informal deductive reasoning. Secondly, it explains how rules of inference in free logic can determine unique senses for the existential quantifier and the identity predicate. The final part of the investigation brings out an underlying order in a basic family of free logics. 

Natural deduction is the type of logic most familiar to current philosophers, and indeed is all that many modern philosophers know about logic. Yet natural deduction is a fairly recent innovation in logic, dating from Gentzen and Ja?kowski in 1934. This article traces the development of natural deduction from the view that these founders embraced to the widespread acceptance of the method in the 1960s. I focus especially on the different choices made by writers of elementary textbooks?the standard conduits of (...) 





What is an inference? Logicians and philosophers have proposed various conceptions of inference. I shall first highlight seven features that contribute to distinguish these conceptions. I shall then compare three conceptions to see which of them best explains the special force that compels us to accept the conclusion of an inference, if we accept its premises. 

Inferentialism is the conviction that to be meaningful in the distinctively human way, or to have a 'conceptual content', is to be governed by a certain kind of inferential rules. The term was coined by Robert Brandom as a label for his theory of language; however, it is also naturally applicable (and is growing increasingly common) within the philosophy of logic. 

This paper responds to recent work in the philosophy of Homotopy Type Theory by James Ladyman and Stuart Presnell. They consider one of the rules for identity, path induction, and justify it along ‘premathematical’ lines. I give an alternate justification based on the philosophical framework of inferentialism. Accordingly, I construct a notion of harmony that allows the inferentialist to say when a connective or concept is meaningbearing and this conception unifies most of the prominent conceptions of harmony through category theory. (...) 

If correct, Christopher Peacocke’s [20] “manifestationism without verificationism,” would explode the dichotomy between realism and inferentialism in the contemporary philosophy of language. I first explicate Peacocke’s theory, defending it from a criticism of Neil Tennant’s. This involves devising a recursive definition for grasp of logical contents along the lines Peacocke suggests. Unfortunately though, the generalized account reveals the Achilles’ heel of the whole theory. By inventing a new logical operator with the introduction rule for the existential quantifier and the elimination (...) 

David Nelson’s constructive logics with strong negation may beviewed as alternative paraconsistent logic. These logics have been developedbefore da Costa’s works. We address some philosophical aspects of Nelson’slogics and give technical results concerning Kripke models and tableau calculi. We also suggest possible applications of paraconsistent constructivelogics. 

