Citations of:
Add citations
You must login to add citations.
This short paper has two loosely connected parts. In the first part, I discuss the difference between classical and intuitionist logic in relation to different the role of hypotheses play in each logic. Harmony is normally understood as a relation between two ways of manipulating formulas in systems of natural deduction: their introduction and elimination. I argue, however, that there is at least a third way of manipulating formulas, namely the discharge of assumption, and that the difference between classical and (...) 

In the present paper I wish to regard constructivelogic as a selfcontained system for the treatment ofepistemological issues; the explanations of theconstructivist logical notions are cast in anepistemological mold already from the outset. Thediscussion offered here intends to make explicit thisimplicit epistemic character of constructivism.Particular attention will be given to the intendedinterpretation laid down by Heyting. This interpretation, especially as refined in the typetheoretical work of Per MartinLöf, puts thesystem on par with the early efforts of Frege andWhiteheadRussell. This quite (...) 

Kosta Došen argued in his papers Inferential Semantics (in Wansing, H. (ed.) Dag Prawitz on Proofs and Meaning, pp. 147–162. Springer, Berlin 2015) and On the Paths of Categories (in Piecha, T., SchroederHeister, P. (eds.) Advances in ProofTheoretic Semantics, pp. 65–77. Springer, Cham 2016) that the propositionsastypes paradigm is less suited for general proof theory because—unlike proof theory based on category theory—it emphasizes categorical proofs over hypothetical inferences. One specific instance of this, Došen points out, is that the Curry–Howard isomorphism (...) 

Proofs from assumptions are amongst the most fundamental reasoning techniques. Yet the precise nature of assumptions is still an open topic. One of the most prominent conceptions is the placeholder view of assumptions generally associated with natural deduction for intuitionistic propositional logic. It views assumptions essentially as holes in proofs, either to be filled with closed proofs of the corresponding propositions via substitution or withdrawn as a side effect of some rule, thus in effect making them an auxiliary notion subservient (...) 

The hypothetical notion of consequence is normally understood as the transmission of a categorical notion from premisses to conclusion. In modeltheoretic semantics this categorical notion is 'truth', in standard prooftheoretic semantics it is 'canonical provability'. Three underlying dogmas, (I) the priority of the categorical over the hypothetical, (II) the transmission view of consequence, and (III) the identification of consequence and correctness of inference are criticized from an alternative view of prooftheoretic semantics. It is argued that consequence is a basic semantical (...) 

From the point of view of prooftheoretic semantics, it is argued that the sequent calculus with introduction rules on the assertion and on the assumption side represents deductive reasoning more appropriately than natural deduction. In taking consequence to be conceptually prior to truth, it can cope with nonwellfounded phenomena such as contradictory reasoning. The fact that, in its typed variant, the sequent calculus has an explicit and separable substitution schema in form of the cut rule, is seen as a crucial (...) 

The main aim of the present paper is to use a proof system for hybrid modal logic to formalize what are called falsebelief tasks in cognitive psychology, thereby investigating the interplay between cognition and logical reasoning about belief. We consider two different versions of the Smarties task, involving respectively a shift of perspective to another person and to another time. Our formalizations disclose that despite this difference, the two versions of the Smarties task have exactly the same underlying logical structure. (...) 

I interpret Mill?s view on logic as the instrumentalist view that logical inferences, complex statements, and logical operators are not necessary for reasoning itself, but are useful only for our remembering and communicating the results of the reasoning. To defend this view, I first show that we can transform all the complex statements in the language of classical firstorder logic into what I call material inference rules and reduce logical inferences to inferences which involve only atomic statements and the material (...) 

This paper provides finitary jointly necessary and sufficient acceptance and rejection conditions for the logical constants of a first order quantificational language. By introducing the notion of making an assignment as a distinct object level practice—something you do with a sentence—(as opposed to a metalevel semantic notion) and combining this with the practice of (hypothetical and categorical) acceptance and rejection and the practice of making suppositions one gains a structure that is sufficiently rich to fully characterize the class of classical (...) 

I here argue for a particular formulation of truthdeflationism, namely, the propositionally quantified formula, (Q) “For all p, <p> is true iff p”. The main argument consists of an enumeration of the other (five) possible formulations and criticisms thereof. Notably, Horwich’s Minimal Theory is found objectionable in that it cannot be accepted by finite beings. Other formulations err in not providing nonquestionbegging, sufficiently direct derivations of the Tschema instances. I end by defending (Q) against various objections. In particular, I argue (...) 

The representational nature of human cognition and thought in general has been a source of controversies. This is particularly so in the context of studies of unconscious cognition, in which representations tend to be ontologically and structurally segregated with regard to their conscious status. However, it appears evolutionarily and developmentally unwarranted to posit such segregations, as,otherwise, artifact structures and ontologies must be concocted to explain them from the viewpoint of the human cognitive architecture. Here, from a byandlarge Classical cognitivist viewpoint, (...) 



Philosophers are divided on whether the proof or truththeoretic approach to logic is more fruitful. The paper demonstrates the considerable explanatory power of a truthbased approach to logic by showing that and how it can provide (i) an explanatory characterization —both semantic and prooftheoretical—of logical inference, (ii) an explanatory criterion for logical constants and operators, (iii) an explanatory account of logic’s role (function) in knowledge, as well as explanations of (iv) the characteristic features of logic —formality, strong modal force, generality, (...) 

We discuss relationships among the existence property, the disjunction property, and their weak variants in the setting of intermediate predicate logics. We deal with the weak and sentential existence properties, and the Znormality, which is a weak variant of the disjunction property. These weak variants were presented in the author’s previous paper [16]. In the present paper, the Kripke sheaf semantics is used. 

This paper discusses prooftheoretic semantics, the project of specifying the meanings of the logical constants in terms of rules of inference governing them. I concentrate on Michael Dummett’s and Dag Prawitz’ philosophical motivations and give precise characterisations of the crucial notions of harmony and stability, placed in the context of proving normalisation results in systems of natural deduction. I point out a problem for defining the meaning of negation in this framework and prospects for an account of the meanings of (...) 

The focus of this paper are Dummett's meaningtheoretical arguments against classical logic based on consideration about the meaning of negation. Using Dummettian principles, I shall outline three such arguments, of increasing strength, and show that they are unsuccessful by giving responses to each argument on behalf of the classical logician. What is crucial is that in responding to these arguments a classicist need not challenge any of the basic assumptions of Dummett's outlook on the theory of meaning. In particular, I (...) 



We develop a novel solution to the negation version of the FregeGeach problem by taking up recent insights from the bilateral programme in logic. Bilateralists derive the meaning of negation from a primitive *Btype* inconsistency involving the attitudes of assent and dissent. Some may demand an explanation of this inconsistency in simpler terms, but we argue that bilateralism’s assumptions are no less explanatory than those of *Atype* semantics that only require a single primitive attitude, but must stipulate inconsistency elsewhere. Based (...) 

We present a proof of the equivalence between two deductive systems for constructive necessity, namely an axiomatic characterisation inspired by Hakli and Negri's system of derivations from assumptions for modal logic , a Hilbertstyle formalism designed to ensure the validity of the deduction theorem, and the judgmental reconstruction given by Pfenning and Davies by means of a natural deduction approach that makes a distinction between valid and true formulae, constructively. Both systems and the proof of their equivalence are formally verified (...) 

Dem "Manifestationsargument" zufolge steht eine realistische Semantik der Wahrheitsbedingungen im Widerspruch zu dem Gedanken, dass das Verstehen von Sätzen eine Fähigkeit ist, die sich im Handeln manifestieren können muss. – Der Aufsatz zeigt, dass sowohl Realisten als auch AntiRealisten die These aufzugeben haben, dass das Verstehen eines Satzes im Erfassen der jeweiligen Wahrheitsbedingungenbesteht. Die realistische Annahme der Existenz verifikationstranszendenter Wahrheiten steht – unabhängig vom Manifestationsprinzip – im Widerspruch zu einer wahrheitskonditionalen Semantik. Die von heutigen AntiRealisten vertretenen Theorien des Verstehens sind (...) 

We approach the topic of solution equivalence of propositional problems from the perspective of nonconstructive procedural theory of problems based on Transparent Intensional Logic (TIL). The answer we put forward is that two solutions are equivalent if and only if they have equivalent solution concepts. Solution concepts can be understood as a generalization of the notion of proof objects from the CurryHoward isomorphism. 

We investigate the development of terms during cutelimination in firstorder logic and Peano arithmetic for proofs of existential formulas. The form of witness terms in cutfree proofs is characterized in terms of structured combinations of basic substitutions. Based on this result, a regular tree grammar computing witness terms is given and a class of proofs is shown to have only elementary cutelimination. 

This paper explores the main philosophical approaches of David Hilbert’s theory of proof. Specifically, it is focuses on his ideas regarding logic, the concept of proof, the axiomatic, the concept of truth, metamathematics, the a priori knowledge and the general nature of scientific knowledge. The aim is to show and characterize his epistemological approach on the foundation of knowledge, where logic appears as a guarantee of that foundation. Hilbert supposes that the propositional apriorism, proposed by him to support mathematics, sustains (...) 

In the domain of ontology design as well as in Knowledge Representation, modeling universals is a challenging problem.Most approaches that have addressed this problem rely on Description Logics (DLs) but many difficulties remain, due to underconstrained representation which reduces the inferences that can be drawn and further causes problems in expressiveness. In mathematical logic and program checking, type theories have proved to be appealing but, so far they have not been applied in the formalization of ontologies. To bridge this gap, (...) 

We present an inferentialist account of the epistemic modal operator might. Our starting point is the bilateralist programme. A bilateralist explains the operator not in terms of the speech act of rejection ; we explain the operator might in terms of weak assertion, a speech act whose existence we argue for on the basis of linguistic evidence. We show that our account of might provides a solution to certain wellknown puzzles about the semantics of modal vocabulary whilst retaining classical logic. (...) 

I here defend a theory consisting of four claims about ‘property’ and properties, and argue that they form a coherent whole that can solve various serious problems. The claims are (1): ‘property’ is defined by the principles (PR): ‘Fness/Being F/etc. is a property of x iff F’ and (PA): ‘Fness/Being F/etc. is a property’; (2) the function of ‘property’ is to increase the expressive power of English, roughly by mimicking quantification into predicate position; (3) property talk should be understood at (...) 

Hilbert’s choice operators τ and ε, when added to intuitionistic logic, strengthen it. In the presence of certain extensionality axioms they produce classical logic, while in the presence of weaker decidability conditions for terms they produce various superintuitionistic intermediate logics. In this thesis, I argue that there are important philosophical lessons to be learned from these results. To make the case, I begin with a historical discussion situating the development of Hilbert’s operators in relation to his evolving program in the (...) 

This article offers an overview of inferential role semantics. We aim to provide a map of the terrain as well as challenging some of the inferentialist’s standard commitments. We begin by introducing inferentialism and placing it into the wider context of contemporary philosophy of language. §2 focuses on what is standardly considered both the most important test case for and the most natural application of inferential role semantics: the case of the logical constants. We discuss some of the (alleged) benefits (...) 

Dummett’s justification procedures are revisited. They are used as background for the discussion of some conceptual and technical issues in prooftheoretic semantics, especially the role played by assumptions in prooftheoretic definitions of validity. 

The main objective of this PhD Thesis is to present a method of obtaining strong normalization via natural ordinal, which is applicable to natural deduction systems and typed lambda calculus. The method includes (a) the definition of a numerical assignment that associates each derivation (or lambda term) to a natural number and (b) the proof that this assignment decreases with reductions of maximal formulas (or redex). Besides, because the numerical assignment used coincide with the length of a specific sequence of (...) 

Secondorder abstract categorial grammars (de Groote in Association for computational linguistics, 39th annual meeting and 10th conference of the European chapter, proceedings of the conference, pp. 148–155, 2001) and hyperedge replacement grammars (Bauderon and Courcelle in Math Syst Theory 20:83–127, 1987; Habel and Kreowski in STACS 87: 4th Annual symposium on theoretical aspects of computer science. Lecture notes in computer science, vol 247, Springer, Berlin, pp 207–219, 1987) are two natural ways of generalizing “contextfree” grammar formalisms for string and tree (...) 

The standard approach to what I call “prooftheoretic semantics”, which is mainly due to Dummett and Prawitz, attempts to give a semantics of proofs by defining what counts as a valid proof. After a discussion of the general aims of prooftheoretic semantics, this paper investigates in detail various notions of prooftheoretic validity and offers certain improvements of the definitions given by Prawitz. Particular emphasis is placed on the relationship between semantic validity concepts and validity concepts used in normalization theory. It (...) 

I argue that there are two ways of construing Wittgenstein’s slogan that meaning is use. One accepts the view that the notion of meaning must be explained in terms of truththeoretic notions and is committed to the epistemic conception of truth. The other keeps the notion of meaning and the truththeoretic notions apart and is not committed to the epistemic conception of truth. I argue that Dummett endorses the first way of construing Wittgenstein’s slogan. I address the issue by discussing (...) 

We examine the sense in which logic is a priori, and explain how mathematical theories can be dichotomized nontrivially into analytic and synthetic portions. We argue that Core Logic contains exactly the aprioribecauseanalyticallyvalid deductive principles. We introduce the reader to Core Logic by explaining its relationship to other logical systems, and stating its rules of inference. Important metatheorems about Core Logic are reported, and its important features noted. Core Logic can serve as the basis for a foundational program that could (...) 

A proof employing no semantic terms is offered in support of the claim that there can be truths without truthmakers. The logical resources used in the proof are weak but do include the structural rule Contraction. 

Semantic justifications of the classical rules of logical inference typically make use of a notion of bivalent truth, understood as a property guaranteed to attach to a sentence or its negation regardless of the prospects for speakers to determine it as so doing. For want of a convincing alternative account of classical logic, some philosophers suspicious of such recognitiontranscending bivalence have seen no choice but to declare classical deduction unwarranted and settle for a weaker system; intuitionistic logic in particular, buttressed (...) 

Various possibilities, that one is dreaming, that one is being deceived by a deceitful demon, that one is a brain in the vat being stimulated to think one has a body and is in a regular world, have been invoked to show that all one's experiencebased beliefs might be false. Descartes in Meditation I advises that in order not to lapse into his careless everyday view of things he, or at least his meditator, should pretend that all his experiencebased beliefs, (...) 

I present a new syntactical method for proving the Interpolation Theorem for the implicational fragment of intuitionistic logic and its substructural subsystems. This method, like Prawitz’s, works on natural deductions rather than sequent derivations, and, unlike existing methods, always finds a ‘strongest’ interpolant under a certain restricted but reasonable notion of what counts as an ‘interpolant’. 

The interplay of introduction and elimination rules for propositional connectives is often seen as suggesting a distinguished role for intuitionistic logic. We prove three formal results concerning intuitionistic propositional logic that bear on that perspective, and discuss their significance. First, for a range of connectives including both negation and the falsum, there are no classically or intuitionistically correct introduction rules. Second, irrespective of the choice of negation or the falsum as a primitive connective, classical and intuitionistic consequence satisfy exactly the (...) 

In this paper, we provide a general setting under which results of normalization of proof trees such as, for instance, the logicality result in equational reasoning and the cutelimination property in sequent or natural deduction calculi, can be unified and generalized. This is achieved by giving simple conditions which are sufficient to ensure that such normalization results hold, and which can be automatically checked since they are syntactical. These conditions are based on basic properties of elementary combinations of inference rules (...) 

Peter Milne (2007) poses two challenges to the inferential theorist of meaning. This study responds to both. First, it argues that the method of natural deduction idealizes the essential details of correct informal deductive reasoning. Secondly, it explains how rules of inference in free logic can determine unique senses for the existential quantifier and the identity predicate. The final part of the investigation brings out an underlying order in a basic family of free logics. 

A new formalism for predicate logic is introduced, with a nonstandard method of binding variables, which allows a compositional formalization of certain anaphoric constructions, including donkey sentences and crosssentential anaphora. A proof system in natural deduction format is provided, and the formalism is compared with other accounts of this type of anaphora, in particular Dynamic Predicate Logic. 

Various natural deduction formulations of classical, minimal, intuitionist, and intermediate propositional and firstorder logics are presented and investigated with respect to satisfaction of the separation and subformula properties. The technique employed is, for the most part, semantic, based on general versions of the Lindenbaum and Lindenbaum–Henkin constructions. Careful attention is paid to which properties of theories result in the presence of which rules of inference, and to restrictions on the sets of formulas to which the rules may be employed, restrictions (...) 

Inferentialism claims that expressions are meaningful by virtue of rules governing their use. In particular, logical expressions are autonomous if given meaning by their introductionrules, rules specifying the grounds for assertion of propositions containing them. If the eliminationrules do no more, and no less, than is justified by the introductionrules, the rules satisfy what Prawitz, following Lorenzen, called an inversion principle. This connection between rules leads to a general form of eliminationrule, and when the rules have this form, they may (...) 

I discuss Prawitz’s claim that a nonreliabilist answer to the question “What is a proof?” compels us to reject the standard BolzanoTarski account of validity, andto account for the meaning of a sentence in broadly veriﬁcationist terms. I sketch what I take to be a possible way of resisting Prawitz’s claimone that concedes the antireliabilist assumption from which Prawitz’s argument proceeds. 

This paper aims to unite two seemingly disparate themes in the philosophy of mathematics and language respectively, namely ante rem structuralism and inferentialism. My analysis begins with describing both frameworks in accordance with their genesis in the work of Hilbert. I then draw comparisons between these philosophical views in terms of their similar motivations and similar objections to the referential orthodoxy. I specifically home in on two points of comparison, namely the role of norms and the relation of ontological dependence (...) 

This is a companion paper to Braüner where a natural deduction system for propositional hybrid logic is given. In the present paper we generalize the system to the firstorder case. Our natural deduction system for firstorder hybrid logic can be extended with additional inference rules corresponding to conditions on the accessibility relations and the quantifier domains expressed by socalled geometric theories. We prove soundness and completeness and we prove a normalisation theorem. Moreover, we give an axiom system firstorder hybrid logic. 

This special issue collects together nine new essays on logical consequence :the relation obtaining between the premises and the conclusion of a logically valid argument. The present paper is a partial, and opinionated,introduction to the contemporary debate on the topic. We focus on two inﬂuential accounts of consequence, the modeltheoretic and the prooftheoretic, and on the seeming platitude that valid arguments necessarilypreserve truth. We brieﬂy discuss the main objections these accounts face, as well as Hartry Field’s contention that such objections (...) 

