Citations of:
String theory
Journal of Symbolic Logic 39 (4):625637 (1974)
Add citations
You must login to add citations.


We seek means of distinguishing logical knowledge from other kinds of knowledge, especially mathematics. The attempt is restricted to classical twovalued logic and assumes that the basic notion in logic is the proposition. First, we explain the distinction between the parts and the moments of a whole, and theories of ?sortal terms?, two theories that will feature prominently. Second, we propose that logic comprises four ?momental sectors?: the propositional and the functional calculi, the calculus of asserted propositions, and rules for (...) 

Defending or attacking either functionalism or computationalism requires clarity on what they amount to and what evidence counts for or against them. My goalhere is not to evaluatc their plausibility. My goal is to formulate them and their relationship clearly enough that we can determine which type of evidence is relevant to them. I aim to dispel some sources of confusion that surround functionalism and computationalism. recruit recent philosophical work on mechanisms and computation to shed light on them, and clarify (...) 

We study firstorder concatenation theory with bounded quantifiers. We give axiomatizations with interesting properties, and we prove some normalform results. Finally, we prove a number of decidability and undecidability results. 

This paper explores the relationship between Hume's Prinicple and Basic Law V, investigating the question whether we really do need to suppose that, already in Die Grundlagen, Frege intended that HP should be justified by its derivation from Law V. 



This paper attempts to address the question what logical strength theories of truth have by considering such questions as: If you take a theory T and add a theory of truth to it, how strong is the resulting theory, as compared to T? Once the question has been properly formulated, the answer turns out to be about as elegant as one could want: Adding a theory of truth to a finitely axiomatized theory T is more or less equivalent to a (...) 

After a short preface, the first of the three sections of this paper is devoted to historical and philosophic aspects of categoricity. The second section is a selfcontained exposition, including detailed definitions, of a proof that every mathematical system whose domain is the closure of its set of distinguished individuals under its distinguished functions is categorically characterized by its induction principle together with its true atoms (atomic sentences and negations of atomic sentences). The third section deals with applications especially those (...) 

This paper discusses the history of the confusion and controversies over whether the definition of consequence presented in the 11page 1936 Tarski consequencedefinition paper is based on a monistic fixeduniverse framework?like Begriffsschrift and Principia Mathematica. Monistic fixeduniverse frameworks, common in preWWII logic, keep the range of the individual variables fixed as the class of all individuals. The contrary alternative is that the definition is predicated on a pluralistic multipleuniverse framework?like the 1931 Gödel incompleteness paper. A pluralistic multipleuniverse framework recognizes multiple (...) 

The conservativeness argument poses a dilemma to deflationism about truth, according to which a deflationist theory of truth must be conservative but no adequate theory of truth is conservative. The debate on the conservativeness argument has so far been framed in a specific formal setting, where theories of truth are formulated over arithmetical base theories. I will argue that the appropriate formal setting for evaluating the conservativeness argument is provided not by theories of truth over arithmetic but by those over (...) 

Corcoran, J. 2007. Syntactics, American Philosophy: an Encyclopedia. 2007. Eds. John Lachs and Robert Talisse. New York: Routledge. pp.7456. / Syntactics, semantics, and pragmatics are the three levels of investigation into semiotics, or the comprehensive study of systems of communication, as described in 1938 by the American philosopher Charles Morris (19031979). Syntactics studies signs themselves and their interrelations in abstraction from their meanings and from their uses and users. Semantics studies signs in relation to their meanings, but still in abstraction (...) 

SEMANTIC ARITHMETIC: A PREFACE John Corcoran Abstract Number theory, or pure arithmetic, concerns the natural numbers themselves, not the notation used, and in particular not the numerals. String theory, or pure syntax, concems the numerals as strings of «uninterpreted» characters without regard to the numbe~s they may be used to denote. Number theory is purely arithmetic; string theory is purely syntactical... in so far as the universe of discourse alone is considered. Semantic arithmetic is a broad subject which begins when (...) 

The five English words—sentence, proposition, judgment, statement, and fact—are central to coherent discussion in logic. However, each is ambiguous in that logicians use each with multiple normal meanings. Several of their meanings are vague in the sense of admitting borderline cases. In the course of displaying and describing the phenomena discussed using these words, this paper juxtaposes, distinguishes, and analyzes several senses of these and related words, focusing on a constellation of recommended senses. One of the purposes of this paper (...) 

Guided by questions of scope, this paper provides an overview of what is known about both the scope and, consequently, the limits of Gödel’s famous first incompleteness theorem. 

This paper offers an account of what it is for a physical system to be a computing mechanism—a system that performs computations. A computing mechanism is a mechanism whose function is to generate output strings from input strings and (possibly) internal states, in accordance with a general rule that applies to all relevant strings and depends on the input strings and (possibly) internal states for its application. This account is motivated by reasons endogenous to the philosophy of computing, namely, doing (...) 

History witnesses alternative approaches to “the proposition”. The proposition has been referred to as the object of belief, disbelief, and doubt: generally as the object of propositional attitudes, that which can be said to be believed, disbelieved, understood, etc. It has also been taken to be the object of grasping, judging, assuming, affirming, denying, and inquiring: generally as the object of propositional actions, that which can be said to be grasped, judged true or false, assumed for reasoning purposes, etc. The (...) 

The aim of this paper is to present a new logicbased understanding of the connection between classical kinematics and relativistic kinematics. We show that the axioms of special relativity can be interpreted in the language of classical kinematics. This means that there is a logical translation function from the language of special relativity to the language of classical kinematics which translates the axioms of special relativity into consequences of classical kinematics. We will also show that if we distinguish a class (...) 

I offer an explication of the notion of computer, grounded in the practices of computability theorists and computer scientists. I begin by explaining what distinguishes computers from calculators. Then, I offer a systematic taxonomy of kinds of computer, including hardwired versus programmable, generalpurpose versus specialpurpose, analog versus digital, and serial versus parallel, giving explicit criteria for each kind. My account is mechanistic: which class a system belongs in, and which functions are computable by which system, depends on the system's mechanistic (...) 



This paper is the first in a twopart series in which we discuss several notions of completeness for systems of mathematical axioms, with special focus on their interrelations and historical origins in the development of the axiomatic method. We argue that, both from historical and logical points of view, higherorder logic is an appropriate framework for considering such notions, and we consider some open questions in higherorder axiomatics. In addition, we indicate how one can fruitfully extend the usual settheoretic semantics (...) 

This article discusses two coextensive concepts of logical consequence that are implicit in the two fundamental logical practices of establishing validity and invalidity for premiseconclusion arguments. The premises and conclusion of an argument have information content (they ?say? something), and they have subject matter (they are ?about? something). The asymmetry between establishing validity and establishing invalidity has long been noted: validity is established through an informationprocessing procedure exhibiting a stepbystep deduction of the conclusion from the premiseset. Invalidity is established by (...) 

According to pancomputationalism, everything is a computing system. In this paper, I distinguish between different varieties of pancomputationalism. I find that although some varieties are more plausible than others, only the strongest variety is relevant to the philosophy of mind, but only the most trivial varieties are true. As a side effect of this exercise, I offer a clarified distinction between computational modelling and computational explanation.<br><br>. 

This paper discusses the neologicist approach to the foundations of mathematics by highlighting an issue that arises from looking at the Bad Company objection from an epistemological perspective. For the most part, our issue is independent of the details of any resolution of the Bad Company objection and, as we will show, it concerns other foundational approaches in the philosophy of mathematics. In the first two sections, we give a brief overview of the "Scottish" neologicist school, present a generic form (...) 

A common assumption among philosophers is that every language has at most denumerably many expressions. This assumption plays a prominent role in many philosophical arguments. Recently formal systems with indenumerably many elements have been developed. These systems are similar to the more familiar denumerable firstorder languages. This similarity makes it appear that the assumption is false. We argue that the assumption is true. 

The syllogistic figures and moods can be taken to be argument schemata as can the rules of the Stoic propositional logic. Sentence schemata have been used in axiomatizations of logic only since the landmark 1927 von Neumann paper [31]. Modern philosophers know the role of schemata in explications of the semantic conception of truth through Tarski’s 1933 Convention T [42]. Mathematical logicians recognize the role of schemata in firstorder number theory where Peano’s secondorder Induction Axiom is approximated by Herbrand’s InductionAxiom (...) 

A Dedekind algebra is an ordered pair (B, h), where B is a nonempty set and h is a similarity transformation on B. Among the Dedekind algebras is the sequence of the positive integers. From a contemporary perspective, Dedekind established that the secondorder theory of the sequence of the positive integers is categorical and finitely axiomatizable. The purpose here is to show that this seemingly isolated result is a consequence of more general results in the model theory of secondorder languages. (...) 





I propose an account of the metaphysics of the expressions of a mathematical language which brings together the structuralist construal of a mathematical object as a place in a structure, the semantic notion of indexicality and Kit Fine's ontological theory of qua objects. By contrasting this indexical qua objects account with several other accounts of the metaphysics of mathematical expressions, I show that it does justice both to the abstractness that mathematical expressions have because they are mathematical objects and to (...) 

The purpose of this article is to examine aspects of the development of the concept and theory of computability through the theory of recursive functions. Following a brief introduction, Section 2 is devoted to the presuppositions of computability. It focuses on certain concepts, beliefs and theorems necessary for a general property of computability to be formulated and developed into a mathematical theory. The following two sections concern situations in which the presuppositions were realized and the theory of computability was developed. (...) 

The existence of a close connection between results on axiomatic truth and the analysis of truththeoretic deflationism is nowadays widely recognized. The first attempt to make such link precise can be traced back to the socalled conservativeness argument due to Leon Horsten, Stewart Shapiro and Jeffrey Ketland: by employing standard Gödelian phenomena, they concluded that deflationism is untenable as any adequate theory of truth leads to consequences that were not achievable by the base theory alone. In the paper I highlight, (...) 

/ A schema (plural: schemata, or schemas), also known as a scheme (plural: schemes), is a linguistic template or pattern together with a rule for using it to specify a potentially infinite multitude of phrases, sentences, or arguments, which are called instances of the schema. Schemas are used in logic to specify rules of inference, in mathematics to describe theories with infinitely many axioms, and in semantics to give adequacy conditions for definitions of truth. / 1. What is a Schema? (...) 