The syllogistic figures and moods can be taken to be argument schemata as can the rules of the Stoic propositional logic. Sentence schemata have been used in axiomatizations of logic only since the landmark 1927 von Neumann paper [31]. Modern philosophers know the role of schemata in explications of the semantic conception of truth through Tarski’s 1933 Convention T [42]. Mathematical logicians recognize the role of schemata in first-order number theory where Peano’s second-order Induction Axiom is approximated (...) by Herbrand’s Induction-Axiom Schema [23]. Similarly, in first-order set theory, Zermelo’s second-order Separation Axiom is approximated by Fraenkel’s first-order Separation Schema [17]. In some of several closely related senses, a schema is a complex system having multiple components one of which is a template-text or scheme-template, a syntactic string composed of one or more “blanks” and also possibly significant words and/or symbols. In accordance with a side condition the template-text of a schema is used as a “template” to specify a multitude, often infinite, of linguistic expressions such as phrases, sentences, or argument-texts, called instances of the schema. The side condition is a second component. The collection of instances may but need not be regarded as a third component. The instances are almost always considered to come from a previously identified language (whether formal or natural), which is often considered to be another component. This article reviews the often-conflicting uses of the expressions ‘schema’ and ‘scheme’ in the literature of logic. It discusses the different definitions presupposed by those uses. And it examines the ontological and epistemic presuppositions circumvented or mooted by the use of schemata, as well as the ontological and epistemic presuppositions engendered by their use. In short, this paper is an introduction to the history and philosophy of schemata. (shrink)
Review of Karel Lambert, Meinong and the Principle of Independence: Its Place in Meinong's Theory of Objects and Its Significance in Contemporary Philosophical Logic.
In this paper I sketch some arguments that underlie Hegel's chapter on judgment, and I attempt to place them within a broad tradition in the history of logic. Focusing on his analysis of simple predicative assertions or ‘positive judgments’, I first argue that Hegel supplies an instructive alternative to the classical technique of existential quantification. The main advantage of his theory lies in his treatment of the ontological implications of judgments, implications that are inadequately captured by quantification. The (...) second concern of this paper is the manner in which Hegel makes logic not only dependent on ontology generally, but also variant in regard to domains of objects. In other words, he offers a domain-specific logical theory, according to which the form of judgment or inference is specific to the subject of judgment. My third concern lies with the metaphilosophical consequences of this theory, and this includes some more familiar Hegelian themes. It is well known that Hegel frequently questioned the adequacy of the sentential form for expressing higher order truths. My reading of his theory of predication explains and contextualizes this tendency by demystifying notions like the so-called speculative proposition. (shrink)
Corcoran, John. 2005. Meanings of word: type-occurrence-token. Bulletin of Symbolic Logic 11(2005) 117. -/- Once we are aware of the various senses of ‘word’, we realize that self-referential statements use ambiguous sentences. If a statement is made using the sentence ‘this is a pronoun’, is the speaker referring to an interpreted string, a string-type, a string-occurrence, a string-token, or what? The listeners can wonder “this what?”. -/- John Corcoran, Meanings of word: type-occurrence-token Philosophy, University at (...) Buffalo, Buffalo, NY 14260-4150 E-mail: corcoran@buffalo.edu The four-letter written-English expression ‘word’, which plays important roles in applications and expositions of logic and philosophy of logic, is ambiguous (multisense, or polysemic) in that it has multiple normal meanings (senses, or definitions). Several of its meanings are vague (imprecise, or indefinite) in that they admit of borderline (marginal, or fringe) cases. This paper juxtaposes, distinguishes, and analyses several senses of ‘word’ focusing on a constellation of senses analogous to constellations of senses of other expression words such as ‘expression’, ‘symbol’, ‘character’, ‘letter’, ‘term’, ‘phrase’, ‘formula’, ‘sentence’, ‘derivation’, ‘paragraph’, and ‘discourse’. Consider, e.g., the word ‘letter’. In one sense there are exactly twenty-six letters (letter-types or ideal letters) in the English alphabet and there are exactly four letters in the word ‘letter’. In another sense, there are exactly six letters (letter-repetitions or letter-occurrences) in the word-type ‘letter’. In yet another sense, every new inscription (act of writing or printing) of ‘letter’ brings into existence six new letters (letter-tokens or ink-letters) and one new word that had not previously existed. The number of letter-occurrences (occurrences of a letter-type) in a given word-type is the same as the number of letter-tokens (tokens of a letter-type) in a single token of the given word. Many logicians fail to distinguish “token” from “occurrence” and a few actually confuse the two concepts. Epistemological and ontological problems concerning word-types, word-occurrences, and word-tokens are described in philosophically neutral terms. This paper presents a theoretical framework of concepts and principles concerning logicography, including use of English in logic. The framework is applied to analytical exposition and critical evaluation of classic passages in the works of philosophers and logicians including Boole, Peirce, Frege, Russell, Tarski, Church and Quine. This paper is intended as a philosophical sequel to Corcoran et al. “StringTheory”, Journal of Symbolic Logic 39(1974) 625-637. https://www.academia.edu/s/cdfa6c854e?source=link -/- . (shrink)
Tarski’s Convention T—presenting his notion of adequate definition of truth (sic)—contains two conditions: alpha and beta. Alpha requires that all instances of a certain T Schema be provable. Beta requires in effect the provability of ‘every truth is a sentence’. Beta formally recognizes the fact, repeatedly emphasized by Tarski, that sentences (devoid of free variable occurrences)—as opposed to pre-sentences (having free occurrences of variables)—exhaust the range of significance of is true. In Tarski’s preferred usage, it is part of the meaning (...) of true that attribution of being true to a given thing presupposes the thing is a sentence. Beta’s importance is further highlighted by the fact that alpha can be satisfied using the recursively definable concept of being satisfied by every infinite sequence, which Tarski explicitly rejects. Moreover, in Definition 23, the famous truth-definition, Tarski supplements “being satisfied by every infinite sequence” by adding the condition “being a sentence”. Even where truth is undefinable and treated by Tarski axiomatically, he adds as an explicit axiom a sentence to the effect that every truth is a sentence. Surprisingly, the sentence just before the presentation of Convention T seems to imply that alpha alone might be sufficient. Even more surprising is the sentence just after Convention T saying beta “is not essential”. Why include a condition if it is not essential? Tarski says nothing about this dissonance. Considering the broader context, the Polish original, the German translation from which the English was derived, and other sources, we attempt to determine what Tarski might have intended by the two troubling sentences which, as they stand, are contrary to the spirit, if not the letter, of several other passages in Tarski’s corpus. (shrink)
This paper takes two tasks. The one is elaborating on the relationship of inductive logic with decision theory to which later Carnap planned to apply his system (§§1-7); this is a surveying side of this article. The other is revealing the property of our prediction of the future, subjectivity (§§8-11); this is its philosophical aspect. They are both discussed under the name of belief in causation. Belief in causation is a kind of “degree of belief” born about the (...) causal effect of the action. As such, it admits of the analysis by inductive logic. (shrink)
In the author’s previous contribution to this journal (Rosen 2015), a phenomenological stringtheory was proposed based on qualitative topology and hypercomplex numbers. The current paper takes this further by delving into the ancient Chinese origin of phenomenological stringtheory. First, we discover a connection between the Klein bottle, which is crucial to the theory, and the Ho-t’u, a Chinese number archetype central to Taoist cosmology. The two structures are seen to mirror each other in (...) expressing the psychophysical (phenomenological) action pattern at the heart of microphysics. But tackling the question of quantum gravity requires that a whole family of topological dimensions be brought into play. What we find in engaging with these structures is a closely related family of Taoist forebears that, in concert with their successors, provide a blueprint for cosmic evolution. Whereas conventional stringtheory accounts for the generation of nature’s fundamental forces via a notion of symmetry breaking that is essentially static and thus unable to explain cosmogony successfully, phenomenological/Taoist stringtheory entails the dialectical interplay of symmetry and asymmetry inherent in the principle of synsymmetry. This dynamic concept of cosmic change is elaborated on in the three concluding sections of the paper. Here, a detailed analysis of cosmogony is offered, first in terms of the theory of dimensional development and its Taoist (yin-yang) counterpart, then in terms of the evolution of the elemental force particles through cycles of expansion and contraction in a spiraling universe. The paper closes by considering the role of the analyst per se in the further evolution of the cosmos. (shrink)
This article had its start with another article, concerned with measuring the speed of gravitational waves - "The Measurement of the Light Deflection from Jupiter: Experimental Results" by Ed Fomalont and Sergei Kopeikin (2003) - The Astrophysical Journal 598 (1): 704–711. This starting-point led to many other topics that required explanation or naturally seemed to follow on – Unification of gravity with electromagnetism and the 2 nuclear forces, Speed of electromagnetic waves, Energy of cosmic rays and UHECRs, Digital string (...)theory, Dark energy+gravity+binary digits, Cosmic strings and wormholes from Figure-8 Klein bottles, Massless and massive photons and gravitons, Inverse square+quantum entanglement = God+evolution, Binary digits projected to make Prof. Greene’s cosmic holographic movie, Renormalization of infinity, Colliding subuniverses, Unifying cosmic inflation, TOE (emphasizing “EVERYthing”) = Bose-Einstein renormalized. The text also addresses (in a nonmathematical way) the wavelength of electromagnetic waves, the frequency of gravitational waves, gravitational and electromagnetic waves having identical speed, the gamma-ray burst designated GRB 090510, the smoothness of space, and includes these words – “Gravity produces electromagnetism. Retrocausally (by means of humans travelling into the past of this subuniverse with their electronics); this “Cosmic EM Background” produces base-2 mathematics, which produces gravity. EM interacts with gravity to produce particles, mass – gravity/EM could be termed “the Higgs field” - and the nuclear forces associated with those particles. It makes gravity using BITS that copy the principle of magnetism attracting and repelling, before pasting it into what we call the strong force and dark energy.” . (shrink)
In this paper, I offer an analysis of the radical disagreement over the adequacy of stringtheory. The prominence of stringtheory despite its notorious lack of empirical support is sometimes explained as a troubling case of science gone awry, driven largely by sociological mechanisms such as groupthink (e.g. Smolin 2006). Others, such as Dawid (2013), explain the controversy by positing a methodological revolution of sorts, according to which string theorists have quietly turned to nonempirical (...) methods of theory assessment given the technological inability to directly test the theory. The appropriate response, according to Dawid, is to acknowledge this development and widen the canons of acceptable scientific methods. As I’ll argue, however, the current situation in fundamental physics does not require either of these responses. Rather, as I’ll suggest, much of the controversy stems from a failure to properly distinguish the “context of justification” from the “context of pursuit”. Both those who accuse string theorists of betraying the scientific method and those who advocate an enlarged conception of scientific methodology objectionably conflate epistemic justification with judgements of pursuit-worthiness. Once we get clear about this distinction and about the different norms governing the two contexts, the current situation in fundamental physics becomes much less puzzling. After defending this diagnosis of the controversy, I’ll show how the argument patterns that have been posited by Dawid as constituting an emergent methodological revolution in science are better off if reworked as arguments belonging to the context of pursuit. (shrink)
This paper contends that Stoic logic (i.e. Stoic analysis) deserves more attention from contemporary logicians. It sets out how, compared with contemporary propositional calculi, Stoic analysis is closest to methods of backward proof search for Gentzen-inspired substructural sequent logics, as they have been developed in logic programming and structural proof theory, and produces its proof search calculus in tree form. It shows how multiple similarities to Gentzen sequent systems combine with intriguing dissimilarities that may enrich contemporary discussion. (...) Much of Stoic logic appears surprisingly modern: a recursively formulated syntax with some truth-functional propositional operators; analogues to cut rules, axiom schemata and Gentzen’s negation-introduction rules; an implicit variable-sharing principle and deliberate rejection of Thinning and avoidance of paradoxes of implication. These latter features mark the system out as a relevance logic, where the absence of duals for its left and right introduction rules puts it in the vicinity of McCall’s connexive logic. Methodologically, the choice of meticulously formulated meta-logical rules in lieu of axiom and inference schemata absorbs some structural rules and results in an economical, precise and elegant system that values decidability over completeness. (shrink)
For each positive n , two alternative axiomatizations of the theory of strings over n alphabetic characters are presented. One class of axiomatizations derives from Tarski's system of the Wahrheitsbegriff and uses the n characters and concatenation as primitives. The other class involves using n character-prefixing operators as primitives and derives from Hermes' Semiotik. All underlying logics are second order. It is shown that, for each n, the two theories are definitionally equivalent [or synonymous in the sense of deBouvere]. (...) It is further shown that each member of one class is synonymous with each member of the other class; thus that all of the theories are definitionally equivalent with each other and with Peano arithmetic. Categoricity of Peano arithmetic then implies categoricity of each of the above theories. (shrink)
This paper is a contribution to graded model theory, in the context of mathematical fuzzy logic. We study characterizations of classes of graded structures in terms of the syntactic form of their first-order axiomatization. We focus on classes given by universal and universal-existential sentences. In particular, we prove two amalgamation results using the technique of diagrams in the setting of structures valued on a finite MTL-algebra, from which analogues of the Łoś–Tarski and the Chang–Łoś–Suszko preservation theorems follow.
Alfred Tarski (1901--1983) is widely regarded as one of the two giants of twentieth-century logic and also as one of the four greatest logicians of all time (Aristotle, Frege and Gödel being the other three). Of the four, Tarski was the most prolific as a logician. The four volumes of his collected papers, which exclude most of his 19 monographs, span over 2500 pages. Aristotle's writings are comparable in volume, but most of the Aristotelian corpus is not about (...) class='Hi'>logic, whereas virtually everything written by Tarski concerns logic more or less directly. There is no doubt that Tarski wrote more on logic than any other author; he started publishing on logic in 1921 at the age of 20 and continued until his death at the age of 82. Two of his works appeared posthumously [Hist. Philos. Logic 7 (1986), no. 2, 143--154; MR0868748 (88b:03010); Tarski and Givant, A formalization of set theory without variables, Amer. Math. Soc., Providence, RI, 1987; MR0920815 (89g:03012)]. Tarski's voluminous writings were widely scattered in numerous journals, some quite rare. It has been extremely difficult to study the development of Tarski's thought and to trace the interconnections and interdependence of his various papers. Thanks to the present collection all this has changed, and it is likely that the increased accessibility of Tarski's papers will have the effect of increasing Tarski's already enormous influence. (shrink)
How does logic relate to rational belief? Is logic normative for belief, as some say? What, if anything, do facts about logical consequence tell us about norms of doxastic rationality? In this paper, we consider a range of putative logic-rationality bridge principles. These purport to relate facts about logical consequence to norms that govern the rationality of our beliefs and credences. To investigate these principles, we deploy a novel approach, namely, epistemic utility theory. That is, we (...) assume that doxastic attitudes have different epistemic value depending on how accurately they represent the world. We then use the principles of decision theory to determine which of the putative logic-rationality bridge principles we can derive from considerations of epistemic utility. (shrink)
This paper is concerned with representations of belief by means of nonadditive probabilities of the Dempster-Shafer (DS) type. After surveying some foundational issues and results in the D.S. theory, including Suppes's related contributions, the paper proceeds to analyze the connection of the D.S. theory with some of the work currently pursued in epistemic logic. A preliminary investigation of the modal logic of belief functions à la Shafer is made. There it is shown that the Alchourrron-Gärdenfors-Makinson (A.G.M.) (...)logic of belief change is closely related to the D.S. theory. The final section compares the critique of Bayesianism which underlies the present paper with some important objections raised by Suppes against this doctrine. -/- . (shrink)
DEFINING OUR TERMS A “paradox" is an argumentation that appears to deduce a conclusion believed to be false from premises believed to be true. An “inconsistency proof for a theory" is an argumentation that actually deduces a negation of a theorem of the theory from premises that are all theorems of the theory. An “indirect proof of the negation of a hypothesis" is an argumentation that actually deduces a conclusion known to be false from the hypothesis alone (...) or, more commonly, from the hypothesis augmented by a set of premises known to be true. A “direct proof of a hypothesis" is an argumentation that actually deduces the hypothesis itself from premises known to be true. Since `appears', `believes' and `knows' all make elliptical reference to a participant, it is clear that `paradox', `indirect proof' and `direct proof' are all participant-relative. PARTICIPANT RELATIVITY In normal mathematical writing the participant is presumed to be “the community of mathematicians" or some more or less well-defined subcommunity and, therefore, omission of explicit reference to the participant is often warranted. However, in historical, critical, or philosophical writing focused on emerging branches of mathematics such omission often invites confusion. One and the same argumentation has been a paradox for one mathematician, an inconsistency proof for another, and an indirect proof to a third. One and the same argumentation-text can appear to one mathematician to express an indirect proof while appearing to another mathematician to express a direct proof. WHAT IS A PARADOX’S SOLUTION? Of the above four sorts of argumentation only the paradox invites “solution" or “resolution", and ordinarily this is to be accomplished either by discovering a logical fallacy in the “reasoning" of the argumentation or by discovering that the conclusion is not really false or by discovering that one of the premises is not really true. Resolution of a paradox by a participant amounts to reclassifying a formerly paradoxical argumentation either as a “fallacy", as a direct proof of its conclusion, as an indirect proof of the negation of one of its premises, as an inconsistency proof, or as something else depending on the participant's state of knowledge or belief. This illustrates why an argumentation which is a paradox to a given mathematician at a given time may well not be a paradox to the same mathematician at a later time. -/- The present article considers several set-theoretic argumentations that appeared in the period 1903-1908. The year 1903 saw the publication of B. Russell's Principles of mathematics, [Cambridge Univ. Press, Cambridge, 1903; Jbuch 34, 62]. The year 1908 saw the publication of Russell's article on type theory as well as Ernst Zermelo's two watershed articles on the axiom of choice and the foundations of set theory. The argumentations discussed concern “the largest cardinal", “the largest ordinal", the well-ordering principle, “the well-ordering of the continuum", denumerability of ordinals and denumerability of reals. The article shows that these argumentations were variously classified by various mathematicians and that the surrounding atmosphere was one of confusion and misunderstanding, partly as a result of failure to make or to heed distinctions similar to those made above. The article implies that historians have made the situation worse by not observing or not analysing the nature of the confusion. -/- RECOMMENDATION This well-written and well-documented article exemplifies the fact that clarification of history can be achieved through articulation of distinctions that had not been articulated (or were not being heeded) at the time. The article presupposes extensive knowledge of the history of mathematics, of mathematics itself (especially set theory) and of philosophy. It is therefore not to be recommended for casual reading. AFTERWORD: This review was written at the same time Corcoran was writing his signature “Argumentations and logic”[249] that covers much of the same ground in much more detail. https://www.academia.edu/14089432/Argumentations_and_Logic . (shrink)
Philosophers are divided on whether the proof- or truth-theoretic approach to logic is more fruitful. The paper demonstrates the considerable explanatory power of a truth-based approach to logic by showing that and how it can provide (i) an explanatory characterization —both semantic and proof-theoretical—of logical inference, (ii) an explanatory criterion for logical constants and operators, (iii) an explanatory account of logic’s role (function) in knowledge, as well as explanations of (iv) the characteristic features of logic —formality, (...) strong modal force, generality, topic neutrality, basicness, and (quasi-)apriority, (v) the veridicality of logic and its applicability to science, (v) the normativity of logic, (vi) error, revision, and expansion in/of logic, and (vii) the relation between logic and mathematics. The high explanatory power of the truth-theoretic approach does not rule out an equal or even higher explanatory power of the proof-theoretic approach. But to the extent that the truth-theoretic approach is shown to be highly explanatory, it sets a standard for other approaches to logic, including the proof-theoretic approach. (shrink)
Principles are central to physical reasoning, particularly in the search for a theory of quantum gravity (QG), where novel empirical data is lacking. One principle widely adopted in the search for QG is UV completion: the idea that a theory should (formally) hold up to all possible high energies. We argue---/contra/ standard scientific practice---that UV-completion is poorly-motivated as a guiding principle in theory-construction, and cannot be used as a criterion of theory-justification in the search for QG. (...) For this, we explore the reasons for expecting, or desiring, a UV-complete theory, as well as analyse how UV completion is used, and how it should be used, in various specific approaches to QG. (shrink)
In times of crisis, when current theories are revealed as inadequate to task, and new physics is thought to be required---physics turns to re-evaluate its principles, and to seek new ones. This paper explores the various types, and roles of principles that feature in the problem of quantum gravity as a current crisis in physics. I illustrate the diversity of the principles being appealed to, and show that principles serve in a variety of roles in all stages of the crisis, (...) including in motivating the need for a new theory, and defining what this theory should be like. In particular, I consider: the generalised correspondence principle, UV-completion, background independence, and the holographic principle. I also explore how the current crisis fits with Friedman's view on the roles of principles in revolutionary theory-change, finding that while many key aspects of this view are not represented in quantum gravity, the view could potentially offer a useful diagnostic, and prescriptive strategy. This paper is intended to be relatively non-technical, and to bring some of the philosophical issues from the search for quantum gravity to a more general philosophical audience interested in the roles of principles in scientific theory-change. (shrink)
Arthur Danto’s recent book, Andy Warhol, leads the reader through the story of the iconic American’s artistic life highlighted by a philosophical commentary, a commentary that merges Danto’s aesthetic theory with the artist himself. Inspired by Warhol’s Brillo Box installation, art that in Danto’s eyes was indiscernible from the everyday boxes it represented, Danto developed a theory that is able to differentiate art from non-art by employing the body of conceptual art theory manifest in what he termed (...) the ‘artworld’. The strength of Danto’s theory is found in its ability to explain the art of the post-modern era. His body of work weaves philosophy, art history and art criticism together, merging his aesthetic philosophy with his extensive knowledge of the world of art. Danto’s essentialist theory of embodied meaning provides him with a critical tool that succeeds in explaining the currents of contemporary art, a task that many great thinkers of art history were unable to do. If Warhol inspired Danto to create a philosophy of art, it is appropriate that Danto write a tribute to Warhol that traces how Warhol brought philosophy into art. Danto’s account of ‘Warhol as philosopher’ positions him as a pivotal figure in the history of twentieth-century art, effecting a sea change in how art was made and viewed. Warhol achieved this by conceiving of works that embodied the answers to a series of philosophical puzzles surrounding the nature of art. Warhol, as Danto describes him, manifests himself in his art because he had transformed himself, in a way, into an icon of the times. This pragmatist notion that art should undermine the dichotomies that exist between art and life would, by some accounts, position Warhol to be the philosopher that Danto claims him to be, for he dissolved the philosophical questions posted by late modern aesthetic thinkers by creating art that imploded the accepted notions of art at the time. One of Danto’s greatest contributions to aesthetics is his theory’s ability to distinguish art from non-art, recognizing that it is the artist’s intention that levels the sublimity of art into the commonplace, thereby transfiguring the everyday. However, acknowledging this achievement, I argue that Warhol’s philosophical contribution actually manifests itself in a manner different from that proposed by Danto. Danto maintains that the internal drive of art leads to the unfolding of art theoretical concepts that ineluctably shift the terrain of world of art. I would agree with Danto that Warhol, almost as Hegel viewed Napoleon as Geist on a horse, pushed forward the boundaries of art through the actualization of art’s internal drive. But I would disagree that the conceptual nature of art is one that unfolds merely as a relation of concepts that artists trace through a connection to the meaning of history they forge using their unmediated grasp of style. Rather, I would argue that the artist’s style is not bound so narrowly to the meanings they express. Through their aesthetic articulations, artists initiate a process of social interaction. This process employs the philosophical logic which Danto attributes to Warhol indirectly, and through it, it is able to transfigure the vocabulary of art—the concepts of the artworld—by superseding the language of modernism. Warhol’s philosophical contribution is seen in his mastery of both the medium of art and the underlying logic of the medium’s expression and reception. (shrink)
The previously introduced algorithm \sqema\ computes first-order frame equivalents for modal formulae and also proves their canonicity. Here we extend \sqema\ with an additional rule based on a recursive version of Ackermann's lemma, which enables the algorithm to compute local frame equivalents of modal formulae in the extension of first-order logic with monadic least fixed-points \mffo. This computation operates by transforming input formulae into locally frame equivalent ones in the pure fragment of the hybrid mu-calculus. In particular, we prove (...) that the recursive extension of \sqema\ succeeds on the class of `recursive formulae'. We also show that a certain version of this algorithm guarantees the canonicity of the formulae on which it succeeds. (shrink)
CORCORAN RECOMMENDS COCCHIARELLA ON TYPE THEORY. The 1983 review in Mathematical Reviews 83e:03005 of: Cocchiarella, Nino “The development of the theory of logical types and the notion of a logical subject in Russell's early philosophy: Bertrand Russell's early philosophy, Part I”. Synthese 45 (1980), no. 1, 71-115 .
A collection of material on Husserl's Logical Investigations, and specifically on Husserl's formal theory of parts, wholes and dependence and its influence in ontology, logic and psychology. Includes translations of classic works by Adolf Reinach and Eugenie Ginsberg, as well as original contributions by Wolfgang Künne, Kevin Mulligan, Gilbert Null, Barry Smith, Peter M. Simons, Roger A. Simons and Dallas Willard. Documents work on Husserl's ontology arising out of early meetings of the Seminar for Austro-German Philosophy.
The theory of imperatives is philosophically relevant since in building it — some of the long standing problems need to be addressed, and presumably some new ones are waiting to be discovered. The relevance of the theory of imperatives for philosophical research is remarkable, but usually recognized only within the ﬁeld of practical philosophy. Nevertheless, the emphasis can be put on problems of theoretical philosophy. Proper understanding of imperatives is likely to raise doubts about some of our deeply (...) entrenched and tacit presumptions. In philosophy of language it is the presumption that declaratives provide the paradigm for sentence form; in philosophy of science it is the belief that theory construction is independent from the language practice, in logic it is the conviction that logical meaning relations are constituted out of logical terminology, in ontology it is the view that language use is free from ontological commitments. The list is not exhaustive; it includes only those presumptions that this paper concerns. (shrink)
I present a reconstruction of the logical system of the Tractatus, which differs from classical logic in two ways. It includes an account of Wittgenstein’s “form-series” device, which suffices to express some effectively generated countably infinite disjunctions. And its attendant notion of structure is relativized to the fixed underlying universe of what is named. -/- There follow three results. First, the class of concepts definable in the system is closed under finitary induction. Second, if the universe of objects is (...) countably infinite, then the property of being a tautology is \Pi^1_1-complete. But third, it is only granted the assumption of countability that the class of tautologies is \Sigma_1-definable in set theory. -/- Wittgenstein famously urges that logical relationships must show themselves in the structure of signs. He also urges that the size of the universe cannot be prejudged. The results of this paper indicate that there is no single way in which logical relationships could be held to make themselves manifest in signs, which does not prejudge the number of objects. (shrink)
This paper starts by indicating the analysis of Hempel's conditions of adequacy for any relation of confirmation (Hempel, 1945) as presented in Huber (submitted). There I argue contra Carnap (1962, Section 87) that Hempel felt the need for two concepts of confirmation: one aiming at plausible theories and another aiming at informative theories. However, he also realized that these two concepts are conflicting, and he gave up the concept of confirmation aiming at informative theories. The main part of the paper (...) consists in working out the claim that one can have Hempel's cake and eat it too - in the sense that there is a logic of theory assessment that takes into account both of the two conflicting aspects of plausibility and informativeness. According to the semantics of this logic, a is an acceptable theory for evidence β if and only if a is both sufficiently plausible given β and sufficiently informative about β. This is spelt out in terms of ranking functions (Spohn, 1988) and shown to represent the syntactically specified notion of an assessment relation. The paper then compares these acceptability relations to explanatory and confirmatory consequence relations (Flach, 2000) as well as to nonmonotonic consequence relations (Kraus et al., 1990). It concludes by relating the plausibility-informativeness approach to Carnap's positive relevance account, thereby shedding new light on Carnap's analysis as well as solving another problem of confirmation theory. (shrink)
The program put forward in von Wright's last works defines deontic logic as ``a study of conditions which must be satisfied in rational norm-giving activity'' and thus introduces the perspective of logical pragmatics. In this paper a formal explication for von Wright's program is proposed within the framework of set-theoretic approach and extended to a two-sets model which allows for the separate treatment of obligation-norms and permission norms. The three translation functions connecting the language of deontic logic with (...) the language of the extended set-theoretical approach are introduced, and used in proving the correspondence between the deontic theorems, on one side, and the perfection properties of the norm-set and the ``counter-set'', on the other side. In this way the possibility of reinterpretation of standard deontic logic as the theory of perfection properties that ought to be achieved in norm-giving activity has been formally proved. The extended set-theoretic approach is applied to the problem of rationality of principles of completion of normative systems. The paper concludes with a plaidoyer for logical pragmatics turn envisaged in the late phase of Von Wright's work in deontic logic. (shrink)
Automated reasoning about uncertain knowledge has many applications. One difficulty when developing such systems is the lack of a completely satisfactory integration of logic and probability. We address this problem directly. Expressive languages like higher-order logic are ideally suited for representing and reasoning about structured knowledge. Uncertain knowledge can be modeled by using graded probabilities rather than binary truth-values. The main technical problem studied in this paper is the following: Given a set of sentences, each having some probability (...) of being true, what probability should be ascribed to other (query) sentences? A natural wish-list, among others, is that the probability distribution (i) is consistent with the knowledge base, (ii) allows for a consistent inference procedure and in particular (iii) reduces to deductive logic in the limit of probabilities being 0 and 1, (iv) allows (Bayesian) inductive reasoning and (v) learning in the limit and in particular (vi) allows confirmation of universally quantified hypotheses/sentences. We translate this wish-list into technical requirements for a prior probability and show that probabilities satisfying all our criteria exist. We also give explicit constructions and several general characterizations of probabilities that satisfy some or all of the criteria and various (counter) examples. We also derive necessary and sufficient conditions for extending beliefs about finitely many sentences to suitable probabilities over all sentences, and in particular least dogmatic or least biased ones. We conclude with a brief outlook on how the developed theory might be used and approximated in autonomous reasoning agents. Our theory is a step towards a globally consistent and empirically satisfactory unification of probability and logic. (shrink)
Russell’s initial project in philosophy (1898) was to make mathematics rigorous reducing it to logic. Before August 1900, however, Russell’s logic was nothing but mereology. First, his acquaintance with Peano’s ideas in August 1900 led him to discard the part-whole logic and accept a kind of intensional predicate logic instead. Among other things, the predicate logic helped Russell embrace a technique of treating the paradox of infinite numbers with the help of a singular concept, which (...) he called ‘denoting phrase’. Unfortunately, a new paradox emerged soon: that of classes. The main contention of this paper is that Russell’s new conception only transferred the paradox of infinity from the realm of infinite numbers to that of class-inclusion. Russell’s long-elaborated solution to his paradox developed between 1905 and 1908 was nothing but to set aside of some of the ideas he adopted with his turn of August 1900: (i) With the Theory of Descriptions, he reintroduced the complexes we are acquainted with in logic. In this way, he partly restored the pre-August 1900 mereology of complexes and simples. (ii) The elimination of classes, with the help of the ‘substitutional theory’, and of propositions, by means of the Multiple Relation Theory of Judgment, completed this process. (shrink)
This paper investigates Wittgenstein’s account of the relation between elementary and molecular propositions (and thus, also, the propositions of logic) in the Tractatus Logico-Philosophicus. I start by sketching a natural reading of that relation – which I call the “bipartite reading” – holding that the Tractatus gives an account of elementary propositions, based on the so-called picture theory, and a different account of molecular ones, based on the principle of truth- functionality. I then show that such a reading (...) cannot be attributed to Wittgenstein, because he holds the view that an explanation of logical complexity is already given by a correct account of the (pictorial) nature of elementary propositions; this is implied in his claim that “an elementary proposition contains all logical constants/operations in itself”. After clarifying Wittgenstein’s notion of an operation from the Notes on Logic to the Tractatus, I finally explain why Wittgenstein claims that an elementary proposition contains all logical operations in itself, and hence why he can be said to provide a unified (and thus not bipartite) account of language and logic. (shrink)
Mathematicians often speak of conjectures, yet unproved, as probable or well-confirmed by evidence. The Riemann Hypothesis, for example, is widely believed to be almost certainly true. There seems no initial reason to distinguish such probability from the same notion in empirical science. Yet it is hard to see how there could be probabilistic relations between the necessary truths of pure mathematics. The existence of such logical relations, short of certainty, is defended using the theory of logical probability (or objective (...) Bayesianism or non-deductive logic), and some detailed examples of its use in mathematics surveyed. Examples of inductive reasoning in experimental mathematics are given and it is argued that the problem of induction is best appreciated in the mathematical case. (shrink)
CORCORAN REVIEWS THE 4 VOLUMES OF TARSKI’S COLLECTED PAPERS Alfred Tarski (1901--1983) is widely regarded as one of the two giants of twentieth-century logic and also as one of the four greatest logicians of all time (Aristotle, Frege and Gödel being the other three). Of the four, Tarski was the most prolific as a logician. The four volumes of his collected papers, which exclude most of his 19 monographs, span over 2500 pages. Aristotle's writings are comparable in volume, but (...) most of the Aristotelian corpus is not about logic, whereas virtually everything written by Tarski concerns logic more or less directly. There is no doubt that Tarski wrote more on logic than any other author; he started publishing on logic in 1921 at the age of 20 and continued until his death at the age of 82. (shrink)
Photomechanical reprint of papers from 1970 to 1992 mostly in English, some in German or French: Foreword 1–4; LAW AS PRACTICE ‘La formation des concepts en sciences juridiques’ 7–33, ‘Geltung des Rechts – Wirksamkeit des Rechts’ 35–42, ‘Macrosociological Theories of Law’ 43–76, ‘Law & its Inner Morality’ 77–89, ‘The Law & its Limits’ 91–96; LAW AS TECHNIQUE ‘Domaine »externe« & domaine »interne« en droit’ 99–117, ‘Die ministerielle Begründung’ 119–139, ‘The Preamble’ 141–167, ‘Presumption & Fiction’ 169–185, ‘Legal Technique’187–198; LAW AS (...) class='Hi'>LOGIC ‘Moderne Staatlichkeit und modernes formales Recht’ 201–207, ‘Heterogeneity & Validity of Law’ 209–218, ‘Leibniz & die Frage der rechtlichen Systembildung’ 219–232, ‘Law & its Approach as a System’ 233–255, ‘Logic of Law & Judicial Activity’ 258–288, ‘Kelsen’s Pure Theory of Law’ 289–293, ‘The Nature of the Judicial Application of Norms’ 295–314; LAW AS EXPERIENCE ‘The Socially Determined Nature of Legal Reasoning’317–374, ‘The Ontological Foundation of Law’ 375–390, ‘Is Law a System of Enactments?’ 391–398, ‘The Uniqueness of National Legal Cultures’ 399–411, ‘Institutions as Systems’ 413–424; LAW AS HISTORY ‘From Legal Customs to Legal Folkways’ 427–436, ‘Anthropological Jurisprudence?’ 437–457, ‘Law as a Social Issue’ 459–475, ‘Law as History?’477–484, ‘Rechtskultur – Denkkultur’ 485–489; w/ Curriculum Vitae & Bibliography, as well as Index & Indexes of normative materials & of names. (shrink)
The notion of consciousness has been studied in many ways out of which, there could be a scientific approach of studying it using stringtheory which enjoys the mathematical coherence. The paper aims to study consciousness using stringtheory in which the role of graviton will be discussed. The notion of parallel universes given by stringtheory would be tackled to understand consciousness and make an attempt to clarify the notion of other world or (...) universe and parallel universes, the problems of time travel, soul and death in the universe or universes of strings which gets twisted and compactified. (shrink)
This introduction contextualizes and evaluates Herbert Marcuse’s the accompanying, previously untranslated review of John Dewey’s Logic: The Theory of Inquiry. Marcuse’s critique of pragmatism is indebted to Max Horkheimer’s claim that pragmatism is an example of “traditional” theory and reduces thought to mere instrument in service of external ends. Unlike Horkheimer, Marcuse concedes that Dewey, unlike the logical positivists, attempted to develop a material logic of ends. However, he concludes that the attempt was ultimately unsuccessful. I (...) place this conclusion in the context of Marcuse’s critique of technological reason. Lastly, I defend Dewey from the charge of crude instrumentalism and delineate Marcuse’s and Dewey’s critical disagreement on science’s capacity for self-reflection. (shrink)
In “Proof-Theoretic Justiﬁcation of Logic”, building on work by Dummett and Prawitz, I show how to construct use-based meaning-theories for the logical constants. The assertability-conditional meaning-theory takes the meaning of the logical constants to be given by their introduction rules; the consequence-conditional meaning-theory takes the meaning of the logical constants to be given by their elimination rules. I then consider the question: given a set of introduction rules \, what are the strongest elimination rules that are validated (...) by an assertability conditional meaning-theory based on \? I prove that the intuitionistic introduction rules are the strongest rules that are validated by the intuitionistic elimination rules. I then prove that intuitionistic logic is the strongest logic that can be given either an assertability-conditional or consequence-conditional meaning-theory. In “Grounding Grounding” I discuss the notion of grounding. My discussion revolves around the problem of iterated grounding-claims. Suppose that \ grounds \; what grounds that \ grounds that \? I argue that unless we can get a satisfactory answer to this question the notion of grounding will be useless. I discuss and reject some proposed accounts of iterated grounding claims. I then develop a new way of expressing grounding, propose an account of iterated grounding-claims and show how we can develop logics for grounding. In “Is the Vagueness Argument Valid?” I argue that the Vagueness Argument in favor of unrestricted composition isn’t valid. However, if the premisses of the argument are true and the conclusion false, mereological facts fail to supervene on non-mereological facts. I argue that this failure of supervenience is an artifact of the interplay between the necessity and determinacy operators and that it does not mean that mereological facts fail to depend on non-mereological facts. I sketch a deﬂationary view of ontology to establish this. (shrink)
The precondition of any feminist politics – a usable category of ‘woman’ – has proved to be difﬁcult to construct, even proposed to be impossible, given the ‘problem of exclusion’. This is the inevitable exclusion of at least some women, as their lives or experiences do not ﬁt into the necessary and sufﬁcient condition(s) that denotes group membership. In this paper, I propose that the problem of exclusion arises not because of inappropriate category membership criteria, but because of the presumption (...) that categories can only be organised by identity relations or shared properties among their members. This criterion of sameness as well as the characterisation of this exclusion as essentialism attests to a metaphysics that is not conducive to resistance and liberatory projects. Following a strain of hybrid thinking in feminist and post-colonial theory, I outline an alternative pluralist logic that confronts oppressive binaries that impede theory work in gender, sexuality, and race theory, and limit political action and resistance. The problem of exclusion is neither irresolvable nor is it essentialism. Instead it is a denial of subjectivity due to pseudodualistic self/Other dichotomies that can be resisted by adopting a new categorial logic. While this paper focuses on the speciﬁc problem of formulating a category of ‘woman’, it has implications for other areas of gender, critical race, and postcolonial theory. Rather than working toward an inclusive category founded on sameness, theorists need to develop independent and positive categories grounded in difference. Our current categorial logic does not permit such a project, and therefore a new metaphysics must be adopted. (shrink)
Direct reference theory faces serious prima facie counterexamples which must be explained away (e.g., that it is possible to know a priori that Hesperus = Phosphorus). This is done by means of various forms of pragmatic explanation. But when those explanations that provisionally succeed are generalized to deal with analogous prima facie counterexamples concerning the identity of propositions, a fatal dilemma results. Either identity must be treated as a four-place relation (contradicting what just about everyone, including direct reference theorists, (...) takes to be essential to identity). Or direct reference theorists must incorporate a view that was rejected in pretty much our first lesson about identity—namely, that Hesperus at twilight is not identical to Hesperus at dawn. One way of the other, the direct reference theory is thus inconsistent with basic principles concerning the logic of identity, which nearly everyone, including direct reference theorists, take as starting points. (shrink)
In a previous work we introduced the algorithm \SQEMA\ for computing first-order equivalents and proving canonicity of modal formulae, and thus established a very general correspondence and canonical completeness result. \SQEMA\ is based on transformation rules, the most important of which employs a modal version of a result by Ackermann that enables elimination of an existentially quantified predicate variable in a formula, provided a certain negative polarity condition on that variable is satisfied. In this paper we develop several extensions of (...) \SQEMA\ where that syntactic condition is replaced by a semantic one, viz. downward monotonicity. For the first, and most general, extension \SSQEMA\ we prove correctness for a large class of modal formulae containing an extension of the Sahlqvist formulae, defined by replacing polarity with monotonicity. By employing a special modal version of Lyndon's monotonicity theorem and imposing additional requirements on the Ackermann rule we obtain restricted versions of \SSQEMA\ which guarantee canonicity, too. (shrink)
This paper discusses an almost sixty year old problem in the philosophy of science -- that of a logic of confirmation. We present a new analysis of Carl G. Hempel's conditions of adequacy (Hempel 1945), differing from the one Carnap gave in §87 of his Logical Foundations of Probability (1962). Hempel, it is argued, felt the need for two concepts of confirmation: one aiming at true theories and another aiming at informative theories. However, he also realized that these two (...) concepts are conflicting, and he gave up the concept of confirmation aiming at informative theories. We then show that one can have Hempel's cake and eat it, too: There is a (rank-theoretic and genuinely nonmonotonic) logic of confirmation -- or rather, theory assessment -- that takes into account both of these two conflicting aspects. According to this logic, a statement H is an acceptable theory for the data E if and only if H is both sufficiently plausible given E and sufficiently informative about E. Finally, the logic sheds new light on Carnap's analysis (and solves another problem of confirmation theory). (shrink)
This thesis discusses some central aspects of Wittgenstein's conception of language and logic in his Tractatus Logico-Philosophicus and brings them into relation with the philosophies of Frege and Russell. The main contention is that a fruitful way of understanding the Tractatus is to see it as responding to tensions in Frege's conception of logic and Russell's theory of judgement. In the thesis the philosophy of the Tractatus is presented as developing from these two strands of criticism and (...) thus as the culmination of the philosophy of logic and language developed in the early analytic period. Part one examines relevant features of Frege's philosophy of logic. Besides shedding light on Frege's philosophy in its own right, it aims at preparing the ground for a discussion of those aspects of the Tractatus' conception of logic which derive from Wittgenstein's critical response to Frege. Part two first presents Russell's early view on truth and judgement, before considering several variants of the multiple relation theory of judgement, devised in opposition to it. Part three discusses the development of Wittgenstein's conception of language and logic, beginning with Wittgenstein's criticism of the multiple relation theory and his early theory of sense, seen as containing the seeds of the picture theory of propositions presented in the Tractatus. I then consider the relation between Wittgenstein's pictorial conception of language and his conception of logic, arguing that Wittgenstein's understanding of sense in terms of bipolarity grounds his view of logical complexity and of the essence of logic as a whole. This view, I show, is free from the internal tensions that affect Frege's understanding of the nature of logic. (shrink)
The philosophy of mathematics has been accused of paying insufficient attention to mathematical practice: one way to cope with the problem, the one we will follow in this paper on extensive magnitudes, is to combine the `history of ideas' and the `philosophy of models' in a logical and epistemological perspective. The history of ideas allows the reconstruction of the theory of extensive magnitudes as a theory of ordered algebraic structures; the philosophy of models allows an investigation into the (...) way epistemology might affect relevant mathematical notions. The article takes two historical examples as a starting point for the investigation of the role of numerical models in the construction of a system of non-Archimedean magnitudes. A brief exposition of the theories developed by Giuseppe Veronese and by Rodolfo Bettazzi at the end of the 19th century will throw new light on the role played by magnitudes and numbers in the development of the concept of a non-Archimedean order. Different ways of introducing non-Archimedean models will be compared and the influence of epistemological models will be evaluated. Particular attention will be devoted to the comparison between the models that oriented Veronese's and Bettazzi's works and the mathematical theories they developed, but also to the analysis of the way epistemological beliefs affected the concepts of continuity and measurement. (shrink)
Noun phrases with overt determiners, such as <i>some apples</i> or <i>a quantity of milk</i>, differ from bare noun phrases like <i>apples</i> or <i>milk</i> in their contribution to aspectual composition. While this has been attributed to syntactic or algebraic properties of these noun phrases, such accounts have explanatory shortcomings. We suggest instead that the relevant property that distinguishes between the two classes of noun phrases derives from two modes of existential quantification, one of which holds the values of a variable fixed (...) throughout a quantificational context while the other allows them to vary. Inspired by Dynamic Plural Logic and Dependence Logic, we propose Plural Predicate Logic as an extension of Predicate Logic to formalize this difference. We suggest that temporal <i>for</i>-adverbials are sensitive to aspect because of the way they manipulate quantificational contexts, and that analogous manipulations occur with spatial <i>for</i>-adverbials, habituals, and the quantifier <i>all</i>. (shrink)
Reinach’s essay of 1911 establishes an ontological theory of logic, based on the notion of Sachverhalt or state of affairs. He draws on the theory of meaning and reference advanced in Husserl’s Logical Investigations and at the same time anticipates both Wittgenstein’s Tractatus and later speech act theorists’ ideas on performative utterances. The theory is used by Reinach to draw a distinction between two kinds of negative judgment: the simple negative judgment, which is made true by (...) a negative state of affairs; and the polemical negative judgment, which is a performative utterance in which the truth of some earlier judgment – typically a judgment made by some other person – is denied. (shrink)
Focusing on the work of Friedrich von Hayek and Vernon Smith, we discuss some conceptual links between Austrian economics and recent work in behavioral game theory and experimental economics. After a brief survey of the main methodological aspects of Austrian and experimental economics, we suggest that common views on subjectivism, individualism, and the role of qualitative explanations and predictions in social science may favour a fruitful interaction between these two research programs.
I consider the first-order modal logic which counts as valid those sentences which are true on every interpretation of the non-logical constants. Based on the assumptions that it is necessary what individuals there are and that it is necessary which propositions are necessary, Timothy Williamson has tentatively suggested an argument for the claim that this logic is determined by a possible world structure consisting of an infinite set of individuals and an infinite set of worlds. He notes that (...) only the cardinalities of these sets matters, and that not all pairs of infinite sets determine the same logic. I use so-called two-cardinal theorems from model theory to investigate the space of logics and consequence relations determined by pairs of infinite sets, and show how to eliminate the assumption that worlds are individuals from Williamson’s argument. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.