The prooftheory of many-valued systems has not been investigated to an extent comparable to the work done on axiomatizatbility of many-valued logics. Prooftheory requires appropriate formalisms, such as sequent calculus, natural deduction, and tableaux for classical (and intuitionistic) logic. One particular method for systematically obtaining calculi for all finite-valued logics was invented independently by several researchers, with slight variations in design and presentation. The main aim of this report is to develop the proof (...)theory of finite-valued first order logics in a general way, and to present some of the more important results in this area. In Systems covered are the resolution calculus, sequent calculus, tableaux, and natural deduction. This report is actually a template, from which all results can be specialized to particular logics. (shrink)
This paper contends that Stoic logic (i.e. Stoic analysis) deserves more attention from contemporary logicians. It sets out how, compared with contemporary propositional calculi, Stoic analysis is closest to methods of backward proof search for Gentzen-inspired substructural sequent logics, as they have been developed in logic programming and structural prooftheory, and produces its proof search calculus in tree form. It shows how multiple similarities to Gentzen sequent systems combine with intriguing dissimilarities that may enrich contemporary (...) discussion. Much of Stoic logic appears surprisingly modern: a recursively formulated syntax with some truth-functional propositional operators; analogues to cut rules, axiom schemata and Gentzen’s negation-introduction rules; an implicit variable-sharing principle and deliberate rejection of Thinning and avoidance of paradoxes of implication. These latter features mark the system out as a relevance logic, where the absence of duals for its left and right introduction rules puts it in the vicinity of McCall’s connexive logic. Methodologically, the choice of meticulously formulated meta-logical rules in lieu of axiom and inference schemata absorbs some structural rules and results in an economical, precise and elegant system that values decidability over completeness. (shrink)
Gaisi Takeuti (1926–2017) is one of the most distinguished logicians in prooftheory after Hilbert and Gentzen. He extensively extended Hilbert's program in the sense that he formulated Gentzen's sequent calculus, conjectured that cut-elimination holds for it (Takeuti's conjecture), and obtained several stunning results in the 1950–60s towards the solution of his conjecture. Though he has been known chiefly as a great mathematician, he wrote many papers in English and Japanese where he expressed his philosophical thoughts. In particular, (...) he used several keywords such as "active intuition" and "self-reflection" from Nishida's philosophy. In this paper, we aim to describe a general outline of our project to investigate Takeuti's philosophy of mathematics. In particular, after reviewing Takeuti's proof-theoretic results briefly, we describe some key elements in Takeuti's texts. By explaining these texts, we point out the connection between Takeuti's prooftheory and Nishida's philosophy and explain the future goals of our project. (shrink)
Semantics plays a role in grammar in at least three guises. (A) Linguists seek to account for speakers‘ knowledge of what linguistic expressions mean. This goal is typically achieved by assigning a model theoretic interpretation2 in a compositional fashion. For example, No whale flies is true if and only if the intersection of the sets of whales and fliers is empty in the model. (B) Linguists seek to account for the ability of speakers to make various inferences based on semantic (...) knowledge. For example, No whale flies entails No blue whale flies and No whale flies high. (C) The wellformedness of a variety of syntactic constructions depends on morpho-syntactic features with a semantic flavor. For example, Under no circumstances would a whale fly is grammatical, whereas Under some circumstances would a whale fly is not, corresponding to the downward vs. upward monotonic features of the preposed phrases. (shrink)
Takeuti and Titani have introduced and investigated a logic they called intuitionistic fuzzy logic. This logic is characterized as the first-order Gödel logic based on the truth value set [0,1]. The logic is known to be axiomatizable, but no deduction system amenable to proof-theoretic, and hence, computational treatment, has been known. Such a system is presented here, based on previous work on hypersequent calculi for propositional Gödel logics by Avron. It is shown that the system is sound and complete, (...) and allows cut-elimination. A question by Takano regarding the eliminability of the Takeuti-Titani density rule is answered affirmatively. (shrink)
This paper presents a sequent calculus and a dual domain semantics for a theory of definite descriptions in which these expressions are formalised in the context of complete sentences by a binary quantifier I. I forms a formula from two formulas. Ix[F, G] means ‘The F is G’. This approach has the advantage of incorporating scope distinctions directly into the notation. Cut elimination is proved for a system of classical positive free logic with I and it is shown to (...) be sound and complete for the semantics. The system has a number of novel features and is briefly compared to the usual approach of formalising ‘the F ’ by a term forming operator. It does not coincide with Hintikka’s and Lambert’s preferred theories, but the divergence is well-motivated and attractive. (shrink)
We introduce translations between display calculus proofs and labeled calculus proofs in the context of tense logics. First, we show that every derivation in the display calculus for the minimal tense logic Kt extended with general path axioms can be effectively transformed into a derivation in the corresponding labeled calculus. Concerning the converse translation, we show that for Kt extended with path axioms, every derivation in the corresponding labeled calculus can be put into a special form that is translatable to (...) a derivation in the associated display calculus. A key insight in this converse translation is a canonical representation of display sequents as labeled polytrees. Labeled polytrees, which represent equivalence classes of display sequents modulo display postulates, also shed light on related correspondence results for tense logics. (shrink)
In this dissertation, we shall investigate whether Tennant's criterion for paradoxicality(TCP) can be a correct criterion for genuine paradoxes and whether the requirement of a normal derivation(RND) can be a proof-theoretic solution to the paradoxes. Tennant’s criterion has two types of counterexamples. The one is a case which raises the problem of overgeneration that TCP makes a paradoxical derivation non-paradoxical. The other is one which generates the problem of undergeneration that TCP renders a non-paradoxical derivation paradoxical. Chapter 2 deals (...) with the problem of undergeneration and Chapter 3 concerns the problem of overgeneration. Chapter 2 discusses that Tenant’s diagnosis of the counterexample which applies CR−rule and causes the undergeneration problem is not correct and presents a solution to the problem of undergeneration. Chapter 3 argues that Tennant’s diagnosis of the counterexample raising the overgeneration problem is wrong and provides a solution to the problem. Finally, Chapter 4 addresses what should be explicated in order for RND to be a proof-theoretic solution to the paradoxes. (shrink)
The traditional view of evidence in mathematics is that evidence is just proof and proof is just derivation. There are good reasons for thinking that this view should be rejected: it misrepresents both historical and current mathematical practice. Nonetheless, evidence, proof, and derivation are closely intertwined. This paper seeks to tease these concepts apart. It emphasizes the role of argumentation as a context shared by evidence, proofs, and derivations. The utility of argumentation theory, in general, and (...) argumentation schemes, in particular, as a methodology for the study of mathematical practice is thereby demonstrated. Argumentation schemes represent an almost untapped resource for mathematics education. Notably, they provide a consistent treatment of rigorous and non-rigorous argumentation, thereby working to exhibit the continuity of reasoning in mathematics with reasoning in other areas. Moreover, since argumentation schemes are a comparatively mature methodology, there is a substantial body of existing work to draw upon, including some increasingly sophisticated software tools. Such tools have significant potential for the analysis and evaluation of mathematical argumentation. The first four sections of the paper address the relationships of evidence to proof, proof to derivation, argument to proof, and argument to evidence, respectively. The final section directly addresses some of the educational implications of an argumentation scheme account of mathematical reasoning. (shrink)
1971. Discourse Grammars and the Structure of Mathematical Reasoning II: The Nature of a Correct Theory of Proof and Its Value, Journal of Structural Learning 3, #2, 1–16. REPRINTED 1976. Structural Learning II Issues and Approaches, ed. J. Scandura, Gordon & Breach Science Publishers, New York, MR56#15263. -/- This is the second of a series of three articles dealing with application of linguistics and logic to the study of mathematical reasoning, especially in the setting of a concern for (...) improvement of mathematical education. The present article presupposes the previous one. Herein we develop our ideas of the purposes of a theory of proof and the criterion of success to be applied to such theories. In addition we speculate at length concerning the specific kinds of uses to which a successful theory of proof may be put vis-a-vis improvement of various aspects of mathematical education. The final article will deal with the construction of such a theory. The 1st is the 1971. Discourse Grammars and the Structure of Mathematical Reasoning I: Mathematical Reasoning and Stratification of Language, Journal of Structural Learning 3, #1, 55–74. https://www.academia.edu/s/fb081b1886?source=link . (shrink)
We introduce an effective translation from proofs in the display calculus to proofs in the labelled calculus in the context of tense logics. We identify the labelled calculus proofs in the image of this translation as those built from labelled sequents whose underlying directed graph possesses certain properties. For the basic normal tense logic Kt, the image is shown to be the set of all proofs in the labelled calculus G3Kt.
Recent years have seen fresh impetus brought to debates about the proper role of statistical evidence in the law. Recent work largely centres on a set of puzzles known as the ‘proof paradox’. While these puzzles may initially seem academic, they have important ramifications for the law: raising key conceptual questions about legal proof, and practical questions about DNA evidence. This article introduces the proof paradox, why we should care about it, and new work attempting to resolve (...) it. (shrink)
Restall set forth a "consecution" calculus in his "An Introduction to Substructural Logics." This is a natural deduction type sequent calculus where the structural rules play an important role. This paper looks at different ways of extending Restall's calculus. It is shown that Restall's weak soundness and completeness result with regards to a Hilbert calculus can be extended to a strong one so as to encompass what Restall calls proofs from assumptions. It is also shown how to extend the calculus (...) so as to validate the metainferential rule of reasoning by cases, as well as certain theory-dependent rules. (shrink)
This work provides proof-search algorithms and automated counter-model extraction for a class of STIT logics. With this, we answer an open problem concerning syntactic decision procedures and cut-free calculi for STIT logics. A new class of cut-free complete labelled sequent calculi G3LdmL^m_n, for multi-agent STIT with at most n-many choices, is introduced. We refine the calculi G3LdmL^m_n through the use of propagation rules and demonstrate the admissibility of their structural rules, resulting in auxiliary calculi Ldm^m_nL. In the single-agent case, (...) we show that the refined calculi Ldm^m_nL derive theorems within a restricted class of (forestlike) sequents, allowing us to provide proof-search algorithms that decide single-agent STIT logics. We prove that the proof-search algorithms are correct and terminate. (shrink)
A question, long discussed by legal scholars, has recently provoked a considerable amount of philosophical attention: ‘Is it ever appropriate to base a legal verdict on statistical evidence alone?’ Many philosophers who have considered this question reject legal reliance on bare statistics, even when the odds of error are extremely low. This paper develops a puzzle for the dominant theories concerning why we should eschew bare statistics. Namely, there seem to be compelling scenarios in which there are multiple sources of (...) incriminating statistical evidence. As we conjoin together different types of statistical evidence, it becomes increasingly incredible to suppose that a positive verdict would be impermissible. I suggest that none of the dominant views in the literature can easily accommodate such cases, and close by offering a diagnosis of my own. (shrink)
One of the most fundamental questions in the philosophy of mathematics concerns the relation between truth and formal proof. The position according to which the two concepts are the same is called deflationism, and the opposing viewpoint substantialism. In an important result of mathematical logic, Kurt Gödel proved in his first incompleteness theorem that all consistent formal systems containing arithmetic include sentences that can neither be proved nor disproved within that system. However, such undecidable Gödel sentences can be established (...) to be true once we expand the formal system with Alfred Tarski s semantical theory of truth, as shown by Stewart Shapiro and Jeffrey Ketland in their semantical arguments for the substantiality of truth. According to them, in Gödel sentences we have an explicit case of true but unprovable sentences, and hence deflationism is refuted. -/- Against that, Neil Tennant has shown that instead of Tarskian truth we can expand the formal system with a soundness principle, according to which all provable sentences are assertable, and the assertability of Gödel sentences follows. This way, the relevant question is not whether we can establish the truth of Gödel sentences, but whether Tarskian truth is a more plausible expansion than a soundness principle. In this work I will argue that this problem is best approached once we think of mathematics as the full human phenomenon, and not just consisting of formal systems. When pre-formal mathematical thinking is included in our account, we see that Tarskian truth is in fact not an expansion at all. I claim that what proof is to formal mathematics, truth is to pre-formal thinking, and the Tarskian account of semantical truth mirrors this relation accurately. -/- However, the introduction of pre-formal mathematics is vulnerable to the deflationist counterargument that while existing in practice, pre-formal thinking could still be philosophically superfluous if it does not refer to anything objective. Against this, I argue that all truly deflationist philosophical theories lead to arbitrariness of mathematics. In all other philosophical accounts of mathematics there is room for a reference of the pre-formal mathematics, and the expansion of Tarkian truth can be made naturally. Hence, if we reject the arbitrariness of mathematics, I argue in this work, we must accept the substantiality of truth. Related subjects such as neo-Fregeanism will also be covered, and shown not to change the need for Tarskian truth. -/- The only remaining route for the deflationist is to change the underlying logic so that our formal languages can include their own truth predicates, which Tarski showed to be impossible for classical first-order languages. With such logics we would have no need to expand the formal systems, and the above argument would fail. From the alternative approaches, in this work I focus mostly on the Independence Friendly (IF) logic of Jaakko Hintikka and Gabriel Sandu. Hintikka has claimed that an IF language can include its own adequate truth predicate. I argue that while this is indeed the case, we cannot recognize the truth predicate as such within the same IF language, and the need for Tarskian truth remains. In addition to IF logic, also second-order logic and Saul Kripke s approach using Kleenean logic will be shown to fail in a similar fashion. (shrink)
It is shown how the schema of equivalence can be used to obtain short proofs of tautologies A , where the depth of proofs is linear in the number of variables in A .
Most human actions are complex, but some of them are basic. Which are these? In this paper, I address this question by invoking slips, a common kind of mistake. The proposal is this: an action is basic if and only if it is not possible to slip in performing it. The argument discusses some well-established results from the psychology of language production in the context of a philosophical theory of action. In the end, the proposed criterion is applied to (...) discuss some well-known theories of basic actions. (shrink)
Roughly, a proof of a theorem, is “pure” if it draws only on what is “close” or “intrinsic” to that theorem. Mathematicians employ a variety of terms to identify pure proofs, saying that a pure proof is one that avoids what is “extrinsic,” “extraneous,” “distant,” “remote,” “alien,” or “foreign” to the problem or theorem under investigation. In the background of these attributions is the view that there is a distance measure (or a variety of such measures) between mathematical (...) statements and proofs. Mathematicians have paid little attention to specifying such distance measures precisely because in practice certain methods of proof have seemed self- evidently impure by design: think for instance of analytic geometry and analytic number theory. By contrast, mathematicians have paid considerable attention to whether such impurities are a good thing or to be avoided, and some have claimed that they are valuable because generally impure proofs are simpler than pure proofs. This article is an investigation of this claim, formulated more precisely by proof- theoretic means. After assembling evidence from prooftheory that may be thought to support this claim, we will argue that on the contrary this evidence does not support the claim. (shrink)
ABSTRACT This part of the series has a dual purpose. In the first place we will discuss two kinds of theories of proof. The first kind will be called a theory of linear proof. The second has been called a theory of suppositional proof. The term "natural deduction" has often and correctly been used to refer to the second kind of theory, but I shall not do so here because many of the theories so-called (...) are not of the second kind--they must be thought of either as disguised linear theories or theories of a third kind (see postscript below). The second purpose of this part is 25 to develop some of the main ideas needed in constructing a comprehensive theory of proof. The reason for choosing the linear and suppositional theories for this purpose is because the linear theory includes only rules of a very simple nature, and the suppositional theory can be seen as the result of making the linear theory more comprehensive. CORRECTION: At the time these articles were written the word ‘proof’ especially in the phrase ‘proof from hypotheses’ was widely used to refer to what were earlier and are now called deductions. I ask your forgiveness. I have forgiven Church and Henkin who misled me. (shrink)
This paper considers logics which are formally dual to intuitionistic logic in order to investigate a co-constructive logic for proofs and refutations. This is philosophically motivated by a set of problems regarding the nature of constructive truth, and its relation to falsity. It is well known both that intuitionism can not deal constructively with negative information, and that defining falsity by means of intuitionistic negation leads, under widely-held assumptions, to a justification of bivalence. For example, we do not want to (...) equate falsity with the non-existence of a proof since this would render a statement such as “pi is transcendental” false prior to 1882. In addition, the intuitionist account of negation as shorthand for the derivation of absurdity is inadequate, particularly outside of purely mathematical contexts. To deal with these issues, I investigate the dual of intuitionistic logic, co-intuitionistic logic, as a logic of refutation, alongside intuitionistic logic of proofs. Direct proof and refutation are dual to each other, and are constructive, whilst there also exist syntactic, weak, negations within both logics. In this respect, the logic of refutation is weakly paraconsistent in the sense that it allows for statements for which, neither they, nor their negation, are refuted. I provide a prooftheory for the co-constructive logic, a formal dualizing map between the logics, and a Kripke-style semantics. This is given an intuitive philosophical rendering in a re-interpretation of Kolmogorov’s logic of problems. (shrink)
Infectious logics are systems that have a truth-value that is assigned to a compound formula whenever it is assigned to one of its components. This paper studies four-valued infectious logics as the basis of transparent theories of truth. This take is motivated as a way to treat different pathological sentences differently, namely, by allowing some of them to be truth-value gluts and some others to be truth-value gaps and as a way to treat the semantic pathology suffered by at least (...) some of these sentences as infectious. This leads us to consider four distinct four-valued logics: one where truth-value gaps are infectious, but gluts are not; one where truth-value gluts are infectious, but gaps are not; and two logics where both gluts and gaps are infectious, in some sense. Additionally, we focus on the prooftheory of these systems, by offering a discussion of two related topics. On the one hand, we prove some limitations regarding the possibility of providing standard Gentzen sequent calculi for these systems, by dualizing and extending some recent results for infectious logics. On the other hand, we provide sound and complete four-sided sequent calculi, arguing that the most important technical and philosophical features taken into account to usually prefer standard calculi are, indeed, enjoyed by the four-sided systems. (shrink)
According to a common conception of legal proof, satisfying a legal burden requires establishing a claim to a numerical threshold. Beyond reasonable doubt, for example, is often glossed as 90% or 95% likelihood given the evidence. Preponderance of evidence is interpreted as meaning at least 50% likelihood given the evidence. In light of problems with the common conception, I propose a new ‘relevant alternatives’ framework for legal standards of proof. Relevant alternative accounts of knowledge state that a person (...) knows a proposition when their evidence rules out all relevant error possibilities. I adapt this framework to model three legal standards of proof—the preponderance of evidence, clear and convincing evidence, and beyond reasonable doubt standards. I describe virtues of this framework. I argue that, by eschewing numerical thresholds, the relevant alternatives framework avoids problems inherent to rival models. I conclude by articulating aspects of legal normativity and practice illuminated by the relevant alternatives framework. (shrink)
Theism and its cousins, atheism and agnosticism, are seldom taken to task for logical-epistemological incoherence. This paper provides a condensed proof that not only theism, but atheism and agnosticism as well, are all of them conceptually self-undermining, and for the same reason: All attempt to make use of the concept of “transcendent reality,” which here is shown not only to lack meaning, but to preclude the very possibility of meaning. In doing this, the incoherence of theism, atheism, and agnosticism (...) is secondary to the more general incoherence of any attempts to refer to so-called “transcendent realities.” A recognition of the conceptually fundamental incoherence of theism, atheism, and agnosticism compels our rational assent to a position the author names “paratheism.”. (shrink)
The concept of burden of proof is used in a wide range of discourses, from philosophy to law, science, skepticism, and even in everyday reasoning. This paper provides an analysis of the proper deployment of burden of proof, focusing in particular on skeptical discussions of pseudoscience and the paranormal, where burden of proof assignments are most poignant and relatively clear-cut. We argue that burden of proof is often misapplied or used as a mere rhetorical gambit, with (...) little appreciation of the underlying principles. The paper elaborates on an important distinction between evidential and prudential varieties of burdens of proof, which is cashed out in terms of Bayesian probabilities and error management theory. Finally, we explore the relationship between burden of proof and several (alleged) informal logical fallacies. This allows us to get a firmer grip on the concept and its applications in different domains, and also to clear up some confusions with regard to when exactly some fallacies (ad hominem, ad ignorantiam, and petitio principii) may or may not occur. (shrink)
Definitions I presented in a previous article as part of a semantic approach in epistemology assumed that the concept of derivability from standard logic held across all mathematical and scientific disciplines. The present article argues that this assumption is not true for quantum mechanics (QM) by showing that concepts of validity applicable to proofs in mathematics and in classical mechanics are inapplicable to proofs in QM. Because semantic epistemology must include this important theory, revision is necessary. The one I (...) propose also extends semantic epistemology beyond the ‘hard’ sciences. The article ends by presenting and then refuting some responses QM theorists might make to my arguments. (shrink)
The Born’s rule to interpret the square of wave function as the probability to get a specific value in measurement has been accepted as a postulate in foundations of quantum mechanics. Although there have been so many attempts at deriving this rule theoretically using different approaches such as frequency operator approach, many-world theory, Bayesian probability and envariance, literature shows that arguments in each of these methods are circular. In view of absence of a convincing theoretical proof, recently some (...) researchers have carried out experiments to validate the rule up-to maximum possible accuracy using multi-order interference (Sinha et al, Science, 329, 418 [2010]). But, a convincing analytical proof of Born’s rule will make us understand the basic process responsible for exact square dependency of probability on wave function. In this paper, by generalizing the method of calculating probability in common experience into quantum mechanics, we prove the Born’s rule for statistical interpretation of wave function. (shrink)
According to one of Leibniz's theories of contingency a proposition is contingent if and only if it cannot be proved in a finite number of steps. It has been argued that this faces the Problem of Lucky Proof , namely that we could begin by analysing the concept ‘Peter’ by saying that ‘Peter is a denier of Christ and …’, thereby having proved the proposition ‘Peter denies Christ’ in a finite number of steps. It also faces a more general (...) but related problem that we dub the Problem of Guaranteed Proof . We argue that Leibniz has an answer to these problems since for him one has not proved that ‘Peter denies Christ’ unless one has also proved that ‘Peter’ is a consistent concept, an impossible task since it requires the full decomposition of the infinite concept ‘Peter’. We defend this view from objections found in the literature and maintain that for Leibniz all truths about created individual beings are contingent. (shrink)
The paper considers contemporary models of presumption in terms of their ability to contribute to a working theory of presumption for argumentation. Beginning with the Whatelian model, we consider its contemporary developments and alternatives, as proposed by Sidgwick, Kauffeld, Cronkhite, Rescher, Walton, Freeman, Ullmann-Margalit, and Hansen. Based on these accounts, we present a picture of presumptions characterized by their nature, function, foundation and force. On our account, presumption is a modal status that is attached to a claim and has (...) the effect of shifting, in a dialogue, a burden of proof set at a local level. Presumptions can be analysed and evaluated inferentially as components of rule-based structures. Presumptions are defeasible, and the force of a presumption is a function of its normative foundation. This picture seeks to provide a framework to guide the development of specific theories of presumption. (shrink)
This paper provides an introductory review of the theory of judgment aggregation. It introduces the paradoxes of majority voting that originally motivated the field, explains several key results on the impossibility of propositionwise judgment aggregation, presents a pedagogical proof of one of those results, discusses escape routes from the impossibility and relates judgment aggregation to some other salient aggregation problems, such as preference aggregation, abstract aggregation and probability aggregation. The present illustrative rather than exhaustive review is intended to (...) give readers new to the field of judgment aggregation a sense of this rapidly growing research area. (shrink)
As the 19th century drew to a close, logicians formalized an ideal notion of proof. They were driven by nothing other than an abiding interest in truth, and their proofs were as ethereal as the mind of God. Yet within decades these mathematical abstractions were realized by the hand of man, in the digital stored-program computer. How it came to be recognized that proofs and programs are the same thing is a story that spans a century, a chase with (...) as many twists and turns as a thriller. At the end of the story is a new principle for designing programming languages that will guide computers into the 21st century. -/- For my money, Gentzen’s natural deduction and Church’s lambda calculus are on a par with Einstein’s relativity and Dirac’s quantum physics for elegance and insight. And the maths are a lot simpler. I want to show you the essence of these ideas. I’ll need a few symbols, but not too many, and I’ll explain as I go along. -/- To simplify, I’ll present the story as we understand it now, with some asides to fill in the history. First, I’ll introduce Gentzen’s natural deduction, a formalism for proofs. Next, I’ll introduce Church’s lambda calculus, a formalism for programs. Then I’ll explain why proofs and programs are really the same thing, and how simplifying a proof corresponds to executing a program. Finally, I’ll conclude with a look at how these principles are being applied to design a new generation of programming languages, particularly mobile code for the Internet. (shrink)
ABSTRACTAn adequate semantics for generic sentences must stake out positions across a range of contested territory in philosophy and linguistics. For this reason the study of generic sentences is a venue for investigating different frameworks for understanding human rationality as manifested in linguistic phenomena such as quantification, classification of individuals under kinds, defeasible reasoning, and intensionality. Despite the wide variety of semantic theories developed for generic sentences, to date these theories have been almost universally model-theoretic and representational. This essay outlines (...) a range of proof-theoretic analyses for characterizing generics. Particular attention is given to an expressivist proof-theory that can be traced to 1) work on logical syntax that Carnap undertook prior to his turn toward truth-conditional model theory in the late 1930s, and 2) research on sequent calculi and natural deduction systems that originate in work from Gentzen and Prawitz.1. (shrink)
This is part one of a two-part paper, in which we develop an axiomatic theory of the relation of partial ground. The main novelty of the paper is the of use of a binary ground predicate rather than an operator to formalize ground. This allows us to connect theories of partial ground with axiomatic theories of truth. In this part of the paper, we develop an axiomatization of the relation of partial ground over the truths of arithmetic and show (...) that the theory is a proof-theoretically conservative extension of the theory PT of positive truth. We construct models for the theory and draw some conclusions for the semantics of conceptualist ground. (shrink)
Until recently, discussion of virtues in the philosophy of mathematics has been fleeting and fragmentary at best. But in the last few years this has begun to change. As virtue theory has grown ever more influential, not just in ethics where virtues may seem most at home, but particularly in epistemology and the philosophy of science, some philosophers have sought to push virtues out into unexpected areas, including mathematics and its philosophy. But there are some mathematicians already there, ready (...) to meet them, who have explicitly invoked virtues in discussing what is necessary for a mathematician to succeed. In both ethics and epistemology, virtue theory tends to emphasize character virtues, the acquired excellences of people. But people are not the only sort of thing whose excellences may be identified as virtues. Theoretical virtues have attracted attention in the philosophy of science as components of an account of theory choice. Within the philosophy of mathematics, and mathematics itself, attention to virtues has emerged from a variety of disparate sources. Theoretical virtues have been put forward both to analyse the practice of proof and to justify axioms; intellectual virtues have found multiple applications in the epistemology of mathematics; and ethical virtues have been offered as a basis for understanding the social utility of mathematical practice. Indeed, some authors have advocated virtue epistemology as the correct epistemology for mathematics (and perhaps even as the basis for progress in the metaphysics of mathematics). This topical collection brings together several of the researchers who have begun to study mathematical practices from a virtue perspective with the intention of consolidating and encouraging this trend. (shrink)
We report on an exploratory study of the way eight mid-level undergraduate mathematics majors read and reflected on four student-generated arguments purported to be proofs of a single theorem. The results suggest that mid-level undergraduates tend to focus on surface features of such arguments and that their ability to determine whether arguments are proofs is very limited -- perhaps more so than either they or their instructors recognize. We begin by discussing arguments (purported proofs) regarded as texts and validations of (...) those arguments, i.e., reflections of individuals checking whether such arguments really are proofs of theorems. We relate the way the mathematics research community views proofs and their validations to ideas from reading comprehension and literary theory. We then give a detailed analysis of the four student-generated arguments and finally analyze the eight students' validations of them. (shrink)
This article presents an ontological proof that God is impossible.I define an ‘impossibility’ as a condition which is inconceivable due to its a priori characteristics (e.g. a ‘square circle’). Accordingly, said conditions will not ever become conceivable, as they could in instances of a posteriori inconceivability (e.g. the notion that someone could touch a star without being burned). As the basis of this argument, I refer to an a priori observation (Primus, 2019) regarding our inability to imagine inconsistency (difference) (...) within any point of space. This observation renders the notion of absolute power to be inconceivable, a priori.I briefly discuss the moral implications of religious faith in the context of Purism: a moral rationalist paradigm. I conclude that whilst belief in God can be aesthetically expressed it should not be possessed as a material purpose, due to the illogicality of the latter category of belief and/or expression. With this article I provide conceptual delineation between harmless religious belief and expression—which, I argue, should be protected from persecution, as per any other artistic expression— and religious belief and expression which is materially harmful to society. Whilst I aim to protect religious freedom of expression on one hand, I duly aim to reduce instances of material faith in God(s) on the other. Finally, I aim to bring hope in the possibility for human salvation via technology—such that they should exist indefinitely as ‘demigods,’ defined by conditional, relative power over their environment. (shrink)
This paper considers proof-theoretic semantics for necessity within Dummett's and Prawitz's framework. Inspired by a system of Pfenning's and Davies's, the language of intuitionist logic is extended by a higher order operator which captures a notion of validity. A notion of relative necessary is defined in terms of it, which expresses a necessary connection between the assumptions and the conclusion of a deduction.
I discuss two types of evidential problems with the most widely touted experiments in evolutionary psychology, those performed by Leda Cosmides and interpreted by Cosmides and John Tooby. First, and despite Cosmides and Tooby's claims to the contrary, these experiments don't fulfil the standards of evidence of evolutionary biology. Second Cosmides and Tooby claim to have performed a crucial experiment, and to have eliminated rival approaches. Though they claim that their results are consistent with their theory but contradictory to (...) the leading non-evolutionary alternative, Pragmatic Reasoning Schemas theory, I argue that this claim is unsupported. In addition, some of Cosmides and Tooby's interpretations arise from misguided and simplistic understandings of evolutionary biology. While I endorse the incorporation of evolutionary approaches into psychology, I reject the claims of Cosmides and Tooby that a modular approach is the only one supported by evolutionary biology. Lewontin's critical examinations of the applications of adaptationist thinking provide a background of evidentiary standards against which to view the currently fashionable claims of evolutionary psychology. (shrink)
Review of Dowek, Gilles, Computation, Proof, Machine, Cambridge University Press, Cambridge, 2015. Translation of Les Métamorphoses du calcul, Le Pommier, Paris, 2007. Translation from the French by Pierre Guillot and Marion Roman.
Gaisi Takeuti extended Gentzen's work to higher-order case in 1950's–1960's and proved the consistency of impredicative subsystems of analysis. He has been chiefly known as a successor of Hilbert's school, but we pointed out in the previous paper that Takeuti's aimed to investigate the relationships between "minds" by carrying out his proof-theoretic project rather than proving the "reliability" of such impredicative subsystems of analysis. Moreover, as briefly explained there, his philosophical ideas can be traced back to Nishida's philosophy in (...) Kyoto's school. For the proving the consistency of such systems, it is crucial to prove the well-foundedness of ordinals called "ordinal diagrams" developed for it. Takeuti presented such arguments several times in order to show that they are admitted in his stand point. As a starting point of investigating his finitist stand point, we formulate the system of ordinal notations up to ε0 and reconstruct the well-foundedness arguments of them. (shrink)
Background theories in science are used both to prove and to disprove that theory choice is underdetermined by data. The alleged proof appeals to the fact that experiments to decide between theories typically require auxiliary assumptions from other theories. If this generates a kind of underdetermination, it shows that standards of scientific inference are fallible and must be appropriately contextualized. The alleged disproof appeals to the possibility of suitable background theories to show that no theory choice can (...) be timelessly or noncontextually underdetermined: Foreground theories might be distinguished against different backgrounds. Philosophers have often replied to such a disproof by focussing their attention not on theories but on Total Sciences. If empirically equivalent Total Sciences were at stake, then there would be no background against which they could be differentiated. I offer several reasons to think that Total Science is a philosophers' fiction. No respectable underdetermination can be based on it. (shrink)
Introduction to the Scientific Proof of the Natural Moral Law This paper proves that Aquinas has a means of demonstrating and deriving both moral goodness and the natural moral law from human nature alone. Aquinas scientifically proves the existence of the natural moral law as the natural rule of human operations from human nature alone. The distinction between moral goodness and transcendental goodness is affirmed. This provides the intellectual tools to refute the G.E. Moore (Principles of Ethics) attack against (...) the natural law as committing a "naturalistic fallacy". This article proves that instead Moore commits the fallacy of equivocation between moral goodness and transcendental goodness in his very assertion of a "naturalistic fallacy" by the proponents of the natural moral law. In the process the new deontological/kantian theory of natural law as articulated by John Finnis, Robert George, and Germain Grisez is false historically and philosophically. Ethical naturalism is affirmed as a result. (shrink)
Presumption is a complex concept in law, affecting the dialogue setting. However, it is not clear how presumptions work in everyday argumentation, in which the concept of “plausible argumentation” seems to encompass all kinds of inferences. By analyzing the legal notion of presumption, it appears that this type of reasoning combines argument schemes with reasoning from ignorance. Presumptive reasoning can be considered a particular form of reasoning, which needs positive or negative evidence to carry a probative weight on the conclusion. (...) For this reason, presumptions shift the burden of providing evidence or explanations onto the interlocutor. The latter can provide new information or fail to do so: whereas in the first case the new information rebuts the presumption, in the second case, the absence of information that the interlocutor could reasonably provide strengthen the conclusion of the presumptive reasoning. In both cases the result of the presumption is to strengthen the conclusion of the reasoning from lack of evidence. As shown in the legal cases, the effect of presumption is to shift the burden of proof to the interlocutor; however, the shift a presumption effects is only the shift of the evidential burden, or the burden of completing the incomplete knowledge from which the conclusion was drawn. The burden of persuasion remains on the proponent of the presumption. On the contrary, reasoning from definition in law is a conclusive proof, and shifts to the other party the burden to prove the contrary. This crucial difference can be applied to everyday argumentation: natural arguments can be divided into dialectical and presumptive arguments, leading to conclusions materially different in strength. -/- . (shrink)
This is part two of a two-part paper in which we develop an axiomatic theory of the relation of partial ground. The main novelty of the paper is the of use of a binary ground predicate rather than an operator to formalize ground. In this part of the paper, we extend the base theory of the first part of the paper with hierarchically typed truth-predicates and principles about the interaction of partial ground and truth. We show that our (...)theory is a proof-theoretically conservative extension of the ramified theory of positive truth up to. (shrink)
Euclid's classic proof about the infinitude of prime numbers has been a standard model of reasoning in student textbooks and books of elementary number theory. It has withstood scrutiny for over 2000 years but we shall prove that despite the deceptive appearance of its analytical reasoning it is tautological in nature. We shall argue that the proof is more of an observation about the general property of a prime numbers than an expository style of natural deduction of (...) the proof of their infinitude. (shrink)
This paper attempts to address the question what logical strength theories of truth have by considering such questions as: If you take a theory T and add a theory of truth to it, how strong is the resulting theory, as compared to T? It turns out that, in a wide range of cases, we can get some nice answers to this question, but only if we work in a framework that is somewhat different from those usually employed (...) in discussions of axiomatic theories of truth. These results are then used to address a range of philosophical questions connected with truth, such as what Tarski meant by "essential richness" and the so-called conservativeness argument against deflationism. -/- This draft dates from about 2009, with some significant updates having been made around 2011. Around then, however, I decided that the paper was becoming unmanageable and that I was trying to do too many things in it. I have therefore exploded the paper into several pieces, which will be published separately. These include "Disquotationalism and the Compositional Principles", "The Logical Strength of Compositional Principles", "Consistency and the Theory of Truth", and "What Is Essential Richness?" You should probably read those instead, since this draft remains a bit of a mess. Terminology and notation are inconsistent, and some of the proofs aren't quite right. So, caveat lector. I make it public only because it has been cited in a few places now. (shrink)
Interpretations are generally regarded as the formal representation of the concept of translation.We do not subscribe to this view. A translation method must indeed establish relative consistency or have some uniformity. These are requirements of a translation. Yet, one can both be more strict or more flexible than interpretations are. In this article, we will define a general scheme translation. It should incorporate interpretations but also be compatible with more flexible methods. By doing so, we want to account for methods (...) that seem to imply a sense of translation but are not reducible to interpretations. The main example will be the relative consistent proof between ZF and NBG given by Novak (1950). Further, we will explore a way of combining interpretations. This should account for truth conditions discarded by interpretations in translated theories. (shrink)
For Aristotle, the shape of a physical body is perceptible per se (DA II.6, 418a8-9). As I read his position, shape is thus a causal power, as a physical body can affect our sense organs simply in virtue of possessing it. But this invites a challenge. If shape is an intrinsically powerful property, and indeed an intrinsically perceptible one, then why are the objects of geometrical reasoning, as such, inert and imperceptible? I here address Aristotle’s answer to that problem, focusing (...) on the version of it that he presents in De caelo III.8. I argue that if we grant that Aristotle conceived of the shape of a sensible body as some kind of causal power, then the satisfactory resolution of that challenge pushes us to interpret him as having conceived of it as being, more specifically, an impure power—that is, as a property that is not only intrinsically powerful but also, in some way, intrinsically non-powerful as well. This is a notable result not only insofar as it illuminates Aristotle’s conception of shape but also insofar as it contributes to our knowledge of Aristotle’s theory of dunameis and his ontology more broadly. (shrink)
The problem of algorithmic structuring of proofs in the sequent calculi LK and LKB ( LK where blocks of quantifiers can be introduced in one step) is investigated, where a distinction is made between linear proofs and proofs in tree form. In this framework, structuring coincides with the introduction of cuts into a proof. The algorithmic solvability of this problem can be reduced to the question of k-l-compressibility: "Given a proof of length k , and l ≤ k (...) : Is there is a proof of length ≤ l ?" When restricted to proofs with universal or existential cuts, this problem is shown to be (1) undecidable for linear or tree-like LK-proofs (corresponds to the undecidability of second order unification), (2) undecidable for linear LKB-proofs (corresponds to the undecidability of semi-unification), and (3) decidable for tree-like LKB -proofs (corresponds to a decidable subprob- lem of semi-unification). (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.