The prooftheory of many-valued systems has not been investigated to an extent comparable to the work done on axiomatizatbility of many-valued logics. Prooftheory requires appropriate formalisms, such as sequent calculus, natural deduction, and tableaux for classical (and intuitionistic) logic. One particular method for systematically obtaining calculi for all finite-valued logics was invented independently by several researchers, with slight variations in design and presentation. The main aim of this report is to develop the proof (...)theory of finite-valued first order logics in a general way, and to present some of the more important results in this area. In Systems covered are the resolution calculus, sequent calculus, tableaux, and natural deduction. This report is actually a template, from which all results can be specialized to particular logics. (shrink)
This paper presents a sequent calculus and a dual domain semantics for a theory of definite descriptions in which these expressions are formalised in the context of complete sentences by a binary quantifier I. I forms a formula from two formulas. Ix[F, G] means ‘The F is G’. This approach has the advantage of incorporating scope distinctions directly into the notation. Cut elimination is proved for a system of classical positive free logic with I and it is shown to (...) be sound and complete for the semantics. The system has a number of novel features and is briefly compared to the usual approach of formalising ‘the F ’ by a term forming operator. It does not coincide with Hintikka’s and Lambert’s preferred theories, but the divergence is well-motivated and attractive. (shrink)
This paper contends that Stoic logic (i.e. Stoic analysis) deserves more attention from contemporary logicians. It sets out how, compared with contemporary propositional calculi, Stoic analysis is closest to methods of backward proof search for Gentzen-inspired substructural sequent logics, as they have been developed in logic programming and structural prooftheory, and produces its proof search calculus in tree form. It shows how multiple similarities to Gentzen sequent systems combine with intriguing dissimilarities that may enrich contemporary (...) discussion. Much of Stoic logic appears surprisingly modern: a recursively formulated syntax with some truth-functional propositional operators; analogues to cut rules, axiom schemata and Gentzen’s negation-introduction rules; an implicit variable-sharing principle and deliberate rejection of Thinning and avoidance of paradoxes of implication. These latter features mark the system out as a relevance logic, where the absence of duals for its left and right introduction rules puts it in the vicinity of McCall’s connexive logic. Methodologically, the choice of meticulously formulated meta-logical rules in lieu of axiom and inference schemata absorbs some structural rules and results in an economical, precise and elegant system that values decidability over completeness. (shrink)
Gaisi Takeuti (1926–2017) is one of the most distinguished logicians in prooftheory after Hilbert and Gentzen. He extensively extended Hilbert's program in the sense that he formulated Gentzen's sequent calculus, conjectured that cut-elimination holds for it (Takeuti's conjecture), and obtained several stunning results in the 1950–60s towards the solution of his conjecture. Though he has been known chiefly as a great mathematician, he wrote many papers in English and Japanese where he expressed his philosophical thoughts. In particular, (...) he used several keywords such as "active intuition" and "self-reflection" from Nishida's philosophy. In this paper, we aim to describe a general outline of our project to investigate Takeuti's philosophy of mathematics. In particular, after reviewing Takeuti's proof-theoretic results briefly, we describe some key elements in Takeuti's texts. By explaining these texts, we point out the connection between Takeuti's prooftheory and Nishida's philosophy and explain the future goals of our project. (shrink)
Semantics plays a role in grammar in at least three guises. (A) Linguists seek to account for speakers‘ knowledge of what linguistic expressions mean. This goal is typically achieved by assigning a model theoretic interpretation in a compositional fashion. For example, *No whale flies* is true if and only if the intersection of the sets of whales and fliers is empty in the model. (B) Linguists seek to account for the ability of speakers to make various inferences based on semantic (...) knowledge. For example, *No whale flies* entails *No blue whale flies* and *No whale flies high*. (C) The wellformedness of a variety of syntactic constructions depends on morpho-syntactic features with a semantic flavor. For example, *Under no circumstances would a whale fly* is grammatical, whereas *Under some circumstances would a whale fly* is not, corresponding to the downward vs. upward monotonic features of the preposed phrases. It is usually assumed that once a compositional model theoretic interpretation is assigned to all expressions, its fruits can be freely enjoyed by inferencing and syntax. What place might prooftheory have in this picture? (shrink)
Takeuti and Titani have introduced and investigated a logic they called intuitionistic fuzzy logic. This logic is characterized as the first-order Gödel logic based on the truth value set [0,1]. The logic is known to be axiomatizable, but no deduction system amenable to proof-theoretic, and hence, computational treatment, has been known. Such a system is presented here, based on previous work on hypersequent calculi for propositional Gödel logics by Avron. It is shown that the system is sound and complete, (...) and allows cut-elimination. A question by Takano regarding the eliminability of the Takeuti-Titani density rule is answered affirmatively. (shrink)
The traditional view of evidence in mathematics is that evidence is just proof and proof is just derivation. There are good reasons for thinking that this view should be rejected: it misrepresents both historical and current mathematical practice. Nonetheless, evidence, proof, and derivation are closely intertwined. This paper seeks to tease these concepts apart. It emphasizes the role of argumentation as a context shared by evidence, proofs, and derivations. The utility of argumentation theory, in general, and (...) argumentation schemes, in particular, as a methodology for the study of mathematical practice is thereby demonstrated. Argumentation schemes represent an almost untapped resource for mathematics education. Notably, they provide a consistent treatment of rigorous and non-rigorous argumentation, thereby working to exhibit the continuity of reasoning in mathematics with reasoning in other areas. Moreover, since argumentation schemes are a comparatively mature methodology, there is a substantial body of existing work to draw upon, including some increasingly sophisticated software tools. Such tools have significant potential for the analysis and evaluation of mathematical argumentation. The first four sections of the paper address the relationships of evidence to proof, proof to derivation, argument to proof, and argument to evidence, respectively. The final section directly addresses some of the educational implications of an argumentation scheme account of mathematical reasoning. (shrink)
I introduce the implantation argument, a new argument for the existence of God. Spatiotemporal extensions believed to exist outside of the mind, composing an external physical reality, cannot be composed of either atomlessness, or of Democritean atoms, and therefore the inner experience of an external reality containing spatiotemporal extensions believed to exist outside of the mind does not represent the external reality, the mind is a mere cinematic-like mindscreen, implanted into the mind by a creator-God. It will be shown that (...) only a creator-God can be the implanting creator of the mindscreen simulation, and other simulation theories, such as Bostrom’s famous account, that do not involve a creator-God as the mindscreen simulation creator, involve a reification fallacy. (shrink)
We introduce translations between display calculus proofs and labeled calculus proofs in the context of tense logics. First, we show that every derivation in the display calculus for the minimal tense logic Kt extended with general path axioms can be effectively transformed into a derivation in the corresponding labeled calculus. Concerning the converse translation, we show that for Kt extended with path axioms, every derivation in the corresponding labeled calculus can be put into a special form that is translatable to (...) a derivation in the associated display calculus. A key insight in this converse translation is a canonical representation of display sequents as labeled polytrees. Labeled polytrees, which represent equivalence classes of display sequents modulo display postulates, also shed light on related correspondence results for tense logics. (shrink)
This paper considers proof-theoretic semantics for necessity within Dummett's and Prawitz's framework. Inspired by a system of Pfenning's and Davies's, the language of intuitionist logic is extended by a higher order operator which captures a notion of validity. A notion of relative necessary is defined in terms of it, which expresses a necessary connection between the assumptions and the conclusion of a deduction.
In this dissertation, we shall investigate whether Tennant's criterion for paradoxicality(TCP) can be a correct criterion for genuine paradoxes and whether the requirement of a normal derivation(RND) can be a proof-theoretic solution to the paradoxes. Tennant’s criterion has two types of counterexamples. The one is a case which raises the problem of overgeneration that TCP makes a paradoxical derivation non-paradoxical. The other is one which generates the problem of undergeneration that TCP renders a non-paradoxical derivation paradoxical. Chapter 2 deals (...) with the problem of undergeneration and Chapter 3 concerns the problem of overgeneration. Chapter 2 discusses that Tenant’s diagnosis of the counterexample which applies CR−rule and causes the undergeneration problem is not correct and presents a solution to the problem of undergeneration. Chapter 3 argues that Tennant’s diagnosis of the counterexample raising the overgeneration problem is wrong and provides a solution to the problem. Finally, Chapter 4 addresses what should be explicated in order for RND to be a proof-theoretic solution to the paradoxes. (shrink)
A question, long discussed by legal scholars, has recently provoked a considerable amount of philosophical attention: ‘Is it ever appropriate to base a legal verdict on statistical evidence alone?’ Many philosophers who have considered this question reject legal reliance on bare statistics, even when the odds of error are extremely low. This paper develops a puzzle for the dominant theories concerning why we should eschew bare statistics. Namely, there seem to be compelling scenarios in which there are multiple sources of (...) incriminating statistical evidence. As we conjoin together different types of statistical evidence, it becomes increasingly incredible to suppose that a positive verdict would be impermissible. I suggest that none of the dominant views in the literature can easily accommodate such cases, and close by offering a diagnosis of my own. (shrink)
Restall set forth a "consecution" calculus in his "An Introduction to Substructural Logics." This is a natural deduction type sequent calculus where the structural rules play an important role. This paper looks at different ways of extending Restall's calculus. It is shown that Restall's weak soundness and completeness result with regards to a Hilbert calculus can be extended to a strong one so as to encompass what Restall calls proofs from assumptions. It is also shown how to extend the calculus (...) so as to validate the metainferential rule of reasoning by cases, as well as certain theory-dependent rules. (shrink)
1971. Discourse Grammars and the Structure of Mathematical Reasoning II: The Nature of a Correct Theory of Proof and Its Value, Journal of Structural Learning 3, #2, 1–16. REPRINTED 1976. Structural Learning II Issues and Approaches, ed. J. Scandura, Gordon & Breach Science Publishers, New York, MR56#15263. -/- This is the second of a series of three articles dealing with application of linguistics and logic to the study of mathematical reasoning, especially in the setting of a concern for (...) improvement of mathematical education. The present article presupposes the previous one. Herein we develop our ideas of the purposes of a theory of proof and the criterion of success to be applied to such theories. In addition we speculate at length concerning the specific kinds of uses to which a successful theory of proof may be put vis-a-vis improvement of various aspects of mathematical education. The final article will deal with the construction of such a theory. The 1st is the 1971. Discourse Grammars and the Structure of Mathematical Reasoning I: Mathematical Reasoning and Stratification of Language, Journal of Structural Learning 3, #1, 55–74. https://www.academia.edu/s/fb081b1886?source=link . (shrink)
Recent years have seen fresh impetus brought to debates about the proper role of statistical evidence in the law. Recent work largely centres on a set of puzzles known as the ‘proof paradox’. While these puzzles may initially seem academic, they have important ramifications for the law: raising key conceptual questions about legal proof, and practical questions about DNA evidence. This article introduces the proof paradox, why we should care about it, and new work attempting to resolve (...) it. (shrink)
We introduce an effective translation from proofs in the display calculus to proofs in the labelled calculus in the context of tense logics. We identify the labelled calculus proofs in the image of this translation as those built from labelled sequents whose underlying directed graph possesses certain properties. For the basic normal tense logic Kt, the image is shown to be the set of all proofs in the labelled calculus G3Kt.
This work provides proof-search algorithms and automated counter-model extraction for a class of STIT logics. With this, we answer an open problem concerning syntactic decision procedures and cut-free calculi for STIT logics. A new class of cut-free complete labelled sequent calculi G3LdmL^m_n, for multi-agent STIT with at most n-many choices, is introduced. We refine the calculi G3LdmL^m_n through the use of propagation rules and demonstrate the admissibility of their structural rules, resulting in auxiliary calculi Ldm^m_nL. In the single-agent case, (...) we show that the refined calculi Ldm^m_nL derive theorems within a restricted class of (forestlike) sequents, allowing us to provide proof-search algorithms that decide single-agent STIT logics. We prove that the proof-search algorithms are correct and terminate. (shrink)
Until recently, discussion of virtues in the philosophy of mathematics has been fleeting and fragmentary at best. But in the last few years this has begun to change. As virtue theory has grown ever more influential, not just in ethics where virtues may seem most at home, but particularly in epistemology and the philosophy of science, some philosophers have sought to push virtues out into unexpected areas, including mathematics and its philosophy. But there are some mathematicians already there, ready (...) to meet them, who have explicitly invoked virtues in discussing what is necessary for a mathematician to succeed. In both ethics and epistemology, virtue theory tends to emphasize character virtues, the acquired excellences of people. But people are not the only sort of thing whose excellences may be identified as virtues. Theoretical virtues have attracted attention in the philosophy of science as components of an account of theory choice. Within the philosophy of mathematics, and mathematics itself, attention to virtues has emerged from a variety of disparate sources. Theoretical virtues have been put forward both to analyse the practice of proof and to justify axioms; intellectual virtues have found multiple applications in the epistemology of mathematics; and ethical virtues have been offered as a basis for understanding the social utility of mathematical practice. Indeed, some authors have advocated virtue epistemology as the correct epistemology for mathematics (and perhaps even as the basis for progress in the metaphysics of mathematics). This topical collection brings together several of the researchers who have begun to study mathematical practices from a virtue perspective with the intention of consolidating and encouraging this trend. (shrink)
It is shown how the schema of equivalence can be used to obtain short proofs of tautologies A , where the depth of proofs is linear in the number of variables in A .
Roughly, a proof of a theorem, is “pure” if it draws only on what is “close” or “intrinsic” to that theorem. Mathematicians employ a variety of terms to identify pure proofs, saying that a pure proof is one that avoids what is “extrinsic,” “extraneous,” “distant,” “remote,” “alien,” or “foreign” to the problem or theorem under investigation. In the background of these attributions is the view that there is a distance measure (or a variety of such measures) between mathematical (...) statements and proofs. Mathematicians have paid little attention to specifying such distance measures precisely because in practice certain methods of proof have seemed self- evidently impure by design: think for instance of analytic geometry and analytic number theory. By contrast, mathematicians have paid considerable attention to whether such impurities are a good thing or to be avoided, and some have claimed that they are valuable because generally impure proofs are simpler than pure proofs. This article is an investigation of this claim, formulated more precisely by proof- theoretic means. After assembling evidence from prooftheory that may be thought to support this claim, we will argue that on the contrary this evidence does not support the claim. (shrink)
Gaisi Takeuti extended Gentzen's work to higher-order case in 1950's–1960's and proved the consistency of impredicative subsystems of analysis. He has been chiefly known as a successor of Hilbert's school, but we pointed out in the previous paper that Takeuti's aimed to investigate the relationships between "minds" by carrying out his proof-theoretic project rather than proving the "reliability" of such impredicative subsystems of analysis. Moreover, as briefly explained there, his philosophical ideas can be traced back to Nishida's philosophy in (...) Kyoto's school. For the proving the consistency of such systems, it is crucial to prove the well-foundedness of ordinals called "ordinal diagrams" developed for it. Takeuti presented such arguments several times in order to show that they are admitted in his stand point. As a starting point of investigating his finitist stand point, we formulate the system of ordinal notations up to ε0 and reconstruct the well-foundedness arguments of them. (shrink)
One of the most fundamental questions in the philosophy of mathematics concerns the relation between truth and formal proof. The position according to which the two concepts are the same is called deflationism, and the opposing viewpoint substantialism. In an important result of mathematical logic, Kurt Gödel proved in his first incompleteness theorem that all consistent formal systems containing arithmetic include sentences that can neither be proved nor disproved within that system. However, such undecidable Gödel sentences can be established (...) to be true once we expand the formal system with Alfred Tarski s semantical theory of truth, as shown by Stewart Shapiro and Jeffrey Ketland in their semantical arguments for the substantiality of truth. According to them, in Gödel sentences we have an explicit case of true but unprovable sentences, and hence deflationism is refuted. -/- Against that, Neil Tennant has shown that instead of Tarskian truth we can expand the formal system with a soundness principle, according to which all provable sentences are assertable, and the assertability of Gödel sentences follows. This way, the relevant question is not whether we can establish the truth of Gödel sentences, but whether Tarskian truth is a more plausible expansion than a soundness principle. In this work I will argue that this problem is best approached once we think of mathematics as the full human phenomenon, and not just consisting of formal systems. When pre-formal mathematical thinking is included in our account, we see that Tarskian truth is in fact not an expansion at all. I claim that what proof is to formal mathematics, truth is to pre-formal thinking, and the Tarskian account of semantical truth mirrors this relation accurately. -/- However, the introduction of pre-formal mathematics is vulnerable to the deflationist counterargument that while existing in practice, pre-formal thinking could still be philosophically superfluous if it does not refer to anything objective. Against this, I argue that all truly deflationist philosophical theories lead to arbitrariness of mathematics. In all other philosophical accounts of mathematics there is room for a reference of the pre-formal mathematics, and the expansion of Tarkian truth can be made naturally. Hence, if we reject the arbitrariness of mathematics, I argue in this work, we must accept the substantiality of truth. Related subjects such as neo-Fregeanism will also be covered, and shown not to change the need for Tarskian truth. -/- The only remaining route for the deflationist is to change the underlying logic so that our formal languages can include their own truth predicates, which Tarski showed to be impossible for classical first-order languages. With such logics we would have no need to expand the formal systems, and the above argument would fail. From the alternative approaches, in this work I focus mostly on the Independence Friendly (IF) logic of Jaakko Hintikka and Gabriel Sandu. Hintikka has claimed that an IF language can include its own adequate truth predicate. I argue that while this is indeed the case, we cannot recognize the truth predicate as such within the same IF language, and the need for Tarskian truth remains. In addition to IF logic, also second-order logic and Saul Kripke s approach using Kleenean logic will be shown to fail in a similar fashion. (shrink)
I explain why model theory is unsatisfactory as a semantic theory and has drawbacks as a tool for proofs on logic systems. I then motivate and develop an alternative, truth-valuational substitutional approach (TVS), and prove with it the soundness and completeness of the first order Predicate Calculus with identity and of Modal Propositional Calculus. Modal logic is developed without recourse to possible worlds. Along the way I answer a variety of difficulties that have been raised against TVS and (...) show that, as applied to several central questions, model-theoretic semantics can be considered TVS in disguise. The conclusion is that the truth-valuational substitutional approach is an adequate tool for many of our logic inquiries, conceptually preferable over model-theoretic semantics. Another conclusion is that formal logic is independent of semantics, apart from its use of the notion of truth, but that even with respect to it its assumptions are minimal. (shrink)
ABSTRACT This part of the series has a dual purpose. In the first place we will discuss two kinds of theories of proof. The first kind will be called a theory of linear proof. The second has been called a theory of suppositional proof. The term "natural deduction" has often and correctly been used to refer to the second kind of theory, but I shall not do so here because many of the theories so-called (...) are not of the second kind--they must be thought of either as disguised linear theories or theories of a third kind (see postscript below). The second purpose of this part is 25 to develop some of the main ideas needed in constructing a comprehensive theory of proof. The reason for choosing the linear and suppositional theories for this purpose is because the linear theory includes only rules of a very simple nature, and the suppositional theory can be seen as the result of making the linear theory more comprehensive. CORRECTION: At the time these articles were written the word ‘proof’ especially in the phrase ‘proof from hypotheses’ was widely used to refer to what were earlier and are now called deductions. I ask your forgiveness. I have forgiven Church and Henkin who misled me. (shrink)
The work provides comprehensively definitive, unconditional proofs of Riemann's hypothesis, Goldbach's conjecture, the 'twin primes' conjecture, the Collatz conjecture, the Newcomb-Benford theorem, and the Quine-Putnam Indispensability thesis. The proofs validate holonomic metamathematics, meta-ontology, new number theory, new prooftheory, new philosophy of logic, and unconditional disproof of the P/NP problem. The proofs, metatheory, and definitions are also confirmed and verified with graphic proof of intrinsic enabling and sustaining principles of reality.
According to a common conception of legal proof, satisfying a legal burden requires establishing a claim to a numerical threshold. Beyond reasonable doubt, for example, is often glossed as 90% or 95% likelihood given the evidence. Preponderance of evidence is interpreted as meaning at least 50% likelihood given the evidence. In light of problems with the common conception, I propose a new ‘relevant alternatives’ framework for legal standards of proof. Relevant alternative accounts of knowledge state that a person (...) knows a proposition when their evidence rules out all relevant error possibilities. I adapt this framework to model three legal standards of proof—the preponderance of evidence, clear and convincing evidence, and beyond reasonable doubt standards. I describe virtues of this framework. I argue that, by eschewing numerical thresholds, the relevant alternatives framework avoids problems inherent to rival models. I conclude by articulating aspects of legal normativity and practice illuminated by the relevant alternatives framework. (shrink)
Infectious logics are systems that have a truth-value that is assigned to a compound formula whenever it is assigned to one of its components. This paper studies four-valued infectious logics as the basis of transparent theories of truth. This take is motivated as a way to treat different pathological sentences differently, namely, by allowing some of them to be truth-value gluts and some others to be truth-value gaps and as a way to treat the semantic pathology suffered by at least (...) some of these sentences as infectious. This leads us to consider four distinct four-valued logics: one where truth-value gaps are infectious, but gluts are not; one where truth-value gluts are infectious, but gaps are not; and two logics where both gluts and gaps are infectious, in some sense. Additionally, we focus on the prooftheory of these systems, by offering a discussion of two related topics. On the one hand, we prove some limitations regarding the possibility of providing standard Gentzen sequent calculi for these systems, by dualizing and extending some recent results for infectious logics. On the other hand, we provide sound and complete four-sided sequent calculi, arguing that the most important technical and philosophical features taken into account to usually prefer standard calculi are, indeed, enjoyed by the four-sided systems. (shrink)
Most human actions are complex, but some of them are basic. Which are these? In this paper, I address this question by invoking slips, a common kind of mistake. The proposal is this: an action is basic if and only if it is not possible to slip in performing it. The argument discusses some well-established results from the psychology of language production in the context of a philosophical theory of action. In the end, the proposed criterion is applied to (...) discuss some well-known theories of basic actions. (shrink)
The concept of burden of proof is used in a wide range of discourses, from philosophy to law, science, skepticism, and even in everyday reasoning. This paper provides an analysis of the proper deployment of burden of proof, focusing in particular on skeptical discussions of pseudoscience and the paranormal, where burden of proof assignments are most poignant and relatively clear-cut. We argue that burden of proof is often misapplied or used as a mere rhetorical gambit, with (...) little appreciation of the underlying principles. The paper elaborates on an important distinction between evidential and prudential varieties of burdens of proof, which is cashed out in terms of Bayesian probabilities and error management theory. Finally, we explore the relationship between burden of proof and several (alleged) informal logical fallacies. This allows us to get a firmer grip on the concept and its applications in different domains, and also to clear up some confusions with regard to when exactly some fallacies (ad hominem, ad ignorantiam, and petitio principii) may or may not occur. (shrink)
Review of Dowek, Gilles, Computation, Proof, Machine, Cambridge University Press, Cambridge, 2015. Translation of Les Métamorphoses du calcul, Le Pommier, Paris, 2007. Translation from the French by Pierre Guillot and Marion Roman.
This paper considers logics which are formally dual to intuitionistic logic in order to investigate a co-constructive logic for proofs and refutations. This is philosophically motivated by a set of problems regarding the nature of constructive truth, and its relation to falsity. It is well known both that intuitionism can not deal constructively with negative information, and that defining falsity by means of intuitionistic negation leads, under widely-held assumptions, to a justification of bivalence. For example, we do not want to (...) equate falsity with the non-existence of a proof since this would render a statement such as “pi is transcendental” false prior to 1882. In addition, the intuitionist account of negation as shorthand for the derivation of absurdity is inadequate, particularly outside of purely mathematical contexts. To deal with these issues, I investigate the dual of intuitionistic logic, co-intuitionistic logic, as a logic of refutation, alongside intuitionistic logic of proofs. Direct proof and refutation are dual to each other, and are constructive, whilst there also exist syntactic, weak, negations within both logics. In this respect, the logic of refutation is weakly paraconsistent in the sense that it allows for statements for which, neither they, nor their negation, are refuted. I provide a prooftheory for the co-constructive logic, a formal dualizing map between the logics, and a Kripke-style semantics. This is given an intuitive philosophical rendering in a re-interpretation of Kolmogorov’s logic of problems. (shrink)
Theism and its cousins, atheism and agnosticism, are seldom taken to task for logical-epistemological incoherence. This paper provides a condensed proof that not only theism, but atheism and agnosticism as well, are all of them conceptually self-undermining, and for the same reason: All attempt to make use of the concept of “transcendent reality,” which here is shown not only to lack meaning, but to preclude the very possibility of meaning. In doing this, the incoherence of theism, atheism, and agnosticism (...) is secondary to the more general incoherence of any attempts to refer to so-called “transcendent realities.” A recognition of the conceptually fundamental incoherence of theism, atheism, and agnosticism compels our rational assent to a position the author names “paratheism.”. (shrink)
Definitions I presented in a previous article as part of a semantic approach in epistemology assumed that the concept of derivability from standard logic held across all mathematical and scientific disciplines. The present article argues that this assumption is not true for quantum mechanics (QM) by showing that concepts of validity applicable to proofs in mathematics and in classical mechanics are inapplicable to proofs in QM. Because semantic epistemology must include this important theory, revision is necessary. The one I (...) propose also extends semantic epistemology beyond the ‘hard’ sciences. The article ends by presenting and then refuting some responses QM theorists might make to my arguments. (shrink)
Introduction to the Scientific Proof of the Natural Moral Law This paper proves that Aquinas has a means of demonstrating and deriving both moral goodness and the natural moral law from human nature alone. Aquinas scientifically proves the existence of the natural moral law as the natural rule of human operations from human nature alone. The distinction between moral goodness and transcendental goodness is affirmed. This provides the intellectual tools to refute the G.E. Moore (Principles of Ethics) attack against (...) the natural law as committing a "naturalistic fallacy". This article proves that instead Moore commits the fallacy of equivocation between moral goodness and transcendental goodness in his very assertion of a "naturalistic fallacy" by the proponents of the natural moral law. In the process the new deontological/kantian theory of natural law as articulated by John Finnis, Robert George, and Germain Grisez is false historically and philosophically. Ethical naturalism is affirmed as a result. (shrink)
The work provides comprehensively definitive, unconditional proofs of Riemann's hypothesis, Goldbach's conjecture, the 'twin primes' conjecture, the Collatz conjecture, the Newcomb-Benford theorem, and the Quine-Putnam Indispensability thesis. The proofs validate holonomic metamathematics, meta-ontology, new number theory, new prooftheory, new philosophy of logic, and unconditional disproof of the P/NP problem. The proofs, metatheory, and definitions are also confirmed and verified with graphic proof of intrinsic enabling and sustaining principles of reality.
The paper considers contemporary models of presumption in terms of their ability to contribute to a working theory of presumption for argumentation. Beginning with the Whatelian model, we consider its contemporary developments and alternatives, as proposed by Sidgwick, Kauffeld, Cronkhite, Rescher, Walton, Freeman, Ullmann-Margalit, and Hansen. Based on these accounts, we present a picture of presumptions characterized by their nature, function, foundation and force. On our account, presumption is a modal status that is attached to a claim and has (...) the effect of shifting, in a dialogue, a burden of proof set at a local level. Presumptions can be analysed and evaluated inferentially as components of rule-based structures. Presumptions are defeasible, and the force of a presumption is a function of its normative foundation. This picture seeks to provide a framework to guide the development of specific theories of presumption. (shrink)
According to one of Leibniz's theories of contingency a proposition is contingent if and only if it cannot be proved in a finite number of steps. It has been argued that this faces the Problem of Lucky Proof , namely that we could begin by analysing the concept ‘Peter’ by saying that ‘Peter is a denier of Christ and …’, thereby having proved the proposition ‘Peter denies Christ’ in a finite number of steps. It also faces a more general (...) but related problem that we dub the Problem of Guaranteed Proof . We argue that Leibniz has an answer to these problems since for him one has not proved that ‘Peter denies Christ’ unless one has also proved that ‘Peter’ is a consistent concept, an impossible task since it requires the full decomposition of the infinite concept ‘Peter’. We defend this view from objections found in the literature and maintain that for Leibniz all truths about created individual beings are contingent. (shrink)
This paper provides an introductory review of the theory of judgment aggregation. It introduces the paradoxes of majority voting that originally motivated the field, explains several key results on the impossibility of propositionwise judgment aggregation, presents a pedagogical proof of one of those results, discusses escape routes from the impossibility and relates judgment aggregation to some other salient aggregation problems, such as preference aggregation, abstract aggregation and probability aggregation. The present illustrative rather than exhaustive review is intended to (...) give readers new to the field of judgment aggregation a sense of this rapidly growing research area. (shrink)
ABSTRACTAn adequate semantics for generic sentences must stake out positions across a range of contested territory in philosophy and linguistics. For this reason the study of generic sentences is a venue for investigating different frameworks for understanding human rationality as manifested in linguistic phenomena such as quantification, classification of individuals under kinds, defeasible reasoning, and intensionality. Despite the wide variety of semantic theories developed for generic sentences, to date these theories have been almost universally model-theoretic and representational. This essay outlines (...) a range of proof-theoretic analyses for characterizing generics. Particular attention is given to an expressivist proof-theory that can be traced to 1) work on logical syntax that Carnap undertook prior to his turn toward truth-conditional model theory in the late 1930s, and 2) research on sequent calculi and natural deduction systems that originate in work from Gentzen and Prawitz.1. (shrink)
The Born’s rule to interpret the square of wave function as the probability to get a specific value in measurement has been accepted as a postulate in foundations of quantum mechanics. Although there have been so many attempts at deriving this rule theoretically using different approaches such as frequency operator approach, many-world theory, Bayesian probability and envariance, literature shows that arguments in each of these methods are circular. In view of absence of a convincing theoretical proof, recently some (...) researchers have carried out experiments to validate the rule up-to maximum possible accuracy using multi-order interference (Sinha et al, Science, 329, 418 [2010]). But, a convincing analytical proof of Born’s rule will make us understand the basic process responsible for exact square dependency of probability on wave function. In this paper, by generalizing the method of calculating probability in common experience into quantum mechanics, we prove the Born’s rule for statistical interpretation of wave function. (shrink)
Writing strategic documents is a major practice of many actors striving to see their educational ideas realised in the curriculum. In these documents, arguments are systematically developed to create the legitimacy of a new educational goal and competence to make claims about it. Through a qualitative analysis of the writing strategies used in these texts, I show how two of the main actors in the Czech educational discourse have developed a proof that a new educational goal is needed. I (...) draw on the connection of the relational approach in the sociology of education with Lyotard’s analytical semantics of instances in the event. The comparison of the writing strategies in the two documents reveals differences in the formation of a particular pattern of justification. In one case the texts function as a herald of pure reality, and in the other case as a messenger of other witnesses. This reveals different regimens of proof, although both of them were written as prescriptive directives – normative models of the educational world. (shrink)
The problem of algorithmic structuring of proofs in the sequent calculi LK and LKB ( LK where blocks of quantifiers can be introduced in one step) is investigated, where a distinction is made between linear proofs and proofs in tree form. In this framework, structuring coincides with the introduction of cuts into a proof. The algorithmic solvability of this problem can be reduced to the question of k-l-compressibility: "Given a proof of length k , and l ≤ k (...) : Is there is a proof of length ≤ l ?" When restricted to proofs with universal or existential cuts, this problem is shown to be (1) undecidable for linear or tree-like LK-proofs (corresponds to the undecidability of second order unification), (2) undecidable for linear LKB-proofs (corresponds to the undecidability of semi-unification), and (3) decidable for tree-like LKB -proofs (corresponds to a decidable subprob- lem of semi-unification). (shrink)
This is part one of a two-part paper, in which we develop an axiomatic theory of the relation of partial ground. The main novelty of the paper is the of use of a binary ground predicate rather than an operator to formalize ground. This allows us to connect theories of partial ground with axiomatic theories of truth. In this part of the paper, we develop an axiomatization of the relation of partial ground over the truths of arithmetic and show (...) that the theory is a proof-theoretically conservative extension of the theory PT of positive truth. We construct models for the theory and draw some conclusions for the semantics of conceptualist ground. (shrink)
As the 19th century drew to a close, logicians formalized an ideal notion of proof. They were driven by nothing other than an abiding interest in truth, and their proofs were as ethereal as the mind of God. Yet within decades these mathematical abstractions were realized by the hand of man, in the digital stored-program computer. How it came to be recognized that proofs and programs are the same thing is a story that spans a century, a chase with (...) as many twists and turns as a thriller. At the end of the story is a new principle for designing programming languages that will guide computers into the 21st century. -/- For my money, Gentzen’s natural deduction and Church’s lambda calculus are on a par with Einstein’s relativity and Dirac’s quantum physics for elegance and insight. And the maths are a lot simpler. I want to show you the essence of these ideas. I’ll need a few symbols, but not too many, and I’ll explain as I go along. -/- To simplify, I’ll present the story as we understand it now, with some asides to fill in the history. First, I’ll introduce Gentzen’s natural deduction, a formalism for proofs. Next, I’ll introduce Church’s lambda calculus, a formalism for programs. Then I’ll explain why proofs and programs are really the same thing, and how simplifying a proof corresponds to executing a program. Finally, I’ll conclude with a look at how these principles are being applied to design a new generation of programming languages, particularly mobile code for the Internet. (shrink)
Hegel endorsed proofs of the existence of God, and also believed God to be a person. Some of his interpreters ignore these apparently retrograde tendencies, shunning them in favor of the philosopher's more forward-looking contributions. Others embrace Hegel's religious thought, but attempt to recast his views as less reactionary than they appear to be. Robert Williams's latest monograph belongs to a third category: he argues that Hegel's positions in philosophical theology are central to his philosophy writ large. The book is (...) diligently researched, and marshals an impressive amount of textual evidence concerning Hegel's view of the proofs, his theory of personhood, and his views on religious community.Many of... (shrink)
Transfinite ordinal numbers enter mathematical practice mainly via the method of definition by transfinite recursion. Outside of axiomatic set theory, there is a significant mathematical tradition in works recasting proofs by transfinite recursion in other terms, mostly with the intention of eliminating the ordinals from the proofs. Leaving aside the different motivations which lead each specific case, we investigate the mathematics of this action of proof transforming and we address the problem of formalising the philosophical notion of elimination (...) which characterises this move. (shrink)
Interesting as they are by themselves in philosophy and mathematics, paradoxes can be made even more fascinating when turned into proofs and theorems. For example, Russell’s paradox, which overthrew Frege’s logical edifice, is now a classical theorem in set theory, to the effect that no set contains all sets. Paradoxes can be used in proofs of some other theorems—thus Liar’s paradox has been used in the classical proof of Tarski’s theorem on the undefinability of truth in sufficiently rich (...) languages. This paradox (as well as Richard’s paradox) appears implicitly in Gödel’s proof of his celebrated first incompleteness theorem. In this paper, we study Yablo’s paradox from the viewpoint of first- and second-order logics. We prove that a formalization of Yablo’s paradox (which is second order in nature) is non-first-orderizable in the sense of George Boolos (1984). (shrink)
We report on an exploratory study of the way eight mid-level undergraduate mathematics majors read and reflected on four student-generated arguments purported to be proofs of a single theorem. The results suggest that mid-level undergraduates tend to focus on surface features of such arguments and that their ability to determine whether arguments are proofs is very limited -- perhaps more so than either they or their instructors recognize. We begin by discussing arguments (purported proofs) regarded as texts and validations of (...) those arguments, i.e., reflections of individuals checking whether such arguments really are proofs of theorems. We relate the way the mathematics research community views proofs and their validations to ideas from reading comprehension and literary theory. We then give a detailed analysis of the four student-generated arguments and finally analyze the eight students' validations of them. (shrink)
This is part two of a two-part paper in which we develop an axiomatic theory of the relation of partial ground. The main novelty of the paper is the of use of a binary ground predicate rather than an operator to formalize ground. In this part of the paper, we extend the base theory of the first part of the paper with hierarchically typed truth-predicates and principles about the interaction of partial ground and truth. We show that our (...)theory is a proof-theoretically conservative extension of the ramified theory of positive truth up to. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.