The prooftheory of many-valued systems has not been investigated to an extent comparable to the work done on axiomatizatbility of many-valued logics. Prooftheory requires appropriate formalisms, such as sequent calculus, natural deduction, and tableaux for classical (and intuitionistic) logic. One particular method for systematically obtaining calculi for all finite-valued logics was invented independently by several researchers, with slight variations in design and presentation. The main aim of this report is to develop the proof (...)theory of finite-valued first order logics in a general way, and to present some of the more important results in this area. In Systems covered are the resolution calculus, sequent calculus, tableaux, and natural deduction. This report is actually a template, from which all results can be specialized to particular logics. (shrink)
This paper contends that Stoic logic (i.e. Stoic analysis) deserves more attention from contemporary logicians. It sets out how, compared with contemporary propositional calculi, Stoic analysis is closest to methods of backward proof search for Gentzen-inspired substructural sequent logics, as they have been developed in logic programming and structural prooftheory, and produces its proof search calculus in tree form. It shows how multiple similarities to Gentzen sequent systems combine with intriguing dissimilarities that may enrich contemporary (...) discussion. Much of Stoic logic appears surprisingly modern: a recursively formulated syntax with some truth-functional propositional operators; analogues to cut rules, axiom schemata and Gentzen’s negation-introduction rules; an implicit variable-sharing principle and deliberate rejection of Thinning and avoidance of paradoxes of implication. These latter features mark the system out as a relevance logic, where the absence of duals for its left and right introduction rules puts it in the vicinity of McCall’s connexive logic. Methodologically, the choice of meticulously formulated meta-logical rules in lieu of axiom and inference schemata absorbs some structural rules and results in an economical, precise and elegant system that values decidability over completeness. (shrink)
Semantics plays a role in grammar in at least three guises. (A) Linguists seek to account for speakers‘ knowledge of what linguistic expressions mean. This goal is typically achieved by assigning a model theoretic interpretation2 in a compositional fashion. For example, No whale flies is true if and only if the intersection of the sets of whales and fliers is empty in the model. (B) Linguists seek to account for the ability of speakers to make various inferences based on semantic (...) knowledge. For example, No whale flies entails No blue whale flies and No whale flies high. (C) The wellformedness of a variety of syntactic constructions depends on morpho-syntactic features with a semantic flavor. For example, Under no circumstances would a whale fly is grammatical, whereas Under some circumstances would a whale fly is not, corresponding to the downward vs. upward monotonic features of the preposed phrases. (shrink)
Takeuti and Titani have introduced and investigated a logic they called intuitionistic fuzzy logic. This logic is characterized as the first-order Gödel logic based on the truth value set [0,1]. The logic is known to be axiomatizable, but no deduction system amenable to proof-theoretic, and hence, computational treatment, has been known. Such a system is presented here, based on previous work on hypersequent calculi for propositional Gödel logics by Avron. It is shown that the system is sound and complete, (...) and allows cut-elimination. A question by Takano regarding the eliminability of the Takeuti-Titani density rule is answered affirmatively. (shrink)
Gaisi Takeuti (1926–2017) is one of the most distinguished logicians in prooftheory after Hilbert and Gentzen. He extensively extended Hilbert's program in the sense that he formulated Gentzen's sequent calculus, conjectured that cut-elimination holds for it (Takeuti's conjecture), and obtained several stunning results in the 1950–60s towards the solution of his conjecture. Though he has been known chiefly as a great mathematician, he wrote many papers in English and Japanese where he expressed his philosophical thoughts. In particular, (...) he used several keywords such as "active intuition" and "self-reflection" from Nishida's philosophy. In this paper, we aim to describe a general outline of our project to investigate Takeuti's philosophy of mathematics. In particular, after reviewing Takeuti's proof-theoretic results briefly, we describe some key elements in Takeuti's texts. By explaining these texts, we point out the connection between Takeuti's prooftheory and Nishida's philosophy and explain the future goals of our project. (shrink)
Recent years have seen fresh impetus brought to debates about the proper role of statistical evidence in the law. Recent work largely centres on a set of puzzles known as the ‘proof paradox’. While these puzzles may initially seem academic, they have important ramifications for the law: raising key conceptual questions about legal proof, and practical questions about DNA evidence. This article introduces the proof paradox, why we should care about it, and new work attempting to resolve (...) it. (shrink)
1971. Discourse Grammars and the Structure of Mathematical Reasoning II: The Nature of a Correct Theory of Proof and Its Value, Journal of Structural Learning 3, #2, 1–16. REPRINTED 1976. Structural Learning II Issues and Approaches, ed. J. Scandura, Gordon & Breach Science Publishers, New York, MR56#15263. -/- This is the second of a series of three articles dealing with application of linguistics and logic to the study of mathematical reasoning, especially in the setting of a concern for (...) improvement of mathematical education. The present article presupposes the previous one. Herein we develop our ideas of the purposes of a theory of proof and the criterion of success to be applied to such theories. In addition we speculate at length concerning the specific kinds of uses to which a successful theory of proof may be put vis-a-vis improvement of various aspects of mathematical education. The final article will deal with the construction of such a theory. The 1st is the 1971. Discourse Grammars and the Structure of Mathematical Reasoning I: Mathematical Reasoning and Stratification of Language, Journal of Structural Learning 3, #1, 55–74. https://www.academia.edu/s/fb081b1886?source=link . (shrink)
Roughly, a proof of a theorem, is “pure” if it draws only on what is “close” or “intrinsic” to that theorem. Mathematicians employ a variety of terms to identify pure proofs, saying that a pure proof is one that avoids what is “extrinsic,” “extraneous,” “distant,” “remote,” “alien,” or “foreign” to the problem or theorem under investigation. In the background of these attributions is the view that there is a distance measure (or a variety of such measures) between mathematical (...) statements and proofs. Mathematicians have paid little attention to specifying such distance measures precisely because in practice certain methods of proof have seemed self- evidently impure by design: think for instance of analytic geometry and analytic number theory. By contrast, mathematicians have paid considerable attention to whether such impurities are a good thing or to be avoided, and some have claimed that they are valuable because generally impure proofs are simpler than pure proofs. This article is an investigation of this claim, formulated more precisely by proof- theoretic means. After assembling evidence from prooftheory that may be thought to support this claim, we will argue that on the contrary this evidence does not support the claim. (shrink)
Most human actions are complex, but some of them are basic. Which are these? In this paper, I address this question by invoking slips, a common kind of mistake. The proposal is this: an action is basic if and only if it is not possible to slip in performing it. The argument discusses some well-established results from the psychology of language production in the context of a philosophical theory of action. In the end, the proposed criterion is applied to (...) discuss some well-known theories of basic actions. (shrink)
Theism and its cousins, atheism and agnosticism, are seldom taken to task for logical-epistemological incoherence. This paper provides a condensed proof that not only theism, but atheism and agnosticism as well, are all of them conceptually self-undermining, and for the same reason: All attempt to make use of the concept of “transcendent reality,” which here is shown not only to lack meaning, but to preclude the very possibility of meaning. In doing this, the incoherence of theism, atheism, and agnosticism (...) is secondary to the more general incoherence of any attempts to refer to so-called “transcendent realities.” -/- A recognition of the conceptually fundamental incoherence of theism, atheism, and agnosticism compels our rational assent to a position the author names “paratheism.”. (shrink)
The concept of burden of proof is used in a wide range of discourses, from philosophy to law, science, skepticism, and even in everyday reasoning. This paper provides an analysis of the proper deployment of burden of proof, focusing in particular on skeptical discussions of pseudoscience and the paranormal, where burden of proof assignments are most poignant and relatively clear-cut. We argue that burden of proof is often misapplied or used as a mere rhetorical gambit, with (...) little appreciation of the underlying principles. The paper elaborates on an important distinction between evidential and prudential varieties of burdens of proof, which is cashed out in terms of Bayesian probabilities and error management theory. Finally, we explore the relationship between burden of proof and several (alleged) informal logical fallacies. This allows us to get a firmer grip on the concept and its applications in different domains, and also to clear up some confusions with regard to when exactly some fallacies (ad hominem, ad ignorantiam, and petitio principii) may or may not occur. (shrink)
According to a common conception of legal proof, satisfying a legal burden requires establishing a claim to a numerical threshold. Beyond reasonable doubt, for example, is often glossed as 90% or 95% likelihood given the evidence. Preponderance of evidence is interpreted as meaning at least 50% likelihood given the evidence. In light of problems with the common conception, I propose a new ‘relevant alternatives’ framework for legal standards of proof. Relevant alternative accounts of knowledge state that a person (...) knows a proposition when their evidence rules out all relevant error possibilities. I adapt this framework to model three legal standards of proof—the preponderance of evidence, clear and convincing evidence, and beyond reasonable doubt standards. I describe virtues of this framework. I argue that, by eschewing numerical thresholds, the relevant alternatives framework avoids problems inherent to rival models. I conclude by articulating aspects of legal normativity and practice illuminated by the relevant alternatives framework. (shrink)
One of the most fundamental questions in the philosophy of mathematics concerns the relation between truth and formal proof. The position according to which the two concepts are the same is called deflationism, and the opposing viewpoint substantialism. In an important result of mathematical logic, Kurt Gödel proved in his first incompleteness theorem that all consistent formal systems containing arithmetic include sentences that can neither be proved nor disproved within that system. However, such undecidable Gödel sentences can be established (...) to be true once we expand the formal system with Alfred Tarski s semantical theory of truth, as shown by Stewart Shapiro and Jeffrey Ketland in their semantical arguments for the substantiality of truth. According to them, in Gödel sentences we have an explicit case of true but unprovable sentences, and hence deflationism is refuted. -/- Against that, Neil Tennant has shown that instead of Tarskian truth we can expand the formal system with a soundness principle, according to which all provable sentences are assertable, and the assertability of Gödel sentences follows. This way, the relevant question is not whether we can establish the truth of Gödel sentences, but whether Tarskian truth is a more plausible expansion than a soundness principle. In this work I will argue that this problem is best approached once we think of mathematics as the full human phenomenon, and not just consisting of formal systems. When pre-formal mathematical thinking is included in our account, we see that Tarskian truth is in fact not an expansion at all. I claim that what proof is to formal mathematics, truth is to pre-formal thinking, and the Tarskian account of semantical truth mirrors this relation accurately. -/- However, the introduction of pre-formal mathematics is vulnerable to the deflationist counterargument that while existing in practice, pre-formal thinking could still be philosophically superfluous if it does not refer to anything objective. Against this, I argue that all truly deflationist philosophical theories lead to arbitrariness of mathematics. In all other philosophical accounts of mathematics there is room for a reference of the pre-formal mathematics, and the expansion of Tarkian truth can be made naturally. Hence, if we reject the arbitrariness of mathematics, I argue in this work, we must accept the substantiality of truth. Related subjects such as neo-Fregeanism will also be covered, and shown not to change the need for Tarskian truth. -/- The only remaining route for the deflationist is to change the underlying logic so that our formal languages can include their own truth predicates, which Tarski showed to be impossible for classical first-order languages. With such logics we would have no need to expand the formal systems, and the above argument would fail. From the alternative approaches, in this work I focus mostly on the Independence Friendly (IF) logic of Jaakko Hintikka and Gabriel Sandu. Hintikka has claimed that an IF language can include its own adequate truth predicate. I argue that while this is indeed the case, we cannot recognize the truth predicate as such within the same IF language, and the need for Tarskian truth remains. In addition to IF logic, also second-order logic and Saul Kripke s approach using Kleenean logic will be shown to fail in a similar fashion. (shrink)
This work provides proof-search algorithms and automated counter-model extraction for a class of STIT logics. With this, we answer an open problem concerning syntactic decision procedures and cut-free calculi for STIT logics. A new class of cut-free complete labelled sequent calculi G3LdmL^m_n, for multi-agent STIT with at most n-many choices, is introduced. We refine the calculi G3LdmL^m_n through the use of propagation rules and demonstrate the admissibility of their structural rules, resulting in auxiliary calculi Ldm^m_nL. In the single-agent case, (...) we show that the refined calculi Ldm^m_nL derive theorems within a restricted class of (forestlike) sequents, allowing us to provide proof-search algorithms that decide single-agent STIT logics. We prove that the proof-search algorithms are correct and terminate. (shrink)
Definitions I presented in a previous article as part of a semantic approach in epistemology assumed that the concept of derivability from standard logic held across all mathematical and scientific disciplines. The present article argues that this assumption is not true for quantum mechanics (QM) by showing that concepts of validity applicable to proofs in mathematics and in classical mechanics are inapplicable to proofs in QM. Because semantic epistemology must include this important theory, revision is necessary. The one I (...) propose also extends semantic epistemology beyond the ‘hard’ sciences. The article ends by presenting and then refuting some responses QM theorists might make to my arguments. (shrink)
In this dissertation, we shall investigate whether Tennant's criterion for paradoxicality(TCP) can be a correct criterion for genuine paradoxes and whether the requirement of a normal derivation(RND) can be a proof-theoretic solution to the paradoxes. Tennant’s criterion has two types of counterexamples. The one is a case which raises the problem of overgeneration that TCP makes a paradoxical derivation non-paradoxical. The other is one which generates the problem of undergeneration that TCP renders a non-paradoxical derivation paradoxical. Chapter 2 deals (...) with the problem of undergeneration and Chapter 3 concerns the problem of overgeneration. Chapter 2 discusses that Tenant’s diagnosis of the counterexample which applies CR−rule and causes the undergeneration problem is not correct and presents a solution to the problem of undergeneration. Chapter 3 argues that Tennant’s diagnosis of the counterexample raising the overgeneration problem is wrong and provides a solution to the problem. Finally, Chapter 4 addresses what should be explicated in order for RND to be a proof-theoretic solution to the paradoxes. (shrink)
This paper considers proof-theoretic semantics for necessity within Dummett's and Prawitz's framework. Inspired by a system of Pfenning's and Davies's, the language of intuitionist logic is extended by a higher order operator which captures a notion of validity. A notion of relative necessary is defined in terms of it, which expresses a necessary connection between the assumptions and the conclusion of a deduction.
Review of Dowek, Gilles, Computation, Proof, Machine, Cambridge University Press, Cambridge, 2015. Translation of Les Métamorphoses du calcul, Le Pommier, Paris, 2007. Translation from the French by Pierre Guillot and Marion Roman.
I discuss two types of evidential problems with the most widely touted experiments in evolutionary psychology, those performed by Leda Cosmides and interpreted by Cosmides and John Tooby. First, and despite Cosmides and Tooby's claims to the contrary, these experiments don't fulfil the standards of evidence of evolutionary biology. Second Cosmides and Tooby claim to have performed a crucial experiment, and to have eliminated rival approaches. Though they claim that their results are consistent with their theory but contradictory to (...) the leading non-evolutionary alternative, Pragmatic Reasoning Schemas theory, I argue that this claim is unsupported. In addition, some of Cosmides and Tooby's interpretations arise from misguided and simplistic understandings of evolutionary biology. While I endorse the incorporation of evolutionary approaches into psychology, I reject the claims of Cosmides and Tooby that a modular approach is the only one supported by evolutionary biology. Lewontin's critical examinations of the applications of adaptationist thinking provide a background of evidentiary standards against which to view the currently fashionable claims of evolutionary psychology. (shrink)
Introduction to the Scientific Proof of the Natural Moral Law This paper proves that Aquinas has a means of demonstrating and deriving both moral goodness and the natural moral law from human nature alone. Aquinas scientifically proves the existence of the natural moral law as the natural rule of human operations from human nature alone. The distinction between moral goodness and transcendental goodness is affirmed. This provides the intellectual tools to refute the G.E. Moore (Principles of Ethics) attack against (...) the natural law as committing a "naturalistic fallacy". This article proves that instead Moore commits the fallacy of equivocation between moral goodness and transcendental goodness in his very assertion of a "naturalistic fallacy" by the proponents of the natural moral law. In the process the new deontological/kantian theory of natural law as articulated by John Finnis, Robert George, and Germain Grisez is false historically and philosophically. Ethical naturalism is affirmed as a result. (shrink)
Writing strategic documents is a major practice of many actors striving to see their educational ideas realised in the curriculum. In these documents, arguments are systematically developed to create the legitimacy of a new educational goal and competence to make claims about it. Through a qualitative analysis of the writing strategies used in these texts, I show how two of the main actors in the Czech educational discourse have developed a proof that a new educational goal is needed. I (...) draw on the connection of the relational approach in the sociology of education with Lyotard’s analytical semantics of instances in the event. The comparison of the writing strategies in the two documents reveals differences in the formation of a particular pattern of justification. In one case the texts function as a herald of pure reality, and in the other case as a messenger of other witnesses. This reveals different regimens of proof, although both of them were written as prescriptive directives – normative models of the educational world. (shrink)
ABSTRACT This part of the series has a dual purpose. In the first place we will discuss two kinds of theories of proof. The first kind will be called a theory of linear proof. The second has been called a theory of suppositional proof. The term "natural deduction" has often and correctly been used to refer to the second kind of theory, but I shall not do so here because many of the theories so-called (...) are not of the second kind--they must be thought of either as disguised linear theories or theories of a third kind (see postscript below). The second purpose of this part is 25 to develop some of the main ideas needed in constructing a comprehensive theory of proof. The reason for choosing the linear and suppositional theories for this purpose is because the linear theory includes only rules of a very simple nature, and the suppositional theory can be seen as the result of making the linear theory more comprehensive. CORRECTION: At the time these articles were written the word ‘proof’ especially in the phrase ‘proof from hypotheses’ was widely used to refer to what were earlier and are now called deductions. I ask your forgiveness. I have forgiven Church and Henkin who misled me. (shrink)
Euclid's classic proof about the infinitude of prime numbers has been a standard model of reasoning in student textbooks and books of elementary number theory. It has withstood scrutiny for over 2000 years but we shall prove that despite the deceptive appearance of its analytical reasoning it is tautological in nature. We shall argue that the proof is more of an observation about the general property of a prime numbers than an expository style of natural deduction of (...) the proof of their infinitude. (shrink)
Nearly a decade has past since Grove gave a semantics for the AGM postulates. The semantics, called sphere semantics, provided a new perspective of the area of study, and has been widely used in the context of theory or belief change. However, the soundness proof that Grove gives in his paper contains an error. In this note, we will point this out and give two ways of repairing it.
The paper considers contemporary models of presumption in terms of their ability to contribute to a working theory of presumption for argumentation. Beginning with the Whatelian model, we consider its contemporary developments and alternatives, as proposed by Sidgwick, Kauffeld, Cronkhite, Rescher, Walton, Freeman, Ullmann-Margalit, and Hansen. Based on these accounts, we present a picture of presumptions characterized by their nature, function, foundation and force. On our account, presumption is a modal status that is attached to a claim and has (...) the effect of shifting, in a dialogue, a burden of proof set at a local level. Presumptions can be analysed and evaluated inferentially as components of rule-based structures. Presumptions are defeasible, and the force of a presumption is a function of its normative foundation. This picture seeks to provide a framework to guide the development of specific theories of presumption. (shrink)
This paper provides an introductory review of the theory of judgment aggregation. It introduces the paradoxes of majority voting that originally motivated the field, explains several key results on the impossibility of propositionwise judgment aggregation, presents a pedagogical proof of one of those results, discusses escape routes from the impossibility and relates judgment aggregation to some other salient aggregation problems, such as preference aggregation, abstract aggregation and probability aggregation. The present illustrative rather than exhaustive review is intended to (...) give readers new to the field of judgment aggregation a sense of this rapidly growing research area. (shrink)
The Born’s rule to interpret the square of wave function as the probability to get a specific value in measurement has been accepted as a postulate in foundations of quantum mechanics. Although there have been so many attempts at deriving this rule theoretically using different approaches such as frequency operator approach, many-world theory, Bayesian probability and envariance, literature shows that arguments in each of these methods are circular. In view of absence of a convincing theoretical proof, recently some (...) researchers have carried out experiments to validate the rule up-to maximum possible accuracy using multi-order interference (Sinha et al, Science, 329, 418 [2010]). But, a convincing analytical proof of Born’s rule will make us understand the basic process responsible for exact square dependency of probability on wave function. In this paper, by generalizing the method of calculating probability in common experience into quantum mechanics, we prove the Born’s rule for statistical interpretation of wave function. (shrink)
I present and discuss three previously unpublished manuscripts written by Bertrand Russell in 1903, not included with similar manuscripts in Volume 4 of his Collected Papers. One is a one-page list of basic principles for his “functional theory” of May 1903, in which Russell partly anticipated the later Lambda Calculus. The next, catalogued under the title “Proof That No Function Takes All Values”, largely explores the status of Cantor’s proof that there is no greatest cardinal number in (...) the variation of the functional theory holding that only some but not all complexes can be analyzed into function and argument. The final manuscript, “Meaning and Denotation”, examines how his pre-1905 distinction between meaning and denotation is to be understood with respect to functions and their arguments. In them, Russell seems to endorse an extensional view of functions not endorsed in other works prior to the 1920s. All three manuscripts illustrate the close connection between his work on the logical paradoxes and his work on the theory of meaning. (shrink)
The “four-color” theorem seems to be generalizable as follows. The four-letter alphabet is sufficient to encode unambiguously any set of well-orderings including a geographical map or the “map” of any logic and thus that of all logics or the DNA plan of any alive being. Then the corresponding maximally generalizing conjecture would state: anything in the universe or mind can be encoded unambiguously by four letters. That admits to be formulated as a “four-letter theorem”, and thus one can search for (...) a properly mathematical proof of the statement. It would imply the “four colour theorem”, the proof of which many philosophers and mathematicians believe not to be entirely satisfactory for it is not a “human proof”, but intermediated by computers unavoidably since the necessary calculations exceed the human capabilities fundamentally. It is furthermore rather unsatisfactory because it consists in enumerating and proving all cases one by one. Sometimes, a more general theorem turns out to be much easier for proving including a general “human” method, and the particular and too difficult for proving theorem to be implied as a corollary in certain simple conditions. The same approach will be followed as to the four colour theorem, i.e. to be deduced more or less trivially from the “four-letter theorem” if the latter is proved. References are only classical and thus very well-known papers: their complete bibliographic description is omitted. (shrink)
Wittgenstein's paradoxical theses that unproved propositions are meaningless, proofs form new concepts and rules, and contradictions are of limited concern, led to a variety of interpretations, most of them centered on rule-following skepticism. We argue, with the help of C. S. Peirce's distinction between corollarial and theorematic proofs, that his intuitions are better explained by resistance to what we call conceptual omniscience, treating meaning as fixed content specified in advance. We interpret the distinction in the context of modern epistemic logic (...) and semantic information theory, and show how removing conceptual omniscience helps resolve Wittgenstein's paradoxes and explain the puzzle of deduction, its ability to generate new knowledge and meaning. (shrink)
This is part one of a two-part paper, in which we develop an axiomatic theory of the relation of partial ground. The main novelty of the paper is the of use of a binary ground predicate rather than an operator to formalize ground. This allows us to connect theories of partial ground with axiomatic theories of truth. In this part of the paper, we develop an axiomatization of the relation of partial ground over the truths of arithmetic and show (...) that the theory is a proof-theoretically conservative extension of the theory PT of positive truth. We construct models for the theory and draw some conclusions for the semantics of conceptualist ground. (shrink)
In his classic book “the Foundations of Statistics” Savage developed a formal system of rational decision making. The system is based on (i) a set of possible states of the world, (ii) a set of consequences, (iii) a set of acts, which are functions from states to consequences, and (iv) a preference relation over the acts, which represents the preferences of an idealized rational agent. The goal and the culmination of the enterprise is a representation theorem: Any preference relation that (...) satisfies certain arguably acceptable postulates determines a (finitely additive) probability distribution over the states and a utility assignment to the consequences, such that the preferences among acts are determined by their expected utilities. Additional problematic assumptions are however required in Savage's proofs. First, there is a Boolean algebra of events (sets of states) which determines the richness of the set of acts. The probabilities are assigned to members of this algebra. Savage's proof requires that this be a σ-algebra (i.e., closed under infinite countable unions and intersections), which makes for an extremely rich preference relation. On Savage's view we should not require subjective probabilities to be σ-additive. He therefore finds the insistence on a σ-algebra peculiar and is unhappy with it. But he sees no way of avoiding it. Second, the assignment of utilities requires the constant act assumption: for every consequence there is a constant act, which produces that consequence in every state. This assumption is known to be highly counterintuitive. The present work contains two mathematical results. The first, and the more difficult one, shows that the σ-algebra assumption can be dropped. The second states that, as long as utilities are assigned to finite gambles only, the constant act assumption can be replaced by the more plausible and much weaker assumption that there are at least two non-equivalent constant acts. The second result also employs a novel way of deriving utilities in Savage-style systems -- without appealing to von Neumann-Morgenstern lotteries. The paper discusses the notion of “idealized agent" that underlies Savage's approach, and argues that the simplified system, which is adequate for all the actual purposes for which the system is designed, involves a more realistic notion of an idealized agent. (shrink)
Much mainstream analytic epistemology is built around a sceptical treatment of modality which descends from Hume. The roots of this scepticism are argued to lie in Hume’s (nominalist) theory of perception, which is excavated, studied and compared with the very different (realist) theory of perception developed by Peirce. It is argued that Peirce’s theory not only enables a considerably more nuanced and effective epistemology, it also (unlike Hume’s theory) does justice to what happens when we appreciate (...) a proof in mathematics. (shrink)
Minimal Type Theory (MTT) is based on type theory in that it is agnostic about Predicate Logic level and expressly disallows the evaluation of incompatible types. It is called Minimal because it has the fewest possible number of fundamental types, and has all of its syntax expressed entirely as the connections in a directed acyclic graph.
In order to explain Wittgenstein’s account of the reality of completed infinity in mathematics, a brief overview of Cantor’s initial injection of the idea into set- theory, its trajectory and the philosophic implications he attributed to it will be presented. Subsequently, we will first expound Wittgenstein’s grammatical critique of the use of the term ‘infinity’ in common parlance and its conversion into a notion of an actually existing infinite ‘set’. Secondly, we will delve into Wittgenstein’s technical critique of the (...) concept of ‘denumerability’ as it is presented in set theory as well as his philosophic refutation of Cantor’s Diagonal Argument and the implications of such a refutation onto the problems of the Continuum Hypothesis and Cantor’s Theorem. Throughout, the discussion will be placed within the historical and philosophical framework of the Grundlagenkrise der Mathematik and Hilbert’s problems. (shrink)
Information Theory, Evolution and The Origin ofLife: The Origin and Evolution of Life as a Digital Message: How Life Resembles a Computer, Second Edition. Hu- bert P. Yockey, 2005, Cambridge University Press, Cambridge: 400 pages, index; hardcover, US $60.00; ISBN: 0-521-80293-8. The reason that there are principles of biology that cannot be derived from the laws of physics and chemistry lies simply in the fact that the genetic information content of the genome for constructing even the simplest organisms is (...) much larger than the information content of these laws. Yockey in his previous book (1992, 335) In this new book, Information Theory, Evolution and The Origin ofLife, Hubert Yockey points out that the digital, segregated, and linear character of the genetic information system has a fundamental significance. If inheritance would blend and not segregate, Darwinian evolution would not occur. If inheritance would be analog, instead of digital, evolution would be also impossible, because it would be impossible to remove the effect of noise. In this way, life is guided by information, and so information is a central concept in molecular biology. The author presents a picture of how the main concepts of the genetic code were developed. He was able to show that despite Francis Crick's belief that the Central Dogma is only a hypothesis, the Central Dogma of Francis Crick is a mathematical consequence of the redundant nature of the genetic code. The redundancy arises from the fact that the DNA and mRNA alphabet is formed by triplets of 4 nucleotides, and so the number of letters (triplets) is 64, whereas the proteome alphabet has only 20 letters (20 amino acids), and so the translation from the larger alphabet to the smaller one is necessarily redundant. Except for Tryptohan and Methionine, all amino acids are coded by more than one triplet, therefore, it is undecidable which source code letter was actually sent from mRNA. This proof has a corollary telling that there are no such mathematical constraints for protein-protein communication. With this clarification, Yockey contributes to diminishing the widespread confusion related to such a central concept like the Central Dogma. Thus the Central Dogma prohibits the origin of life "proteins first." Proteins can not be generated by "self-organization." Understanding this property of the Central Dogma will have a serious impact on research on the origin of life. (shrink)
CORCORAN RECOMMENDS COCCHIARELLA ON TYPE THEORY. The 1983 review in Mathematical Reviews 83e:03005 of: Cocchiarella, Nino “The development of the theory of logical types and the notion of a logical subject in Russell's early philosophy: Bertrand Russell's early philosophy, Part I”. Synthese 45 (1980), no. 1, 71-115 .
Easy to understand philosophy papers in all areas. Table of contents: Three Short Philosophy Papers on Human Freedom The Paradox of Religions Institutions Different Perspectives on Religious Belief: O’Reilly v. Dawkins. v. James v. Clifford Schopenhauer on Suicide Schopenhauer’s Fractal Conception of Reality Theodore Roszak’s Views on Bicameral Consciousness Philosophy Exam Questions and Answers Locke, Aristotle and Kant on Virtue Logic Lecture for Erika Kant’s Ethics Van Cleve on Epistemic Circularity Plato’s Theory of Forms Can we trust our senses? (...) Yes we can Descartes on What He Believes Himself to Be The Role of Values in Science Modern Science Kant’s Moral Philosophy Plato’s Republic as Pol Potist Bureaucracy Schopenhauer on Human Suffering Bertrand Russell on the Value of Philosophy The Philosophical Value of Uncertainty Logic Homework: Theorems and Models Searle vs. Turing on the Imitation Game Hume, Frankfurt, and Holbach on Personal Freedom Manifesto of the University of Wisconsin, Madison Secular Society Michael’s Analysis of the Limits of Civil Protections Bentham and Mill on Different Types of Pleasure Set Theory Homework Aristotle on Virtue Nagel On the Hard Problem Wittgenstein on Language and Thought Camus and Schopenhauer on the Meaning of Life Camus’ Hero as Rebel without a Cause My Little Finger: Camus’ Absurdism Illustrated Are Late-term Abortions Ethical? Does Mathematics Assume the Truth of Platonism? The Self-defeating Nature of Utilitarianism and Consequentialism Generally What is The Good Life? Bentham and Mill regarding types of pleasures Kant’s Moral Philosophy Five Short Papers on Mind-body Dualism Tracy Latimer’s Father had the Right to Kill Her: Towards a doctrine of generalized self-defense Arguments Concerning God and Morality Goldman, Rousseau and von Hayek on the Ideal State J.S Mill on Liberty and Personal Freedom A Kantian Analysis of a Borderline Date-rape Situation Living Well as Flourishing: Aristotle’s Conception of the Good Life Three Essays on Medical Ethics: Answers to Exam Questions on Elective Amputation, Vaccination, and Informed Consent Hobbes, Marx, Rousseau, Nietzsche: Their Central Themes De Tocqueville on Egoism Mill vs. Hobbes on Liberty Exam-Essays on the Moral Systems of Mill, Bentham, and Kant Kant’s Moral System Aristotle on Virtue Plato’s Cave Allegory An Ethical Quandary Superorganisms The Tuskegee Experiment A Rawlsian Analysis Why Moore’s Proof of an External World Fails A Defense of Nagel’s Argument Against Materialism A Utilitarian Analysis of a Case of Theft The Paradox of the Self-aware Wretch: An Analysis of Pascal’s Moral Philosophy Jean-Paul Sartre: Decline and Fall of a Marxist Sell-out A One Page Proof of Plato’s Theory of Forms Plato’s Republic as Pol Potist Bureaucracy The problem of the one and the many Four Short Essays on Truth and Knowledge What is ‘the Good Life?’ The Ontological Argument Different Political Philosophies: Plato, Locke, Madison, Rousseau, Hayek, and Mill on the State What do I know with certainty? Skepticism about skepticism Neuroscience and Freewill Operant Conditioning What makes us special? Are Late-term Abortions Ethical? No Two Papers on Epistemology: Gettier and Bostrom Examination Nietzsche on Punishment God’s Foreknowledge and Moral Responsibility . (shrink)
We approach the topic of solution equivalence of propositional problems from the perspective of non-constructive procedural theory of problems based on Transparent Intensional Logic (TIL). The answer we put forward is that two solutions are equivalent if and only if they have equivalent solution concepts. Solution concepts can be understood as a generalization of the notion of proof objects from the Curry-Howard isomorphism.
The Elementary Process Theory (EPT) is a collection of seven elementary process-physical principles that describe the individual processes by which interactions have to take place for repulsive gravity to exist. One of the two main problems of the EPT is that there is no proof that the four fundamental interactions (gravitational, electromagnetic, strong, and weak) as we know them can take place in the elementary processes described by the EPT. This paper sets forth the method by which it (...) can be proven that the EPT agrees with the knowledge that derives from the successful predictions of a modern interaction theory T. This determines a fundamentally new research program in theoretical physics. (shrink)
I have read many recent discussions of the limits of computation and the universe as computer, hoping to find some comments on the amazing work of polymath physicist and decision theorist David Wolpert but have not found a single citation and so I present this very brief summary. Wolpert proved some stunning impossibility or incompleteness theorems (1992 to 2008-see arxiv.org) on the limits to inference (computation) that are so general they are independent of the device doing the computation, and even (...) independent of the laws of physics, so they apply across computers, physics, and human behavior. They make use of Cantor's diagonalization, the liar paradox and worldlines to provide what may be the ultimate theorem in Turing Machine Theory, and seemingly provide insights into impossibility,incompleteness, the limits of computation,and the universe as computer, in all possible universes and all beings or mechanisms, generating, among other things,a non-quantum mechanical uncertainty principle and a proof of monotheism. (shrink)
Scientific inquiry takes onward course from the point where previous scientists had reached. But philosophical analysis initiates from scratch. Philosophy questions everything and chooses starting point for itself after having ruled out all the unsubstantiated and doubtful elements of the topic under study. Secondly, known realities must make sense. If a theory is officially 'counterintuitive', then either it is mere fiction or at the most; a distorted form of truth. This book's analysis is based on the philosophical principle that (...) knowledge is empirical and does not arise magically in absence of observational grounds. With philosophical approach, it was doubtful to accept that Georges Lemaître already knew Hubble's law in year 1927 that was yet to be found by Edwin Hubble in year 1929. Therefore this book started with denial of the claim that Lemaître already knew this law. But analysis of section I.III forced author to look the matter from original source and it came to surface that Lemaître knew this law in year 1927. But contrary to mainstream claim, Lemaître had not derived that law from general relativity (GR) equations rather had deduced from a method given by Hubble himself. Whereas whole case of the Big Bang Theory rests on misleading claim that Lemaître had derived this law solely from GR equations. The basis of this claim happened to be a manipulated translation (1931) of Lemaître's original 1927 article. People regard Big Bang Theory as truth because authoritative sources deceived them by presenting a manipulated translation in year 1931. This book is a philosophical analysis of original papers of Alexander Friedmann (1922), Georges Lemaître (1927), Edwin Hubble (1929) and Albert Einstein (1917) thus covers actual roots and origins of the Big Bang Model. In this book, only the core elements of the Big Bang Model i.e. 'Expansion of Universe' and 'CMBR' are covered. It has been sufficiently shown that 'expansion' is an illusion whereas CMBR is a proof that we live in a non-expanding infinite universe. If these two core elements of the standard Big Bang Model are precisely refuted then there is nothing crucial left with the standard model. For readers of this book at least, Big Bang Theory shall become a story of past mistakes. Author is not an authoritative source on science topics therefore readers must download all the above mentioned original papers and check all the points outlined in this book from relevant original papers. Unlike reading from an authoritative source that makes readers relaxed and careless but enables authorities to deceive them in worst way possible, this book requires readers to remain alert on all the points discussed in the book and verify everything from original sources whose links are given at the end of this description and also provided in footnotes section of the book. This book is not a judgment of the topic rather it is like a case presented by an advocate while readers are the judges. Readers are required to apply their own critical judgment to conclude the matter by themselves. After carefully reading this book, readers will also start taking 'authoritative sources' with due care and it will become difficult for the 'authorities' to deceive them again. (shrink)
DEFINING OUR TERMS A “paradox" is an argumentation that appears to deduce a conclusion believed to be false from premises believed to be true. An “inconsistency proof for a theory" is an argumentation that actually deduces a negation of a theorem of the theory from premises that are all theorems of the theory. An “indirect proof of the negation of a hypothesis" is an argumentation that actually deduces a conclusion known to be false from the (...) hypothesis alone or, more commonly, from the hypothesis augmented by a set of premises known to be true. A “direct proof of a hypothesis" is an argumentation that actually deduces the hypothesis itself from premises known to be true. Since `appears', `believes' and `knows' all make elliptical reference to a participant, it is clear that `paradox', `indirect proof' and `direct proof' are all participant-relative. PARTICIPANT RELATIVITY In normal mathematical writing the participant is presumed to be “the community of mathematicians" or some more or less well-defined subcommunity and, therefore, omission of explicit reference to the participant is often warranted. However, in historical, critical, or philosophical writing focused on emerging branches of mathematics such omission often invites confusion. One and the same argumentation has been a paradox for one mathematician, an inconsistency proof for another, and an indirect proof to a third. One and the same argumentation-text can appear to one mathematician to express an indirect proof while appearing to another mathematician to express a direct proof. WHAT IS A PARADOX’S SOLUTION? Of the above four sorts of argumentation only the paradox invites “solution" or “resolution", and ordinarily this is to be accomplished either by discovering a logical fallacy in the “reasoning" of the argumentation or by discovering that the conclusion is not really false or by discovering that one of the premises is not really true. Resolution of a paradox by a participant amounts to reclassifying a formerly paradoxical argumentation either as a “fallacy", as a direct proof of its conclusion, as an indirect proof of the negation of one of its premises, as an inconsistency proof, or as something else depending on the participant's state of knowledge or belief. This illustrates why an argumentation which is a paradox to a given mathematician at a given time may well not be a paradox to the same mathematician at a later time. -/- The present article considers several set-theoretic argumentations that appeared in the period 1903-1908. The year 1903 saw the publication of B. Russell's Principles of mathematics, [Cambridge Univ. Press, Cambridge, 1903; Jbuch 34, 62]. The year 1908 saw the publication of Russell's article on type theory as well as Ernst Zermelo's two watershed articles on the axiom of choice and the foundations of set theory. The argumentations discussed concern “the largest cardinal", “the largest ordinal", the well-ordering principle, “the well-ordering of the continuum", denumerability of ordinals and denumerability of reals. The article shows that these argumentations were variously classified by various mathematicians and that the surrounding atmosphere was one of confusion and misunderstanding, partly as a result of failure to make or to heed distinctions similar to those made above. The article implies that historians have made the situation worse by not observing or not analysing the nature of the confusion. -/- RECOMMENDATION This well-written and well-documented article exemplifies the fact that clarification of history can be achieved through articulation of distinctions that had not been articulated (or were not being heeded) at the time. The article presupposes extensive knowledge of the history of mathematics, of mathematics itself (especially set theory) and of philosophy. It is therefore not to be recommended for casual reading. AFTERWORD: This review was written at the same time Corcoran was writing his signature “Argumentations and logic”[249] that covers much of the same ground in much more detail. https://www.academia.edu/14089432/Argumentations_and_Logic . (shrink)
Issue 202011211 includes additional chapter about appearence of the form. At ordinary scales, the ontological model proposed by Ontology of Knowledge (OK) does not call into question the representation of the world elaborated by common sense or science. This is not the world such as it appears to us and as science describes it that is challenged by the OK but the way it appears to the knowing subject and science. In spite of the efforts made to separate scientific reasoning (...) and metaphysical considerations, in spite of the rigorous construction of mathematics, these are not, in their very foundations, independent of modalities, of laws of representation of the world. The OK shows that logical facts Exist neither more nor less than the facts of the World which are Facts of Knowledge. The mathematical facts are facts of representation. Indeed : by the experimental proof, only the laws of the representation are proved persistent/consistent, because what science foresees and verifies with precision, it is not the facts of the world but the facts of the representation of the world. Beyond the laws of representation, nothing proves to us that there are laws of the world. Remember, however, that mathematics « are worth themselves » and can not be called into question « for themselves » by an ontology. The only question is the process of creating meaning that provides mathematics with their intuitions a priori. The first objective of this article will therefore be to identify and clarify what ruptures proposed by the OK could affect intuitions a priori which found mathematics but also could explain the remarkable ability of mathematics to represent the world. For this, three major intuitions of form will be analyzed, namely : the intuition of the One, the intuition of time and the intuition of space. Then considering mathematics in two major classes : {logic, arithmetic, set theory ...} on the one hand and geometry on the other hand, we will ask the questions : - How does the OK affect their premises and rules of inference ? - In case of incompatibility, under what conditions can such a mathematical theory be made compatible with the OK? - Can we deduce a possible extension of the theory ? (shrink)
A principle, according to which any scientific theory can be mathematized, is investigated. Social science, liberal arts, history, and philosophy are meant first of all. That kind of theory is presupposed to be a consistent text, which can be exhaustedly represented by a certain mathematical structure constructively. In thus used, the term “theory” includes all hypotheses as yet unconfirmed as already rejected. The investigation of the sketch of a possible proof of the principle demonstrates that it (...) should be accepted rather a metamathematical axiom about the relation of mathematics and reality. The main statement is formulated as follows: Any scientific theory admits isomorphism to some mathematical structure in a way constructive. Its investigation needs philosophical means. Husserl’s phenomenology is what is used, and then the conception of “bracketing reality” is modelled to generalize Peano arithmetic in its relation to set theory in the foundation of mathematics. The obtained model is equivalent to the generalization of Peano arithmetic by means of replacing the axiom of induction with that of transfinite induction. The sketch of the proof is organized in five steps: a generalization of epoché; involving transfinite induction in the transition between Peano arithmetic and set theory; discussing the finiteness of Peano arithmetic; applying transfinite induction to Peano arithmetic; discussing an arithmetical model of reality. Accepting or rejecting the principle, two kinds of mathematics appear differing from each other by its relation to reality. Accepting the principle, mathematics has to include reality within itself in a kind of Pythagoreanism. These two kinds are called in paper correspondingly Hilbert mathematics and Gödel mathematics. The sketch of the proof of the principle demonstrates that the generalization of Peano arithmetic as above can be interpreted as a model of Hilbert mathematics into Gödel mathematics therefore showing that the former is not less consistent than the latter, and the principle is an independent axiom. The present paper follows a pathway grounded on Husserl’s phenomenology and “bracketing reality” to achieve the generalized arithmetic necessary for the principle to be founded in alternative ontology, in which there is no reality external to mathematics: reality is included within mathematics. That latter mathematics is able to self-found itself and can be called Hilbert mathematics in honour of Hilbert’s program for self-founding mathematics on the base of arithmetic. The principle of universal mathematizability is consistent to Hilbert mathematics, but not to Gödel mathematics. Consequently, its validity or rejection would resolve the problem which mathematics refers to our being; and vice versa: the choice between them for different reasons would confirm or refuse the principle as to the being. An information interpretation of Hilbert mathematics is involved. It is a kind of ontology of information. The Schrödinger equation in quantum mechanics is involved to illustrate that ontology. Thus the problem which of the two mathematics is more relevant to our being is discussed again in a new way A few directions for future work can be: a rigorous formal proof of the principle as an independent axiom; the further development of information ontology consistent to both kinds of mathematics, but much more natural for Hilbert mathematics; the development of the information interpretation of quantum mechanics as a mathematical one for information ontology and thus Hilbert mathematics; the description of consciousness in terms of information ontology. (shrink)
Analysis is given of the Omega Point cosmology, an extensively peer-reviewed proof (i.e., mathematical theorem) published in leading physics journals by professor of physics and mathematics Frank J. Tipler, which demonstrates that in order for the known laws of physics to be mutually consistent, the universe must diverge to infinite computational power as it collapses into a final cosmological singularity, termed the Omega Point. The theorem is an intrinsic component of the Feynman-DeWitt-Weinberg quantum gravity/Standard Model Theory of Everything (...) (TOE) describing and unifying all the forces in physics, of which itself is also required by the known physical laws. With infinite computational resources, the dead can be resurrected--never to die again--via perfect computer emulation of the multiverse from its start at the Big Bang. Miracles are also physically allowed via electroweak quantum tunneling controlled by the Omega Point cosmological singularity. The Omega Point is a different aspect of the Big Bang cosmological singularity--the first cause--and the Omega Point has all the haecceities claimed for God in the traditional religions. -/- From this analysis, conclusions are drawn regarding the social, ethical, economic and political implications of the Omega Point cosmology. (shrink)
I have read many recent discussions of the limits of computation and the universe as computer, hoping to find some comments on the amazing work of polymath physicist and decision theorist David Wolpert but have not found a single citation and so I present this very brief summary. Wolpert proved some stunning impossibility or incompleteness theorems (1992 to 2008-see arxiv dot org) on the limits to inference (computation) that are so general they are independent of the device doing the computation, (...) and even independent of the laws of physics, so they apply across computers, physics, and human behavior. They make use of Cantor's diagonalization, the liar paradox and worldlines to provide what may be the ultimate theorem in Turing Machine Theory, and seemingly provide insights into impossibility, incompleteness, the limits of computation, and the universe as computer, in all possible universes and all beings or mechanisms, generating, among other things, a non- quantum mechanical uncertainty principle and a proof of monotheism. There are obvious connections to the classic work of Chaitin, Solomonoff, Komolgarov and Wittgenstein and to the notion that no program (and thus no device) can generate a sequence (or device) with greater complexity than it possesses. One might say this body of work implies atheism since there cannot be any entity more complex than the physical universe and from the Wittgensteinian viewpoint, ‘more complex’ is meaningless (has no conditions of satisfaction, i.e., truth-maker or test). Even a ‘God’ (i.e., a ‘device’with limitless time/space and energy) cannot determine whether a given ‘number’ is ‘random’, nor find a certain way to show that a given ‘formula’, ‘theorem’ or ‘sentence’ or ‘device’ (all these being complex language games) is part of a particular ‘system’. -/- Those wishing a comprehensive up to date framework for human behavior from the modern two systems view may consult my book ‘The Logical Structure of Philosophy, Psychology, Mind and Language in Ludwig Wittgenstein and John Searle’ 2nd ed (2019). Those interested in more of my writings may see ‘Talking Monkeys--Philosophy, Psychology, Science, Religion and Politics on a Doomed Planet--Articles and Reviews 2006-2019 2nd ed (2019) and Suicidal Utopian Delusions in the 21st Century 4th ed (2019) . (shrink)
The paper re-expresses arguments against the normative validity of expected utility theory in Robin Pope (1983, 1991a, 1991b, 1985, 1995, 2000, 2001, 2005, 2006, 2007). These concern the neglect of the evolving stages of knowledge ahead (stages of what the future will bring). Such evolution is fundamental to an experience of risk, yet not consistently incorporated even in axiomatised temporal versions of expected utility. Its neglect entails a disregard of emotional and financial effects on well-being before a particular risk (...) is resolved. These arguments are complemented with an analysis of the essential uniqueness property in the context of temporal and atemporal expected utility theory and a proof of the absence of a limit property natural in an axiomatised approach to temporal expected utility theory. Problems of the time structure of risk are investigated in a simple temporal framework restricted to a subclass of temporal lotteries in the sense of David Kreps and Evan Porteus (1978). This subclass is narrow but wide enough to discuss basic issues. It will be shown that there are serious objections against the modification of expected utility theory axiomatised by Kreps and Porteus (1978, 1979). By contrast the umbrella theory proffered by Pope that she has now termed SKAT, the Stages of Knowledge Ahead Theory, offers an epistemically consistent framework within which to construct particular models to deal with particular decision situations. A model by Caplin and Leahy (2001) will also be discussed and contrasted with the modelling within SKAT (Pope, Leopold and Leitner 2007). (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.