The paper introduces and utilizes a few new concepts: “nonstandard Peanoarithmetic”, “complementary Peanoarithmetic”, “Hilbert arithmetic”. They identify the foundations of both mathematics and physics demonstrating the equivalence of the newly introduced Hilbert arithmetic and the separable complex Hilbert space of quantum mechanics in turn underlying physics and all the world. That new both mathematical and physical ground can be recognized as information complemented and generalized by quantum information. A few fundamental mathematical problems (...) of the present such as Fermat’s last theorem, four-color theorem as well as its new-formulated generalization as “four-letter theorem”, Poincaré’s conjecture, “P vs NP” are considered over again, from and within the new-founding conceptual reference frame of information, as illustrations. Simple or crucially simplifying solutions and proofs are demonstrated. The link between the consistent completeness of the system mathematics-physics on the ground of information and all the great mathematical problems of the present (rather than the enumerated ones) is suggested. (shrink)
One of the philosophical uses of Dedekind’s categoricity theorem for PeanoArithmetic is to provide support for semantic realism. To this end, the logical framework in which the proof of the theorem is conducted becomes highly significant. I examine different proposals regarding these logical frameworks and focus on the philosophical benefits of adopting open-ended schemas in contrast to second order logic as the logical medium of the proof. I investigate Pederson and Rossberg’s critique of the ontological advantages of (...) open-ended arithmetic when it comes to establishing the categoricity of PeanoArithmetic and show that the critique is highly problematic. I argue that Pederson and Rossberg’s ontological criterion deliver the bizarre result that certain first order subsystems of PeanoArithmetic have a second order ontology. As a consequence, the application of the ontological criterion proposed by Pederson and Rossberg assigns a certain type of ontology to a theory, and a different, richer, ontology to one of its subtheories. (shrink)
The paper considers a generalization of Peanoarithmetic, Hilbert arithmetic as the basis of the world in a Pythagorean manner. Hilbert arithmetic unifies the foundations of mathematics (Peanoarithmetic and set theory), foundations of physics (quantum mechanics and information), and philosophical transcendentalism (Husserl’s phenomenology) into a formal theory and mathematical structure literally following Husserl’s tracе of “philosophy as a rigorous science”. In the pathway to that objective, Hilbert arithmetic identifies by itself information related (...) to finite sets and series and quantum information referring to infinite one as both appearing in three “hypostases”: correspondingly, mathematical, physical and ontological, each of which is able to generate a relevant science and area of cognition. Scientific transcendentalism is a falsifiable counterpart of philosophical transcendentalism. The underlying concept of the totality can be interpreted accordingly also mathematically, as consistent completeness, and physically, as the universe defined not empirically or experimentally, but as that ultimate wholeness containing its externality into itself. (shrink)
Second-order PeanoArithmetic minus the Successor Axiom is developed from first principles through Quadratic Reciprocity and a proof of self-consistency. This paper combines 4 other papers of the author in a self-contained exposition.
The previous two parts of the paper demonstrate that the interpretation of Fermat’s last theorem (FLT) in Hilbert arithmetic meant both in a narrow sense and in a wide sense can suggest a proof by induction in Part I and by means of the Kochen - Specker theorem in Part II. The same interpretation can serve also for a proof FLT based on Gleason’s theorem and partly similar to that in Part II. The concept of (probabilistic) measure of a (...) subspace of Hilbert space and especially its uniqueness can be unambiguously linked to that of partial algebra or incommensurability, or interpreted as a relation of the two dual branches of Hilbert arithmetic in a wide sense. The investigation of the last relation allows for FLT and Gleason’s theorem to be equated in a sense, as two dual counterparts, and the former to be inferred from the latter, as well as vice versa under an additional condition relevant to the Gödel incompleteness of arithmetic to set theory. The qubit Hilbert space itself in turn can be interpreted by the unity of FLT and Gleason’s theorem. The proof of such a fundamental result in number theory as FLT by means of Hilbert arithmetic in a wide sense can be generalized to an idea about “quantum number theory”. It is able to research mathematically the origin of Peanoarithmetic from Hilbert arithmetic by mediation of the “nonstandard bijection” and its two dual branches inherently linking it to information theory. Then, infinitesimal analysis and its revolutionary application to physics can be also re-realized in that wider context, for example, as an exploration of the way for physical quantity of time (respectively, for time derivative in any temporal process considered in physics) to appear at all. Finally, the result admits a philosophical reflection of how any hierarchy arises or changes itself only thanks to its dual and idempotent counterpart. (shrink)
A practical viewpoint links reality, representation, and language to calculation by the concept of Turing (1936) machine being the mathematical model of our computers. After the Gödel incompleteness theorems (1931) or the insolvability of the so-called halting problem (Turing 1936; Church 1936) as to a classical machine of Turing, one of the simplest hypotheses is completeness to be suggested for two ones. That is consistent with the provability of completeness by means of two independent Peano arithmetics discussed in Section (...) I. Many modifications of Turing machines cum quantum ones are researched in Section II for the Halting problem and completeness, and the model of two independent Turing machines seems to generalize them. Then, that pair can be postulated as the formal definition of reality therefore being complete unlike any of them standalone, remaining incomplete without its complementary counterpart. Representation is formal defined as a one-to-one mapping between the two Turing machines, and the set of all those mappings can be considered as “language” therefore including metaphors as mappings different than representation. Section III investigates that formal relation of “reality”, “representation”, and “language” modeled by (at least two) Turing machines. The independence of (two) Turing machines is interpreted by means of game theory and especially of the Nash equilibrium in Section IV. Choice and information as the quantity of choices are involved. That approach seems to be equivalent to that based on set theory and the concept of actual infinity in mathematics and allowing of practical implementations. (shrink)
The systems of arithmetic discussed in this work are non-elementary theories. In this paper, natural numbers are characterized axiomatically in two di erent ways. We begin by recalling the classical set P of axioms of Peano’s arithmetic of natural numbers proposed in 1889 (including such primitive notions as: set of natural numbers, zero, successor of natural number) and compare it with the set W of axioms of this arithmetic (including the primitive notions like: set of natural (...) numbers and relation of inequality) proposed by Witold Wilkosz, a Polish logician, philosopher and mathematician, in 1932. The axioms W are those of ordered sets without largest element, in which every non-empty set has a least element, and every set bounded from above has a greatest element. We show that P and W are equivalent and also that the systems of arithmetic based on W or on P, are categorical and consistent. There follows a set of intuitive axioms PI of integers arithmetic, modelled on P and proposed by B. Iwanuś, as well as a set of axioms WI of this arithmetic, modelled on the W axioms, PI and WI being also equivalent, categorical and consistent. We also discuss the problem of independence of sets of axioms, which were dealt with earlier. (shrink)
In a previous paper, an elementary and thoroughly arithmetical proof of Fermat’s last theorem by induction has been demonstrated if the case for “n = 3” is granted as proved only arithmetically (which is a fact a long time ago), furthermore in a way accessible to Fermat himself though without being absolutely and precisely correct. The present paper elucidates the contemporary mathematical background, from which an inductive proof of FLT can be inferred since its proof for the case for “n (...) = 3” has been known for a long time. It needs “Hilbert mathematics”, which is inherently complete unlike the usual “Gödel mathematics”, and based on “Hilbert arithmetic” to generalize Peanoarithmetic in a way to unify it with the qubit Hilbert space of quantum information. An “epoché to infinity” (similar to Husserl’s “epoché to reality”) is necessary to map Hilbert arithmetic into Peanoarithmetic in order to be relevant to Fermat’s age. Furthermore, the two linked semigroups originating from addition and multiplication and from the Peano axioms in the final analysis can be postulated algebraically as independent of each other in a “Hamilton” modification of arithmetic supposedly equivalent to Peanoarithmetic. The inductive proof of FLT can be deduced absolutely precisely in that Hamilton arithmetic and the pransfered as a corollary in the standard Peanoarithmetic furthermore in a way accessible in Fermat’s epoch and thus, to himself in principle. A future, second part of the paper is outlined, getting directed to an eventual proof of the case “n=3” based on the qubit Hilbert space and the Kochen-Specker theorem inferable from it. (shrink)
I here investigate the sense in which diagonalization allows one to construct sentences that are self-referential. Truly self-referential sentences cannot be constructed in the standard language of arithmetic: There is a simple theory of truth that is intuitively inconsistent but is consistent with Peanoarithmetic, as standardly formulated. True self-reference is possible only if we expand the language to include function-symbols for all primitive recursive functions. This language is therefore the natural setting for investigations of self-reference.
Hilbert arithmetic in a wide sense, including Hilbert arithmetic in a narrow sense consisting by two dual and anti-isometric Peano arithmetics, on the one hand, and the qubit Hilbert space (originating for the standard separable complex Hilbert space of quantum mechanics), on the other hand, allows for an arithmetic version of Gentzen’s cut elimination and quantum measurement to be described uniformy as two processes occurring accordingly in those two branches. A philosophical reflection also justifying that unity (...) by quantum neo-Pythagoreanism links it to the opposition of propositional logic, to which Gentzen’s cut rule refers immediately, on the one hand, and the linguistic and mathematical theory of metaphor therefore sharing the same structure borrowed from Hilbert arithmetic in a wide sense. An example by hermeneutical circle modeled as a dual pair of a syllogism (accomplishable also by a Turing machine) and a relevant metaphor (being a formal and logical mistake and thus fundamentally inaccessible to any Turing machine) visualizes human understanding corresponding also to Gentzen’s cut elimination and the Gödel dichotomy about the relation of arithmetic to set theory: either incompleteness or contradiction. The metaphor as the complementing “half” of any understanding of hermeneutical circle is what allows for that Gödel-like incompleteness to be overcome in human thought. (shrink)
Peanoarithmetic cannot serve as the ground of mathematics for it is inconsistent to infinity, and infinity is necessary for its foundation. Though Peanoarithmetic cannot be complemented by any axiom of infinity, there exists at least one (logical) axiomatics consistent to infinity. That is nothing else than a new reading at issue and comparative interpretation of Gödel’s papers (1930; 1931) meant here. Peanoarithmetic admits anyway generalizations consistent to infinity and thus to some (...) addable axiom(s) of infinity. The most utilized example of those generalizations is the complex Hilbert space. Any generalization of Peanoarithmetic consistent to infinity, e.g. the complex Hilbert space, can serve as a foundation for mathematics to found itself and by itself. (shrink)
According to Augustine, abstract objects are ideas in the Mind of God. Because numbers are a type of abstract object, it would follow that numbers are ideas in the Mind of God. Let us call such a view the Augustinian View of Numbers (AVN). In this paper, I present a formal theory for AVN. The theory stems from the symmetry conception of God as it appears in Studtmann (2021). I show that Robinson’s Arithmetic, Q, can be interpreted by the (...) theory in Studtmann’s paper. The interpretation is made possible by identifying the set of natural numbers with God, 0 with Being, and the successor function with the essence function. The resulting theory can then be augmented to include PeanoArithmetic by adding a set-theoretic version of induction and a comprehension schema restricted to arithmetically definable properties. In addition to these formal matters, the paper provides a characterization of the mind of God. According to the characterization, the Being essences that constitute God’s mind act as both numbers and representations – each has all the properties of some number and encodes all the properties of that number’s predecessor. The conception of God that emerges by the end of the discussion is a conception of an infinite, ineffable, axiologically and metaphysically ultimate entity that contains objects that not only serve as numbers but also encode information about each other. (shrink)
Gentzen’s approach by transfinite induction and that of intuitionist Heyting arithmetic to completeness and the self-foundation of mathematics are compared and opposed to the Gödel incompleteness results as to Peanoarithmetic. Quantum mechanics involves infinity by Hilbert space, but it is finitist as any experimental science. The absence of hidden variables in it interpretable as its completeness should resurrect Hilbert’s finitism at the cost of relevant modification of the latter already hinted by intuitionism and Gentzen’s approaches for (...) completeness. This paper investigates both conditions and philosophical background necessary for that modification. The main conclusion is that the concept of infinity as underlying contemporary mathematics cannot be reduced to a single Peanoarithmetic, but to at least two ones independent of each other. Intuitionism, quantum mechanics, and Gentzen’s approaches to completeness an even Hilbert’s finitism can be unified from that viewpoint. Mathematics may found itself by a way of finitism complemented by choice. The concept of information as the quantity of choices underlies that viewpoint. Quantum mechanics interpretable in terms of information and quantum information is inseparable from mathematics and its foundation. (shrink)
The concept of quantum information is introduced as both normed superposition of two orthogonal sub-spaces of the separable complex Hilbert space and in-variance of Hamilton and Lagrange representation of any mechanical system. The base is the isomorphism of the standard introduction and the representation of a qubit to a 3D unit ball, in which two points are chosen. The separable complex Hilbert space is considered as the free variable of quantum information and any point in it (a wave function describing (...) a state of a quantum system) as its value as the bound variable. A qubit is equivalent to the generalization of ‘bit’ from the set of two equally probable alternatives to an infinite set of alternatives. Then, that Hilbert space is considered as a generalization of Peanoarithmetic where any unit is substituted by a qubit and thus the set of natural number is mappable within any qubit as the complex internal structure of the unit or a different state of it. Thus, any mathematical structure being reducible to set theory is re-presentable as a set of wave functions and a subspace of the separable complex Hilbert space, and it can be identified as the category of all categories for any functor represents an operator transforming a set (or subspace) of the separable complex Hilbert space into another. Thus, category theory is isomorphic to the Hilbert-space representation of set theory & Peanoarithmetic as above. Given any value of quantum information, i.e. a point in the separable complex Hilbert space, it always admits two equally acceptable interpretations: the one is physical, the other is mathematical. The former is a wave function as the exhausted description of a certain state of a certain quantum system. The latter chooses a certain mathematical structure among a certain category. Thus there is no way to be distinguished a mathematical structure from a physical state for both are described exhaustedly as a value of quantum information. This statement in turn can be utilized to be defined quantum information by the identity of any mathematical structure to a physical state, and also vice versa. Further, that definition is equivalent to both standard definition as the normed superposition and in-variance of Hamilton and Lagrange interpretation of mechanical motion introduced in the beginning of the paper. Then, the concept of information symmetry can be involved as the symmetry between three elements or two pairs of elements: Lagrange representation and each counterpart of the pair of Hamilton representation. The sense and meaning of information symmetry may be visualized by a single (quantum) bit and its interpretation as both (privileged) reference frame and the symmetries of the Standard model. (shrink)
Can the so-ca\led first incompleteness theorem refer to itself? Many or maybe even all the paradoxes in mathematics are connected with some kind of self-reference. Gбdel built his proof on the ground of self-reference: а statement which claims its unprovabllity. So, he demonstrated that undecidaЬle propositions exist in any enough rich axiomatics (i.e. such one which contains Peanoarithmetic in some sense). What about the decidabllity of the very first incompleteness theorem? We can display that it fulfills its (...) conditions. That's why it can Ье applied to itself, proving that it is an undecidaЬle statement. It seems to Ье а too strange kind of proposition: its validity implies its undecidabllity. If the validity of а statement implies its untruth, then it is either untruth (reductio ad absurdum) or an antinomy (if also its negation implies its validity). А theory that contains а contradiction implies any statement. Appearing of а proposition, whose validity implies its undecidabllity, is due to the statement that claims its unprovability. Obviously, it is а proposition of self-referential type. Ву Gбdel's words, it is correlative with Richard's or liar paradox, or even with any other semantic or mathematical one. What is the cost, if а proposition of that special kind is used in а proof? ln our opinion, the price is analogous to «applying» of а contradictory in а theory: any statement turns out to Ье undecidaЬ!e. Ifthe first incompleteness theorem is an undecidaЬ!e theorem, then it is impossiЬle to prove that the very completeness of Peanoarithmetic is also an tmdecidaЬle statement (the second incompleteness theorem). Hilbert's program for ап arithmetical self-foundation of matheшatics is partly rehabllitated: only partly, because it is not decidaЬ!e and true, but undecidaЬle; that's wby both it and its negation шау Ье accepted as true, however not siшultaneously true. The first incompleteness theoreш gains the statute of axiom of а very special, semi-philosophical kind: it divides mathematics as whole into two parts: either Godel шathematics or Нilbert matheшatics. Нilbert's program of self-foundation ofmatheшatic is valid only as to the latter. (shrink)
We consider the argument that Tarski's classic definitions permit an intelligence---whether human or mechanistic---to admit finitary evidence-based definitions of the satisfaction and truth of the atomic formulas of the first-order PeanoArithmetic PA over the domain N of the natural numbers in two, hitherto unsuspected and essentially different, ways: (1) in terms of classical algorithmic verifiabilty; and (2) in terms of finitary algorithmic computability. We then show that the two definitions correspond to two distinctly different assignments of satisfaction (...) and truth to the compound formulas of PA over N---I_PA(N; SV ) and I_PA(N; SC). We further show that the PA axioms are true over N, and that the PA rules of inference preserve truth over N, under both I_PA(N; SV ) and I_PA(N; SC). We then show: (a) that if we assume the satisfaction and truth of the compound formulas of PA are always non-finitarily decidable under I_PA(N; SV ), then this assignment corresponds to the classical non-finitary putative standard interpretation I_PA(N; S) of PA over the domain N; and (b) that the satisfaction and truth of the compound formulas of PA are always finitarily decidable under the assignment I_PA(N; SC), from which we may finitarily conclude that PA is consistent. We further conclude that the appropriate inference to be drawn from Goedel's 1931 paper on undecidable arithmetical propositions is that we can define PA formulas which---under interpretation---are algorithmically verifiable as always true over N, but not algorithmically computable as always true over N. We conclude from this that Lucas' Goedelian argument is validated if the assignment I_PA(N; SV ) can be treated as circumscribing the ambit of human reasoning about `true' arithmetical propositions, and the assignment I_PA(N; SC) as circumscribing the ambit of mechanistic reasoning about `true' arithmetical propositions. (shrink)
We show how removing faith-based beliefs in current philosophies of classical and constructive mathematics admits formal, evidence-based, definitions of constructive mathematics; of a constructively well-defined logic of a formal mathematical language; and of a constructively well-defined model of such a language. -/- We argue that, from an evidence-based perspective, classical approaches which follow Hilbert's formal definitions of quantification can be labelled `theistic'; whilst constructive approaches based on Brouwer's philosophy of Intuitionism can be labelled `atheistic'. -/- We then adopt what may (...) be labelled a finitary, evidence-based, `agnostic' perspective and argue that Brouwerian atheism is merely a restricted perspective within the finitary agnostic perspective, whilst Hilbertian theism contradicts the finitary agnostic perspective. -/- We then consider the argument that Tarski's classic definitions permit an intelligence---whether human or mechanistic---to admit finitary, evidence-based, definitions of the satisfaction and truth of the atomic formulas of the first-order PeanoArithmetic PA over the domain N of the natural numbers in two, hitherto unsuspected and essentially different, ways. -/- We show that the two definitions correspond to two distinctly different---not necessarily evidence-based but complementary---assignments of satisfaction and truth to the compound formulas of PA over N. -/- We further show that the PA axioms are true over N, and that the PA rules of inference preserve truth over N, under both the complementary interpretations; and conclude some unsuspected constructive consequences of such complementarity for the foundations of mathematics, logic, philosophy, and the physical sciences. -/- . (shrink)
Classical interpretations of Goedels formal reasoning, and of his conclusions, implicitly imply that mathematical languages are essentially incomplete, in the sense that the truth of some arithmetical propositions of any formal mathematical language, under any interpretation, is, both, non-algorithmic, and essentially unverifiable. However, a language of general, scientific, discourse, which intends to mathematically express, and unambiguously communicate, intuitive concepts that correspond to scientific investigations, cannot allow its mathematical propositions to be interpreted ambiguously. Such a language must, therefore, define mathematical truth (...) verifiably. We consider a constructive interpretation of classical, Tarskian, truth, and of Goedel's reasoning, under which any formal system of PeanoArithmetic---classically accepted as the foundation of all our mathematical Languages---is verifiably complete in the above sense. We show how some paradoxical concepts of Quantum mechanics can, then, be expressed, and interpreted, naturally under a constructive definition of mathematical truth. (shrink)
Husserl (a mathematician by education) remained a few famous and notable philosophical “slogans” along with his innovative doctrine of phenomenology directed to transcend “reality” in a more general essence underlying both “body” and “mind” (after Descartes) and called sometimes “ontology” (terminologically following his notorious assistant Heidegger). Then, Husserl’s tradition can be tracked as an idea for philosophy to be reinterpreted in a way to be both generalized and mathenatizable in the final analysis. The paper offers a pattern borrowed from the (...) theory of information and quantum information (therefore relating philosophy to both mathematics and physics) to formalize logically a few key concepts of Husserl’s phenomenology such as “epoché” “eidetic, phenomenological, and transcendental reductions” as well as the identification of “phenomenological, transcendental, and psychological reductions” in a way allowing for that identification to be continued to “eidetic reduction” (and thus to mathematics). The approach is tested by an independent and earlier idea of Husserl, “logical arithmetic” (parallelly implemented in mathematics by Whitehead and Russell’s Principia) as what “Hilbert arithmetic” generalizing Peano arithmetics is interpreted. A basic conclusion states for the unification of philosophy, mathematics, and physics in their foundations and fundamentals to be the Husserl tradition both tracked to its origin (in the being itself after Heidegger or after Husserl’s “zu Sache selbst”) and embodied in the development of human cognition in the third millennium. (shrink)
The paper is concentrated on the special changes of the conception of causality from quantum mechanics to quantum information meaning as a background the revolution implemented by the former to classical physics and science after Max Born’s probabilistic reinterpretation of wave function. Those changes can be enumerated so: (1) quantum information describes the general case of the relation of two wave functions, and particularly, the causal amendment of a single one; (2) it keeps the physical description to be causal by (...) the conservation of quantum information and in accordance with Born’s interpretation; (3) it introduces inverse causality, “backwards in time”, observable “forwards in time” as the fundamentally random probability density distribution of all possible measurements of any physical quantity in quantum mechanics; (4) it involves a kind of “bidirectional causality” unifying (4.1) the classical determinism of cause and effect, (4.2) the probabilistic causality of quantum mechanics, and (4.3) the reversibility of any coherent state; (5) it identifies determinism with the function successor in Peanoarithmetic, and its proper generalized causality with the information function successor in Hilbert arithmetic. (shrink)
For each positive n , two alternative axiomatizations of the theory of strings over n alphabetic characters are presented. One class of axiomatizations derives from Tarski's system of the Wahrheitsbegriff and uses the n characters and concatenation as primitives. The other class involves using n character-prefixing operators as primitives and derives from Hermes' Semiotik. All underlying logics are second order. It is shown that, for each n, the two theories are definitionally equivalent [or synonymous in the sense of deBouvere]. It (...) is further shown that each member of one class is synonymous with each member of the other class; thus that all of the theories are definitionally equivalent with each other and with Peanoarithmetic. Categoricity of Peanoarithmetic then implies categoricity of each of the above theories. (shrink)
Those incompleteness theorems mean the relation of (Peano) arithmetic and (ZFC) set theory, or philosophically, the relation of arithmetical finiteness and actual infinity. The same is managed in the framework of set theory by the axiom of choice (respectively, by the equivalent well-ordering "theorem'). One may discuss that incompleteness form the viewpoint of set theory by the axiom of choice rather than the usual viewpoint meant in the proof of theorems. The logical corollaries from that "nonstandard" viewpoint the (...) relation of set theory and arithmetic are demonstrated. (shrink)
Tarski's Undefinability of Truth Theorem comes in two versions: that no consistent theory which interprets Robinson's Arithmetic (Q) can prove all instances of the T-Scheme and hence define truth; and that no such theory, if sound, can even express truth. In this note, I prove corresponding limitative results for validity. While PeanoArithmetic already has the resources to define a predicate expressing logical validity, as Jeff Ketland has recently pointed out (2012, Validity as a primitive. Analysis 72: (...) 421-30), no theory which interprets Q closed under the standard structural rules can define nor express validity, on pain of triviality. The results put pressure on the widespread view that there is an asymmetry between truth and validity, viz. that while the former cannot be defined within the language, the latter can. I argue that Vann McGee's and Hartry Field's arguments for the asymmetry view are problematic. (shrink)
This paper is part of a project that is based on the notion of a dialectical system, introduced by Magari as a way of capturing trial and error mathematics. In Amidei et al. (2016, Rev. Symb. Logic, 9, 1–26) and Amidei et al. (2016, Rev. Symb. Logic, 9, 299–324), we investigated the expressive and computational power of dialectical systems, and we compared them to a new class of systems, that of quasi-dialectical systems, that enrich Magari’s systems with a natural mechanism (...) of revision. In the present paper we consider a third class of systems, that of p-dialectical systems, that naturally combine features coming from the two other cases. We prove several results about p-dialectical systems and the sets that they represent. Then we focus on the completions of first-order theories. In doing so, we consider systems with connectives, i.e. systems that encode the rules of classical logic. We show that any consistent system with connectives represents the completion of a given theory. We prove that dialectical and q-dialectical systems coincide with respect to the completions that they can represent. Yet, p-dialectical systems are more powerful; we exhibit a p-dialectical system representing a completion of PeanoArithmetic that is neither dialectical nor q-dialectical. (shrink)
By a classical result of Kotlarski, Krajewski and Lachlan, pathological satisfaction classes can be constructed for countable, recursively saturated models of Peanoarithmetic. In this paper we consider the question of whether the pathology can be eliminated; we ask in effect what generalities involving the notion of truth can be obtained in a deflationary truth theory (a theory of truth which is conservative over its base). It is shown that the answer depends on the notion of pathology we (...) adopt. It turns out in particular that a certain natural closure condition imposed on a satisfaction class—namely, closure of truth under sentential proofs—generates a nonconservative extension of a syntactic base theory (Peanoarithmetic). (shrink)
According to Augustine, abstract objects are ideas in the Mind of God. Because numbers are a type of abstract object, it would follow that numbers are ideas in the Mind of God. Let us call such a view the Augustinian View of Numbers (AVN). In this paper, I present a formal theory for AVN. The theory stems from the symmetry conception of God as it appears in Studtmann (2021). I show that Robinson’s Arithmetic is a conservative extension of the (...) axioms in Studtmann’s original paper. The extension is made possible by identifying the set of natural numbers with God, 0 with Being, and the successor function with the essence function. The resulting theory can then be augmented to include PeanoArithmetic by adding a set-theoretic version of induction and a comprehension schema restricted to arithmetically definable properties. In addition to these formal matters, the paper provides a characterization of the mind of God. According to the characterization, the Being essences that constitute God’s mind act as both numbers and representations – each (except for Being itself) has all the properties of some number and encodes all the properties of that number’s predecessor. The conception of God that emerges by the end of the discussion is a conception of an infinite, ineffable, axiologically and metaphysically ultimate entity that contains objects that not only serve as numbers but also encode information about each other. (shrink)
We present an overview of typed and untyped disquotational truth theories with the emphasis on their (non)conservativity over the base theory of syntax. Two types of conservativity are discussed: syntactic and semantic. We observe in particular that TB—one of the most basic disquotational theories—is not semantically conservative over its base; we show also that an untyped disquotational theory PTB is a syntactically conservative extension of PeanoArithmetic.
The “four-color” theorem seems to be generalizable as follows. The four-letter alphabet is sufficient to encode unambiguously any set of well-orderings including a geographical map or the “map” of any logic and thus that of all logics or the DNA plan of any alive being. Then the corresponding maximally generalizing conjecture would state: anything in the universe or mind can be encoded unambiguously by four letters. That admits to be formulated as a “four-letter theorem”, and thus one can search for (...) a properly mathematical proof of the statement. It would imply the “four colour theorem”, the proof of which many philosophers and mathematicians believe not to be entirely satisfactory for it is not a “human proof”, but intermediated by computers unavoidably since the necessary calculations exceed the human capabilities fundamentally. It is furthermore rather unsatisfactory because it consists in enumerating and proving all cases one by one. Sometimes, a more general theorem turns out to be much easier for proving including a general “human” method, and the particular and too difficult for proving theorem to be implied as a corollary in certain simple conditions. The same approach will be followed as to the four colour theorem, i.e. to be deduced more or less trivially from the “four-letter theorem” if the latter is proved. References are only classical and thus very well-known papers: their complete bibliographic description is omitted. (shrink)
Is a logicist bound to the claim that as a matter of analytic truth there is an actual infinity of objects? If Hume’s Principle is analytic then in the standard setting the answer appears to be yes. Hodes’s work pointed to a way out by offering a modal picture in which only a potential infinity was posited. However, this project was abandoned due to apparent failures of cross-world predication. We re-explore this idea and discover that in the setting of the (...) potential infinite one can interpret first-order Peanoarithmetic, but not second-order Peanoarithmetic. We conclude that in order for the logicist to weaken the metaphysically loaded claim of necessary actual infinities, they must also weaken the mathematics they recover. (shrink)
A truth-preservation fallacy is using the concept of truth-preservation where some other concept is needed. For example, in certain contexts saying that consequences can be deduced from premises using truth-preserving deduction rules is a fallacy if it suggests that all truth-preserving rules are consequence-preserving. The arithmetic additive-associativity rule that yields 6 = (3 + (2 + 1)) from 6 = ((3 + 2) + 1) is truth-preserving but not consequence-preserving. As noted in James Gasser’s dissertation, Leibniz has been criticized (...) for using that rule in attempting to show that arithmetic equations are consequences of definitions. -/- A system of deductions is truth-preserving if each of its deductions having true premises has a true conclusion—and consequence-preserving if, for any given set of sentences, each deduction having premises that are consequences of that set has a conclusion that is a consequence of that set. Consequence-preserving amounts to: in each of its deductions the conclusion is a consequence of the premises. The same definitions apply to deduction rules considered as systems of deductions. Every consequence-preserving system is truth-preserving. It is not as well-known that the converse fails: not every truth-preserving system is consequence-preserving. Likewise for rules: not every truth-preserving rule is consequence-preserving. There are many famous examples. In ordinary first-order Peano-Arithmetic, the induction rule yields the conclusion ‘every number x is such that: x is zero or x is a successor’—which is not a consequence of the null set—from two tautological premises, which are consequences of the null set, of course. The arithmetic induction rule is truth-preserving but not consequence-preserving. Truth-preserving rules that are not consequence-preserving are non-logical or extra-logical rules. Such rules are unacceptable to persons espousing traditional truth-and-consequence conceptions of demonstration: a demonstration shows its conclusion is true by showing that its conclusion is a consequence of premises already known to be true. The 1965 Preface in Benson Mates (1972, vii) contains the first occurrence of truth-preservation fallacies in the book. (shrink)
The paper investigates the understanding of quantum indistinguishability after quantum information in comparison with the “classical” quantum mechanics based on the separable complex Hilbert space. The two oppositions, correspondingly “distinguishability / indistinguishability” and “classical / quantum”, available implicitly in the concept of quantum indistinguishability can be interpreted as two “missing” bits of classical information, which are to be added after teleportation of quantum information to be restored the initial state unambiguously. That new understanding of quantum indistinguishability is linked to the (...) distinction of classical (Maxwell-Boltzmann) versus quantum (either Fermi-Dirac or Bose-Einstein) statistics. The latter can be generalized to classes of wave functions (“empty” qubits) and represented exhaustively in Hilbert arithmetic therefore connectible to the foundations of mathematics, more precisely, to the interrelations of propositional logic and set theory sharing the structure of Boolean algebra and two anti-isometric copies of Peanoarithmetic. (shrink)
In this multi-disciplinary investigation we show how an evidence-based perspective of quantification---in terms of algorithmic verifiability and algorithmic computability---admits evidence-based definitions of well-definedness and effective computability, which yield two unarguably constructive interpretations of the first-order PeanoArithmetic PA---over the structure N of the natural numbers---that are complementary, not contradictory. The first yields the weak, standard, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically verifiable Tarskian truth values to the formulas of PA under (...) the interpretation. The second yields a strong, finitary, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically computable Tarskian truth values to the formulas of PA under the interpretation. We situate our investigation within a broad analysis of quantification vis a vis: * Hilbert's epsilon-calculus * Goedel's omega-consistency * The Law of the Excluded Middle * Hilbert's omega-Rule * An Algorithmic omega-Rule * Gentzen's Rule of Infinite Induction * Rosser's Rule C * Markov's Principle * The Church-Turing Thesis * Aristotle's particularisation * Wittgenstein's perspective of constructive mathematics * An evidence-based perspective of quantification. By showing how these are formally inter-related, we highlight the fragility of both the persisting, theistic, classical/Platonic interpretation of quantification grounded in Hilbert's epsilon-calculus; and the persisting, atheistic, constructive/Intuitionistic interpretation of quantification rooted in Brouwer's belief that the Law of the Excluded Middle is non-finitary. We then consider some consequences for mathematics, mathematics education, philosophy, and the natural sciences, of an agnostic, evidence-based, finitary interpretation of quantification that challenges classical paradigms in all these disciplines. (shrink)
The concept of formal transcendentalism is utilized. The fundamental and definitive property of the totality suggests for “the totality to be all”, thus, its externality (unlike any other entity) is contained within it. This generates a fundamental (or philosophical) “doubling” of anything being referred to the totality, i.e. considered philosophically. Thus, that doubling as well as transcendentalism underlying it can be interpreted formally as an elementary choice such as a bit of information and a quantity corresponding to the number of (...) elementary choices to be defined. This is the quantity of information defined both transcendentally and formally and thus, philosophically and mathematically. If one defines information specifically, as an elementary choice between finiteness (or mathematically, as any natural number of Peanoarithmetic) and infinity (i.e. an actually infinite set in the meaning of set theory), the quantity of quantum information is defined. One can demonstrate that the so-defined quantum information and quantum information standardly defined by quantum mechanics are equivalent to each other. The equivalence of the axiom of choice and the well-ordering “theorem” is involved. It can be justified transcendentally as well, in virtue of transcendental equivalence implied by the totality. Thus, all can be considered as temporal as far anything possesses such a temporal counterpart necessarily. Formally defined, the frontier of time is the current choice now, a bit of information, furthermore interpretable as a qubit of quantum information. (shrink)
A set theory model of reality, representation and language based on the relation of completeness and incompleteness is explored. The problem of completeness of mathematics is linked to its counterpart in quantum mechanics. That model includes two Peano arithmetics or Turing machines independent of each other. The complex Hilbert space underlying quantum mechanics as the base of its mathematical formalism is interpreted as a generalization of Peanoarithmetic: It is a doubled infinite set of doubled Peano (...) arithmetics having a remarkable symmetry to the axiom of choice. The quantity of information is interpreted as the number of elementary choices (bits). Quantum information is seen as the generalization of information to infinite sets or series. The equivalence of that model to a quantum computer is demonstrated. The condition for the Turing machines to be independent of each other is reduced to the state of Nash equilibrium between them. Two relative models of language as game in the sense of game theory and as ontology of metaphors (all mappings, which are not one-to-one, i.e. not representations of reality in a formal sense) are deduced. (shrink)
The concepts of choice, negation, and infinity are considered jointly. The link is the quantity of information interpreted as the quantity of choices measured in units of elementary choice: a bit is an elementary choice between two equally probable alternatives. “Negation” supposes a choice between it and confirmation. Thus quantity of information can be also interpreted as quantity of negations. The disjunctive choice between confirmation and negation as to infinity can be chosen or not in turn: This corresponds to set-theory (...) or intuitionist approach to the foundation of mathematics and to Peano or Heyting arithmetic. Quantum mechanics can be reformulated in terms of information introducing the concept and quantity of quantum information. A qubit can be equivalently interpreted as that generalization of “bit” where the choice is among an infinite set or series of alternatives. The complex Hilbert space can be represented as both series of qubits and value of quantum information. The complex Hilbert space is that generalization of Peanoarithmetic where any natural number is substituted by a qubit. “Negation”, “choice”, and “infinity” can be inherently linked to each other both in the foundation of mathematics and quantum mechanics by the meditation of “information” and “quantum information”. (shrink)
A principle, according to which any scientific theory can be mathematized, is investigated. That theory is presupposed to be a consistent text, which can be exhaustedly represented by a certain mathematical structure constructively. In thus used, the term “theory” includes all hypotheses as yet unconfirmed as already rejected. The investigation of the sketch of a possible proof of the principle demonstrates that it should be accepted rather a metamathematical axiom about the relation of mathematics and reality. Its investigation needs philosophical (...) means. Husserl’s phenomenology is what is used, and then the conception of “bracketing reality” is modelled to generalize Peanoarithmetic in its relation to set theory in the foundation of mathematics. The obtained model is equivalent to the generalization of Peanoarithmetic by means of replacing the axiom of induction with that of transfinite induction. A comparison to Mach’s doctrine is used to be revealed the fundamental and philosophical reductionism of Husserl’s phenomenology leading to a kind of Pythagoreanism in the final analysis. Accepting or rejecting the principle, two kinds of mathematics appear differing from each other by its relation to reality. Accepting the principle, mathematics has to include reality within itself in a kind of Pythagoreanism. These two kinds are called in paper correspondingly Hilbert mathematics and Gödel mathematics. The sketch of the proof of the principle demonstrates that the generalization of Peanoarithmetic as above can be interpreted as a model of Hilbert mathematics into Gödel mathematics therefore showing that the former is not less consistent than the latter, and the principle is an independent axiom. An information interpretation of Hilbert mathematics is involved. It is a kind of ontology of information. Thus the problem which of the two mathematics is more relevant to our being is discussed. An information interpretation of the Schrödinger equation is involved to illustrate the above problem. (shrink)
Husserl’s phenomenology is what is used, and then the conception of “bracketing reality” is modelled to generalize Peanoarithmetic in its relation to set theory in the foundation of mathematics. The obtained model is equivalent to the generalization of Peanoarithmetic by means of replacing the axiom of induction with that of transfinite induction. A comparison to Mach’s doctrine is used to be revealed the fundamental and philosophical reductionism of Husserl’s phenomenology leading to a kind of (...) Pythagoreanism in the final analysis. (shrink)
A principle, according to which any scientific theory can be mathematized, is investigated. Social science, liberal arts, history, and philosophy are meant first of all. That kind of theory is presupposed to be a consistent text, which can be exhaustedly represented by a certain mathematical structure constructively. In thus used, the term “theory” includes all hypotheses as yet unconfirmed as already rejected. The investigation of the sketch of a possible proof of the principle demonstrates that it should be accepted rather (...) a metamathematical axiom about the relation of mathematics and reality. The main statement is formulated as follows: Any scientific theory admits isomorphism to some mathematical structure in a way constructive. Its investigation needs philosophical means. Husserl’s phenomenology is what is used, and then the conception of “bracketing reality” is modelled to generalize Peanoarithmetic in its relation to set theory in the foundation of mathematics. The obtained model is equivalent to the generalization of Peanoarithmetic by means of replacing the axiom of induction with that of transfinite induction. The sketch of the proof is organized in five steps: a generalization of epoché; involving transfinite induction in the transition between Peanoarithmetic and set theory; discussing the finiteness of Peanoarithmetic; applying transfinite induction to Peanoarithmetic; discussing an arithmetical model of reality. Accepting or rejecting the principle, two kinds of mathematics appear differing from each other by its relation to reality. Accepting the principle, mathematics has to include reality within itself in a kind of Pythagoreanism. These two kinds are called in paper correspondingly Hilbert mathematics and Gödel mathematics. The sketch of the proof of the principle demonstrates that the generalization of Peanoarithmetic as above can be interpreted as a model of Hilbert mathematics into Gödel mathematics therefore showing that the former is not less consistent than the latter, and the principle is an independent axiom. The present paper follows a pathway grounded on Husserl’s phenomenology and “bracketing reality” to achieve the generalized arithmetic necessary for the principle to be founded in alternative ontology, in which there is no reality external to mathematics: reality is included within mathematics. That latter mathematics is able to self-found itself and can be called Hilbert mathematics in honour of Hilbert’s program for self-founding mathematics on the base of arithmetic. The principle of universal mathematizability is consistent to Hilbert mathematics, but not to Gödel mathematics. Consequently, its validity or rejection would resolve the problem which mathematics refers to our being; and vice versa: the choice between them for different reasons would confirm or refuse the principle as to the being. An information interpretation of Hilbert mathematics is involved. It is a kind of ontology of information. The Schrödinger equation in quantum mechanics is involved to illustrate that ontology. Thus the problem which of the two mathematics is more relevant to our being is discussed again in a new way A few directions for future work can be: a rigorous formal proof of the principle as an independent axiom; the further development of information ontology consistent to both kinds of mathematics, but much more natural for Hilbert mathematics; the development of the information interpretation of quantum mechanics as a mathematical one for information ontology and thus Hilbert mathematics; the description of consciousness in terms of information ontology. (shrink)
The quantum information introduced by quantum mechanics is equivalent to a certain generalization of classical information: from finite to infinite series or collections. The quantity of information is the quantity of choices measured in the units of elementary choice. The “qubit”, can be interpreted as that generalization of “bit”, which is a choice among a continuum of alternatives. The axiom of choice is necessary for quantum information. The coherent state is transformed into a well-ordered series of results in time after (...) measurement. The quantity of quantum information is the transfinite ordinal number corresponding to the infinity series in question. The transfinite ordinal numbers can be defined as ambiguously corresponding “transfinite natural numbers” generalizing the natural numbers of Peanoarithmetic to “Hilbert arithmetic” allowing for the unification of the foundations of mathematics and quantum mechanics. (shrink)
It is commonly thought that such topics as Impossibility, Incompleteness, Paraconsistency, Undecidability, Randomness, Computability, Paradox, Uncertainty and the Limits of Reason are disparate scientific physical or mathematical issues having little or nothing in common. I suggest that they are largely standard philosophical problems (i.e., language games) which were resolved by Wittgenstein over 80 years ago. -/- Wittgenstein also demonstrated the fatal error in regarding mathematics or language or our behavior in general as a unitary coherent logical ‘system,’ rather than as (...) a motley of pieces assembled by the random processes of natural selection. “Gödel shows us an unclarity in the concept of ‘mathematics’, which is indicated by the fact that mathematics is taken to be a system” and we can say (contra nearly everyone) that is all that Gödel and Chaitin show. Wittgenstein commented many times that ‘truth’ in math means axioms or the theorems derived from axioms, and ‘false’ means that one made a mistake in using the definitions, and this is utterly different from empirical matters where one applies a test. Wittgenstein often noted that to be acceptable as mathematics in the usual sense, it must be useable in other proofs and it must have real world applications, but neither is the case with Godel’s Incompleteness. Since it cannot be proved in a consistent system (here PeanoArithmetic but a much wider arena for Chaitin), it cannot be used in proofs and, unlike all the ‘rest’ of PA it cannot be used in the real world either. As Rodych notes “…Wittgenstein holds that a formal calculus is only a mathematical calculus (i.e., a mathematical language-game) if it has an extra- systemic application in a system of contingent propositions (e.g., in ordinary counting and measuring or in physics) …” Another way to say this is that one needs a warrant to apply our normal use of words like ‘proof’, ‘proposition’, ‘true’, ‘incomplete’, ‘number’, and ‘mathematics’ to a result in the tangle of games created with ‘numbers’ and ‘plus’ and ‘minus’ signs etc., and with -/- ‘Incompleteness’ this warrant is lacking. Rodych sums it up admirably. “On Wittgenstein’s account, there is no such thing as an incomplete mathematical calculus because ‘in mathematics, everything is algorithm [and syntax] and nothing is meaning [semantics]…” -/- I make some brief remarks which note the similarities of these ‘mathematical’ issues to economics, physics, game theory, and decision theory. -/- Those wishing further comments on philosophy and science from a Wittgensteinian two systems of thought viewpoint may consult my other writings -- Talking Monkeys--Philosophy, Psychology, Science, Religion and Politics on a Doomed Planet--Articles and Reviews 2006-2019 3rd ed (2019), The Logical Structure of Philosophy, Psychology, Mind and Language in Ludwig Wittgenstein and John Searle 2nd ed (2019), Suicide by Democracy 4th ed (2019), The Logical Structure of Human Behavior (2019), The Logical Structure of Consciousness (2019, Understanding the Connections between Science, Philosophy, Psychology, Religion, Politics, and Economics and Suicidal Utopian Delusions in the 21st Century 5th ed (2019), Remarks on Impossibility, Incompleteness, Paraconsistency, Undecidability, Randomness, Computability, Paradox, Uncertainty and the Limits of Reason in Chaitin, Wittgenstein, Hofstadter, Wolpert, Doria, da Costa, Godel, Searle, Rodych, Berto, Floyd, Moyal-Sharrock and Yanofsky (2019), and The Logical Structure of Philosophy, Psychology, Sociology, Anthropology, Religion, Politics, Economics, Literature and History (2019). (shrink)
Andrew Wiles' analytic proof of Fermat's Last Theorem FLT, which appeals to geometrical properties of real and complex numbers, leaves two questions unanswered: (i) What technique might Fermat have used that led him to, even if only briefly, believe he had `a truly marvellous demonstration' of FLT? (ii) Why is x^n+y^n=z^n solvable only for n<3? In this inter-disciplinary perspective, we offer insight into, and answers to, both queries; yielding a pre-formal proof of why FLT can be treated as a true (...) arithmetical proposition (one which, moreover, might not be provable formally in the first-order PeanoArithmetic PA), where we admit only elementary (i.e., number-theoretic) reasoning, without appeal to analytic properties of real and complex numbers. We cogently argue, further, that any formal proof of FLT needs---as is implicitly suggested by Wiles' proof---to appeal essentially to formal geometrical properties of formal arithmetical propositions. (shrink)
The central topic of this article is de re knowledge about natural numbers and its relation with names for numbers. It is held by several prominent philosophers that numerals are eligible for existential quantification in epistemic contexts, whereas other names for natural numbers are not. In other words, numerals are intimately linked with de re knowledge about natural numbers, whereas the other names for natural numbers are not. In this article I am looking for an explanation of this phenomenon. It (...) is argued that the standard induction scheme plays a key role. (shrink)
Philosophers of science since Nagel have been interested in the links between intertheoretic reduction and explanation, understanding and other forms of epistemic progress. Although intertheoretic reduction is widely agreed to occur in pure mathematics as well as empirical science, the relationship between reduction and explanation in the mathematical setting has rarely been investigated in a similarly serious way. This paper examines an important particular case: the reduction of arithmetic to set theory. I claim that the reduction is unexplanatory. In (...) defense of this claim, I offer evidence from mathematical practice, and I respond to contrary suggestions due to Steinhart, Maddy, Kitcher and Quine. I then show how, even if set-theoretic reductions are generally not explanatory, set theory can nevertheless serve as a legitimate foundation for mathematics. Finally, some implications of my thesis for philosophy of mathematics and philosophy of science are discussed. In particular, I suggest that some reductions in mathematics are probably explanatory, and I propose that differing standards of theory acceptance might account for the apparent lack of unexplanatory reductions in the empirical sciences. (shrink)
The paper explores the idea that some singular judgements about the natural numbers are immune to error through misidentification by pursuing a comparison between arithmetic judgements and first-person judgements. By doing so, the first part of the paper offers a conciliatory resolution of the Coliva-Pryor dispute about so-called “de re” and “which-object” misidentification. The second part of the paper draws some lessons about what it takes to explain immunity to error through misidentification. The lessons are: First, the so-called Simple (...) Account of which-object immunity to error through misidentification to the effect that a judgement is immune to this kind of error just in case its grounds do not feature any identification component fails. Secondly, wh-immunity can be explained by a Reference-Fixing Account to the effect that a judgement is immune to this kind of error just in case its grounds are constituted by the facts whereby the reference of the concept of the object which the judgement concerns is fixed. Thirdly, a suitable revision of the Simple Account explains the de re immunity of those arithmetic judgements which are not wh-immune. These three lessons point towards the general conclusion that there is no unifying explanation of de re and wh-immunity. (shrink)
Is calculation possible without language? Or is the human ability for arithmetic dependent on the language faculty? To clarify the relation between language and arithmetic, we studied numerical cognition in speakers of Mundurukú, an Amazonian language with a very small lexicon of number words. Although the Mundurukú lack words for numbers beyond 5, they are able to compare and add large approximate numbers that are far beyond their naming range. However, they fail in exact arithmetic with numbers (...) larger than 4 or 5. Our results imply a distinction between a nonverbal system of number approximation and a language-based counting system for exact number and arithmetic. (shrink)
Orthodoxy holds that there is a determinate fact of the matter about every arithmetical claim. Little argument has been supplied in favour of orthodoxy, and work of Field, Warren and Waxman, and others suggests that the presumption in its favour is unjustified. This paper supports orthodoxy by establishing the determinacy of arithmetic in a well-motivated modal plural logic. Recasting this result in higher-order logic reveals that even the nominalist who thinks that there are only finitely many things should think (...) that there is some sense in which arithmetic is true and determinate. (shrink)
An arithmetic theory of oppositions is devised by comparing expressions, Boolean bitstrings, and integers. This leads to a set of correspondences between three domains of investigation, namely: logic, geometry, and arithmetic. The structural properties of each area are investigated in turn, before justifying the procedure as a whole. Io finish, I show how this helps to improve the logical calculus of oppositions, through the consideration of corresponding operations between integers.
It is well known that Peano had a reluctant attitude towards philosophy, including philosophy of mathematics. Some scholars have suggested the existence of an 'implicit' philosophy, without being able to describe it. In this paper a first attempt is done to reconstruct, if not a general philosophy of mathematics, at least Peano' epistemology of mathematics and its relation to contemporary positions.
Frege and Peano started in 1896 a debate where they contrasted the respective conceptions on the theory and practice of mathematical definitions. Which was (if any) the influence of the Frege-Peano debate on the conceptions by the two authors on the theme of defining in mathematics and which was the role played by this debate in the broader context of their scientific interaction?
Neo-Fregean logicists claim that Hume's Principle (HP) may be taken as an implicit definition of cardinal number, true simply by fiat. A longstanding problem for neo-Fregean logicism is that HP is not deductively conservative over pure axiomatic second-order logic. This seems to preclude HP from being true by fiat. In this paper, we study Richard Kimberly Heck's Two-sorted Frege Arithmetic (2FA), a variation on HP which has been thought to be deductively conservative over second-order logic. We show that it (...) isn't. In fact, 2FA is not conservative over $n$-th order logic, for all $n \geq 2$. It follows that in the usual one-sorted setting, HP is not deductively Field-conservative over second- or higher-order logic. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.