The Goedelian approach is discussed as a prime example of a science towards the origins. While mere selfreferential objectification locks in to its own byproducts, selfreleasing objectification informs the formation of objects at hand and their different levels of interconnection. Guided by the spirit of Goedel's work a selfreflective science can open the road where old tenets see only blocked paths. “This is, as it were, an analysis of the analysis itself, but if that is done it forms the fundamental (...) of human science, as far as this kind of things is concerned.” G. Leibniz, ('Methodus Nova ...', 1673) . (shrink)
Classical interpretations of Goedels formal reasoning, and of his conclusions, implicitly imply that mathematical languages are essentially incomplete, in the sense that the truth of some arithmetical propositions of any formal mathematical language, under any interpretation, is, both, non-algorithmic, and essentially unverifiable. However, a language of general, scientific, discourse, which intends to mathematically express, and unambiguously communicate, intuitive concepts that correspond to scientific investigations, cannot allow its mathematical propositions to be interpreted ambiguously. Such a language must, therefore, define mathematical truth (...) verifiably. We consider a constructive interpretation of classical, Tarskian, truth, and of Goedel's reasoning, under which any formal system of Peano Arithmetic---classically accepted as the foundation of all our mathematical Languages---is verifiably complete in the above sense. We show how some paradoxical concepts of Quantum mechanics can, then, be expressed, and interpreted, naturally under a constructive definition of mathematical truth. (shrink)
In this multi-disciplinary investigation we show how an evidence-based perspective of quantification---in terms of algorithmic verifiability and algorithmic computability---admits evidence-based definitions of well-definedness and effective computability, which yield two unarguably constructive interpretations of the first-order Peano Arithmetic PA---over the structure N of the natural numbers---that are complementary, not contradictory. The first yields the weak, standard, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically verifiable Tarskian truth values to the formulas of PA under the interpretation. (...) The second yields a strong, finitary, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically computable Tarskian truth values to the formulas of PA under the interpretation. We situate our investigation within a broad analysis of quantification vis a vis: * Hilbert's epsilon-calculus * Goedel's omega-consistency * The Law of the Excluded Middle * Hilbert's omega-Rule * An Algorithmic omega-Rule * Gentzen's Rule of Infinite Induction * Rosser's Rule C * Markov's Principle * The Church-Turing Thesis * Aristotle's particularisation * Wittgenstein's perspective of constructive mathematics * An evidence-based perspective of quantification. By showing how these are formally inter-related, we highlight the fragility of both the persisting, theistic, classical/Platonic interpretation of quantification grounded in Hilbert's epsilon-calculus; and the persisting, atheistic, constructive/Intuitionistic interpretation of quantification rooted in Brouwer's belief that the Law of the Excluded Middle is non-finitary. We then consider some consequences for mathematics, mathematics education, philosophy, and the natural sciences, of an agnostic, evidence-based, finitary interpretation of quantification that challenges classical paradigms in all these disciplines. (shrink)
We consider the argument that Tarski's classic definitions permit an intelligence---whether human or mechanistic---to admit finitary evidence-based definitions of the satisfaction and truth of the atomic formulas of the first-order Peano Arithmetic PA over the domain N of the natural numbers in two, hitherto unsuspected and essentially different, ways: (1) in terms of classical algorithmic verifiabilty; and (2) in terms of finitary algorithmic computability. We then show that the two definitions correspond to two distinctly different assignments of satisfaction and truth (...) to the compound formulas of PA over N---I_PA(N; SV ) and I_PA(N; SC). We further show that the PA axioms are true over N, and that the PA rules of inference preserve truth over N, under both I_PA(N; SV ) and I_PA(N; SC). We then show: (a) that if we assume the satisfaction and truth of the compound formulas of PA are always non-finitarily decidable under I_PA(N; SV ), then this assignment corresponds to the classical non-finitary putative standard interpretation I_PA(N; S) of PA over the domain N; and (b) that the satisfaction and truth of the compound formulas of PA are always finitarily decidable under the assignment I_PA(N; SC), from which we may finitarily conclude that PA is consistent. We further conclude that the appropriate inference to be drawn from Goedel's 1931 paper on undecidable arithmetical propositions is that we can define PA formulas which---under interpretation---are algorithmically verifiable as always true over N, but not algorithmically computable as always true over N. We conclude from this that Lucas' Goedelian argument is validated if the assignment I_PA(N; SV ) can be treated as circumscribing the ambit of human reasoning about `true' arithmetical propositions, and the assignment I_PA(N; SC) as circumscribing the ambit of mechanistic reasoning about `true' arithmetical propositions. (shrink)
This thesis describes two classes of Dialectica categories. Chapter one introduces dialectica categories based on Goedel's Dialectica interpretation and shows that they constitute a model of Girard's Intuitionistic Linear Logic. Chapter two shows that, with extra assumptions, we can provide a comonad that interprets Girard's !-course modality. Chapter three presents the second class of Dialectica categories, a simplification suggested by Girard, that models (classical) Linear Logic and chapter four shows how to provide modalities ! and ? for this second class (...) of construction. (shrink)
The aim of this paper is to argue that the (alleged) indeterminism of quantum mechanics, claimed by adherents of the Copenhagen interpretation since Born (1926), can be proved from Chaitin's follow-up to Goedel's (first) incompleteness theorem. In comparison, Bell's (1964) theorem as well as the so-called free will theorem-originally due to Heywood and Redhead (1983)-left two loopholes for deterministic hidden variable theories, namely giving up either locality (more precisely: local contextuality, as in Bohmian mechanics) or free choice (i.e. uncorrelated measurement (...) settings, as in 't Hooft's cellular automaton interpretation of quantum mechanics). The main point is that Bell and others did not exploit the full empirical content of quantum mechanics, which consists of long series of outcomes of repeated measurements (idealized as infinite binary sequences): their arguments only used the long-run relative frequencies derived from such series, and hence merely asked hidden variable theories to reproduce single-case Born probabilities defined by certain entangled bipartite states. If we idealize binary outcome strings of a fair quantum coin flip as infinite sequences, quantum mechanics predicts that these typically (i.e. almost surely) have a property called 1-randomness in logic, which is much stronger than uncomputability. This is the key to my claim, which is admittedly based on a stronger (yet compelling) notion of determinism than what is common in the literature on hidden variable theories. (shrink)
In this paper we analyze methodological and philosophical implications of algorithmic aspects of unconventional computation. At first, we describe how the classical algorithmic universe developed and analyze why it became closed in the conventional approach to computation. Then we explain how new models of algorithms turned the classical closed algorithmic universe into the open world of algorithmic constellations, allowing higher flexibility and expressive power, supporting constructivism and creativity in mathematical modeling. As Goedels undecidability theorems demonstrate, the closed algorithmic universe restricts (...) essential forms of mathematical cognition. In contrast, the open algorithmic universe, and even more the open world of algorithmic constellations, remove such restrictions and enable new, richer understanding of computation. (shrink)
This review concludes that if the authors know what mathematical logic is they have not shared their knowledge with the readers. This highly praised book is replete with errors and incoherency.
The viewpoint that consciousness, including feeling, could be fully expressed by a computational device is known as strong artificial intelligence or strong AI. Here I offer a defense of strong AI based on machine-state functionalism at the quantum level, or quantum-state functionalism. I consider arguments against strong AI, then summarize some counterarguments I find compelling, including Torkel Franzén’s work which challenges Roger Penrose’s claim, based on Gödel incompleteness, that mathematicians have nonalgorithmic levels of “certainty.” Some consequences of strong AI (...) are then considered. A resolution is offered of some problems including John Searle’s Chinese Room problem and the problem of consciousness propagation under isomorphism. (shrink)
Thinking about time travel is an entertaining way to explore how to understand time and its location in the broad conceptual landscape that includes causation, fate, action, possibility, experience, and reality. It is uncontroversial that time travel towards the future exists, and time travel to the past is generally recognized as permitted by Einstein’s general theory of relativity, though no one knows yet whether nature truly allows it. Coherent time travel stories have added flair to traditional debates over the metaphysical (...) status of the past, the reality of temporal passage, and the existence of free will. Moreover, plausible models of time travel and time machines can be used to investigate the subtle relation between space-time structure and causality. -/- It surveys some philosophical issues concerning time travel and should serves as a quick introduction. It includes a new and improved way to define a time machine. (shrink)
The foundations of mathematics and physics no longer start with fundamental entities and their properties like spatial extension, points, lines or the billiard ball like particles of Newtonian physics. Mathematics has abolished these from its foundations in set theory by making all assumptions explicit and structural. Particle physics has become completely mathematical, connecting to physical reality only through experimental technique. Applying the principles guiding the foundations of mathematics and physics to philosophical analysis underscores that only conscious experience has an intrinsic (...) nature. This leads to a version of realistic monism in which the essence and totality of the existence of physical structure is immediate experience in some form. Identifying physical structure with conscious experience allows the application of mathematics to the evolution of consciousness. Some of the implications from Goedel’s Incompleteness Theorem are connected to creativity and ethics. (shrink)
What is so special and mysterious about the Continuum, this ancient, always topical, and alongside the concept of integers, most intuitively transparent and omnipresent conceptual and formal medium for mathematical constructions and the battle field of mathematical inquiries ? And why it resists the century long siege by best mathematical minds of all times committed to penetrate once and for all its set-theoretical enigma ? -/- The double-edged purpose of the present study is to save from the transfinite deadlock of (...) higher set theory the jewel of mathematical Continuum -- this genuine, even if mostly forgotten today raison d'etre of all set-theoretical enterprises to Infinity and beyond, from Georg Cantor to W. Hugh Woodin to Buzz Lightyear, by simultaneously exhibiting the limits and pitfalls of all old and new reductionist foundational approaches to mathematical truth: be it Cantor's or post-Cantorian Idealism, Brouwer's or post-Brouwerian Constructivism, Hilbert's or post-Hilbertian Formalism, Goedel's or post-Goedelian Platonism. -/- In the spirit of Zeno's paradoxes, but with the enormous historical advantage of hindsight, we claim that Cantor's set-theoretical methodology, powerful and reach in proof-theoretic and similar applications as it might be, is inherently limited by its epistemological framework of transfinite local causality, and neither can be held accountable for the properties of the Continuum already acquired through geometrical, analytical, and arithmetical studies, nor can it be used for an adequate, conceptually sensible, operationally workable, and axiomatically sustainable re-creation of the Continuum. -/- From a strictly mathematical point of view, this intrinsic limitation of the constative and explicative power of higher set theory finds its explanation in the identified in this study ultimate phenomenological obstacle to Cantor's transfinite construction, similar to topological obstacles in homotopy theory and theoretical physics: the entanglement capacity of the mathematical Continuum. (shrink)
The full text of this essay is available in an English translation (also in philpapers) under: Alfred Gierer, Science, religion, and basic biological issues that are open to interpretation. Der Artikel bildet das Schlusskapitel des Buches " Alfred Gierer: Wissenschaftliches Denken, das Rätsel Bewusstsein und pro-religiöse Ideen", Königshausen&Neumann, Würzburg 2019. Reichweite und Grenzen naturwissenschaftlicher Erklärungen ergeben sich zum einen aus der universellen Gültigkeit physikalischer Gesetze, zum anderen aus prinzipiellen, intrinsischen Grenzen der Bestimmbarkeit und Berechenbarkeit, zumal bei selbstbezüglichen Fragestellungen. In diesem (...) Essay geht es um deutungsoffene Grundfragen in Zusammenhang mit der Beziehung von Wissenschaft und Religion: Der Unterscheidung von Tier und Mensch, der Entstehung der mentalen Fähigkeiten der biologischen Spezies „Mensch“, den naturgesetzlichen Voraussetzungen eines „lebensfreundlichen“ physikalischen Universums, und der Reichweite wie den Grenzen einer naturwissenschaftlichen Erklärung von menschlichem Bewusstsein. Naturwissenschaft kann auf der philosophischen, kulturellen und religiösen Ebene die Mehrdeutigkeit der Welt nicht auflösen. Agnostische und religiöse Grundauffassungen werden auf Dauer ko-existieren, und die Wahl ist nicht nur eine Frage des Wissens, sondern besonders auch der Weisheit und der Lebenskunst. (shrink)
The enigma of the Emergence of Natural Languages, coupled or not with the closely related problem of their Evolution is perceived today as one of the most important scientific problems. The purpose of the present study is actually to outline such a solution to our problem which is epistemologically consonant with the Big Bang solution of the problem of the Emergence of the Universe}. Such an outline, however, becomes articulable, understandable, and workable only in a drastically extended epistemic and scientific (...) oecumene, where known and habitual approaches to the problem, both theoretical and experimental, become distant, isolated, even if to some degree still hospitable conceptual and methodological islands. The guiding light of our inquiry will be Eugene Paul Wigner's metaphor of ``the unreasonable effectiveness of mathematics in natural sciences'', i.e., the steadily evolving before our eyes, since at least XVIIth century, ``the miracle of the appropriateness of the language of mathematics for the formulation of the laws of physics''. Kurt Goedel's incompleteness and undecidability theory will be our guardian discerner against logical fallacies of otherwise apparently plausible explanations. John Bell's ``unspeakableness'' and the commonplace counterintuitive character of quantum phenomena will be our encouragers. And the radical novelty of the introduced here and adapted to our purposes Big Bang epistemological paradigm will be an appropriate, even if probably shocking response to our equally shocking discovery in the oldest among well preserved linguistic fossils of perfect mathematical structures outdoing the best artifactual Assemblers. (shrink)
This is an English translation of my essay: Alfred Gierer Wissenschaft, Religion und die deutungsoffenen Grundfragen der Biologie. Mpi for the History of Science, preprint 388, 1-21, also in philpapers. Range and limits of science are given by the universal validity of physical laws, and by intrinsic limitations, especially in self-referential contexts. In particular, neurobiology should not be expected to provide a full understanding of consciousness and the mind. Science cannot provide, by itself, an unambiguous interpretation of the natural order (...) at the philosophical, cultural and religious level. The diversity of interpretations, however appears as a positive factor of cultural dynamics. Historically, the revival of the philosophy of nature in the middle ages included remarkable biological thoughts such as those of Eriugena and Thierry of Chartres. In this essay, emphasis is placed on basic issues of modern biology – the distinction of man and animal, the evolution of human mental capabilities, the physics of the universe as precondition for biological evolution, and the intricasies of the brain-mind-relation. They are open to agnostic as well as religious interpretations, the individual choice being mainly a matter of wisdom and not just of knowledge. -/-. (shrink)
We show how removing faith-based beliefs in current philosophies of classical and constructive mathematics admits formal, evidence-based, definitions of constructive mathematics; of a constructively well-defined logic of a formal mathematical language; and of a constructively well-defined model of such a language. -/- We argue that, from an evidence-based perspective, classical approaches which follow Hilbert's formal definitions of quantification can be labelled `theistic'; whilst constructive approaches based on Brouwer's philosophy of Intuitionism can be labelled `atheistic'. -/- We then adopt what may (...) be labelled a finitary, evidence-based, `agnostic' perspective and argue that Brouwerian atheism is merely a restricted perspective within the finitary agnostic perspective, whilst Hilbertian theism contradicts the finitary agnostic perspective. -/- We then consider the argument that Tarski's classic definitions permit an intelligence---whether human or mechanistic---to admit finitary, evidence-based, definitions of the satisfaction and truth of the atomic formulas of the first-order Peano Arithmetic PA over the domain N of the natural numbers in two, hitherto unsuspected and essentially different, ways. -/- We show that the two definitions correspond to two distinctly different---not necessarily evidence-based but complementary---assignments of satisfaction and truth to the compound formulas of PA over N. -/- We further show that the PA axioms are true over N, and that the PA rules of inference preserve truth over N, under both the complementary interpretations; and conclude some unsuspected constructive consequences of such complementarity for the foundations of mathematics, logic, philosophy, and the physical sciences. -/- . (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.