Throughout this paper, we are trying to show how and why our Mathematical frame-work seems inappropriate to solve problems in Theory of Computation. More exactly, the concept of turning back in time in paradoxes causes inconsistency in modeling of the concept of Time in some semantic situations. As we see in the first chapter, by introducing a version of “Unexpected Hanging Paradox”,first we attempt to open a new explanation for some paradoxes. In the second step, by applying (...) this paradox, it is demonstrated that any formalized system for the Theory of Computation based on Classical Logic and Turing Model of Computation leads us to a contradiction. We conclude that our mathematical frame work is inappropriate for Theory of Computation. Furthermore, the result provides us a reason that many problems in Complexity Theory resist to be solved.(This work is completed in 2017 -5- 2, it is in vixra in 2017-5-14, presented in Unilog 2018, Vichy). (shrink)
The aim of this paper is to describe and analyze the epistemological justification of a proposal initially made by the biomathematician Robert Rosen in 1958. In this theoretical proposal, Rosen suggests using the mathematical concept of “category” and the correlative concept of “natural equivalence” in mathematical modeling applied to living beings. Our questions are the following: According to Rosen, to what extent does the mathematical notion of category give access to more “natural” formalisms in the modeling of (...) living beings? Is the so -called “naturalness” of some kinds of equivalences (which the mathematical notion of category makes it possible to generalize and to put at the forefront) analogous to the naturalness of living systems? Rosen appears to answer “yes” and to ground this transfer of the concept of “natural equivalence” in biology on such an analogy. But this hypothesis, although fertile, remains debatable. Finally, this paper makes a brief account of the later evolution of Rosen’s arguments about this topic. In particular, it sheds light on the new role played by the notion of “category” in his more recent objections to the computational models that have pervaded almost every domain of biology since the 1990s. (shrink)
The Computational Theory of Mind (CTM) holds that cognitive processes are essentially computational, and hence computation provides the scientific key to explaining mentality. The Representational Theory of Mind (RTM) holds that representational content is the key feature in distinguishing mental from non-mental systems. I argue that there is a deep incompatibility between these two theoretical frameworks, and that the acceptance of CTM provides strong grounds for rejecting RTM. The focal point of the incompatibility is the fact that (...) representational content is extrinsic to formal procedures as such, and the intended interpretation of syntax makes no difference to the execution of an algorithm. So the unique 'content' postulated by RTM is superfluous to the formal procedures of CTM. And once these procedures are implemented in a physical mechanism, it is exclusively the causal properties of the physical mechanism that are responsible for all aspects of the system's behaviour. So once again, postulated content is rendered superfluous. To the extent that semantic content may appear to play a role in behaviour, it must be syntactically encoded within the system, and just as in a standard computational artefact, so too with the human mind/brain - it's pure syntax all the way down to the level of physical implementation. Hence 'content' is at most a convenient meta-level gloss, projected from the outside by human theorists, which itself can play no role in cognitive processing. (shrink)
1971. Discourse Grammars and the Structure of Mathematical Reasoning II: The Nature of a Correct Theory of Proof and Its Value, Journal of Structural Learning 3, #2, 1–16. REPRINTED 1976. Structural Learning II Issues and Approaches, ed. J. Scandura, Gordon & Breach Science Publishers, New York, MR56#15263. -/- This is the second of a series of three articles dealing with application of linguistics and logic to the study of mathematical reasoning, especially in the setting of a concern (...) for improvement of mathematical education. The present article presupposes the previous one. Herein we develop our ideas of the purposes of a theory of proof and the criterion of success to be applied to such theories. In addition we speculate at length concerning the specific kinds of uses to which a successful theory of proof may be put vis-a-vis improvement of various aspects of mathematical education. The final article will deal with the construction of such a theory. The 1st is the 1971. Discourse Grammars and the Structure of Mathematical Reasoning I: Mathematical Reasoning and Stratification of Language, Journal of Structural Learning 3, #1, 55–74. https://www.academia.edu/s/fb081b1886?source=link . (shrink)
CORCORAN RECOMMENDS COCCHIARELLA ON TYPE THEORY. The 1983 review in Mathematical Reviews 83e:03005 of: Cocchiarella, Nino “The development of the theory of logical types and the notion of a logical subject in Russell's early philosophy: Bertrand Russell's early philosophy, Part I”. Synthese 45 (1980), no. 1, 71-115 .
I have read many recent discussions of the limits of computation and the universe as computer, hoping to find some comments on the amazing work of polymath physicist and decision theorist David Wolpert but have not found a single citation and so I present this very brief summary. Wolpert proved some stunning impossibility or incompleteness theorems (1992 to 2008-see arxiv.org) on the limits to inference (computation) that are so general they are independent of the device doing the (...) class='Hi'>computation, and even independent of the laws of physics, so they apply across computers, physics, and human behavior. They make use of Cantor's diagonalization, the liar paradox and worldlines to provide what may be the ultimate theorem in Turing Machine Theory, and seemingly provide insights into impossibility,incompleteness, the limits of computation,and the universe as computer, in all possible universes and all beings or mechanisms, generating, among other things,a non-quantum mechanical uncertainty principle and a proof of monotheism. (shrink)
DEFINING OUR TERMS A “paradox" is an argumentation that appears to deduce a conclusion believed to be false from premises believed to be true. An “inconsistency proof for a theory" is an argumentation that actually deduces a negation of a theorem of the theory from premises that are all theorems of the theory. An “indirect proof of the negation of a hypothesis" is an argumentation that actually deduces a conclusion known to be false from the hypothesis alone (...) or, more commonly, from the hypothesis augmented by a set of premises known to be true. A “direct proof of a hypothesis" is an argumentation that actually deduces the hypothesis itself from premises known to be true. Since `appears', `believes' and `knows' all make elliptical reference to a participant, it is clear that `paradox', `indirect proof' and `direct proof' are all participant-relative. PARTICIPANT RELATIVITY In normal mathematical writing the participant is presumed to be “the community of mathematicians" or some more or less well-defined subcommunity and, therefore, omission of explicit reference to the participant is often warranted. However, in historical, critical, or philosophical writing focused on emerging branches of mathematics such omission often invites confusion. One and the same argumentation has been a paradox for one mathematician, an inconsistency proof for another, and an indirect proof to a third. One and the same argumentation-text can appear to one mathematician to express an indirect proof while appearing to another mathematician to express a direct proof. WHAT IS A PARADOX’S SOLUTION? Of the above four sorts of argumentation only the paradox invites “solution" or “resolution", and ordinarily this is to be accomplished either by discovering a logical fallacy in the “reasoning" of the argumentation or by discovering that the conclusion is not really false or by discovering that one of the premises is not really true. Resolution of a paradox by a participant amounts to reclassifying a formerly paradoxical argumentation either as a “fallacy", as a direct proof of its conclusion, as an indirect proof of the negation of one of its premises, as an inconsistency proof, or as something else depending on the participant's state of knowledge or belief. This illustrates why an argumentation which is a paradox to a given mathematician at a given time may well not be a paradox to the same mathematician at a later time. -/- The present article considers several set-theoretic argumentations that appeared in the period 1903-1908. The year 1903 saw the publication of B. Russell's Principles of mathematics, [Cambridge Univ. Press, Cambridge, 1903; Jbuch 34, 62]. The year 1908 saw the publication of Russell's article on type theory as well as Ernst Zermelo's two watershed articles on the axiom of choice and the foundations of set theory. The argumentations discussed concern “the largest cardinal", “the largest ordinal", the well-ordering principle, “the well-ordering of the continuum", denumerability of ordinals and denumerability of reals. The article shows that these argumentations were variously classified by various mathematicians and that the surrounding atmosphere was one of confusion and misunderstanding, partly as a result of failure to make or to heed distinctions similar to those made above. The article implies that historians have made the situation worse by not observing or not analysing the nature of the confusion. -/- RECOMMENDATION This well-written and well-documented article exemplifies the fact that clarification of history can be achieved through articulation of distinctions that had not been articulated (or were not being heeded) at the time. The article presupposes extensive knowledge of the history of mathematics, of mathematics itself (especially set theory) and of philosophy. It is therefore not to be recommended for casual reading. AFTERWORD: This review was written at the same time Corcoran was writing his signature “Argumentations and logic”[249] that covers much of the same ground in much more detail. https://www.academia.edu/14089432/Argumentations_and_Logic . (shrink)
This essay uses a mental files theory of singular thought—a theory saying that singular thought about and reference to a particular object requires possession of a mental store of information taken to be about that object—to explain how we could have such thoughts about abstract mathematical objects. After showing why we should want an explanation of this I argue that none of three main contemporary mental files theories of singular thought—acquaintance theory, semantic instrumentalism, and semantic cognitivism—can (...) give it. I argue for two claims intended to advance our understanding of singular thought about mathematical abstracta. First, that the conditions for possession of a file for an abstract mathematical object are the same as the conditions for possessing a file for an object perceived in the past—namely, that the agent retains information about the object. Thus insofar as we are able to have memory-based files for objects perceived in the past, we ought to be able to have files for abstract mathematical objects too. Second, at least one recently articulated condition on a file’s being a device for singular thought—that it be capable of surviving a certain kind of change in the information it contains—can be satisfied by files for abstract mathematical objects. (shrink)
In this paper a class of languages which are formal enough for mathematical reasoning is introduced. Its languages are called mathematically agreeable. Languages containing a given MA language L, and being sublanguages of L augmented by a monadic predicate, are constructed. A mathematicaltheory of truth (shortly MTT) is formulated for some of those languages. MTT makes them fully interpreted MA languages which posses their own truth predicates. MTT is shown to conform well with the eight norms (...) formulated for theories of truth in the paper 'What Theories of Truth Should be Like (but Cannot be)', by Hannes Leitgeb. MTT is also free from infinite regress, providing a proper framework to study the regress problem. Main tools used in proofs are Zermelo-Fraenkel (ZF) set theory and classical logic. (shrink)
In this paprer a class of so called mathematically acceptable (shortly MA) languages is introduced First-order formal languages containing natural numbers and numerals belong to that class. MA languages which are contained in a given fully interpreted MA language augmented by a monadic predicate are constructed. A mathematicaltheory of truth (shortly MTT) is formulated for some of these languages. MTT makes them fully interpreted MA languages which posses their own truth predicates, yielding consequences to philosophy of mathematics. (...) MTT is shown to conform well with the eight norms presented for theories of truth in the paper 'What Theories of Truth Should be Like (but Cannot be)' by Hannes Leitgeb. MTT is also free from infinite regress, providing a proper framework to study the regress problem. (shrink)
According to the computational theory of mind , to think is to compute. But what is meant by the word 'compute'? The generally given answer is this: Every case of computing is a case of manipulating symbols, but not vice versa - a manipulation of symbols must be driven exclusively by the formal properties of those symbols if it is qualify as a computation. In this paper, I will present the following argument. Words like 'form' and 'formal' are (...) ambiguous, as they can refer to form in either the syntactic or the morphological sense. CTM fails on each disambiguation, and the arguments for CTM immediately cease to be compelling once we register that ambiguity. The terms 'mechanical' and 'automatic' are comparably ambiguous. Once these ambiguities are exposed, it turns out that there is no possibility of mechanizing thought, even if we confine ourselves to domains where all problems can be settled through decision-procedures. The impossibility of mechanizing thought thus has nothing to do with recherché mathematical theorems, such as those proven by Gödel and Rosser. A related point is that CTM involves, and is guilty of reinforcing, a misunderstanding of the concept of an algorithm. (shrink)
This work is a conceptual analysis of certain recent developments in the mathematical foundations of Classical and Quantum Mechanics which have allowed to formulate both theories in a common language. From the algebraic point of view, the set of observables of a physical system, be it classical or quantum, is described by a Jordan-Lie algebra. From the geometric point of view, the space of states of any system is described by a uniform Poisson space with transition probability. Both these (...) structures are here perceived as formal translations of the fundamental twofold role of properties in Mechanics: they are at the same time quantities and transformations. The question becomes then to understand the precise articulation between these two roles. The analysis will show that Quantum Mechanics can be thought as distinguishing itself from Classical Mechanics by a compatibility condition between properties-as-quantities and properties-as-transformations. -/- Moreover, this dissertation shows the existence of a tension between a certain "abstract way" of conceiving mathematical structures, used in the practice of mathematical physics, and the necessary capacity to specify particular states or observables. It then becomes important to understand how, within the formalism, one can construct a labelling scheme. The “Chase for Individuation” is the analysis of different mathematical techniques which attempt to overcome this tension. In particular, we discuss how group theory furnishes a partial solution. (shrink)
The paper investigates how the mathematical languages used to describe and to observe automatic computations influence the accuracy of the obtained results. In particular, we focus our attention on Single and Multi-tape Turing machines which are described and observed through the lens of a new mathematical language which is strongly based on three methodological ideas borrowed from Physics and applied to Mathematics, namely: the distinction between the object (we speak here about a mathematical object) of an observation (...) and the instrument used for this observation; interrelations holding between the object and the tool used for the observation; the accuracy of the observation determined by the tool. Results of the observation executed by the traditional and new languages are compared and discussed. (shrink)
I have read many recent discussions of the limits of computation and the universe as computer, hoping to find some comments on the amazing work of polymath physicist and decision theorist David Wolpert but have not found a single citation and so I present this very brief summary. Wolpert proved some stunning impossibility or incompleteness theorems (1992 to 2008-see arxiv dot org) on the limits to inference (computation) that are so general they are independent of the device doing (...) the computation, and even independent of the laws of physics, so they apply across computers, physics, and human behavior. They make use of Cantor's diagonalization, the liar paradox and worldlines to provide what may be the ultimate theorem in Turing Machine Theory, and seemingly provide insights into impossibility, incompleteness, the limits of computation, and the universe as computer, in all possible universes and all beings or mechanisms, generating, among other things, a non- quantum mechanical uncertainty principle and a proof of monotheism. There are obvious connections to the classic work of Chaitin, Solomonoff, Komolgarov and Wittgenstein and to the notion that no program (and thus no device) can generate a sequence (or device) with greater complexity than it possesses. One might say this body of work implies atheism since there cannot be any entity more complex than the physical universe and from the Wittgensteinian viewpoint, ‘more complex’ is meaningless (has no conditions of satisfaction, i.e., truth-maker or test). Even a ‘God’ (i.e., a ‘device’with limitless time/space and energy) cannot determine whether a given ‘number’ is ‘random’, nor find a certain way to show that a given ‘formula’, ‘theorem’ or ‘sentence’ or ‘device’ (all these being complex language games) is part of a particular ‘system’. -/- Those wishing a comprehensive up to date framework for human behavior from the modern two systems view may consult my book ‘The Logical Structure of Philosophy, Psychology, Mind and Language in Ludwig Wittgenstein and John Searle’ 2nd ed (2019). Those interested in more of my writings may see ‘Talking Monkeys--Philosophy, Psychology, Science, Religion and Politics on a Doomed Planet--Articles and Reviews 2006-2019 2nd ed (2019) and Suicidal Utopian Delusions in the 21st Century 4th ed (2019) . (shrink)
The relationship between abstract formal procedures and the activities of actual physical systems has proved to be surprisingly subtle and controversial, and there are a number of competing accounts of when a physical system can be properly said to implement a mathematical formalism and hence perform a computation. I defend an account wherein computational descriptions of physical systems are high-level normative interpretations motivated by our pragmatic concerns. Furthermore, the criteria of utility and success vary according to our diverse (...) purposes and pragmatic goals. Hence there is no independent or uniform fact to the matter, and I advance the ‘anti-realist’ conclusion that computational descriptions of physical systems are not founded upon deep ontological distinctions, but rather upon interest-relative human conventions. Hence physical computation is a ‘conventional’ rather than a ‘natural’ kind. (shrink)
Lipsey and Lancaster's ``general theory of second best'' is widely thought to have significant implications for applied theorizing about the institutions and policies that most effectively implement abstract normative principles. It is also widely thought to have little significance for theorizing about which abstract normative principles we ought to implement. Contrary to this conventional wisdom, I show how the second best theorem can be extended to myriad domains beyond applied normative theorizing, and in particular to more abstract theorizing about (...) the normative principles we should aim to implement. I start by separating the mathematical model used to prove the second best theorem from its familiar economic interpretation. I then develop an alternative normative-theoretic interpretation of the model, which yields a novel second best theorem for idealistic normative theory. My method for developing this interpretation provides a template for developing additional interpretations that can extend the reach of the second best theorem beyond normative theoretical domains. I also show how, within any domain, the implications of the second best theorem are more specific than is typically thought. I conclude with some brief remarks on the value of mathematical models for conceptual exploration. (shrink)
The mind-body relationship has vexed philosophers of mind for quite a long time. Different theories of mind have offered different points of view about the interaction between the two, but none of them seem free of ambiguities and questions. This paper attempts to use a mathematical model for mind-body relationship. The model may generate some questions to think about this relationship from the viewpoint of operator theory.
In most accounts of realization of computational processes by physical mechanisms, it is presupposed that there is one-to-one correspondence between the causally active states of the physical process and the states of the computation. Yet such proposals either stipulate that only one model of computation is implemented, or they do not reflect upon the variety of models that could be implemented physically. In this paper, I claim that mechanistic accounts of computation should allow for a broad variation (...) of models of computation. In particular, some non-standard models should not be excluded a priori. The relationship between mathematical models of computation and mechanistically adequate models is studied in more detail. (shrink)
[Müller, Vincent C. (ed.), (2013), Philosophy and theory of artificial intelligence (SAPERE, 5; Berlin: Springer). 429 pp. ] --- Can we make machines that think and act like humans or other natural intelligent agents? The answer to this question depends on how we see ourselves and how we see the machines in question. Classical AI and cognitive science had claimed that cognition is computation, and can thus be reproduced on other computing machines, possibly surpassing the abilities of human (...) intelligence. This consensus has now come under threat and the agenda for the philosophy and theory of AI must be set anew, re-defining the relation between AI and Cognitive Science. We can re-claim the original vision of general AI from the technical AI disciplines; we can reject classical cognitive science and replace it with a new theory (e.g. embodied); or we can try to find new ways to approach AI, for example from neuroscience or from systems theory. To do this, we must go back to the basic questions on computing, cognition and ethics for AI. The 30 papers in this volume provide cutting-edge work from leading researchers that define where we stand and where we should go from here. (shrink)
Instead of the half-century old foundational feud between set theory and category theory, this paper argues that they are theories about two different complementary types of universals. The set-theoretic antinomies forced naïve set theory to be reformulated using some iterative notion of a set so that a set would always have higher type or rank than its members. Then the universal u_{F}={x|F(x)} for a property F() could never be self-predicative in the sense of u_{F}∈u_{F}. But the (...) class='Hi'>mathematicaltheory of categories, dating from the mid-twentieth century, includes a theory of always-self-predicative universals--which can be seen as forming the "other bookend" to the never-self-predicative universals of set theory. The self-predicative universals of category theory show that the problem in the antinomies was not self-predication per se, but negated self-predication. They also provide a model (in the Platonic Heaven of mathematics) for the self-predicative strand of Plato's Theory of Forms as well as for the idea of a "concrete universal" in Hegel and similar ideas of paradigmatic exemplars in ordinary thought. (shrink)
By the end of his life Plato had rearranged the theory of ideas into his teaching about ideal numbers, but no written records have been left. The Ideal mathematics of Plato is present in all his dialogues. It can be clearly grasped in relation to the effective use of mathematical modelling. Many problems of mathematical modelling were laid in the foundation of the method by cutting the three-level idealism of Plato to the single-level “ideism” of Aristotle. For (...) a long time, the real, ideal numbers of Plato’s Ideal mathematics eliminates many mathematical problems, extends the capabilities of modelling, and improves mathematics. (shrink)
Kant's special metaphysics is intended to provide the a priori foundation for Newtonian science, which is to be achieved by exhibiting the a priori content of Newtonian concepts and laws. Kant envisions a two-step mathematical construction of the dynamical concept of matter involving a geometrical construction of matter’s bulk and a symbolic construction of matter’s density. Since Newton himself defines quantity of matter in terms of bulk and density, there is no reason why we shouldn’t interpret Kant’s Dynamics as (...) a defence of a Newtonian concept of matter. When Kant’s reasoning is understood in relation to his criteria for mathematical construction, it is possible to maintain that matter theory is central to the Metaphysical Foundations, but that this does not undermine Kant’s stated aim of giving an a priori foundation for Newtonian science. (shrink)
This paper is a contribution to graded model theory, in the context of mathematical fuzzy logic. We study characterizations of classes of graded structures in terms of the syntactic form of their first-order axiomatization. We focus on classes given by universal and universal-existential sentences. In particular, we prove two amalgamation results using the technique of diagrams in the setting of structures valued on a finite MTL-algebra, from which analogues of the Łoś–Tarski and the Chang–Łoś–Suszko preservation theorems follow.
This book reports on the results of the third edition of the premier conference in the field of philosophy of artificial intelligence, PT-AI 2017, held on November 4 - 5, 2017 at the University of Leeds, UK. It covers: advanced knowledge on key AI concepts, including complexity, computation, creativity, embodiment, representation and superintelligence; cutting-edge ethical issues, such as the AI impact on human dignity and society, responsibilities and rights of machines, as well as AI threats to humanity and AI (...) safety; and cutting-edge developments in techniques to achieve AI, including machine learning, neural networks, dynamical systems. The book also discusses important applications of AI, including big data analytics, expert systems, cognitive architectures, and robotics. It offers a timely, yet very comprehensive snapshot of what is going on in the field of AI, especially at the interfaces between philosophy, cognitive science, ethics and computing. (shrink)
This note is based on a lecture delivered at the Conference on the Scien- tic Research of the Mathematical Center of Opole, Turawa, May 10-11th, 1980. A somewhat extended version will be published in the Proceedings of the Conference. At the same time it is an abstract of a part of a planned larger paper, which will involve the theory of label-tokens. The theory is included into the author's monograph in Polish "Teorie Językow Syntaktycznie Kategorialnych", PWN, Warszawa-Wrocław (...) 1985 and into its English version: Theory of Language Syntax. Categorial Approach, Kluwer Academic Publishers, Boston-London-Dordrecht 1991. (shrink)
What is so special and mysterious about the Continuum, this ancient, always topical, and alongside the concept of integers, most intuitively transparent and omnipresent conceptual and formal medium for mathematical constructions and the battle field of mathematical inquiries ? And why it resists the century long siege by best mathematical minds of all times committed to penetrate once and for all its set-theoretical enigma ? -/- The double-edged purpose of the present study is to save from the (...) transfinite deadlock of higher set theory the jewel of mathematical Continuum -- this genuine, even if mostly forgotten today raison d'etre of all set-theoretical enterprises to Infinity and beyond, from Georg Cantor to W. Hugh Woodin to Buzz Lightyear, by simultaneously exhibiting the limits and pitfalls of all old and new reductionist foundational approaches to mathematical truth: be it Cantor's or post-Cantorian Idealism, Brouwer's or post-Brouwerian Constructivism, Hilbert's or post-Hilbertian Formalism, Goedel's or post-Goedelian Platonism. -/- In the spirit of Zeno's paradoxes, but with the enormous historical advantage of hindsight, we claim that Cantor's set-theoretical methodology, powerful and reach in proof-theoretic and similar applications as it might be, is inherently limited by its epistemological framework of transfinite local causality, and neither can be held accountable for the properties of the Continuum already acquired through geometrical, analytical, and arithmetical studies, nor can it be used for an adequate, conceptually sensible, operationally workable, and axiomatically sustainable re-creation of the Continuum. -/- From a strictly mathematical point of view, this intrinsic limitation of the constative and explicative power of higher set theory finds its explanation in the identified in this study ultimate phenomenological obstacle to Cantor's transfinite construction, similar to topological obstacles in homotopy theory and theoretical physics: the entanglement capacity of the mathematical Continuum. (shrink)
This paper is on Aristotle's conception of the continuum. It is argued that although Aristotle did not have the modern conception of real numbers, his account of the continuum does mirror the topology of the real number continuum in modern mathematics especially as seen in the work of Georg Cantor. Some differences are noted, particularly as regards Aristotle's conception of number and the modern conception of real numbers. The issue of whether Aristotle had the notion of open versus closed intervals (...) is discussed. Finally, it is suggested that one reason there is a common structure between Aristotle's account of the continuum and that found in Cantor's definition of the real number continuum is that our intuitions about the continuum have their source in the experience of the real spatiotemporal world. A plea is made to consider Aristotle's abstractionist philosophy of mathematics anew. (shrink)
The paper argues against defending realism about numbers on the basis of realism about instantiated structural universals. After presenting Armstrong’s theory of structural properties as instantiated universals and Lewis’s devastating criticism of it, I argue that several responses to this criticism are unsuccessful, and that one possible construal of structural universals via non-well-founded sets should be resisted by the mathematical realist.
This essay is a contribution to the historical phenomenology of science, taking as its point of departure Husserl’s later philosophy of science and Jacob Klein’s seminal work on the emergence of the symbolic conception of number in European mathematics during the late sixteenth and seventeenth centuries. Sinceneither Husserl nor Klein applied their ideas to actual theories of modern mathematical physics, this essay attempts to do so through a case study of the conceptof “spacetime.” In §1, I sketch Klein’s account (...) of the emergence of the symbolic conception of number, beginning with Vieta in the late sixteenth century. In §2,through a series of historical illustrations, I show how the principal impediment to assimilating the new symbolic algebra to mathematical physics, namely, thedimensionless character of symbolic number, is overcome via the translation of the traditional language of ratio and proportion into the symbolic language of equations. In §§3–4, I critically examine the concept of “Minkowski spacetime,” specifically, the purported analogy between the Pythagorean distance formula and the Minkowski “spacetime interval.” Finally, in §5, I address the question of whether the concept of Minkowski spacetime is, as generally assumed, indispensable to Einstein’s general theory of relativity. (shrink)
I have read many recent discussions of the limits of computation and the universe as computer, hoping to find some comments on the amazing work of polymath physicist and decision theorist David Wolpert but have not found a single citation and so I present this very brief summary. Wolpert proved some stunning impossibility or incompleteness theorems (1992 to 2008-see arxiv.org) on the limits to inference (computation) that are so general they are independent of the device doing the (...) class='Hi'>computation, and even independent of the laws of physics, so they apply across computers, physics, and human behavior. They make use of Cantor's diagonalization, the liar paradox and worldlines to provide what may be the ultimate theorem in Turing Machine Theory, and seemingly provide insights into impossibility, incompleteness, the limits of computation,and the universe as computer, in all possible universes and all beings or mechanisms, generating, among other things,a non- quantum mechanical uncertainty principle and a proof of monotheism. There are obvious connections to the classic work of Chaitin, Solomonoff, Komolgarov and Wittgenstein and to the notion that no program (and thus no device) can generate a sequence (or device) with greater complexity than it possesses. One might say this body of work implies atheism since there cannot be any entity more complex than the physical universe and from the Wittgensteinian viewpoint, ‘more complex’ is meaningless (has no conditions of satisfaction, i.e., truth-maker or test). Even a ‘God’ (i.e., a ‘device’ with limitless time/space and energy) cannot determine whether a given ‘number’ is ‘random’ nor can find a certain way to show that a given ‘formula’, ‘theorem’ or ‘sentence’ or ‘device’ (all these being complex language games) is part of a particular ‘system’. -/- Those wishing a comprehensive up to date framework for human behavior from the modern two systems view may consult my article The Logical Structure of Philosophy, Psychology, Mind and Language as Revealed in Wittgenstein and Searle 59p(2016). For all my articles on Wittgenstein and Searle see my e-book ‘The Logical Structure of Philosophy, Psychology, Mind and Language in Wittgenstein and Searle 367p (2016). Those interested in all my writings in their most recent versions may consult my e-book Philosophy, Human Nature and the Collapse of Civilization - Articles and Reviews 2006-2016’ 662p (2016). -/- All of my papers and books have now been published in revised versions both in ebooks and in printed books. -/- Talking Monkeys: Philosophy, Psychology, Science, Religion and Politics on a Doomed Planet - Articles and Reviews 2006-2017 (2017) https://www.amazon.com/dp/B071HVC7YP. -/- The Logical Structure of Philosophy, Psychology, Mind and Language in Ludwig Wittgenstein and John Searle--Articles and Reviews 2006-2016 (2017) https://www.amazon.com/dp/B071P1RP1B. -/- Suicidal Utopian Delusions in the 21st century: Philosophy, Human Nature and the Collapse of Civilization - Articles and Reviews 2006-2017 (2017) https://www.amazon.com/dp/B0711R5LGX . (shrink)
This is the first of a two-volume work combining two fundamental components of contemporary computing into classical deductive computing, a powerful form of computation, highly adequate for programming and automated theorem proving, which, in turn, have fundamental applications in areas of high complexity and/or high security such as mathematical proof, software specification and verification, and expert systems. Deductive computation is concerned with truth-preservation: This is the essence of the satisfiability problem, or SAT, the central computational problem in (...) computability and complexity theory. The Turing machine provides the classical version of this theory—classical computing—with its standard model, which is physically concretized—and thus spatial-temporally limited and restricted—in the von Neumann, or digital, computer. Although a number of new technological applications require classical deductive computation with non-classical logics, many key technologies still do well—or exclusively, for that matter—with classical logic. In this first volume, we elaborate on classical deductive computing with classical logic. The objective of the main text is to provide the reader with a thorough elaboration on both classical computing and classical deduction with the classical first-order predicate calculus with a view to computational implementations. As a complement to the mathematical-based exposition of the topics we offer the reader a very large selection of exercises. This selection aims at not only practice of discussed material, but also creative approaches to problems, for both discussed and novel contents, as well as at research into further relevant topics. (shrink)
Scientists use models to know the world. It i susually assumed that mathematicians doing pure mathematics do not. Mathematicians doing pure mathematics prove theorems about mathematical entities like sets, numbers, geometric figures, spaces, etc., they compute various functions and solve equations. In this paper, I want to exhibit models build by mathematicians to study the fundamental components of spaces and, more generally, of mathematical forms. I focus on one area of mathematics where models occupy a central role, namely (...) homotopy theory. I argue that mathematicians introduce genuine models and I offer a rough classification of these models. (shrink)
The concept of similarity has had a rather mixed reputation in philosophy and the sciences. On the one hand, philosophers such as Goodman and Quine emphasized the „logically repugnant“ and „insidious“ character of the concept of similarity that allegedly renders it inaccessible for a proper logical analysis. On the other hand, a philosopher such as Carnap assigned a central role to similarity in his constitutional theory. Moreover, the importance and perhaps even indispensibility of the concept of similarity for many (...) empirical sciences can hardly be denied. The aim of this paper is to show that Quine’s and Goodman’s harsh verdicts about this notion are mistaken. The concept of similarity is susceptible to a precise logico-mathematical analysis through which its place in the conceptual landscape of modern mathematical theories such as order theory, topology, and graph theory becomes visible. Thereby it can be shown that a quasi-analysis of a similarity structure S can be conceived of as a sheaf (etale space) over S. (shrink)
This book is an attempt "to give a systematic account of the development of plato's theory of knowledge" (page vii). thus it focuses on the dialogues in which epistemological issues come to the fore. these dialogues are "meno", "phaedo", "symposium", "republic", "cratylus", "theastetus", "phaedrus", "timaeus", "sophist", "politicus", "philebus", and "laws". issues discusssed include the theory of recollection, perception, the difference between belief and knowledge, and mathematical knowledge. (staff).
Maimon’s theory of the differential has proved to be a rather enigmatic aspect of his philosophy. By drawing upon mathematical developments that had occurred earlier in the century and that, by virtue of the arguments presented in the Essay and comments elsewhere in his writing, I suggest Maimon would have been aware of, what I propose to offer in this paper is a study of the differential and the role that it plays in the Essay on Transcendental Philosophy (...) (1790). In order to do so, this paper focuses upon Maimon’s criticism of the role played by mathematics in Kant’s philosophy, to which Maimon offers a Leibnizian solution based on the infinitesimal calculus. The main difficulties that Maimon has with Kant’s system, the second of which will be the focus of this paper, include the presumption of the existence of synthetic a priori judgments, i.e. the question quid facti, and the question of whether the fact of our use of a priori concepts in experience is justified, i.e. the question quid juris. Maimon deploys mathematics, specifically arithmetic, against Kant to show how it is possible to understand objects as having been constituted by the very relations between them, and he proposes an alternative solution to the question quid juris, which relies on the concept of the differential. However, despite these arguments, Maimon remains sceptical with respect to the question quid facti. (shrink)
The philosophy of mathematics has been accused of paying insufficient attention to mathematical practice: one way to cope with the problem, the one we will follow in this paper on extensive magnitudes, is to combine the `history of ideas' and the `philosophy of models' in a logical and epistemological perspective. The history of ideas allows the reconstruction of the theory of extensive magnitudes as a theory of ordered algebraic structures; the philosophy of models allows an investigation into (...) the way epistemology might affect relevant mathematical notions. The article takes two historical examples as a starting point for the investigation of the role of numerical models in the construction of a system of non-Archimedean magnitudes. A brief exposition of the theories developed by Giuseppe Veronese and by Rodolfo Bettazzi at the end of the 19th century will throw new light on the role played by magnitudes and numbers in the development of the concept of a non-Archimedean order. Different ways of introducing non-Archimedean models will be compared and the influence of epistemological models will be evaluated. Particular attention will be devoted to the comparison between the models that oriented Veronese's and Bettazzi's works and the mathematical theories they developed, but also to the analysis of the way epistemological beliefs affected the concepts of continuity and measurement. (shrink)
The paper addresses the formation of striking patterns within originally near-homogenous tissue, the process prototypical for embryology, and represented in particularly purist form by cut sections of hydra regenerating, by internal reorganisation of the pre-existing tissue, a complete animal with head and foot. The essential requirements are autocatalytic, self-enhancing activation, combined with inhibitory or depletion effects of wider range – “lateral inhibition”. Not only de-novo-pattern formation, but also well known, striking features of developmental regulation such as induction, inhibition, and proportion (...) regulation can be explained on this basis. The theory provides a mathematical recipe for the construction of molecular models with criteria for the necessary non-linear interactions. It has since been widely applied to different developmental processes. (shrink)
This 4-page review-essay—which is entirely reportorial and philosophically neutral as are my other contributions to MATHEMATICAL REVIEWS—starts with a short introduction to the philosophy known as mathematical structuralism. The history of structuralism traces back to George Boole (1815–1864). By reference to a recent article various feature of structuralism are discussed with special attention to ambiguity and other terminological issues. The review-essay includes a description of the recent article. The article’s 4-sentence summary is quoted in full and then analyzed. (...) The point of the quotation is to make clear how murky, incompetent, and badly written the paper is. There is no way to determine from the article whether the editor or referees suggests improvements. (shrink)
Philosophers of science since Nagel have been interested in the links between intertheoretic reduction and explanation, understanding and other forms of epistemic progress. Although intertheoretic reduction is widely agreed to occur in pure mathematics as well as empirical science, the relationship between reduction and explanation in the mathematical setting has rarely been investigated in a similarly serious way. This paper examines an important particular case: the reduction of arithmetic to set theory. I claim that the reduction is unexplanatory. (...) In defense of this claim, I offer evidence from mathematical practice, and I respond to contrary suggestions due to Steinhart, Maddy, Kitcher and Quine. I then show how, even if set-theoretic reductions are generally not explanatory, set theory can nevertheless serve as a legitimate foundation for mathematics. Finally, some implications of my thesis for philosophy of mathematics and philosophy of science are discussed. In particular, I suggest that some reductions in mathematics are probably explanatory, and I propose that differing standards of theory acceptance might account for the apparent lack of unexplanatory reductions in the empirical sciences. (shrink)
The paper begins with an argument against eliminativism with respect to the propositional attitudes. There follows an argument that concepts are sui generis ante rem entities. A nonreductionist view of concepts and propositions is then sketched. This provides the background for a theory of concept possession, which forms the bulk of the paper. The central idea is that concept possession is to be analyzed in terms of a certain kind of pattern of reliability in one’s intuitions regarding the behavior (...) of the concept. The challenge is to find an analysis that is at once noncircular and fully general. Environmentalism, anti-individualism, holism, analyticity, etc. provide additional hurdles. The paper closes with a discussion of the theory’s implications for the Wittgenstein-Kripke puzzle about rule-following and the Benacerraf problem concerning mathematical knowledge. (shrink)
In most accounts of realization of computational processes by physical mechanisms, it is presupposed that there is one-to-one correspondence between the causally active states of the physical process and the states of the computation. Yet such proposals either stipulate that only one model of computation is implemented, or they do not reflect upon the variety of models that could be implemented physically. -/- In this paper, I claim that mechanistic accounts of computation should allow for a broad (...) variation of models of computation. In particular, some non-standard models should not be excluded a priori. The relationship between mathematical models of computation and mechanistically adequate models is studied in more detail. (shrink)
A possible world is a junky world if and only if each thing in it is a proper part. The possibility of junky worlds contradicts the principle of general fusion. Bohn (2009) argues for the possibility of junky worlds, Watson (2010) suggests that Bohn‘s arguments are flawed. This paper shows that the arguments of both authors leave much to be desired. First, relying on the classical results of Cantor, Zermelo, Fraenkel, and von Neumann, this paper proves the possibility of junky (...) worlds for certain weak set theories. Second, the paradox of Burali-Forti shows that according to the Zermelo-Fraenkel set theory ZF, junky worlds are possible. Finally, it is shown that set theories are not the only sources for designing plausible models of junky worlds: Topology (and possibly other "algebraic" mathematical theories) may be used to construct models of junky worlds. In sum, junkyness is a relatively widespread feature among possible worlds. (shrink)
In the present article we attempt to show that Aristotle's syllogistic is an underlying logiC which includes a natural deductive system and that it isn't an axiomatic theory as had previously been thought. We construct a mathematical model which reflects certain structural aspects of Aristotle's logic. We examine the relation of the model to the system of logic envisaged in scattered parts of Prior and Posterior Analytics. Our interpretation restores Aristotle's reputation as a logician of consummate imagination and (...) skill. Several attributions of shortcomings and logical errors to Aristotle are shown to be without merit. Aristotle's logic is found to be self-sufficient in several senses: his theory of deduction is logically sound in every detail. (His indirect deductions have been criticized, but incorrectly on our account.) Aristotle's logic presupposes no other logical concepts, not even those of propositional logic. The Aristotelian system is seen to be complete in the sense that every valid argument expressible in his system admits of a deduction within his deductive system: every semantically valid argument is deducible. (shrink)
In this paper we motivate and develop the analytic theory of measurement, in which autonomously specified algebras of quantities (together with the resources of mathematical analysis) are used as a unified mathematical framework for modeling (a) the time-dependent behavior of natural systems, (b) interactions between natural systems and measuring instruments, (c) error and uncertainty in measurement, and (d) the formal propositional language for describing and reasoning about measurement results. We also discuss how a celebrated theorem in analysis, (...) known as Gelfand representation, guarantees that autonomously specified algebras of quantities can be interpreted as algebras of observables on a suitable state space. Such an interpretation is then used to support (i) a realist conception of quantities as objective characteristics of natural systems, and (ii) a realist conception of measurement results (evaluations of quantities) as determined by and descriptive of the states of a target natural system. As a way of motivating the analytic approach to measurement, we begin with a discussion of some serious philosophical and theoretical problems facing the well-known representational theory of measurement. We then explain why we consider the analytic approach, which avoids all these problems, to be far more attractive on both philosophical and theoretical grounds. (shrink)
A generalized information theory is proposed as a natural extension of Shannon's information theory. It proposes that information comes from forecasts. The more precise and the more unexpected a forecast is, the more information it conveys. If subjective forecast always conforms with objective facts then the generalized information measure will be equivalent to Shannon's information measure. The generalized communication model is consistent with K. R. Popper's model of knowledge evolution. The mathematical foundations of the new information (...) class='Hi'>theory, the generalized communication model , information measures for semantic information and sensory information, and the coding meanings of generalized entropy and generalized mutual information are introduced. Assessments and optimizations of pattern recognition, predictions, and detection with the generalized information criterion are discussed. For economization of communication, a revised version of rate-distortion theory: rate-of-keeping-precision theory, which is a theory for datum compression and also a theory for matching an objective channels with the subjective understanding of information receivers, is proposed. Applications include stock market forecasting and video image presentation. (shrink)
Contemporary philosophy and theoretical psychology are dominated by an acceptance of content-externalism: the view that the contents of one's mental states are constitutively, as opposed to causally, dependent on facts about the external world. In the present work, it is shown that content-externalism involves a failure to distinguish between semantics and pre-semantics---between, on the one hand, the literal meanings of expressions and, on the other hand, the information that one must exploit in order to ascertain their literal meanings. It is (...) further shown that, given the falsity of content-externalism, the falsity of the Computational Theory of Mind (CTM) follows. It is also shown that CTM involves a misunderstanding of terms such as "computation," "syntax," "algorithm," and "formal truth." Novel analyses of the concepts expressed by these terms are put forth. These analyses yield clear, intuition-friendly, and extensionally correct answers to the questions "what are propositions?, "what is it for a proposition to be true?", and "what are the logical and psychological differences between conceptual (propositional) and non-conceptual (non-propositional) content?" Naively taking literal meaning to be in lockstep with cognitive content, Burge, Salmon, Falvey, and other semantic externalists have wrongly taken Kripke's correct semantic views to justify drastic and otherwise contraindicated revisions of commonsense. (Salmon: What is non-existent exists; at a given time, one can rationally accept a proposition and its negation. Burge: Somebody who is having a thought may be psychologically indistinguishable from somebody who is thinking nothing. Falvey: somebody who rightly believes himself to be thinking about water is psychologically indistinguishable from somebody who wrongly thinks himself to be doing so and who, indeed, isn't thinking about anything.) Given a few truisms concerning the differences between thought-borne and sentence-borne information, the data is easily modeled without conceding any legitimacy to any one of these rationality-dismantling atrocities. (It thus turns out, ironically, that no one has done more to undermine Kripke's correct semantic points than Kripke's own followers!). (shrink)
Information Theory, Evolution and The Origin ofLife: The Origin and Evolution of Life as a Digital Message: How Life Resembles a Computer, Second Edition. Hu- bert P. Yockey, 2005, Cambridge University Press, Cambridge: 400 pages, index; hardcover, US $60.00; ISBN: 0-521-80293-8. The reason that there are principles of biology that cannot be derived from the laws of physics and chemistry lies simply in the fact that the genetic information content of the genome for constructing even the simplest organisms is (...) much larger than the information content of these laws. Yockey in his previous book (1992, 335) In this new book, Information Theory, Evolution and The Origin ofLife, Hubert Yockey points out that the digital, segregated, and linear character of the genetic information system has a fundamental significance. If inheritance would blend and not segregate, Darwinian evolution would not occur. If inheritance would be analog, instead of digital, evolution would be also impossible, because it would be impossible to remove the effect of noise. In this way, life is guided by information, and so information is a central concept in molecular biology. The author presents a picture of how the main concepts of the genetic code were developed. He was able to show that despite Francis Crick's belief that the Central Dogma is only a hypothesis, the Central Dogma of Francis Crick is a mathematical consequence of the redundant nature of the genetic code. The redundancy arises from the fact that the DNA and mRNA alphabet is formed by triplets of 4 nucleotides, and so the number of letters (triplets) is 64, whereas the proteome alphabet has only 20 letters (20 amino acids), and so the translation from the larger alphabet to the smaller one is necessarily redundant. Except for Tryptohan and Methionine, all amino acids are coded by more than one triplet, therefore, it is undecidable which source code letter was actually sent from mRNA. This proof has a corollary telling that there are no such mathematical constraints for protein-protein communication. With this clarification, Yockey contributes to diminishing the widespread confusion related to such a central concept like the Central Dogma. Thus the Central Dogma prohibits the origin of life "proteins first." Proteins can not be generated by "self-organization." Understanding this property of the Central Dogma will have a serious impact on research on the origin of life. (shrink)
While many different mechanisms contribute to the generation of spatial order in biological development, the formation of morphogenetic fields which in turn direct cell responses giving rise to pattern and form are of major importance and essential for embryogenesis and regeneration. Most likely the fields represent concentration patterns of substances produced by molecular kinetics. Short range autocatalytic activation in conjunction with longer range “lateral” inhibition or depletion effects is capable of generating such patterns (Gierer and Meinhardt, 1972). Non-linear reactions are (...) required, and mathematical criteria were derived to design molecular models capable of pattern generation. The classical embryological feature of proportion regulation can be incorporated into the models. The conditions are mathematically necessary for the simplest two-factor case, and are likely to be a fair approximation in multi-component systems in which activation and inhibition are systems parameters subsuming the action of several agents. Gradients, symmetric and periodic patterns, in one or two dimensions, stable or pulsing in time, can be generated on this basis. Our basic concept of autocatalysis in conjunction with lateral inhibition accounts for self-regulatory biological features, including the reproducible formation of structures from near-uniform initial conditions as required by the logic of the generation cycle. Real tissue form, for instance that of budding Hydra, may often be traced back to local curvature arising within an initially relatively flat cell sheet, the position of evagination being determined by morphogenetic fields. Shell theory developed for architecture may also be applied to such biological processes. (shrink)
Schaffner’s model of theory reduction has played an important role in philosophy of science and philosophy of biology. Here, the model is found to be problematic because of an internal tension. Indeed, standard antireductionist external criticisms concerning reduction functions and laws in biology do not provide a full picture of the limits of Schaffner’s model. However, despite the internal tension, his model usefully highlights the importance of regulative ideals associated with the search for derivational, and embedding, deductive relations among (...)mathematical structures in theoretical biology. A reconstructed Schaffnerian model could therefore shed light on mathematicaltheory development in the biological sciences and on the epistemology of mathematical practices more generally. *Received November 2006; revised March 2009. †To contact the author, please write to: Philosophy Department, University of California, Santa Cruz, 1156 High St., Santa Cruz, CA 95064; e‐mail: rgw@ucsc.edu. (shrink)
Invited papers from PT-AI 2011. - Vincent C. Müller: Introduction: Theory and Philosophy of Artificial Intelligence - Nick Bostrom: The Superintelligent Will: Motivation and Instrumental Rationality in Advanced Artificial Agents - Hubert L. Dreyfus: A History of First Step Fallacies - Antoni Gomila, David Travieso and Lorena Lobo: Wherein is Human Cognition Systematic - J. Kevin O'Regan: How to Build a Robot that Is Conscious and Feels - Oron Shagrir: Computation, Implementation, Cognition.
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.