A new computational methodology for executing calculations with infinite and infinitesimal quantities is described in this paper. It is based on the principle ‘The part is less than the whole’ introduced by Ancient Greeks and applied to all numbers (finite, infinite, and infinitesimal) and to all sets and processes (finite and infinite). It is shown that it becomes possible to write down finite, infinite, and infinitesimal numbers by a finite number of symbols as particular cases of a (...) unique framework. The new methodology has allowed us to introduce the Infinity Computer working with such numbers (its simulator has already been realized). Examples dealing with divergent series, infinite sets, and limits are given. (shrink)
In Bertrand Russell's 1903 Principles of Mathematics, he offers an apparently devastating criticism of the neo-Kantian Hermann Cohen's Principle of the Infinitesimal Method and its History (PIM). Russell's criticism is motivated by his concern that Cohen's account of the foundations of calculus saddles mathematics with the paradoxes of the infinitesimal and continuum, and thus threatens the very idea of mathematical truth. This paper defends Cohen against that objection of Russell's, and argues that properly understood, Cohen's views of limits (...) and infinitesimals do not entail the paradoxes of the infinitesimal and continuum. Essential to that defense is an interpretation, developed in the paper, of Cohen's positions in the PIM as deeply rationalist. The interest in developing this interpretation is not just that it reveals how Cohen's views in the PIM avoid the paradoxes of the infinitesimal and continuum. It also reveals some of what is at stake, both historically and philosophically, in Russell's criticism of Cohen. (shrink)
Leibniz is well known for his formulation of the infinitesimal calculus. Nevertheless, the nature and logic of his discovery are seldom questioned: does it belong more to mathematics or metaphysics, and how is it connected to his physics? This book, composed of fourteen essays, investigates the nature and foundation of the calculus, its relationship to the physics of force and principle of continuity, and its overall method and metaphysics. The Leibnizian calculus is presented in its origin and context together (...) with its main contributors: Archimedes, Cavalieri, Wallis, Hobbes, Pascal, Huygens, Bernoulli, and Nieuwentijt. Many of us know and probably have used the Leibnizian formula: to .. (shrink)
In Hegel ou Spinoza,1 Pierre Macherey challenges the influence of Hegel’s reading of Spinoza by stressing the degree to which Spinoza eludes the grasp of the Hegelian dialectical progression of the history of philosophy. He argues that Hegel provides a defensive misreading of Spinoza, and that he had to “misread him” in order to maintain his subjective idealism. The suggestion being that Spinoza’s philosophy represents, not a moment that can simply be sublated and subsumed within the dialectical progression of the (...) history of philosophy, but rather an alternative point of view for the development of a philosophy that overcomes Hegelian idealism. Gilles Deleuze also considers Spinoza’s philosophy to resist the totalising effects of the dialectic. Indeed, Deleuze demonstrates, by means of Spinoza, that a more complex philosophy antedates Hegel’s, which cannot be supplanted by it. Spinoza therefore becomes a significant figure in Deleuze’s project of tracing an alternative lineage in the history of philosophy, which, by distancing itself from Hegelian idealism, culminates in the construction of a philosophy of difference. It is Spinoza’s role in this project that will be demonstrated in this paper by differentiating Deleuze’s interpretation of the geometrical example of Spinoza’s Letter XII (on the problem of the infinite) in Expressionism in Philosophy, Spinoza,2 from that which Hegel presents in the Science of Logic.3. (shrink)
El cálculo infinitesimal elaborado por Leibniz en la segunda mitad del siglo XVII tuvo, como era de esperarse, muchos adeptos pero también importantes críticos. Uno pensaría que cuatro siglos después de haber sido presentado éste, en las revistas, academias y sociedades de la época, habría ya poco qué decir sobre el mismo; sin embargo, cuando uno se acerca al cálculo de Leibniz –tal y como me sucedió hace tiempo– fácilmente puede percatarse de que el debate en torno al cálculo (...) leibniziano ha trascendido las fronteras temporales del siglo XVII y XVIII y aún sigue vigente, al menos en parte y sobre ciertos puntos específicos. Lo anterior resulta un tanto inquietante, entre otras cosas porque implica que hay que revisar el cálculo leibniziano para tratar de entender el motivo por el que se sigue debatiendo actualmente. El propósito de este artículo no es presentar las tesis principales del cálculo, tampoco hacer una defensa del mismo y mucho menos desarrollar una crítica de sus fundamentos matemáticos, epistémicos u ontológicos. Mi propósito es menos ambicioso, pretendo mostrar, en primer lugar, dos de las primeras objeciones que se formularon contra el cálculo infinitesimal y que fueron hechas por dos contemporáneos de Leibniz, a saber, Rolle y Nieuwentijit; por otro lado, y sirviéndome de lo anterior, quiero presentar dos comentaristas actuales que abordan el tema del cálculo. Lo anterior con el objetivo de ilustrar cuál es el estado de la cuestión, es decir, en qué momento está el debate sobre el cálculo infinitesimal y en qué se enfocan actualmente los comentaristas al hablar sobre él, pues al hacerlo –e incluso a veces sin pretenderlo– mantienen abierto el debate sobre los aciertos y desaciertos del método infinitesimal. (shrink)
This book posits that a singular paradigm in social theory can be discovered by reconstructing the conceptual grammar of Gabriel Tarde’s micro-sociology and by understanding the ways in which Gilles Deleuze’s micro-politics and Michel Foucault’s micro-physics have engaged with it. This is articulated in the infinite social multiplicity-invention-imitation-opposition-open system. Guided by infinitist ontology and an epistemology of infinitesimal difference, this paradigm offers a micro-socio-logic capable of producing new ways of understanding social life and its vicissitudes. In the field of (...) social theory, this can be called the infinitesimal revolution. (shrink)
To explore the extent of embeddability of Leibnizian infinitesimal calculus in first-order logic (FOL) and modern frameworks, we propose to set aside ontological issues and focus on pro- cedural questions. This would enable an account of Leibnizian procedures in a framework limited to FOL with a small number of additional ingredients such as the relation of infinite proximity. If, as we argue here, first order logic is indeed suitable for developing modern proxies for the inferential moves found in Leibnizian (...)infinitesimal calculus, then modern infinitesimal frameworks are more appropriate to interpreting Leibnizian infinitesimal calculus than modern Weierstrassian ones. (shrink)
In standard probability theory, probability zero is not the same as impossibility. But many have suggested that only impossible events should have probability zero. This can be arranged if we allow infinitesimal probabilities, but infinitesimals do not solve all of the problems. We will see that regular probabilities are not invariant over rigid transformations, even for simple, bounded, countable, constructive, and disjoint sets. Hence, regular chances cannot be determined by space-time invariant physical laws, and regular credences cannot satisfy seemingly (...) reasonable symmetry principles. Moreover, the examples here are immune to the objections against Williamson’s infinite coin flips. (shrink)
The role of mathematics in the development of Gilles Deleuze's (1925-95) philosophy of difference as an alternative to the dialectical philosophy determined by the Hegelian dialectic logic is demonstrated in this paper by differentiating Deleuze's interpretation of the problem of the infinitesimal in Difference and Repetition from that which G. W. F Hegel (1770-1831) presents in the Science of Logic . Each deploys the operation of integration as conceived at different stages in the development of the infinitesimal calculus (...) in his treatment of the problem of the infinitesimal. Against the role that Hegel assigns to integration as the inverse transformation of differentiation in the development of his dialectical logic, Deleuze strategically redeploys Leibniz's account of integration as a method of summation in the form of a series in the development of his philosophy of difference. By demonstrating the relation between the differential point of view of the Leibnizian infinitesimal calculus and the differential calculus of contemporary mathematics, I argue that Deleuze effectively bypasses the methods of the differential calculus which Hegel uses to support the development of the dialectical logic, and by doing so, sets up the critical perspective from which to construct an alternative logic of relations characteristic of a philosophy of difference. The mode of operation of this logic is then demonstrated by drawing upon the mathematical philosophy of Albert Lautman (1908-44), which plays a significant role in Deleuze's project of constructing a philosophy of difference. Indeed, the logic of relations that Deleuze constructs is dialectical in the Lautmanian sense. (shrink)
There exists a huge number of numerical methods that iteratively construct approximations to the solution y(x) of an ordinary differential equation (ODE) y′(x) = f(x,y) starting from an initial value y_0=y(x_0) and using a finite approximation step h that influences the accuracy of the obtained approximation. In this paper, a new framework for solving ODEs is presented for a new kind of a computer – the Infinity Computer (it has been patented and its working prototype exists). The new computer is (...) able to work numerically with finite, infinite, and infinitesimal numbers giving so the possibility to use different infinitesimals numerically and, in particular, to take advantage of infinitesimal values of h. To show the potential of the new framework a number of results is established. It is proved that the Infinity Computer is able to calculate derivatives of the solution y(x) and to reconstruct its Taylor expansion of a desired order numerically without finding the respective derivatives analytically (or symbolically) by the successive derivation of the ODE as it is usually done when the Taylor method is applied. Methods using approximations of derivatives obtained thanks to infinitesimals are discussed and a technique for an automatic control of rounding errors is introduced. Numerical examples are given. (shrink)
In this paper, a number of traditional models related to the percolation theory has been considered by means of new computational methodology that does not use Cantor’s ideas and describes infinite and infinitesimal numbers in accordance with the principle ‘The part is less than the whole’. It gives a possibility to work with finite, infinite, and infinitesimal quantities numerically by using a new kind of a compute - the Infinity Computer – introduced recently in [18]. The new approach (...) does not contradict Cantor. In contrast, it can be viewed as an evolution of his deep ideas regarding the existence of different infinite numbers in a more applied way. Site percolation and gradient percolation have been studied by applying the new computational tools. It has been established that in an infinite system the phase transition point is not really a point as with respect of traditional approach. In light of new arithmetic it appears as a critical interval, rather than a critical point. Depending on “microscope” we use this interval could be regarded as finite, infinite and infinitesimal short interval. Using new approach we observed that in vicinity of percolation threshold we have many different infinite clusters instead of one infinite cluster that appears in traditional consideration. (shrink)
There exist many applications where it is necessary to approximate numerically derivatives of a function which is given by a computer procedure. In particular, all the fields of optimization have a special interest in such a kind of information. In this paper, a new way to do this is presented for a new kind of a computer - the Infinity Computer - able to work numerically with finite, infinite, and infinitesimal number. It is proved that the Infinity Computer is (...) able to calculate values of derivatives of a higher order for a wide class of functions represented by computer procedures. It is shown that the ability to compute derivatives of arbitrary order automatically and accurate to working precision is an intrinsic property of the Infinity Computer related to its way of functioning. Numerical examples illustrating the new concepts and numerical tools are given. (shrink)
Many biological processes and objects can be described by fractals. The paper uses a new type of objects – blinking fractals – that are not covered by traditional theories considering dynamics of self-similarity processes. It is shown that both traditional and blinking fractals can be successfully studied by a recent approach allowing one to work numerically with infinite and infinitesimal numbers. It is shown that blinking fractals can be applied for modeling complex processes of growth of biological systems including (...) their season changes. The new approach allows one to give various quantitative characteristics of the obtained blinking fractals models of biological systems. (shrink)
The purpose of this note is to contrast a Cantorian outlook with a non-Cantorian one and to present a picture that provides support for the latter. In particular, I suggest that: i) infinite hyperreal numbers are the (actual, determined) infinite numbers, ii) ω is merely potentially infinite, and iii) infinitesimals should not be used in the di Finetti lottery. Though most Cantorians will likely maintain a Cantorian outlook, the picture is meant to motivate the obvious nature of the non-Cantorian outlook.
For Anaxagoras, both before the beginning of the world and in the present, “all is together” and “everything is in everything.” Various modern interpretations abound regarding the identity of this “mixture.” It has been explained as an aggregation of particles or as a continuous “fusion” of different sorts of ingredients. However—even though they are not usually recognized as a distinct group—there are a number of other scholars who, without seemingly knowing each other, have offered a different interpreta- tion: Anaxagoras’ mixture (...) as an “interpenetration” of different ingredients, which are as far-extended as the whole mixture is. As a result, there are different entities occupying the same place at the same time. This explanation assigns to Anaxagoras the same model of mixture which was later used by the Stoics. A new book by Marmodoro helps us to clarify this position. (shrink)
A probability distribution is regular if no possible event is assigned probability zero. While some hold that probabilities should always be regular, three counter-arguments have been posed based on examples where, if regularity holds, then perfectly similar events must have different probabilities. Howson (2017) and Benci et al. (2016) have raised technical objections to these symmetry arguments, but we see here that their objections fail. Howson says that Williamson’s (2007) “isomorphic” events are not in fact isomorphic, but Howson is speaking (...) of set-theoretic representations of events in a probability model. While those sets are not isomorphic, Williamson’s physical events are, in the relevant sense. Benci et al. claim that all three arguments rest on a conflation of different models, but they do not. They are founded on the premise that similar events should have the same probability in the same model, or in one case, on the assumption that a single rotation-invariant distribution is possible. Having failed to refute the symmetry arguments on such technical grounds, one could deny their implicit premises, which is a heavy cost, or adopt varying degrees of instrumentalism or pluralism about regularity, but that would not serve the project of accurately modelling chances. (shrink)
Special and General theories of relativity may be considered as the most significant examples of integrative thinking. From these works we see that Albert Einstein attached great importance to how we understand geometry and dimensions. It is shown that physics powered by the new multidimensional elastic geometry is a reliable basis for science integration. Instead of searching for braneworlds (elastic membranes - EM) in higher dimensions we will start by searching them in our 3+1 dimensional world. The cornerstone of the (...) new philosophy is an idea that lower dimensional EMs are an essential component of the living matter, they are responsible for our perceptions, intellect, pattern recognition and high speed signal propagation. According to this theory each EM has both physical and perceptive (psychological) meanings: it exists as our Universe-like physical reality for its inner objects and at the same time it plays perceptive (psychological) role in the external bulk space-time. This philosophy may help us to build up a science which explains not only inanimate, unconscious phenomena, but consciousness as well. (shrink)
We examine some of Connes’ criticisms of Robinson’s infinitesimals starting in 1995. Connes sought to exploit the Solovay model S as ammunition against non-standard analysis, but the model tends to boomerang, undercutting Connes’ own earlier work in functional analysis. Connes described the hyperreals as both a “virtual theory” and a “chimera”, yet acknowledged that his argument relies on the transfer principle. We analyze Connes’ “dart-throwing” thought experiment, but reach an opposite conclusion. In S , all definable sets of reals are (...) Lebesgue measurable, suggesting that Connes views a theory as being “virtual” if it is not definable in a suitable model of ZFC. If so, Connes’ claim that a theory of the hyperreals is “virtual” is refuted by the existence of a definable model of the hyperreal field due to Kanovei and Shelah. Free ultrafilters aren’t definable, yet Connes exploited such ultrafilters both in his own earlier work on the classification of factors in the 1970s and 80s, and in Noncommutative Geometry, raising the question whether the latter may not be vulnerable to Connes’ criticism of virtuality. We analyze the philosophical underpinnings of Connes’ argument based on Gödel’s incompleteness theorem, and detect an apparent circularity in Connes’ logic. We document the reliance on non-constructive foundational material, and specifically on the Dixmier trace −∫ (featured on the front cover of Connes’ magnum opus) and the Hahn–Banach theorem, in Connes’ own framework. We also note an inaccuracy in Machover’s critique of infinitesimal-based pedagogy. (shrink)
We seek to elucidate the philosophical context in which one of the most important conceptual transformations of modern mathematics took place, namely the so-called revolution in rigor in infinitesimal calculus and mathematical analysis. Some of the protagonists of the said revolution were Cauchy, Cantor, Dedekind,and Weierstrass. The dominant current of philosophy in Germany at the time was neo-Kantianism. Among its various currents, the Marburg school (Cohen, Natorp, Cassirer, and others) was the one most interested in matters scientific and mathematical. (...) Our main thesis is that Marburg neo-Kantian philosophy formulated a sophisticated position towards the problems raised by the concepts of limits and infinitesimals. The Marburg school neither clung to the traditional approach of logically and metaphysically dubious infinitesimals, nor whiggishly subscribed to the new orthodoxy of the “great triumvirate” of Cantor, Dedekind, and Weierstrass that declared infinitesimals conceptus nongrati in mathematical discourse. Rather, following Cohen’s lead, the Marburg philosophers sought to clarify Leibniz’s principle of continuity, and to exploit it in making sense of infinitesimals and related concepts. (shrink)
Maimon’s theory of the differential has proved to be a rather enigmatic aspect of his philosophy. By drawing upon mathematical developments that had occurred earlier in the century and that, by virtue of the arguments presented in the Essay and comments elsewhere in his writing, I suggest Maimon would have been aware of, what I propose to offer in this paper is a study of the differential and the role that it plays in the Essay on Transcendental Philosophy (1790). In (...) order to do so, this paper focuses upon Maimon’s criticism of the role played by mathematics in Kant’s philosophy, to which Maimon offers a Leibnizian solution based on the infinitesimal calculus. The main difficulties that Maimon has with Kant’s system, the second of which will be the focus of this paper, include the presumption of the existence of synthetic a priori judgments, i.e. the question quid facti, and the question of whether the fact of our use of a priori concepts in experience is justified, i.e. the question quid juris. Maimon deploys mathematics, specifically arithmetic, against Kant to show how it is possible to understand objects as having been constituted by the very relations between them, and he proposes an alternative solution to the question quid juris, which relies on the concept of the differential. However, despite these arguments, Maimon remains sceptical with respect to the question quid facti. (shrink)
In this article we provide a mathematical model of Kant?s temporal continuum that satisfies the (not obviously consistent) synthetic a priori principles for time that Kant lists in the Critique of pure Reason (CPR), the Metaphysical Foundations of Natural Science (MFNS), the Opus Postumum and the notes and frag- ments published after his death. The continuum so obtained has some affinities with the Brouwerian continuum, but it also has ‘infinitesimal intervals’ consisting of nilpotent infinitesimals, which capture Kant’s theory of (...) rest and motion in MFNS. While constructing the model, we establish a concordance between the informal notions of Kant?s theory of the temporal continuum, and formal correlates to these notions in the mathematical theory. Our mathematical reconstruction of Kant?s theory of time allows us to understand what ?faculties and functions? must be in place for time to satisfy all the synthetic a priori principles for time mentioned. We have presented here a mathematically precise account of Kant?s transcendental argument for time in the CPR and of the rela- tion between the categories, the synthetic a priori principles for time, and the unity of apperception; the most precise account of this relation to date. We focus our exposition on a mathematical analysis of Kant’s informal terminology, but for reasons of space, most theorems are explained but not formally proven; formal proofs are available in (Pinosio, 2017). The analysis presented in this paper is related to the more general project of developing a formalization of Kant’s critical philosophy (Achourioti & van Lambalgen, 2011). A formal approach can shed light on the most controversial concepts of Kant’s theoretical philosophy, and is a valuable exegetical tool in its own right. However, we wish to make clear that mathematical formalization cannot displace traditional exegetical methods, but that it is rather an exegetical tool in its own right, which works best when it is coupled with a keen awareness of the subtleties involved in understanding the philosophical issues at hand. In this case, a virtuous ?hermeneutic circle? between mathematical formalization and philosophical discourse arises. (shrink)
This dissertation is a contribution to formal and computational philosophy. -/- In the first part, we show that by exploiting the parallels between large, yet finite lotteries on the one hand and countably infinite lotteries on the other, we gain insights in the foundations of probability theory as well as in epistemology. Case 1: Infinite lotteries. We discuss how the concept of a fair finite lottery can best be extended to denumerably infinite lotteries. The solution boils down to the introduction (...) of infinitesimal probability values, which can be achieved using non-standard analysis. Our solution can be generalized to uncountable sample spaces, giving rise to a Non-Archimedean Probability (NAP) theory. Case 2: Large but finite lotteries. We propose application of the language of relative analysis (a type of non-standard analysis) to formulate a new model for rational belief, called Stratified Belief. This contextualist model seems well-suited to deal with a concept of beliefs based on probabilities ‘sufficiently close to unity’. -/- The second part presents a case study in social epistemology. We model a group of agents who update their opinions by averaging the opinions of other agents. Our main goal is to calculate the probability for an agent to end up in an inconsistent belief state due to updating. To that end, an analytical expression is given and evaluated numerically, both exactly and using statistical sampling. The probability of ending up in an inconsistent belief state turns out to be always smaller than 2%. (shrink)
Explications of the reconstruction of Leibniz’s metaphysics that Deleuze undertakes in 'The Fold: Leibniz and the Baroque' focus predominantly on the role of the infinitesimal calculus developed by Leibniz.1 While not underestimat- ing the importance of the infinitesimal calculus and the law of continuity as reflected in the calculus of infinite series to any understanding of Leibniz’s metaphysics and to Deleuze’s reconstruction of it in The Fold, what I propose to examine in this paper is the role played (...) by other developments in mathematics that Deleuze draws upon, including those made by a number of Leibniz’s near contemporaries – the projective geometry that has its roots in the work of Desargues (1591–1661) and the ‘proto-topology’ that appears in the work of Du ̈rer (1471–1528) – and a number of the subsequent developments in these fields of mathematics. Deleuze brings this elaborate conjunction of material together in order to set up a mathematical idealization of the system that he considers to be implicit in Leibniz’s work. The result is a thoroughly mathematical explication of the structure of Leibniz’s metaphysics. What is provided in this paper is an exposition of the very mathematical underpinnings of this Deleuzian account of the structure of Leibniz’s metaphysics, which, I maintain, subtends the entire text of The Fold. (shrink)
In his book The Boundary Stones of Thought, Ian Rumfitt considers five arguments in favour of intuitionistic logic over classical logic. Two of these arguments are based on reflections concerning the meaning of statements in general, due to Michael Dummett and John McDowell. The remaining three are more specific, concerning statements about the infinite and the infinitesimal, statements involving vague terms, and statements about sets.Rumfitt is sympathetic to the premisses of many of these arguments, and takes some of them (...) to be effective challenges to Bivalence, the following principle: Each statement is either true or false.However, he argues that counterexamples to Bivalence do not immediately lead to counterexamples to Excluded Middle, and so do not immediately refute classical logic; here, Excluded Middle is taken to be the following principle: For each statement A, is true.Much... (shrink)
In my dissertation, I present Hermann Cohen's foundation for the history and philosophy of science. My investigation begins with Cohen's formulation of a neo-Kantian epistemology. I analyze Cohen's early work, especially his contributions to 19th century debates about the theory of knowledge. I conclude by examining Cohen's mature theory of science in two works, The Principle of the Infinitesimal Method and its History of 1883, and Cohen's extensive 1914 Introduction to Friedrich Lange's History of Materialism. In the former, Cohen (...) gives an historical and philosophical analysis of the foundations of the infinitesimal method in mathematics. In the latter, Cohen presents a detailed account of Heinrich Hertz's Principles of Mechanics of 1894. Hertz considers a series of possible foundations for mechanics, in the interest of finding a secure conceptual basis for mechanical theories. Cohen argues that Hertz's analysis can be completed, and his goal achieved, by means of a philosophical examination of the role of mathematical principles and fundamental concepts in scientific theories. (shrink)
In the paper “Math Anxiety,” Aden Evens explores the manner by means of which concepts are implicated in the problematic Idea according to the philosophy of Gilles Deleuze. The example that Evens draws from Difference and Repetition in order to demonstrate this relation is a mathematics problem, the elements of which are the differentials of the differential calculus. What I would like to offer in the present paper is an historical account of the mathematical problematic that Deleuze deploys in his (...) philosophy, and an introduction to the role that this problematic plays in the develop- ment of his philosophy of difference. One of the points of departure that I will take from the Evens paper is the theme of “power series.”2 This will involve a detailed elaboration of the mechanism by means of which power series operate in the differential calculus deployed by Deleuze in Difference and Repetition. Deleuze actually constructs an alternative history of mathematics that establishes an historical conti- nuity between the differential point of view of the infinitesimal calculus and modern theories of the differential calculus. It is in relation to the differential point of view of the infinitesimal calculus that Deleuze determines a differential logic which he deploys, in the form of a logic of different/ciation, in the development of his proj- ect of constructing a philosophy of difference. (shrink)
Undoubtedly the Penrose-Hameroff Orch OR model may be considered as a good theory for describing information processing mechanisms and holistic phenomena in the human brain, but it doesn’t give us satisfactory explanation of human perception. In this work a new approach explaining our perception is introduced, which is in good agreement with Orch OR model and other mainstream science theories such as string theory, loop quantum gravity and holographic principle. It is shown that human perception cannot be explained in the (...) terms of elementary particles and we should introduce new indivisible holistic objects with geometry based on smooth infinitesimal analysis - elastic membranes. The example of such a membrane is our Universe which is an indivisible whole. It is shown that our perception may be considered as the result of elastic oscillations of two dimensional (2D) elastic membranes with closed topology embedded in our bodies. Only one elastic membrane responsible for its perceptions will correspond to the selected organism, but there may be other membranes, even at the cell level. In other words, reality may be considered as the process of time evolution of holistic energetically very weak macro objects - elastic membranes with the geometry based on smooth infinitesimal analysis. An embedded membrane in this multidimensional world will look different for the external and internal observers: from the outside it will look like a material object with smooth infinitesimal geometry, while from the inside our Universe-like space-time fabric. When interacting with elementary particles and other membranes, a membrane will transform their energy into its elastic energy (a new form of energy) - the energy of stretching of the infinitesimal segments. The theory postulates that these elastic deformations will not be observable from the point of view of the internal observer. Heisenberg’s uncertainty principle will work in this physics only from the point of view of the internal observer. For the external observer each embedded elastic membrane may be stretched and even a very small region will become observable. For example, living organisms play the role of internal observers of the Universe, and at the same time they serve as external observers for 2D membranes embedded into our Universe. We can observe our 2D self-membranes through our perceptions, which are encoded in elastic oscillations of the elastic membrane. According to the theory, elastic membranes occupy energetically favorable positions around microtubules involved into Orch OR. The theory not only gives us a really multidimensional holistic picture of reality, but it also provides us with a new method for understanding such phenomena as perception, self-awareness and will. (shrink)
Our visual experience seems to suggest that no continuous curve can cover every point of the unit square, yet in the late nineteenth century Giuseppe Peano proved that such a curve exists. Examples like this, particularly in analysis (in the sense of the infinitesimal calculus) received much attention in the nineteenth century. They helped instigate what Hans Hahn called a “crisis of intuition”, wherein visual reasoning in mathematics came to be thought to be epistemically problematic. Hahn described this “crisis” (...) as follows: Mathematicians had for a long time made use of supposedly geometric evidence as a means of proof in much too naive and much too uncritical a way, till the unclarities and mistakes that arose as a result forced a turnabout. Geometrical intuition was now declared to be inadmissible as a means of proof... (p. 67) Avoiding geometrical evidence, Hahn continued, mathematicians aware of this crisis pursued what he called “logicization”, “when the discipline requires nothing but purely logical fundamental concepts and propositions for its development.” On this view, an epistemically ideal mathematics would minimize, or avoid altogether, appeals to visual representations. This would be a radical reformation of past practice, necessary, according to its advocates, for avoiding “unclarities and mistakes” like the one exposed by Peano. (shrink)
In this survey, a recent computational methodology paying a special attention to the separation of mathematical objects from numeral systems involved in their representation is described. It has been introduced with the intention to allow one to work with infinities and infinitesimals numerically in a unique computational framework in all the situations requiring these notions. The methodology does not contradict Cantor’s and non-standard analysis views and is based on the Euclid’s Common Notion no. 5 “The whole is greater than the (...) part” applied to all quantities (finite, infinite, and infinitesimal) and to all sets and processes (finite and infinite). The methodology uses a computational device called the Infinity Computer (patented in USA and EU) working numerically (recall that traditional theories work with infinities and infinitesimals only symbolically) with infinite and infinitesimal numbers that can be written in a positional numeral system with an infinite radix. It is argued that numeral systems involved in computations limit our capabilities to compute and lead to ambiguities in theoretical assertions, as well. The introduced methodology gives the possibility to use the same numeral system for measuring infinite sets, working with divergent series, probability, fractals, optimization problems, numerical differentiation, ODEs, etc. (recall that traditionally different numerals lemniscate; Aleph zero, etc. are used in different situations related to infinity). Numerous numerical examples and theoretical illustrations are given. The accuracy of the achieved results is continuously compared with those obtained by traditional tools used to work with infinities and infinitesimals. In particular, it is shown that the new approach allows one to observe mathematical objects involved in the Hypotheses of Continuum and the Riemann zeta function with a higher accuracy than it is done by traditional tools. It is stressed that the hardness of both problems is not related to their nature but is a consequence of the weakness of traditional numeral systems used to study them. It is shown that the introduced methodology and numeral system change our perception of the mathematical objects studied in the two problems. (shrink)
The Koch snowflake is one of the first fractals that were mathematically described. It is interesting because it has an infinite perimeter in the limit but its limit area is finite. In this paper, a recently proposed computational methodology allowing one to execute numerical computations with infinities and infinitesimals is applied to study the Koch snowflake at infinity. Numerical computations with actual infinite and infinitesimal numbers can be executed on the Infinity Computer being a new supercomputer patented in USA (...) and EU. It is revealed in the paper that at infinity the snowflake is not unique, i.e., different snowflakes can be distinguished for different infinite numbers of steps executed during the process of their generation. It is then shown that for any given infinite number n of steps it becomes possible to calculate the exact infinite number, Nn, of sides of the snowflake, the exact infinitesimal length, Ln, of each side and the exact infinite perimeter, Pn, of the Koch snowflake as the result of multiplication of the infinite Nn by the infinitesimal Ln. It is established that for different infinite n and k the infinite perimeters Pn and Pk are also different and the difference can be infinite. It is shown that the finite areas An and Ak of the snowflakes can be also calculated exactly (up to infinitesimals) for different infinite n and k and the difference An − Ak results to be infinitesimal. Finally, snowflakes constructed starting from different initial conditions are also studied and their quantitative characteristics at infinity are computed. (shrink)
The principle of least action, which has so successfully been applied to diverse fields of physics looks back at three centuries of philosophical and mathematical discussions and controversies. They could not explain why nature is applying the principle and why scalar energy quantities succeed in describing dynamic motion. When the least action integral is subdivided into infinitesimal small sections each one has to maintain the ability to minimise. This however has the mathematical consequence that the Lagrange function at a (...) given point of the trajectory, the dynamic, available energy generating motion, must itself have a fundamental property to minimize. Since a scalar quantity, a pure number, cannot do that, energy must fundamentally be dynamic and time oriented for a consistent understanding. It must have vectorial properties in aiming at a decrease of free energy per state (which would also allow derivation of the second law of thermodynamics). Present physics is ignoring that and applying variation calculus as a formal mathematical tool to impose a minimisation of scalar assumed energy quantities for obtaining dynamic motion. When, however, the dynamic property of energy is taken seriously it is fundamental and has also to be applied to quantum processes. A consequence is that particle and wave are not equivalent, but the wave (distributed energy) follows from the first (concentrated energy). Information, provided from the beginning, an information self-image of matter, is additionally needed to recreate the particle from the wave, shaping a “dynamic” particle-wave duality. It is shown that this new concept of a “dynamic” quantum state rationally explains quantization, the double slit experiment and quantum correlation, which has not been possible before. Some more general considerations on the link between quantum processes, gravitation and cosmological phenomena are also advanced. (shrink)
This paper argues that the principle of continuity that underlies Benjamin’s understanding of what makes the reality of a thing thinkable, which in the Kantian context implies a process of “filling time” with an anticipatory structure oriented to the subject, is of a different order than that of infinitesimal calculus—and that a “discontinuity” constitutive of the continuity of experience and (merely) counterposed to the image of actuality as an infinite gradation of ultimately thetic acts cannot be the principle on (...) which Benjamin bases the structure of becoming. Tracking the transformation of the process of “filling time” from its logical to its historical iteration, or from what Cohen called the “fundamental acts of time” in Logik der reinen Erkenntnis to Benjamin’s image of a language of language (qua language touching itself), the paper will suggest that for Benjamin, moving from 0 to 1 is anything but paradoxical, and instead relies on the possibility for a mathematical function to capture the nature of historical occurrence beyond paradoxes of language or phenomenality. (shrink)
General Relativity generated various early philosophical interpretations. His adherents have highlighted the "relativization of inertia" and the concept of simultaneity, Kantians and Neo-Kantians have underlined the approach of certain synthetic "intellectual forms" (especially the principle of general covariance, and logical empirics have emphasized the philosophical methodological significance of the theory. Reichenbach approached the GR through the "relativity of geometry" thesis, trying to build a "constructive axiomatization" of relativity based on "elementary matters of fact" (Elementartatbestande) for the observable behavior of light (...) rays, rods and clocks. The mathematician Hermann Weyl attempted a reconstruction of Einstein's theory based on the epistemology of a "pure infinitesimal geometry", an extended geometry with additional terms that formally identified with the potential of the electromagnetic field. DOI: 10.13140/RG.2.2.11641.93281. (shrink)
The four antinomies of Zeno of Elea, especially Achilles and the tortoise, continue to be provoking issues which not always receive adequate treatment. Aristotle himself used this antinomy to develop his understanding of movement: it is a fluent continuum that he considers to be a whole. The parts, if any, are only potentially present. The claim of quantum mechanics is precisely that: movement is quantized; things move or change in non-reducible steps, the so-called quanta. This view is in contrast to (...) classical mechanics, where small infinitesimal steps are permitted. The objective of the present study is to show the merits of the Aristotelian approach. It is a suitable candidate for providing a philosophical framework for understanding fundamental aspects of quantum mechanics. Especially one may mention the influence of the final state in quantum mechanics, which in philosophical terms relates to the final cause. Like in the work of Aristotle, examples from science are also presented in the present study. They serve to illustrate the philosophical statements. However, in contrast to ancient Greek, the examples now relate to issues which are only fully accessible to the scientifically trained reader. It may, therefore, happen that certain parts in the present study miss clarity for the philosopher and other parts for the scientist. One conclusion, therefore, could be that an open dialogue between scientists and philosophers is needed to get a better understanding of the challenging issues at the cross-road of both disciplines. (shrink)
Eyal Shahar’s essay review [1] of James Penston’s remarkable book [2] seems more inspired playful academic provocation than review or essay, expressing dramatic views of impossible validity. The account given of modern biostatistical causation reveals the slide from science into the intellectual confusion and non-science RCTs have created: “…. the purpose of medical research is to estimate the magnitude of the effect of a causal contrast, for example the probability ratio of a binary outcome …” But Shahar’s world is simultaneously (...) not probabilistic, but of absolute uncertainty: “We should have no confidence in any type of evidence ….. We should have no confidence at all”. Shahar’s "Causal contrast" is attractive. It seems to make sense, but bypasses in two words the means of establishing causation by the scientific method. This phrase assumes a numeric statistically significant “contrast” is causal rather than a potential correlation requiring further investigation. The concept of “causal contrast” is a slippery slope from sense into biostatistical non-science. This can be illustrated with an hypothetical RCT where 100% of interventions exhibit a posited treatment effect and 0% of placebo controls. Internal validity is seemingly quite reasonably assumed satisfied (common-sense dictating the likelihood of an awesome magnificent fraud, bias or plain error of the magnitude required is infinitesimal). Scientific method appears satisfied. The RCT demonstrates: (1) strict regularity of outcome in the presence of posited cause; (2) the absence of outcome in its absence and (3) an intervention (experiment) showing the direction of causation is from posited cause to posited effect. Now travel further down the slope from science. Assume 50% of interventions and 0% of controls are positive. We compromise scientific method, but justify this by assuming a large subgroup which we say surely must on these figures be exhibiting the posited treatment effect. But what of 10% of interventions and 9% of placebo controls exhibiting the posited treatment effect? Our biostatistician says the 1% “causal contrast” is statistically significant. But we have: (1) minimal evidence of regularity; (2) the posited outcome irrespective of presence of posited cause and (3) our intervention is at the highest equivocal in demonstrating any form of causation. This is not science. It is, however, where biostatistics has unthinkingly taken us, as Penston has shown comprehensively [2]. We, the audience of published medical research, are now for the 10% / 9% example well down the slope from science. An unattractive hypothesis results requiring numerous assumptions similar to these:- "There is a 'contrast' which is ‘causal’, albeit the method employed is not scientific. An effect of the intervention has been observed in a very small subgroup. This subgroup is susceptible to treatment. The similar number of placebo controls exhibiting the outcome sought is irrelevant, because the 1% difference between intervention and controls is statistically significant. The statistical analysis is valid and reliable. The RCT’s internal validity is sufficiently satisfied. No funding or bias or fraud has affected the results or their analysis.” As Penston notes: “Confirming and refuting the results of research is crucial to science …. But … there’s no way of testing the results of any particular large-scale RCT or epidemiological study. Each study … is left hanging in the air, unsupported.” It gets worse. To identify a rare serious adverse reaction of a frequency of 1:10,000 can require a trial of 200,000 or larger split between controls and interventions. This is not done. But for every 100 who prospectively benefit from the intervention, 9,900 also receive it. And for every 100 benefiting one person (who likely gains no benefit) will suffer a serious unidentified adverse reaction. This is also without taking account of more common adverse reactions whether serious or otherwise. References [1] Shahar, E. (2011) Research and medicine: human conjectures at every turn. International Journal of Person Centered Medicine 1 (2), 250-253. [2] Penston, J. (2010). stats.con: How we’ve been fooled by statistics-based research in medicine. London: The London Press, UK. (shrink)
In the present paper I argue that the formalism of Newtonian mechanics stems directly from the general principle to be called the principle of microlevel reducibility which physical systems obey in the realm of classical physics. This principle assumes, first, that all the properties of physical systems must be determined by their states at the current moment of time, in a slogan form it is ``only the present matters to physics.'' Second, it postulates that any physical system is nothing but (...) an ensemble of structureless particles arranged in some whose interaction obeys the superposition principle. I substantiate this statement and demonstrate directly how the formalism of differential equations, the notion of forces in Newtonian mechanics, the concept of phase space and initial conditions, the principle of least actions, etc. result from the principle of microlevel reducibility. The philosophical concept of thick presentism and the introduction of two dimensional time-physical time and meta-time that are mutually independent on infinitesimal scales-are the the pivot points in these constructions. (shrink)
Survival of the fittest is an over simplification because the creature is tuned to the value in the environment. All variation is constrained and sophistication emerges as a consequence of value. -/- If we could zoom out until the entire universe appeared as an infinitesimal point, then it becomes one thing, undifferentiated, a system in search of a narrative. The system is a form of a dance, without the dancers, there is no system. The narrative unfolds as an expression (...) in a context. The meaning branches out as an increasingly rich and complex expression where the dancers enrich the environment and the environment enriches the dancers. As the deeper layers are expressed, more and more subtlety and sophistication is created and the process continues ad infinitum... Value is the measure of an enriching principle of organization. (shrink)
Cognitive Set Theory is a mathematical model of cognition which equates sets with concepts, and uses mereological elements. It has a holistic emphasis, as opposed to a reductionistic emphasis, and it therefore begins with a single universe (as opposed to an infinite collection of infinitesimal points).
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.