The distinction between the discrete and the continuous lies at the heart of mathematics. Discrete mathematics (arithmetic, algebra, combinatorics, graph theory, cryptography, logic) has a set of concepts, techniques, and application areas largely distinct from continuous mathematics (traditional geometry, calculus, most of functional analysis, differential equations, topology). The interaction between the two – for example in computer models of continuous systems such as fluid flow – is a central issue in the applicable mathematics of the last hundred years. This article (...) explains the distinction and why it has proved to be one of the great organizing themes of mathematics. (shrink)
The concept of ‘ideas’ plays central role in philosophy. The genesis of the idea of continuity and its essential role in intellectual history have been analyzed in this research. The main question of this research is how the idea of continuity came to the human cognitive system. In this context, we analyzed the epistemological function of this idea. In intellectual history, the idea of continuity was first introduced by Leibniz. After him, this idea, as a paradigm, formed (...) the base of several fundamental scientific conceptions. This idea also allowed mathematicians to justify the nature of real numbers, which was one of central questions and intellectual discussions in the history of mathematics. For this reason, we analyzed how Dedekind’s continuity idea was used to this justification. As a result, it can be said that several fundamental conceptions in intellectual history, philosophy and mathematics cannot arise without existence of the idea of continuity. However, this idea is neither a purely philosophical nor a mathematical idea. This is an interdisciplinary concept. For this reason, we call and classify it as mathematical and philosophical invariance. (shrink)
A strongly independent preorder on a possibly in finite dimensional convex set that satisfi es two of the following conditions must satisfy the third: (i) the Archimedean continuity condition; (ii) mixture continuity; and (iii) comparability under the preorder is an equivalence relation. In addition, if the preorder is nontrivial (has nonempty asymmetric part) and satisfi es two of the following conditions, it must satisfy the third: (i') a modest strengthening of the Archimedean condition; (ii') mixture continuity; and (...) (iii') completeness. Applications to decision making under conditions of risk and uncertainty are provided. (shrink)
In this paper, I examine the relation between Henri Poincaré’s definition of mathematicalcontinuity and Sartre’s discussion of temporality in Being and Nothingness. Poincaré states that a series A, B, and C is continuous when A=B, B=C and A is less than C. I explicate Poincaré’s definition and examine the arguments that he uses to arrive at this definition. I argue that Poincaré’s definition is applicable to temporal series, and I show that this definition of continuity provides (...) a logical basis for Sartre’s psychological explanation of temporality. Specifically, I demonstrate that Poincaré’s definition allows the for-itself to be understood both as connected to a past and future and as distinct from itself. I conclude that the gap between two terms in a temporal series comprises the present and being-for-itself, since it is this gap that occasions the radical freedom to reshape the past into a distinct and different future. (shrink)
The reconstruction of Leibniz’s metaphysics that Deleuze undertakes in The Fold provides a systematic account of the structure of Leibniz’s metaphysics in terms of its mathematical foundations. However, in doing so, Deleuze draws not only upon the mathematics developed by Leibniz—including the law of continuity as reflected in the calculus of infinite series and the infinitesimal calculus—but also upon developments in mathematics made by a number of Leibniz’s contemporaries—including Newton’s method of fluxions. He also draws upon a number (...) of subsequent developments in mathematics, the rudiments of which can be more or less located in Leibniz’s own work—including the theory of functions and singularities, the Weierstrassian theory of analytic continuity, and Poincaré’s theory of automorphic functions. Deleuze then retrospectively maps these developments back onto the structure of Leibniz’s metaphysics. While the Weierstrassian theory of analytic continuity serves to clarify Leibniz’s work, Poincaré’s theory of automorphic functions offers a solution to overcome and extend the limits that Deleuze identifies in Leibniz’s metaphysics. Deleuze brings this elaborate conjunction of material together in order to set up a mathematical idealization of the system that he considers to be implicit in Leibniz’s work. The result is a thoroughly mathematical explication of the structure of Leibniz’s metaphysics. This essay is an exposition of the very mathematical underpinnings of this Deleuzian account of the structure of Leibniz’s metaphysics, which, I maintain, subtends the entire text of The Fold. (shrink)
We propose that all actual causes are simultaneous with their direct effects, as illustrated by both everyday examples and the laws of physics. We contrast this view with the sequential conception of causation, according to which causes must occur prior to their effects. The key difference between the two views of causation lies in differing assumptions about the mathematical structure of time.
The term ‘continuous’ in real analysis wasn’t given an adequate formal definition until 1817. However, important theorems about continuity were proven long before that. How was this possible? In this paper, I introduce and refine a proposed answer to this question, derived from the work of Frank Jackson, David Lewis and other proponents of the ‘Canberra plan’. In brief, the proposal is that before 1817 the meaning of the term ‘continuous’ was determined by a number of ‘platitudes’ which had (...) some special epistemic status. (shrink)
In the chapter of Difference and Repetition entitled ‘Ideas and the synthesis of difference,’ Deleuze mobilizes mathematics to develop a ‘calculus of problems’ that is based on the mathematical philosophy of Albert Lautman. Deleuze explicates this process by referring to the operation of certain conceptual couples in the field of contemporary mathematics: most notably the continuous and the discontinuous, the infinite and the finite, and the global and the local. The two mathematical theories that Deleuze draws upon for (...) this purpose are the differential calculus and the theory of dynamical systems, and Galois’ theory of polynomial equations. For the purposes of this paper I will only treat the first of these, which is based on the idea that the singularities of vector fields determine the local trajectories of solution curves, or their ‘topological behaviour’. These singularities can be described in terms of the given mathematical problematic, that is for example, how to solve two divergent series in the same field, and in terms of the solutions, as the trajectories of the solution curves to the problem. What actually counts as a solution to a problem is determined by the specific characteristics of the problem itself, typically by the singularities of this problem and the way in which they are distributed in a system. Deleuze understands the differential calculus essentially as a ‘calculus of problems’, and the theory of dynamical systems as the qualitative and topological theory of problems, which, when connected together, are determinative of the complex logic of different/ciation. (DR 209). Deleuze develops the concept of a problematic idea from the differential calculus, and following Lautman considers the concept of genesis in mathematics to ‘play the role of model ... with respect to all other domains of incarnation’. While Lautman explicated the philosophical logic of the actualization of ideas within the framework of mathematics, Deleuze (along with Guattari) follows Lautman’s suggestion and explicates the operation of this logic within the framework of a multiplicity of domains, including for example philosophy, science and art in What is Philosophy?, and the variety of domains which characterise the plateaus in A Thousand Plateaus. While for Lautman, a mathematical problem is resolved by the development of a new mathematical theory, for Deleuze, it is the construction of a concept that offers a solution to a philosophical problem; even if this newly constructed concept is characteristic of, or modelled on the new mathematical theory. (shrink)
Modern philosophy of mathematics has been dominated by Platonism and nominalism, to the neglect of the Aristotelian realist option. Aristotelianism holds that mathematics studies certain real properties of the world – mathematics is neither about a disembodied world of “abstract objects”, as Platonism holds, nor it is merely a language of science, as nominalism holds. Aristotle’s theory that mathematics is the “science of quantity” is a good account of at least elementary mathematics: the ratio of two heights, for example, is (...) a perceivable and measurable real relation between properties of physical things, a relation that can be shared by the ratio of two weights or two time intervals. Ratios are an example of continuous quantity; discrete quantities, such as whole numbers, are also realised as relations between a heap and a unit-making universal. For example, the relation between foliage and being-a-leaf is the number of leaves on a tree,a relation that may equal the relation between a heap of shoes and being-a-shoe. Modern higher mathematics, however, deals with some real properties that are not naturally seen as quantity, so that the “science of quantity” theory of mathematics needs supplementation. Symmetry, topology and similar structural properties are studied by mathematics, but are about pattern, structure or arrangement rather than quantity. (shrink)
In the philosophy of the analytical tradition, set theory and formal logic are familiar formal tools. I think there is no deep reason why the philosopher’s tool kit should be restricted to just these theories. It might well be the case—to generalize a dictum of Suppes concerning philosophy of science—that the appropriate formal device for doing philosophy is mathematics in general; it may be set theory, algebra, topology, or any other realm of mathematics. In this paper I want to employ (...) elementary topological considerations to shed new light on the intricate problem of the relation of qualities and similarity. Thereby I want to make plausible the general thesis that topology might be a useful device for matters epistemological. (shrink)
Mathematics clearly plays an important role in scientific explanation. Debate continues, however, over the kind of role that mathematics plays. I argue that if pure mathematical explananda and physical explananda are unified under a common explanation within science, then we have good reason to believe that mathematics is explanatory in its own right. The argument motivates the search for a new kind of scientific case study, a case in which pure mathematical facts and physical facts are explanatorily unified. (...) I argue that it is possible for there to be such cases, and provide some toy examples to demonstrate this. I then identify a potential source of scientific case studies as a guide for future work. (shrink)
This paper argues that the principle of continuity that underlies Benjamin’s understanding of what makes the reality of a thing thinkable, which in the Kantian context implies a process of “filling time” with an anticipatory structure oriented to the subject, is of a different order than that of infinitesimal calculus—and that a “discontinuity” constitutive of the continuity of experience and (merely) counterposed to the image of actuality as an infinite gradation of ultimately thetic acts cannot be the principle (...) on which Benjamin bases the structure of becoming. Tracking the transformation of the process of “filling time” from its logical to its historical iteration, or from what Cohen called the “fundamental acts of time” in Logik der reinen Erkenntnis to Benjamin’s image of a language of language (qua language touching itself), the paper will suggest that for Benjamin, moving from 0 to 1 is anything but paradoxical, and instead relies on the possibility for a mathematical function to capture the nature of historical occurrence beyond paradoxes of language or phenomenality. (shrink)
This thesis articulates the resonances between J. M. Coetzee's lifelong engagement with mathematics and his practice as a novelist, critic, and poet. Though the critical discourse surrounding Coetzee's literary work continues to flourish, and though the basic details of his background in mathematics are now widely acknowledged, his inheritance from that background has not yet been the subject of a comprehensive and mathematically- literate account. In providing such an account, I propose that these two strands of his intellectual trajectory not (...) only developed in parallel, but together engendered several of the characteristic qualities of his finest work. The structure of the thesis is essentially thematic, but is also broadly chronological. Chapter 1 focuses on Coetzee's poetry, charting the increasing involvement of mathematical concepts and methods in his practice and poetics between 1958 and 1979. Chapter 2 situates his master's thesis alongside archival materials from the early stages of his academic career, and thus traces the development of his philosophical interest in the migration of quantificatory metaphors into other conceptual domains. Concentrating on his doctoral thesis and a series of contemporaneous reviews, essays, and lecture notes, Chapter 3 details the calculated ambivalence with which he therein articulates, adopts, and challenges various statistical methods designed to disclose objective truth. Chapter 4 explores the thematisation of several mathematical concepts in Dusklands and In the Heart of the Country. Chapter Five considers Waiting for the Barbarians and Foe in the context provided by Coetzee's interest in the attempts of Isaac Newton to bridge the gap between natural language and the supposedly transparent language of mathematics. Finally, Chapter 6 locates in Elizabeth Costello and Diary of a Bad Year a cognitive approach to the use of mathematical concepts in ethics, politics, and aesthetics, and, by analogy, a central aspect of the challenge Coetzee's late fiction poses to the contemporary literary landscape. (shrink)
The idea behind this special theme journal issue was to continue the work we have started with the INBIOSA initiative (www.inbiosa.eu) and our small inter-disciplinary scientific community. The result of this EU funded project was a white paper (Simeonov et al., 2012a) defining a new direction for future research in theoretical biology we called Integral Biomathics and a volume (Simeonov et al., 2012b) with contributions from two workshops and our first international conference in this field in 2011. The initial impulse (...) for this effort was given a year earlier by a publication of one of the guest editors of this issue (Simeonov, 2010) in this journal. This time we wish to provide a broader forum and more space to elaborate in detail some of the most interesting concepts we have encountered in our discussions, as well as to invite some new contributions of particular interest in the field. Another goal we had in mind was to collect and review as many provocative perspectives as possible on the same key topic we are interested before making a decision to follow a more focused notion that would lead to a funded research program. Therefore we welcomed the generous suggestion of Professor Denis Noble, FRS, who is also editor of this journal to prepare a special theme issue entitled: “Can biology create a profoundly new mathematics and computation?” It has taken a while to invite and collect the contributions. Most of them had a couple of revision cycles and adjustments after having been thoroughly discussed with colleagues, incl. the editors of this issue. We think that the result we have obtained at the end is a satisfactory one, since we succeeded to integrate a diversity of original, but sometimes controversial and mutually excluding concepts organized within chapters of a self-contained volume. The task of compiling all this was not easy at all. Despite our efforts to position the articles of different authors and themes in a way allowing their easy comprehension and relation to each other within the individual chapters, some of them still require a sort of introduction to dissolve possible ambiguities. This is what we are going to do in the following few paragraphs with the hope that the reader (and some of the authors) would excuse our failures. (shrink)
Girolamo Saccheri (1667--1733) was an Italian Jesuit priest, scholastic philosopher, and mathematician. He earned a permanent place in the history of mathematics by discovering and rigorously deducing an elaborate chain of consequences of an axiom-set for what is now known as hyperbolic (or Lobachevskian) plane geometry. Reviewer's remarks: (1) On two pages of this book Saccheri refers to his previous and equally original book Logica demonstrativa (Turin, 1697) to which 14 of the 16 pages of the editor's "Introduction" are devoted. (...) At the time of the first edition, 1920, the editor was apparently not acquainted with the secondary literature on Logica demonstrativa which continued to grow in the period preceding the second edition \ref[see D. J. Struik, in Dictionary of scientific biography, Vol. 12, 55--57, Scribner's, New York, 1975]. Of special interest in this connection is a series of three articles by A. F. Emch [Scripta Math. 3 (1935), 51--60; Zbl 10, 386; ibid. 3 (1935), 143--152; Zbl 11, 193; ibid. 3 (1935), 221--333; Zbl 12, 98]. (2) It seems curious that modern writers believe that demonstration of the "nondeducibility" of the parallel postulate vindicates Euclid whereas at first Saccheri seems to have thought that demonstration of its "deducibility" is what would vindicate Euclid. Saccheri is perfectly clear in his commitment to the ancient (and now discredited) view that it is wrong to take as an "axiom" a proposition which is not a "primal verity", which is not "known through itself". So it would seem that Saccheri should think that he was convicting Euclid of error by deducing the parallel postulate. The resolution of this confusion is that Saccheri thought that he had proved, not merely that the parallel postulate was true, but that it was a "primal verity" and, thus, that Euclid was correct in taking it as an "axiom". As implausible as this claim about Saccheri may seem, the passage on p. 237, lines 3--15, seems to admit of no other interpretation. Indeed, Emch takes it this way. (3) As has been noted by many others, Saccheri was fascinated, if not obsessed, by what may be called "reflexive indirect deductions", indirect deductions which show that a conclusion follows from given premises by a chain of reasoning beginning with the given premises augmented by the denial of the desired conclusion and ending with the conclusion itself. It is obvious, of course, that this is simply a species of ordinary indirect deduction; a conclusion follows from given premises if a contradiction is deducible from those given premises augmented by the denial of the conclusion---and it is immaterial whether the contradiction involves one of the premises, the denial of the conclusion, or even, as often happens, intermediate propositions distinct from the given premises and the denial of the conclusion. Saccheri seemed to think that a proposition proved in this way was deduced from its own denial and, thus, that its denial was self-contradictory (p. 207). Inference from this mistake to the idea that propositions proved in this way are "primal verities" would involve yet another confusion. The reviewer gratefully acknowledges extensive communication with his former doctoral students J. Gasser and M. Scanlan. ADDED 14 March 14, 2015: (1) Wikipedia reports that many of Saccheri's ideas have a precedent in the 11th Century Persian polymath Omar Khayyám's Discussion of Difficulties in Euclid, a fact ignored in most Western sources until recently. It is unclear whether Saccheri had access to this work in translation, or developed his ideas independently. (2) This book is another exemplification of the huge difference between indirect deduction and indirect reduction. Indirect deduction requires making an assumption that is inconsistent with the premises previously adopted. This means that the reasoner must perform a certain mental act of assuming a certain proposition. It case the premises are all known truths, indirect deduction—which would then be indirect proof—requires the reasoner to assume a falsehood. This fact has been noted by several prominent mathematicians including Hardy, Hilbert, and Tarski. Indirect reduction requires no new assumption. Indirect reduction is simply a transformation of an argument in one form into another argument in a different form. In an indirect reduction one proposition in the old premise set is replaced by the contradictory opposite of the old conclusion and the new conclusion becomes the contradictory opposite of the replaced premise. Roughly and schematically, P,Q/R becomes P,~R/~Q or ~R, Q/~P. Saccheri’s work involved indirect deduction not indirect reduction. (3) The distinction between indirect deduction and indirect reduction has largely slipped through the cracks, the cracks between medieval-oriented logic and modern-oriented logic. The medievalists have a heavy investment in reduction and, though they have heard of deduction, they think that deduction is a form of reduction, or vice versa, or in some cases they think that the word ‘deduction’ is the modern way of referring to reduction. The modernists have no interest in reduction, i.e. in the process of transforming one argument into another having exactly the same number of premises. Modern logicians, like Aristotle, are concerned with deducing a single proposition from a set of propositions. Some focus on deducing a single proposition from the null set—something difficult to relate to reduction. (shrink)
In the recent years, problem-solving become a central topic that discussed by educators or researchers in mathematics education. it’s not only as the ability or as a method of teaching. but also, it is a little in reviewing about the components of the support to succeed in problem-solving, such as student's belief and attitude towards mathematics, algebraic thinking skills, resources and teaching materials. In this paper, examines the algebraic thinking skills as a foundation for problem-solving, and learning cycle as a (...) breath of continuous learning. In this paper, learning cycle to be used is a modified type of 5E based on beliefs. (shrink)
When mathematicians think of the philosophy of mathematics, they probably think of endless debates about what numbers are and whether they exist. Since plenty of mathematical progress continues to be made without taking a stance on either of these questions, mathematicians feel confident they can work without much regard for philosophical reflections. In his sharp–toned, sprawling book, David Corfield acknowledges the irrelevance of much contemporary philosophy of mathematics to current mathematical practice, and proposes reforming the subject accordingly.
Our visual experience seems to suggest that no continuous curve can cover every point of the unit square, yet in the late nineteenth century Giuseppe Peano proved that such a curve exists. Examples like this, particularly in analysis (in the sense of the infinitesimal calculus) received much attention in the nineteenth century. They helped instigate what Hans Hahn called a “crisis of intuition”, wherein visual reasoning in mathematics came to be thought to be epistemically problematic. Hahn described this “crisis” as (...) follows: Mathematicians had for a long time made use of supposedly geometric evidence as a means of proof in much too naive and much too uncritical a way, till the unclarities and mistakes that arose as a result forced a turnabout. Geometrical intuition was now declared to be inadmissible as a means of proof... (p. 67) Avoiding geometrical evidence, Hahn continued, mathematicians aware of this crisis pursued what he called “logicization”, “when the discipline requires nothing but purely logical fundamental concepts and propositions for its development.” On this view, an epistemically ideal mathematics would minimize, or avoid altogether, appeals to visual representations. This would be a radical reformation of past practice, necessary, according to its advocates, for avoiding “unclarities and mistakes” like the one exposed by Peano. (shrink)
We attribute three major insights to Hegel: first, an understanding of the real numbers as the paradigmatic kind of number ; second, a recognition that a quantitative relation has three elements, which is embedded in his conception of measure; and third, a recognition of the phenomenon of divergence of measures such as in second-order or continuous phase transitions in which correlation length diverges. For ease of exposition, we will refer to these three insights as the R First Theory, Tripartite Relations, (...) and Divergence of Measures. Given the constraints of space, we emphasize the first and the third in this paper. (shrink)
Brentano’s theory of continuity is based on his account of boundaries. The core idea of the theory is that boundaries and coincidences thereof belong to the essence of continua. Brentano is confident that he developed a full-fledged, boundary-based, theory of continuity1; and scholars often concur: whether or not they accept Brentano’s take on continua they consider it a clear contender. My impression, on the contrary, is that, although it is infused with invaluable insights, several aspects of Brentano’s account of (...)continuity remain inchoate. To be clear, the theory of boundaries on which it relies, as well as the account of ontological dependence that Brentano develops alongside his theory of boundaries, constitute splendid achievements. However, the passage from the theory of boundaries to the account of continuity is rather sketchy. This paper pinpoints some chief problems raised by this transition, and proposes some solutions to them which, if not always faithful to the letter of Brentano’s account of continua, are I believe faithful to its spirit. §1 presents Brentano’s critique of the mathematical account of the continuous. §2 introduces Brentano’s positive account of continua. §3 raises three worries about Brentano’s account of continuity. §4 proposes a Neo-Brentanian approach to continua that handles these worries. (shrink)
We seek to elucidate the philosophical context in which one of the most important conceptual transformations of modern mathematics took place, namely the so-called revolution in rigor in infinitesimal calculus and mathematical analysis. Some of the protagonists of the said revolution were Cauchy, Cantor, Dedekind,and Weierstrass. The dominant current of philosophy in Germany at the time was neo-Kantianism. Among its various currents, the Marburg school (Cohen, Natorp, Cassirer, and others) was the one most interested in matters scientific and (...) class='Hi'>mathematical. Our main thesis is that Marburg neo-Kantian philosophy formulated a sophisticated position towards the problems raised by the concepts of limits and infinitesimals. The Marburg school neither clung to the traditional approach of logically and metaphysically dubious infinitesimals, nor whiggishly subscribed to the new orthodoxy of the “great triumvirate” of Cantor, Dedekind, and Weierstrass that declared infinitesimals conceptus nongrati in mathematical discourse. Rather, following Cohen’s lead, the Marburg philosophers sought to clarify Leibniz’s principle of continuity, and to exploit it in making sense of infinitesimals and related concepts. (shrink)
Does consciousness collapse the quantum wave function? This idea was taken seriously by John von Neumann and Eugene Wigner but is now widely dismissed. We develop the idea by combining a mathematical theory of consciousness (integrated information theory) with an account of quantum collapse dynamics (continuous spontaneous localization). Simple versions of the theory are falsified by the quantum Zeno effect, but more complex versions remain compatible with empirical evidence. In principle, versions of the theory can be tested by experiments (...) with quantum computers. The upshot is not that consciousness-collapse interpretations are clearly correct, but that there is a research program here worth exploring. (shrink)
The INBIOSA project brings together a group of experts across many disciplines who believe that science requires a revolutionary transformative step in order to address many of the vexing challenges presented by the world. It is INBIOSA’s purpose to enable the focused collaboration of an interdisciplinary community of original thinkers. This paper sets out the case for support for this effort. The focus of the transformative research program proposal is biology-centric. We admit that biology to date has been more fact-oriented (...) and less theoretical than physics. However, the key leverageable idea is that careful extension of the science of living systems can be more effectively applied to some of our most vexing modern problems than the prevailing scheme, derived from abstractions in physics. While these have some universal application and demonstrate computational advantages, they are not theoretically mandated for the living. A new set of mathematical abstractions derived from biology can now be similarly extended. This is made possible by leveraging new formal tools to understand abstraction and enable computability. [The latter has a much expanded meaning in our context from the one known and used in computer science and biology today, that is "by rote algorithmic means", since it is not known if a living system is computable in this sense (Mossio et al., 2009).] Two major challenges constitute the effort. The first challenge is to design an original general system of abstractions within the biological domain. The initial issue is descriptive leading to the explanatory. There has not yet been a serious formal examination of the abstractions of the biological domain. What is used today is an amalgam; much is inherited from physics (via the bridging abstractions of chemistry) and there are many new abstractions from advances in mathematics (incentivized by the need for more capable computational analyses). Interspersed are abstractions, concepts and underlying assumptions “native” to biology and distinct from the mechanical language of physics and computation as we know them. A pressing agenda should be to single out the most concrete and at the same time the most fundamental process-units in biology and to recruit them into the descriptive domain. Therefore, the first challenge is to build a coherent formal system of abstractions and operations that is truly native to living systems. Nothing will be thrown away, but many common methods will be philosophically recast, just as in physics relativity subsumed and reinterpreted Newtonian mechanics. -/- This step is required because we need a comprehensible, formal system to apply in many domains. Emphasis should be placed on the distinction between multi-perspective analysis and synthesis and on what could be the basic terms or tools needed. The second challenge is relatively simple: the actual application of this set of biology-centric ways and means to cross-disciplinary problems. In its early stages, this will seem to be a “new science”. This White Paper sets out the case of continuing support of Information and Communication Technology (ICT) for transformative research in biology and information processing centered on paradigm changes in the epistemological, ontological, mathematical and computational bases of the science of living systems. Today, curiously, living systems cannot be said to be anything more than dissipative structures organized internally by genetic information. There is not anything substantially different from abiotic systems other than the empirical nature of their robustness. We believe that there are other new and unique properties and patterns comprehensible at this bio-logical level. The report lays out a fundamental set of approaches to articulate these properties and patterns, and is composed as follows. -/- Sections 1 through 4 (preamble, introduction, motivation and major biomathematical problems) are incipient. Section 5 describes the issues affecting Integral Biomathics and Section 6 -- the aspects of the Grand Challenge we face with this project. Section 7 contemplates the effort to formalize a General Theory of Living Systems (GTLS) from what we have today. The goal is to have a formal system, equivalent to that which exists in the physics community. Here we define how to perceive the role of time in biology. Section 8 describes the initial efforts to apply this general theory of living systems in many domains, with special emphasis on crossdisciplinary problems and multiple domains spanning both “hard” and “soft” sciences. The expected result is a coherent collection of integrated mathematical techniques. Section 9 discusses the first two test cases, project proposals, of our approach. They are designed to demonstrate the ability of our approach to address “wicked problems” which span across physics, chemistry, biology, societies and societal dynamics. The solutions require integrated measurable results at multiple levels known as “grand challenges” to existing methods. Finally, Section 10 adheres to an appeal for action, advocating the necessity for further long-term support of the INBIOSA program. -/- The report is concluded with preliminary non-exclusive list of challenging research themes to address, as well as required administrative actions. The efforts described in the ten sections of this White Paper will proceed concurrently. Collectively, they describe a program that can be managed and measured as it progresses. (shrink)
Iterability, the repetition which alters the idealization it reproduces, is the engine of deconstructive movement. The fact that all experience is transformative-dissimulative in its essence does not, however, mean that the momentum of change is the same for all situations. Derrida adapts Husserl's distinction between a bound and a free ideality to draw up a contrast between mechanical mathematical calculation, whose in-principle infinite enumerability is supposedly meaningless, empty of content, and therefore not in itself subject to alteration through contextual (...) change, and idealities such as spoken or written language which are directly animated by a meaning-to-say and are thus immediately affected by context. Derrida associates the dangers of cultural stagnation, paralysis and irresponsibility with the emptiness of programmatic, mechanical, formulaic thinking. This paper endeavors to show that enumerative calculation is not context-independent in itself but is instead immediately infused with alteration, thereby making incoherent Derrida's claim to distinguish between a free and bound ideality. Along with the presumed formal basis of numeric infinitization, Derrida's non-dialectical distinction between forms of mechanical or programmatic thinking (the Same) and truly inventive experience (the absolute Other) loses its justification. In the place of a distinction between bound and free idealities is proposed a distinction between two poles of novelty; the first form of novel experience would be characterized by affectivites of unintelligibility , confusion and vacuity, and the second by affectivities of anticipatory continuity and intimacy. (shrink)
Pythagoras’s number doctrine had a great effect on the development of science. Number – the key to the highest reality, and such approach allowed Pythagoras to transform mathematics from craft into science, which continues implementation of its project of “digitization of being”. Pythagoras's project underwent considerable transformation, but it only means that the plan in knowledge is often far from result.
The chapter explains why evolutionary genetics – a mathematical body of theory developed since the 1910s – eventually got to deal with culture: the frequency dynamics of genes like “the lactase gene” in populations cannot be correctly modeled without including social transmission. While the body of theory requires specific justifications, for example meticulous legitimations of describing culture in terms of traits, the body of theory is an immensely valuable scientific instrument, not only for its modeling power but also for (...) the amount of work that has been necessary to build, maintain, and expand it. A brief history of evolutionary genetics is told to demonstrate such patrimony, and to emphasize the importance and accumulation of statistical knowledge therein. The probabilistic nature of genotypes, phenogenotypes and population phenomena is also touched upon. Although evolutionary genetics is actually composed by distinct and partially independent traditions, the most important mathematical object of evolutionary genetics is the Mendelian space, and evolutionary genetics is mostly the daring study of trajectories of alleles in a population that explores that space. The ‘body’ is scientific wealth that can be invested in studying every situation that happens to turn out suitable to be modeled as a Mendelian population, or as a modified Mendelian population, or as a population of continuously varying individuals with an underlying Mendelian basis. Mathematical tinkering and justification are two halves of the mutual adjustment between the body of theory and the new domain of culture. Some works in current literature overstate justification, misrepresenting the relationship between body of theory and domain, and hindering interdisciplinary dialogue. (shrink)
Anti-exceptionalism about logic is the doctrine that logic does not require its own epistemology, for its methods are continuous with those of science. Although most recently urged by Williamson, the idea goes back at least to Lakatos, who wanted to adapt Popper's falsicationism and extend it not only to mathematics but to logic as well. But one needs to be careful here to distinguish the empirical from the a posteriori. Lakatos coined the term 'quasi-empirical' `for the counterinstances to putative (...) class='Hi'>mathematical and logical theses. Mathematics and logic may both be a posteriori, but it does not follow that they are empirical. Indeed, as Williamson has demonstrated, what counts as empirical knowledge, and the role of experience in acquiring knowledge, are both unclear. Moreover, knowledge, even of necessary truths, is fallible. Nonetheless, logical consequence holds in virtue of the meaning of the logical terms, just as consequence in general holds in virtue of the meanings of the concepts involved; and so logic is both analytic and necessary. In this respect, it is exceptional. But its methodologyand its epistemology are the same as those of mathematics and science in being fallibilist, and counterexamples to seemingly analytic truths are as likely as those in any scientic endeavour. What is needed is a new account of the evidential basis of knowledge, one which is, perhaps surprisingly, found in Aristotle. (shrink)
We develop a simple framework called ‘natural topology’, which can serve as a theoretical and applicable basis for dealing with real-world phenomena.Natural topology is tailored to make pointwise and pointfree notions go together naturally. As a constructive theory in BISH, it gives a classical mathematician a faithful idea of important concepts and results in intuitionism. -/- Natural topology is well-suited for practical and computational purposes. We give several examples relevant for applied mathematics, such as the decision-support system Hawk-Eye, and various (...) real-number representations. -/- We compare classical mathematics (CLASS), intuitionistic mathematics (INT), recursive mathematics (RUSS), Bishop-style mathematics (BISH) and formal topology, aiming to reduce the mutual differences to their essence. To do so, our mathematical foundation must be precise and simple. There are links with physics, regarding the topological character of our physical universe. -/- Any natural space is isomorphic to a quotient space of Baire space, which therefore is universal. We develop an elegant and concise ‘genetic induction’ scheme, and prove its equivalence on natural spaces to a formal-topological induction style. The inductive Heine-Borel property holds for ‘compact’ or ‘fanlike’ natural subspaces, including the real interval [g, h]. Inductive morphisms respect this Heine-Borel property, inversely. This partly solves the continuous-function problem for BISH, yet pointwise problems persist in the absence of Brouwer’s Thesis. -/- By inductivizing the definitions, a direct correspondence with INT is obtained which allows for a translation of many intuitionistic results into BISH. We thus prove a constructive star-finitary metrization theorem which parallels the classical metrization theorem for strongly paracompact spaces. We also obtain non-metrizable Silva spaces, in infinite-dimensional topology. Natural topology gives a solid basis, we think, for further constructive study of topological lattice theory, algebraic topology and infinite-dimensional topology. The final section reconsiders the question of which mathematics to choose for physics. Compactness issues also play a role here, since the question ‘can Nature produce a non-recursive sequence?’ finds a negative answer in CTphys . CTphys , if true, would seem at first glance to point to RUSS as the mathematics of choice for physics. To discuss this issue, we wax more philosophical. We also present a simple model of INT in RUSS, in the two-players game LIfE. (shrink)
The relevance of analytic metaphysics has come under criticism: Ladyman & Ross, for instance, have suggested do discontinue the field. French & McKenzie have argued in defense of analytic metaphysics that it develops tools that could turn out to be useful for philosophy of physics. In this article, we show first that this heuristic defense of metaphysics can be extended to the scientific field of applied ontology, which uses constructs from analytic metaphysics. Second, we elaborate on a parallel by French (...) & McKenzie between mathematics and metaphysics to show that the whole field of analytic metaphysics, being useful not only for philosophy but also for science, should continue to exist as a largely autonomous field. (shrink)
Abstract. In Dynamics of Reason Michael Friedman proposes a kind of synthesis between the neokantianism of Ernst Cassirer, the logical empiricism of Rudolf Carnap, and the historicism of Thomas Kuhn. Cassirer and Carnap are to take care of the Kantian legacy of modern philosophy of science, encapsulated in the concept of a relativized a priori and the globally rational or continuous evolution of scientific knowledge,while Kuhn´s role is to ensure that the historicist character of scientific knowledge is taken seriously. More (...) precisely, Carnapian linguistic frameworks, guarantee that the evolution of science procedes in a rational manner locally,while Cassirer’s concept of an internally defined conceptual convergence of empirical theories provides the means to maintain the global continuity of scientific reason. In this paper it is argued that Friedman’s neokantian account of scientific reason based on the concept of the relativized a priori underestimates the pragmatic aspects of the dynamics of scientific reason. To overcome this short-coming, I propose to reconsider C.I. Lewis’s account of a pragmatic the priori, recently modernized and elaborated by Hasok Chang. This may be<br><br><br><br><br><br><br><br><br><br&g t;<br><br><br><br><br><br>Keywords: Dynamics of reason, Paradigms, Logical Empiricism,Neokantianism, Pragmatism, Mathematics, Communicative Rationality. (shrink)
In continuation of what has been said in the first part of this two-part paper, herein we present further considerations on symbolism, reconsider some related psychodynamic case reports with some possible variants about their interpretations, and will apply what is said to some further speculations on mathematical symbolism and thought. In this second part, we continue with the numeration of the first part Σύμβολου, 1.
The traditional view of evidence in mathematics is that evidence is just proof and proof is just derivation. There are good reasons for thinking that this view should be rejected: it misrepresents both historical and current mathematical practice. Nonetheless, evidence, proof, and derivation are closely intertwined. This paper seeks to tease these concepts apart. It emphasizes the role of argumentation as a context shared by evidence, proofs, and derivations. The utility of argumentation theory, in general, and argumentation schemes, in (...) particular, as a methodology for the study of mathematical practice is thereby demonstrated. Argumentation schemes represent an almost untapped resource for mathematics education. Notably, they provide a consistent treatment of rigorous and non-rigorous argumentation, thereby working to exhibit the continuity of reasoning in mathematics with reasoning in other areas. Moreover, since argumentation schemes are a comparatively mature methodology, there is a substantial body of existing work to draw upon, including some increasingly sophisticated software tools. Such tools have significant potential for the analysis and evaluation of mathematical argumentation. The first four sections of the paper address the relationships of evidence to proof, proof to derivation, argument to proof, and argument to evidence, respectively. The final section directly addresses some of the educational implications of an argumentation scheme account of mathematical reasoning. (shrink)
In this chapter, one considers finance at its very foundations, namely, at the place where assumptions are being made about the ways to measure the two key ingredients of finance: risk and return. It is well known that returns for a large class of assets display a number of stylized facts that cannot be squared with the traditional views of 1960s financial economics (normality and continuity assumptions, i.e. Brownian representation of market dynamics). Despite the empirical counterevidence, normality and (...) class='Hi'>continuity assumptions were part and parcel of financial theory and practice, embedded in all financial practices and beliefs. Our aim is to build on this puzzle for extracting some clues revealing the use of one research strategy in academic community, model tinkering defined as a particular research habit. We choose to focus on one specific moment of the scientific controversies in academic finance: the ‘leptokurtic crisis’ opened by Mandelbrot in 1962. The profoundness of the crisis came from the angle of the Mandelbrot’s attack: not only he emphasized an empirical inadequacy of the Brownian representation, but also he argued for an inadequate grounding of this representation. We give some insights in this crisis and display the model tinkering strategies of the financial academic community in the 1970s and the 1980s. (shrink)
In this chapter, I consider the largely overlooked influence of E. W. von Tschirnhaus' treatise on method, the Medicina mentis, on Wolff's early philosophical project (in both its conception and execution). As I argue, part of Tschirnhaus' importance for Wolff lies in the use he makes of principles gained from experience as a foundation for the scientific enterprise in the context of his broader philosophical rationalism. I will show that this lesson from Tschirnhaus runs through Wolff's earliest philosophical discussions, and (...) indeed continues to inform his major texts in logic and mathematics just before the publication of the German Metaphysics. In the end, my discussion has the effect of revealing Tschirnhaus to be an exceptionally important influence on Wolff's development, perhaps even as important (or so I suggest) as Leibniz. (shrink)
A framework is developed for understanding what is “taken for granted” both in philosophy and in life generally, which may serve to orient philosophical inquiry and make it more effective. The framework takes in language and its development, as well as mathematics, logic, and the empirical sphere with particular reference to the exigencies of life. It is evaluated through consideration of seven philosophical issues concerned with such topics as solipsism, sense data as the route to knowledge, the possible reduction of (...) geometry to logic, and the existence and status of human rights. Various dichotomies and the notion of continuity are evidently highly strategic. (shrink)
The paper explores Hermann Weyl’s turn to intuitionism through a philosophical prism of normative framework transitions. It focuses on three central themes that occupied Weyl’s thought: the notion of the continuum, logical existence, and the necessity of intuitionism, constructivism, and formalism to adequately address the foundational crisis of mathematics. The analysis of these themes reveals Weyl’s continuous endeavor to deal with such fundamental problems and suggests a view that provides a different perspective concerning Weyl’s wavering foundational positions. Building on a (...) philosophical model of scientific framework transitions and the special role that normative indecision or ambivalence plays in the process, the paper examines Weyl’s motives for considering such a radical shift in the first place. It concludes by showing that Weyl’s shifting stances should be regarded as symptoms of a deep, convoluted intrapersonal process of self-deliberation induced by exposure to external criticism. (shrink)
Most advocates of the so-called “neologicist” movement in the philosophy of mathematics identify themselves as “Neo-Fregeans” (e.g., Hale and Wright): presenting an updated and revised version of Frege’s form of logicism. Russell’s form of logicism is scarcely discussed in this literature, and when it is, often dismissed as not really logicism at all (in lights of its assumption of axioms of infinity, reducibiity and so on). In this paper I have three aims: firstly, to identify more clearly the primary metaontological (...) and methodological differences between Russell’s logicism and the more recent forms; secondly, to argue that Russell’s form of logicism offers more elegant and satisfactory solutions to a variety of problems that continue to plague the neo-logicist movement (the bad company objection, the embarassment of richness objection, worries about a bloated ontology, etc.); thirdly, to argue that Neo- Russellian forms of neologicism remain viable positions for current philosophers of mathematics. (shrink)
This volume tells the story of the legacy and impact of the great German polymath Gottfried Wilhelm Leibniz (1646-1716). Leibniz made significant contributions to many areas, including philosophy, mathematics, political and social theory, theology, and various sciences. The essays in this volume explores the effects of Leibniz’s profound insights on subsequent generations of thinkers by tracing the ways in which his ideas have been defended and developed in the three centuries since his death. Each of the 11 essays is concerned (...) with Leibniz’s legacy and impact in a particular area, and between them they show not just the depth of Leibniz’s talents but also the extent to which he shaped the various domains to which he contributed, and in some cases continues to shape them today. With essays written by experts such as Nicholas Jolley, Pauline Phemister, and Philip Beeley, this volume is essential reading not just for students of Leibniz but also for those who wish to understand the game-changing impact made by one of history’s true universal geniuses. (shrink)
This collection of articles was written over the last 10 years and edited to bring them up to date (2019). All the articles are about human behavior (as are all articles by anyone about anything), and so about the limitations of having a recent monkey ancestry (8 million years or much less depending on viewpoint) and manifest words and deeds within the framework of our innate psychology as presented in the table of intentionality. As famous evolutionist Richard Leakey says, it (...) is critical to keep in mind not that we evolved from apes, but that in every important way, we are apes. If everyone was given a real understanding of this (i.e., of human ecology and psychology to actually give them some control over themselves), maybe civilization would have a chance. As things are however the leaders of society have no more grasp of things than their constituents and so collapse into anarchy is inevitable. -/- The first group of articles attempt to give some insight into how we behave that is reasonably free of theoretical delusions. In the next three groups, I comment on three of the principal delusions preventing a sustainable world— technology, religion and politics (cooperative groups). People believe that society can be saved by them, so I provide some suggestions in the rest of the book as to why this is unlikely via short articles and reviews of recent books by well-known writers. -/- It is critical to understand why we behave as we do and so the first section presents articles that try to describe (not explain as Wittgenstein insisted) behavior. I start with a brief review of the logical structure of rationality, which provides some heuristics for the description of language (mind, rationality, personality) and gives some suggestions as to how this relates to the evolution of social behavior. This centers around the two writers I have found the most important in this regard, Ludwig Wittgenstein and John Searle, whose ideas I combine and extend within the dual system (two systems of thought) framework that has proven so useful in recent thinking and reasoning research. As I note, there is in my view essentially complete overlap between philosophy, in the strict sense of the enduring questions that concern the academic discipline, and the descriptive psychology of higher order thought (behavior). Once one has grasped Wittgenstein’s insight that there is only the issue of how the language game is to be played, one determines the Conditions of Satisfaction (what makes a statement true or satisfied etc.) and that is the end of the discussion. No neurophysiology, no metaphysics, no postmodernism, no theology. -/- It is my contention that the table of intentionality (rationality, mind, thought, language, personality etc.) that features prominently here describes more or less accurately, or at least serves as an heuristic for, how we think and behave, and so it encompasses not merely philosophy and psychology, but everything else (history, literature, mathematics, politics etc.). Note especially that intentionality and rationality as I (along with Searle, Wittgenstein and others) view it, includes both conscious deliberative System 2 and unconscious automated System 1 actions or reflexes. -/- The next section describes the digital delusions, which confuse the language games of System 2 with the automatisms of System one, and so cannot distinguish biological machines (i.e., people) from other kinds of machines (i.e., computers). The ‘reductionist’ claim is that one can ‘explain’ behavior at a ‘lower’ level, but what actually happens is that one does not explain human behavior but a ‘stand in’ for it. Hence the title of Searle’s classic review of Dennett’s book (“Consciousness Explained”)— “Consciousness Explained Away”. In most contexts ‘reduction’ of higher level emergent behavior to brain functions, biochemistry, or physics is incoherent. Even for ‘reduction’ of chemistry or physics, the path is blocked by chaos and uncertainty. Anything can be ‘represented’ by equations, but when they ‘represent’ higher order behavior, it is not clear (and cannot be made clear) what the ‘results’ mean. Reductionist metaphysics is a joke, but most scientists and philosophers lack the appropriate sense of humor. -/- The last section describes The One Big Happy Family Delusion, i.e., that we are selected for cooperation with everyone, and that the euphonious ideals of Democracy, Diversity and Equality will lead us into utopia, if we just manage things correctly (the possibility of politics). Again, the No Free Lunch Principle ought to warn us it cannot be true, and we see throughout history and all over the contemporary world, that without strict controls, selfishness and stupidity gain the upper hand and soon destroy any nation that embraces these delusions. In addition, the monkey mind steeply discounts the future, and so we cooperate in selling our descendant’s heritage for temporary comforts, greatly exacerbating the problems. The only major change in this edition is the addition in the last article of a short discussion of China, a threat to peace and freedom as great as overpopulation and climate change and one to which even most professional scholars and politicians are oblivious so I regarded it as sufficiently important to warrant a new edition. -/- I describe versions of this delusion (i.e., that we are basically ‘friendly’ if just given a chance) as it appears in some recent books on sociology/biology/economics. Even Sapolsky’s otherwise excellent “Behave”(2017) embraces leftist politics and group selection and gives space to a discussion of whether humans are innately violent. I end with an essay on the great tragedy playing out in America and the world, which can be seen as a direct result of our evolved psychology manifested as the inexorable machinations of System 1. Our psychology, eminently adaptive and eugenic on the plains of Africa from ca. 6 million years ago, when we split from chimpanzees, to ca. 50,000 years ago, when many of our ancestors left Africa (i.e., in the EEA or Environment of Evolutionary Adaptation), is now maladaptive and dysgenic and the source of our Suicidal Utopian Delusions. So, like all discussions of behavior (philosophy, psychology, sociology, biology, anthropology, politics, law, literature, history, economics, soccer strategies, business meetings, etc.), this book is about evolutionary strategies, selfish genes and inclusive fitness (kin selection, natural selection). -/- The great mystic Osho said that the separation of God and Heaven from Earth and Humankind was the most evil idea that ever entered the Human mind. In the 20th century an even more evil notion arose, or at least became popular with leftists—that humans are born with rights, rather than having to earn privileges. The idea of human rights is an evil fantasy created by leftists to draw attention away from the merciless destruction of the earth by unrestrained 3rd world motherhood. Thus, every day the population increases by 200,000, who must be provided with resources to grow and space to live, and who soon produce another 200,000 etc. And one almost never hears it noted that what they receive must be taken from those already alive, and their descendants. Their lives diminish those already here in both major obvious and countless subtle ways. Every new baby destroys the earth from the moment of conception. In a horrifically overcrowded world with vanishing resources, there cannot be human rights without destroying the earth and our descendant’s futures. It could not be more obvious, but it is rarely mentioned in a clear and direct way, and one will never see the streets full of protesters against motherhood. -/- The most basic facts, almost never mentioned, are that there are not enough resources in America or the world to lift a significant percentage of the poor out of poverty and keep them there. Even the attempt to do this is already bankrupting America and destroying the world. The earth’s capacity to produce food decreases daily, as does our genetic quality. And now, as always, by far the greatest enemy of the poor is other poor and not the rich. -/- America and the world are in the process of collapse from excessive population growth, most of it for the last century, and now all of it, due to 3rd world people. Consumption of resources and the addition of 4 billion more ca. 2100 will collapse industrial civilization and bring about starvation, disease, violence and war on a staggering scale. The earth loses about 2% of its topsoil every year, so as it nears 2100, most of its food growing capacity will be gone. Billions will die and nuclear war is all but certain. In America, this is being hugely accelerated by massive immigration and immigrant reproduction, combined with abuses made possible by democracy. Depraved human nature inexorably turns the dream of democracy and diversity into a nightmare of crime and poverty. China will continue to overwhelm America and the world, as long as it maintains the dictatorship which limits selfishness. The root cause of collapse is the inability of our innate psychology to adapt to the modern world, which leads people to treat unrelated persons as though they had common interests (which I suggest may be regarded as an unrecognized -- but the commonest and most serious-- psychological problem -- Inclusive Fitness Disorder). This, plus ignorance of basic biology and psychology, leads to the social engineering delusions of the partially educated who control democratic societies. Few understand that if you help one person you harm someone else—there is no free lunch and every single item anyone consumes destroys the earth beyond repair. Consequently, social policies everywhere are unsustainable and one by one all societies without stringent controls on selfishness will collapse into anarchy or dictatorship. Without dramatic and immediate changes, there is no hope for preventing the collapse of America, or any country that follows a democratic system. Hence my concluding essay “Suicide by Democracy”. -/- Those wishing to read my other writings may see Talking Monkeys 2nd ed (2019), The Logical Structure of Philosophy, Psychology, Mind and Language in Ludwig Wittgenstein and John Searle 2nd ed (2019), Suicide by Democracy 3rd ed (2019), The Logical Stucture of Human Behavior (2019) and Suicidal Utopian Delusions in the 21st Century 4th ed (2019) . (shrink)
This collection of articles was written over the last 10 years and edited them to bring them up to date (2017). All the articles are about human behavior (as are all articles by anyone about anything), and so about the limitations of having a recent monkey ancestry (8 million years or much less depending on viewpoint) and manifest words and deeds within the framework of our innate psychology as presented in the table of intentionality. As famous evolutionist Richard Leakey says, (...) it is critical to keep in mind not that we evolved from apes, but that in every important way, we are apes. If everyone was given a real understanding of this (i.e., of human ecology and psychology to actually give them some control over themselves), maybe civilization would have a chance. As things are however the leaders of society have no more grasp of things than their constituents and so collapse into anarchy is inevitable. -/- It is critical to understand why we behave as we do and so the first section presents articles that try to describe (not explain as Wittgenstein insisted) behavior. Section one starts with a brief review of the logical structure of rationality which provides some heuristics for the description of language (mind) and gives some suggestions as to how this relates to the evolution of social behavior. This centers around the two writers I have found the most important in this regard, Ludwig Wittgenstein and John Searle, whose ideas I combine and extend within the dual system (two systems of thought) framework that has proven so useful in recent thinking and reasoning research. As I note, there is in my view essentially complete overlap between philosophy, in the strict sense of the enduring questions that concern the academic discipline, and the descriptive psychology of higher order thought (behavior). Once one has grasped Wittgenstein’s insight that there is only the issue of how the language game is to be played, one determines the Conditions of Satisfaction (what makes a statement true or satisfied etc.) and that is the end of the discussion. -/- Since philosophical problems are the result of our innate psychology, or as Wittgenstein put it, due to the lack of perspicuity of language, they run throughout human discourse, so there is endless need for philosophical analysis, not only in the ‘human sciences’ of philosophy, sociology, anthropology, political science, psychology, history, literature, religion, etc., but in the ‘hard sciences’ of physics, mathematics, and biology. It is universal to mix the language game questions with the real scientific ones as to what the empirical facts are. Scientism is ever present and the master has laid it before us long ago, i.e., Wittgenstein (hereafter W) beginning with the Blue and Brown Books in the early 1930’s. -/- "Philosophers constantly see the method of science before their eyes and are irresistibly tempted to ask and answer questions in the way science does. This tendency is the real source of metaphysics and leads the philosopher into complete darkness." (BBB p18) -/- The key to everything about us is biology, and it is obliviousness to it that leads millions of smart educated people like Obama, Chomsky, Clinton and the Pope to espouse suicidal utopian ideals that inexorably lead straight to Hell On Earth. As W noted, it is what is always before our eyes that is the hardest to see. We live in the world of conscious deliberative linguistic System 2, but it is unconscious, automatic reflexive System 1 that rules. This is the source of the universal blindness described by Searle’s The Phenomenological Illusion (TPI), Pinker’s Blank Slate and Tooby and Cosmides’ Standard Social Science Model. -/- The astute may wonder why we cannot see System 1 at work, but it is clearly counterproductive for an animal to be thinking about or second guessing every action, and in any case there is no time for the slow, massively integrated System 2 to be involved in the constant stream of split second ‘decisions’ we must make. As W noted, our ‘thoughts’ (T1 or the ‘thoughts’ of System 1) must lead directly to actions. -/- It is my contention that the table of intentionality (rationality, mind, thought, language, personality etc.) that features prominently here describes more or less accurately, or at least serves as an heuristic for, how we think and behave, and so it encompasses not merely philosophy and psychology, but everything else (history, literature, mathematics, politics etc.). Note especially that intentionality and rationality as I (along with Searle, Wittgenstein and others) view it, includes both conscious deliberative System 2 and unconscious automated System 1 actions or reflexes. -/- Thus all the articles, like all behavior, are intimately connected if one knows how to look at them. As I note, The Phenomenological Illusion (oblivion to our automated System 1) is universal and extends not merely throughout philosophy but throughout life. I am sure that Chomsky, Obama, Zuckerberg and the Pope would be incredulous if told that they suffer from the same problem as Hegel, Husserl and Heidegger, (or that that they differ only in degree from drug and sex addicts in being motivated by stimulation of their frontal cortices by the delivery of dopamine via the ventral tegmentum and the nucleus accumbens) but it’s clearly true. While the phenomenologists only wasted a lot of people’s time, they are wasting the earth and their descendant’s future. Section one continues with other views of behavior which my reviews attempt to correct and put in context with minimal theory. -/- The next section describes the digital delusions which confuse the language games of System 2 with the automatisms of System one, and so cannot distinguish biological machines (i.e., people) from other kinds of machines (i.e., computers). The ‘reductionist’ claim is that one can ‘explain’ behavior at a ‘lower’ level, but what actually happens is that one does not explain human behavior but a ‘stand in’ for it. Hence the title of Searle’s classic review of Dennett’s book (“Consciousness Explained”)— “Consciousness Explained Away”. In most contexts ‘reduction’ of higher level emergent behavior to brain functions, biochemistry, or physics is incoherent. Even for chemistry or physics the path is blocked by chaos and uncertainty. Anything can be ‘represented’ by equations, but when they ‘represent’ higher order behavior, it is not clear what the ‘results’ mean. Reductionist metaphysics is a joke but most scientists and philosophers lack the appropriate sense of humor. -/- Another hi-tech delusion is that the we will be saved from the pure evil (selfishness) of System 1 by computers/AI/robotics/ nanotech/genetic engineering created by System 2. The No Free Lunch principal tells us there will be serious and possibly fatal consequences. The adventurous may regard this principle as a higher order emergent expression of the Second Law of Thermodynamics. -/- The last section describes The One Big Happy Family Delusion , i.e., that we are selected for cooperation with everyone and that the euphonious ideals of Democracy, Diversity and Equality will lead us into utopia. Again the No Free Lunch Principle ought to warn us it cannot be true, and we see throughout history and all over the contemporary world, that without strict controls, selfishness and stupidity gain the upper hand and soon destroy any nation that embraces it. In addition, the monkey mind steeply discounts the future, and so we sell our descendant’s heritage for temporary comforts greatly exacerbating the problems. -/- I describe versions of this delusion (i.e., that we are basically ‘friendly’ if just given a chance) as it appears in some recent books on sociology/biology/economics. I end with an essay on the great tragedy playing out in America and the world, which can be seen as a direct result of our evolved psychology manifested as the inexorable machinations of System 1. Our evolved psychology, eminently adaptive and eugenic on the plains of Africa ca. 50,000 years ago, when many of our ancestors left Africa, to ca. 6 million years ago, when we split from chimpanzees (i.e., in the EEA or Environment of Evolutionary Adaptation), but now maladaptive and dysgenic and the source of our Suicidal Utopian Delusions. So, like all discussions of behavior, this book is about evolutionary strategies, selfish genes and inclusive fitness. (shrink)
This dissertation examines aspects of the interplay between computing and scientific practice. The appropriate foundational framework for such an endeavour is rather real computability than the classical computability theory. This is so because physical sciences, engineering, and applied mathematics mostly employ functions defined in continuous domains. But, contrary to the case of computation over natural numbers, there is no universally accepted framework for real computation; rather, there are two incompatible approaches --computable analysis and BSS model--, both claiming to formalise algorithmic (...) computation and to offer foundations for scientific computing. -/- The dissertation consists of three parts. In the first part, we examine what notion of 'algorithmic computation' underlies each approach and how it is respectively formalised. It is argued that the very existence of the two rival frameworks indicates that 'algorithm' is not one unique concept in mathematics, but it is used in more than one way. We test this hypothesis for consistency with mathematical practice as well as with key foundational works that aim to define the term. As a result, new connections between certain subfields of mathematics and computer science are drawn, and a distinction between 'algorithms' and 'effective procedures' is proposed. -/- In the second part, we focus on the second goal of the two rival approaches to real computation; namely, to provide foundations for scientific computing. We examine both frameworks in detail, what idealisations they employ, and how they relate to floating-point arithmetic systems used in real computers. We explore limitations and advantages of both frameworks, and answer questions about which one is preferable for computational modelling and which one for addressing general computability issues. -/- In the third part, analog computing and its relation to analogue (physical) modelling in science are investigated. Based on some paradigmatic cases of the former, a certain view about the nature of computation is defended, and the indispensable role of representation in it is emphasized and accounted for. We also propose a novel account of the distinction between analog and digital computation and, based on it, we compare analog computational modelling to physical modelling. It is concluded that the two practices, despite their apparent similarities, are orthogonal. (shrink)
Quantum mechanics was reformulated as an information theory involving a generalized kind of information, namely quantum information, in the end of the last century. Quantum mechanics is the most fundamental physical theory referring to all claiming to be physical. Any physical entity turns out to be quantum information in the final analysis. A quantum bit is the unit of quantum information, and it is a generalization of the unit of classical information, a bit, as well as the quantum information itself (...) is a generalization of classical information. Classical information refers to finite series or sets while quantum information, to infinite ones. Quantum information as well as classical information is a dimensionless quantity. Quantum information can be considered as a “bridge” between the mathematical and physical. The standard and common scientific epistemology grants the gap between the mathematical models and physical reality. The conception of truth as adequacy is what is able to transfer “over” that gap. One should explain how quantum information being a continuous transition between the physical and mathematical may refer to truth as adequacy and thus to the usual scientific epistemology and methodology. If it is the overall substance of anything claiming to be physical, one can question how different and dimensional physical quantities appear. Quantum information can be discussed as the counterpart of action. Quantum information is what is conserved, action is what is changed in virtue of the fundamental theorems of Emmy Noether (1918). The gap between mathematical models and physical reality, needing truth as adequacy to be overcome, is substituted by the openness of choice. That openness in turn can be interpreted as the openness of the present as a different concept of truth recollecting Heidegger’s one as “unconcealment” (ἀλήθεια). Quantum information as what is conserved can be thought as the conservation of that openness. (shrink)
Jakob Friedrich Fries (1773-1843): A Philosophy of the Exact Sciences -/- Shortened version of the article of the same name in: Tabula Rasa. Jenenser magazine for critical thinking. 6th of November 1994 edition -/- 1. Biography -/- Jakob Friedrich Fries was born on the 23rd of August, 1773 in Barby on the Elbe. Because Fries' father had little time, on account of his journeying, he gave up both his sons, of whom Jakob Friedrich was the elder, to the Herrnhut Teaching (...) Institution in Niesky in 1778. Fries attended the theological seminar in Niesky in autumn 1792, which lasted for three years. There he (secretly) began to study Kant. The reading of Kant's works led Fries, for the first time, to a deep philosophical satisfaction. His enthusiasm for Kant is to be understood against the background that a considerable measure of Kant's philosophy is based on a firm foundation of what happens in an analogous and similar manner in mathematics. -/- During this period he also read Heinrich Jacobi's novels, as well as works of the awakening classic German literature; in particular Friedrich Schiller's works. In 1795, Fries arrived at Leipzig University to study law. During his time in Leipzig he became acquainted with Fichte's philosophy. In autumn of the same year he moved to Jena to hear Fichte at first hand, but was soon disappointed. -/- During his first sojourn in Jenaer (1796), Fries got to know the chemist A. N. Scherer who was very influenced by the work of the chemist A. L. Lavoisier. Fries discovered, at Scherer's suggestion, the law of stoichiometric composition. Because he felt that his work still need some time before completion, he withdrew as a private tutor to Zofingen (in Switzerland). There Fries worked on his main critical work, and studied Newton's "Philosophiae naturalis principia mathematica". He remained a lifelong admirer of Newton, whom he praised as a perfectionist of astronomy. Fries saw the final aim of his mathematical natural philosophy in the union of Newton's Principia with Kant's philosophy. -/- With the aim of qualifying as a lecturer, he returned to Jena in 1800. Now Fries was known from his independent writings, such as "Reinhold, Fichte and Schelling" (1st edition in 1803), and "Systems of Philosophy as an Evident Science" (1804). The relationship between G. W. F. Hegel and Fries did not develop favourably. Hegel speaks of "the leader of the superficial army", and at other places he expresses: "he is an extremely narrow-minded bragger". On the other hand, Fries also has an unfavourable take on Hegel. He writes of the "Redundancy of the Hegelistic dialectic" (1828). In his History of Philosophy (1837/40) he writes of Hegel, amongst other things: "Your way of philosophising seems just to give expression to nonsense in the shortest possible way". In this work, Fries appears to argue with Hegel in an objective manner, and expresses a positive attitude to his work. -/- In 1805, Fries was appointed professor for philosophy in Heidelberg. In his time spent in Heidelberg, he married Caroline Erdmann. He also sealed his friendships with W. M. L. de Wette and F. H. Jacobi. Jacobi was amongst the contemporaries who most impressed Fries during this period. In Heidelberg, Fries wrote, amongst other things, his three-volume main work New Critique of Reason (1807). -/- In 1816 Fries returned to Jena. When in 1817 the Wartburg festival took place, Fries was among the guests, and made a small speech. 1819 was the so-called "Great Year" for Fries: His wife Caroline died, and Karl Sand, a member of a student fraternity, and one of Fries' former students stabbed the author August von Kotzebue to death. Fries was punished with a philosophy teaching ban but still received a professorship for physics and mathematics. Only after a period of years, and under restrictions, he was again allowed to read philosophy. From now on, Fries was excluded from political influence. The rest of his life he devoted himself once again to philosophical and natural studies. During this period, he wrote "Mathematical Natural Philosophy" (1822) and the "History of Philosophy" (1837/40). -/- Fries suffered from a stroke on New Year's Day 1843, and a second stroke, on the 10th of August 1843 ended his life. -/- 2. Fries' Work Fries left an extensive body of work. A look at the subject areas he worked on makes us aware of the universality of his thinking. Amongst these subjects are: Psychic anthropology, psychology, pure philosophy, logic, metaphysics, ethics, politics, religious philosophy, aesthetics, natural philosophy, mathematics, physics and medical subjects, to which, e.g., the text "Regarding the optical centre in the eye together with general remarks about the theory of seeing" (1839) bear witness. With popular philosophical writings like the novel "Julius and Evagoras" (1822), or the arabesque "Longing, and a Trip to the Middle of Nowhere" (1820), he tried to make his philosophy accessible to a broader public. Anthropological considerations are shown in the methodical basis of his philosophy, and to this end, he provides the following didactic instruction for the study of his work: "If somebody wishes to study philosophy on the basis of this guide, I would recommend that after studying natural philosophy, a strict study of logic should follow in order to peruse metaphysics and its applied teachings more rapidly, followed by a strict study of criticism, followed once again by a return to an even closer study of metaphysics and its applied teachings." -/- 3. Continuation of Fries' work through the Friesian School -/- Fries' ideas found general acceptance amongst scientists and mathematicians. A large part of the followers of the "Fries School of Thought" had a scientific or mathematical background. Amongst them were biologist Matthias Jakob Schleiden, mathematics and science specialist philosopher Ernst Friedrich Apelt, the zoologist Oscar Schmidt, and the mathematician Oscar Xavier Schlömilch. Between the years 1847 and 1849, the treatises of the "Fries School of Thought", with which the publishers aimed to pursue philosophy according to the model of the natural sciences appeared. In the Kant-Fries philosophy, they saw the realisation of this ideal. The history of the "New Fries School of Thought" began in 1903. It was in this year that the philosopher Leonard Nelson gathered together a small discussion circle in Goettingen. Amongst the founding members of this circle were: A. Rüstow, C. Brinkmann and H. Goesch. In 1904 L. Nelson, A. Rüstow, H. Goesch and the student W. Mecklenburg travelled to Thuringia to find the missing Fries writings. In the same year, G. Hessenberg, K. Kaiser and Nelson published the first pamphlet from their first volume of the "Treatises of the Fries School of Thought, New Edition". -/- The school set out with the aim of searching for the missing Fries' texts, and re-publishing them with a view to re-opening discussion of Fries' brand of philosophy. The members of the circle met regularly for discussions. Additionally, larger conferences took place, mostly during the holidays. Featuring as speakers were: Otto Apelt, Otto Berg, Paul Bernays, G. Fraenkel, K. Grelling, G. Hessenberg, A. Kronfeld, O. Meyerhof, L. Nelson and R. Otto. On the 1st of March 1913, the Jakob-Friedrich-Fries society was founded. Whilst the Fries' school of thought dealt in continuum with the advancement of the Kant-Fries philosophy, the members of the Jakob-Friedrich-Fries society's main task was the dissemination of the Fries' school publications. In May/June, 1914, the organisations took part in their last common conference before the gulf created by the outbreak of the First World War. Several members died during the war. Others returned disabled. The next conference took place in 1919. A second conference followed in 1921. Nevertheless, such intensive work as had been undertaken between 1903 and 1914 was no longer possible. -/- Leonard Nelson died in October 1927. In the 1930's, the 6th and final volume of "Treatises of the Fries School of Thought, New Edition" was published. Franz Oppenheimer, Otto Meyerhof, Minna Specht and Grete Hermann were involved in their publication. -/- 4. About Mathematical Natural Philosophy -/- In 1822, Fries' "Mathematical Natural Philosophy" appeared. Fries rejects the speculative natural philosophy of his time - above all Schelling's natural philosophy. A natural study, founded on speculative philosophy, ceases with its collection, arrangement and order of well-known facts. Only a mathematical natural philosophy can deliver the necessary explanatory reasoning. The basic dictum of his mathematical natural philosophy is: "All natural theories must be definable using purely mathematically determinable reasons of explanation." Fries is of the opinion that science can attain completeness only by the subordination of the empirical facts to the metaphysical categories and mathematical laws. -/- The crux of Fries' natural philosophy is the thought that mathematics must be made fertile for use by the natural sciences. However, pure mathematics displays solely empty abstraction. To be able to apply them to the sensory world, an intermediatory connection is required. Mathematics must be connected to metaphysics. The pure mechanics, consisting of three parts are these: a) A study of geometrical movement, which considers solely the direction of the movement, b) A study of kinematics, which considers velocity in Addition, c) A study of dynamic movement, which also incorporates mass and power, as well as direction and velocity. -/- Of great interest is Fries' natural philosophy in view of its methodology, particularly with regard to the doctrine "leading maxims". Fries calls these "leading maxims" "heuristic", "because they are principal rules for scientific invention". -/- Fries' philosophy found great recognition with Carl Friedrich Gauss, amongst others. Fries asked for Gauss's opinion on his work "An Attempt at a Criticism based on the Principles of the Probability Calculus" (1842). Gauss also provided his opinions on "Mathematical Natural Philosophy" (1822) and on Fries' "History of Philosophy". Gauss acknowledged Fries' philosophy and wrote in a letter to Fries: "I have always had a great predilection for philosophical speculation, and now I am all the more happy to have a reliable teacher in you in the study of the destinies of science, from the most ancient up to the latest times, as I have not always found the desired satisfaction in my own reading of the writings of some of the philosophers. In particular, the writings of several famous (maybe better, so-called famous) philosophers who have appeared since Kant have reminded me of the sieve of a goat-milker, or to use a modern image instead of an old-fashioned one, of Münchhausen's plait, with which he pulled himself from out of the water. These amateurs would not dare make such a confession before their Masters; it would not happen were they were to consider the case upon its merits. I have often regretted not living in your locality, so as to be able to glean much pleasurable entertainment from philosophical verbal discourse." -/- The starting point of the new adoption of Fries was Nelson's article "The critical method and the relation of psychology to philosophy" (1904). Nelson dedicates special attention to Fries' re-interpretation of Kant's deduction concept. Fries awards Kant's criticism the rationale of anthropological idiom, in that he is guided by the idea that one can examine in a psychological way which knowledge we have "a priori", and how this is created, so that we can therefore recognise our own knowledge "a priori" in an empirical way. Fries understands deduction to mean an "awareness residing darkly in us is, and only open to basic metaphysical principles through conscious reflection.". -/- Nelson has pointed to an analogy between Fries' deduction and modern metamathematics. In the same manner, as with the anthropological deduction of the content of the critical investigation into the metaphysical object show, the content of mathematics become, in David Hilbert's view, the object of metamathematics. -/-. (shrink)
Explications of the reconstruction of Leibniz’s metaphysics that Deleuze undertakes in 'The Fold: Leibniz and the Baroque' focus predominantly on the role of the infinitesimal calculus developed by Leibniz.1 While not underestimat- ing the importance of the infinitesimal calculus and the law of continuity as reflected in the calculus of infinite series to any understanding of Leibniz’s metaphysics and to Deleuze’s reconstruction of it in The Fold, what I propose to examine in this paper is the role played by (...) other developments in mathematics that Deleuze draws upon, including those made by a number of Leibniz’s near contemporaries – the projective geometry that has its roots in the work of Desargues (1591–1661) and the ‘proto-topology’ that appears in the work of Du ̈rer (1471–1528) – and a number of the subsequent developments in these fields of mathematics. Deleuze brings this elaborate conjunction of material together in order to set up a mathematical idealization of the system that he considers to be implicit in Leibniz’s work. The result is a thoroughly mathematical explication of the structure of Leibniz’s metaphysics. What is provided in this paper is an exposition of the very mathematical underpinnings of this Deleuzian account of the structure of Leibniz’s metaphysics, which, I maintain, subtends the entire text of The Fold. (shrink)
The late scholastics, from the fourteenth to the seventeenth centuries, contributed to many fields of knowledge other than philosophy. They developed a method of conceptual analysis that was very productive in those disciplines in which theory is relatively more important than empirical results. That includes mathematics, where the scholastics developed the analysis of continuous motion, which fed into the calculus, and the theory of risk and probability. The method came to the fore especially in the social sciences. In legal theory (...) they developed, for example, the ethical analyses of the conditions of validity of contracts, and natural rights theory. In political theory, they introduced constitutionalism and the thought experiment of a “state of nature”. Their contributions to economics included concepts still regarded as basic, such as demand, capital, labour, and scarcity. Faculty psychology and semiotics are other areas of significance. In such disciplines, later developments rely crucially on scholastic concepts and vocabulary. (shrink)
The text is a continuation of the article of the same name published in the previous issue of Philosophical Alternatives. The philosophical interpretations of the Kochen- Specker theorem (1967) are considered. Einstein's principle regarding the,consubstantiality of inertia and gravity" (1918) allows of a parallel between descriptions of a physical micro-entity in relation to the macro-apparatus on the one hand, and of physical macro-entities in relation to the astronomical mega-entities on the other. The Bohmian interpretation ( 1952) of quantum mechanics proposes (...) that all quantum systems be interpreted as dissipative ones and that the theorem be thus derstood. The conclusion is that the continual representation, by force or (gravitational) field between parts interacting by means of it, of a system is equivalent to their mutual entanglement if representation is discrete. Gravity (force field) and entanglement are two different, correspondingly continual and discrete, images of a single common essence. General relativity can be interpreted as a superluminal generalization of special relativity. The postulate exists of an alleged obligatory difference between a model and reality in science and philosophy. It can also be deduced by interpreting a corollary of the heorem. On the other hand, quantum mechanics, on the basis of this theorem and of V on Neumann's (1932), introduces the option that a model be entirely identified as the modeled reality and, therefore, that absolutely reality be recognized: this is a non-standard hypothesis in the epistemology of science. Thus, the true reality begins to be understood mathematically, i.e. in a Pythagorean manner, for its identification with its mathematical model. A few linked problems are highlighted: the role of the axiom of choice forcorrectly interpreting the theorem; whether the theorem can be considered an axiom; whether the theorem can be considered equivalent to the negation of the axiom. (shrink)
Guided by key insights of the four great philosophers mentioned in the title, here, in review of and expanding on our earlier work (Burchard, 2005, 2011), we present an exposition of the role played by language, & in the broader sense, λογοζ, the Logos, in how the CNS, the brain, is running the human being. Evolution by neural Darwinism has been forcing the linguistic nature of mind, enabling it to overcome & exploit the cognitive gap between an animal and its (...) world by recognizing environmental structures. Our work was greatly influenced by Heidegger’s lecture notes on metaphysics (Heidegger, 1935). We found agreement with recent progress in neuroscience, but also mathematical foundations of language theory, equating Logos with the mathematical concept of structure. The mystery of perception across the gap is analyzed as radiation and molecules impinging on sensory neurons that carry linguistic information about gross environmental structures, and only remotely about the physical reality of elementary particles. The most important logical brain function is Ego or Self, guiding the workings of the brain as a logos machine. Ego or Self operates from neurons in frontopolar cortex with global receptive fields. The logos machine can function only by availing itself of global context, its internally stored noumenal cosmos NK, and the categorical-conceptual apparatus CCA, updated continually through the neural default mode network (Raichle, 2005). In the Transcendental Deduction, Immanuel Kant discovered that Ego or Self is responsible for conscious control in perception relying on concepts & categories for a fitting percept to be incorporated intoNK. The entire CNS runs as a “movie-in-the-brain” (Parvizi & Damasio, 2001), at peak speed processing simultaneously in a series of cortical centers a stack of up to twelve frames in gamma rhythm of 25 ms intervals. We equate global context, or NK, with our human world, Heidegger’s Dasein being-in-the-world, and are able to demonstrate that the great philosopher in EM parallels neuro-science concerning the human mind. (shrink)
Quantity is the first category that Aristotle lists after substance. It has extraordinary epistemological clarity: "2+2=4" is the model of a self-evident and universally known truth. Continuous quantities such as the ratio of circumference to diameter of a circle are as clearly known as discrete ones. The theory that mathematics was "the science of quantity" was once the leading philosophy of mathematics. The article looks at puzzles in the classification and epistemology of quantity.
Few metaphors in biology are more enduring than the idea of Adaptive Landscapes, originally proposed by Sewall Wright (1932) as a way to visually present to an audience of typically non- mathematically savvy biologists his ideas about the relative role of natural selection and genetic drift in the course of evolution. The metaphor, how- ever, was born troubled, not the least reason for which is the fact that Wright presented different diagrams in his original paper that simply can- not refer (...) to the same concept and are therefore hard to reconcile with each other (Pigliucci 2008). For instance, in some usages, the landscape’s non- fitness axes represent combinations of individual genotypes (which cannot sensibly be aligned on a linear axis, and accordingly were drawn by Wright as polyhedrons of increasing dimensionality). In other usages, however, the points on the diagram represent allele or genotypic frequencies, and so are actually populations, not individuals (and these can indeed be coherently represented along continuous axes). (shrink)
The introductory personal remarks refer to my motivations for choosing research projects, and for moving from physics to molecular biology and then to development, with Hydra as a model system. Historically, Trembley’s discovery of Hydra regeneration in 1744 was the begin¬ning of developmental biology as we understand it, with passionate debates about preformation versus de novo generation, mechanisms versus organisms. In fact, seemingly conflicting bottom-up and top-down concepts are both required in combination to understand development. In modern terms, this means (...) analysing the molecules involved, as well as searching for physical principles underlying development within systems of molecules, cells and tissues. During the last decade, molecular biology has provided surprising and impressive evidence that the same types of mol¬ecules and molecular systems are involved in pattern formation in a wide range of organisms, including coelenterates like Hydra, and thus appear to have been “invented” early in evolution. Likewise, the features of certain systems, especially those of developmental regulation, are found in many different organisms. This includes the generation of spatial structures by the interplay of self-enhancing activation and “lateral” inhibitory effects of wider range, which is a main topic of my essay. Hydra regeneration is a particularly clear model for the formation of defined patterns within initially near-uniform tissues. In conclusion, this essay emphasizes the analysis of development in terms of physical laws, including the application of mathematics, and insists that Hydra was, and will continue to be, a rewarding model for understanding general features of embryogenesis and regeneration. -/- -/- . (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.