There exists a huge number of numerical methods that iteratively construct approximations to the solution y(x) of an ordinary differential equation (ODE) y′(x) = f(x,y) starting from an initial value y_0=y(x_0) and using a finite approximation step h that influences the accuracy of the obtained approximation. In this paper, a new framework for solving ODEs is presented for a new kind of a computer – the Infinity Computer (it has been patented and its working prototype exists). The new computer (...) is able to work numerically with finite, infinite, and infinitesimal numbers giving so the possibility to use different infinitesimals numerically and, in particular, to take advantage of infinitesimal values of h. To show the potential of the new framework a number of results is established. It is proved that the Infinity Computer is able to calculate derivatives of the solution y(x) and to reconstruct its Taylor expansion of a desired order numerically without finding the respective derivatives analytically (or symbolically) by the successive derivation of the ODE as it is usually done when the Taylor method is applied. Methods using approximations of derivatives obtained thanks to infinitesimals are discussed and a technique for an automatic control of rounding errors is introduced. Numerical examples are given. (shrink)
The global/local contrast is ubiquitous in mathematics. This paper explains it with straightforward examples. It is possible to build a circular staircase that is rising at any point (locally) but impossible to build one that rises at all points and comes back to where it started (a global restriction). Differentialequations describe the local structure of a process; their solution describes the global structure that results. The interplay between global and local structure is one of the great themes (...) of mathematics, but rarely discussed explicitly. (shrink)
In the present paper I argue that the formalism of Newtonian mechanics stems directly from the general principle to be called the principle of microlevel reducibility which physical systems obey in the realm of classical physics. This principle assumes, first, that all the properties of physical systems must be determined by their states at the current moment of time, in a slogan form it is ``only the present matters to physics.'' Second, it postulates that any physical system is nothing but (...) an ensemble of structureless particles arranged in some whose interaction obeys the superposition principle. I substantiate this statement and demonstrate directly how the formalism of differentialequations, the notion of forces in Newtonian mechanics, the concept of phase space and initial conditions, the principle of least actions, etc. result from the principle of microlevel reducibility. The philosophical concept of thick presentism and the introduction of two dimensional time-physical time and meta-time that are mutually independent on infinitesimal scales-are the the pivot points in these constructions. (shrink)
I examine to what extent accounts of mechanisms based on formal interventionist theories of causality can adequately represent biological mechanisms with complex dynamics. Using a differential equation model for a circadian clock mechanism as an example, I first show that there exists an iterative solution that can be interpreted as a structural causal model. Thus, in principle it is possible to integrate causal difference-making information with dynamical information. However, the differential equation model itself lacks the right modularity properties (...) for a full integration. A formal mechanistic model will therefore either have to leave out non-causal or causal explanatory relations. (shrink)
While the philosophers of science discuss the General Relativity, the mathematical physicists do not question it. Therefore, there is a conflict. From the theoretical point view “the question of precisely what Einstein discovered remains unanswered, for we have no consensus over the exact nature of the theory 's foundations. Is this the theory that extends the relativity of motion from inertial motion to accelerated motion, as Einstein contended? Or is it just a theory that treats gravitation geometrically in the spacetime (...) setting?”. “The voices of dissent proclaim that Einstein was mistaken over the fundamental ideas of his own theory and that their basic principles are simply incompatible with this theory. Many newer texts make no mention of the principles Einstein listed as fundamental to his theory; they appear as neither axiom nor theorem. At best, they are recalled as ideas of purely historical importance in the theory's formation. The very name General Relativity is now routinely condemned as a misnomer and its use often zealously avoided in favour of, say , Einstein's theory of gravitation What has complicated an easy resolution of the debate are the alterations of Einstein's own position on the foundations of his theory”, (Norton, 1993). Of other hand from the mathematical point view the “General Relativity had been formulated as a messy set of partial differentialequations in a single coordinate system. People were so pleased when they found a solution that they didn't care that it probably had no physical significance” (Hawking and Penrose, 1996). So, during a time, the declaration of quantum theorists:“I take the positivist viewpoint that a physical theory is just a mathematical model and that it is meaningless to ask whether it corresponds to reality. All that one can ask is that its predictions should be in agreement with observation.” (Hawking and Penrose, 1996)seemed to solve the problem, but recently achieved with the help of the tightly and collectively synchronized clocks in orbit frontally contradicts fundamental assumptions of the theory of Relativity. These observations are in disagree from predictions of the theory of Relativity. (Hatch, 2004a, 2004b, 2007). The mathematical model was developed first by Grossmann who presented it, in 1913, as the mathematical part of the Entwurf theory, still referred to a curved Minkowski spacetime. Einstein completed the mathematical model, in 1915, formulated for Riemann ́s spacetimes. In this paper, we present as of General Relativity currently remains only the mathematical model, darkened with the results of Hatch and, course, we conclude that a Einstein ́s gravity theory does not exist. (shrink)
In contemporary mathematics, a Colombeau algebra of Colombeau generalized functions is an algebra of a certain kind containing the space of Schwartz distributions. While in classical distribution theory a general multiplication of distributions is not possible, Colombeau algebras provide a rigorous framework for this. Remark 1.1.1.Such a multiplication of distributions has been a long time mistakenly believed to be impossible because of Schwartz’ impossibility result, which basically states that there cannot be a differential algebra containing the space of distributions (...) and preserving the product of continuous functions. However, if one only wants to preserve the product of smooth functions instead such a construction becomes possible, as demonstrated first by J.F.Colombeau [1],[2]. As a mathematical tool, Colombeau algebras can be said to combine a treatment of singularities, differentiation and nonlinear operations in one framework, lifting the limitations of distribution theory. These algebras have found numerous applications in the fields of partial differentialequations, geophysics, microlocal analysis and general relativity so far. (shrink)
Attempts to ‘naturalize’ phenomenology challenge both traditional phenomenology and traditional approaches to cognitive science. They challenge Edmund Husserl’s rejection of naturalism and his attempt to establish phenomenology as a foundational transcendental discipline, and they challenge efforts to explain cognition through mainstream science. While appearing to be a retreat from the bold claims made for phenomenology, it is really its triumph. Naturalized phenomenology is spearheading a successful challenge to the heritage of Cartesian dualism. This converges with the reaction against Cartesian thought (...) within science itself. Descartes divided the universe between res cogitans, thinking substances, and res extensa, the mechanical world. The latter won with Newton and we have, in most of objective science since, literally lost our mind, hence our humanity. Despite Darwin, biologists remain children of Newton, and dream of a grand theory that is epistemologically complete and would allow lawful entailment of the evolution of the biosphere. This dream is no longer tenable. We now have to recognize that science and scientists are within and part of the world we are striving to comprehend, as proponents of endophysics have argued, and that physics, biology and mathematics have to be reconceived accordingly. Interpreting quantum mechanics from this perspective is shown to both illuminate conscious experience and reveal new paths for its further development. In biology we must now justify the use of the word “function”. As we shall see, we cannot prestate the ever new biological functions that arise and constitute the very phase space of evolution. Hence, we cannot mathematize the detailed becoming of the biosphere, nor write differentialequations for functional variables we do not know ahead of time, nor integrate those equations, so no laws “entail” evolution. The dream of a grand theory fails. In place of entailing laws, a post-entailing law explanatory framework is proposed in which Actuals arise in evolution that constitute new boundary conditions that are enabling constraints that create new, typically unprestatable, Adjacent Possible opportunities for further evolution, in which new Actuals arise, in a persistent becoming. Evolution flows into a typically unprestatable succession of Adjacent Possibles. Given the concept of function, the concept of functional closure of an organism making a living in its world, becomes central. Implications for patterns in evolution include historical reconstruction, and statistical laws such as the distribution of extinction events, or species per genus, and the use of formal cause, not efficient cause, laws. (shrink)
George Boole emerged from the British tradition of the “New Analytic”, known for the view that the laws of logic are laws of thought. Logicians in the New Analytic tradition were influenced by the work of Immanuel Kant, and by the German logicians Wilhelm Traugott Krug and Wilhelm Esser, among others. In his 1854 work An Investigation of the Laws of Thought on Which are Founded the Mathematical Theories of Logic and Probabilities, Boole argues that the laws of thought acquire (...) normative force when constrained to mathematical reasoning. Boole’s motivation is, first, to address issues in the foundations of mathematics, including the relationship between arithmetic and algebra, and the study and application of differentialequations (Durand-Richard, van Evra, Panteki). Second, Boole intended to derive the laws of logic from the laws of the operation of the human mind, and to show that these laws were valid of algebra and of logic both, when applied to a restricted domain. Boole’s thorough and flexible work in these areas influenced the development of model theory (see Hodges, forthcoming), and has much in common with contemporary inferentialist approaches to logic (found in, e.g., Peregrin and Resnik). (shrink)
The distinction between the discrete and the continuous lies at the heart of mathematics. Discrete mathematics (arithmetic, algebra, combinatorics, graph theory, cryptography, logic) has a set of concepts, techniques, and application areas largely distinct from continuous mathematics (traditional geometry, calculus, most of functional analysis, differentialequations, topology). The interaction between the two – for example in computer models of continuous systems such as fluid flow – is a central issue in the applicable mathematics of the last hundred years. (...) This article explains the distinction and why it has proved to be one of the great organizing themes of mathematics. (shrink)
It has been argued that the fundamental laws of physics do not face a ‘problem of provisos’ equivalent to that found in other scientiﬁc disciplines (Earman, Roberts and Smith 2002) and there is only the appearance of exceptions to physical laws if they are confused with differentialequations of evolution type (Smith 2002). In this paper I argue that even if this is true, fundamental laws in physics still pose a major challenge to standard Humean approaches to lawhood, (...) as they are not in any obvious sense about regularities in behaviour. A Humean approach to physical laws with exceptions is possible, however, if we adopt a view of laws that takes them to be the algorithms in the algorithmic compressions of empirical data. When this is supplemented with a distinction between lossy and lossless compression, we can explain exceptions in terms of compression artefacts present in the application of the lossy laws. (shrink)
Spatially situated opinions that can be held with different degrees of conviction lead to spatiotemporal patterns such as clustering (homophily), polarization, and deadlock. Our goal is to understand how sensitive these patterns are to changes in the local nature of interactions. We introduce two different mixing mechanisms, spatial relocation and nonlocal interaction (“telephoning”), to an earlier fully spatial model (no mixing). Interestingly, the mechanisms that create deadlock in the fully spatial model have the opposite effect when there is a sufficient (...) amount of mixing. With telephoning, not only is polarization and deadlock broken up, but consensus is hastened. The effects of mixing by relocation are even more pronounced. Further insight into these dynamics is obtained for selected parameter regimes via comparison to the mean-field differentialequations. (shrink)
Attempts to ‘naturalize’ phenomenology challenge both traditional phenomenology and traditional approaches to cognitive science. They challenge Edmund Husserl’s rejection of naturalism and his attempt to establish phenomenology as a foundational transcendental discipline, and they challenge efforts to explain cognition through mainstream science. While appearing to be a retreat from the bold claims made for phenomenology, it is really its triumph. Naturalized phenomenology is spearheading a successful challenge to the heritage of Cartesian dualism. This converges with the reaction against Cartesian thought (...) within science itself. Descartes divided the universe between res cogitans, thinking substances, and res extensa, the mechanical world. The latter won with Newton and we have, in most of objective science since, literally lost our mind, hence our humanity. Despite Darwin, biologists remain children of Newton, and dream of a grand theory that is epistemologically complete and would allow lawful entailment of the evolution of the biosphere. This dream is no longer tenable. We now have to recognize that science and scientists are within and part of the world we are striving to comprehend, as proponents of endophysics have argued, and that physics, biology and mathematics have to be reconceived accordingly. Interpreting quantum mechanics from this perspective is shown to both illuminate conscious experience and reveal new paths for its further development. In biology we must now justify the use of the word “function”. As we shall see, we cannot prestate the ever new biological functions that arise and constitute the very phase space of evolution. Hence, we cannot mathematize the detailed becoming of the biosphere, nor write differentialequations for functional variables we do not know ahead of time, nor integrate those equations, so no laws “entail” evolution. The dream of a grand theory fails. In place of entailing laws, a post-entailing law explanatory framework is proposed in which Actuals arise in evolution that constitute new boundary conditions that are enabling constraints that create new, typically unprestatable, Adjacent Possible opportunities for further evolution, in which new Actuals arise, in a persistent becoming. Evolution flows into a typically unprestatable succession of Adjacent Possibles. Given the concept of function, the concept of functional closure of an organism making a living in its world, becomes central. Implications for patterns in evolution include historical reconstruction, and statistical laws such as the distribution of extinction events, or species per genus, and the use of formal cause, not efficient cause, laws. (shrink)
We show that in the Maxwell–Lorentz theory of classical electrodynamics most initial values for fields and particles lead to an ill-defined dynamics, as they exhibit singularities or discontinuities along light-cones. This phenomenon suggests that the Maxwell equations and the Lorentz force law ought rather to be read as a system of delay differentialequations, that is, differentialequations that relate a function and its derivatives at different times. This mathematical reformulation, however, leads to physical and (...) philosophical consequences for the ontological status of the electromagnetic field. In particular, fields cannot be taken as independent degrees of freedom, which suggests that one should not add them to the ontology. (shrink)
In this article, it is argued that the Gibbs-Liouville theorem is a mathematical representation of the statement that closed classical systems evolve deterministically. From the perspective of an observer of the system, whose knowledge about the degrees of freedom of the system is complete, the statement of deterministic evolution is equivalent to the notion that the physical distinctions between the possible states of the system, or, in other words, the information possessed by the observer about the system, is never lost. (...) Thus, it is proposed that the Gibbs-Liouville theorem is a statement about the dynamical evolution of a closed classical system valid in such situations where information about the system is conserved in time. Furthermore, in this article it is shown that the Hamilton equations and the Hamilton principle on phase space follow directly from the differential representation of the Gibbs-Liouville theorem, i.e. that the divergence of the Hamiltonian phase flow velocity vanish. Thus, considering that the Lagrangian and Hamiltonian formulations of classical mechanics are related via the Legendre transformation, it is obtained that these two standard formulations are both logical consequences of the statement of deterministic evolution, or, equivalently, information conservation. (shrink)
Let us start by some general definitions of the concept of complexity. We take a complex system to be one composed by a large number of parts, and whose properties are not fully explained by an understanding of its components parts. Studies of complex systems recognized the importance of “wholeness”, defined as problems of organization (and of regulation), phenomena non resolvable into local events, dynamics interactions in the difference of behaviour of parts when isolated or in higher configuration, etc., in (...) short, systems of various orders (or levels) not understandable by investigation of their respective parts in isolation. In a complex system it is essential to distinguish between ‘global’ and ‘local’ properties. Theoretical physicists in the last two decades have discovered that the collective behaviour of a macro-system, i.e. a system composed of many objects, does not change qualitatively when the behaviour of single components are modified slightly. Conversely, it has been also found that the behaviour of single components does change when the overall behaviour of the system is modified. There are many universal classes which describe the collective behaviour of the system, and each class has its own characteristics; the universal classes do not change when we perturb the system. The most interesting and rewarding work consists in finding these universal classes and in spelling out their properties. This conception has been followed in studies done in the last twenty years on second order phase transitions. The objective, which has been mostly achieved, was to classify all possible types of phase transitions in different universality classes and to compute the parameters that control the behaviour of the system near the transition (or critical or bifurcation) point as a function of the universality class. This point of view is not very different from the one expressed by Thom in the introduction of Structural Stability and Morphogenesis (1975). It differs from Thom’s program because there is no a priori idea of the mathematical framework which should be used. Indeed Thom considers only a restricted class of models (ordinary differentialequations in low dimensional spaces) while we do not have any prejudice regarding which models should be accepted. One of the most interesting and surprising results obtained by studying complex systems is the possibility of classifying the configurations of the system taxonomically. It is well-known that a well founded taxonomy is possible only if the objects we want to classify have some unique properties, i.e. species may be introduced in an objective way only if it is impossible to go continuously from one specie to another; in a more mathematical language, we say that objects must have the property of ultrametricity. More precisely, it was discovered that there are conditions under which a class of complex systems may only exist in configurations that have the ultrametricity property and consequently they can be classified in a hierarchical way. Indeed, it has been found that only this ultrametricity property is shared by the near-optimal solutions of many optimization problems of complex functions, i.e. corrugated landscapes in Kauffman’s language. These results are derived from the study of spin glass model, but they have wider implications. It is possible that the kind of structures that arise in these cases is present in many other apparently unrelated problems. Before to go on with our considerations, we have to pick in mind two main complementary ideas about complexity. (i) According to the prevalent and usual point of view, the essence of complex systems lies in the emergence of complex structures from the non-linear interaction of many simple elements that obey simple rules. Typically, these rules consist of 0–1 alternatives selected in response to the input received, as in many prototypes like cellular automata, Boolean networks, spin systems, etc. Quite intricate patterns and structures can occur in such systems. However, what can be also said is that these are toy systems, and the systems occurring in reality rather consist of elements that individually are quite complex themselves. (ii) So, this bring a new aspect that seems essential and indispensable to the emergence and functioning of complex systems, namely the coordination of individual agents or elements that themselves are complex at their own scale of operation. This coordination dramatically reduces the degree of freedom of those participating agents. Even the constituents of molecules, i.e. the atoms, are rather complicated conglomerations of subatomic particles, perhaps ultimately excitations of patterns of superstrings. Genes, the elementary biochemical coding units, are very complex macromolecular strings, as are the metabolic units, the proteins. Neurons, the basic elements of cognitive networks, themselves are cells. In those mentioned and in other complex systems, it is an important feature that the potential complexity of the behaviour of the individual agents gets dramatically simplified through the global interactions within the system. The individual degrees of freedom are drastically reduced, or, in a more formal terminology, the factual space of the system is much smaller than the product of the state space of the individual elements. That is one key aspect. The other one is that on this basis, that is utilizing the coordination between the activities of its members, the system then becomes able to develop and express a coherent structure at a higher level, that is, an emergent behaviour (and emergent properties) that transcends what each element is individually capable of. (shrink)
Filtration combustion is described by Laplacian growth without surface tension. These equations have elegant analytical solutions that replace the complex integro-differential motion equations by simple differentialequations of pole motion in a complex plane. The main problem with such a solution is the existence of finite time singularities. To prevent such singularities, nonzero surface tension is usually used. However, nonzero surface tension does not exist in filtration combustion, and this destroys the analytical solutions. However, a (...) more elegant approach exists for solving the problem. First, we can introduce a small amount of pole noise to the system. Second, for regularisation of the problem, we throw out all new poles that can produce a finite time singularity. It can be strictly proved that the asymptotic solution for such a system is a single finger. Moreover, the qualitative consideration demonstrates that a finger with 1 2 of the channel width is statistically stable. Therefore, all properties of such a solution are exactly the same as those of the solution with nonzero surface tension under numerical noise. The solution of the ST problem without surface tension is similar to the solution for the equation of cellular flames in the case of the combustion of gas mixtures. (shrink)
In 1916, Johannes Droste independently found an exact (vacuum) solution to the Einstein's (gravitational) field equations in empty space. Droste's solution is quasi-comparable to Schwarzschild's one . Droste published his paper entitled “The field of a single centre in Einstein's theory of gravitation, and the motion of a particle in that fieldˮ. The paper communicated (in the meeting of May 27, 1916) by Prof. H.A. Lorentz, and published in ʻProceedings of the Royal Netherlands Academy of Arts and Science. 19 (...) (1): 197-215 (1917)ʼ. In the present article, the Droste's solution is scrutinized and proven to be invalid purely and simply because the procedure used by Droste is mathematically questionable since he had systematically, deliberately, and without any justification ‒removed the constant coefficient ʻ2ʼ from the differential term (v'w') in Eq.(6) and added the differential term (wv'') to the same Eq.(6) in order to obtain Eq.(7) which was and is his principal objective, that is, the desired solution. Consequently, Eqs.(6,7) had clearly been falsified. (shrink)
We evaluated the reliability, validity, and differential item functioning (DIF) of a shorter version of the Defining Issues Test-1 (DIT-1), the behavioral DIT (bDIT), measuring the development of moral reasoning. 353 college students (81 males, 271 females, 1 not reported; age M = 18.64 years, SD = 1.20 years) who were taking introductory psychology classes at a public University in a suburb area in the Southern United States participated in the present study. First, we examined the reliability of the (...) bDIT using Cronbach’s α and its concurrent validity with the original DIT-1 using disattenuated correlation. Second, we compared the test duration between the two measures. Third, we tested the DIF of each question between males and females. Findings reported that first, the bDIT showed acceptable reliability and good concurrent validity. Second, the test duration could be significantly shortened by employing the bDIT. Third, DIF results indicated that the bDIT items did not favour any gender. Practical implications of the present study based on the reported findings are discussed. (shrink)
Recent accounts of actual causation are stated in terms of extended causal models. These extended causal models contain two elements representing two seemingly distinct modalities. The first element are structural equations which represent the or mechanisms of the model, just as ordinary causal models do. The second element are ranking functions which represent normality or typicality. The aim of this paper is to show that these two modalities can be unified. I do so by formulating two constraints under which (...) extended causal models with their two modalities can be subsumed under so called which contain just one modality. These two constraints will be formally precise versions of Lewissystem of weights or priorities” governing overall similarity between possible worlds. (shrink)
Prior Analytics by the Greek philosopher Aristotle (384 – 322 BCE) and Laws of Thought by the English mathematician George Boole (1815 – 1864) are the two most important surviving original logical works from before the advent of modern logic. This article has a single goal: to compare Aristotle’s system with the system that Boole constructed over twenty-two centuries later intending to extend and perfect what Aristotle had started. This comparison merits an article itself. Accordingly, this article does not discuss (...) many other historically and philosophically important aspects of Boole’s book, e.g. his confused attempt to apply differential calculus to logic, his misguided effort to make his system of ‘class logic’ serve as a kind of ‘truth-functional logic’, his now almost forgotten foray into probability theory, or his blindness to the fact that a truth-functional combination of equations that follows from a given truth-functional combination of equations need not follow truth-functionally. One of the main conclusions is that Boole’s contribution widened logic and changed its nature to such an extent that he fully deserves to share with Aristotle the status of being a founding figure in logic. By setting forth in clear and systematic fashion the basic methods for establishing validity and for establishing invalidity, Aristotle became the founder of logic as formal epistemology. By making the first unmistakable steps toward opening logic to the study of ‘laws of thought’—tautologies and laws such as excluded middle and non-contradiction—Boole became the founder of logic as formal ontology. (shrink)
The existence of singularities alerts that one of the highest priorities of a centennial perspective on general relativity should be a careful re-thinking of the validity domain of Einstein’s field equations. We address the problem of constructing distinguishable extensions of the smooth spacetime manifold model, which can incorporate singularities, while retaining the form of the field equations. The sheaf-theoretic formulation of this problem is tantamount to extending the algebra sheaf of smooth functions to a distribution-like algebra sheaf in (...) which the former may be embedded, satisfying the pertinent cohomological conditions required for the coordinatization of all of the tensorial physical quantities, such that the form of the field equations is preserved. We present in detail the construction of these distribution-like algebra sheaves in terms of residue classes of sequences of smooth functions modulo the information of singular loci encoded in suitable ideals. Finally, we consider the application of these distribution-like solution sheaves in geometrodynamics by modeling topologically-circular boundaries of singular loci in three-dimensional space in terms of topological links. It turns out that the Borromean link represents higher order wormhole solutions. (shrink)
The derivative is a basic concept of differential calculus. However, if we calculate the derivative as change in distance over change in time, the result at any instant is 0/0, which seems meaningless. Hence, Newton and Leibniz used the limit to determine the derivative. Their method is valid in practice, but it is not easy to intuitively accept. Thus, this article describes the novel method of differential calculus based on the double contradiction, which is easier to accept intuitively. (...) Next, the geometrical meaning of the double contradiction is considered as follows. A tangent at a point on a convex curve is iterated. Then, the slope of the tangent at the point is sandwiched by two kinds of lines. The first kind of line crosses the curve at the original point and a point to the right of it. The second kind of line crosses the curve at the original point and a point to the left of it. Then, the double contradiction can be applied, and the slope of the tangent is determined as a single value. Finally, the meaning of this method for the foundation of mathematics is considered. We reflect on Dehaene’s notion that the foundation of mathematics is based on the intuitions, which evolve independently. Hence, there may be gaps between intuitions. In fact, the Ancient Greeks identified inconsistency between arithmetic and geometry. However, Eudoxus developed the theory of proportion, which is equivalent to the Dedekind Cut. This allows the iteration of an irrational number by rational numbers as precisely as desired. Simultaneously, we can define the irrational number by the double contradiction, although its existence is not guaranteed. Further, an area of a curved figure is iterated and defined by rectilinear figures using the double contradiction. (shrink)
An analysis of the classical-quantum correspondence shows that it needs to identify a preferred class of coordinate systems, which defines a torsionless connection. One such class is that of the locally-geodesic systems, corresponding to the Levi-Civita connection. Another class, thus another connection, emerges if a preferred reference frame is available. From the classical Hamiltonian that rules geodesic motion, the correspondence yields two distinct Klein-Gordon equations and two distinct Dirac-type equations in a general metric, depending on the connection used. (...) Each of these two equations is generally-covariant, transforms the wave function as a four-vector, and differs from the Fock-Weyl gravitational Dirac equation (DFW equation). One obeys the equivalence principle in an often-accepted sense, whereas the DFW equation obeys that principle only in an extended sense. (shrink)
Geometry was a main source of inspiration for Carnap’s conventionalism. Taking Poincaré as his witness Carnap asserted in his dissertation Der Raum (Carnap 1922) that the metrical structure of space is conventional while the underlying topological structure describes "objective" facts. With only minor modifications he stuck to this account throughout his life. The aim of this paper is to disprove Carnap's contention by invoking some classical theorems of differential topology. By this means his metrical conventionalism turns out to be (...) indefensible for mathematical reasons. This implies that the relation between to-pology and geometry cannot be conceptualized as analogous to the relation between the meaning of a proposition and its expression in some language as logical empiricists used to say. (shrink)
As is well known, Einstein was dissatisfied with the foundation of quantum theory and sought to find a basis for it that would have satisfied his need for a causal explanation. In this paper this abandoned idea is investigated. It is found that it is mathematically not dead at all. More in particular: a quantum mechanical U(1) gauge invariant Dirac equation can be derived from Einstein's gravity field equations. We ask ourselves what it means for physics, the history of (...) physics and for the actual discussion on foundations. (shrink)
In Against the Day, Pynchon is obsessed with twoness, double worlds, as well as dual realities, and like Deleuze’s concept of repetition, these duplications and twinships are not merely repetition of the same, rather they allow for creativity, reinvention, and becoming. Pynchon’s duplication of fictional and spectral characters intends to critique the notion of identity as does Deleuzian concept of repetition. Not attached to the representational concept of identity as the recurrence of the same, Pynchon’s duplications decenter the transcendental concept (...) in favor of a perpetual becoming and reproduces difference and singularity. Like Deleuze, Pynchon eschews an identity that is always guaranteed, and shows that the repetition of an object or a subject is not the recurrence of the original self-identical object or person. Moreover, Iceland spar, the mystifying calcite, with its doubling effect provides the reader with a view of a world beyond the ordinary, actual world, which is quite similar to what Pynchon’s novel does per se. (shrink)
In Hegel ou Spinoza,1 Pierre Macherey challenges the influence of Hegel’s reading of Spinoza by stressing the degree to which Spinoza eludes the grasp of the Hegelian dialectical progression of the history of philosophy. He argues that Hegel provides a defensive misreading of Spinoza, and that he had to “misread him” in order to maintain his subjective idealism. The suggestion being that Spinoza’s philosophy represents, not a moment that can simply be sublated and subsumed within the dialectical progression of the (...) history of philosophy, but rather an alternative point of view for the development of a philosophy that overcomes Hegelian idealism. Gilles Deleuze also considers Spinoza’s philosophy to resist the totalising effects of the dialectic. Indeed, Deleuze demonstrates, by means of Spinoza, that a more complex philosophy antedates Hegel’s, which cannot be supplanted by it. Spinoza therefore becomes a significant figure in Deleuze’s project of tracing an alternative lineage in the history of philosophy, which, by distancing itself from Hegelian idealism, culminates in the construction of a philosophy of difference. It is Spinoza’s role in this project that will be demonstrated in this paper by differentiating Deleuze’s interpretation of the geometrical example of Spinoza’s Letter XII (on the problem of the infinite) in Expressionism in Philosophy, Spinoza,2 from that which Hegel presents in the Science of Logic.3. (shrink)
At the heart of Rik Peels’s Responsible Belief: A Theory in Ethics and Epistemology is the idea that responsibility for belief ought to be understood on the model of responsibility for states of affairs that are subject to our influence but not under our intentional control, or what he calls derivative responsibility. In this article, I argue that reflection on the nature and scope of derivative responsibility reveals important lacunae in Peels’s account of responsible belief and his account of responsibility (...) for belief. (shrink)
The biochemistry of geotropism in plants and gravisensing in e.g. cyanobacteria or paramacia is still not well understood today [1]. Perhaps there are more ways than one for organisms to sense gravity. The two best known relatively old explanations for gravity sensing are sensing through the redistribution of cellular starch statoliths and sensing through redistribution of auxin. The starch containing statoliths in a gravity field produce pressure on the endoplasmic reticulum of the cell. This enables the cell to sense direction. (...) Alternatively, there is the redistribution of auxin under the action of gravity. This is known as the Cholodny-Went hypothesis [2], [3]. The latter redistribution coincides with a redistribution of electrical charge in the cell. With the present study the aim is to add a mathematical unified field explanation to gravisensing. (shrink)
Since its final version and publication in 1916, it is widely reported in several specialized textbooks and research articles that general relativity theory may be reduced to the Newton's gravity theory in the limit of a weak gravitational field and slow motion of the material bodies. In the present paper, the so-called reducibility of Einstein's geodesic and field equations to Newton's equation of motion and Poisson's gravitational potential equation, respectively, is scrutinized and proven to be mathematically, physically and dimensionally (...) wrong and also the geometrization of gravity is not really necessary. (shrink)
Saturated free fatty acids-induced hepatocyte lipoapoptosis plays a pivotal role in non-alcoholic steatohepatitis. Theactivation of endoplasmic reticulum (ER) stress isinvolved in hepatocyte lipoapoptosis induced by thesaturated free fatty acidpalmitate (PA). However, the underlying mechanismsof the role of ER stress in hepatocyte lipoapoptosis remain largely unclear.In this study, we showed that PA and tunicamycin (Tun), a classic ER stress inducer, resulted in differential activation of ERstress pathways. Our data revealed that PA inducedchronic and persistent ER stress response, but Tuninduced acute (...) and transientER stress response. Compared with Tun treatment, PAinduced much lower glucose-regulated protein 78 (GRP78), a centralregulator of ER homeostasis, accumulation. It is noteworthy that GRP78 over-expression not only inhibited PA-induced ERstress but also decreased PA-induced apoptosis. Taken together, our data suggest that the differentialactivation of ER stresssignal plays an important role in PA-induced hepatocyte lipoapoptosis. More detailed studies on the mechanisms of PA inrepressing the accumulation of GRP78 will contribute to the understanding of molecular mechanisms of lipoapoptosis. (shrink)
We start from previous studies of G.N. Ord and A.S. Deakin showing that both the classical diffusion equation and Schrödinger equation of quantum mechanics have a common stump. Such result is obtained in rigorous terms since it is demonstrated that both diffusion and Schrödinger equations are manifestation of the same mathematical axiomatic set of the Clifford algebra. By using both such ( ) i A S and the i,±1 N algebra, it is evidenced, however, that possibly the two basic (...)equations of the physics cannot be reconciled. 1. (shrink)
Saturated free fatty acids-induced hepatocyte lipoapoptosis plays a pivotal role in non-alcoholic steatohepatitis. Theactivation of endoplasmic reticulum (ER) stress isinvolved in hepatocyte lipoapoptosis induced by thesaturated free fatty acidpalmitate (PA). However, the underlying mechanismsof the role of ER stress in hepatocyte lipoapoptosis remain largely unclear.In this study, we showed that PA and tunicamycin (Tun), a classic ER stress inducer, resulted in differential activation of ERstress pathways. Our data revealed that PA inducedchronic and persistent ER stress response, but Tuninduced acute (...) and transientER stress response. (shrink)
I present and defend the generalized selected effects theory (GSE) of function. According to GSE, the function of a trait consists in the activity that contributed to its bearer’s differential reproduction, or differential retention, within a population. Unlike the traditional selected effects (SE) theory, it does not require that the functional trait helped its bearer reproduce; differential retention is enough. Although the core theory has been presented previously, I go significantly beyond those presentations by providing a new (...) argument for GSE and defending it from a recent objection. I also sketch its implications for teleosemantics and philosophy of medicine. (shrink)
With the advent of computers in the experimental labs, dynamic systems have become a new tool for research on problem solving and decision making. A short review of this research is given and the main features of these systems (connectivity and dynamics) are illustrated. To allow systematic approaches to the influential variables in this area, two formal frameworks (linear structural equations and finite state automata) are presented. Besides the formal background, the article sets out how the task demands of (...) system identification and system control can be realised in these environments, and how psychometrically acceptable dependent variables can be derived. (shrink)
Maimon’s theory of the differential has proved to be a rather enigmatic aspect of his philosophy. By drawing upon mathematical developments that had occurred earlier in the century and that, by virtue of the arguments presented in the Essay and comments elsewhere in his writing, I suggest Maimon would have been aware of, what I propose to offer in this paper is a study of the differential and the role that it plays in the Essay on Transcendental Philosophy (...) (1790). In order to do so, this paper focuses upon Maimon’s criticism of the role played by mathematics in Kant’s philosophy, to which Maimon offers a Leibnizian solution based on the infinitesimal calculus. The main difficulties that Maimon has with Kant’s system, the second of which will be the focus of this paper, include the presumption of the existence of synthetic a priori judgments, i.e. the question quid facti, and the question of whether the fact of our use of a priori concepts in experience is justified, i.e. the question quid juris. Maimon deploys mathematics, specifically arithmetic, against Kant to show how it is possible to understand objects as having been constituted by the very relations between them, and he proposes an alternative solution to the question quid juris, which relies on the concept of the differential. However, despite these arguments, Maimon remains sceptical with respect to the question quid facti. (shrink)
While structural equations modeling is increasingly used in philosophical theorizing about causation, it remains unclear what it takes for a particular structural equations model to be correct. To the extent that this issue has been addressed, the consensus appears to be that it takes a certain family of causal counterfactuals being true. I argue that this account faces difficulties in securing the independent manipulability of the structural determination relations represented in a correct structural equations model. I then (...) offer an alternate understanding of structural determination, and I demonstrate that this theory guarantees that structural determination relations are independently manipulable. The account provides a straightforward way of understanding hypothetical interventions, as well as a criterion for distinguishing hypothetical changes in the values of variables which constitute interventions from those which do not. It additionally affords a semantics for causal counterfactual conditionals which is able to yield a clean solution to a problem case for the standard ‘closest possible world’ semantics. (shrink)
In the paper “Math Anxiety,” Aden Evens explores the manner by means of which concepts are implicated in the problematic Idea according to the philosophy of Gilles Deleuze. The example that Evens draws from Difference and Repetition in order to demonstrate this relation is a mathematics problem, the elements of which are the differentials of the differential calculus. What I would like to offer in the present paper is an historical account of the mathematical problematic that Deleuze deploys in (...) his philosophy, and an introduction to the role that this problematic plays in the develop- ment of his philosophy of difference. One of the points of departure that I will take from the Evens paper is the theme of “power series.”2 This will involve a detailed elaboration of the mechanism by means of which power series operate in the differential calculus deployed by Deleuze in Difference and Repetition. Deleuze actually constructs an alternative history of mathematics that establishes an historical conti- nuity between the differential point of view of the infinitesimal calculus and modern theories of the differential calculus. It is in relation to the differential point of view of the infinitesimal calculus that Deleuze determines a differential logic which he deploys, in the form of a logic of different/ciation, in the development of his proj- ect of constructing a philosophy of difference. (shrink)
Einstein field equations was originally derived by Einstein in 1915 in respect with canonical formalism of Riemann geometry,i.e. by using the classical sufficiently smooth metric tensor, smooth Riemann curvature tensor, smooth Ricci tensor,smooth scalar curvature, etc.. However have soon been found singular solutions of the Einstein field equations with degenerate and singular metric tensor and singular Riemann curvature tensor. These degenerate and singular solutions of the Einstein field equations was formally accepted by main part of scientific community (...) beyond rigorous canonical formalism of Riemannian geometry. (shrink)
Selon la relativité générale, la force gravitationnelle est une manifestation de la géométrie de l'espace-temps local. RG est une théorie métrique de la gravité. Il est basé sur les équations d'Einstein, qui décrivent la relation entre la géométrie d'une variété pseudo-riemannienne à quatre dimensions, représentant l'espace-temps et l'énergie-impulsion contenu dans cet espace-temps. La gravité correspond aux modifications des propriétés spatiales et temporelles, qui à leur tour modifient les chemins des objets. La courbure est causée par l'énergie-impulsion de la matière. Selon (...) John Archibald Wheeler, l'espace-temps indique à la matière comment se déplacer, et la matière indique à l'espace-temps comment se courber. DOI: 10.13140/RG.2.2.23756.26249. (shrink)
The inevitability of arising in equations of kinetics and hydrodynamics irreversibility not contained in original equations of classic mechanics is substantiated. It is established that transfer of information about the direction of system evolution from initial conditions to resulting equations is the consequence of losing information about the position of an individual particle in space, which takes place at roughening description. It is shown that the roughening with respect to impact parameters of colliding particles is responsible for (...) appearance of the irreversibility in resulting equations. Direct equations of kinetics and hydrodynamics are the result of roughening distribution functions with respect to impact parameters of particles, which have not yet reached the domain of their interaction. The direct equations are valid for the progressive direction of timing on the time axis pointing from the past to the future. Reverse equations of kinetics and hydrodynamics are the result of roughening distribution functions with respect to impact parameters of particles, which have already left the domain of their interaction. The reverse equations are valid for the progressive direction of timing on the time axis pointing from the future to the past. (shrink)
I provide a theory of causation formulated within the causal modeling framework. In contrast to its predecessors, this theory is model-invariant in the following sense: if the theory says that C caused (didn't cause) E in a causal model, M, then it will continue to say that C caused (didn't cause) E once we've removed an inessential variable from M. I suggest that, if this theory is true, then we should understand a cause as something which transmits deviant or non-inertial (...) behavior to its effect. (shrink)
An actual cause of some token effect is itself a token event that helped to bring about that effect. The notion of an actual cause is different from that of a potential cause – for example a pre-empted backup – which had the capacity to bring about the effect, but which wasn't in fact operative on the occasion in question. Sometimes actual causes are also distinguished from mere background conditions: as when we judge that the struck match was a cause (...) of the fire, while the presence of oxygen was merely part of the relevant background against which the struck match operated. Actual causation is also to be distinguished from type causation: actual causation holds between token events in a particular, concrete scenario; type causation, by contrast, holds between event kinds in scenario kinds. (shrink)
Structural models of systems of causal connections have become a common tool in the analysis of the concept of causation. In the present paper I offer a general argument to show that one of the most powerful definitions of the concept of actual cause, provided within the structural models framework, is not sufficient to grant a full account of our intuitive judgements about actual causation, so that we are still waiting for a comprehensive definition. This is done not simply by (...) focusing on a set of case studies, but by arguing that our intuitions about two different kinds of causal patterns, i.e., overdetermination and counterdetermination, cannot be addressed using that definition. (shrink)
This book provides the first detailed account of Gramsci's work in the context of current critical and socio-cultural debates. Renate Holub argues that Gramsci was ahead of his time in offering a theory of art, politics and cultural production. Gramsci's achievement is discussed particularly in relation to the Frankfurt School (Adorno, Horkheimer, Benjamin, Bloch, Habermas), to Brecht's theoretical writings and to thinkers in the phenomenological tradition especially Merleau-Ponty. She argues for Gramsci's continuing relevance at a time of retreat from Marxist (...) positions on the postmodern left. Antonio Gramsci is distinguished by its range of philosophical grasp, its depth of specialized historical scholarship, and its keen sense of Gramsci's position as a crucial figure in the politics of contemporary cultural theory. (shrink)
Let G be an additive subgroup of ℂ, let Wn = {xi = 1, xi + xj = xk: i, j, k ∈ {1, …, n }}, and define En = {xi = 1, xi + xj = xk, xi · xj = xk: i, j, k ∈ {1, …, n }}. We discuss two conjectures. If a system S ⊆ En is consistent over ℝ, then S has a real solution which consists of numbers whose absolute values belong to (...) [0, 22n –2]. If a system S ⊆ Wn is consistent over G, then S has a solution ∈ n in which |xj| ≤ 2n –1 for each j. (shrink)
This paper offers a new interpretation for Wittgenstein`s treatment of mathematical identities. As it is widely known, Wittgenstein`s mature philosophy of mathematics includes a general rejection of abstract objects. On the other hand, the traditional interpretation of mathematical identities involves precisely the idea of a single abstract object – usually a number –named by both sides of an equation.
Using Peirce as a guide, this paper explores the way in which light mediates finitude through the relational process of semiosis. Embodying the triadic logic of identity, difference and return, light creates space, time and matter. Attention is on simple bodily forms and the meta-physics of their relationality. The first section introduces the mathematical and metaphysical contours of Peirce’s approach. The second section motivates Peirce’s three categories as interwoven process. In the third section, Peirce’s formalism of the sign is presented (...) and applied to simple physical and biological bodies. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.