Indeterminism of quantum mechanics is considered as an immediate corollary from the theorems about absence of hidden variables in it, and first of all, the Kochen – Specker theorem. The base postulate of quantum mechanics formulated by Niels Bohr that it studies the system of an investigated microscopic quantum entity and the macroscopic apparatus described by the smooth equations of classical mechanics by the readings of the latter implies as a necessary condition of quantum mechanics the absence of hidden (...) variables, and thus, quantum indeterminism. Consequently, the objectivity of quantum mechanics and even its possibility and ability to study its objects as they are by themselves imply quantum indeterminism. The so-called free-will theorems in quantum mechanics elucidate that the “valuable commodity” of free will is not a privilege of the experimenters and human beings, but it is shared by anything in the physical universe once the experimenter is granted to possess free will. The analogical idea, that e.g. an electron might possess free will to “decide” what to do, scandalized Einstein forced him to exclaim (in a letter to Max Born in 2016) that he would be а shoemaker or croupier rather than a physicist if this was true. Anyway, many experiments confirmed the absence of hidden variables and thus quantum indeterminism in virtue of the objectivity and completeness of quantum mechanics. Once quantum mechanics is complete and thus an objective science, one can ask what this would mean in relation to classicalphysics and its objectivity. In fact, it divides disjunctively what possesses free will from what does not. Properly, all physical objects belong to the latter area according to it, and their “behavior” is necessary and deterministic. All possible decisions, on the contrary, are concentrated in the experimenters (or human beings at all), i.e. in the former domain not intersecting the latter. One may say that the cost of the determinism and unambiguous laws of classicalphysics, is the indeterminism and free will of the experimenters and researchers (human beings) therefore necessarily being out of the scope and objectivity of classicalphysics. This is meant as the “deterministic subjectivity of classicalphysics” opposed to the “indeterminist objectivity of quantum mechanics”. (shrink)
To make out in what way Einstein’s manifold 1905 ‘annus mirabilis’ writings hang together one has to take into consideration Einstein’s strive for unity evinced in his persistent attempts to reconcile the basic research traditions of classicalphysics. Light quanta hypothesis and special theory of relativity turn out to be the contours of a more profound design, mere milestones of implementation of maxwellian electrodynamics, statistical mechanics and thermodynamics reconciliation programme. The conception of luminiferous ether was an insurmountable obstacle (...) for Einstein’s statistical thermodynamics in which the leading role was played by the light quanta paper. In his critical stand against the entrenched research traditions of classicalphysics Einstein was apparently influenced by David Hume and Ernst Mach. However, when related to creative momenta, Einstein’s 1905 unificationist modus operandi was drawn upon Mach’s principle of economy of thought taken in the context of his ‘instinctive knowledge’ doctrine and with promising inclinations of Kantian epistemology presuming the coincidence of both constructing theory and integrating intuition of Principle. (shrink)
Information can be considered as the most fundamental, philosophical, physical and mathematical concept originating from the totality by means of physical and mathematical transcendentalism (the counterpart of philosophical transcendentalism). Classical and quantum information, particularly by their units, bit and qubit, correspond and unify the finite and infinite. As classical information is relevant to finite series and sets, as quantum information, to infinite ones. A fundamental joint relativity of the finite and infinite, of the external and internal is to (...) be investigated. The corresponding invariance is able to define physical action and its quantity only on the basis of information and especially: on the relativity of classical and quantum information. The concept of transcendental time, an epoché in relation to the direction of time arrow can be defined. Its correlate is that information invariant to the finite and infinite, therefore unifying both classical and quantum information. (shrink)
It is usual to identify initial conditions of classical dynamical systems with mathematical real numbers. However, almost all real numbers contain an infinite amount of information. I argue that a finite volume of space can’t contain more than a finite amount of information, hence that the mathematical real numbers are not physically relevant. Moreover, a better terminology for the so-called real numbers is “random numbers”, as their series of bits are truly random. I propose an alternative classical mechanics, (...) which is empirically equivalent to classical mechanics, but uses only finite-information numbers. This alternative classical mechanics is non-deterministic, despite the use of deterministic equations, in a way similar to quantum theory. Interestingly, both alternative classical mechanics and quantum theories can be supplemented by additional variables in such a way that the supplemented theory is deterministic. Most physicists straightforwardly supplement classical theory with real numbers to which they attribute physical existence, while most physicists reject Bohmian mechanics as supplemented quantum theory, arguing that Bohmian positions have no physical reality. (shrink)
It is usual to identify initial conditions of classical dynamical systems with mathematical real numbers. However, almost all real numbers contain an infinite amount of information. I argue that a finite volume of space can’t contain more than a finite amount of information, hence that the mathematical real numbers are not physically relevant. Moreover, a better terminology for the so-called real numbers is “random numbers”, as their series of bits are truly random. I propose an alternative classical mechanics, (...) which is empirically equivalent to classical mechanics, but uses only finite-information numbers. This alternative classical mechanics is non-deterministic, despite the use of deterministic equations, in a way similar to quantum theory. Interestingly, both alternative classical mechanics and quantum theories can be supplemented by additional variables in such a way that the supplemented theory is deterministic. Most physicists straightforwardly supplement classical theory with real numbers to which they attribute physical existence, while most physicists reject Bohmian mechanics as supplemented quantum theory, arguing that Bohmian positions have no physical reality. (shrink)
This is a short, nontechnical introduction to features of time in classical and relativistic physics and their representation in the four-dimensional geometry of spacetime. Topics discussed include: the relativity of simultaneity in special and general relativity; the ‘twin paradox’ and differential aging effects in special and general relativity; and time travel in general relativity.
“Physical premotion” is a concept associated with Baroque Catholic theological debates concerning grace and freedom. In this paper, I present an argument that the entities identified in this debate, physical premotions, are necessary for any classical theist’s account of divine causality. A “classical theist” is a theist who holds both that God is simple, that is, without inhering properties, and that humans and God are both free in the incompatibilist sense. In fact, not only does the acceptance of (...) physical premotions not entail determinism, physical premotions are the only way for classical theists to preserve the aforementioned two commitments. Nevertheless, the theory of premotions cannot help theologians resolve questions of how God causes human free acts without violating their freedom. (shrink)
We think of kinetic energy (KE) as a quantity possessed by rest mass in motion. But somehow electromagnetic (EM) radiation transports KE across space without any rest mass. In addition, a single photon passing through a double slit diffracts into multiple paths in space without affecting its KE. This is hard to explain. Quantum theories that confront the double slit problem do not address these two issues directly. The ontology of radiation KE is examined which leads to some new ideas (...) as to the nature of KE transmission by radiation. (shrink)
For two reasons, physics occupies a preeminent position among the sciences. On the one hand, due to its recognized position as a fundamental science, and on the other hand, due to the characteristic of its obvious certainty of knowledge. For both reasons it is regarded as the paradigm of scientificity par excellence. With its focus on the issue of epistemic certainty, philosophy of science follows in the footsteps of classical epistemology, and this is also the basis of its (...) 'judicial' pretension vis-à-vis physics. Whereas physics is in a strong competitive relationship to philosophy and epistemology with respect to its position as a fundamental science - even on the subject of cognition, as the pretension of 'reductionism' shows. It is the thematic focus on epistemic certainty itself, however, that becomes the root of a profound epistemological misunderstanding of physics. The reason for this is twofold: first, the idea of epistemic certainty as a criterion of 'demarcation' between physics and metaphysics obscures the view of the much deeper heuristic differences between the two kinds of knowledge. The second, related, reason is that epistemology does not ask the question of the reason for the epistemic certainty of physics; instead, it sets itself the task of 'legitimating' physical knowledge, and this, crucially, with reference to the interpretation of the process of cognition. Thus, as a matter of course, all epistemological assumptions about this process – including the common descriptive understanding of knowledge and its ontological premises – flow into the interpretation of physics as a science. Consequently, this undertaking is not only doubtful from the ground up, because it presupposes for its meaningfulness nothing less than certainty of knowledge concerning (the interpretation of) the process of knowledge, thereby relying on mere convictions; moreover, by projecting the descriptive, 'metaphysical' concept of knowledge onto physics, it leads to unsolvable epistemological problems and corresponding resignative conclusions concerning the claim of knowledge of physics. In other words, epistemology itself builds, due to its basic assumptions, a major obstacle for an adequate understanding of physics. Physics' cross-object, deconstructive approach to knowledge implies a completely different, non-descriptive understanding of its concepts, with consequences that extend far beyond itself due to its status as a basic science. (shrink)
Symmetries play a major role in physics, in particular since the work by E. Noether and H. Weyl in the first half of last century. Herein, we briefly review their role by recalling how symmetry changes allow to conceptually move from classical to relativistic and quantum physics. We then introduce our ongoing theoretical analysis in biology and show that symmetries play a radically different role in this discipline, when compared to those in current physics. By this (...) comparison, we stress that symmetries must be understood in relation to conservation and stability properties, as represented in the theories. We posit that the dynamics of biological organisms, in their various levels of organization, are not just processes, but permanent (extended, in our terminology) critical transitions and, thus, symmetry changes. Within the limits of a relative structural stability (or interval of viability), variability is at the core of these transitions. (shrink)
The impetus theory of motion states that to be in motion is to have a non-zero velocity. The at-at theory of motion states that to be in motion is to be at different places at different times, which in classicalphysics is naturally understood as the reduction of velocities to position developments. I first defend the at-at theory against the criticism raised by Arntzenius that it renders determinism impossible. I then develop a novel impetus theory of motion that (...) reduces positions to velocity developments. As this impetus theory of motion is by construction a mirror image of the at-at theory of motion, I claim that the two theories of motion are in fact epistemically on par—despite the unfamiliar metaphysical picture of the world furnished by the impetus version. (shrink)
The nature of time is yet to be fully grasped and finally agreed upon among physicists, philosophers, psychologists and scholars from various disciplines. Present paper takes clue from the known assumptions of time as - movement, change, becoming - and the nature of time will be thoroughly discussed. -/- The real and unreal existences of time will be pointed out and presented. The complex number notation of nature of time will be put forward. Natural scientific systems and various cosmic processes (...) will be identified as constructing physical form of time and the physical existence of time will be designed. -/- The finite and infinite forms of physical time and classical, quantum and cosmic times will be delineated and their mathematical constructions and loci will be narrated. -/- Thus the physics behind time-construction, time creation and time-measurement will be given. -/- Based on these developments the physics of Timelessness will be developed and presented. -/- . (shrink)
Wigner’s quantum-mechanical classification of particle-types in terms of irreducible representations of the Poincaré group has a classical analogue, which we extend in this paper. We study the compactness properties of the resulting phase spaces at fixed energy, and show that in order for a classical massless particle to be physically sensible, its phase space must feature a classical-particle counterpart of electromagnetic gauge invariance. By examining the connection between massless and massive particles in the massless limit, we also (...) derive a classical-particle version of the Higgs mechanism. (shrink)
We address the question of whether it is possible to operate a time machine by manipulating matter and energy so as to manufacture closed timelike curves. This question has received a great deal of attention in the physics literature, with attempts to prove no- go theorems based on classical general relativity and various hybrid theories serving as steps along the way towards quantum gravity. Despite the effort put into these no-go theorems, there is no widely accepted definition of (...) a time machine. We explain the conundrum that must be faced in providing a satisfactory definition and propose a resolution. Roughly, we require that all extensions of the time machine region contain closed timelike curves; the actions of the time machine operator are then sufficiently "potent" to guarantee that closed timelike curves appear. We then review no-go theorems based on classical general relativity, semi-classical quantum gravity, quantum field theory on curved spacetime, and Euclidean quantum gravity. Our verdict on the question of our title is that no result of sufficient generality to underwrite a confident "yes" has been proven. Our review of the no-go results does, however, highlight several foundational problems at the intersection of general relativity and quantum physics that lend substance to the search for an answer. (shrink)
Quantum mechanics was reformulated as an information theory involving a generalized kind of information, namely quantum information, in the end of the last century. Quantum mechanics is the most fundamental physical theory referring to all claiming to be physical. Any physical entity turns out to be quantum information in the final analysis. A quantum bit is the unit of quantum information, and it is a generalization of the unit of classical information, a bit, as well as the quantum information (...) itself is a generalization of classical information. Classical information refers to finite series or sets while quantum information, to infinite ones. Quantum information as well as classical information is a dimensionless quantity. Quantum information can be considered as a “bridge” between the mathematical and physical. The standard and common scientific epistemology grants the gap between the mathematical models and physical reality. The conception of truth as adequacy is what is able to transfer “over” that gap. One should explain how quantum information being a continuous transition between the physical and mathematical may refer to truth as adequacy and thus to the usual scientific epistemology and methodology. If it is the overall substance of anything claiming to be physical, one can question how different and dimensional physical quantities appear. Quantum information can be discussed as the counterpart of action. Quantum information is what is conserved, action is what is changed in virtue of the fundamental theorems of Emmy Noether (1918). The gap between mathematical models and physical reality, needing truth as adequacy to be overcome, is substituted by the openness of choice. That openness in turn can be interpreted as the openness of the present as a different concept of truth recollecting Heidegger’s one as “unconcealment” (ἀλήθεια). Quantum information as what is conserved can be thought as the conservation of that openness. (shrink)
We start from previous studies of G.N. Ord and A.S. Deakin showing that both the classical diffusion equation and Schrödinger equation of quantum mechanics have a common stump. Such result is obtained in rigorous terms since it is demonstrated that both diffusion and Schrödinger equations are manifestation of the same mathematical axiomatic set of the Clifford algebra. By using both such ( ) i A S and the i,±1 N algebra, it is evidenced, however, that possibly the two basic (...) equations of the physics cannot be reconciled. 1. (shrink)
In this paper, we review a general technique for converting the standard Lagrangian description of a classical system into a formulation that puts time on an equal footing with the system's degrees of freedom. We show how the resulting framework anticipates key features of special relativity, including the signature of the Minkowski metric tensor and the special role played by theories that are invariant under a generalized notion of Lorentz transformations. We then use this technique to revisit a classification (...) of classical particle-types that mirrors Wigner's classification of quantum particle-types in terms of irreducible representations of the Poincaré group, including the cases of massive particles, massless particles, and tachyons. Along the way, we see gauge invariance naturally emerge in the context of classical massless particles with nonzero spin, as well as study the massless limit of a massive particle and derive a classical-particle version of the Higgs mechanism. (shrink)
This second volume is a continuation of the first volume’s 20th century conceptual foundations of quantum physics extending its view to the principles and research fields of the 21st century. A summary of the standard concepts, from modern advanced experimental tests of 'quantum ontology’ to the interpretations of quantum mechanics, the standard model of particle physics, and the mainstream quantum gravity theories. A state-of-the-art treatise that reports on the recent developments in quantum computing, classical and quantum information (...) theory, the black holes information paradox and the holographic principle to quantum cosmology, with some attention on contemporary themes such as the Bose-Einstein condensates as also to the more speculative areas of quantum biology and quantum consciousness. A final chapter on the connections between the quantum realm and philosophical idealism concludes this volume. Considering how the media (sometimes also physicists) present quantum theory, which focuses only on highly dubious ideas and speculations backed by no evidence or, worse, promote pseudo-scientific hypes that fall regularly in and out of fashion, this is a ‘vademecum’ for those who look for a serious introduction and deeper understanding of the 21st century quantum theory. All topics are explained with a concise but rigorous intermediate level style which may, at times, require some effort. However, you will finally acquire an unparalleled background in the conceptual foundations of quantum physics, enabling you to distinguish between the real science backed by experimental facts and mere speculative interpretations. (shrink)
The paper takes up Bell's “Everett theory” and develops it further. The resulting theory is about the system of all particles in the universe, each located in ordinary, 3-dimensional space. This many-particle system as a whole performs random jumps through 3N-dimensional configuration space – hence “Tychistic Bohmian Mechanics”. The distribution of its spontaneous localisations in configuration space is given by the Born Rule probability measure for the universal wavefunction. Contra Bell, the theory is argued to satisfy the minimal desiderata for (...) a Bohmian theory within the Primitive Ontology framework. TBM's formalism is that of ordinary Bohmian Mechanics, without the postulate of continuous particle trajectories and their deterministic dynamics. This “rump formalism” receives, however, a different interpretation. We defend TBM as an empirically adequate and coherent quantum theory. Objections voiced by Bell and Maudlin are rebutted. The “for all practical purposes”-classical, Everettian worlds exist sequentially in TBM. In a temporally coarse-grained sense, they quasi-persist. By contrast, the individual particles themselves cease to persist. (shrink)
INTERNATIONAL STUDIES IN THE PHILOSOPHY OF SCIENCE Vol. 10, number 2, 1996, pp. 127-140. R.M. Nugayev. Why did the new physics force out the old ? Abstract. The aim of my paper is to demonstrate that special relativity and the early quantum theory were created within the same programme of statistical mechanics, thermodynamics and Maxwellian electrodynamics reconciliation. I’ll try to explain why classical mechanics and classical electrodynamics were “refuted” almost simultaneously or, in other words, why the quantum (...) revolution and the relativistic one both took place at the beginning of the 20th century. I’ll argue that the quantum and relativistic revolutions were simultaneous since they had a common origin – the clash beyween the mature theories of the second half of the 19th century that constituted the “body” of classicalphysics. The revolution’s most dramatic point was Einstein’s 1905 photon paper that laid the foundations of both special relativity and the old quantum theory. Hence the dialectic of the old theories is crucial for theory change. Later, classicalphysics was forced out by the joint development of quantum and relativistic subprogrammes. The title of my paper can be reformulated in Bruno Latour’s terms: The Einstein Revolution or Drawing Models Together. -/- . (shrink)
This report offers a modern perspective on the question of time directionality as it arises in a classical and quantum mechanical context, based on key developments in the field of gravitational physics. Important clarifications are achieved regarding, in particular, the concepts of time reversal, negative energy and causality. From this analysis emerges an improved understanding of the general relativistic concept of stress-energy of matter as being a manifestation of local variations in the energy density of zero-point vacuum fluctuations. (...) Based on those developments a set of axioms is proposed from which are derived generalized gravitational field equations which actually constitute a simplification of relativity theory in the presence of negative energy matter and vacuum energy. Those results are then applied to provide original solutions to several long-standing problems in theoretical cosmology and concerning the foundations of quantum theory, including the problem of the nature of dark matter and dark energy, that of the origin of thermodynamic time asymmetry and several other issues traditionally approached using inflation theory. Significant new insights are also provided concerning gravitational entropy, the problem of quantum non-locality, that of the emergence of time in quantum cosmology as well as the problem of the persistence of quasiclassicality following decoherence. (shrink)
In the quest and search for a physical theory of everything from the macroscopic large body matter to the microscopic elementary particles, with strange and weird concepts springing from quantum physics discovery, irreconcilable positions and inconvenient facts complicated physics – from Newtonian physics to quantum science, the question is- how do we close the gap? Indeed, there is a scientific and mathematical fireworks when the issue of quantum uncertainties and entanglements cannot be explained with classical (...) class='Hi'>physics. The Copenhagen interpretation is an expression of few wise men on quantum physics that was largely formulated from 1925 to 1927 namely by Niels Bohr and Werner Heisenberg. From this point on, there is a divergence of quantum science into the realms of indeterminacy, complementarity and entanglement which are principles expounded in Yijing, an ancient Chinese knowledge constructed on symbols, with a vintage of at least 3 millennia, with broken and unbroken lines to form stacked 6-line structure called the hexagram. It is premised on probability development of the hexagram in a space-time continuum. The discovery of the quantization of action meant that quantum physics could not convincingly explain the principles of classicalphysics. This paper will draw the great departure from classicalphysics into the realm of probabilistic realities. The probabilistic nature and reality interpretation had a significant influence on Bohr’s line of thought. Apparently, Bohr realized that speaking of disturbance seemed to indicate that atomic objects were classical particles with definite inherent kinematic and dynamic properties (Hanson, 1959). Disturbances, energy excitation and entanglements are processual evolutionary phases in Yijing. This paper will explore the similarities in quantum physics and the methodological ways where Yijing is pivoted to interpret observable realities involving interactions which are uncontrollable and probabilistic and forms an inseparable unity due to the entanglement, superposition Transgressing disciplinary boundaries in the discussion of Yijing, originally from the Western Zhou period (1000-750 BC), over a period of warring states and the early imperial period (500-200 BC) which was compiled, transcribed and transformed into a cosmological texts with philosophical commentaries known as the “Ten Wings” and closely associated with Confucius (551- 479 BC) with the Copenhagen Interpretation (1925-1927) by the few wise men including Niel Bohr and Werner Heisensberg would seem like a subversive undertaking. Subversive as the interpretations from Yijing is based on wisdom derived from thousands of years from ancient China to recently discovered quantum concepts. The subversive undertaking does seem to violate the sanctuaries of accepted ways in looking at Yijing principles, classicalphysics and quantum science because of the fortified boundaries that have been erected between Yijing and the sciences. Subversive as this paper may be, it is an attempt to re-cast an ancient framework where indeterminism, complementarity, non-linearity entanglement, superposition and probability interpretation is seen in today quantum’s realities. (shrink)
Do scientific theories limit human knowledge? In other words, are there physical variables hidden by essence forever? We argue for negative answers and illustrate our point on chaotic classical dynamical systems. We emphasize parallels with quantum theory and conclude that the common real numbers are, de facto, the hidden variables of classicalphysics. Consequently, real numbers should not be considered as ``physically real" and classical mechanics, like quantum physics, is indeterministic.
This physics note entails a summary of an extended form of Eccles-Cartesian Interactive Dualism mind-body-multiverse paradigm called Noetic Field Theory: The Quantization of Mind (NFT), distinguished as a paradigm because it is comprehensive and empirically testable. NFT posits not only that the brain is not the seat of awareness but also that neither classical nor quantum mechanics are sufficient to describe mind as the required regime entails the new physics associated with Unified Field, UF Mechanics. This means (...) that the brain is merely a transducer (form of quantum computer) mediating mentation, sensory data and metabolic function. The so-called ‘Hard Problem’ of cognitive science arises as a category error in philosophy of mind, i.e. an incorrect posit of the question “what processes in the brain give rise to awareness” which instead should simply be queried “what processes give rise to awareness”. In the history of science whenever a hard problem has arisen it has later been shown that the underlying principles had not been understood. NFT posits these underlying principles in a comprehensive empirically testable manner solving the ancient mind-body problem in a manner that enables breaking the 1st person 3rd person barrier able to explain Psi-phenomena. The premise is that the mind-body is a naturally occurring form of ‘conscious quantum computer’ (QC), not that the QC is conscious but that it is modelled after those principles. Such QC technologies could lead to routine telepathic effect devices. (shrink)
An ontology of Leibnizian relationalism, consisting in distance relations among sparse matter points and their change only, is well recognized as a serious option in the context of classical mechanics. In this paper, we investigate how this ontology fares when it comes to general relativistic physics. Using a Humean strategy, we regard the gravitational field as a means to represent the overall change in the distance relations among point particles in a way that achieves the best combination of (...) being simple and being informative. (shrink)
This paper examines the almost ineradicable misconception of Wittgenstein's alleged antagonism to science as evidenced through some characteristic disparaging comments by world-renowned scientists, notably by Anton Zeilinger. Above all, he criticizes Wittgenstein on the basis of the opening sentence of the Tractatus Logico-Philosophicus, "The world is all that is the case", which he regards as expressing *"the naive world-view"*1 of a *"typical philosopher of classicalphysics"*. He proposes an extension in agreement with the findings of quantum theory, namely (...) by the clause *"… and all that can be the case"* (Zeilinger 2003, 231). -/- It will become apparent, however, that this amplification is redundant, that Wittgenstein was in tune with modern physics, that a surprising number of his philosophical concepts are in agreement with it, and that various quantum pundits consider them to be relevant. (shrink)
With his influence on the development of physiology, physics and geometry, Hermann von Helmholtz – like few scientists of the second half of the 19th century – is representative of the research in natural science in Germany. The development of his understanding of science is not less representative. Until the late sixties, he emphatically claimed the truth of science; later on, he began to see the conditions for the validity of scientific knowledge in relative terms, and this can, in (...) summary, be referred to as hypothesizing. Already in the past century, HeImholtz made first approaches to an understanding of science, which were incompatible with his own former position and which pointed to the modern age to an astonishingly large extent. A comparison with Karl R. Popper's logic of research will illustrate how closely he nevertheless approached modern understanding of science. In Popper's logic of research, hypothesizing of scientific knowledge is definitely much more advanced than in Helmholtz's theory of science. What begins vaguely to emerge with Helmholtz has already become an explicitly formulated programme with Popper. Although HeImholtz and Popper are not on a direct line of epistemological development and Popper refers to HeImholtz only rarely and casually, there are in fact surprising points of contact which have not been taken notice of so far and which appear above all if one looks at Helmholtz's understanding of science against the background of Popper's logic of research. (shrink)
Since the early days of physics, space has called for means to represent, experiment, and reason about it. Apart from physicists, the concept of space has intrigued also philosophers, mathematicians and, more recently, computer scientists. This longstanding interest has left us with a plethora of mathematical tools developed to represent and work with space. Here we take a special look at this evolution by considering the perspective of Logic. From the initial axiomatic efforts of Euclid, we revisit the major (...) milestones in the logical representation of space and investigate current trends. In doing so, we do not only consider classical logic, but we indulge ourselves with modal logics. These present themselves naturally by providing simple axiomatizations of different geometries, topologies, space-time causality, and vector spaces. (shrink)
Following the proposal of a new kind of selective structural realism that uses as a basis the distinction between framework and interaction theories, this work discusses relevant applications in fundamental physics. An ontology for the different entities and properties of well-known theories is thus consistently built. The case of classical field theories—including general relativity as a classical theory of gravitation—is examined in detail, as well as the implications of the classification scheme for issues of realism in quantum (...) mechanics. These applications also shed light on the different range of applicability of the ontic and epistemic versions of structural realism. (shrink)
The notions of conservation and relativity lie at the heart of classical mechanics, and were critical to its early development. However, in Newton’s theory of mechanics, these symmetry principles were eclipsed by domain-specific laws. In view of the importance of symmetry principles in elucidating the structure of physical theories, it is natural to ask to what extent conservation and relativity determine the structure of mechanics. In this paper, we address this question by deriving classical mechanics—both nonrelativistic and relativistic—using (...) relativity and conservation as the primary guiding principles. The derivation proceeds in three distinct steps. First, conservation and relativity are used to derive the asymptotically conserved quantities of motion. Second, in order that energy and momentum be continuously conserved, the mechanical system is embedded in a larger energetic framework containing a massless component that is capable of bearing energy. Imposition of conservation and relativity then results, in the nonrelativistic case, in the conservation of mass and in the frame-invariance of massless energy; and, in the relativistic case, in the rules for transforming massless energy and momentum between frames. Third, a force framework for handling continuously interacting particles is established, wherein Newton’s second law is derived on the basis of relativity and a staccato model of motion-change. Finally, in light of the derivation, we elucidate the structure of mechanics by classifying the principles and assumptions that have been employed according to their explanatory role, distinguishing between symmetry principles and other types of principles that are needed to build up the theoretical edifice. (shrink)
In classical electrodynamics, electromagnetic effects are calculated from solution of wave equation formed by combination of four Maxwell’s equations. However, along with retarded solution, this wave equation admits advanced solution in which case the effect happens before the cause. So, to preserve causality in natural events, the retarded solution is intentionally chosen and the advance part is just ignored. But, an equation or method cannot be called fundamental if it admits a wrong result (that violates principle of causality) in (...) addition to the correct result. Since it is the Maxwell’s form of equations that gives birth to this acausal advanced potential, we rewrite these equations in a different form using the recent theory of reaction at a distance (Biswaranjan Dikshit, Physics essays, 24(1), 4-9, 2011) so that the process of calculation does not generate any advanced effects. Thus, the long-standing causality problem in electrodynamics is solved. (shrink)
If the concept of “free will” is reduced to that of “choice” all physical world shares the latter quality. Anyway the “free will” can be distinguished from the “choice”: The “free will” involves implicitly a certain goal, and the choice is only the mean, by which the aim can be achieved or not by the one who determines the target. Thus, for example, an electron has always a choice but not free will unlike a human possessing both. Consequently, and paradoxically, (...) the determinism of classicalphysics is more subjective and more anthropomorphic than the indeterminism of quantum mechanics for the former presupposes certain deterministic goal implicitly following the model of human freewill behavior. Quantum mechanics introduces the choice in the fundament of physical world involving a generalized case of choice, which can be called “subjectless”: There is certain choice, which originates from the transition of the future into the past. Thus that kind of choice is shared of all existing and does not need any subject: It can be considered as a low of nature. There are a few theorems in quantum mechanics directly relevant to the topic: two of them are called “free will theorems” by their authors (Conway and Kochen 2006; 2009). Any quantum system either a human or an electron or whatever else has always a choice: Its behavior is not predetermined by its past. This is a physical law. It implies that a form of information, the quantum information underlies all existing for the unit of the quantity of information is an elementary choice: either a bit or a quantum bit (qubit). (shrink)
The force law of Maxwell’s classical electrodynamics does not agree with Newton’s third law of motion (N3LM), in case of open circuit magnetostatics. Initially, a generalized magnetostatics theory is presented that includes two additional physical fields B_Φ and B_l, defined by scalar functions. The scalar magnetic field B_l mediates a longitudinal Ampère force that balances the transverse Ampère force (aka the magnetic field force), such that the sum of the two forces agrees with N3LM for all stationary current distributions. (...) Secondary field induction laws are derived; a secondary curl free electric field E_l is induced by a time varying scalar magnetic field B_l, which isn’t described by Maxwell’s electrodynamics. The Helmholtz’ decomposition is applied to exclude E_l from the total electric field E, resulting into a more simple Maxwell theory. Decoupled inhomogeneous potential equations and its solutions follow directly from this theory, without having to apply a gauge condition. Field expressions are derived from the potential functions that are simpler and far field consistent with respect to the Jefimenko fields. However, our simple version of Maxwell’s theory does not satisfy N3LM. Therefore we combine the generalized magnetostatics with the simple version of Maxwell’s electrodynamics, via the generalization of Maxwell’s speculative displacement current. The resulting electrodynamics describes three types of vacuum waves: the Φ wave, the longitudinal electromagnetic (LEM) wave and the transverse electromagnetic (TEM) wave, with phase velocities respectively a, b and c. Power- and force theorems are derived, and the force law agrees with Newton’s third law only if the phase velocities satisfy the following condition: a >> b and b = c. The retarded potential functions can be found without gauge conditions, and four retarded field expressions are derived that have three near field terms and six far field terms. All six far field terms are explained as the mutual induction of two free fields. Our theory supports Rutherford’s solution of the 4/3 problem of electromagnetic mass, which requires an extra longitudinal electromagnetic momentum. Our generalized classical electrodynamics might spawn new physics experiments and electrical engineering, such as new photoelectric effects based on Φ- or LEM radiation, and the conversion of natural Φ- or LEM radiation into useful electricity, in the footsteps of dr. N. Tesla and dr. T.H. Moray. (shrink)
The paper discusses the possibility of constructing economic models using the methodology of model construction in classical mechanics. At the same time, unlike the "econophysical" approach, the properties of economic models are derived without involvement of any equivalent physical properties, but with account of the types of symmetry existing in the economic system. It has been shown that at this approach practically all known mechanical variables have their "economic twins". The variational principle is formulated on the basis of formal (...) mathematical construction without involving the subjective factor common to the majority of models in economics. The dynamics of interaction of two companies has been studies in details, on the basis of which we can proceed to modeling of more complex and realistic economic systems. Prediction of the possibility of constructing economic theory on the basis of primary principles analogously to physics has been made. (shrink)
This work is a conceptual analysis of certain recent developments in the mathematical foundations of Classical and Quantum Mechanics which have allowed to formulate both theories in a common language. From the algebraic point of view, the set of observables of a physical system, be it classical or quantum, is described by a Jordan-Lie algebra. From the geometric point of view, the space of states of any system is described by a uniform Poisson space with transition probability. Both (...) these structures are here perceived as formal translations of the fundamental twofold role of properties in Mechanics: they are at the same time quantities and transformations. The question becomes then to understand the precise articulation between these two roles. The analysis will show that Quantum Mechanics can be thought as distinguishing itself from Classical Mechanics by a compatibility condition between properties-as-quantities and properties-as-transformations. -/- Moreover, this dissertation shows the existence of a tension between a certain "abstract way" of conceiving mathematical structures, used in the practice of mathematical physics, and the necessary capacity to specify particular states or observables. It then becomes important to understand how, within the formalism, one can construct a labelling scheme. The “Chase for Individuation” is the analysis of different mathematical techniques which attempt to overcome this tension. In particular, we discuss how group theory furnishes a partial solution. (shrink)
Maxwell’s Classical Electrodynamics (MCED) suffers several inconsistencies: (1) the Lorentz force law of MCED violates Newton’s Third Law of Motion (N3LM) in case of stationary and divergent or convergent current distributions; (2) the general Jefimenko electric field solution of MCED shows two longitudinal far fields that are not waves; (3) the ratio of the electrodynamic energy-momentum of a charged sphere in uniform motion has an incorrect factor of 4/3. A consistent General Classical Electrodynamics (GCED) is presented that is (...) based on Whittaker’s reciprocal force law that satisfies N3LM. The Whittaker force is expressed as a scalar magnetic field force, added to the Lorentz force. GCED is consistent only if it is assumed that the electric potential velocity in vacuum, ’a’, is much greater than ’c’ (a ≫ c); GCED reduces to MCED, in case we assume a = c. Longitudinal electromagnetic waves and superluminal longitudinal electric potential waves are predicted. This theory has been verified by seemingly unrelated experiments, such as the detection of superluminal Coulomb fields and longitudinal Ampe`re forces, and has a wide range of electrical engineering applications. (shrink)
In this paper, I present a new framework supporting the claim that some elements in science play a constitutive function, with the aim of overcoming some limitations of Friedman's (2001) account. More precisely, I focus on what I consider to be the gradualism implicit in Friedman's interpretation of the constitutive a priori, that is, the fact that it seems to allow for degrees of 'constitutivity'. I tease out such gradualism by showing that the constitutive character Friedman aims to track can (...) be captured by three features - namely, quasi-axiomaticity (QA), generative potential (GP), and empirical shielding (ES) - which are exhibited to a maximal degree by the examples Friedman deploys, particularly in his analysis of Newtonian mechanics. I argue that not all varieties of 'constitutivity' can be captured by the kind of gradualism implicit in Friedman's view, although developing the gradualism itself might provide useful insights. To show this, I analyse the function of the Hardy-Weinberg principle (HWP) in population genetics in terms of its QA, GP, and ES. Whereas the HWP does not count as constitutive in classical philosophical interpretations (Sober 1984), nor does it within Friedman's framework, it does nonetheless perform a minimally constitutive function. By means of historical details and considerations on the prospects of replacing the HWP, I show that the HWP is minimally constitutive by being a counterfactual instantiation of a paradigmatically constitutive stability principle, where the latter might itself be regarded as an enabling condition for a variety of modelling practices across the sciences. (shrink)
It is shown by means of general principles and specific examples that, contrary to a long-standing misconception, the modern mathematical physics of compressible fluid dynamics provides a generally consistent and efficient language for describing many seemingly fundamental physical phenomena. It is shown to be appropriate for describing electric and gravitational force fields, the quantized structure of charged elementary particles, the speed of light propagation, relativistic phenomena, the inertia of matter, the expansion of the universe, and the physical nature of (...) time. New avenues and opportunities for fundamental theoretical research are thereby illuminated. (shrink)
Propensities are presented as a generalization of classical determinism. They describe a physical reality intermediary between Laplacian determinism and pure randomness, such as in quantum mechanics. They are characterized by the fact that their values are determined by the collection of all actual properties. It is argued that they do not satisfy Kolmogorov axioms; other axioms are proposed.
The original proposal of H. H. Pattee (1971) of basing quantum theoretical measurement theory on the theory of the origin of life, and its far reaching consequences, is discussed in the light of a recently emerging biological paradigm of internal measurement. It is established that the "measurement problem" of quantum physics can, in principle, be traced back to the internal material constraints of the biological organisms, where choice is a fundamental attribute of the self-measurement of matter. In this light, (...) which is shown to be a consequence of Pattee's original suggestion, it is proposed that biological evolution is a gradual liberation from the inert unity of "subject" and "object" of inanimate matter (as "natural law" and "initial conditions"), to a split biological existence of them and, as a consequence, the "message of evolution" is freedom, rather than complexity in itself. Some classical philosophical systems are brought into context to show that the epistemologies of several strictly philosophical systems of the social sciences are well acquainted with the problem and their solutions support our conclusions. (shrink)
This paper centers on the implicit metaphysics beyond the Theory of Relativity and the Principle of Indeterminacy – two revolutionary theories that have changed 20th Century Physics – using the perspective of Husserlian Transcedental Phenomenology. Albert Einstein (1879-1955) and Werner Heisenberg (1901-1976) abolished the theoretical framework of Classical (Galilean- Newtonian) physics that has been complemented, strengthened by Cartesian metaphysics. Rene Descartes (1596- 1850) introduced a separation between subject and object (as two different and self- enclosed substances) while (...) Galileo and Newton did the “mathematization” of the world. Newtonian physics, however, had an inexplicable postulate of absolute space and absolute time – a kind of geometrical framework, independent of all matter, for the explication of locality and acceleration. Thus, Cartesian modern metaphysics and Galilean- Newtonian physics go hand in hand, resulting to socio- ethical problems, materialism and environmental destruction. Einstein got rid of the Newtonian absolutes and was able to provide a new foundation for our notions of space and time: the four (4) dimensional space- time; simultaneity and the constancy of velocity of light, and the relativity of all systems of reference. Heisenberg, following the theory of quanta of Max Planck, told us of our inability to know sub- atomic phenomena and thus, blurring the line between the Cartesian separation of object and subject, hence, initiating the crisis of the foundations of ClassicalPhysics. But the real crisis, according to Edmund Husserl (1859-1930) is that Modern (Classical) Science had “idealized” the world, severing nature from what he calls the Lebenswelt (life- world), the world that is simply there even before it has been reduced to mere mathematical- logical equations. Husserl thus, aims to establish a new science that returns to the “pre- scientific” and “non- mathematized” world of rich and complex phenomena: phenomena as they “appear to human consciousness”. To overcome the Cartesian equation of subject vs. object (man versus environment), Husserl brackets the external reality of Newtonian Science (epoché = to put in brackets, to suspend judgment) and emphasizes (1) the meaning of “world” different from the “world” of ClassicalPhysics, (2) the intentionality of consciousness (L. in + tendere = to tend towards, to be essentially related to or connected to) which means that even before any scientific- logical description of the external reality, there is always a relation already between consciousness and an external reality. The world is the equiprimordial existence of consciousness and of external reality. My paper aims to look at this new science of the pre- idealized phenomena started by Husserl (a science of phenomena as they appear to conscious, human, lived experience, hence he calls it phenomenology), centering on the life- world and the intentionality of consciousness, as providing a new way of looking at ourselves and the world, in short, as providing a new metaphysics (as an antidote to Cartesian metaphysics) that grounds the revolutionary findings of Einstein and Heisenberg. The environmental destruction, technocracy, socio- ethical problems in the modern world are all rooted in this Galilean- Newtonian- Cartesian interpretation of the relationship between humans and the world after the crumbling of European Medieval Ages. Friedrich Nietzsche (1844-1900) comments that the modern world is going toward a nihilism (L. nihil = nothingness) at the turn of the century. Now, after two World Wars and the dropping of Atomic bomb, the capitalism and imperialism on the one hand, and on the other hand the poverty, hunger of the non- industrialized countries alongside destruction of nature (i.e., global warming), Nietzsche might be correct: unless humanity changes the way it looks at humanity and the kosmos. The works of Einstein, Heisenberg and Husserl seem to be pointing the way for us humans to escape nihilism by a “great existential transformation.” What these thinkers of post- modernity (after Cartesian/ Newtonian/ Galilean modernity) point to are: a) a new therapeutic way of looking at ourselves and our world (metaphysics) and b) a new and corrective notion of “rationality” (different from the objectivist, mathematico- logical way of thinking). This paper is divided into four parts: 1) A summary of ClassicalPhysics and a short history of Quantum Theory 2) Einstein’s Special and General Relativity and Heisenberg’s Indeterminacy Principle 3) Husserl’s discussion of the Crisis of Europe, the life- world and intentionality of consciousness 4) A Metaphysics of Relativity and Indeterminacy and a Corrective notion of Rationality in Husserl’s Phenomenology . (shrink)
Maxwell’s Classical Electrodynamics (MCED) suffers several inconsistencies: (1) the Lorentz force law of MCED violates Newton’s Third Law of Motion (N3LM) in case of stationary and divergent or convergent current distributions; (2) the general Jefimenko electric field solution of MCED shows two longitudinal far fields that are not waves; (3) the ratio of the electrodynamic energy-momentum of a charged sphere in uniform motion has an incorrect factor of 4/3. A consistent General Classical Electrodynamics (GCED) is presented that is (...) based on Whittaker’s reciprocal force law that satisfies N3LM. The Whittaker force is expressed as a scalar magnetic field force, added to the Lorentz force. GCED is consistent only if it is assumed that the electric potential velocity in vacuum, ’a’, is much greater than ’c’ (a ≫ c); GCED reduces to MCED, in case we assume a = c. Longitudinal electromagnetic waves and superluminal longitudinal electric potential waves are predicted. This theory has been verified by seemingly unrelated experiments, such as the detection of superluminal Coulomb fields and longitudinal Ampère forces, and has a wide range of electrical engineering applications. (shrink)
This thesis is a study of the notion of time in modern physics, consisting of two parts. Part I takes seriously the doctrine that modern physics should be treated as the primary guide to the nature of time. To this end, it offers an analysis of the various conceptions of time that emerge in the context of various physical theories and, furthermore, an analysis of the relation between these conceptions of time and the more orthodox philosophical views on (...) the nature of time. In Part II I explore the interpretation of nonrelativistic quantum mechanics in light of the suggestion that an overly Newtonian conception of time might be contributing to some of the difficulties that we face in interpreting the quantum mechanical formalism. In particular, I argue in favour of introducing backwards-in-time causal influences as part of an alternative conception of time that is consistent with the picture of reality that arises in the context of the quantum formalism. Moreover, I demonstrate that this conception of time can already be found in a particular formulation of classical mechanics. One might see that one of the central themes of Part II originates from a failure to heed properly the doctrine of Part I: study into the nature of time should be guided by modern physics and thus we should be careful not to insert a preconceived Newtonian conception of time unwittingly into our interpretation of the quantum mechanical formalism. Thus, whereas Part I is intended as a demonstration of methodology with respect to the study of time, Part II in a sense explores a confusion that can be seen as arising in the absence of this methodology. (shrink)
The problem of emergence in physical theories makes necessary to build a general theory of the relationships between the observed system and the observing system. It can be shown that there exists a correspondence between classical systems and computational dynamics according to the Shannon-Turing model. A classical system is an informational closed system with respect to the observer; this characterizes the emergent processes in classicalphysics as phenomenological emergence. In quantum systems, the analysis based on the (...) computation theory fails. It is here shown that a quantum system is an informational open system with respect to the observer and able to exhibit processes of observational, radical emergence. Finally, we take into consideration the role of computation in describing the physical world. (shrink)
While many different mechanisms contribute to the generation of spatial order in biological development, the formation of morphogenetic fields which in turn direct cell responses giving rise to pattern and form are of major importance and essential for embryogenesis and regeneration. Most likely the fields represent concentration patterns of substances produced by molecular kinetics. Short range autocatalytic activation in conjunction with longer range “lateral” inhibition or depletion effects is capable of generating such patterns (Gierer and Meinhardt, 1972). Non-linear reactions are (...) required, and mathematical criteria were derived to design molecular models capable of pattern generation. The classical embryological feature of proportion regulation can be incorporated into the models. The conditions are mathematically necessary for the simplest two-factor case, and are likely to be a fair approximation in multi-component systems in which activation and inhibition are systems parameters subsuming the action of several agents. Gradients, symmetric and periodic patterns, in one or two dimensions, stable or pulsing in time, can be generated on this basis. Our basic concept of autocatalysis in conjunction with lateral inhibition accounts for self-regulatory biological features, including the reproducible formation of structures from near-uniform initial conditions as required by the logic of the generation cycle. Real tissue form, for instance that of budding Hydra, may often be traced back to local curvature arising within an initially relatively flat cell sheet, the position of evagination being determined by morphogenetic fields. Shell theory developed for architecture may also be applied to such biological processes. (shrink)
In this thesis I argue against unrestricted mereological hybridism, the view that there are absolutely no constraints on wholes having parts from many different logical or ontological categories, an exemplar of which I take to be ‘mixed fusions’. These are composite entities which have parts from at least two different categories – the membered (as in classes) and the non-membered (as in individuals). As a result, mixed fusions can also be understood to represent a variety of cross-category summation such as (...) the abstract with the concrete, the physical with the non-physical, and the possible with the impossible, just to name a few. -/- Proposed by David Lewis (1991) alongside his defence of classical mereology (the major theory of parthood which permits such transcategorial composites through its principle of unrestricted composition) it is my contention that mixed fusions are an under-examined consequence of indiscriminate mereological fusion which harbour a multitude of complications. In my attempt to discern their substantive character, throughout this thesis I make a case study of mixed fusions and uncover several problematic consequences which I think follow from their most plausible assessment. -/- These include: (1) that mixed fusions’ probable membership relations may lead to dubious foundational loops in the mereological Universe, or (2) otherwise that mixed fusions oblige an implausible ontological priority of the mereological Universe as a whole; (3) that mixed fusions contradict the reductive account of set theory they are proposed within, by plausibly being seen to have the same members as their class parts, and (4) that mixed fusions therefore confound a mereological thesis of Composition as Identity, which some (including Lewis) use to support classical mereology – a consequence which is potentially self-defeating; (5) that mixed fusions as sums of abstract and concrete entities both subvert Lewis’s (1986) system of modal realism, while (6) also undermining less expansive theories of possible worlds; and finally, (7) that even where some of the foregoing is resisted, it remains implausible that mixed fusions are ontologically innocent, because their supposed distinction from their parts in this case ensures that they need to be counted as additional entities in one’s ontology. -/- To be clear, I do not advance a theory of mereological hybrid nihilism in the sense of denying all cases of transcategorial composition. (I only cover a few select instances of mereological hybridism via mixed fusions after all.) Rather, I deny that mereological hybridism is plausible in full generality, by demonstrating that any cases of it are at least limited by the constraints that I identify. This in turn vindicates a call for a restriction on parthood theories and composition principles which allow certain types of categorially mixed entities – including restricting classical mereology with its principle of unrestricted composition. -/- Although theories of parthood like the standard classical mereology are not ordinarily developed for the sake of mereological hybrids like mixed fusions, these and other transcategorial composites are still among the logical consequences of such parthood systems operating with sufficient generality. The significance of my thesis, then, comes from showcasing how some of these kinds of entities do not conform to the systems in which they are included as required, and hence I argue for the rejection of unrestricted mereological hybridism as well as any mereological principles which support it. (shrink)
In this paper I present and critically discuss the main strategies that Bohr used and could have used to fend off the charge that his interpretation does not provide a clear-cut distinction between the classical and the quantum domain. In particular, in the first part of the paper I reassess the main arguments used by Bohr to advocate the indispensability of a classical framework to refer to quantum phenomena. In this respect, by using a distinction coming from an (...) apparently unrelated philosophical corner, we could say that Bohr is not a revisionist philosopher of physics but rather a descriptivist one in the sense of Strawson. I will then go on discussing the nature of the holistic link between classical measurement apparatuses and observed system that he also advocated. The oft-repeated conclusion that Bohr’s interpretation of the quantum formalism is untenable can only be established by giving his arguments as much force as possible, which is what I will try to do in the following by remaining as faithful as possible to his published work. (shrink)
As figurational sociologists and sociolinguists, we need to know that we currently find support from other fields in our efforts to construct a sociocultural science focused on interdependencies and processes, creating a multidimensional picture of human beings, one in which the brain and its mental and emotional processes are properly recognized. The paradigmatic revolutions in 20th-century physics, the contributions made by biology to our understanding of living beings, the conceptual constructions built around the theories of systems, self-organization and complexity, (...) all these implore that we reflect on social sciences paradigms in the light of the great changes in these other disciplines. The application of metaphors or theoretical images of complexity and figurational sociology in understanding language and socio-communication phenomena is of great use, since language is not an ‘object’, but a ‘complex’; it exists simultaneously in and among different domains. ‘Languaging’ and interaction are co-phenomena. The former exists within the latter, and the latter within the former. By visualizing, for instance, the different levels of linguistic structure not as separate entities but rather as united and integrated within the same theoretical frame, by seeing their functional interdependencies, by situating them in a greater multidimensionality that includes what for a long time was considered ‘external’ – the individual and his or her mind-brain, the sociocultural system, the physical world, etc. – and expanding in this way our classical view, we should be able to make important, if not essential, theoretical and practical advances. (shrink)
The thesis that follows proffers a solution to the mind-matter problem, the problem as to how mind and matter relate. The proposed solution herein is a variant of panpsychism – the theory that all (pan) has minds (psyche) – that we name pansentient monism. By defining the suffix 'psyche' of panpsychism, i.e. by analysing what 'mind' is (Chapter 1), we thereby initiate the effacement of the distinction between mind and matter, and thus advance a monism. We thereafter critically examine the (...) prevalent view, antithetical to a pansentient monism, that mind is not identical to matter but emergent therefrom (Chapter 2). This anti-emergentist critique acts also as a fortification of the Genetic Argument for panpsychism: if mind is not emergent (nor distinct) from matter, mind must always have existed with matter. But what is 'matter'? Chapter 3 investigates what we understand by 'matter', or 'the physical', and exposes it as a highly deficient concept and percept that in concreto points to its identity with that denoted by 'mind'. This also acts as a fortification of the Abstraction Argument for panpsychism, employing a new taxonomy of physicalism and a new taxonomy of the varieties of abstraction. Thus do we reach a monism that is a parsimonious psycho-physical identity theory. But here we face what can be called The Identity Problem for Panpsychism: if our panpsychism is a psycho-physical identity theory, how can it respond to the powerful objections that beset the identity theory of the twentieth century? In Chapter 4 it will be argued that, like emergentism, this psycho-neural identity theory presupposed a deficient concept of 'matter', down to which mind was reduced away, let alone identified. But to identify down phenomena to what is actually an abstraction is to commit failure of explanation. When the theory is amended accordingly, we move from a psycho-neural identity theory to a genuine psycho-physical identity theory that as such can overcome the aforementioned identity problem. Furthermore, as Chapter 5 clarifies, our pansentient monism has, in addition to parsimony, the explanatory power to resolve the problem of mental causation that afflicts both the reductive physicalism of psycho-neural identity theory and the non-reductive physicalism of emergentism, by genuinely identifying physical and mental causation. Jaegwon Kim considers the place of consciousness in a physical world and the nature of mental causation to be the two key components of the mind-matter problem. Through the critical analysis of our prosaic understanding of mind and matter in this thesis, which incorporates the thought of both classical and contemporary thinkers through a novel fusion, it is hoped that both components are addressed and redressed. That is to say that I present this pansentient monism as a plausible, parsimonious, explanatory, and thus, I think, powerful position towards this ever-perplexing mind-matter mystery. -/- [This thesis was passed in January 2019 with viva examination from Galen Strawson and Joel Krueger. (shrink)
Two seemingly contradictory tendencies have accompanied the development of the natural sciences in the past 150 years. On the one hand, the natural sciences have been instrumental in effecting a thoroughgoing transformation of social structures and have made a permanent impact on the conceptual world of human beings. This historical period has, on the other hand, also brought to light the merely hypothetical validity of scientific knowledge. As late as the middle of the 19th century the truth-pathos in the natural (...) sciences was still unbroken. Yet in the succeeding years these claims to certain knowledge underwent a fundamental crisis. For scientists today, of course, the fact that their knowledge can possess only relative validity is a matter of self-evidence. The present analysis investigates the early phase of this fundamental change in the concept of science through an examination of Hermann von Helmholtz's conception of science and his mechanistic interpretation of nature. Helmholtz (1821-1894) was one of the most important natural scientists in Germany. The development of this thoughts offers an impressive but, until now, relatively little considered report from the field of the experimental sciences chronicling the erosion of certainty. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.