From the exponential function of Euler’s equation to the geometry of a fundamental form, a calculation of the fine-structure constant and its relationship to the proton-electron mass ratio is given. Equations are found for the fundamentalconstants of the four forces of nature: electromagnetism, the weak force, the strong force and the force of gravitation. Symmetry principles are then associated with traditional physical measures.
The present crisis of foundations in Fundamental Science is manifested as a comprehensive conceptual crisis, crisis of understanding, crisis of interpretation and representation, crisis of methodology, loss of certainty. Fundamental Science "rested" on the understanding of matter, space, nature of the "laws of nature", fundamentalconstants, number, time, information, consciousness. The question "What is fundametal?" pushes the mind to other questions → Is Fundamental Science fundamental? → What is the most fundamental in the (...) Universum?.. Physics, do not be afraid of Metaphysics! Levels of fundamentality. The problem №1 of Fundamental Science is the ontological justification (basification) of mathematics. To understand is to "grasp" Structure ("La Structure mère"). Key ontological ideas for emerging from the crisis of understanding: total unification of matter across all levels of the Universum, one ontological superaxiom, one ontological superprinciple. The ontological construction method of the knowledge basis (framework, carcass, foundation). The triune (absolute, ontological) space of eternal generation of new structures and meanings. Super concept of the scientific world picture of the Information era - Ontological (structural, cosmic) memory as "soul of matter", measure of the Universum being as the holistic generating process. The result of the ontological construction of the knowledge basis: primordial (absolute) generating structure is the most fundamental in the Universum. (shrink)
Research into ancient physical structures, some having been known as the seven wonders of the ancient world, inspired new developments in the early history of mathematics. At the other end of this spectrum of inquiry the research is concerned with the minimum of observations from physical data as exemplified by Eddington's Principle. Current discussions of the interplay between physics and mathematics revive some of this early history of mathematics and offer insight into the fine-structure constant. Arthur Eddington's work (...) leads to a new calculation of the inverse fine-structure constant giving the same approximate value as ancient geometry combined with the golden ratio structure of the hydrogen atom. The hyperbolic function suggested by Alfred Landé leads to another result, involving the Laplace limit of Kepler's equation, with the same approximate value and related to the aforementioned results. The accuracy of these results are consistent with the standard reference. Relationships between the four fundamental coupling constants are also found. (shrink)
Arnold Sommerfeld introduced the fine-structure constant that determines the strength of the electromagnetic interaction. Following Sommerfeld, Wolfgang Pauli left several clues to calculating the fine-structure constant with his research on Johannes Kepler's view of nature and Pythagorean geometry. The Laplace limit of Kepler's equation in classical mechanics, the Bohr-Sommerfeld model of the hydrogen atom and Julian Schwinger's research enable a calculation of the electron magnetic moment anomaly. Considerations of fundamental lengths such as the charge radius of the proton and (...) mass ratios suggest some further foundational interpretations of quantum electrodynamics. (shrink)
In his [1937, 1938], Paul Dirac proposed his “Large Number Hypothesis” (LNH), as a speculative law, based upon what we will call the “Large Number Coincidences” (LNC’s), which are essentially “coincidences” in the ratios of about six large dimensionless numbers in physics. Dirac’s LNH postulates that these numerical coincidences reflect a deeper set of law-like relations, pointing to a revolutionary theory of cosmology. This led to substantial work, including the development of Dirac’s later [1969/74] cosmology, and other alternative cosmologies, such (...) as the Brans-Dicke modification of GTR, and to extensive empirical tests. We may refer to the generic hypothesis of “Large Number Relations” (LNR’s), as the proposal that there are lawlike relations of some kind between the dimensionless numbers, not necessarily those proposed in Dirac’s early LNH. Such relations would have a profound effect on our concepts of physics, but they remain shrouded in mystery. Although Dirac’s specific proposals for LNR theories have been largely rejected, the subject retains interest, especially among cosmologists seeking to test possible variations in fundamentalconstants, and to explain dark energy or the cosmological constant. In the first two sections here we briefly summarize the basic concepts of LNR’s. We then introduce an alternative LNR theory, using a systematic formalism to express variable transformations between conventional measurement variables and the true variables of the theory. We demonstrate some consistency results and review the evidence for changes in the gravitational constant G. The theory adopted in the strongest tests of Ġ/G, by the Lunar Laser Ranging (LLR) experiments, assumes: Ġ/G = 3(dr/dt)/r – 2(dP/dt)/P – (dm/dt)/m, as a fundamental relationship. Experimental measurements show the RHS to be close to zero, so it is inferred that significant changes in G are ruled out. However when the relation is derived in our alternative theory it gives: Ġ/G = 3(dr/dt)/r – 2(dP/dt)/P – (dm/dt)/m – (dR/dt)/R. The extra final term (which is the Hubble constant) is not taken into account in conventional derivations. This means the LLR experiments are consistent with our LNR theory (and others), and they do not really test for a changing value of G at all. This failure to transform predictions of LNR theories correctly is a serious conceptual flaw in current experiment and theory. (shrink)
An introduction is given to the geometry and harmonics of the Golden Apex in the Great Pyramid, with the metaphysical and mathematical determination of the fine-structure constant of electromagnetic interactions. Newton's gravitational constant is also presented in harmonic form and other fundamentalphysicalconstants are then found related to the quintessential geometry of the Golden Apex in the Great Pyramid.
After a brief review of the golden ratio in history and our previous exposition of the fine-structure constant and equations with the exponential function, the fine-structure constant is studied in the context of other research calculating the fine-structure constant from the golden ratio geometry of the hydrogen atom. This research is extended and the fine-structure constant is then calculated in powers of the golden ratio to an accuracy consistent with the most recent publications. The mathematical constants associated with the (...) golden ratio are also involved in both the calculation of the fine-structure constant and the proton-electron mass ratio. These constants are included in symbolic geometry of historical relevance in the science of the ancients. (shrink)
The fine-structure constant, which determines the strength of the electromagnetic interaction, is briefly reviewed beginning with its introduction by Arnold Sommerfeld and also includes the interest of Wolfgang Pauli, Paul Dirac, Richard Feynman and others. Sommerfeld was very much a Pythagorean and sometimes compared to Johannes Kepler. The archetypal Pythagorean triangle has long been known as a hiding place for the golden ratio. More recently, the quartic polynomial has also been found as a hiding place for the golden ratio. The (...) Kepler triangle, with its golden ratio proportions, is also a Pythagorean triangle. Combining classical harmonic proportions derived from Kepler’s triangle with quartic equations determine an approximate value for the fine-structure constant that is the same as that found in our previous work with the golden ratio geometry of the hydrogen atom. These results make further progress toward an understanding of the golden ratio as the basis for the fine-structure constant. (shrink)
The mathematical structure of realist quantum theories has given rise to a debate about how our ordinary 3-dimensional space is related to the 3N-dimensional configuration space on which the wave function is defined. Which of the two spaces is our (more) fundamentalphysical space? I review the debate between 3N-Fundamentalists and 3D-Fundamentalists and evaluate it based on three criteria. I argue that when we consider which view leads to a deeper understanding of the physical world, especially given (...) the deeper topological explanation from the unordered configurations to the Symmetrization Postulate, we have strong reasons in favor of 3D-Fundamentalism. I conclude that our evidence favors the view that our fundamentalphysical space in a quantum world is 3-dimensional rather than 3N-dimensional. I outline lines of future research where the evidential balance can be restored or reversed. Finally, I draw lessons from this case study to the debate about theoretical equivalence. (shrink)
The cosmological constant problem arises because the magnitude of vacuum energy density predicted by the Quantum Field Theory is about 120 orders of magnitude larger then the value implied by cosmological observations of accelerating cosmic expansion. We pointed out that the fractal nature of the quantum space-time with negative Hausdorff-Colombeau dimensions can resolve this tension. The canonical Quantum Field Theory is widely believed to break down at some fundamental high-energy cutoff ∗ Λ and therefore the quantum fluctuations in the (...) vacuum can be treated classically seriously only up to this high-energy cutoff. In this paper we argue that the Quantum Field Theory in fractal space-time with negative Hausdorff-Colombeau dimensions gives high-energy cutoff on natural way. We argue that there exists hidden physical mechanism which cancels divergences in canonical QED4 ,QCD4 , Higher-Derivative-Quantum gravity, etc. In fact we argue that corresponding supermassive Pauli-Villars ghost fields really exist. It means that there exists the ghost-driven acceleration of the universe hidden in cosmological constant. In order to obtain the desired physical result we apply the canonical Pauli-Villars regularization up to ∗ Λ . This would fit in the observed value of the dark energy needed to explain the accelerated expansion of the universe if we choose highly symmetric masses distribution between standard matter and ghost matter below the scale ∗ Λ , i.e. , ( ) ( ) . . eff eff s m g m , , , f μ f μ μ mc μ μ μ c ∗ − = ≤ < Λ The small value of the cosmological constant is explained by tiny violation of the symmetry between standard matter and ghost matter. Dark matter nature is also explained using a common origin of the dark energy and dark matter phenomena. (shrink)
In a preceding publication a fundamentally oriented and irreversible world was shown to be de- rivable from the important principle of least action. A consequence of such a paradigm change is avoidance of paradoxes within a “dynamic” quantum physics. This becomes essentially possible because fundamental irreversibility allows consideration of the “entropy” concept in elementary processes. For this reason, and for a compensation of entropy in the spread out energy of the wave, the duality of particle and wave has to (...) be mediated via an information self-image of matter. In this publication considerations are extended to irreversible thermodynamics, to gravitation and cos- mology with its dependence on quantum interpretations. The information self-image of matter around particles could be identified with gravitation. Because information can also impose an al- ways constant light velocity there is no need any more to attribute such a property to empty space, as done in relativity theory. In addition, the possibility is recognized to consider entropy genera- tion by expanding photon fields in the universe. Via a continuous activation of information on matter photons can generate entropy and release small energy packages without interacting with matter. This facilitates a new interpretation of galactic redshift, emphasizes an information link between quantum- and cosmological phenomena, and evidences an information-triggered origin of the universe. Self-organized processes approach maximum entropy production within their constraints. In a far from equilibrium world also information, with its energy content, can self- organize to a higher hierarchy of computation. It is here identified with consciousness. This ap- pears to explain evolution of spirit and intelligence on a materialistic basis. Also gravitation, here identified as information on matter, could, under special conditions, self-organize to act as a su- per-gravitation, offering an alternative to dark matter. Time is not an illusion, but has to be understood as flux of action, which is the ultimate reality of change. The concept of an irreversible physical world opens a route towards a rational understanding of complex contexts in nature. (shrink)
The golden ratio is found to be related to the fine-structure constant, which determines the strength of the electromagnetic interaction. The golden ratio and classical harmonic proportions with quartic equations give an approximate value for the inverse fine-structure constant the same as that discovered previously in the geometry of the hydrogen atom. With the former golden ratio results, relationships are also shown between the four fundamental forces of nature: electromagnetism, the weak force, the strong force, and the force of (...) gravitation. (shrink)
Modern physics describes the mechanics of the Universe. We have discovered a new foundation for physics, which explains the components of the Universe with precision and depth. We quantify the existence of Aether, subatomic particles, and the force laws. Some aspects of the theory derive from the Standard Model, but much is unique. A key discovery from this new foundation is a mathematically correct Unified Force Theory. Other fundamental discoveries follow, including the origin of the fine structure constant and (...) subatomic particle g-factors, a slight correction of neutron magnetic moment, a geometrical structure for charge, the quantification of electromagnetic charge as separate from electrostatic charge, a more precise meaning of spin, the quantification of space-resonance in five dimensions, and a new system of quantum units. The Aether quantifies as a fabric of quantum rotating magnetic fields with electromagnetic, electrostatic, and gravitational dipole structures. Subatomic particles quantify as angular momentum encapsulated in a quantum, rotating magnetic field. All quantum, atomic, and molecular processes can be precisely modeled, leading to discrete physics with new understandings and insights. (shrink)
Abstract : A measurement result is never absolutely accurate: it is affected by an unknown “measurement error” which characterizes the discrepancy between the obtained value and the “true value” of the quantity intended to be measured. As a consequence, to be acceptable a measurement result cannot take the form of a unique numerical value, but has to be accompanied by an indication of its “measurement uncertainty”, which enunciates a state of doubt. What, though, is the value of measurement uncertainty? What (...) is its numerical value: how does one calculate it? What is its epistemic value: how one should interpret a measurement result? Firstly, we describe the statistical models that scientists make use of in contemporary metrology to perform an uncertainty analysis, and we show that the issue of the interpretation of probabilities is vigorously debated. This debate brings out epistemological issues about the nature and function of physical measurements, metrologists insisting in particular on the subjective aspect of measurement. Secondly, we examine the philosophical elaboration of metrologists in their technical works, where they criticize the use of the notion of “true value” of a physical quantity. We then challenge this elaboration and defend such a notion. The third part turns to a specific use of measurement uncertainty in order to address our thematic from the perspective of precision physics, considering the activity of the adjustments of physicalconstants. In the course of this activity, physicists have developed a dynamic conception of the accuracy of their measurement results, oriented towards a future progress of knowledge, and underlining the epistemic virtues of a never-ending process of identification and correction of measurement errors. (shrink)
I provide a comprehensive metaphysics of causation based on the idea that fundamentally things are governed by the laws of physics, and that derivatively difference-making can be assessed in terms of what fundamental laws of physics imply for hypothesized events. Highlights include a general philosophical methodology, the fundamental/derivative distinction, and my mature account of causal asymmetry.
In the beginning God created the elementary particles. Bosons, electrons, protons, quarks and the rest he created them. And they were without form and void, so God created the fundamental laws of physics - the laws of mechanics, electromagnetism, thermodynamics and the rest - and assigned values to the fundamentalphysicalconstants: the gravitational constant, the speed of light, Planck's constant and the rest. God then set the Universe in motion. And God looked at what he (...) had done, and saw that it was physicalistically acceptable. (shrink)
For a long time it was believed that it was impossible to be realist about quantum mechanics. It took quite a while for the researchers in the foundations of physics, beginning with John Stuart Bell [Bell 1987], to convince others that such an alleged impossibility had no foundation. Nowadays there are several quantum theories that can be interpreted realistically, among which Bohmian mechanics, the GRW theory, and the many-worlds theory. The debate, though, is far from being over: in what respect (...) should we be realist regarding these theories? Two diff erent proposals have been made: on the one hand, there are those who insist on a direct ontological interpretation of the wave function as representing physical bodies, and on the other hand there are those who claim that quantum mechanics is not really about the wave function. In this paper we will present and discuss one proposal of the latter kind that focuses on the notion of primitive ontology. (shrink)
In the introduction I argue that the basic element (or primitive) for constructing the physical universe is "displacement from a prior level", and the basic structure is "a sequence of such displacements" (summarized as postulates 1 and 2). The displacements are then defined as one-dimensional objects with a direction (postulate 3). The relations between these displacements are stated in postulate 4. In section 2 we discuss basic consequences of the postulates, and in section 3 we use the postulates to (...) derive a (3+1)-dimensional structure, interpreted as ordinary space and time. We then derive further properties of space --- isotropy, homogeneity, and a rapid early expansion (i.e. inflation). Time, comporting with experience, is shown to be a one-dimensional stream --- with a direction. In section 4 we associate energy with the displacements, and find that the same factors that construct ordinary space (and make it isotropic and homogeneous) also smear the locations of entities/particles across that space --- thereby providing a mechanism/explanation for that iconic and enigmatic aspect of quantum mechanics. We also determine that there must be a continual, uniformly-distributed stream of (non-zero-point) energy coming into the system that constructs new space (i.e. dark energy). The streaming natures of both time and dark energy are shown to have the same basic cause: the processes that input dark energy into the system, and that construct time, are themselves independent of time --- and so they are continual processes. Further consequences follow from the model, including an explanation for why the presence of energy affects space and time, and why quantum vacuum energy is an exception to this rule (i.e. does not gravitate) --- thereby eliminating the cosmological constant problem. A key benefit of the model is that it liberates us from always having to think about the construction of the universe in terms of spatio-temporal relations and evolution (e.g. the big bang model), which is problematic because presumably (and as we will indeed see) space and time are products of the fundamental construction process, not things that govern it. (shrink)
Following the proposal of a new kind of selective structural realism that uses as a basis the distinction between framework and interaction theories, this work discusses relevant applications in fundamental physics. An ontology for the different entities and properties of well-known theories is thus consistently built. The case of classical field theories—including general relativity as a classical theory of gravitation—is examined in detail, as well as the implications of the classification scheme for issues of realism in quantum mechanics. These (...) applications also shed light on the different range of applicability of the ontic and epistemic versions of structural realism. (shrink)
This paper is a brief (and hopelessly incomplete) non-standard introduction to the philosophy of space and time. It is an introduction because I plan to give an overview of what I consider some of the main questions about space and time: Is space a substance over and above matter? How many dimensions does it have? Is space-time fundamental or emergent? Does time have a direction? Does time even exist? Nonetheless, this introduction is not standard because I conclude the discussion (...) by presenting the material with an original spin, guided by a particular understanding of fundamentalphysical theories, the so-called primitive ontology approach. (shrink)
The file on this site provides the slides for a lecture given in Hangzhou in May 2018, and the lecture itself is available at the URL beginning 'sms' in the set of links provided in connection with this item. -/- It is commonly assumed that regular physics underpins biology. Here it is proposed, in a synthesis of ideas by various authors, that in reality structures and mechanisms of a biological character underpin the world studied by physicists, in principle supplying detail (...) in the domain that according to regular physics is of an indeterminate character. In regular physics mathematical equations are primary, but this constraint leads to problems with reconciling theory and reality. Biology on the other hand typically does not characterise nature in quantitative terms, instead investigating in detail important complex interrelationships between parts, leading to an understanding of the systems concerned that is in some respects beyond that which prevails in regular physics. It makes contact with quantum physics in various ways, for example in that both involve interactions between observer and observed, an insight that explains what is special about processes involving observation, justifying in the quantum physics context the replacement of the unphysical many-worlds picture by one involving collapse. The link with biology furthermore clarifies Wheeler’s suggestion that a multiplicity of observations can lead to the ‘fabrication of form’, including the insight that this process depends on very specific ‘structures with power’ related to the 'semiotic scaffolding' of the application of sign theory to biology known as biosemiotics. -/- The observer-observed 'circle' of Wheeler and Yardley is a special case of a more general phenomenon, oppositional dynamics, related to the 'intra-action' of Barad's Agential Realism, involving cooperating systems such as mind and matter, abstract and concrete, observer and observed, that preserve their identities while interacting with one another in such a way as to act as a unit. A third system may also be involved, the mediating system of Peirce linking the two together. Such a situation of changing connections and separations may plausibly lead in the future to an understanding of how complex systems are able to evolve to produce 'life, the universe and everything'. -/- (Added 1 July 2018) The general structure proposed here as an alternative to a mathematics-based physics can be usefully characterised by relating it to different disciplines and the specialised concepts utilised therein. In theoretical physics, the test for the correctness of a theory typically involves numerical predictions, corresponding to which theories are expressed in terms of equations, that is to say assertions that two quantities have identical values. Equations have a lesser significance in biology which typically talks in terms of functional mechanisms, dependent for example on details of chemistry and concepts such as genes, natural selection, signals and geometrical or topologically motivated concepts such as the interconnections between systems and the unfolding of DNA. Biosemiotics adds to this the concept of signs and their interpretation, implying novel concepts such as semiotic scaffolding and the semiosphere, code duality, and appreciation of the different types of signs, including symbols and their capacity for abstraction and use in language systems. Circular Theory adds to this picture, as do the ideas of Barad, considerations such as the idea of oppositional dynamics. The proposals in this lecture can be regarded as the idea that concepts such as those deriving from biosemiotics have more general applicability than just conventional biology and may apply, in some circumstances, to nonlinear systems generally, including the domain new to science hypothesised to underlie the phenomena of present-day physics. -/- The task then has to be to restore the mathematical aspect presumed, in this picture, not to be fundamental as it is in conventional theory. Deacon has invoked a complex sequence of evolutionary steps to account for the emergence over time of human language systems, and correspondingly mathematical behaviour can be subsumed under the general evolutionary mechanisms of biosemiotics (cf. also the proposals of Davis and Hersh regarding the nature of mathematics), so that the mathematical behaviour of physical systems is consistent with the proposed scheme. In conclusion, it is suggested that theoretical physicists should cease expecting to find some universal mathematical ‘theory of everything’, and focus instead on understanding in more detail complex systems exhibiting behaviour of a biological character, extending existing understanding. This may in time provide a more fruitful understanding of the natural world than does the regular approach. The essential concepts have an observational basis from both biology and the little-known discipline of cymatics (a discipline concerned with the remarkable patterns that specific waveforms can give rise to), while again computer simulations also offer promise in providing insight into the complex behaviours involved in the above proposals. -/- References -/- Jesper Hoffmeyer, Semiotic Scaffolding of Living Systems. Commens, a Digital Companion to C. S. Peirce (on Commens web site). Terrence Deacon, The Symbolic Species, W.W. Norton & Co. Karen Barad, Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning, Duke University Press. Philip Davis and Reuben Hersh, The Mathematical Experience, Penguin. Ilexa Yardley, Circular Theory. (shrink)
In modern theoretical physics, the laws of physics are directly represented with axioms (e.g., the Dirac--Von Neumann axioms, the Wightman axioms, Newton's laws of motion, etc.). Although axioms in logic are held to be true merely by definition, the laws of physics on the other hand are entailed by laboratory measurements. This difference is sufficient to warrant a more appropriate mathematical structure than axioms to represent the laws of physics. This paper presents this structure and demonstrates its supremacy. Specifically, an (...) optimization problem based on the entropy of all possible measurements will be introduced. As its solution yields the laws of physics, the entailment from measurements to laws is respected. The solution not only recovers the Dirac--Von Neumann axioms along with the Born rule, but further improves upon them by automatically restricting the observables to no more than the standard model group symmetry SU(3) x SU(2) x U(1), while simultaneously extending the probability measure exactly enough to support general relativity in the form of a general linear gauge theory. As the solution to an optimization problem, it is arguable that this solution constitutes, in this present sense, the optimal formulation of physics. Finally, our approach strengthens the foundation of physics with the group of all measurements as its new and sole axiom, and all “theoretical artefacts” (Born rule, probability amplitude, Hilbert space, observables, etc.) are now promoted to theorems, thus providing a rigorous deductive account for their previously postulated origin. (shrink)
Purpose of this article is to investigate the role that the "miraculous" – that is, everything that goes beyond “natural” – plays in the worldview of Western man. Methodology. I do not consider “miracles” as the facts of nature, but as the facts of culture, so in this article I am not talking about specific cases of violation of “laws of nature”, but about the place of “miraculous” in the view of the world of Western man and those transformations, that (...) occur with this element outlook influenced the development of information technologies. Novelty. It has been proved that miracles should be sought in mind, because the “miraculous” event does our attitude towards it. Moreover, it is impossible to determine the “true miracle”, while we are “inside” the reality. It has been demonstrated that influenced the development of society is transformed representations of gods and miracles. It has been discovered fundamental shift associated with the transition from mythology-as-faith to mythology-as-show. However, even within the latter remains a need for miracles, though, and goes to a completely different level. Conclusions. The term "miracle" has no meaning outside of accepted socio-cultural settings. The last set as the “natural” conditions and admissibility (inadmissibility) of its violation. And these installations are formed by social institutions, which in this age and at this time have a weight sufficient to impose their views to all the congregation or the greater part. Any extraordinary event can be explained by the action of internal agents unknown to us, and their ability to alter the reality is not necessarily superior to even our own capabilities. The only thing we can not do – is to change the source code of being (e.g., fundamentalphysicalconstants). This could make only creatures that are not spelled out in these source codes. However, people have not seen anything like it, and scarcely become witnesses of such events. (shrink)
‘Space does not exist fundamentally: it emerges from a more fundamental non-spatial structure.’ This intriguing claim appears in various research programs in contemporary physics. Philosophers of physics tend to believe that this claim entails either that spacetime does not exist, or that it is derivatively real. In this article, I introduce and defend a third metaphysical interpretation of the claim: reductionism about space. I argue that, as a result, there is no need to subscribe to fundamentality, layers of reality (...) and emergence in order to analyse the constitution of space by non-spatial entities. It follows that space constitution, if borne out, does not provide empirical evidence in favour of a stratified, Aristotelian in spirit, metaphysics. The view will be described in relation to two particular research programs in contemporary physics: wave function realism and loop quantum gravity. (shrink)
There is a standard way of interpreting physicalism. This is as a completeness thesis of some kind. Completeness physicalists believe there is or in principle could be some future physics that provides a complete explanatory or ontological basis for our universe. And this provides a sense in which physics is special among the sciences, the sense in which it is fundamental. This paper contrasts this standard completeness physicalism with what is a more plausible maximality physicalism. Maximality physicalists believe physics (...) is special only in its providing an epistemic framework that is ontologically or explanatorily superior in some respect. This paper shows how completeness physicalists cannot, while maximality physicalists can, provide an adequate explanation of the empirical support for and the pragmatic usefulness of physicalism. It also shows how maximality physicalism is better supported in light of several developments from late twentieth century philosophy of science. (shrink)
In their recent book Every Thing Must Go, Ladyman and Ross claim: (i) Physics is analytically complete since it is the only science that cannot be left incomplete. (ii) There might not be an ontologically fundamental level. (iii) We should not admit anything into our ontology unless it has explanatory and predictive utility. In this discussion note I aim to show that the ontological commitment in implies that the completeness of no science can be achieved where no fundamental (...) level exists. Therefore, if claim requires a science to actually be complete in order to be considered as physics,, and if Ladyman and Ross's “tentative metaphysical hypothesis ... that there is no fundamental level” is true,, then there simply is no physics. Ladyman and Ross can, however, avoid this unwanted result if they merely require physics to ever strive for completeness rather than to already be complete. (shrink)
The fundamental building block of the loop quantum gravity (LQG) is the spin network which is used to quantize the physical space-time in the LQG. Recently, the novel quantum spin is proposed using the basic concepts of the spin network. This perspective redefines the notion of the quantum spin and also introduces the novel definition of the reduced Planck constant. The implication of this perspective is not only limited to the quantum gravity; but also found in the quantum (...) mechanics. Using this perspective, we also propose the quantization of the mind-stuff. Similarity between the physical space-time and the space-time of the mind-stuff provides novel notions to study the space-time scientifically as well philosophically. The comparison between the physical- space-time and the space-time of the mind-stuff is also studied. (shrink)
The cosmic time dependencies of $G$, $\alpha$, $h$ and of Standard Model parameters like the Higgs vev and elementary particle masses are studied in the framework of a new dark energy interpretation. Due to the associated time variation of rulers, many effects turn out to be invisible. However, a rather large time dependence is claimed to arise in association with dark energy measurements, and smaller ones in connection with the Standard Model.
A synthesis of trending topics in pancomputationalism. I introduce the notion that "strange loops" engender the most atomic levels of physical reality, and introduce a mechanism for global non-locality. Writen in a simple and accesssible style, it seeks to draw research in fundamental physics back to realism, and have a bit of fun in the process.
INTERNATIONAL STUDIES IN THE PHILOSOPHY OF SCIENCE Vol. 5, number 1, Autumn 1991, pp. 79-87. R.M. Nugayev. -/- The fundamental laws of physics can tell the truth. -/- Abstract. Nancy Cartwright’s arguments in favour of phenomenological laws and against fundamental ones are discussed. Her criticisms of the standard cjvering-law account are extended using Vyacheslav Stepin’s analysis of the structure of fundamental theories. It is argued that Cartwright’s thesis 9that the laws of physics lie) is too radical to (...) accept. A model of theory change is proposed which demonstrates how the fundamental laws of physics can, in fact, be confronted with experience. -/- . (shrink)
Our concept of the universe and the material world is foundational for our thinking and our moral lives. In an earlier contribution to the URAM project I presented what I called 'the ultimate organizational principle' of the universe. In that article (Grandpierre 2000, pp. 12-35) I took as an adversary the wide-spread system of thinking which I called 'materialism'. According to those who espouse this way of thinking, the universe consists of inanimate units or sets of material such as atoms (...) or elementary particles. Against this point of view on reality, I argued that it is 'logic', which exists in our inner world as a function of our mind, that is the universal organizing power of the universe. The present contribution builds upon this insight. Then I focussed on rationality; now I am interested in the responsibility that is the driving force behind our effort to find coherence and ultimate perspectives in our cosmos. It is shown that biology fundamentally differs from physics. Biology has its own fundamental principle, which is formulated for the first time in history in a scientific manner by Ervin Bauer. This fundamental principle is the cosmic life principle. I show that if one considers the physical laws as corresponding to reality, as in scientific realism, than physicalism becomes fundamentally spiritual because the physical laws are not material. I point out that the physical laws originate from the fundamental principle of physics which is the least action principle. I show that the fundamental principle of physics can be considered as the "instinct of atoms". Our research has found deep and meaningful connections between the basic principle of physics and the ultimate principles of the universe: matter, life and reason. Therefore, the principle of least action is not necessarily an expression of sterile inanimateness. On the contrary, the principle of physics is related to the life principle of the universe, to the world of instincts behind the atomic world, in which the principles of physics, biology, and psychology arise from the same ultimate principle. Our research sheds new light to the sciences of physics, biology, and psychology in close relation to the basic principles. These ultimate principles have a primary importance in our understanding of the nature of Man and the Universe, together with the relations between Man and Nature, Man and the Universe. The results offer new foundations for our understanding our own role in the Earth, in the Nature and in the Universe. Even the apparently inanimate world of physics shows itself to be animate on long timescales and having a kind of pre- human consciousness in its basic organisation. This hypothesis offers a way to understand when and how the biological laws may direct physical laws, and, moreover, offers a new perspective to study and understand under which conditions can self-consciousness govern the laws of biology and physics. This point of view offers living beings and humans the possibility of strengthening our natural identity, and recognising the wide perspective arising from having access to the deepest ranges of our own human resources and realising the task for which human and individual life has been created. (shrink)
The original conception of atomism suggests “atoms”, which cannot be divided more into composing parts. However, the name “atom” in physics is reserved for entities, which can be divided into electrons, protons, neutrons and other “elementary particles”, some of which are in turn compounded by other, “more elementary” ones. Instead of this, quantum mechanics is grounded on the actually indivisible quanta of action limited by the fundamental Planck constant. It resolves the problem of how both discrete and continuous (even (...) smooth) to be described uniformly and invariantly in thus. Quantum mechanics can be interpreted in terms of quantum information. Qubit is the indivisible unit (“atom”) of quantum information. The imagery of atomism in modern physics moves from atoms of matter (or energy) via “atoms” (quanta) of action to “atoms” (qubits) of quantum information. This is a conceptual shift in the cognition of reality to terms of information, choice, and time. (shrink)
A generalized and unifying viewpoint to both general relativity and quantum mechanics and information is investigated. It may be described as a generaliztion of the concept of reference frame from mechanics to thermodynamics, or from a reference frame linked to an element of a system, and thus, within it, to another reference frame linked to the whole of the system or to any of other similar systems, and thus, out of it. Furthermore, the former is the viewpoint of general relativity, (...) the latter is that of quantum mechanics and information. Ciclicity in the manner of Nicolas Cusanus (Nicolas of Cusa) is complemented as a fundamental and definitive property of any totality, e.g. physically, that of the universe. It has to contain its externality within it somehow being namely the totality. This implies a seemingly paradoxical (in fact, only to common sense rather logically and mathematically) viewpoint for the universe to be repesented within it as each one quant of action according to the fundamental Planck constant. That approach implies the unification of gravity and entanglement correspondiing to the former or latter class of reference frames. An invariance, more general than Einstein's general covariance is to be involved as to both classes of reference frames unifying them. Its essence is the unification of the discrete and cotnitinuous (smooth). That idea underlies implicitly quantum mechanics for Bohr's principle that it study the system of quantum microscopic entities and the macroscopic apparatus desribed uniformly by the smmoth equations of classical physics. (shrink)
The cognition of quantum processes raises a series of questions about ordering and information connecting the states of one and the same system before and after measurement: Quantum measurement, quantum in-variance and the non-locality of quantum information are considered in the paper from an epistemological viewpoint. The adequate generalization of ‘measurement’ is discussed to involve the discrepancy, due to the fundamental Planck constant, between any quantum coherent state and its statistical representation as a statistical ensemble after measurement. Quantum in-variance (...) designates the relation of any quantum coherent state to the corresponding statistical ensemble of measured results. A set-theory corollary is the curious in-variance to the axiom of choice: Any coherent state excludes any well-ordering and thus excludes also the axiom of choice. However the above equivalence requires it to be equated to a well-ordered set after measurement and thus requires the axiom of choice for it to be able to be obtained. Quantum in-variance underlies quantum information and reveals it as the relation of an unordered quantum “much” (i.e. a coherent state) and a well-ordered “many” of the measured results (i.e. a statistical ensemble). It opens up to a new horizon, in which all physical processes and phenomena can be interpreted as quantum computations realizing relevant operations and algorithms on quantum information. All phenomena of entanglement can be described in terms of the so defined quantum information. Quantum in-variance elucidates the link between general relativity and quantum mechanics and thus, the problem of quantum gravity. The non-locality of quantum information unifies the exact position of any space-time point of a smooth trajectory and the common possibility of all space-time points due to a quantum leap. This is deduced from quantum in-variance. Epistemology involves the relation of ordering and thus a generalized kind of information, quantum one, to explain the special features of the cognition in quantum mechanics. (shrink)
In seeking an answer to the question of what it means for a theory to be fundamental, it is enlightening to ask why the current best theories of physics are not generally believed to be fundamental. This reveals a set of conditions that a theory of physics must satisfy in order to be considered fundamental. Physics aspires to describe ever deeper levels of reality, which may be without end. Ultimately, at any stage we may not be able (...) to tell whether we've reached rock bottom, or even if there is a base level – nevertheless, I draft a checklist to help us identify when to stop digging, in the case where we may have reached a candidate for a final theory. Given that the list is – according to (current) mainstream belief in high-energy physics – complete, and each criterion well-motivated, I argue that a physical theory that satisfies all the criteria can be assumed to be fundamental in the absence of evidence to the contrary (i.e., I argue that the necessary conditions are jointly sufficient for a claim of fundamentality in physics). (shrink)
Quantum invariance designates the relation of any quantum coherent state to the corresponding statistical ensemble of measured results. The adequate generalization of ‘measurement’ is discussed to involve the discrepancy, due to the fundamental Planck constant, between any quantum coherent state and its statistical representation as a statistical ensemble after measurement. A set-theory corollary is the curious invariance to the axiom of choice: Any coherent state excludes any well-ordering and thus excludes also the axiom of choice. It should be equated (...) to a well-ordered set after measurement and thus requires the axiom of choice. Quantum invariance underlies quantum information and reveals it as the relation of an unordered quantum “much” (i.e. a coherent state) and a well-ordered “many” of the measured results (i.e. a statistical ensemble). It opens up to a new horizon, in which all physical processes and phenomena can be interpreted as quantum computations realizing relevant operations and algorithms on quantum information. All phenomena of entanglement can be described in terms of the so defined quantum information. Quantum invariance elucidates the link between general relativity and quantum mechanics and thus, the problem of quantum gravity. (shrink)
A physical model of the electron is suggested according to the basic structures of matter (BSM) hypothesis. BSM is based on an alternative concept about the physical vacuum, assuming that space contains an underlying grid structure of nodes formed of superdense subelementary particles, which are also involved in the structure of the elementary particles. The proposed grid structure is formed of vibrating nodes that possess quantum features and energy well. It is admitted that this hypothetical structure could account (...) for the missing “dark matter” in the universe. The signature of this dark matter is apparent in the galactic rotational curves and in the relation between masses of the supermassive black hole in the galactic center and the host galaxy. The suggested model of the electron possesses oscillation features with anomalous magnetic moment and embedded signatures of the Compton wavelength and the fine-structure constant. The analysis of the interactions between the oscillating electron and the nodes of the vacuum grid structure allows us to obtain physical meaning for some fundamentalconstants. (shrink)
I explore some ways in which one might base an account of the fundamental metaphysics of geometry on the mathematical theory of Linear Structures recently developed by Tim Maudlin (2010). Having considered some of the challenges facing this approach, Idevelop an alternative approach, according to which the fundamental ontology includes concrete entities structurally isomorphic to functions from space-time points to real numbers.
The fundamentalconstants that are involved in the laws of physics which describe our universe are finely-tuned for life, in the sense that if some of the constants had slightly different values life could not exist. Some people hold that this provides evidence for the existence of God. I will present a probabilistic version of this fine-tuning argument which is stronger than all other versions in the literature. Nevertheless, I will show that one can have reasonable opinions (...) such that the fine-tuning argument doesn’t lead to an increase in one’s probability for the existence of God. (shrink)
In metaphysics, fundamentality is a central theme involving debates on the nature of existents, as wholes. These debates are largely object-oriented in their standpoint and engage with composites or wholes through the mereological notion of compositionality. The ontological significance of the parts overrides that of wholes since the existence and identity of the latter are dependent on that of the former. Broadly, the candidates for fundamental entities are considered to be elementary particles of modern physics (since they appear to (...) play the role of ultimate parts to all phenomena). The paper intends to show the inadequacy of the object-oriented notion of conditionality by pointing out that the parts and wholes possess varying conditions of existence. By alleging that only the parts are ontologically significant is to conflate such conditions and neglect the spectrum of conditions which exist in our world. A proposal for a revised notion of compositionality in terms of structural relatedness is also put forward. (shrink)
In the interpretation of canonical quantum gravity (CQG), gravity appears as a geometric pseudoforce, is reduced to spacetime geometry and becomes a simple effect of spacetime curvature. The scale at which quantum gravitational effects occur is determined by the different physicalconstants of fundamental physics: h, c and G, which characterize quantum, relativistic and gravitational phenomena. By combining these constants, we obtain the Planck constants at which the effects of quantum gravity must manifest. Loop quantum (...) gravity attempts to unify gravity with the other three fundamental forces starting with relativity and adding quantum traits. DOI: 10.13140/RG.2.2.10368.58889 . (shrink)
Since the publication of David Lewis's ''New Work for a Theory of Universals,'' the distinction between properties that are fundamental – or perfectly natural – and those that are not has become a staple of mainstream metaphysics. Plausible candidates for perfect naturalness include the quantitative properties posited by fundamental physics. This paper argues for two claims: (1) the most satisfying account of quantitative properties employs higher-order relations, and (2) these relations must be perfectly natural, for otherwise the perfectly (...) natural properties cannot play the roles in metaphysical theorizing as envisaged by Lewis. (shrink)
If there are fundamental laws of nature, can they fail to be exact? In this paper, I consider the possibility that some fundamental laws are vague. I call this phenomenon 'fundamental nomic vagueness.' I characterize fundamental nomic vagueness as the existence of borderline lawful worlds and the presence of several other accompanying features. Under certain assumptions, such vagueness prevents the fundamentalphysical theory from being completely expressible in the mathematical language. Moreover, I suggest that (...) such vagueness can be regarded as 'vagueness in the world.' For a case study, we turn to the Past Hypothesis, a postulate that (partially) explains the direction of time in our world. We have reasons to take it seriously as a candidate fundamental law of nature. Yet it is vague: it admits borderline (nomologically) possible worlds. An exact version would lead to an untraceable arbitrariness absent in any other fundamental laws. However, the dilemma between fundamental nomic vagueness and untraceable arbitrariness is dissolved in a new quantum theory of time's arrow. (shrink)
It has been argued that the fundamental laws of physics do not face a ‘problem of provisos’ equivalent to that found in other scientiﬁc disciplines (Earman, Roberts and Smith 2002) and there is only the appearance of exceptions to physical laws if they are confused with differential equations of evolution type (Smith 2002). In this paper I argue that even if this is true, fundamental laws in physics still pose a major challenge to standard Humean approaches to (...) lawhood, as they are not in any obvious sense about regularities in behaviour. A Humean approach to physical laws with exceptions is possible, however, if we adopt a view of laws that takes them to be the algorithms in the algorithmic compressions of empirical data. When this is supplemented with a distinction between lossy and lossless compression, we can explain exceptions in terms of compression artefacts present in the application of the lossy laws. (shrink)
It is shown by means of general principles and specific examples that, contrary to a long-standing misconception, the modern mathematical physics of compressible fluid dynamics provides a generally consistent and efficient language for describing many seemingly fundamentalphysical phenomena. It is shown to be appropriate for describing electric and gravitational force fields, the quantized structure of charged elementary particles, the speed of light propagation, relativistic phenomena, the inertia of matter, the expansion of the universe, and the physical (...) nature of time. New avenues and opportunities for fundamental theoretical research are thereby illuminated. (shrink)
The question of what ontological insights can be gained from the knowledge of physics (keyword: ontic structural realism) cannot obviously be separated from the view of physics as a science from an epistemological perspective. This is also visible in the debate about 'scientific realism'. This debate makes it evident, in the form of the importance of perception as a criterion for the assertion of existence in relation to the 'theoretical entities' of physics, that epistemology itself is 'ontologically laden'. This is (...) in the form of the assumption that things (or entities) in themselves exist as such and such determined ones (independent of cognition, autonomously). This ontological assumption is not only the basis of our naïve understanding of cognition, but also its indispensable premise, insofar as this understanding is a fundamentally passive, 'receptive' one. Accordingly, just as 'perception' is the foundation, ('objective') description is the aim of cognition, that which cognition is about. In this sense, our idea of cognition and our idea of the things are inseparably linked. Without the ontological premise mentioned we just would not know what cognition is, but it is basically just a kind of image that we have in our minds (an assumption that helps us understand 'cognition'). Epistemology not only shares this basic assumption (which it also shares with metaphysics), but it revolves (unlike metaphysics) entirely around it by making the idea and demand of 'certainty' a condition of 'real' knowledge. As 'certainty' is a subjective criterion this entails the 'remodelling' of the real, holistic cognitive situation (to which metaphysics adheres) into a linear subject-object-relation (which results in the strict 'transcendence' of the objects). And it also establishes, due to its 'expertise' in matters of cognition, the 'primacy of epistemology' over all other sciences. Now, on closer inspection, however, the expertise of epistemology seems not all that dependable, because it basically consists only of paradigms which, from the point of view of the holism of the real cognitive situation itself, are nothing more than relatively simplistic interpretations of this situation. However, we do not yet know what another conception of cognition might look like (which is not surprising given the high rank of the phenomenon of cognition in the hierarchy of phenomena according to their complexity). 'Certainty' as a criterion of cognition is thus excluded from the outset, and thus the linear relational model of cognition appears as what it is, a gross distortion of the real, holistic cognitive situation. The significance of this argumentation with regard to physics is that the linear epistemological model of cognition itself is a major obstacle to an adequate epistemological understanding of physics. This is because it is fixed 'a priori' to an object-related concept of cognition, and to 'description' as the only mode of ('real') cognition. But physics (without questioning our naïve notion of cognition on the level of epistemology) simply works past it and its basic assumptions. Its cognitive concept (alias heuristic) is fundamentally different from that of metaphysics. The acceptance of the real, holistic cognitive situation is, in my opinion, the condition for an adequate understanding of physics' heuristic access to objects, its transcendental, generalizing cognitive concept, as well as its ontological relevance and dimension of its own. (shrink)
Our ordinary causal concept seems to fit poorly with how our best physics describes the world. We think of causation as a time-asymmetric dependence relation between relatively local events. Yet fundamental physics describes the world in terms of dynamical laws that are, possible small exceptions aside, time symmetric and that relate global time slices. My goal in this paper is to show why we are successful at using local, time-asymmetric models in causal explanations despite this apparent mismatch with (...) class='Hi'>fundamental physics. In particular, I will argue that there is an important connection between time asymmetry and locality, namely: understanding the locality of our causal models is the key to understanding why the physical time asymmetries in our universe give rise to time asymmetry in causal explanation. My theory thus provides a unified account of why causation is local and time asymmetric and thereby enables a reply to Russell’s famous attack on causation. (shrink)
The principle of least action, which has so successfully been applied to diverse fields of physics looks back at three centuries of philosophical and mathematical discussions and controversies. They could not explain why nature is applying the principle and why scalar energy quantities succeed in describing dynamic motion. When the least action integral is subdivided into infinitesimal small sections each one has to maintain the ability to minimise. This however has the mathematical consequence that the Lagrange function at a given (...) point of the trajectory, the dynamic, available energy generating motion, must itself have a fundamental property to minimize. Since a scalar quantity, a pure number, cannot do that, energy must fundamentally be dynamic and time oriented for a consistent understanding. It must have vectorial properties in aiming at a decrease of free energy per state (which would also allow derivation of the second law of thermodynamics). Present physics is ignoring that and applying variation calculus as a formal mathematical tool to impose a minimisation of scalar assumed energy quantities for obtaining dynamic motion. When, however, the dynamic property of energy is taken seriously it is fundamental and has also to be applied to quantum processes. A consequence is that particle and wave are not equivalent, but the wave (distributed energy) follows from the first (concentrated energy). Information, provided from the beginning, an information self-image of matter, is additionally needed to recreate the particle from the wave, shaping a “dynamic” particle-wave duality. It is shown that this new concept of a “dynamic” quantum state rationally explains quantization, the double slit experiment and quantum correlation, which has not been possible before. Some more general considerations on the link between quantum processes, gravitation and cosmological phenomena are also advanced. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.