Statistical mechanics is a strange theory. Its aims are debated, its methods are contested, its main claims have never been fully proven, and their very truth is challenged, yet at the same time, it enjoys huge empirical success and gives us the feeling that we understand important phenomena. What is this weird theory, exactly? Statistical mechanics is the name of the ongoing attempt to apply mechanics, together with some auxiliary hypotheses, to explain and predict certain phenomena, above (...) all those described by thermodynamics. This paper shows what parts of this objective can be achieved with mechanics by itself. It thus clarifies what roles remain for the auxiliary assumptions that are needed to achieve the rest of the desiderata. Those auxiliary hypotheses are described in another paper in this journal, Foundations of statistical mechanics: The auxiliary hypotheses. (shrink)
I examine here if Kant’s metaphysics of matter can support any late-modern versions of classical mechanics. I argue that in principle it can, by two different routes. I assess the interpretive costs of each approach, and recommend the most promising strategy: a mass-point approach.
Statistical mechanics is often taken to be the paradigm of a successful inter-theoretic reduction, which explains the high-level phenomena (primarily those described by thermodynamics) by using the fundamental theories of physics together with some auxiliary hypotheses. In my view, the scope of statistical mechanics is wider since it is the type-identity physicalist account of all the special sciences. But in this chapter, I focus on the more traditional and less controversial domain of this theory, namely, that of explaining (...) the thermodynamic phenomena.What are the fundamental theories that are taken to explain the thermodynamic phenomena? The lively research into the foundations of classical statistical mechanics suggests that using classical mechanics to explain the thermodynamic phenomena is fruitful. Strictly speaking, in contemporary physics, classical mechanics is considered to be false. Since classical mechanics preserves certain explanatory and predictive aspects of the true fundamental theories, it can be successfully applied in certain cases. In other circumstances, classical mechanics has to be replaced by quantum mechanics. In this chapter I ask the following two questions: I) How does quantum statistical mechanics differ from classical statistical mechanics? How are the well-known differences between the two fundamental theories reflected in the statistical mechanical account of high-level phenomena? II) How does quantum statistical mechanics differ from quantum mechanics simpliciter? To make our main points I need to only consider non-relativistic quantum mechanics. Most of the ideas described and addressed in this chapter hold irrespective of the choice of a (so-called) interpretation of quantum mechanics, and so I will mention interpretations only when the differences between them are important to the matter discussed. (shrink)
We present an axiomatization of non-relativistic Quantum Mechanics for a system with an arbitrary number of components. The interpretation of our system of axioms is realistic and objective. The EPR paradox and its relation with realism is discussed in this framework. It is shown that there is no contradiction between realism and recent experimental results.
With the advent of quantum mechanics in the early 20th century, a great revolution took place in science. The philosophical foundations of classical physics collapsed, and controversial conceptual issues arose: can the quantum mechanical description of physical reality be considered complete? Are the objects of nature inseparable? Do objects not have a specific location before measurement, and are there non-causal quantum jumps? As time passed, not only did the controversies not diminish, but with the decline of positivism, they (...) got more attention. This book, written in Persian, attempts to explain these issues and controversies and their philosophical foundations as simply and critically as possible for those students interested in the philosophical foundations of quantum mechanics. (shrink)
This chapter looks at Euler’s relation to Newton, and at his role in the rise of ‘Newtonian’ mechanics. It aims to give a sense of Newton’s complicated legacy for Enlightenment science, and to raise awareness that some key ‘Newtonian’ results really come from Euler.
The four antinomies of Zeno of Elea continue to be provoking issues that remain relevant for the foundation of science. Aristotle used this antinomy to arrive at a deeper understanding of movement : it is a fluent continuum that he considers to be a whole. The parts, if any, are only potentially present. Similarly, quantum mechanics states that movement is quantized ; things move or change in nonreducible steps, the so-called quanta. This view is in contrast to classical (...) class='Hi'>mechanics, where infinitesimally small steps are permitted. The objective of the present study is to show the merits of the Aristotelian approach. Examples from modern science serve to illustrate the philosophical statements. (shrink)
We review a recent approach to the foundations of quantum mechanics inspired by quantum information theory. The approach is based on a general framework, which allows one to address a large class of physical theories which share basic information-theoretic features. We first illustrate two very primitive features, expressed by the axioms of causality and purity-preservation, which are satisfied by both classical and quantum theory. We then discuss the axiom of purification, which expresses a strong version of the Conservation (...) of Information and captures the core of a vast number of protocols in quantum information. Purification is a highly non-classical feature and leads directly to the emergence of entanglement at the purely conceptual level, without any reference to the superposition principle. Supplemented by a few additional requirements, satisfied by classical and quantum theory, it provides a complete axiomatic characterization of quantum theory for finite dimensional systems. (shrink)
We review our approach to quantum mechanics adding also some new interesting results. We start by giving proof of two important theorems on the existence of the A(Si) and i,±1 N Clifford algebras. This last algebra gives proof of the von Neumann basic postulates on the quantum measurement explaining thus in an algebraic manner the wave function collapse postulated in standard quantum theory. In this manner we reach the objective to expose a self-consistent version of quantum mechanics. In (...) detail we realize a bare bone skeleton of quantum mechanics recovering all the basic foundations of this theory on an algebraic framework. We give proof of the quantum like Heisenberg uncertainty relations using only the basic support of the Clifford algebra. In addition we demonstrate the well known phenomenon of quantum Mach Zender interference using the same algebraic framework, as well as we give algebraic proof of quantum collapse in some cases of physical interest by direct application of the theorem that we derive to elaborate the i,±1 N algebra. We also discuss the problem of time evolution of quantum systems as well as the changes in space location, in momentum and the linked invariance principles. We are also able to re-derive the basic wave function of standard quantum mechanics by using only the Clifford algebraic approach. In this manner we obtain a full exposition of standard quantum mechanics using only the basic axioms of Clifford algebra. We also discuss more advanced features of quantum mechanics. In detail, we give demonstration of the Kocken-Specher theorem, and also we give an algebraic formulation and explanation of the EPR paradox only using the Clifford algebra. By using the same approach we also derive Bell inequalities. Our formulation is strongly based on the use of idempotents that are contained in Clifford algebra. Their counterpart in quantum mechanics is represented by the projection operators that, as it is well known, are interpreted as logical statements, following the basic von Neumann results. Von Neumann realized a matrix logic on the basis of quantum mechanics. Using the Clifford algebra we are able to invert such result. According to the results previously obtained by Orlov in 1994, we are able to give proof that quantum mechanics derives from logic. We show that indeterminism and quantum interference have their origin in the logic. Therefore, it seems that we may conclude that quantum mechanics, as it appears when investigated by the Clifford algebra, is a two-faced theory in the sense that it looks from one side to “matter per se”, thus to objects but simultaneously also to conceptual entities. We advance the basic conclusion of the paper: There are stages of our reality in which we no more can separate the logic ( and thus cognition and thus conceptual entity) from the features of “matter per se”. In quantum mechanics the logic, and thus the cognition and thus the conceptual entity-cognitive performance, assume the same importance as the features of what is being described. We are at levels of reality in which the truths of logical statements about dynamic variables become dynamic variables themselves so that a profound link is established from its starting in this theory between physics and conceptual entities. Finally, in this approach there is not an absolute definition of logical truths. Transformations , and thus … “redefinitions”…. of truth values are permitted in such scheme as well as the well established invariance principles, clearly indicate . (shrink)
This paper examines the origin, range and meaning of the Principle of Action and Reaction in Kant’s mechanics. On the received view, it is a version of Newton’s Third Law. I argue that Kant meant his principle as foundation for a Leibnizian mechanics. To find a ‘Newtonian’ law of action and reaction, we must look to Kant’s ‘dynamics,’ or theory of matter. I begin, in part I, by noting marked differences between Newton’s and Kant’s laws of action and (...) reaction. I argue that these are explainable by Kant’s allegiance to a Leibnizian mechanics. I show (in part II) that Leibniz too had a model of action and reaction, at odds with Newton’s. Then I reconstruct how Jakob Hermann and Christian Wolff received Leibniz’s model. I present (in Part III) Kant’s early law of action and reaction for mechanics. I show that he devised it so as to solve extant problems in the Hermann-Wolff account. I reconstruct Kant’s views on ‘mechanical’ action and reaction in the 1780s, and highlight strong continuities with his earlier, pre-Critical stance. I use these continuities, and Kant’s earlier engagement with post-Leibnizians, to explain the un-Newtonian features of his law of action and reaction. (shrink)
This paper is intended to persuade an uncommitted audience that free will is illusory. I examine free will through the lens of three interpretations of quantum theory: dynamical collapse theories, hidden variable theories, and many-worlds theories. Dynamical collapse theories, hereon called collapse theories, are the primary focus of this work since they are the most widely accepted in the current philosophy of physics climate. The core postulations and mechanics of the collapse theories are articulated. Accompanying these postulations are a (...) few assumptions regarding the role quantum mechanics may have in one’s decisions. The postulations and assumptions together lead to the conclusion that agents are not free in the collapse theory framework. Then, I anticipate and respond to the following objections. First, that agents are at least partially free through their ability to control and change personal dispositions. Second, that the psychological level is the most appropriate scale for discussions regarding freedom. Finally, I extend my argument against free will to the hidden variables and many-worlds theories. (shrink)
This paper aims to assess current theoretical findings on the origin of coordination by salience and suggests a way to clarify the existing framework. The main concern is to reveal how different coordination mechanisms rely on specific epistemic aspects of reasoning. The paper highlights the fact that basic epistemic assumptions of theories diverge in a way that makes them essentially distinctive. Consequently, recommendations and predictions of the traditional views of coordination by salience are, in principle, based on the processes related (...) to the agent’s presumptions regarding the cognitive abilities of a co-player. This finding implies that we should consider these theories as complementary, and not competitive, explanations of the same phenomenon. -/- . (shrink)
The aim of this paper is to make a step towards a complete description of Special Relativity genesis and acceptance, bringing some light on the intertheoretic relations between Special Relativity and other physical theories of the day. I’ll try to demonstrate that Special Relativity and the Early Quantum Theory were created within the same programme of statistical mechanics, thermodynamics and Maxwellian electrodynamics reconciliation, i.e. elimination of the contradictions between the consequences of this theories. The approach proposed enables to explain (...) why classical mechanics and classical electrodynamics were “refuted” almost simultaneously or, in terms more suitable for the present discussion, why did the quantum and the relativistic revolutions both took place at the beginning of the 20-th century. I ‘ll argue that the quantum and the relativistic revolutions were simultaneous since they had common origin - the clash between the fundamental theories of the second half of the 19-th century that constituted the “body” of Classical Physics. The revolution’ s most dramatic turning point was Einstein’s 1905 light quantum paper, that laid the foundations of the Old Quantum Theory and influenced the fate of special theory of relativity too. Hence, the following two main interrelated theses are defended.(1)Einstein’s special relativity 1905 paper can be considered as a subprogramme of a general research programme that had its pivot in the quantum; (2) One of the reasons of Einstein’s victory over Lorentz consists in the following: special relativity theory superseded Lorentz’s theory when the general programme imposed itself, and, in so doing, made the ether concept untenable. -/- Key words: A.Einstein; H.Lorentz; I.Yu.Kobzarev; context of discovery; context of justification . (shrink)
A description of consciousness leads to a contradiction with the postulation from special relativity that there can be no connections between simultaneous event. This contradiction points to consciousness involving quantum level mechanisms. The Quantum level description of the universe is re- evaluated in the light of what is observed in consciousness namely 4 Dimensional objects. A new improved interpretation of Quantum level observations is introduced. From this vantage point the following axioms of consciousness is presented. Consciousness consists of two distinct (...) components, the observed U and the observer I. The observed U consist of all the events I is aware of. A vast majority of these occur simultaneously. Now if I were to be an entity within the space-time continuum, all of these events of U together with I would have to occur at one point in space-time. However, U is distributed over a definite region of space-time (region in brain). Thus, I is aware of a multitude of space-like separated events. It is seen that this awareness necessitates I to be an entity outside the space-time continuum. With I taken as such, a new concept called concept A is introduced. With the help of concept A a very important axiom of consciousness, namely Free Will is explained. Libet s Experiment which was originally seen to contradict Free will, in the light of Concept A is shown to support it. A variation to Libet s Experiment is suggested that will give conclusive proof for Concept A and Free Will. (shrink)
The emerging contemporary natural philosophy provides a common ground for the integrative view of the natural, the artificial, and the human-social knowledge and practices. Learning process is central for acquiring, maintaining, and managing knowledge, both theoretical and practical. This paper explores the relationships between the present advances in understanding of learning in the sciences of the artificial, natural sciences, and philosophy. The question is, what at this stage of the development the inspiration from nature, specifically its computational models such as (...) info-computation through morphological computing, can contribute to machine learning and artificial intelligence, and how much on the other hand models and experiments in machine learning and robotics can motivate, justify, and inform research in computational cognitive science, neurosciences, and computing nature. We propose that one contribution can be understanding of the mechanisms of ‘learning to learn’, as a step towards deep learning with symbolic layer of computation/information processing in a framework linking connectionism with symbolism. As all natural systems possessing intelligence are cognitive systems, we describe the evolutionary arguments for the necessity of learning to learn for a system to reach human-level intelligence through evolution and development. The paper thus presents a contribution to the epistemology of the contemporary philosophy of nature. (shrink)
This is the introduction to the special issue of Crítica on the metaphysics of physics, featuring papers by Valia Allori, Tim Maudlin and Gustavo Esteban Romero.
This is the first book in a two-volume series. The present volume introduces the basics of the conceptual foundations of quantum physics. It appeared first as a series of video lectures on the online learning platform Udemy.]There is probably no science that is as confusing as quantum theory. There's so much misleading information on the subject that for most people it is very difficult to separate science facts from pseudoscience. The goal of this book is to make you able (...) to separate facts from fiction with a comprehensive introduction to the scientific principles of a complex topic in which meaning and interpretation never cease to puzzle and surprise. An A-Z guide which is neither too advanced nor oversimplified to the weirdness and paradoxes of quantum physics explained from the first principles to modern state-of-the-art experiments and which is complete with figures and graphs that illustrate the deeper meaning of the concepts you are unlikely to find elsewhere. A guide for the autodidact or philosopher of science who is looking for general knowledge about quantum physics at intermediate level furnishing the most rigorous account that an exposition can provide and which only occasionally, in few special chapters, resorts to a mathematical level that goes no further than that of high school. It will save you a ton of time that you would have spent searching elsewhere, trying to piece together a variety of information. The author tried to span an 'arch of knowledge' without giving in to the temptation of taking an excessively one-sided account of the subject. What is this strange thing called quantum physics? What is its impact on our understanding of the world? What is ‘reality’ according to quantum physics? This book addresses these and many other questions through a step-by-step journey. The central mystery of the double-slit experiment and the wave-particle duality, the fuzzy world of Heisenberg's uncertainty principle, the weird Schrödinger's cat paradox, the 'spooky action at a distance' of quantum entanglement, the EPR paradox and much more are explained, without neglecting such main contributors as Planck, Einstein, Bohr, Feynman and others who struggled themselves to come up with the mysterious quantum realm. We also take a look at the experiments conducted in recent decades, such as the surprising "which-way" and "quantum-erasure" experiments. Some considerations on why and how quantum physics suggests a worldview based on philosophical idealism conclude this first volume. This treatise goes, at times, into technical details that demand some effort and therefore requires some basics of high school math (calculus, algebra, trigonometry, elementary statistics). However, the final payoff will be invaluable: Your knowledge of, and grasp on, the subject of the conceptual foundations of quantum physics will be deep, wide, and outstanding. Additionally, because schools, colleges, and universities teach quantum physics using a dry, mostly technical approach which furnishes only superficial insight into its foundations, this manual is recommended for all those students, physicists or philosophers of science who would like to look beyond the mere formal aspect and delve deeper into the meaning and essence of quantum mechanics. The manual is a primer that the public deserves. (shrink)
Gentzen’s approach by transfinite induction and that of intuitionist Heyting arithmetic to completeness and the self-foundation of mathematics are compared and opposed to the Gödel incompleteness results as to Peano arithmetic. Quantum mechanics involves infinity by Hilbert space, but it is finitist as any experimental science. The absence of hidden variables in it interpretable as its completeness should resurrect Hilbert’s finitism at the cost of relevant modification of the latter already hinted by intuitionism and Gentzen’s approaches for completeness. This (...) paper investigates both conditions and philosophical background necessary for that modification. The main conclusion is that the concept of infinity as underlying contemporary mathematics cannot be reduced to a single Peano arithmetic, but to at least two ones independent of each other. Intuitionism, quantum mechanics, and Gentzen’s approaches to completeness an even Hilbert’s finitism can be unified from that viewpoint. Mathematics may found itself by a way of finitism complemented by choice. The concept of information as the quantity of choices underlies that viewpoint. Quantum mechanics interpretable in terms of information and quantum information is inseparable from mathematics and its foundation. (shrink)
The main objective of this dissertation is to philosophically assess how the use of informational concepts in the field of classical thermostatistical physics has historically evolved from the late 1940s to the present day. I will first analyze in depth the main notions that form the conceptual basis on which 'informational physics' historically unfolded, encompassing (i) different entropy, probability and information notions, (ii) their multiple interpretative variations, and (iii) the formal, numerical and semantic-interpretative relationships among them. In the following, I (...) will assess the history of informational thermophysics during the second half of the twentieth century. Firstly, I analyse the intellectual factors that gave rise to this current in the late forties (i.e., popularization of Shannon's theory, interest in a naturalized epistemology of science, etc.), then study its consolidation in the Brillouinian and Jaynesian programs, and finally claim how Carnap (1977) and his disciples tried to criticize this tendency within the scientific community. Then, I evaluate how informational physics became a predominant intellectual current in the scientific community in the nineties, made possible by the convergence of Jaynesianism and Brillouinism in proposals such as that of Tribus and McIrvine (1971) or Bekenstein (1973) and the application of algorithmic information theory into the thermophysical domain. As a sign of its radicality at this historical stage, I explore the main proposals to include information as part of our physical reality, such as Wheeler’s (1990), Stonier’s (1990) or Landauer’s (1991), detailing the main philosophical arguments (e.g., Timpson, 2013; Lombardi et al. 2016a) against those inflationary attitudes towards information. Following this historical assessment, I systematically analyze whether the descriptive exploitation of informational concepts has historically contributed to providing us with knowledge of thermophysical reality via (i) explaining thermal processes such as equilibrium approximation, (ii) advantageously predicting thermal phenomena, or (iii) enabling understanding of thermal property such as thermodynamic entropy. I argue that these epistemic shortcomings would make it impossible to draw ontological conclusions in a justified way about the physical nature of information. In conclusion, I will argue that the historical exploitation of informational concepts has not contributed significantly to the epistemic progress of thermophysics. This would lead to characterize informational proposals as 'degenerate science' (à la Lakatos 1978a) regarding classical thermostatistical physics or as theoretically underdeveloped regarding the study of the cognitive dynamics of scientists in this physical domain. (shrink)
The paper justifies the following theses: The totality can found time if the latter is axiomatically represented by its “arrow” as a well-ordering. Time can found choice and thus information in turn. Quantum information and its units, the quantum bits, can be interpreted as their generalization as to infinity and underlying the physical world as well as the ultimate substance of the world both subjective and objective. Thus a pathway of interpretation between the totality via time, order, choice, and information (...) to the substance of the world is constructed. The article is based only on the well-known facts and definitions and is with no premises in this sense. Nevertheless it is naturally situated among works and ideas of Husserl and Heidegger, linked to the foundation of mathematics by the axiom of choice, to the philosophy of quantum mechanics and information. (shrink)
Cognitive Behavioral Therapy has become the dominant form of psychotherapy in North America. The CBT model is theoretically based on the idea that all external and internal stimuli are filtered through meaning-making, consciously accessible cognitive schemas. The goal of CBT is to identify dysfunctional or maladaptive thoughts and beliefs, and replace them with more adaptive cognitive interpretations. While CBT is clearly effective as a treatment, there is good reason to be skeptical that its efficacy is due to the causal mechanisms (...) posited by the CBT model. This paper will argue that the specific cognitive schemas posited by the CBT model likely do not play a direct role in the development or treatment of psychological illness. Cognitive schemas, as identified in CBT interventions, are likely to be the result of patient confabulation and epistemically under-supported practitioner-based identification. CBT interventions appear to impose coherence on patients’ psychological states, rather than identifying and modifying preexistent causally efficacious core beliefs. (shrink)
In this essay, we draw on John Haugeland’s work in order to argue that Burge is wrong to think that exercises of perceptual constancy mechanisms suffice for perceptual representation. Although Haugeland did not live to read or respond to Burge’s Origins of Objectivity, we think that his work contains resources that can be developed into a critique of the very foundation of Burge’s approach. Specifically, we identify two related problems for Burge. First, if (what Burge calls) mere sensory responses are (...) not representational, then neither are exercises of constancy mechanisms, since the differences between them do not suffice to imply that one is representational and the other is not. Second, taken by themselves, exercises of constancy mechanisms are only derivatively representational, so merely understanding how they work is not sufficient for understanding what is required for something, in itself, to be representational (and thereby provide a full solution to the problem of perceptual representation). (shrink)
The INBIOSA project brings together a group of experts across many disciplines who believe that science requires a revolutionary transformative step in order to address many of the vexing challenges presented by the world. It is INBIOSA’s purpose to enable the focused collaboration of an interdisciplinary community of original thinkers. This paper sets out the case for support for this effort. The focus of the transformative research program proposal is biology-centric. We admit that biology to date has been more fact-oriented (...) and less theoretical than physics. However, the key leverageable idea is that careful extension of the science of living systems can be more effectively applied to some of our most vexing modern problems than the prevailing scheme, derived from abstractions in physics. While these have some universal application and demonstrate computational advantages, they are not theoretically mandated for the living. A new set of mathematical abstractions derived from biology can now be similarly extended. This is made possible by leveraging new formal tools to understand abstraction and enable computability. [The latter has a much expanded meaning in our context from the one known and used in computer science and biology today, that is "by rote algorithmic means", since it is not known if a living system is computable in this sense (Mossio et al., 2009).] Two major challenges constitute the effort. The first challenge is to design an original general system of abstractions within the biological domain. The initial issue is descriptive leading to the explanatory. There has not yet been a serious formal examination of the abstractions of the biological domain. What is used today is an amalgam; much is inherited from physics (via the bridging abstractions of chemistry) and there are many new abstractions from advances in mathematics (incentivized by the need for more capable computational analyses). Interspersed are abstractions, concepts and underlying assumptions “native” to biology and distinct from the mechanical language of physics and computation as we know them. A pressing agenda should be to single out the most concrete and at the same time the most fundamental process-units in biology and to recruit them into the descriptive domain. Therefore, the first challenge is to build a coherent formal system of abstractions and operations that is truly native to living systems. Nothing will be thrown away, but many common methods will be philosophically recast, just as in physics relativity subsumed and reinterpreted Newtonian mechanics. -/- This step is required because we need a comprehensible, formal system to apply in many domains. Emphasis should be placed on the distinction between multi-perspective analysis and synthesis and on what could be the basic terms or tools needed. The second challenge is relatively simple: the actual application of this set of biology-centric ways and means to cross-disciplinary problems. In its early stages, this will seem to be a “new science”. This White Paper sets out the case of continuing support of Information and Communication Technology (ICT) for transformative research in biology and information processing centered on paradigm changes in the epistemological, ontological, mathematical and computational bases of the science of living systems. Today, curiously, living systems cannot be said to be anything more than dissipative structures organized internally by genetic information. There is not anything substantially different from abiotic systems other than the empirical nature of their robustness. We believe that there are other new and unique properties and patterns comprehensible at this bio-logical level. The report lays out a fundamental set of approaches to articulate these properties and patterns, and is composed as follows. -/- Sections 1 through 4 (preamble, introduction, motivation and major biomathematical problems) are incipient. Section 5 describes the issues affecting Integral Biomathics and Section 6 -- the aspects of the Grand Challenge we face with this project. Section 7 contemplates the effort to formalize a General Theory of Living Systems (GTLS) from what we have today. The goal is to have a formal system, equivalent to that which exists in the physics community. Here we define how to perceive the role of time in biology. Section 8 describes the initial efforts to apply this general theory of living systems in many domains, with special emphasis on crossdisciplinary problems and multiple domains spanning both “hard” and “soft” sciences. The expected result is a coherent collection of integrated mathematical techniques. Section 9 discusses the first two test cases, project proposals, of our approach. They are designed to demonstrate the ability of our approach to address “wicked problems” which span across physics, chemistry, biology, societies and societal dynamics. The solutions require integrated measurable results at multiple levels known as “grand challenges” to existing methods. Finally, Section 10 adheres to an appeal for action, advocating the necessity for further long-term support of the INBIOSA program. -/- The report is concluded with preliminary non-exclusive list of challenging research themes to address, as well as required administrative actions. The efforts described in the ten sections of this White Paper will proceed concurrently. Collectively, they describe a program that can be managed and measured as it progresses. (shrink)
This book argues that the Enlightenment was a golden age for the philosophy of body, and for efforts to integrate coherently a philosophical concept of body with a mathematized theory of mechanics. Thereby, it articulates a new framing for the history of 18th-century philosophy and science. It explains why, more than a century after Newton, physics broke away from philosophy to become an autonomous domain. And, it casts fresh light on the structure and foundations of classical mechanics. (...) Among the figures studied are Malebranche, Leibniz, Du Châtelet, Boscovich, and Kant, alongside d’Alembert, Euler, Lagrange, Laplace and Cauchy. (shrink)
A non-relativistic quantum mechanical theory is proposed that describes the universe as a continuum of worlds whose mutual interference gives rise to quantum phenomena. A logical framework is introduced to properly deal with propositions about objects in a multiplicity of worlds. In this logical framework, the continuum of worlds is treated in analogy to the continuum of time points; both “time” and “world” are considered as mutually independent modes of existence. The theory combines elements of Bohmian mechanics and of (...) Everett’s many-worlds interpretation; it has a clear ontology and a set of precisely defined postulates from where the predictions of standard quantum mechanics can be derived. Probability as given by the Born rule emerges as a consequence of insufficient knowledge of observers about which world it is that they live in. The theory describes a continuum of worlds rather than a single world or a discrete set of worlds, so it is similar in spirit to many-worlds interpretations based on Everett’s approach, without being actually reducible to these. In particular, there is no splitting of worlds, which is a typical feature of Everett-type theories. Altogether, the theory explains (1) the subjective occurrence of probabilities, (2) their quantitative value as given by the Born rule, and (3) the apparently random “collapse of the wavefunction” caused by the measurement, while still being an objectively deterministic theory. (shrink)
I pedagogically show that the momentum operator in quantum mechanics, in the position representation, commonly known to be a derivative with respect to a spatial x-coordinate, can be derived by identifying momentum as the generator of space translations.
The presumptions underlying quantum mechanics make it relevant to a limited range of situations only; furthermore, its statistical character means that it provides no answers to the question ‘what is really going on?’. Following Barad, I hypothesise that the underlying mechanics has parallels with human activities, as used by Barad to account for the way quantum measurements introduce definiteness into previously indefinite situations. We are led to consider a subtle type of order, different from those commonly encountered in (...) the discipline of physics, and yet comprehensible in terms of concepts considered by Barad and Yardley such as oppositional dynamics or ‘intra-actions’. The emergent organisation implies that nature is no longer fundamentally meaningless. Agencies can be viewed as dynamical systems, so we are dealing with models involving interacting dynamical systems. The ‘congealing of agencies’ to which Barad refers can be equated to the presence of regulatory mechanisms restricting the range of possibilities open to the agencies concerned. (shrink)
Gravity remains the most elusive field. Its relationship with the electromagnetic field is poorly understood. Relativity and quantum mechanics describe the aforementioned fields, respectively. Bosons and fermions are often credited with responsibility for the interactions of force and matter. It is shown here that fermions factually determine the gravitational structure of the universe, while bosons are responsible for the three established and described forces. Underlying the relationships of the gravitational and electromagnetic fields is a symmetrical probability distribution of fermions (...) and bosons. Werner Heisenberg's assertion that the Schr\'f6dinger wave function and Heisenberg matrices do not describe one thing is confirmed. It is asserted that the conscious observation of Schr\'f6dinger's wave function never causes its collapse, but invariably produces the classical space described by the Heisenberg picture. As a result, the Heisenberg picture can be explained and substantiated only in terms of conscious observation of the Schr\'f6dinger wave function. Schr\'f6dinger\'92s picture is defined as information space, while Heisenberg\'92s picture is defined as classical space. B-theory postulates that although the Schr\'f6dinger picture and the Heisenberg picture are mathematically connected, the former is eternal while the latter is discrete, existing only as the sequence of discrete conscious moments. Inferences related to information-based congruence between physical and mental phenomena have long been discussed in the literature. Moreover, John Wheeler suggested that information is fundamental to the physics of the universe. However, there is a great deal of uncertainty about how the physical and the mental complement each other. Bishop Berkeley and Ernst Mach, to name two who have addressed the subject, simply reject the concept of the material world altogether. Professor Hardy defined physical reality as 'dubious and elusive'. It is proposed in this paper that physical reality, or physical instantiation in the classical space as described by Heisenberg picture is one thing with the consciousness. (shrink)
The notions of conservation and relativity lie at the heart of classical mechanics, and were critical to its early development. However, in Newton’s theory of mechanics, these symmetry principles were eclipsed by domain-specific laws. In view of the importance of symmetry principles in elucidating the structure of physical theories, it is natural to ask to what extent conservation and relativity determine the structure of mechanics. In this paper, we address this question by deriving classical mechanics—both nonrelativistic (...) and relativistic—using relativity and conservation as the primary guiding principles. The derivation proceeds in three distinct steps. First, conservation and relativity are used to derive the asymptotically conserved quantities of motion. Second, in order that energy and momentum be continuously conserved, the mechanical system is embedded in a larger energetic framework containing a massless component that is capable of bearing energy. Imposition of conservation and relativity then results, in the nonrelativistic case, in the conservation of mass and in the frame-invariance of massless energy; and, in the relativistic case, in the rules for transforming massless energy and momentum between frames. Third, a force framework for handling continuously interacting particles is established, wherein Newton’s second law is derived on the basis of relativity and a staccato model of motion-change. Finally, in light of the derivation, we elucidate the structure of mechanics by classifying the principles and assumptions that have been employed according to their explanatory role, distinguishing between symmetry principles and other types of principles that are needed to build up the theoretical edifice. (shrink)
In this paper I propose an interpretation of classical statistical mechanics that centers on taking seriously the idea that probability measures represent complete states of statistical mechanical systems. I show how this leads naturally to the idea that the stochasticity of statistical mechanics is associated directly with the observables of the theory rather than with the microstates (as traditional accounts would have it). The usual assumption that microstates are representationally significant in the theory is therefore dispensable, a consequence (...) which suggests interesting possibilities for developing non-equilibrium statistical mechanics and investigating inter-theoretic answers to the foundational questions of statistical mechanics. (shrink)
The Born’s rule to interpret the square of wave function as the probability to get a specific value in measurement has been accepted as a postulate in foundations of quantum mechanics. Although there have been so many attempts at deriving this rule theoretically using different approaches such as frequency operator approach, many-world theory, Bayesian probability and envariance, literature shows that arguments in each of these methods are circular. In view of absence of a convincing theoretical proof, recently some (...) researchers have carried out experiments to validate the rule up-to maximum possible accuracy using multi-order interference (Sinha et al, Science, 329, 418 [2010]). But, a convincing analytical proof of Born’s rule will make us understand the basic process responsible for exact square dependency of probability on wave function. In this paper, by generalizing the method of calculating probability in common experience into quantum mechanics, we prove the Born’s rule for statistical interpretation of wave function. (shrink)
In a quantum universe with a strong arrow of time, it is standard to postulate that the initial wave function started in a particular macrostate---the special low-entropy macrostate selected by the Past Hypothesis. Moreover, there is an additional postulate about statistical mechanical probabilities according to which the initial wave function is a ''typical'' choice in the macrostate. Together, they support a probabilistic version of the Second Law of Thermodynamics: typical initial wave functions will increase in entropy. Hence, there are two (...) sources of randomness in such a universe: the quantum-mechanical probabilities of the Born rule and the statistical mechanical probabilities of the Statistical Postulate. I propose a new way to understand time's arrow in a quantum universe. It is based on what I call the Thermodynamic Theories of Quantum Mechanics. According to this perspective, there is a natural choice for the initial quantum state of the universe, which is given by not a wave function but by a density matrix. The density matrix plays a microscopic role: it appears in the fundamental dynamical equations of those theories. The density matrix also plays a macroscopic / thermodynamic role: it is exactly the projection operator onto the Past Hypothesis subspace. Thus, given an initial subspace, we obtain a unique choice of the initial density matrix. I call this property "the conditional uniqueness" of the initial quantum state. The conditional uniqueness provides a new and general strategy to eliminate statistical mechanical probabilities in the fundamental physical theories, by which we can reduce the two sources of randomness to only the quantum mechanical one. I also explore the idea of an absolutely unique initial quantum state, in a way that might realize Penrose's idea of a strongly deterministic universe. (shrink)
A case study of quantum mechanics is investigated in the framework of the philosophical opposition “mathematical model – reality”. All classical science obeys the postulate about the fundamental difference of model and reality, and thus distinguishing epistemology from ontology fundamentally. The theorems about the absence of hidden variables in quantum mechanics imply for it to be “complete” (versus Einstein’s opinion). That consistent completeness (unlike arithmetic to set theory in the foundations of mathematics in Gödel’s opinion) can be (...) interpreted furthermore as the coincidence of model and reality. The paper discusses the option and fact of that coincidence it its base: the fundamental postulate formulated by Niels Bohr about what quantum mechanics studies (unlike all classical science). Quantum mechanics involves and develops further both identification and disjunctive distinction of the global space of the apparatus and the local space of the investigated quantum entity as complementary to each other. This results into the analogical complementarity of model and reality in quantum mechanics. The apparatus turns out to be both absolutely “transparent” and identically coinciding simultaneously with the reflected quantum reality. Thus, the coincidence of model and reality is postulated as necessary condition for cognition in quantum mechanics by Bohr’s postulate and further, embodied in its formalism of the separable complex Hilbert space, in turn, implying the theorems of the absence of hidden variables (or the equivalent to them “conservation of energy conservation” in quantum mechanics). What the apparatus and measured entity exchange cannot be energy (for the different exponents of energy), but quantum information (as a certain, unambiguously determined wave function) therefore a generalized law of conservation, from which the conservation of energy conservation is a corollary. Particularly, the local and global space (rigorously justified in the Standard model) share the complementarity isomorphic to that of model and reality in the foundation of quantum mechanics. On that background, one can think of the troubles of “quantum gravity” as fundamental, direct corollaries from the postulates of quantum mechanics. Gravity can be defined only as a relation or by a pair of non-orthogonal separable complex Hilbert space attachable whether to two “parts” or to a whole and its parts. On the contrary, all the three fundamental interactions in the Standard model are “flat” and only “properties”: they need only a single separable complex Hilbert space to be defined. (shrink)
The universality assumption (“U”) that quantum wave states only evolve by linear or unitary dynamics has led to a variety of paradoxes in the foundations of physics. U is not directly supported by empirical evidence but is rather an inference from data obtained from microscopic systems. The inference of U conflicts with empirical observations of macroscopic systems, giving rise to the century-old measurement problem and subjecting the inference of U to a higher standard of proof, the burden of which (...) lies with its proponents. This burden remains unmet because the intentional choice by scientists to perform interference experiments that only probe the microscopic realm disqualifies the resulting data from supporting an inference that wave states always evolve linearly in the macroscopic realm. Further, the nature of the physical world creates an asymptotic size limit above which interference experiments, and verification of U in the realm in which it causes the measurement problem, seem impossible for all practical purposes if nevertheless possible in principle. This apparent natural limit serves as evidence against an inference of U, providing a further hurdle to the proponent’s currently unmet burden of proof. The measurement problem should never have arisen because the inference of U is entirely unfounded, logically and empirically. (shrink)
Currently there are at least four sizeable projects going on to establish the gravitational acceleration of massive antiparticles on earth. While general relativity and modern quantum theories strictly forbid any repulsive gravity, it has not yet been established experimentally that gravity is attraction only. With that in mind, the Elementary Process Theory (EPT) is a rather abstract theory that has been developed from the hypothesis that massive antiparticles are repulsed by the gravitational field of a body of ordinary matter: the (...) EPT essentially describes the elementary processes by which the smallest massive systems have to interact with their environments for repulsive gravity to exist. In this paper we model a nonrelativistic, one-component massive system that evolves in time by the processes as described by the EPT in an environment described by classical fields: the main result is a semi-classical model of a process at Planck scale by which a non-relativistic onecomponent system interacts with its environment, such that the interaction has both gravitational and electromagnetic aspects. Some worked-out examples are provided, among which the repulsion of an antineutron by the gravitational field of the earth. The general conclusion is that the semi-classical model of the EPT corresponds to non-relativistic classical mechanics. Further research is aimed at demonstrating that the EPT has a model that reproduces the successful predictions of general relativity. (shrink)
The main algebraic foundations of quantum mechanics are quickly reviewed. They have been suggested since the birth of this theory till up to last years. They are the following ones: Heisenberg-Born- Jordan’s (1925), Weyl’s (1928), Dirac’s (1930), von Neumann’s (1936), Segal’s (1947), T.F. Jordan’s (1986), Morchio and Strocchi’s (2009) and Buchholz and Fregenhagen’s (2019). Four cases are stressed: 1) the misinterpretation of Dirac’s algebraic foundation; 2) von Neumann’s ‘conversion’ from the analytic approach of Hilbert space to the algebraic (...) approach of the rings of operators; 3) Morchio and Strocchi’s improving Dirac’s analogy between commutators and Poisson Brackets into an exact equivalence; 4) the recent foundation of quantum mechanics upon the algebra of perturbations. Some considerations on alternating theoretical importance of the algebraic approach in the history of QM are offered. The level of formalism has increased from the mere introduction of matrices to group theory and C*-algebras but has not led to a definition of the foundations of physics; in particular, an algebraic formulation of QM organized as a problem-based theory and an exclusive use of constructive mathematics is still to be discovered. (shrink)
This paper investigates the question of, and the degree to which, Newton’s theory of space constitutes a third-way between the traditional substantivalist and relationist ontologies, i.e., that Newton judged that space is neither a type of substance/entity nor purely a relation among such substances. A non-substantivalist reading of Newton has been famously defended by Howard Stein, among others; but, as will be demonstrated, these claims are problematic on various grounds, especially as regards Newton’s alleged rejection of the traditional substance/accident dichotomy (...) concerning space. Nevertheless, our analysis of the metaphysical foundations of Newton’s spatial theory will strive to uncover its unique and innovative characteristics, most notably, the distinctive role that Newton’s “immaterialist” spatial ontology plays in his dynamics. (shrink)
The ontological models framework distinguishes ψ-ontic from ψ-epistemic wave- functions. It is, in general, quite straightforward to categorize the wave-function of a certain quantum theory. Nevertheless, there has been a debate about the ontological status of the wave-function in the statistical interpretation of quantum mechanics: is it ψ-epistemic and incomplete or ψ-ontic and complete? I will argue that the wave- function in this interpretation is best regarded as ψ-ontic and incomplete.
One finds, in Maxwell's writings on thermodynamics and statistical physics, a conception of the nature of these subjects that differs in interesting ways from the way that they are usually conceived. In particular, though—in agreement with the currently accepted view—Maxwell maintains that the second law of thermodynamics, as originally conceived, cannot be strictly true, the replacement he proposes is different from the version accepted by most physicists today. The modification of the second law accepted by most physicists is a probabilistic (...) one: although statistical fluctuations will result in occasional spontaneous differences in temperature or pressure, there is no way to predictably and reliably harness these to produce large violations of the original version of the second law. Maxwell advocates a version of the second law that is strictly weaker; the validity of even this probabilistic version is of limited scope, limited to situations in which we are dealing with large numbers of molecules en masse and have no ability to manipulate individual molecules. Connected with this is his concept of the thermodynamic concepts of heat, work, and entropy; on the Maxwellian view, these are concepts that must be relativized to the means we have available for gathering information about and manipulating physical systems. The Maxwellian view is one that deserves serious consideration in discussions of the foundation of statistical mechanics. It has relevance for the project of recovering thermodynamics from statistical mechanics because, in such a project, it matters which version of the second law we are trying to recover. (shrink)
The explicit history of the “hidden variables” problem is well-known and established. The main events of its chronology are traced. An implicit context of that history is suggested. It links the problem with the “conservation of energy conservation” in quantum mechanics. Bohr, Kramers, and Slaters (1924) admitted its violation being due to the “fourth Heisenberg uncertainty”, that of energy in relation to time. Wolfgang Pauli rejected the conjecture and even forecast the existence of a new and unknown then elementary (...) particle, neutrino, on the ground of energy conservation in quantum mechanics, afterwards confirmed experimentally. Bohr recognized his defeat and Pauli’s truth: the paradigm of elementary particles (furthermore underlying the Standard model) dominates nowadays. However, the reason of energy conservation in quantum mechanics is quite different from that in classical mechanics (the Lie group of all translations in time). Even more, if the reason was the latter, Bohr, Cramers, and Slatters’s argument would be valid. The link between the “conservation of energy conservation” and the problem of hidden variables is the following: the former is equivalent to their absence. The same can be verified historically by the unification of Heisenberg’s matrix mechanics and Schrödinger’s wave mechanics in the contemporary quantum mechanics by means of the separable complex Hilbert space. The Heisenberg version relies on the vector interpretation of Hilbert space, and the Schrödinger one, on the wave-function interpretation. However the both are equivalent to each other only under the additional condition that a certain well-ordering is equivalent to the corresponding ordinal number (as in Neumann’s definition of “ordinal number”). The same condition interpreted in the proper terms of quantum mechanics means its “unitarity”, therefore the “conservation of energy conservation”. In other words, the “conservation of energy conservation” is postulated in the foundations of quantum mechanics by means of the concept of the separable complex Hilbert space, which furthermore is equivalent to postulating the absence of hidden variables in quantum mechanics (directly deducible from the properties of that Hilbert space). Further, the lesson of that unification (of Heisenberg’s approach and Schrödinger’s version) can be directly interpreted in terms of the unification of general relativity and quantum mechanics in the cherished “quantum gravity” as well as a “manual” of how one can do this considering them as isomorphic to each other in a new mathematical structure corresponding to quantum information. Even more, the condition of the unification is analogical to that in the historical precedent of the unifying mathematical structure (namely the separable complex Hilbert space of quantum mechanics) and consists in the class of equivalence of any smooth deformations of the pseudo-Riemannian space of general relativity: each element of that class is a wave function and vice versa as well. Thus, quantum mechanics can be considered as a “thermodynamic version” of general relativity, after which the universe is observed as if “outside” (similarly to a phenomenological thermodynamic system observable only “outside” as a whole). The statistical approach to that “phenomenological thermodynamics” of quantum mechanics implies Gibbs classes of equivalence of all states of the universe, furthermore re-presentable in Boltzmann’s manner implying general relativity properly … The meta-lesson is that the historical lesson can serve for future discoveries. (shrink)
This paper presents and critically discusses the “logos approach to quantum mechanics” from the point of view of the current debates concerning the relation between metaphysics and science. Due to its alleged direct connection with quantum formalism, the logos approach presents itself as a better alternative for understanding quantum mechanics than other available views. However, we present metaphysical and methodological difficulties that seem to clearly point to a different conclusion: the logos approach is on an epistemic equal footing (...) among alternative realist approaches to quantum mechanics. (shrink)
This invited article is a response to the paper “Quantum Misuse in Psychic Literature,” by Jack A. Mroczkowski and Alexis P. Malozemoff, published in this issue of the Journal of Near-Death Studies. Whereas I sympathize with Mroczkowski’s and Malozemoff’s cause and goals, and I recognize the problem they attempted to tackle, I argue that their criticisms often overshot the mark and end up adding to the confusion. I address nine specific technical points that Mroczkowski and Malozemoff accused popular writers in (...) the fields of health care and parapsychology of misunderstanding and misrepresenting. I argue that, by and large—and contrary to Mroczkowski’s and Malozemoff’s claims—the statements made by these writers are often reasonable and generally consistent with the current state of play in foundations of quantum mechanics. (shrink)
The recent use of typicality in statistical mechanics for foundational purposes has stirred an important debate involving both philosophers and physicists. While this debate customarily focuses on technical issues, in this paper I try to approach the problem from an epistemological angle. The discussion is driven by two questions: (1) What does typicality add to the concept of measure? (2) What kind of explanation, if any, does typicality yield? By distinguishing the notions of `typicality-as-vast-majority' and `typicality-as-best-exemplar', I argue that (...) the former goes beyond the concept of measure. Furthermore, I also argue that typicality aims at providing us with a form of causal explanation of equilibrium. (shrink)
This paper investigates the possibiity of developing a fully micro realistic version of elementary quantum mechanics. I argue that it is highly desirable to develop such a version of quantum mechanics, and that the failure of all current versions and interpretations of quantum mechanics to constitute micro realistic theories is at the root of many of the interpretative problems associated with quantum mechanics, in particular the problem of measurement. I put forward a propensity micro realistic version (...) of quantum mechanics, and suggest how it might be possible to discriminate, on expermental grounds, between this theory and other versions of quantum mechanics. (shrink)
In (Weaver 2021), I showed that Boltzmann’s H-theorem does not face a significant threat from the reversibility paradox. I argue that my defense of the H-theorem against that paradox can be used yet again for the purposes of resolving the recurrence paradox without having to endorse heavy-duty statistical assumptions outside of the hypothesis of molecular chaos. As in (Weaver 2021), lessons from the history and foundations of physics reveal precisely how such resolution is achieved.
The purpose of this paper is to show that the mathematics of quantum mechanics is the mathematics of set partitions linearized to vector spaces, particularly in Hilbert spaces. That is, the math of QM is the Hilbert space version of the math to describe objective indefiniteness that at the set level is the math of partitions. The key analytical concepts are definiteness versus indefiniteness, distinctions versus indistinctions, and distinguishability versus indistinguishability. The key machinery to go from indefinite to more (...) definite states is the partition join operation at the set level that prefigures at the quantum level projective measurement as well as the formation of maximally-definite state descriptions by Dirac’s Complete Sets of Commuting Operators. This development is measured quantitatively by logical entropy at the set level and by quantum logical entropy at the quantum level. This follow-the-math approach supports the Literal Interpretation of QM—as advocated by Abner Shimony among others which sees a reality of objective indefiniteness that is quite different from the common sense and classical view of reality as being “definite all the way down”. (shrink)
In this paper, possible objections to the propensity microrealistic version of quantum mechanics proposed in Part I are answered. This version of quantum mechanics is compared with the statistical, particle microrealistic viewpoint, and a crucial experiment is proposed designed to distinguish between these to microrealistic versions of quantum mechanics.
Early modern foundations for mechanics came in two kinds, nomic and material. I examine here the dynamical laws and pictures of matter given respectively by Newton, Leibniz, and Kant. I argue that they fall short of their foundational task, viz. to represent enough kinematic behavior; or at least to explain it. In effect, for the true foundations of classical mechanics we must look beyond Newton, Leibniz, and Kant.
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.