I have been working for a long time about basic laws which direct existence, and some mathematical problems which are waited for a solution. I can count myself lucky, that I could make some important inferences during this time, and I published them in a few papers partially as some propositions. This work aimed to explain and discuss these inferences all together by relating them one another by some extra additions, corrections and explanations being physical phenomena are prior. There are (...) many motivation instruments for exact physical inferences. (shrink)
In this paper, we review a general technique for converting the standard Lagrangian description of a classical system into a formulation that puts time on an equal footing with the system's degrees of freedom. We show how the resulting framework anticipates key features of special relativity, including the signature of the Minkowski metric tensor and the special role played by theories that are invariant under a generalized notion of Lorentz transformations. We then use this technique to revisit a classification of (...) classical particle-types that mirrors Wigner's classification of quantum particle-types in terms of irreducible representations of the Poincaré group, including the cases of massive particles, massless particles, and tachyons. Along the way, we see gauge invariance naturally emerge in the context of classical massless particles with nonzero spin, as well as study the massless limit of a massive particle and derive a classical-particle version of the Higgs mechanism. (shrink)
In many languages, the same particles that form quantifier words also serve as connectives, additive and scalar particles, question markers, roots of existential verbs, and so on. Do these have a unified semantics, or do they merely bear a family resemblance? Are they aided by silent operators in their varied roles―if yes, what operators? I dub the particles “quantifier particles” and refer to them generically with capitalized versions of the Japanese morphemes. I argue that both MO (...) and KA can be assigned a stable semantics across their various roles. The specific analysis I offer is motivated by the fact that MO and KA often combine with just one argument; I propose that this is their characteristic behavior. Their role is to impose semantic requirements that are satisfied when the immediately larger context is interpreted as the meet/join of their host’s semantic contribution with something else. They do not perform meet/join themselves. The obligatory vs. optional appearance of the particles depends on whether the meet/join interpretations arise by default in the given constellation. I explicate the proposal using the toolkit of basic Inquisitive Semantics. (shrink)
A new version of quantum theory is proposed, according to which probabilistic events occur whenever new statioinary or bound states are created as a result of inelastic collisions. The new theory recovers the experimental success of orthodox quantum theory, but differs form the orthodox theory for as yet unperformed experiments.
In many languages, the same particles build quantiﬁer words and serve as connectives, additive and scalar particles, question markers, existential verbs, and so on. Do the roles of each particle form a natural class with a stable semantics? Are the particles aided by additional elements, overt or covert, in fulﬁlling their varied roles? I propose a uniﬁed analysis, according to which the particles impose partial ordering requirements (glb and lub) on the interpretations of their hosts and (...) the immediate larger contexts, but do not embody algebraic operations themselves. (shrink)
A fully micro realistic, propensity version of quantum theory is proposed, according to which fundamental physical entities - neither particles nor fields - have physical characteristics which determine probabilistically how they interact with one another . The version of quantum "smearon" theory proposed here does not modify the equations of orthodox quantum theory: rather, it gives a radically new interpretation to these equations. It is argued that there are strong general reasons for preferring quantum "smearon" theory to orthodox quantum (...) theory; the proposed change in physical interpretation leads quantum "smearon" theory to make experimental predictions subtly different from those of orthodox quantum theory. Some possible crucial experiments are considered. (shrink)
The assertion that an experiment by Afshar et al. demonstrates violation of Bohr’s Principle of Complementarity is based on the faulty assumption that which-way information in a double-slit interference experiment can be retroactively determined from a future measurement.
We propose an approach of treating information as an intermediate (or medium). The intermediate can be thought of as an information particle. That particle has three main properties.
Wigner’s quantum-mechanical classification of particle-types in terms of irreducible representations of the Poincaré group has a classical analogue, which we extend in this paper. We study the compactness properties of the resulting phase spaces at fixed energy, and show that in order for a classical massless particle to be physically sensible, its phase space must feature a classical-particle counterpart of electromagnetic gauge invariance. By examining the connection between massless and massive particles in the massless limit, we also derive a (...) classical-particle version of the Higgs mechanism. (shrink)
In this paper I put forward a new micro realistic, fundamentally probabilistic, propensiton version of quantum theory. According to this theory, the entities of the quantum domain - electrons, photons, atoms - are neither particles nor fields, but a new kind of fundamentally probabilistic entity, the propensiton - entities which interact with one another probabilistically. This version of quantum theory leaves the Schroedinger equation unchanged, but reinterprets it to specify how propensitons evolve when no probabilistic transitions occur. Probabilisitic transitions (...) occur when new "particles" are created as a result of inelastic interactions. All measurements are just special cases of this. This propensiton version of quantum theory, I argue, solves the wave/particle dilemma, is free of conceptual problems that plague orthodox quantum theory, recovers all the empirical success of orthodox quantum theory, and at the same time yields as yet untested predictions that differ from those of orthodox quantum theory. (shrink)
The physics literature contains many claims that elementary particles have been observed: such observational claims are, of course, important for the development of existential knowledge. Regarding claimed observations of short-lived unstable particles in particular, the use of the word 'observation' is based on the convention in physics that the observation of a short-lived unstable particle can be claimed when its predicted decay products have been observed with a significance of 5 sigma. This paper, however, shows that this 5 (...) sigma convention is inconsistent with existing concepts of observation by showing that unstable particles with a lifetime of less than 0.01 attosecond are fundamentally unobservable both from the perspective of Fox's recent concepts of direct and indirect observation, and from the perspective of Van Fraassen's notion of observability. This cognitive inaccessibility of parts of the subatomic world has far-reaching implications for physics, not the least of which is that the aforementioned convention is untenable: claims that such short-lived unstable particles have been observed will thus have to be retracted. The main implications are two incompleteness theorems for physics, respectively stating (i) that experiments cannot prove completeness of a physical theory predicting short-lived unstable particles, and (ii) that experiments cannot prove correctness of such a theory|one can at most test its empirical adequacy. On a general note, the conclusion is that the importance of philosophical arguments for particle physics is herewith demonstrated: it is, thus, a widespread misconception that philosophical arguments can be completely avoided. (shrink)
Why do photons and speeding electrons have both wave features and particle features when common sense tells us that they should be either particle or wave and not an amalgam of both? Part I of this paper deals with photons and argues that there are flaws in the assumptions we have made regarding their particle nature. The argument depends upon distinguishing between two identities of the photon, namely unstored energy and its stored (relativistic) mass. Part II extends these arguments to (...) the case of the speeding electron and argues that current ontological assumptions made about projectiles have exacerbated our confusion about the nature of moving electrons. When regarded ontologically, projectile motion is not as simple as has been assumed by both classical and modern physics. (shrink)
In quantum mechanics, the wave function of a N-body system is a mathematical function defined in a 3N-dimensional configuration space. We argue that wave function realism implies particle ontology when assuming: (1) the wave function of a N-body system describes N physical entities; (2) each triple of the 3N coordinates of a point in configuration space that relates to one physical entity represents a point in ordinary three-dimensional space. Moreover, the motion of particles is random and discontinuous.
This paper puts forward the hypothesis that the distinctive features of quantum statistics are exclusively determined by the nature of the properties it describes. In particular, all statistically relevant properties of identical quantum particles in many-particle systems are conjectured to be irreducible, ‘inherent’ properties only belonging to the whole system. This allows one to explain quantum statistics without endorsing the ‘Received View’ that particles are non-individuals, or postulating that quantum systems obey peculiar probability distributions, or assuming that there (...) are primitive restrictions on the range of states accessible to such systems. With this, the need for an unambiguously metaphysical explanation of certain physical facts is acknowledged and satisfied. (shrink)
Throughout this paper, in a nutshell we try to show a way to check Fuzzy time in general and Fuzzy time-Particle interpretation of Quantum Mechanics, experimentally. . -/- .
An analysis of the physical implications of abstractness reveals the reality of three interconnected modes of existence: abstract, virtual and concrete, corresponding in physics to information, energy and matter. This triple-aspect monism clarifies the ontological status of subatomic quantum particles. It also provides a non-spooky solution to the weirdness of quantum physics and a new outlook for the mind-body problem. The ontological implications are profound for both physics and philosophy.
The major point in [1] chapter 2 is the following claim: “Any formalized system for the Theory of Computation based on Classical Logic and Turing Model of Computation leads us to a contradiction.” So, in the case we wish to save Classical Logic we should change our Computational Model. As we see in chapter two, the mentioned contradiction is about and around the concept of time, as it is in the contradiction of modified version of paradox. It is natural to (...) try fabricating the paradox not by time but in some other linear ordering or the concept of space. Interestingly, the attempts to have similar contradiction by the other concepts like space and linear ordering, is failed. It is remarkable that, the paradox is considered either Epistemological or Logical traditionally, but by new considerations the new version of paradox should be considered as either Logical or Physical paradox. Hence, in order to change our Computational Model, it is natural to change the concept of time, but how? We start from some models that are different from the classical one but they are intuitively plausible. The idea of model is somewhat introduced by Brouwer and Husserl [3]. This model doesn’t refute the paradox, since the paradox and the associated contradiction would be repeated in this new model. The model is introduced in [2]. Here we give some more explanations. (shrink)
Rutherford’s α-particles scattering experiment was one of the milestone for the physics community as it provided an insight to an atom thus discarding the previously prevailed Thomson’s model. Through this article we shall examine the theoretical formulation of Rutherford’s experiment and how it helped to shape the modern physics.
According to Steiner (1998), in contemporary physics new important discoveries are often obtained by means of strategies which rely on purely formal mathematical considerations. In such discoveries, mathematics seems to have a peculiar and controversial role, which apparently cannot be accounted for by means of standard methodological criteria. M. Gell-Mann and Y. Ne׳eman׳s prediction of the Ω− particle is usually considered a typical example of application of this kind of strategy. According to Bangu (2008), this prediction is apparently based on (...) the employment of a highly controversial principle—what he calls the “reification principle”. Bangu himself takes this principle to be methodologically unjustifiable, but still indispensable to make the prediction logically sound. In the present paper I will offer a new reconstruction of the reasoning that led to this prediction. By means of this reconstruction, I will show that we do not need to postulate any “reificatory” role of mathematics in contemporary physics and I will contextually clarify the representative and heuristic role of mathematics in science. (shrink)
This paper discusses the issue of the identity and individuality (or lack thereof) of quantum mechanical particles. It first reconstructs, on the basis of the extant literature, a general argument in favour of the conclusion that such particles are not individual objects. Then, it critically assesses each one of the argument’s premises. The upshot is that, in fact, there is no compelling reason for believing that quantum particles are not individual objects.
Photons deliver their energy and momentum to a point on a material target. It is commonplace to attribute this to particle impact. But since the in-flight photon also has a wave nature, we are stuck with the paradox of wave-particle duality. It is argued here that the photon’s wave nature is indisputable, but its particle nature is open to question. Photons deliver energy. The problem with invoking impact as a means of delivery is that energy becomes a payload which in (...) turn requires a particle. This assumes that energy is always a payload and there is but one mode of energy delivery; surely two unsupported assumptions. It should be possible to explain photon termination without invoking particle impact. One approach offered here is to question the assumption that the photon is a unitary object. Perhaps the photon has two linked-but-distinct identities: one supporting wave behavior and the other supporting discrete behavior. It is the latter that might imitate particle impact. (shrink)
According to our understanding of the everyday physical world, observable phenomena are underpinned by persistent objects that can be reidentified across time by observation of their distinctive properties. This understanding is reflected in classical mechanics, which posits that matter consists of persistent, reidentifiable particles. However, the mathematical symmetrization procedures used to describe identical particles within the quantum formalism have led to the widespread belief that identical quantum particles lack either persistence or reidentifiability. However, it has proved difficult (...) to reconcile these assertions with the fact that identical particles are routinely assumed to be reidentifiable in particular circumstances. For example, when two electrons move through a bubble chamber, each is said to generate a sequence of bubbles that is caused by one and the same particle. Moreover, neither of these assertions accounts for the mathematical form of the symmetrization procedures used to describe identical particles within the quantum framework, leaving open theoretical possibilities other than bosonic and fermionic behavior, such as paraparticles, which do not appear to be realized in nature. Here we propose the novel idea that both persistence and nonpersistence models must be employed in order to fully account for the behaviour of identical particles. Thus, identical particles are neither persistent nor nonpersistent. We prove the viability of this viewpoint by showing how Feynman's and Dirac's symmetrization procedures arise through a synthesis of a quantum treatment of these models, and by showing how reidentifiability emerges in a context-dependent manner. We further show that the persistence and nonpersistence models satisfy the key characteristics of Bohr's concept of complementarity, and thereby propose that the behavior of identical particles is a manifestation of a persistence-nonpersistence complementarity, analogous to Bohr's wave-particle complementarity for individual particles. Finally, we construct a precise parallel between these two complementarities, and detail their conceptual similarities and dissimilarities. (shrink)
The cosmic time dependencies of $G$, $\alpha$, $h$ and of Standard Model parameters like the Higgs vev and elementary particle masses are studied in the framework of a new dark energy interpretation. Due to the associated time variation of rulers, many effects turn out to be invisible. However, a rather large time dependence is claimed to arise in association with dark energy measurements, and smaller ones in connection with the Standard Model.
The properties of angular momentum and its connection to magnetic momentum are explored, based on a reconsideration of the Stern-Gerlach experiment and gauge invariance. A possible way to solve the so called spin crisis is proposed. The separation of angular momentum of a quan- tum system of particles into orbital angular momentum plus intrinsic angular momentum is reconsidered, within the limits of the Schrodinger theory. A proof is given that, for systems of more than two particles, un- less (...) all of them have the same mass, the possibility of having eigenvalues of the form (n + 1/2)h is not excluded. (shrink)
The properties of angular momentum and its connection to magnetic momentum are explored, based on a reconsideration of the Stern-Gerlach experiment and gauge invariance. A possible way to solve the so called spin crisis is proposed. The separation of angular momentum of a quantum system of particles into orbital angular momentum plus intrinsic angular momentum is reconsidered, within the limits of the Schr\"odinger theory. A proof is given that, for systems of more than two particles, unless all of (...) them have the same mass, the possibility of having eigenvalues of the form $(n+1/2)\hbar$ is not excluded. (shrink)
Experiments in particle physics have hitherto failed to produce any significant evidence for the many explicit models of physics beyond the Standard Model (BSM) that had been proposed over the past decades. As a result, physicists have increasingly turned to model-independent strategies as tools in searching for a wide range of possible BSM effects. In this paper, we describe the Standard Model Effective Field Theory (SM-EFT) and analyse it in the context of the philosophical discussions about models, theories, and (bottom-up) (...) effective field theories. We find that while the SM-EFT is a quantum field theory, assisting experimentalists in searching for deviations from the SM, in its general form it lacks some of the characteristic features of models. Those features only come into play if put in by hand or prompted by empirical evidence for deviations. Employing different philosophical approaches to models, we argue that the case study suggests not to take a view on models that is overly permissive because it blurs the lines between the different stages of the SM-EFT research strategies and glosses over particle physicists' motivations for undertaking this bottom-up approach in the first place. Looking at EFTs from the perspective of modelling does not require taking a stance on some specific brand of realism or taking sides in the debate between reduction and emergence into which EFTs have recently been embedded. (shrink)
The Higgs mechanism is an essential but elusive component of the Standard Model of particle physics. Without it Yang‐Mills gauge theories would have been little more than a warm‐up exercise in the attempt to quantize gravity rather than serving as the basis for the Standard Model. This article focuses on two problems related to the Higgs mechanism clearly posed in Earman’s recent papers (Earman 2003, 2004a, 2004b): what is the gauge‐invariant content of the Higgs mechanism, and what does it mean (...) to break a local gauge symmetry? (shrink)
We address a long-standing debate over whether classical magnetic forces can do work, ultimately answering the question in the affirmative. In detail, we couple a classical particle with intrinsic spin and elementary dipole moments to the electromagnetic field, derive the appropriate generalization of the Lorentz force law, show that the particle’s dipole moments must be collinear with its spin axis, and argue that the magnetic field does mechanical work on the particle’s elementary magnetic dipole moment. As consistency checks, we calculate (...) the overall system’s energy-momentum and angular momentum, and show that their local conservation equations lead to the same force law and therefore the same conclusions about magnetic forces and work. We also compute the system’s Belinfante–Rosenfeld energy–momentum tensor. (shrink)
Mereological nihilism is the philosophical position that there are no items that have parts. If there are no items with parts then the only items that exist are partless fundamental particles, such as the true atoms (also called philosophical atoms) theorized to exist by some ancient philosophers, some contemporary physicists, and some contemporary philosophers. With several novel arguments I show that mereological nihilism is the correct theory of reality. I will also discuss strong similarities that mereological nihilism has with (...) empirical results in quantum physics. And I will discuss how mereological nihilism vindicates a few other theories, such as a very specific theory of philosophical atomism, which I will call quantum abstract atomism. I will show that mereological nihilism also is an interpretation of quantum mechanics that avoids the problems of other interpretations, such as the widely known, metaphysically generated, quantum paradoxes of quantum physics, which ironically are typically accepted as facts about reality. I will also show why it is very surprising that mereological nihilism is not a widely held theory, and not the premier theory in philosophy. (shrink)
This article is a rebuttal to Robert G. Cavin and Carlos A. Colombetti’s article, “Assessing the Resurrection Hypothesis: Problems with Craig’s Inference to the Best Explanation,” which argues that the Standard Model of current particle physics entails that non-physical things (like a supernatural God or a supernaturally resurrected body) can have no causal contact with the physical universe. As such, they argue that William Lane Craig’s resurrection hypothesis is not only incompatible with the notion of Jesus physically appearing to the (...) disciples, but the resurrection hypothesis is significantly limited in both its explanatory scope and explanatory power. This article seeks to demonstrate why their use of the Standard Model does not logically entail a rejection of the physical resurrection of Jesus when considering the scope and limitations of science itself. (shrink)
The paper addresses the problem, which quantum mechanics resolves in fact. Its viewpoint suggests that the crucial link of time and its course is omitted in understanding the problem. The common interpretation underlain by the history of quantum mechanics sees discreteness only on the Plank scale, which is transformed into continuity and even smoothness on the macroscopic scale. That approach is fraught with a series of seeming paradoxes. It suggests that the present mathematical formalism of quantum mechanics is only partly (...) relevant to its problem, which is ostensibly known. The paper accepts just the opposite: The mathematical solution is absolute relevant and serves as an axiomatic base, from which the real and yet hidden problem is deduced. Wave-particle duality, Hilbert space, both probabilistic and many-worlds interpretations of quantum mechanics, quantum information, and the Schrödinger equation are included in that base. The Schrödinger equation is understood as a generalization of the law of energy conservation to past, present, and future moments of time. The deduced real problem of quantum mechanics is: “What is the universal law describing the course of time in any physical change therefore including any mechanical motion?”. (shrink)
To this day, a hundred and fifty years after Mendeleev's discovery, the overal structure of the periodic system remains unaccounted for in quantum-mechanical terms. Given this dire situation, a handful of scientists in the 1970s embarked on a quest for the symmetries that lie hidden in the periodic table. Their goal was to explain the table's structure in group-theoretical terms. We argue that this symmetry program required an important paradigm shift in the understanding of the nature of chemical elements. The (...) idea, in essence, consisted of treating the chemical elements, not as particles, but as states of a superparticle. We show that the inspiration for this came from elementary particle physics, and in particular from Heisenberg's suggestion to treat the proton and neutron as different states of the nucleon. We provide a careful study of Heisenberg's last paper on the nature of elementary particles, and explain why the Democritean picture of matter no longer applied in modern physics and a Platonic symmetry-based picture was called for instead. We show how Heisenberg's Platonic philosophy came to dominate the field of elementary particle physics, and how it found its culmination point in Gell-Mann's classification of the hadrons in the eightfold way. We argue that it was the success of Heisenberg's approach in elementary particle physics that sparked the group-theoretical approach to the periodic table. We explain how it was applied to the set of chemical elements via a critical examination of the work of the Russian mathematician Abram Ilyich Fet the Turkish-American physicist Asim Orhan Barut, before giving some final reflections. (shrink)
In my 2013 article, “A New Theory of Free Will”, I argued that several serious hypotheses in philosophy and modern physics jointly entail that our reality is structurally identical to a peer-to-peer (P2P) networked computer simulation. The present paper outlines how quantum phenomena emerge naturally from the computational structure of a P2P simulation. §1 explains the P2P Hypothesis. §2 then sketches how the structure of any P2P simulation realizes quantum superposition and wave-function collapse (§2.1.), quantum indeterminacy (§2.2.), wave-particle duality (§2.3.), (...) and quantum entanglement (§2.4.). Finally, §3 argues that although this is by no means a philosophical proof that our reality is a P2P simulation, it provides ample reasons to investigate the hypothesis further using the methods of computer science, physics, philosophy, and mathematics. (shrink)
In this paper I outline my propensiton version of quantum theory (PQT). PQT is a fully micro-realistic version of quantum theory that provides us with a very natural possible solution to the fundamental wave/particle problem, and is free of the severe defects of orthodox quantum theory (OQT) as a result. PQT makes sense of the quantum world. PQT recovers all the empirical success of OQT and is, furthermore, empirically testable (although not as yet tested). I argue that Einstein almost put (...) forward this version of quantum theory in 1916/17 in his papers on spontaneous and induced radiative transitions, but retreated from doing so because he disliked the probabilistic character of the idea. Subsequently, the idea was overlooked because debates about quantum theory polarised into the Bohr/Heisenberg camp, which argued for the abandonment of realism and determinism, and the Einstein/Schrödinger camp, which argued for the retention of realism and determinism, no one, as a result, pursuing the most obvious option of retaining realism but abandoning determinism. It is this third, overlooked option that leads to PQT. PQT has implications for quantum field theory, the standard model, string theory, and cosmology. The really important point, however, is that it is experimentally testable. I indicate two experiments in principle capable of deciding between PQT and OQT. (shrink)
We investigate the meaning of the wave function by analyzing the mass and charge density distributions of a quantum system. According to protective measurement, a charged quantum system has effective mass and charge density distributing in space, proportional to the square of the absolute value of its wave function. In a realistic interpretation, the wave function of a quantum system can be taken as a description of either a physical field or the ergodic motion of a particle. The essential difference (...) between a field and the ergodic motion of a particle lies in the property of simultaneity; a field exists throughout space simultaneously, whereas the ergodic motion of a particle exists throughout space in a time-divided way. If the wave function is a physical field, then the mass and charge density will be distributed in space simultaneously for a charged quantum system, and thus there will exist gravitational and electrostatic self-interactions of its wave function. This not only violates the superposition principle of quantum mechanics but also contradicts experimental observations. Thus the wave function cannot be a description of a physical field but a description of the ergodic motion of a particle. For the later there is only a localized particle with mass and charge at every instant, and thus there will not exist any self-interaction for the wave function. Which kind of ergodic motion of particles then? It is argued that the classical ergodic models, which assume continuous motion of particles, cannot be consistent with quantum mechanics. Based on the negative result, we suggest that the wave function is a description of the quantum motion of particles, which is random and discontinuous in nature. On this interpretation, the square of the absolute value of the wave function not only gives the probability of the particle being found in certain locations, but also gives the probability of the particle being there. We show that this new interpretation of the wave function provides a natural realistic alternative to the orthodox interpretation, and its implications for other realistic interpretations of quantum mechanics are also briefly discussed. (shrink)
We show that the physical meaning of the wave function can be derived based on the established parts of quantum mechanics. It turns out that the wave function represents the state of random discontinuous motion of particles, and its modulus square determines the probability density of the particles appearing in certain positions in space.
Conventional wisdom holds that the von Neumann entropy corresponds to thermodynamic entropy, but Hemmo and Shenker (2006) have recently argued against this view by attacking von Neumann's (1955) argument. I argue that Hemmo and Shenker's arguments fail due to several misunderstandings: about statistical-mechanical and thermodynamic domains of applicability, about the nature of mixed states, and about the role of approximations in physics. As a result, their arguments fail in all cases: in the single-particle case, the finite particles case, and (...) the infinite particles case. (shrink)
The theoretical prediction of Higgs boson was arguably one of the most important contributions in particle physics in the 20th century, with significant implications for modern cosmology. Its reported discovery in 2012 was celebrated as one of the most significant scientific achievements of all times. The fierce public discourse that followed was at large ignited by the media-hyped nickname “God particle” attributed to Higgs boson. The debate regarding the science-religion relation reinvigorated once again and plenty theologically informed views were expressed. (...) In this paper, I take into consideration the authoritative views expressed by the Catholic Church and the Greek-Orthodox Church and I discuss them in comparison with each other, as well as in juxtaposition with other views expressed in the public discussion on the issue, in an attempt to draw philosophically interesting inferences. (shrink)
We propose a semantic analysis of the particles afinal (European Portuguese) and alla fine (Italian) in terms of the notion of truth unpersistence, which combines both epistemic modality and constraints on discourse structure. We argue that the felicitous use of these modal particles requires that the truth of a proposition p* fail to persist through a temporal succession of epistemic states, where p* is incompatible with the proposition modified by afinal/alla fine, and that the interlocutors share knowledge of (...) a previous epistemic attitude toward p*. We analyze two main cases, that of plan-related propositions and that of propositions without plans. We also discuss the connections between truth unpersistence and evidentiality. (shrink)
We investigate the validity of the field explanation of the wave function by analyzing the mass and charge density distributions of a quantum system. It is argued that a charged quantum system has effective mass and charge density distributing in space, proportional to the square of the absolute value of its wave function. This is also a consequence of protective measurement. If the wave function is a physical field, then the mass and charge density will be distributed in space simultaneously (...) for a charged quantum system, and thus there will exist a remarkable electrostatic self-interaction of its wave function, though the gravitational self-interaction is too weak to be detected presently. This not only violates the superposition principle of quantum mechanics but also contradicts experimental observations. Thus we conclude that the wave function cannot be a description of a physical field. In the second part of this paper, we further analyze the implications of these results for the main realistic interpretations of quantum mechanics, especially for de Broglie-Bohm theory. It has been argued that de Broglie-Bohm theory gives the same predictions as quantum mechanics by means of quantum equilibrium hypothesis. However, this equivalence is based on the premise that the wave function, regarded as a Ψ-field, has no mass and charge density distributions, which turns out to be wrong according to the above results. For a charged quantum system, both Ψ-field and Bohmian particle have charge density distribution. This then results in the existence of an electrostatic self-interaction of the field and an electromagnetic interaction between the field and Bohmian particle, which contradicts both the predictions of quantum mechanics and experimental observations. Therefore, de Broglie-Bohm theory as a realistic interpretation of quantum mechanics is probably wrong. Lastly, we suggest that the wave function is a description of some sort of ergodic motion (e.g. random discontinuous motion) of particles, and we also briefly analyze the implications of this suggestion for other realistic interpretations of quantum mechanics including many-worlds interpretation and dynamical collapse theories. (shrink)
Quantum mechanics makes some very significant observations about nature. Unfortunately, these observations remain a mystery because they do not fit into and/or cannot be explained through classical mechanics. However, we can still explore the philosophical and practical implications of these observations. This article aims to explain philosophical and practical implications of one of the most important observations of quantum mechanics – uncertainty or the arbitrariness in the behavior of particles.
The Mach-Zehnder Interferometer (MZI) is chosen to illustrate the long-standing wave-particle duality problem. Why is which-way (welcher weg) information incompatible with wave interference? How do we explain Wheeler’s delayed choice experiment? Most crucially, how can the photon divide at the first beam splitter and yet terminate on either arm with its undiminished energy? The position advanced is that the photon has two identities, one supporting particle features and the other wave features. There is photon kinetic energy that never splits (on (...) half-silvered mirrors) or diffracts (in pinholes or slits). Then there are photon probability waves that do diffract and can reinforce or cancel. Photon kinetic energy is oscillatory; its cycles require/occupy time. E = mc2 suggests that kinetic energy is physically real as occurrence in time just as rest mass is physically real as existence in space; both are quantized and both occupy/require a dimension for their occurrence or existence. Photon kinetic energy (KE) thus resides in time, but is still present/available for interactions (events) in space; rest mass (e.g., your desk) resides in space but is still present/available for interactions (events) in time. While photon probability waves progress in space and diffract there, photon KE resides in time and never diffracts in space; at reception it always arrives whole and imitates particle impact without being a particle. Photon probability waves are real; they diffract in space. Acknowledging that the photon has two identities (residing energy and progressing probability), explains photon dual nature. And wave-particle duality is central to quantum mechanics. Understanding it leads to new insights into entanglement, nonlocality and the measurement problem. A 30-minute video on nonlocality and photon dualism can be found by a google: search “youtube klevgard nonlocality”). (shrink)
The meaning of the wave function and its evolution are investigated. First, we argue that the wave function in quantum mechanics is a description of random discontinuous motion of particles, and the modulus square of the wave function gives the probability density of the particles being in certain locations in space. Next, we show that the linear non-relativistic evolution of the wave function of an isolated system obeys the free Schrödinger equation due to the requirements of spacetime translation (...) invariance and relativistic invariance. Thirdly, we argue that the random discontinuous motion of particles may lead to a stochastic, nonlinear collapse evolution of the wave function. A discrete model of energy-conserved wavefunction collapse is proposed and shown consistent with existing experiments and our macroscopic experience. Besides, we also give a critical analysis of the de Broglie-Bohm theory, the many-worlds interpretation and other dynamical collapse theories, and briefly discuss the issues of unifying quantum mechanics and relativity. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.