Quantum computing is of high interest because it promises to perform at least some kinds of computations much faster than classical computers. Arute et al. 2019 (informally, “the Google Quantum Team”) report the results of experiments that purport to demonstrate “quantumsupremacy” – the claim that the performance of some quantum computers is better than that of classical computers on some problems. Do these results close the debate over quantumsupremacy? We argue that (...) they do not. In the following, we provide an overview of the Google Quantum Team’s experiments, then identify some open questions in the quest to demonstrate quantumsupremacy. (shrink)
Rita Floyd’s "The Morality of Security: A Theory of Just Securitization" is an important and insightful book that delineates a theory of just securitization (modified from the jus ad bellum and jus in bello criteria in just war theory) involving three sets of principles governing the just initiation of securitization, just conduct of securitization, and just desecuritization. This book is a much-needed addition to the security studies and just war scholarship. -/- Here, I explore the potential of Floyd’s just securitization (...) theory (JST) to provide insights into the moral justifiability of non-state groups that are not political entities engaging in resistance against forms of structural violence that pose an existential threat to those groups. Using the case study of the Black Lives Matter (BLM) movement and the threat of white supremacy to African Americans as an illustrative example, I argue that structural forms of violence can meet Floyd’s definition of an objective existential threat, justifying the resort to securitization by groups such as BLM. (shrink)
David Lewis is a natural target for those who believe that findings in quantum physics threaten the tenability of traditional metaphysical reductionism. Such philosophers point to allegedly holistic entities they take both to be the subjects of some claims of quantum mechanics and to be incompatible with Lewisian metaphysics. According to one popular argument, the non-separability argument from quantum entanglement, any realist interpretation of quantum theory is straightforwardly inconsistent with the reductive conviction that the complete physical (...) state of the world supervenes on the intrinsic properties of and spatio-temporal relations between its point-sized constituents. Here I defend Lewis's metaphysical doctrine, and traditional reductionism more generally, against this alleged threat from quantum holism. After presenting the non-separability argument from entanglement, I show that Bohmian mechanics, an interpretation of quantum mechanics explicitly recognized as a realist one by proponents of the non-separability argument, plausibly rejects a key premise of that argument. Another holistic worry for Humeanism persists, however, the trouble being the apparently holistic character of the Bohmian pilot wave. I present a Humean strategy for addressing the holistic threat from the pilot wave by drawing on resources from the Humean best system account of laws. (shrink)
In a quantum universe with a strong arrow of time, we postulate a low-entropy boundary condition to account for the temporal asymmetry. In this paper, I show that the Past Hypothesis also contains enough information to simplify the quantum ontology and define a unique initial condition in such a world. First, I introduce Density Matrix Realism, the thesis that the quantum universe is described by a fundamental density matrix that represents something objective. This stands in sharp contrast (...) to Wave Function Realism, the thesis that the quantum universe is described by a wave function that represents something objective. Second, I suggest that the Past Hypothesis is sufficient to determine a unique and simple density matrix. This is achieved by what I call the Initial Projection Hypothesis: the initial density matrix of the universe is the normalized projection onto the special low-dimensional Hilbert space. Third, because the initial quantum state is unique and simple, we have a strong case for the \emph{Nomological Thesis}: the initial quantum state of the universe is on a par with laws of nature. This new package of ideas has several interesting implications, including on the harmony between statistical mechanics and quantum mechanics, the dynamic unity of the universe and the subsystems, and the alleged conflict between Humean supervenience and quantum entanglement. (shrink)
This report reviews what quantum physics and information theory have to tell us about the age-old question, How come existence? No escape is evident from four conclusions: (1) The world cannot be a giant machine, ruled by any preestablished continuum physical law. (2) There is no such thing at the microscopic level as space or time or spacetime continuum. (3) The familiar probability function or functional, and wave equation or functional wave equation, of standard quantum theory provide mere (...) continuum idealizations and by reason of this circumstance conceal the information-theoretic source from which they derive. (4) No element in the description of physics shows itself as closer to primordial than the elementary quantum phenomenon, that is, the elementary device-intermediated act of posing a yes-no physical question and eliciting an answer or, in brief, the elementary act of observer-participancy. Otherwise stated, every physical quantity, every it, derives its ultimate significance from bits, binary yes-or-no indications, a conclusion which we epitomize in the phrase, it from bit. (shrink)
A growing literature is premised on the claim that quantum mechanics provides evidence for metaphysical indeterminacy. But does it? None of the currently fashionable realist interpretations involve fundamental indeterminacy and the ‘standard interpretation’, to the extent that it can be made out, doesn't require indeterminacy either.
In the United States, Protestant Christian identity is the dominant religious identity. Protestant Christian identity confers status privileges, yet also creates objectionable status inequalities. Historical and contemporary evidence includes the unfair treatment of Mormons, Native Americans, Muslims, and other religious minorities. Protestant Christian supremacy also plays a significant role in bolstering anti LGBTQ prejudice, xenophobia, and white supremacy. Ways that Protestant Christian identity correlates with objectionable status inequalities are often neglected in contemporary political philosophy. This paper aims to (...) make a modest contribution towards filling that gap. Some forms of inequality linked to Protestant Christian supremacy can be characterized as domination and oppression. Other instances include barriers to fair equality of opportunity for self-determination. Adapting ideas from egalitarian political philosophy I propose an analysis of objectionable status inequality rooted in Protestant Christian supremacy. Alan Patten’s defense of an egalitarian principle for assessing the effects of law and policy is helpful for this task. (shrink)
I maintain that quantum mechanics is fundamentally about a system of N particles evolving in three-dimensional space, not the wave function evolving in 3N-dimensional space.
THE PRINCIPLE OF SUPERPOSITION. The need for a quantum theory Classical mechanics has been developed continuously from the time of Newton and applied to an ...
We expound an alternative to the Copenhagen interpretation of the formalism of nonrelativistic quantum mechanics. The basic difference is that the new interpretation is formulated in the language of epistemological realism. It involves a change in some basic physical concepts. The ψ function is no longer interpreted as a probability amplitude of the observed behaviour of elementary particles but as an objective physical field representing the particles themselves. The particles are thus extended objects whose extension varies in time according (...) to the variation of ψ. They are considered as fundamental regions of space with some kind of nonlocality. Special consideration is given to the Heisenberg relations, the Einstein-Podolsky- Rosen correlations, the reduction process, the problem of measurement, and the quantum-statistical distributions. (shrink)
The Preamble to the Charter of Rights and Freedoms claims "Canada is grounded upon principles that recognize the supremacy of God." This claim is hopelessly confused and it has no place in our constitution. This is true, moreover, whether you are a Christian, a Jew, a Muslim, a Pantheist, an atheist, or someone who has never given one moment's thought to "the supremacy of God" -- much less "recognized" it.
It is widely known that Black people are significantly more likely to be killed by the police in the United States of America than white people. What is less widely known is that nearly half of all people killed by the police are disabled people. The aim of this paper is to better understand the intersection of racism and ableism in the USA. Contributing to the growing literature at the intersection of philosophy of disability and critical philosophy of race, I (...) argue that theories concerning white supremacy should take more seriously the ways in which it functions as a process and apparatus of making abled and disabled. I conclude by discussing how understanding white supremacy in this manner is a valuable coalitional tool in fights for social justice more generally. (shrink)
In this paper I put forward a new micro realistic, fundamentally probabilistic, propensiton version of quantum theory. According to this theory, the entities of the quantum domain - electrons, photons, atoms - are neither particles nor fields, but a new kind of fundamentally probabilistic entity, the propensiton - entities which interact with one another probabilistically. This version of quantum theory leaves the Schroedinger equation unchanged, but reinterprets it to specify how propensitons evolve when no probabilistic transitions occur. (...) Probabilisitic transitions occur when new "particles" are created as a result of inelastic interactions. All measurements are just special cases of this. This propensiton version of quantum theory, I argue, solves the wave/particle dilemma, is free of conceptual problems that plague orthodox quantum theory, recovers all the empirical success of orthodox quantum theory, and at the same time yields as yet untested predictions that differ from those of orthodox quantum theory. (shrink)
In this paper, I introduce an intrinsic account of the quantum state. This account contains three desirable features that the standard platonistic account lacks: (1) it does not refer to any abstract mathematical objects such as complex numbers, (2) it is independent of the usual arbitrary conventions in the wave function representation, and (3) it explains why the quantum state has its amplitude and phase degrees of freedom. -/- Consequently, this account extends Hartry Field’s program outlined in Science (...) Without Numbers (1980), responds to David Malament’s long-standing impossibility conjecture (1982), and establishes an important first step towards a genuinely intrinsic and nominalistic account of quantum mechanics. I will also compare the present account to Mark Balaguer’s (1996) nominalization of quantum mechanics and discuss how it might bear on the debate about “wave function realism.” In closing, I will suggest some possible ways to extend this account to accommodate spinorial degrees of freedom and a variable number of particles (e.g. for particle creation and annihilation). -/- Along the way, I axiomatize the quantum phase structure as what I shall call a “periodic difference structure” and prove a representation theorem as well as a uniqueness theorem. These formal results could prove fruitful for further investigation into the metaphysics of phase and theoretical structure. -/- (For a more recent version of this paper, please see "The Intrinsic Structure of Quantum Mechanics" available on PhilPapers.). (shrink)
What is the quantum state of the universe? Although there have been several interesting suggestions, the question remains open. In this paper, I consider a natural choice for the universal quantum state arising from the Past Hypothesis, a boundary condition that accounts for the time-asymmetry of the universe. The natural choice is given not by a wave function but by a density matrix. I begin by classifying quantum theories into two types: theories with a fundamental wave function (...) and theories with a fundamental density matrix. The Past Hypothesis is compatible with infinitely many initial wave functions, none of which seems to be particularly natural. However, once we turn to density matrices, the Past Hypothesis provides a natural choice---the normalized projection onto the Past Hypothesis subspace in the Hilbert space. Nevertheless, the two types of theories can be empirically equivalent. To provide a concrete understanding of the empirical equivalence, I provide a novel subsystem analysis in the context of Bohmian theories. Given the empirical equivalence, it seems empirically underdetermined whether the universe is in a pure state or a mixed state. Finally, I discuss some theoretical payoffs of the density-matrix theories and present some open problems for future research. (Bibliographic note: the thesis was submitted for the Master of Science in mathematics at Rutgers University.). (shrink)
Spacetime functionalism is the view that spacetime is a functional structure implemented by a more fundamental ontology. Lam and Wüthrich have recently argued that spacetime functionalism helps to solve the epistemological problem of empirical coherence in quantum gravity and suggested that it also (dis)solves the hard problem of spacetime, namely the problem of offering a picture consistent with the emergence of spacetime from a non-spatio-temporal structure. First, I will deny that spacetime functionalism solves the hard problem by showing that (...) it comes in various species, each entailing a different attitude towards, or answer to, the hard problem. Second, I will argue that the existence of an explanatory gap, which grounds the hard problem, has not been correctly taken into account in the literature. (shrink)
Under so-called primitive ontology approaches, in fully describing the history of a quantum system, one thereby attributes interesting properties to regions of spacetime. Primitive ontology approaches, which include some varieties of Bohmian mechanics and spontaneous collapse theories, are interesting in part because they hold out the hope that it should not be too difficult to make a connection between models of quantum mechanics and descriptions of histories of ordinary macroscopic bodies. But such approaches are dualistic, positing a (...) class='Hi'>quantum state as well as ordinary material degrees of freedom. This paper lays out and compares some options that primitive ontologists have for making sense of the quantum state. (shrink)
We discuss the no-go theorem of Frauchiger and Renner based on an "extended Wigner's friend" thought experiment which is supposed to show that any single-world interpretation of quantum mechanics leads to inconsistent predictions if it is applicable on all scales. We show that no such inconsistency occurs if one considers a complete description of the physical situation. We then discuss implications of the thought experiment that have not been clearly addressed in the original paper, including a tension between relativity (...) and nonlocal effects predicted by quantum mechanics. Our discussion applies in particular to Bohmian mechanics. (shrink)
Statistical mechanics is often taken to be the paradigm of a successful inter-theoretic reduction, which explains the high-level phenomena (primarily those described by thermodynamics) by using the fundamental theories of physics together with some auxiliary hypotheses. In my view, the scope of statistical mechanics is wider since it is the type-identity physicalist account of all the special sciences. But in this chapter, I focus on the more traditional and less controversial domain of this theory, namely, that of explaining the thermodynamic (...) phenomena.What are the fundamental theories that are taken to explain the thermodynamic phenomena? The lively research into the foundations of classical statistical mechanics suggests that using classical mechanics to explain the thermodynamic phenomena is fruitful. Strictly speaking, in contemporary physics, classical mechanics is considered to be false. Since classical mechanics preserves certain explanatory and predictive aspects of the true fundamental theories, it can be successfully applied in certain cases. In other circumstances, classical mechanics has to be replaced by quantum mechanics. In this chapter I ask the following two questions: I) How does quantum statistical mechanics differ from classical statistical mechanics? How are the well-known differences between the two fundamental theories reflected in the statistical mechanical account of high-level phenomena? II) How does quantum statistical mechanics differ from quantum mechanics simpliciter? To make our main points I need to only consider non-relativistic quantum mechanics. Most of the ideas described and addressed in this chapter hold irrespective of the choice of a (so-called) interpretation of quantum mechanics, and so I will mention interpretations only when the differences between them are important to the matter discussed. (shrink)
We put forward a new, ‘coherentist’ account of quantum entanglement, according to which entangled systems are characterized by symmetric relations of ontological dependence among the component particles. We compare this coherentist viewpoint with the two most popular alternatives currently on offer—structuralism and holism—and argue that it is essentially different from, and preferable to, both. In the course of this article, we point out how coherentism might be extended beyond the case of entanglement and further articulated.
Within the field of quantum gravity, there is an influential research program developing the connection between quantum entanglement and spatiotemporal distance. Quantum information theory gives us highly refined tools for quantifying quantum entanglement such as the entanglement entropy. Through a series of well-confirmed results, it has been shown how these facts about the entanglement entropy of component systems may be connected to facts about spatiotemporal distance. Physicists are seeing these results as yielding promising methods for better (...) understanding the emergence of (the dynamical) spacetime (of general relativity) from more fundamental quantum theories, and moreover, as promising for the development of a nonperturbative theory of quantum gravity. However, to what extent does the case for the entanglement entropy-distance link provide evidence that spacetime structure is nonfundamental and emergent from nongravitational degrees of freedom? I will show that a closer look at the results lends support only to a weaker conclusion, that the facts about quantum entanglement are constrained by facts about spatiotemporal distance, and not that they are the basis from which facts about spatiotemporal distance emerge. (shrink)
I offer an account of how the quantum theory we have helps us explain so much. The account depends on a pragmatist interpretation of the theory: this takes a quantum state to serve as a source of sound advice to physically situated agents on the content and appropriate degree of belief about matters concerning which they are currently inevitably ignorant. The general account of how to use quantum states and probabilities to explain otherwise puzzling regularities is then (...) illustrated by showing how we can explain single-particle interference phenomena, the stability of matter, and interference of Bose–Einstein condensates. Finally, I note some open problems and relate this account to alternative approaches to explanation that emphasize the importance of causation, of unification, and of structure. 1 Introduction2 Two Requirements on Explanations in Physics3 What We Can use Quantum Theory to Explain4 The Function of Quantum States and Born Probabilities5 How These Functions Contribute to the Explanatory Task6 Example One: Single-Particle Interference7 Example Two: Explanation of the Stability of Matter8 Example Three: Bose Condensation9 Conclusion. (shrink)
The paper addresses the problem, which quantum mechanics resolves in fact. Its viewpoint suggests that the crucial link of time and its course is omitted in understanding the problem. The common interpretation underlain by the history of quantum mechanics sees discreteness only on the Plank scale, which is transformed into continuity and even smoothness on the macroscopic scale. That approach is fraught with a series of seeming paradoxes. It suggests that the present mathematical formalism of quantum mechanics (...) is only partly relevant to its problem, which is ostensibly known. The paper accepts just the opposite: The mathematical solution is absolute relevant and serves as an axiomatic base, from which the real and yet hidden problem is deduced. Wave-particle duality, Hilbert space, both probabilistic and many-worlds interpretations of quantum mechanics, quantum information, and the Schrödinger equation are included in that base. The Schrödinger equation is understood as a generalization of the law of energy conservation to past, present, and future moments of time. The deduced real problem of quantum mechanics is: “What is the universal law describing the course of time in any physical change therefore including any mechanical motion?”. (shrink)
Mereotopology faces problems when its methods are extended to deal with time and change. We offer a new solution to these problems, based on a theory of partitions of reality which allows us to simulate (and also to generalize) aspects of set theory within a mereotopological framework. This theory is extended to a theory of coarse- and ﬁne-grained histories (or ﬁnite sequences of partitions evolving over time), drawing on machinery developed within the framework of the so-called ‘consistent histories’ interpretation of (...)quantum mechanics. (shrink)
A number of recent theories of quantum gravity lack a one-dimensional structure of ordered temporal instants. Instead, according to many of these views, our world is either best represented as a single three-dimensional object, or as a configuration space composed of such three-dimensional objects, none of which bear temporal relations to one another. Such theories will be empirically self-refuting unless they can accommodate the existence of conscious beings capable of representation. For if representation itself is impossible in a timeless (...) world, then no being in such a world could entertain the thought that a timeless theory is true, let alone believe such a theory or rationally believe it. This paper investigates the options for understanding representation in a three-dimensional, timeless, world. Ultimately it concludes that the only viable option is one according to which representation is taken to be deeply non-naturalistic. Ironically then we are left with two seemingly very unattractive options. Either a very naturalistic motivation—taking seriously a live view in fundamental physics—leads us to a very non-naturalistic view of the mental, or else views in the philosophy of mind partly dictate what is an acceptable theory in physics. (shrink)
It has been argued that the transition from classical to quantum mechanics is an example of a Kuhnian scientific revolution, in which there is a shift from the simple, intuitive, straightforward classical paradigm, to the quantum, convoluted, counterintuitive, amazing new quantum paradigm. In this paper, after having clarified what these quantum paradigms are supposed to be, I analyze whether they constitute a radical departure from the classical paradigm. Contrary to what is commonly maintained, I argue that, (...) in addition to radical quantum paradigms, there are also legitimate ways of understanding the quantum world that do not require any substantial change to the classical paradigm. (shrink)
Eternalism, the view that what we regard locally as being located in the past, the present and the future equally exists, is the best ontological account of temporal existence in line with special and general relativity. However, special and general relativity are not fundamental theories and several research programs aim at finding a more fundamental theory of quantum gravity weaving together all we know from relativistic physics and quantum physics. Interestingly, some of these approaches assert that time is (...) not fundamental. If time is not fundamental, what does it entail for eternalism and the standard debate over existence in time? First, I will argue that the non-fundamentality of time to be found in string theory entails standard eternalism. Second, I will argue that the non-fundamentality of time to be found in loop quantum gravity entails atemporal eternalism, namely a novel position in the spirit of standard eternalism. (shrink)
Quantum Counterfactual Communication is the recently-proposed idea of using quantum physics to send messages between two parties, without any matter/energy transfer associated with the bits sent. While this has excited massive interest, both for potential ‘unhackable’ communication, and insight into the foundations of quantum mechanics, it has been asked whether this process is essentially quantum, or could be performed classically. We examine counterfactual communication, both classical and quantum, and show that the protocols proposed so far (...) for sending signals that don’t involve matter/energy transfer associated with the bits sent must be quantum, insofar as they require wave-particle duality. (shrink)
Quantum theory offers mathematical descriptions of measurable phenomena with great facility and accuracy, but it provides absolutely no understanding of why any particular quantum outcome is observed. It is the province of genuine explanations to tell us how things actually work—that is, why such descriptions hold and why such predictions are true. Quantum theory is long on the what, both mathematically and observationally, but almost completely silent on the how and the why. What is even more interesting (...) is that, in some sense, this state of affairs seems to be a necessary consequence of the empirical adequacy of quantum descriptions. One of the most noteworthy achievements of quantum theory is the accurate prediction of phenomena that, on pain of experimental contradiction, have no physical explanation. It is the purpose of this essay to make clear why quantum mechanics and quantum field theory are complete physical descriptions that describe the metaphysical incompleteness of the physical world, then to press the negative implications of this fact for naturalistic metaphysics. (shrink)
Husserl (a mathematician by education) remained a few famous and notable philosophical “slogans” along with his innovative doctrine of phenomenology directed to transcend “reality” in a more general essence underlying both “body” and “mind” (after Descartes) and called sometimes “ontology” (terminologically following his notorious assistant Heidegger). Then, Husserl’s tradition can be tracked as an idea for philosophy to be reinterpreted in a way to be both generalized and mathenatizable in the final analysis. The paper offers a pattern borrowed from the (...) theory of information and quantum information (therefore relating philosophy to both mathematics and physics) to formalize logically a few key concepts of Husserl’s phenomenology such as “epoché” “eidetic, phenomenological, and transcendental reductions” as well as the identification of “phenomenological, transcendental, and psychological reductions” in a way allowing for that identification to be continued to “eidetic reduction” (and thus to mathematics). The approach is tested by an independent and earlier idea of Husserl, “logical arithmetic” (parallelly implemented in mathematics by Whitehead and Russell’s Principia) as what “Hilbert arithmetic” generalizing Peano arithmetics is interpreted. A basic conclusion states for the unification of philosophy, mathematics, and physics in their foundations and fundamentals to be the Husserl tradition both tracked to its origin (in the being itself after Heidegger or after Husserl’s “zu Sache selbst”) and embodied in the development of human cognition in the third millennium. (shrink)
What it would take to vindicate folk temporal error theory? This question is significant against a backdrop of new views in quantum gravity—so-called timeless physical theories—that claim to eliminate time by eliminating a one-dimensional substructure of ordered temporal instants. Ought we to conclude that if these views are correct, nothing satisfies the folk concept of time and hence that folk temporal error theory is true? In light of evidence we gathered, we argue that physical theories that entirely eliminate an (...) ordered substructure vindicate folk temporal error theory. (shrink)
Relationships between current theories, and relationships between current theories and the sought theory of quantum gravity (QG), play an essential role in motivating the need for QG, aiding the search for QG, and defining what would count as QG. Correspondence is the broad class of inter-theory relationships intended to demonstrate the necessary compatibility of two theories whose domains of validity overlap, in the overlap regions. The variety of roles that correspondence plays in the search for QG are illustrated, using (...) examples from specific QG approaches. Reduction is argued to be a special case of correspondence, and to form part of the definition of QG. Finally, the appropriate account of emergence in the context of QG is presented, and compared to conceptions of emergence in the broader philosophy literature. It is argued that, while emergence is likely to hold between QG and general relativity, emergence is not part of the definition of QG, and nor can it serve usefully in the development and justification of the new theory. (shrink)
-/- Panpsychism is often thought to be an obviously mistaken doctrine, because it is considered to be completely inconceivable how the elementary particles of physics could possibly have proto-mental properties. This paper points out that quantum theory implies that elementary particles are far more subtle and strange than most contemporary physicalist philosophers assume. The discusses David Bohm’s famous “pilot wave” theory which implies that, say, an electron is a particle guided by a field carrying active information, the latter of (...) which can be seen as a primitive mind-like quality. (shrink)
I discuss the quantum mechanical theory of consciousness and freewill offered by Stapp (1993, 1995, 2000, 2004). First I show that decoherence-based arguments do not work against this theory. Then discuss a number of problems with the theory: Stapp's separate accounts of consciousness and freewill are incompatible, the interpretations of QM they are tied to are questionable, the Zeno effect could not enable freewill as he suggests because weakness of will would then be ubiquitous, and the holism of measurement (...) in QM is not a good explanation of the unity of consciousness for essentially the same reason that local interactions may seem incapable of accounting for it. (shrink)
Consider the concept combination ‘pet human’. In word association experiments, human subjects produce the associate ‘slave’ in relation to this combination. The striking aspect of this associate is that it is not produced as an associate of ‘pet’, or ‘human’ in isolation. In other words, the associate ‘slave’ seems to be emergent. Such emergent associations sometimes have a creative character and cognitive science is largely silent about how we produce them. Departing from a dimensional model of human conceptual space, this (...) article will explore concept combinations, and will argue that emergent associations are a result of abductive reasoning within conceptual space, that is, below the symbolic level of cognition. A tensor-based approach is used to model concept combinations allowing such combinations to be formalized as interacting quantum systems. Free association norm data is used to motivate the underlying basis of the conceptual space. It is shown by analogy how some concept combinations may behave like quantum-entangled particles. Two methods of analysis were presented for empirically validating the presence of non-separable concept combinations in human cognition. One method is based on quantum theory and another based on comparing a joint probability distribution with another distribution based on a separability assumption using a chi-square goodness-of-fit test. Although these methods were inconclusive in relation to an empirical study of bi-ambiguous concept combinations, avenues for further refinement of these methods are identified. (shrink)
We investigate whether standard counterfactual analyses of causation imply that the outcomes of space-like separated measurements on entangled particles are causally related. Although it has sometimes been claimed that standard CACs imply such a causal relation, we argue that a careful examination of David Lewis’s influential counterfactual semantics casts doubt on this. We discuss ways in which Lewis’s semantics and standard CACs might be extended to the case of space-like correlations.
The essential biological processes that sustain life are catalyzed by protein nano-engines, which maintain living systems in far-from-equilibrium ordered states. To investigate energetic processes in proteins, we have analyzed the system of generalized Davydov equations that govern the quantum dynamics of multiple amide I exciton quanta propagating along the hydrogen-bonded peptide groups in α-helices. Computational simulations have confirmed the generation of moving Davydov solitons by applied pulses of amide I energy for protein α-helices of varying length. The stability and (...) mobility of these solitons depended on the uniformity of dipole-dipole coupling between amide I oscillators, and the isotropy of the exciton-phonon interaction. Davydov solitons were also able to quantum tunnel through massive barriers, or to quantum interfere at collision sites. The results presented here support a nontrivial role of quantum effects in biological systems that lies beyond the mechanistic support of covalent bonds as binding agents of macromolecular structures. Quantum tunneling and interference of Davydov solitons provide catalytically active macromolecular protein complexes with a physical mechanism allowing highly efficient transport, delivery, and utilization of free energy, besides the evolutionary mandate of biological order that supports the existence of such genuine quantum phenomena, and may indeed demarcate the quantum boundaries of life. (shrink)
This paper shows how the classical finite probability theory (with equiprobable outcomes) can be reinterpreted and recast as the quantum probability calculus of a pedagogical or toy model of quantum mechanics over sets (QM/sets). There have been several previous attempts to develop a quantum-like model with the base field of ℂ replaced by ℤ₂. Since there are no inner products on vector spaces over finite fields, the problem is to define the Dirac brackets and the probability calculus. (...) The previous attempts all required the brackets to take values in ℤ₂. But the usual QM brackets <ψ|ϕ> give the "overlap" between states ψ and ϕ, so for subsets S,T⊆U, the natural definition is <S|T>=|S∩T| (taking values in the natural numbers). This allows QM/sets to be developed with a full probability calculus that turns out to be a non-commutative extension of classical Laplace-Boole finite probability theory. The pedagogical model is illustrated by giving simple treatments of the indeterminacy principle, the double-slit experiment, Bell's Theorem, and identical particles in QM/Sets. A more technical appendix explains the mathematics behind carrying some vector space structures between QM over ℂ and QM/Sets over ℤ₂. (shrink)
Our conscious minds exist in the Universe, therefore they should be identified with physical states that are subject to physical laws. In classical theories of mind, the mental states are identified with brain states that satisfy the deterministic laws of classical mechanics. This approach, however, leads to insurmountable paradoxes such as epiphenomenal minds and illusionary free will. Alternatively, one may identify mental states with quantum states realized within the brain and try to resolve the above paradoxes using the standard (...) Hilbert space formalism of quantum mechanics. In this essay, we first show that identification of mind states with quantum states within the brain is biologically feasible, and then elaborating on the mathematical proofs of two quantum mechanical no-go theorems, we explain why quantum theory might have profound implications for the scientific understanding of one's mental states, self identity, beliefs and free will. (shrink)
Effective Field Theory (EFT) is the successful paradigm underlying modern theoretical physics, including the "Core Theory" of the Standard Model of particle physics plus Einstein's general relativity. I will argue that EFT grants us a unique insight: each EFT model comes with a built-in specification of its domain of applicability. Hence, once a model is tested within some domain (of energies and interaction strengths), we can be confident that it will continue to be accurate within that domain. Currently, the Core (...) Theory has been tested in regimes that include all of the energy scales relevant to the physics of everyday life (biology, chemistry, technology, etc.). Therefore, we have reason to be confident that the laws of physics underlying the phenomena of everyday life are completely known. (shrink)
The paper considers the symmetries of a bit of information corresponding to one, two or three qubits of quantum information and identifiable as the three basic symmetries of the Standard model, U(1), SU(2), and SU(3) accordingly. They refer to “empty qubits” (or the free variable of quantum information), i.e. those in which no point is chosen (recorded). The choice of a certain point violates those symmetries. It can be represented furthermore as the choice of a privileged reference frame (...) (e.g. that of the Big Bang), which can be described exhaustively by means of 16 numbers (4 for position, 4 for velocity, and 8 for acceleration) independently of time, but in space-time continuum, and still one, 17th number is necessary for the mass of rest of the observer in it. The same 17 numbers describing exhaustively a privileged reference frame thus granted to be “zero”, respectively a certain violation of all the three symmetries of the Standard model or the “record” in a qubit in general, can be represented as 17 elementary wave functions (or classes of wave functions) after the bijection of natural and transfinite natural (ordinal) numbers in Hilbert arithmetic and further identified as those corresponding to the 17 elementary of particles of the Standard model. Two generalizations of the relevant concepts of general relativity are introduced: (1) “discrete reference frame” to the class of all arbitrarily accelerated reference frame constituting a smooth manifold; (2) a still more general principle of relativity to the general principle of relativity, and meaning the conservation of quantum information as to all discrete reference frames as to the smooth manifold of all reference frames of general relativity. Then, the bijective transition from an accelerated reference frame to the 17 elementary wave functions of the Standard model can be interpreted by the still more general principle of relativity as the equivalent redescription of a privileged reference frame: smooth into a discrete one. The conservation of quantum information related to the generalization of the concept of reference frame can be interpreted as restoring the concept of the ether, an absolutely immovable medium and reference frame in Newtonian mechanics, to which the relative motion can be interpreted as an absolute one, or logically: the relations, as properties. The new ether is to consist of qubits (or quantum information). One can track the conceptual pathway of the “ether” from Newtonian mechanics via special relativity, via general relativity, via quantum mechanics to the theory of quantum information (or “quantum mechanics and information”). The identification of entanglement and gravity can be considered also as a ‘byproduct” implied by the transition from the smooth “ether of special and general relativity’ to the “flat” ether of quantum mechanics and information. The qubit ether is out of the “temporal screen” in general and is depicted on it as both matter and energy, both dark and visible. (shrink)
Practical quantum computing devices and their applications to AI in particular are presently mostly speculative. Nevertheless, questions about whether this future technology, if achieved, presents any special ethical issues are beginning to take shape. As with any novel technology, one can be reasonably confident that the challenges presented by "quantum AI" will be a mixture of something new and something old. Other commentators (Sevilla & Moreno 2019), have emphasized continuity, arguing that quantum computing does not substantially affect (...) approaches to value alignment methods for AI, although they allow that further questions arise concerning governance and verification of quantum AI applications. In this brief paper, we turn our attention to the problem of identifying as-yet-unknown discontinuities that might result from quantum AI applications. Wise development, introduction, and use of any new technology depends on successfully anticipating new modes of failure for that technology. This requires rigorous efforts to break systems in protected sandboxes, and it must be conducted at all stages of technology design, development, and deployment. Such testing must also be informed by technical expertise but cannot be left solely to experts in the technology because of the history of failures to predict how non-experts will use or adapt to new technologies. This interplay between experts and non-experts may be particularly acute for quantum AI because quantum mechanics is notoriously difficult to understand. (As Richard Feynman quipped, "Anyone who claims to understand quantum mechanics is either lying or crazy.") We will discuss the extent to which the difficulties in understanding the physics underlying quantum computing challenges attempts to anticipate new failure modes that might be introduced in AI applications intended for unsupervised operation in the public sphere. (shrink)
There is a consistent and simple interpretation of the quantum theory of isolated systems. The interpretation suffers no measurement problem and provides a quantum explanation of state reduction, which is usually postulated. Quantum entanglement plays an essential role in the construction of the interpretation.
The theme of phenomenology and quantum physics is here tackled by examining some basic interpretational issues in quantum physics. One key issue in quantum theory from the very beginning has been whether it is possible to provide a quantum ontology of particles in motion in the same way as in classical physics, or whether we are restricted to stay within a more limited view of quantum systems, in terms of complementary but mutually exclusive phenomena. In (...) phenomenological terms we could describe the situation by saying that according to the usual interpretation of quantum theory, quantum phenomena require a kind of epoche. However, there are other interpretations that seem to re-establish the possibility of a mind-independent ontology at the quantum level. We will show that even such ontological interpretations contain novel, non-classical features, which require them to give a special role to “phenomena” or “appearances”, a role not encountered in classical physics. We will conclude that while ontological interpretations of quantum theory are possible, quantum theory implies the need of a certain kind of epoche even for this type of interpretations. While different from the epoche connected to phenomenological description, the “quantum epoche” nevertheless points to a potentially interesting parallel between phenomenology and quantum philosophy. (shrink)
Two of the most difficult problems in the foundations of physics are (1) what gives rise to the arrow of time and (2) what the ontology of quantum mechanics is. I propose a unified 'Humean' solution to the two problems. Humeanism allows us to incorporate the Past Hypothesis and the Statistical Postulate into the best system, which we then use to simplify the quantum state of the universe. This enables us to confer the nomological status to the (...) class='Hi'>quantum state in a way that adds no significant complexity to the best system and solves the ''supervenient-kind problem'' facing the original version of the Past Hypothesis. We call the resultant theory the Humean unification. It provides a unified explanation of time asymmetry and quantum entanglement. On this theory, what gives rise to time's arrow is also responsible for quantum phenomena. The new theory has a separable mosaic, a best system that is simple and non-vague, less tension between quantum mechanics and special relativity, and a higher degree of theoretical and dynamical unity. The Humean unification leads to new insights that can be useful to Humeans and non-Humeans alike. (shrink)
The conspicuous similarities between interpretive strategies in classical statistical mechanics and in quantum mechanics may be grounded on their employment of common implementations of probability. The objective probabilities which represent the underlying stochasticity of these theories can be naturally associated with three of their common formal features: initial conditions, dynamics, and observables. Various well-known interpretations of the two theories line up with particular choices among these three ways of implementing probability. This perspective has significant application to debates on primitive (...) ontology and to the quantum measurement problem. (shrink)
Measures and theories of information abound, but there are few formalised methods for treating the contextuality that can manifest in different information systems. Quantum theory provides one possible formalism for treating information in context. This paper introduces a quantum inspired model of the human mental lexicon. This model is currently being experimentally investigated and we present a preliminary set of pilot data suggesting that concept combinations can indeed behave non-separably.
Saunders' recent arguments in favour of the weak discernibility of (certain) quantum particles seem to be grounded in the 'generalist' view that science only provides general descriptions of the worlIn this paper, I introduce the ‘generalist’ perspective and consider its possible justification and philosophical basis; and then look at the notion of weak discernibility. I expand on the criticisms formulated by Hawley (2006) and Dieks and Veerstegh (2008) and explain what I take to be the basic problem: that the (...) properties invoked by Saunders cannot be pointed to as ‘individuators’ of otherwise indiscernible (and thus numerically identical) entities because their ontological status remains underdetermined by the evidence and the established interpretation of the theory. In addition to to this, I suggest that Saunders does not deal adequately with bosons, and cannot do so exactly because he subscribes to PII and the generalist picture. The last part of the paper contains a critical examination of the claim (or at least implicit assumption) that the generalist picture should be regarded as obviously compelling by the modern-day empiricist. (shrink)
Since the pioneering work of Birkhoff and von Neumann, quantum logic has been interpreted as the logic of (closed) subspaces of a Hilbert space. There is a progression from the usual Boolean logic of subsets to the "quantum logic" of subspaces of a general vector space--which is then specialized to the closed subspaces of a Hilbert space. But there is a "dual" progression. The notion of a partition (or quotient set or equivalence relation) is dual (in a category-theoretic (...) sense) to the notion of a subset. Hence the Boolean logic of subsets has a dual logic of partitions. Then the dual progression is from that logic of partitions to the quantum logic of direct-sum decompositions (i.e., the vector space version of a set partition) of a general vector space--which can then be specialized to the direct-sum decompositions of a Hilbert space. This allows the logic to express measurement by any self-adjoint operators rather than just the projection operators associated with subspaces. In this introductory paper, the focus is on the quantum logic of direct-sum decompositions of a finite-dimensional vector space (including such a Hilbert space). The primary special case examined is finite vector spaces over ℤ₂ where the pedagogical model of quantum mechanics over sets (QM/Sets) is formulated. In the Appendix, the combinatorics of direct-sum decompositions of finite vector spaces over GF(q) is analyzed with computations for the case of QM/Sets where q=2. (shrink)
Capacity of conscious agents to perform genuine choices among future alternatives is a prerequisite for moral responsibility. Determinism that pervades classical physics, however, forbids free will, undermines the foundations of ethics, and precludes meaningful quantification of personal biases. To resolve that impasse, we utilize the characteristic indeterminism of quantum physics and derive a quantitative measure for the amount of free will manifested by the brain cortical network. The interaction between the central nervous system and the surrounding environment is shown (...) to perform a quantum measurement upon the neural constituents, which actualize a single measurement outcome selected from the resulting quantum probability distribution. Inherent biases in the quantum propensities for alternative physical outcomes provide varying amounts of free will, which can be quantified with the expected information gain from learning the actual course of action chosen by the nervous system. For example, neuronal electric spikes evoke deterministic synaptic vesicle release in the synapses of sensory or somatomotor pathways, with no free will manifested. In cortical synapses, however, vesicle release is triggered indeterministically with probability of 0.35 per spike. This grants the brain cortex, with its over 100 trillion synapses, an amount of free will exceeding 96 terabytes per second. Although reliable deterministic transmission of sensory or somatomotor information ensures robust adaptation of animals to their physical environment, unpredictability of behavioral responses initiated by decisions made by the brain cortex is evolutionary advantageous for avoiding predators. Thus, free will may have a survival value and could be optimized through natural selection. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.