Although expected utility theory has proven a fruitful and elegant theory in the finite realm, attempts to generalize it to infinite values have resulted in many paradoxes. In this paper, we argue that the use of John Conway's surreal numbers shall provide a firm mathematical foundation for transfinite decision theory. To that end, we prove a surreal representation theorem and show that our surreal decision theory respects dominance reasoning even in the case of infinite values. We then bring our theory (...) to bear on one of the more venerable decision problems in the literature: Pascal's Wager. Analyzing the wager showcases our theory's virtues and advantages. To that end, we analyze two objections against the wager: Mixed Strategies and Many Gods. After formulating the two objections in the framework of surreal utilities and probabilities, our theory correctly predicts that (1) the pure Pascalian strategy beats all mixed strategies, and (2) what one should do in a Pascalian decision problem depends on what one's credence function is like. Our analysis therefore suggests that although Pascal's Wager is mathematically coherent, it does not deliver what it purports to, a rationally compelling argument that people should lead a religious life regardless of how confident they are in theism and its alternatives. (shrink)
A century after the discovery of quantum mechanics, the meaning of quantum mechanics still remains elusive. This is largely due to the puzzling nature of the wave function, the central object in quantum mechanics. If we are realists about quantum mechanics, how should we understand the wave function? What does it represent? What is its physical meaning? Answering these questions would improve our understanding of what it means to be a realist about quantum mechanics. In this survey article, I review (...) and compare several realist interpretations of the wave function. They fall into three categories: ontological interpretations, nomological interpretations, and the sui generis interpretation. For simplicity, I will focus on non-relativistic quantum mechanics. (shrink)
The mathematical structure of realist quantum theories has given rise to a debate about how our ordinary 3-dimensional space is related to the 3N-dimensional configuration space on which the wave function is defined. Which of the two spaces is our (more) fundamental physical space? I review the debate between 3N-Fundamentalists and 3D-Fundamentalists and evaluate it based on three criteria. I argue that when we consider which view leads to a deeper understanding of the physical world, especially given the deeper topological (...) explanation from the unordered configurations to the Symmetrization Postulate, we have strong reasons in favor of 3D-Fundamentalism. I conclude that our evidence favors the view that our fundamental physical space in a quantum world is 3-dimensional rather than 3N-dimensional. I outline lines of future research where the evidential balance can be restored or reversed. Finally, I draw lessons from this case study to the debate about theoretical equivalence. (shrink)
In a quantum universe with a strong arrow of time, we postulate a low-entropy boundary condition to account for the temporal asymmetry. In this paper, I show that the Past Hypothesis also contains enough information to simplify the quantum ontology and define a unique initial condition in such a world. First, I introduce Density Matrix Realism, the thesis that the quantum universe is described by a fundamental density matrix that represents something objective. This stands in sharp contrast to Wave Function (...) Realism, the thesis that the quantum universe is described by a wave function that represents something objective. Second, I suggest that the Past Hypothesis is sufficient to determine a unique and simple density matrix. This is achieved by what I call the Initial Projection Hypothesis: the initial density matrix of the universe is the normalized projection onto the special low-dimensional Hilbert space. Third, because the initial quantum state is unique and simple, we have a strong case for the \emph{Nomological Thesis}: the initial quantum state of the universe is on a par with laws of nature. This new package of ideas has several interesting implications, including on the harmony between statistical mechanics and quantum mechanics, the dynamic unity of the universe and the subsystems, and the alleged conflict between Humean supervenience and quantum entanglement. (shrink)
In a quantum universe with a strong arrow of time, it is standard to postulate that the initial wave function started in a particular macrostate---the special low-entropy macrostate selected by the Past Hypothesis. Moreover, there is an additional postulate about statistical mechanical probabilities according to which the initial wave function is a ''typical'' choice in the macrostate. Together, they support a probabilistic version of the Second Law of Thermodynamics: typical initial wave functions will increase in entropy. Hence, there are two (...) sources of randomness in such a universe: the quantum-mechanical probabilities of the Born rule and the statistical mechanical probabilities of the Statistical Postulate. I propose a new way to understand time's arrow in a quantum universe. It is based on what I call the Thermodynamic Theories of Quantum Mechanics. According to this perspective, there is a natural choice for the initial quantum state of the universe, which is given by not a wave function but by a density matrix. The density matrix plays a microscopic role: it appears in the fundamental dynamical equations of those theories. The density matrix also plays a macroscopic / thermodynamic role: it is exactly the projection operator onto the Past Hypothesis subspace. Thus, given an initial subspace, we obtain a unique choice of the initial density matrix. I call this property "the conditional uniqueness" of the initial quantum state. The conditional uniqueness provides a new and general strategy to eliminate statistical mechanical probabilities in the fundamental physical theories, by which we can reduce the two sources of randomness to only the quantum mechanical one. I also explore the idea of an absolutely unique initial quantum state, in a way that might realize Penrose's idea of a strongly deterministic universe. (shrink)
If there are fundamental laws of nature, can they fail to be exact? In this paper, I consider the possibility that some fundamental laws are vague. I call this phenomenon 'fundamental nomic vagueness.' I characterize fundamental nomic vagueness as the existence of borderline lawful worlds and the presence of several other accompanying features. Under certain assumptions, such vagueness prevents the fundamental physical theory from being completely expressible in the mathematical language. Moreover, I suggest that such vagueness can be regarded as (...) 'vagueness in the world.' For a case study, we turn to the Past Hypothesis, a postulate that (partially) explains the direction of time in our world. We have reasons to take it seriously as a candidate fundamental law of nature. Yet it is vague: it admits borderline (nomologically) possible worlds. An exact version would lead to an untraceable arbitrariness absent in any other fundamental laws. However, the dilemma between fundamental nomic vagueness and untraceable arbitrariness is dissolved in a new quantum theory of time's arrow. (shrink)
If the Past Hypothesis underlies the arrows of time, what is the status of the Past Hypothesis? In this paper, I examine the role of the Past Hypothesis in the Boltzmannian account and defend the view that the Past Hypothesis is a candidate fundamental law of nature. Such a view is known to be compatible with Humeanism about laws, but as I argue it is also supported by a minimal non-Humean "governing'' view. Some worries arise from the non-dynamical and time-dependent (...) character of the Past Hypothesis as a boundary condition, the intrinsic vagueness in its specification, and the nature of the initial probability distribution. I show that these worries do not have much force, and in any case they become less relevant in a new quantum framework for analyzing time's arrows---the Wentaculus. Hence, the view that the Past Hypothesis is a candidate fundamental law should be more widely accepted than it is now. (shrink)
In this short survey article, I discuss Bell’s theorem and some strategies that attempt to avoid the conclusion of non-locality. I focus on two that intersect with the philosophy of probability: (1) quantum probabilities and (2) superdeterminism. The issues they raised not only apply to a wide class of no-go theorems about quantum mechanics but are also of general philosophical interest.
What is the quantum state of the universe? Although there have been several interesting suggestions, the question remains open. In this paper, I consider a natural choice for the universal quantum state arising from the Past Hypothesis, a boundary condition that accounts for the time-asymmetry of the universe. The natural choice is given not by a wave function but by a density matrix. I begin by classifying quantum theories into two types: theories with a fundamental wave function and theories with (...) a fundamental density matrix. The Past Hypothesis is compatible with infinitely many initial wave functions, none of which seems to be particularly natural. However, once we turn to density matrices, the Past Hypothesis provides a natural choice---the normalized projection onto the Past Hypothesis subspace in the Hilbert space. Nevertheless, the two types of theories can be empirically equivalent. To provide a concrete understanding of the empirical equivalence, I provide a novel subsystem analysis in the context of Bohmian theories. Given the empirical equivalence, it seems empirically underdetermined whether the universe is in a pure state or a mixed state. Finally, I discuss some theoretical payoffs of the density-matrix theories and present some open problems for future research. (Bibliographic note: the thesis was submitted for the Master of Science in mathematics at Rutgers University.). (shrink)
The Great Divide in metaphysical debates about laws of nature is between Humeans, who think that laws merely describe the distribution of matter, and non-Humeans, who think that laws govern it. The metaphysics can place demands on the proper formulations of physical theories. It is sometimes assumed that the governing view requires a fundamental / intrinsic direction of time: to govern, laws must be dynamical, producing later states of the world from earlier ones, in accord with the fundamental direction of (...) time in the universe. In this paper, we propose a minimal primitivism about laws of nature (MinP) according to which there is no such requirement. On our view, laws govern by constraining the physical possibilities. Our view captures the essence of the governing view without taking on extraneous commitments about the direction of time or dynamic production. Moreover, as a version of primitivism, our view requires no reduction / analysis of laws in terms of universals, powers, or dispositions. Our view accommodates several potential candidates for fundamental laws, including the principle of least action, the Past Hypothesis, the Einstein equation of general relativity, and even controversial examples found in the Wheeler-Feynman theory of electrodynamics and retrocausal theories of quantum mechanics. By understanding governing as constraining, non-Humeans who accept MinP have the same freedom to contemplate a wide variety of candidate fundamental laws as Humeans do. (shrink)
Two of the most difficult problems in the foundations of physics are (1) what gives rise to the arrow of time and (2) what the ontology of quantum mechanics is. I propose a unified 'Humean' solution to the two problems. Humeanism allows us to incorporate the Past Hypothesis and the Statistical Postulate into the best system, which we then use to simplify the quantum state of the universe. This enables us to confer the nomological status to the quantum state in (...) a way that adds no significant complexity to the best system and solves the ''supervenient-kind problem'' facing the original version of the Past Hypothesis. We call the resultant theory the Humean unification. It provides a unified explanation of time asymmetry and quantum entanglement. On this theory, what gives rise to time's arrow is also responsible for quantum phenomena. The new theory has a separable mosaic, a best system that is simple and non-vague, less tension between quantum mechanics and special relativity, and a higher degree of theoretical and dynamical unity. The Humean unification leads to new insights that can be useful to Humeans and non-Humeans alike. (shrink)
What exists at the fundamental level of reality? On the standard picture, the fundamental reality contains (among other things) fundamental matter, such as particles, fields, or even the quantum state. Non-fundamental facts are explained by facts about fundamental matter, at least in part. In this paper, I introduce a non-standard picture called the "cosmic void” in which the universe is devoid of any fundamental material ontology. Facts about tables and chairs are recovered from a special kind of laws that satisfy (...) strong determinism. All non-fundamental facts are completely explained by nomic facts. I discuss a concrete example of this picture in a strongly deterministic version of the many-worlds theory of quantum mechanics. I discuss some philosophical and scientific challenges to this view, as well as some connections to ontological nihilism. (shrink)
We expect the laws of nature that describe the universe to be exact, but what if that isn't true? In this popular science article, I discuss the possibility that some candidate fundamental laws of nature, such as the Past Hypothesis, may be vague. This possibility is in conflict with the idea that the fundamental laws of nature can always and faithfully be described by classical mathematics. -/- [Bibliographic note: this article is featured on the magazine website under a different title (...) as "The fuzzy law that could break the idea of a mathematical universe" and on the magazine cover as "The Flaw at the Heart of Reality: Why precise mathematical laws can never fully explain the universe." It is a popular version of the article "Nomic Vagueness" that can be found on arXiv: 2006.05298.]. (shrink)
A strongly deterministic theory of physics is one that permits exactly one possible history of the universe. In the words of Penrose (1989), "it is not just a matter of the future being determined by the past; the entire history of the universe is fixed, according to some precise mathematical scheme, for all time.” Such an extraordinary feature may appear unattainable in any realistic and simple theory of physics. In this paper, I propose a definition of strong determinism and contrast (...) it with those of standard determinism and super-determinism. Next, I discuss its consequences for explanation, causation, prediction, fundamental properties, free will, and modality. Finally, I present the first example of a realistic, simple, and strongly deterministic physical theory--the Everettian Wentaculus. As a consequence of physical laws, the history of the Everettian multiverse could not have been different. If the Everettian Wentaculus is empirically equivalent to other quantum theories, we can never empirically find out whether or not our world is strongly deterministic. Even if strong determinism fails to be true, it is closer to the actual world than we have presumed, with implications for some of the central topics in philosophy and foundations of physics. (shrink)
In this paper, I introduce an intrinsic account of the quantum state. This account contains three desirable features that the standard platonistic account lacks: (1) it does not refer to any abstract mathematical objects such as complex numbers, (2) it is independent of the usual arbitrary conventions in the wave function representation, and (3) it explains why the quantum state has its amplitude and phase degrees of freedom. -/- Consequently, this account extends Hartry Field’s program outlined in Science Without Numbers (...) (1980), responds to David Malament’s long-standing impossibility conjecture (1982), and establishes an important first step towards a genuinely intrinsic and nominalistic account of quantum mechanics. I will also compare the present account to Mark Balaguer’s (1996) nominalization of quantum mechanics and discuss how it might bear on the debate about “wave function realism.” In closing, I will suggest some possible ways to extend this account to accommodate spinorial degrees of freedom and a variable number of particles (e.g. for particle creation and annihilation). -/- Along the way, I axiomatize the quantum phase structure as what I shall call a “periodic difference structure” and prove a representation theorem as well as a uniqueness theorem. These formal results could prove fruitful for further investigation into the metaphysics of phase and theoretical structure. -/- (For a more recent version of this paper, please see "The Intrinsic Structure of Quantum Mechanics" available on PhilPapers.). (shrink)
One of the most difficult problems in the foundations of physics is what gives rise to the arrow of time. Since the fundamental dynamical laws of physics are (essentially) symmetric in time, the explanation for time's arrow must come from elsewhere. A promising explanation introduces a special cosmological initial condition, now called the Past Hypothesis: the universe started in a low-entropy state. Unfortunately, in a universe where there are many copies of us (in the distant ''past'' or the distant ''future''), (...) the Past Hypothesis is not enough; we also need to postulate self-locating (de se) probabilities. However, I show that we can similarly use self-locating probabilities to strengthen its rival---the Fluctuation Hypothesis, leading to in-principle empirical underdetermination and radical epistemological skepticism. The underdetermination is robust in the sense that it is not resolved by the usual appeal to 'empirical coherence' or 'simplicity.' That is a serious problem for the vision of providing a completely scientific explanation of time's arrow. (shrink)
What is the proper metaphysics of quantum mechanics? In this dissertation, I approach the question from three different but related angles. First, I suggest that the quantum state can be understood intrinsically as relations holding among regions in ordinary space-time, from which we can recover the wave function uniquely up to an equivalence class (by representation and uniqueness theorems). The intrinsic account eliminates certain conventional elements (e.g. overall phase) in the representation of the quantum state. It also dispenses with first-order (...) quantification over mathematical objects, which goes some way towards making the quantum world safe for a nominalistic metaphysics suggested in Field (1980, 2016). Second, I argue that the fundamental space of the quantum world is the low-dimensional physical space and not the high-dimensional space isomorphic to the ``configuration space.'' My arguments are based on considerations about dynamics, empirical adequacy, and symmetries of the quantum mechanics. Third, I show that, when we consider quantum mechanics in a time-asymmetric universe (with a large entropy gradient), we obtain new theoretical and conceptual possibilities. In such a model, we can use the low-entropy boundary condition known as the Past Hypothesis (Albert, 2000) to pin down a natural initial quantum state of the universe. However, the universal quantum state is not a pure state but a mixed state, represented by a density matrix that is the normalized projection onto the Past Hypothesis subspace. This particular choice has interesting consequences for Humean supervenience, statistical mechanical probabilities, and theoretical unity. (shrink)
In this thought-provoking book, Richard Healey proposes a new interpretation of quantum theory inspired by pragmatist philosophy. Healey puts forward the interpretation as an alternative to realist quantum theories on the one hand such as Bohmian mechanics, spontaneous collapse theories, and many-worlds interpretations, which are different proposals for describing what the quantum world is like and what the basic laws of physics are, and non-realist interpretations on the other hand such as quantum Bayesianism, which proposes to understand quantum theory as (...) describing agents' subjective epistemic states. The central idea of Healey's proposal is to understand quantum theory as providing not a description of the physical world but a set of authoritative and objectively correct prescriptions about how agents should act. The book provides a detailed development and defense of that idea, and it contains interesting discussions about a wide range of philosophical issues such as representation, probability, explanation, causation, objectivity, meaning, and fundamentality. Healey's project is at the intersection of physics and philosophy. The book is divided into two parts. Part I of the book discusses the foundational questions in quantum theory from the perspective of the prescriptive interpretation. In Part II, Healey discusses the philosophical implications of the view. Both parts are written in a way that is largely accessible to non-specialists. In this brief book review, I will focus on two questions: (1) How does Healey's idea work? (2) What reasons are there to believe in it? (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.