The idea that there could be spatially extended mereological simples has recently been defended by a number of metaphysicians (Markosian 1998, 2004; Simons 2004; Parsons (2000) also takes the idea seriously). Peter Simons (2004) goes further, arguing not only that spatially extended mereological simples (henceforth just extended simples) are possible, but that it is more plausible that our world is composed of such simples, than that it is composed of either point-sized simples, or of atomless gunk. The difficulty for these (...) views lies in explaining why it is that the various sub-volumes of space occupied by such simples, are not occupied by proper parts of those simples. Intuitively at least, many of us find compelling the idea that spatially extended objects have proper parts at every sub-volume of the region they occupy. It seems that the defender of extended simples must reject a seemingly plausible claim, what Simons calls the geometriccorrespondenceprinciple (GCP): that any (spatially) extended object has parts that correspond to the parts of the region that it occupies (Simons 2004: 371). We disagree. We think that GCP is a plausible principle. We also think it is plausible that our world is composed of extended simples. We reconcile these two notions by two means. On the one hand we pay closer attention to the physics of our world. On the other hand, we consider what happens when our concept of something—in this case space—contains elements not all of which are realized in anything, but instead key components are realized in different features of the world. (shrink)
We discuss the fate of the correspondenceprinciple beyond quantum mechanics, specifically in quantum field theory and quantum gravity, in connection with the intrinsic limitations of the human ability to observe the external world. We conclude that the best correspondenceprinciple is made of unitarity, locality, proper renormalizability (a refinement of strict renormalizability), combined with fundamental local symmetries and the requirement of having a finite number of fields. Quantum gravity is identified in an essentially unique way. (...) The gauge interactions are uniquely identified in form. Instead, the matter sector remains basically unrestricted. The major prediction is the violation of causality at small distances. (shrink)
In a recent revision (chapter 4 of Nowakowa and Nowak 2000) of an older article Leszek Nowak (1992) has attempted to rebut Niiniluoto’s 1990 critical suggestion that proponents of the Poznań idealizational approach to the sciences have committed a rather elementary logical error in the formal machinery that they advocate for use in the analysis of scientific methodology. In this paper I criticize Nowak’s responses to Niiniluoto’s suggestion, and, subsequently, work out some of the consequences of that criticism for understanding (...) the role that idealization plays in scientific methodology. (shrink)
We recently presented our Efimov K-theory of Diamonds, proposing a pro-diamond, a large stable (∞,1)-category of diamonds (D^{diamond}), and a localization sequence for diamond spectra. Commensurate with the localization sequence, we now detail four potential applications of the Efimov K-theory of D^{diamond}: emergent time as a pro-emergence (v-stack time) in a diamond holographic principle using Scholze’s six operations in the ’etale cohomology of diamonds; a pro-Generative Adversarial Network and v-stack perceptron; D^{diamond}cryptography; and diamond nonlocality in perfectoid quantum physics.
The correspondenceprinciple made of unitarity, locality and renormalizability has been very successful in quantum field theory. Among the other things, it helped us build the standard model. However, it also showed important limitations. For example, it failed to restrict the gauge group and the matter sector in a powerful way. After discussing its effectiveness, we upgrade it to make room for quantum gravity. The unitarity assumption is better understood, since it allows for the presence of physical particles (...) as well as fake particles (fakeons). The locality assumption is applied to an interim classical action, since the true classical action is nonlocal and emerges from the quantization and a later process of classicization. The renormalizability assumption is refined to single out the special role of the gauge couplings. We show that the upgraded principle leads to an essentially unique theory of quantum gravity. In particular, in four dimensions, a fakeon of spin 2, together with a scalar field, is able to make the theory renormalizable while preserving unitarity. We offer an overview of quantum field theories of particles and fakeons in various dimensions, with and without gravity. (shrink)
This paper explores the issue of the unification of three languages of physics, the geometric language of forces, geometric language of fields or 4-dimensional space-time, and probabilistic language of quantum mechanics. On the one hand, equations in each language may be derived from the Principle of Least Action (PLA). On the other hand, Feynman's path integral method could explain the physical meaning of PLA. The axioms of classical and relativistic mechanics can be considered as consequences of Feynman's (...) formulation of quantum mechanics. (shrink)
Supra-Bayesianism is the Bayesian response to learning the opinions of others. Probability pooling constitutes an alternative response. One natural question is whether there are cases where probability pooling gives the supra-Bayesian result. This has been called the problem of Bayes-compatibility for pooling functions. It is known that in a common prior setting, under standard assumptions, linear pooling cannot be non-trivially Bayes-compatible. We show by contrast that geometric pooling can be non-trivially Bayes-compatible. Indeed, we show that, under certain assumptions, (...) class='Hi'>geometric and Bayes-compatible pooling are equivalent. Granting supra-Bayesianism its usual normative status, one upshot of our study is thus that, in a certain class of epistemic contexts, geometric pooling enjoys a normative advantage over linear pooling as a social learning mechanism. We discuss the philosophical ramifications of this advantage, which we show to be robust to variations in our statement of the Bayes-compatibility problem. (shrink)
The paper delineates a new approach to truth that falls under the category of “Pluralism within the bounds of correspondence”, and illustrates it with respect to mathematical truth. Mathematical truth, like all other truths, is based on correspondence, but the route of mathematical correspondence differs from other routes of correspondence in (i) connecting mathematical truths to a special aspect of reality, namely, its formal aspect, and (ii) doing so in a complex, indirect way, rather than in (...) a simple and direct way. The underlying idea is that an intricate mind is capable of creating intricate routes from language to reality, and this enables it to apply correspondence principles in areas for which correspondence is traditionally thought to be problematic. (shrink)
Relationships between current theories, and relationships between current theories and the sought theory of quantum gravity (QG), play an essential role in motivating the need for QG, aiding the search for QG, and defining what would count as QG. Correspondence is the broad class of inter-theory relationships intended to demonstrate the necessary compatibility of two theories whose domains of validity overlap, in the overlap regions. The variety of roles that correspondence plays in the search for QG are illustrated, (...) using examples from specific QG approaches. Reduction is argued to be a special case of correspondence, and to form part of the definition of QG. Finally, the appropriate account of emergence in the context of QG is presented, and compared to conceptions of emergence in the broader philosophy literature. It is argued that, while emergence is likely to hold between QG and general relativity, emergence is not part of the definition of QG, and nor can it serve usefully in the development and justification of the new theory. (shrink)
Martin Peterson’s The Ethics of Technology: A Geometric Analysis of Five Moral Principles offers a welcome contribution to the ethics of technology, understood by Peterson as a branch of applied ethics that attempts ‘to identify the morally right courses of action when we develop, use, or modify technological artifacts’ (3). He argues that problems within this field are best treated by the use of five domain-specific principles: the Cost-Benefit Principle, the Precautionary Principle, the Sustainability Principle, the (...) Autonomy Principle, and the Fairness Principle. These principles are, in turn, to be understood and applied with reference to the geometric method. This method is perhaps the most interesting and novel part of Peterson’s book, and I’ll devote the bulk of my review to it. (shrink)
We investigate a lattice of conditional logics described by a Kripke type semantics, which was suggested by Chellas and Segerberg – Chellas–Segerberg (CS) semantics – plus 30 further principles. We (i) present a non-trivial frame-based completeness result, (ii) a translation procedure which gives one corresponding trivial frame conditions for arbitrary formula schemata, and (iii) non-trivial frame conditions in CS semantics which correspond to the 30 principles.
En esta obra monumental de Jesper Lützen sobre la mecánica de Heinrich Hertz encontramos una magnífica exposición de la vida y obra de este insigne físico germano. Un interesante relato de las influencias intelectuales que modelaron su pensamiento científico, culmina con un exhaustivo análisis de la reformulación de la mecánica clásica que Hertz planteó poco antes de su prematuro fallecimiento.
In times of crisis, when current theories are revealed as inadequate to task, and new physics is thought to be required—physics turns to re-evaluate its principles, and to seek new ones. This paper explores the various types, and roles of principles that feature in the problem of quantum gravity as a current crisis in physics. I illustrate the diversity of the principles being appealed to, and show that principles serve in a variety of roles in all stages of the crisis, (...) including in motivating the need for a new theory, and defining what this theory should be like. In particular, I consider: the generalised correspondenceprinciple, UV-completion, background independence, and the holographic principle. I also explore how the current crisis fits with Friedman’s view on the roles of principles in revolutionary theory-change, finding that while many key aspects of this view are not represented in quantum gravity, the view could potentially offer a useful diagnostic, and prescriptive strategy. This paper is intended to be relatively non-technical, and to bring some of the philosophical issues from the search for quantum gravity to a more general philosophical audience interested in the roles of principles in scientific theory-change. (shrink)
Formulations of Mill's principle of utility are examined, and it is shown that Mill did not recognize a moral obligation to maximize the good, as is often assumed. His was neither a maximizing act nor rule utilitarianism. It was a distinctive minimizing utilitarianism which morally obligates us only to abstain from inflicting harm, to prevent harm, to provide for others minimal essentials of well being (to which rights correspond), and to be occasionally charitable or benevolent.
When talking about truth, we ordinarily take ourselves to be talking about one-and-the-same thing. Alethic monists suggest that theorizing about truth ought to begin with this default or pre-reflective stance, and, subsequently, parlay it into a set of theoretical principles that are aptly summarized by the thesis that truth is one. Foremost among them is the invariance principle.
While the philosophers of science discuss the General Relativity, the mathematical physicists do not question it. Therefore, there is a conflict. From the theoretical point view “the question of precisely what Einstein discovered remains unanswered, for we have no consensus over the exact nature of the theory 's foundations. Is this the theory that extends the relativity of motion from inertial motion to accelerated motion, as Einstein contended? Or is it just a theory that treats gravitation geometrically in the spacetime (...) setting?”. “The voices of dissent proclaim that Einstein was mistaken over the fundamental ideas of his own theory and that their basic principles are simply incompatible with this theory. Many newer texts make no mention of the principles Einstein listed as fundamental to his theory; they appear as neither axiom nor theorem. At best, they are recalled as ideas of purely historical importance in the theory's formation. The very name General Relativity is now routinely condemned as a misnomer and its use often zealously avoided in favour of, say , Einstein's theory of gravitation What has complicated an easy resolution of the debate are the alterations of Einstein's own position on the foundations of his theory”, (Norton, 1993). Of other hand from the mathematical point view the “General Relativity had been formulated as a messy set of partial differential equations in a single coordinate system. People were so pleased when they found a solution that they didn't care that it probably had no physical significance” (Hawking and Penrose, 1996). So, during a time, the declaration of quantum theorists:“I take the positivist viewpoint that a physical theory is just a mathematical model and that it is meaningless to ask whether it corresponds to reality. All that one can ask is that its predictions should be in agreement with observation.” (Hawking and Penrose, 1996)seemed to solve the problem, but recently achieved with the help of the tightly and collectively synchronized clocks in orbit frontally contradicts fundamental assumptions of the theory of Relativity. These observations are in disagree from predictions of the theory of Relativity. (Hatch, 2004a, 2004b, 2007). The mathematical model was developed first by Grossmann who presented it, in 1913, as the mathematical part of the Entwurf theory, still referred to a curved Minkowski spacetime. Einstein completed the mathematical model, in 1915, formulated for Riemann ́s spacetimes. In this paper, we present as of General Relativity currently remains only the mathematical model, darkened with the results of Hatch and, course, we conclude that a Einstein ́s gravity theory does not exist. (shrink)
Our concept of the universe and the material world is foundational for our thinking and our moral lives. In an earlier contribution to the URAM project I presented what I called 'the ultimate organizational principle' of the universe. In that article (Grandpierre 2000, pp. 12-35) I took as an adversary the wide-spread system of thinking which I called 'materialism'. According to those who espouse this way of thinking, the universe consists of inanimate units or sets of material such as (...) atoms or elementary particles. Against this point of view on reality, I argued that it is 'logic', which exists in our inner world as a function of our mind, that is the universal organizing power of the universe. The present contribution builds upon this insight. Then I focussed on rationality; now I am interested in the responsibility that is the driving force behind our effort to find coherence and ultimate perspectives in our cosmos. It is shown that biology fundamentally differs from physics. Biology has its own fundamental principle, which is formulated for the first time in history in a scientific manner by Ervin Bauer. This fundamental principle is the cosmic life principle. I show that if one considers the physical laws as corresponding to reality, as in scientific realism, than physicalism becomes fundamentally spiritual because the physical laws are not material. I point out that the physical laws originate from the fundamental principle of physics which is the least action principle. I show that the fundamental principle of physics can be considered as the "instinct of atoms". Our research has found deep and meaningful connections between the basic principle of physics and the ultimate principles of the universe: matter, life and reason. Therefore, the principle of least action is not necessarily an expression of sterile inanimateness. On the contrary, the principle of physics is related to the life principle of the universe, to the world of instincts behind the atomic world, in which the principles of physics, biology, and psychology arise from the same ultimate principle. Our research sheds new light to the sciences of physics, biology, and psychology in close relation to the basic principles. These ultimate principles have a primary importance in our understanding of the nature of Man and the Universe, together with the relations between Man and Nature, Man and the Universe. The results offer new foundations for our understanding our own role in the Earth, in the Nature and in the Universe. Even the apparently inanimate world of physics shows itself to be animate on long timescales and having a kind of pre- human consciousness in its basic organisation. This hypothesis offers a way to understand when and how the biological laws may direct physical laws, and, moreover, offers a new perspective to study and understand under which conditions can self-consciousness govern the laws of biology and physics. This point of view offers living beings and humans the possibility of strengthening our natural identity, and recognising the wide perspective arising from having access to the deepest ranges of our own human resources and realising the task for which human and individual life has been created. (shrink)
Recent work has defended “Euclidean” theories of set size, in which Cantor’s Principle (two sets have equally many elements if and only if there is a one-to-one correspondence between them) is abandoned in favor of the Part-Whole Principle (if A is a proper subset of B then A is smaller than B). It has also been suggested that Gödel’s argument for the unique correctness of Cantor’s Principle is inadequate. Here we see from simple examples, not that (...) Euclidean theories of set size are wrong, but that they must be either very weak and narrow or largely arbitrary and misleading. (shrink)
In a first investigation, a Lacan-motivated template of the Poe story is fitted to the data. A segmentation of the storyline is used in order to map out the diachrony. Based on this, it will be shown how synchronous aspects, potentially related to Lacanian registers, can be sought. This demonstrates the effectiveness of an approach based on a model template of the storyline narrative. In a second and more Comprehensive investigation, we develop an approach for revealing, that is, uncovering, Lacanian (...) register relationships. Objectives of this work include the wide and general application of our methodology. This methodology is strongly based on the “letting the data speak” Correspondence Analysis analytics Platform of Jean-Paul Benzécri, that is also the geometric data analysis, both qualitative and quantitative analytics, developed by Pierre Bourdieu. (shrink)
The file on this site provides the slides for a lecture given in Hangzhou in May 2018, and the lecture itself is available at the URL beginning 'sms' in the set of links provided in connection with this item. -/- It is commonly assumed that regular physics underpins biology. Here it is proposed, in a synthesis of ideas by various authors, that in reality structures and mechanisms of a biological character underpin the world studied by physicists, in principle supplying (...) detail in the domain that according to regular physics is of an indeterminate character. In regular physics mathematical equations are primary, but this constraint leads to problems with reconciling theory and reality. Biology on the other hand typically does not characterise nature in quantitative terms, instead investigating in detail important complex interrelationships between parts, leading to an understanding of the systems concerned that is in some respects beyond that which prevails in regular physics. It makes contact with quantum physics in various ways, for example in that both involve interactions between observer and observed, an insight that explains what is special about processes involving observation, justifying in the quantum physics context the replacement of the unphysical many-worlds picture by one involving collapse. The link with biology furthermore clarifies Wheeler’s suggestion that a multiplicity of observations can lead to the ‘fabrication of form’, including the insight that this process depends on very specific ‘structures with power’ related to the 'semiotic scaffolding' of the application of sign theory to biology known as biosemiotics. -/- The observer-observed 'circle' of Wheeler and Yardley is a special case of a more general phenomenon, oppositional dynamics, related to the 'intra-action' of Barad's Agential Realism, involving cooperating systems such as mind and matter, abstract and concrete, observer and observed, that preserve their identities while interacting with one another in such a way as to act as a unit. A third system may also be involved, the mediating system of Peirce linking the two together. Such a situation of changing connections and separations may plausibly lead in the future to an understanding of how complex systems are able to evolve to produce 'life, the universe and everything'. -/- (Added 1 July 2018) The general structure proposed here as an alternative to a mathematics-based physics can be usefully characterised by relating it to different disciplines and the specialised concepts utilised therein. In theoretical physics, the test for the correctness of a theory typically involves numerical predictions, corresponding to which theories are expressed in terms of equations, that is to say assertions that two quantities have identical values. Equations have a lesser significance in biology which typically talks in terms of functional mechanisms, dependent for example on details of chemistry and concepts such as genes, natural selection, signals and geometrical or topologically motivated concepts such as the interconnections between systems and the unfolding of DNA. Biosemiotics adds to this the concept of signs and their interpretation, implying novel concepts such as semiotic scaffolding and the semiosphere, code duality, and appreciation of the different types of signs, including symbols and their capacity for abstraction and use in language systems. Circular Theory adds to this picture, as do the ideas of Barad, considerations such as the idea of oppositional dynamics. The proposals in this lecture can be regarded as the idea that concepts such as those deriving from biosemiotics have more general applicability than just conventional biology and may apply, in some circumstances, to nonlinear systems generally, including the domain new to science hypothesised to underlie the phenomena of present-day physics. -/- The task then has to be to restore the mathematical aspect presumed, in this picture, not to be fundamental as it is in conventional theory. Deacon has invoked a complex sequence of evolutionary steps to account for the emergence over time of human language systems, and correspondingly mathematical behaviour can be subsumed under the general evolutionary mechanisms of biosemiotics (cf. also the proposals of Davis and Hersh regarding the nature of mathematics), so that the mathematical behaviour of physical systems is consistent with the proposed scheme. In conclusion, it is suggested that theoretical physicists should cease expecting to find some universal mathematical ‘theory of everything’, and focus instead on understanding in more detail complex systems exhibiting behaviour of a biological character, extending existing understanding. This may in time provide a more fruitful understanding of the natural world than does the regular approach. The essential concepts have an observational basis from both biology and the little-known discipline of cymatics (a discipline concerned with the remarkable patterns that specific waveforms can give rise to), while again computer simulations also offer promise in providing insight into the complex behaviours involved in the above proposals. -/- References -/- Jesper Hoffmeyer, Semiotic Scaffolding of Living Systems. Commens, a Digital Companion to C. S. Peirce (on Commens web site). Terrence Deacon, The Symbolic Species, W.W. Norton & Co. Karen Barad, Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning, Duke University Press. Philip Davis and Reuben Hersh, The Mathematical Experience, Penguin. Ilexa Yardley, Circular Theory. (shrink)
Robert Batterman’s ontological insights (2002, 2004, 2005) are apt: Nature abhors singularities. “So should we,” responds the physicist. However, the epistemic assessments of Batterman concerning the matter prove to be less clear, for in the same vein he write that singularities play an essential role in certain classes of physical theories referring to certain types of critical phenomena. I devise a procedure (“methodological fundamentalism”) which exhibits how singularities, at least in principle, may be avoided within the same classes of (...) formalisms discussed by Batterman. I show that we need not accept some divergence between explanation and reduction (Batterman 2002), or between epistemological and ontological fundamentalism (Batterman 2004, 2005). Though I remain sympathetic to the ‘principle of charity’ (Frisch (2005)), which appears to favor a pluralist outlook, I nevertheless call into question some of the forms such pluralist implications take in Robert Batterman’s conclusions. It is difficult to reconcile some of the pluralist assessments that he and some of his contemporaries advocate with what appears to be a countervailing trend in a burgeoning research tradition known as Clifford (or geometric) algebra. In my critical chapters (2 and 3) I use some of the demonstrated formal unity of Clifford algebra to argue that Batterman (2002) equivocates a physical theory’s ontology with its purely mathematical content. Carefully distinguishing the two, and employing Clifford algebraic methods reveals a symmetry between reduction and explanation that Batterman overlooks. I refine this point by indicating that geometric algebraic methods are an active area of research in computational fluid dynamics, and applied in modeling the behavior of droplet-formation appear to instantiate a “methodologically fundamental” approach. I argue in my introductory and concluding chapters that the model of inter-theoretic reduction and explanation offered by Fritz Rohrlich (1988, 1994) provides the best framework for accommodating the burgeoning pluralism in philosophical studies of physics, with the presumed claims of formal unification demonstrated by physicists choices of mathematical formalisms such as Clifford algebra. I show how Batterman’s insights can be reconstructed in Rohrlich’s framework, preserving Batterman’s important philosophical work, minus what I consider are his incorrect conclusions. (shrink)
In this article I develop an elementary system of axioms for Euclidean geometry. On one hand, the system is based on the symmetry principles which express our a priori ignorant approach to space: all places are the same to us, all directions are the same to us and all units of length we use to create geometric figures are the same to us. On the other hand, through the process of algebraic simplification, this system of axioms directly provides the (...) Weyl’s system of axioms for Euclidean geometry. The system of axioms, together with its a priori interpretation, offers new views to philosophy and pedagogy of mathematics: it supports the thesis that Euclidean geometry is a priori, it supports the thesis that in modern mathematics the Weyl’s system of axioms is dominant to the Euclid’s system because it reflects the a priori underlying symmetries, it gives a new and promising approach to learn geometry which, through the Weyl’s system of axioms, leads from the essential geometric symmetry principles of the mathematical nature directly to modern mathematics. (shrink)
The Principle of Ariadne, formulated in 1988 ago by Walter Carnielli and Carlos Di Prisco and later published in 1993, is an infinitary principle that is independent of the Axiom of Choice in ZF, although it can be consistently added to the remaining ZF axioms. The present paper surveys, and motivates, the foundational importance of the Principle of Ariadne and proposes the Ariadne Game, showing that the Principle of Ariadne, corresponds precisely to a winning strategy for (...) the Ariadne Game. Some relations to other alternative. set-theoretical principles are also briefly discussed. (shrink)
Cyclic mechanic is intended as a suitable generalization both of quantum mechanics and general relativity apt to unify them. It is founded on a few principles, which can be enumerated approximately as follows: 1. Actual infinity or the universe can be considered as a physical and experimentally verifiable entity. It allows of mechanical motion to exist. 2. A new law of conservation has to be involved to generalize and comprise the separate laws of conservation of classical and relativistic mechanics, and (...) especially that of conservation of energy: This is the conservation of action or information. 3. Time is not a uniformly flowing time in general. It can have some speed, acceleration, more than one dimension, to be discrete. 4. The following principle of cyclicity: The universe returns in any point of it. The return can be only kinematic, i.e. per a unit of energy (or mass), and thermodynamic, i.e. considering the universe as a thermodynamic whole. 5. The kinematic return, which is per a unit of energy (or mass), is the counterpart of conservation of energy, which can be interpreted as the particular case of conservation of action “per a unit of time”. The kinematic return per a unit of energy (or mass) can be interpreted in turn as another particular law of conservation in the framework of conservation of action (or information), namely conservation of wave period (or time). These two counterpart laws of conservation correspond exactly to the particle “half” and to the wave “half” of wave-particle duality. 6. The principle of quantum invariance is introduced. It means that all physical laws have to be invariant to discrete and continuous (smooth) morphisms (motions) or mathematically, to the axiom of choice. The list is not intended to be exhausted or disjunctive, but only to give an introductory idea. (shrink)
Three distinct turning points (“bottleneck breakings”) in universal evolution are discussed at some length in terms of “self-reference” and (corresponding) “Reality Principles.” The first (origin and evolution of animate Nature) and second (human consciousness) are shown to necessarily precede a third one, that of Marxist philosophy. It is pointed out that while the previous two could occupy a natural (so in a sense neutral) place as parts of human science, the self-reference of Marxism, as a _social_ human phenomenon, through its (...) direct bearings on the _practice_ of society, did have a stormy history. I conclude that the fall of Bolshevism was unavoidable, and still, we might uphold our hope for a truly free society of humankind, just on the very basis of what we have learned of the fate of Marxist philosophy as such, as a _recursively evolving_ social _practice_: the freedom of humankind of its own ideological burdens (constraints). (shrink)
In Newton’s correspondence with Richard Bentley, Newton rejected the possibility of remote action, even though he accepted it in the Principia. Practically, Newton’s natural philosophy is indissolubly linked to his conception of God. The knowledge of God seems to be essentially immutable, unlike the laws of nature that can be subjected to refining, revision and rejection procedures. As Newton later states in Opticks, the cause of gravity is an active principle in matter, but this active principle is (...) not an essential aspect of matter, but something that must have been added to matter by God, arguing in the same Query of Opticks even the need for divine intervention. DOI: 10.13140/RG.2.2.16732.44162 . (shrink)
Interpersonal comparisons of well-being (ICWs) confront the longstanding unsolved epistemic problem of other minds (EPOM): the problem of how to achieve objective knowledge of people's subjective mental states. The intractability of the EPOM may lead to the hope that Rational Choice Theory (RCT) can show that information about how people would choose over goods and gambles is sufficient--and information about subjective mental states therefore unnecessary--for interpersonal comparisons of levels and changes in well-being, thereby bypassing the EPOM. I argue that this (...) hope cannot be fulfilled. Our most plausible theories of value--whether anti-realist or realist--and theories of what makes a life go best--whether preference hedonism, success theory, or objective list theory--tie well-being to our evaluative attitudes towards our lives. These are distinct from and only contingently related to motivational attitudes to choose or behave in certain ways and therefore to choices and behaviors themselves. Interpersonal comparisons of the evaluative attitudes are therefore necessary, though perhaps insufficient, for ICWs. Preference theory's zero-one rule ignores these attitudes and is therefore implausible. Its extended preference approach assumes that our preferences are perfectly sympathetic and therefore begs the question of the EPOM. I argue that a principled solution to the EPOM, and to interpersonal comparisons of the evaluative attitudes, is provided by type correspondence between these attitudes and brain states. It remains an open and difficult question whether there exists a summary evaluative attitude whose intensity can serve as an index of an individual's over-all well-being, and which is the appropriate target of all efforts aimed at promoting the personal good, or whether the self and therefore well-being are too fragmented for this. (shrink)
The success of a few theories in statistical thermodynamics can be correlated with their selectivity to reality. These are the theories of Boltzmann, Gibbs, end Einstein. The starting point is Carnot’s theory, which defines implicitly the general selection of reality relevant to thermodynamics. The three other theories share this selection, but specify it further in detail. Each of them separates a few main aspects within the scope of the implicit thermodynamic reality. Their success grounds on that selection. Those aspects can (...) be represented by corresponding oppositions. These are: macroscopic – microscopic; elements – states; relational – non-relational; and observable – theoretical. They can be interpreted as axes of independent qualities constituting a common qualitative reference frame shared by those theories. Each of them can be situated in this reference frame occupying a different place. This reference frame can be interpreted as an additional selection of reality within Carnot’s initial selection describable as macroscopic and both observable and theoretical. The deduced reference frame refers implicitly to many scientific theories independent of their subject therefore defining a general and common space or subspace for scientific theories (not for all). The immediate conclusion is: The examples of a few statistical thermodynamic theories demonstrate that the concept of “reality” is changed or generalized, or even exemplified (i.e. “de-generalized”) from a theory to another. Still a few more general suggestions referring the scientific realism debate can be added: One can admit that reality in scientific theories is some partially shared common qualitative space or subspace describable by relevant oppositions and rather independent of their subject quite different in general. Many or maybe all theories can be situated in that space of reality, which should develop adding new dimensions in it for still newer and newer theories. Its division of independent subspaces can represent the many-realities conception. The subject of a theory determines some relevant subspace of reality. This represents a selection within reality, relevant to the theory in question. The success of that theory correlates essentially with the selection within reality, relevant to its subject. (shrink)
A principle, according to which any scientific theory can be mathematized, is investigated. Social science, liberal arts, history, and philosophy are meant first of all. That kind of theory is presupposed to be a consistent text, which can be exhaustedly represented by a certain mathematical structure constructively. In thus used, the term “theory” includes all hypotheses as yet unconfirmed as already rejected. The investigation of the sketch of a possible proof of the principle demonstrates that it should be (...) accepted rather a metamathematical axiom about the relation of mathematics and reality. The main statement is formulated as follows: Any scientific theory admits isomorphism to some mathematical structure in a way constructive. Its investigation needs philosophical means. Husserl’s phenomenology is what is used, and then the conception of “bracketing reality” is modelled to generalize Peano arithmetic in its relation to set theory in the foundation of mathematics. The obtained model is equivalent to the generalization of Peano arithmetic by means of replacing the axiom of induction with that of transfinite induction. The sketch of the proof is organized in five steps: a generalization of epoché; involving transfinite induction in the transition between Peano arithmetic and set theory; discussing the finiteness of Peano arithmetic; applying transfinite induction to Peano arithmetic; discussing an arithmetical model of reality. Accepting or rejecting the principle, two kinds of mathematics appear differing from each other by its relation to reality. Accepting the principle, mathematics has to include reality within itself in a kind of Pythagoreanism. These two kinds are called in paper correspondingly Hilbert mathematics and Gödel mathematics. The sketch of the proof of the principle demonstrates that the generalization of Peano arithmetic as above can be interpreted as a model of Hilbert mathematics into Gödel mathematics therefore showing that the former is not less consistent than the latter, and the principle is an independent axiom. The present paper follows a pathway grounded on Husserl’s phenomenology and “bracketing reality” to achieve the generalized arithmetic necessary for the principle to be founded in alternative ontology, in which there is no reality external to mathematics: reality is included within mathematics. That latter mathematics is able to self-found itself and can be called Hilbert mathematics in honour of Hilbert’s program for self-founding mathematics on the base of arithmetic. The principle of universal mathematizability is consistent to Hilbert mathematics, but not to Gödel mathematics. Consequently, its validity or rejection would resolve the problem which mathematics refers to our being; and vice versa: the choice between them for different reasons would confirm or refuse the principle as to the being. An information interpretation of Hilbert mathematics is involved. It is a kind of ontology of information. The Schrödinger equation in quantum mechanics is involved to illustrate that ontology. Thus the problem which of the two mathematics is more relevant to our being is discussed again in a new way A few directions for future work can be: a rigorous formal proof of the principle as an independent axiom; the further development of information ontology consistent to both kinds of mathematics, but much more natural for Hilbert mathematics; the development of the information interpretation of quantum mechanics as a mathematical one for information ontology and thus Hilbert mathematics; the description of consciousness in terms of information ontology. (shrink)
This paper centers on the implicit metaphysics beyond the Theory of Relativity and the Principle of Indeterminacy – two revolutionary theories that have changed 20th Century Physics – using the perspective of Husserlian Transcedental Phenomenology. Albert Einstein (1879-1955) and Werner Heisenberg (1901-1976) abolished the theoretical framework of Classical (Galilean- Newtonian) physics that has been complemented, strengthened by Cartesian metaphysics. Rene Descartes (1596- 1850) introduced a separation between subject and object (as two different and self- enclosed substances) while Galileo and (...) Newton did the “mathematization” of the world. Newtonian physics, however, had an inexplicable postulate of absolute space and absolute time – a kind of geometrical framework, independent of all matter, for the explication of locality and acceleration. Thus, Cartesian modern metaphysics and Galilean- Newtonian physics go hand in hand, resulting to socio- ethical problems, materialism and environmental destruction. Einstein got rid of the Newtonian absolutes and was able to provide a new foundation for our notions of space and time: the four (4) dimensional space- time; simultaneity and the constancy of velocity of light, and the relativity of all systems of reference. Heisenberg, following the theory of quanta of Max Planck, told us of our inability to know sub- atomic phenomena and thus, blurring the line between the Cartesian separation of object and subject, hence, initiating the crisis of the foundations of Classical Physics. But the real crisis, according to Edmund Husserl (1859-1930) is that Modern (Classical) Science had “idealized” the world, severing nature from what he calls the Lebenswelt (life- world), the world that is simply there even before it has been reduced to mere mathematical- logical equations. Husserl thus, aims to establish a new science that returns to the “pre- scientific” and “non- mathematized” world of rich and complex phenomena: phenomena as they “appear to human consciousness”. To overcome the Cartesian equation of subject vs. object (man versus environment), Husserl brackets the external reality of Newtonian Science (epoché = to put in brackets, to suspend judgment) and emphasizes (1) the meaning of “world” different from the “world” of Classical Physics, (2) the intentionality of consciousness (L. in + tendere = to tend towards, to be essentially related to or connected to) which means that even before any scientific- logical description of the external reality, there is always a relation already between consciousness and an external reality. The world is the equiprimordial existence of consciousness and of external reality. My paper aims to look at this new science of the pre- idealized phenomena started by Husserl (a science of phenomena as they appear to conscious, human, lived experience, hence he calls it phenomenology), centering on the life- world and the intentionality of consciousness, as providing a new way of looking at ourselves and the world, in short, as providing a new metaphysics (as an antidote to Cartesian metaphysics) that grounds the revolutionary findings of Einstein and Heisenberg. The environmental destruction, technocracy, socio- ethical problems in the modern world are all rooted in this Galilean- Newtonian- Cartesian interpretation of the relationship between humans and the world after the crumbling of European Medieval Ages. Friedrich Nietzsche (1844-1900) comments that the modern world is going toward a nihilism (L. nihil = nothingness) at the turn of the century. Now, after two World Wars and the dropping of Atomic bomb, the capitalism and imperialism on the one hand, and on the other hand the poverty, hunger of the non- industrialized countries alongside destruction of nature (i.e., global warming), Nietzsche might be correct: unless humanity changes the way it looks at humanity and the kosmos. The works of Einstein, Heisenberg and Husserl seem to be pointing the way for us humans to escape nihilism by a “great existential transformation.” What these thinkers of post- modernity (after Cartesian/ Newtonian/ Galilean modernity) point to are: a) a new therapeutic way of looking at ourselves and our world (metaphysics) and b) a new and corrective notion of “rationality” (different from the objectivist, mathematico- logical way of thinking). This paper is divided into four parts: 1) A summary of Classical Physics and a short history of Quantum Theory 2) Einstein’s Special and General Relativity and Heisenberg’s Indeterminacy Principle 3) Husserl’s discussion of the Crisis of Europe, the life- world and intentionality of consciousness 4) A Metaphysics of Relativity and Indeterminacy and a Corrective notion of Rationality in Husserl’s Phenomenology . (shrink)
Which way does causation proceed? The pattern in the material world seems to be upward: particles to molecules to organisms to brains to mental processes. In contrast, the principles of quantum mechanics allow us to see a pattern of downward causation. These new ideas describe sets of multiple levels in which each level influences the levels below it through generation and selection. Top-down causation makes exciting sense of the world: we can find analogies in psychology, in the formation of our (...) minds, in locating the source of consciousness, and even in the possible logic of belief in God. (shrink)
The paper follows the track of a previous paper “Natural cybernetics of time” in relation to history in a research of the ways to be mathematized regardless of being a descriptive humanitarian science withal investigating unique events and thus rejecting any repeatability. The pathway of classical experimental science to be mathematized gradually and smoothly by more and more relevant mathematical models seems to be inapplicable. Anyway quantum mechanics suggests another pathway for mathematization; considering the historical reality as dual or “complimentary” (...) to its model. The historical reality by itself can be seen as mathematical if one considers it in Hegel’s manner as a specific interpretation of the totality being in a permanent self-movement due to being just the totality, i.e. by means of the “speculative dialectics” of history, however realized as a theory both mathematical and empirical and thus falsifiable as by logical contradictions within itself as emprical discrepancies to facts. Not less, a Husserlian kind of “historical phenomenology” is possible along with Hegel’s historical dialectics sharing the postulate of the totality (and thus, that of transcendentalism). One would be to suggest the transcendental counterpart: an “eternal”, i.e. atemporal and aspatial history to the usual, descriptive temporal history, and equating the real course of history as with its alternative, actually happened branches of the regions of the world as with only imaginable, counterfactual histories. That universal and transcendental history is properly mathematical by itself, even in a neo-Pythagorean model. It is only represented on the temporal screen of the standard historiography as a discrete series of unique events. An analogy to the readings of the apparatus in quantum mechanics can be useful. Even more, that analogy is considered rigorously and logically as implied by the mathematical transcendental history and sharing with it the same quantity of information as an invariant to all possible alternative or counterfactual histories. One can involve the hypothetical external viewpoint to history (as if outside of history or from “God’s viewpoint to it), to which all alternative or counterfactual histories can be granted as a class of equivalence sharing the same information (i.e. the number choices, but realized in different sequence or adding redundant ones in each branch) being similar and even mathematically isomorphic to Feynman trajectories in quantum mechanics. Particularly, a fundamental law of mathematical history, the law of least choice of the real historical pathway is deducible from the same approach. Its counterpart in physics is the well-known and confirmed law of least action as far as the quantity of action corresponds equivocally to the quantity of information or that of number elementary historical choices. (shrink)
Based on the various documents, 1989-2002, through the original texts, in addition to the author's contributions, this paper presents the refutation of the mathematicians and physicists A. Logunov and M. Mestvirishvil of A. Einstein's "general relativity", from the relativistic theory of gravitation of these authors, who applying the fundamental principle of the science of physics of the conservation of the energy-momentum and using absolute differential calculus they rigorously perform their mathematical tests. It is conclusively shown that, from the Einstein-Grossman-Hilbert (...) equations, gravity is absurdly a metric field devoid of physical reality unlike all other fields in nature that are material fields, interrupting the chain of transformations between the different existing fields. Also, in Einstein's theory the proved "inertial mass" equal to gravitational mass has no physical meaning. Therefore, "general relativity" does not obey the correspondenceprinciple with Newton's gravity. (shrink)
Gideon Rosen and Robert Schwartzkopff have independently suggested (variants of) the following claim, which is a varian of Hume's Principle: -/- When the number of Fs is identical to the number of Gs, this fact is grounded by the fact that there is a one-to-one correspondence between the Fs and Gs. -/- My paper is a detailed critique of the proposal. I don't find any decisive refutation of the proposal. At the same time, it has some consequences which (...) many will find objectionable. (shrink)
The numerous and diverse roles of theory reduction in science have been insufficiently explored in the philosophy literature on reduction. Part of the reason for this has been a lack of attention paid to reduction2 (successional reduction)---although I here argue that this sense of reduction is closer to reduction1 (explanatory reduction) than is commonly recognised, and I use an account of reduction that is neutral between the two. This paper draws attention to the utility---and incredible versatility---of theory reduction. A non-exhaustive (...) list of various applications of reduction in science is presented, some of which are drawn from a particular case-study, being the current search for a new theory of fundamental physics. This case-study is especially interesting because it employs both senses of reduction at once, and because of the huge weight being put on reduction by the different research groups involved; additionally, it presents some unique uses for reduction---revealing, I argue, the fact that reduction can be of specialised and unexpected service in particular scientific cases. The paper makes two other general findings: that the functions of reduction that are typically assumed to characterise the different forms of the relation may instead be understood as secondary consequences of some other roles; and that most of the roles that reduction plays in science can actually also be fulfilled by a weaker relation than (the typical understanding of) reduction. (shrink)
Gila Sher approaches knowledge from the perspective of the basic human epistemic situation—the situation of limited yet resourceful beings, living in a complex world and aspiring to know it in its full complexity. What principles should guide them? Two fundamental principles of knowledge are epistemic friction and freedom. Knowledge must be substantially constrained by the world (friction), but without active participation of the knower in accessing the world (freedom) theoretical knowledge is impossible. This requires a grounding of all knowledge, empirical (...) and abstract, in both mind and world, but the fall of traditional foundationalism has led many to doubt the viability of this ‘classical’ project. Sher challenges this skepticism, charting a new foundational methodology, foundational holism, that differs from others in being holistic, world-oriented, and universal (i.e., applicable to all fields of knowledge). Using this methodology, Epistemic Friction develops an integrated theory of knowledge, truth, and logic. This includes (i) a dynamic model of knowledge, incorporating some of Quine’s revolutionary ideas while rejecting his narrow empiricism, (ii) a substantivist, non-traditional correspondence theory of truth, and (iii) an outline of a joint grounding of logic in mind and world. The model of knowledge subjects all disciplines to demanding norms of both veridicality and conceptualization. The correspondence theory is robust and universal yet not simplistic or naive, admitting diverse forms of correspondence. Logic’s grounding in the world brings it in line with other disciplines while preserving, and explaining, its strong formality, necessity, generality, and normativity. (shrink)
The predominant approaches to understanding how quantum theory and General Relativity are related to each other implicitly assume that both theories use the same concept of mass. Given that despite great efforts such approaches have not yet produced a consistent falsifiable quantum theory of gravity, this paper entertains the possibility that the concepts of mass in the two theories are in fact distinct. It points out that if the concept of mass in quantum mechanics is defined such that it always (...) exists in a superposition and is not a gravitational source, then this sharply segregates the domains of quantum theory and of general relativity. This concept of mass violates the equivalence principle applied to active gravitational mass, but may still produce effects consistent with the equivalence principle when applied to passive gravitational mass (in agreement with observations) by the correspondenceprinciple applied to a weak field in the appropriate limit. An experiment that successfully measures the gravity field of quantum objects in a superposition, and in particular of photons, would not only falsify this distinction but also constitute the first direct empirical test that gravity must in fact be described fundamentally by a quantum theory. (shrink)
The aim of this paper is to argue that the adoption of an unrestricted principle of bivalence is compatible with a metaphysics that (i) denies that the future is real, (ii) adopts nomological indeterminism, and (iii) exploits a branching structure to provide a semantics for future contingent claims. To this end, we elaborate what we call Flow Fragmentalism, a view inspired by Kit Fine (2005)’s non-standard tense realism, according to which reality is divided up into maximally coherent collections of (...) tensed facts. In this way, we show how to reconcile a genuinely A-theoretic branching-time model with the idea that there is a branch corresponding to the thin red line, that is, the branch that will turn out to be the actual future history of the world. (shrink)
We investigate the conflict between the ex ante and ex post criteria of social welfare in a new framework of individual and social decisions, which distinguishes between two sources of uncertainty, here interpreted as an objective and a subjective source respectively. This framework makes it possible to endow the individuals and society not only with ex ante and ex post preferences, as is usually done, but also with interim preferences of two kinds, and correspondingly, to introduce interim forms of the (...) Pareto principle. After characterizing the ex ante and ex post criteria, we present a first solution to their conflict that extends the former as much possible in the direction of the latter. Then, we present a second solution, which goes in the opposite direction, and is also maximally assertive. Both solutions translate the assumed Pareto conditions into weighted additive utility representations, and both attribute to the individuals common probability values on the objective source of uncertainty, and different probability values on the subjective source. We discuss these solutions in terms of two conceptual arguments, i.e., the by now classic spurious unanimity argument and a novel informational argument labelled complementary ignorance. The paper complies with the standard economic methodology of basing probability and utility representations on preference axioms, but for the sake of completeness, also considers a construal of objective uncertainty based on the assumption of an exogeneously given probability measure. JEL classification: D70; D81. (shrink)
This paper argues that two widely accepted principles about the indicative conditional jointly presuppose the falsity of one of the most prominent arguments against epistemological iteration principles. The first principle about the indicative conditional, which has close ties both to the Ramsey test and the “or-to-if” inference, says that knowing a material conditional suffices for knowing the corresponding indicative. The second principle says that conditional contradictions cannot be true when their antecedents are epistemically possible. Taken together, these principles (...) entail that it is impossible to be in a certain kind of epistemic state: namely, a state of ignorance about which of two partially overlapping bodies of knowledge corresponds to one’s actual one. However, some of the more popular “margin for error” style arguments against epistemological iteration principles suggest that such states are not only possible, but commonplace. I argue that the tension between these views runs deep, arising just as much for non-factive attitudes like belief, presupposition, and certainty. I also argue that this is worse news for those who accept the principles about the indicative conditional than it is for those who reject epistemological iteration principles. (shrink)
Human freedom is in tension with nomological determinism and with statistical determinism. The goal of this paper is to answer both challenges. Four contributions are made to the free-will debate. First, we propose a classification of scientific theories based on how much freedom they allow. We take into account that indeterminism comes in different degrees and that both the laws and the auxiliary conditions can place constraints. A scientific worldview pulls towards one end of this classification, while libertarianism pulls towards (...) the other end of the spectrum. Second, inspired by Hoefer, we argue that an interval of auxiliary conditions corresponds to a region in phase space, and to a bundle of possible block universes. We thus make room for a form of non-nomological indeterminism. Third, we combine crucial elements from the works of Hoefer and List; we attempt to give a libertarian reading of this combination. On our proposal, throughout spacetime, there is a certain amount of freedom that can be interpreted as the result of agential choices. Fourth, we focus on the principle of alternative possibilities throughout and propose three ways of strengthening it. (shrink)
Despite the centrality of the idea of history to Dewey's overall philosophical outlook, his brief treatment of philosophical issues in history has never attracted much attention, partly because of the dearth of the available material. Nonetheless, as argued in this essay, what we do have provides for the outlines of a comprehensive pragmatist view of history distinguished by an emphasis on methodological pluralism and a principled opposition to thinking of historical knowledge in correspondence terms. The key conceptions of Dewey's (...) philosophy of history outlined in this paper -- i.e. historical constitution of human nature, constructivist ontology of historical events, as well as the belief that the proper form of historical judgments is underwritten by the category of continual change -- are discussed with a view to the current challenges in philosophy of history, e.g. the contest between naturalism and rationalism, objectivity and relativism, questions surrounding the function of narrative in history, and the relationship of history to the problems of identity and self-knowledge. The intended upshot of the essay is to suggest that Dewey's brief yet substantial analysis may be capable of supplying the guiding principles for articulating a viable and promising pragmatist (and naturalist) conception of historical knowledge. (shrink)
In this paper I provide a frequentist philosophical-methodological solution for the stopping rule problem presented by Lindley & Phillips in 1976, which is settled in the ecological realm of testing koalas’ sex ratio. I deliver criteria for discerning a stopping rule, an evidence and a model that are epistemically more appropriate for testing the hypothesis of the case studied, by appealing to physical notion of probability and by analyzing the content of possible formulations of evidence, assumptions of models and meaning (...) of the ecological hypothesis. First, I show the difference in the evidence taken into account in different frequentist sampling procedures presented in the problem. Next, I discuss the inapplicability of the Carnapian principle of total evidence in deciding which formulation of evidence associated with a given sampling procedure and statistical model is epistemically more appropriate for testing the hypothesis in question. Then I propose a double-perspective (evidence and model) frequentist solution based on the choice of evidence which better corresponds to the investigated ecological hypothesis, as well as on the choice of a model that embraces less unrealistic ontological assumptions. Finally, I discuss two perspectives of the stopping rule dependence. (shrink)
This article seeks the origin, in the theories of Ibn al-Haytham (Alhazen), Descartes, and Berkeley, of two-stage theories of spatial perception, which hold that visual perception involves both an immediate representation of the proximal stimulus in a two-dimensional ‘‘sensory core’’ and also a subsequent perception of the three dimensional world. The works of Ibn al-Haytham, Descartes, and Berkeley already frame the major theoretical options that guided visual theory into the twentieth century. The field of visual perception was the first area (...) of what we now call psychology to apply mathematics, through geometrical models as used by Euclid, Ptolemy, Ibn al-Haytham, and Descartes (among others). The article shows that Kepler’s discovery of the retinal image, which revolutionized visual anatomy and entailed fundamental changes in visual physiology, did not alter the basic structure of theories of spatial vision. These changes in visual physiology are advanced especially in Descartes' Dioptrics and his L'Homme. Berkeley develops a radically empirist theory vision, according to which visual perception of depth is learned through associative processes that rely on the sense of touch. But Descartes and Berkeley share the assertion that there is a two-dimensional sensory core that is in principle available to consciousness. They also share the observation that we don't usually perceived this core, but find depth and distance to be phenomenally immediate, a point they struggle to accommodate theoretically. If our interpretation is correct, it was not a change in the theory of the psychology of vision that engendered the idea of a sensory core, but rather the introduction of the theory into a new metaphysical context. (shrink)
An analysis of the classical-quantum correspondence shows that it needs to identify a preferred class of coordinate systems, which defines a torsionless connection. One such class is that of the locally-geodesic systems, corresponding to the Levi-Civita connection. Another class, thus another connection, emerges if a preferred reference frame is available. From the classical Hamiltonian that rules geodesic motion, the correspondence yields two distinct Klein-Gordon equations and two distinct Dirac-type equations in a general metric, depending on the connection used. (...) Each of these two equations is generally-covariant, transforms the wave function as a four-vector, and differs from the Fock-Weyl gravitational Dirac equation (DFW equation). One obeys the equivalence principle in an often-accepted sense, whereas the DFW equation obeys that principle only in an extended sense. (shrink)
We introduce a realist, unextravagant interpretation of quantum theory that builds on the existing physical structure of the theory and allows experiments to have definite outcomes but leaves the theory’s basic dynamical content essentially intact. Much as classical systems have specific states that evolve along definite trajectories through configuration spaces, the traditional formulation of quantum theory permits assuming that closed quantum systems have specific states that evolve unitarily along definite trajectories through Hilbert spaces, and our interpretation extends this intuitive picture (...) of states and Hilbert-space trajectories to the more realistic case of open quantum systems despite the generic development of entanglement. We provide independent justification for the partial-trace operation for density matrices, reformulate wave-function collapse in terms of an underlying interpolating dynamics, derive the Born rule from deeper principles, resolve several open questions regarding ontological stability and dynamics, address a number of familiar no-go theorems, and argue that our interpretation is ultimately compatible with Lorentz invariance. Along the way, we also investigate a number of unexplored features of quantum theory, including an interesting geometrical structure—which we call subsystem space—that we believe merits further study. We conclude with a summary, a list of criteria for future work on quantum foundations, and further research directions. We include an appendix that briefly reviews the traditional Copenhagen interpretation and the measurement problem of quantum theory, as well as the instrumentalist approach and a collection of foundational theorems not otherwise discussed in the main text. (shrink)
Gauss’s quadratic reciprocity theorem is among the most important results in the history of number theory. It’s also among the most mysterious: since its discovery in the late 18th century, mathematicians have regarded reciprocity as a deeply surprising fact in need of explanation. Intriguingly, though, there’s little agreement on how the theorem is best explained. Two quite different kinds of proof are most often praised as explanatory: an elementary argument that gives the theorem an intuitive geometric interpretation, due to (...) Gauss and Eisenstein, and a sophisticated proof using algebraic number theory, due to Hilbert. Philosophers have yet to look carefully at such explanatory disagreements in mathematics. I do so here. According to the view I defend, there are two important explanatory virtues—depth and transparency—which different proofs (and other potential explanations) possess to different degrees. Although not mutually exclusive in principle, the packages of features associated with the two stand in some tension with one another, so that very deep explanations are rarely transparent, and vice versa. After developing the theory of depth and transparency and applying it to the case of quadratic reciprocity, I draw some morals about the nature of mathematical explanation. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.