Since the pioneering work of Birkhoff and von Neumann, quantumlogic has been interpreted as the logic of (closed) subspaces of a Hilbert space. There is a progression from the usual Boolean logic of subsets to the "quantumlogic" of subspaces of a general vector space--which is then specialized to the closed subspaces of a Hilbert space. But there is a "dual" progression. The notion of a partition (or quotient set or equivalence relation) is (...) dual (in a category-theoretic sense) to the notion of a subset. Hence the Boolean logic of subsets has a dual logic of partitions. Then the dual progression is from that logic of partitions to the quantumlogic of direct-sum decompositions (i.e., the vector space version of a set partition) of a general vector space--which can then be specialized to the direct-sum decompositions of a Hilbert space. This allows the logic to express measurement by any self-adjoint operators rather than just the projection operators associated with subspaces. In this introductory paper, the focus is on the quantumlogic of direct-sum decompositions of a finite-dimensional vector space (including such a Hilbert space). The primary special case examined is finite vector spaces over ℤ₂ where the pedagogical model of quantum mechanics over sets (QM/Sets) is formulated. In the Appendix, the combinatorics of direct-sum decompositions of finite vector spaces over GF(q) is analyzed with computations for the case of QM/Sets where q=2. (shrink)
Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of (...) this paper is to give the direct generalization to quantum logical information theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., qudits of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates that are distinguished by the measurement. Both the classical and quantum versions of logical entropy have simple interpretations as “two-draw” probabilities for distinctions. The conclusion is that quantum logical entropy is the simple and natural notion of information for quantum information theory focusing on the distinguishing of quantum states. (shrink)
The notion of a partition on a set is mathematically dual to the notion of a subset of a set, so there is a logic of partitions dual to Boole's logic of subsets (Boolean logic is usually mis-specified as "propositional" logic). The notion of an element of a subset has as its dual the notion of a distinction of a partition (a pair of elements in different blocks). Boole developed finite logical probability as the normalized counting (...) measure on elements of subsets so there is a dual concept of logical entropy which is the normalized counting measure on distinctions of partitions. Thus the logical notion of information is a measure of distinctions. Classical logical entropy naturally extends to the notion of quantum logical entropy which provides a more natural and informative alternative to the usual Von Neumann entropy in quantum information theory. The quantum logical entropy of a post-measurement density matrix has the simple interpretation as the probability that two independent measurements of the same state using the same observable will have different results. The main result of the paper is that the increase in quantum logical entropy due to a projective measurement of a pure state is the sum of the absolute squares of the off-diagonal entries ("coherences") of the pure state density matrix that are zeroed ("decohered") by the measurement, i.e., the measure of the distinctions ("decoherences") created by the measurement. (shrink)
I construct a quantum-logical model of the type of situation that seems to be at the root of the problem of interpreting the projection postulate (Luders' rule) as a criterion of minimal disturbance. It is shown that the most natural way of characterizing minimal disturbance leads to contradictory conclusions concerning the final state.
This paper reviews some of the literature on the philosophy of quantum mechanics. The publications involved tend to follow similar patterns of first identifying the mysteries, puzzles or paradoxes of the quantum world, and then discussing the existing interpretations of these matters, before the authors produce their own interpretations, or side with one of the existing views. The paper will show that all interpretations of quantum mechanics involve elements of apparent weirdness. They suggest that the quantum (...) world, and possibly our macro world, exists or behaves in a way quite contrary to the way we normally imagine they should. The paper will also show how many of the writers on quantum mechanics misunderstand idealism in the macro world as proposed by philosophers such as George Berkeley, David Hume, Immanuel Kant and John Stuart Mill and misunderstand the concept of the observer dependent universe. The paper concludes by examining the similarities between the idealist view of the macro world and the Copenhagen Interpretation of the quantum world and suggests that as the Copenhagen Interpretation provides a view of the quantum world that is consistent with the macro world then the Copenhagen Interpretation should be the preferred view of the quantum world. (shrink)
Quantum information is discussed as the universal substance of the world. It is interpreted as that generalization of classical information, which includes both finite and transfinite ordinal numbers. On the other hand, any wave function and thus any state of any quantum system is just one value of quantum information. Information and its generalization as quantum information are considered as quantities of elementary choices. Their units are correspondingly a bit and a qubit. The course of time (...) is what generates choices by itself, thus quantum information and any item in the world in final analysis. The course of time generates necessarily choices so: The future is absolutely unorderable in principle while the past is always well-ordered and thus unchangeable. The present as the mediation between them needs the well-ordered theorem equivalent to the axiom of choice. The latter guarantees the choice even among the elements of an infinite set, which is the case of quantum information. The concrete and abstract objects share information as their common base, which is quantum as to the formers and classical as to the latters. The general quantities of matter in physics, mass and energy can be considered as particular cases of quantum information. The link between choice and abstraction in set theory allows of “Hume’s principle” to be interpreted in terms of quantum mechanics as equivalence of “many” and “much” underlying quantum information. Quantum information as the universal substance of the world calls for the unity of physics and mathematics rather than that of the concrete and abstract objects and thus for a form of quantum neo-Pythagoreanism in final analysis. (shrink)
Any logic is represented as a certain collection of well-orderings admitting or not some algebraic structure such as a generalized lattice. Then universal logic should refer to the class of all subclasses of all well-orderings. One can construct a mapping between Hilbert space and the class of all logics. Thus there exists a correspondence between universal logic and the world if the latter is considered a collection of wave functions, as which the points in Hilbert space can (...) be interpreted. The correspondence can be further extended to the foundation of mathematics by set theory and arithmetic, and thus to all mathematics. (shrink)
The resolving of the main problem of quantum mechanics about how a quantum leap and a smooth motion can be uniformly described resolves also the problem of how a distribution of reliable data and a sequence of deductive conclusions can be uniformly described by means of a relevant wave function “Ψdata”.
One can construct a mapping between Hilbert space and the class of all logic if the latter is defined as the set of all well-orderings of some relevant set (or class). That mapping can be further interpreted as a mapping of all states of all quantum systems, on the one hand, and all logic, on the other hand. The collection of all states of all quantum systems is equivalent to the world (the universe) as a whole. (...) Thus that mapping establishes a fundamentally philosophical correspondence between the physical world and universal logic by the meditation of a special and fundamental structure, that of Hilbert space, and therefore, between quantum mechanics and logic by mathematics. Furthermore, Hilbert space can be interpreted as the free variable of "quantum information" and any point in it, as a value of the same variable as "bound" already axiom of choice. (shrink)
Barbour shows that time does not exist in the physical world, and similar conclusions are reached by others such as Deutsch, Davies and Woodward. Every possible configuration of a physical environment simply exists in the universe. The system is objectively static. Observation, however, is an inherently transtemporal phenomenon, involving actual or effective change of the configuration, collapse. Since, in a static environment, all possible configurations exist, transtemporal reality is of the logical type of a movie. The frame of a movie (...) film is of one logical type, an element of a set of frames, the movie, itself of a second logical type. In a static no-collapse universe, the configurations are of the first logical type, transtemporal reality of the second. To run, the movie requires iteration, a third logical type. Phenomenal consciousness is subjectively experienced as of this third logical type with regard to physical configurations. Everett's formulation clearly describes the transtemporal reality of an observer, which follows the physical in the linear dynamics, but departs from it on observation, giving rise to the the appearance of collapse, and the alternation of dynamics defined in the standard von Neumann-Dirac formulation. Since there is no physical collapse, his formulation is disputed. Given an iterator of the third logical type, the appearance of collapse is simply evidence of iteration. Chalmers demonstrates that phenomenal consciousness is of this logical type, an emergent property of the unitary system as a whole. Given an iterative function of this nature, one contextual to the physical configurations, paradoxes of time are resolved. Subjectively, meaning from the perspective of the iterative process, time passes in an objectively static universe, and the appearance of collapse is effected. (shrink)
Husserl (a mathematician by education) remained a few famous and notable philosophical “slogans” along with his innovative doctrine of phenomenology directed to transcend “reality” in a more general essence underlying both “body” and “mind” (after Descartes) and called sometimes “ontology” (terminologically following his notorious assistant Heidegger). Then, Husserl’s tradition can be tracked as an idea for philosophy to be reinterpreted in a way to be both generalized and mathenatizable in the final analysis. The paper offers a pattern borrowed from the (...) theory of information and quantum information (therefore relating philosophy to both mathematics and physics) to formalize logically a few key concepts of Husserl’s phenomenology such as “epoché” “eidetic, phenomenological, and transcendental reductions” as well as the identification of “phenomenological, transcendental, and psychological reductions” in a way allowing for that identification to be continued to “eidetic reduction” (and thus to mathematics). The approach is tested by an independent and earlier idea of Husserl, “logical arithmetic” (parallelly implemented in mathematics by Whitehead and Russell’s Principia) as what “Hilbert arithmetic” generalizing Peano arithmetics is interpreted. A basic conclusion states for the unification of philosophy, mathematics, and physics in their foundations and fundamentals to be the Husserl tradition both tracked to its origin (in the being itself after Heidegger or after Husserl’s “zu Sache selbst”) and embodied in the development of human cognition in the third millennium. (shrink)
The paper considers the symmetries of a bit of information corresponding to one, two or three qubits of quantum information and identifiable as the three basic symmetries of the Standard model, U(1), SU(2), and SU(3) accordingly. They refer to “empty qubits” (or the free variable of quantum information), i.e. those in which no point is chosen (recorded). The choice of a certain point violates those symmetries. It can be represented furthermore as the choice of a privileged reference frame (...) (e.g. that of the Big Bang), which can be described exhaustively by means of 16 numbers (4 for position, 4 for velocity, and 8 for acceleration) independently of time, but in space-time continuum, and still one, 17th number is necessary for the mass of rest of the observer in it. The same 17 numbers describing exhaustively a privileged reference frame thus granted to be “zero”, respectively a certain violation of all the three symmetries of the Standard model or the “record” in a qubit in general, can be represented as 17 elementary wave functions (or classes of wave functions) after the bijection of natural and transfinite natural (ordinal) numbers in Hilbert arithmetic and further identified as those corresponding to the 17 elementary of particles of the Standard model. Two generalizations of the relevant concepts of general relativity are introduced: (1) “discrete reference frame” to the class of all arbitrarily accelerated reference frame constituting a smooth manifold; (2) a still more general principle of relativity to the general principle of relativity, and meaning the conservation of quantum information as to all discrete reference frames as to the smooth manifold of all reference frames of general relativity. Then, the bijective transition from an accelerated reference frame to the 17 elementary wave functions of the Standard model can be interpreted by the still more general principle of relativity as the equivalent redescription of a privileged reference frame: smooth into a discrete one. The conservation of quantum information related to the generalization of the concept of reference frame can be interpreted as restoring the concept of the ether, an absolutely immovable medium and reference frame in Newtonian mechanics, to which the relative motion can be interpreted as an absolute one, or logically: the relations, as properties. The new ether is to consist of qubits (or quantum information). One can track the conceptual pathway of the “ether” from Newtonian mechanics via special relativity, via general relativity, via quantum mechanics to the theory of quantum information (or “quantum mechanics and information”). The identification of entanglement and gravity can be considered also as a ‘byproduct” implied by the transition from the smooth “ether of special and general relativity’ to the “flat” ether of quantum mechanics and information. The qubit ether is out of the “temporal screen” in general and is depicted on it as both matter and energy, both dark and visible. (shrink)
The notion of equality between two observables will play many important roles in foundations of quantum theory. However, the standard probabilistic interpretation based on the conventional Born formula does not give the probability of equality between two arbitrary observables, since the Born formula gives the probability distribution only for a commuting family of observables. In this paper, quantum set theory developed by Takeuti and the present author is used to systematically extend the standard probabilistic interpretation of quantum (...) theory to define the probability of equality between two arbitrary observables in an arbitrary state. We apply this new interpretation to quantum measurement theory, and establish a logical basis for the difference between simultaneous measurability and simultaneous determinateness. (shrink)
We review a rough scheme of quantum mechanics using the Clifford algebra. Following the steps previously published in a paper by another author [31], we demonstrate that quantum interference arises in a Clifford algebraic formulation of quantum mechanics. In 1932 J. von Neumann showed that projection operators and, in particular, quantum density matrices can be interpreted as logical statements. In accord with a previously obtained result by V. F Orlov , in this paper we invert von (...) Neumann’s result. Instead of constructing logic from quantum mechanics , we construct quantum mechanics from an extended classical logic. It follows that the origins of the two most fundamental quantum phenomena , the indeterminism and the interference of probabilities, lie not in the traditional physics by itself but in the logical structure as realized here by the Clifford algebra. (shrink)
We review a rough scheme of quantum mechanics using the Clifford algebra. Following the steps previously published in a paper by another author [31], we demonstrate that quantum interference arises in a Clifford algebraic formulation of quantum mechanics. In 1932 J. von Neumann showed that projection operators and, in particular, quantum density matrices can be interpreted as logical statements. In accord with a previously obtained result by V. F Orlov , in this paper we invert von (...) Neumann’s result. Instead of constructing logic from quantum mechanics , we construct quantum mechanics from an extended classical logic. It follows that the origins of the two most fundamental quantum phenomena , the indeterminism and the interference of probabilities, lie not in the traditional physics by itself but in the logical structure as realized here by the Clifford algebra. (shrink)
Consider the concept combination ‘pet human’. In word association experiments, human subjects produce the associate ‘slave’ in relation to this combination. The striking aspect of this associate is that it is not produced as an associate of ‘pet’, or ‘human’ in isolation. In other words, the associate ‘slave’ seems to be emergent. Such emergent associations sometimes have a creative character and cognitive science is largely silent about how we produce them. Departing from a dimensional model of human conceptual space, this (...) article will explore concept combinations, and will argue that emergent associations are a result of abductive reasoning within conceptual space, that is, below the symbolic level of cognition. A tensor-based approach is used to model concept combinations allowing such combinations to be formalized as interacting quantum systems. Free association norm data is used to motivate the underlying basis of the conceptual space. It is shown by analogy how some concept combinations may behave like quantum-entangled particles. Two methods of analysis were presented for empirically validating the presence of non-separable concept combinations in human cognition. One method is based on quantum theory and another based on comparing a joint probability distribution with another distribution based on a separability assumption using a chi-square goodness-of-fit test. Although these methods were inconclusive in relation to an empirical study of bi-ambiguous concept combinations, avenues for further refinement of these methods are identified. (shrink)
The emerging field of quantum mereology considers part-whole relations in quantum systems. Entangled quantum systems pose a peculiar problem in the field, since their total states are not reducible to that of their parts. While there exist several established proposals for modelling entangled systems, like monistic holism or relational holism, there is considerable unclarity, which further positions are available. Using the lambda operator and plural logic as formal tools, we review and develop conceivable models and evaluate (...) their consistency and distinctness. The main result is an exhaustive taxonomy of six distinct and precise models that both provide information about the mereological features as well as about the entangled property. The taxonomy is well-suited to serve as the basis for future systematic investigations. (shrink)
ABSTRACT Topos quantum theory is standardly portrayed as a kind of ‘neo-realist’ reformulation of quantum mechanics.1 1 In this article, I study the extent to which TQT can really be characterized as a realist formulation of the theory, and examine the question of whether the kind of realism that is provided by TQT satisfies the philosophical motivations that are usually associated with the search for a realist reformulation of quantum theory. Specifically, I show that the notion of (...) the quantum state is problematic for those who view TQT as a realist reformulation of quantum theory. 1Introduction 2Topos Quantum Theory 2.1Phase space 2.2Hilbert space 2.3Beyond Hilbert space 2.4Defining realism 2.5The spectral presheaf 2.6The logic of topos quantum theory 3Interpreting States in Topos Quantum Theory 4Interpreting Truth Values and Clopen Subobjects in Topos Quantum Theory 4.1Interpreting the truth values 4.2Interpreting Subcl 5Neo-realism 5.1The covariant approach 6Conclusion. (shrink)
This paper is divided in four parts. In the first part we introduce the method of internal critique of philosophical theories by examination of their external consistency with scientific theories. In the second part two metaphysical and one epistemological postulate of Wittgenstein's Tractatus are made explicit and formally expressed. In the third part we examine whether Tractarian metaphysical and epistemological postulates (the independence of simple states of affairs, the unique mode of their composition, possibility of complete empirical knowledge) are externally (...) consistent with the theory of quantum mechanics. The result of the inquiry is negative: Tractarian postulates ought to be be revised. Relying on the result we approach the question of the empirical character of logic in the fourth part. The description of theoretical transformations of the notion of disjunction, in its ontological, epistemological, and logical sense, is a common element of in all parts of the text. The conjecture on the existence of different types of disjunctive connectives in the language of quantum mechanics concludes the paper. (shrink)
Which way does causation proceed? The pattern in the material world seems to be upward: particles to molecules to organisms to brains to mental processes. In contrast, the principles of quantum mechanics allow us to see a pattern of downward causation. These new ideas describe sets of multiple levels in which each level influences the levels below it through generation and selection. Top-down causation makes exciting sense of the world: we can find analogies in psychology, in the formation of (...) our minds, in locating the source of consciousness, and even in the possible logic of belief in God. (shrink)
We discuss the relationship between logic, geometry and probability theory under the light of a novel approach to quantum probabilities which generalizes the method developed by R. T. Cox to the quantum logical approach to physical theories.
In his entry on "QuantumLogic and Probability Theory" in the Stanford Encyclopedia of Philosophy, Alexander Wilce (2012) writes that "it is uncontroversial (though remarkable) the formal apparatus quantum mechanics reduces neatly to a generalization of classical probability in which the role played by a Boolean algebra of events in the latter is taken over the 'quantumlogic' of projection operators on a Hilbert space." For a long time, Patrick Suppes has opposed this view (see, (...) for example, the paper collected in Suppes and Zanotti (1996). Instead of changing the logic and moving from a Boolean algebra to a non-Boolean algebra, one can also 'save the phenomena' by weakening the axioms of probability theory and work instead with upper and lower probabilities. However, it is fair to say that despite Suppes' efforts upper and lower probabilities are not particularly popular in physics as well as in the foundations of physics, at least so far. Instead, quantumlogic is booming again, especially since quantum information and computation became hot topics. Interestingly, however, imprecise probabilities are becoming more and more popular in formal epistemology as recent work by authors such as James Joye (2010) and Roger White (2010) demonstrates. (shrink)
In the paper we will employ set theory to study the formal aspects of quantum mechanics without explicitly making use of space-time. It is demonstrated that von Neuman and Zermelo numeral sets, previously efectively used in the explanation of Hardy’s paradox, follow a Heisenberg quantum form. Here monadic union plays the role of time derivative. The logical counterpart of monadic union plays the part of the Hamiltonian in the commutator. The use of numerals and monadic union in the (...) classical probability resolution of Hardy’s paradox [1] is supported with the present derivation of a commutator for sets. (shrink)
A non-relativistic quantum mechanical theory is proposed that describes the universe as a continuum of worlds whose mutual interference gives rise to quantum phenomena. A logical framework is introduced to properly deal with propositions about objects in a multiplicity of worlds. In this logical framework, the continuum of worlds is treated in analogy to the continuum of time points; both “time” and “world” are considered as mutually independent modes of existence. The theory combines elements of Bohmian mechanics and (...) of Everett’s many-worlds interpretation; it has a clear ontology and a set of precisely defined postulates from where the predictions of standard quantum mechanics can be derived. Probability as given by the Born rule emerges as a consequence of insufficient knowledge of observers about which world it is that they live in. The theory describes a continuum of worlds rather than a single world or a discrete set of worlds, so it is similar in spirit to many-worlds interpretations based on Everett’s approach, without being actually reducible to these. In particular, there is no splitting of worlds, which is a typical feature of Everett-type theories. Altogether, the theory explains (1) the subjective occurrence of probabilities, (2) their quantitative value as given by the Born rule, and (3) the apparently random “collapse of the wavefunction” caused by the measurement, while still being an objectively deterministic theory. (shrink)
The potential for scalable quantum computing depends on the viability of fault tolerance and quantum error correction, by which the entropy of environmental noise is removed during a quantum computation to maintain the physical reversibility of the computer’s logical qubits. However, the theory underlying quantum error correction applies a linguistic double standard to the words “noise” and “measurement” by treating environmental interactions during a quantum computation as inherently reversible, and environmental interactions at the end of (...) a quantum computation as irreversible measurements. Specifically, quantum error correction theory models noise as interactions that are uncorrelated or that result in correlations that decay in space and/or time, thus embedding no permanent information to the environment. I challenge this assumption both on logical grounds and by discussing a hypothetical quantum computer based on “position qubits.” The technological difficulties of producing a useful scalable position-qubit quantum computer parallel the overwhelming difficulties in performing a double-slit interference experiment on an object comprising a million to a billion fermions. (shrink)
This chapter focuses on alternative logics. It discusses a hierarchy of logical reform. It presents case studies that illustrate particular aspects of the logical revisionism discussed in the chapter. The first case study is of intuitionistic logic. The second case study turns to quantumlogic, a system proposed on empirical grounds as a resolution of the antinomies of quantum mechanics. The third case study is concerned with systems of relevance logic, which have been the subject (...) of an especially detailed reform program. Finally, the fourth case study is paraconsistent logic, perhaps the most controversial of serious proposals. (shrink)
Classical logic is usually interpreted as the logic of propositions. But from Boole's original development up to modern categorical logic, there has always been the alternative interpretation of classical logic as the logic of subsets of any given (nonempty) universe set. Partitions on a universe set are dual to subsets of a universe set in the sense of the reverse-the-arrows category-theoretic duality--which is reflected in the duality between quotient objects and subobjects throughout algebra. Hence the (...) idea arises of a dual logic of partitions. That dual logic is described here. Partition logic is at the same mathematical level as subset logic since models for both are constructed from (partitions on or subsets of) arbitrary unstructured sets with no ordering relations, compatibility or accessibility relations, or topologies on the sets. Just as Boole developed logical finite probability theory as a quantitative treatment of subset logic, applying the analogous mathematical steps to partition logic yields a logical notion of entropy so that information theory can be refounded on partition logic. But the biggest application is that when partition logic and the accompanying logical information theory are "lifted" to complex vector spaces, then the mathematical framework of quantum mechanics is obtained. Partition logic models indefiniteness (i.e., numerical attributes on a set become more definite as the inverse-image partition becomes more refined) while subset logic models the definiteness of classical physics (an entity either definitely has a property or definitely does not). Hence partition logic provides the backstory so the old idea of "objective indefiniteness" in QM can be fleshed out to a full interpretation of quantum mechanics. (shrink)
The paper investigates the understanding of quantum indistinguishability after quantum information in comparison with the “classical” quantum mechanics based on the separable complex Hilbert space. The two oppositions, correspondingly “distinguishability / indistinguishability” and “classical / quantum”, available implicitly in the concept of quantum indistinguishability can be interpreted as two “missing” bits of classical information, which are to be added after teleportation of quantum information to be restored the initial state unambiguously. That new understanding of (...)quantum indistinguishability is linked to the distinction of classical (Maxwell-Boltzmann) versus quantum (either Fermi-Dirac or Bose-Einstein) statistics. The latter can be generalized to classes of wave functions (“empty” qubits) and represented exhaustively in Hilbert arithmetic therefore connectible to the foundations of mathematics, more precisely, to the interrelations of propositional logic and set theory sharing the structure of Boolean algebra and two anti-isometric copies of Peano arithmetic. (shrink)
Triadic (systemical) logic can provide an interpretive paradigm for understanding how quantum indeterminacy is a consequence of the formal nature of light in relativity theory. This interpretive paradigm is coherent and constitutionally open to ethical and theological interests. -/- In this statement: -/- (1) Triadic logic refers to a formal pattern that describes systemic (collaborative) processes involving signs that mediate between interiority (individuation) and exteriority (generalized worldview or Umwelt). It is also called systemical logic or the (...)logic of relatives. The term "triadic logic" emphasizes that this logic involves mediation of dualities through an irreducibly triadic formalism. The term "systemical logic" emphasizes that this logic applies to systems in contrast to traditional binary logic which applies to classes. The term "logic of relatives" emphasizes that this logic is background independent (in the sense discussed by Smolin ). -/- (2) An interpretive paradigm refers to a way of thinking that generates an understanding through concepts, their inter-relationships and their connections with experience. -/- (3) Coherence refers to holistic integrity or continuity in the meaning of concepts that form an interpretation or understanding. -/- (4) Constitutionally open refers to an inherent dependence in principle of an interpretation or understanding on something outside of a specific discipline's discourse or domain of inquiry (epistemic system). Interpretations that are constitutionally open are incomplete in themselves and open to responsive, interdisciplinary discourse and collaborative learning. (shrink)
The paper discusses the philosophical conclusions, which the interrelation between quantum mechanics and general relativity implies by quantum measure. Quantum measure is three-dimensional, both universal as the Borel measure and complete as the Lebesgue one. Its unit is a quantum bit (qubit) and can be considered as a generalization of the unit of classical information, a bit. It allows quantum mechanics to be interpreted in terms of quantum information, and all physical processes to be (...) seen as informational in a generalized sense. This implies a fundamental connection between the physical and material, on the one hand, and the mathematical and ideal, on the other hand. Quantum measure unifies them by a common and joint informational unit. Furthermore the approach clears up philosophically how quantum mechanics and general relativity can be understood correspondingly as the holistic and temporal aspect of one and the same, the state of a quantum system, e.g. that of the universe as a whole. The key link between them is the notion of the Bekenstein bound as well as that of quantum temperature. General relativity can be interpreted as a special particular case of quantum gravity. All principles underlain by Einstein (1918) reduce the latter to the former. Consequently their generalization and therefore violation addresses directly a theory of quantum gravity. Quantum measure reinterprets newly the “Bing Bang” theories about the beginning of the universe. It measures jointly any quantum leap and smooth motion complementary to each other and thus, the jump-like initiation of anything and the corresponding continuous process of its appearance. Quantum measure unifies the “Big Bang” and the whole visible expansion of the universe as two complementary “halves” of one and the same, the set of all states of the universe as a whole. It is a scientific viewpoint to the “creation from nothing”. (shrink)
Although Fuzzy logic and Fuzzy Mathematics is a widespread subject and there is a vast literature about it, yet the use of Fuzzy issues like Fuzzy sets and Fuzzy numbers was relatively rare in time concept. This could be seen in the Fuzzy time series. In addition, some attempts are done in fuzzing Turing Machines but seemingly there is no need to fuzzy time. Throughout this article, we try to change this picture and show why it is helpful to (...) consider the instants of time as Fuzzy numbers. In physics, though there are revolutionary ideas on the time concept like B theories in contrast to A theory also about central concepts like space, momentum… it is a long time that these concepts are changed, but time is considered classically in all well-known and established physics theories. Seemingly, we stick to the classical time concept in all fields of science and we have a vast inertia to change it. Our goal in this article is to provide some bases why it is rational and reasonable to change and modify this picture. Here, the central point is the modified version of “Unexpected Hanging” paradox as it is described in "Is classical Mathematics appropriate for theory of Computation".This modified version leads us to a contradiction and based on that it is presented there why some problems in Theory of Computation are not solved yet. To resolve the difficulties arising there, we have two choices. Either “choosing” a new type of Logic like “Para-consistent Logic” to tolerate contradiction or changing and improving the time concept and consequently to modify the “Turing Computational Model”. Throughout this paper, we select the second way for benefiting from saving some aspects of Classical Logic. In chapter 2, by applying quantum Mechanics and Schrodinger equation we compute the associated fuzzy number to time. (shrink)
Definitions I presented in a previous article as part of a semantic approach in epistemology assumed that the concept of derivability from standard logic held across all mathematical and scientific disciplines. The present article argues that this assumption is not true for quantum mechanics (QM) by showing that concepts of validity applicable to proofs in mathematics and in classical mechanics are inapplicable to proofs in QM. Because semantic epistemology must include this important theory, revision is necessary. The one (...) I propose also extends semantic epistemology beyond the ‘hard’ sciences. The article ends by presenting and then refuting some responses QM theorists might make to my arguments. (shrink)
The Schrodinger's Cat and Wigner's Friend thought experiments, which logically follow from the universality of quantum mechanics at all scales, have been repeatedly characterized as possible in principle, if perhaps difficult or impossible for all practical purposes. I show in this paper why these experiments, and interesting macroscopic superpositions in general, are actually impossible in principle. First, no macroscopic superposition can be created via the slow process of natural quantum packet dispersion because all macroscopic objects are inundated with (...) decohering interactions that constantly localize them. Second, the SC/WF thought experiments depend on von Neumann-style amplification to achieve quickly what quantum dispersion achieves slowly. Finally, I show why such amplification cannot produce a macroscopic quantum superposition of an object relative to an external observer, no matter how well isolated the object from the observer, because: the object and observer are already well correlated to each other; and reducing their correlations to allow the object to achieve a macroscopic superposition relative to the observer is equally impossible, in principle, as creating a macroscopic superposition via the process of natural quantum dispersion. (shrink)
The paper addresses Leon Hen.kin's proposition as a " lighthouse", which can elucidate a vast territory of knowledge uniformly: logic, set theory, information theory, and quantum mechanics: Two strategies to infinity are equally relevant for it is as universal and t hus complete as open and thus incomplete. Henkin's, Godel's, Robert Jeroslow's, and Hartley Rogers' proposition are reformulated so that both completeness and incompleteness to be unified and thus reduced as a joint property of infinity and of all (...) infinite sets. However, only Henkin's proposition equivalent to an internal position to infinity is consistent . This can be retraced back to set theory and its axioms, where that of choice is a key. Quantum mechanics is forced to introduce infinity implicitly by Hilbert space, on which is founded its formalism. One can demonstrate that some essential properties of quantum information, entanglement, and quantum computer originate directly from infinity once it is involved in quantum mechanics. Thus, these phenomena can be elucidated as both complete and incomplete, after which choice is the border between them. A special kind of invariance to the axiom of choice shared by quantum mechanics is discussed to be involved that border between the completeness and incompleteness of infinity in a consistent way. The so-called paradox of Albert Einstein, Boris Podolsky, and Nathan Rosen is interpreted entirely in the same terms only of set theory. Quantum computer can demonstrate especially clearly the privilege of the internal position, or " observer'' , or "user" to infinity implied by Henkin's proposition as the only consistent ones as to infinity. An essential area of contemporary knowledge may be synthesized from a single viewpoint. (shrink)
The universality assumption (“U”) that quantum wave states only evolve by linear or unitary dynamics has led to a variety of paradoxes in the foundations of physics. U is not directly supported by empirical evidence but is rather an inference from data obtained from microscopic systems. The inference of U conflicts with empirical observations of macroscopic systems, giving rise to the century-old measurement problem and subjecting the inference of U to a higher standard of proof, the burden of which (...) lies with its proponents. This burden remains unmet because the intentional choice by scientists to perform interference experiments that only probe the microscopic realm disqualifies the resulting data from supporting an inference that wave states always evolve linearly in the macroscopic realm. Further, the nature of the physical world creates an asymptotic size limit above which interference experiments, and verification of U in the realm in which it causes the measurement problem, seem impossible for all practical purposes if nevertheless possible in principle. This apparent natural limit serves as evidence against an inference of U, providing a further hurdle to the proponent’s currently unmet burden of proof. The measurement problem should never have arisen because the inference of U is entirely unfounded, logically and empirically. (shrink)
Some variants of quantum theory theorize dogmatic "unimodal" states-of-being, and are based on hodge-podge classical-quantum language. They are based on ontic syntax, but pragmatic semantics. This error was termed semantic inconsistency [1]. Measurement seems to be central problem of these theories, and widely discussed in their interpretation. Copenhagen theory deviates from this prescription, which is modeled on experience. A complete quantum experiment is "bimodal". An experimenter creates the system-under-study in initial mode of experiment, and annihilates it in (...) the final. The experimental intervention lies beyond the theory. I theorize most rudimentary bimodal quantum experiments studied by Finkelstein [2], and deduce "bimodal probability density" P=|In><Fin| to represent complete quantum experiments. It resembles core insights of the Copenhagen theory. (shrink)
The concepts of choice, negation, and infinity are considered jointly. The link is the quantity of information interpreted as the quantity of choices measured in units of elementary choice: a bit is an elementary choice between two equally probable alternatives. “Negation” supposes a choice between it and confirmation. Thus quantity of information can be also interpreted as quantity of negations. The disjunctive choice between confirmation and negation as to infinity can be chosen or not in turn: This corresponds to set-theory (...) or intuitionist approach to the foundation of mathematics and to Peano or Heyting arithmetic. Quantum mechanics can be reformulated in terms of information introducing the concept and quantity of quantum information. A qubit can be equivalently interpreted as that generalization of “bit” where the choice is among an infinite set or series of alternatives. The complex Hilbert space can be represented as both series of qubits and value of quantum information. The complex Hilbert space is that generalization of Peano arithmetic where any natural number is substituted by a qubit. “Negation”, “choice”, and “infinity” can be inherently linked to each other both in the foundation of mathematics and quantum mechanics by the meditation of “information” and “quantum information”. (shrink)
Non-commuting quantities and hidden parameters – Wave-corpuscular dualism and hidden parameters – Local or nonlocal hidden parameters – Phase space in quantum mechanics – Weyl, Wigner, and Moyal – Von Neumann’s theorem about the absence of hidden parameters in quantum mechanics and Hermann – Bell’s objection – Quantum-mechanical and mathematical incommeasurability – Kochen – Specker’s idea about their equivalence – The notion of partial algebra – Embeddability of a qubit into a bit – Quantum computer is (...) not Turing machine – Is continuality universal? – Diffeomorphism and velocity – Einstein’s general principle of relativity – „Mach’s principle“ – The Skolemian relativity of the discrete and the continuous – The counterexample in § 6 of their paper – About the classical tautology which is untrue being replaced by the statements about commeasurable quantum-mechanical quantities – Logical hidden parameters – The undecidability of the hypothesis about hidden parameters – Wigner’s work and и Weyl’s previous one – Lie groups, representations, and psi-function – From a qualitative to a quantitative expression of relativity − psi-function, or the discrete by the random – Bartlett’s approach − psi-function as the characteristic function of random quantity – Discrete and/ or continual description – Quantity and its “digitalized projection“ – The idea of „velocity−probability“ – The notion of probability and the light speed postulate – Generalized probability and its physical interpretation – A quantum description of macro-world – The period of the as-sociated de Broglie wave and the length of now – Causality equivalently replaced by chance – The philosophy of quantum information and religion – Einstein’s thesis about “the consubstantiality of inertia ant weight“ – Again about the interpretation of complex velocity – The speed of time – Newton’s law of inertia and Lagrange’s formulation of mechanics – Force and effect – The theory of tachyons and general relativity – Riesz’s representation theorem – The notion of covariant world line – Encoding a world line by psi-function – Spacetime and qubit − psi-function by qubits – About the physical interpretation of both the complex axes of a qubit – The interpretation of the self-adjoint operators components – The world line of an arbitrary quantity – The invariance of the physical laws towards quantum object and apparatus – Hilbert space and that of Minkowski – The relationship between the coefficients of -function and the qubits – World line = psi-function + self-adjoint operator – Reality and description – Does a „curved“ Hilbert space exist? – The axiom of choice, or when is possible a flattening of Hilbert space? – But why not to flatten also pseudo-Riemannian space? – The commutator of conjugate quantities – Relative mass – The strokes of self-movement and its philosophical interpretation – The self-perfection of the universe – The generalization of quantity in quantum physics – An analogy of the Feynman formalism – Feynman and many-world interpretation – The psi-function of various objects – Countable and uncountable basis – Generalized continuum and arithmetization – Field and entanglement – Function as coding – The idea of „curved“ Descartes product – The environment of a function – Another view to the notion of velocity-probability – Reality and description – Hilbert space as a model both of object and description – The notion of holistic logic – Physical quantity as the information about it – Cross-temporal correlations – The forecasting of future – Description in separable and inseparable Hilbert space – „Forces“ or „miracles“ – Velocity or time – The notion of non-finite set – Dasein or Dazeit – The trajectory of the whole – Ontological and onto-theological difference – An analogy of the Feynman and many-world interpretation − psi-function as physical quantity – Things in the world and instances in time – The generation of the physi-cal by mathematical – The generalized notion of observer – Subjective or objective probability – Energy as the change of probability per the unite of time – The generalized principle of least action from a new view-point – The exception of two dimensions and Fermat’s last theorem. (shrink)
The consistent histories reformulation of quantum mechanics was developed by Robert Griffiths, given a formal logical systematization by Roland Omn\`{e}s, and under the label `decoherent histories', was independently developed by Murray Gell-Mann and James Hartle and extended to quantum cosmology. Criticisms of CH involve issues of meaning, truth, objectivity, and coherence, a mixture of philosophy and physics. We will briefly consider the original formulation of CH and some basic objections. The reply to these objections, like the objections themselves, (...) involves a mixture of physics and philosophy. These replies support an evaluation of the CH formulation as a replacement for the measurement, or orthodox, interpretation. (shrink)
This book has been written for eighteen year olds (or anyone who will listen) as an honest attempt to face their justified questionings and to offer them a metaphysical framework with which to confront the twenty-first century. It is vitally important that certain modes of thought are uprooted and new modes put in their place if mankind and planet Earth are not soon to suffer an historic global catastrophe. Apart from the continuing world-wide proliferation of conventional, chemical, biological and nuclear (...) weaponry, the temperature of the planet has risen more rapidly in the last twenty-five years than it has since the year 900AD, which confirms global warming by some cause or other. These and the many intensifying human conflicts and natural disasters demand that some fundamental changes to our thinking are made as soon as possible. These 120 pages begin with an original examination of the quantum-theoretical understanding of reason (logic) and reality (existence) and find both to be at odds with common sense. A theory of everything is a reduction to a single grand idea! A quantum theory of everything is a mathematical theory of a consciousness realizing this grand idea! That is what this book tries to comprehend by means of five indisputable propositions. (shrink)
The laws of classical logic are taken to be logical truths, which in turn are taken to hold objectively. However, we might question our faith in these truths: why are they true? One general approach, proposed by Putnam [8] and more recently Dickson [3] or Maddy [5], is to adopt empiricism about logic. On this view, logical truths are true because they are true of the world alone – this gives logical truths an air of objectivity. Putnam and (...) Dickson both take logical truths to be true in virtue of the world’s structure, given by our best empirical theory, quantum mechanics. This assumes a determinate logical structure of the world given by quantum mechanics. Here, I argue that this assumption is false, and that the world’s logical structure, and hence the related ‘true’ logic, is underdetermined. This leads to what I call empirical conventionalism. (shrink)
The major point in [1] chapter 2 is the following claim: “Any formalized system for the Theory of Computation based on Classical Logic and Turing Model of Computation leads us to a contradiction.” So, in the case we wish to save Classical Logic we should change our Computational Model. As we see in chapter two, the mentioned contradiction is about and around the concept of time, as it is in the contradiction of modified version of paradox. It is (...) natural to try fabricating the paradox not by time but in some other linear ordering or the concept of space. Interestingly, the attempts to have similar contradiction by the other concepts like space and linear ordering, is failed. It is remarkable that, the paradox is considered either Epistemological or Logical traditionally, but by new considerations the new version of paradox should be considered as either Logical or Physical paradox. Hence, in order to change our Computational Model, it is natural to change the concept of time, but how? We start from some models that are different from the classical one but they are intuitively plausible. The idea of model is somewhat introduced by Brouwer and Husserl [3]. This model doesn’t refute the paradox, since the paradox and the associated contradiction would be repeated in this new model. The model is introduced in [2]. Here we give some more explanations. (shrink)
A principle, according to which any scientific theory can be mathematized, is investigated. That theory is presupposed to be a consistent text, which can be exhaustedly represented by a certain mathematical structure constructively. In thus used, the term “theory” includes all hypotheses as yet unconfirmed as already rejected. The investigation of the sketch of a possible proof of the principle demonstrates that it should be accepted rather a metamathematical axiom about the relation of mathematics and reality. Its investigation needs philosophical (...) means. Husserl’s phenomenology is what is used, and then the conception of “bracketing reality” is modelled to generalize Peano arithmetic in its relation to set theory in the foundation of mathematics. The obtained model is equivalent to the generalization of Peano arithmetic by means of replacing the axiom of induction with that of transfinite induction. A comparison to Mach’s doctrine is used to be revealed the fundamental and philosophical reductionism of Husserl’s phenomenology leading to a kind of Pythagoreanism in the final analysis. Accepting or rejecting the principle, two kinds of mathematics appear differing from each other by its relation to reality. Accepting the principle, mathematics has to include reality within itself in a kind of Pythagoreanism. These two kinds are called in paper correspondingly Hilbert mathematics and Gödel mathematics. The sketch of the proof of the principle demonstrates that the generalization of Peano arithmetic as above can be interpreted as a model of Hilbert mathematics into Gödel mathematics therefore showing that the former is not less consistent than the latter, and the principle is an independent axiom. An information interpretation of Hilbert mathematics is involved. It is a kind of ontology of information. Thus the problem which of the two mathematics is more relevant to our being is discussed. An information interpretation of the Schrödinger equation is involved to illustrate the above problem. (shrink)
Ignited by Einstein and Bohr a century ago, the philosophical struggle about Reality is yet unfinished, with no signs of a swift resolution. Despite vast technological progress fueled by the iconic EPR paper (EPR), the intricate link between ontic and epistemic aspects of Quantum Theory (QT) has greatly hindered our grip on Reality and further progress in physical theory. Fallacies concealed by tortuous logical negations made EPR comprehension much harder than it could have been had Einstein written it himself (...) in German. It is plagued with preconceptions about what a physical property is, the 'Uncertainty Principle', and the Principle of Locality. Numerous interpretations of QT vis à vis Reality exist and are keenly disputed. This is the first of a series of articles arguing for a physical interpretation called ‘The Ontic Probability Interpretation’ (TOPI). A gradual explanation of TOPI is given intertwined with a meticulous logico-philosophical scrutiny of EPR. Part I focuses on the meaning of Einstein’s ‘Incompleteness’ claim. A conceptual confusion, a preconception about Reality, and a flawed dichotomy are shown to be severe obstacles for the EPR argument to succeed. Part II analyzes Einstein’s ‘Incompleteness/Nonlocality Dilemma’. Future articles will further explain TOPI, demonstrating its soundness and potential for nurturing theoretical progress. (shrink)
The assumption that known physical laws are sufficient for explaining mental phenomena is flawed from the outset. Qualities such as phenomenal redness do not exist within the known physical laws so by definition they are incomplete. Now assuming a new law was added that could explain how some physical property or vibration causes or is associated with phenomenal redness it would not be enough because it still wouldn’t explain how different qualities are bound together into a subjective unity. Assuming more (...) additional laws could now explain the popping into existence of subjective selves this also would still not be enough because a subjective self that cannot be directly observed nor indirectly observed via its effects vanishes into non-existence. This implies that subjective selves must have causal efficacy. But this would require still additional physical laws or perhaps an accommodated interpretation of quantum physics because there is no current understanding of how a mind can change the probability distribution of matter. But even if an expanded quantum physics was understood this would still not be enough because no general law can determine the indexical fact that I am me and you are you. -/- Then the hard problem of consciousness is real, non-trivial and when taken seriously appears to be actually separate problems. 1. How are qualia generated from purely physical activity? This appears to be a strictly deterministic process. 2. The so called binding problem. That is how is that I hear a sound and see a visual scene and feel a bodily sensation, all separate modalities that are unrelated, and yet I experience them all at the same time? 3. Can it be that I am causally efficacious? It certainly seems that the actual feelings of pain and pleasure serve some real purpose. Since conscious entities can never be directly observed and are only known through their effects, if they have no effects they disappear altogether. 4. The so called problem of indexicality. That is why I am me and not you. This seems to be a real fact above and beyond any objective fact. Eliminative materialism ignores the empirical evidence that phenomenal qualities exist and in doing so attains logical consistency. Other forms of emergence that try to explain qualia and give it a meaning while being restricted to physical closure are illogical. I explore these hard problems and arrive at a justification for a panpsychic interactive dualism. (shrink)
We review our approach to quantum mechanics adding also some new interesting results. We start by giving proof of two important theorems on the existence of the A(Si) and i,±1 N Clifford algebras. This last algebra gives proof of the von Neumann basic postulates on the quantum measurement explaining thus in an algebraic manner the wave function collapse postulated in standard quantum theory. In this manner we reach the objective to expose a self-consistent version of quantum (...) mechanics. In detail we realize a bare bone skeleton of quantum mechanics recovering all the basic foundations of this theory on an algebraic framework. We give proof of the quantum like Heisenberg uncertainty relations using only the basic support of the Clifford algebra. In addition we demonstrate the well known phenomenon of quantum Mach Zender interference using the same algebraic framework, as well as we give algebraic proof of quantum collapse in some cases of physical interest by direct application of the theorem that we derive to elaborate the i,±1 N algebra. We also discuss the problem of time evolution of quantum systems as well as the changes in space location, in momentum and the linked invariance principles. We are also able to re-derive the basic wave function of standard quantum mechanics by using only the Clifford algebraic approach. In this manner we obtain a full exposition of standard quantum mechanics using only the basic axioms of Clifford algebra. We also discuss more advanced features of quantum mechanics. In detail, we give demonstration of the Kocken-Specher theorem, and also we give an algebraic formulation and explanation of the EPR paradox only using the Clifford algebra. By using the same approach we also derive Bell inequalities. Our formulation is strongly based on the use of idempotents that are contained in Clifford algebra. Their counterpart in quantum mechanics is represented by the projection operators that, as it is well known, are interpreted as logical statements, following the basic von Neumann results. Von Neumann realized a matrix logic on the basis of quantum mechanics. Using the Clifford algebra we are able to invert such result. According to the results previously obtained by Orlov in 1994, we are able to give proof that quantum mechanics derives from logic. We show that indeterminism and quantum interference have their origin in the logic. Therefore, it seems that we may conclude that quantum mechanics, as it appears when investigated by the Clifford algebra, is a two-faced theory in the sense that it looks from one side to “matter per se”, thus to objects but simultaneously also to conceptual entities. We advance the basic conclusion of the paper: There are stages of our reality in which we no more can separate the logic ( and thus cognition and thus conceptual entity) from the features of “matter per se”. In quantum mechanics the logic, and thus the cognition and thus the conceptual entity-cognitive performance, assume the same importance as the features of what is being described. We are at levels of reality in which the truths of logical statements about dynamic variables become dynamic variables themselves so that a profound link is established from its starting in this theory between physics and conceptual entities. Finally, in this approach there is not an absolute definition of logical truths. Transformations , and thus … “redefinitions”…. of truth values are permitted in such scheme as well as the well established invariance principles, clearly indicate . (shrink)
On the grounds that the Einstein-Podolsky-Rosen argument is an example of reasoning by reductio ad absurdum, and that a counterexample is unacceptable, unless all its elements meet all the necessary conditions, its conclusions are invalidated. The arguments in this paper are strictly logical. Einstein, Podolsky and Rosen made a mathematical assumption that is incompatible with quantum mechanics.
We comment some recent results obtained by using a Clifford bare bone skeleton of quantum mechanics in order to formulate the conclusion that quantum mechanics has its origin in the logic, and relates conceptual entities. Such results touch directly the basic problem about the structure of our cognitive and conceptual dynamics and thus of our mind. The problem of exploring consciousness results consequently to be strongly linked. This is the reason because studies on quantum mechanics applied (...) to this matter are so important for neurologists and psychologists. Under this profile we present some experimental results showing violation of Bell inequality during the MBTI test in investigation of C.V. Jung’s theory of personality. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.