Harsanyi claimed that his Aggregation and Impartial Observer Theorems provide a justification for utilitarianism. This claim has been strongly resisted, notably by Sen and Weymark, who argue that while Harsanyi has perhaps shown that overall good is a linear sum of individuals’ von Neumann-Morgenstern utilities, he has done nothing to establish any con- nection between the notion of von Neumann-Morgenstern utility and that of well-being, and hence that utilitarianism does not follow. The present article defends Harsanyi (...) against the Sen-Weymark cri- tique. I argue that, far from being a term with precise and independent quantitative content whose relationship to von Neumann-Morgenstern utility is then a substantive question, terms such as ‘well-being’ suffer (or suffered) from indeterminacy regarding precisely which quantity they refer to. If so, then (on the issue that this article focuses on) Harsanyi has gone as far towards defending ‘utilitarianism in the original sense’ as could coherently be asked. (shrink)
In spite of the many efforts made to clarify von Neumann’s methodology of science, one crucial point seems to have been disregarded in recent literature: his closeness to Hilbert’s spirit. In this paper I shall claim that the scientific methodology adopted by von Neumann in his later foundational reflections originates in the attempt to revaluate Hilbert’s axiomatics in the light of Gödel’s incompleteness theorems. Indeed, axiomatics continues to be pursued by the Hungarian mathematician in the spirit of Hilbert’s (...) school. I shall argue this point by examining four basic ideas embraced by von Neumann in his foundational considerations: a) the conservative attitude to assume in mathematics; b) the role that mathematics and the axiomatic approach have to play in all that is science; c) the notion of success as an alternative methodological criterion to follow in scientific research; d) the empirical and, at the same time, abstract nature of mathematical thought. Once these four basic ideas have been accepted, Hilbert’s spirit in von Neumann’s methodology of science will become clear. (shrink)
Although expected utility theory has proven a fruitful and elegant theory in the finite realm, attempts to generalize it to infinite values have resulted in many paradoxes. In this paper, we argue that the use of John Conway's surreal numbers shall provide a firm mathematical foundation for transfinite decision theory. To that end, we prove a surreal representation theorem and show that our surreal decision theory respects dominance reasoning even in the case of infinite values. We then bring our (...) theory to bear on one of the more venerable decision problems in the literature: Pascal's Wager. Analyzing the wager showcases our theory's virtues and advantages. To that end, we analyze two objections against the wager: Mixed Strategies and Many Gods. After formulating the two objections in the framework of surreal utilities and probabilities, our theory correctly predicts that (1) the pure Pascalian strategy beats all mixed strategies, and (2) what one should do in a Pascalian decision problem depends on what one's credence function is like. Our analysis therefore suggests that although Pascal's Wager is mathematically coherent, it does not deliver what it purports to, a rationally compelling argument that people should lead a religious life regardless of how confident they are in theism and its alternatives. (shrink)
We show that the respective oversights in the von Neumann's general theorem against all hidden variable theories and Bell's theorem against their local-realistic counterparts are homologous. When latter oversight is rectified, the bounds on the CHSH correlator work out to be ±2√2 instead of ±2.
In his classic book “the Foundations of Statistics” Savage developed a formal system of rational decision making. The system is based on (i) a set of possible states of the world, (ii) a set of consequences, (iii) a set of acts, which are functions from states to consequences, and (iv) a preference relation over the acts, which represents the preferences of an idealized rational agent. The goal and the culmination of the enterprise is a representation theorem: Any preference relation (...) that satisfies certain arguably acceptable postulates determines a (finitely additive) probability distribution over the states and a utility assignment to the consequences, such that the preferences among acts are determined by their expected utilities. Additional problematic assumptions are however required in Savage's proofs. First, there is a Boolean algebra of events (sets of states) which determines the richness of the set of acts. The probabilities are assigned to members of this algebra. Savage's proof requires that this be a σ-algebra (i.e., closed under infinite countable unions and intersections), which makes for an extremely rich preference relation. On Savage's view we should not require subjective probabilities to be σ-additive. He therefore finds the insistence on a σ-algebra peculiar and is unhappy with it. But he sees no way of avoiding it. Second, the assignment of utilities requires the constant act assumption: for every consequence there is a constant act, which produces that consequence in every state. This assumption is known to be highly counterintuitive. The present work contains two mathematical results. The first, and the more difficult one, shows that the σ-algebra assumption can be dropped. The second states that, as long as utilities are assigned to finite gambles only, the constant act assumption can be replaced by the more plausible and much weaker assumption that there are at least two non-equivalent constant acts. The second result also employs a novel way of deriving utilities in Savage-style systems -- without appealing to von Neumann-Morgenstern lotteries. The paper discusses the notion of “idealized agent" that underlies Savage's approach, and argues that the simplified system, which is adequate for all the actual purposes for which the system is designed, involves a more realistic notion of an idealized agent. (shrink)
People with the kind of preferences that give rise to the St. Petersburg paradox are problematic---but not because there is anything wrong with infinite utilities. Rather, such people cannot assign the St. Petersburg gamble any value that any kind of outcome could possibly have. Their preferences also violate an infinitary generalization of Savage's Sure Thing Principle, which we call the *Countable Sure Thing Principle*, as well as an infinitary generalization of von Neumann and Morgenstern's Independence axiom, which we (...) call *Countable Independence*. In violating these principles, they display foibles like those of people who deviate from standard expected utility theory in more mundane cases: they choose dominated strategies, pay to avoid information, and reject expert advice. We precisely characterize the preference relations that satisfy Countable Independence in several equivalent ways: a structural constraint on preferences, a representation theorem, and the principle we began with, that every prospect has a value that some outcome could have. (shrink)
Non-commuting quantities and hidden parameters – Wave-corpuscular dualism and hidden parameters – Local or nonlocal hidden parameters – Phase space in quantum mechanics – Weyl, Wigner, and Moyal – Von Neumann’s theorem about the absence of hidden parameters in quantum mechanics and Hermann – Bell’s objection – Quantum-mechanical and mathematical incommeasurability – Kochen – Specker’s idea about their equivalence – The notion of partial algebra – Embeddability of a qubit into a bit – Quantum computer is not Turing (...) machine – Is continuality universal? – Diffeomorphism and velocity – Einstein’s general principle of relativity – „Mach’s principle“ – The Skolemian relativity of the discrete and the continuous – The counterexample in § 6 of their paper – About the classical tautology which is untrue being replaced by the statements about commeasurable quantum-mechanical quantities – Logical hidden parameters – The undecidability of the hypothesis about hidden parameters – Wigner’s work and и Weyl’s previous one – Lie groups, representations, and psi-function – From a qualitative to a quantitative expression of relativity − psi-function, or the discrete by the random – Bartlett’s approach − psi-function as the characteristic function of random quantity – Discrete and/ or continual description – Quantity and its “digitalized projection“ – The idea of „velocity−probability“ – The notion of probability and the light speed postulate – Generalized probability and its physical interpretation – A quantum description of macro-world – The period of the as-sociated de Broglie wave and the length of now – Causality equivalently replaced by chance – The philosophy of quantum information and religion – Einstein’s thesis about “the consubstantiality of inertia ant weight“ – Again about the interpretation of complex velocity – The speed of time – Newton’s law of inertia and Lagrange’s formulation of mechanics – Force and effect – The theory of tachyons and general relativity – Riesz’s representation theorem – The notion of covariant world line – Encoding a world line by psi-function – Spacetime and qubit − psi-function by qubits – About the physical interpretation of both the complex axes of a qubit – The interpretation of the self-adjoint operators components – The world line of an arbitrary quantity – The invariance of the physical laws towards quantum object and apparatus – Hilbert space and that of Minkowski – The relationship between the coefficients of -function and the qubits – World line = psi-function + self-adjoint operator – Reality and description – Does a „curved“ Hilbert space exist? – The axiom of choice, or when is possible a flattening of Hilbert space? – But why not to flatten also pseudo-Riemannian space? – The commutator of conjugate quantities – Relative mass – The strokes of self-movement and its philosophical interpretation – The self-perfection of the universe – The generalization of quantity in quantum physics – An analogy of the Feynman formalism – Feynman and many-world interpretation – The psi-function of various objects – Countable and uncountable basis – Generalized continuum and arithmetization – Field and entanglement – Function as coding – The idea of „curved“ Descartes product – The environment of a function – Another view to the notion of velocity-probability – Reality and description – Hilbert space as a model both of object and description – The notion of holistic logic – Physical quantity as the information about it – Cross-temporal correlations – The forecasting of future – Description in separable and inseparable Hilbert space – „Forces“ or „miracles“ – Velocity or time – The notion of non-finite set – Dasein or Dazeit – The trajectory of the whole – Ontological and onto-theological difference – An analogy of the Feynman and many-world interpretation − psi-function as physical quantity – Things in the world and instances in time – The generation of the physi-cal by mathematical – The generalized notion of observer – Subjective or objective probability – Energy as the change of probability per the unite of time – The generalized principle of least action from a new view-point – The exception of two dimensions and Fermat’s last theorem. (shrink)
The paper summarizes expected utility theory, both in its original von Neumann-Morgenstern version and its later developments, and discusses the normative claims to rationality made by this theory.
REVIEW OF: Automated Development of Fundamental Mathematical Theories by Art Quaife. (1992: Kluwer Academic Publishers) 271pp. Using the theorem prover OTTER Art Quaife has proved four hundred theorems of von Neumann-Bernays-Gödel set theory; twelve hundred theorems and definitions of elementary number theory; dozens of Euclidean geometry theorems; and Gödel's incompleteness theorems. It is an impressive achievement. To gauge its significance and to see what prospects it offers this review looks closely at the book and the proofs it presents.
Conventional wisdom holds that the von Neumann entropy corresponds to thermodynamic entropy, but Hemmo and Shenker (2006) have recently argued against this view by attacking von Neumann's (1955) argument. I argue that Hemmo and Shenker's arguments fail due to several misunderstandings: about statistical-mechanical and thermodynamic domains of applicability, about the nature of mixed states, and about the role of approximations in physics. As a result, their arguments fail in all cases: in the single-particle case, the finite particles case, (...) and the infinite particles case. (shrink)
I argue that prioritarianism cannot be assessed in abstraction from an account of the measure of utility. Rather, the soundness of this view crucially depends on what counts as a greater, lesser, or equal increase in a person’s utility. In particular, prioritarianism cannot accommodate a normatively compelling measure of utility that is captured by the axioms of John von Neumann and Oskar Morgenstern’s expected utility theory. Nor can it accommodate a plausible and elegant generalization of this theory that (...) has been offered in response to challenges to von Neumann and Morgenstern. This is, I think, a theoretically interesting and unexpected source of difficulty for prioritarianism, which I explore in this article. (shrink)
In this essay a quantum-dualistic, perspectival and synchronistic interpretation of quantum mechanics is further developed in which the classical world-from-decoherence which is perceived (decoherence) and the perceived world-in-consciousness which is classical (collapse) are not necessarily identified. Thus, Quantum Reality or "{\it unus mundus}" is seen as both i) a physical non-perspectival causal Reality where the quantum-to-classical transition is operated by decoherence, and as ii) a quantum linear superposition of all classical psycho-physical perspectival Realities which are governed by synchronicity as well (...) as causality (corresponding to classical first-person observes who actually populate the world). This interpretation is termed the Nietzsche-Jung-Pauli interpretation and is a re-imagining of the Wigner-von Neumann interpretation which is also consistent with some reading of Bohr's quantum philosophy. (shrink)
John von Neumann's proof that quantum mechanics is logically incompatible with hidden varibales has been the object of extensive study both by physicists and by historians. The latter have concentrated mainly on the way the proof was interpreted, accepted and rejected between 1932, when it was published, and 1966, when J.S. Bell published the first explicit identification of the mistake it involved. What is proposed in this paper is an investigation into the origins of the proof rather than the (...) aftermath. In the first section, a brief overview of the his personal life and his proof is given to set the scene. There follows a discussion on the merits of using here the historical method employed elsewhere by Andrew Warwick. It will be argued that a study of the origins of von Neumann's proof shows how there is an interaction between the following factors: the broad issues within a specific culture, the learning process of the theoretical physicist concerned, and the conceptual techniques available. In our case, the ‘conceptual technology’ employed by von Neumann is identified as the method of axiomatisation. (shrink)
Whereas many others have scrutinized the Allais paradox from a theoretical angle, we study the paradox from an historical perspective and link our findings to a suggestion as to how decision theory could make use of it today. We emphasize that Allais proposed the paradox as a normative argument, concerned with ‘the rational man’ and not the ‘real man’, to use his words. Moreover, and more subtly, we argue that Allais had an unusual sense of the normative, being concerned not (...) so much with the rationality of choices as with the rationality of the agent as a person. These two claims are buttressed by a detailed investigation – the first of its kind – of the 1952 Paris conference on risk, which set the context for the invention of the paradox, and a detailed reconstruction – also the first of its kind – of Allais’s specific normative argument from his numerous but allusive writings. The paper contrasts these interpretations of what the paradox historically represented, with how it generally came to function within decision theory from the late 1970s onwards: that is, as an empirical refutation of the expected utility hypothesis, and more specifically of the condition of von Neumann–Morgenstern independence that underlies that hypothesis. While not denying that this use of the paradox was fruitful in many ways, we propose another use that turns out also to be compatible with an experimental perspective. Following Allais’s hints on ‘the experimental definition of rationality’, this new use consists in letting the experiment itself speak of the rationality or otherwise of the subjects. In the 1970s, a short sequence of papers inspired by Allais implemented original ways of eliciting the reasons guiding the subjects’ choices, and claimed to be able to draw relevant normative consequences from this information. We end by reviewing this forgotten experimental avenue not simply historically, but with a view to recommending it for possible use by decision theorists today. (shrink)
This paper reviews some major episodes in the history of the spatial isomorphism problem of dynamical systems theory. In particular, by analysing, both systematically and in historical context, a hitherto unpublished letter written in 1941 by John von Neumann to Stanislaw Ulam, this paper clarifies von Neumann's contribution to discovering the relationship between spatial isomorphism and spectral isomorphism. The main message of the paper is that von Neumann's argument described in his letter to Ulam is the very (...) first proof that spatial isomorphism and spectral isomorphism are not equivalent because spectral isomorphism is weaker than spatial isomorphism: von Neumann shows that spectrally isomorphic ergodic dynamical systems with mixed spectra need not be spatially isomorphic. (shrink)
A ansiedade gerada pela possível criação de inteligência artificial geral, algo profetizado desde a fundação do campo de pesquisa (i.e., Dartmouth's Summer Research Project on Artificial Intelligence) (MCCARTHY et al., 1955), é algo comumente investigado dentro dos círculos transhumanistas e singularistas (KURZWEIL, 2005; CHALMERS, 2010; TEGMARK, 2016; 2017; CORRÊA; DE OLIVEIRA, 2021). Por exemplo, em seu livro “Superintelligence: Paths, Dangers, Strategies”, Nick Bostrom (2014) apresenta uma série de argumentos para justificar a ideia de que precisamos ser cautelosos em relação a (...) criação de inteligência artificial avançada ou inteligência artificial geral. Contudo, muito dos argumentos levantados, em suas versões originais (i.e., convergência instrumental e ortogonalidade (OMOHUNDRO, 2008), são baseados em noções “vagas” de conceitos complexos (i.e., “inteligência”, “racionalidade”, “agência”). O objetivo deste resumo (apresentação) será introduzir o leitor/ouvinte ao campo de Segurança da Inteligência Artificial. Nesse campo, questões como o “Problema de controle” (i.e., evitar construir inadvertidamente uma inteligência artificial geral que prejudicará seus criadores) e o “Problema de Alinhamento” (i.e., criar sistemas de inteligência artificial que permaneçam seguros mesmo quando atuando de forma autônoma no ambiente real), que são questões muitas vezes debatidas apenas dentro da Filosofia ou Ficção Científica, ganham um aspecto muito mais real e formal (AMODEI et al., 2016, HUBINGER et al., 2019). A criação de alguns dos modelos de inteligência artificial mais avançados já criados (e.g., AlphaGo, GPT-3) (SILVER el al., 2016; BROWN et al., 2020), está ligado a utilização de dois paradigmas da área de Aprendizagem de Máquina, i.e., Aprendizagem Profunda e Aprendizagem por Reforço. Dentro deste contexto, podemos definir “um sistema autônomo inteligente” como “um sistema que implementa uma política ótima (tecnicamente "ϵ-greedy optimal") para alguma função de recompensa” (ORSEAU et al., 2018). Podemos também definir tais sistemas como “maximizadores de utilidade esperada” (VON NEUMANN; MORGENSTERN, 1944). Um sistema de IA maximizando uma função de recompensa (i.e., objetivo) especificada erroneamente pode vir a causar danos reais quando interagindo no ambiente real, seja por erros de especificação (i.e., Efeitos Colaterais Negativos, Hackeamento de Recompensa, Interrupção Segura), ou quando o domínio de treinamento é diferente do domínio de implantação (i.e., Exploração Segura e Mudança de Domínio). Ao mesmo tempo, especificar objetivos complexos (e.g., preferências e valores humanos) é uma tarefa extremamente complexa, e quando visamos a possibilidade de inteligência artificial avançada agindo no mundo real, é imperativo que tais sistemas estejam: (1) alinhados com nossos objetivos; e (2) desenvolvidos com um design que evite os problemas citados acima. Contudo, tais problemas permanecem em aberto. Por fim, gostaria de ressaltar que no cerne desta problemática, não temos apenas problemas técnicos de aprendizagem de máquina, mas uma importante questão de cunho “Moral”: “O que é a coisa ‘certa” a fazer”? “Qual o objetivo correto a ser perseguido”? “Como podemos especificar isso”? Perguntas como essas prometem um fértil campo para pesquisas interdisciplinares envolvendo inteligência artificial, aprendizagem de máquina, ciências da computação e filosofia moral. (shrink)
We review our approach to quantum mechanics adding also some new interesting results. We start by giving proof of two important theorems on the existence of the A(Si) and i,±1 N Clifford algebras. This last algebra gives proof of the von Neumann basic postulates on the quantum measurement explaining thus in an algebraic manner the wave function collapse postulated in standard quantum theory. In this manner we reach the objective to expose a self-consistent version of quantum mechanics. In detail (...) we realize a bare bone skeleton of quantum mechanics recovering all the basic foundations of this theory on an algebraic framework. We give proof of the quantum like Heisenberg uncertainty relations using only the basic support of the Clifford algebra. In addition we demonstrate the well known phenomenon of quantum Mach Zender interference using the same algebraic framework, as well as we give algebraic proof of quantum collapse in some cases of physical interest by direct application of the theorem that we derive to elaborate the i,±1 N algebra. We also discuss the problem of time evolution of quantum systems as well as the changes in space location, in momentum and the linked invariance principles. We are also able to re-derive the basic wave function of standard quantum mechanics by using only the Clifford algebraic approach. In this manner we obtain a full exposition of standard quantum mechanics using only the basic axioms of Clifford algebra. We also discuss more advanced features of quantum mechanics. In detail, we give demonstration of the Kocken-Specher theorem, and also we give an algebraic formulation and explanation of the EPR paradox only using the Clifford algebra. By using the same approach we also derive Bell inequalities. Our formulation is strongly based on the use of idempotents that are contained in Clifford algebra. Their counterpart in quantum mechanics is represented by the projection operators that, as it is well known, are interpreted as logical statements, following the basic von Neumann results. Von Neumann realized a matrix logic on the basis of quantum mechanics. Using the Clifford algebra we are able to invert such result. According to the results previously obtained by Orlov in 1994, we are able to give proof that quantum mechanics derives from logic. We show that indeterminism and quantum interference have their origin in the logic. Therefore, it seems that we may conclude that quantum mechanics, as it appears when investigated by the Clifford algebra, is a two-faced theory in the sense that it looks from one side to “matter per se”, thus to objects but simultaneously also to conceptual entities. We advance the basic conclusion of the paper: There are stages of our reality in which we no more can separate the logic ( and thus cognition and thus conceptual entity) from the features of “matter per se”. In quantum mechanics the logic, and thus the cognition and thus the conceptual entity-cognitive performance, assume the same importance as the features of what is being described. We are at levels of reality in which the truths of logical statements about dynamic variables become dynamic variables themselves so that a profound link is established from its starting in this theory between physics and conceptual entities. Finally, in this approach there is not an absolute definition of logical truths. Transformations , and thus … “redefinitions”…. of truth values are permitted in such scheme as well as the well established invariance principles, clearly indicate . (shrink)
The bi-polar confrontation between the Soviet Union and the USA involved many leading game theorists from both sides of the Iron Curtain: Oskar Morgenstern, John von Neumann, Michael Intriligator, John Nash, Thomas Schelling and Steven Brams from the United States and Nikolay Vorob’ev, Leon A. Petrosyan, Elena B. Yanovskaya and Olga N. Bondareva from the Soviet Union. The formalization of game theory took place prior to the Cold War but the geopolitical confrontation hastened and shaped its evolution. In (...) our article we outline four similarities and differences between Western GT and Soviet GT: 1) the Iron Curtain resulted in a lagged evolution of GT in the Soviet Union; 2) Soviet GT focused more on operations research and issues of centralized planning; 3) the contemporary Western view on Soviet GT was biased and Soviet contributions, including works on dynamic stability, non-emptiness of the core and many refinements, suggest that Soviet GT was able to catch up to the Western level relatively fast; 4) international conferences, including Vilnius, 1971, fostered interaction between Soviet game theorists and their Western colleagues. In general, we consider the Cold War to be a positive environment for GT in the West and in the Soviet Union. (shrink)
The Humean conception of the self consists in the belief-desire model of motivation and the utility-maximizing model of rationality. This conception has dominated Western thought in philosophy and the social sciences ever since Hobbes’ initial formulation in Leviathan and Hume’s elaboration in the Treatise of Human Nature. Bentham, Freud, Ramsey, Skinner, Allais, von Neumann and Morgenstern and others have added further refinements that have brought it to a high degree of formal sophistication. Late twentieth century moral philosophers such (...) as Rawls, Brandt, Frankfurt, Nagel and Williams have taken it for granted, and have made use of it to supply metaethical foundations for a wide variety of normative moral theories. But the Humean conception of the self also leads to seemingly insoluble problems about moral motivation, rational final ends, and moral justification. Can it be made to work? (shrink)
This monographic chapter explains how expected utility (EU) theory arose in von Neumann and Morgenstern, how it was called into question by Allais and others, and how it gave way to non-EU theories, at least among the specialized quarters of decion theory. I organize the narrative around the idea that the successive theoretical moves amounted to resolving Duhem-Quine underdetermination problems, so they can be assessed in terms of the philosophical recommendations made to overcome these problems. I actually follow (...) Duhem's recommendation, which was essentially to rely on the passing of time to make many experiments and arguments available, and evebntually strike a balance between competing theories on the basis of this improved knowledge. Although Duhem's solution seems disappointingly vague, relying as it does on "bon sens" to bring an end to the temporal process, I do not think there is any better one in the philosophical literature, and I apply it here for what it is worth. In this perspective, EU theorists were justified in resisting the first attempts at refuting their theory, including Allais's in the 50s, but they would have lacked "bon sens" in not acknowledging their defeat in the 80s, after the long process of pros and cons had sufficiently matured. This primary Duhemian theme is actually combined with a secondary theme - normativity. I suggest that EU theory was normative at its very beginning and has remained so all along, and I express dissatisfaction with the orthodox view that it could be treated as a straightforward descriptive theory for purposes of prediction and scientific test. This view is usually accompanied with a faulty historical reconstruction, according to which EU theorists initially formulated the VNM axioms descriptively and retreated to a normative construal once they fell threatened by empirical refutation. From my historical study, things did not evolve in this way, and the theory was both proposed and rebutted on the basis of normative arguments already in the 1950s. The ensuing, major problem was to make choice experiments compatible with this inherently normative feature of theory. Compability was obtained in some experiments, but implicitly and somewhat confusingly, for instance by excluding overtly incoherent subjects or by creating strong incentives for the subjects to reflect on the questions and provide answers they would be able to defend. I also claim that Allais had an intuition of how to combine testability and normativity, unlike most later experimenters, and that it would have been more fruitful to work from his intuition than to make choice experiments of the naively empirical style that flourished after him. In sum, it can be said that the underdetermination process accompanying EUT was resolved in a Duhemian way, but this was not without major inefficiencies. To embody explicit rationality considerations into experimental schemes right from the beginning would have limited the scope of empirical research, avoided wasting resources to get only minor findings, and speeded up the Duhemian process of groping towards a choice among competing theories. (shrink)
The text is a continuation of the article of the same name published in the previous issue of Philosophical Alternatives. The philosophical interpretations of the Kochen- Specker theorem (1967) are considered. Einstein's principle regarding the,consubstantiality of inertia and gravity" (1918) allows of a parallel between descriptions of a physical micro-entity in relation to the macro-apparatus on the one hand, and of physical macro-entities in relation to the astronomical mega-entities on the other. The Bohmian interpretation ( 1952) of quantum mechanics (...) proposes that all quantum systems be interpreted as dissipative ones and that the theorem be thus derstood. The conclusion is that the continual representation, by force or (gravitational) field between parts interacting by means of it, of a system is equivalent to their mutual entanglement if representation is discrete. Gravity (force field) and entanglement are two different, correspondingly continual and discrete, images of a single common essence. General relativity can be interpreted as a superluminal generalization of special relativity. The postulate exists of an alleged obligatory difference between a model and reality in science and philosophy. It can also be deduced by interpreting a corollary of the heorem. On the other hand, quantum mechanics, on the basis of this theorem and of V on Neumann's (1932), introduces the option that a model be entirely identified as the modeled reality and, therefore, that absolutely reality be recognized: this is a non-standard hypothesis in the epistemology of science. Thus, the true reality begins to be understood mathematically, i.e. in a Pythagorean manner, for its identification with its mathematical model. A few linked problems are highlighted: the role of the axiom of choice forcorrectly interpreting the theorem; whether the theorem can be considered an axiom; whether the theorem can be considered equivalent to the negation of the axiom. (shrink)
Neueste Predigt aus der Kirche des fundamentalistischen Naturalismus von Pastor Hofstadter. Wie sein viel berühmteres (oder berüchtigtfür seine unerbittlichen philosophischen Fehler) Werk Godel, Escher, Bach, hat es eine oberflächliche Plausibilität, aber wenn man begreift, dass dies ein grassierender Scientismus ist, der reale wissenschaftliche Fragen mit philosophischen vermischt (d.h. die einzigen wirklichen Fragen sind, welche Sprachspiele wir spielen sollten), dann verschwindet fast sein ganzes Interesse. Ich bieten einen Rahmen für die Analyse basierend auf der Evolutionspsychologie und der Arbeit von Wittgenstein (seit (...) meiner neueren Arbeit aktualisiert). Wer aus der modernen zweisystems-Sichteinen umfassenden, aktuellen Rahmen für menschliches Verhalten wünscht, kann mein Buch "The Logical Structure of Philosophy, Psychology, Mindand Language in Ludwig Wittgenstein and John Searle' 2nd ed (2019) konsultieren. Diejenigen,die sich für mehr meiner Schriften interessieren, können 'Talking Monkeys--Philosophie, Psychologie, Wissenschaft, Religion und Politik auf einem verdammten Planeten --Artikel und Rezensionen 2006-2019 3rd ed (2019) und Suicidal Utopian Delusions in the 21st Century 4th ed (2019) und andere sehen. (shrink)
Ich gebe einen ausführlichen Überblick über 'The Outer Limits of Reason' von Noson Yanofsky aus einer einheitlichen Perspektive von Wittgenstein und Evolutionspsychologie. Ich weise darauf hin, dass die Schwierigkeit bei Themen wie Paradoxon in Sprache und Mathematik, Unvollständigkeit, Unbedenklichkeit, Berechenbarkeit, Gehirn und Universum als Computer usw. allesamt auf das Versäumnis zurückzuführen ist, unseren Sprachgebrauch im geeigneten Kontext sorgfältig zu prüfen, und daher das Versäumnis, Fragen der wissenschaftlichen Tatsache von Fragen der Funktionsweise von Sprache zu trennen. Ich bespreche Wittgensteins Ansichten über (...) Unvollständigkeit, Parakonsistenz und Unentschlossenheit und die Arbeit Wolperts an den Grenzen der Berechnung. Zusammengefasst: The Universe According to Brooklyn---Good Science, Not So Good Philosophy. Wer aus der modernen zweisystems-Sichteinen umfassenden, aktuellen Rahmen für menschliches Verhalten wünscht, kann mein Buch "The Logical Structure of Philosophy, Psychology, Mindand Language in Ludwig Wittgenstein and John Searle' 2nd ed (2019) konsultieren. Diejenigen,die sich für mehr meiner Schriften interessieren, können 'Talking Monkeys--Philosophie, Psychologie, Wissenschaft, Religion und Politik auf einem verdammten Planeten --Artikel und Rezensionen 2006-2019 3rd ed (2019) und Suicidal Utopian Delusions in the 21st Century 4th ed (2019) und andere sehen. (shrink)
A possible world is a junky world if and only if each thing in it is a proper part. The possibility of junky worlds contradicts the principle of general fusion. Bohn (2009) argues for the possibility of junky worlds, Watson (2010) suggests that Bohn‘s arguments are flawed. This paper shows that the arguments of both authors leave much to be desired. First, relying on the classical results of Cantor, Zermelo, Fraenkel, and von Neumann, this paper proves the possibility of (...) junky worlds for certain weak set theories. Second, the paradox of Burali-Forti shows that according to the Zermelo-Fraenkel set theory ZF, junky worlds are possible. Finally, it is shown that set theories are not the only sources for designing plausible models of junky worlds: Topology (and possibly other "algebraic" mathematical theories) may be used to construct models of junky worlds. In sum, junkyness is a relatively widespread feature among possible worlds. (shrink)
In "Godel es Way" diskutieren drei namhafte Wissenschaftler Themen wie Unentschlossenheit, Unvollständigkeit, Zufälligkeit, Berechenbarkeit und Parakonsistenz. Ich gehe diese Fragen aus Wittgensteiner Sicht an, dass es zwei grundlegende Fragen gibt, die völlig unterschiedliche Lösungen haben. Es gibt die wissenschaftlichen oder empirischen Fragen, die Fakten über die Welt sind, die beobachtungs- und philosophische Fragen untersuchen müssen, wie Sprache verständlich verwendet werden kann (die bestimmte Fragen in Mathematik und Logik beinhalten), die entschieden werden müssen, indem man sich anschaut,wie wir Wörter in bestimmten (...) Kontexten tatsächlich verwenden. Wenn wir klar werden, welches Sprachspiel wir spielen, werden diese Themen als gewöhnliche wissenschaftliche und mathematische Fragen angesehen, wie alle anderen auch. Wittgensteins Einsichten wurden selten übertroffen und sind heute so treffend wie vor 80 Jahren, als er die Blauen und Braunen Bücher diktierte. Trotz seiner Versäumnisse – wirklich eine Reihe von Notizen statt eines fertigen Buches – ist dies eine einzigartige Quelle für die Arbeit dieser drei berühmten Gelehrten, die seit über einem halben Jahrhundert an den blutenden Rändern von Physik, Mathematik und Philosophie arbeiten. Da Costa und Doria werden von Wolpert zitiert (siehe unten oder meine Artikel über Wolpert und meine Rezension von Yanofskys 'The Outer Limits of Reason'), da sie auf universelle Berechnung schrieben,, und unter seinen vielen Errungenschaften ist Da Costa ein Pionier in Parakonsistenz. Wer aus der modernen zweisystems-Sichteinen umfassenden, aktuellen Rahmen für menschliches Verhalten wünscht, kann mein Buch "The Logical Structure of Philosophy, Psychology, Mindand Language in Ludwig Wittgenstein and John Searle' 2nd ed (2019) konsultieren. Diejenigen,die sich für mehr meiner Schriften interessieren, können 'Talking Monkeys--Philosophie, Psychologie, Wissenschaft, Religion und Politik auf einem verdammten Planeten --Artikel und Rezensionen 2006-2019 3rd ed (2019) und Suicidal Utopian Delusions in the 21st Century 4th ed (2019) und andere sehen. (shrink)
Does consciousness collapse the quantum wave function? This idea was taken seriously by John von Neumann and Eugene Wigner but is now widely dismissed. We develop the idea by combining a mathematical theory of consciousness (integrated information theory) with an account of quantum collapse dynamics (continuous spontaneous localization). Simple versions of the theory are falsified by the quantum Zeno effect, but more complex versions remain compatible with empirical evidence. In principle, versions of the theory can be tested by experiments (...) with quantum computers. The upshot is not that consciousness-collapse interpretations are clearly correct, but that there is a research program here worth exploring. (shrink)
The syllogistic figures and moods can be taken to be argument schemata as can the rules of the Stoic propositional logic. Sentence schemata have been used in axiomatizations of logic only since the landmark 1927 von Neumann paper [31]. Modern philosophers know the role of schemata in explications of the semantic conception of truth through Tarski’s 1933 Convention T [42]. Mathematical logicians recognize the role of schemata in first-order number theory where Peano’s second-order Induction Axiom is approximated by Herbrand’s (...) Induction-Axiom Schema [23]. Similarly, in first-order set theory, Zermelo’s second-order Separation Axiom is approximated by Fraenkel’s first-order Separation Schema [17]. In some of several closely related senses, a schema is a complex system having multiple components one of which is a template-text or scheme-template, a syntactic string composed of one or more “blanks” and also possibly significant words and/or symbols. In accordance with a side condition the template-text of a schema is used as a “template” to specify a multitude, often infinite, of linguistic expressions such as phrases, sentences, or argument-texts, called instances of the schema. The side condition is a second component. The collection of instances may but need not be regarded as a third component. The instances are almost always considered to come from a previously identified language (whether formal or natural), which is often considered to be another component. This article reviews the often-conflicting uses of the expressions ‘schema’ and ‘scheme’ in the literature of logic. It discusses the different definitions presupposed by those uses. And it examines the ontological and epistemic presuppositions circumvented or mooted by the use of schemata, as well as the ontological and epistemic presuppositions engendered by their use. In short, this paper is an introduction to the history and philosophy of schemata. (shrink)
I follow standard mathematical practice and theory to argue that the natural numbers are the finite von Neumann ordinals. I present the reasons standardly given for identifying the natural numbers with the finite von Neumann's. I give a detailed mathematical demonstration that 0 is {} and for every natural number n, n is the set of all natural numbers less than n. Natural numbers are sets. They are the finite von Neumann ordinals.
Since John Von Neumann's proposition in 1932 of a relationship between quantum mechanics and the brain, different perspectives and proposals have evolved (Tarlaci, 2010). Hu & Wu (2006) point out that the seat of consciousness would be the spin within the membranes of neurons and proteins in the brain. Sieb (2016) applied the theory of relativity to spatiotemporal consciousness and found correlations with aspects of brain functioning. Another suggestion is that consciousness emerges because of the Orchestrated Objective Reduction in (...) microtubules (Hameroff & Penrose, 2003). However, few studies about the psychological implications of the relationship between quantum mechanics and the brain and its application to individual psychology exist. (shrink)
Since the pioneering work of Birkhoff and von Neumann, quantum logic has been interpreted as the logic of (closed) subspaces of a Hilbert space. There is a progression from the usual Boolean logic of subsets to the "quantum logic" of subspaces of a general vector space--which is then specialized to the closed subspaces of a Hilbert space. But there is a "dual" progression. The notion of a partition (or quotient set or equivalence relation) is dual (in a category-theoretic sense) (...) to the notion of a subset. Hence the Boolean logic of subsets has a dual logic of partitions. Then the dual progression is from that logic of partitions to the quantum logic of direct-sum decompositions (i.e., the vector space version of a set partition) of a general vector space--which can then be specialized to the direct-sum decompositions of a Hilbert space. This allows the logic to express measurement by any self-adjoint operators rather than just the projection operators associated with subspaces. In this introductory paper, the focus is on the quantum logic of direct-sum decompositions of a finite-dimensional vector space (including such a Hilbert space). The primary special case examined is finite vector spaces over ℤ₂ where the pedagogical model of quantum mechanics over sets (QM/Sets) is formulated. In the Appendix, the combinatorics of direct-sum decompositions of finite vector spaces over GF(q) is analyzed with computations for the case of QM/Sets where q=2. (shrink)
In 1957, Feyerabend delivered a paper titled “On the quantum‐theory of measurement” at the Colston Research Symposium in Bristol to sketch a completion of von Neumann’s measurement scheme without collapse, using only unitary quantum dynamics and well‐motivated statistical assumptions about macroscopic quantum systems. Feyerabend’s paper has been recognized as an early contribution to quantum measurement, anticipating certain aspects of decoherence. Our paper reassesses the physical and philosophical content of Feyerabend’s contribution, detailing the technical steps as well as its overall (...) philosophical motivations and consequences. Summarizing our results, Feyerabend interpreted collapse as a positivist assumption in quantum mechanics leading to a strict distinction between the uninterpreted formalism of unitary evolution in quantum mechanics and the classically interpreted observational language describing post‐measurement outcomes. Thus, Feyerabend took his the no‐collapse completion of the von Neumann measurement scheme to shows the dispensability of the positivist assumption, leading the way to a realistic interpretation of quantum theory. We note, however, that there are substantial problems with his account of measurement that bring into question its viability as a legitimate foil to the orthodox view. We further argue that his dissatisfaction with the von Neumann measurement scheme is indicative of early views on theoretical pluralism. (shrink)
En este artículo realizamos una reconstrucción del Programa original de Hilbert antes del surgimiento de los teoremas limitativos de la tercera década del siglo pasado. Para tal reconstrucción empezaremos por mostrar lo que Torretti llama los primeros titubeos formales de Hilbert, es decir, la defensa por el método axiomático como enfoque fundamentante. Seguidamente, mostraremos como estos titubeos formales se establecen como un verdadero programa de investigación lógico-matemático y como dentro de dicho programa la inquietud por la decidibilidad de los problemas (...) matemáticos y en específico la decidibilidad de la Lógica de primer orden cobra peso. Luego pasamos a analizar como la inquietud por la decibilidad toma lugar dentro del pensamiento filosófico-matemático de Hilbert presentándose como uno de los grandes problemas a los cuales la metamatemática debe encontrar una solución, esto lo hacemos mostrando un contraste con autores, como John von Neumann y Roberto Torretti, quienes de alguna u otra manera no interpretan el problema de la decidibilidad de la Lógica de primer orden como un problema de peso dentro del programa original de Hilbert. Finalmente argumentamos que el resultado meta-teórico de Church puede entenderse como una refutación del optimismo intelectual que permea a todo el programa original de Hilbert. (shrink)
Tous les chercheurs intéressés aux fondements de la théorie quantique s’entendent sur le fait que celle-ci a profondément modifié notre conception de la réalité. Là s’arrête, toutefois, le consensus. Le formalisme de la théorie, non problématique, donne lieu à plusieurs interprétations très différentes, qui ont chacune des conséquences sur la notion de réalité. Cet article analyse comment l’interprétation de Copenhague, l’effondrement du vecteur d’état de von Neumann, l’onde pilote de Bohm et de Broglie et les mondes multiples d’Everett modifient, (...) chacun à sa manière, la conception classique de la réalité, dont le caractère local, en particulier, requiert une révision. (shrink)
contents -/- ONT vol 1 i. short review: Beyond the Black Rainbow ii. as you die, hold one thought iii. short review: LA JETÉE -/- ONT vol 2 i. maya means ii. short review: SANS SOLEIL iii. vocab iv. eros has an underside v. short review: In the Mood for Love -/- ONT vol 3 i. weed weakens / compels me ii. an Ender's Game after-party iii. playroom is a realm of the dead iv. a precise german History v. short (...) review: STATUES ALSO DIE vi. Kenneth Clark, curator for Fascism vii. a protest poem, in industry lit viii. Lawrence and the English Romance -/- ONT vol 4 i. short review: The Eyes of Tammy Faye ii. vR is efficient R iii. all thru Asia, robes for monks iv. same of God, and of the one God sent v. i thought of the Messiah / muse would be vi. conscience is strong vii. a monk's exalted end -/- ONT vol 5 -/- i. for Shakespeare's Richard the Third ii. the truth is i pass over so many words iii. the boori nazar / nadhar iv. i've awe for jihaad v. short review: Hail, Caesar! vi. a minute of Nothing, gone from YouTube vii. we were rivalrous friends, again viii. my bardo pdf ix. within i'm a weak old mandarin -/- ONT vol 6 i. short review: The Intern ii. the confusion of Chinatown iii. we'll remember water, in Theology iv. Respironics versus ResMed v. i'd bet my life for what vi. the Mad Max deity vii. they'd kill my rat, not heal him -/- ONT vol 7 i. Austen would eroticize all life ii. Merchant/Ivory, a name oddly right iii. Ellie Arroway / Agent Starling iv. abattoir / l’abattoir / laboratoire v. von Neumann's brain an anomaly vi. was terrified of death, delighted in the a-bomb vii. the Greatest Brain is variously named -/- ONT vol 8 i. the day they shot the sacrifice ii. Yay or Nay, on Animal Testing iii. an ought is an is / an is is an ought iv. Behaviorism is for zombies v. a finding from the neuro-lab, on empathy vi. i’ve never had discernible abs vii. a cowardice i'm assenting to perpetually -/- ONT vol 9 i. Day of the Locust / Triffids ii. we're wide on a Paramount soundstage iii. HOLLYWOOD, an ecologic history iv. yet one more site of end-time art v. he's "a bookworm with bulging lobes" vi. apartment is my state of being apart vii. enlightenment means a weight's release -/- ONT lates and xtras i. re Gödel's ontological argument ii. deep in pi's numeric noise iii. from Nothing, something iv. endless in the wrong direction, tragic v. they give you all Eternity to answer vi. what of God's mercy? vii. informed consent and prayer viii. i won't live on. a deed i've done may ix. my selective memory x. Janus means: in close-up foam, two faces xi. a liveable world is a readable world xii. what Supervenes from this? xiii. at each extreme our naming is anachronism xiv. Cat is a collapsing of the wave-function xv. diminishing returns in the history of Experiment xvi. all those undershared Nobels xvii. ice preserves the Cold from heat xviii. a desert spreads xix. Pinker's wit, on jokes xx. Rome surrounds St. Paul / Paul is now the center xxi. each is a gathering Ministry xxii. white boy shot execution-style xxiii. the McDonald's Statement of Claim xxiv. first & last: Don Quixote / Ulysses xxv. The Summer of Rave xxvi. this electro is intrinsically anonymous xxvii. all thru Asia, Drake-Rihanna xxviii. WHO IS BETTER: PLATON OR KANT? (shrink)
The notion of a partition on a set is mathematically dual to the notion of a subset of a set, so there is a logic of partitions dual to Boole's logic of subsets (Boolean logic is usually mis-specified as "propositional" logic). The notion of an element of a subset has as its dual the notion of a distinction of a partition (a pair of elements in different blocks). Boole developed finite logical probability as the normalized counting measure on elements of (...) subsets so there is a dual concept of logical entropy which is the normalized counting measure on distinctions of partitions. Thus the logical notion of information is a measure of distinctions. Classical logical entropy naturally extends to the notion of quantum logical entropy which provides a more natural and informative alternative to the usual Von Neumann entropy in quantum information theory. The quantum logical entropy of a post-measurement density matrix has the simple interpretation as the probability that two independent measurements of the same state using the same observable will have different results. The main result of the paper is that the increase in quantum logical entropy due to a projective measurement of a pure state is the sum of the absolute squares of the off-diagonal entries ("coherences") of the pure state density matrix that are zeroed ("decohered") by the measurement, i.e., the measure of the distinctions ("decoherences") created by the measurement. (shrink)
What is a physical object according to the theory of quantum mechanics? The first answer to be considered is that given by Bohr in terms of the concept of complementarity. This interpretation is illustrated by way of an example, the two slit experiment, which highlights some of the associated problems of ontology. One such problem is the so-called problem of measurement or observation. Various interpretations of measurement in Quantum Theory, including those of Heisenberg, von Neumann, Everett and Bohr, are (...) compared and contrasted. A second problem concerns whether or not QT can be considered complete and therefore satisfactory as a basis for physics. Various attempts to complete QT by means of the addition of ‘hidden variables’ to the quantum mechanical state function are considered and their aims and achievements assessed. Finally, we investigate some of the characteristic ontological problems for the orthodox interpretation of Relativistic Quantum Theory. -/- . (shrink)
“There’s Plenty of Room at the Bottom”, said the title of Richard Feynman’s 1959 seminal conference at the California Institute of Technology. Fifty years on, nanotechnologies have led computer scientists to pay close attention to the links between physical reality and information processing. Not all the physical requirements of optimal computation are captured by traditional models—one still largely missing is reversibility. The dynamic laws of physics are reversible at microphysical level, distinct initial states of a system leading to distinct final (...) states. On the other hand, as von Neumann already conjectured, irreversible information processing is expensive: to erase a single bit of information costs ~3 × 10−21 joules at room temperature. Information entropy is a thermodynamic cost, to be paid in non-computational energy dissipation. This paper addresses the problem drawing on Edward Fredkin’s Finite Nature hypothesis: the ultimate nature of the universe is discrete and finite, satisfying the axioms of classical, atomistic mereology. The chosen model is a cellular automaton with reversible dynamics, capable of retaining memory of the information present at the beginning of the universe. Such a CA can implement the Boolean logical operations and the other building bricks of computation: it can develop and host all-purpose computers. The model is a candidate for the realization of computational systems, capable of exploiting the resources of the physical world in an efficient way, for they can host logical circuits with negligible internal energy dissipation. (shrink)
The Schrodinger's Cat and Wigner's Friend thought experiments, which logically follow from the universality of quantum mechanics at all scales, have been repeatedly characterized as possible in principle, if perhaps difficult or impossible for all practical purposes. I show in this paper why these experiments, and interesting macroscopic superpositions in general, are actually impossible in principle. First, no macroscopic superposition can be created via the slow process of natural quantum packet dispersion because all macroscopic objects are inundated with decohering interactions (...) that constantly localize them. Second, the SC/WF thought experiments depend on von Neumann-style amplification to achieve quickly what quantum dispersion achieves slowly. Finally, I show why such amplification cannot produce a macroscopic quantum superposition of an object relative to an external observer, no matter how well isolated the object from the observer, because: the object and observer are already well correlated to each other; and reducing their correlations to allow the object to achieve a macroscopic superposition relative to the observer is equally impossible, in principle, as creating a macroscopic superposition via the process of natural quantum dispersion. (shrink)
The 20th Century is the starting point for the most ambitious attempts to extrapolate human life into artificial systems. Norbert Wiener’s Cybernetics, Claude Shannon’s Information Theory, John von Neumann’s Cellular Automata, Universal Constructor to the Turing Test, Artificial Intelligence to Maturana and Varela’s Autopoietic Organization, all shared the goal of understanding in what sense humans resemble a machine. This scientific and technological movement has embraced all disciplines without exceptions, not only mathematics and physics but also biology, sociology, psychology, economics (...) etc. New terms were developed, such as “information”, “organization”, “entropy”, “communication”, “encryption”, “computation” and “algorithmics” to mention a few, all which had an enormous impact on artificial systems. Our work follows this historical track but with a reversed order of priorities. Instead of deducing a reduced group of human acts into an artificial environment, we deduce the human aspects of machines by extrapolating algorithmic methodologies into everyday life acts. The present informational theories were insufficiently powered to achieve this objective without developing a new theoretical approach. Hence, our research is directed towards an expansion of the current informational theories. (shrink)
Theoretiker der Künstlichen Intelligenz und deren Wegbegleiter in der Philosophie des Geistes haben auf unterschiedliche Weise auf Kritik am ursprünglichen Theorieziel der KI reagiert. Eine dieser Reaktionen ist die Zurücknahme dieses Theorieziels zugunsten der Verfolgung kleinerformatiger Projekte. Eine andere Reaktion ist die Propagierung konnektionistischer Systeme, die mit ihrer dezentralen Arbeitsweise die neuronalen Netze des menschlichen Gehirns besser simulieren sollen. Eine weitere ist die sogenannte robot reply. Die Roboterantwort besteht aus zwei Elementen. Sie enthält (a) das Zugeständnis, daß das Systemverhalten eines (...) wie auch immer programmierten konventionellen Digitalrechners mit von Neumann-Architektur nicht schon menschenähnliche Intelligenz aufweist, und (b) die Behauptung, daß es für bestimmte Arten von Maschinen doch zur Intelligenz reicht. In die Liga der intelligenten Wesen könnten Maschinen genau dann aufsteigen, wenn sie Roboter sind. Damit ist gemeint: wenn sie über Wahrnehmungskomponenten (Rezeptoren) und Handlungskomponenten (Effektoren) verfügen, mithilfe deren sie aktiv in kausale Interaktionen mit ihrer Umwelt eintreten können. Im Beitrag wird für die These argumentiert, daß der Roboterantwort eine richtige Intuition zugrunde liegt, von der die Roboterfreunde sich aber zu einer kurzschlüssigen Folgerung verleiten lassen. Es ist richtig, mentale Zustände und Handlungskompetenz eng aneinander zu binden. Einem Wesen, dem man Handlungsfähigkeit zuerkennt, kann man mentale Zustände nicht absprechen. Doch Handelnkönnen und Geisthaben sind nicht hinreichend unabhängig voneinander, als daß man das eine als Rechtfertigung für die Zuerkennung des anderen verwenden könnte. Man sollte Robotern beides absprechen. (shrink)
The main algebraic foundations of quantum mechanics are quickly reviewed. They have been suggested since the birth of this theory till up to last years. They are the following ones: Heisenberg-Born- Jordan’s (1925), Weyl’s (1928), Dirac’s (1930), von Neumann’s (1936), Segal’s (1947), T.F. Jordan’s (1986), Morchio and Strocchi’s (2009) and Buchholz and Fregenhagen’s (2019). Four cases are stressed: 1) the misinterpretation of Dirac’s algebraic foundation; 2) von Neumann’s ‘conversion’ from the analytic approach of Hilbert space to the algebraic (...) approach of the rings of operators; 3) Morchio and Strocchi’s improving Dirac’s analogy between commutators and Poisson Brackets into an exact equivalence; 4) the recent foundation of quantum mechanics upon the algebra of perturbations. Some considerations on alternating theoretical importance of the algebraic approach in the history of QM are offered. The level of formalism has increased from the mere introduction of matrices to group theory and C*-algebras but has not led to a definition of the foundations of physics; in particular, an algebraic formulation of QM organized as a problem-based theory and an exclusive use of constructive mathematics is still to be discovered. (shrink)
A version of the growing block theory of time is developed based on the choice of both consciousness and mathematics as fundamental substances, while dismissing the reality/semantics distinction usually assumed by works on time theory. The well-analyzable growing block structure of mathematical ontology revealed by mathematical logic, is used as a model for a possible deeper working of conscious time. Physical reality is explained as emerging from a combination of both substances, with a proposed specific version of the Consciousness Causes (...) Collapse interpretation. This leads to new solutions to old problems, including the epistemic problem and issues with Relativity. (shrink)
We review a rough scheme of quantum mechanics using the Clifford algebra. Following the steps previously published in a paper by another author [31], we demonstrate that quantum interference arises in a Clifford algebraic formulation of quantum mechanics. In 1932 J. von Neumann showed that projection operators and, in particular, quantum density matrices can be interpreted as logical statements. In accord with a previously obtained result by V. F Orlov , in this paper we invert von Neumann’s result. (...) Instead of constructing logic from quantum mechanics , we construct quantum mechanics from an extended classical logic. It follows that the origins of the two most fundamental quantum phenomena , the indeterminism and the interference of probabilities, lie not in the traditional physics by itself but in the logical structure as realized here by the Clifford algebra. (shrink)
We review a rough scheme of quantum mechanics using the Clifford algebra. Following the steps previously published in a paper by another author [31], we demonstrate that quantum interference arises in a Clifford algebraic formulation of quantum mechanics. In 1932 J. von Neumann showed that projection operators and, in particular, quantum density matrices can be interpreted as logical statements. In accord with a previously obtained result by V. F Orlov , in this paper we invert von Neumann’s result. (...) Instead of constructing logic from quantum mechanics , we construct quantum mechanics from an extended classical logic. It follows that the origins of the two most fundamental quantum phenomena , the indeterminism and the interference of probabilities, lie not in the traditional physics by itself but in the logical structure as realized here by the Clifford algebra. (shrink)
Barbour shows that time does not exist in the physical world, and similar conclusions are reached by others such as Deutsch, Davies and Woodward. Every possible configuration of a physical environment simply exists in the universe. The system is objectively static. Observation, however, is an inherently transtemporal phenomenon, involving actual or effective change of the configuration, collapse. Since, in a static environment, all possible configurations exist, transtemporal reality is of the logical type of a movie. The frame of a movie (...) film is of one logical type, an element of a set of frames, the movie, itself of a second logical type. In a static no-collapse universe, the configurations are of the first logical type, transtemporal reality of the second. To run, the movie requires iteration, a third logical type. Phenomenal consciousness is subjectively experienced as of this third logical type with regard to physical configurations. Everett's formulation clearly describes the transtemporal reality of an observer, which follows the physical in the linear dynamics, but departs from it on observation, giving rise to the the appearance of collapse, and the alternation of dynamics defined in the standard von Neumann-Dirac formulation. Since there is no physical collapse, his formulation is disputed. Given an iterator of the third logical type, the appearance of collapse is simply evidence of iteration. Chalmers demonstrates that phenomenal consciousness is of this logical type, an emergent property of the unitary system as a whole. Given an iterative function of this nature, one contextual to the physical configurations, paradoxes of time are resolved. Subjectively, meaning from the perspective of the iterative process, time passes in an objectively static universe, and the appearance of collapse is effected. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.