Those incompleteness theorems mean the relation of (Peano) arithmetic and (ZFC) set theory, or philosophically, the relation of arithmetical finiteness and actual infinity. The same is managed in the framework of set theory by the axiom of choice (respectively, by the equivalent well-ordering "theorem'). One may discuss that incompleteness form the viewpoint of set theory by the axiom of choice rather than the usual viewpoint meant in the proof of theorems. The logical corollaries from that "nonstandard" (...) viewpoint the relation of set theory and arithmetic are demonstrated. (shrink)
In the early 1900s, Russell began to recognize that he, and many other mathematicians, had been using assertions like the Axiom of Choice implicitly, and without explicitly proving them. In working with the Axioms of Choice, Infinity, and Reducibility, and his and Whitehead’s Multiplicative Axiom, Russell came to take the position that some axioms are necessary to recovering certain results of mathematics, but may not be proven to be true absolutely. The essay traces historical roots of, (...) and motivations for, Russell’s method of analysis, which are intended to shed light on his view about the status of mathematical axioms. I describe the position Russell develops in consequence as “immanent logicism,” in contrast to what Irving (1989) describes as “epistemic logicism.” Immanent logicism allows Russell to avoid the logocentric predicament, and to propose a method for discovering structural relationships of dependence within mathematical theories. (shrink)
The concept of formal transcendentalism is utilized. The fundamental and definitive property of the totality suggests for “the totality to be all”, thus, its externality (unlike any other entity) is contained within it. This generates a fundamental (or philosophical) “doubling” of anything being referred to the totality, i.e. considered philosophically. Thus, that doubling as well as transcendentalism underlying it can be interpreted formally as an elementary choice such as a bit of information and a quantity corresponding to the number (...) of elementary choices to be defined. This is the quantity of information defined both transcendentally and formally and thus, philosophically and mathematically. If one defines information specifically, as an elementary choice between finiteness (or mathematically, as any natural number of Peano arithmetic) and infinity (i.e. an actually infinite set in the meaning of set theory), the quantity of quantum information is defined. One can demonstrate that the so-defined quantum information and quantum information standardly defined by quantum mechanics are equivalent to each other. The equivalence of the axiom of choice and the well-ordering “theorem” is involved. It can be justified transcendentally as well, in virtue of transcendental equivalence implied by the totality. Thus, all can be considered as temporal as far anything possesses such a temporal counterpart necessarily. Formally defined, the frontier of time is the current choice now, a bit of information, furthermore interpretable as a qubit of quantum information. (shrink)
An isomorphism is built between the separable complex Hilbert space (quantum mechanics) and Minkowski space (special relativity) by meditation of quantum information (i.e. qubit by qubit). That isomorphism can be interpreted physically as the invariance between a reference frame within a system and its unambiguous counterpart out of the system. The same idea can be applied to Poincaré’s conjecture (proved by G. Perelman) hinting another way for proving it, more concise and meaningful physically. Mathematically, the isomorphism means the invariance to (...)choice, the axiom of choice, well-ordering, and well-ordering “theorem” (or “principle”) and can be defined generally as “information invariance”. (shrink)
The link between the high-order metaphysics and abstractions, on the one hand, and choice in the foundation of set theory, on the other hand, can distinguish unambiguously the “good” principles of abstraction from the “bad” ones and thus resolve the “bad company problem” as to set theory. Thus it implies correspondingly a more precise definition of the relation between the axiom of choice and “all company” of axioms in set theory concerning directly or indirectly abstraction: the principle (...) of abstraction, axiom of comprehension, axiom scheme of specification, axiom scheme of separation, subset axiom scheme, axiom scheme of replacement, axiom of unrestricted comprehension, axiom of extensionality, etc. (shrink)
The Principle of Ariadne, formulated in 1988 ago by Walter Carnielli and Carlos Di Prisco and later published in 1993, is an infinitary principle that is independent of the Axiom of Choice in ZF, although it can be consistently added to the remaining ZF axioms. The present paper surveys, and motivates, the foundational importance of the Principle of Ariadne and proposes the Ariadne Game, showing that the Principle of Ariadne, corresponds precisely to a winning strategy for the Ariadne (...) Game. Some relations to other alternative. set-theoretical principles are also briefly discussed. (shrink)
We present an elementary system of axioms for the geometry of Minkowski spacetime. It strikes a balance between a simple and streamlined set of axioms and the attempt to give a direct formalization in first-order logic of the standard account of Minkowski spacetime in [Maudlin 2012] and [Malament, unpublished]. It is intended for future use in the formalization of physical theories in Minkowski spacetime. The choice of primitives is in the spirit of [Tarski 1959]: a predicate of betwenness and (...) a four place predicate to compare the square of the relativistic intervals. Minkowski spacetime is described as a four dimensional ‘vector space’ that can be decomposed everywhere into a spacelike hyperplane - which obeys the Euclidean axioms in [Tarski and Givant, 1999] - and an orthogonal timelike line. The length of other ‘vectors’ are calculated according to Pythagora’s theorem. We conclude with a Representation Theorem relating models of our system that satisfy second order continuity to the mathematical structure called ‘Minkowski spacetime’ in physics textbooks. (shrink)
Gentzen’s approach by transfinite induction and that of intuitionist Heyting arithmetic to completeness and the self-foundation of mathematics are compared and opposed to the Gödel incompleteness results as to Peano arithmetic. Quantum mechanics involves infinity by Hilbert space, but it is finitist as any experimental science. The absence of hidden variables in it interpretable as its completeness should resurrect Hilbert’s finitism at the cost of relevant modification of the latter already hinted by intuitionism and Gentzen’s approaches for completeness. This paper (...) investigates both conditions and philosophical background necessary for that modification. The main conclusion is that the concept of infinity as underlying contemporary mathematics cannot be reduced to a single Peano arithmetic, but to at least two ones independent of each other. Intuitionism, quantum mechanics, and Gentzen’s approaches to completeness an even Hilbert’s finitism can be unified from that viewpoint. Mathematics may found itself by a way of finitism complemented by choice. The concept of information as the quantity of choices underlies that viewpoint. Quantum mechanics interpretable in terms of information and quantum information is inseparable from mathematics and its foundation. (shrink)
We examine some of Connes’ criticisms of Robinson’s infinitesimals starting in 1995. Connes sought to exploit the Solovay model S as ammunition against non-standard analysis, but the model tends to boomerang, undercutting Connes’ own earlier work in functional analysis. Connes described the hyperreals as both a “virtual theory” and a “chimera”, yet acknowledged that his argument relies on the transfer principle. We analyze Connes’ “dart-throwing” thought experiment, but reach an opposite conclusion. In S , all definable sets of reals are (...) Lebesgue measurable, suggesting that Connes views a theory as being “virtual” if it is not definable in a suitable model of ZFC. If so, Connes’ claim that a theory of the hyperreals is “virtual” is refuted by the existence of a definable model of the hyperreal field due to Kanovei and Shelah. Free ultrafilters aren’t definable, yet Connes exploited such ultrafilters both in his own earlier work on the classification of factors in the 1970s and 80s, and in Noncommutative Geometry, raising the question whether the latter may not be vulnerable to Connes’ criticism of virtuality. We analyze the philosophical underpinnings of Connes’ argument based on Gödel’s incompleteness theorem, and detect an apparent circularity in Connes’ logic. We document the reliance on non-constructive foundational material, and specifically on the Dixmier trace −∫ (featured on the front cover of Connes’ magnum opus) and the Hahn–Banach theorem, in Connes’ own framework. We also note an inaccuracy in Machover’s critique of infinitesimal-based pedagogy. (shrink)
The quantum information introduced by quantum mechanics is equivalent to a certain generalization of classical information: from finite to infinite series or collections. The quantity of information is the quantity of choices measured in the units of elementary choice. The “qubit”, can be interpreted as that generalization of “bit”, which is a choice among a continuum of alternatives. The axiom of choice is necessary for quantum information. The coherent state is transformed into a well-ordered series of (...) results in time after measurement. The quantity of quantum information is the transfinite ordinal number corresponding to the infinity series in question. The transfinite ordinal numbers can be defined as ambiguously corresponding “transfinite natural numbers” generalizing the natural numbers of Peano arithmetic to “Hilbert arithmetic” allowing for the unification of the foundations of mathematics and quantum mechanics. (shrink)
The cognition of quantum processes raises a series of questions about ordering and information connecting the states of one and the same system before and after measurement: Quantum measurement, quantum in-variance and the non-locality of quantum information are considered in the paper from an epistemological viewpoint. The adequate generalization of ‘measurement’ is discussed to involve the discrepancy, due to the fundamental Planck constant, between any quantum coherent state and its statistical representation as a statistical ensemble after measurement. Quantum in-variance designates (...) the relation of any quantum coherent state to the corresponding statistical ensemble of measured results. A set-theory corollary is the curious in-variance to the axiom of choice: Any coherent state excludes any well-ordering and thus excludes also the axiom of choice. However the above equivalence requires it to be equated to a well-ordered set after measurement and thus requires the axiom of choice for it to be able to be obtained. Quantum in-variance underlies quantum information and reveals it as the relation of an unordered quantum “much” (i.e. a coherent state) and a well-ordered “many” of the measured results (i.e. a statistical ensemble). It opens up to a new horizon, in which all physical processes and phenomena can be interpreted as quantum computations realizing relevant operations and algorithms on quantum information. All phenomena of entanglement can be described in terms of the so defined quantum information. Quantum in-variance elucidates the link between general relativity and quantum mechanics and thus, the problem of quantum gravity. The non-locality of quantum information unifies the exact position of any space-time point of a smooth trajectory and the common possibility of all space-time points due to a quantum leap. This is deduced from quantum in-variance. Epistemology involves the relation of ordering and thus a generalized kind of information, quantum one, to explain the special features of the cognition in quantum mechanics. (shrink)
Quantum invariance designates the relation of any quantum coherent state to the corresponding statistical ensemble of measured results. The adequate generalization of ‘measurement’ is discussed to involve the discrepancy, due to the fundamental Planck constant, between any quantum coherent state and its statistical representation as a statistical ensemble after measurement. A set-theory corollary is the curious invariance to the axiom of choice: Any coherent state excludes any well-ordering and thus excludes also the axiom of choice. It (...) should be equated to a well-ordered set after measurement and thus requires the axiom of choice. Quantum invariance underlies quantum information and reveals it as the relation of an unordered quantum “much” (i.e. a coherent state) and a well-ordered “many” of the measured results (i.e. a statistical ensemble). It opens up to a new horizon, in which all physical processes and phenomena can be interpreted as quantum computations realizing relevant operations and algorithms on quantum information. All phenomena of entanglement can be described in terms of the so defined quantum information. Quantum invariance elucidates the link between general relativity and quantum mechanics and thus, the problem of quantum gravity. (shrink)
Peano arithmetic cannot serve as the ground of mathematics for it is inconsistent to infinity, and infinity is necessary for its foundation. Though Peano arithmetic cannot be complemented by any axiom of infinity, there exists at least one (logical) axiomatics consistent to infinity. That is nothing else than a new reading at issue and comparative interpretation of Gödel’s papers (1930; 1931) meant here. Peano arithmetic admits anyway generalizations consistent to infinity and thus to some addable axiom(s) of infinity. (...) The most utilized example of those generalizations is the complex Hilbert space. Any generalization of Peano arithmetic consistent to infinity, e.g. the complex Hilbert space, can serve as a foundation for mathematics to found itself and by itself. (shrink)
A principle, according to which any scientific theory can be mathematized, is investigated. That theory is presupposed to be a consistent text, which can be exhaustedly represented by a certain mathematical structure constructively. In thus used, the term “theory” includes all hypotheses as yet unconfirmed as already rejected. The investigation of the sketch of a possible proof of the principle demonstrates that it should be accepted rather a metamathematical axiom about the relation of mathematics and reality. Its investigation needs (...) philosophical means. Husserl’s phenomenology is what is used, and then the conception of “bracketing reality” is modelled to generalize Peano arithmetic in its relation to set theory in the foundation of mathematics. The obtained model is equivalent to the generalization of Peano arithmetic by means of replacing the axiom of induction with that of transfinite induction. A comparison to Mach’s doctrine is used to be revealed the fundamental and philosophical reductionism of Husserl’s phenomenology leading to a kind of Pythagoreanism in the final analysis. Accepting or rejecting the principle, two kinds of mathematics appear differing from each other by its relation to reality. Accepting the principle, mathematics has to include reality within itself in a kind of Pythagoreanism. These two kinds are called in paper correspondingly Hilbert mathematics and Gödel mathematics. The sketch of the proof of the principle demonstrates that the generalization of Peano arithmetic as above can be interpreted as a model of Hilbert mathematics into Gödel mathematics therefore showing that the former is not less consistent than the latter, and the principle is an independent axiom. An information interpretation of Hilbert mathematics is involved. It is a kind of ontology of information. Thus the problem which of the two mathematics is more relevant to our being is discussed. An information interpretation of the Schrödinger equation is involved to illustrate the above problem. (shrink)
Arthur Clark and Michael Kube–McDowell (“The Triger”, 2000) suggested the sci-fi idea about the direct transformation from a chemical substance to another by the action of a newly physical, “Trigger” field. Karl Brohier, a Nobel Prize winner, who is a dramatic persona in the novel, elaborates a new theory, re-reading and re-writing Pauling’s “The Nature of the Chemical Bond”; according to Brohier: “Information organizes and differentiates energy. It regularizes and stabilizes matter. Information propagates through matter-energy and mediates the interactions of (...) matter-energy.” Dr Horton, his collaborator in the novel replies: “If the universe consists of energy and information, then the Trigger somehow alters the information envelope of certain substances –“. “Alters it, scrambles it, overwhelms it, destabilizes it” Brohier adds. There is a scientific debate whether or how far chemistry is fundamentally reducible to quantum mechanics. Nevertheless, the fact that many essential chemical properties and reactions are at least partly representable in terms of quantum mechanics is doubtless. For the quantum mechanics itself has been reformulated as a theory of a special kind of information, quantum information, chemistry might be in turn interpreted in the same terms. Wave function, the fundamental concept of quantum mechanics, can be equivalently defined as a series of qubits, eventually infinite. A qubit, being defined as the normed superposition of the two orthogonal subspaces of the complex Hilbert space, can be interpreted as a generalization of the standard bit of information as to infinite sets or series. All “forces” in the Standard model, which are furthermore essential for chemical transformations, are groups [U(1),SU(2),SU(3)] of the transformations of the complex Hilbert space and thus, of series of qubits. One can suggest that any chemical substances and changes are fundamentally representable as quantum information and its transformations. If entanglement is interpreted as a physical field, though any group above seems to be unattachable to it, it might be identified as the “Triger field”. It might cause a direct transformation of any chemical substance by from a remote distance. Is this possible in principle? (shrink)
Hilbert’s choice operators τ and ε, when added to intuitionistic logic, strengthen it. In the presence of certain extensionality axioms they produce classical logic, while in the presence of weaker decidability conditions for terms they produce various superintuitionistic intermediate logics. In this thesis, I argue that there are important philosophical lessons to be learned from these results. To make the case, I begin with a historical discussion situating the development of Hilbert’s operators in relation to his evolving program in (...) the foundations of mathematics and in relation to philosophical motivations leading to the development of intuitionistic logic. This sets the stage for a brief description of the relevant part of Dummett’s program to recast debates in metaphysics, and in particular disputes about realism and anti-realism, as closely intertwined with issues in philosophical logic, with the acceptance of classical logic for a domain reflecting a commitment to realism for that domain. Then I review extant results about what is provable and what is not when one adds epsilon to intuitionistic logic, largely due to Bell and DeVidi, and I give several new proofs of intermediate logics from intuitionistic logic+ε without identity. With all this in hand, I turn to a discussion of the philosophical significance of choice operators. Among the conclusions I defend are that these results provide a finer-grained basis for Dummett’s contention that commitment to classically valid but intuitionistically invalid principles reflect metaphysical commitments by showing those principles to be derivable from certain existence assumptions; that Dummett’s framework is improved by these results as they show that questions of realism and anti-realism are not an “all or nothing” matter, but that there are plausibly metaphysical stances between the poles of anti-realism and realism, because different sorts of ontological assumptions yield intermediate rather than classical logic; and that these intermediate positions between classical and intuitionistic logic link up in interesting ways with our intuitions about issues of objectivity and reality, and do so usefully by linking to questions around intriguing everyday concepts such as “is smart,” which I suggest involve a number of distinct dimensions which might themselves be objective, but because of their multivalent structure are themselves intermediate between being objective and not. Finally, I discuss the implications of these results for ongoing debates about the status of arbitrary and ideal objects in the foundations of logic, showing among other things that much of the discussion is flawed because it does not recognize the degree to which the claims being made depend on the presumption that one is working with a very strong logic. (shrink)
The principle of maximal entropy (further abbreviated as “MaxEnt”) can be founded on the formal mechanism, in which future transforms into past by the mediation of present. This allows of MaxEnt to be investigated by the theory of quantum information. MaxEnt can be considered as an inductive analog or generalization of “Occam’s razor”. It depends crucially on choice and thus on information just as all inductive methods of reasoning. The essence shared by Occam’s razor and MaxEnt is for the (...) relevant data known till now to be postulated as an enough fundament of conclusion. That axiom is the kind of choice grounding both principles. Popper’s falsifiability (1935) can be discussed as a complement to them: That axiom (or axiom scheme) is always sufficient but never necessary condition of conclusion therefore postulating the choice in the base of MaxEnt. Furthermore, the abstraction axiom (or axiom scheme) relevant to set theory (e.g. the axiom scheme of specification in ZFC) involves choice analogically. (shrink)
The text is a continuation of the article of the same name published in the previous issue of Philosophical Alternatives. The philosophical interpretations of the Kochen- Specker theorem (1967) are considered. Einstein's principle regarding the,consubstantiality of inertia and gravity" (1918) allows of a parallel between descriptions of a physical micro-entity in relation to the macro-apparatus on the one hand, and of physical macro-entities in relation to the astronomical mega-entities on the other. The Bohmian interpretation ( 1952) of quantum mechanics proposes (...) that all quantum systems be interpreted as dissipative ones and that the theorem be thus derstood. The conclusion is that the continual representation, by force or (gravitational) field between parts interacting by means of it, of a system is equivalent to their mutual entanglement if representation is discrete. Gravity (force field) and entanglement are two different, correspondingly continual and discrete, images of a single common essence. General relativity can be interpreted as a superluminal generalization of special relativity. The postulate exists of an alleged obligatory difference between a model and reality in science and philosophy. It can also be deduced by interpreting a corollary of the heorem. On the other hand, quantum mechanics, on the basis of this theorem and of V on Neumann's (1932), introduces the option that a model be entirely identified as the modeled reality and, therefore, that absolutely reality be recognized: this is a non-standard hypothesis in the epistemology of science. Thus, the true reality begins to be understood mathematically, i.e. in a Pythagorean manner, for its identification with its mathematical model. A few linked problems are highlighted: the role of the axiom of choice forcorrectly interpreting the theorem; whether the theorem can be considered an axiom; whether the theorem can be considered equivalent to the negation of the axiom. (shrink)
One can construct a mapping between Hilbert space and the class of all logic if the latter is defined as the set of all well-orderings of some relevant set (or class). That mapping can be further interpreted as a mapping of all states of all quantum systems, on the one hand, and all logic, on the other hand. The collection of all states of all quantum systems is equivalent to the world (the universe) as a whole. Thus that mapping establishes (...) a fundamentally philosophical correspondence between the physical world and universal logic by the meditation of a special and fundamental structure, that of Hilbert space, and therefore, between quantum mechanics and logic by mathematics. Furthermore, Hilbert space can be interpreted as the free variable of "quantum information" and any point in it, as a value of the same variable as "bound" already axiom of choice. (shrink)
Quantum information is discussed as the universal substance of the world. It is interpreted as that generalization of classical information, which includes both finite and transfinite ordinal numbers. On the other hand, any wave function and thus any state of any quantum system is just one value of quantum information. Information and its generalization as quantum information are considered as quantities of elementary choices. Their units are correspondingly a bit and a qubit. The course of time is what generates choices (...) by itself, thus quantum information and any item in the world in final analysis. The course of time generates necessarily choices so: The future is absolutely unorderable in principle while the past is always well-ordered and thus unchangeable. The present as the mediation between them needs the well-ordered theorem equivalent to the axiom of choice. The latter guarantees the choice even among the elements of an infinite set, which is the case of quantum information. The concrete and abstract objects share information as their common base, which is quantum as to the formers and classical as to the latters. The general quantities of matter in physics, mass and energy can be considered as particular cases of quantum information. The link between choice and abstraction in set theory allows of “Hume’s principle” to be interpreted in terms of quantum mechanics as equivalence of “many” and “much” underlying quantum information. Quantum information as the universal substance of the world calls for the unity of physics and mathematics rather than that of the concrete and abstract objects and thus for a form of quantum neo-Pythagoreanism in final analysis. (shrink)
If the concept of “free will” is reduced to that of “choice” all physical world shares the latter quality. Anyway the “free will” can be distinguished from the “choice”: The “free will” involves implicitly a certain goal, and the choice is only the mean, by which the aim can be achieved or not by the one who determines the target. Thus, for example, an electron has always a choice but not free will unlike a human possessing (...) both. Consequently, and paradoxically, the determinism of classical physics is more subjective and more anthropomorphic than the indeterminism of quantum mechanics for the former presupposes certain deterministic goal implicitly following the model of human freewill behavior. Quantum mechanics introduces the choice in the fundament of physical world involving a generalized case of choice, which can be called “subjectless”: There is certain choice, which originates from the transition of the future into the past. Thus that kind of choice is shared of all existing and does not need any subject: It can be considered as a low of nature. There are a few theorems in quantum mechanics directly relevant to the topic: two of them are called “free will theorems” by their authors (Conway and Kochen 2006; 2009). Any quantum system either a human or an electron or whatever else has always a choice: Its behavior is not predetermined by its past. This is a physical law. It implies that a form of information, the quantum information underlies all existing for the unit of the quantity of information is an elementary choice: either a bit or a quantum bit (qubit). (shrink)
DEFINING OUR TERMS A “paradox" is an argumentation that appears to deduce a conclusion believed to be false from premises believed to be true. An “inconsistency proof for a theory" is an argumentation that actually deduces a negation of a theorem of the theory from premises that are all theorems of the theory. An “indirect proof of the negation of a hypothesis" is an argumentation that actually deduces a conclusion known to be false from the hypothesis alone or, more commonly, (...) from the hypothesis augmented by a set of premises known to be true. A “direct proof of a hypothesis" is an argumentation that actually deduces the hypothesis itself from premises known to be true. Since `appears', `believes' and `knows' all make elliptical reference to a participant, it is clear that `paradox', `indirect proof' and `direct proof' are all participant-relative. PARTICIPANT RELATIVITY In normal mathematical writing the participant is presumed to be “the community of mathematicians" or some more or less well-defined subcommunity and, therefore, omission of explicit reference to the participant is often warranted. However, in historical, critical, or philosophical writing focused on emerging branches of mathematics such omission often invites confusion. One and the same argumentation has been a paradox for one mathematician, an inconsistency proof for another, and an indirect proof to a third. One and the same argumentation-text can appear to one mathematician to express an indirect proof while appearing to another mathematician to express a direct proof. WHAT IS A PARADOX’S SOLUTION? Of the above four sorts of argumentation only the paradox invites “solution" or “resolution", and ordinarily this is to be accomplished either by discovering a logical fallacy in the “reasoning" of the argumentation or by discovering that the conclusion is not really false or by discovering that one of the premises is not really true. Resolution of a paradox by a participant amounts to reclassifying a formerly paradoxical argumentation either as a “fallacy", as a direct proof of its conclusion, as an indirect proof of the negation of one of its premises, as an inconsistency proof, or as something else depending on the participant's state of knowledge or belief. This illustrates why an argumentation which is a paradox to a given mathematician at a given time may well not be a paradox to the same mathematician at a later time. -/- The present article considers several set-theoretic argumentations that appeared in the period 1903-1908. The year 1903 saw the publication of B. Russell's Principles of mathematics, [Cambridge Univ. Press, Cambridge, 1903; Jbuch 34, 62]. The year 1908 saw the publication of Russell's article on type theory as well as Ernst Zermelo's two watershed articles on the axiom of choice and the foundations of set theory. The argumentations discussed concern “the largest cardinal", “the largest ordinal", the well-ordering principle, “the well-ordering of the continuum", denumerability of ordinals and denumerability of reals. The article shows that these argumentations were variously classified by various mathematicians and that the surrounding atmosphere was one of confusion and misunderstanding, partly as a result of failure to make or to heed distinctions similar to those made above. The article implies that historians have made the situation worse by not observing or not analysing the nature of the confusion. -/- RECOMMENDATION This well-written and well-documented article exemplifies the fact that clarification of history can be achieved through articulation of distinctions that had not been articulated (or were not being heeded) at the time. The article presupposes extensive knowledge of the history of mathematics, of mathematics itself (especially set theory) and of philosophy. It is therefore not to be recommended for casual reading. AFTERWORD: This review was written at the same time Corcoran was writing his signature “Argumentations and logic”[249] that covers much of the same ground in much more detail. https://www.academia.edu/14089432/Argumentations_and_Logic . (shrink)
Within the social sciences, much controversy exists about which status should be ascribed to the rationality assumption that forms the core of rational choice theories. Whilst realists argue that the rationality assumption is an empirical claim which describes real processes that cause individual action, instrumentalists maintain that it amounts to nothing more than an analytically set axiom or ‘as if’ hypothesis which helps in the generation of accurate predictions. In this paper, I argue that this realist-instrumentalist debate about (...) rational choice theory can be overcome once it is realised that the rationality assumption is neither an empirical description nor an ‘as if’ hypothesis, but a normative claim. (shrink)
The quantum information introduced by quantum mechanics is equivalent to that generalization of the classical information from finite to infinite series or collections. The quantity of information is the quantity of choices measured in the units of elementary choice. The qubit can be interpreted as that generalization of bit, which is a choice among a continuum of alternatives. The axiom of choice is necessary for quantum information. The coherent state is transformed into a well-ordered series of (...) results in time after measurement. The quantity of quantum information is the ordinal corresponding to the infinity series in question. Number and being (by the meditation of time), the natural and artificial turn out to be not more than different hypostases of a single common essence. This implies some kind of neo-Pythagorean ontology making related mathematics, physics, and technics immediately, by an explicit mathematical structure. (shrink)
The paper justifies the following theses: The totality can found time if the latter is axiomatically represented by its “arrow” as a well-ordering. Time can found choice and thus information in turn. Quantum information and its units, the quantum bits, can be interpreted as their generalization as to infinity and underlying the physical world as well as the ultimate substance of the world both subjective and objective. Thus a pathway of interpretation between the totality via time, order, choice, (...) and information to the substance of the world is constructed. The article is based only on the well-known facts and definitions and is with no premises in this sense. Nevertheless it is naturally situated among works and ideas of Husserl and Heidegger, linked to the foundation of mathematics by the axiom of choice, to the philosophy of quantum mechanics and information. (shrink)
This dissertation examines several of the problems that Hilbert discovered in the foundations of mathematics, from a metalogical perspective. The problems manifest themselves in four different aspects of Hilbert’s views: (i) Hilbert’s axiomatic approach to the foundations of mathematics; (ii) His response to criticisms of set theory; (iii) His response to intuitionist criticisms of classical mathematics; (iv) Hilbert’s contribution to the specification of the role of logical inference in mathematical reasoning. This dissertation argues that Hilbert’s axiomatic approach was guided primarily (...) by model theoretical concerns. Accordingly, the ultimate aim of his consistency program was to prove the model-theoretical consistency of mathematical theories. It turns out that for the purpose of carrying out such consistency proofs, a suitable modification of the ordinary first-order logic is needed. To effect this modification, independence-friendly logic is needed as the appropriate conceptual framework. It is then shown how the model theoretical consistency of arithmetic can be proved by using IF logic as its basic logic. Hilbert’s other problems, manifesting themselves as aspects (ii), (iii), and (iv)—most notably the problem of the status of the axiom of choice, the problem of the role of the law of excluded middle, and the problem of giving an elementary account of quantification—can likewise be approached by using the resources of IF logic. It is shown that by means of IF logic one can carry out Hilbertian solutions to all these problems. The two major results concerning aspects (ii), (iii) and (iv) are the following: (a) The axiom of choice is a logical principle; (b) The law of excluded middle divides metamathematical methods into elementary and non-elementary ones. It is argued that these results show that IF logic helps to vindicate Hilbert’s nominalist philosophy of mathematics. On the basis of an elementary approach to logic, which enriches the expressive resources of ordinary first-order logic, this dissertation shows how the different problems that Hilbert discovered in the foundations of mathematics can be solved. (shrink)
Cyclic mechanic is intended as a suitable generalization both of quantum mechanics and general relativity apt to unify them. It is founded on a few principles, which can be enumerated approximately as follows: 1. Actual infinity or the universe can be considered as a physical and experimentally verifiable entity. It allows of mechanical motion to exist. 2. A new law of conservation has to be involved to generalize and comprise the separate laws of conservation of classical and relativistic mechanics, and (...) especially that of conservation of energy: This is the conservation of action or information. 3. Time is not a uniformly flowing time in general. It can have some speed, acceleration, more than one dimension, to be discrete. 4. The following principle of cyclicity: The universe returns in any point of it. The return can be only kinematic, i.e. per a unit of energy (or mass), and thermodynamic, i.e. considering the universe as a thermodynamic whole. 5. The kinematic return, which is per a unit of energy (or mass), is the counterpart of conservation of energy, which can be interpreted as the particular case of conservation of action “per a unit of time”. The kinematic return per a unit of energy (or mass) can be interpreted in turn as another particular law of conservation in the framework of conservation of action (or information), namely conservation of wave period (or time). These two counterpart laws of conservation correspond exactly to the particle “half” and to the wave “half” of wave-particle duality. 6. The principle of quantum invariance is introduced. It means that all physical laws have to be invariant to discrete and continuous (smooth) morphisms (motions) or mathematically, to the axiom of choice. The list is not intended to be exhausted or disjunctive, but only to give an introductory idea. (shrink)
We discuss the philosophical implications of formal results showing the con- sequences of adding the epsilon operator to intuitionistic predicate logic. These results are related to Diaconescu’s theorem, a result originating in topos theory that, translated to constructive set theory, says that the axiom of choice (an “existence principle”) implies the law of excluded middle (which purports to be a logical principle). As a logical choice principle, epsilon allows us to translate that result to a logical setting, (...) where one can get an analogue of Diaconescu’s result, but also can disentangle the roles of certain other assumptions that are hidden in mathematical presentations. It is our view that these results have not received the attention they deserve: logicians are unlikely to read a discussion because the results considered are “already well known,” while the results are simultaneously unknown to philosophers who do not specialize in what most philosophers will regard as esoteric logics. This is a problem, since these results have important implications for and promise signif i cant illumination of contem- porary debates in metaphysics. The point of this paper is to make the nature of the results clear in a way accessible to philosophers who do not specialize in logic, and in a way that makes clear their implications for contemporary philo- sophical discussions. To make the latter point, we will focus on Dummettian discussions of realism and anti-realism. Keywords: epsilon, axiom of choice, metaphysics, intuitionistic logic, Dummett, realism, antirealism. (shrink)
Two strategies to infinity are equally relevant for it is as universal and thus complete as open and thus incomplete. Quantum mechanics is forced to introduce infinity implicitly by Hilbert space, on which is founded its formalism. One can demonstrate that essential properties of quantum information, entanglement, and quantum computer originate directly from infinity once it is involved in quantum mechanics. Thus, thеse phenomena can be elucidated as both complete and incomplete, after which choice is the border between them. (...) A special kind of invariance to the axiom of choice shared by quantum mechanics is discussed to be involved that border between the completeness and incompleteness of infinity in a consistent way. The so-called paradox of Albert Einstein, Boris Podolsky, and Nathan Rosen is interpreted entirely in the same terms only of set theory. Quantum computer can demonstrate especially clearly the privilege of the internal position, or “observer”, or “user” to infinity implied by Henkin’s proposition as the only consistent ones as to infinity. (shrink)
The way, in which quantum information can unify quantum mechanics (and therefore the standard model) and general relativity, is investigated. Quantum information is defined as the generalization of the concept of information as to the choice among infinite sets of alternatives. Relevantly, the axiom of choice is necessary in general. The unit of quantum information, a qubit is interpreted as a relevant elementary choice among an infinite set of alternatives generalizing that of a bit. The invariance (...) to the axiom of choice shared by quantum mechanics is introduced: It constitutes quantum information as the relation of any state unorderable in principle (e.g. any coherent quantum state before measurement) and the same state already well-ordered (e.g. the well-ordered statistical ensemble of the measurement of the quantum system at issue). This allows of equating the classical and quantum time correspondingly as the well-ordering of any physical quantity or quantities and their coherent superposition. That equating is interpretable as the isomorphism of Minkowski space and Hilbert space. Quantum information is the structure interpretable in both ways and thus underlying their unification. Its deformation is representable correspondingly as gravitation in the deformed pseudo-Riemannian space of general relativity and the entanglement of two or more quantum systems. The standard model studies a single quantum system and thus privileges a single reference frame turning out to be inertial for the generalized symmetry [U(1)]X[SU(2)]X[SU(3)] “gauging” the standard model. As the standard model refers to a single quantum system, it is necessarily linear and thus the corresponding privileged reference frame is necessary inertial. The Higgs mechanism U(1) → [U(1)]X[SU(2)] confirmed enough already experimentally describes exactly the choice of the initial position of a privileged reference frame as the corresponding breaking of the symmetry. The standard model defines ‘mass at rest’ linearly and absolutely, but general relativity non-linearly and relatively. The “Big Bang” hypothesis is additional interpreting that position as that of the “Big Bang”. It serves also in order to reconcile the linear standard model in the singularity of the “Big Bang” with the observed nonlinearity of the further expansion of the universe described very well by general relativity. Quantum information links the standard model and general relativity in another way by mediation of entanglement. The linearity and absoluteness of the former and the nonlinearity and relativeness of the latter can be considered as the relation of a whole and the same whole divided into parts entangled in general. (shrink)
Contemporary Humeans treat laws of nature as statements of exceptionless regularities that function as the axioms of the best deductive system. Such ‘Best System Accounts’ marry realism about laws with a denial of necessary connections among events. I argue that Hume’s predecessor, George Berkeley, offers a more sophisticated conception of laws, equally consistent with the absence of powers or necessary connections among events in the natural world. On this view, laws are not statements of regularities but the most general rules (...) God follows in producing the world. Pace most commentators, I argue that Berkeley’s view is neither instrumentalist nor reductionist. More important, the Berkeleyan Best System can solve some of the problems afflicting its Humean rivals, including the problems of theory choice and Nancy Cartwright’s ‘facticity’ dilemma. Some of these solutions are available in the contemporary context, without any appeal to God. Berkeley’s account deserves to be taken seriously in its own right. (shrink)
The concept of quantum information is introduced as both normed superposition of two orthogonal sub-spaces of the separable complex Hilbert space and in-variance of Hamilton and Lagrange representation of any mechanical system. The base is the isomorphism of the standard introduction and the representation of a qubit to a 3D unit ball, in which two points are chosen. The separable complex Hilbert space is considered as the free variable of quantum information and any point in it (a wave function describing (...) a state of a quantum system) as its value as the bound variable. A qubit is equivalent to the generalization of ‘bit’ from the set of two equally probable alternatives to an infinite set of alternatives. Then, that Hilbert space is considered as a generalization of Peano arithmetic where any unit is substituted by a qubit and thus the set of natural number is mappable within any qubit as the complex internal structure of the unit or a different state of it. Thus, any mathematical structure being reducible to set theory is re-presentable as a set of wave functions and a subspace of the separable complex Hilbert space, and it can be identified as the category of all categories for any functor represents an operator transforming a set (or subspace) of the separable complex Hilbert space into another. Thus, category theory is isomorphic to the Hilbert-space representation of set theory & Peano arithmetic as above. Given any value of quantum information, i.e. a point in the separable complex Hilbert space, it always admits two equally acceptable interpretations: the one is physical, the other is mathematical. The former is a wave function as the exhausted description of a certain state of a certain quantum system. The latter chooses a certain mathematical structure among a certain category. Thus there is no way to be distinguished a mathematical structure from a physical state for both are described exhaustedly as a value of quantum information. This statement in turn can be utilized to be defined quantum information by the identity of any mathematical structure to a physical state, and also vice versa. Further, that definition is equivalent to both standard definition as the normed superposition and in-variance of Hamilton and Lagrange interpretation of mechanical motion introduced in the beginning of the paper. Then, the concept of information symmetry can be involved as the symmetry between three elements or two pairs of elements: Lagrange representation and each counterpart of the pair of Hamilton representation. The sense and meaning of information symmetry may be visualized by a single (quantum) bit and its interpretation as both (privileged) reference frame and the symmetries of the Standard model. (shrink)
We show how removing faith-based beliefs in current philosophies of classical and constructive mathematics admits formal, evidence-based, definitions of constructive mathematics; of a constructively well-defined logic of a formal mathematical language; and of a constructively well-defined model of such a language. -/- We argue that, from an evidence-based perspective, classical approaches which follow Hilbert's formal definitions of quantification can be labelled `theistic'; whilst constructive approaches based on Brouwer's philosophy of Intuitionism can be labelled `atheistic'. -/- We then adopt what may (...) be labelled a finitary, evidence-based, `agnostic' perspective and argue that Brouwerian atheism is merely a restricted perspective within the finitary agnostic perspective, whilst Hilbertian theism contradicts the finitary agnostic perspective. -/- We then consider the argument that Tarski's classic definitions permit an intelligence---whether human or mechanistic---to admit finitary, evidence-based, definitions of the satisfaction and truth of the atomic formulas of the first-order Peano Arithmetic PA over the domain N of the natural numbers in two, hitherto unsuspected and essentially different, ways. -/- We show that the two definitions correspond to two distinctly different---not necessarily evidence-based but complementary---assignments of satisfaction and truth to the compound formulas of PA over N. -/- We further show that the PA axioms are true over N, and that the PA rules of inference preserve truth over N, under both the complementary interpretations; and conclude some unsuspected constructive consequences of such complementarity for the foundations of mathematics, logic, philosophy, and the physical sciences. -/- . (shrink)
Until recently, discussion of virtues in the philosophy of mathematics has been fleeting and fragmentary at best. But in the last few years this has begun to change. As virtue theory has grown ever more influential, not just in ethics where virtues may seem most at home, but particularly in epistemology and the philosophy of science, some philosophers have sought to push virtues out into unexpected areas, including mathematics and its philosophy. But there are some mathematicians already there, ready to (...) meet them, who have explicitly invoked virtues in discussing what is necessary for a mathematician to succeed. In both ethics and epistemology, virtue theory tends to emphasize character virtues, the acquired excellences of people. But people are not the only sort of thing whose excellences may be identified as virtues. Theoretical virtues have attracted attention in the philosophy of science as components of an account of theory choice. Within the philosophy of mathematics, and mathematics itself, attention to virtues has emerged from a variety of disparate sources. Theoretical virtues have been put forward both to analyse the practice of proof and to justify axioms; intellectual virtues have found multiple applications in the epistemology of mathematics; and ethical virtues have been offered as a basis for understanding the social utility of mathematical practice. Indeed, some authors have advocated virtue epistemology as the correct epistemology for mathematics (and perhaps even as the basis for progress in the metaphysics of mathematics). This topical collection brings together several of the researchers who have begun to study mathematical practices from a virtue perspective with the intention of consolidating and encouraging this trend. (shrink)
The problem of indeterminism in quantum mechanics usually being considered as a generalization determinism of classical mechanics and physics for the case of discrete (quantum) changes is interpreted as an only mathematical problem referring to the relation of a set of independent choices to a well-ordered series therefore regulated by the equivalence of the axiom of choice and the well-ordering “theorem”. The former corresponds to quantum indeterminism, and the latter, to classical determinism. No other premises (besides the above (...) only mathematical equivalence) are necessary to explain how the probabilistic causation of quantum mechanics refers to the unambiguous determinism of classical physics. The same equivalence underlies the mathematical formalism of quantum mechanics. It merged the well-ordered components of the vectors of Heisenberg’s matrix mechanics and the non-ordered members of the wave functions of Schrödinger’s undulatory mechanics. The mathematical condition of that merging is just the equivalence of the axiom of choice and the well-ordering theorem implying in turn Max Born’s probabilistic interpretation of quantum mechanics. Particularly, energy conservation is justified differently than classical physics. It is due to the equivalence at issue rather than to the principle of least action. One may involve two forms of energy conservation corresponding whether to the smooth changes of classical physics or to the discrete changes of quantum mechanics. Further both kinds of changes can be equated to each other under the unified energy conservation as well as the conditions for the violation of energy conservation to be investigated therefore directing to a certain generalization of energy conservation. (shrink)
A principle, according to which any scientific theory can be mathematized, is investigated. Social science, liberal arts, history, and philosophy are meant first of all. That kind of theory is presupposed to be a consistent text, which can be exhaustedly represented by a certain mathematical structure constructively. In thus used, the term “theory” includes all hypotheses as yet unconfirmed as already rejected. The investigation of the sketch of a possible proof of the principle demonstrates that it should be accepted rather (...) a metamathematical axiom about the relation of mathematics and reality. The main statement is formulated as follows: Any scientific theory admits isomorphism to some mathematical structure in a way constructive. Its investigation needs philosophical means. Husserl’s phenomenology is what is used, and then the conception of “bracketing reality” is modelled to generalize Peano arithmetic in its relation to set theory in the foundation of mathematics. The obtained model is equivalent to the generalization of Peano arithmetic by means of replacing the axiom of induction with that of transfinite induction. The sketch of the proof is organized in five steps: a generalization of epoché; involving transfinite induction in the transition between Peano arithmetic and set theory; discussing the finiteness of Peano arithmetic; applying transfinite induction to Peano arithmetic; discussing an arithmetical model of reality. Accepting or rejecting the principle, two kinds of mathematics appear differing from each other by its relation to reality. Accepting the principle, mathematics has to include reality within itself in a kind of Pythagoreanism. These two kinds are called in paper correspondingly Hilbert mathematics and Gödel mathematics. The sketch of the proof of the principle demonstrates that the generalization of Peano arithmetic as above can be interpreted as a model of Hilbert mathematics into Gödel mathematics therefore showing that the former is not less consistent than the latter, and the principle is an independent axiom. The present paper follows a pathway grounded on Husserl’s phenomenology and “bracketing reality” to achieve the generalized arithmetic necessary for the principle to be founded in alternative ontology, in which there is no reality external to mathematics: reality is included within mathematics. That latter mathematics is able to self-found itself and can be called Hilbert mathematics in honour of Hilbert’s program for self-founding mathematics on the base of arithmetic. The principle of universal mathematizability is consistent to Hilbert mathematics, but not to Gödel mathematics. Consequently, its validity or rejection would resolve the problem which mathematics refers to our being; and vice versa: the choice between them for different reasons would confirm or refuse the principle as to the being. An information interpretation of Hilbert mathematics is involved. It is a kind of ontology of information. The Schrödinger equation in quantum mechanics is involved to illustrate that ontology. Thus the problem which of the two mathematics is more relevant to our being is discussed again in a new way A few directions for future work can be: a rigorous formal proof of the principle as an independent axiom; the further development of information ontology consistent to both kinds of mathematics, but much more natural for Hilbert mathematics; the development of the information interpretation of quantum mechanics as a mathematical one for information ontology and thus Hilbert mathematics; the description of consciousness in terms of information ontology. (shrink)
Drawing on insights from Imre Lakatos' seminal work on theories of rationality, Leslie Allan develops seven criteria for rational theory choice that avoid presuming the rationality of the scientific enterprise. He shows how his axioms of rationality follow from the general demands of an objectivist epistemology. Allan concludes by considering two weighty objections to his framework.
We propose an axiomatic approach to constructing the dynamics of systems, in which one the main elements 9e8 is the consciousness of a subject. The main axiom is the statements that the state of consciousness is completely determined by the results of measurements performed on it. In case of economic systems we propose to consider an offer of transaction as a fundamental measurement. Transactions with delayed choice, discussed in this paper, represent a logical generalization of incomplete transactions and (...) allow for a more rigorous approach to studying the properties of the algebra of economic measurements. Considering the behavior of an object as a sequence of actions and choices allows extending the obtained results to random systems, which include the subject's consciousness as one of its elements. The specifics, on which the description of fundamental measurements is based, allows easily transferring the formalism of Schwinger's theory of selective measurements (and the consequent quantum-mechanical formalism) to the economic and social systems. (shrink)
Assuming that votes are independent, the epistemically optimal procedure in a binary collective choice problem is known to be a weighted supermajority rule with weights given by personal log-likelihood-ratios. It is shown here that an analogous result holds in a much more general model. Firstly, the result follows from a more basic principle than expected-utility maximisation, namely from an axiom (Epistemic Monotonicity) which requires neither utilities nor prior probabilities of the ‘correctness’ of alternatives. Secondly, a person’s input need (...) not be a vote for an alternative, it may be any type of input, for instance a subjective degree of belief or probability of the correctness of one of the alternatives. The case of a proﬁle of subjective degrees of belief is particularly appealing, since here no parameters such as competence parameters need to be known. (shrink)
The paper justifies the following theses: The totality can found time if the latter is axiomatically represented by its “arrow” as a well-ordering. Time can found choice and thus information in turn. Quantum information and its units, the quantum bits, can be interpreted as their generalization as to infinity and underlying the physical world as well as the ultimate substance of the world both subjective and objective. Thus a pathway of interpretation between the totality via time, order, choice, (...) and information to the substance of the world is constructed. The article is based only on the well-known facts and definitions and is with no premises in this sense. Nevertheless it is naturally situated among works and ideas of Husserl and Heidegger, linked to the foundation of mathematics by the axiom of choice, to the philosophy of quantum mechanics and information. (shrink)
The paper discusses the origin of dark matter and dark energy from the concepts of time and the totality in the final analysis. Though both seem to be rather philosophical, nonetheless they are postulated axiomatically and interpreted physically, and the corresponding philosophical transcendentalism serves heuristically. The exposition of the article means to outline the “forest for the trees”, however, in an absolutely rigorous mathematical way, which to be explicated in detail in a future paper. The “two deductions” are two successive (...) stage of a single conclusion mentioned above. The concept of “transcendental invariance” meaning ontologically and physically interpreting the mathematical equivalence of the axiom of choice and the well-ordering “theorem” is utilized again. Then, time arrow is a corollary from that transcendental invariance, and in turn, it implies quantum information conservation as the Noether correlate of the linear “increase of time” after time arrow. Quantum information conservation implies a few fundamental corollaries such as the “conservation of energy conservation” in quantum mechanics from reasons quite different from those in classical mechanics and physics as well as the “absence of hidden variables” (versus Einstein’s conjecture) in it. However, the paper is concentrated only into the inference of another corollary from quantum information conservation, namely, dark matter and dark energy being due to entanglement, and thus and in the final analysis, to the conservation of quantum information, however observed experimentally only on the “cognitive screen” of “Mach’s principle” in Einstein’s general relativity. therefore excluding any other source of gravitational field than mass and gravity. Then, if quantum information by itself would generate a certain nonzero gravitational field, it will be depicted on the same screen as certain masses and energies distributed in space-time, and most presumably, observable as those dark energy and dark matter predominating in the universe as about 96% of its energy and matter quite unexpectedly for physics and the scientific worldview nowadays. Besides on the cognitive screen of general relativity, entanglement is available necessarily on still one “cognitive screen” (namely, that of quantum mechanics), being furthermore “flat”. Most probably, that projection is confinement, a mysterious and ad hoc added interaction along with the fundamental tree ones of the Standard model being even inconsistent to them conceptually, as far as it need differ the local space from the global space being definable only as a relation between them (similar to entanglement). So, entanglement is able to link the gravity of general relativity to the confinement of the Standard model as its projections of the “cognitive screens” of those two fundamental physical theories. (shrink)
In this paper, partly historical and partly theoretical, after having shortly outlined the development of the meta-ethics in the 1900?s starting from the Tractatus of Wittgenstein, I argue it is possible to sustain that emotivism and intuitionism are unsatisfactory ethical conceptions, while on the contrary, reason (intended in a logical-deductive sense) plays an effective role both in ethical discussions and in choices. There are some characteristics of the ethical language (prescriptivity, universalizability and predominance) that cannot be eluded (pain the non (...) significativity of the same language) by those who want to morally reason, i.e. by those who intend to regulate their own behaviour on the basis of knowledged and coherent principles. These characteristics can be found whether or not all possible ontological-metaphysics foundations of morals are taken into account. Furthermore the deontic logic systems allow the formalization of ethical theories and - at least in principle - a rigorous critical discussion of the same, but obviously nothing can be affirmed on the value of truth of the axioms of a system. In the deontic logic systems Hume?s law is assumed as an implicit result of inferential (conventional) rules and the acceptance of Hume?s law as a logical-linguistic thesis does not involve the cancellation of values (nihilism) or ethical relativism or indifferentism. (shrink)
In 1922, Thoralf Skolem introduced the term of «relativity» as to infinity от set theory. Не demonstrated Ьу Zermelo 's axiomatics of set theory (incl. the axiom of choice) that there exists unintended interpretations of anу infinite set. Тhus, the notion of set was also «relative». We сan apply his argurnentation to Gödel's incompleteness theorems (1931) as well as to his completeness theorem (1930). Then, both the incompleteness of Реапо arithmetic and the completeness of first-order logic tum out (...) to bе also «relative» in Skolem 's sense. Skolem 's «relativity» argumentation of that kind сan bе applied to а vету wide range of problems and one сan spoke of the relativity of discreteness and continuity or, of finiiteness and infinity, or, of Cantor 's kinds of infinities, etc. The relativity of Skolemian type helps us for generaIizing Einstein 's principle of relativity from the invariance of the physical laws toward diffeomorphisms to their invariance toward anу morphisms (including and especiaIly the discrete ones). Such а kind of generalization from diffeomorphisms (then, the notion of velocity always makes sense) to anу kind of morphism (when 'velocity' mау оr maу not make sense) is an extension of the general Skolemian type оГ relativity between discreteness and continuity от between finiteness and infinity. Particularly, the Lorentz invariance is not valid in general because the notion of velocity is limited to diffeomorphisms. [п the case of entanglement, the physical interaction is discrete0. 'Velocity" and consequently, the 'Lorentz invariance'"do not make sense. Тhat is the simplest explanation ofthe argurnent EPR, which tums into а paradox оnJу if the universal validity of 'velocity' and 'Lогелtz invariance' is implicitly accepted. (shrink)
Group decisions must often obey exogenous constraints. While in a preference aggregation problem constraints are modelled by restricting the set of feasible alternatives, this paper discusses the modelling of constraints when aggregating individual yes/no judgments on interconnected propositions. For example, court judgments in breach-of-contract cases should respect the constraint that action and obligation are necessary and sufficient for liability, and judgments on budget items should respect budgetary constraints. In this paper, we make constraints in judgment aggregation explicit by relativizing the (...) rationality conditions of consistency and deductive closure to a constraint set, whose variation yields more or less strong notions of rationality. This approach of modelling constraints explicitly contrasts with that of building constraints as axioms into the logic, which turns compliance with constraints into a matter of logical consistency and thereby conflates requirements of ordinary logical consistency and requirements dictated by the environment . We present some general impossibility results on constrained judgment aggregation; they are immediate corollaries of known results on judgment aggregation. (shrink)
Any computer can create a model of reality. The hypothesis that quantum computer can generate such a model designated as quantum, which coincides with the modeled reality, is discussed. Its reasons are the theorems about the absence of “hidden variables” in quantum mechanics. The quantum modeling requires the axiom of choice. The following conclusions are deduced from the hypothesis. A quantum model unlike a classical model can coincide with reality. Reality can be interpreted as a quantum computer. The (...) physical processes represent computations of the quantum computer. Quantum information is the real fundament of the world. The conception of quantum computer unifies physics and mathematics and thus the material and the ideal world. Quantum computer is a non-Turing machine in principle. Any quantum computing can be interpreted as an infinite classical computational process of a Turing machine. Quantum computer introduces the notion of “actually infinite computational process”. The discussed hypothesis is consistent with all quantum mechanics. The conclusions address a form of neo-Pythagoreanism: Unifying the mathematical and physical, quantum computer is situated in an intermediate domain of their mutual transformation. (shrink)
A set theory model of reality, representation and language based on the relation of completeness and incompleteness is explored. The problem of completeness of mathematics is linked to its counterpart in quantum mechanics. That model includes two Peano arithmetics or Turing machines independent of each other. The complex Hilbert space underlying quantum mechanics as the base of its mathematical formalism is interpreted as a generalization of Peano arithmetic: It is a doubled infinite set of doubled Peano arithmetics having a remarkable (...) symmetry to the axiom of choice. The quantity of information is interpreted as the number of elementary choices (bits). Quantum information is seen as the generalization of information to infinite sets or series. The equivalence of that model to a quantum computer is demonstrated. The condition for the Turing machines to be independent of each other is reduced to the state of Nash equilibrium between them. Two relative models of language as game in the sense of game theory and as ontology of metaphors (all mappings, which are not one-to-one, i.e. not representations of reality in a formal sense) are deduced. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.