We show that the respective oversights in the von Neumann's general theorem against all hidden variable theories and Bell's theorem against their local-realistic counterparts are homologous. When latter oversight is rectified, the bounds on the CHSH correlator work out to be ±2√2 instead of ±2.
Although expected utility theory has proven a fruitful and elegant theory in the finite realm, attempts to generalize it to infinite values have resulted in many paradoxes. In this paper, we argue that the use of John Conway's surreal numbers shall provide a firm mathematical foundation for transfinite decision theory. To that end, we prove a surreal representation theorem and show that our surreal decision theory respects dominance reasoning even in the case of infinite values. We then bring our (...) theory to bear on one of the more venerable decision problems in the literature: Pascal's Wager. Analyzing the wager showcases our theory's virtues and advantages. To that end, we analyze two objections against the wager: Mixed Strategies and Many Gods. After formulating the two objections in the framework of surreal utilities and probabilities, our theory correctly predicts that (1) the pure Pascalian strategy beats all mixed strategies, and (2) what one should do in a Pascalian decision problem depends on what one's credence function is like. Our analysis therefore suggests that although Pascal's Wager is mathematically coherent, it does not deliver what it purports to, a rationally compelling argument that people should lead a religious life regardless of how confident they are in theism and its alternatives. (shrink)
I argue that prioritarianism cannot be assessed in abstraction from an account of the measure of utility. Rather, the soundness of this view crucially depends on what counts as a greater, lesser, or equal increase in a person’s utility. In particular, prioritarianism cannot accommodate a normatively compelling measure of utility that is captured by the axioms of John von Neumann and Oskar Morgenstern’s expected utility theory. Nor can it accommodate a plausible and elegant generalization of this theory that has been (...) offered in response to challenges to von Neumann and Morgenstern. This is, I think, a theoretically interesting and unexpected source of difficulty for prioritarianism, which I explore in this article. (shrink)
The paper summarizes expected utility theory, both in its original von Neumann-Morgenstern version and its later developments, and discusses the normative claims to rationality made by this theory.
Harsanyi claimed that his Aggregation and Impartial Observer Theorems provide a justification for utilitarianism. This claim has been strongly resisted, notably by Sen and Weymark, who argue that while Harsanyi has perhaps shown that overall good is a linear sum of individuals’ von Neumann-Morgenstern utilities, he has done nothing to establish any con- nection between the notion of von Neumann-Morgenstern utility and that of well-being, and hence that utilitarianism does not follow. The present article defends Harsanyi against the (...) Sen-Weymark cri- tique. I argue that, far from being a term with precise and independent quantitative content whose relationship to von Neumann-Morgenstern utility is then a substantive question, terms such as ‘well-being’ suffer (or suffered) from indeterminacy regarding precisely which quantity they refer to. If so, then (on the issue that this article focuses on) Harsanyi has gone as far towards defending ‘utilitarianism in the original sense’ as could coherently be asked. (shrink)
People with the kind of preferences that give rise to the St. Petersburg paradox are problematic---but not because there is anything wrong with infinite utilities. Rather, such people cannot assign the St. Petersburg gamble any value that any kind of outcome could possibly have. Their preferences also violate an infinitary generalization of Savage's Sure Thing Principle, which we call the *Countable Sure Thing Principle*, as well as an infinitary generalization of von Neumann and Morgenstern's Independence axiom, which we call *Countable (...) Independence*. In violating these principles, they display foibles like those of people who deviate from standard expected utility theory in more mundane cases: they choose dominated strategies, pay to avoid information, and reject expert advice. We precisely characterize the preference relations that satisfy Countable Independence in several equivalent ways: a structural constraint on preferences, a representation theorem, and the principle we began with, that every prospect has a value that some outcome could have. (shrink)
This paper reviews some major episodes in the history of the spatial isomorphism problem of dynamical systems theory. In particular, by analysing, both systematically and in historical context, a hitherto unpublished letter written in 1941 by John von Neumann to Stanislaw Ulam, this paper clarifies von Neumann's contribution to discovering the relationship between spatial isomorphism and spectral isomorphism. The main message of the paper is that von Neumann's argument described in his letter to Ulam is the very first proof that (...) spatial isomorphism and spectral isomorphism are not equivalent because spectral isomorphism is weaker than spatial isomorphism: von Neumann shows that spectrally isomorphic ergodic dynamical systems with mixed spectra need not be spatially isomorphic. (shrink)
In his classic book “the Foundations of Statistics” Savage developed a formal system of rational decision making. The system is based on (i) a set of possible states of the world, (ii) a set of consequences, (iii) a set of acts, which are functions from states to consequences, and (iv) a preference relation over the acts, which represents the preferences of an idealized rational agent. The goal and the culmination of the enterprise is a representation theorem: Any preference relation (...) that satisfies certain arguably acceptable postulates determines a (finitely additive) probability distribution over the states and a utility assignment to the consequences, such that the preferences among acts are determined by their expected utilities. Additional problematic assumptions are however required in Savage's proofs. First, there is a Boolean algebra of events (sets of states) which determines the richness of the set of acts. The probabilities are assigned to members of this algebra. Savage's proof requires that this be a σ-algebra (i.e., closed under infinite countable unions and intersections), which makes for an extremely rich preference relation. On Savage's view we should not require subjective probabilities to be σ-additive. He therefore finds the insistence on a σ-algebra peculiar and is unhappy with it. But he sees no way of avoiding it. Second, the assignment of utilities requires the constant act assumption: for every consequence there is a constant act, which produces that consequence in every state. This assumption is known to be highly counterintuitive. The present work contains two mathematical results. The first, and the more difficult one, shows that the σ-algebra assumption can be dropped. The second states that, as long as utilities are assigned to finite gambles only, the constant act assumption can be replaced by the more plausible and much weaker assumption that there are at least two non-equivalent constant acts. The second result also employs a novel way of deriving utilities in Savage-style systems -- without appealing to von Neumann-Morgenstern lotteries. The paper discusses the notion of “idealized agent" that underlies Savage's approach, and argues that the simplified system, which is adequate for all the actual purposes for which the system is designed, involves a more realistic notion of an idealized agent. (shrink)
In spite of the many efforts made to clarify von Neumann’s methodology of science, one crucial point seems to have been disregarded in recent literature: his closeness to Hilbert’s spirit. In this paper I shall claim that the scientific methodology adopted by von Neumann in his later foundational reflections originates in the attempt to revaluate Hilbert’s axiomatics in the light of Gödel’s incompleteness theorems. Indeed, axiomatics continues to be pursued by the Hungarian mathematician in the spirit of Hilbert’s school. I (...) shall argue this point by examining four basic ideas embraced by von Neumann in his foundational considerations: a) the conservative attitude to assume in mathematics; b) the role that mathematics and the axiomatic approach have to play in all that is science; c) the notion of success as an alternative methodological criterion to follow in scientific research; d) the empirical and, at the same time, abstract nature of mathematical thought. Once these four basic ideas have been accepted, Hilbert’s spirit in von Neumann’s methodology of science will become clear. (shrink)
We review our approach to quantum mechanics adding also some new interesting results. We start by giving proof of two important theorems on the existence of the A(Si) and i,±1 N Clifford algebras. This last algebra gives proof of the von Neumann basic postulates on the quantum measurement explaining thus in an algebraic manner the wave function collapse postulated in standard quantum theory. In this manner we reach the objective to expose a self-consistent version of quantum mechanics. In detail we (...) realize a bare bone skeleton of quantum mechanics recovering all the basic foundations of this theory on an algebraic framework. We give proof of the quantum like Heisenberg uncertainty relations using only the basic support of the Clifford algebra. In addition we demonstrate the well known phenomenon of quantum Mach Zender interference using the same algebraic framework, as well as we give algebraic proof of quantum collapse in some cases of physical interest by direct application of the theorem that we derive to elaborate the i,±1 N algebra. We also discuss the problem of time evolution of quantum systems as well as the changes in space location, in momentum and the linked invariance principles. We are also able to re-derive the basic wave function of standard quantum mechanics by using only the Clifford algebraic approach. In this manner we obtain a full exposition of standard quantum mechanics using only the basic axioms of Clifford algebra. We also discuss more advanced features of quantum mechanics. In detail, we give demonstration of the Kocken-Specher theorem, and also we give an algebraic formulation and explanation of the EPR paradox only using the Clifford algebra. By using the same approach we also derive Bell inequalities. Our formulation is strongly based on the use of idempotents that are contained in Clifford algebra. Their counterpart in quantum mechanics is represented by the projection operators that, as it is well known, are interpreted as logical statements, following the basic von Neumann results. Von Neumann realized a matrix logic on the basis of quantum mechanics. Using the Clifford algebra we are able to invert such result. According to the results previously obtained by Orlov in 1994, we are able to give proof that quantum mechanics derives from logic. We show that indeterminism and quantum interference have their origin in the logic. Therefore, it seems that we may conclude that quantum mechanics, as it appears when investigated by the Clifford algebra, is a two-faced theory in the sense that it looks from one side to “matter per se”, thus to objects but simultaneously also to conceptual entities. We advance the basic conclusion of the paper: There are stages of our reality in which we no more can separate the logic ( and thus cognition and thus conceptual entity) from the features of “matter per se”. In quantum mechanics the logic, and thus the cognition and thus the conceptual entity-cognitive performance, assume the same importance as the features of what is being described. We are at levels of reality in which the truths of logical statements about dynamic variables become dynamic variables themselves so that a profound link is established from its starting in this theory between physics and conceptual entities. Finally, in this approach there is not an absolute definition of logical truths. Transformations , and thus … “redefinitions”…. of truth values are permitted in such scheme as well as the well established invariance principles, clearly indicate . (shrink)
REVIEW OF: Automated Development of Fundamental Mathematical Theories by Art Quaife. (1992: Kluwer Academic Publishers) 271pp. Using the theorem prover OTTER Art Quaife has proved four hundred theorems of von Neumann-Bernays-Gödel set theory; twelve hundred theorems and definitions of elementary number theory; dozens of Euclidean geometry theorems; and Gödel's incompleteness theorems. It is an impressive achievement. To gauge its significance and to see what prospects it offers this review looks closely at the book and the proofs it presents.
Whereas many others have scrutinized the Allais paradox from a theoretical angle, we study the paradox from an historical perspective and link our findings to a suggestion as to how decision theory could make use of it today. We emphasize that Allais proposed the paradox as a normative argument, concerned with ‘the rational man’ and not the ‘real man’, to use his words. Moreover, and more subtly, we argue that Allais had an unusual sense of the normative, being concerned not (...) so much with the rationality of choices as with the rationality of the agent as a person. These two claims are buttressed by a detailed investigation – the first of its kind – of the 1952 Paris conference on risk, which set the context for the invention of the paradox, and a detailed reconstruction – also the first of its kind – of Allais’s specific normative argument from his numerous but allusive writings. The paper contrasts these interpretations of what the paradox historically represented, with how it generally came to function within decision theory from the late 1970s onwards: that is, as an empirical refutation of the expected utility hypothesis, and more specifically of the condition of von Neumann–Morgenstern independence that underlies that hypothesis. While not denying that this use of the paradox was fruitful in many ways, we propose another use that turns out also to be compatible with an experimental perspective. Following Allais’s hints on ‘the experimental definition of rationality’, this new use consists in letting the experiment itself speak of the rationality or otherwise of the subjects. In the 1970s, a short sequence of papers inspired by Allais implemented original ways of eliciting the reasons guiding the subjects’ choices, and claimed to be able to draw relevant normative consequences from this information. We end by reviewing this forgotten experimental avenue not simply historically, but with a view to recommending it for possible use by decision theorists today. (shrink)
This monographic chapter explains how expected utility (EU) theory arose in von Neumann and Morgenstern, how it was called into question by Allais and others, and how it gave way to non-EU theories, at least among the specialized quarters of decion theory. I organize the narrative around the idea that the successive theoretical moves amounted to resolving Duhem-Quine underdetermination problems, so they can be assessed in terms of the philosophical recommendations made to overcome these problems. I actually follow Duhem's recommendation, (...) which was essentially to rely on the passing of time to make many experiments and arguments available, and evebntually strike a balance between competing theories on the basis of this improved knowledge. Although Duhem's solution seems disappointingly vague, relying as it does on "bon sens" to bring an end to the temporal process, I do not think there is any better one in the philosophical literature, and I apply it here for what it is worth. In this perspective, EU theorists were justified in resisting the first attempts at refuting their theory, including Allais's in the 50s, but they would have lacked "bon sens" in not acknowledging their defeat in the 80s, after the long process of pros and cons had sufficiently matured. This primary Duhemian theme is actually combined with a secondary theme - normativity. I suggest that EU theory was normative at its very beginning and has remained so all along, and I express dissatisfaction with the orthodox view that it could be treated as a straightforward descriptive theory for purposes of prediction and scientific test. This view is usually accompanied with a faulty historical reconstruction, according to which EU theorists initially formulated the VNM axioms descriptively and retreated to a normative construal once they fell threatened by empirical refutation. From my historical study, things did not evolve in this way, and the theory was both proposed and rebutted on the basis of normative arguments already in the 1950s. The ensuing, major problem was to make choice experiments compatible with this inherently normative feature of theory. Compability was obtained in some experiments, but implicitly and somewhat confusingly, for instance by excluding overtly incoherent subjects or by creating strong incentives for the subjects to reflect on the questions and provide answers they would be able to defend. I also claim that Allais had an intuition of how to combine testability and normativity, unlike most later experimenters, and that it would have been more fruitful to work from his intuition than to make choice experiments of the naively empirical style that flourished after him. In sum, it can be said that the underdetermination process accompanying EUT was resolved in a Duhemian way, but this was not without major inefficiencies. To embody explicit rationality considerations into experimental schemes right from the beginning would have limited the scope of empirical research, avoided wasting resources to get only minor findings, and speeded up the Duhemian process of groping towards a choice among competing theories. (shrink)
The Humean conception of the self consists in the belief-desire model of motivation and the utility-maximizing model of rationality. This conception has dominated Western thought in philosophy and the social sciences ever since Hobbes’ initial formulation in Leviathan and Hume’s elaboration in the Treatise of Human Nature. Bentham, Freud, Ramsey, Skinner, Allais, von Neumann and Morgenstern and others have added further refinements that have brought it to a high degree of formal sophistication. Late twentieth century moral philosophers such as Rawls, (...) Brandt, Frankfurt, Nagel and Williams have taken it for granted, and have made use of it to supply metaethical foundations for a wide variety of normative moral theories. But the Humean conception of the self also leads to seemingly insoluble problems about moral motivation, rational final ends, and moral justification. Can it be made to work? (shrink)
This is the first of a two-volume work combining two fundamental components of contemporary computing into classical deductive computing, a powerful form of computation, highly adequate for programming and automated theorem proving, which, in turn, have fundamental applications in areas of high complexity and/or high security such as mathematical proof, software specification and verification, and expert systems. Deductive computation is concerned with truth-preservation: This is the essence of the satisfiability problem, or SAT, the central computational problem in computability and (...) complexity theory. The Turing machine provides the classical version of this theory—classical computing—with its standard model, which is physically concretized—and thus spatial-temporally limited and restricted—in the von Neumann, or digital, computer. Although a number of new technological applications require classical deductive computation with non-classical logics, many key technologies still do well—or exclusively, for that matter—with classical logic. In this first volume, we elaborate on classical deductive computing with classical logic. The objective of the main text is to provide the reader with a thorough elaboration on both classical computing and classical deduction with the classical first-order predicate calculus with a view to computational implementations. As a complement to the mathematical-based exposition of the topics we offer the reader a very large selection of exercises. This selection aims at not only practice of discussed material, but also creative approaches to problems, for both discussed and novel contents, as well as at research into further relevant topics. (shrink)
A possible world is a junky world if and only if each thing in it is a proper part. The possibility of junky worlds contradicts the principle of general fusion. Bohn (2009) argues for the possibility of junky worlds, Watson (2010) suggests that Bohn‘s arguments are flawed. This paper shows that the arguments of both authors leave much to be desired. First, relying on the classical results of Cantor, Zermelo, Fraenkel, and von Neumann, this paper proves the possibility of junky (...) worlds for certain weak set theories. Second, the paradox of Burali-Forti shows that according to the Zermelo-Fraenkel set theory ZF, junky worlds are possible. Finally, it is shown that set theories are not the only sources for designing plausible models of junky worlds: Topology (and possibly other "algebraic" mathematical theories) may be used to construct models of junky worlds. In sum, junkyness is a relatively widespread feature among possible worlds. (shrink)
The notion of equality between two observables will play many important roles in foundations of quantum theory. However, the standard probabilistic interpretation based on the conventional Born formula does not give the probability of equality between two arbitrary observables, since the Born formula gives the probability distribution only for a commuting family of observables. In this paper, quantum set theory developed by Takeuti and the present author is used to systematically extend the standard probabilistic interpretation of quantum theory to define (...) the probability of equality between two arbitrary observables in an arbitrary state. We apply this new interpretation to quantum measurement theory, and establish a logical basis for the difference between simultaneous measurability and simultaneous determinateness. (shrink)
Psychopathy refers to a range of complex behaviors and personality traits, including callousness and antisocial behavior, typically studied in criminal populations. Recent studies have used self-reports to examine psychopathic traits among noncriminal samples. The goal of the current study was to examine the underlying factor structure of the Self-Report of Psychopathy Scale–Short Form (SRP-SF) across complementary samples and examine the impact of gender on factor structure. We examined the structure of the SRP-SF among 2,554 young adults from three undergraduate samples (...) and a high-risk young adult sample. Using confirmatory factor analysis, a four-correlated factor model and a four-bifactor model showed good fit to the data. Evidence of weak invariance was found for both models across gender. These findings highlight that the SRP-SF is a useful measure of low-level psychopathic traits in noncriminal samples, although the underlying factor structure may not fully translate across men and women. (shrink)
Holt argues that Rawls’s first principle of justice requires democratic control of the economy and that property owning democracy fails to satisfy this requirement; only liberal socialism is fully democratic. However, the notion of democratic control is ambiguous, and Holt has to choose between the weaker notion of democratic control that Rawls is committed to and the stronger notion that property owning democracy fails to satisfy. It may be that there is a tension between capitalism and democracy, so that only (...) liberal socialism can be fully democratic, but if so, we should reject, rather than argue from, the theory of democracy we find in justice as fairness. (shrink)
The syllogistic figures and moods can be taken to be argument schemata as can the rules of the Stoic propositional logic. Sentence schemata have been used in axiomatizations of logic only since the landmark 1927 von Neumann paper [31]. Modern philosophers know the role of schemata in explications of the semantic conception of truth through Tarski’s 1933 Convention T [42]. Mathematical logicians recognize the role of schemata in first-order number theory where Peano’s second-order Induction Axiom is approximated by Herbrand’s Induction-Axiom (...) Schema [23]. Similarly, in first-order set theory, Zermelo’s second-order Separation Axiom is approximated by Fraenkel’s first-order Separation Schema [17]. In some of several closely related senses, a schema is a complex system having multiple components one of which is a template-text or scheme-template, a syntactic string composed of one or more “blanks” and also possibly significant words and/or symbols. In accordance with a side condition the template-text of a schema is used as a “template” to specify a multitude, often infinite, of linguistic expressions such as phrases, sentences, or argument-texts, called instances of the schema. The side condition is a second component. The collection of instances may but need not be regarded as a third component. The instances are almost always considered to come from a previously identified language (whether formal or natural), which is often considered to be another component. This article reviews the often-conflicting uses of the expressions ‘schema’ and ‘scheme’ in the literature of logic. It discusses the different definitions presupposed by those uses. And it examines the ontological and epistemic presuppositions circumvented or mooted by the use of schemata, as well as the ontological and epistemic presuppositions engendered by their use. In short, this paper is an introduction to the history and philosophy of schemata. (shrink)
I follow standard mathematical practice and theory to argue that the natural numbers are the finite von Neumann ordinals. I present the reasons standardly given for identifying the natural numbers with the finite von Neumann's. I give a detailed mathematical demonstration that 0 is {} and for every natural number n, n is the set of all natural numbers less than n. Natural numbers are sets. They are the finite von Neumann ordinals.
Since the pioneering work of Birkhoff and von Neumann, quantum logic has been interpreted as the logic of (closed) subspaces of a Hilbert space. There is a progression from the usual Boolean logic of subsets to the "quantum logic" of subspaces of a general vector space--which is then specialized to the closed subspaces of a Hilbert space. But there is a "dual" progression. The notion of a partition (or quotient set or equivalence relation) is dual (in a category-theoretic sense) to (...) the notion of a subset. Hence the Boolean logic of subsets has a dual logic of partitions. Then the dual progression is from that logic of partitions to the quantum logic of direct-sum decompositions (i.e., the vector space version of a set partition) of a general vector space--which can then be specialized to the direct-sum decompositions of a Hilbert space. This allows the logic to express measurement by any self-adjoint operators rather than just the projection operators associated with subspaces. In this introductory paper, the focus is on the quantum logic of direct-sum decompositions of a finite-dimensional vector space (including such a Hilbert space). The primary special case examined is finite vector spaces over ℤ₂ where the pedagogical model of quantum mechanics over sets (QM/Sets) is formulated. In the Appendix, the combinatorics of direct-sum decompositions of finite vector spaces over GF(q) is analyzed with computations for the case of QM/Sets where q=2. (shrink)
The notion of a partition on a set is mathematically dual to the notion of a subset of a set, so there is a logic of partitions dual to Boole's logic of subsets (Boolean logic is usually mis-specified as "propositional" logic). The notion of an element of a subset has as its dual the notion of a distinction of a partition (a pair of elements in different blocks). Boole developed finite logical probability as the normalized counting measure on elements of (...) subsets so there is a dual concept of logical entropy which is the normalized counting measure on distinctions of partitions. Thus the logical notion of information is a measure of distinctions. Classical logical entropy naturally extends to the notion of quantum logical entropy which provides a more natural and informative alternative to the usual Von Neumann entropy in quantum information theory. The quantum logical entropy of a post-measurement density matrix has the simple interpretation as the probability that two independent measurements of the same state using the same observable will have different results. The main result of the paper is that the increase in quantum logical entropy due to a projective measurement of a pure state is the sum of the absolute squares of the off-diagonal entries ("coherences") of the pure state density matrix that are zeroed ("decohered") by the measurement, i.e., the measure of the distinctions ("decoherences") created by the measurement. (shrink)
contents -/- i. Austen would eroticize all life -/- ii. Merchant/Ivory, a name oddly right -/- iii. Ellie Arroway / Agent Starling -/- iv. abattoir / l’abattoir / laboratoire -/- v. von Neumann's brain an anomaly -/- vi. was terrified of death, delighted in the a-bomb -/- vii. the Greatest Brain is variously named .
“There’s Plenty of Room at the Bottom”, said the title of Richard Feynman’s 1959 seminal conference at the California Institute of Technology. Fifty years on, nanotechnologies have led computer scientists to pay close attention to the links between physical reality and information processing. Not all the physical requirements of optimal computation are captured by traditional models—one still largely missing is reversibility. The dynamic laws of physics are reversible at microphysical level, distinct initial states of a system leading to distinct final (...) states. On the other hand, as von Neumann already conjectured, irreversible information processing is expensive: to erase a single bit of information costs ~3 × 10−21 joules at room temperature. Information entropy is a thermodynamic cost, to be paid in non-computational energy dissipation. This paper addresses the problem drawing on Edward Fredkin’s Finite Nature hypothesis: the ultimate nature of the universe is discrete and finite, satisfying the axioms of classical, atomistic mereology. The chosen model is a cellular automaton with reversible dynamics, capable of retaining memory of the information present at the beginning of the universe. Such a CA can implement the Boolean logical operations and the other building bricks of computation: it can develop and host all-purpose computers. The model is a candidate for the realization of computational systems, capable of exploiting the resources of the physical world in an efficient way, for they can host logical circuits with negligible internal energy dissipation. (shrink)
Theoretiker der Künstlichen Intelligenz und deren Wegbegleiter in der Philosophie des Geistes haben auf unterschiedliche Weise auf Kritik am ursprünglichen Theorieziel der KI reagiert. Eine dieser Reaktionen ist die Zurücknahme dieses Theorieziels zugunsten der Verfolgung kleinerformatiger Projekte. Eine andere Reaktion ist die Propagierung konnektionistischer Systeme, die mit ihrer dezentralen Arbeitsweise die neuronalen Netze des menschlichen Gehirns besser simulieren sollen. Eine weitere ist die sogenannte robot reply. Die Roboterantwort besteht aus zwei Elementen. Sie enthält (a) das Zugeständnis, daß das Systemverhalten eines (...) wie auch immer programmierten konventionellen Digitalrechners mit von Neumann-Architektur nicht schon menschenähnliche Intelligenz aufweist, und (b) die Behauptung, daß es für bestimmte Arten von Maschinen doch zur Intelligenz reicht. In die Liga der intelligenten Wesen könnten Maschinen genau dann aufsteigen, wenn sie Roboter sind. Damit ist gemeint: wenn sie über Wahrnehmungskomponenten (Rezeptoren) und Handlungskomponenten (Effektoren) verfügen, mithilfe deren sie aktiv in kausale Interaktionen mit ihrer Umwelt eintreten können. Im Beitrag wird für die These argumentiert, daß der Roboterantwort eine richtige Intuition zugrunde liegt, von der die Roboterfreunde sich aber zu einer kurzschlüssigen Folgerung verleiten lassen. Es ist richtig, mentale Zustände und Handlungskompetenz eng aneinander zu binden. Einem Wesen, dem man Handlungsfähigkeit zuerkennt, kann man mentale Zustände nicht absprechen. Doch Handelnkönnen und Geisthaben sind nicht hinreichend unabhängig voneinander, als daß man das eine als Rechtfertigung für die Zuerkennung des anderen verwenden könnte. Man sollte Robotern beides absprechen. (shrink)
What is a physical object according to the theory of quantum mechanics? The first answer to be considered is that given by Bohr in terms of the concept of complementarity. This interpretation is illustrated by way of an example, the two slit experiment, which highlights some of the associated problems of ontology. One such problem is the so-called problem of measurement or observation. Various interpretations of measurement in Quantum Theory, including those of Heisenberg, von Neumann, Everett and Bohr, are compared (...) and contrasted. A second problem concerns whether or not QT can be considered complete and therefore satisfactory as a basis for physics. Various attempts to complete QT by means of the addition of ‘hidden variables’ to the quantum mechanical state function are considered and their aims and achievements assessed. Finally, we investigate some of the characteristic ontological problems for the orthodox interpretation of Relativistic Quantum Theory. -/- . (shrink)
This paper generalises the classical Condorcet jury theorem from majority voting over two options to plurality voting over multiple options. The paper further discusses the debate between epistemic and procedural democracy and situates its formal results in that debate. The paper finally compares a number of different social choice procedures for many-option choices in terms of their epistemic merits. An appendix explores the implications of some of the present mathematical results for the question of how probable majority cycles (as (...) in Condorcet's paradox) are in large electorates. (shrink)
In response to recent work on the aggregation of individual judgments on logically connected propositions into collective judgments, it is often asked whether judgment aggregation is a special case of Arrowian preference aggregation. We argue for the converse claim. After proving two impossibility theorems on judgment aggregation (using "systematicity" and "independence" conditions, respectively), we construct an embedding of preference aggregation into judgment aggregation and prove Arrow’s theorem (stated for strict preferences) as a corollary of our second result. Although we (...) thereby provide a new proof of Arrow’s theorem, our main aim is to identify the analogue of Arrow’s theorem in judgment aggregation, to clarify the relation between judgment and preference aggregation, and to illustrate the generality of the judgment aggregation model. JEL Classi…cation: D70, D71.. (shrink)
Psychopathic individuals display a callous-coldhearted approach to interpersonal and affective situations and engage in impulsive and antisocial behaviors. Despite early conceptualizations suggesting that psychopathy is related to enhanced cognitive functioning, research examining executive functioning (EF) in psychopathy has yielded few such findings. It is possible that some psychopathic trait dimensions are more related to EF than others. Research using a 2-factor or 4-facet model of psychopathy highlights some dimension-specific differences in EF, but this research is limited in scope. Another complicating (...) factor in teasing apart the EF–psychopathy relationship is the tendency to use different psychopathy assessments for incarcerated versus community samples. In this study, an EF battery and multiple measures of psychopathic dimensions were administered to a sample of male prisoners (N. (shrink)
Exploratory factor analysis (EFA) of the Psychopathic Personality Inventory (PPI; S. O. Lilienfeld, 1990; S. O. Lilienfeld & B. P. Andrews, 1996) with a community sample has suggested that the PPI subscales may comprise 2 higher order factors (S. D. Benning, C. J. Patrick, B. M. Hicks, D. M. Blonigen, & R. F. Krueger, 2003). However, substantive and structural evidence raises concerns about the viability of this 2-factor model, particularly in offender populations. The authors attempted to replicate the S. D. (...) Benning et al. 2-factor solution using a large (N= 1,224) incarcerated male sample. Confirmatory factor analysis of this model resulted in poor model fit. Similarly, using the same EFA procedures as did S. D. Benning et al., the authors found little evidence for a 2-factor model. When they followed the recommendations of J.-W. van Prooijen and W. A. van der Kloot (2001) for recovering EFA solutions, model fit results provided some evidence that a 3-factor EFA solution could be recovered via confirmatory factor analysis. (shrink)
Condorcet's famous jury theorem reaches an optimistic conclusion on the correctness of majority decisions, based on two controversial premises about voters: they are competent and vote independently, in a technical sense. I carefully analyse these premises and show that: whether a premise is justi…ed depends on the notion of probability considered; none of the notions renders both premises simultaneously justi…ed. Under the perhaps most interesting notions, the independence assumption should be weakened.
Why did human beings throughout the millennia so often think about a doomsday? Could there be a profit to our inner pleasure and pain equilibrium, when believing that doomsday is nearing, an idea suggested by Sigmund Freud? An analogous instinctive dynamics was thought by Nietzsche who wrote that human beings do prefer to want the nothingness rather than not to want anything at all. In this essay, 'Melancholia', a movie by Lars von Trier, is taken as an exquisite masterpiece, a (...) grandiose exposition of Schopenhauer and Nietzsche Philosophies.. (shrink)
Amalgamating evidence of different kinds for the same hypothesis into an overall confirmation is analogous, I argue, to amalgamating individuals’ preferences into a group preference. The latter faces well-known impossibility theorems, most famously “Arrow’s Theorem”. Once the analogy between amalgamating evidence and amalgamating preferences is tight, it is obvious that amalgamating evidence might face a theorem similar to Arrow’s. I prove that this is so, and end by discussing the plausibility of the axioms required for the theorem.
The National Library of Finland and the Von Wright and Wittgenstein Archives at the University of Helsinki keep the collected correspondence of Georg Henrik von Wright, Wittgenstein’s friend and successor at Cambridge and one of the three literary executors of Wittgenstein’s Nachlass. Among von Wright’s correspondence partners, Elizabeth Anscombe and Rush Rhees are of special interest to Wittgenstein scholars as the two other trustees of the Wittgenstein papers. Thus, von Wright’s collections held in Finland promise to shed light on the (...) context of decades of editorial work that made Wittgenstein’s later philosophy available to all interested readers. In this text, we present the letters which von Wright received from Anscombe and Rhees during the first nine months after Wittgenstein’s death. This correspondence provides a vivid picture of the literary executors as persons and of their developing relationships. The presented letters are beautiful examples of what the correspondence as a whole has to offer; it depicts – besides facts of editing – the story of three philosophers, whose conversing voices unfold the human aspects of inheriting Wittgenstein’s Nachlass. Their story does not only deal with editing the papers of an eminent philosopher, but with the attempt to do justice to the man they knew, to his philosophy and to his wishes for publication. (shrink)
It is sometimes alleged that the study of emotion and the study of value are currently pursued as relatively autonomous disciplines. As Kevin Mulligan notes, “the philosophy and psychology of emotions pays little attention to the philosophy of value and the latter pays only a little more attention to the former.” (2010b, 475). Arguably, the last decade has seen more of a rapprochement between these two domains than used to be the norm (cf. e.g. Roeser & Todd 2014). But there (...) still seems to be considerable potential for exchange and dialogue if the situation is compared with their intimate relationship in central strands of early realist phenomenology. The philosopher perhaps most representative of this ecumenical approach is Husserl’s early student Dietrich von Hildebrand (1889-1977). From the very early stages of his philosophical career, Hildebrand has developed one of the most original, comprehensive and nuanced accounts of emotions at whose core is a detailed examination of their connection to value. While his central concern with the ethical significance of our affective life is in many ways continuous with Scheler’s work and draws crucially on Reinach’s philosophy of mind, Hildebrand’s own reflections considerably expand on and substantially modify the picture of the ontology and normative role of emotions defended by these authors. In this article, I reconstruct Hildebrand’s view of emotions with a particular focus on those aspects which represent his most distinctive contribution to this subject. (shrink)
Sebastian Franck hat Teile von Agrippas De Vanitate Scientiarum übersetzt und kommentiert. Von daher ist der Einfluss der Philosophie von Agrippa auf Franck bekannt. Es gab allerdings bisher keine ausführlichen Untersuchungen zu den Einflüssen von Agrippa auf Franck. Diese Lücke versucht dieser Aufsatz zu schließen. Beim Vergleich der metaphysischen Systeme von Franck und Agrippa stellt sich heraus, dass es bedeutende Einflüsse im Bereich der Seelenlehre und der Christologie gab. Sowohl Agrippa als auch Franck sind Anhänger der platonischen Lehre der drei (...) Seelenteile. Die Seele besteht nach dieser Lehre aus Geist, Seele und Körper. Der Geist ist der unsterbliche göttliche Teil im Menschen. Der Hauptunterschied zwischen Agrippa und Franck kann darin gesehen werden, dass Agrippas Synkretismus stärker den Neuplatonismus betont und Franck`s Synkretismus die Gnosis. Die Weltseele ergibt im Konzept von Franck nur wenig Sinn, während diese bei Agrippa ganz zentral ist. Franck lehnt den Gedanken ab, die Welt werde durch eine Vernunft geleitet. Sein pessimistisches Bild von der Welt und vom Menschen verträgt sich nicht mit diesem Gedanken. Da es keine Vernunft gibt, ist auch die Astrologie, die Mantik oder die Kabbalistik nicht wichtig für Franck. Daher ist es nicht verwunderlich, dass Franck nirgends die Occulta philosophia zitiert, sondern immer nur De Vantitate Scientiarum Der Synkretismus von Agrippa beinhaltet Astrologie, Mantik, Kabbala und Neuplatonismus. Am ehesten spielt bei Franck noch die Hermetik eine Rolle. Franck meint, Christus habe im Pimander mit Hermes Trismegistos kommuniziert. Diesen Gedanken findet man nicht nur bei Franck, sondern auch bei Agrippa. Die für Franck so wichtige Lehre des inneren Christus ist ebenfalls Teil der Seelenlehre von Agrippa. Wenn der Mensch seinen Geist oder seinen inneren Christus erkennt, dann erkennt er Gott. Die Untersuchung zeigt, dass Franck nur diejenigen Dinge von dem „weisen Agrippa“ übernommen hat, die in sein Konzept passen. (shrink)
The standard representation theorem for expected utility theory tells us that if a subject’s preferences conform to certain axioms, then she can be represented as maximising her expected utility given a particular set of credences and utilities—and, moreover, that having those credences and utilities is the only way that she could be maximising her expected utility. However, the kinds of agents these theorems seem apt to tell us anything about are highly idealised, being always probabilistically coherent with infinitely precise (...) degrees of belief and full knowledge of all a priori truths. Ordinary subjects do not look very rational when compared to the kinds of agents usually talked about in decision theory. In this paper, I will develop an expected utility representation theorem aimed at the representation of those who are neither probabilistically coherent, logically omniscient, nor expected utility maximisers across the board—that is, agents who are frequently irrational. The agents in question may be deductively fallible, have incoherent credences, limited representational capacities, and fail to maximise expected utility for all but a limited class of gambles. (shrink)
Mit seinem Einfluß auf die Entwicklung der Physiologie, Physik und Geometrie ist Hermann von Helmholtz wie kaum ein anderer Wissenschaftler der zweiten Hälfte des 19. Jahrhunderts repräsentativ für die Naturforschung in Deutschland. Nicht weniger repräsentativ nimmt sich die Entwicklung seiner Wissenschaftsauffassung aus. Während er bis in die späten 60er Jahre einen emphatischen Wahrheitsanspruch der Wissenschaft vertrat, begann er in der nachfolgenden Zeit, die Geltungsbedingungen der wissenschaftlichen Erkenntnis einer Relativierung zu unterwerfen, die zusammenfassend als Hypothetisierung bezeichnet werden kann. Helmholtz entwickelte damit (...) schon im vergangenen Jahrhundert Ansätze einer Wissenschaftsauffassung , die in erstaunlichem Umfang in die Richtung der Moderne weisen. Wie nah er späteren Wissenschaftsauffassungen bereits gekommen ist, kann ein Vergleich mit Karl R. Poppers Forschungslogik illustrieren. In seiner Forschungslogik ist die Hypothetisierung der wissenschaftlichen Erkenntnis entschieden weiter vorangeschritten als in Heimholtz' Wissenschaftstheorie. Was sich bei Helmholtz erst vage abzuzeichnen beginnt, ist bei ihm bereits explizit formuliertes Programm geworden. Obwohl HeImholtz und Popper in keiner direkten wissenschaftstheoretischen Entwicklungslinie stehen und Popper sich in seinen Schriften auch nur sehr selten und beiläufig auf Helmholtz bezieht, finden sich dennoch überraschende und bisher nicht beachtete Berührungspunkte, die insbesondere dann hervortreten, wenn man Heimholtz' Wissenschaftsauffassung vor dem Hintergrund von Poppers Forschungslogik betrachtet. (shrink)
I argue that Composition as Identity blocks the plural version of Cantor's Theorem, and that therefore the plural version of Cantor's Theorem can no longer be uncritically appealed to. As an example, I show how this result blocks a recent argument by Hawthorne and Uzquiano.
In questo lavoro si dimostra che l'opinione comune, secondo cui è Heidegger a introdurre Jacob von Uexküll nel dibattito filosofico è scorretta, in quanto è Scheler, due decenni prima, a scoprire e valorizzare la portata filosofica di Uexküll. -/- Pure la distinzione fra mondo (Welt) e ambiente (Umwelt), come quella fra apertura al mondo e chiusura ambientale, non è introdotta da Heidegger nel 1929 (cfr. l'Introduzione di Marco Mazzeo al testo di Uexküll, Ambienti animali e ambienti umani, p.18 e seg.) (...) ma è già presente in Scheler negli scritti del periodo 1909-1913. (shrink)
The Four-Colour Theorem (4CT) proof, presented to the mathematical community in a pair of papers by Appel and Haken in the late 1970's, provoked a series of philosophical debates. Many conceptual points of these disputes still require some elucidation. After a brief presentation of the main ideas of Appel and Haken’s procedure for the proof and a reconstruction of Thomas Tymoczko’s argument for the novelty of 4CT’s proof, we shall formulate some questions regarding the connections between the points raised (...) by Tymoczko and some Wittgensteinian topics in the philosophy of mathematics such as the importance of the surveyability as a criterion for distinguishing mathematical proofs from empirical experiments. Our aim is to show that the “characteristic Wittgensteinian invention” (Mühlhölzer 2006) – the strong distinction between proofs and experiments – can shed some light in the conceptual confusions surrounding the Four-Colour Theorem. (shrink)
In searching for the origins of current conceptions of science in the history of physics, one encounters a remarkable phenomenon. A typical view today is that theoretical knowledge-claims have only relativized validity. Historically, however, this thesis was supported by proponents of a conception of nature that today is far from typical, a mechanistic conception within which natural phenomena were to be explained by the action of mechanically moved matter. Two of these proponents, Hermann von Helmholtz and his pupil Heinrich Hertz, (...) contributed significantly to the modernization of the conception of science. Paradigmatic for their common contribution to this development is the way in which they employed the concept of image. By considering the origin and the different meanings of this concept we may trace a line of development which begins with Helmholtz's original claim that a universally and forever valid theory provides a unique representation of nature. It continues with the realization that the status of scientific knowledge is capable of revision; and it arrives at Hertz's admission that a variety of theories over a domain of objects is possible, at least at times. (shrink)
The complex world of thought and sensitivity in the sphere of contemporary art has entailed the revision and exclusion of disciplines aimed at providing a model to explain and conceptualize reality. Art history, as one such discipline, has had many of its contributions questioned from Gombrich’s epistemological reformulation to the postmodern discourses, which extol the death of the author, the post-structuralist idea of tradition as a textual phenomenon, and the declaration of the death of history as a consequence of the (...) hybridization of disciplines and of other bran- ches of human knowledge. Nevertheless, it can be demonstrated that proposals as those by Julius von Schlosser and Giulio Carlo Argan enclose reflections and methodological aspects which can help us face the task of understanding and visualizing the mediating role of historians in the culture of sensitivity, and the art modulations that have resulted from the blows of history and that, in turn, have shaped both art and art history into what they are or can be to us today. (shrink)
In the following we will investigate whether von Mises’ frequency interpretation of probability can be modified to make it philosophically acceptable. We will reject certain elements of von Mises’ theory, but retain others. In the interpretation we propose we do not use von Mises’ often criticized ‘infinite collectives’ but we retain two essential claims of his interpretation, stating that probability can only be defined for events that can be repeated in similar conditions, and that exhibit frequency stabilization. The central idea (...) of the present article is that the mentioned ‘conditions’ should be well-defined and ‘partitioned’. More precisely, we will divide probabilistic systems into object, initializing, and probing subsystem, and show that such partitioning allows to solve problems. Moreover we will argue that a key idea of the Copenhagen interpretation of quantum mechanics (the determinant role of the observing system) can be seen as deriving from an analytic definition of probability as frequency. Thus a secondary aim of the article is to illustrate the virtues of analytic definition of concepts, consisting of making explicit what is implicit. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.