This paper critically engages Philip Mirowki's essay, "The scientific dimensions of social knowledge and their distant echoes in 20th-century American philosophy of science." It argues that although the cold war context of anti-democratic elitism best suited for making decisions about engaging in nuclear war may seem to be politically and ideologically motivated, in fact we need to carefully consider the arguments underlying the new rational choice based political philosophies of the post-WWII era typified by Arrow'simpossibilitytheorem. (...) A distrust of democratic decision-making principles may be developed by social scientists whose leanings may be toward the left or right side of the spectrum of political practices. (shrink)
In response to recent work on the aggregation of individual judgments on logically connected propositions into collective judgments, it is often asked whether judgment aggregation is a special case of Arrowian preference aggregation. We argue for the converse claim. After proving two impossibility theorems on judgment aggregation (using "systematicity" and "independence" conditions, respectively), we construct an embedding of preference aggregation into judgment aggregation and prove Arrow’s theorem (stated for strict preferences) as a corollary of our second result. Although (...) we thereby provide a new proof of Arrow’s theorem, our main aim is to identify the analogue of Arrow’s theorem in judgment aggregation, to clarify the relation between judgment and preference aggregation, and to illustrate the generality of the judgment aggregation model. JEL Classi…cation: D70, D71.. (shrink)
Amalgamating evidence of different kinds for the same hypothesis into an overall confirmation is analogous, I argue, to amalgamating individuals’ preferences into a group preference. The latter faces well-known impossibility theorems, most famously “Arrow’s Theorem”. Once the analogy between amalgamating evidence and amalgamating preferences is tight, it is obvious that amalgamating evidence might face a theorem similar to Arrow’s. I prove that this is so, and end by discussing the plausibility of the axioms required for the (...) class='Hi'>theorem. (shrink)
Riker (1982) famously argued that Arrow’s impossibilitytheorem undermined the logical foundations of “populism”, the view that in a democracy, laws and policies ought to express “the will of the people”. In response, his critics have questioned the use of Arrow’s theorem on the grounds that not all configurations of preferences are likely to occur in practice; the critics allege, in particular, that majority preference cycles, whose possibility the theorem exploits, rarely happen. In this essay, I (...) argue that the critics’ rejoinder to Riker misses the mark even if its factual claim about preferences is correct: Arrow’s theorem and related results threaten the populist’s principle of democratic legitimacy even if majority preference cycles never occur. In this particular context, the assumption of an unrestricted domain is justified irrespective of the preferences citizens are likely to have. (shrink)
According to conciliatory views about the epistemology of disagreement, when epistemic peers have conflicting doxastic attitudes toward a proposition and fully disclose to one another the reasons for their attitudes toward that proposition (and neither has independent reason to believe the other to be mistaken), each peer should always change his attitude toward that proposition to one that is closer to the attitudes of those peers with which there is disagreement. According to pure higher-order evidence views, higher-order evidence for a (...) proposition always suffices to determine the proper rational response to disagreement about that proposition within a group of epistemic peers. Using an analogue of Arrow'sImpossibilityTheorem, I shall argue that no conciliatory and pure higher-order evidence view about the epistemology of disagreement can provide a true and general answer to the question of what disagreeing epistemic peers should do after fully disclosing to each other the (first-order) reasons for their conflicting doxastic attitudes. (shrink)
We argue that a semantics for counterfactual conditionals in terms of comparative overall similarity faces a formal limitation due to Arrow’s impossibilitytheorem from social choice theory. According to Lewis’s account, the truth-conditions for counterfactual conditionals are given in terms of the comparative overall similarity between possible worlds, which is in turn determined by various aspects of similarity between possible worlds. We argue that a function from aspects of similarity to overall similarity should satisfy certain plausible constraints while (...) Arrow’s impossibilitytheorem rules out that such a function satisfies all the constraints simultaneously. We argue that a way out of this impasse is to represent aspectual similarity in terms of ranking functions instead of representing it in a purely ordinal fashion. Further, we argue against the claim that the determination of overall similarity by aspects of similarity faces a difficulty in addition to the Arrovian limitation, namely the incommensurability of different aspects of similarity. The phenomena that have been cited as evidence for such incommensurability are best explained by ordinary vagueness. (shrink)
In this paper, I investigate the relationship between preference and judgment aggregation, using the notion of ranking judgment introduced in List and Pettit. Ranking judgments were introduced in order to state the logical connections between the impossibilitytheorem of aggregating sets of judgments and Arrow’s theorem. I present a proof of the theorem concerning ranking judgments as a corollary of Arrow’s theorem, extending the translation between preferences and judgments defined in List and Pettit to the (...) conditions on the aggregation procedure. (shrink)
Suppose that the members of a group each hold a rational set of judgments on some interconnected questions, and imagine that the group itself has to form a collective, rational set of judgments on those questions. How should it go about dealing with this task? We argue that the question raised is subject to a difficulty that has recently been noticed in discussion of the doctrinal paradox in jurisprudence. And we show that there is a general impossibilitytheorem (...) that that difficulty illustrates. Our paper describes this impossibility result and provides an exploration of its significance. The result naturally invites comparison with Kenneth Arrow's famous theorem (Arrow, 1963 and 1984; Sen, 1970) and we elaborate that comparison in a companion paper (List and Pettit, 2002). The paper is in four sections. The first section documents the need for various groups to aggregate its members' judgments; the second presents the discursive paradox; the third gives an informal statement of the more general impossibility result; the formal proof is presented in an appendix. The fourth section, finally, discusses some escape routes from that impossibility. (shrink)
The ``doctrinal paradox'' or ``discursive dilemma'' shows that propositionwise majority voting over the judgments held by multiple individuals on some interconnected propositions can lead to inconsistent collective judgments on these propositions. List and Pettit (2002) have proved that this paradox illustrates a more general impossibilitytheorem showing that there exists no aggregation procedure that generally produces consistent collective judgments and satisfies certain minimal conditions. Although the paradox and the theorem concern the aggregation of judgments rather than preferences, (...) they invite comparison with two established results on the aggregation of preferences: the Condorcet paradox and Arrow'simpossibilitytheorem. We may ask whether the new impossibilitytheorem is a special case of Arrow'stheorem, or whether there are interesting disanalogies between the two results. In this paper, we compare the two theorems, and show that they are not straightforward corollaries of each other. We further suggest that, while the framework of preference aggregation can be mapped into the framework of judgment aggregation, there exists no obvious reverse mapping. Finally, we address one particular minimal condition that is used in both theorems – an independence condition – and suggest that this condition points towards a unifying property underlying both impossibility results. (shrink)
According to a theorem recently proved in the theory of logical aggregation, any nonconstant social judgment function that satisfies independence of irrelevant alternatives (IIA) is dictatorial. We show that the strong and not very plausible IIA condition can be replaced with a minimal independence assumption plus a Pareto-like condition. This new version of the impossibilitytheorem likens it to Arrow’s and arguably enhances its paradoxical value.
Juries, committees and experts panels commonly appraise things of one kind or another on the basis of grades awarded by several people. When everybody's grading thresholds are known to be the same, the results sometimes can be counted on to reflect the graders’ opinion. Otherwise, they often cannot. Under certain conditions, Arrow's ‘impossibility’ theorem entails that judgements reached by aggregating grades do not reliably track any collective sense of better and worse at all. These claims are made (...) by adapting the Arrow–Sen framework for social choice to study grading in groups. (shrink)
This paper provides an introductory review of the theory of judgment aggregation. It introduces the paradoxes of majority voting that originally motivated the field, explains several key results on the impossibility of propositionwise judgment aggregation, presents a pedagogical proof of one of those results, discusses escape routes from the impossibility and relates judgment aggregation to some other salient aggregation problems, such as preference aggregation, abstract aggregation and probability aggregation. The present illustrative rather than exhaustive review is intended to (...) give readers new to the field of judgment aggregation a sense of this rapidly growing research area. (shrink)
Can we design a perfect democratic decision procedure? Condorcet famously observed that majority rule, our paradigmatic democratic procedure, has some desirable properties, but sometimes produces inconsistent outcomes. Revisiting Condorcet’s insights in light of recent work on the aggregation of judgments, I show that there is a conflict between three initially plausible requirements of democracy: “robustness to pluralism”, “basic majoritarianism”, and “collective rationality”. For all but the simplest collective decision problems, no decision procedure meets these three requirements at once; at most (...) two can be met together. This “democratic trilemma” raises the question of which requirement to give up. Since different answers correspond to different views about what matters most in a democracy, the trilemma suggests a map of the “logical space” in which different conceptions of democracy are located. It also sharpens our thinking about other impossibility problems of social choice and how to avoid them, by capturing a core structure many of these problems have in common. More broadly, it raises the idea of “cartography of logical space” in relation to contested political concepts. (shrink)
The paper discusses the sense in which the changes undergone by normative economics in the twentieth century can be said to be progressive. A simple criterion is proposed to decide whether a sequence of normative theories is progressive. This criterion is put to use on the historical transition from the new welfare economics to social choice theory. The paper reconstructs this classic case, and eventually concludes that the latter theory was progressive compared with the former. It also briefly comments on (...) the recent developments in normative economics and their connection with the previous two stages. (Published Online April 18 2006) Footnotes1 This paper suspersedes an earlier one entitled “Is There Progress in Normative Economics?” (Mongin 2002). I thank the organizers of the Fourth ESHET Conference (Graz 2000) for the opportunity they gave me to lecture on this topic. Thanks are also due to J. Alexander, K. Arrow, A. Bird, R. Bradley, M. Dascal, W. Gaertner, N. Gravel, D. Hausman, B. Hill, C. Howson, N. McClennen, A. Trannoy, J. Weymark, J. Worrall, two annonymous referees of this journal, and especially the editor M. Fleurbaey, for helpful comments. The editor's suggestions contributed to determine the final orientation of the paper. The author is grateful to the LSE and the Lachmann Foundation for their support at the time when he was writing the initial version. (shrink)
Shows how, as a consequence of the Arrow ImpossibilityTheorem, objectivity in grading is chimerical, given a sufficiently knowledgeable teacher (of his students, not his subject) in a sufficiently small class. -/- PDF available from JStor only; permission to post full version previously granted by journal editors and publisher expired. -/- Unpublished reply posted gratis.
What is the relationship between degrees of belief and binary beliefs? Can the latter be expressed as a function of the former—a so-called “belief-binarization rule”—without running into difficulties such as the lottery paradox? We show that this problem can be usefully analyzed from the perspective of judgment-aggregation theory. Although some formal similarities between belief binarization and judgment aggregation have been noted before, the connection between the two problems has not yet been studied in full generality. In this paper, we seek (...) to fill this gap. The paper is organized around a baseline impossibilitytheorem, which we use to map out the space of possible solutions to the belief-binarization problem. Our theorem shows that, except in limiting cases, there exists no belief-binarization rule satisfying four initially plausible desiderata. Surprisingly, this result is a direct corollary of the judgment-aggregation variant of Arrow’s classic impossibilitytheorem in social choice theory. (shrink)
I propose a relevance-based independence axiom on how to aggregate individual yes/no judgments on given propositions into collective judgments: the collective judgment on a proposition depends only on people’s judgments on propositions which are relevant to that proposition. This axiom contrasts with the classical independence axiom: the collective judgment on a proposition depends only on people’s judgments on the same proposition. I generalize the premise-based rule and the sequential-priority rule to an arbitrary priority order of the propositions, instead of a (...) dichotomous premise/conclusion order resp. a linear priority order. I prove four impossibility theorems on relevance-based aggregation. One theorem simultaneously generalizes Arrow’s Theorem (in its general and indiﬀerence-free versions) and the well-known Arrow-like theorem in judgment aggregation. (shrink)
Standard impossibility theorems on judgment aggregation over logically connected propositions either use a controversial systematicity condition or apply only to agendas of propositions with rich logical connections. Are there any serious impossibilities without these restrictions? We prove an impossibilitytheorem without requiring systematicity that applies to most standard agendas: Every judgment aggregation function (with rational inputs and outputs) satisfying a condition called unbiasedness is dictatorial (or effectively dictatorial if we remove one of the agenda conditions). Our agenda (...) conditions are tight. When applied illustratively to (strict) preference aggregation represented in our model, the result implies that every unbiased social welfare function with universal domain is effectively dictatorial. (shrink)
Several recent results on the aggregation of judgments over logically connected propositions show that, under certain conditions, dictatorships are the only propositionwise aggregation functions generating fully rational (i.e., complete and consistent) collective judgments. A frequently mentioned route to avoid dictatorships is to allow incomplete collective judgments. We show that this route does not lead very far: we obtain oligarchies rather than dictatorships if instead of full rationality we merely require that collective judgments be deductively closed, arguably a minimal condition of (...) rationality, compatible even with empty judgment sets. We derive several characterizations of oligarchies and provide illustrative applications to Arrowian preference aggregation and Kasher and Rubinsteinís group identification problem. (shrink)
I have read many recent discussions of the limits of computation and the universe as computer, hoping to find some comments on the amazing work of polymath physicist and decision theorist David Wolpert but have not found a single citation and so I present this very brief summary. Wolpert proved some stunning impossibility or incompleteness theorems (1992 to 2008-see arxiv.org) on the limits to inference (computation) that are so general they are independent of the device doing the computation, and (...) even independent of the laws of physics, so they apply across computers, physics, and human behavior. They make use of Cantor's diagonalization, the liar paradox and worldlines to provide what may be the ultimate theorem in Turing Machine Theory, and seemingly provide insights into impossibility,incompleteness, the limits of computation,and the universe as computer, in all possible universes and all beings or mechanisms, generating, among other things,a non-quantum mechanical uncertainty principle and a proof of monotheism. (shrink)
In normative political theory, it is widely accepted that democracy cannot be reduced to voting alone, but that it requires deliberation. In formal social choice theory, by contrast, the study of democracy has focused primarily on the aggregation of individual opinions into collective decisions, typically through voting. While the literature on deliberation has an optimistic flavour, the literature on social choice is more mixed. It is centred around several paradoxes and impossibility results identifying conflicts between different intuitively plausible desiderata. (...) In recent years, there has been a growing dialogue between the two literatures. This paper discusses the connections between them. Important insights are that (i) deliberation can complement aggregation and open up an escape route from some of its negative results; and (ii) the formal models of social choice theory can shed light on some aspects of deliberation, such as the nature of deliberation-induced opinion change. (shrink)
The aim of this article is to introduce the theory of judgment aggregation, a growing interdisciplinary research area. The theory addresses the following question: How can a group of individuals make consistent collective judgments on a given set of propositions on the basis of the group members' individual judgments on them? I begin by explaining the observation that initially sparked the interest in judgment aggregation, the so-called "doctinal" and "discursive paradoxes". I then introduce the basic formal model of judgment aggregation, (...) which allows me to present some illustrative variants of a generic impossibility result. I subsequently turn to the question of how this impossibility result can be avoided, going through several possible escape routes. Finally, I relate the theory of judgment aggregation to other branches of aggregation theory. Rather than offering a comprehensive survey of the theory of judgment aggregation, I hope to introduce the theory in a succinct and pedagogical way, providing an illustrative rather than exhaustive coverage of some of its key ideas and results. (shrink)
I have read many recent discussions of the limits of computation and the universe as computer, hoping to find some comments on the amazing work of polymath physicist and decision theorist David Wolpert but have not found a single citation and so I present this very brief summary. Wolpert proved some stunning impossibility or incompleteness theorems (1992 to 2008-see arxiv.org) on the limits to inference (computation) that are so general they are independent of the device doing the computation, and (...) even independent of the laws of physics, so they apply across computers, physics, and human behavior. They make use of Cantor's diagonalization, the liar paradox and worldlines to provide what may be the ultimate theorem in Turing Machine Theory, and seemingly provide insights into impossibility, incompleteness, the limits of computation,and the universe as computer, in all possible universes and all beings or mechanisms, generating, among other things,a non- quantum mechanical uncertainty principle and a proof of monotheism. There are obvious connections to the classic work of Chaitin, Solomonoff, Komolgarov and Wittgenstein and to the notion that no program (and thus no device) can generate a sequence (or device) with greater complexity than it possesses. One might say this body of work implies atheism since there cannot be any entity more complex than the physical universe and from the Wittgensteinian viewpoint, ‘more complex’ is meaningless (has no conditions of satisfaction, i.e., truth-maker or test). Even a ‘God’ (i.e., a ‘device’ with limitless time/space and energy) cannot determine whether a given ‘number’ is ‘random’ nor can find a certain way to show that a given ‘formula’, ‘theorem’ or ‘sentence’ or ‘device’ (all these being complex language games) is part of a particular ‘system’. -/- Those wishing a comprehensive up to date framework for human behavior from the modern two systems view may consult my article The Logical Structure of Philosophy, Psychology, Mind and Language as Revealed in Wittgenstein and Searle 59p(2016). For all my articles on Wittgenstein and Searle see my e-book ‘The Logical Structure of Philosophy, Psychology, Mind and Language in Wittgenstein and Searle 367p (2016). Those interested in all my writings in their most recent versions may consult my e-book Philosophy, Human Nature and the Collapse of Civilization - Articles and Reviews 2006-2016’ 662p (2016). -/- All of my papers and books have now been published in revised versions both in ebooks and in printed books. -/- Talking Monkeys: Philosophy, Psychology, Science, Religion and Politics on a Doomed Planet - Articles and Reviews 2006-2017 (2017) https://www.amazon.com/dp/B071HVC7YP. -/- The Logical Structure of Philosophy, Psychology, Mind and Language in Ludwig Wittgenstein and John Searle--Articles and Reviews 2006-2016 (2017) https://www.amazon.com/dp/B071P1RP1B. -/- Suicidal Utopian Delusions in the 21st century: Philosophy, Human Nature and the Collapse of Civilization - Articles and Reviews 2006-2017 (2017) https://www.amazon.com/dp/B0711R5LGX . (shrink)
I have read many recent discussions of the limits of computation and the universe as computer, hoping to find some comments on the amazing work of polymath physicist and decision theorist David Wolpert but have not found a single citation and so I present this very brief summary. Wolpert proved some stunning impossibility or incompleteness theorems (1992 to 2008-see arxiv dot org) on the limits to inference (computation) that are so general they are independent of the device doing the (...) computation, and even independent of the laws of physics, so they apply across computers, physics, and human behavior. They make use of Cantor's diagonalization, the liar paradox and worldlines to provide what may be the ultimate theorem in Turing Machine Theory, and seemingly provide insights into impossibility, incompleteness, the limits of computation, and the universe as computer, in all possible universes and all beings or mechanisms, generating, among other things, a non- quantum mechanical uncertainty principle and a proof of monotheism. There are obvious connections to the classic work of Chaitin, Solomonoff, Komolgarov and Wittgenstein and to the notion that no program (and thus no device) can generate a sequence (or device) with greater complexity than it possesses. One might say this body of work implies atheism since there cannot be any entity more complex than the physical universe and from the Wittgensteinian viewpoint, ‘more complex’ is meaningless (has no conditions of satisfaction, i.e., truth-maker or test). Even a ‘God’ (i.e., a ‘device’with limitless time/space and energy) cannot determine whether a given ‘number’ is ‘random’, nor find a certain way to show that a given ‘formula’, ‘theorem’ or ‘sentence’ or ‘device’ (all these being complex language games) is part of a particular ‘system’. -/- Those wishing a comprehensive up to date framework for human behavior from the modern two systems view may consult my book ‘The Logical Structure of Philosophy, Psychology, Mind and Language in Ludwig Wittgenstein and John Searle’ 2nd ed (2019). Those interested in more of my writings may see ‘Talking Monkeys--Philosophy, Psychology, Science, Religion and Politics on a Doomed Planet--Articles and Reviews 2006-2019 2nd ed (2019) and Suicidal Utopian Delusions in the 21st Century 4th ed (2019) . (shrink)
Aumann’s theorem states that no individual should agree to disagree under a range of assumptions. Political liberalism appears to presuppose these assumptions with the idealized conditions of public reason. We argue Aumann’s theorem demonstrates they nevertheless cannot be simultaneously held with what is arguably political liberalism’s most central tenet. That is, the tenet of reasonable pluralism, which implies we can rationally agree to disagree over conceptions of the good. We finish by elaborating a way of relaxing one of (...) the theorem’s axioms that arguably lends itself to a coherent account of political liberalism, namely the condition of indexical independence. (shrink)
Bell inequalities are usually derived by assuming locality and realism, and therefore violations of the Bell-CHSH inequality are usually taken to imply violations of either locality or realism, or both. But, after reviewing an oversight by Bell, in the Corollary below we derive the Bell-CHSH inequality by assuming only that Bob can measure along vectors b and b' simultaneously while Alice measures along either a or a', and likewise Alice can measure along vectors a and a' simultaneously while Bob measures (...) along either b or b', without assuming locality. The violations of the Bell-CHSH inequality therefore only mean impossibility of measuring along b and b' simultaneously. (shrink)
An analogue of Arrow’s theorem has been thought to limit the possibilities for multi-criterial theory choice. Here, an example drawn from Toy Science, a model of theories and choice criteria, suggests that it does not. Arrow’s assumption that domains are unrestricted is inappropriate in connection with theory choice in Toy Science. There are, however, variants of Arrow’s theorem that do not require an unrestricted domain. They require instead that domains are, in a technical sense, ‘rich’. Since there are (...) rich domains in Toy Science, such theorems do constrain theory choice to some extent—certainly in the model and perhaps also in real science. (shrink)
Time in electromagnetism shares many features with time in other physical theories. But there is one aspect of electromagnetism's relationship with time that has always been controversial, yet has not always attracted the limelight it deserves: the electromagnetic arrow of time. Beginning with a re-analysis of a famous argument between Ritz and Einstein over the origins of the radiation arrow, this chapter frames the debate between modern Einsteinians and neo-Ritzians. It tries to find a clean statement of what the arrow (...) is and then explains how it relates to the cosmological and thermodynamic arrows, representing the most developed and sophisticated attack yet, in either the physics or philosophy literature, on the electromagnetic arrow of time. (shrink)
This paper applies ideas and tools from social choice theory (such as Arrow'stheorem and related results) to linguistics. Specifically, the paper investigates the problem of constraint aggregation in optimality theory from a social-choice-theoretic perspective.
In the theory of judgment aggregation, it is known for which agendas of propositions it is possible to aggregate individual judgments into collective ones in accordance with the Arrow-inspired requirements of universal domain, collective rationality, unanimity preservation, non-dictatorship and propositionwise independence. But it is only partially known (e.g., only in the monotonic case) for which agendas it is possible to respect additional requirements, notably non-oligarchy, anonymity, no individual veto power, or implication preservation. We fully characterize the agendas for which there (...) are such possibilities, thereby answering the most salient open questions about propositionwise judgment aggregation. Our results build on earlier results by Nehring and Puppe (2002), Nehring (2006), Dietrich and List (2007a) and Dokow and Holzman (2010a). (shrink)
Majority cycling and related social choice paradoxes are often thought to threaten the meaningfulness of democracy. But deliberation can prevent majority cycles – not by inducing unanimity, which is unrealistic, but by bringing preferences closer to single-peakedness. We present the first empirical test of this hypothesis, using data from Deliberative Polls. Comparing preferences before and after deliberation, we find increases in proximity to single-peakedness. The increases are greater for lower versus higher salience issues and for individuals who seem to have (...) deliberated more versus less effectively. They are not merely a byproduct of increased substantive agreement. Our results both refine and support the idea that deliberation, by increasing proximity to single-peakedness, provides an escape from the problem of majority cycling. (shrink)
A platitude that took hold with Kuhn is that there can be several equally good ways of balancing theoretical virtues for theory choice. Okasha recently modelled theory choice using technical apparatus from the domain of social choice: famously, Arrow showed that no method of social choice can jointly satisfy four desiderata, and each of the desiderata in social choice has an analogue in theory choice. Okasha suggested that one can avoid the Arrow analogue for theory choice by employing a strategy (...) used by Sen in social choice, namely, to enhance the information made available to the choice algorithms. I argue here that, despite Okasha’s claims to the contrary, the information-enhancing strategy is not compelling in the domain of theory choice. (shrink)
(This is for the Cambridge Handbook of Analytic Philosophy, edited by Marcus Rossberg) In this handbook entry, I survey the different ways in which formal mathematical methods have been applied to philosophical questions throughout the history of analytic philosophy. I consider: formalization in symbolic logic, with examples such as Aquinas’ third way and Anselm’s ontological argument; Bayesian confirmation theory, with examples such as the fine-tuning argument for God and the paradox of the ravens; foundations of mathematics, with examples such as (...) Hilbert’s programme and Gödel’s incompleteness theorems; social choice theory, with examples such as Condorcet’s paradox and Arrow’s theorem; ‘how possibly’ results, with examples such as Condorcet’s jury theorem and recent work on intersectionality theory; and the application of advanced mathematics in philosophy, with examples such as accuracy-first epistemology. (shrink)
In a quantum universe with a strong arrow of time, it is standard to postulate that the initial wave function started in a particular macrostate---the special low-entropy macrostate selected by the Past Hypothesis. Moreover, there is an additional postulate about statistical mechanical probabilities according to which the initial wave function is a ''typical'' choice in the macrostate. Together, they support a probabilistic version of the Second Law of Thermodynamics: typical initial wave functions will increase in entropy. Hence, there are two (...) sources of randomness in such a universe: the quantum-mechanical probabilities of the Born rule and the statistical mechanical probabilities of the Statistical Postulate. I propose a new way to understand time's arrow in a quantum universe. It is based on what I call the Thermodynamic Theories of Quantum Mechanics. According to this perspective, there is a natural choice for the initial quantum state of the universe, which is given by not a wave function but by a density matrix. The density matrix plays a microscopic role: it appears in the fundamental dynamical equations of those theories. The density matrix also plays a macroscopic / thermodynamic role: it is exactly the projection operator onto the Past Hypothesis subspace. Thus, given an initial subspace, we obtain a unique choice of the initial density matrix. I call this property "the conditional uniqueness" of the initial quantum state. The conditional uniqueness provides a new and general strategy to eliminate statistical mechanical probabilities in the fundamental physical theories, by which we can reduce the two sources of randomness to only the quantum mechanical one. I also explore the idea of an absolutely unique initial quantum state, in a way that might realize Penrose's idea of a strongly deterministic universe. (shrink)
A proof of Fermat’s last theorem is demonstrated. It is very brief, simple, elementary, and absolutely arithmetical. The necessary premises for the proof are only: the three definitive properties of the relation of equality (identity, symmetry, and transitivity), modus tollens, axiom of induction, the proof of Fermat’s last theorem in the case of.
The aim of this paper is to comprehensively question the validity of the standard way of interpreting Chaitin's famous incompleteness theorem, which says that for every formalized theory of arithmetic there is a finite constant c such that the theory in question cannot prove any particular number to have Kolmogorov complexity larger than c. The received interpretation of theorem claims that the limiting constant is determined by the complexity of the theory itself, which is assumed to be good (...) measure of the strength of the theory. I exhibit certain strong counterexamples and establish conclusively that the received view is false. Moreover, I show that the limiting constants provided by the theorem do not in any way reflect the power of formalized theories, but that the values of these constants are actually determined by the chosen coding of Turing machines, and are thus quite accidental. (shrink)
In this paper we distinguish between various kinds of doxastic theories. One distinction is between informal and formal doxastic theories. AGM-type theories of belief change are of the former kind, while Hintikka’s logic of knowledge and belief is of the latter. Then we distinguish between static theories that study the unchanging beliefs of a certain agent and dynamic theories that investigate not only the constraints that can reasonably be imposed on the doxastic states of a rational agent but also rationality (...) constraints on the changes of doxastic state that may occur in such agents. An additional distinction is that between non-introspective theories and introspective ones. Non-introspective theories investigate agents that have opinions about the external world but no higher-order opinions about their own doxasticnstates. Standard AGM-type theories as well as the currently existing versions of Segerberg’s dynamic doxastic logic (DDL) are non-introspective. Hintikka-style doxastic logic is of course introspective but it is a static theory. Thus, the challenge remains to devise doxastic theories that are both dynamic and introspective. We outline the semantics for truly introspective dynamic doxastic logic, i.e., a dynamic doxastic logic that allows us to describe agents who have both the ability to form higher-order beliefs and to reflect upon and change their minds about their own (higher-order) beliefs. This extension of DDL demands that we give up the Preservation condition on revision. We make some suggestions as to how such a non-preservative revision operation can be constructed. We also consider extending DDL with conditionals satisfying the Ramsey test and show that Gärdenfors’ well-known impossibility result applies to such a framework. Also in this case, Preservation has to be given up. (shrink)
Agents are often assumed to have degrees of belief (“credences”) and also binary beliefs (“beliefs simpliciter”). How are these related to each other? A much-discussed answer asserts that it is rational to believe a proposition if and only if one has a high enough degree of belief in it. But this answer runs into the “lottery paradox”: the set of believed propositions may violate the key rationality conditions of consistency and deductive closure. In earlier work, we showed that this problem (...) generalizes: there exists no local function from degrees of belief to binary beliefs that satisfies some minimal conditions of rationality and non-triviality. “Locality” means that the binary belief in each proposition depends only on the degree of belief in that proposition, not on the degrees of belief in others. One might think that the impossibility can be avoided by dropping the assumption that binary beliefs are a function of degrees of belief. We prove that, even if we drop the “functionality” restriction, there still exists no local relation between degrees of belief and binary beliefs that satisfies some minimal conditions. Thus functionality is not the source of the impossibility; its source is the condition of locality. If there is any non-trivial relation between degrees of belief and binary beliefs at all, it must be a “holistic” one. We explore several concrete forms this “holistic” relation could take. (shrink)
Condorcet's famous jury theorem reaches an optimistic conclusion on the correctness of majority decisions, based on two controversial premises about voters: they are competent and vote independently, in a technical sense. I carefully analyse these premises and show that: whether a premise is justi…ed depends on the notion of probability considered; none of the notions renders both premises simultaneously justi…ed. Under the perhaps most interesting notions, the independence assumption should be weakened.
How can different individuals' probability assignments to some events be aggregated into a collective probability assignment? Classic results on this problem assume that the set of relevant events -- the agenda -- is a sigma-algebra and is thus closed under disjunction (union) and conjunction (intersection). We drop this demanding assumption and explore probabilistic opinion pooling on general agendas. One might be interested in the probability of rain and that of an interest-rate increase, but not in the probability of rain or (...) an interest-rate increase. We characterize linear pooling and neutral pooling for general agendas, with classic results as special cases for agendas that are sigma-algebras. As an illustrative application, we also consider probabilistic preference aggregation. Finally, we compare our results with existing results on binary judgment aggregation and Arrovian preference aggregation. This paper is the first of two self-contained, but technically related companion papers inspired by binary judgment-aggregation theory. (shrink)
In this paper, I introduce an intrinsic account of the quantum state. This account contains three desirable features that the standard platonistic account lacks: (1) it does not refer to any abstract mathematical objects such as complex numbers, (2) it is independent of the usual arbitrary conventions in the wave function representation, and (3) it explains why the quantum state has its amplitude and phase degrees of freedom. -/- Consequently, this account extends Hartry Field’s program outlined in Science Without Numbers (...) (1980), responds to David Malament’s long-standing impossibility conjecture (1982), and establishes an important first step towards a genuinely intrinsic and nominalistic account of quantum mechanics. I will also compare the present account to Mark Balaguer’s (1996) nominalization of quantum mechanics and discuss how it might bear on the debate about “wave function realism.” In closing, I will suggest some possible ways to extend this account to accommodate spinorial degrees of freedom and a variable number of particles (e.g. for particle creation and annihilation). -/- Along the way, I axiomatize the quantum phase structure as what I shall call a “periodic difference structure” and prove a representation theorem as well as a uniqueness theorem. These formal results could prove fruitful for further investigation into the metaphysics of phase and theoretical structure. (shrink)
Comparative overall similarity lies at the basis of a lot of recent metaphysics and epistemology. It is a poor foundation. Overall similarity is supposed to be an aggregate of similarities and differences in various respects. But there is no good way of combining them all.
My aim in this paper is to explain what Condorcet’s jury theorem is, and to examine its central assumptions, its significance to the epistemic theory of democracy and its connection with Rousseau’s theory of general will. In the first part of the paper I will analyze an epistemic theory of democracy and explain how its connection with Condorcet’s jury theorem is twofold: the theorem is at the same time a contributing historical source, and the model used by (...) the authors to this day. In the second part I will specify the purposes of the theorem itself, and examine its underlying assumptions. Third part will be about an interpretation of Rousseau’s theory, which is given by Grofman and Feld relying on Condorcet’s jury theorem, and about criticisms of such interpretation. In the fourth, and last, part I will focus on one particular assumption of Condorcet’s theorem, which proves to be especially problematic if we would like to apply the theorem under real-life conditions; namely, the assumption that voters choose between two options only. (shrink)
In this short survey article, I discuss Bell’s theorem and some strategies that attempt to avoid the conclusion of non-locality. I focus on two that intersect with the philosophy of probability: (1) quantum probabilities and (2) superdeterminism. The issues they raised not only apply to a wide class of no-go theorems about quantum mechanics but are also of general philosophical interest.
Apart from his critique of Kant, Maimon’s significance for the history of philosophy lies in his crucial role in the rediscovery of Spinoza by the German Idealists. Specifically, Maimon initiated a change from the common eighteenth-century view of Spinoza as the great ‘atheist’ to the view of Spinoza as an ‘acosmist’, i.e., a thinker who propounded a deep, though unorthodox, religious view denying the reality of the world and taking God to be the only real being. I have discussed this (...) aspect of Maimon’s philosophy in other places, and though the topic of the current paper has an interesting relation to certain doctrines of Spinoza, I will not develop this issue here. Neither of these two issues -- Maimon’s criticism of Kant or his original interpretation of Spinoza -- was considered by Maimon as his main contribution to philosophy. There is little doubt that if Maimon were asked to point out his single most important innovation he would have picked his doctrine of the Principle of Determinability [Satz der Bestimmbarkeit]. Regarding this doctrine Maimon writes: ... [T]he principle of determinability laid down in this work is a principle of all objectively real thought, and consequently of philosophy as a whole too. All the propositions of philosophy can be derived from, and be determined by it [woraus sich alle Sätze herleiten und wodurch sie sich bestimmen lassen]. … I have made available a supreme principle of all objectively real thought, viz., the principle of determinability... and have established as the ground of the whole of pure philosophy -- a principle which, if it is ever grasped, will, I hope, withstand every scrutiny. These claims may strike the reader as somewhat presumptuous, to say the least. But, if we pay attention to the last sentence of the passage, we can see that Maimon doubts whether his great finding will ever be understood. It is not unlikely that in this phrase (“wenn er nur einmal eingesehen werden wird”) Maimon was reacting to his own repeatedly unsuccessful attempts to explain the principle. The fate of Maimon’s principle has not been much better in the few works written on Maimon’s philosophy, and though almost all commentators agree that the principle of determinability is the linchpin of the positive philosophy Maimon was trying to develop, we do not yet have a clear explanation of this principle, or of the reason why Maimon assigns such importance to it. Recently, Oded Schechter developed an excellent reading of this principle, and in most aspects my view agrees with his (primarily, in its rejecting the attempt to explain the principle as a version of Leibniz’s predicate-in-subject [Praedicatum inest subjecto] containment thesis). My paper consists of two parts. The first is expository in nature. In this part, I spell out briefly the main aspects of Maimon’s principle of determinability and its aims. In the second part, I examine Maimon’s surprising claim that once we accept the principle of determinability, we have to deny the possibility of two subjects sharing the same predicate. Maimon provides several proofs for this highly counterintuitive claim, and I will try to clarify and evaluate these proofs. (shrink)
In solving judgment aggregation problems, groups often face constraints. Many decision problems can be modelled in terms the acceptance or rejection of certain propositions in a language, and constraints as propositions that the decisions should be consistent with. For example, court judgments in breach-of-contract cases should be consistent with the constraint that action and obligation are necessary and sufficient for liability; judgments on how to rank several options in an order of preference with the constraint of transitivity; and judgments on (...) budget items with budgetary constraints. Often more or less demanding constraints on decisions are imaginable. For instance, in preference ranking problems, the transitivity constraint is often contrasted with the weaker acyclicity constraint. In this paper, we make constraints explicit in judgment aggregation by relativizing the rationality conditions of consistency and deductive closure to a constraint set, whose variation yields more or less strong notions of rationality. We review several general results on judgment aggregation in light of such constraints. (shrink)
Evolution's Arrow argues that evolution is directional and progressive, and that this has major consequences for humanity. Without resort to teleology, the book demonstrates that evolution moves in the direction of producing cooperative organisations of greater scale and evolvability - evolution has organised molecular processes into cells, cells into organisms, and organisms into societies. The book founds this position on a new theory of the evolution of cooperation. It shows that self-interest at the level of the genes does not prevent (...) cooperation from increasing as evolution unfolds. Evolution progresses by discovering ways to build cooperative organisations out of self-interested individuals. The book also shows that evolution itself has evolved. Evolution has progressively improved the ability of evolutionary mechanisms to discover effective adaptations. And it has produced new and better mechanisms. Evolution's Arrow uses this understanding of the direction of evolution to identify the next great steps in the evolution of life on earth - the steps that humanity must take if we are to continue to be successful in evolutionary terms. A key step for humanity is to increase the scale and evolvability of our societies, eventually forming a unified and cooperative society on the scale of the planet. We must also transform ourselves psychologically to become self-evolving organisms - organisms that are able to escape their biological and cultural past by adapting in whatever directions are necessary to achieve future evolutionary success. (shrink)
In the context of EPR-Bohm type experiments and spin detections confined to spacelike hypersurfaces, a local, deterministic and realistic model within a Friedmann-Robertson-Walker spacetime with a constant spatial curvature (S^3 ) is presented that describes simultaneous measurements of the spins of two fermions emerging in a singlet state from the decay of a spinless boson. Exact agreement with the probabilistic predictions of quantum theory is achieved in the model without data rejection, remote contextuality, superdeterminism or backward causation. A singularity-free Clifford-algebraic (...) representation of S^3 with vanishing spatial curvature and non-vanishing torsion is then employed to transform the model in a more elegant form. Several event-by-event numerical simulations of the model are presented, which confirm our analytical results with the accuracy of 4 parts in 10^4 . Possible implications of our results for practical applications such as quantum security protocols and quantum computing are briefly discussed. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.