This paper critically engages Philip Mirowki's essay, "The scientific dimensions of social knowledge and their distant echoes in 20th-century American philosophy of science." It argues that although the cold war context of anti-democratic elitism best suited for making decisions about engaging in nuclear war may seem to be politically and ideologically motivated, in fact we need to carefully consider the arguments underlying the new rational choice based political philosophies of the post-WWII era typified by Arrow'simpossibilitytheorem. (...) A distrust of democratic decision-making principles may be developed by social scientists whose leanings may be toward the left or right side of the spectrum of political practices. (shrink)
In response to recent work on the aggregation of individual judgments on logically connected propositions into collective judgments, it is often asked whether judgment aggregation is a special case of Arrowian preference aggregation. We argue for the converse claim. After proving two impossibility theorems on judgment aggregation (using "systematicity" and "independence" conditions, respectively), we construct an embedding of preference aggregation into judgment aggregation and prove Arrow’s theorem (stated for strict preferences) as a corollary of our second result. Although (...) we thereby provide a new proof of Arrow’s theorem, our main aim is to identify the analogue of Arrow’s theorem in judgment aggregation, to clarify the relation between judgment and preference aggregation, and to illustrate the generality of the judgment aggregation model. JEL Classi…cation: D70, D71.. (shrink)
Amalgamating evidence of different kinds for the same hypothesis into an overall confirmation is analogous, I argue, to amalgamating individuals’ preferences into a group preference. The latter faces well-known impossibility theorems, most famously “Arrow’s Theorem”. Once the analogy between amalgamating evidence and amalgamating preferences is tight, it is obvious that amalgamating evidence might face a theorem similar to Arrow’s. I prove that this is so, and end by discussing the plausibility of the axioms required for the (...) class='Hi'>theorem. (shrink)
Riker (1982) famously argued that Arrow’s impossibilitytheorem undermined the logical foundations of “populism”, the view that in a democracy, laws and policies ought to express “the will of the people”. In response, his critics have questioned the use of Arrow’s theorem on the grounds that not all configurations of preferences are likely to occur in practice; the critics allege, in particular, that majority preference cycles, whose possibility the theorem exploits, rarely happen. In this essay, I (...) argue that the critics’ rejoinder to Riker misses the mark even if its factual claim about preferences is correct: Arrow’s theorem and related results threaten the populist’s principle of democratic legitimacy even if majority preference cycles never occur. In this particular context, the assumption of an unrestricted domain is justified irrespective of the preferences citizens are likely to have. (shrink)
In this paper, I investigate the relationship between preference and judgment aggregation, using the notion of ranking judgment introduced in List and Pettit. Ranking judgments were introduced in order to state the logical connections between the impossibilitytheorem of aggregating sets of judgments and Arrow’s theorem. I present a proof of the theorem concerning ranking judgments as a corollary of Arrow’s theorem, extending the translation between preferences and judgments defined in List and Pettit to the (...) conditions on the aggregation procedure. (shrink)
We argue that a semantics for counterfactual conditionals in terms of comparative overall similarity faces a formal limitation due to Arrow’s impossibilitytheorem from social choice theory. According to Lewis’s account, the truth-conditions for counterfactual conditionals are given in terms of the comparative overall similarity between possible worlds, which is in turn determined by various aspects of similarity between possible worlds. We argue that a function from aspects of similarity to overall similarity should satisfy certain plausible constraints while (...) Arrow’s impossibilitytheorem rules out that such a function satisfies all the constraints simultaneously. We argue that a way out of this impasse is to represent aspectual similarity in terms of ranking functions instead of representing it in a purely ordinal fashion. Further, we argue against the claim that the determination of overall similarity by aspects of similarity faces a difficulty in addition to the Arrovian limitation, namely the incommensurability of different aspects of similarity. The phenomena that have been cited as evidence for such incommensurability are best explained by ordinary vagueness. (shrink)
According to conciliatory views about the epistemology of disagreement, when epistemic peers have conflicting doxastic attitudes toward a proposition and fully disclose to one another the reasons for their attitudes toward that proposition (and neither has independent reason to believe the other to be mistaken), each peer should always change his attitude toward that proposition to one that is closer to the attitudes of those peers with which there is disagreement. According to pure higher-order evidence views, higher-order evidence for a (...) proposition always suffices to determine the proper rational response to disagreement about that proposition within a group of epistemic peers. Using an analogue of Arrow'sImpossibilityTheorem, I shall argue that no conciliatory and pure higher-order evidence view about the epistemology of disagreement can provide a true and general answer to the question of what disagreeing epistemic peers should do after fully disclosing to each other the (first-order) reasons for their conflicting doxastic attitudes. (shrink)
Suppose that the members of a group each hold a rational set of judgments on some interconnected questions, and imagine that the group itself has to form a collective, rational set of judgments on those questions. How should it go about dealing with this task? We argue that the question raised is subject to a difficulty that has recently been noticed in discussion of the doctrinal paradox in jurisprudence. And we show that there is a general impossibilitytheorem (...) that that difficulty illustrates. Our paper describes this impossibility result and provides an exploration of its significance. The result naturally invites comparison with Kenneth Arrow's famous theorem (Arrow, 1963 and 1984; Sen, 1970) and we elaborate that comparison in a companion paper (List and Pettit, 2002). The paper is in four sections. The first section documents the need for various groups to aggregate its members' judgments; the second presents the discursive paradox; the third gives an informal statement of the more general impossibility result; the formal proof is presented in an appendix. The fourth section, finally, discusses some escape routes from that impossibility. (shrink)
The ``doctrinal paradox'' or ``discursive dilemma'' shows that propositionwise majority voting over the judgments held by multiple individuals on some interconnected propositions can lead to inconsistent collective judgments on these propositions. List and Pettit (2002) have proved that this paradox illustrates a more general impossibilitytheorem showing that there exists no aggregation procedure that generally produces consistent collective judgments and satisfies certain minimal conditions. Although the paradox and the theorem concern the aggregation of judgments rather than preferences, (...) they invite comparison with two established results on the aggregation of preferences: the Condorcet paradox and Arrow'simpossibilitytheorem. We may ask whether the new impossibilitytheorem is a special case of Arrow'stheorem, or whether there are interesting disanalogies between the two results. In this paper, we compare the two theorems, and show that they are not straightforward corollaries of each other. We further suggest that, while the framework of preference aggregation can be mapped into the framework of judgment aggregation, there exists no obvious reverse mapping. Finally, we address one particular minimal condition that is used in both theorems – an independence condition – and suggest that this condition points towards a unifying property underlying both impossibility results. (shrink)
According to a theorem recently proved in the theory of logical aggregation, any nonconstant social judgment function that satisfies independence of irrelevant alternatives (IIA) is dictatorial. We show that the strong and not very plausible IIA condition can be replaced with a minimal independence assumption plus a Pareto-like condition. This new version of the impossibilitytheorem likens it to Arrow’s and arguably enhances its paradoxical value.
Juries, committees and experts panels commonly appraise things of one kind or another on the basis of grades awarded by several people. When everybody's grading thresholds are known to be the same, the results sometimes can be counted on to reflect the graders’ opinion. Otherwise, they often cannot. Under certain conditions, Arrow's ‘impossibility’ theorem entails that judgements reached by aggregating grades do not reliably track any collective sense of better and worse at all. These claims are made (...) by adapting the Arrow–Sen framework for social choice to study grading in groups. (shrink)
This paper provides an introductory review of the theory of judgment aggregation. It introduces the paradoxes of majority voting that originally motivated the field, explains several key results on the impossibility of propositionwise judgment aggregation, presents a pedagogical proof of one of those results, discusses escape routes from the impossibility and relates judgment aggregation to some other salient aggregation problems, such as preference aggregation, abstract aggregation and probability aggregation. The present illustrative rather than exhaustive review is intended to (...) give readers new to the field of judgment aggregation a sense of this rapidly growing research area. (shrink)
Can we design a perfect democratic decision procedure? Condorcet famously observed that majority rule, our paradigmatic democratic procedure, has some desirable properties, but sometimes produces inconsistent outcomes. Revisiting Condorcet’s insights in light of recent work on the aggregation of judgments, I show that there is a conflict between three initially plausible requirements of democracy: “robustness to pluralism”, “basic majoritarianism”, and “collective rationality”. For all but the simplest collective decision problems, no decision procedure meets these three requirements at once; at most (...) two can be met together. This “democratic trilemma” raises the question of which requirement to give up. Since different answers correspond to different views about what matters most in a democracy, the trilemma suggests a map of the “logical space” in which different conceptions of democracy are located. It also sharpens our thinking about other impossibility problems of social choice and how to avoid them, by capturing a core structure many of these problems have in common. More broadly, it raises the idea of “cartography of logical space” in relation to contested political concepts. (shrink)
The paper discusses the sense in which the changes undergone by normative economics in the twentieth century can be said to be progressive. A simple criterion is proposed to decide whether a sequence of normative theories is progressive. This criterion is put to use on the historical transition from the new welfare economics to social choice theory. The paper reconstructs this classic case, and eventually concludes that the latter theory was progressive compared with the former. It also briefly comments on (...) the recent developments in normative economics and their connection with the previous two stages. (Published Online April 18 2006) Footnotes1 This paper suspersedes an earlier one entitled “Is There Progress in Normative Economics?” (Mongin 2002). I thank the organizers of the Fourth ESHET Conference (Graz 2000) for the opportunity they gave me to lecture on this topic. Thanks are also due to J. Alexander, K. Arrow, A. Bird, R. Bradley, M. Dascal, W. Gaertner, N. Gravel, D. Hausman, B. Hill, C. Howson, N. McClennen, A. Trannoy, J. Weymark, J. Worrall, two annonymous referees of this journal, and especially the editor M. Fleurbaey, for helpful comments. The editor's suggestions contributed to determine the final orientation of the paper. The author is grateful to the LSE and the Lachmann Foundation for their support at the time when he was writing the initial version. (shrink)
Arrhenius’s impossibility theorems purport to demonstrate that no population axiology can satisfy each of a small number of intuitively compelling adequacy conditions. However, it has recently been pointed out that each theorem depends on a dubious assumption: Finite Fine-Grainedness. This assumption states that there exists a finite sequence of slight welfare differences between any two welfare levels. Denying Finite Fine-Grainedness makes room for a lexical population axiology which satisfies all of the compelling adequacy conditions in each theorem. (...) Therefore, Arrhenius’s theorems fail to prove that there is no satisfactory population axiology. In this paper, I argue that Arrhenius’s theorems can be repurposed. Since all of our population-affecting actions have a non-zero probability of bringing about more than one distinct population, it is population prospect axiologies that are of practical relevance, and amended versions of Arrhenius’s theorems demonstrate that there is no satisfactory population prospect axiology. These impossibility theorems do not depend on Finite Fine-Grainedness, so lexical views do not escape them. (shrink)
Shows how, as a consequence of the Arrow ImpossibilityTheorem, objectivity in grading is chimerical, given a sufficiently knowledgeable teacher (of his students, not his subject) in a sufficiently small class. -/- PDF available from JStor only; permission to post full version previously granted by journal editors and publisher expired. -/- Unpublished reply posted gratis.
This thesis argues for the fundamental importance of the opposition between holistic and reductionistic world-views in economics. Both reductionism and holism may nevertheless underpin laissez-faire policy prescriptions. Scrutiny of the nature of the articulation between micro and macro levels in the writings of economists suggests that invisible hand theories play a key role in reconciling reductionist policy prescriptions with a holistic world. An examination of the prisoners' dilemma in game theory and Arrow'simpossibilitytheorem in social choice (...) theory sets the scene. The prisoners' dilemma epitomises the collective irrationality coordination problems lead to. The source of the dilemma is identified as the combination of interdependence in content and independence in form of the decision making process. Arrovian impossibility has been perceived as challenging traditional views of the relationship between micro and macro levels in economics. Conservative arguments against the possibility in principle of a social welfare function are criticised here as depending on an illicit dualism. The thesis then reviews the standpoints of Smith, Hayek and Keynes. For Smith, the social desirability of individual self-seeking activity is ensured by the 'invisible hand' of a god who has moulded us so to behave, that the quantity of happiness in the world is always maximised. Hayek seeks to re-establish the invisible hand in a secular age, replacing the agency of a deity with an evolutionary mechanism. Hayek's evolutionary theory, criticised here as being based on the exploded notion of group selection, cannot underpin the desirability of spontaneous outcomes. I conclude by arguing that Keynes shares the holistic approach of Smith and Hayek, but without their reliance on invisible hand mechanisms. If spontaneous processes cannot be relied upon to generate desirable social outcomes then we have to take responsibility for achieving this ourselves by establishing the appropriate institutional framework to eliminate macroeconomic prisoners' dilemmas. (shrink)
What is the relationship between degrees of belief and binary beliefs? Can the latter be expressed as a function of the former—a so-called “belief-binarization rule”—without running into difficulties such as the lottery paradox? We show that this problem can be usefully analyzed from the perspective of judgment-aggregation theory. Although some formal similarities between belief binarization and judgment aggregation have been noted before, the connection between the two problems has not yet been studied in full generality. In this paper, we seek (...) to fill this gap. The paper is organized around a baseline impossibilitytheorem, which we use to map out the space of possible solutions to the belief-binarization problem. Our theorem shows that, except in limiting cases, there exists no belief-binarization rule satisfying four initially plausible desiderata. Surprisingly, this result is a direct corollary of the judgment-aggregation variant of Arrow’s classic impossibilitytheorem in social choice theory. (shrink)
I propose a relevance-based independence axiom on how to aggregate individual yes/no judgments on given propositions into collective judgments: the collective judgment on a proposition depends only on people’s judgments on propositions which are relevant to that proposition. This axiom contrasts with the classical independence axiom: the collective judgment on a proposition depends only on people’s judgments on the same proposition. I generalize the premise-based rule and the sequential-priority rule to an arbitrary priority order of the propositions, instead of a (...) dichotomous premise/conclusion order resp. a linear priority order. I prove four impossibility theorems on relevance-based aggregation. One theorem simultaneously generalizes Arrow’s Theorem (in its general and indiﬀerence-free versions) and the well-known Arrow-like theorem in judgment aggregation. (shrink)
Standard impossibility theorems on judgment aggregation over logically connected propositions either use a controversial systematicity condition or apply only to agendas of propositions with rich logical connections. Are there any serious impossibilities without these restrictions? We prove an impossibilitytheorem without requiring systematicity that applies to most standard agendas: Every judgment aggregation function (with rational inputs and outputs) satisfying a condition called unbiasedness is dictatorial (or effectively dictatorial if we remove one of the agenda conditions). Our agenda (...) conditions are tight. When applied illustratively to (strict) preference aggregation represented in our model, the result implies that every unbiased social welfare function with universal domain is effectively dictatorial. (shrink)
Several recent results on the aggregation of judgments over logically connected propositions show that, under certain conditions, dictatorships are the only propositionwise aggregation functions generating fully rational (i.e., complete and consistent) collective judgments. A frequently mentioned route to avoid dictatorships is to allow incomplete collective judgments. We show that this route does not lead very far: we obtain oligarchies rather than dictatorships if instead of full rationality we merely require that collective judgments be deductively closed, arguably a minimal condition of (...) rationality, compatible even with empty judgment sets. We derive several characterizations of oligarchies and provide illustrative applications to Arrowian preference aggregation and Kasher and Rubinsteinís group identification problem. (shrink)
In normative political theory, it is widely accepted that democracy cannot be reduced to voting alone, but that it requires deliberation. In formal social choice theory, by contrast, the study of democracy has focused primarily on the aggregation of individual opinions into collective decisions, typically through voting. While the literature on deliberation has an optimistic flavour, the literature on social choice is more mixed. It is centred around several paradoxes and impossibility results identifying conflicts between different intuitively plausible desiderata. (...) In recent years, there has been a growing dialogue between the two literatures. This paper discusses the connections between them. Important insights are that (i) deliberation can complement aggregation and open up an escape route from some of its negative results; and (ii) the formal models of social choice theory can shed light on some aspects of deliberation, such as the nature of deliberation-induced opinion change. (shrink)
I have read many recent discussions of the limits of computation and the universe as computer, hoping to find some comments on the amazing work of polymath physicist and decision theorist David Wolpert but have not found a single citation and so I present this very brief summary. Wolpert proved some stunning impossibility or incompleteness theorems (1992 to 2008-see arxiv.org) on the limits to inference (computation) that are so general they are independent of the device doing the computation, and (...) even independent of the laws of physics, so they apply across computers, physics, and human behavior. They make use of Cantor's diagonalization, the liar paradox and worldlines to provide what may be the ultimate theorem in Turing Machine Theory, and seemingly provide insights into impossibility,incompleteness, the limits of computation,and the universe as computer, in all possible universes and all beings or mechanisms, generating, among other things,a non-quantum mechanical uncertainty principle and a proof of monotheism. (shrink)
The aim of this article is to introduce the theory of judgment aggregation, a growing interdisciplinary research area. The theory addresses the following question: How can a group of individuals make consistent collective judgments on a given set of propositions on the basis of the group members' individual judgments on them? I begin by explaining the observation that initially sparked the interest in judgment aggregation, the so-called "doctinal" and "discursive paradoxes". I then introduce the basic formal model of judgment aggregation, (...) which allows me to present some illustrative variants of a generic impossibility result. I subsequently turn to the question of how this impossibility result can be avoided, going through several possible escape routes. Finally, I relate the theory of judgment aggregation to other branches of aggregation theory. Rather than offering a comprehensive survey of the theory of judgment aggregation, I hope to introduce the theory in a succinct and pedagogical way, providing an illustrative rather than exhaustive coverage of some of its key ideas and results. (shrink)
Bell inequalities are usually derived by assuming locality and realism, and therefore violations of the Bell-CHSH inequality are usually taken to imply violations of either locality or realism, or both. But, after reviewing an oversight by Bell, in the Corollary below we derive the Bell-CHSH inequality by assuming only that Bob can measure along vectors b and b' simultaneously while Alice measures along either a or a', and likewise Alice can measure along vectors a and a' simultaneously while Bob measures (...) along either b or b', without assuming locality. The violations of the Bell-CHSH inequality therefore only mean impossibility of measuring along b and b' simultaneously. (shrink)
I have read many recent discussions of the limits of computation and the universe as computer, hoping to find some comments on the amazing work of polymath physicist and decision theorist David Wolpert but have not found a single citation and so I present this very brief summary. Wolpert proved some stunning impossibility or incompleteness theorems (1992 to 2008-see arxiv.org) on the limits to inference (computation) that are so general they are independent of the device doing the computation, and (...) even independent of the laws of physics, so they apply across computers, physics, and human behavior. They make use of Cantor's diagonalization, the liar paradox and worldlines to provide what may be the ultimate theorem in Turing Machine Theory, and seemingly provide insights into impossibility, incompleteness, the limits of computation,and the universe as computer, in all possible universes and all beings or mechanisms, generating, among other things,a non- quantum mechanical uncertainty principle and a proof of monotheism. There are obvious connections to the classic work of Chaitin, Solomonoff, Komolgarov and Wittgenstein and to the notion that no program (and thus no device) can generate a sequence (or device) with greater complexity than it possesses. One might say this body of work implies atheism since there cannot be any entity more complex than the physical universe and from the Wittgensteinian viewpoint, ‘more complex’ is meaningless (has no conditions of satisfaction, i.e., truth-maker or test). Even a ‘God’ (i.e., a ‘device’ with limitless time/space and energy) cannot determine whether a given ‘number’ is ‘random’ nor can find a certain way to show that a given ‘formula’, ‘theorem’ or ‘sentence’ or ‘device’ (all these being complex language games) is part of a particular ‘system’. -/- Those wishing a comprehensive up to date framework for human behavior from the modern two systems view may consult my article The Logical Structure of Philosophy, Psychology, Mind and Language as Revealed in Wittgenstein and Searle 59p(2016). For all my articles on Wittgenstein and Searle see my e-book ‘The Logical Structure of Philosophy, Psychology, Mind and Language in Wittgenstein and Searle 367p (2016). Those interested in all my writings in their most recent versions may consult my e-book Philosophy, Human Nature and the Collapse of Civilization - Articles and Reviews 2006-2016’ 662p (2016). -/- All of my papers and books have now been published in revised versions both in ebooks and in printed books. -/- Talking Monkeys: Philosophy, Psychology, Science, Religion and Politics on a Doomed Planet - Articles and Reviews 2006-2017 (2017) https://www.amazon.com/dp/B071HVC7YP. -/- The Logical Structure of Philosophy, Psychology, Mind and Language in Ludwig Wittgenstein and John Searle--Articles and Reviews 2006-2016 (2017) https://www.amazon.com/dp/B071P1RP1B. -/- Suicidal Utopian Delusions in the 21st century: Philosophy, Human Nature and the Collapse of Civilization - Articles and Reviews 2006-2017 (2017) https://www.amazon.com/dp/B0711R5LGX . (shrink)
I have read many recent discussions of the limits of computation and the universe as computer, hoping to find some comments on the amazing work of polymath physicist and decision theorist David Wolpert but have not found a single citation and so I present this very brief summary. Wolpert proved some stunning impossibility or incompleteness theorems (1992 to 2008-see arxiv dot org) on the limits to inference (computation) that are so general they are independent of the device doing the (...) computation, and even independent of the laws of physics, so they apply across computers, physics, and human behavior. They make use of Cantor's diagonalization, the liar paradox and worldlines to provide what may be the ultimate theorem in Turing Machine Theory, and seemingly provide insights into impossibility, incompleteness, the limits of computation, and the universe as computer, in all possible universes and all beings or mechanisms, generating, among other things, a non- quantum mechanical uncertainty principle and a proof of monotheism. There are obvious connections to the classic work of Chaitin, Solomonoff, Komolgarov and Wittgenstein and to the notion that no program (and thus no device) can generate a sequence (or device) with greater complexity than it possesses. One might say this body of work implies atheism since there cannot be any entity more complex than the physical universe and from the Wittgensteinian viewpoint, ‘more complex’ is meaningless (has no conditions of satisfaction, i.e., truth-maker or test). Even a ‘God’ (i.e., a ‘device’with limitless time/space and energy) cannot determine whether a given ‘number’ is ‘random’, nor find a certain way to show that a given ‘formula’, ‘theorem’ or ‘sentence’ or ‘device’ (all these being complex language games) is part of a particular ‘system’. -/- Those wishing a comprehensive up to date framework for human behavior from the modern two systems view may consult my book ‘The Logical Structure of Philosophy, Psychology, Mind and Language in Ludwig Wittgenstein and John Searle’ 2nd ed (2019). Those interested in more of my writings may see ‘Talking Monkeys--Philosophy, Psychology, Science, Religion and Politics on a Doomed Planet--Articles and Reviews 2006-2019 2nd ed (2019) and Suicidal Utopian Delusions in the 21st Century 4th ed (2019) . (shrink)
Maxwell’s Demon is a thought experiment devised by J. C. Maxwell in 1867 in order to show that the Second Law of thermodynamics is not universal, since it has a counter-example. Since the Second Law is taken by many to provide an arrow of time, the threat to its universality threatens the account of temporal directionality as well. Various attempts to “exorcise” the Demon, by proving that it is impossible for one reason or another, have been made throughout the years, (...) but none of them were successful. We have shown (in a number of publications) by a general state-space argument that Maxwell’s Demon is compatible with classical mechanics, and that the most recent solutions, based on Landauer’s thesis, are not general. In this paper we demonstrate that Maxwell’s Demon is also compatible with quantum mechanics. We do so by analyzing a particular (but highly idealized) experimental setup and proving that it violates the Second Law. Our discussion is in the framework of standard quantum mechanics; we give two separate arguments in the framework of quantum mechanics with and without the projection postulate. We address in our analysis the connection between measurement and erasure interactions and we show how these notions are applicable in the microscopic quantum mechanical structure. We discuss what might be the quantum mechanical counterpart of the classical notion of “macrostates”, thus explaining why our Quantum Demon setup works not only at the micro level but also at the macro level, properly understood. One implication of our analysis is that the Second Law cannot provide a universal lawlike basis for an account of the arrow of time; this account has to be sought elsewhere. (shrink)
Aumann’s theorem states that no individual should agree to disagree under a range of assumptions. Political liberalism appears to presuppose these assumptions with the idealized conditions of public reason. We argue Aumann’s theorem demonstrates they nevertheless cannot be simultaneously held with what is arguably political liberalism’s most central tenet. That is, the tenet of reasonable pluralism, which implies we can rationally agree to disagree over conceptions of the good. We finish by elaborating a way of relaxing one of (...) the theorem’s axioms that arguably lends itself to a coherent account of political liberalism, namely the condition of indexical independence. (shrink)
An analogue of Arrow’s theorem has been thought to limit the possibilities for multi-criterial theory choice. Here, an example drawn from Toy Science, a model of theories and choice criteria, suggests that it does not. Arrow’s assumption that domains are unrestricted is inappropriate in connection with theory choice in Toy Science. There are, however, variants of Arrow’s theorem that do not require an unrestricted domain. They require instead that domains are, in a technical sense, ‘rich’. Since there are (...) rich domains in Toy Science, such theorems do constrain theory choice to some extent—certainly in the model and perhaps also in real science. (shrink)
Majority cycling and related social choice paradoxes are often thought to threaten the meaningfulness of democracy. But deliberation can prevent majority cycles – not by inducing unanimity, which is unrealistic, but by bringing preferences closer to single-peakedness. We present the first empirical test of this hypothesis, using data from Deliberative Polls. Comparing preferences before and after deliberation, we find increases in proximity to single-peakedness. The increases are greater for lower versus higher salience issues and for individuals who seem to have (...) deliberated more versus less effectively. They are not merely a byproduct of increased substantive agreement. Our results both refine and support the idea that deliberation, by increasing proximity to single-peakedness, provides an escape from the problem of majority cycling. (shrink)
Population axiology is the study of the conditions under which one state of affairs is better than another, when the states of affairs in ques- tion may differ over the numbers and the identities of the persons who ever live. Extant theories include totalism, averagism, variable value theories, critical level theories, and “person-affecting” theories. Each of these the- ories is open to objections that are at least prima facie serious. A series of impossibility theorems shows that this is no (...) coincidence: it can be proved, for various sets of prima facie intuitively compelling desiderata, that no axiology can simultaneously satisfy all the desiderata on the list. One’s choice of population axiology appears to be a choice of which intuition one is least unwilling to give up. (shrink)
Time in electromagnetism shares many features with time in other physical theories. But there is one aspect of electromagnetism's relationship with time that has always been controversial, yet has not always attracted the limelight it deserves: the electromagnetic arrow of time. Beginning with a re-analysis of a famous argument between Ritz and Einstein over the origins of the radiation arrow, this chapter frames the debate between modern Einsteinians and neo-Ritzians. It tries to find a clean statement of what the arrow (...) is and then explains how it relates to the cosmological and thermodynamic arrows, representing the most developed and sophisticated attack yet, in either the physics or philosophy literature, on the electromagnetic arrow of time. (shrink)
Judgment aggregation theory, or rather, as we conceive of it here, logical aggregation theory generalizes social choice theory by having the aggregation rule bear on judgments of all kinds instead of merely preference judgments. It derives from Kornhauser and Sager’s doctrinal paradox and List and Pettit’s discursive dilemma, two problems that we distinguish emphatically here. The current theory has developed from the discursive dilemma, rather than the doctrinal paradox, and the final objective of the paper is to give the latter (...) its own theoretical development along the line of recent work by Dietrich and Mongin. However, the paper also aims at reviewing logical aggregation theory as such, and it covers impossibility theorems by Dietrich, Dietrich and List, Dokow and Holzman, List and Pettit, Mongin, Nehring and Puppe, Pauly and van Hees, providing a uniform logical framework in which they can be compared with each other. The review goes through three historical stages: the initial paradox and dilemma, the scattered early results on the independence axiom, and the so-called canonical theorem, a collective achievement that provided the theory with its specific method of analysis. The paper goes some way towards philosophical logic, first by briefly connecting the aggregative framework of judgment with the modern philosophy of judgment, and second by thoroughly discussing and axiomatizing the ‘general logic’ built in this framework. (shrink)
In this paper, I introduce an intrinsic account of the quantum state. This account contains three desirable features that the standard platonistic account lacks: (1) it does not refer to any abstract mathematical objects such as complex numbers, (2) it is independent of the usual arbitrary conventions in the wave function representation, and (3) it explains why the quantum state has its amplitude and phase degrees of freedom. -/- Consequently, this account extends Hartry Field’s program outlined in Science Without Numbers (...) (1980), responds to David Malament’s long-standing impossibility conjecture (1982), and establishes an important first step towards a genuinely intrinsic and nominalistic account of quantum mechanics. I will also compare the present account to Mark Balaguer’s (1996) nominalization of quantum mechanics and discuss how it might bear on the debate about “wave function realism.” In closing, I will suggest some possible ways to extend this account to accommodate spinorial degrees of freedom and a variable number of particles (e.g. for particle creation and annihilation). -/- Along the way, I axiomatize the quantum phase structure as what I shall call a “periodic difference structure” and prove a representation theorem as well as a uniqueness theorem. These formal results could prove fruitful for further investigation into the metaphysics of phase and theoretical structure. (shrink)
A proof of Fermat’s last theorem is demonstrated. It is very brief, simple, elementary, and absolutely arithmetical. The necessary premises for the proof are only: the three definitive properties of the relation of equality (identity, symmetry, and transitivity), modus tollens, axiom of induction, the proof of Fermat’s last theorem in the case of.
As I hope to show in this paper, Fichte’s rejection of traditional social contractarian accounts of human social relations is related to his rejection of the search for a criterion, or external standard, by which we might measure our knowledge in epistemology. More specifically, Fichte’s account of the impossibility of a normative social contract (as traditionally construed) is related to his account of the impossibility of our knowing things as they might be “in themselves,” separate from and independent (...) of our own activity in knowing them. Addressing the question of whether we finite human knowers can ever transcend the limits of our own consciousness, Fichte argues that Hume was not sufficiently critical: “…the Humean system holds open the possibility that we might someday be able to go beyond the boundary of the human mind, whereas the Critical system proves that such progress is absolutely impossible, and it shows that the thought of a thing possessing existence and specific properties in itself and apart from any faculty of representation is a piece of whimsy, a pipe dream, a nonthought.” In a very real sense, then, Fichte aims to “out-Hume” Hume on the question of whether we can ever know “things-in-themselves” or an external criterion for testing our knowledge. That is, Fichte goes beyond Hume and insists on the necessary – and not merely contingent – character of our ignorance of so-called things-in-themselves (i.e., things that supposedly exist antecedent to and independent of our consciousness of them). But unlike Hume, Fichte argues that radical skepticism regarding all possible knowledge of things-in-themselves does not undermine – but actually confirms and sustains – our belief in the emancipatory power of reason. (shrink)
The aim of this paper is to comprehensively question the validity of the standard way of interpreting Chaitin's famous incompleteness theorem, which says that for every formalized theory of arithmetic there is a finite constant c such that the theory in question cannot prove any particular number to have Kolmogorov complexity larger than c. The received interpretation of theorem claims that the limiting constant is determined by the complexity of the theory itself, which is assumed to be good (...) measure of the strength of the theory. I exhibit certain strong counterexamples and establish conclusively that the received view is false. Moreover, I show that the limiting constants provided by the theorem do not in any way reflect the power of formalized theories, but that the values of these constants are actually determined by the chosen coding of Turing machines, and are thus quite accidental. (shrink)
In the theory of judgment aggregation, it is known for which agendas of propositions it is possible to aggregate individual judgments into collective ones in accordance with the Arrow-inspired requirements of universal domain, collective rationality, unanimity preservation, non-dictatorship and propositionwise independence. But it is only partially known (e.g., only in the monotonic case) for which agendas it is possible to respect additional requirements, notably non-oligarchy, anonymity, no individual veto power, or implication preservation. We fully characterize the agendas for which there (...) are such possibilities, thereby answering the most salient open questions about propositionwise judgment aggregation. Our results build on earlier results by Nehring and Puppe (2002), Nehring (2006), Dietrich and List (2007a) and Dokow and Holzman (2010a). (shrink)
Agents are often assumed to have degrees of belief (“credences”) and also binary beliefs (“beliefs simpliciter”). How are these related to each other? A much-discussed answer asserts that it is rational to believe a proposition if and only if one has a high enough degree of belief in it. But this answer runs into the “lottery paradox”: the set of believed propositions may violate the key rationality conditions of consistency and deductive closure. In earlier work, we showed that this problem (...) generalizes: there exists no local function from degrees of belief to binary beliefs that satisfies some minimal conditions of rationality and non-triviality. “Locality” means that the binary belief in each proposition depends only on the degree of belief in that proposition, not on the degrees of belief in others. One might think that the impossibility can be avoided by dropping the assumption that binary beliefs are a function of degrees of belief. We prove that, even if we drop the “functionality” restriction, there still exists no local relation between degrees of belief and binary beliefs that satisfies some minimal conditions. Thus functionality is not the source of the impossibility; its source is the condition of locality. If there is any non-trivial relation between degrees of belief and binary beliefs at all, it must be a “holistic” one. We explore several concrete forms this “holistic” relation could take. (shrink)
The determinism-free will debate is perhaps as old as philosophy itself and has been engaged in from a great variety of points of view including those of scientific, theological, and logical character. This chapter focuses on two arguments from logic. First, there is an argument in support of determinism that dates back to Aristotle, if not farther. It rests on acceptance of the Law of Excluded Middle, according to which every proposition is either true or false, no matter whether the (...) proposition is about the past, present or future. In particular, the argument goes, whatever one does or does not do in the future is determined in the present by the truth or falsity of the corresponding proposition. The second argument coming from logic is much more modern and appeals to Gödel's incompleteness theorems to make the case against determinism and in favour of free will, insofar as that applies to the mathematical potentialities of human beings. The claim more precisely is that as a consequence of the incompleteness theorems, those potentialities cannot be exactly circumscribed by the output of any computing machine even allowing unlimited time and space for its work. The chapter concludes with some new considerations that may be in favour of a partial mechanist account of the mathematical mind. (shrink)
A platitude that took hold with Kuhn is that there can be several equally good ways of balancing theoretical virtues for theory choice. Okasha recently modelled theory choice using technical apparatus from the domain of social choice: famously, Arrow showed that no method of social choice can jointly satisfy four desiderata, and each of the desiderata in social choice has an analogue in theory choice. Okasha suggested that one can avoid the Arrow analogue for theory choice by employing a strategy (...) used by Sen in social choice, namely, to enhance the information made available to the choice algorithms. I argue here that, despite Okasha’s claims to the contrary, the information-enhancing strategy is not compelling in the domain of theory choice. (shrink)
Condorcet's famous jury theorem reaches an optimistic conclusion on the correctness of majority decisions, based on two controversial premises about voters: they are competent and vote independently, in a technical sense. I carefully analyse these premises and show that: whether a premise is justi…ed depends on the notion of probability considered; none of the notions renders both premises simultaneously justi…ed. Under the perhaps most interesting notions, the independence assumption should be weakened.
(This is for the Cambridge Handbook of Analytic Philosophy, edited by Marcus Rossberg) In this handbook entry, I survey the different ways in which formal mathematical methods have been applied to philosophical questions throughout the history of analytic philosophy. I consider: formalization in symbolic logic, with examples such as Aquinas’ third way and Anselm’s ontological argument; Bayesian confirmation theory, with examples such as the fine-tuning argument for God and the paradox of the ravens; foundations of mathematics, with examples such as (...) Hilbert’s programme and Gödel’s incompleteness theorems; social choice theory, with examples such as Condorcet’s paradox and Arrow’s theorem; ‘how possibly’ results, with examples such as Condorcet’s jury theorem and recent work on intersectionality theory; and the application of advanced mathematics in philosophy, with examples such as accuracy-first epistemology. (shrink)
This paper applies ideas and tools from social choice theory (such as Arrow'stheorem and related results) to linguistics. Specifically, the paper investigates the problem of constraint aggregation in optimality theory from a social-choice-theoretic perspective.
In a quantum universe with a strong arrow of time, it is standard to postulate that the initial wave function started in a particular macrostate---the special low-entropy macrostate selected by the Past Hypothesis. Moreover, there is an additional postulate about statistical mechanical probabilities according to which the initial wave function is a ''typical'' choice in the macrostate. Together, they support a probabilistic version of the Second Law of Thermodynamics: typical initial wave functions will increase in entropy. Hence, there are two (...) sources of randomness in such a universe: the quantum-mechanical probabilities of the Born rule and the statistical mechanical probabilities of the Statistical Postulate. I propose a new way to understand time's arrow in a quantum universe. It is based on what I call the Thermodynamic Theories of Quantum Mechanics. According to this perspective, there is a natural choice for the initial quantum state of the universe, which is given by not a wave function but by a density matrix. The density matrix plays a microscopic role: it appears in the fundamental dynamical equations of those theories. The density matrix also plays a macroscopic / thermodynamic role: it is exactly the projection operator onto the Past Hypothesis subspace. Thus, given an initial subspace, we obtain a unique choice of the initial density matrix. I call this property "the conditional uniqueness" of the initial quantum state. The conditional uniqueness provides a new and general strategy to eliminate statistical mechanical probabilities in the fundamental physical theories, by which we can reduce the two sources of randomness to only the quantum mechanical one. I also explore the idea of an absolutely unique initial quantum state, in a way that might realize Penrose's idea of a strongly deterministic universe. (shrink)
It is a widespread intuition that the coherence of independent reports provides a powerful reason to believe that the reports are true. Formal results by Huemer (1997), Olsson (2002, 2005), and Bovens and Hartmann (2003) prove that, under certain conditions, coherence cannot increase the probability of the target claim. These formal results, known as ‘the impossibility theorems’ have been widely discussed in the literature. They are taken to have significant epistemic upshot. In particular, they are taken to show that (...) reports must first individually confirm the target claim before the coherence of multiple reports offers any positive confirmation. In this paper, I dispute this epistemic interpretation. The impossibility theorems are consistent with the idea that the coherence of independent reports provides a powerful reason to believe that the reports are true even if the reports do not individually confirm prior to coherence. Once we see that the formal discoveries do not have this implication, we can recover a model of coherence justification consistent with Bayesianism and these results. This paper, thus, seeks to turn the tide of the negative findings for coherence reasoning by defending coherence as a unique source of confirmation. (shrink)
My aim in this paper is to explain what Condorcet’s jury theorem is, and to examine its central assumptions, its significance to the epistemic theory of democracy and its connection with Rousseau’s theory of general will. In the first part of the paper I will analyze an epistemic theory of democracy and explain how its connection with Condorcet’s jury theorem is twofold: the theorem is at the same time a contributing historical source, and the model used by (...) the authors to this day. In the second part I will specify the purposes of the theorem itself, and examine its underlying assumptions. Third part will be about an interpretation of Rousseau’s theory, which is given by Grofman and Feld relying on Condorcet’s jury theorem, and about criticisms of such interpretation. In the fourth, and last, part I will focus on one particular assumption of Condorcet’s theorem, which proves to be especially problematic if we would like to apply the theorem under real-life conditions; namely, the assumption that voters choose between two options only. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.