This is a chapter of a collective volume of Rawls's and Harsanyi's theories of distributive justice. It focuses on Harsanyi's important Social AggregationTheorem and technically reconstructs it as a theorem in welfarist social choice.
This paper is about the role of interpersonal comparisons in Harsanyi'saggregationtheorem. Harsanyi interpreted his theorem to show that a broadly utilitarian theory of distribution must be true even if there are no interpersonal comparisons of well-being. How is this possible? The orthodox view is that it is not. Some argue that the interpersonal comparability of well-being is hidden in Harsanyi's premises. Others argue that it is a surprising conclusion of Harsanyi'stheorem, (...) which is not presupposed by any one of the premises. I argue instead that Harsanyi was right: his theorem and its weighted-utilitarian conclusion do not require interpersonal comparisons of well-being. The key to making sense of this possibility is to treat Harsanyi's weights as dimensional constants rather than dimensionless numbers. (shrink)
We present an abstract social aggregationtheorem. Society, and each individual, has a preorder that may be interpreted as expressing values or beliefs. The preorders are allowed to violate both completeness and continuity, and the population is allowed to be infinite. The preorders are only assumed to be represented by functions with values in partially ordered vector spaces, and whose product has convex range. This includes all preorders that satisfy strong independence. Any Pareto indifferent social preorder is then (...) shown to be represented by a linear transformation of the representations of the individual preorders. Further Pareto conditions on the social preorder correspond to positivity conditions on the transformation. When all the Pareto conditions hold and the population is finite, the social preorder is represented by a sum of individual preorder representations. We provide two applications. The first yields an extremely general version of Harsanyi's social aggregationtheorem. The second generalizes a classic result about linear opinion pooling. (shrink)
Can a group be an orthodox rational agent? This requires the group's aggregate preferences to follow expected utility (static rationality) and to evolve by Bayesian updating (dynamic rationality). Group rationality is possible, but the only preference aggregation rules which achieve it (and are minimally Paretian and continuous) are the linear-geometric rules, which combine individual values linearly and combine individual beliefs geometrically. Linear-geometric preference aggregation contrasts with classic linear-linear preference aggregation, which combines both values and beliefs linearly, but (...) achieves only static rationality. Our characterisation of linear-geometric preference aggregation has two corollaries: a characterisation of linear aggregation of values (Harsanyi'sTheorem) and a characterisation of geometric aggregation of beliefs. (shrink)
An important objection to preference-satisfaction theories of well-being is that they cannot make sense of interpersonal comparisons. A tradition dating back to Harsanyi :434, 1953) attempts to solve this problem by appeal to people’s so-called extended preferences. This paper presents a new problem for the extended preferences program, related to Arrow’s celebrated impossibility theorem. We consider three ways in which the extended-preference theorist might avoid this problem, and recommend that she pursue one: developing aggregation rules that violate Arrow’s (...) Independence of Irrelevant Alternatives condition. (shrink)
This chapter of the Handbook of Utility Theory aims at covering the connections between utility theory and social ethics. The chapter first discusses the philosophical interpretations of utility functions, then explains how social choice theory uses them to represent interpersonal comparisons of welfare in either utilitarian or non-utilitarian representations of social preferences. The chapter also contains an extensive account of John Harsanyi's formal reconstruction of utilitarianism and its developments in the later literature, especially when society faces uncertainty rather than (...) probabilistic risk. (shrink)
The article is a plea for ethicists to regard probability as one of their most important concerns. It outlines a series of topics of central importance in ethical theory in which probability is implicated, often in a surprisingly deep way, and lists a number of open problems. Topics covered include: interpretations of probability in ethical contexts; the evaluative and normative significance of risk or uncertainty; uses and abuses of expected utility theory; veils of ignorance; Harsanyi’s aggregationtheorem; population (...) size problems; equality; fairness; giving priority to the worse off; continuity; incommensurability; nonexpected utility theory; evaluative measurement; aggregation; causal and evidential decision theory; act consequentialism; rule consequentialism; and deontology. (shrink)
In response to recent work on the aggregation of individual judgments on logically connected propositions into collective judgments, it is often asked whether judgment aggregation is a special case of Arrowian preference aggregation. We argue for the converse claim. After proving two impossibility theorems on judgment aggregation (using "systematicity" and "independence" conditions, respectively), we construct an embedding of preference aggregation into judgment aggregation and prove Arrow’s theorem (stated for strict preferences) as a corollary (...) of our second result. Although we thereby provide a new proof of Arrow’s theorem, our main aim is to identify the analogue of Arrow’s theorem in judgment aggregation, to clarify the relation between judgment and preference aggregation, and to illustrate the generality of the judgment aggregation model. JEL Classi…cation: D70, D71.. (shrink)
Risky prospects represent policies that impose different types of risks on multiple people. I present an example from food safety. A utilitarian following Harsanyi'sAggregationTheorem ranks such prospects according to their mean expected utility or the expectation of the social utility. Such a ranking is not sensitive to any of four types of distributional concerns. I develop a model that lets the policy analyst rank prospects relative to the distributional concerns that she considers fitting in the (...) context at hand. I name this model ‘the Distribution View’ posing an alternative to Parfit's Priority View for risky prospects. (shrink)
We give two social aggregation theorems under conditions of risk, one for constant population cases, the other an extension to variable populations. Intra and interpersonal welfare comparisons are encoded in a single ‘individual preorder’. The theorems give axioms that uniquely determine a social preorder in terms of this individual preorder. The social preorders described by these theorems have features that may be considered characteristic of Harsanyi-style utilitarianism, such as indifference to ex ante and ex post equality. However, the theorems (...) are also consistent with the rejection of all of the expected utility axioms, completeness, continuity, and independence, at both the individual and social levels. In that sense, expected utility is inessential to Harsanyi-style utilitarianism. In fact, the variable population theorem imposes only a mild constraint on the individual preorder, while the constant population theorem imposes no constraint at all. We then derive further results under the assumption of our basic axioms. First, the individual preorder satisfies the main expected utility axiom of strong independence if and only if the social preorder has a vector-valued expected total utility representation, covering Harsanyi’s utilitarian theorem as a special case. Second, stronger utilitarian-friendly assumptions, like Pareto or strong separability, are essentially equivalent to strong independence. Third, if the individual preorder satisfies a ‘local expected utility’ condition popular in non-expected utility theory, then the social preorder has a ‘local expected total utility’ representation. Fourth, a wide range of non-expected utility theories nevertheless lead to social preorders of outcomes that have been seen as canonically egalitarian, such as rank-dependent social preorders. Although our aggregation theorems are stated under conditions of risk, they are valid in more general frameworks for representing uncertainty or ambiguity. (shrink)
We introduce a ranking of multidimensional alternatives, including uncertain prospects as a particular case, when these objects can be given a matrix form. This ranking is separable in terms of rows and columns, and continuous and monotonic in the basic quantities. Owing to the theory of additive separability developed here, we derive very precise numerical representations over a large class of domains (i.e., typically notof the Cartesian product form). We apply these representationsto (1)streams of commodity baskets through time, (2)uncertain social (...) prospects, (3)uncertain individual prospects. Concerning(1), we propose a finite horizon variant of Koopmans’s (1960) axiomatization of infinite discounted utility sums. The main results concern(2). We push the classic comparison between the exanteand expostsocial welfare criteria one step further by avoiding any expected utility assumptions, and as a consequence obtain what appears to be the strongest existing form of Harsanyi’s (1955) AggregationTheorem. Concerning(3), we derive a subjective probability for Anscombe and Aumann’s (1963) finite case by merely assuming that there are two epistemically independent sources of uncertainty. (shrink)
Suppose that the members of a group each hold a rational set of judgments on some interconnected questions, and imagine that the group itself has to form a collective, rational set of judgments on those questions. How should it go about dealing with this task? We argue that the question raised is subject to a difficulty that has recently been noticed in discussion of the doctrinal paradox in jurisprudence. And we show that there is a general impossibility theorem that (...) that difficulty illustrates. Our paper describes this impossibility result and provides an exploration of its significance. The result naturally invites comparison with Kenneth Arrow's famous theorem (Arrow, 1963 and 1984; Sen, 1970) and we elaborate that comparison in a companion paper (List and Pettit, 2002). The paper is in four sections. The first section documents the need for various groups to aggregate its members' judgments; the second presents the discursive paradox; the third gives an informal statement of the more general impossibility result; the formal proof is presented in an appendix. The fourth section, finally, discusses some escape routes from that impossibility. (shrink)
Harsanyi claimed that his Aggregation and Impartial Observer Theorems provide a justification for utilitarianism. This claim has been strongly resisted, notably by Sen and Weymark, who argue that while Harsanyi has perhaps shown that overall good is a linear sum of individuals’ von Neumann-Morgenstern utilities, he has done nothing to establish any con- nection between the notion of von Neumann-Morgenstern utility and that of well-being, and hence that utilitarianism does not follow. The present article defends Harsanyi against the Sen-Weymark (...) cri- tique. I argue that, far from being a term with precise and independent quantitative content whose relationship to von Neumann-Morgenstern utility is then a substantive question, terms such as ‘well-being’ suffer (or suffered) from indeterminacy regarding precisely which quantity they refer to. If so, then (on the issue that this article focuses on) Harsanyi has gone as far towards defending ‘utilitarianism in the original sense’ as could coherently be asked. (shrink)
Several recent results on the aggregation of judgments over logically connected propositions show that, under certain conditions, dictatorships are the only propositionwise aggregation functions generating fully rational (i.e., complete and consistent) collective judgments. A frequently mentioned route to avoid dictatorships is to allow incomplete collective judgments. We show that this route does not lead very far: we obtain oligarchies rather than dictatorships if instead of full rationality we merely require that collective judgments be deductively closed, arguably a minimal (...) condition of rationality, compatible even with empty judgment sets. We derive several characterizations of oligarchies and provide illustrative applications to Arrowian preference aggregation and Kasher and Rubinsteinís group identification problem. (shrink)
In solving judgment aggregation problems, groups often face constraints. Many decision problems can be modelled in terms the acceptance or rejection of certain propositions in a language, and constraints as propositions that the decisions should be consistent with. For example, court judgments in breach-of-contract cases should be consistent with the constraint that action and obligation are necessary and sufficient for liability; judgments on how to rank several options in an order of preference with the constraint of transitivity; and judgments (...) on budget items with budgetary constraints. Often more or less demanding constraints on decisions are imaginable. For instance, in preference ranking problems, the transitivity constraint is often contrasted with the weaker acyclicity constraint. In this paper, we make constraints explicit in judgment aggregation by relativizing the rationality conditions of consistency and deductive closure to a constraint set, whose variation yields more or less strong notions of rationality. We review several general results on judgment aggregation in light of such constraints. (shrink)
I propose a relevance-based independence axiom on how to aggregate individual yes/no judgments on given propositions into collective judgments: the collective judgment on a proposition depends only on people’s judgments on propositions which are relevant to that proposition. This axiom contrasts with the classical independence axiom: the collective judgment on a proposition depends only on people’s judgments on the same proposition. I generalize the premise-based rule and the sequential-priority rule to an arbitrary priority order of the propositions, instead of a (...) dichotomous premise/conclusion order resp. a linear priority order. I prove four impossibility theorems on relevance-based aggregation. One theorem simultaneously generalizes Arrow’s Theorem (in its general and indiﬀerence-free versions) and the well-known Arrow-like theorem in judgment aggregation. (shrink)
The ``doctrinal paradox'' or ``discursive dilemma'' shows that propositionwise majority voting over the judgments held by multiple individuals on some interconnected propositions can lead to inconsistent collective judgments on these propositions. List and Pettit (2002) have proved that this paradox illustrates a more general impossibility theorem showing that there exists no aggregation procedure that generally produces consistent collective judgments and satisfies certain minimal conditions. Although the paradox and the theorem concern the aggregation of judgments rather than (...) preferences, they invite comparison with two established results on the aggregation of preferences: the Condorcet paradox and Arrow's impossibility theorem. We may ask whether the new impossibility theorem is a special case of Arrow's theorem, or whether there are interesting disanalogies between the two results. In this paper, we compare the two theorems, and show that they are not straightforward corollaries of each other. We further suggest that, while the framework of preference aggregation can be mapped into the framework of judgment aggregation, there exists no obvious reverse mapping. Finally, we address one particular minimal condition that is used in both theorems – an independence condition – and suggest that this condition points towards a unifying property underlying both impossibility results. (shrink)
The aim of this article is to introduce the theory of judgment aggregation, a growing interdisciplinary research area. The theory addresses the following question: How can a group of individuals make consistent collective judgments on a given set of propositions on the basis of the group members' individual judgments on them? I begin by explaining the observation that initially sparked the interest in judgment aggregation, the so-called "doctinal" and "discursive paradoxes". I then introduce the basic formal model of (...) judgment aggregation, which allows me to present some illustrative variants of a generic impossibility result. I subsequently turn to the question of how this impossibility result can be avoided, going through several possible escape routes. Finally, I relate the theory of judgment aggregation to other branches of aggregation theory. Rather than offering a comprehensive survey of the theory of judgment aggregation, I hope to introduce the theory in a succinct and pedagogical way, providing an illustrative rather than exhaustive coverage of some of its key ideas and results. (shrink)
This paper provides an introductory review of the theory of judgment aggregation. It introduces the paradoxes of majority voting that originally motivated the field, explains several key results on the impossibility of propositionwise judgment aggregation, presents a pedagogical proof of one of those results, discusses escape routes from the impossibility and relates judgment aggregation to some other salient aggregation problems, such as preference aggregation, abstract aggregation and probability aggregation. The present illustrative rather than (...) exhaustive review is intended to give readers new to the field of judgment aggregation a sense of this rapidly growing research area. (shrink)
The purpose of this paper is to illustrate, formally, an ambiguity in the exercise of political influence. To wit: A voter might exert influence with an eye toward maximizing the probability that the political system (1) obtains the correct (e.g. just) outcome, or (2) obtains the outcome that he judges to be correct (just). And these are two very different things. A variant of Condorcet's Jury Theorem which incorporates the effect of influence on group competence and interdependence is developed. (...) Analytic and numerical results are obtained, the most important of which is that it is never optimal--from the point-of-view of collective accuracy--for a voter to exert influence without limit. He ought to either refrain from influencing other voters or else exert a finite amount of influence, depending on circumstance. Philosophical lessons are drawn from the model, to include a solution to Wollheim's "Paradox in the Theory of Democracy". (shrink)
In the theory of judgment aggregation, it is known for which agendas of propositions it is possible to aggregate individual judgments into collective ones in accordance with the Arrow-inspired requirements of universal domain, collective rationality, unanimity preservation, non-dictatorship and propositionwise independence. But it is only partially known (e.g., only in the monotonic case) for which agendas it is possible to respect additional requirements, notably non-oligarchy, anonymity, no individual veto power, or implication preservation. We fully characterize the agendas for which (...) there are such possibilities, thereby answering the most salient open questions about propositionwise judgment aggregation. Our results build on earlier results by Nehring and Puppe (2002), Nehring (2006), Dietrich and List (2007a) and Dokow and Holzman (2010a). (shrink)
Judgment aggregation theory, or rather, as we conceive of it here, logical aggregation theory generalizes social choice theory by having the aggregation rule bear on judgments of all kinds instead of merely preference judgments. It derives from Kornhauser and Sager’s doctrinal paradox and List and Pettit’s discursive dilemma, two problems that we distinguish emphatically here. The current theory has developed from the discursive dilemma, rather than the doctrinal paradox, and the final objective of the paper is to (...) give the latter its own theoretical development along the line of recent work by Dietrich and Mongin. However, the paper also aims at reviewing logical aggregation theory as such, and it covers impossibility theorems by Dietrich, Dietrich and List, Dokow and Holzman, List and Pettit, Mongin, Nehring and Puppe, Pauly and van Hees, providing a uniform logical framework in which they can be compared with each other. The review goes through three historical stages: the initial paradox and dilemma, the scattered early results on the independence axiom, and the so-called canonical theorem, a collective achievement that provided the theory with its specific method of analysis. The paper goes some way towards philosophical logic, first by briefly connecting the aggregative framework of judgment with the modern philosophy of judgment, and second by thoroughly discussing and axiomatizing the ‘general logic’ built in this framework. (shrink)
According to a theorem recently proved in the theory of logical aggregation, any nonconstant social judgment function that satisfies independence of irrelevant alternatives (IIA) is dictatorial. We show that the strong and not very plausible IIA condition can be replaced with a minimal independence assumption plus a Pareto-like condition. This new version of the impossibility theorem likens it to Arrow’s and arguably enhances its paradoxical value.
This paper applies ideas and tools from social choice theory (such as Arrow's theorem and related results) to linguistics. Specifically, the paper investigates the problem of constraint aggregation in optimality theory from a social-choice-theoretic perspective.
Standard impossibility theorems on judgment aggregation over logically connected propositions either use a controversial systematicity condition or apply only to agendas of propositions with rich logical connections. Are there any serious impossibilities without these restrictions? We prove an impossibility theorem without requiring systematicity that applies to most standard agendas: Every judgment aggregation function (with rational inputs and outputs) satisfying a condition called unbiasedness is dictatorial (or effectively dictatorial if we remove one of the agenda conditions). Our agenda (...) conditions are tight. When applied illustratively to (strict) preference aggregation represented in our model, the result implies that every unbiased social welfare function with universal domain is effectively dictatorial. (shrink)
What is the relationship between degrees of belief and binary beliefs? Can the latter be expressed as a function of the former—a so-called “belief-binarization rule”—without running into difficulties such as the lottery paradox? We show that this problem can be usefully analyzed from the perspective of judgment-aggregation theory. Although some formal similarities between belief binarization and judgment aggregation have been noted before, the connection between the two problems has not yet been studied in full generality. In this paper, (...) we seek to fill this gap. The paper is organized around a baseline impossibility theorem, which we use to map out the space of possible solutions to the belief-binarization problem. Our theorem shows that, except in limiting cases, there exists no belief-binarization rule satisfying four initially plausible desiderata. Surprisingly, this result is a direct corollary of the judgment-aggregation variant of Arrow’s classic impossibility theorem in social choice theory. (shrink)
This paper critically engages Philip Mirowki's essay, "The scientific dimensions of social knowledge and their distant echoes in 20th-century American philosophy of science." It argues that although the cold war context of anti-democratic elitism best suited for making decisions about engaging in nuclear war may seem to be politically and ideologically motivated, in fact we need to carefully consider the arguments underlying the new rational choice based political philosophies of the post-WWII era typified by Arrow's impossibility theorem. A distrust (...) of democratic decision-making principles may be developed by social scientists whose leanings may be toward the left or right side of the spectrum of political practices. (shrink)
We investigate the conflict between the ex ante and ex post criteria of social welfare in a new framework of individual and social decisions, which distinguishes between two sources of uncertainty, here interpreted as an objective and a subjective source respectively. This framework makes it possible to endow the individuals and society not only with ex ante and ex post preferences, as is usually done, but also with interim preferences of two kinds, and correspondingly, to introduce interim forms of the (...) Pareto principle. After characterizing the ex ante and ex post criteria, we present a first solution to their conflict that extends the former as much possible in the direction of the latter. Then, we present a second solution, which goes in the opposite direction, and is also maximally assertive. Both solutions translate the assumed Pareto conditions into weighted additive utility representations, and both attribute to the individuals common probability values on the objective source of uncertainty, and different probability values on the subjective source. We discuss these solutions in terms of two conceptual arguments, i.e., the by now classic spurious unanimity argument and a novel informational argument labelled complementary ignorance. The paper complies with the standard economic methodology of basing probability and utility representations on preference axioms, but for the sake of completeness, also considers a construal of objective uncertainty based on the assumption of an exogeneously given probability measure. JEL classification: D70; D81. (shrink)
My aim in this paper is to explain what Condorcet’s jury theorem is, and to examine its central assumptions, its significance to the epistemic theory of democracy and its connection with Rousseau’s theory of general will. In the first part of the paper I will analyze an epistemic theory of democracy and explain how its connection with Condorcet’s jury theorem is twofold: the theorem is at the same time a contributing historical source, and the model used by (...) the authors to this day. In the second part I will specify the purposes of the theorem itself, and examine its underlying assumptions. Third part will be about an interpretation of Rousseau’s theory, which is given by Grofman and Feld relying on Condorcet’s jury theorem, and about criticisms of such interpretation. In the fourth, and last, part I will focus on one particular assumption of Condorcet’s theorem, which proves to be especially problematic if we would like to apply the theorem under real-life conditions; namely, the assumption that voters choose between two options only. (shrink)
A proof of Fermat’s last theorem is demonstrated. It is very brief, simple, elementary, and absolutely arithmetical. The necessary premises for the proof are only: the three definitive properties of the relation of equality (identity, symmetry, and transitivity), modus tollens, axiom of induction, the proof of Fermat’s last theorem in the case of.
In a previous paper, an elementary and thoroughly arithmetical proof of Fermat’s last theorem by induction has been demonstrated if the case for “n = 3” is granted as proved only arithmetically (which is a fact a long time ago), furthermore in a way accessible to Fermat himself though without being absolutely and precisely correct. The present paper elucidates the contemporary mathematical background, from which an inductive proof of FLT can be inferred since its proof for the case for (...) “n = 3” has been known for a long time. It needs “Hilbert mathematics”, which is inherently complete unlike the usual “Gödel mathematics”, and based on “Hilbert arithmetic” to generalize Peano arithmetic in a way to unify it with the qubit Hilbert space of quantum information. An “epoché to infinity” (similar to Husserl’s “epoché to reality”) is necessary to map Hilbert arithmetic into Peano arithmetic in order to be relevant to Fermat’s age. Furthermore, the two linked semigroups originating from addition and multiplication and from the Peano axioms in the final analysis can be postulated algebraically as independent of each other in a “Hamilton” modification of arithmetic supposedly equivalent to Peano arithmetic. The inductive proof of FLT can be deduced absolutely precisely in that Hamilton arithmetic and the pransfered as a corollary in the standard Peano arithmetic furthermore in a way accessible in Fermat’s epoch and thus, to himself in principle. A future, second part of the paper is outlined, getting directed to an eventual proof of the case “n=3” based on the qubit Hilbert space and the Kochen-Specker theorem inferable from it. (shrink)
The previous two parts of the paper demonstrate that the interpretation of Fermat’s last theorem (FLT) in Hilbert arithmetic meant both in a narrow sense and in a wide sense can suggest a proof by induction in Part I and by means of the Kochen - Specker theorem in Part II. The same interpretation can serve also for a proof FLT based on Gleason’s theorem and partly similar to that in Part II. The concept of (probabilistic) measure (...) of a subspace of Hilbert space and especially its uniqueness can be unambiguously linked to that of partial algebra or incommensurability, or interpreted as a relation of the two dual branches of Hilbert arithmetic in a wide sense. The investigation of the last relation allows for FLT and Gleason’s theorem to be equated in a sense, as two dual counterparts, and the former to be inferred from the latter, as well as vice versa under an additional condition relevant to the Gödel incompleteness of arithmetic to set theory. The qubit Hilbert space itself in turn can be interpreted by the unity of FLT and Gleason’s theorem. The proof of such a fundamental result in number theory as FLT by means of Hilbert arithmetic in a wide sense can be generalized to an idea about “quantum number theory”. It is able to research mathematically the origin of Peano arithmetic from Hilbert arithmetic by mediation of the “nonstandard bijection” and its two dual branches inherently linking it to information theory. Then, infinitesimal analysis and its revolutionary application to physics can be also re-realized in that wider context, for example, as an exploration of the way for physical quantity of time (respectively, for time derivative in any temporal process considered in physics) to appear at all. Finally, the result admits a philosophical reflection of how any hierarchy arises or changes itself only thanks to its dual and idempotent counterpart. (shrink)
In this paper, I investigate the relationship between preference and judgment aggregation, using the notion of ranking judgment introduced in List and Pettit. Ranking judgments were introduced in order to state the logical connections between the impossibility theorem of aggregating sets of judgments and Arrow’s theorem. I present a proof of the theorem concerning ranking judgments as a corollary of Arrow’s theorem, extending the translation between preferences and judgments defined in List and Pettit to the (...) conditions on the aggregation procedure. (shrink)
Classical interpretations of Goedels formal reasoning, and of his conclusions, implicitly imply that mathematical languages are essentially incomplete, in the sense that the truth of some arithmetical propositions of any formal mathematical language, under any interpretation, is, both, non-algorithmic, and essentially unverifiable. However, a language of general, scientific, discourse, which intends to mathematically express, and unambiguously communicate, intuitive concepts that correspond to scientific investigations, cannot allow its mathematical propositions to be interpreted ambiguously. Such a language must, therefore, define mathematical truth (...) verifiably. We consider a constructive interpretation of classical, Tarskian, truth, and of Goedel's reasoning, under which any formal system of Peano Arithmetic---classically accepted as the foundation of all our mathematical Languages---is verifiably complete in the above sense. We show how some paradoxical concepts of Quantum mechanics can, then, be expressed, and interpreted, naturally under a constructive definition of mathematical truth. (shrink)
The paper is a continuation of another paper published as Part I. Now, the case of “n=3” is inferred as a corollary from the Kochen and Specker theorem (1967): the eventual solutions of Fermat’s equation for “n=3” would correspond to an admissible disjunctive division of qubit into two absolutely independent parts therefore versus the contextuality of any qubit, implied by the Kochen – Specker theorem. Incommensurability (implied by the absence of hidden variables) is considered as dual to quantum (...) contextuality. The relevant mathematical structure is Hilbert arithmetic in a wide sense, in the framework of which Hilbert arithmetic in a narrow sense and the qubit Hilbert space are dual to each other. A few cases involving set theory are possible: (1) only within the case “n=3” and implicitly, within any next level of “n” in Fermat’s equation; (2) the identification of the case “n=3” and the general case utilizing the axiom of choice rather than the axiom of induction. If the former is the case, the application of set theory and arithmetic can remain disjunctively divided: set theory, “locally”, within any level; and arithmetic, “globally”, to all levels. If the latter is the case, the proof is thoroughly within set theory. Thus, the relevance of Yablo’s paradox to the statement of Fermat’s last theorem is avoided in both cases. The idea of “arithmetic mechanics” is sketched: it might deduce the basic physical dimensions of mechanics (mass, time, distance) from the axioms of arithmetic after a relevant generalization, Furthermore, a future Part III of the paper is suggested: FLT by mediation of Hilbert arithmetic in a wide sense can be considered as another expression of Gleason’s theorem in quantum mechanics: the exclusions about (n = 1, 2) in both theorems as well as the validity for all the rest values of “n” can be unified after the theory of quantum information. The availability (respectively, non-availability) of solutions of Fermat’s equation can be proved as equivalent to the non-availability (respectively, availability) of a single probabilistic measure as to Gleason’s theorem. (shrink)
The determinism-free will debate is perhaps as old as philosophy itself and has been engaged in from a great variety of points of view including those of scientific, theological, and logical character. This chapter focuses on two arguments from logic. First, there is an argument in support of determinism that dates back to Aristotle, if not farther. It rests on acceptance of the Law of Excluded Middle, according to which every proposition is either true or false, no matter whether the (...) proposition is about the past, present or future. In particular, the argument goes, whatever one does or does not do in the future is determined in the present by the truth or falsity of the corresponding proposition. The second argument coming from logic is much more modern and appeals to Gödel's incompleteness theorems to make the case against determinism and in favour of free will, insofar as that applies to the mathematical potentialities of human beings. The claim more precisely is that as a consequence of the incompleteness theorems, those potentialities cannot be exactly circumscribed by the output of any computing machine even allowing unlimited time and space for its work. The chapter concludes with some new considerations that may be in favour of a partial mechanist account of the mathematical mind. (shrink)
The aggregation of individual judgments over interrelated propositions is a newly arising field of social choice theory. I introduce several independence conditions on judgment aggregation rules, each of which protects against a specific type of manipulation by agenda setters or voters. I derive impossibility theorems whereby these independence conditions are incompatible with certain minimal requirements. Unlike earlier impossibility results, the main result here holds for any (non-trivial) agenda. However, independence conditions arguably undermine the logical structure of judgment (...) class='Hi'>aggregation. I therefore suggest restricting independence to premises, which leads to a generalised premise-based procedure. This procedure is proven to be possible if the premises are logically independent. (shrink)
The aim of this paper is to comprehensively question the validity of the standard way of interpreting Chaitin's famous incompleteness theorem, which says that for every formalized theory of arithmetic there is a finite constant c such that the theory in question cannot prove any particular number to have Kolmogorov complexity larger than c. The received interpretation of theorem claims that the limiting constant is determined by the complexity of the theory itself, which is assumed to be good (...) measure of the strength of the theory. I exhibit certain strong counterexamples and establish conclusively that the received view is false. Moreover, I show that the limiting constants provided by the theorem do not in any way reflect the power of formalized theories, but that the values of these constants are actually determined by the chosen coding of Turing machines, and are thus quite accidental. (shrink)
Can experimental philosophy help us answer central questions about the nature of moral responsibility, such as the question of whether moral responsibility is compatible with determinism? Specifically, can folk judgments in line with a particular answer to that question provide support for that answer. Based on reasoning familiar from Condorcet’s Jury Theorem, such support could be had if individual judges track the truth of the matter independently and with some modest reliability: such reliability quickly aggregates as the number of (...) judges goes up. In this chapter, however, I argue, partly based on empirical evidence, that although non-specialist judgments might on average be more likely than not to get things right, their individual likelihoods fail to aggregate because they do not track truth with sufficient independence. (shrink)
Condorcet's famous jury theorem reaches an optimistic conclusion on the correctness of majority decisions, based on two controversial premises about voters: they are competent and vote independently, in a technical sense. I carefully analyse these premises and show that: whether a premise is justi…ed depends on the notion of probability considered; none of the notions renders both premises simultaneously justi…ed. Under the perhaps most interesting notions, the independence assumption should be weakened.
Can we design a perfect democratic decision procedure? Condorcet famously observed that majority rule, our paradigmatic democratic procedure, has some desirable properties, but sometimes produces inconsistent outcomes. Revisiting Condorcet’s insights in light of recent work on the aggregation of judgments, I show that there is a conflict between three initially plausible requirements of democracy: “robustness to pluralism”, “basic majoritarianism”, and “collective rationality”. For all but the simplest collective decision problems, no decision procedure meets these three requirements at once; at (...) most two can be met together. This “democratic trilemma” raises the question of which requirement to give up. Since different answers correspond to different views about what matters most in a democracy, the trilemma suggests a map of the “logical space” in which different conceptions of democracy are located. It also sharpens our thinking about other impossibility problems of social choice and how to avoid them, by capturing a core structure many of these problems have in common. More broadly, it raises the idea of “cartography of logical space” in relation to contested political concepts. (shrink)
Juries, committees and experts panels commonly appraise things of one kind or another on the basis of grades awarded by several people. When everybody's grading thresholds are known to be the same, the results sometimes can be counted on to reflect the graders’ opinion. Otherwise, they often cannot. Under certain conditions, Arrow's ‘impossibility’ theorem entails that judgements reached by aggregating grades do not reliably track any collective sense of better and worse at all. These claims are made by adapting (...) the Arrow–Sen framework for social choice to study grading in groups. (shrink)
Fermat’s Least Time Principle has a long history. World’s foremost academies of the day championed by their most prestigious philosophers competed for the glory and prestige that went with the solution of the refraction problem of light. The controversy, known as Descartes - Fermat controversy was due to the contradictory views held by Descartes and Fermat regarding the relative speeds of light in different media. Descartes with his mechanical philosophy insisted that every natural phenomenon must be explained by mechanical principles. (...) Fermat on the other hand insisted an end purpose for every motion. For example, least time of travel and not the least distance of travel is the end purpose for motion of light. This implied a thinking nature, which was rejected by Descartes. Surprisingly, with contradictory assumptions regarding the relative speeds of light in different media, both Descartes and Fermat came to the same result that the ratio of sines of angles of incidence and refraction is a constant. Fermat’s result came to be known as the ‘Fermat’s least time principle’. We show in this article that Fermat’s least time principle violates a fundamental theorem in geometry – the Ptolemy’s theorem. That leads to the invalidity of Fermat’s principle. -/- . (shrink)
Leibniz proposed the ‘Most Determined Path Principle’ in seventeenth century. According to it, ‘ease’ of travel is the end purpose of motion. Using this principle and his calculus method he demonstrated Snell’s Laws of reflection and refraction. This method shows that light follows extremal (local minimum or maximum) time path in going from one point to another, either directly along a straight line path or along a broken line path when it undergoes reflection or refraction at plane or spherical (concave (...) or convex) surfaces. The extremal time path avoided the criticism that Fermat’s least time path was subjected to, by Cartesians who cited examples of reflections at spherical surfaces where light took the path of longest time. Thereby it became the standard method of demonstration of Snell’s Laws. Ptolemy’s theorem is a fundamental theorem in geometry. A special case of it offers a method of finding the minimum sum of the two distances of a point from two given fixed points. We show in this paper that Leibniz’s calculus proof of Snell’s Laws violates Ptolemy’s theorem, whereby Leibniz’s proof becomes invalid. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.