Jury theorems are mathematical theorems about the ability of collectives to make correct decisions. Several jury theorems carry the optimistic message that, in suitable circumstances, ‘crowds are wise’: many individuals together (using, for instance, majority voting) tend to make good decisions, outperforming fewer or just one individual. Jury theorems form the technical core of epistemic arguments for democracy, and provide probabilistic tools for reasoning about the epistemic quality of collective decisions. The popularity of jury theorems spans (...) across various disciplines such as economics, political science, philosophy, and computer science. This entry reviews and critically assesses a variety of jury theorems. It first discusses Condorcet's initial jurytheorem, and then progressively introduces jury theorems with more appropriate premises and conclusions. It explains the philosophical foundations, and relates jury theorems to diversity, deliberation, shared evidence, shared perspectives, and other phenomena. It finally connects jury theorems to their historical background and to democratic theory, social epistemology, and social choice theory. (shrink)
We give a review and critique of jury theorems from a social-epistemology perspective, covering Condorcet’s (1785) classic theorem and several later refinements and departures. We assess the plausibility of the conclusions and premises featuring in jury theorems and evaluate the potential of such theorems to serve as formal arguments for the ‘wisdom of crowds’. In particular, we argue (i) that there is a fundamental tension between voters’ independence and voters’ competence, hence between the two premises of most (...)jury theorems; (ii) that the (asymptotic) conclusion that ‘huge groups are infallible’, reached by many jury theorems, is an artifact of unjustified premises; and (iii) that the (nonasymptotic) conclusion that ‘larger groups are more reliable’, also reached by many jury theorems, is not an artifact and should be regarded as the more adequate formal rendition of the ‘wisdom of crowds’. (shrink)
This paper generalises the classical Condorcet jurytheorem from majority voting over two options to plurality voting over multiple options. The paper further discusses the debate between epistemic and procedural democracy and situates its formal results in that debate. The paper finally compares a number of different social choice procedures for many-option choices in terms of their epistemic merits. An appendix explores the implications of some of the present mathematical results for the question of how probable majority cycles (...) (as in Condorcet's paradox) are in large electorates. (shrink)
Condorcet's famous jurytheorem reaches an optimistic conclusion on the correctness of majority decisions, based on two controversial premises about voters: they are competent and vote independently, in a technical sense. I carefully analyse these premises and show that: whether a premise is justi…ed depends on the notion of probability considered; none of the notions renders both premises simultaneously justi…ed. Under the perhaps most interesting notions, the independence assumption should be weakened.
Democratic decision-making is often defended on grounds of the ‘wisdom of crowds’: decisions are more likely to be correct if they are based on many independent opinions, so a typical argument in social epistemology. But what does it mean to have independent opinions? Opinions can be probabilistically dependent even if individuals form their opinion in causal isolation from each other. We distinguish four probabilistic notions of opinion independence. Which of them holds depends on how individuals are causally affected by environmental (...) factors such as commonly perceived evidence. In a general theorem, we identify causal conditions guaranteeing each kind of opinion independence. These results have implications for whether and how ‘wisdom of crowds’ arguments are possible, and how truth-conducive institutions can be designed. (shrink)
One might think that if the majority of virtue signallers judge that a proposition is true, then there is significant evidence for the truth of that proposition. Given the Condorcet JuryTheorem, individual virtue signallers need not be very reliable for the majority judgment to be very likely to be correct. Thus, even people who are skeptical of the judgments of individual virtue signallers should think that if a majority of them judge that a proposition is true, then (...) that provides significant evidence that the proposition is true. We argue that this is mistaken. Various empirical studies converge on the following point: humans are very conformist in the contexts in which virtue signalling occurs. And stereotypical virtue signallers are even more conformist in such contexts. So we should be skeptical of the claim that virtue signallers are sufficiently independent for the Condorcet JuryTheorem to apply. We do not seek to decisively rule out the relevant application of the Condorcet JuryTheorem. But we do show that careful consideration of the available evidence should make us very skeptical of that application. Consequently, a defense of virtue signalling would need to engage with these findings and show that despite our strong tendencies for conformism, our judgements are sufficiently independent for the Condorcet JuryTheorem to apply. This suggests new directions for the debate about the epistemology of virtue signalling. (shrink)
My aim in this paper is to explain what Condorcet’s jurytheorem is, and to examine its central assumptions, its significance to the epistemic theory of democracy and its connection with Rousseau’s theory of general will. In the first part of the paper I will analyze an epistemic theory of democracy and explain how its connection with Condorcet’s jurytheorem is twofold: the theorem is at the same time a contributing historical source, and the model (...) used by the authors to this day. In the second part I will specify the purposes of the theorem itself, and examine its underlying assumptions. Third part will be about an interpretation of Rousseau’s theory, which is given by Grofman and Feld relying on Condorcet’s jurytheorem, and about criticisms of such interpretation. In the fourth, and last, part I will focus on one particular assumption of Condorcet’s theorem, which proves to be especially problematic if we would like to apply the theorem under real-life conditions; namely, the assumption that voters choose between two options only. (shrink)
Under the independence and competence assumptions of Condorcet’s classical jury model, the probability of a correct majority decision converges to certainty as the jury size increases, a seemingly unrealistic result. Using Bayesian networks, we argue that the model’s independence assumption requires that the state of the world (guilty or not guilty) is the latest common cause of all jurors’ votes. But often – arguably in all courtroom cases and in many expert panels – the latest such common cause (...) is a shared ‘body of evidence’ observed by the jurors. In the corresponding Bayesian network, the votes are direct descendants not of the state of the world, but of the body of evidence, which in turn is a direct descendant of the state of the world. We develop a model of jury decisions based on this Bayesian network. Our model permits the possibility of misleading evidence, even for a maximally competent observer, which cannot easily be accommodated in the classical model. We prove that (i) the probability of a correct majority verdict converges to the probability that the body of evidence is not misleading, a value typically below 1; (ii) depending on the required threshold of ‘no reasonable doubt’, it may be impossible, even in an arbitrarily large jury, to establish guilt of a defendant ‘beyond any reasonable doubt’. (shrink)
The contemporary theory of epistemic democracy often draws on the Condorcet JuryTheorem to formally justify the ‘wisdom of crowds’. But this theorem is inapplicable in its current form, since one of its premises – voter independence – is notoriously violated. This premise carries responsibility for the theorem's misleading conclusion that ‘large crowds are infallible’. We prove a more useful jurytheorem: under defensible premises, ‘large crowds are fallible but better than small groups’. This (...)theorem rehabilitates the importance of deliberation and education, which appear inessential in the classical jury framework. Our theorem is related to Ladha's (1993) seminal jurytheorem for interchangeable (‘indistinguishable’) voters based on de Finetti's Theorem. We also prove a more general and simpler such jurytheorem. (shrink)
It has been argued that an epistemically rational agent’s evidence is subjectively mediated through some rational epistemic standards, and that there are incompatible but equally rational epistemic standards available to agents. This supports Permissiveness, the view according to which one or multiple fully rational agents are permitted to take distinct incompatible doxastic attitudes towards P (relative to a body of evidence). In this paper, I argue that the above claims entail the existence of a unique and more reliable epistemic standard. (...) My strategy relies on Condorcet’s JuryTheorem. This gives rise to an important problem for those who argue that epistemic standards are permissive, since the reliability criterion is incompatible with such a type of Permissiveness. (shrink)
In his 2010 paper “Philosophical Naturalism and Intuitional Methodology”, Alvin I. Goldman invokes the Condorcet JuryTheorem in order to defend the reliability of intuitions. The present note argues that the original conditions of the theorem are all unrealistic when analysed in connection to the case of intuitions. Alternative conditions are discussed.
(This is for the Cambridge Handbook of Analytic Philosophy, edited by Marcus Rossberg) In this handbook entry, I survey the different ways in which formal mathematical methods have been applied to philosophical questions throughout the history of analytic philosophy. I consider: formalization in symbolic logic, with examples such as Aquinas’ third way and Anselm’s ontological argument; Bayesian confirmation theory, with examples such as the fine-tuning argument for God and the paradox of the ravens; foundations of mathematics, with examples such as (...) Hilbert’s programme and Gödel’s incompleteness theorems; social choice theory, with examples such as Condorcet’s paradox and Arrow’s theorem; ‘how possibly’ results, with examples such as Condorcet’s jurytheorem and recent work on intersectionality theory; and the application of advanced mathematics in philosophy, with examples such as accuracy-first epistemology. (shrink)
Can experimental philosophy help us answer central questions about the nature of moral responsibility, such as the question of whether moral responsibility is compatible with determinism? Specifically, can folk judgments in line with a particular answer to that question provide support for that answer. Based on reasoning familiar from Condorcet’s JuryTheorem, such support could be had if individual judges track the truth of the matter independently and with some modest reliability: such reliability quickly aggregates as the number (...) of judges goes up. In this chapter, however, I argue, partly based on empirical evidence, that although non-specialist judgments might on average be more likely than not to get things right, their individual likelihoods fail to aggregate because they do not track truth with sufficient independence. (shrink)
Epistemically immodest agents take their own epistemic standards to be among the most truth-conducive ones available to them. Many philosophers have argued that immodesty is epistemically required of agents, notably because being modest entails a problematic kind of incoherence or self-distrust. In this paper, I argue that modesty is epistemically permitted in some social contexts. I focus on social contexts where agents with limited cognitive capacities cooperate with each other (like juries).
Recent political developments cast doubt on the wisdom of democratic decision-making. Brexit, the Colombian people's (initial) rejection of peace with the FARC, and the election of Donald Trump suggest that the time is right to explore alternatives to democracy. In this essay, I describe and defend the epistocratic system of government which is, given current theoretical and empirical knowledge, most likely to produce optimal political outcomes—or at least better outcomes than democracy produces. To wit, we should expand the suffrage as (...) wide as possible and weight citizens’ votes in accordance with their competence. As it turns out, the optimal system is closely related to J. S. Mill's plural voting proposal. I also explain how voters’ competences can be precisely determined, without reference to an objective standard of correctness and without generating invidious comparisons between voters. (shrink)
Epistemic justifications for democracy have been offered in terms of two different aspects of decision-making: voting and deliberation, or ‘votes’ and ‘talk.’ The Condorcet JuryTheorem is appealed to as a justification in terms votes, and the Hong-Page “Diversity Trumps Ability” result is appealed to as a justification in terms of deliberation. Both of these, however, are most plausibly construed as models of direct democracy, with full and direct participation across the population. In this paper, we explore how (...) these results hold up if we vary the model so as to reflect the more familiar democratic structure of a representative hierarchy. We first recount extant analytic work that shows that representation inevitably weakens the voting results of the Condorcet JuryTheorem, but we question the ability of that result to shine light on real representative systems. We then show that, when we move from votes to talk, as modeled in Hong-Page, representation holds its own and even has a slight edge. (shrink)
Juries, committees and experts panels commonly appraise things of one kind or another on the basis of grades awarded by several people. When everybody's grading thresholds are known to be the same, the results sometimes can be counted on to reflect the graders’ opinion. Otherwise, they often cannot. Under certain conditions, Arrow's ‘impossibility’ theorem entails that judgements reached by aggregating grades do not reliably track any collective sense of better and worse at all. These claims are made by adapting (...) the Arrow–Sen framework for social choice to study grading in groups. (shrink)
How can democratic governments be relied upon to achieve adequate political knowledge when they turn over their authority to those of no epistemic distinction whatsoever? This deep and longstanding concern is one that any proponent of epistemic conceptions of democracy must take seriously. While Condorcetian responses have recently attracted substantial interest, they are largely undermined by a fundamental neglect of agenda-setting. I argue that the apparent intractability of the problem of epistemic adequacy in democracy stems in large part from a (...) failure to appreciate the social character of political knowledge. A social point of view brings into focus a number of vital factors that bear on our understanding of democratic epistemology and our assessment of its prospects: the essential role of inclusive deliberation, the public's agenda-setting function, institutional provisions for policy feedback, the independence of expert communities, and the knowledge-pooling powers of markets. (shrink)
Epistemic justifications for democracy have been offered in terms of two different aspects of decision-making: voting and deliberation, or 'votes' and 'talk.' The Condorcet JuryTheorem is appealed to as a justification in terms of votes, and the Hong-Page "Diversity Trumps Ability" result is appealed to as a justification in terms of deliberation. Both of these, however, are most plausibly construed as models of direct democracy, with full and direct participation across the population. In this paper, we explore (...) how these results hold up if we vary the model so as to reflect the more familiar democratic structure of a representative hierarchy. We first recount extant analytic work that shows that representation inevitably weakens the voting results of the Condorcet JuryTheorem, but we question the ability of the result to shine light on real representative systems. We then show that, when we move from votes to talk, as modeled in Hong-Page, representation holds its own and even has a slight edge. (shrink)
Una de las bifurcaciones en el debate contemporáneo sobre la legitimidad de la democracia explora si ésta ofrece ventajas distintivamente epistémicas frente a otras alternativas políticas. Quienes defienden la tesis de la democracia epistémica afirman que la democracia es instrumentalmente superior o equiparable a otras formas de organización política en lo que concierne a la obtención de varios bienes epistémicos. En este ensayo presento dos (grupos de) argumentos a favor de la democracia epistémica, que se inspiran en resultados formales: el (...) teorema del jurado de Condorcet [TJC] y el teorema ‘diversidad supera habilidad’ [DSH]. Pese a su gran atractivo, sostengo que estos argumentos son incapaces de respaldar dicha tesis: no brindan razones para considerar que la democracia es epistémicamente superior (o equiparable) a algunas alternativas políticas no democráticas. En su lugar, sugiero que, sin requerir un cambio radical en nuestras formas de organización política, la epistemología democrática –el estudio de las ‘circunstancias epistémicas de la democracia’– puede ofrecer valiosas lecciones de sobre cómo optimizar, en nuestra situación, instituciones y procedimientos de toma de decisiones. Para ello, primero distingo entre varias maneras de evaluar procedimientos de toma de decisión colectiva. Argumento que, al considerarlos como formas de organización política, un factor importante en la evaluación de tales procedimientos involucra asuntos fácticos sobre los cuales puede aspirarse a obtener o promover algunos bienes epistémicos. En este contexto, presento algunos de los argumentos más importantes a favor de la democracia epistémica. A continuación, reúno algunas de las objeciones sobre la aplicabilidad de dichos argumentos y ofrezco razones independientes para dudar de que ofrezcan apoyo a la tesis de la democracia epistémica. Finalmente, defiendo que la epistemología democrática puede desempeñar un papel significativo en la legitimación de formas de organización colectiva que podrían denominarse ‘democráticas’. (shrink)
In this paper, we argue that computer simulations can provide valuable insights into the performance of voting methods on different collective decision problems. This could improve institutional design, even when there is no general theoretical result to support the optimality of a voting method. To support our claim, we first describe a decision problem that has not received much theoretical attention in the literature. We outline different voting methods to address that collective decision problem. Under certain criteria of assessment akin (...) to extensions of the Condorcet JuryTheorem, we run simulations for the methods using MATLAB, in order to compare their performance under various conditions. We consider and respond to concerns about the use of simulations in the assessment of voting procedures for policymaking. (shrink)
There is a substantial class of collective decision problems whose successful solution requires interdependence among decision makers at the agenda-setting stage and independence at the stage of choice. We define this class of problems and describe and apply a search-and-decision mechanism theoretically modeled in the context of honeybees and identified in earlier empirical work in biology. The honeybees’ mechanism has useful implications for mechanism design in human institutions, including courts, legislatures, executive appointments, research and development in firms, and basic research (...) in the sciences. Our paper offers a fresh perspective on the idea of “biomimicry” in institutional design and raises the possibility of comparative institutional analysis across species. (shrink)
In this study I analyse the performance of a democratic decision-making rule: the weighted majority rule. It assigns to each voter a number of votes that is proportional to her stakes in the decision. It has been shown that, for collective decisions with two options, the weighted majority rule in combination with self-interested voters maximises the common good when the latter is understood in terms of either the sum-total or prioritarian sum of the voters’ well-being. The main result of my (...) study is that this argument for the weighted majority rule — that it maximises the common good — can be improved along the following three main lines. (1) The argument can be adapted to other criteria of the common good, such as sufficientarian, maximin, leximin or non-welfarist criteria. I propose a generic argument for the collective optimality of the weighted majority rule that works for all of these criteria. (2) The assumption of self-interested voters can be relaxed. First, common-interest voters can be accommodated. Second, even if voters are less than fully competent in judging their self-interest or the common interest, the weighted majority rule is weakly collectively optimal, that is, it almost certainly maximises the common good given large numbers of voters. Third, even for smaller groups of voters, the weighted majority rule still has some attractive features. (3) The scope of the argument can be extended to decisions with more than two options. I state the conditions under which the weighted majority rule maximises the common good even in multi-option contexts. I also analyse the possibility and the detrimental effects of strategic voting. Furthermore, I argue that self-interested voters have reason to accept the weighted majority rule. (shrink)
Frente a problemas de decisión colectiva de cierta complejidad, distintos métodos de votación pueden considerarse igualmente democráticos. Ante esta situación, argumento que es posible investigar cuáles de esos métodos producen mejores resultados epistémicos sobre asuntos fácticos. Comienzo ilustrando la relación entre democracia y métodos de votación con un sencillo ejemplo. Muestro cómo el uso de modelos idealizados permite descubrir algunas propiedades de los métodos de votación; varios de estos descubrimientos muestran que, frente a problemas de cierta complejidad, no hay una (...) respuesta clara acerca de cuál es el resultado de una elección democrática. Frente a esto, sugiero que deberíamos tomar en cuenta un rasgo epistémico instrumental de varios métodos de votación: su capacidad para generar respuestas correctas ante varias situaciones. Esta intuición ofrece lecciones importantes para el diseño de instituciones electorales. (shrink)
Suppose the members of a group (e.g., committee, jury, expert panel) each form a judgment on which worlds in a given set are possible, subject to the constraint that at least one world is possible but not all are. The group seeks to aggregate these individual judgments into a collective judgment, subject to the same constraint. I show that no judgment aggregation rule can solve this problem in accordance with three conditions: “unanimity,” “independence” and “non-dictatorship,” Although the result is (...) a variant of an existing theorem on “group identification” (Kasher and Rubinstein, Logique et Analyse 160:385–395, 1997), the aggregation of judgments on which worlds are possible (or permissible, desirable, etc.) appears not to have been studied yet. The result challenges us to take a stance on which of its conditions to relax. (shrink)
Representation theorems are often taken to provide the foundations for decision theory. First, they are taken to characterize degrees of belief and utilities. Second, they are taken to justify two fundamental rules of rationality: that we should have probabilistic degrees of belief and that we should act as expected utility maximizers. We argue that representation theorems cannot serve either of these foundational purposes, and that recent attempts to defend the foundational importance of representation theorems are unsuccessful. As a result, we (...) should reject these claims, and lay the foundations of decision theory on firmer ground. (shrink)
Any intermediate propositional logic (i.e., a logic including intuitionistic logic and contained in classical logic) can be extended to a calculus with epsilon- and tau-operators and critical formulas. For classical logic, this results in Hilbert’s ε-calculus. The first and second ε-theorems for classical logic establish conservativity of the ε-calculus over its classical base logic. It is well known that the second ε-theorem fails for the intuitionistic ε-calculus, as prenexation is impossible. The paper investigates the effect of adding critical ε- (...) and τ -formulas and using the translation of quantifiers into ε- and τ -terms to intermediate logics. It is shown that conservativity over the propositional base logic also holds for such intermediate ετ -calculi. The “extended” first ε-theorem holds if the base logic is finite-valued Gödel-Dummett logic, fails otherwise, but holds for certain provable formulas in infinite-valued Gödel logic. The second ε-theorem also holds for finite-valued first-order Gödel logics. The methods used to prove the extended first ε-theorem for infinite-valued Gödel logic suggest applications to theories of arithmetic. (shrink)
Amalgamating evidence of different kinds for the same hypothesis into an overall confirmation is analogous, I argue, to amalgamating individuals’ preferences into a group preference. The latter faces well-known impossibility theorems, most famously “Arrow’s Theorem”. Once the analogy between amalgamating evidence and amalgamating preferences is tight, it is obvious that amalgamating evidence might face a theorem similar to Arrow’s. I prove that this is so, and end by discussing the plausibility of the axioms required for the theorem.
In response to recent work on the aggregation of individual judgments on logically connected propositions into collective judgments, it is often asked whether judgment aggregation is a special case of Arrowian preference aggregation. We argue for the converse claim. After proving two impossibility theorems on judgment aggregation (using "systematicity" and "independence" conditions, respectively), we construct an embedding of preference aggregation into judgment aggregation and prove Arrow’s theorem (stated for strict preferences) as a corollary of our second result. Although we (...) thereby provide a new proof of Arrow’s theorem, our main aim is to identify the analogue of Arrow’s theorem in judgment aggregation, to clarify the relation between judgment and preference aggregation, and to illustrate the generality of the judgment aggregation model. JEL Classi…cation: D70, D71.. (shrink)
In this short survey article, I discuss Bell’s theorem and some strategies that attempt to avoid the conclusion of non-locality. I focus on two that intersect with the philosophy of probability: (1) quantum probabilities and (2) superdeterminism. The issues they raised not only apply to a wide class of no-go theorems about quantum mechanics but are also of general philosophical interest.
A proof of Fermat’s last theorem is demonstrated. It is very brief, simple, elementary, and absolutely arithmetical. The necessary premises for the proof are only: the three definitive properties of the relation of equality (identity, symmetry, and transitivity), modus tollens, axiom of induction, the proof of Fermat’s last theorem in the case of.
The standard representation theorem for expected utility theory tells us that if a subject’s preferences conform to certain axioms, then she can be represented as maximising her expected utility given a particular set of credences and utilities—and, moreover, that having those credences and utilities is the only way that she could be maximising her expected utility. However, the kinds of agents these theorems seem apt to tell us anything about are highly idealised, being always probabilistically coherent with infinitely precise (...) degrees of belief and full knowledge of all a priori truths. Ordinary subjects do not look very rational when compared to the kinds of agents usually talked about in decision theory. In this paper, I will develop an expected utility representation theorem aimed at the representation of those who are neither probabilistically coherent, logically omniscient, nor expected utility maximisers across the board—that is, agents who are frequently irrational. The agents in question may be deductively fallible, have incoherent credences, limited representational capacities, and fail to maximise expected utility for all but a limited class of gambles. (shrink)
This paper begins with a puzzle regarding Lewis' theory of radical interpretation. On the one hand, Lewis convincingly argued that the facts about an agent's sensory evidence and choices will always underdetermine the facts about her beliefs and desires. On the other hand, we have several representation theorems—such as those of (Ramsey 1931) and (Savage 1954)—that are widely taken to show that if an agent's choices satisfy certain constraints, then those choices can suffice to determine her beliefs and desires. In (...) this paper, I will argue that Lewis' conclusion is correct: choices radically underdetermine beliefs and desires, and representation theorems provide us with no good reasons to think otherwise. Any tension with those theorems is merely apparent, and relates ultimately to the difference between how 'choices' are understood within Lewis' theory and the problematic way that they're represented in the context of the representation theorems. For the purposes of radical interpretation, representation theorems like Ramsey's and Savage's just aren't very relevant after all. (shrink)
The scientificity of the research should be evaluated according to the methodology used in the study. However, these are usually the research areas or the institutions that are classified as scientific or non-scientific. Because of various reasons, it may turn out that the scientific institutions are not producing science, while the “non-scientists” are doing real science. In the extreme case, the official science system is entirely corrupt, consisting of fraudsters, while the real scientists have been expelled from academic institutions. Since (...) 2016-2017, there has been much talk about the “post-truth era” and the politicians who are “denying science”. However, simultaneously, many complaints about the corruption of science appeared. The outsider cannot tell who is telling the truth as it may be the case that the science fraudsters are defending themselves and these politicians are aware of the corruption. It is also untrue that the censoring or suppression of science started from 2016-2017. Suppression of science because of political and ideological reasons was present already long ago, and during the last few years, it has been increasing. The picture is highly complicated as there are many pretenders, false accusations, etc. For example, because of political reasons, someone may be set up as a pseudoscientist, the real scientist may be expelled using political accusations, justified criticism may be labelled as political pressure, etc. There is something like an inner information war ongoing in and around science. The classical philosophy of science seems unable to handle it because every formal rule can be misapplied. Science, as a whole, may be unable to persist. (shrink)
What is the Quid Juris in Kant's Deduction? Chapter 3 from my book on the Deduction (Kant's Deduction From Apperception) provides an answer to that question, and also contains an extensive discussion of the relevant literature on this topic (Henrich, Proops, Seeberg & Longuenesse).
The aggregation of individual judgments over interrelated propositions is a newly arising field of social choice theory. I introduce several independence conditions on judgment aggregation rules, each of which protects against a specific type of manipulation by agenda setters or voters. I derive impossibility theorems whereby these independence conditions are incompatible with certain minimal requirements. Unlike earlier impossibility results, the main result here holds for any (non-trivial) agenda. However, independence conditions arguably undermine the logical structure of judgment aggregation. I therefore (...) suggest restricting independence to premises, which leads to a generalised premise-based procedure. This procedure is proven to be possible if the premises are logically independent. (shrink)
We note that a plural version of logicism about arithmetic is suggested by the standard reading of Hume's Principle in terms of `the number of Fs/Gs'. We lay out the resources needed to prove a version of Frege's principle in plural, rather than second-order, logic. We sketch a proof of the theorem and comment philosophically on the result, which sits well with a metaphysics of natural numbers as plural properties.
Pettit (2012) presents a model of popular control over government, according to which it consists in the government being subject to those policy-making norms that everyone accepts. In this paper, I provide a formal statement of this interpretation of popular control, which illuminates its relationship to other interpretations of the idea with which it is easily conflated, and which gives rise to a theorem, similar to the famous Gibbard-Satterthwaite theorem. The theorem states that if government policy is (...) subject to popular control, as Pettit interprets it, and policy responds positively to changes in citizens' normative attitudes, then there is a single individual whose normative attitudes unilaterally determine policy. I use the model and theorem as an illustrative example to discuss the role of mathematics in normative political theory. (shrink)
This paper deals with, prepositional calculi with strong negation (N-logics) in which the Craig interpolation theorem holds. N-logics are defined to be axiomatic strengthenings of the intuitionistic calculus enriched with a unary connective called strong negation. There exists continuum of N-logics, but the Craig interpolation theorem holds only in 14 of them.
Our conscious minds exist in the Universe, therefore they should be identified with physical states that are subject to physical laws. In classical theories of mind, the mental states are identified with brain states that satisfy the deterministic laws of classical mechanics. This approach, however, leads to insurmountable paradoxes such as epiphenomenal minds and illusionary free will. Alternatively, one may identify mental states with quantum states realized within the brain and try to resolve the above paradoxes using the standard Hilbert (...) space formalism of quantum mechanics. In this essay, we first show that identification of mind states with quantum states within the brain is biologically feasible, and then elaborating on the mathematical proofs of two quantum mechanical no-go theorems, we explain why quantum theory might have profound implications for the scientific understanding of one's mental states, self identity, beliefs and free will. (shrink)
The text is a continuation of the article of the same name published in the previous issue of Philosophical Alternatives. The philosophical interpretations of the Kochen- Specker theorem (1967) are considered. Einstein's principle regarding the,consubstantiality of inertia and gravity" (1918) allows of a parallel between descriptions of a physical micro-entity in relation to the macro-apparatus on the one hand, and of physical macro-entities in relation to the astronomical mega-entities on the other. The Bohmian interpretation ( 1952) of quantum mechanics (...) proposes that all quantum systems be interpreted as dissipative ones and that the theorem be thus derstood. The conclusion is that the continual representation, by force or (gravitational) field between parts interacting by means of it, of a system is equivalent to their mutual entanglement if representation is discrete. Gravity (force field) and entanglement are two different, correspondingly continual and discrete, images of a single common essence. General relativity can be interpreted as a superluminal generalization of special relativity. The postulate exists of an alleged obligatory difference between a model and reality in science and philosophy. It can also be deduced by interpreting a corollary of the heorem. On the other hand, quantum mechanics, on the basis of this theorem and of V on Neumann's (1932), introduces the option that a model be entirely identified as the modeled reality and, therefore, that absolutely reality be recognized: this is a non-standard hypothesis in the epistemology of science. Thus, the true reality begins to be understood mathematically, i.e. in a Pythagorean manner, for its identification with its mathematical model. A few linked problems are highlighted: the role of the axiom of choice forcorrectly interpreting the theorem; whether the theorem can be considered an axiom; whether the theorem can be considered equivalent to the negation of the axiom. (shrink)
How, if at all, should race figure in criminal trials with a jury? How far should attorneys be allowed or encouraged to probe the racial sensitivities of jurors and what does this mean for the appropriate way to present cases which involve racial profiling and, therefore, are likely to pit the words and actions of a white policeman against those of a young black man?
In the context of EPR-Bohm type experiments and spin detections confined to spacelike hypersurfaces, a local, deterministic and realistic model within a Friedmann-Robertson-Walker spacetime with a constant spatial curvature (S^3 ) is presented that describes simultaneous measurements of the spins of two fermions emerging in a singlet state from the decay of a spinless boson. Exact agreement with the probabilistic predictions of quantum theory is achieved in the model without data rejection, remote contextuality, superdeterminism or backward causation. A singularity-free Clifford-algebraic (...) representation of S^3 with vanishing spatial curvature and non-vanishing torsion is then employed to transform the model in a more elegant form. Several event-by-event numerical simulations of the model are presented, which confirm our analytical results with the accuracy of 4 parts in 10^4 . Possible implications of our results for practical applications such as quantum security protocols and quantum computing are briefly discussed. (shrink)
The proliferation of criminal laws in diﬀerent legal systems has made legal practitioners and scholars deliberate upon the present day relevance of old age principles and concepts. The maxim ignorantia juris non excusat (ignorantia juris hereinafter) also falls in this category. The application of criminal law is said to rest on the maxim ignorantia juris, meaning ignorance of law is no excuse. The application of the maxim has from time immemorial been defended on grounds of convenience, utility, and community interests. (...) At the same time it has been challenged on grounds of legality and morality. The application of the maximin criminal matters is secured through penal laws. These laws are further validated through judicial pronouncements. Penal statutes and judicial pronouncements have largely been the basis to determine the exceptions and scope of the maxim. -/- Breaking it down to a single case in hand, several factors come into picture including the nature of the crime in question, the mental state of the person accused, the conduct of the accused person, the objective or public utility of the criminal legislation or provision, and the reasons or justifications offered for its rigid or qualified application. The single case is viewed in light of the general provisions of criminal law of the legal system in which the case is being tried. The uniqueness of each case and the criminal statutes of legal systems make it both complex and conducive to generalize or conclude on the true import and application of the maxim ignorantia juris. (shrink)
Following a long-standing philosophical tradition, impartiality is a distinctive and determining feature of moral judgments, especially in matters of distributive justice. This broad ethical tradition was revived in welfare economics by Vickrey, and above all, Harsanyi, under the form of the so-called Impartial Observer Theorem. The paper offers an analytical reconstruction of this argument and a step-wise philosophical critique of its premisses. It eventually provides a new formal version of the theorem based on subjective probability.
The determinism-free will debate is perhaps as old as philosophy itself and has been engaged in from a great variety of points of view including those of scientific, theological, and logical character. This chapter focuses on two arguments from logic. First, there is an argument in support of determinism that dates back to Aristotle, if not farther. It rests on acceptance of the Law of Excluded Middle, according to which every proposition is either true or false, no matter whether the (...) proposition is about the past, present or future. In particular, the argument goes, whatever one does or does not do in the future is determined in the present by the truth or falsity of the corresponding proposition. The second argument coming from logic is much more modern and appeals to Gödel's incompleteness theorems to make the case against determinism and in favour of free will, insofar as that applies to the mathematical potentialities of human beings. The claim more precisely is that as a consequence of the incompleteness theorems, those potentialities cannot be exactly circumscribed by the output of any computing machine even allowing unlimited time and space for its work. The chapter concludes with some new considerations that may be in favour of a partial mechanist account of the mathematical mind. (shrink)
If the concept of “free will” is reduced to that of “choice” all physical world share the latter quality. Anyway the “free will” can be distinguished from the “choice”: The “free will” involves implicitly certain preliminary goal, and the choice is only the mean, by which it can be achieved or not by the one who determines the goal. Thus, for example, an electron has always a choice but not free will unlike a human possessing both. Consequently, and paradoxically, the (...) determinism of classical physics is more subjective and more anthropomorphic than the indeterminism of quantum mechanics for the former presupposes certain deterministic goal implicitly following the model of human freewill behavior. The choice is usually linked to very complicated systems such as human brain or society and even often associated with consciousness. In its background, the material world is deterministic and absolutely devoid of choice. However, quantum mechanics introduces the choice in the fundament of physical world, in the only way, in which it can exist: All exists in the “phase transition” of the present between the uncertain future and the well-ordered past. Thus the present is forced to choose in order to be able to transform the coherent state of future into the well-ordering of past. The concept of choice as if suggests that there is one who chooses. However quantum mechanics involves a generalized case of choice, which can be called “subjectless”: There is certain choice, which originates from the transition of the future into the past. Thus that kind of choice is shared of all existing and does not need any subject: It can be considered as a low of nature. There are a few theorems in quantum mechanics directly relevant to the topic: two of them are called “free will theorems” by their authors, Conway and Kochen, and according to them: “Do we really have free will, or, as a few determined folk maintain, is it all an illusion? We don’t know, but will prove in this paper that if indeed there exist any experimenters with a modicum of free will, then elementary particles must have their own share of this valuable commodity” “The import of the free will theorem is that it is not only current quantum theory, but the world itself that is non-deterministic, so that no future theory can return us to a clockwork universe”. Those theorems can be considered as a continuation of the so-called theorems about the absence of “hidden variables” in quantum mechanics. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.