In formalepistemology, we use mathematical methods to explore the questions of epistemology and rational choice. What can we know? What should we believe and how strongly? How should we act based on our beliefs and values? We begin by modelling phenomena like knowledge, belief, and desire using mathematical machinery, just as a biologist might model the fluctuations of a pair of competing populations, or a physicist might model the turbulence of a fluid passing through a small (...) aperture. Then, we explore, discover, and justify the laws governing those phenomena, using the precision that mathematical machinery affords. For example, we might represent a person by the strengths of their beliefs, and we might measure these using real numbers, which we call credences. Having done this, we might ask what the norms are that govern that person when we represent them in that way. How should those credences hang together? How should the credences change in response to evidence? And how should those credences guide the person’s actions? This is the approach of the first six chapters of this handbook. In the second half, we consider different representations—the set of propositions a person believes; their ranking of propositions by their plausibility. And in each case we ask again what the norms are that govern a person so represented. Or, we might represent them as having both credences and full beliefs, and then ask how those two representations should interact with one another. This handbook is incomplete, as such ventures often are. Formalepistemology is a much wider topic than we present here. One omission, for instance, is social epistemology, where we consider not only individual believers but also the epistemic aspects of their place in a social world. Michael Caie’s entry on doxastic logic touches on one part of this topic, but there is much more. Relatedly, there is no entry on epistemic logic, nor any on knowledge more generally. There are still more gaps. These omissions should not be taken as ideological choices. This material is missing, not because it is any less valuable or interesting, but because we v failed to secure it in time. Rather than delay publication further, we chose to go ahead with what is already a substantial collection. We anticipate a further volume in the future that will cover more ground. Why an open access handbook on this topic? A number of reasons. The topics covered here are large and complex and need the space allowed by the sort of 50 page treatment that many of the authors give. We also wanted to show that, using free and open software, one can overcome a major hurdle facing open access publishing, even on topics with complex typesetting needs. With the right software, one can produce attractive, clear publications at reasonably low cost. Indeed this handbook was created on a budget of exactly £0 (≈ $0). Our thanks to PhilPapers for serving as publisher, and to the authors: we are enormously grateful for the effort they put into their entries. (shrink)
Formalepistemology is just what it sounds like: epistemology done with formal tools. Coinciding with the general rise in popularity of experimental philosophy, formal epistemologists have begun to apply experimental methods in their own work. In this entry, I survey some of the work at the intersection of formal and experimental epistemology. I show that experimental methods have unique roles to play when epistemology is done formally, and I highlight some ways in (...) which results from formalepistemology have been used fruitfully to advance epistemically-relevant experimental work. The upshot of this brief, incomplete survey is that formal and experimental methods often constitute mutually informative means to epistemological ends. (shrink)
This paper reviews the central points and presents some recent developments of the epistemic approach to paraconsistency in terms of the preservation of evidence. Two formal systems are surveyed, the basic logic of evidence (BLE) and the logic of evidence and truth (LET J ), designed to deal, respectively, with evidence and with evidence and truth. While BLE is equivalent to Nelson’s logic N4, it has been conceived for a different purpose. Adequate valuation semantics that provide decidability are given (...) for both BLE and LET J . The meanings of the connectives of BLE and LET J , from the point of view of preservation of evidence, is explained with the aid of an inferential semantics. A formalization of the notion of evidence for BLE as proposed by M. Fitting is also reviewed here. As a novel result, the paper shows that LET J is semantically characterized through the so-called Fidel structures. Some opportunities for further research are also discussed. (shrink)
There are two fundamentally distinct kinds of biological theorizing. "Formal biology" focuses on the relations, captured in formal laws, among mathematically abstracted properties of abstract objects. Population genetics and theoretical mathematical ecology, which are cases of formal biology, thus share methods and goals with theoretical physics. "Compositional biology," on the other hand, is concerned with articulating the concrete structure, mechanisms, and function, through developmental and evolutionary time, of material parts and wholes. Molecular genetics, biochemistry, developmental biology, and (...) physiology, which are examples of compositional biology, are in serious need of philosophical attention. For example, the very concept of a "part" is understudied in both philosophy of biology and philosophy of science. ;My dissertation is an attempt to clarify the distinction between formal biology and compositional biology and, in so doing, provide a clear philosophical analysis, with case studies, of compositional biology. Given the social, economic, and medical importance of compositional biology, understanding it is urgent. For my investigation, I draw on the philosophical fields of metaphysics and epistemology, as well as philosophy of biology and philosophy of science. I suggest new ways of thinking about some classic philosophy of science issues, such as modeling, laws of nature, abstraction, explanation, and confirmation. I hint at the relevance of my study of two kinds of biological theorizing to debates concerning the disunity of science. (shrink)
We consider the complex interactions between rape culture and epistemology. A central case study is the consideration of a deferential attitude about the epistemology of sexual assault testimony. According to the deferential attitude, individuals and institutions should decline to act on allegations of sexual assault unless and until they are proven in a formal setting, i.e., a criminal court. We attack this deference from several angles, including the pervasiveness of rape culture in the criminal justice system, the (...)epistemology of testimony and norms connecting knowledge and action, the harms of tacit idealizations away from important contextual factors, and a contextualist semantics for 'knows' ascriptions. (shrink)
The Duhem-Quine Thesis is the claim that it is impossible to test a scientific hypothesis in isolation because any empirical test requires assuming the truth of one or more auxiliary hypotheses. This is taken by many philosophers, and is assumed here, to support the further thesis that theory choice is underdetermined by empirical evidence. This inquiry is focused strictly on the axiological commitments engendered in solutions to underdetermination, specifically those of Pierre Duhem and W. V. Quine. Duhem resolves underdetermination by (...) appealing to a cluster of virtues called 'good sense', and it has recently been argued by Stump (Stud Hist Philos Biol Biomed Sei, 18(1):149-159,2007) that good sense is a form of virtue epistemology. This paper considers whether Quine, who's philosophy is heavily influenced by the very thesis that led Duhem to the virtues, is also led to a virtue epistemology in the face of underdetermination. Various sources of Quinian epistemic normativity are considered, and it is argued that, in conjunction with other normative commitments, Quine's sectarian solution to underdetermination amounts to a skills based virtue epistemology. The paper also sketches formal features of the novel form of virtue epistemology common to Duhem and Quine that challenges the adequacy of epistemic value truth-monism and blocks any imperialist naturalization of virtue epistemology, as the epistemic virtues are essential to the success of the sciences themselves. (shrink)
Can there be knowledge and rational belief in the absence of a rational degree of confidence? Yes, and cases of "mistuned knowledge" demonstrate this. In this paper we leverage this normative possibility in support of advancing our understanding of the metaphysical relation between belief and credence. It is generally assumed that a Lockean metaphysics of belief that reduces outright belief to degrees of confidence would immediately effect a unification of coarse-grained epistemology of belief with fine-grained epistemology of confidence. (...) Scott Sturgeon has suggested that the unification is effected by understanding the relation between outright belief and confidence as an instance of the determinable-determinate relation. But determination of belief by confidence would not by itself yield the result that norms for confidence carry over to norms for outright belief unless belief and high confidence are token identical. We argue that this token-identity thesis is incompatible with the neglected phenomenon of “mistuned knowledge”—knowledge and rational belief in the absence of rational confidence. We contend that there are genuine cases of mistuned knowledge and that, therefore, epistemological unification must forego token identity of belief and high confidence. We show how partial epistemological unification can be secured given determination of outright belief by degrees of confidence even without token-identity. Finally, we suggest a direction for the pursuit of thoroughgoing epistemological unification. (shrink)
This dissertation is a contribution to formal and computational philosophy. -/- In the first part, we show that by exploiting the parallels between large, yet finite lotteries on the one hand and countably infinite lotteries on the other, we gain insights in the foundations of probability theory as well as in epistemology. Case 1: Infinite lotteries. We discuss how the concept of a fair finite lottery can best be extended to denumerably infinite lotteries. The solution boils down to (...) the introduction of infinitesimal probability values, which can be achieved using non-standard analysis. Our solution can be generalized to uncountable sample spaces, giving rise to a Non-Archimedean Probability (NAP) theory. Case 2: Large but finite lotteries. We propose application of the language of relative analysis (a type of non-standard analysis) to formulate a new model for rational belief, called Stratified Belief. This contextualist model seems well-suited to deal with a concept of beliefs based on probabilities ‘sufficiently close to unity’. -/- The second part presents a case study in social epistemology. We model a group of agents who update their opinions by averaging the opinions of other agents. Our main goal is to calculate the probability for an agent to end up in an inconsistent belief state due to updating. To that end, an analytical expression is given and evaluated numerically, both exactly and using statistical sampling. The probability of ending up in an inconsistent belief state turns out to be always smaller than 2%. (shrink)
Epistemology is the study of knowledge. This entry covers epistemology in two parts: one historical, one contemporary. The former provides a brief theological history of epistemology. The latter outlines three categories of contemporary epistemology: traditional epistemology, social epistemology, and formalepistemology, along with corresponding theological questions that arise in each.
Medical terminology collects and organizes the many different kinds of terms employed in the biomedical domain both by practitioners and also in the course of biomedical research. In addition to serving as labels for biomedical classes, these names reflect the organizational principles of biomedical vocabularies and ontologies. Some names represent invariant features (classes, universals) of biomedical reality (i.e., they are a matter for ontology). Other names, however, convey also how this reality is perceived, measured, and understood by health professionals (i.e., (...) they belong to the domain of epistemology). We analyze terms from several biomedical vocabularies in order to throw light on the interactions between ontological and epistemological components of these terminologies. We identify four cases: 1) terms containing classification criteria, 2) terms reflecting detectability, modality, uncertainty, and vagueness, 3) terms created in order to obtain a complete partition of a given domain, and 4) terms reflecting mere fiat boundaries. We show that epistemology-loaded terms are pervasive in biomedical vocabularies, that the “classes” they name often do not comply with sound classification principles, and that they are therefore likely to cause problems in the evolution and alignment of terminologies and associated ontologies. (shrink)
In the latter half of the twentieth century, philosophers of science have argued (implicitly and explicitly) that epistemically rational individuals might compose epistemically irrational groups and that, conversely, epistemically rational groups might be composed of epistemically irrational individuals. We call the conjunction of these two claims the Independence Thesis, as they together imply that methodological prescriptions for scientific communities and those for individual scientists might be logically independent of one another. We develop a formal model of scientific inquiry, define (...) four criteria for individual and group epistemic rationality, and then prove that the four definitions diverge, in the sense that individuals will be judged rational when groups are not and vice versa. We conclude by explaining implications of the inconsistency thesis for (i) descriptive history and sociology of science and (ii) normative prescriptions for scientific communities. (shrink)
(This is for the Cambridge Handbook of Analytic Philosophy, edited by Marcus Rossberg) In this handbook entry, I survey the different ways in which formal mathematical methods have been applied to philosophical questions throughout the history of analytic philosophy. I consider: formalization in symbolic logic, with examples such as Aquinas’ third way and Anselm’s ontological argument; Bayesian confirmation theory, with examples such as the fine-tuning argument for God and the paradox of the ravens; foundations of mathematics, with examples such (...) as Hilbert’s programme and Gödel’s incompleteness theorems; social choice theory, with examples such as Condorcet’s paradox and Arrow’s theorem; ‘how possibly’ results, with examples such as Condorcet’s jury theorem and recent work on intersectionality theory; and the application of advanced mathematics in philosophy, with examples such as accuracy-first epistemology. (shrink)
John D. Norton is responsible for a number of influential views in contemporary philosophy of science. This paper will discuss two of them. The material theory of induction claims that inductive arguments are ultimately justified by their material features, not their formal features. Thus, while a deductive argument can be valid irrespective of the content of the propositions that make up the argument, an inductive argument about, say, apples, will be justified (or not) depending on facts about apples. The (...) argument view of thought experiments claims that thought experiments are arguments, and that they function epistemically however arguments do. These two views have generated a great deal of discussion, although there hasn’t been much written about their combination. I argue that despite some interesting harmonies, there is a serious tension between them. I consider several options for easing this tension, before suggesting a set of changes to the argument view that I take to be consistent with Norton’s fundamental philosophical commitments, and which retain what seems intuitively correct about the argument view. These changes require that we move away from a unitary epistemology of thought experiments and towards a more pluralist position. (shrink)
The notion of an ideal reasoner has several uses in epistemology. Often, ideal reasoners are used as a parameter of (maximum) rationality for finite reasoners (e.g. humans). However, the notion of an ideal reasoner is normally construed in such a high degree of idealization (e.g. infinite/unbounded memory) that this use is unadvised. In this dissertation, I investigate the conditions under which an ideal reasoner may be used as a parameter of rationality for finite reasoners. In addition, I present and (...) justify the research program of computational epistemology, which investigates the parameter of maximum rationality for finite reasoners using computer simulations. (shrink)
The formal and empirical-generative perspectives of computation are demonstrated to be inadequate to secure the goals of simulation in the social sciences. Simulation does not resemble formal demonstrations or generative mechanisms that deductively explain how certain models are sufficient to generate emergent macrostructures of interest. The description of scientific practice implies additional epistemic conceptions of scientific knowledge. Three kinds of knowledge that account for a comprehensive description of the discipline were identified: formal, empirical and intentional knowledge. The (...) use of formal conceptions of computation for describing simulation is refuted; the roles of programming languages according to intentional accounts of computation are identified; and the roles of iconographic programming languages and aesthetic machines in simulation are characterized. The roles that simulation and intentional decision making may be able to play in a participative information society are also discussed. (shrink)
This is book l of three philosophy books in international language; formal academic philosophy source in Kurdish language Given overall view on classic epistemology for university student in non English language philosophy departments and philosophy schools .
This is book ll of three philosophy books in international language, formal academic philosophy source in Kurdish language. Written for non English speaking university students as a philosophy guide into epistemology/ Social epistemology, as a resource for the use of philosophy departments and philosophy schools.
True contradictions are taken increasingly seriously by philosophers and logicians. Yet, the belief that contradictions are always false remains deeply intuitive. This paper confronts this belief head-on by explaining in detail how one specific contradiction is true. The contradiction in question derives from Priest's reworking of Berkeley's argument for idealism. However, technical aspects of the explanation offered here differ considerably from Priest's derivation. The explanation uses novel formal and epistemological tools to guide the reader through a valid argument with, (...) not just true, but eminently acceptable premises, to an admittedly unusual conclusion: a true contradiction. The novel formal and epistemological tools concern points of view and changes in points of view. The result is an understanding of why the contradiction is true. (shrink)
Since the time of Aristotle's students, interpreters have considered Prior Analytics to be a treatise about deductive reasoning, more generally, about methods of determining the validity and invalidity of premise-conclusion arguments. People studied Prior Analytics in order to learn more about deductive reasoning and to improve their own reasoning skills. These interpreters understood Aristotle to be focusing on two epistemic processes: first, the process of establishing knowledge that a conclusion follows necessarily from a set of premises (that is, on the (...) epistemic process of extracting information implicit in explicitly given information) and, second, the process of establishing knowledge that a conclusion does not follow. Despite the overwhelming tendency to interpret the syllogistic as formalepistemology, it was not until the early 1970s that it occurred to anyone to think that Aristotle may have developed a theory of deductive reasoning with a well worked-out system of deductions comparable in rigor and precision with systems such as propositional logic or equational logic familiar from mathematical logic. When modern logicians in the 1920s and 1930s first turned their attention to the problem of understanding Aristotle's contribution to logic in modern terms, they were guided both by the Frege-Russell conception of logic as formal ontology and at the same time by a desire to protect Aristotle from possible charges of psychologism. They thought they saw Aristotle applying the informal axiomatic method to formal ontology, not as making the first steps into formalepistemology. They did not notice Aristotle's description of deductive reasoning. Ironically, the formal axiomatic method (in which one explicitly presents not merely the substantive axioms but also the deductive processes used to derive theorems from the axioms) is incipient in Aristotle's presentation. Partly in opposition to the axiomatic, ontically-oriented approach to Aristotle's logic and partly as a result of attempting to increase the degree of fit between interpretation and text, logicians in the 1970s working independently came to remarkably similar conclusions to the effect that Aristotle indeed had produced the first system of formal deductions. They concluded that Aristotle had analyzed the process of deduction and that his achievement included a semantically complete system of natural deductions including both direct and indirect deductions. Where the interpretations of the 1920s and 1930s attribute to Aristotle a system of propositions organized deductively, the interpretations of the 1970s attribute to Aristotle a system of deductions, or extended deductive discourses, organized epistemically. The logicians of the 1920s and 1930s take Aristotle to be deducing laws of logic from axiomatic origins; the logicians of the 1970s take Aristotle to be describing the process of deduction and in particular to be describing deductions themselves, both those deductions that are proofs based on axiomatic premises and those deductions that, though deductively cogent, do not establish the truth of the conclusion but only that the conclusion is implied by the premise-set. Thus, two very different and opposed interpretations had emerged, interestingly both products of modern logicians equipped with the theoretical apparatus of mathematical logic. The issue at stake between these two interpretations is the historical question of Aristotle's place in the history of logic and of his orientation in philosophy of logic. This paper affirms Aristotle's place as the founder of logic taken as formalepistemology, including the study of deductive reasoning. A by-product of this study of Aristotle's accomplishments in logic is a clarification of a distinction implicit in discourses among logicians--that between logic as formal ontology and logic as formalepistemology. (shrink)
The rise of experimental philosophy has placed metaphilosophical questions, particularly those concerning concepts, at the center of philosophical attention. X-phi offers empirically rigorous methods for identifying conceptual content, but what exactly it contributes towards evaluating conceptual content remains unclear. We show how x-phi complements Rudolf Carnap’s underappreciated methodology for concept determination, explication. This clarifies and extends x-phi’s positive philosophical import, and also exhibits explication’s broad appeal. But there is a potential problem: Carnap’s account of explication was limited to empirical and (...) logical concepts, but many concepts of interest to philosophers are essentially normative. With formalepistemology as a case study, we show how x-phi assisted explication can apply to normative domains. (shrink)
We present a philosophical motivation for the logics of formal inconsistency, a family of paraconsistent logics whose distinctive feature is that of having resources for expressing the notion of consistency within the object language. We shall defend the view according to which logics of formal inconsistency are theories of logical consequence of normative and epistemic character. This approach not only allows us to make inferences in the presence of contradictions, but offers a philosophically acceptable account of paraconsistency.
There is a long tradition in formalepistemology and in the psychology of reasoning to investigate indicative conditionals. In psychology, the propositional calculus was taken for granted to be the normative standard of reference. Experimental tasks, evaluation of the participants’ responses and psychological model building, were inspired by the semantics of the material conditional. Recent empirical work on indicative conditionals focuses on uncertainty. Consequently, the normative standard of reference has changed. I argue why neither logic nor standard probability (...) theory provide appropriate rationality norms for uncertain conditionals. I advocate coherence based probability logic as an appropriate framework for investigating uncertain conditionals. Detailed proofs of the probabilistic non-informativeness of a paradox of the material conditional illustrate the approach from a formal point of view. I survey selected data on human reasoning about uncertain conditionals which additionally support the plausibility of the approach from an empirical point of view. (shrink)
Applying Bernard Lonergan's (1957/1992, 1972) analysis of intentional consciousness and its concomitant epistemology, this paper highlights epistemological confusion in contemporary consciousness studies as exemplified mostly in David Chalmers's (1996) position. In ideal types, a first section outlines two epistemologies-sensate-modeled and intelligence-based-whose difference significantly explains the different positions. In subsequent sections, this paper documents the sensate-modeled epistemology in Chalmers's position and consciousness studies in general. Tellingly, this model of knowing is at odds with the formal-operational theorizing in twentieth-century (...) science. This paper then links this epistemology with functionalism and its focus on descriptive efficient causality in external behaviors and its oversight of explanatory formal causality; highlights the theoretical incoherence of the understanding of science in the functionalist approach; connects it with the construal of consciousness as primarily intentional (i.e., directed toward an object) to the neglect of consciousness as conscious (i.e., constituted by a non-objectified self-presence); and relates this outcome to the reduction of human consciousness to animal-like perception and mechanistic interactions. A brief conclusion summarizes these multiple, subtle, and interconnected considerations and suggests how only an intellectual epistemology would be adequate to the intellectual nature of human consciousness and the world of meaning, not of mere bodies, in which humans exist. (shrink)
The branch of philosophical logic which has become known as “belief change” has, in the course of its development, become alienated from its epistemological origins. However, as formal criteria do not suffice to defend a principled choice between competing systems for belief change, we do need to take their epistemological embedding into account. Here, on the basis of a detailed examination of Isaac Levi's epistemology, we argue for a new direction of belief change research and propose to construct (...) systems for belief change that can do without, but do not rule out, selection functions, in order to enable an *empirical* assessment of the relative merits of competing belief change systems. (shrink)
Prior Analytics by the Greek philosopher Aristotle (384 – 322 BCE) and Laws of Thought by the English mathematician George Boole (1815 – 1864) are the two most important surviving original logical works from before the advent of modern logic. This article has a single goal: to compare Aristotle’s system with the system that Boole constructed over twenty-two centuries later intending to extend and perfect what Aristotle had started. This comparison merits an article itself. Accordingly, this article does not discuss (...) many other historically and philosophically important aspects of Boole’s book, e.g. his confused attempt to apply differential calculus to logic, his misguided effort to make his system of ‘class logic’ serve as a kind of ‘truth-functional logic’, his now almost forgotten foray into probability theory, or his blindness to the fact that a truth-functional combination of equations that follows from a given truth-functional combination of equations need not follow truth-functionally. One of the main conclusions is that Boole’s contribution widened logic and changed its nature to such an extent that he fully deserves to share with Aristotle the status of being a founding figure in logic. By setting forth in clear and systematic fashion the basic methods for establishing validity and for establishing invalidity, Aristotle became the founder of logic as formalepistemology. By making the first unmistakable steps toward opening logic to the study of ‘laws of thought’—tautologies and laws such as excluded middle and non-contradiction—Boole became the founder of logic as formal ontology. (shrink)
We’ll sketch the debate on testimony in social epistemology by reference to the contemporary debate on reductionism/anti-reductionism, communitarian epistemology and inferentialism. Testimony is a fundamental source of knowledge we share and it is worthy to be considered in the ambit of a dialogical perspective, which requires a description of a formal structure which entails deontic statuses and deontic attitudes. In particular, we’ll argue for a social reformulation of the “space of reasons”, which establishes a fruitful relationship with (...) the epistemological view of Wilfrid Sellars. (shrink)
This article is primarily concerned with the articulation of a defensible position on the relevance of phenomenological analysis with the current epistemological edifice as this latter has evolved since the rupture with the classical scientific paradigm pointing to the Newtonian-Leibnizian tradition which took place around the beginning of 20th century. My approach is generally based on the reduction of the objects-contents of natural sciences, abstracted in the form of ideal objectivities in the corresponding logical-mathematical theories, to the content of meaning-acts (...) ultimately referring to a specific being-within-the-world experience. This is a position that finds itself in line with Husserl’s gradual departure from the psychologistic interpretations of his earlier works on the philosophy of logic and mathematics and culminates in a properly meant phenomenological foundation of natural sciences in his last major published work, namely the Crisis of European Sciences and the Transcendental Phenomenology. Further this article tries to set up a context of discourse in which to found both physical and formal objects in parallel terms as essentially temporal-noematic objects to the extent that they may be considered as invariants of the constitutional modes of a temporal consciousness. (shrink)
We have recently started to understand that fundamental aspects of complex systems such as emergence, the measurement problem, inherent uncertainty, complex causality in connection with unpredictable determinism, timeirreversibility and nonlocality all highlight the observer's participatory role in determining their workings. In addition, the principle of 'limited universality' in complex systems, which prompts us to search for the appropriate 'level of description in which unification and universality can be expected', looks like a version of Bohr's 'complementarity principle'. It is more or (...) less certain that the different levels of description possible of a complex whole actually partial objectifications are projected on to and even redefine its constituent parts. Thus it is interesting that these fundamental complexity issues don't just bear a formal resemblance to, but reveal a profound connection with, quantum mechanics. Indeed, they point to a common origin on a deeper level of description. (shrink)
How should a group with different opinions (but the same values) make decisions? In a Bayesian setting, the natural question is how to aggregate credences: how to use a single credence function to naturally represent a collection of different credence functions. An extension of the standard Dutch-book arguments that apply to individual decision-makers recommends that group credences should be updated by conditionalization. This imposes a constraint on what aggregation rules can be like. Taking conditionalization as a basic constraint, we gather (...) lessons from the established work on credence aggregation, and extend this work with two new impossibility results. We then explore contrasting features of two kinds of rules that satisfy the constraints we articulate: one kind uses fixed prior credences, and the other uses geometric averaging, as opposed to arithmetic averaging. We also prove a new characterisation result for geometric averaging. Finally we consider applications to neighboring philosophical issues, including the epistemology of disagreement. (shrink)
When do we agree? The answer might once have seemed simple and obvious; we agree that p when we each believe that p. But from a formal epistemological perspective, where degrees of belief are more fundamental than beliefs, this answer is unsatisfactory. On the one hand, there is reason to suppose that it is false; degrees of belief about p might differ when beliefs simpliciter on p do not. On the other hand, even if it is true, it is (...) too vague; for what it is to believe simpliciter ought to be explained in terms of degrees of belief. This paper presents several possible notions of agreement, and corresponding notions of disagreement. It indicates how the findings are fruitful for the epistemology of disagreement, with special reference to the notion of epistemic peerhood. (shrink)
Our evidence can be about different subject matters. In fact, necessarily equivalent pieces of evidence can be about different subject matters. Does the hyperintensionality of ‘aboutness’ engender any hyperintensionality at the level of rational credence? In this paper, I present a case which seems to suggest that the answer is ‘yes’. In particular, I argue that our intuitive notions of independent evidence and inadmissible evidence are sensitive to aboutness in a hyperintensional way. We are thus left with a paradox. While (...) there is strong reason to think that rational credence cannot make such hyperintensional distinctions, our intuitive judgements about certain cases seem to demand that it does. (shrink)
Conditionalization is one of the central norms of Bayesian epistemology. But there are a number of competing formulations, and a number of arguments that purport to establish it. In this paper, I explore which formulations of the norm are supported by which arguments. In their standard formulations, each of the arguments I consider here depends on the same assumption, which I call Deterministic Updating. I will investigate whether it is possible to amend these arguments so that they no longer (...) depend on it. As I show, whether this is possible depends on the formulation of the norm under consideration. (shrink)
Ranking theory is a formalepistemology that has been developed in over 600 pages in Spohn's recent book The Laws of Belief, which aims to provide a normative account of the dynamics of beliefs that presents an alternative to current probabilistic approaches. It has long been received in the AI community, but it has not yet found application in experimental psychology. The purpose of this paper is to derive clear, quantitative predictions by exploiting a parallel between ranking theory (...) and a statistical model called logistic regression. This approach is illustrated by the development of a model for the conditional inference task using Spohn's ranking theoretic approach to conditionals. (shrink)
Some recent work in formalepistemology shows that “witness agreement” by itself implies neither an increase in the probability of truth nor a high probability of truth—the witnesses need to have some “individual credibility.” It can seem that, from this formal epistemological result, it follows that coherentist justification (i.e., doxastic coherence) is not truth-conducive. I argue that this does not follow. Central to my argument is the thesis that, though coherentists deny that there can be noninferential justification, (...) coherentists do not deny that there can be individual credibility. (shrink)
What is the relationship between degrees of belief and binary beliefs? Can the latter be expressed as a function of the former—a so-called “belief-binarization rule”—without running into difficulties such as the lottery paradox? We show that this problem can be usefully analyzed from the perspective of judgment-aggregation theory. Although some formal similarities between belief binarization and judgment aggregation have been noted before, the connection between the two problems has not yet been studied in full generality. In this paper, we (...) seek to fill this gap. The paper is organized around a baseline impossibility theorem, which we use to map out the space of possible solutions to the belief-binarization problem. Our theorem shows that, except in limiting cases, there exists no belief-binarization rule satisfying four initially plausible desiderata. Surprisingly, this result is a direct corollary of the judgment-aggregation variant of Arrow’s classic impossibility theorem in social choice theory. (shrink)
In this paper we propose and analyze a game-theoretic model of the epistemology of peer disagreement. In this model, the peers' rationality is evaluated in terms of their probability of ending the disagreement with a true belief. We find that different strategies---in particular, one based on the Steadfast View and one based on the Conciliatory View---are rational depending on the truth-sensitivity of the individuals involved in the disagreement. Interestingly, the Steadfast and the Conciliatory Views can even be rational simultaneously (...) in some circumstances. We tentatively provide some reasons to favor the Conciliatory View in such cases. We argue that the game-theoretic perspective is a fruitful one in this debate, and this fruitfulness has not been exhausted by the present paper. (shrink)
Consider two epistemic experts—for concreteness, let them be two weather forecasters. Suppose that you aren’t certain that they will issue identical forecasts, and you would like to proportion your degrees of belief to theirs in the following way: first, conditional on either’s forecast of rain being x, you’d like your own degree of belief in rain to be x. Secondly, conditional on them issuing different forecasts of rain, you’d like your own degree of belief in rain to be some weighted (...) average of the forecast of each. Finally, you’d like your degrees of belief to be given by an orthodox probability measure. Moderate ambitions, all. But you can’t always get what you want. (shrink)
The epistemology of self-locating belief concerns itself with how rational agents ought to respond to certain kinds of indexical information. I argue that those who endorse the thesis of Time-Slice Rationality ought to endorse a particular view about the epistemology of self-locating belief, according to which ‘essentially indexical’ information is never evidentially relevant to non-indexical matters. I close by offering some independent motivations for endorsing Time-Slice Rationality in the context of the epistemology of self-locating belief.
Accuracy‐first epistemology is an approach to formalepistemology which takes accuracy to be a measure of epistemic utility and attempts to vindicate norms of epistemic rationality by showing how conformity with them is beneficial. If accuracy‐first epistemology can actually vindicate any epistemic norms, it must adopt a plausible account of epistemic value. Any such account must avoid the epistemic version of Derek Parfit's “repugnant conclusion.” I argue that the only plausible way of doing so is to (...) say that accurate credences in certain propositions have no, or almost no, epistemic value. I prove that this is incompatible with standard accuracy‐first arguments for probabilism, and argue that there is no way for accuracy‐first epistemology to show that all credences of all agents should be coherent. (shrink)
Recent philosophical work has praised the reward structure of science, while recent empirical work has shown that many scientific results may not be reproducible. I argue that the reward structure of science incentivizes scientists to focus on speed and impact at the expense of the reproducibility of their work, thus contributing to the so-called reproducibility crisis. I use a rational choice model to identify a set of sufficient conditions for this problem to arise, and I argue that these conditions plausibly (...) apply to a wide range of research situations. Currently proposed solutions will not fully address this problem. Philosophical commentators should temper their optimism about the reward structure of science. (shrink)
Coherentists on epistemic justification claim that all justification is inferential, and that beliefs, when justified, get their justification together (not in isolation) as members of a coherent belief system. Some recent work in formalepistemology shows that “individual credibility” is needed for “witness agreement” to increase the probability of truth and generate a high probability of truth. It can seem that, from this result in formalepistemology, it follows that coherentist justification is not truth-conducive, that it (...) is not the case that, under the requisite conditions, coherentist justification increases the probability of truth and generates a high probability of truth. I argue that this does not follow. (shrink)
Social scientists use many different methods, and there are often substantial disagreements about which method is appropriate for a given research question. In response to this uncertainty about the relative merits of different methods, W. E. B. Du Bois advocated for and applied “methodological triangulation”. This is to use multiple methods simultaneously in the belief that, where one is uncertain about the reliability of any given method, if multiple methods yield the same answer that answer is confirmed more strongly than (...) it could have been by any single method. Against this, methodological purists believe that one should choose a single appropriate method and stick with it. Using tools from voting theory, we show Du Boisian methodological triangulation to be more likely to yield the correct answer than purism, assuming the scientist is subject to some degree of diffidence about the relative merits of the various methods. This holds even when in fact only one of the methods is appropriate for the given research question. (shrink)
Recently there have been several attempts in formalepistemology to develop an adequate probabilistic measure of coherence. There is much to recommend probabilistic measures of coherence. They are quantitative and render formally precise a notion—coherence—notorious for its elusiveness. Further, some of them do very well, intuitively, on a variety of test cases. Siebel, however, argues that there can be no adequate probabilistic measure of coherence. Take some set of propositions A, some probabilistic measure of coherence, and a probability (...) distribution such that all the probabilities on which A’s degree of coherence depends (according to the measure in question) are defined. Then, the argument goes, the degree to which A is coherent depends solely on the details of the distribution in question and not at all on the explanatory relations, if any, standing between the propositions in A. This is problematic, the argument continues, because, first, explanation matters for coherence, and, second, explanation cannot be adequately captured solely in terms of probability. We argue that Siebel’s argument falls short. (shrink)
The communist norm requires that scientists widely share the results of their work. Where did this norm come from, and how does it persist? Michael Strevens provides a partial answer to these questions by showing that scientists should be willing to sign a social contract that mandates sharing. However, he also argues that it is not in an individual credit-maximizing scientist's interest to follow this norm. I argue against Strevens that individual scientists can rationally conform to the communist norm, even (...) in the absence of a social contract or other ways of socially enforcing the norm, by proving results to this effect in a game-theoretic model. This shows that the incentives provided to scientists through the priority rule are sufficient to explain both the origins and the persistence of the communist norm, adding to previous results emphasizing the benefits of the incentive structure created by the priority rule. (shrink)
Pluralistic ignorance is a socio-psychological phenomenon that involves a systematic discrepancy between people’s private beliefs and public behavior in certain social contexts. Recently, pluralistic ignorance has gained increased attention in formal and social epistemology. But to get clear on what precisely a formal and social epistemological account of pluralistic ignorance should look like, we need answers to at least the following two questions: What exactly is the phenomenon of pluralistic ignorance? And can the phenomenon arise among perfectly (...) rational agents? In this paper, we propose answers to both these questions. First, we characterize different versions of pluralistic ignorance and define the version that we claim most adequately captures the examples cited as paradigmatic cases of pluralistic ignorance in the literature. In doing so, we will stress certain key epistemic and social interactive aspects of the phenomenon. Second, given our characterization of pluralistic ignorance, we argue that the phenomenon can indeed arise in groups of perfectly rational agents. This, in turn, ensures that the tools of formalepistemology can be fully utilized to reason about pluralistic ignorance. (shrink)
The field of iterated belief change has focused mainly on revision, with the other main operator of AGM belief change theory, i.e. contraction, receiving relatively little attention. In this paper we extend the Harper Identity from single-step change to define iterated contraction in terms of iterated revision. Specifically, just as the Harper Identity provides a recipe for defining the belief set resulting from contracting A in terms of (i) the initial belief set and (ii) the belief set resulting from revision (...) by ¬A, we look at ways to define the plausibility ordering over worlds resulting from contracting A in terms of (iii) the initial plausibility ordering, and (iv) the plausibility ordering resulting from revision by ¬A. After noting that the most straightforward such extension leads to a trivialisation of the space of permissible orderings, we provide a family of operators for combining plausibility orderings that avoid such a result. These operators are characterised in our domain of interest by a pair of intuitively compelling properties, which turn out to enable the derivation of a number of iterated contraction postulates from postulates for iterated revision. We finish by observing that a salient member of this family allows for the derivation of counterparts for contraction of some well known iterated revision operators, as well as for defining new iterated contraction operators. (shrink)
Some of the most interesting recent work in formalepistemology has focused on developing accuracy-based approaches to justifying Bayesian norms. These approaches are interesting not only because they offer new ways to justify these norms, but because they potentially offer a way to justify all of these norms by appeal to a single, attractive epistemic goal: having accurate beliefs. Recently, Easwaran & Fitelson (2012) have raised worries regarding whether such “all-accuracy” or “purely alethic” approaches can accommodate and justify (...) evidential Bayesian norms. In response, proponents of purely alethic approaches, such as Pettigrew (2013b) and Joyce (2016), have argued that scoring rule arguments provide us with compatible and purely alethic justifications for the traditional Bayesian norms, including evidential norms. In this paper I raise several challenges to this claim. First, I argue that many of the justifications these scoring rule arguments provide are not compatible. Second, I raise worries for the claim that these scoring rule arguments provide purely alethic justifications. Third, I turn to assess the more general question of whether purely alethic justifications for evidential norms are even possible, and argue that, without making some contentious assumptions, they are not. Fourth, I raise some further worries for the possibility of providing purely alethic justifications for content-sensitive evidential norms, like the Principal Principle. (shrink)
This article shows that a slight variation of the argument in Milne 1996 yields the log‐likelihood ratio l rather than the log‐ratio measure r as “the one true measure of confirmation. ” *Received December 2006; revised December 2007. †To contact the author, please write to: FormalEpistemology Research Group, Zukunftskolleg and Department of Philosophy, University of Konstanz, P.O. Box X906, 78457 Konstanz, Germany; e‐mail: franz.huber@uni‐konstanz.de.
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.