I argue that normative formalepistemology (NFE) is best understood as modelling, in the sense that this is the reconstruction of its methodology on which NFE is doing best. I focus on Bayesianism and show that it has the characteristics of modelling. But modelling is a scientific enterprise, while NFE is normative. I thus develop an account of normative models on which they are idealised representations put to normative purposes. Normative assumptions, such as the transitivity of comparative credence, (...) are characterised as modelling idealisations motivated normatively. I then survey the landscape of methodological options: what might formal epistemologists be up to? I argue the choice is essentially binary: modelling or theorising. If NFE is theorising it is doing very poorly: generating false claims with no clear methodology for separating out what is to be taken seriously. Modelling, by contrast, is a successful methodology precisely suited to the management of useful falsehoods. Regarding NFE as modelling is not costless, however. First, our normative inferences are less direct and are muddied by the presence of descriptive idealisations. Second, our models are purpose-specific and limited in their scope. I close with suggestions for how to adapt our practice. (shrink)
Formalepistemology is just what it sounds like: epistemology done with formal tools. Coinciding with the general rise in popularity of experimental philosophy, formal epistemologists have begun to apply experimental methods in their own work. In this entry, I survey some of the work at the intersection of formal and experimental epistemology. I show that experimental methods have unique roles to play when epistemology is done formally, and I highlight some ways in (...) which results from formalepistemology have been used fruitfully to advance epistemically-relevant experimental work. The upshot of this brief, incomplete survey is that formal and experimental methods often constitute mutually informative means to epistemological ends. (shrink)
In formalepistemology, we use mathematical methods to explore the questions of epistemology and rational choice. What can we know? What should we believe and how strongly? How should we act based on our beliefs and values? We begin by modelling phenomena like knowledge, belief, and desire using mathematical machinery, just as a biologist might model the fluctuations of a pair of competing populations, or a physicist might model the turbulence of a fluid passing through a small (...) aperture. Then, we explore, discover, and justify the laws governing those phenomena, using the precision that mathematical machinery affords. For example, we might represent a person by the strengths of their beliefs, and we might measure these using real numbers, which we call credences. Having done this, we might ask what the norms are that govern that person when we represent them in that way. How should those credences hang together? How should the credences change in response to evidence? And how should those credences guide the person’s actions? This is the approach of the first six chapters of this handbook. In the second half, we consider different representations—the set of propositions a person believes; their ranking of propositions by their plausibility. And in each case we ask again what the norms are that govern a person so represented. Or, we might represent them as having both credences and full beliefs, and then ask how those two representations should interact with one another. This handbook is incomplete, as such ventures often are. Formalepistemology is a much wider topic than we present here. One omission, for instance, is social epistemology, where we consider not only individual believers but also the epistemic aspects of their place in a social world. Michael Caie’s entry on doxastic logic touches on one part of this topic, but there is much more. Relatedly, there is no entry on epistemic logic, nor any on knowledge more generally. There are still more gaps. These omissions should not be taken as ideological choices. This material is missing, not because it is any less valuable or interesting, but because we v failed to secure it in time. Rather than delay publication further, we chose to go ahead with what is already a substantial collection. We anticipate a further volume in the future that will cover more ground. Why an open access handbook on this topic? A number of reasons. The topics covered here are large and complex and need the space allowed by the sort of 50 page treatment that many of the authors give. We also wanted to show that, using free and open software, one can overcome a major hurdle facing open access publishing, even on topics with complex typesetting needs. With the right software, one can produce attractive, clear publications at reasonably low cost. Indeed this handbook was created on a budget of exactly £0 (≈ $0). Our thanks to PhilPapers for serving as publisher, and to the authors: we are enormously grateful for the effort they put into their entries. (shrink)
How does being a woman affect one’s epistemic life? What about being Black? Or queer? Standpoint theorists argue that such social positions can give rise to otherwise unavailable epistemic privilege. “Epistemic privilege” is a murky concept, however. Critics of standpoint theory argue that the view is offered without a clear explanation of how standpoints confer their benefits, what those benefits are, or why social positions are particularly apt to produce them. For this reason, many regard standpoint theory as being out (...) of step with epistemology more broadly. But this need not be so. This article articulates a minimal version of standpoint epistemology that avoids these criticisms and supports the normative goals of its feminist forerunners. This account serves as the foundation for developing a formal model in which to explore standpoint epistemology using neighborhood semantics for modal logic. (shrink)
Can there be knowledge and rational belief in the absence of a rational degree of confidence? Yes, and cases of "mistuned knowledge" demonstrate this. In this paper we leverage this normative possibility in support of advancing our understanding of the metaphysical relation between belief and credence. It is generally assumed that a Lockean metaphysics of belief that reduces outright belief to degrees of confidence would immediately effect a unification of coarse-grained epistemology of belief with fine-grained epistemology of confidence. (...) Scott Sturgeon has suggested that the unification is effected by understanding the relation between outright belief and confidence as an instance of the determinable-determinate relation. But determination of belief by confidence would not by itself yield the result that norms for confidence carry over to norms for outright belief unless belief and high confidence are token identical. We argue that this token-identity thesis is incompatible with the neglected phenomenon of “mistuned knowledge”—knowledge and rational belief in the absence of rational confidence. We contend that there are genuine cases of mistuned knowledge and that, therefore, epistemological unification must forego token identity of belief and high confidence. We show how partial epistemological unification can be secured given determination of outright belief by degrees of confidence even without token-identity. Finally, we suggest a direction for the pursuit of thoroughgoing epistemological unification. (shrink)
This paper reviews the central points and presents some recent developments of the epistemic approach to paraconsistency in terms of the preservation of evidence. Two formal systems are surveyed, the basic logic of evidence (BLE) and the logic of evidence and truth (LET J ), designed to deal, respectively, with evidence and with evidence and truth. While BLE is equivalent to Nelson’s logic N4, it has been conceived for a different purpose. Adequate valuation semantics that provide decidability are given (...) for both BLE and LET J . The meanings of the connectives of BLE and LET J , from the point of view of preservation of evidence, is explained with the aid of an inferential semantics. A formalization of the notion of evidence for BLE as proposed by M. Fitting is also reviewed here. As a novel result, the paper shows that LET J is semantically characterized through the so-called Fidel structures. Some opportunities for further research are also discussed. (shrink)
Epistemology is the study of knowledge. This entry covers epistemology in two parts: one historical, one contemporary. The former provides a brief theological history of epistemology. The latter outlines three categories of contemporary epistemology: traditional epistemology, social epistemology, and formalepistemology, along with corresponding theological questions that arise in each.
In the past few years, social epistemologists have developed several formal models of the social organisation of science. While their robustness and representational adequacy has been analysed at length, the function of these models has begun to be discussed in more general terms only recently. In this article, I will interpret many of the current formal models of the scientific community as representing the latest development of what I will call the ‘Kuhnian project’. These models share with Kuhn (...) a number of questions about the relation between individuals and communities. At the same time, they also inherit some of Kuhn’s problematic characterisations of the scientific community. In particular, current models of the social organisation of science represent the scientific community as essentially value-free. This may put into question both their representational adequacy and their normative ambitions. In the end, it will be shown that the discussion on the formal models of the scientific community may contribute in fruitful ways to the ongoing debates on value judgements in science. (shrink)
(This is for the Cambridge Handbook of Analytic Philosophy, edited by Marcus Rossberg) In this handbook entry, I survey the different ways in which formal mathematical methods have been applied to philosophical questions throughout the history of analytic philosophy. I consider: formalization in symbolic logic, with examples such as Aquinas’ third way and Anselm’s ontological argument; Bayesian confirmation theory, with examples such as the fine-tuning argument for God and the paradox of the ravens; foundations of mathematics, with examples such (...) as Hilbert’s programme and Gödel’s incompleteness theorems; social choice theory, with examples such as Condorcet’s paradox and Arrow’s theorem; ‘how possibly’ results, with examples such as Condorcet’s jury theorem and recent work on intersectionality theory; and the application of advanced mathematics in philosophy, with examples such as accuracy-first epistemology. (shrink)
There are two fundamentally distinct kinds of biological theorizing. "Formal biology" focuses on the relations, captured in formal laws, among mathematically abstracted properties of abstract objects. Population genetics and theoretical mathematical ecology, which are cases of formal biology, thus share methods and goals with theoretical physics. "Compositional biology," on the other hand, is concerned with articulating the concrete structure, mechanisms, and function, through developmental and evolutionary time, of material parts and wholes. Molecular genetics, biochemistry, developmental biology, and (...) physiology, which are examples of compositional biology, are in serious need of philosophical attention. For example, the very concept of a "part" is understudied in both philosophy of biology and philosophy of science. ;My dissertation is an attempt to clarify the distinction between formal biology and compositional biology and, in so doing, provide a clear philosophical analysis, with case studies, of compositional biology. Given the social, economic, and medical importance of compositional biology, understanding it is urgent. For my investigation, I draw on the philosophical fields of metaphysics and epistemology, as well as philosophy of biology and philosophy of science. I suggest new ways of thinking about some classic philosophy of science issues, such as modeling, laws of nature, abstraction, explanation, and confirmation. I hint at the relevance of my study of two kinds of biological theorizing to debates concerning the disunity of science. (shrink)
Much contemporary epistemology is informed by a kind of confirmational holism, and a consequent rejection of the assumption that all confirmation rests on experiential certainties. Another prominent theme is that belief comes in degrees, and that rationality requires apportioning one's degrees of belief reasonably. Bayesian confirmation models based on Jeffrey Conditionalization attempt to bring together these two appealing strands. I argue, however, that these models cannot account for a certain aspect of confirmation that would be accounted for in any (...) adequate holistic confirmation theory. I then survey the prospects for constructing a formalepistemology that better accommodates holistic insights. (shrink)
Philosophers have recently highlighted substantial affinities between causation and grounding, which has inclined some to import the conceptual and formal resources of causal interventionism into the metaphysics of grounding. The prospect of grounding interventionism raises two important questions: exactly what are grounding interventions, and why should we think they enable knowledge of grounding? This paper will approach these questions by examining how causal interventionists have addressed (or might address) analogous questions and then comparing the available options for grounding interventionism. (...) I argue that grounding interventions must be understood in worldly terms, as adding something to or deleting something from the roster of entities, or making some fact obtain or fail to obtain. I consider three bases for counterfactual assessment: imagination, structural equation models, and background theory. I conclude that grounding interventionism requires firmer epistemological foundations, without which the interventionist's epistemology of grounding is incomplete and ineffectually rationalist. (shrink)
We consider the complex interactions between rape culture and epistemology. A central case study is the consideration of a deferential attitude about the epistemology of sexual assault testimony. According to the deferential attitude, individuals and institutions should decline to act on allegations of sexual assault unless and until they are proven in a formal setting, i.e., a criminal court. We attack this deference from several angles, including the pervasiveness of rape culture in the criminal justice system, the (...)epistemology of testimony and norms connecting knowledge and action, the harms of tacit idealizations away from important contextual factors, and a contextualist semantics for 'knows' ascriptions. (shrink)
The Duhem-Quine Thesis is the claim that it is impossible to test a scientific hypothesis in isolation because any empirical test requires assuming the truth of one or more auxiliary hypotheses. This is taken by many philosophers, and is assumed here, to support the further thesis that theory choice is underdetermined by empirical evidence. This inquiry is focused strictly on the axiological commitments engendered in solutions to underdetermination, specifically those of Pierre Duhem and W. V. Quine. Duhem resolves underdetermination by (...) appealing to a cluster of virtues called 'good sense', and it has recently been argued by Stump (Stud Hist Philos Biol Biomed Sei, 18(1):149-159,2007) that good sense is a form of virtue epistemology. This paper considers whether Quine, who's philosophy is heavily influenced by the very thesis that led Duhem to the virtues, is also led to a virtue epistemology in the face of underdetermination. Various sources of Quinian epistemic normativity are considered, and it is argued that, in conjunction with other normative commitments, Quine's sectarian solution to underdetermination amounts to a skills based virtue epistemology. The paper also sketches formal features of the novel form of virtue epistemology common to Duhem and Quine that challenges the adequacy of epistemic value truth-monism and blocks any imperialist naturalization of virtue epistemology, as the epistemic virtues are essential to the success of the sciences themselves. (shrink)
This publication defends a phenomenalist interpretation of Kant’s idealism, which, however, deviates from usual phenomenalist interpretations in several respects. According to my reading, appearances are the content of representations, but not the true object of cognition. The object to which our cognition refers is rather the thing itself as the transcendental object. Nonetheless, we only cognize them as they appear and not as they are in themselves. Thus the unknowability of things as they are in themselves is retained. In the (...) course of my presentation, I discuss a number of aspects of Kant’s philosophy, among which are the distinction between appearances and things in themselves, Kant’s relationship to Cartesian epistemology, the refutation of idealism, and not least his theory of synthesis. My aim is not only to show that Kant is a phenomenalist, but also to characterize the kind of his phenomenalism. (shrink)
The notion of an ideal reasoner has several uses in epistemology. Often, ideal reasoners are used as a parameter of (maximum) rationality for finite reasoners (e.g. humans). However, the notion of an ideal reasoner is normally construed in such a high degree of idealization (e.g. infinite/unbounded memory) that this use is unadvised. In this dissertation, I investigate the conditions under which an ideal reasoner may be used as a parameter of rationality for finite reasoners. In addition, I present and (...) justify the research program of computational epistemology, which investigates the parameter of maximum rationality for finite reasoners using computer simulations. (shrink)
Recently, some have challenged the idea that there are genuine norms of diachronic rationality. Part of this challenge has involved offering replacements for diachronic principles. Skeptics about diachronic rationality believe that we can provide an error theory for it by appealing to synchronic updating rules that, over time, mimic the behavior of diachronic norms. In this paper, I argue that the most promising attempts to develop this position within the Bayesian framework are unsuccessful. I sketch a new synchronic surrogate that (...) draws upon some of the features of each of these earlier attempts. At the heart of this discussion is the question of what exactly it means to say that one norm is a surrogate for another. I argue that surrogacy, in the given context, can be taken as a proxy for the degree to which formal and traditional epistemology can be made compatible. (shrink)
This dissertation is a contribution to formal and computational philosophy. -/- In the first part, we show that by exploiting the parallels between large, yet finite lotteries on the one hand and countably infinite lotteries on the other, we gain insights in the foundations of probability theory as well as in epistemology. Case 1: Infinite lotteries. We discuss how the concept of a fair finite lottery can best be extended to denumerably infinite lotteries. The solution boils down to (...) the introduction of infinitesimal probability values, which can be achieved using non-standard analysis. Our solution can be generalized to uncountable sample spaces, giving rise to a Non-Archimedean Probability (NAP) theory. Case 2: Large but finite lotteries. We propose application of the language of relative analysis (a type of non-standard analysis) to formulate a new model for rational belief, called Stratified Belief. This contextualist model seems well-suited to deal with a concept of beliefs based on probabilities ‘sufficiently close to unity’. -/- The second part presents a case study in social epistemology. We model a group of agents who update their opinions by averaging the opinions of other agents. Our main goal is to calculate the probability for an agent to end up in an inconsistent belief state due to updating. To that end, an analytical expression is given and evaluated numerically, both exactly and using statistical sampling. The probability of ending up in an inconsistent belief state turns out to be always smaller than 2%. (shrink)
The formal and empirical-generative perspectives of computation are demonstrated to be inadequate to secure the goals of simulation in the social sciences. Simulation does not resemble formal demonstrations or generative mechanisms that deductively explain how certain models are sufficient to generate emergent macrostructures of interest. The description of scientific practice implies additional epistemic conceptions of scientific knowledge. Three kinds of knowledge that account for a comprehensive description of the discipline were identified: formal, empirical and intentional knowledge. The (...) use of formal conceptions of computation for describing simulation is refuted; the roles of programming languages according to intentional accounts of computation are identified; and the roles of iconographic programming languages and aesthetic machines in simulation are characterized. The roles that simulation and intentional decision making may be able to play in a participative information society are also discussed. (shrink)
Medical terminology collects and organizes the many different kinds of terms employed in the biomedical domain both by practitioners and also in the course of biomedical research. In addition to serving as labels for biomedical classes, these names reflect the organizational principles of biomedical vocabularies and ontologies. Some names represent invariant features (classes, universals) of biomedical reality (i.e., they are a matter for ontology). Other names, however, convey also how this reality is perceived, measured, and understood by health professionals (i.e., (...) they belong to the domain of epistemology). We analyze terms from several biomedical vocabularies in order to throw light on the interactions between ontological and epistemological components of these terminologies. We identify four cases: 1) terms containing classification criteria, 2) terms reflecting detectability, modality, uncertainty, and vagueness, 3) terms created in order to obtain a complete partition of a given domain, and 4) terms reflecting mere fiat boundaries. We show that epistemology-loaded terms are pervasive in biomedical vocabularies, that the “classes” they name often do not comply with sound classification principles, and that they are therefore likely to cause problems in the evolution and alignment of terminologies and associated ontologies. (shrink)
In this paper we present a philosophical motivation for the logics of formal inconsistency, a family of paraconsistent logics whose distinctive feature is that of having resources for expressing the notion of consistency within the object language in such a way that consistency may be logically independent of non-contradiction. We defend the view according to which logics of formal inconsistency may be interpreted as theories of logical consequence of an epistemological character. We also argue that in order to (...) philosophically justify paraconsistency there is no need to endorse dialetheism, the thesis that there are true contradictions. Furthermore, we show that mbC, a logic of formal inconsistency based on classical logic, may be enhanced in order to express the basic ideas of an intuitive interpretation of contradictions as conflicting evidence. (shrink)
This is book l of three philosophy books in international language; formal academic philosophy source in Kurdish language Given overall view on classic epistemology for university student in non English language philosophy departments and philosophy schools .
This is book ll of three philosophy books in international language, formal academic philosophy source in Kurdish language. Written for non English speaking university students as a philosophy guide into epistemology/ Social epistemology, as a resource for the use of philosophy departments and philosophy schools.
Judaic Logic is an original inquiry into the forms of thought determining Jewish law and belief, from the impartial perspective of a logician. Judaic Logic attempts to honestly estimate the extent to which the logic employed within Judaism fits into the general norms, and whether it has any contributions to make to them. The author ranges far and wide in Jewish lore, finding clear evidence of both inductive and deductive reasoning in the Torah and other books of the Bible, and (...) analyzing the methodology of the Talmud and other Rabbinic literature by means of formal tools which make possible its objective evaluation with reference to scientific logic. The result is a highly innovative work – incisive and open, free of clichés or manipulation. Judaic Logic succeeds in translating vague and confusing interpretative principles and examples into formulas with the clarity and precision of Aristotelean syllogism. Among the positive outcomes, for logic in general, are a thorough listing, analysis and validation of the various forms of a-fortiori argument, as well as a clarification of dialectic logic. However, on the negative side, this demystification of Talmudic/Rabbinic modes of thought (hermeneutic and heuristic) reveals most of them to be, contrary to the boasts of orthodox commentators, far from deductive and certain. They are often, legitimately enough, inductive. But they are also often unnatural and arbitrary constructs, supported by unverifiable claims and fallacious techniques. Many other thought-processes, used but not noticed or discussed by the Rabbis, are identified in this treatise, and subjected to logical review. Various more or less explicit Rabbinic doctrines, which have logical significance, are also examined in it. In particular, this work includes a formal study of the ethical logic (deontology) found in Jewish law, to elicit both its universal aspects and its peculiarities. With regard to Biblical studies, one notable finding is an explicit formulation (which, however, the Rabbis failed to take note of and stress) of the principles of adduction in the Torah, written long before the acknowledgement of these principles in Western philosophy and their assimilation in a developed theory of knowledge. Another surprise is that, in contrast to Midrashic claims, the Tanakh (Jewish Bible) contains a lot more than ten instances of qal vachomer (a-fortiori) reasoning. In sum, Judaic Logic elucidates and evaluates the epistemological assumptions which have generated the Halakhah (Jewish religious jurisprudence) and allied doctrines. Traditional justifications, or rationalizations, concerning Judaic law and belief, are carefully dissected and weighed at the level of logical process and structure, without concern for content. This foundational approach, devoid of any critical or supportive bias, clears the way for a timely reassessment of orthodox Judaism (and incidentally, other religious systems, by means of analogies or contrasts). Judaic Logic ought, therefore, to be read by all Halakhists, as well as Bible and Talmud scholars and students; and also by everyone interested in the theory, practise and history of logic. (shrink)
In his book, History as a Science and the System of the Sciences, Thomas Seebohm articulates the view that history can serve to mediate between the sciences of explanation and the sciences of interpretation, that is, between the natural sciences and the human sciences. Among other things, Seebohm analyzes history from a phenomenological perspective to reveal the material foundations of the historical human sciences in the lifeworld. As a preliminary to his analyses, Seebohm examines the formal and material presuppositions (...) of phenomenological epistemology, as well as the emergence of the human sciences and the traditional distinctions and divisions that are made between the natural and the human sciences. -/- As part of this examination, Seebohm devotes a section to discussing Husserl’s formal mereology because he understands that a reflective analysis of the foundations of the historical sciences requires a reflective analysis of the objects of the historical sciences, that is, of concrete organic wholes (i.e., social groups) and of their parts. Seebohm concludes that Husserl’s mereological ontology needs to be altered with regard to the historical sciences because the relations between organic wholes and their parts are not summative relations. Seebohm’s conclusion is relevant for the issue of the reducibility of organic wholes such as social groups to their parts and for the issue of the reducibility of the historical sciences to the lower-order sciences, that is, to the sciences concerned with lower-order ontologies. -/- In this paper, I propose to extend Seebohm’s conclusion to the ontology of chemical wholes as object of quantum chemistry and to argue that Husserl’s formal mereology is descriptively inadequate for this regional ontology as well. This may seem surprising at first, since the objects studied by quantum chemists are not organic wholes. However, my discussion of atoms and molecules as they are understood in quantum chemistry will show that Husserl’s classical summative and extensional mereology does not accurately capture the relations between chemical wholes and their parts. This conclusion is relevant for the question of the reducibility of chemical wholes to their parts and of the reducibility of chemistry to physics, issues that have been of central importance within the philosophy of chemistry for the past several decades. (shrink)
Bayesian epistemology tells us with great precision how we should move from prior to posterior beliefs in light of new evidence or information, but says little about where our prior beliefs come from. It offers few resources to describe some prior beliefs as rational or well-justified, and others as irrational or unreasonable. A different strand of epistemology takes the central epistemological question to be not how to change one’s beliefs in light of new evidence, but what reasons justify (...) a given set of beliefs in the first place. We offer an account of rational belief formation that closes some of the gap between Bayesianism and its reason-based alternative, formalizing the idea that an agent can have reasons for his or her (prior) beliefs, in addition to evidence or information in the ordinary Bayesian sense. Our analysis of reasons for belief is part of a larger programme of research on the role of reasons in rational agency (Dietrich and List, Nous, 2012a, in press; Int J Game Theory, 2012b, in press). (shrink)
Formally-inclined epistemologists often theorize about ideally rational agents--agents who exemplify rational ideals, such as probabilistic coherence, that human beings could never fully realize. This approach can be defended against the well-know worry that abstracting from human cognitive imperfections deprives the approach of interest. But a different worry arises when we ask what an ideal agent should believe about her own cognitive perfection (even an agent who is in fact cognitively perfect might, it would seem, be uncertain of this fact). Consideration (...) of this question reveals an interesting feature of the structure of our epistemic ideals: for agents with limited information, our epistemic ideals turn out to conflict with one another. (shrink)
In the latter half of the twentieth century, philosophers of science have argued (implicitly and explicitly) that epistemically rational individuals might compose epistemically irrational groups and that, conversely, epistemically rational groups might be composed of epistemically irrational individuals. We call the conjunction of these two claims the Independence Thesis, as they together imply that methodological prescriptions for scientific communities and those for individual scientists might be logically independent of one another. We develop a formal model of scientific inquiry, define (...) four criteria for individual and group epistemic rationality, and then prove that the four definitions diverge, in the sense that individuals will be judged rational when groups are not and vice versa. We conclude by explaining implications of the inconsistency thesis for (i) descriptive history and sociology of science and (ii) normative prescriptions for scientific communities. (shrink)
We present a philosophical motivation for the logics of formal inconsistency, a family of paraconsistent logics whose distinctive feature is that of having resources for expressing the notion of consistency within the object language. We shall defend the view according to which logics of formal inconsistency are theories of logical consequence of normative and epistemic character. This approach not only allows us to make inferences in the presence of contradictions, but offers a philosophically acceptable account of paraconsistency.
In this paper we present a philosophical motivation for the logics of formal inconsistency, a family of paraconsistent logics whose distinctive feature is that of having resources for expressing the notion of consistency within the object language in such a way that consistency may be logically independent of non- contradiction. We defend the view according to which logics of formal inconsistency may be interpreted as theories of logical consequence of an epistemological character. We also argue that in order (...) to philosophically justify paraconsistency there is no need to endorse dialetheism, the thesis that there are true contradictions. Furthermore, we argue that an intuitive reading of the bivalued semantics for the logic mbC, a logic of formal inconsistency based on classical logic, fits in well with the basic ideas of an intuitive interpretation of contradictions. On this interpretation, the acceptance of a pair of propositions A and ¬A does not mean that A is simultaneously true and false, but rather that there is conflicting evidence about the truth value of A. (shrink)
John D. Norton is responsible for a number of influential views in contemporary philosophy of science. This paper will discuss two of them. The material theory of induction claims that inductive arguments are ultimately justified by their material features, not their formal features. Thus, while a deductive argument can be valid irrespective of the content of the propositions that make up the argument, an inductive argument about, say, apples, will be justified (or not) depending on facts about apples. The (...) argument view of thought experiments claims that thought experiments are arguments, and that they function epistemically however arguments do. These two views have generated a great deal of discussion, although there hasn’t been much written about their combination. I argue that despite some interesting harmonies, there is a serious tension between them. I consider several options for easing this tension, before suggesting a set of changes to the argument view that I take to be consistent with Norton’s fundamental philosophical commitments, and which retain what seems intuitively correct about the argument view. These changes require that we move away from a unitary epistemology of thought experiments and towards a more pluralist position. (shrink)
True contradictions are taken increasingly seriously by philosophers and logicians. Yet, the belief that contradictions are always false remains deeply intuitive. This paper confronts this belief head-on by explaining in detail how one specific contradiction is true. The contradiction in question derives from Priest's reworking of Berkeley's argument for idealism. However, technical aspects of the explanation offered here differ considerably from Priest's derivation. The explanation uses novel formal and epistemological tools to guide the reader through a valid argument with, (...) not just true, but eminently acceptable premises, to an admittedly unusual conclusion: a true contradiction. The novel formal and epistemological tools concern points of view and changes in points of view. The result is an understanding of why the contradiction is true. (shrink)
John Venn has the “uneasy suspicion” that the stagnation in mathematical logic between J. H. Lambert and George Boole was due to Kant’s “disastrous effect on logical method,” namely the “strictest preservation [of logic] from mathematical encroachment.” Kant’s actual position is more nuanced, however. In this chapter, I tease out the nuances by examining his use of Leonhard Euler’s circles and comparing it with Euler’s own use. I do so in light of the developments in logical calculus from G. W. (...) Leibniz to Lambert and Gottfried Ploucquet. While Kant is evidently open to using mathematical tools in logic, his main concern is to clarify what mathematical tools can be used to achieve. For without such clarification, all efforts at introducing mathematical tools into logic would be blind if not complete waste of time. In the end, Kant would stress, the means provided by formal logic at best help us to express and order what we already know in some sense. No matter how much mathematical notations may enhance the precision of this function of formal logic, it does not change the fact that no truths can, strictly speaking, be revealed or established by means of those notations. (shrink)
The branch of philosophical logic which has become known as “belief change” has, in the course of its development, become alienated from its epistemological origins. However, as formal criteria do not suffice to defend a principled choice between competing systems for belief change, we do need to take their epistemological embedding into account. Here, on the basis of a detailed examination of Isaac Levi's epistemology, we argue for a new direction of belief change research and propose to construct (...) systems for belief change that can do without, but do not rule out, selection functions, in order to enable an *empirical* assessment of the relative merits of competing belief change systems. (shrink)
Philosophy of science in the 20th century is to be considered as mostly characterized by a fundamentally systematic heuristic attitude, which looks to mathematics, and more generally to the philosophy of mathematics, for a genuinely and epistemologically legitimate form of knowledge. Rooted in this assumption, the book provides a formal reconsidering of the dynamics of scientific theories, especially in the field of the physical sciences, and offers a significant contribution to current epistemological investigations regarding the validity of using (...) class='Hi'>formal (especially: model-theoretic) methods of analysis, as developed principally by Stegmüller, Sneed, Suppes, Moulines, “to bring the airy flights of analytical philosophy back down to earth”, to borrow Stephan Hartmann’s provocative statement. At the same time, the volume represents a comprehensive account of the epistemic content of physical theories, the logic of theory change in science, and specific (inter-)theoretical core aspects of scientific progress, particularly in the form suggested informally by Thomas Kuhn. As C. Ulises Moulines writes in the preface, “there is no other example in present-day literature (in any language) on this topic, i.e. the formal analysis of the ideographic characterization of the dynamics of theories between Kuhn’s theory of science and structural epistemology, that is as systematic and complete as Perrone’s work”. (shrink)
This article is primarily concerned with the articulation of a defensible position on the relevance of phenomenological analysis with the current epistemological edifice as this latter has evolved since the rupture with the classical scientific paradigm pointing to the Newtonian-Leibnizian tradition which took place around the beginning of 20th century. My approach is generally based on the reduction of the objects-contents of natural sciences, abstracted in the form of ideal objectivities in the corresponding logical-mathematical theories, to the content of meaning-acts (...) ultimately referring to a specific being-within-the-world experience. This is a position that finds itself in line with Husserl’s gradual departure from the psychologistic interpretations of his earlier works on the philosophy of logic and mathematics and culminates in a properly meant phenomenological foundation of natural sciences in his last major published work, namely the Crisis of European Sciences and the Transcendental Phenomenology. Further this article tries to set up a context of discourse in which to found both physical and formal objects in parallel terms as essentially temporal-noematic objects to the extent that they may be considered as invariants of the constitutional modes of a temporal consciousness. (shrink)
We’ll sketch the debate on testimony in social epistemology by reference to the contemporary debate on reductionism/anti-reductionism, communitarian epistemology and inferentialism. Testimony is a fundamental source of knowledge we share and it is worthy to be considered in the ambit of a dialogical perspective, which requires a description of a formal structure which entails deontic statuses and deontic attitudes. In particular, we’ll argue for a social reformulation of the “space of reasons”, which establishes a fruitful relationship with (...) the epistemological view of Wilfrid Sellars. (shrink)
Applying Bernard Lonergan's (1957/1992, 1972) analysis of intentional consciousness and its concomitant epistemology, this paper highlights epistemological confusion in contemporary consciousness studies as exemplified mostly in David Chalmers's (1996) position. In ideal types, a first section outlines two epistemologies-sensate-modeled and intelligence-based-whose difference significantly explains the different positions. In subsequent sections, this paper documents the sensate-modeled epistemology in Chalmers's position and consciousness studies in general. Tellingly, this model of knowing is at odds with the formal-operational theorizing in twentieth-century (...) science. This paper then links this epistemology with functionalism and its focus on descriptive efficient causality in external behaviors and its oversight of explanatory formal causality; highlights the theoretical incoherence of the understanding of science in the functionalist approach; connects it with the construal of consciousness as primarily intentional (i.e., directed toward an object) to the neglect of consciousness as conscious (i.e., constituted by a non-objectified self-presence); and relates this outcome to the reduction of human consciousness to animal-like perception and mechanistic interactions. A brief conclusion summarizes these multiple, subtle, and interconnected considerations and suggests how only an intellectual epistemology would be adequate to the intellectual nature of human consciousness and the world of meaning, not of mere bodies, in which humans exist. (shrink)
The naive idea of a mimesis between theory and experiments, a concept still lasing in many epistemologies, is here substituted by a more sophisticated mathematical methexis where theoretical physics is a system of production of formal structures under strong mathematical constraints, such as global and local symmetries. Instead of an ultimate “everything theory”, the image of physical theories here proposed is a totality of interconnected structures establishing the very conditions of its “thinkability” and the relations with the experimental domain.
We have recently started to understand that fundamental aspects of complex systems such as emergence, the measurement problem, inherent uncertainty, complex causality in connection with unpredictable determinism, timeirreversibility and nonlocality all highlight the observer's participatory role in determining their workings. In addition, the principle of 'limited universality' in complex systems, which prompts us to search for the appropriate 'level of description in which unification and universality can be expected', looks like a version of Bohr's 'complementarity principle'. It is more or (...) less certain that the different levels of description possible of a complex whole actually partial objectifications are projected on to and even redefine its constituent parts. Thus it is interesting that these fundamental complexity issues don't just bear a formal resemblance to, but reveal a profound connection with, quantum mechanics. Indeed, they point to a common origin on a deeper level of description. (shrink)
This paper is about teaching probability to students of philosophy who don’t aim to do primarily formal work in their research. These students are unlikely to seek out classes about probability or formalepistemology for various reasons, for example because they don’t realize that this knowledge would be useful for them or because they are intimidated by the material. However, most areas of philosophy now contain debates that incorporate probability, and basic knowledge of it is essential even (...) for philosophers whose work isn’t primarily formal. In this paper, I explain how to teach probability to students who are not already enthusiastic about formal philosophy, taking into account the common phenomena of math anxiety and the lack of reading skills for formal texts. I address course design, lesson design, and assignment design. Most of my recommendations also apply to teaching formal methods other than probability theory. (shrink)
Since the time of Aristotle's students, interpreters have considered Prior Analytics to be a treatise about deductive reasoning, more generally, about methods of determining the validity and invalidity of premise-conclusion arguments. People studied Prior Analytics in order to learn more about deductive reasoning and to improve their own reasoning skills. These interpreters understood Aristotle to be focusing on two epistemic processes: first, the process of establishing knowledge that a conclusion follows necessarily from a set of premises (that is, on the (...) epistemic process of extracting information implicit in explicitly given information) and, second, the process of establishing knowledge that a conclusion does not follow. Despite the overwhelming tendency to interpret the syllogistic as formalepistemology, it was not until the early 1970s that it occurred to anyone to think that Aristotle may have developed a theory of deductive reasoning with a well worked-out system of deductions comparable in rigor and precision with systems such as propositional logic or equational logic familiar from mathematical logic. When modern logicians in the 1920s and 1930s first turned their attention to the problem of understanding Aristotle's contribution to logic in modern terms, they were guided both by the Frege-Russell conception of logic as formal ontology and at the same time by a desire to protect Aristotle from possible charges of psychologism. They thought they saw Aristotle applying the informal axiomatic method to formal ontology, not as making the first steps into formalepistemology. They did not notice Aristotle's description of deductive reasoning. Ironically, the formal axiomatic method (in which one explicitly presents not merely the substantive axioms but also the deductive processes used to derive theorems from the axioms) is incipient in Aristotle's presentation. Partly in opposition to the axiomatic, ontically-oriented approach to Aristotle's logic and partly as a result of attempting to increase the degree of fit between interpretation and text, logicians in the 1970s working independently came to remarkably similar conclusions to the effect that Aristotle indeed had produced the first system of formal deductions. They concluded that Aristotle had analyzed the process of deduction and that his achievement included a semantically complete system of natural deductions including both direct and indirect deductions. Where the interpretations of the 1920s and 1930s attribute to Aristotle a system of propositions organized deductively, the interpretations of the 1970s attribute to Aristotle a system of deductions, or extended deductive discourses, organized epistemically. The logicians of the 1920s and 1930s take Aristotle to be deducing laws of logic from axiomatic origins; the logicians of the 1970s take Aristotle to be describing the process of deduction and in particular to be describing deductions themselves, both those deductions that are proofs based on axiomatic premises and those deductions that, though deductively cogent, do not establish the truth of the conclusion but only that the conclusion is implied by the premise-set. Thus, two very different and opposed interpretations had emerged, interestingly both products of modern logicians equipped with the theoretical apparatus of mathematical logic. The issue at stake between these two interpretations is the historical question of Aristotle's place in the history of logic and of his orientation in philosophy of logic. This paper affirms Aristotle's place as the founder of logic taken as formalepistemology, including the study of deductive reasoning. A by-product of this study of Aristotle's accomplishments in logic is a clarification of a distinction implicit in discourses among logicians--that between logic as formal ontology and logic as formalepistemology. (shrink)
The rise of experimental philosophy has placed metaphilosophical questions, particularly those concerning concepts, at the center of philosophical attention. X-phi offers empirically rigorous methods for identifying conceptual content, but what exactly it contributes towards evaluating conceptual content remains unclear. We show how x-phi complements Rudolf Carnap’s underappreciated methodology for concept determination, explication. This clarifies and extends x-phi’s positive philosophical import, and also exhibits explication’s broad appeal. But there is a potential problem: Carnap’s account of explication was limited to empirical and (...) logical concepts, but many concepts of interest to philosophers are essentially normative. With formalepistemology as a case study, we show how x-phi assisted explication can apply to normative domains. (shrink)
There is a long tradition in formalepistemology and in the psychology of reasoning to investigate indicative conditionals. In psychology, the propositional calculus was taken for granted to be the normative standard of reference. Experimental tasks, evaluation of the participants’ responses and psychological model building, were inspired by the semantics of the material conditional. Recent empirical work on indicative conditionals focuses on uncertainty. Consequently, the normative standard of reference has changed. I argue why neither logic nor standard probability (...) theory provide appropriate rationality norms for uncertain conditionals. I advocate coherence based probability logic as an appropriate framework for investigating uncertain conditionals. Detailed proofs of the probabilistic non-informativeness of a paradox of the material conditional illustrate the approach from a formal point of view. I survey selected data on human reasoning about uncertain conditionals which additionally support the plausibility of the approach from an empirical point of view. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.