In ‘A reductio of coherentism’ (Analysis 67, 2007) Tom Stoneham offers a novel argument against epistemological coherentism. ‘On the face of it’, he writes, ‘the argument gives a conclusive reductio ad absurdum of any coherence theory of justification. But that cannot be right, can it?’ (p. 254). It could be right, but it isn’t. I argue that coherentists need not accept the central premises of Stoneham’s argument and that, even if these premises were acceptable and true, Stoneham’s reductio (...) would not follow. (shrink)
Coherentists on epistemic justification claim that all justification is inferential, and that beliefs, when justified, get their justification together (not in isolation) as members of a coherent belief system. Some recent work in formal epistemology shows that “individual credibility” is needed for “witness agreement” to increase the probability of truth and generate a high probability of truth. It can seem that, from this result in formal epistemology, it follows that coherentist justification is not truth-conducive, that it is not the case (...) that, under the requisite conditions, coherentist justification increases the probability of truth and generates a high probability of truth. I argue that this does not follow. (shrink)
Once upon a time, coherentism was the dominant response to the regress problem in epistemology, but in recent decades the view has fallen into disrepute: now almost everyone is a foundationalist (with a few infinitists sprinkled here and there). In this paper, I sketch a new way of thinking about coherentism, and show how it avoids many of the problems often thought fatal for the view, including the isolation objection, worries over circularity, and concerns that the concept of (...) coherence is too vague or metaphorical for serious theoretical use. The key to my approach is to take a familiar tool from discussions of the regress problem -- namely, directed graphs depicting the support relations between beliefs -- and to use that tool in a more sophisticated manner than it is standardly employed. (shrink)
Semantic holists view what one's terms mean as function of all of one's usage. Holists will thus be coherentists about semantic justification: showing that one's usage of a term is semantically justified involves showing how it coheres with the rest of one's usage. Semantic atomists, by contrast, understand semantic justification in a foundationalist fashion. Saul Kripke has, on Wittgenstein's behalf, famously argued for a type of skepticism about meaning and semantic justification. However, Kripke's argument has bite only if one understands (...) semantic justification in foundationalist terms. Consequently, Kripke's arguments lead not to a type of skepticism about meaning, but rather to the conclusion that one should be a coherentist about semantic justification, and thus a holist about semantic facts. (shrink)
Can a perceptual experience justify (epistemically) a belief? More generally, can a nonbelief justify a belief? Coherentists answer in the negative: Only a belief can justify a belief. A perceptual experience can cause a belief but cannot justify a belief. Coherentists eschew all noninferential justification—justification independent of evidential support from beliefs—and, with it, the idea that justification has a foundation. Instead, justification is holistic in structure. Beliefs are justified together, not in isolation, as members of a coherent belief system. The (...) main question of the paper is whether coherentism is consistent. I set out an apparent inconsistency in coherentism and then give a resolution to that apparent inconsistency. (shrink)
Approximate coherentism suggests that imperfectly rational agents should hold approximately coherent credences. This norm is intended as a generalization of ordinary coherence. I argue that it may be unable to play this role by considering its application under learning experiences. While it is unclear how imperfect agents should revise their beliefs, I suggest a plausible route is through Bayesian updating. However, Bayesian updating can take an incoherent agent from relatively more coherent credences to relatively less coherent credences, depending on (...) the data observed. Thus, comparative rationality judgments among incoherent agents are unduly sensitive to luck. (shrink)
The most pressing difficulty coherentism faces is, I believe, the problem of justified inconsistent beliefs. In a nutshell, there are cases in which our beliefs appear to be both fully rational and justified, and yet the contents of the beliefs are inconsistent, often knowingly so. This fact contradicts the seemingly obvious idea that a minimal requirement for coherence is logical consistency. Here, I present a solution to one version of this problem.
If a subject’s belief system is inconsistent, does it follow that the subject’s beliefs (all of them) are unjustified? It seems not. But, coherentist theories of justification (at least some of them) imply otherwise, and so, it seems, are open to counterexample. This is the “Problem of Justified Inconsistent Beliefs”. I examine two main versions of the Problem of Justified Inconsistent Beliefs, and argue that coherentists can give at least a promising line of response to each of them.
In this paper I prove that holistic coherentism is logically equivalent to the conjunction of symmetry and quasi-transitivity of epistemic support and a condition on justified beliefs. On the way I defend Tom Stoneham from a criticism made by Darrell Rowbottom and prove a premiss of Stoneham’s argument to be an entailment of coherentism.
Plantinga argues that cases involving ‘fixed’ beliefs refute the coherentist thesis that a belief’s belonging to a coherent set of beliefs suffices for its having justification (warrant). According to Plantinga, a belief cannot be justified if there is a ‘lack of fit’ between it and its subject’s experiences. I defend coherentism by showing that if Plantinga means to claim that any ‘lack of fit’ destroys justification, his argument is obviously false. If he means to claim that significant ‘lack of (...) fit’ destroys justification, his argument suffers a critical lack of support. Either way, Plantinga’s argument fails and coherentism emerges unscathed. (shrink)
The paper takes off from the suggestion of Jouni-Matti Kuukkanen that Kuhn’s account of science may be understood in coherentist terms. There are coherentist themes in Kuhn’s philosophy of science. But one crucial element is lacking. Kuhn does not deny the existence of basic beliefs which have a non-doxastic source of justification. Nor does he assert that epistemic justification only derives from inferential relationships between non-basic beliefs. Despite this, the coherentist interpretation is promising and I develop it further in this (...) paper. I raise the question of whether Kuhn’s account of science can deal with the input objection to coherentism. I argue that the role played by problems in Kuhn’s theory of science ensures that there is input from the external world into scientists’ belief-systems. I follow Hoyningen-Huene in pointing to the causal role played by the external world in determining perceptual states. I next turn to the question of whether Kuhn’s rejection of foundationalism implies coherentism. I argue that Kuhn’s rejection of the one-to-one relation between object and experience is compatible with a foundationalist account of justification. Nor does Kuhn’s rejection of the given entail the same coherentist implications as Sellars’ critique of the myth of the given. (shrink)
According to metaphysical coherentism, grounding relations form an interconnected system in which things ground each other and nothing is ungrounded. This potentially viable view’s logical territory remains largely unexplored. In this paper, I describe that territory by articulating four varieties of metaphysical coherentism. I do not argue for any variety in particular. Rather, I aim to show that not all issues which might be raised against coherentism will be equally problematic for all the versions of that view, (...) which features far more nuance and diversity than is typically ascribed to it. (shrink)
matter of knowing that -- that injustice is wrong, courage is valuable, and care is As a result, what I'll be doing is primarily defending in general -- and due. Such knowledge is embodied in a range of capacities, abilities, and skills..
This paper considers a problem for Bayesian epistemology and proposes a solution to it. On the traditional Bayesian framework, an agent updates her beliefs by Bayesian conditioning, a rule that tells her how to revise her beliefs whenever she gets evidence that she holds with certainty. In order to extend the framework to a wider range of cases, Jeffrey (1965) proposed a more liberal version of this rule that has Bayesian conditioning as a special case. Jeffrey conditioning is a rule (...) that tells the agent how to revise her beliefs whenever she gets evidence that she holds with any degree of confidence. The problem? While Bayesian conditioning has a foundationalist structure, this foundationalism disappears once we move to Jeffrey conditioning. If Bayesian conditioning is a special case of Jeffrey conditioning, then they should have the same normative structure. The solution? To reinterpret Bayesian updating as a form of diachronic coherentism. (shrink)
Philosophers of well-being have tended to adopt a foundationalist approach to the question of theory and measurement, according to which theories are conceptually prior to measures. By contrast, social scientists have tended to adopt operationalist commitments, according to which they develop and refine well-being measures independently of any philosophical foundation. Unfortunately, neither approach helps us overcome the problem of coordinating between how we characterize wellbeing and how we measure it. Instead, we should adopt a coherentist approach to well-being science.
It is standard practice, when distinguishing between the foundationalist and the coherentist, to construe the coherentist as an internalist. The coherentist, the construal goes, says that justification is solely a matter of coherence, and that coherence, in turn, is solely a matter of internal relations between beliefs. The coherentist, so construed, is an internalist (in the sense I have in mind) in that the coherentist, so construed, says that whether a belief is justified hinges solely on what the subject is (...) like mentally. I argue that this practice is fundamentally misguided, by arguing that the foundationalism / coherentism debate and the internalism / externalism debate are about two very different things, so that there is nothing, qua coherentist, precluding the coherentist from siding with the externalist. I then argue that this spells trouble for two of the three most pressing and widely known objections to coherentism: the Alternative-Systems Objection and the Isolation Objection. (shrink)
I discuss the ideas of common sense and common-sense morality in Sidgwick. I argue that, far from aiming at overcoming common-sense morality, Sidgwick aimed purposely at grounding a consist code of morality by methods allegedly taken from the natural sciences, in order to reach also in the domain of morality the same kind of “mature” knowledge as in the natural sciences. His whole polemics with intuitionism was vitiated by the apriori assumption that the widespread ethos of the educated part of (...) humankind, not the theories of the intuitionist philosophers, was what was really worth considering as the expression of intuitionist ethics. In spite of the naïve positivist starting point Sidgwick was encouraged by his own approach in exploring the fruitfulness of coherentist methods for normative ethics. Thus, Sidgwick left an ambivalent legacy to twentieth-century ethics: the dogmatic idea of a “new” morality of a consequentialist kind, and the fruitful idea that we can argue rationally in normative ethics albeit without shared foundations. -/- . (shrink)
In this paper we show that the coherence measures of Olsson (J Philos 94:246–272, 2002), Shogenji (Log Anal 59:338–345, 1999), and Fitelson (Log Anal 63:194–199, 2003) satisfy the two most important adequacy requirements for the purpose of assessing theories. Following Hempel (Synthese 12:439–469, 1960), Levi (Gambling with truth, New York, A. A. Knopf, 1967), and recently Huber (Synthese 161:89–118, 2008) we require, as minimal or necessary conditions, that adequate assessment functions favor true theories over false theories and true and informative (...) theories over true but uninformative theories. We then demonstrate that the coherence measures of Olsson, Shogenji, and Fitelson satisfy these minimal conditions if we confront the hypotheses with a separating sequence of observational statements. In the concluding remarks we set out the philosophical relevance, and limitations, of the formal results. Inter alia, we discuss the problematic implications of our precondition that competing hypotheses must be confronted with a separating sequence of observational statements, which also leads us to discuss theory assessment in the context of scientific antirealism. (shrink)
Some recent work in formal epistemology shows that “witness agreement” by itself implies neither an increase in the probability of truth nor a high probability of truth—the witnesses need to have some “individual credibility.” It can seem that, from this formal epistemological result, it follows that coherentist justification (i.e., doxastic coherence) is not truth-conducive. I argue that this does not follow. Central to my argument is the thesis that, though coherentists deny that there can be noninferential justification, coherentists do not (...) deny that there can be individual credibility. (shrink)
We put forward a new, ‘coherentist’ account of quantum entanglement, according to which entangled systems are characterized by symmetric relations of ontological dependence among the component particles. We compare this coherentist viewpoint with the two most popular alternatives currently on offer—structuralism and holism—and argue that it is essentially different from, and preferable to, both. In the course of this article, we point out how coherentism might be extended beyond the case of entanglement and further articulated.
Ted Poston's book Reason and Explanation: A Defense of Explanatory Coherentism is a book worthy of careful study. Poston develops and defends an explanationist theory of (epistemic) justification on which justification is a matter of explanatory coherence which in turn is a matter of conservativeness, explanatory power, and simplicity. He argues that his theory is consistent with Bayesianism. He argues, moreover, that his theory is needed as a supplement to Bayesianism. There are seven chapters. I provide a chapter-by-chapter summary (...) along with some substantive concerns. (shrink)
Coherentism maintains that coherent beliefs are more likely to be true than incoherent beliefs, and that coherent evidence provides more confirmation of a hypothesis when the evidence is made coherent by the explanation provided by that hypothesis. Although probabilistic models of credence ought to be well-suited to justifying such claims, negative results from Bayesian epistemology have suggested otherwise. In this essay we argue that the connection between coherence and confirmation should be understood as a relation mediated by the causal (...) relationships among the evidence and a hypothesis, and we offer a framework for doing so by fitting together probabilistic models of coherence, confirmation, and causation. We show that the causal structure among the evidence and hypothesis is sometimes enough to determine whether the coherence of the evidence boosts confirmation of the hypothesis, makes no difference to it, or even reduces it. We also show that, ceteris paribus, it is not the coherence of the evidence that boosts confirmation, but rather the ratio of the coherence of the evidence to the coherence of the evidence conditional on a hypothesis. (shrink)
The philosophy of measurement studies the conceptual, ontological, epistemic, and technological conditions that make measurement possible and reliable. A new wave of philosophical scholarship has emerged in the last decade that emphasizes the material and historical dimensions of measurement and the relationships between measurement and theoretical modeling. This essay surveys these developments and contrasts them with earlier work on the semantics of quantity terms and the representational character of measurement. The conclusions highlight four characteristics of the emerging research program in (...) philosophy of measurement: it is epistemological, coherentist, practice oriented, and model based. (shrink)
I develop a probabilistic account of coherence, and argue that at least in certain respects it is preferable to (at least some of) the main extant probabilistic accounts of coherence: (i) Igor Douven and Wouter Meijs’s account, (ii) Branden Fitelson’s account, (iii) Erik Olsson’s account, and (iv) Tomoji Shogenji’s account. Further, I relate the account to an important, but little discussed, problem for standard varieties of coherentism, viz., the “Problem of Justified Inconsistent Beliefs.”.
I argue that coherence is truth-conducive in that coherence implies an increase in the probability of truth. Central to my argument is a certain principle for transitivity in probabilistic support. I then address a question concerning the truth-conduciveness of coherence as it relates to (something else I argue for) the truth-conduciveness of consistency, and consider how the truth-conduciveness of coherence bears on coherentist theories of justification.
This article examines the method of reflective equilibrium (RE) and its role in philosophical inquiry. It begins with an overview of RE before discussing some of the subtleties involved in its interpretation, including challenges to the standard assumption that RE is a form of coherentism. It then evaluates some of the main objections to RE, in particular, the criticism that this method generates unreasonable beliefs. It concludes by considering how RE relates to recent debates about the role of intuitions (...) in philosophy. (shrink)
This brief reply to McCain and Poston's chapter problematizes both their objections to my chapter on experience justifying belief and their version of epistemological coherentism.
Applying a theory of psychological modularity, I argue for a theory of defeasibility conditions for the epistemic justification of perceptual beliefs. My theory avoids the extremes of holism (e.g., coherentism and confirmation holism) and of foundationalist theories of non-inferential justification.
In this paper, we evaluate some proposals that can be advanced to clarify the ontological consequences of Relational Quantum Mechanics. We first focus on priority monism and ontic structural realism and argue that these views are not suitable for providing an ontological interpretation of the theory. Then, we discuss an alternative interpretation that we regard as more promising, based on so-called ‘metaphysical coherentism’, which we also connect to the idea of an event-based, or ‘flash’, ontology.
We can distinguish between ambitious metanormative constructivism and a variety of other constructivist projects in ethics and metaethics. Ambitious metanormative constructivism is the project of either developing a type of new metanormative theory, worthy of the label “constructivism”, that is distinct from the existing types of metaethical, or metanormative, theories already on the table—various realisms, non-cognitivisms, error-theories and so on—or showing that the questions that lead to these existing types of theories are somehow fundamentally confused. Natural ways of pursuing the (...) project of ambitious metanormative constructivism lead to certain obvious, and related, worries about whether the ambitions are really being achieved—that is whether we really are being given a distinctive theory. I will argue that responding to these initial worries pushes ambitious metanormative constructivism towards adopting a kind of position that I will call “constructivism all the way down”. Such a position does see off most of the above initial worries. Drawing on the work of Ralph Walker and Crispin Wright, I argue, however, that it faces a distinct objection that is a descendent of Bertrand Russell’s Bishop Stubbs objection against coherentist theories of truth. I grant that the constructivist need not be a coherentist about truth. I argue, however, that despite this the constructivist cannot escape my version of the objection. I also distinguish between this objection and various traditional charges of circularity, regress, relativism, or psychologistic reductionism. (shrink)
I will compare Lehrer’s anti-skeptical strategy from a coherentist point of view with the anti-skeptical strategy of the Mooreans. I will argue that there are strong similarities between them: neither can present a persuasive argument to the skeptic and both face the problem of easy knowledge in one way or another. However, both can offer a complete and self-explanatory explanation of knowledge although Mooreanism can offer the more natural one. Hence, one has good reasons to prefer Mooreanism to Lehrer’s anti-skeptical (...) approach, if one does not prefer coherentism to foundationalism for other reasons. (shrink)
Lam and Esfeld have argued that, within Bohmian mechanics, the wave function can be interpreted as a physical structure instantiated by the fundamental particles posited by the theory. Further, to characterize the nature of this structure, they appeal to the framework of Ontic Structural Realism, thereby proposing a structuralist interpretation of Bohmian mechanics. However, I shall point out that OSR denotes a family of distinct views, each of which maintains a different account about the relation between structures and objects, and (...) entails a different kind of ontology. Thus, in this paper I will show how to articulate the structuralist approach to Bohmian Mechanics accordingly to the different standard versions of OSR, and I will evaluate these alternatives. Moreover, I will propose a novel and _sui generis_ kind of structuralist interpretation of Bohmian Mechanics, based on the framework of metaphysical coherentism. (shrink)
Andrew Cling presents a new version of the epistemic regress problem, and argues that intuitionist foundationalism, social contextualism, holistic coherentism, and infinitism fail to solve it. Cling’s discussion is quite instructive, and deserving of careful consideration. But, I argue, Cling’s discussion is not in all respects decisive. I argue that Cling’s dilemma argument against holistic coherentism fails.
As one of the first modern philosophers, Georg Simmel systematically developed a “relativistic world view” (Simmel 2004, VI). In this paper I attempt to examine Simmel’s relativistic answer to the question of truth. I trace his main arguments regarding the concept of truth and present his justification of epistemic relativism. In doing so, I also want to show that some of Simmel’s claims are surprisingly timely. Simmel’s relativistic concept of truth is supported by an evolutionary argument. The first part of (...) this paper outlines that pragmatic foundation of his epistemology. The second part of the paper shows that Simmel develops what today would be called a coherence theory of truth. He presents his coherentist view that every belief is true only in relation to another one primarily as a theory of epistemic justification. The third part turns to Simmel’s original way of dealing with the (in)famous self-refutation charge against relativism. (shrink)
According to Agrippa's trilemma, an attempt to justify something leads to either infinite regress, circularity, or dogmatism. This essay examines whether and to what extent the trilemma applies to ethics. There are various responses to the trilemma, such as foundationalism, coherentism, contextualism, infinitism, and German idealism. Examining those responses, the essay shows that the trilemma applies at least to rational justification of contentful moral beliefs. This means that rationalist ethics based on any contentful moral belief are rationally unjustifiable.
Foundationalists distinguish basic from nonbasic beliefs. At a first approximation, to say that a belief of a person is basic is to say that it is epistemically justified and it owes its justification to something other than her other beliefs, where “belief” refers to the mental state that goes by that name. To say that a belief of a person is nonbasic is to say that it is epistemically justified and not basic. Two theses constitute Foundationalism: (a) Minimality: There are (...) some basic beliefs, and (b) Exclusivity: If there are any nonbasic beliefs, that is solely because they (ultimately) owe their justification to some basic belief. Proponents of Minimality but not Exclusivity endorse Minimal Foundationalism. Proponents of Exclusivity but not Minimality endorse either Epistemic Nihilism, the view that there are no justified beliefs, or some non-foundationalist epistemology such as Coherentism or Infinitism. In this essay I aim to characterize the notion of a basic belief more precisely and to assess some arguments for and against Foundationalism. In the process, I hope to exhibit the resilience and attractiveness of Foundationalism. (shrink)
The thesis evaluates a contemporary debate concerning the very possibility of thinking about the world. In the first chapter, McDowell's critique of Davidson is presented, focusing on the coherentism defended by the latter. The critique of the myth of the given (as it appears in Sellars and Wittgenstein), as well as the necessity of a minimal empiricism (which McDowell finds in Quine and Kant), lead to an oscillation in contemporary thinking between two equally unsatisfactory ways of understanding the empirical (...) content of thought. In the second chapter, I defend Davidson's approach, focusing on his theory of interpretation and semantic externalism, as well as on the relation between causes and reasons. In the third chapter, the debate is analyzed in more detail. I criticize the anomalous monism, the way in which the boundaries between the conceptual and the non-conceptual are understood by Davidson, as well as the naturalized Platonism defended by McDowell. This thesis is mainly negative, and it concludes by revealing problems in both positions under evaluation. (shrink)
Theorists are divided as to whether truth is or is not a substantive property. In a nutshell, those that maintain that it is, pragmatists, coherentists, and correspondence theorists among others, oppose deflationists who claim that ascribing truth to an assertion is nothing more, or little more, than simply making the assertion. Deflationists typically refuse to grant truth a metaphysical standing, although we must recognise deflationism is not just a statement about the metaphysical status of truth. Unfortunately, propertihood is elusive to (...) define in relation to truth, but to deny it is to say that truth is not a quality bestowed on truth-bearers, one that they possess;1 or to say that truth is not the kind of philosophical entity apt for dissection into constituents and common to all true assertions; or to say that truth ascription is a mere convenience, a façon de parler that eases conversation and confers style; or to say that truth is in some sense a trivial logical fragment that all but disappears upon closer inspection. Intuitions and theories vary on the details of what it takes to be a property, but some or all of these premises are accepted in embracing deflationism. Whiteness, for example, is an uncontroversial property of snow, even if it is arguably a relational one; what deflationism rejects is the analogous property of truth-bearers. (shrink)
In this paper I discuss the role played by the ideas of ‘common sense’ and ‘common sense morality’ in Sidgwick’s system of ideas. I argue that, far from aiming at overcoming common sense morality, Sidgwick aimed purposely at grounding a consist code of morality by methods allegedly taken from the example provided by the natural sciences, in order to reach also in the moral field some body of ‘mature’ knowledge similar to that provided by the natural sciences. His whole polemics (...) with intuitionist was vitiated by the a priori assumption that the widespread ethos, not the theories of intuitionist philosophers was what was really worth considering In spite of the naïve positivist starting point Sidgwick was encouraged by his own approach in exploring the fruitfulness of coherentist methods for normative ethics. Thus Sidgwick left an ambivalent legacy to twentieth-century ethics: the dogmatic idea of a ‘new’ morality of a consequentialist kind, and the fruitful idea that in normative ethics we can argue rationally even though without shared foundations. (shrink)
The relationship between experience and thought is one of the distinctive problems in contemporary philosophy and has significant implications for both philosophy of mind and epistemology. John McDowell in his Magnum Opus Mind and World has argued in favour of a rational and conceptual relationship between experience and thought. In our understanding of the relationship between experience and thought, in his opinion, we fall into an “intolerable oscillation” between Myth of the Given and Coherentism. One of these pitfalls, he (...) specifically targets, is Davidson’s coherentism according to which there cannot be rational relationship between experience and thought. The point Davidson makes is that our perception of the world cannot give justification to our beliefs about the world. Only a belief, in his opinion, can justify another belief and this is considered to be one of the most controversial claims in contemporary philosophy. Both Davidson and McDowell would agree that the root of the problem pertaining to the relationship between experience and thought lies in the relationship between reason and nature (rationality and natural world). In this paper, my aim is to critically evaluate the debate between Davidson and McDowell about the relationship between experience and thought in connection with their views on the relationship between reason and nature. I will argue that a rational relation between experience and thought is necessary for our thought to have a genuine content from the external world. (shrink)
Coherentism in epistemology has long suffered from lack of formal and quantitative explication of the notion of coherence. One might hope that probabilistic accounts of coherence such as those proposed by Lewis, Shogenji, Olsson, Fitelson, and Bovens and Hartmann will finally help solve this problem. This paper shows, however, that those accounts have a serious common problem: the problem of belief individuation. The coherence degree that each of the accounts assigns to an information set (or the verdict it gives (...) as to whether the set is coherent tout court) depends on how beliefs (or propositions) that represent the set are individuated. Indeed, logically equivalent belief sets that represent the same information set can be given drastically different degrees of coherence. This feature clashes with our natural and reasonable expectation that the coherence degree of a belief set does not change unless the believer adds essentially new information to the set or drops old information from it; or, to put it simply, that the believer cannot raise or lower the degree of coherence by purely logical reasoning. None of the accounts in question can adequately deal with coherence once logical inferences get into the picture. Toward the end of the paper, another notion of coherence that takes into account not only the contents but also the origins (or sources) of the relevant beliefs is considered. It is argued that this notion of coherence is of dubious significance, and that it does not help solve the problem of belief individuation. (shrink)
I start by defining sentience and giving an analysis of the epistemological problems that plague its scientific study; this consists mainly in justifying that the attribution of sentience is underdetermined by the data. Second I show that as a result of this situation of underdetermination, most of the types of arguments used to infer sentience from the data are inconclusive and lead to a stalemate. Third, I argue that the stalemates arise from a foundationalist epistemology which needlessly leads to skeptical (...) conclusions; as an alternative, I propose to adopt a coherentist framework and defend a process of ‘epistemic iteration’ (Chang 2004) within that framework, which I argue gives us a way out of the underdetermination. (shrink)
The question of authenticity centers in the lives of women of color to invite and restrict their representative roles. For this reason, Gayatri Chakravorty Spivak and Uma Narayan advocate responding with strategic essentialism. This paper argues against such a strategy and proposes an epistemic understanding of the question of authentic- ity. The question stems from a kernel of truth—the connection between experience and knowledge. But a coherence theory of knowledge better captures the sociality and the holism of experience and knowledge.
Does a coherentist version of rationality issue requirements on states? Or does it issue requirements on processes? This paper evalu- ates the possibility of process-requirements. It argues that there are two possible definitions of state- and process-requirements: a satisfaction- based definition and a content-based definition. I demonstrate that the satisfaction-based definition is inappropriate. It does not allow us to uphold a clear-cut distinction between state- and process-requirements. We should therefore use a content-based definition of state- and pro- cess-requirements. However, a (...) content-based definition entails that ra- tionality does not issue process-requirements. Content-based process- requirements violate the principle that ‘rationality requires’ implies ‘can satisfy’. The conclusion of this paper therefore amounts to a radical re- jection of process-requirements of rationality. (shrink)
We will be in a better position to evaluate some important skeptical theses if we first investigate two questions about justified suspended judgment. One question is this: when, if ever, does one justified suspension confer justification on another suspension? And the other is this: what is the structure of justified suspension? The goal of this essay is to make headway at answering these questions. After surveying the four main views about the non-normative nature of suspended judgment and offering a taxonomy (...) of the epistemic principles that might govern which suspended judgments are justified, I will isolate five important principles that might govern which suspended judgments are justified. I will call these suspension-to-suspension principles. I will then evaluate these principles by the lights of each of the four views about what suspensions are. I close by drawing some conclusions about the prospects for skepticism, the structure of justified suspended judgment, and the importance of theorizing about justified suspended judgment. (shrink)
How can we figure out what’s right or wrong, if moral truths are neither self-evident nor something we can perceive? Very roughly, the method of reflective equilibrium (RE) says that we should begin moral inquiry from what we already confidently think, seeking to find a a match between our initial convictions and general principles that are well-supported by background theories, mutually adjusting both until we reach a coherent outlook in which our beliefs are in harmony (the equilibrium part) and we (...) know why and how they support each other (the reflective part). It has been central to the self-understanding of normative ethics and other branches of philosophy in the last half a century. In this chapter, we examine the history of the idea of RE and introduce a schema for generating 256 variants. We explain why RE is subject to serious objections insofar as it purports to yield epistemic justification in virtue of achieving coherence. However, we also develop a new argument to the effect that RE is the best feasible method for us to achieve moral understanding and the ability to justify our judgments to others. It may thus be crucial for responsible moral inquiry, even if coherence among considered judgments and principles is neither sufficient nor necessary for justified moral belief. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.