Moral luck occurs when factors beyond an agent’s control affect her blameworthiness. Several scholars deny the existence of moral luck by distinguishing judging blameworthy from blame-related practices. Luck does not affect an agent’s blameworthiness because morality is conceptually fair, but it can affect the appropriate degree of blame for that agent. While separatism resolves the paradox of moral luck, we aim to show it that it needs amendment, because it is unfair to treat two equally blameworthy people unequally. We argue (...) that separatists should conceive fairness as a pro tanto reason for blame. By locating fairness as a ground for blame within a wider axiology of blame, separatism could resolve the challenge of blaming fairly. In such an axiology, reasonable and fair blame diverge. (shrink)
A prominent view in contemporary philosophy of technology suggests that more technology implies more possibilities and, therefore, more responsibilities. Consequently, the question ‘What technology?’ is discussed primarily on the backdrop of assessing, assigning, and avoiding technology-borne culpability. The view is reminiscent of the Olympian gods’ vengeful and harsh reaction to Prometheus’ play with fire. However, the Olympian view leaves unexplained how technologies increase possibilities. Also, if Olympians are right, endorsing their view will at some point demand putting a halt to (...) technological development, which is absurd. Hence, we defend an alternative perspective on the relationship between responsibility and technology: Our Promethean view recognises technology as the result of collective, forward-looking responsibility and not only as a cause thereof. Several examples illustrate that technologies are not always the right means to tackle human vulnerabilities. Together, these arguments prompt a change in focus from the question ‘What technology?’ to ‘Why technology?’. (shrink)
I propose that the meanings of vague expressions render the truth conditions of utterances of sentences containing them sensitive to our interests. For example, 'expensive' is analyzed as meaning 'costs a lot', which in turn is analyzed as meaning 'costs significantly greater than the norm'. Whether a difference is a significant difference depends on what our interests are. Appeal to the proposal is shown to provide an attractive resolution of the sorites paradox that is compatible with classical logic and semantics.
Animal ethicists have recently debated the ethical questions raised by disenhancing animals to improve their welfare. Here, we focus on the particular case of breeding hens for commercial egg-laying systems to become blind, in order to benefit their welfare. Many people find breeding blind hens intuitively repellent, yet ‘welfare-only’ positions appear to be committed to endorsing this possibility if it produces welfare gains. We call this the ‘Blind Hens’ Challenge’. In this paper, we argue that there are both empirical and (...) theoretical reasons why even those adopting ‘welfare-only’ views should be concerned about breeding blind hens. But we also argue that alternative views, which (for example) claim that it is important to respect the telos or rights of an animal, do not offer a more convincing solution to questions raised by the possibility of disenhancing animals for their own benefit. (shrink)
This book explores a question central to philosophy--namely, what does it take for a belief to be justified or rational? According to a widespread view, whether one has justification for believing a proposition is determined by how probable that proposition is, given one's evidence. In this book this view is rejected and replaced with another: in order for one to have justification for believing a proposition, one's evidence must normically support it--roughly, one's evidence must make the falsity of that proposition (...) abnormal in the sense of calling for special, independent explanation. This conception of justification bears upon a range of topics in epistemology and beyond. Ultimately, this way of looking at justification guides us to a new, unfamiliar picture of how we should respond to our evidence and manage our own fallibility. This picture is developed here. (shrink)
In this paper I draw attention to a peculiar epistemic feature exhibited by certain deductively valid inferences. Certain deductively valid inferences are unable to enhance the reliability of one's belief that the conclusion is true—in a sense that will be fully explained. As I shall show, this feature is demonstrably present in certain philosophically significant inferences—such as GE Moore's notorious 'proof' of the existence of the external world. I suggest that this peculiar epistemic feature might be correlated with the much (...) discussed phenomenon that Crispin Wright and Martin Davies have called 'transmission failure'—the apparent failure, on the part of some deductively valid inferences to transmit one's justification for believing the premises. (shrink)
There is something puzzling about statistical evidence. One place this manifests is in the law, where courts are reluctant to base affirmative verdicts on evidence that is purely statistical, in spite of the fact that it is perfectly capable of meeting the standards of proof enshrined in legal doctrine. After surveying some proposed explanations for this, I shall outline a new approach – one that makes use of a notion of normalcy that is distinct from the idea of statistical frequency. (...) The puzzle is not, however, merely a legal one. Our unwillingness to base beliefs on statistical evidence is by no means limited to the courtroom, and is at odds with almost every general principle that epistemologists have proposed as to how we ought to manage our beliefs. (shrink)
According to a captivating picture, epistemic justification is essentially a matter of epistemic or evidential likelihood. While certain problems for this view are well known, it is motivated by a very natural thought—if justification can fall short of epistemic certainty, then what else could it possibly be? In this paper I shall develop an alternative way of thinking about epistemic justification. On this conception, the difference between justification and likelihood turns out to be akin to the more widely recognised difference (...) between ceteris paribus laws and brute statistical generalisations. I go on to discuss, in light of this suggestion, issues such as classical and lottery-driven scepticism as well as the lottery and preface paradoxes. (shrink)
According to the principle of Conjunction Closure, if one has justification for believing each of a set of propositions, one has justification for believing their conjunction. The lottery and preface paradoxes can both be seen as posing challenges for Closure, but leave open familiar strategies for preserving the principle. While this is all relatively well-trodden ground, a new Closure-challenging paradox has recently emerged, in two somewhat different forms, due to Backes :3773–3787, 2019a) and Praolini :715–726, 2019). This paradox synthesises elements (...) of the lottery and the preface and is designed to close off the familiar Closure-preserving strategies. By appealing to a normic theory of justification, I will defend Closure in the face of this new paradox. Along the way I will draw more general conclusions about justification, normalcy and defeat, which bear upon what Backes :2877–2895, 2019b) has dubbed the ‘easy defeat’ problem for the normic theory. (shrink)
Is it right to convict a person of a crime on the basis of purely statistical evidence? Many who have considered this question agree that it is not, posing a direct challenge to legal probabilism – the claim that the criminal standard of proof should be understood in terms of a high probability threshold. Some defenders of legal probabilism have, however, held their ground: Schoeman (1987) argues that there are no clear epistemic or moral problems with convictions based on purely (...) statistical evidence, and speculates that our aversion to such convictions may be nothing more than an irrational bias. More recently, Hedden and Colyvan (2019, section VI) describe our reluctance to convict on the basis of purely statistical evidence as an ‘intuition’, but suggest that there may be no ‘in principle’ problem with such convictions (see also Papineau, forthcoming, section 6). In this paper, I argue that there is, in some cases, an in principle problem with a conviction based upon statistical evidence alone – namely, it commits us to a precedent which, if consistently followed through, could lead to the deliberate conviction of an innocent person. I conclude with some reflections on the idea that the criminal justice system should strive to maximise the accuracy of its verdicts – and the related idea that we should each strive to maximise the accuracy of our beliefs. (shrink)
One of the most intriguing claims in Sven Rosenkranz’s Justification as Ignorance is that Timothy Williamson’s celebrated anti-luminosity argument can be resisted when it comes to the condition ~K~KP—the condition that one is in no position to know that one is in no position to know P. In this paper, I critically assess this claim.
Recent years have seen an explosion of interest in metaphysical explanation, and philosophers have fixed on the notion of ground as the conceptual tool with which such explanation should be investigated. I will argue that this focus on ground is myopic and that some metaphysical explanations that involve the essences of things cannot be understood in terms of ground. Such ‘essentialist’ explanation is of interest, not only for its ubiquity in philosophy, but for its being in a sense an ultimate (...) form of explanation. I give an account of the sense in which such explanation is ultimate and support it by defending what I call the inessentiality of essence. I close by suggesting that this principle is the key to understanding why essentialist explanations can seem so satisfying. (shrink)
Metaphysical rationalism, the doctrine which affirms the Principle of Sufficient Reason (the PSR), is out of favor today. The best argument against it is that it appears to lead to necessitarianism, the claim that all truths are necessarily true. Whatever the intuitive appeal of the PSR, the intuitive appeal of the claim that things could have been otherwise is greater. This problem did not go unnoticed by the great metaphysical rationalists Spinoza and Leibniz. Spinoza’s response was to embrace necessitarianism. Leibniz’s (...) response was to argue that, despite appearances, rationalism does not lead to necessitarianism. This paper examines the debate between these two rationalists and concludes that Leibniz has persuasive grounds for his opinion. This has significant implications both for the plausibility of the PSR and for our understanding of modality. (shrink)
In Explaining and Understanding International Relations philosopher Martin Hollis and international relations scholar Steve Smith join forces to analyse the dominant theories of international relations and to examine the philosophical issues underlying them.
This paper proposes a view of time that takes passage to be the most basic temporal notion, instead of the usual A-theoretic and B-theoretic notions, and explores how we should think of a world that exhibits such a genuine temporal passage. It will be argued that an objective passage of time can only be made sense of from an atemporal point of view and only when it is able to constitute a genuine change of objects across time. This requires that (...) passage can flip one fact into a contrary fact, even though neither side of the temporal passage is privileged over the other. We can make sense of this if the world is inherently perspectival. Such an inherently perspectival world is characterized by fragmentalism, a view that has been introduced by Fine in his ‘Tense and Reality’ (2005). Unlike Fine's tense-theoretic fragmentalism though, the proposed view will be a fragmentalist view based in a primitive notion of passage. (shrink)
Theories of epistemic justification are commonly assessed by exploring their predictions about particular hypothetical cases – predictions as to whether justification is present or absent in this or that case. With a few exceptions, it is much less common for theories of epistemic justification to be assessed by exploring their predictions about logical principles. The exceptions are a handful of ‘closure’ principles, which have received a lot of attention, and which certain theories of justification are well known to invalidate. But (...) these closure principles are only a small sample of the logical principles that we might consider. In this paper, I will outline four further logical principles that plausibly hold for justification and two which plausibly do not. While my primary aim is just to put these principles forward, I will use them to evaluate some different approaches to justification and (tentatively) conclude that a ‘normic’ theory of justification best captures its logic. (shrink)
Our understanding of subjunctive conditionals has been greatly enhanced through the use of possible world semantics and, more precisely, by the idea that they involve variably strict quantification over possible worlds. I propose to extend this treatment to ceteris paribus conditionals – that is, conditionals that incorporate a ceteris paribus or ‘other things being equal’ clause. Although such conditionals are commonly invoked in scientific theorising, they traditionally arouse suspicion and apprehensiveness amongst philosophers. By treating ceteris paribus conditionals as a species (...) of variably strict conditional I hope to shed new light upon their content and their logic. (shrink)
A ‘lottery belief’ is a belief that a particular ticket has lost a large, fair lottery, based on nothing more than the odds against it winning. The lottery paradox brings out a tension between the idea that lottery beliefs are justified and the idea that that one can always justifiably believe the deductive consequences of things that one justifiably believes – what is sometimes called the principle of closure. Many philosophers have treated the lottery paradox as an argument against the (...) second idea – but I make a case here that it is the first idea that should be given up. As I shall show, there are a number of independent arguments for denying that lottery beliefs are justified. (shrink)
Any explanation of one fact in terms of another will appeal to some sort of connection between the two. In a causal explanation, the connection might be a causal mechanism or law. But not all explanations are causal, and neither are all explanatory connections. For example, in explaining the fact that a given barn is red in terms of the fact that it is crimson, we might appeal to a non-causal connection between things’ being crimson and their being red. Many (...) such connections, like this one, are general rather than particular. I call these general non-causal explanatory connections 'laws of metaphysics'. In this paper I argue that some of these laws are to be found in the world at its most fundamental level, forming a bridge between fundamental reality and everything else. It is only by admitting fundamental laws, I suggest, that we can do justice to the explanatory relationship between what is fundamental and what is not. And once these laws are admitted, we are able to provide a nice resolution of the puzzle of why there are any non-fundamental facts in the first place. (shrink)
Modern generally accepted models of the growth of knowledge are scrutinized. It is maintained that Thomas Kuhn’s growth of knowledge model is grounded preeminently on Heidegger’s epistemology. To justify the tenet the corresponding works of both thinkers are considered. As a result, the one-to-one correspondence between the key propositions of Heideggerian epistemology and the basic tenets of Kuhn’s growth of knowledge model is elicited. The tenets under consideration include the holistic nature of a paradigm, the incommensurability thesis, conventional status of (...) a paradigm caused by pragmatist way of its vocabulary justification and even the basic instance – connection between Aristotelean and Newtonian mechanics. It is conjectured that an indirect influence of Heidegger upon Kuhn should be taken into account to explain the isomorphism. For instance, through the works of Alexandre Koyré admired by Kuhn. As is well-known, Koyré had close professional links with another Russian émigré – Alexandre Kojev – who presented in his 1933-1939 Paris lectures Hegel’s “Phenomenology of Spirit” seen through the cognitive lens of Heideggerian phenomenology. Key words: Martin Heidegger, Thomas Kuhn, Imre Lakatos, growth of knowledge, paradigm, incommensurability thesis, holism, pragmatism. (shrink)
Many epistemologists have responded to the lottery paradox by proposing formal rules according to which high probability defeasibly warrants acceptance. Douven and Williamson present an ingenious argument purporting to show that such rules invariably trivialise, in that they reduce to the claim that a probability of 1 warrants acceptance. Douven and Williamson’s argument does, however, rest upon significant assumptions – amongst them a relatively strong structural assumption to the effect that the underlying probability space is both finite and uniform. In (...) this paper, I will show that something very like Douven and Williamson’s argument can in fact survive with much weaker structural assumptions – and, in particular, can apply to infinite probability spaces. (shrink)
This critical review aims to more fully situate the claim Martin Heidegger makes in ‘Letter on Humanism’ that a “productive dialogue” between his work and that of Karl Marx is possible. The prompt for this is Paul Laurence Hemming’s recently published Heidegger and Marx: A Productive Dialogue over the Language of Humanism (2013) which omits to fully account for the historical situation which motivated Heidegger’s seemingly positive endorsement of Marxism. This piece will show that there were significant external factors (...) which influenced Heidegger’s claim and that, when seen within his broader corpus, these particular comments in “Letter on Humanism” are evidently disingenuous, given that his general opinion of Marxism can only be described as vitriolic. Any attempt to explore how such a “productive dialogue” could be construed must fully contextualise Heidegger’s claim for it. This piece will aim to do that, and more broadly explore Heidegger’s general opinion of Marxism. (shrink)
In 1990 Edward Craig published a book called Knowledge and the State of Nature in which he introduced and defended a genealogical approach to epistemology. In recent years Craig’s book has attracted a lot of attention, and his distinctive approach has been put to a wide range of uses including anti-realist metaepistemology, contextualism, relativism, anti-luck virtue epistemology, epistemic injustice, value of knowledge, pragmatism and virtue epistemology. While the number of objections to Craig’s approach has accumulated, there has been no sustained (...) attempt to develop answers to these objections. In this paper we provide answers to seven important objections in the literature. (shrink)
There are a number of debates that are relevant to questions concerning objectivity in science. One of the eldest, and still one of the most intensely fought, is the debate over epistemic relativism. —All forms of epistemic relativism commit themselves to the view that it is impossible to show in a neutral, non-question-begging, way that one “epistemic system”, that is, one interconnected set of epistemic standards, is epistemically superior to others. I shall call this view “No-metajustification”. No-metajustification is commonly taken (...) to deny the objectivity of standards. In this paper I shall discuss two currently popular attempts to attack “No-metajustification”. The first attempt attacks no-metajustification by challenging a particular strategy of arguing in its defence: this strategy involves the ancient Pyrrhonian “Problem of the Criterion”. The second attempt to refute No-metajustification targets its metaphysical underpinning: to wit, the claim that there are, or could be, several fundamentally different and irreconcilable epistemic systems. I shall call this assumption “Pluralism”. I shall address three questions with respect to these attempts to refute epistemic relativism by attacking no-metajustification: Can the epistemic relativist rely on the Problem of the Criterion in support of No-metajustification? Is a combination of Chisholmian “particularism” and epistemic naturalism an effective weapon against No-metajustification? And is Pluralism a defensible assumption? (shrink)
In the debate about Heidegger’s commitment to National Socialism is often referred to his membership in the „Committee for the Philosophy of Right“ of the „Academy for German Law“ that was founded by then „Reichsminister“ Hans Frank in 1934. Since the protocols of the Committee were destroyed and there is no relevant information in other writings, nothing can be said about the frequency and content of the meetings. It is only documented that the committee was dissolved in 1938. However, in (...) the past year the philosopher Sidonie Kellerer and the semiotician François Rastier referred to a document that, they say, proves that Heidegger was in the committee until 1941/42 and that the latter participated „in practice“ (Rastier) in the Holocaust. The said document was depicted for the first time in the above mentioned FAZ publication and will be analysed in the present essay. It is exluded in it that the document proves the continuity of the „Committee for the Philosophy of Right“ until 1941/42 or even the participation mentioned. It is rather possible to conclude in the frame of high probability that in the document were listed only the names and addresses of possible experts for the conversion of the Civil Code into a „Volksgesetzbuch“. The allegation of the committee’s participation in the Holocaust is rejected as being untenable. The publication of the article in the FAZ triggered the „Debate about Heidegger and Fake News“. In der Debatte um das Engagement des Philosophen Martin Heidegger für den Nationalsozialismus wird oft auf seine Mitgliedschaft in dem vom damaligen Reichsminister Hans Frank gegründeten „Ausschuss für Rechtsphilosophie“ innerhalb der „Akademie für Deutsches Recht“ verwiesen, der 1934 gegründet wurde. Da die Protokolle des Ausschusses zerstört wurden und auch in anderen Schriften keine diesbezüglichen Angaben zu finden sind, lässt sich nichts über die Häufigkeit und den Inhalt der Tagungen sagen. Es ist nur belegt, dass der Ausschuss 1938 offiziell aufgelöst wurde. Im vergangenen Jahr, im September 2017, referierten die Philosophin Sidonie Kellerer und der Linguist François Rastier jedoch auf ein Schriftstück, das belege, dass Heidegger bis 1941/42 in dem Ausschuss war und dieser auch „in der Praxis“ (Rastier) am Holocaust teilgenommen habe. Das Schriftstück wurde in der obigen Publikation der FAZ erstmals abgebildet und wird hier im Detail analysiert. Dabei kann begründetermaßen ausgeschlossen werden, dass das besagte Dokument den Fortbestand des „Ausschusses für Rechtsphilosophie“ oder die genannte Teilhabe belege. Nach hinreichender Analyse muss vielmehr in dem Rahmen hoher Wahrscheinlichkeit geschlossen werden, dass dort nur Namen und Adressen von potentiellen Gutachtern für die Umwandlung des BGB in ein „Volksgesetzbuch“ aufgelistet wurden. Der Vorhalt einer Teilhabe des Ausschusses am Holocaust wird als ganz unhaltbar zurückgewiesen. Die Publikation des Artikels in der FAZ löste die „Debatte über Heidegger und Fake News“ aus. (shrink)
The tremendous advances of research into artificial intelligence as well as neuroscience made over the last two to three decades have given further support to a renewed interest into philosophical discussions of the mind-body problem. Especially the last decade has seen a revival of panpsychist and idealist considerations, often focused on solving philosophical puzzles like the socalled hard problem of consciousness.1–9 While a number of respectable philosophers advocate some sort of panpsychistic solution to the mind-body problem now, fewer advocate that (...) idealism can contribute substantially to the debate. Interest in idealism has nevertheless risen again, as can be seen also from recent overview articles and collections of works.10–14 The working hypothesis here is that a properly formulated idealism can not only provide an alternative view of the mind/matter gap, but that this new view will also shed light on open questions in our common scientific, i.e. materialist, world view. To investigate this possibility, idealism first of all needs a model for the integration of modern science which allows for a mathematically consistent reinterpretation of the physical world as a limiting case of a both material and non-material world, which would make the outcome of idealistic considerations accessible to scientific investigation. To develop such a model I will first try to explain what I mean when I speak of a ‘scientifically tenable’ idealism, including a formulation of the emanation problem which for idealism replaces the interaction problem, then give a very brief summary of the available elements of such a theory in the philosophical literature, before sketching out some ‘design questions’ which have to be answered upon the construction of such models, and finally put forward a first model for a scientifically tenable objective idealism. (shrink)
In ‘The normative role of knowledge’ (2012), Declan Smithies defends a ‘JK-rule’ for belief: One has justification to believe that P iff one has justification to believe that one is in a position to know that P. Similar claims have been defended by others (Huemer, 2007, Reynolds, forthcoming). In this paper, I shall argue that the JK-rule is false. The standard and familiar way of arguing against putative rules for belief or assertion is, of course, to describe putative counterexamples. My (...) argument, though, won’t be like this – indeed I doubt that there are any intuitively compelling counterexamples to the JK-rule. Nevertheless, the claim that there are counterexamples to the JK-rule can, I think, be given something approaching a formal proof. My primary aim here is to sketch this proof. I will briefly consider some broader implications for how we ought to think about the epistemic standards governing belief and assertion. (shrink)
Fragmentalism was first introduced by Kit Fine in his ‘Tense and Reality’. According to fragmentalism, reality is an inherently perspectival place that exhibits a fragmented structure. The current paper defends the fragmentalist interpretation of the special theory of relativity, which Fine briefly considers in his paper. The fragmentalist interpretation makes room for genuine facts regarding absolute simultaneity, duration and length. One might worry that positing such variant properties is a turn for the worse in terms of theoretical virtues because such (...) properties are not involved in physical explanations and hence theoretically redundant. It will be argued that this is not right: if variant properties are indeed instantiated, they will also be involved in straightforward physical explanations and hence not explanatorily redundant. Hofweber and Lange, in their ‘Fine’s Fragmentalist Interpretation of Special Relativity’, object that the fragmentalist interpretation is in tension with the right explanation of the Lorentz transformations. It will be argued that their objection targets an inessential aspect of the fragmentalist framework and fails to raise any serious problem for the fragmentalist interpretation of special relativity. (shrink)
Say that two goals are normatively coincident just in case one cannot aim for one goal without automatically aiming for the other. While knowledge and justification are distinct epistemic goals, with distinct achievement conditions, this paper begins from the suggestion that they are nevertheless normatively coincident—aiming for knowledge and aiming for justification are one and the same activity. A number of surprising consequences follow from this—both specific consequences about how we can ascribe knowledge and justification in lottery cases and more (...) general consequences about the nature of justification and the relationship between justification and evidential probability. Many of these consequences turn out to be at variance with conventional, prevailing views. (shrink)
People have been arguing about natural law for at least a couple of thousand years now. During that time, a number of substantially different sorts of theory have been identified as falling within the natural law tradition. Even within each sort of natural law theory, there has been a variety of quite different arguments proposed, both in behalf of and in opposition to the theory. These facts about the natural law tradition serve to confound its critics. It's extremely tough to (...) get a tidy formulation of just what natural law theory is, and such a formulation is needed if the theory is to be analyzed as to its adequacy, truth, value, or whatever. I am going to try to perform such an analysis nonetheless, and to do it as neatly as possible within a fairly short space. (shrink)
Martin Peterson’s The Ethics of Technology: A Geometric Analysis of Five Moral Principles offers a welcome contribution to the ethics of technology, understood by Peterson as a branch of applied ethics that attempts ‘to identify the morally right courses of action when we develop, use, or modify technological artifacts’ (3). He argues that problems within this field are best treated by the use of five domain-specific principles: the Cost-Benefit Principle, the Precautionary Principle, the Sustainability Principle, the Autonomy Principle, and (...) the Fairness Principle. These principles are, in turn, to be understood and applied with reference to the geometric method. This method is perhaps the most interesting and novel part of Peterson’s book, and I’ll devote the bulk of my review to it. (shrink)
In a number of papers and in his recent book, Is Water H₂O? Evidence, Realism, Pluralism (2012), Hasok Chang has argued that the correct interpretation of the Chemical Revolution provides a strong case for the view that progress in science is served by maintaining several incommensurable “systems of practice” in the same discipline, and concerning the same region of nature. This paper is a critical discussion of Chang's reading of the Chemical Revolution. It seeks to establish, first, that Chang's assessment (...) of Lavoisier's and Priestley's work and character follows the phlogistonists' “actors' sociology”; second, that Chang simplifies late-eighteenth-century chemical debates by reducing them to an alleged conflict between two systems of practice; third, that Chang's evidence for a slow transition from phlogistonist theory to oxygen theory is not strong; and fourth, that he is wrong to assume that chemists at the time did not have overwhelming good reasons to favour Lavoisier's over the phlogistonists' views. (shrink)
This paper seeks to widen the dialogue between the “epistemology of peer disagreement” and the epistemology informed by Wittgenstein’s last notebooks, later edited as On Certainty. The paper defends the following theses: not all certainties are groundless; many of them are beliefs; and they do not have a common essence. An epistemic peer need not share all of my certainties. Which response to a disagreement over a certainty is called for, depends on the type of certainty in question. Sometimes a (...) form of relativism is the right response. Reasonable, mutually recognized peer disagreement over a certainty is possible.—The paper thus addresses both interpretative and systematic issues. It uses Wittgenstein as a resource for thinking about peer disagreement over certainties. (shrink)
The _Principle of Indifference_ was once regarded as a linchpin of probabilistic reasoning, but has now fallen into disrepute as a result of the so-called _problem of multiple of partitions_. In ‘Evidential symmetry and mushy credence’ Roger White suggests that we have been too quick to jettison this principle and argues that the problem of multiple partitions rests on a mistake. In this paper I will criticise White’s attempt to revive POI. In so doing, I will argue that what underlies (...) the problem of multiple partitions is a fundamental tension between POI and the very idea of _evidential incomparability_. (shrink)
My concern in this paper is with the claim that knowledge is a mental state – a claim that Williamson places front and centre in Knowledge and Its Limits. While I am not by any means convinced that the claim is false, I do think it carries certain costs that have not been widely appreciated. One source of resistance to this claim derives from internalism about the mental – the view, roughly speaking, that one’s mental states are determined by one’s (...) internal physical state. In order to know that something is the case it is not, in general, enough for one’s internal physical state to be a certain way – the wider world must also be a certain way. If we accept that knowledge is a mental state, we must give up internalism. One might think that this is no cost, since much recent work in the philosophy of mind has, in any case, converged on the view that internalism is false. This thought, though, is too quick. As I will argue here, the claim that knowledge is a mental state would take us to a view much further from internalism than anything philosophers of mind have converged upon. (shrink)
Companies committed to corporate social responsibility should ensure that their managers possess the appropriate competencies to effectively manage the CSR adaptation process. The literature provides insights into the individual competencies these managers need but fails to prioritize them and adequately contextualize them in a manner that makes them meaningful in practice. In this study, we contextualized the competencies within the different job roles CSR managers have in the CSR adaptation process. We interviewed 28 CSR managers, followed by a survey to (...) explore the relative importance of the competencies within each job role. Based on our analysis, we identified six distinct managerial roles, including strategic, coordinating, and stimulating roles. Next, we identified per role key individual CSR-related competencies as prioritized by the respondents. Our results show that the context, as indicated in this study by CSR managers’ job roles, indeed influenced the importance of particular CSR-related competencies, because each role seems to require a different combination and prioritization of these competencies. Moreover, the results suggest that the relative importance of these competencies within each role may be driven by business logic rather than an idealistic logic. The results are presented as a competence profile which can serve as a reflection tool and as a frame of reference to further develop the competence profile for CSR managers. (shrink)
The epistemology of religion is the branch of epistemology concerned with the rationality, the justificatory status and the knowledge status of religious beliefs – most often the belief in the existence of an omnipotent, omniscient and loving God as conceived by the major monotheistic religions. While other sorts of religious beliefs – such as belief in an afterlife or in disembodied spirits or in the occurrence of miracles – have also been the focus of considerable attention from epistemologists, I shall (...) concentrate here on belief in God. There were a number of significant works in the epistemology of religion written during the early and mid Twentieth Century. The late Twentieth Century, however, saw a surge of interest in this area, fuelled by the work of philosophers such as William Alston, Alvin Plantinga and Linda Zagzebski amongst others. Alston, Plantinga and Zagzebski succeeded in importing, into the epistemology of religion, various new ideas from mainstream epistemology – in particular, externalist approaches to justification, such as reliabilism, and virtue theoretic approaches to knowledge (see, for instance, Alston, 1986, 1991, Plantinga, 1988, 2000, Zagzebski, 1993a, 1993b). This laid fertile ground for new research – questions about the justificatory and knowledge status of belief in God begin to look very different when viewed through the lens of theories such as these. I will begin by surveying some of this groundbreaking work in the present article, before moving on to work from the last five years – a period in which the epistemology of religion has again received impetus from a number of ideas from mainstream epistemology; ideas such as pragmatic encroachment, phenomenal conservatism and externalist theories of evidence. (shrink)
A de minimis risk is defined as a risk that is so small that it may be legitimately ignored when making a decision. While ignoring small risks is common in our day-to-day decision making, attempts to introduce the notion of a de minimis risk into the framework of decision theory have run up against a series of well-known difficulties. In this paper, I will develop an enriched decision theoretic framework that is capable of overcoming two major obstacles to the modelling (...) of de minimis risk. The key move is to introduce, into decision theory, a non-probabilistic conception of risk known as normic risk. (shrink)
This article aims to bring some work in contemporary analytic metaphysics to discussions of the Real Presence of Christ in the Eucharist. I will show that some unusual claims of the Real Presence doctrine exactly parallel what would be happening in the world if objects were to time-travel in certain ways. Such time-travel would make ordinary objects multiply located, and in the relevantly analogous respects. If it is conceptually coherent that objects behave in this way, we have a model for (...) the behaviour of the Eucharist which shows the doctrine to be coherent, at least with respect to the issues discussed. (shrink)
Entitlement is conceived as a kind of positive epistemic status, attaching to certain propositions, that involves no cognitive or intellectual accomplishment on the part of the beneficiary — a status that is in place by default. In this paper I will argue that the notion of entitlement — or something very like it — falls out of an idea that may at first blush seem rather disparate: that the evidential support relation can be understood as a kind of variably strict (...) conditional (in the sense of Lewis 1973). Lewis provided a general recipe for deriving what he termed inner modalities from any variably strict conditional governed by a logic meeting certain constraints. On my proposal, entitlement need be nothing more exotic than the inner necessity associated with evidential support. Understanding entitlement in this way helps to answer some common concerns — in particular, the concern that entitlement could only be a pragmatic, and not genuinely epistemic, status. (shrink)
There is a type of metaphysical picture that surfaces in a range of philosophical discussions, is of intrinsic interest and yet remains ill-understood. According to this picture, the world contains a range of standpoints relative to which different facts obtain. Any true representation of the world cannot but adopt a particular standpoint. The aim of this paper is to propose a regimentation of a metaphysics that underwrites this picture. Key components are a factive notion of metaphysical relativity, a deflationary notion (...) of adopting standpoints and two kinds of valid inference, one that allows one to abandon standpoints and one that doesn’t. To better understand how theories formulated in terms of this framework are situated in dialectical space, I sketch a theory in the philosophy of time that admits both temporal and atemporal standpoints. (shrink)
In this paper, I offer reasons for thinking that two prominent sceptical arguments in the literature – the underdetermination-based sceptical argument and the closure-based sceptical argument – are less philosophically interesting than is commonly supposed. The underdetermination-based argument begs the question against a non-sceptic and can be dismissed with little fanfare. The closure-based argument, though perhaps not question-begging per se, does rest upon contentious assumptions that a non-sceptic is under no pressure to accept.
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.