Throughout the biological and biomedical sciences there is a growing need for, prescriptive ‘minimum information’ (MI) checklists specifying the key information to include when reporting experimental results are beginning to find favor with experimentalists, analysts, publishers and funders alike. Such checklists aim to ensure that methods, data, analyses and results are described to a level sufficient to support the unambiguous interpretation, sophisticated search, reanalysis and experimental corroboration and reuse of data sets, facilitating the extraction of maximum value from data sets (...) them. However, such ‘minimum information’ MI checklists are usually developed independently by groups working within representatives of particular biologically- or technologically-delineated domains. Consequently, an overview of the full range of checklists can be difficult to establish without intensive searching, and even tracking thetheir individual evolution of single checklists may be a non-trivial exercise. Checklists are also inevitably partially redundant when measured one against another, and where they overlap is far from straightforward. Furthermore, conflicts in scope and arbitrary decisions on wording and sub-structuring make integration difficult. This presents inhibit their use in combination. Overall, these issues present significant difficulties for the users of checklists, especially those in areas such as systems biology, who routinely combine information from multiple biological domains and technology platforms. To address all of the above, we present MIBBI (Minimum Information for Biological and Biomedical Investigations); a web-based communal resource for such checklists, designed to act as a ‘one-stop shop’ for those exploring the range of extant checklist projects, and to foster collaborative, integrative development and ultimately promote gradual integration of checklists. (shrink)
With increasing publication and data production, scientific knowledge presents not simply an achievement but also a challenge. Scientific publications and data are increasingly treated as resources that need to be digitally ‘managed.’ This gives rise to scientific Knowledge Management : second-order scientific work aiming to systematically collect, take care of and mobilise first-hand disciplinary knowledge and data in order to provide new first-order scientific knowledge. We follow the work of Leonelli, Efstathiou and Hislop in our analysis of the use of (...) KM in semantic systems biology. Through an empirical philosophical account of KM-enabled biological research, we argue that KM helps produce new first-order biological knowledge that did not exist before, and which could not have been produced by traditional means. KM work is enabled by conceiving of ‘knowledge’ as an object for computational science: as explicated in the text of biological articles and computable via appropriate data and metadata. However, these founded knowledge concepts enabling computational KM risk focusing on only computationally tractable data as knowledge, underestimating practice-based knowing and its significance in ensuring the validity of ‘manageable’ knowledge as knowledge. (shrink)
This book explores a question central to philosophy--namely, what does it take for a belief to be justified or rational? According to a widespread view, whether one has justification for believing a proposition is determined by how probable that proposition is, given one's evidence. In this book this view is rejected and replaced with another: in order for one to have justification for believing a proposition, one's evidence must normically support it--roughly, one's evidence must make the falsity of that proposition (...) abnormal in the sense of calling for special, independent explanation. This conception of justification bears upon a range of topics in epistemology and beyond. Ultimately, this way of looking at justification guides us to a new, unfamiliar picture of how we should respond to our evidence and manage our own fallibility. This picture is developed here. (shrink)
There is something puzzling about statistical evidence. One place this manifests is in the law, where courts are reluctant to base affirmative verdicts on evidence that is purely statistical, in spite of the fact that it is perfectly capable of meeting the standards of proof enshrined in legal doctrine. After surveying some proposed explanations for this, I shall outline a new approach – one that makes use of a notion of normalcy that is distinct from the idea of statistical frequency. (...) The puzzle is not, however, merely a legal one. Our unwillingness to base beliefs on statistical evidence is by no means limited to the courtroom, and is at odds with almost every general principle that epistemologists have proposed as to how we ought to manage our beliefs. (shrink)
According to a captivating picture, epistemic justification is essentially a matter of epistemic or evidential likelihood. While certain problems for this view are well known, it is motivated by a very natural thought—if justification can fall short of epistemic certainty, then what else could it possibly be? In this paper I shall develop an alternative way of thinking about epistemic justification. On this conception, the difference between justification and likelihood turns out to be akin to the more widely recognised difference (...) between ceteris paribus laws and brute statistical generalisations. I go on to discuss, in light of this suggestion, issues such as classical and lottery-driven scepticism as well as the lottery and preface paradoxes. (shrink)
Is it right to convict a person of a crime on the basis of purely statistical evidence? Many who have considered this question agree that it is not, posing a direct challenge to legal probabilism – the claim that the criminal standard of proof should be understood in terms of a high probability threshold. Some defenders of legal probabilism have, however, held their ground: Schoeman (1987) argues that there are no clear epistemic or moral problems with convictions based on purely (...) statistical evidence, and speculates that our aversion to such convictions may be nothing more than an irrational bias. More recently, Hedden and Colyvan (2019, section VI) describe our reluctance to convict on the basis of purely statistical evidence as an ‘intuition’, but suggest that there may be no ‘in principle’ problem with such convictions (see also Papineau, forthcoming, section 6). In this paper, I argue that there is, in some cases, an in principle problem with a conviction based upon statistical evidence alone – namely, it commits us to a precedent which, if consistently followed through, could lead to the deliberate conviction of an innocent person. I conclude with some reflections on the idea that the criminal justice system should strive to maximise the accuracy of its verdicts – and the related idea that we should each strive to maximise the accuracy of our beliefs. (shrink)
In this paper I draw attention to a peculiar epistemic feature exhibited by certain deductively valid inferences. Certain deductively valid inferences are unable to enhance the reliability of one's belief that the conclusion is true—in a sense that will be fully explained. As I shall show, this feature is demonstrably present in certain philosophically significant inferences—such as GE Moore's notorious 'proof' of the existence of the external world. I suggest that this peculiar epistemic feature might be correlated with the much (...) discussed phenomenon that Crispin Wright and Martin Davies have called 'transmission failure'—the apparent failure, on the part of some deductively valid inferences to transmit one's justification for believing the premises. (shrink)
Recent years have seen an explosion of interest in metaphysical explanation, and philosophers have fixed on the notion of ground as the conceptual tool with which such explanation should be investigated. I will argue that this focus on ground is myopic and that some metaphysical explanations that involve the essences of things cannot be understood in terms of ground. Such ‘essentialist’ explanation is of interest, not only for its ubiquity in philosophy, but for its being in a sense an ultimate (...) form of explanation. I give an account of the sense in which such explanation is ultimate and support it by defending what I call the inessentiality of essence. I close by suggesting that this principle is the key to understanding why essentialist explanations can seem so satisfying. (shrink)
This paper proposes a view of time that takes passage to be the most basic temporal notion, instead of the usual A-theoretic and B-theoretic notions, and explores how we should think of a world that exhibits such a genuine temporal passage. It will be argued that an objective passage of time can only be made sense of from an atemporal point of view and only when it is able to constitute a genuine change of objects across time. This requires that (...) passage can flip one fact into a contrary fact, even though neither side of the temporal passage is privileged over the other. We can make sense of this if the world is inherently perspectival. Such an inherently perspectival world is characterized by fragmentalism, a view that has been introduced by Fine in his ‘Tense and Reality’ (2005). Unlike Fine's tense-theoretic fragmentalism though, the proposed view will be a fragmentalist view based in a primitive notion of passage. (shrink)
Metaphysical rationalism, the doctrine which affirms the Principle of Sufficient Reason (the PSR), is out of favor today. The best argument against it is that it appears to lead to necessitarianism, the claim that all truths are necessarily true. Whatever the intuitive appeal of the PSR, the intuitive appeal of the claim that things could have been otherwise is greater. This problem did not go unnoticed by the great metaphysical rationalists Spinoza and Leibniz. Spinoza’s response was to embrace necessitarianism. Leibniz’s (...) response was to argue that, despite appearances, rationalism does not lead to necessitarianism. This paper examines the debate between these two rationalists and concludes that Leibniz has persuasive grounds for his opinion. This has significant implications both for the plausibility of the PSR and for our understanding of modality. (shrink)
One of the most intriguing claims in Sven Rosenkranz’s Justification as Ignorance is that Timothy Williamson’s celebrated anti-luminosity argument can be resisted when it comes to the condition ~ K ~ KP—the condition that one is in no position to know that one is in no position to know P. In this paper, I critically assess this claim.
A ‘lottery belief’ is a belief that a particular ticket has lost a large, fair lottery, based on nothing more than the odds against it winning. The lottery paradox brings out a tension between the idea that lottery beliefs are justified and the idea that that one can always justifiably believe the deductive consequences of things that one justifiably believes – what is sometimes called the principle of closure. Many philosophers have treated the lottery paradox as an argument against the (...) second idea – but I make a case here that it is the first idea that should be given up. As I shall show, there are a number of independent arguments for denying that lottery beliefs are justified. (shrink)
Theories of epistemic justification are commonly assessed by exploring their predictions about particular hypothetical cases – predictions as to whether justification is present or absent in this or that case. With a few exceptions, it is much less common for theories of epistemic justification to be assessed by exploring their predictions about logical principles. The exceptions are a handful of ‘closure’ principles, which have received a lot of attention, and which certain theories of justification are well known to invalidate. But (...) these closure principles are only a small sample of the logical principles that we might consider. In this paper, I will outline four further logical principles that plausibly hold for justification and two which plausibly do not. While my primary aim is just to put these principles forward, I will use them to evaluate some different approaches to justification and (tentatively) conclude that a ‘normic’ theory of justification best captures its logic. (shrink)
Any explanation of one fact in terms of another will appeal to some sort of connection between the two. In a causal explanation, the connection might be a causal mechanism or law. But not all explanations are causal, and neither are all explanatory connections. For example, in explaining the fact that a given barn is red in terms of the fact that it is crimson, we might appeal to a non-causal connection between things’ being crimson and their being red. Many (...) such connections, like this one, are general rather than particular. I call these general non-causal explanatory connections 'laws of metaphysics'. In this paper I argue that some of these laws are to be found in the world at its most fundamental level, forming a bridge between fundamental reality and everything else. It is only by admitting fundamental laws, I suggest, that we can do justice to the explanatory relationship between what is fundamental and what is not. And once these laws are admitted, we are able to provide a nice resolution of the puzzle of why there are any non-fundamental facts in the first place. (shrink)
There are a number of debates that are relevant to questions concerning objectivity in science. One of the eldest, and still one of the most intensely fought, is the debate over epistemic relativism. —All forms of epistemic relativism commit themselves to the view that it is impossible to show in a neutral, non-question-begging, way that one “epistemic system”, that is, one interconnected set of epistemic standards, is epistemically superior to others. I shall call this view “No-metajustification”. No-metajustification is commonly taken (...) to deny the objectivity of standards. In this paper I shall discuss two currently popular attempts to attack “No-metajustification”. The first attempt attacks no-metajustification by challenging a particular strategy of arguing in its defence: this strategy involves the ancient Pyrrhonian “Problem of the Criterion”. The second attempt to refute No-metajustification targets its metaphysical underpinning: to wit, the claim that there are, or could be, several fundamentally different and irreconcilable epistemic systems. I shall call this assumption “Pluralism”. I shall address three questions with respect to these attempts to refute epistemic relativism by attacking no-metajustification: Can the epistemic relativist rely on the Problem of the Criterion in support of No-metajustification? Is a combination of Chisholmian “particularism” and epistemic naturalism an effective weapon against No-metajustification? And is Pluralism a defensible assumption? (shrink)
In 1990 Edward Craig published a book called Knowledge and the State of Nature in which he introduced and defended a genealogical approach to epistemology. In recent years Craig’s book has attracted a lot of attention, and his distinctive approach has been put to a wide range of uses including anti-realist metaepistemology, contextualism, relativism, anti-luck virtue epistemology, epistemic injustice, value of knowledge, pragmatism and virtue epistemology. While the number of objections to Craig’s approach has accumulated, there has been no sustained (...) attempt to develop answers to these objections. In this paper we provide answers to seven important objections in the literature. (shrink)
According to the principle of Conjunction Closure, if one has justification for believing each of a set of propositions, one has justification for believing their conjunction. The lottery and preface paradoxes can both be seen as posing challenges for Closure, but leave open familiar strategies for preserving the principle. While this is all relatively well-trodden ground, a new Closure-challenging paradox has recently emerged, in two somewhat different forms, due to Marvin Backes (2019a) and Francesco Praolini (2019). This paradox synthesises elements (...) of the lottery and the preface and is designed to close off the familiar Closure-preserving strategies. By appealing to a normic theory of justification, I will defend Closure in the face of this new paradox. Along the way I will draw more general conclusions about justification, normalcy and defeat, which bear upon what Backes (2019b) has dubbed the ‘easy defeat’ problem for the normic theory. (shrink)
Our understanding of subjunctive conditionals has been greatly enhanced through the use of possible world semantics and, more precisely, by the idea that they involve variably strict quantification over possible worlds. I propose to extend this treatment to ceteris paribus conditionals – that is, conditionals that incorporate a ceteris paribus or ‘other things being equal’ clause. Although such conditionals are commonly invoked in scientific theorising, they traditionally arouse suspicion and apprehensiveness amongst philosophers. By treating ceteris paribus conditionals as a species (...) of variably strict conditional I hope to shed new light upon their content and their logic. (shrink)
Fragmentalism was first introduced by Kit Fine in his ‘Tense and Reality’. According to fragmentalism, reality is an inherently perspectival place that exhibits a fragmented structure. The current paper defends the fragmentalist interpretation of the special theory of relativity, which Fine briefly considers in his paper. The fragmentalist interpretation makes room for genuine facts regarding absolute simultaneity, duration and length. One might worry that positing such variant properties is a turn for the worse in terms of theoretical virtues because such (...) properties are not involved in physical explanations and hence theoretically redundant. It will be argued that this is not right: if variant properties are indeed instantiated, they will also be involved in straightforward physical explanations and hence not explanatorily redundant. Hofweber and Lange, in their ‘Fine’s Fragmentalist Interpretation of Special Relativity’, object that the fragmentalist interpretation is in tension with the right explanation of the Lorentz transformations. It will be argued that their objection targets an inessential aspect of the fragmentalist framework and fails to raise any serious problem for the fragmentalist interpretation of special relativity. (shrink)
In this paper I respond to Marcello Di Bello’s criticisms of the ‘normic account’ of the criminal standard of proof. In so doing, I further elaborate on what the normic account predicts about certain significant legal categories of evidence, including DNA and fingerprint evidence and eyewitness identifications.
This paper seeks to widen the dialogue between the “epistemology of peer disagreement” and the epistemology informed by Wittgenstein’s last notebooks, later edited as On Certainty. The paper defends the following theses: not all certainties are groundless; many of them are beliefs; and they do not have a common essence. An epistemic peer need not share all of my certainties. Which response to a disagreement over a certainty is called for, depends on the type of certainty in question. Sometimes a (...) form of relativism is the right response. Reasonable, mutually recognized peer disagreement over a certainty is possible.—The paper thus addresses both interpretative and systematic issues. It uses Wittgenstein as a resource for thinking about peer disagreement over certainties. (shrink)
In a number of papers and in his recent book, Is Water H₂O? Evidence, Realism, Pluralism (2012), Hasok Chang has argued that the correct interpretation of the Chemical Revolution provides a strong case for the view that progress in science is served by maintaining several incommensurable “systems of practice” in the same discipline, and concerning the same region of nature. This paper is a critical discussion of Chang's reading of the Chemical Revolution. It seeks to establish, first, that Chang's assessment (...) of Lavoisier's and Priestley's work and character follows the phlogistonists' “actors' sociology”; second, that Chang simplifies late-eighteenth-century chemical debates by reducing them to an alleged conflict between two systems of practice; third, that Chang's evidence for a slow transition from phlogistonist theory to oxygen theory is not strong; and fourth, that he is wrong to assume that chemists at the time did not have overwhelming good reasons to favour Lavoisier's over the phlogistonists' views. (shrink)
In Explaining and Understanding International Relations philosopher Martin Hollis and international relations scholar Steve Smith join forces to analyse the dominant theories of international relations and to examine the philosophical issues underlying them.
This article aims to bring some work in contemporary analytic metaphysics to discussions of the Real Presence of Christ in the Eucharist. I will show that some unusual claims of the Real Presence doctrine exactly parallel what would be happening in the world if objects were to time-travel in certain ways. Such time-travel would make ordinary objects multiply located, and in the relevantly analogous respects. If it is conceptually coherent that objects behave in this way, we have a model for (...) the behaviour of the Eucharist which shows the doctrine to be coherent, at least with respect to the issues discussed. (shrink)
Many epistemologists have responded to the lottery paradox by proposing formal rules according to which high probability defeasibly warrants acceptance. Douven and Williamson present an ingenious argument purporting to show that such rules invariably trivialise, in that they reduce to the claim that a probability of 1 warrants acceptance. Douven and Williamson’s argument does, however, rest upon significant assumptions – amongst them a relatively strong structural assumption to the effect that the underlying probability space is both finite and uniform. In (...) this paper, I will show that something very like Douven and Williamson’s argument can in fact survive with much weaker structural assumptions – and, in particular, can apply to infinite probability spaces. (shrink)
Argument mapping is a way of diagramming the logical structure of an argument to explicitly and concisely represent reasoning. The use of argument mapping in critical thinking instruction has increased dramatically in recent decades. This paper overviews the innovation and provides a procedural approach for new teaches wanting to use argument mapping in the classroom. A brief history of argument mapping is provided at the end of this paper.
In ‘The normative role of knowledge’ (2012), Declan Smithies defends a ‘JK-rule’ for belief: One has justification to believe that P iff one has justification to believe that one is in a position to know that P. Similar claims have been defended by others (Huemer, 2007, Reynolds, forthcoming). In this paper, I shall argue that the JK-rule is false. The standard and familiar way of arguing against putative rules for belief or assertion is, of course, to describe putative counterexamples. My (...) argument, though, won’t be like this – indeed I doubt that there are any intuitively compelling counterexamples to the JK-rule. Nevertheless, the claim that there are counterexamples to the JK-rule can, I think, be given something approaching a formal proof. My primary aim here is to sketch this proof. I will briefly consider some broader implications for how we ought to think about the epistemic standards governing belief and assertion. (shrink)
In their 2010 book, Biology’s First Law, D. McShea and R. Brandon present a principle that they call ‘‘ZFEL,’’ the zero force evolutionary law. ZFEL says (roughly) that when there are no evolutionary forces acting on a population, the population’s complexity (i.e., how diverse its member organisms are) will increase. Here we develop criticisms of ZFEL and describe a different law of evolution; it says that diversity and complexity do not change when there are no evolutionary causes.
Several theistic arguments are formulated as arguments for the best explanation. This article discusses how one can determine that some phenomenon actually needs an explanation. One way to demonstrate that an explanation is needed is by providing one. The proposed explanation ought to either make the occurrence of the phenomenon in question more probable than it occurring by chance, or it has to sufficiently increase our understanding of the phenomenon. A second way to demonstrate that an explanation is needed is (...) to show that the phenomenon in question both violates our expectations and is particularly noticeable. (shrink)
Relativism can be found in all philosophical traditions and subfields of philosophy. It is also a central idea in the social sciences, the humanities, religion and politics. This is the first volume to map relativistic motifs in all areas of philosophy, synchronically and diachronically. It thereby provides essential intellectual tools for thinking about contemporary issues like cultural diversity, the plurality of the sciences, or the scope of moral values. The Routledge Handbook of Philosophy of Relativism is an outstanding major reference (...) source on this fundamental topic. The 57 chapters by a team of international contributors are divided into nine parts: Relativism in non-Western philosophical traditions Relativism in Western philosophical traditions Relativism in ethics Relativism in political and legal philosophy Relativism in epistemology Relativism in metaphysics Relativism in philosophy of science Relativism in philosophy of language and mind Relativism in other areas of philosophy. Essential reading for students and researchers in all branches of philosophy, this handbook will also be of interest to those in related subjects such as politics, religion, sociology, cultural studies and literature. (shrink)
Say that two goals are normatively coincident just in case one cannot aim for one goal without automatically aiming for the other. While knowledge and justification are distinct epistemic goals, with distinct achievement conditions, this paper begins from the suggestion that they are nevertheless normatively coincident—aiming for knowledge and aiming for justification are one and the same activity. A number of surprising consequences follow from this—both specific consequences about how we can ascribe knowledge and justification in lottery cases and more (...) general consequences about the nature of justification and the relationship between justification and evidential probability. Many of these consequences turn out to be at variance with conventional, prevailing views. (shrink)
This paper outlines a novel solution to the Ship of Theseus puzzle. The solution relies on situations, a philosophical tool used in natural language semantics among other places. The core idea is that what is true is always relative to the situation under consideration. I begin by outlining the problem before briefly introducing situations. I then present the solution: in smaller situations the candidate is identical to Theseus’s ship. But in larger situations containing both candidates these identities are neither true (...) nor false. Finally, I discuss some worries for the view that arise from the nature of identity, and suggest responses. It is concluded that the solution, and the theory that underpins it, are worth further investigation. (shrink)
The standard of proof applied in civil trials is the preponderance of evidence, often said to be met when a proposition is shown to be more than 50% likely to be true. A number of theorists have argued that this 50%+ standard is too weak – there are circumstances in which a court should find that the defendant is not liable, even though the evidence presented makes it more than 50% likely that the plaintiff’s claim is true. In this paper, (...) I will recapitulate the familiar arguments for this thesis, before defending a more radical one: The 50%+ standard is also too strong – there are circumstances in which a court should find that a defendant is liable, even though the evidence presented makes it less than 50% likely that the plaintiff’s claim is true. I will argue that the latter thesis follows naturally from the former once we accept that the parties in a civil trial are to be treated equally. I will conclude by sketching an alternative interpretation of the civil standard of proof. (shrink)
My concern in this paper is with the claim that knowledge is a mental state – a claim that Williamson places front and centre in Knowledge and Its Limits. While I am not by any means convinced that the claim is false, I do think it carries certain costs that have not been widely appreciated. One source of resistance to this claim derives from internalism about the mental – the view, roughly speaking, that one’s mental states are determined by one’s (...) internal physical state. In order to know that something is the case it is not, in general, enough for one’s internal physical state to be a certain way – the wider world must also be a certain way. If we accept that knowledge is a mental state, we must give up internalism. One might think that this is no cost, since much recent work in the philosophy of mind has, in any case, converged on the view that internalism is false. This thought, though, is too quick. As I will argue here, the claim that knowledge is a mental state would take us to a view much further from internalism than anything philosophers of mind have converged upon. (shrink)
Experiments in particle physics have hitherto failed to produce any significant evidence for the many explicit models of physics beyond the Standard Model (BSM) that had been proposed over the past decades. As a result, physicists have increasingly turned to model-independent strategies as tools in searching for a wide range of possible BSM effects. In this paper, we describe the Standard Model Effective Field Theory (SM-EFT) and analyse it in the context of the philosophical discussions about models, theories, and (bottom-up) (...) effective field theories. We find that while the SM-EFT is a quantum field theory, assisting experimentalists in searching for deviations from the SM, in its general form it lacks some of the characteristic features of models. Those features only come into play if put in by hand or prompted by empirical evidence for deviations. Employing different philosophical approaches to models, we argue that the case study suggests not to take a view on models that is overly permissive because it blurs the lines between the different stages of the SM-EFT research strategies and glosses over particle physicists' motivations for undertaking this bottom-up approach in the first place. Looking at EFTs from the perspective of modelling does not require taking a stance on some specific brand of realism or taking sides in the debate between reduction and emergence into which EFTs have recently been embedded. (shrink)
Material from this paper appears in Chap. 7 of my book Reason and Being, but there is also stuff here that isn't in the book. In particular, it discusses the claims that, for Spinoza, conceiving implies explaining and that existence is identical to or reducible to conceivability. So, if you're interested in those issues, this paper might be worth a read.
This article focuses on two developments in nineteenth-century (philosophy of) social science: Moritz Lazarus’s and Heymann Steinthal’s Völkerpsychologie and Georg Simmel’s early sociology of knowledge. The article defends the following theses. First, Lazarus and Steinthal wavered between a “strong” and a “weak” program for Völkerpsychologie. Ingredients for the strong program included methodological neutrality and symmetry; causal explanation of beliefs based on causal laws; a focus on groups, interests, tradition, culture, or materiality; determinism; and a self-referential model of social institutions. Second, (...) elements of the weak program were the blurring of explanatory and normative interests, an emphasis on freedom of the will, and antirelativism and antimaterialism. Third, later research projects keeping the label “Völkerpsychologie” followed the weak program. Fourth, in the 1880s and 1890s, Simmel tried to build on some of the elements of the strong program. Finally, and fifth, part of the explanation for why Simmel did not succeed in his attempt had to do with the social-political situation of German academia around 1900. (shrink)
The _Principle of Indifference_ was once regarded as a linchpin of probabilistic reasoning, but has now fallen into disrepute as a result of the so-called _problem of multiple of partitions_. In ‘Evidential symmetry and mushy credence’ Roger White suggests that we have been too quick to jettison this principle and argues that the problem of multiple partitions rests on a mistake. In this paper I will criticise White’s attempt to revive POI. In so doing, I will argue that what underlies (...) the problem of multiple partitions is a fundamental tension between POI and the very idea of _evidential incomparability_. (shrink)
In this paper I defend the claim that justification is closed under conjunction, and confront its most alarming consequence – that one can have justification for believing propositions that are unlikely to be true, given one’s evidence.
: Perhaps the central problem which preoccupies Spinoza as a moral philosopher is the conflict between reason and passion. He belongs to a long tradition that sees the key to happiness and virtue as mastery and control by reason over the passions. This mastery, however, is hard won, as the passions often overwhelm its power and subvert its rule. When reason succumbs to passion, we act against our better judgment. Such action is often termed 'akratic'. Many commentators have complained that (...) the psychological principles that Spinoza appeals to in his account of akrasia are mere ad hoc modifications to his philosophical psychology. I show, on the contrary, that these principles follow from some of the most important and interesting aspects of Spinoza's philosophy of mind. (shrink)
The epistemology of religion is the branch of epistemology concerned with the rationality, the justificatory status and the knowledge status of religious beliefs – most often the belief in the existence of an omnipotent, omniscient and loving God as conceived by the major monotheistic religions. While other sorts of religious beliefs – such as belief in an afterlife or in disembodied spirits or in the occurrence of miracles – have also been the focus of considerable attention from epistemologists, I shall (...) concentrate here on belief in God. There were a number of significant works in the epistemology of religion written during the early and mid Twentieth Century. The late Twentieth Century, however, saw a surge of interest in this area, fuelled by the work of philosophers such as William Alston, Alvin Plantinga and Linda Zagzebski amongst others. Alston, Plantinga and Zagzebski succeeded in importing, into the epistemology of religion, various new ideas from mainstream epistemology – in particular, externalist approaches to justification, such as reliabilism, and virtue theoretic approaches to knowledge (see, for instance, Alston, 1986, 1991, Plantinga, 1988, 2000, Zagzebski, 1993a, 1993b). This laid fertile ground for new research – questions about the justificatory and knowledge status of belief in God begin to look very different when viewed through the lens of theories such as these. I will begin by surveying some of this groundbreaking work in the present article, before moving on to work from the last five years – a period in which the epistemology of religion has again received impetus from a number of ideas from mainstream epistemology; ideas such as pragmatic encroachment, phenomenal conservatism and externalist theories of evidence. (shrink)
Extended simples are fruitfully discussed in metaphysics. They are entities which are located in a complex region of space but do not themselves have parts. In this paper, I will discuss unextended complexes: entities which are not located at a complex region of space but do themselves have parts. In particular, I focus on one type of unextended complex: pointy complexes. Four areas are indicated where pointy complexes might prove philosophically useful. Unextended complexes are therefore philosophically fruitful, in much the (...) same way as extended simples. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.