ABSTRACT Drawing inspiration from Fred Dretske, L. S. Carrier, John A. Barker, and Robert Nozick, we develop a tracking analysis of knowing according to which a true belief constitutes knowledge if and only if it is based on reasons that are sensitive to the fact that makes it true, that is, reasons that wouldn’t obtain if the belief weren’t true. We show that our sensitivity analysis handles numerous Gettier-type cases and lottery problems, blocks pathways leading to skepticism, and validates the (...) epistemic closure thesis that correct inferences from known premises yield knowledge of the conclusions. We discuss the plausible views of Ted Warfield and Branden Fitelson regarding cases of knowledge acquired via inference from false premises, and we show how our sensitivity analysis can account for such cases. We present arguments designed to discredit putative counterexamples to sensitivity analyses recently proffered by Tristan Haze, John Williams and Neil Sinhababu, which involve true statements made by untrustworthy informants and strange clocks that sometimes display the correct time while running backwards. Finally, we show that in virtue of employing the paradox-free subjunctive conditionals codified by Relevance Logic theorists instead of the paradox-laden subjunctive conditionals codified by Robert Stalnaker and David Lewis. (shrink)
When consumers choose to abstain from purchasing meat, they face some uncertainty about whether their decisions will have an impact on the number of animals raised and killed. Consequentialists have argued that this uncertainty should not dissuade consumers from a vegetarian diet because the “expected” impact, or average impact, will be predictable. Recently, however, critics have argued that the expected marginal impact of a consumer change is likely to be much smaller or more radically unpredictable than previously thought. This objection (...) to the consequentialist case for vegetarianism is known as the “causal inefficacy” (or “causal impotence”) objection. In this paper, we argue that the inefficacy objection fails. First, we summarize the contours of the objection and the standard “expected impact” response to it. Second, we examine and rebut two contemporary attempts (by Mark Budolfson and Ted Warfield) to defeat the expected impact reply through alleged demonstrations of the inefficacy of abstaining from meat consumption. Third, we argue that there are good reasons to believe that single individual consumers—not just individual consumers taken as an aggregate—really do make a positive difference when they choose to abstain from meat consumption. Our case rests on three economic observations: (i) animal producers operate in a highly competitive environment, (ii) complex supply chains efficiently communicate some information about product demand, and (iii) consumers of plant-based meat alternatives have positive consumption spillover effects on other consumers. (shrink)
A novel argument is offered against the following popular condition on inferential knowledge: a person inferentially knows a conclusion only if they know each of the claims from which they essentially inferred that conclusion. The epistemology of conditional proof reveals that we sometimes come to know conditionals by inferring them from assumptions rather than beliefs. Since knowledge requires belief, cases of knowing via conditional proof refute the popular knowledge from knowledge condition. It also suggests more radical cases against the condition (...) and it brings to light the under-recognized category of inferential basic knowledge. (shrink)
It is a widespread intuition that the coherence of independent reports provides a powerful reason to believe that the reports are true. Formal results by Huemer, M. 1997. “Probability and Coherence Justification.” Southern Journal of Philosophy 35: 463–72, Olsson, E. 2002. “What is the Problem of Coherence and Truth?” Journal of Philosophy XCIX : 246–72, Olsson, E. 2005. Against Coherence: Truth, Probability, and Justification. Oxford University Press., Bovens, L., and S. Hartmann. 2003. Bayesian Epistemology. Oxford University Press, prove that, under (...) certain conditions, coherence cannot increase the probability of the target claim. These formal results, known as ‘the impossibility theorems’ have been widely discussed in the literature. They are taken to have significant epistemic upshot. In particular, they are taken to show that reports must first individually confirm the target claim before the coherence of multiple reports offers any positive confirmation. In this paper, I dispute this epistemic interpretation. The impossibility theorems are consistent with the idea that the coherence of independent reports provides a powerful reason to believe that the reports are true even if the reports do not individually confirm prior to coherence. Once we see that the formal discoveries do not have this implication, we can recover a model of coherence justification consistent with Bayesianism and these results. This paper, thus, seeks to turn the tide of the negative findings for coherence reasoning by defending coherence as a unique source of confirmation. (shrink)
Why are we conscious? What does consciousness enable us to do that cannot be done by zombies in the dark? This paper argues that introspective consciousness probably co-evolved as a "spandrel" along with our more useful ability to represent the mental states of other people. The first part of the paper defines and motivates a conception of consciousness as a kind of "double vision" – the perception of how things seem to us as well as what they are – along (...) lines recently explored by Peter Carruthers. The second part explains the basic socioepistemic function of consciousness and suggests an evolutionary pathway to this cognitive capacity that begins with predators and prey. The third part discusses the relevance of these considerations to the traditional problem of other minds. (shrink)
The coherence of independent reports provides a strong reason to believe that the reports are true. This plausible claim has come under attack from recent work in Bayesian epistemology. This work shows that, under certain probabilistic conditions, coherence cannot increase the probability of the target claim. These theorems are taken to demonstrate that epistemic coherentism is untenable. To date no one has investigated how these results bear on different conceptions of coherence. I investigate this situation using Thagard’s ECHO model of (...) explanatory coherence. Thagard’s ECHO model provides a natural representation of the evidential significance of multiple independent reports. (shrink)
In this paper I shall show how Heidegger’s notions of Dasein’s “Being-with” (Mitsein), “discourse” (Rede), and “solicitude” (Fursorge) illustrate how he has a conception of the dialogical in Being and Time. There are at least three advantages to proposing that Heidegger is a dialogist in Being and Time. First, this paradigm offers an alternative, and more perspicuous, vocabulary for describing the discursive nature of Dasein’s Being-in-the-world as a Being-with others. Second, it provides a better way of recognizing and understanding the (...) normative dimensions of “solicitude.” And third, it helps to underscore the ineliminable sociality of Dasein’s understanding of itself and of others, such that its identity remains social even in the seemingly individualizing initial moment of becoming authentic. (shrink)
Prerequisite to memory is a past distinct from present. Because wave evolution is both continuous and time-reversible, the undisturbed quantum system lacks a distinct past and therefore the possibility of memory. With the quantum transition, a reversibly evolving superposition of values yields to an irreversible emergence of definite values in a distinct and transient moment of time. The succession of such moments generates an irretrievable past and thus the possibility of memory. Bohm’s notion of implicate and explicate order provides a (...) conceptual basis for memory as a general feature of nature akin to gravity and electromagnetism. I propose that natural memory is an outcome of the continuity of implicate time in the context of discontinuous explicate time. Among the ramifications of natural memory are that laws of nature can propagate through time much like habits and that personal memory does not require neural information storage. (shrink)
The problem of biological memory led Russell to propose the existence of mnemic causation, a mechanism by which past experience influences current thought directly, that is, without the need for a material intermediary in the form of a neural "memory trace." Russell appears to have been inspired by German biologist Richard Semon's concept of mnemic homophony, which conveys memory to consciousness on the basis of similarity between current and past circumstances. Semon, however, in no way denied a role for stable (...) neural structures or "engrams" in the memory process. In contrast to Russell, contemporary biologist Rupert Sheldrake provides a viable theory of memory by remaining true to Semon's original idea and expanding it beyond personal memory to the collective memory of species, encompassing not only mental but developmental memory. (shrink)
This paper articulates a way to ground a relatively high prior probability for grand explanatory theories apart from an appeal to simplicity. I explore the possibility of enumerating the space of plausible grand theories of the universe by using the explanatory properties of possible views to limit the number of plausible theories. I motivate this alternative grounding by showing that Swinburne’s appeal to simplicity is problematic along several dimensions. I then argue that there are three plausible grand views—theism, atheism, and (...) axiarchism—which satisfy explanatory requirements for plausibility. Other possible views lack the explanatory virtue of these three theories. Consequently, this explanatory grounding provides a way of securing a nontrivial prior probability for theism, atheism, and axiarchism. An important upshot of my approach is that a modest amount of empirical evidence can bear significantly on the posterior probability of grand theories of the universe. (shrink)
Dmitri Nikulin is one of the few contemporary philosophers to have devoted books to the topic of dialogue and the dialogical self, especially in the last fifteen years. Yet his work on dialogue and the dialogical has received scant attention by philosophers, and this neglect has hurt the ongoing development of contemporary philosophical work on dialogicality. I want to address this lacuna in contemporary philosophical scholarship on dialogicality and suggest that, although Nikulin’s account is no doubt insightful and thought-provoking, it (...) is problematic for two main reasons: first, his account fails to recognize the proper relationship between dialogue and agency; and second, his enumeration of the necessary and sufficient conditions for dialogue contains conceptual inconsistencies. (shrink)
Ted Poston's book Reason and Explanation: A Defense of Explanatory Coherentism is a book worthy of careful study. Poston develops and defends an explanationist theory of (epistemic) justification on which justification is a matter of explanatory coherence which in turn is a matter of conservativeness, explanatory power, and simplicity. He argues that his theory is consistent with Bayesianism. He argues, moreover, that his theory is needed as a supplement to Bayesianism. There are seven chapters. I provide a chapter-by-chapter summary along (...) with some substantive concerns. (shrink)
Suppositions can be introduced in either the indicative or subjunctive mood. The introduction of either type of supposition initiates judgments that may be either qualitative, binary judgments about whether a given proposition is acceptable or quantitative, numerical ones about how acceptable it is. As such, accounts of qualitative/quantitative judgment under indicative/subjunctive supposition have been developed in the literature. We explore these four different types of theories by systematically explicating the relationships canonical representatives of each. Our representative qualitative accounts of indicative (...) and subjunctive supposition are based on the belief change operations provided by AGM revision and KM update respectively; our representative quantitative ones are offered by conditionalization and imaging. This choice is motivated by the familiar approach of understanding supposition as `provisional belief revision' wherein one temporarily treats the supposition as true and forms judgments by making appropriate changes to their other opinions. To compare the numerical judgments recommended by the quantitative theories with the binary ones recommended by the qualitative accounts, we rely on a suitably adapted version of the Lockean thesis. Ultimately, we establish a number of new results that we interpret as vindicating the often-repeated claim that conditionalization is a probabilistic version of revision, while imaging is a probabilistic version of update. (shrink)
By eliminating the need for an absolute frame of reference or ether, Einstein resolved the problem of the constancy of light-speed in all inertial frames but created a new problem in our understanding of time. The resolution of this problem requires no experimentation but only a careful analysis of special relativity, in particular the relativity of simultaneity. This concept is insufficiently relativistic insofar as Einstein failed to recognize that any given set of events privileges the frame in which the events (...) occur; relative to those events, only the privileged frame yields the correct measurement. Instead of equally valid frames occupying different times, one frame is correct and all others incorrect within a shared present moment. I conclude that (1) time is a succession of universal moments and (2) in the context of flowing time, time dilation requires absolute simultaneity, whereas relative simultaneity predicts a nonexistent phenomenon here dubbed time regression. (shrink)
Ted Honderich's theory of consciousness as existence, which he here calls Radical Externalism, starts with a good phenomenological observation: that perceptual experience appears to involve external things being immediately present to us. As P.F. Strawson once observed, when asked to describe my current perceptual state, it is normally enough simply to describe the things around me (Strawson, 1979, p. 97). But in my view that does not make the whole theory plausible.
Because an unmeasured quantum system consists of information — neither tangible existence nor its complete absence — no property can be assigned a definite value, only a range of likely values should it be measured. The instantaneous transition from information to matter upon measurement establishes a gradient between being and not-being. A quantum system enters a determinate state in a particular moment until this moment is past, at which point the system resumes its default state as an evolving superposition of (...) potential values of properties, neither strictly being nor not-being. Like a “self-organized” chemical system that derives energy from breaking down environmental gradients, a quantum system derives information from breaking down the ontological gradient. An organism is a body in the context of energy and a mind in the context of information. (shrink)
The quantum measurement problem resolves according to the twofold nature of time. Whereas the continuous evolution of the wave function reflects the fundamental nature of time as continuous presence, the collapse of the wave function indicates the subsidiary aspect of time as the projection of instantaneity from the ongoing present. Each instant irreversibly emerges from the reversible temporal continuum implicit in the smoothly propagating wave function. The basis of this emergence is periodic conflict between quantum systems, the definitive resolution of (...) which requires the momentary reduction of each system from the potentially infinite dimensions of configuration space to the three dimensions of classical space at an instant. (shrink)
An atom is characterized mathematically as an evolving superposition of possible values of properties and experimentally as an instantaneous phenomenon with a precise value of a measured property. Likewise, an organism is to itself a flux of experience and to an observer a tangible body in a distinct moment. Whereas the implicit atom is the stream of computation represented by the smoothly propagating wave function, the implicit organism is both the species from which the body individuates and the personal mind (...) its behavior explicates. As with the wave computation that underlies the atom, the substance of the implicit organism is not matter but information. And like projection from a superposition of potential values to a single outcome in a precise and fleeting moment, the organism actualizes only one of the many possible behaviors calculated in the ongoing presence we know as consciousness. (shrink)
The foundation of irreversible, probabilistic time -- the classical time of conscious observation -- is the reversible and deterministic time of the quantum wave function. The tendency in physics is to regard time in the abstract, a mere parameter devoid of inherent direction, implying that a concept of real time begins with irreversibility. In reality time has no need for irreversibility, and every invocation of time implies becoming or flow. Neither symmetry under time reversal, of which Newton was well aware, (...) nor the absence of an absolute parameter, as in relativity, negates temporal passage. Far from encapsulating time, irreversibility is a secondary property dependent on the emergence of distinct moments from the ceaseless presence charted by the wave function. (shrink)
Though Einstein explained time dilation without recourse to a universal frame of reference, he erred by abolishing universal present moments. Relative simultaneity is insufficiently relativistic insofar as it depends on the absolute equality of reference frames in the measurement of the timing of events. Yet any given set of events privileges the frame in which the events take place. Relative to those events, the privileged frame yields the correct measurement of their timing while all other frames yield an incorrect measurement. (...) Instead of multiple frames occupying multiple times, one frame is correct and all others incorrect within a shared present moment. With the collapse of relative simultaneity, we may regard time as a succession of universal moments. Absolute simultaneity, in turn, explains why an accelerated inertial frame dilates in time rather than regressing to a prior moment relative to non-accelerated frames. In the context of flowing time, absolute simultaneity predicts time dilation while relative simultaneity predicts time regression. Einstein's explanation of time dilation is therefore incomplete. (shrink)
When you and I seriously argue over whether a man of seventy is old enough to count as an "old man", it seems that we are appealing neither to our own separate standards of oldness nor to a common standard that is already fixed in the language. Instead, it seems that both of us implicitly invoke an ideal, shared standard that has yet to be agreed upon: the place where we ought to draw the line. As with other normative standards, (...) it is hard to know whether such borderlines exist prior to our coming to agree on where they are. But epistemicists plausibly argue that they must exist whether we ever agree on them or not, as this provides the only logically acceptable response to the sorites paradox. This paper argues that such boundaries do typically exist as hypothetical ideals, but not as determinate features of the present actual world. There is in fact no general solution to the paradox, but attention to practice in resolving vague disagreements shows that its instances can be dealt with separately, as they arise, in many reasonable ways. (shrink)
Bradford Hill (1965) highlighted nine aspects of the complex evidential situation a medical researcher faces when determining whether a causal relation exists between a disease and various conditions associated with it. These aspects are widely cited in the literature on epidemiological inference as justifying an inference to a causal claim, but the epistemological basis of the Hill aspects is not understood. We offer an explanatory coherentist interpretation, explicated by Thagard's ECHO model of explanatory coherence. The ECHO model captures the complexity (...) of epidemiological inference and provides a tractable model for inferring disease causation. We apply this model to three cases: the inference of a causal connection between the Zika virus and birth defects, the classic inference that smoking causes cancer, and John Snow’s inference about the cause of cholera. (shrink)
AI-supported methods for identifying and combating disinformation are progressing in their development and application. However, these methods face a litany of epistemic and ethical challenges. These include (1) robustly defining disinformation, (2) reliably classifying data according to this definition, and (3) navigating ethical risks in the deployment of countermeasures, which involve a mixture of harms and benefits. This paper seeks to expose and offer preliminary analysis of these challenges.
Book review of Earl Conee and Ted Sider's "Riddles of Existence: A Guided Tour of Metaphysics", written in Spanish. || Reseña del libro "Riddles of Existence: A Guided Tour of Metaphysics", escrito por Earl Conee y Ted Sider.
Perhaps no one has written more extensively, more deeply, and more insightfully about determinism and freedom than Ted Honderich. His influence and legacy with regard to the problem of free will—or the determinism problem, as he prefers to frame it—looms large. In these comments I would like to focus on three main aspects of Honderich ’s work: his defense of determinism and its consequences for origination and moral responsibility; his concern that the truth of determinism threatens and restricts, but does (...) not eliminate, our life-hopes; and his attack on the traditional justifications for punishment. In many ways, I see my own defense of free will skepticism as the natural successor to Honderich ’s work. There are, however, some small differences between us. My goal in this paper is to clarify our areas of agreement and disagreement and to acknowledge my enormous debt to Ted. If I can also move him toward my own more optimistic brand of free will skepticism that would be great too. (shrink)
Fears of black-box algorithms are multiplying. Black-box algorithms are said to prevent accountability, make it harder to detect bias and so on. Some fears concern the epistemology of black-box algorithms in medicine and the ethical implications of that epistemology. In ‘Who is afraid of black box algorithms? On the epistemological and ethical basis of trust in medical AI,’ Juan Durán and Karin Jongsma seek to allay such fears. While we find some of their arguments compelling, we still see reasons for (...) fear. (shrink)
This paper raises questions concerning Ted Morris’ interpretation of Hume’s notion of meaning and investigates the private and public aspects of Hume’s notion of meaning.
The INBIOSA project brings together a group of experts across many disciplines who believe that science requires a revolutionary transformative step in order to address many of the vexing challenges presented by the world. It is INBIOSA’s purpose to enable the focused collaboration of an interdisciplinary community of original thinkers. This paper sets out the case for support for this effort. The focus of the transformative research program proposal is biology-centric. We admit that biology to date has been more fact-oriented (...) and less theoretical than physics. However, the key leverageable idea is that careful extension of the science of living systems can be more effectively applied to some of our most vexing modern problems than the prevailing scheme, derived from abstractions in physics. While these have some universal application and demonstrate computational advantages, they are not theoretically mandated for the living. A new set of mathematical abstractions derived from biology can now be similarly extended. This is made possible by leveraging new formal tools to understand abstraction and enable computability. [The latter has a much expanded meaning in our context from the one known and used in computer science and biology today, that is "by rote algorithmic means", since it is not known if a living system is computable in this sense (Mossio et al., 2009).] Two major challenges constitute the effort. The first challenge is to design an original general system of abstractions within the biological domain. The initial issue is descriptive leading to the explanatory. There has not yet been a serious formal examination of the abstractions of the biological domain. What is used today is an amalgam; much is inherited from physics (via the bridging abstractions of chemistry) and there are many new abstractions from advances in mathematics (incentivized by the need for more capable computational analyses). Interspersed are abstractions, concepts and underlying assumptions “native” to biology and distinct from the mechanical language of physics and computation as we know them. A pressing agenda should be to single out the most concrete and at the same time the most fundamental process-units in biology and to recruit them into the descriptive domain. Therefore, the first challenge is to build a coherent formal system of abstractions and operations that is truly native to living systems. Nothing will be thrown away, but many common methods will be philosophically recast, just as in physics relativity subsumed and reinterpreted Newtonian mechanics. -/- This step is required because we need a comprehensible, formal system to apply in many domains. Emphasis should be placed on the distinction between multi-perspective analysis and synthesis and on what could be the basic terms or tools needed. The second challenge is relatively simple: the actual application of this set of biology-centric ways and means to cross-disciplinary problems. In its early stages, this will seem to be a “new science”. This White Paper sets out the case of continuing support of Information and Communication Technology (ICT) for transformative research in biology and information processing centered on paradigm changes in the epistemological, ontological, mathematical and computational bases of the science of living systems. Today, curiously, living systems cannot be said to be anything more than dissipative structures organized internally by genetic information. There is not anything substantially different from abiotic systems other than the empirical nature of their robustness. We believe that there are other new and unique properties and patterns comprehensible at this bio-logical level. The report lays out a fundamental set of approaches to articulate these properties and patterns, and is composed as follows. -/- Sections 1 through 4 (preamble, introduction, motivation and major biomathematical problems) are incipient. Section 5 describes the issues affecting Integral Biomathics and Section 6 -- the aspects of the Grand Challenge we face with this project. Section 7 contemplates the effort to formalize a General Theory of Living Systems (GTLS) from what we have today. The goal is to have a formal system, equivalent to that which exists in the physics community. Here we define how to perceive the role of time in biology. Section 8 describes the initial efforts to apply this general theory of living systems in many domains, with special emphasis on crossdisciplinary problems and multiple domains spanning both “hard” and “soft” sciences. The expected result is a coherent collection of integrated mathematical techniques. Section 9 discusses the first two test cases, project proposals, of our approach. They are designed to demonstrate the ability of our approach to address “wicked problems” which span across physics, chemistry, biology, societies and societal dynamics. The solutions require integrated measurable results at multiple levels known as “grand challenges” to existing methods. Finally, Section 10 adheres to an appeal for action, advocating the necessity for further long-term support of the INBIOSA program. -/- The report is concluded with preliminary non-exclusive list of challenging research themes to address, as well as required administrative actions. The efforts described in the ten sections of this White Paper will proceed concurrently. Collectively, they describe a program that can be managed and measured as it progresses. (shrink)
ABSTRACT: BACKGROUND: The governments and citizens of the developed nations are increasingly called upon to contribute financially to health initiatives outside their borders. Although international development assistance for health has grown rapidly over the last two decades, austerity measures related to the 2008 and 2011 global financial crises may impact negatively on aid expenditures. The competition between national priorities and foreign aid commitments raises important ethical questions for donor nations. This paper aims to foster individual reflection and public debate on (...) donor responsibilities for global health. METHODS: We undertook a critical review of contemporary accounts of justice. We selected theories that: (i) articulate important and widely held moral intuitions; (ii) have had extensive impact on debates about global justice; (iii) represent diverse approaches to moral reasoning; and (iv) present distinct stances on the normative importance of national borders. Due to space limitations we limit the discussion to four frameworks. RESULTS: Consequentialist, relational, human rights, and social contract approaches were considered. Responsibilities to provide international assistance were seen as significant by all four theories and place limits on the scope of acceptable national autonomy. Among the range of potential aid foci, interventions for health enjoyed consistent prominence. The four theories concur that there are important ethical responsibilities to support initiatives to improve the health of the worst off worldwide, but offer different rationales for intervention and suggest different implicit limits on responsibilities. CONCLUSIONS: Despite significant theoretical disagreements, four influential accounts of justice offer important reasons to support many current initiatives to promote global health. Ethical argumentation can complement pragmatic reasons to support global health interventions and provide an important foundation to strengthen collective action. (shrink)
In the third and final part of his A Theory of Determinism (TD) Ted Honderich addresses the fundamental question concerning “the consequences of determinism.” The critical question he aims to answer is what follows if determinism is true? This question is, of course, intimately bound up with the problem of free will and, in particular, with the question of whether or not the truth of determinism is compatible or incompatible with the sort of freedom required for moral responsibility. It is (...) Honderich’s aim to provide a solution to “the problem of the consequences of determinism” and a key element of this is his articulation and defence of an alternative response to the implications of determinism that collapses the familiar Compatibilist/Incompatibilist dichotomy. Honderich offers us a third way – the response of “Affirmation” (HFY 125-6). Although his account of Affirmation has application and relevance to issues and features beyond freedom and responsibility, my primary concern in this essay will be to examine Honderich’s theory of “Affirmation” as it concerns the free will problem. (shrink)
Ted Sider argues that nihilism about objects is incompatible with the metaphysical possibility of gunk and takes this point to show that nihilism is flawed. I shall describe one kind of nihilism able to answer this objection. I believe that most of the things we usually encounter do not exist. That is, I take talk of macroscopic objects and macroscopic properties to refer to sets of fundamental properties, which are invoked as a matter of linguistic convention. This view is a (...) kind of nihilism : it rules out the existence of objects; that is, from an ontological point of view, there are no objects. But unlike the moderate nihilism of Mark Heller, Peter van Inwagen and Trenton Merricks that claims that most objects do not exist, I endorse a radical nihilism according to which there are no objects in the world, but only properties instantiated in spacetime. As I will show, radical nihilism is perfectly compatible with the metaphysical possibility of gunk. It is also compatible with the epistemic possibility that we actually live in a gunk world. The objection raised by Ted Sider only applies to moderate nihilism that admits some objects in its ontology. (shrink)
Ted Sider’s Proportionality of Justice condition requires that any two moral agents instantiating nearly the same moral state be treated in nearly the same way. I provide a countermodel in supervaluation semantics to the proportionality of justice condition. It is possible that moral agents S and S' are in nearly the same moral state, S' is beyond all redemption and S is not. It is consistent with perfect justice then that moral agents that are not beyond redemption go determinately to (...) heaven and moral agents that are beyond all redemption go determinately to hell. I conclude that moral agents that are in nearly the same moral state may be treated in very unequal ways. (shrink)
Ted Sider has famously argued that existence, in the unrestricted sense of ontology, cannot be vague, as long as vagueness is modeled by means of precisifications. The first section of Chapter 9 exposes some controversial assumptions underlying Sider’s alleged reductio of vague existence. The upshot of the discussion is that, although existence cannot be vague, it can be super-vague, i.e. higher-order vague, for all orders. The second section develops and defends a novel framework, dubbed negative supervaluationary semantics, which makes room (...) for the possibility of super-vague existence. (shrink)
Ted Sider has shown that my indeterminism argument for comparativist theories of quantity also applies to Mundy's absolutist theory. This is because Mundy's theory posits only "pure" relations, i.e. relations between values of the same quantity (between masses and other masses, or distances and other distances). It is straightforward to solve the problem by positing additional mixed relations.
Ted Honderich's edited volume, with introductions to his chosen philosophers shows his contempt/ignorance of the non-white world's thinkers. Further, this review points out the iterative nature of Western philosophy today. The book under review is banal and shows the pathetic state of philosophising in the West now in 2020.
In a situation of peer disagreement, peers are usually assumed to share the same evidence. However they might not share the same evidence for the epistemic system used to process the evidence. This synchronic complication of the peer disagreement debate suggested by Goldman (In Feldman R, Warfield T (eds) (2010) Disagreement. Oxford University Press, Oxford, pp 187–215) is elaborated diachronically by use of a simulation. The Hegselmann–Krause model is extended to multiple epistemic systems and used to investigate the role (...) of consensus and difference splitting in peer disagreement. I find that the very possibility of multiple epistemic systems downgrades the epistemic value of consensus and makes difference splitting a suboptimal strategy. (shrink)
Alessandro Torza argues that Ted Sider’s Lewisian argument against vague existence is insufficient to rule out the possibility of what he calls ‘super-vague existence’, that is the idea that existence is higher-order vague, for all orders. In this chapter it is argued that the possibility of super-vague existence is ineffective against the conclusion of Sider’s argument since super-vague existence cannot be consistently claimed to be a kind of linguistic vagueness. Torza’s idea of super-vague existence seems to be better suited to (...) model vague existence under the assumption that vague existence is instead a form of ontic indeterminacy, contra what Ted Sider and David Lewis assume. (shrink)
The truthmaker literature has recently come to the consensus that the logic of truthmaking is distinct from classical propositional logic. This development has huge implications for the free will literature. Since free will and moral responsibility are primarily ontological concerns (and not semantic concerns) the logic of truthmaking ought to be central to the free will debate. I shall demonstrate that counterexamples to transfer principles employed in the direct argument occur precisely where a plausible logic of truthmaking diverges from classical (...) logic. Further, restricted transfer principles (like the ones employed by McKenna, Stump, and Warfield) are as problematic as the original formulation of the direct argument. (shrink)
I here argue that Ted Sider's indeterminacy argument against vagueness in quantifiers fails. Sider claims that vagueness entails precisifications, but holds that precisifications of quantifiers cannot be coherently described: they will either deliver the wrong logical form to quantified sentences, or involve a presupposition that contradicts the claim that the quantifier is vague. Assuming (as does Sider) that the “connectedness” of objects can be precisely defined, I present a counter-example to Sider's contention, consisting of a partial, implicit definition of the (...) existential quantifier that in effect sets a given degree of connectedness among the putative parts of an object as a condition upon there being something (in the sense in question) with those parts. I then argue that such an implicit definition, taken together with an “auxiliary logic” (e.g., introduction and elimination rules), proves to function as a precisification in just the same way as paradigmatic precisifications of, e.g., “red”. I also argue that with a quantifier that is stipulated as maximally tolerant as to what mereological sums there are, precisifications can be given in the form of truth-conditions of quantified sentences, rather than by implicit definition. (shrink)
In Writing the Book of the World, Ted Sider argues that David Lewis’s distinction between those predicates which are ‘perfectly natural’ and those which are not can be extended so that it applies to words of all semantic types. Just as there are perfectly natural predicates, there may be perfectly natural connectives, operators, singular terms and so on. According to Sider, one of our goals as metaphysicians should be to identify the perfectly natural words. Sider claims that there is a (...) perfectly natural first-order quantifier. I argue that this claim is not justified. Quine has shown that we can dispense with first-order quantifiers, by using a family of ‘predicate functors’ instead. I argue that we have no reason to think that it is the first-order quantifiers, rather than Quine’s predicate functors, which are perfectly natural. The discussion of quantification is used to provide some motivation for a general scepticism about Sider’s project. Shamik Dasgupta’s ‘generalism’ and Jason Turner’s critique of ‘ontological nihilism’ are also discussed. (shrink)
This paper looks at an argument strategy for assessing the epistemic closure principle. This is the principle that says knowledge is closed under known entailment; or (roughly) if S knows p and S knows that p entails q, then S knows that q. The strategy in question looks to the individual conditions on knowledge to see if they are closed. According to one conjecture, if all the individual conditions are closed, then so too is knowledge. I give a deductive argument (...) for this conjecture. According to a second conjecture, if one (or more) condition is not closed, then neither is knowledge. I give an inductive argument for this conjecture. In sum, I defend the strategy by defending the claim that knowledge is closed if, and only if, all the conditions on knowledge are closed. After making my case, I look at what this means for the debate over whether knowledge is closed. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.