Host-microbiome interactions (HMIs) are critical for the modulation of biological processes and are associated with several diseases, and extensive HMI studies have generated large amounts of data. We propose that the logical representation of the knowledge derived from these data and the standardized representation of experimental variables and processes can foster integration of data and reproducibility of experiments and thereby further HMI knowledge discovery. A community-based Ontology of Host-Microbiome Interactions (OHMI) was developed following the OBO Foundry principles. OHMI leverages established (...) ontologies to create logically structured representations of microbiomes, microbial taxonomy, host species, host anatomical entities, and HMIs under different conditions and associated study protocols and types of data analysis and experimental results. (shrink)
Conscience is oft-referred to yet not understood. This text develops a theory of cognition around a model of conscience, the ACTWith model. It represents a synthesis of results from contemporary neuroscience with traditional philosophy, building from Jamesian insights into the emergence of the self to narrative identity, all the while motivated by a single mechanism as represented in the ACTWith model. Emphasis is placed on clarifying historical expressions and demonstrations of conscience - Socrates, Heidegger, Kant, M.L. King - in light (...) of the ACTWith model, while at once turning these resources to developing the basic architecture. In the end, this text aims to enrich moral theory by improving our understanding of moral cognition, while at once providing a useful tool in everyday moral practice and self-development. (shrink)
Our purpose in this article is to identify and suggest resolution for two core problematics of grounded theory. First, while grounded theory provides transparency to one part of the conceptualization process, where codes emerge directly from the data, it provides no such systematic or transparent way for gaining insight into the conceptual relationships between discovered codes. Producing a grounded theory depends not only on the definition of conceptual pieces, but the delineation of a relationship between at least two of those (...) pieces. Second, the conceptualization process of grounded theory is done in hierarchical fashion, where individual codes emerge from the data but then are used to generate insight into more general concepts and thematic statements. But various works on grounded theory have failed to provide any systematic way of using data specific levels of scale (the codes) to gain insight into more macro levels of scale (concepts and themes). We offer fractal concept analysis as a means of resolving both of these issues. By using a logic structure generator, fractal concept analysis delineates self-similar conceptual frameworks at various levels of abstraction, yielding a method for linking concepts together within and between levels of scale encountered in the grounded theory coding and categorization process. We conclude that this fractal analytic technique can bolster the aims of grounded theory as a formalized and systematic process for generating theory from empirical data. (shrink)
Eyal et al have recently argued that researchers should consider conducting severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) human challenge studies to hasten vaccine development. We have conducted (J. L.) and overseen (L. D.) human challenge studies and agree that they can be useful in developing anti-infective agents. We also agree that adults can autonomously choose to undergo risks with no prospect of direct benefit to themselves. However, we disagree that SARS-CoV-2 challenge studies are ethically appropriate at this time, for (...) 3 reasons: (1) current scientific knowledge of SARS-CoV-2 infection is insufficient to manage risks; (2) autonomous decision making, while necessary, does not override concerns about risk; and (3) undertaking challenge studies now would imperil confidence in the research enterprise, potentially undermining the global response to the coronavirus disease 2019 (COVID-19) pandemic. (shrink)
In this paper, I shed light on Kant’s notion of Erkenntnis or cognition by focusing on texts pertaining to Kant’s thoughts on logic. Although a passage from Kant’s Logik is widely referred to for understanding Kant’s conception of Erkenntnis, this work was not penned by Kant himself but rather compiled by Benjamin Jäsche. So, it is imperative to determine its fidelity to Kant’s thought. I compare the passage with other sources, including Reflexionen and students’ lecture notes. I argue that several (...) of the text’s peculiarities stem from Jäsche rather than Kant, but that nevertheless Jäsche largely got Kant's view right, with two major exceptions. First, Jäsche’s text fails to reproduce Kant’s key thesis that kennen and verstehen are jointly sufficient for Erkenntnis. Second, Jäsche’s text gives the false impression that Kant holds that animals have consciousness. (shrink)
Some contextually sensitive expressions are such that their context independent conventional meanings need to be in some way supplemented in context for the expressions to secure semantic values in those contexts. As we’ll see, it is not clear that there is a paradigm here, but ‘he’ used demonstratively is a clear example of such an expression. Call expressions of this sort supplementives in order to highlight the fact that their context independent meanings need to be supplemented in context for them (...) to have semantic values relative to the context. Many philosophers and linguists think that there is a lot of contextual sensitivity in natural language that goes well beyond the pure indexicals and supplementives like ‘he’. Constructions/expressions that are good candidates for being contextually sensitive include: quantifiers, gradable adjectives including “predicates of personal taste”, modals, conditionals, possessives and relational expressions taking implicit arguments. It would appear that in none of these cases does the expression/construction in question have a context independent meaning that when placed in context suffices to secure a semantic value for the expression/construction in the context. In each case, some sort of supplementation is required to do this. Hence, all these expressions are supplementives in my sense. For a given supplementive, the question arises as to what the mechanism is for supplementing its conventional meanings in context so as to secure a semantic value for it in context. That is, what form does the supplementation take? The question also arises as to whether different supplementives require different kinds of supplementation. Let us call an account of what, in addition to its conventional meaning, secures a semantic value for a supplementive in context a metasemantics for that supplementive. So we can put our two questions thus: what is the proper metasemantics for a given supplementive; and do all supplementives have the same metasemantics? In the present work, I sketch the metasemantics I formulated for demonstratives in earlier work. Next, I briefly consider a number of other supplementives that I think the metasemantics I propose plausibly applies to and explain why I think that. Finally, I consider the prospects for extending the account to all supplementives. In so doing, I take up arguments due to Michael Glanzberg to the effect that supplementives are governed by two different metasemantics and attempt to respond to them. (shrink)
Mereological nihilism is the philosophical position that there are no items that have parts. If there are no items with parts then the only items that exist are partless fundamental particles, such as the true atoms (also called philosophical atoms) theorized to exist by some ancient philosophers, some contemporary physicists, and some contemporary philosophers. With several novel arguments I show that mereological nihilism is the correct theory of reality. I will also discuss strong similarities that mereological nihilism has with empirical (...) results in quantum physics. And I will discuss how mereological nihilism vindicates a few other theories, such as a very specific theory of philosophical atomism, which I will call quantum abstract atomism. I will show that mereological nihilism also is an interpretation of quantum mechanics that avoids the problems of other interpretations, such as the widely known, metaphysically generated, quantum paradoxes of quantum physics, which ironically are typically accepted as facts about reality. I will also show why it is very surprising that mereological nihilism is not a widely held theory, and not the premier theory in philosophy. (shrink)
The Epistemic Objection says that certain theories of time imply that it is impossible to know which time is absolutely present. Standard presentations of the Epistemic Objection are elliptical—and some of the most natural premises one might fill in to complete the argument end up leading to radical skepticism. But there is a way of filling in the details which avoids this problem, using epistemic safety. The new version has two interesting upshots. First, while Ross Cameron alleges that the Epistemic (...) Objection applies to presentism as much as to theories like the growing block, the safety version does not overgeneralize this way. Second, the Epistemic Objection does generalize in a different, overlooked way. The safety objection is a serious problem for a widely held combination of views: “propositional temporalism” together with “metaphysical eternalism”. (shrink)
3 Abstract This paper is about modeling morality, with a proposal as to the best 4 way to do it. There is the small problem, however, in continuing disagreements 5 over what morality actually is, and so what is worth modeling. This paper resolves 6 this problem around an understanding of the purpose of a moral model, and from 7 this purpose approaches the best way to model morality.
How should a group with different opinions (but the same values) make decisions? In a Bayesian setting, the natural question is how to aggregate credences: how to use a single credence function to naturally represent a collection of different credence functions. An extension of the standard Dutch-book arguments that apply to individual decision-makers recommends that group credences should be updated by conditionalization. This imposes a constraint on what aggregation rules can be like. Taking conditionalization as a basic constraint, we gather (...) lessons from the established work on credence aggregation, and extend this work with two new impossibility results. We then explore contrasting features of two kinds of rules that satisfy the constraints we articulate: one kind uses fixed prior credences, and the other uses geometric averaging, as opposed to arithmetic averaging. We also prove a new characterisation result for geometric averaging. Finally we consider applications to neighboring philosophical issues, including the epistemology of disagreement. (shrink)
The Christian doctrine of the Trinity poses a serious philosophical problem. On the one hand, it seems to imply that there is exactly one divine being; on the other hand, it seems to imply that there are three. There is another well-known philosophical problem that presents us with a similar sort of tension: the problem of material constitution. We argue in this paper that a relatively neglected solution to the problem of material constitution can be developed into a novel solution (...) to the problem of the Trinity. (shrink)
According to the doctrine of divine simplicity, God is an absolutely simple being lacking any distinct metaphysical parts, properties, or constituents. Although this doctrine was once an essential part of traditional philosophical theology, it is now widely rejected as incoherent. In this paper, I develop an interpretation of the doctrine designed to resolve contemporary concerns about its coherence, as well as to show precisely what is required to make sense of divine simplicity.
We consider how an epistemic network might self-assemble from the ritualization of the individual decisions of simple heterogeneous agents. In such evolved social networks, inquirers may be significantly more successful than they could be investigating nature on their own. The evolved network may also dramatically lower the epistemic risk faced by even the most talented inquirers. We consider networks that self-assemble in the context of both perfect and imperfect communication and compare the behaviour of inquirers in each. This provides a (...) step in bringing together two new and developing research programs, the theory of self-assembling games and the theory of network epistemology. (shrink)
There is a traditional theistic doctrine, known as the doctrine of divine simplicity, according to which God is an absolutely simple being, completely devoid of any metaphysical complexity. On the standard understanding of this doctrine—as epitomized in the work of philosophers such as Augustine, Anselm, and Aquinas—there are no distinctions to be drawn between God and his nature, goodness, power, or wisdom. On the contrary, God is identical with each of these things, along with anything else that can be predicated (...) of him intrinsically. (shrink)
David Lewis holds that a single possible world can provide more than one way things could be. But what are possible worlds good for if they come apart from ways things could be? We can make sense of this if we go in for a metaphysical understanding of what the world is. The world does not include everything that is the case—only the genuine facts. Understood this way, Lewis's “cheap haecceitism” amounts to a kind of metaphysical anti-haecceitism: it says there (...) aren't any genuine facts about individuals over and above their qualitative roles. (shrink)
Readers of Philosophical Psychology may be most familiar with Ron Sun by way of an article recently appearing in this journal on creative composition expressed within his own hybrid computational intelligence model, CLARION (Sun, 2013). That article represents nearly two decades’ work in situated agency stressing the importance of psychologically realistic architectures and processes in the articulation of both functional, and reflectively informative, AI and agent- level social-cultural simulations. Readers may be less familiar with Sun’s 2001 “prolegomena” to related multi-agent (...) (proto-social) research also from this journal. That article argues that “a proper balance between “objective” social reality and individual cognitive processes” is necessary in order to understand “how individual belief systems... and the social/cultural belief system ... interact” (Sun, 2001, pages 10 and 23). This issue remains central in Sun’s 2012 edited volume, Grounding Social Sciences in the Cognitive Sciences, here addressed from within the expanding field of pioneering researchers bent on orchestrating that proper balance, the “cognitive social sciences.” Its fifteen chapters are sectioned according to culture, politics, religion, and economics, and closes with an especially rewarding pair of contributions from Gintis, and McCubbins and Turner, under the heading of “unifying perspectives.” Most entries – but for Sun’s own - are serviceably summarized in the introductory overview. So, rather than follow suit, this review will focus on setting out Sun’s vision, noting how this text helps us to realize it more clearly, with a positive focus on a few entries in particular. (shrink)
We prove a representation theorem for preference relations over countably infinite lotteries that satisfy a generalized form of the Independence axiom, without assuming Continuity. The representing space consists of lexicographically ordered transfinite sequences of bounded real numbers. This result is generalized to preference orders on abstract superconvex spaces.
Linguists often advert to what are sometimes called linguistic intuitions. These intuitions and the uses to which they are put give rise to a variety of philosophically interesting questions: What are linguistic intuitions – for example, what kind of attitude or mental state is involved? Why do they have evidential force and how might this force be underwritten by their causal etiology? What light might their causal etiology shed on questions of cognitive architecture – for example, as a case study (...) of how consciously inaccessible subpersonal processes give rise to conscious states, or as a candidate example of cognitive penetrability? What methodological issues arise concerning how linguistic intuitions are gathered and interpreted – for example, might some subjects' intuitions be more reliable than others? And what bearing might all this have on philosophers' own appeals to intuitions? This paper surveys and critically discusses leading answers to these questions. In particular, we defend a ‘mentalist’ conception of linguistics and the role of linguistic intuitions therein. (shrink)
I examine three ‘anti-object’ metaphysical views: nihilism, generalism, and anti-quantificationalism. After setting aside nihilism, I argue that generalists should be anti-quantificationalists. Along the way, I attempt to articulate what a ‘metaphysically perspicuous’ language might even be.
“There are no gaps in logical space,” David Lewis writes, giving voice to sentiment shared by many philosophers. But different natural ways of trying to make this sentiment precise turn out to conflict with one another. One is a *pattern* idea: “Any pattern of instantiation is metaphysically possible.” Another is a *cut and paste* idea: “For any objects in any worlds, there exists a world that contains any number of duplicates of all of those objects.” We use resources from model (...) theory to show the inconsistency of certain packages of combinatorial principles and the consistency of others. (shrink)
Psychopathy is increasingly in the public eye. However, it is yet to be fully and effectively understood. Within the context of the DSM-IV, for example, it is best regarded as a complex family of disorders. The upside is that this family can be tightly related along common dimensions. Characteristic marks of psychopaths include a lack of guilt and remorse for paradigm case immoral actions, leading to the common conception of psychopathy rooted in affective dysfunctions. An adequate portrait of psychopathy is (...) much more complicated, however. Though some neural regions and corresponding functions are commonly indicated, they range across those responsible for action planning and learning, as well as emotional processes. Accordingly, a complete fine-grained map of all neural mechanisms responsible for psychopathy has not been realized, and even if it were, such a map would have limited utility outside of the context of surgical or chemical intervention. The utility of a neural-level understanding of psychopathy is further limited by the fact that it is only applicable in the clinical identification of individual subjects, and the neuro-chemical/biological correction of those subjects after they are positively identified as psychopaths. On the other hand, an information processing model of moral cognition provides for wider-ranging applications. The theoretical and practical implications for such a feasible working model of psychopathic personalities are assessed. Finally, this chapter raises the possibility of directed modification of social-environmental factors discouraging the development of psychopathic personalities in the first place, modifications which are also open to simulation and testing in terms of the same model of moral cognition. (shrink)
Could space consist entirely of extended regions, without any regions shaped like points, lines, or surfaces? Peter Forrest and Frank Arntzenius have independently raised a paradox of size for space like this, drawing on a construction of Cantor’s. I present a new version of this argument and explore possible lines of response.
That believing truly as a matter of luck does not generally constitute knowing has become epistemic commonplace. Accounts of knowledge incorporating this anti-luck idea frequently rely on one or another of a safety or sensitivity condition. Sensitivity-based accounts of knowledge have a well-known problem with necessary truths, to wit, that any believed necessary truth trivially counts as knowledge on such accounts. In this paper, we argue that safety-based accounts similarly trivialize knowledge of necessary truths and that two ways of responding (...) to this problem for safety, issuing from work by Williamson and Pritchard, are of dubious success. (shrink)
Famous results by David Lewis show that plausible-sounding constraints on the probabilities of conditionals or evaluative claims lead to unacceptable results, by standard probabilistic reasoning. Existing presentations of these results rely on stronger assumptions than they really need. When we strip these arguments down to a minimal core, we can see both how certain replies miss the mark, and also how to devise parallel arguments for other domains, including epistemic “might,” probability claims, claims about comparative value, and so on. A (...) popular reply to Lewis's results is to claim that conditional claims, or claims about subjective value, lack truth conditions. For this strategy to have a chance of success, it needs to give up basic structural principles about how epistemic states can be updated—in a way that is strikingly parallel to the commitments of the project of dynamic semantics. (shrink)
Few notions are more central to Aquinas’s thought than those of matter and form. Although he invokes these notions in a number of different contexts, and puts them to a number of different uses, he always assumes that in their primary or basic sense they are correlative both with each other and with the notion of a “hylomorphic compound”—that is, a compound of matter (hyle) and form (morphe). Thus, matter is an entity that can have form, form is an entity (...) that can be had by matter, and a hylomorphic compound is an entity that exists when the potentiality of some matter to have form is actualized.1 What is more, Aquinas assumes that the matter of a hylomorphic compound explains certain of its general characteristics, whereas its form explains certain of its more specific characteristics. Thus, the matter of a bronze statue explains the fact that it is bronze, whereas its form explains the fact that it is a statue. Again, the matter of a human being explains the fact that it is a material object, whereas its form explains the specific type of material object it is (namely, human). My aim in this chapter is to provide a systematic introduction to Aquinas’s primary or basic notions of matter and form. To accomplish this aim, I focus on the two main theoretical contexts in which he deploys them—namely, his theory of change and his theory of individuation. In both contexts, as we shall see, Aquinas appeals to matter and form to account for relations of sameness and difference holding between distinct individuals. (shrink)
Until recently, experimental philosophy has been associated with the questionnaire-based study of intuitions; however, experimental philosophers now adapt a wide range of empirical methods for new philosophical purposes. New methods include paradigms for behavioural experiments from across the social sciences as well as computational methods from the digital humanities that can process large bodies of text and evidence. This book offers an accessible overview of these exciting innovations. The volume brings together established and emerging research leaders from several areas of (...) experimental philosophy to explore how new empirical methods can contribute to philosophical debates. Each chapter presents one or several methods new to experimental philosophy, demonstrating their application in a key area of philosophy and discussing their strengths and limitations. Methods covered include eye tracking, virtual reality technology, neuroimaging, statistical learning, and experimental economics as well as corpus linguistics, visualisation techniques and data and text mining. The volume explores their use in moral philosophy and moral psychology, epistemology, philosophy of science, metaphysics, philosophy of language, philosophy of mind and the history of ideas. Methodological Advances in Experimental Philosophy is essential reading for undergraduates, graduate students and researchers working in experimental philosophy. (shrink)
The counterpart theorist has a problem: there is no obvious way to understand talk about actuality in terms of counterparts. Fara and Williamson have charged that this obstacle cannot be overcome. Here I defend the counterpart theorist by offering systematic interpretations of a quantified modal language that includes an actuality operator. Centrally, I disentangle the counterpart relation from a related notion, a ‘representation relation’. The relation of possible things to the actual things they represent is variable, and an adequate account (...) of modal language must keep track of the way it is systematically shifted by modal operators. I apply my account to resolve several puzzles about counterparts and actuality. In technical appendices, I prove some important logical results about this ‘representational’ counterpart system and its relationship to other modal systems. (shrink)
“Pragmatic encroachers” about knowledge generally advocate two ideas: (1) you can rationally act on what you know; (2) knowledge is harder to achieve when more is at stake. Charity Anderson and John Hawthorne have recently argued that these two ideas may not fit together so well. I extend their argument by working out what “high stakes” would have to mean for the two ideas to line up, using decision theory.
Suppose that all non-qualitative facts are grounded in qualitative facts. I argue that this view naturally comes with a picture in which trans-world identity is indeterminate. But this in turn leads to either pervasive indeterminacy in the non-qualitative, or else contingency in what facts about modality and possible worlds are determinate.
Some hold that the lesson of Russell’s paradox and its relatives is that mathematical reality does not form a ‘definite totality’ but rather is ‘indefinitely extensible’. There can always be more sets than there ever are. I argue that certain contact puzzles are analogous to Russell’s paradox this way: they similarly motivate a vision of physical reality as iteratively generated. In this picture, the divisions of the continuum into smaller parts are ‘potential’ rather than ‘actual’. Besides the intrinsic interest of (...) this metaphysical picture, it has important consequences for the debate over absolute generality. It is often thought that ‘indefinite extensibility’ arguments at best make trouble for mathematical platonists; but the contact arguments show that nominalists face the same kind of difficulty, if they recognize even the metaphysical possibility of the picture I sketch. (shrink)
A cura di Ignazio Licata, Ammar J. Sakaji Jeffrey A. Barrett, Enrico Celeghini, Leonardo Chiatti, Maurizio Consoli, Davide Fiscaletti, Ervin Goldfain, Annick Lesne, Maria Paola Lombardo, Mohammad Mehrafarin, Ronald Mirman, Ulrich Mohrhoff, Renato Nobili, Farrin Payandeh, Eliano Pessa, L.I Petrova, Erasmo Recami, Giovanni Salesi, Francesco Maria Scarpa, Mohammad Vahid Takook, Giuseppe Vitiello This volume comes out from an informal discussion between friends and colleagues on the answer:what topic do you think as fundamental in theoretical physics nowadays? Obviously wereceived different (...) answers according to the disposition and the different research areas, and answersin superposition state too. And yet some attractors have emerged pointing out the keys forthe Physicists conception of Nature, all of them converging towards a group of stronglyinterconnectedproblems. Let's see them one by one:. The concept of particle identity in Quantum Mechanics (QM) and Quantum Field Theory (QFT);. The relationship between QM and QFT, in particular the non -local aspects in Field Theory andthe problem of non-perturbative solutions;. The local/global problem in the relationship between particle physics and cosmology;. The role of Renormalization group in describing the meso and macroscopic emergent behaviour;. The possible extension of Poincaré symmetry group and Quantum cosmology;. Higgs "mechanism" and the origin of mass. (shrink)
Linguists, particularly in the generative tradition, commonly rely upon intuitions about sentences as a key source of evidence for their theories. While widespread, this methodology has also been controversial. In this paper, I develop a positive account of linguistic intuition, and defend its role in linguistic inquiry. Intuitions qualify as evidence as form of linguistic behavior, which, since it is partially caused by linguistic competence (the object of investigation), can be used to study this competence. I defend this view by (...) meeting two challenges. First, that intuitions are collected through methodologically unsound practices, and second, that intuition cannot distinguish between the contributions of competence and performance systems. (shrink)
In this paper I defend a form of epistocracy I call limited epistocracy— rule by institutions housing expertise in non-political areas that become politically relevant. This kind of limited epistocracy, I argue, isn’t a far-off fiction. With increasing frequency, governments are outsourcing political power to expert institutions to solve urgent, multidimensional problems because they outperform ordinary democratic decision-making. I consider the objection that limited epistocracy, while more effective than its competitors, lacks a fundamental intrinsic value that its competitors have; namely, (...) political inclusion. After explaining this challenge, I suggest that limited epistocracies can be made compatible with robust political inclusion if specialized institutions are confined to issuing directives that give citizens multiple actionable options. I explain how this safeguards citizens’ inclusion through rational deliberation, choice, and contestation. (shrink)
Some philosophers respond to Leibniz’s “shift” argument against absolute space by appealing to antihaecceitism about possible worlds, using David Lewis’s counterpart theory. But separated from Lewis’s distinctive system, it is difficult to understand what this doctrine amounts to or how it bears on the Leibnizian argument. In fact, the best way of making sense of the relevant kind of antihaecceitism concedes the main point of the Leibnizian argument, pressing us to consider alternative spatiotemporal metaphysics.
Creationism is the conjunction of the following theses: (i) fictional individuals (e.g. Sherlock Holmes) actually exist; (ii) fictional names (e.g., 'Holmes') are at least sometimes genuinely referential; (iii) fictional individuals are the creations of the authors who first wrote (or spoke, etc.) about them. CA Creationism is the conjunction of (i) - (iii) and the following thesis: (iv) fictional individuals are contingently existing abstracta; they are non-concrete artifacts of our world and various other possible worlds. TakashiYagisawa has recently provided a (...) number of arguments designed to show that Creationism is unjustified. I here critically examine three of his challenges to CA Creationism. I argue that each fails to undermine this version of Creationism. (shrink)
In a recent article, Edward Wierenga defends a version of Social Trinitarianism according to which the Persons of the Trinity form a unique society of really distinct divine beings, each of whom has its own exemplification of divinity. In this paper, I call attention to several philosophical and theological difficulties with Wierenga’s account, as well as to a problem that such difficulties pose for Social Trinitarianism generally. I then briefly suggest what I take to be a more promising approach to (...) the Trinity. (shrink)
Scientific knowledge is not merely a matter of reconciling theories and laws with data and observations. Science presupposes a number of metatheoretic shaping principles in order to judge good methods and theories from bad. Some of these principles are metaphysical and some are methodological. While many shaping principles have endured since the scientific revolution, others have changed in response to conceptual pressures both from within science and without. Many of them have theistic roots. For example, the notion that nature conforms (...) to mathematical laws flows directly from the early modern presupposition that there is a divine Lawgiver. This interplay between theism and shaping principles is often unappreciated in discussions about the relation between science and religion. Today, of course, naturalists reject the influence of theism and prefer to do science on their terms. But as Robert Koons and Alvin Plantinga have argued, this is more difficult than is typically assumed. In particular, they argue, metaphysical naturalism is in conflict with several metatheoretic shaping principles, especially explanatory virtues such as simplicity and with scientific realism more broadly. These arguments will be discussed as well as possible responses. In the end, theism is able to provide justification for the philosophical foundations of science that naturalism cannot. (shrink)
Though legal positivism remains popular, HLA Hart’s version has fallen somewhat by the wayside. This is because, according to many, the central task of a theory of law is to explain the so-called ‘normativity of law’. Hart’s theory, it is thought, is not up to the task. Some have suggested modifying the theory accordingly. This paper argues that both Hart’s theory and the normativity of law have been misunderstood. First, a popular modification of Hart’s theory is considered and rejected. It (...) stems from a misunderstanding of Hart and his project. Second, a new understanding of the mysterious but often-mentioned ‘normativity of law’ is presented. Once we have dispelled some misunderstandings of Hart’s view and clarified the sense in which law is supposed to be normative, we see that Hart’s view, unmodified, is well suited to the task of explaining law’s normativity. (shrink)
Theology is the preeminent academic discipline during the Middle Ages and, as a result, most of great thinkers of this period are highly trained theologians. Although this is common knowledge, it is sometimes overlooked that the systematic nature of medieval theology led its practitioners to develop full treatments of virtually every area within philosophy. Indeed, theological reflection not only provides the main context in which the medievals theorize about what we would now recognize as distinctively philosophical issues, but it is (...) responsible for some of their most significant philosophical contributions. To give just a few examples: it is problems with the Christian doctrine of the Incarnation that prompt medievals to develop the notions of ‘substance’ and ‘person’ in striking and original ways; it is problems with the doctrine of the Eucharist that lead them to consider the possibility of ‘accidents that do not inhere’; and it is problems of.. (shrink)
Nietzsche, Nihilism and the Philosophy of the Future examines Nietzsche's analysis of and response to contemporary nihilism, the sense that nothing has value or ...
We criticize the bare theory of quantum mechanics -- a theory on which the Schrödinger equation is universally valid, and standard way of thinking about superpositions is correct.
Eliminativists sometimes invoke evolutionary debunking arguments against ordinary object beliefs, either to help them establish object skepticism or to soften the appeal of commonsense ontology. I argue that object debunkers face a self-defeat problem: their conclusion undermines the scientific support for one of their premises, because evolutionary biology depends on our object beliefs. Using work on reductionism and multiple realizability from the philosophy of science, I argue that it will not suffice for an eliminativist debunker to simply appeal to some (...) object-free surrogate theory of evolution that results from converting any scientific proposition about some object K into a proposition about simples arranged K-wise. In the process, I examine some hazards peculiar to eliminative reductions of scientific theories, and propose a trilemma for eliminativists who attempt to recoup generality for ontologically sparse reducing theories by appealing to pluralities of simples arranged K-wise. The paper is intended to define and develop the object debunker’s self-defeat problem for further study, and to clarify some of the ways sparse and abundant ontologies interact with scientific theory. (shrink)
I think it would be fair to say that, until about 1900, philosophers were generally reluctant to admit the existence of what are nowadays called polyadic properties.1 It is important to recognize, however, that this reluctance on the part of pre-twentieth-century philosophers did not prevent them from theorizing about relations. On the contrary, philosophers from the ancient through the modern period have had much to say about both the nature and the ontological status of relations. In this paper I examine (...) the views of one such philosopher, namely, Albert the Great. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.