Philosophical debate about the meaning of normative terms has long been pulled in two directions by the apparently competing ideas: (i) ‘ought’s do not describe what is actually the case but rather prescribe possible action, thought, or feeling, (ii) all declarative sentences deserve the same general semantic treatment, e.g. in terms of compositionally specified truth conditions. In this paper, I pursue resolution of this tension by rehearsing the case for a relatively standard truth-conditionalist semantics for ‘ought’ conceived as a necessity (...) modal and proposing a revision to it motivated by the distinctively prescriptive character of some deontic modals. In my view, this puts pressure on a popular conception of one of the core debates of metanormativetheory between realists and antirealists. To make good on this claim, I go on to explore two very general ways we might interpret the results of compositional semantics—“representationalism” and “inferentialism”—in order to argue that, contrary to what is generally assumed, both can capture the special prescriptivity of ‘ought’ and both can countenance compositionally specified and informative truth-conditions for ought-sentences. Hence, my main thesis is that the deciding factor between them should not be which of ideas (i) and (ii) we are more impressed by but rather what we think of the relative merits of how representationalism and inferentialism respect these ideas. I’m inclined to favor an antirealist form of inferentialism, but the task I’ve set myself here is mainly to articulate the view in the context of metanormativetheory and the semantics of deontic modals rather than try to defend it fully. To this purpose, towards the end I also briefly compare and contrast inferentialism with a third “ideationalist” metasemantic view, which may be an attractive home for some sophisticated versions of metanormative expressivism. Depending on how expressivism is worked out, it may be completely compatible with and so perhaps usefully combined with inferentialism or it may offer a competing way to respect ideas (i) and (ii). (shrink)
First-order normative theories concerning what’s right and wrong, good and bad, etc. and metanormative theories concerning the nature of first-order normative thought and talk are widely regarded as independent theoretical enterprises. This paper argues that several debates in metanormativetheory involve views that have first-order normative implications, even as the implications in question may not be immediately recognizable as normative. I first make my claim more precise by outlining a general recipe for generating this result. I then (...) apply this recipe to three debates in metaethics: the modal status of basic normative principles, normative vagueness and indeterminacy, and the determination of reference for normative predicates. In each case I argue that certain views on each issue carry first-order normative commitments, in accordance with my recipe. (shrink)
How should you decide what to do when you're uncertain about basic normative principles (e.g., Kantianism vs. utilitarianism)? A natural suggestion is to follow some "second-order" norm: e.g., "comply with the first-order norm you regard as most probable" or "maximize expected choiceworthiness". But what if you're uncertain about second-order norms too -- must you then invoke some third-order norm? If so, it seems that any norm-guided response to normative uncertainty is doomed to a vicious regress. In this paper, I aim (...) to rescue second-order norms from this threat of regress. I first elaborate and defend the suggestion some philosophers have entertained that the regress problem forces us to accept normative externalism, the view that at least one norm is incumbent on agents regardless of their beliefs or evidence concerning that norm. But, I then argue, we need not accept externalism about first-order (e.g., moral) norms, thus closing off any question of what an agent should do in light of her normative beliefs. Rather, it is more plausible to ascribe external force to a single, second-order rational norm: the enkratic principle, correctly formulated. This modest form of externalism, I argue, is both intrinsically well-motivated and sufficient to head off the threat of regress. (shrink)
We can distinguish between ambitious metanormative constructivism and a variety of other constructivist projects in ethics and metaethics. Ambitious metanormative constructivism is the project of either developing a type of new metanormativetheory, worthy of the label “constructivism”, that is distinct from the existing types of metaethical, or metanormative, theories already on the table—various realisms, non-cognitivisms, error-theories and so on—or showing that the questions that lead to these existing types of theories are somehow fundamentally confused. (...) Natural ways of pursuing the project of ambitious metanormative constructivism lead to certain obvious, and related, worries about whether the ambitions are really being achieved—that is whether we really are being given a distinctive theory. I will argue that responding to these initial worries pushes ambitious metanormative constructivism towards adopting a kind of position that I will call “constructivism all the way down”. Such a position does see off most of the above initial worries. Drawing on the work of Ralph Walker and Crispin Wright, I argue, however, that it faces a distinct objection that is a descendent of Bertrand Russell’s Bishop Stubbs objection against coherentist theories of truth. I grant that the constructivist need not be a coherentist about truth. I argue, however, that despite this the constructivist cannot escape my version of the objection. I also distinguish between this objection and various traditional charges of circularity, regress, relativism, or psychologistic reductionism. (shrink)
Decision-making under normative uncertainty requires an agent to aggregate the assessments of options given by rival normative theories into a single assessment that tells her what to do in light of her uncertainty. But what if the assessments of rival theories differ not just in their content but in their structure -- e.g., some are merely ordinal while others are cardinal? This paper describes and evaluates three general approaches to this "problem of structural diversity": structural enrichment, structural depletion, and multi-stage (...) aggregation. All three approaches have notable drawbacks, but I tentatively defend multi-stage aggregation as least bad of the three. (shrink)
I present and defend (1) an account of ethical judgments as judgments about our reasons to feel specific motivationally laden attitudes, (2) an account of what an agent should do in terms of what would achieve ends that she has reason to be motivated to pursue, and (3) an account of an agent’s reasons for motivation (and thus action) in terms of the prescriptions of the most fundamental principles that guide her deliberations. Using these accounts, I explain the connection between (...) ethics and reasons for action, how ethical judgments are both descriptive and intrinsically motivating, and how ethical facts arise from facts about agents’ deliberations. (shrink)
This paper investigates whether different philosophers’ claims about “normativity” are about the same subject or (as recently argued by Derek Parfit) theorists who appear to disagree are really using the term with different meanings, in order to cast disambiguating light on the debates over at least the nature, existence, extension, and analyzability of normativity. While I suggest the term may be multiply ambiguous, I also find reasons for optimism about a common subject-matter for metanormativetheory. This is supported (...) partly by sketching a special kind of hybrid view of normative judgment, perspectivism, that occupies a position between cognitivism and noncognitivism, naturalism and nonnaturalism, objectivism and subjectivism, making it more plausible that radically different metanormative theories could be about the same thing. I explore three main fissures: between (i) the “normativity” of language/thought versus that of facts and properties, (ii) abstract versus substantive senses, and (iii) formal versus robust senses. (shrink)
This note explores how ideal subjectivism in metanormativetheory can help solve two important problems for Fitting Attitude analyses of value. The wrong-kind-of-reason problem is that there may be sufficient reason for attitude Y even if the object is not Y-able. The many-kinds-of-fittingness problem is that the same attitude can be fitting in many ways. Ideal subjectivism addresses both by maintaining that an attitude is W-ly fitting if and only if endorsed by any W-ly ideal subject. A subject (...) is W-ly ideal when the most robust way of avoiding W-type practical problems is deferring to her endorsement. (shrink)
Critical examination of Alchourrón and Bulygin’s set-theoretic definition of normative system shows that deductive closure is not an inevitable property. Following von Wright’s conjecture that axioms of standard deontic logic describe perfection-properties of a norm-set, a translation algorithm from the modal to the set-theoretic language is introduced. The translations reveal that the plausibility of metanormative principles rests on different grounds. Using a methodological approach that distinguishes the actor roles in a norm governed interaction, it has been shown that (...) class='Hi'>metanormative principles are directed second-order obligations and, in particular, that the requirement related to deductive closure is directed to the norm-applier role rather than to the norm-giver role. The approach has been applied to the case of pure derogation yielding a new result, namely, that an independence property is a perfection-property of a norm-set in view of possible derogation. This paper in a polemical way touches upon several points raised by Kristan in his recent paper. (shrink)
If there is something that every possible agent is committed to value, and certain actions or attitudes either enhance or diminish P, then normative claims about a range of intentional actions can be objectively and non-trivially evaluated. I argue that the degree of existence as an agent depends on the consistency of reflexive-relating with other individuals of the agent-kind: the ontological thesis. I then show that in intending to act on a reason, every agent is rationally committed to value being (...) an agent, what consists in exercising the capacity to act and having the freedom to discriminate between more or less valuable actions: the transcendental thesis. Since the degree of possession of this personal but non-contingent good depends on relating to other agents in a special way, certain actions and attitudes may be objectively right or wrong for all agents. (shrink)
How can someone reconcile the desire to eat meat, and a tendency toward vegetarian ideals? How should we reconcile contradictory moral values? How can we aggregate different moral theories? How individual preferences can be fairly aggregated to represent a will, norm, or social decision? Conflict resolution and preference aggregation are tasks that intrigue philosophers, economists, sociologists, decision theorists, and many other scholars, being a rich interdisciplinary area for research. When trying to solve questions about moral uncertainty a meta understanding of (...) the concept of normativity can help us to develop strategies to deal with norms themselves. 2nd-order normativity, or norms about norms, is a hierarchical way to think about how to combine many different normative structures and preferences into a single coherent decision. That is what metanormativity is all about, a way to answer: what should we do when we don’t know what to do? In this study, we will review a decision-making strategy dealing with moral uncertainty, Maximization of Expected Choice-Worthiness. This strategy, proposed by William MacAskill, allows for the aggregation and inter-theoretical comparison of different normative structures, cardinal theories, and ordinal theories. In this study, we will exemplify the metanormative methods proposed by MacAskill, using has an example, a series of vegetarian dilemmas. Given the similarity to this metanormative strategy to expected utility theory, we will also show that it is possible to integrate both models to address decision-making problems in situations of empirical and moral uncertainty. We believe that this kind of ethical-mathematical formalism can be useful to help develop strategies to better aggregate moral preferences and solve conflicts. (shrink)
Both mindreading and stereotyping are forms of social cognition that play a pervasive role in our everyday lives, yet too little attention has been paid to the question of how these two processes are related. This paper offers a theory of the influence of stereotyping on mental-state attribution that draws on hierarchical predictive coding accounts of action prediction. It is argued that the key to understanding the relation between stereotyping and mindreading lies in the fact that stereotypes centrally involve (...) character-trait attributions, which play a systematic role in the action–prediction hierarchy. On this view, when we apply a stereotype to an individual, we rapidly attribute to her a cluster of generic character traits on the basis of her perceived social group membership. These traits are then used to make inferences about that individual’s likely beliefs and desires, which in turn inform inferences about her behavior. (shrink)
A platitude that took hold with Kuhn is that there can be several equally good ways of balancing theoretical virtues for theory choice. Okasha recently modelled theory choice using technical apparatus from the domain of social choice: famously, Arrow showed that no method of social choice can jointly satisfy four desiderata, and each of the desiderata in social choice has an analogue in theory choice. Okasha suggested that one can avoid the Arrow analogue for theory choice (...) by employing a strategy used by Sen in social choice, namely, to enhance the information made available to the choice algorithms. I argue here that, despite Okasha’s claims to the contrary, the information-enhancing strategy is not compelling in the domain of theory choice. (shrink)
Ranking theory is a formal epistemology that has been developed in over 600 pages in Spohn's recent book The Laws of Belief, which aims to provide a normative account of the dynamics of beliefs that presents an alternative to current probabilistic approaches. It has long been received in the AI community, but it has not yet found application in experimental psychology. The purpose of this paper is to derive clear, quantitative predictions by exploiting a parallel between ranking theory (...) and a statistical model called logistic regression. This approach is illustrated by the development of a model for the conditional inference task using Spohn's ranking theoretic approach to conditionals. (shrink)
If the world itself is metaphysically indeterminate in a specified respect, what follows? In this paper, we develop a theory of metaphysical indeterminacy answering this question.
Decision theory has at its core a set of mathematical theorems that connect rational preferences to functions with certain structural properties. The components of these theorems, as well as their bearing on questions surrounding rationality, can be interpreted in a variety of ways. Philosophy’s current interest in decision theory represents a convergence of two very different lines of thought, one concerned with the question of how one ought to act, and the other concerned with the question of what (...) action consists in and what it reveals about the actor’s mental states. As a result, the theory has come to have two different uses in philosophy, which we might call the normative use and the interpretive use. It also has a related use that is largely within the domain of psychology, the descriptive use. This essay examines the historical development of decision theory and its uses; the relationship between the norm of decision theory and the notion of rationality; and the interdependence of the uses of decision theory. (shrink)
The topic of a priori knowledge is approached through the theory of evidence. A shortcoming in traditional formulations of moderate rationalism and moderate empiricism is that they fail to explain why rational intuition and phenomenal experience count as basic sources of evidence. This explanatory gap is filled by modal reliabilism -- the theory that there is a qualified modal tie between basic sources of evidence and the truth. This tie to the truth is then explained by the (...) class='Hi'>theory of concept possession: this tie is a consequence of what, by definition, it is to possess (i.e., to understand) one’s concepts. A corollary of the overall account is that the a priori disciplines (logic, mathematics, philosophy) can be largely autonomous from the empirical sciences. (shrink)
The Austrian philosopher Christian von Ehrenfels published his essay "On 'Gestalt Qualities'" in 1890. The essay initiated a current of thought which enjoyed a powerful position in the philosophy and psychology of the first half of this century and has more recently enjoyed a minor resurgence of interest in the area of cognitive science, above all in criticisms of the so-called 'strong programme' in artificial intelligence. The theory of Gestalt is of course associated most specifically with psychologists of the (...) Berlin school such as Max Wertheimer, Wolfgang Kohler and Kurt Koffka. We shall see in what follows, however, that an adequate philosophical understanding of the Gestalt idea and of Ehrenfels' achievement will require a close examination not merely of the work of the Berlin school but also of a much wider tradition in Austrian and German philosophy in general. (shrink)
The sensorimotor theory of perceptual consciousness offers a form of enactivism in that it stresses patterns of interaction instead of any alleged internal representations of the environment. But how does it relate to forms of enactivism stressing the continuity between life and mind? We shall distinguish sensorimotor enactivism, which stresses perceptual capacities themselves, from autopoietic enactivism, which claims an essential connection between experience and autopoietic processes or associated background capacities. We show how autopoiesis, autonomous agency, and affective dimensions of (...) experience may fit into sensorimotor enactivism, and we identify differences between this interpretation and autopoietic enactivism. By taking artificial consciousness as a case in point, we further sharpen the distinction between sensorimotor enactivism and autopoietic enactivism. We argue that sensorimotor enactivism forms a strong default position for an enactive account of perceptual consciousness. (shrink)
This paper provides an introductory review of the theory of judgment aggregation. It introduces the paradoxes of majority voting that originally motivated the field, explains several key results on the impossibility of propositionwise judgment aggregation, presents a pedagogical proof of one of those results, discusses escape routes from the impossibility and relates judgment aggregation to some other salient aggregation problems, such as preference aggregation, abstract aggregation and probability aggregation. The present illustrative rather than exhaustive review is intended to give (...) readers new to the field of judgment aggregation a sense of this rapidly growing research area. (shrink)
Orthodox decision theory gives no advice to agents who hold two goods to be incommensurate in value because such agents will have incomplete preferences. According to standard treatments, rationality requires complete preferences, so such agents are irrational. Experience shows, however, that incomplete preferences are ubiquitous in ordinary life. In this paper, we aim to do two things: (1) show that there is a good case for revising decision theory so as to allow it to apply non-vacuously to agents (...) with incomplete preferences, and (2) to identify one substantive criterion that any such non-standard decision theory must obey. Our criterion, Competitiveness, is a weaker version of a dominance principle. Despite its modesty, Competitiveness is incompatible with prospectism, a recently developed decision theory for agents with incomplete preferences. We spend the final part of the paper showing why Competitiveness should be retained, and prospectism rejected. (shrink)
The so-called theory of karma is one of the distinguishing aspects of Hinduism and other non-Hindu south-Asian traditions. At the same time that the theory can be seen as closely connected with the freedom of will and action that we humans supposedly have, it has many times been said to be determinist and fatalist. The purpose of this paper is to analyze in some deepness the relations that are between the theory of karma on one side and (...) determinism, fatalism and free-will on the other side. In order to do that, I shall use what has been described as the best formal approach we have to indeterminism: branching time theory. More specifically, I shall introduce a branching time semantic framework in which, among other things, statements such as “state of affairs e is a karmic effect of agent a”, “a wills it to be the case that e” and “e is inevitable” could be properly represented. (shrink)
Sensorimotor Theory is the claim that it is our practical know-how of the relations between our environments and us that gives our environmental interactions their experiential qualities. Yet why should such interactions involve or be accompanied by experience? This is the ‘absolute’ gap question. Some proponents of SMT answer this question by arguing that our interactions with an environment involve experience when we cognitively access those interactions. In this paper, I aim to persuade proponents of SMT to accept the (...) following three claims. First, that appeals to cognitive access fail to answer the absolute gap question. Second, that SMT can be read in a way that rejects the gap question. Third, that if proponents of SMT are prepared to read SMT in a way that rejects the absolute gap question, then they can also reject the claim that cognitive access is needed to explain experience. (shrink)
Generating an account that can sidestep the disagreement among substantive theories of well-being, while at the same time still providing useful guidance for well-being public policy, would be a significant achievement. Unfortunately, the various attempts to remain agnostic regarding what constitutes well-being fail to either be an account of well-being, provide useful guidance for well-being policy, or avoid relying on a substantive well-being theory. There are no theory-free lunches in well-being policy. Instead, I propose an intermediate account, according (...) to which well-being is constituted by endorsed veridical experiences. This account refers back to theories of well-being but does so as agnostically as possible. An intermediate account of well-being is meant as a policy guiding compromise between the different theories of well-being that make claims regarding what constitutes well-being. An intermediate account does as well as can be hoped for in providing a basis for well-being policy. (shrink)
The thesis of theory-ladenness of observations, in its various guises, is widely considered as either ill-conceived or harmless to the rationality of science. The latter view rests partly on the work of the proponents of New Experimentalism who have argued, among other things, that experimental practices are efficient in guarding against any epistemological threat posed by theory-ladenness. In this paper I show that one can generate a thesis of theory-ladenness for experimental practices from an influential New Experimentalist (...) account. The notion I introduce for this purpose is the concept of ‘theory-driven data reliability judgments’, according to which theories which are sought to be tested with a particular set of data guide reliability judgments about those very same data. I provide various prominent historical examples to show that TDRs are used by scientists to resolve data conflicts. I argue that the rationality of the practices which employ TDRs can be saved if the independent support of the theories driving TDRs is construed in a particular way. (shrink)
Relationships between current theories, and relationships between current theories and the sought theory of quantum gravity (QG), play an essential role in motivating the need for QG, aiding the search for QG, and defining what would count as QG. Correspondence is the broad class of inter-theory relationships intended to demonstrate the necessary compatibility of two theories whose domains of validity overlap, in the overlap regions. The variety of roles that correspondence plays in the search for QG are illustrated, (...) using examples from specific QG approaches. Reduction is argued to be a special case of correspondence, and to form part of the definition of QG. Finally, the appropriate account of emergence in the context of QG is presented, and compared to conceptions of emergence in the broader philosophy literature. It is argued that, while emergence is likely to hold between QG and general relativity, emergence is not part of the definition of QG, and nor can it serve usefully in the development and justification of the new theory. (shrink)
This essay makes the case for, in the phrase of Angelika Kratzer, packing the fruits of the study of rational decision-making into our semantics for deontic modals—specifically, for parametrizing the truth-condition of a deontic modal to things like decision problems and decision theories. Then it knocks it down. While the fundamental relation of the semantic theory must relate deontic modals to things like decision problems and theories, this semantic relation cannot be intelligibly understood as representing the conditions under which (...) a deontic modal is true. Rather it represents the conditions under which it is accepted by a semantically competent agent. This in turn motivates a reorientation of the whole of semantic theorizing, away from the truth-conditional paradigm, toward a form of Expressivism. (shrink)
This essay is divided into two parts. In the first part (§2), I introduce the idea of practical meaning by looking at a certain kind of procedural systems — the motor system — that play a central role in computational explanations of motor behavior. I argue that in order to give a satisfactory account of the content of the representations computed by motor systems (motor commands), we need to appeal to a distinctively practical kind of meaning. Defending the explanatory relevance (...) of semantic properties in a computationalist explanation of motor behavior, my argument concludes that practical meanings play a central role in an adequate psychological theory of motor skill. In the second part of this essay (§3), I generalize and clarify the notion of practical meaning, and I defend the intelligibility of practical meanings against an important objection. (shrink)
Ultimately this book provides a theory of intergenerational justice that is both intellectually robust and practical with wide applicability to law and policy.
This article provides a conceptual map of the debate on ideal and non‐ideal theory. It argues that this debate encompasses a number of different questions, which have not been kept sufficiently separate in the literature. In particular, the article distinguishes between the following three interpretations of the ‘ideal vs. non‐ideal theory’ contrast: full compliance vs. partial compliance theory; utopian vs. realistic theory; end‐state vs. transitional theory. The article advances critical reflections on each of these sub‐debates, (...) and highlights areas for future research in the field. (shrink)
Abstract: In this essay I outline a radical kind of virtue theory I call exemplarism, which is foundational in structure but which is grounded in exemplars of moral goodness, direct reference to which anchors all the moral concepts in the theory. I compare several different kinds of moral theory by the way they relate the concepts of the good, a right act, and a virtue. In the theory I propose, these concepts, along with the concepts of (...) a duty and of a good life, are defined by reference to exemplars, identified directly through the emotion of admiration, not through a description. It is an advantage of the theory that what makes a good person good is not given a priori but is determined by empirical investigation. The same point applies to what good persons do and what states of affairs they aim at. The theory gives an important place to empirical investigation and narratives about exemplars analogous to the scientific investigation of natural kinds in the theory of direct reference. (shrink)
This paper provides a critical overview of the realist current in contemporary political philosophy. We define political realism on the basis of its attempt to give varying degrees of autonomy to politics as a sphere of human activity, in large part through its exploration of the sources of normativity appropriate for the political and so distinguish sharply between political realism and non-ideal theory. We then identify and discuss four key arguments advanced by political realists: from ideology, from the relationship (...) of ethics to politics, from the priority of legitimacy over justice and from the nature of political judgement. Next, we ask to what extent realism is a methodological approach as opposed to a substantive political position and so discuss the relationship between realism and a few such positions. We close by pointing out the links between contemporary realism and the realist strand that runs through much of the history of Western political thought. (shrink)
Eternalism, the view that what we regard locally as being located in the past, the present and the future equally exists, is the best ontological account of temporal existence in line with special and general relativity. However, special and general relativity are not fundamental theories and several research programs aim at finding a more fundamental theory of quantum gravity weaving together all we know from relativistic physics and quantum physics. Interestingly, some of these approaches assert that time is not (...) fundamental. If time is not fundamental, what does it entail for eternalism and the standard debate over existence in time? First, I will argue that the non-fundamentality of time to be found in string theory entails standard eternalism. Second, I will argue that the non-fundamentality of time to be found in loop quantum gravity entails atemporal eternalism, namely a novel position in the spirit of standard eternalism. (shrink)
The theory of the organism-environment system starts with the proposition that in any functional sense organism and environment are inseparable and form only one unitary system. The organism cannot exist without the environment and the environment has descriptive properties only if it is connected to the organism. Although for practical purposes we do separate organism and environment, this common-sense starting point leads in psychological theory to problems which cannot be solved. Therefore, separation of organism and environment cannot be (...) the basis of any scientific explanation of human behavior. The theory leads to a reinterpretation of basic problems in many fields of inquiry and makes possible the definition of mental phenomena without their reduction either to neural or biological activity or to separate mental functions. According to the theory, mental activity is activity of the whole organism-environment system, and the traditional psychological concepts describe only different aspects of organisation of this system. Therefore, mental activity cannot be separated from the nervous system, but the nervous system is only one part of the organismenvironment system. This problem will be dealt with in detail in the second part of the article. (shrink)
This article points out the criteria necessary in order for a qualitative scientific method to qualify itself as phenomenological in a descriptive Husserlian sense. One would have to employ description within the attitude of the phenomenological reduction, and seek the most invariant meanings for a context. The results of this analysis are used to critique an article by Klein and Westcott , that presents a typology of the development of the phenomenological psychological method.
Is it possible to get by with just one ontological category? We evaluate L.A. Paul's attempt to do so: the mereological bundle theory. The upshot is that Paul's attempt to construct a one category ontology may be challenged with some of her own arguments. In the positive part of the paper we outline a two category ontology with property universals and kind universals. We will also examine Paul's arguments against a version of universal bundle theory that takes spatiotemporal (...) co-location instead of compresence or coinstantiation as the feature by which we can identify genuine bundles. We compare this novel theory, bundle theory with kinds, and Paul's mereological bundle theory and apply them to a case study concerning entangled fermions and co-located bosons. (shrink)
Knowledge-making practices in biology are being strongly affected by the availability of data on an unprecedented scale, the insistence on systemic approaches and growing reliance on bioinformatics and digital infrastructures. What role does theory play within data-intensive science, and what does that tell us about scientific theories in general? To answer these questions, I focus on Open Biomedical Ontologies, digital classification tools that have become crucial to sharing results across research contexts in the biological and biomedical sciences, and argue (...) that they constitute an example of classificatory theory. This form of theorizing emerges from classification practices in conjunction with experimental know-how and expresses the knowledge underpinning the analysis and interpretation of data disseminated online. (shrink)
The paper considers contemporary models of presumption in terms of their ability to contribute to a working theory of presumption for argumentation. Beginning with the Whatelian model, we consider its contemporary developments and alternatives, as proposed by Sidgwick, Kauffeld, Cronkhite, Rescher, Walton, Freeman, Ullmann-Margalit, and Hansen. Based on these accounts, we present a picture of presumptions characterized by their nature, function, foundation and force. On our account, presumption is a modal status that is attached to a claim and has (...) the effect of shifting, in a dialogue, a burden of proof set at a local level. Presumptions can be analysed and evaluated inferentially as components of rule-based structures. Presumptions are defeasible, and the force of a presumption is a function of its normative foundation. This picture seeks to provide a framework to guide the development of specific theories of presumption. (shrink)
This paper explores whether it is possible to reformulate or re-interpret Lewis’s theory of fundamental laws of nature—his “best system analysis”—in such a way that it becomes a useful theory for special science laws. One major step in this enterprise is to make plausible how law candidates within best system competitions can tolerate exceptions—this is crucial because we expect special science laws to be so called “ceteris paribus laws ”. I attempt to show how this is possible and (...) also how we can thereby make the first step towards a solution for the infamous difficulties surrounding the troublesome ceteris paribus clause. The paper outlines the general ideas of the theory but also points out some of its difficulties and background assumptions. (shrink)
My aim in this article is to contribute to the larger project of assessing the relative merits of different theories of substance. An important preliminary step in this project is assessing the explanatory resources of one main theory of substance, the so-called bundle theory. This article works towards such an assessment. I identify and explain three distinct explanatory challenges an adequate bundle theory must meet. Each points to a putative explanatory gap, so I call them the Gap (...) Challenges. I consider three bundle-theoretic strategies for meeting these challenges. I argue that none of them goes very far. The upshot is that, absent other strategies for meeting the challenges, bundle theory involves a significant amount of stipulation. This black box makes bundle theory relatively weak with respect to its explanatory power—unless, of course, rival theories of substance are unable to do better. (shrink)
In this paper I defend what I call the argument from epistemic reasons against the moral error theory. I argue that the moral error theory entails that there are no epistemic reasons for belief and that this is bad news for the moral error theory since, if there are no epistemic reasons for belief, no one knows anything. If no one knows anything, then no one knows that there is thought when they are thinking, and no one (...) knows that they do not know everything. And it could not be the case that we do not know that there is thought when we believe that there is thought and that we do not know that we do not know everything. I address several objections to the claim that the moral error theory entails that there are no epistemic reasons for belief. It might seem that arguing against the error theory on the grounds that it entails that no one knows anything is just providing a Moorean argument against the moral error theory. I show that even if my argument against the error theory is indeed a Moorean one, it avoids Streumer's, McPherson's and Olson's objections to previous Moorean arguments against the error theory and is a more powerful argument against the error theory than Moore's argument against external world skepticism is against external world skepticism. (shrink)
This paper offers a general model of substantive moral principles as a kind of hedged moral principles that can (but don't have to) tolerate exceptions. I argue that the kind of principles I defend provide an account of what would make an exception to them permissible. I also argue that these principles are nonetheless robustly explanatory with respect to a variety of moral facts; that they make sense of error, uncertainty, and disagreement concerning moral principles and their implications; and that (...) one can grasp these principles without having to grasp any particular list of their permissibly exceptional instances. I conclude by pointing out various advantages that this model of principles has over several of its rivals. The bottom line is that we should find nothing peculiarly odd or problematic about the idea of exception-tolerating and yet robustly explanatory moral principles. (shrink)
: I respond to an argument presented by Daniel Povinelli and Jennifer Vonk that the current generation of experiments on chimpanzee theory of mind cannot decide whether chimpanzees have the ability to reason about mental states. I argue that Povinelli and Vonk's proposed experiment is subject to their own criticisms and that there should be a more radical shift away from experiments that ask subjects to predict behavior. Further, I argue that Povinelli and Vonk's theoretical commitments should lead them (...) to accept this new approach, and that experiments which offer subjects the opportunity to look for explanations for anomalous behavior should be explored. (shrink)
Lipsey and Lancaster's "general theory of second best" is widely thought to have significant implications for applied theorizing about the institutions and policies that most effectively implement abstract normative principles. It is also widely thought to have little significance for theorizing about which abstract normative principles we ought to implement. Contrary to this conventional wisdom, I show how the second-best theorem can be extended to myriad domains beyond applied normative theorizing, and in particular to more abstract theorizing about the (...) normative principles we should aim to implement. I start by separating the mathematical model used to prove the second-best theorem from its familiar economic interpretation. I then develop an alternative normative-theoretic interpretation of the model, which yields a novel second best theorem for idealistic normative theory. My method for developing this interpretation provides a template for developing additional interpretations that can extend the reach of the second-best theorem beyond normative theoretical domains. I also show how, within any domain, the implications of the second-best theorem are more specific than is typically thought. I conclude with some brief remarks on the value of mathematical models for conceptual exploration. (shrink)
We have a variety of different ways of dividing up, classifying, mapping, sorting and listing the objects in reality. The theory of granular partitions presented here seeks to provide a general and unified basis for understanding such phenomena in formal terms that is more realistic than existing alternatives. Our theory has two orthogonal parts: the first is a theory of classification; it provides an account of partitions as cells and subcells; the second is a theory of (...) reference or intentionality; it provides an account of how cells and subcells relate to objects in reality. We define a notion of well-formedness for partitions, and we give an account of what it means for a partition to project onto objects in reality. We continue by classifying partitions along three axes: (a) in terms of the degree of correspondence between partition cells and objects in reality; (b) in terms of the degree to which a partition represents the mereological structure of the domain it is projected onto; and (c) in terms of the degree of completeness with which a partition represents this domain. (shrink)
Many realists argue that present scientific theories will not follow the fate of past scientific theories because the former are more successful than the latter. Critics object that realists need to show that present theories have reached the level of success that warrants their truth. I reply that the special theory of relativity has been repeatedly reinforced by unconceived scientific methods, so it will be reinforced by infinitely many unconceived scientific methods. This argument for the special theory of (...) relativity overcomes the critics’ objection, and has advantages over the no-miracle argument and the selective induction for it. (shrink)
For each positive n , two alternative axiomatizations of the theory of strings over n alphabetic characters are presented. One class of axiomatizations derives from Tarski's system of the Wahrheitsbegriff and uses the n characters and concatenation as primitives. The other class involves using n character-prefixing operators as primitives and derives from Hermes' Semiotik. All underlying logics are second order. It is shown that, for each n, the two theories are definitionally equivalent [or synonymous in the sense of deBouvere]. (...) It is further shown that each member of one class is synonymous with each member of the other class; thus that all of the theories are definitionally equivalent with each other and with Peano arithmetic. Categoricity of Peano arithmetic then implies categoricity of each of the above theories. (shrink)
This is a penultimate draft of a paper that will appear in Handbook of Imagination, Amy Kind (ed.). Routledge Press. Please cite only the final printed version.
I propose a new theory of semantic presupposition, which I call dissatisfaction theory. I first briefly review a cluster of problems − known collectively as the proviso problem − for most extant theories of presupposition, arguing that the main pragmatic response to them faces a serious challenge. I avoid these problems by adopting two changes in perspective on presupposition. First, I propose a theory of projection according to which presuppositions project unless they are locally entailed. Second, I (...) reject the standard assumption that presuppositions are contents which must be entailed by the input context; instead, I propose that presuppositions are contents which are marked as backgrounded. I show that, together, these commitments allow us to avoid the proviso problem altogether, and generally make plausible predictions about presupposition projection out of connectives and attitude predicates. I close by sketching a two-dimensional implementation of my theory which allows us to make further welcome predictions about attitude predicates and quantifiers. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.