We care not only about what experiences we have, but when we have them too. However, on the B-theory of time, something’s timing isn’t an intrinsic way for that thing to be or become. Given B-theory, should we be rationally indifferent about the timing per se of an experience? In this paper, I argue that B-theorists can justify time-biased preferences for pains to be past rather than present and for pleasures to be present rather than past. In support (...) of this argument, I appeal to the doctrine of temporal parts or “four-dimensionalism” for short. When held in conjunction with a certain evaluative principle about whose experiences matter, four-dimensionalism reconciles B-theory with some time-biased preferences. (shrink)
The purpose of this paper is to explore the connection between change and the B-theory of time, sometimes also called the Scientific view of time, according to which reality is a four-dimensional spacetime manifold, where past, present and future things equally exist, and the present time and non-present times are metaphysically the same. I argue in favour of a novel response to the much-vexed question of whether there is change on the B-theory or not. In fact, B-theorists are (...) often said to hold a ‘static’ view of time. But this far from being innocent label: if the B-theory of time presents a model of temporal reality that is static, then there is no change on the B-theory. From this, one can reasonably think as follows: of course, there is change, so the B-theory must be false. What I plan to do in this paper is to argue that in some sense there is change on the B-theory, but in some other sense, there is no change on the B-theory. To do so, I present three instances of change: Existential Change, namely the view that things change with respect to their existence over time; Qualitative Change, the view that things change with respect to how they are over time; Propositional Change, namely the view that things (i.e. propositions) change with respect to truth value over time. I argue that while there is a reading of these three instances of change that is true on the B-theory, and so there is change on the B-theory in this sense, there is a B-theoretical reading of each of them that is not true on the B-theory, and therefore there is no change on the B-theory in this other sense. (shrink)
Elsewhere I have suggested that the B-theory includes a notion of passage, by virtue of including succession. Here, I provide further support for that claim by showing that uncontroversial elements of the B-theory straightforwardly ground a veridical sense of passage. First, I argue that the B-theory predicts that subjects of experience have a sense of passivity with respect to time that they do not have with respect to space, which they are right to have, even according to (...) the B-theory. I then ask what else might be involved in our experience of time as passing that is not yet vindicated by the B-theoretic conception. I examine a recent B-theoretic explanation of our ‘illusory’ sense of passage, by Robin Le Poidevin, and argue that it explains away too much: our perception of succession poses no more of a problem on the B-theory than it does on other theories of time. Finally, I respond to an objection by Oreste Fiocco that a causal account of our sense of passage cannot succeed, because it leaves out the ‘phenomenological novelty’ of each moment. (shrink)
In this paper I consider two strategies for providing tenseless truth-conditions for tensed sentences: the token-reflexive theory and the date theory. Both theories have faced a number of objections by prominent A-theorists such as Quentin Smith and William Lane Craig. Traditionally, these two theories have been viewed as rival methods for providing truth-conditions for tensed sentences. I argue that the debate over whether the token-reflexive theory or the date theory is true has arisen from a failure (...) to distinguish between conditions for the truth of tensed tokens and conditions for the truth of propositions expressed by tensed tokens. I demonstrate that there is a true formulation of the token-reflexive theory that provides necessary and sufficient conditions for the truth of tensed tokens, and there is a true formulation of the date theory that provides necessary and sufficient conditions for the truth of propositions expressed by tensed tokens. I argue that once the views are properly formulated, the A-theorist’s objections fail to make their mark. However, I conclude by claiming that even though there is a true formulation of the token-reflexive theory and a true formulation of the date theory, the New B-theory nonetheless fails to provide a complete account of the truth and falsity of tensed sentences. (shrink)
A uniform theory of conditionals is one which compositionally captures the behavior of both indicative and subjunctive conditionals without positing ambiguities. This paper raises new problems for the closest thing to a uniform analysis in the literature (Stalnaker, Philosophia, 5, 269–286 (1975)) and develops a new theory which solves them. I also show that this new analysis provides an improved treatment of three phenomena (the import-export equivalence, reverse Sobel-sequences and disjunctive antecedents). While these results concern central issues in (...) the study of conditionals, broader themes in the philosophy of language and formal semantics are also engaged here. This new analysis exploits a dynamic conception of meaning where the meaning of a symbol is its potential to change an agent’s mental state (or the state of a conversation) rather than being the symbol’s content (e.g. the proposition it expresses). The analysis of conditionals is also built on the idea that the contrast between subjunctive and indicative conditionals parallels a contrast between revising and consistently extending some body of information. (shrink)
French translation by G. B. Côté and Roger Lapalme of "A Geneticist's Roadmap to Sanity" (G. B. Côté, 2019) with added bibliography. -/- À voir le monde d’aujourd’hui, on pourrait croire que nous avons perdu la raison. Je veux explorer ici les fondements mêmes de notre existence. Je discuterai brièvement du libre arbitre, de l’éthique, de la religion, de la souffrance, du dualisme cartésien et de l’état de conscience, avec un arrière-plan promulguant l’importance de la physique quantique d’aujourd’hui et de (...) l’intemporel. Pour ce faire, je devrai d’abord établir que le platonisme mathématique est une prémisse essentielle pour qu’un univers (ou même un multivers) prenne forme, et j’introduirai les trois modes d’existence abstrait, virtuel et concret (en philosophie) correspondant respectivement aux concepts d’information, d’énergie et de masse (en physique). Cet article constitue un bref exposé de ma théorie du tout. (shrink)
This essay presents the normative foundation of W.E.B. Du Bois’s constructivist theory of justice in three steps. First, I show that for Du Bois the public sphere in Anglo-European modern states consists of a dialectical interplay between reasonable persons and illiberal rogues. Second, under these nonideal circumstances, the ideal of autonomy grounds reasonable persons’ deliberative openness, an attitude of public moral regard for others which is necessary for constructing the terms of political rule. Though deliberative openness is the essential (...) vehicle of construction, reasonable persons only have a pragmatic political obligation to forge ties of deliberative reciprocity with likeminded persons whom they trust will listen and not harm them. Finally, I present Du Bois’s defense of black suffragists’ support of the 19th Amendment to illustrate pragmatic political obligation in action. I sketch successful democratic engagement that reconstitutes a nonideal public sphere. (shrink)
Recent work in social psychology suggests that people harbor “implicit race biases,” biases which can be unconscious or uncontrollable. Because awareness and control have traditionally been deemed necessary for the ascription of moral responsibility, implicit biases present a unique challenge: do we pardon discrimination based on implicit biases because of its unintentional nature, or do we punish discrimination regardless of how it comes about? The present experiments investigated the impact such theories have upon moral judgments about racial discrimination. The results (...) show that different theories differ in their impact on moral judgments: when implicit biases are defined as unconscious, people hold the biased agent less morally responsible than when these biases are defined as automatic (i.e., difficult to control), or when no theory of implicit bias is provided. (shrink)
These days the two most popular approaches to belief ascription are Millianism and Contextualism. The former approach is inconsistent with the existence of ordinary Frege cases, such as Lois believing that Superman flies while failing to believe that Clark Kent flies. The Millian holds that the only truth-conditionally relevant aspect of a proper name is its referent or extension. Contextualism, as I will define it for the purposes of this essay, includes all theories according to which ascriptions of the form (...) ‘S believes that a is F’ and ‘S believes that b is F’, where ‘a’ and ‘b’ are coreferential proper names, may, depending on the context, differ in truth-value even though in those very contexts each ascription relates the same believer to the very same proposition. What the two theories have in common is the claim that names are Millian. What separates the two theories is what they say about belief contexts. In this essay I prove that Millianism is true, Contextualism is true, or our intuitions regarding belief ascriptions are hopelessly inaccurate. As a consequence, my argument is a proof that either names and many general terms are Millian or our intuitions regarding belief ascriptions are hopelessly inaccurate. (shrink)
“The universe is expanding, not contracting.” Many statements of this form appear unambiguously true; after all, the discovery of the universe’s expansion is one of the great triumphs of empirical science. However, the statement is time-directed: the universe expands towards what we call the future; it contracts towards the past. If we deny that time has a direction, should we also deny that the universe is really expanding? This article draws together and discusses what I call ‘C-theories’ of time — (...) in short, philosophical positions that hold time lacks a direction — from different areas of the literature. I set out the various motivations, aims, and problems for C-theories, and outline different versions of antirealism about the direction of time. (shrink)
In her re-analysis of the evidence presented in Klein and Nichols (2012) to support their argument that patient R.B. temporarily lost possessory custody of consciously apprehended objects (in this case, objects that normally would be non-inferentially taken as episodic memory), Professor Roache concludes Klein and Nichols's claims are untenable. I argue that Professor Roache is incorrect in her re-interpretation, and that this is due, in part, to lack of sufficient familiarity with psychological theory on memory as well as clinical (...) literature on felt loss of ownership of one’s intentional objects. (shrink)
This paper proposes a view of time that takes passage to be the most basic temporal notion, instead of the usual A-theoretic and B-theoretic notions, and explores how we should think of a world that exhibits such a genuine temporal passage. It will be argued that an objective passage of time can only be made sense of from an atemporal point of view and only when it is able to constitute a genuine change of objects across time. This requires that (...) passage can flip one fact into a contrary fact, even though neither side of the temporal passage is privileged over the other. We can make sense of this if the world is inherently perspectival. Such an inherently perspectival world is characterized by fragmentalism, a view that has been introduced by Fine in his ‘Tense and Reality’ (2005). Unlike Fine's tense-theoretic fragmentalism though, the proposed view will be a fragmentalist view based in a primitive notion of passage. (shrink)
In this paper I revisit a dispute between Mikel Burley and Robin Le Poidevin about whether or not the B-theory of time can give its adherents any reason to be less afraid of death. In ‘Should a B-theoretic atheist fear death?’, Burley argues that even on Le Poidevin’s understanding of the B-theory, atheists shouldn’t be comforted. His reason is that the prevalent B-theoretic account of our attitudes towards the past and future precludes treating our fear of death as (...) unwarranted. I examine his argument and provide a tentative defense of Le Poidevin. I claim that while Burley rightly spots a tension with a non-revisionary approach to our ordinary emotional life, he doesn’t isolate the source of that tension. The real question is how to understand Le Poidevin’s idea that on the B-theory, we and our lives are ‘eternally real’. I then suggest that there is a view of time that does justice to Le Poidevin’s remarks, albeit a strange one. The view takes temporal relations to be quasi-spatial and temporal entities to exist in a totum simul. (shrink)
This article reconstructs and defends Theodor Adorno’s social theory by motivating the central role of abstract domination within it. Whereas critics such as Axel Honneth have charged Adorno with adhering to a reductive model of personal domination, I argue that the latter rather understands domination as a structural and de-individualized feature of capitalist society. If Adorno’s social theory is to be explanatory, however, it must account for the source of the abstractions that dominate modern individuals and, in particular, (...) that of value. While such an account remains undeveloped in Adorno, Marx provides resources for its development, in positing the constitution of value neither in production nor exchange alone, but in the social totality. This article argues that Marx’s account is compatible with Adorno’s, and that it may be used to render Adorno’s theory of domination more credible on explanatory grounds. (shrink)
To make sense of what Gilles Deleuze understands by a mathematical concept requires unpacking what he considers to be the conceptualizable character of a mathematical theory. For Deleuze, the mathematical problems to which theories are solutions retain their relevance to the theories not only as the conditions that govern their development, but also insofar as they can contribute to determining the conceptualizable character of those theories. Deleuze presents two examples of mathematical problems that operate in this way, which he (...) considers to be characteristic of a more general theory of mathematical problems. By providing an account of the historical development of this more general theory, which he traces drawing upon the work of Weierstrass, Poincaré, Riemann, and Weyl, and of its significance to the work of Deleuze, an account of what a mathematical concept is for Deleuze will be developed. (shrink)
This article is a response to Clifford Williams’s claim that the debate between A- and B theories of time is misconceived because these theories do not differ. I provide some missing support for Williams’s claim that the B-theory includes transition, by arguing that representative B-theoretic explanations for why we experience time as passing (even though it does not) are inherently unstable. I then argue that, contra Williams, it does not follow that there is nothing at stake in the A- (...) versus B debate. (shrink)
What is it to be a woman? What is it to be a man? We start by laying out desiderata for an analysis of 'woman' and 'man': descriptively, it should link these gender categories to sex biology without reducing them to sex biology, and politically, it should help us explain and combat traditional sexism while also allowing us to make sense of the activist view that gendering should be consensual. Using a Putnam-style 'Twin Earth' example, we argue that none of (...) the existing analyses in the feminist literature succeeds in meeting all of our desiderata. Finally, we propose a positive account that we believe can satisfy all the desiderata outlined. According to our theory, the genders 'woman' and 'man' are individuated not by their contemporary connections to sex biology, but by their historical continuity with classes that were originally closely connected to sex biology. (shrink)
The label `semiotic grammar' captures a fundamental property of the grammars of human languages: not only is language a semiotic system in the familiar Saussurean sense, but its organizing system, its grammar, is also a semiotic system. This proposition, explicated in detail by William McGregor in this book, constitutes a new theory of grammar. Semiotic Grammar is `functional' rather than `formal' in its intellectual origins, approaches, and methods. It demonstrates, however, that neither a purely functional nor a purely formal (...) account of language is adequate, given the centrality of the sign as the fundamental unit of grammatical analysis. The author distinguishes four types of grammatical signs: experiential, logical, interpersonal, and textural. The signifiers of these signs are syntagmatic relationships of the following types, respectively: constituency, dependency, conjugational and linking. McGregor illustrates and exemplifies the theory with data from a variety of languages including English, Acehnese, Polish, Finnish, Japanese, Chinese, and Mohawk; and from his pioneering research on Gooniyandi and Nyulnyul, two languages of the Kimberleys region of Western Australia. (shrink)
The theme of the conflict between the different interpretations of Spinoza’s philosophy in French scholarship, introduced by Christopher Norris in this volume and expanded on by Alain Badiou, is also central to the argument presented in this chapter. Indeed, this chapter will be preoccupied with distinguishing the interpretations of Spinoza by two of the figures introduced by Badiou. The interpretation of Spinoza offered by Gilles Deleuze in Expressionism in Philosophy provides an account of the dynamic changes or transformations of the (...) characteristic relations of a Spinozist finite existing mode, or human being. This account has been criticized more or less explicitly by a number of commentators, including Charles Ramond. Rather than providing a defence of Deleuze on this specific point, which I have done elsewhere, what I propose to do in this chapter is provide an account of the role played by “joyful passive a affections” in these dynamic changes or transformations by distinguishing Deleuze’s account of this role from that offered by one of his more explicit critics on this issue, Pierre Macherey. An appreciation of the role played by “joyful passive affections” in this context is crucial to understanding how Deleuze’s interpretation of Spinoza is implicated in his broader philosophical project of constructing a philosophy of difference. The outcome is a position that, like Badiou in the previous chapter, rules out “intellect in potentiality” but maintains a role for the joyful passive affects in the development of adequate ideas. (shrink)
The current resurgence of interest in cognition and in the nature of cognitive processing has brought with it also a renewed interest in the early work of Husserl, which contains one of the most sustained attempts to come to grips with the problems of logic from a cognitive point of view. Logic, for Husserl, is a theory of science; but it is a theory which takes seriously the idea that scientific theories are constituted by the mental acts of (...) cognitive subjects. The present essay begins with an exposition of Husserl's act-based conception of what a science is, and goes on to consider his account of the role of linguistic meanings, of the ontology of scientific objects, and of evidence and truth. The essay concentrates almost exclusively on the Logical Investigations of 1900/01. This is not only because this work, which is surely Husserl's single most important masterpiece, has been overshadowed first of all by his Ideas I and then later by the Crisis. It is also because the Investigations contain, in a peculiarly clear and pregnant form, a whole panoply of ideas on logic and cognitive theory which either simply disappeared in Husserl's own later writings or became obfuscated by an admixture of that great mystery which is 'transcendental phenomenology'. (shrink)
Does the recent success of Podemos and Syriza herald a new era of inclusive, egalitarian left populism? Because leaders of both parties are former students of Ernesto Laclau and cite his account of populism as guiding their political practice, this essay considers whether his theory supports hope for a new kind of populism. For Laclau, the essence of populism is an “empty signifier” that provides a means by which anyone can identify with the people as a whole. However, the (...) concept of the empty signifier is not as neutral as he assumes. As I show by analyzing the role of race in his theory, some subjects are constituted in a way that prevents their unmediated identification with the people. Consequently, Laclau’s view should be read as symptomatic of the problems with populist logic if its adherents are to avoid reproducing its exclusions and practice a more inclusive politics. (shrink)
: I offer a new approach to the increasingly convoluted debate between the A- and B-theories of time, the ‘tensed’ and ‘tenseless’ theories. It is often assumed that the B-theory faces more difficulties than the A-theory in explaining the apparently tensed features of temporal experience. I argue that the A-theory cannot explain these features at all, because on any physicalist or supervenience theory of the mind, in which the nature of experience is fixed by the physical (...) state of the world, the tensed properties of time posited by the A-theory could play no role in shaping temporal experience. It follows that the A-theory is false; even a priori arguments for it fail, because they still require the tensed vocabulary which is used to describe temporal experience. (shrink)
According to Heidegger's Being and Time, social relations are constitutive of the core features of human agency. On this view, which I call a ‘strong conception’ of sociality, the core features of human agency cannot obtain in an individual subject independently of social relations to others. I explain the strong conception of sociality captured by Heidegger's underdeveloped notion of ‘being-with’ by reconstructing Heidegger's critique of the ‘weak conception’ of sociality characteristic of Kant's theory of agency. According to a weak (...) conception, sociality is a mere aggregation of individual subjects and the core features of human agency are built into each individual mind. The weak conception of sociality remains today widely taken for granted. I show that Christine Korsgaard, one of the most creative contemporary appropriators of Kant, operates with a weak conception of sociality and that this produces a problematic explanatory deficiency in her view: she is unable to explain the peculiar motivational efficacy of shared social norms. Heidegger's view is tailor made to explain this phenomenon. I end by sketching how Heidegger provides a social explanation of a major systematic concern animating Korsgaard, the concern with the importance of individual autonomy and answerability in human life. (shrink)
This paper assesses branching spacetime theories in light of metaphysical considerations concerning time. I present the A, B, and C series in terms of the temporal structure they impose on sets of events, and raise problems for two elements of extant branching spacetime theories—McCall’s ‘branch attrition’, and the ‘no backward branching’ feature of Belnap’s ‘branching space-time’—in terms of their respective A- and B-theoretic nature. I argue that McCall’s presentation of branch attrition can only be coherently formulated on a model with (...) at least two temporal dimensions, and that this results in severing the link between branch attrition and the flow of time. I argue that ‘no backward branching’ prohibits Belnap’s theory from capturing the modal content of indeterministic physical theories, and results in it ascribing to the world a time-asymmetric modal structure that lacks physical justification. (shrink)
We presuppose a position of scientific realism to the effect (i) that the world exists and (ii) that through the working out of ever more sophisticated theories our scientific picture of reality will approximate ever more closely to the world as it really is. Against this background consider, now, the following question: 1. Do the empirical theories with the help of which we seek to approximate a good or true picture of reality rest on any non-empirical presuppositions? One can answer (...) this question with either a 'yes' or a 'no'. 'No' is the preferred answer of most contemporary methodologists -- Murray Rothbard is one distinguished counterexample to this trend -- who maintain that empirical theories are completely free of non-empirical ('a priori') admixtures and who see science as a matter of the gathering of pure 'data' obtained through simple observation. From such data scientific propositions are then supposed to be somehow capable of being established. (shrink)
This chapter discusses some aspects of the relation between temporal experience and the A versus B debate. To begin with, I provide an overview of the A versus B debate and, following Baron et al. (2015), distinguish between two B-theoretic responses to the A- theoretic argument from experience, veridicalism and illusionism. I then argue for veridicalism over illusionism, by examining our (putative) experiences as of presentness and as of time passing. I close with some remarks on the relation between veridicalism (...) and a deflationary view of the A versus B debate. I suggest that the deflationary view can provide further support for veridicalism. (shrink)
In this paper, I present a general theory of topological explanations, and illustrate its fruitfulness by showing how it accounts for explanatory asymmetry. My argument is developed in three steps. In the first step, I show what it is for some topological property A to explain some physical or dynamical property B. Based on that, I derive three key criteria of successful topological explanations: a criterion concerning the facticity of topological explanations, i.e. what makes it true of a particular (...) system; a criterion for describing counterfactual dependencies in two explanatory modes, i.e. the vertical and the horizontal; and, finally, a third perspectival one that tells us when to use the vertical and when to use the horizontal mode. In the second step, I show how this general theory of topological explanations accounts for explanatory asymmetry in both the vertical and horizontal explanatory modes. Finally, in the third step, I argue that this theory is universally applicable across biological sciences, which helps to unify essential concepts of biological networks. (shrink)
This thesis is about the conceptualization of persistence of physical, middle-sized objects within the theoretical framework of the revisionary ‘B-theory’ of time. According to the B-theory, time does not flow, but is an extended and inherently directed fourth dimension along which the history of the universe is ‘laid out’ once and for all. It is a widespread view among philosophers that if we accept the B-theory, the commonsensical ‘endurance theory’ of persistence will have to be rejected. (...) The endurance theory says that objects persist through time by being wholly present at distinct times as numerically the same entity. Instead of endurantism, it has been argued, we have to adopt either ‘perdurantism’ or the ‘stage theory’. Perdurantism is the theory that objects are four-dimensional ‘space-time worms’ persisting through time by having distinct temporal parts at distinct times. The stage theory says that objects are instantaneous temporal parts (stages) of space-time worms, persisting by having distinct temporal counterparts at distinct times. In the thesis, it is argued that no good arguments have been provided for the conclusion that we are obliged to drop the endurance theory by acceptance of the B-theory. This conclusion stands even if the endurance theory incorporates the claim that objects endure through intrinsic change. It is also shown that perdurantism and the stage theory come with unwelcome consequences. -/- Paper I demonstrates that the main arguments for the view that objects cannot endure in B-time intrinsically unchanged fail. Papers II and III do the same with respect to the traditional arguments against endurance through intrinsic change in B-time. Paper III also contains a detailed account of the semantics of the tenseless copula, which occurs frequently in the debate. The contention of Paper IV is that four-dimensional space-time worms, as traditionally understood, are not suited to take dispositional predicates. In Paper V, it is shown that the stage theory needs to introduce an overabundance of persistence-concepts, many of which will have to be simultaneously applicable to a single object (qua falling under a single sortal), in order for the theory to be consistent. The final article, Paper VI, investigates the sense in which persistence can, as is sometimes suggested, be a ‘conventional matter’. It also asks whether alleged cases of ‘conventional persistence’ create trouble for the endurance theory. It is argued that conventions can only enter at a trivial semantic level, and that the endurance theory is no more threatened by such conventions than are its rivals. (shrink)
We have a variety of different ways of dividing up, classifying, mapping, sorting and listing the objects in reality. The theory of granular partitions presented here seeks to provide a general and unified basis for understanding such phenomena in formal terms that is more realistic than existing alternatives. Our theory has two orthogonal parts: the first is a theory of classification; it provides an account of partitions as cells and subcells; the second is a theory of (...) reference or intentionality; it provides an account of how cells and subcells relate to objects in reality. We define a notion of well-formedness for partitions, and we give an account of what it means for a partition to project onto objects in reality. We continue by classifying partitions along three axes: (a) in terms of the degree of correspondence between partition cells and objects in reality; (b) in terms of the degree to which a partition represents the mereological structure of the domain it is projected onto; and (c) in terms of the degree of completeness with which a partition represents this domain. (shrink)
I offer an interpretation and a partial defense of Kit Fine's ‘Argument from Passage’, which is situated within his reconstruction of McTaggart's paradox. Fine argues that existing A-theoretic approaches to passage are no more dynamic, i.e. capture passage no better, than the B-theory. I argue that this comparative claim is correct. Our intuitive picture of passage, which inclines us towards A-theories, suggests more than coherent A-theories can deliver. In Finean terms, the picture requires not only Realism about tensed facts, (...) but also Neutrality, i.e. the tensed facts not being ‘oriented towards’ one privileged time. However unlike Fine, and unlike others who advance McTaggartian arguments, I take McTaggart's paradox to indicate neither the need for a more dynamic theory of passage nor that time does not pass. A more dynamic theory is not to be had: Fine's ‘non-standard realism’ amounts to no more than a conceptual gesture. But instead of concluding that time does not pass, we should conclude that theories of passage cannot deliver the dynamicity of our intuitive picture. For this reason, a B-theoretic account of passage that simply identifies passage with the succession of times is a serious contender. (shrink)
Standard Type Theory, ${\textrm {STT}}$, tells us that $b^n$ is well-formed iff $n=m+1$. However, Linnebo and Rayo [23] have advocated the use of Cumulative Type Theory, $\textrm {CTT}$, which has more relaxed type-restrictions: according to $\textrm {CTT}$, $b^\beta $ is well-formed iff $\beta>\alpha $. In this paper, we set ourselves against $\textrm {CTT}$. We begin our case by arguing against Linnebo and Rayo’s claim that $\textrm {CTT}$ sheds new philosophical light on set theory. We then argue that, (...) while $\textrm {CTT}$ ’s type-restrictions are unjustifiable, the type-restrictions imposed by ${\textrm {STT}}$ are justified by a Fregean semantics. What is more, this Fregean semantics provides us with a principled way to resist Linnebo and Rayo’s Semantic Argument for $\textrm {CTT}$. We end by examining an alternative approach to cumulative types due to Florio and Jones [10]; we argue that their theory is best seen as a misleadingly formulated version of ${\textrm {STT}}$. (shrink)
This notebook presents an introductory overview to the cognitive perspective on the psychology of human behaviour for social science students. Starting with an introduction to cognitive developmental theories of how babies reason, the overview then moves to discuss how children develop into better thinkers. Adult theories of cognition are subsequently outlined and critically evaluated. -/- A chronology of topics include: the rise of 'this thing we call cognition', Piaget's theory of cognitive development and its evaluation, problem space theory, (...) and theories of mental representation in adult thought examining, amongst other types of thinking and reasoning, deduction and induction and an evaluation of mental representation theories. (shrink)
Transactive memory theory describes the processes by which benefits for memory can occur when remembering is shared in dyads or groups. In contrast, cognitive psychology experiments demonstrate that social influences on memory disrupt and inhibit individual recall. However, most research in cognitive psychology has focused on groups of strangers recalling relatively meaningless stimuli. In the current study, we examined social influences on memory in groups with a shared history, who were recalling a range of stimuli, from word lists to (...) personal, shared memories. We focused in detail on the products and processes of remembering during in-depth interviews with 12 older married couples. These interviews consisted of three recall tasks: (1) word list recall; (2) personal list recall, where stimuli were relevant to the couples’ shared past; and (3) an open-ended autobiographical interview. We conducted these tasks individually and then collaboratively two weeks later. Across each of the tasks, although some couples demonstrated collaborative inhibition, others demonstrated collaborative facilitation. We identified a number of factors that predicted collaborative success, in particular, group-level strategy use. Our results show that collaboration may help or hinder memory, and certain interactions are more likely to produce collaborative benefits. (shrink)
This article offers a novel, conservative account of material constitution, one that incorporates sortal essentialism and features a theory of dominant sortals. It avoids coinciding objects, temporal parts, relativizations of identity, mereological essentialism, anti-essentialism, denials of the reality of the objects of our ordinary ontology, and other departures from the metaphysic implicit in ordinary ways of thinking. Defenses of the account against important objections are found in Burke 1997, 2003, and 2004, as well as in the often neglected six (...) paragraphs that conclude section V of this article. (shrink)
Semantic originalism is a theory of constitutional meaning that aims to disentangle the semantic, legal, and normative strands of debates in constitutional theory about the role of original meaning in constitutional interpretation and construction. This theory affirms four theses: (1) the fixation thesis, (2) the clause meaning thesis, (3) the contribution thesis, and (4) the fidelity thesis. -/- The fixation thesis claims that the semantic content of each constitutional provision is fixed at the time the provision is (...) framed and ratified: subsequent changes in linguistic practice cannot change the semantic content of an utterance. -/- The clause meaning thesis claims that the semantic content is given by the conventional semantic meaning (or original public meaning) of the text with four modifications. The first modification is provided by the publicly available context of constitutional utterance: words and phrases that might be ambiguous in isolation can become clear in light of those circumstances of framing and ratification that could be expected to known to interpreters of the Constitution across time. The second modification is provided by the idea of the division of linguistic labor: some constitutional provisions, such as the natural born citizen clause may be terms of art, the meaning of which are fixed by the usages of experts. The third modification is provided by the idea of constitutional implicature: the constitution may mean things it does not explicitly say. The fourth modification is provided by the idea of constitutional stipulations: the constitution brings into being new terms such as House of Representatives and the meaning of these terms is stipulated by the Constitution itself. -/- The contribution thesis asserts that the semantic content of the Constitution contributes to the law: the most plausible version of the contribution thesis is modest, claiming that the semantic content of the Constitution provides rules of constitutional law, subject to various qualifications. Our constitutional practice provides strong evidence for the modest version of the contribution thesis. -/- The fidelity thesis asserts that we have good reasons to affirm fidelity to constitutional law: virtuous citizens and officials are disposed to act in accord with the Constitution; right acting citizens and officials obey the constitution in normal circumstances; constitutional conformity produces good consequences. Our public political culture affirms the great value of the rule of law. -/- We can summarize semantic originalism as a slogan: The original public meaning of the constitution is the law and for that reason it should be respected and obeyed. The slogan recapitulates each of the claims made by semantic originalism, but it is potentially misleading because it does not clearly distinguish between the semantic claims made by the fixation and clause meaning theses, the legal claim made by the contribution thesis, and the normative claim made by the fidelity thesis. -/- Part I introduces the four theses. Part II is entitled An Opinionated History of Constitutional Originalism, and it provides the context for all that follows. Part III is entitled Semantic Originalism: A Theory of Constitutional Meaning, and it lays out the case for original public meaning as the best nonnormative theory of constitutional content. Part IV is entitled The Normative Implications of Semantic Originalism, and it articulates a variety of normative arguments for originalism. Part V is entitled Conclusion: Semantic Originalism and Living Constitutionalism, and it explores the broad implications of semantic originalism for living constitutionalism and the future of constitutional theory. (shrink)
This paper will examine the nature of mechanisms and the distinction between the relevant and irrelevant parts involved in a mechanism’s operation. I first consider Craver’s account of this distinction in his book on the nature of mechanisms, and explain some problems. I then offer a novel account of the distinction that appeals to some resources from Mackie’s theory of causation. I end by explaining how this account enables us to better understand what mechanisms are and their various features.
"Procedural Justice" offers a theory of procedural fairness for civil dispute resolution. The core idea behind the theory is the procedural legitimacy thesis: participation rights are essential for the legitimacy of adjudicatory procedures. The theory yields two principles of procedural justice: the accuracy principle and the participation principle. The two principles require a system of procedure to aim at accuracy and to afford reasonable rights of participation qualified by a practicability constraint. The Article begins in Part I, (...) Introduction, with two observations. First, the function of procedure is to particularize general substantive norms so that they can guide action. Second, the hard problem of procedural justice corresponds to the following question: How can we regard ourselves as obligated by legitimate authority to comply with a judgment that we believe (or even know) to be in error with respect to the substantive merits? The theory of procedural justice is developed in several stages, beginning with some preliminary questions and problems. The first question - what is procedure? - is the most difficult and requires an extensive answer: Part II, Substance and Procedure, defines the subject of the inquiry by offering a new theory of the distinction between substance and procedure that acknowledges the entanglement of the action-guiding roles of substantive and procedural rules while preserving the distinction between two ideal types of rules. The key to the development of this account of the nature of procedure is a thought experiment, in which we imagine a world with the maximum possible acoustic separation between substance and procedure. Part III, The Foundations of Procedural Justice, lays out the premises of general jurisprudence that ground the theory and answers a series of objections to the notion that the search for a theory of procedural justice is a worthwhile enterprise. Sections II and III set the stage for the more difficult work of constructing a theory of procedural legitimacy. Part IV, Views of Procedural Justice, investigates the theories of procedural fairness found explicitly or implicitly in case law and commentary. After a preliminary inquiry that distinguishes procedural justice from other forms of justice, Part IV focuses on three models or theories. The first, the accuracy model, assumes that the aim of civil dispute resolution is correct application of the law to the facts. The second, the balancing model, assumes that the aim of civil procedure is to strike a fair balance between the costs and benefits of adjudication. The third, the participation model, assumes that the very idea of a correct outcome must be understood as a function of process that guarantees fair and equal participation. Part IV demonstrates that none of these models provides the basis for a fully adequate theory of procedural justice. In Part V, The Value of Participation, the lessons learned from analysis and critique of the three models are then applied to the question whether a right of participation can be justified for reasons that are not reducible to either its effect on the accuracy or its effect on the cost of adjudication. The most important result of Part V is the Participatory Legitimacy Thesis: it is (usually) a condition for the fairness of a procedure that those who are to be finally bound shall have a reasonable opportunity to participate in the proceedings. The central normative thrust of Procedural Justice is developed in Part VI, Principles of Procedural Justice. The first principle, the Participation Principle, stipulates a minimum (and minimal) right of participation, in the form of notice and an opportunity to be heard, that must be satisfied (if feasible) in order for a procedure to be considered fair. The second principle, the Accuracy Principle, specifies the achievement of legally correct outcomes as the criterion for measuring procedural fairness, subject to four provisos, each of which sets out circumstances under which a departure from the goal of accuracy is justified by procedural fairness itself. In Part VII, The Problem of Aggregation, the Participation Principle and the Accuracy Principle are applied to the central problem of contemporary civil procedure - the aggregation of claims in mass litigation. Part VIII offers some concluding observations about the point and significance of Procedural Justice. (shrink)
The computational paradigm, which has dominated psychology and artificial intelligence since the cognitive revolution, has been a source of intense debate. Recently, several cognitive scientists have argued against this paradigm, not by objecting to computation, but rather by objecting to the notion of representation. Our analysis of these objections reveals that it is not the notion of representation per se that is causing the problem, but rather specific properties of representations as they are used in various psychological theories. Our analysis (...) suggests that all theorists accept the idea that cognitive processing involves internal information-carrying states that mediate cognitive processing. These mediating states are a superordinate category of representations. We discuss five properties that can be added to mediating states and examine their importance in various cognitive models. Finally, three methodological lessons are drawn from our analysis and discussion. (shrink)
The goal of this paper is to defend the general tenet that time travelers cannot change the past within B-theoretical models of time, independently of how many temporal dimensions there are. Baron Pacific Philosophical Quarterly, 98, 129–147 offered a strong argument intended to reach this general conclusion. However, his argument does not cover a peculiar case, i.e. a B-theoretical one-dimensional model of time that allows for the presence of internal times. Loss Pacific Philosophical Quarterly, 96, 1–11 used the latter model (...) to argue that time travelers can change the past within such model. We show a way to debunk Loss’s argument, so that the general tenet about the impossibility of changing the past within B-theoretical models is maintained. (shrink)
(The version now posted is a revision of what was posted earlier. Final version now published.) The article gives a novel argument to show that there is sense of 'exists' suitable for posing a substantive issue between presentists and eternalists. It then seeks to invigorate a neglected variety of presentism. There are seven doctrines, widely accepted even among presentists, that create problems for presentism. Without distinguishing existence and being, presentists can comfortably reject all seven. Doing so would dispose of the (...) majority of presentism’s problems. Further, it would enable presentists to reduce A-judgments to B-judgments, thereby insulating presentism from doubts about the intelligibility of A-theories. For reasons indicated very briefly, it might also make presentism less difficult to reconcile with special relativity, though the point is not pursued here. (shrink)
Kopeikin (forthcoming a, forthcoming b) and Rachels’ (1975) bare-difference cases elicit the intuition that killing is no different than letting die. Hill’s (2018) bare-difference cases elicit the intuition that killing is worse than letting die. At least one of the intuitions must be mistaken. This calls for an error theory. Hill has an error theory for the intuition elicited by the Kopeikin/Rachels’ cases. Kopeikin and Rachels have an error theory for the intuition elicited by Hill’s cases. A (...) natural thought is that we are at an impasse. There is no plausible basis for preferring one error theory to the other. I argue that this natural thought is mistaken. Not all error theories are equal. Preliminary considerations favor Hill’s error theory and disfavor the Kopeikin/Rachels error theory. But preliminary considerations are not decisive. The way forward in the bare-difference debate is not to evaluate intuitions. The intuitions are in. What is left to do now is evaluate the comparative status of the Hill and the Kopeikin/Rachels error theories. (shrink)
This paper proposes a semantics for free choice permission that explains both the non-classical behavior of modals and disjunction in sentences used to grant permission, and their classical behavior under negation. It also explains why permissions can expire when new information comes in and why free choice arises even when modals scope under disjunction. On the proposed approach, deontic modals update preference orderings, and connectives operate on these updates rather than propositions. The success of this approach stems from its capacity (...) to capture the difference between expressing the preferences that give rise to permissions and conveying propositions about those preferences. (shrink)
Some New Mechanists have proposed that claims of compositional relations are justified by combining the results of top-down and bottom-up interlevel interventions. But what do scientists do when they can perform, say, a cellular intervention, but not a subcellular detection? In such cases, paired interlevel interventions are unavailable. We propose that scientists use abduction and we illustrate its use through a case study of the ionic theory of resting and action potentials.
Very plausibly, nothing can be a genuine computing system unless it meets an input-sensitivity requirement. Otherwise all sorts of objects, such as rocks or pails of water, can count as performing computations, even such as might suffice for mentality—thus threatening computationalism about the mind with panpsychism. Maudlin in J Philos 86:407–432, ( 1989 ) and Bishop ( 2002a , b ) have argued, however, that such a requirement creates difficulties for computationalism about conscious experience, putting it in conflict with the (...) very intuitive thesis that conscious experience supervenes on physical activity. Klein in Synthese 165:141–153, ( 2008 ) proposes a way for computationalists about experience to avoid panpsychism while still respecting the supervenience of experience on activity. I argue that his attempt to save computational theories of experience from Maudlin’s and Bishop’s critique fails. (shrink)
According to a common objection to epistemological naturalism, no empirical, scientific theory of knowledge can be normative in the way epistemological theories need to be. In response, such naturalists as W.V. Quine have claimed naturalized epistemology can be normative by emulating engineering disciplines and addressing the relations of causal efficacy between our cognitive means and ends. This paper evaluates that "engineering reply" and finds it a mixed success. Based on consideration of what it might mean to call a (...) class='Hi'>theory "normative," seven versions of the normativity objection to epistemological naturalism are formulated. The engineering reply alone is sufficient to answer only the four least sophisticated versions. To answer the others, naturalists must draw on more resources than their engineering reply alone provides. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.