On one popular view, the general covariance of gravity implies that change is relational in a strong sense, such that all it is for a physical degree of freedom to change is for it to vary with regard to a second physical degree of freedom. At a quantum level, this view of change as relative variation leads to a fundamentally timeless formalism for quantum gravity. Here, we will show how one may avoid this acute ‘problem of time’. Under our view, (...) duration is still regarded as relative, but temporal succession is taken to be absolute. Following our approach, which is presented in more formal terms in, it is possible to conceive of a genuinely dynamical theory of quantum gravity within which time, in a substantive sense, remains. 1 Introduction1.1 The problem of time1.2 Our solution2 Understanding Symmetry2.1 Mechanics and representation2.2 Freedom by degrees2.3 Voluntary redundancy3 Understanding Time3.1 Change and order3.2 Quantization and succession4 Time and Gravitation4.1 The two faces of classical gravity4.2 Retaining succession in quantum gravity5 Discussion5.1 Related arguments5.2 Concluding remarks. (shrink)
Effective Field Theory (EFT) is the successful paradigm underlying modern theoretical physics, including the "Core Theory" of the Standard Model of particle physics plus Einstein's general relativity. I will argue that EFT grants us a unique insight: each EFT model comes with a built-in specification of its domain of applicability. Hence, once a model is tested within some domain (of energies and interaction strengths), we can be confident that it will continue to be accurate within that domain. Currently, the Core (...) Theory has been tested in regimes that include all of the energy scales relevant to the physics of everyday life (biology, chemistry, technology, etc.). Therefore, we have reason to be confident that the laws of physics underlying the phenomena of everyday life are completely known. (shrink)
Some modern cosmological models predict the appearance of Boltzmann Brains: observers who randomly fluctuate out of a thermal bath rather than naturally evolving from a low-entropy Big Bang. A theory in which most observers are of the Boltzmann Brain type is generally thought to be unacceptable, although opinions differ. I argue that such theories are indeed unacceptable: the real problem is with fluctuations into observers who are locally identical to ordinary observers, and their existence cannot be swept under the rug (...) by a choice of probability distributions over observers. The issue is not that the existence of such observers is ruled out by data, but that the theories that predict them are cognitively unstable: they cannot simultaneously be true and justifiably believed. (shrink)
I defend the extremist position that the fundamental ontology of the world consists of a vector in Hilbert space evolving according to the Schrödinger equation. The laws of physics are determined solely by the energy eigenspectrum of the Hamiltonian. The structure of our observed world, including space and fields living within it, should arise as a higher-level emergent description. I sketch how this might come about, although much work remains to be done.
It seems natural to ask why the universe exists at all. Modern physics suggests that the universe can exist all by itself as a self-contained system, without anything external to create or sustain it. But there might not be an absolute answer to why it exists. I argue that any attempt to account for the existence of something rather than nothing must ultimately bottom out in a set of brute facts; the universe simply is, without ultimate cause or explanation.
We have a much better understanding of physics than we do of consciousness. I consider ways in which intrinsically mental aspects of fundamental ontology might induce modifications of the known laws of physics, or whether they could be relevant to accounting for consciousness if no such modifications exist. I suggest that our current knowledge of physics should make us skeptical of hypothetical modifications of the known rules, and that without such modifications it’s hard to imagine how intrinsically mental aspects could (...) play a useful explanatory role. Draft version of a paper submitted to Journal of Consciousness Studies, special issue responding to Philip Goff’s Galileo’s Error: Foundations for a New Science of Consciousness. (shrink)
Cosmological models that invoke a multiverse - a collection of unobservable regions of space where conditions are very different from the region around us - are controversial, on the grounds that unobservable phenomena shouldn't play a crucial role in legitimate scientific theories. I argue that the way we evaluate multiverse models is precisely the same as the way we evaluate any other models, on the basis of abduction, Bayesian inference, and empirical success. There is no scientifically respectable way to do (...) cosmology without taking into account different possibilities for what the universe might be like outside our horizon. Multiverse theories are utterly conventionally scientific, even if evaluating them can be difficult in practice. (shrink)
When are inequalities in political power undemocratic, and why? While some writers condemn any inequalities in political power as a deviation from the ideal of democracy, this view is vulnerable to the simple objection that representative democracies concentrate political power in the hands of elected officials rather than distributing it equally among citizens, but they are no less democratic for it. Building on recent literature that interprets democracy as part of a broader vision of social equality, I argue that concentrations (...) of political power are incompatible with democracy, and with a commitment to social equality more generally, when they consist in some having greater arbitrary power to influence decisions according to their idiosyncratic preferences. A novel account of the relationship between power and social status clarifies the role of social equality in the justification of democracy, including a representative democracy in which public officials have more political power than ordinary citizens. (shrink)
We study the question of how to decompose Hilbert space into a preferred tensor-product factorization without any pre-existing structure other than a Hamiltonian operator, in particular the case of a bipartite decomposition into "system" and "environment." Such a decomposition can be defined by looking for subsystems that exhibit quasi-classical behavior. The correct decomposition is one in which pointer states of the system are relatively robust against environmental monitoring (their entanglement with the environment does not continually and dramatically increase) and remain (...) localized around approximately-classical trajectories. We present an in-principle algorithm for finding such a decomposition by minimizing a combination of entanglement growth and internal spreading of the system. Both of these properties are related to locality in different ways. This formalism could be relevant to the emergence of spacetime from quantum entanglement. (shrink)
Starting from the special theory of relativity it is argued that the structure of an experience is extended over time, making experience dynamic rather than static. The paper describes and explains what is meant by phenomenal parts and outlines opposing positions on the experience of time. Time according to he special theory of relativity is defined and the possibility of static experience shown to be implausible, leading to the conclusion that experience is dynamic. Some implications of this for the relationship (...) of phenomenology to the physical world are considered. (shrink)
It is commonplace in discussions of modern cosmology to assert that the early universe began in a special state. Conventionally, cosmologists characterize this fine-tuning in terms of the horizon and flatness problems. I argue that the fine-tuning is real, but these problems aren't the best way to think about it: causal disconnection of separated regions isn't the real problem, and flatness isn't a problem at all. Fine-tuning is better understood in terms of a measure on the space of trajectories: given (...) reasonable conditions in the late universe, the fraction of cosmological histories that were smooth at early times is incredibly tiny. This discussion helps clarify what is required by a complete theory of cosmological initial conditions. (shrink)
We provide a derivation of the Born Rule in the context of the Everett (Many-Worlds) approach to quantum mechanics. Our argument is based on the idea of self-locating uncertainty: in the period between the wave function branching via decoherence and an observer registering the outcome of the measurement, that observer can know the state of the universe precisely without knowing which branch they are on. We show that there is a uniquely rational way to apportion credence in such cases, which (...) leads directly to the Born Rule. Our analysis generalizes straightforwardly to cases of combined classical and quantum self-locating uncertainty, as in the cosmological multiverse. (shrink)
Foucault’s disciplinary society and his notion of panopticism are often invoked in discussions regarding electronic surveillance. Against this use of Foucault, I argue that contemporary trends in surveillance technology abstract human bodies from their territorial settings, separating them into a series of discrete flows through what Deleuze will term, the surveillant assemblage. The surveillant assemblage and its product, the socially sorted body, aim less at molding, punishing and controlling the body and more at triggering events of in- and ex-clusion from (...) life opportunities. The meaning of the body as monitored by latest generation vision technologies formed from machine only surveillance has been transformed. Such a body is no longer disciplinary in the Foucauldian sense. It is a virtual/flesh interface broken into discrete data flows whose comparison and breakage generate bodies as both legible and eligible (or illegible). (shrink)
Republicans hold that people are dominated merely in virtue of others' having unconstrained abilities to frustrate their choices. They argue further that public officials may dominate citizens unless subject to popular control. Critics identify a dilemma. To maintain the possibility of popular control, republicans must attribute to the people an ability to control public officials merely in virtue of the possibility that they might coordinate their actions. But if the possibility of coordination suffices for attributing abilities to groups, then, even (...) in the best case, countless groups will be dominating because it will be possible for their members to coordinate their actions with the aim of frustrating others' choices. We argue the dilemma is apparent only. To make our argument, we present a novel interpretation of the republican concept of domination with the help of a game-theoretic model that clarifies the significance of collective action problems for republican theory. (shrink)
In democracies citizens are supposed to have some control over the general direction of policy. According to a pretheoretical interpretation of this idea, the people have control if elections and other democratic institutions compel officials to do what the people want, or what the majority want. This interpretation of popular control fits uncomfortably with insights from social choice theory; some commentators—Riker, most famously—have argued that these insights should make us abandon the idea of popular rule as traditionally understood. This article (...) presents a formal theory of popular control that responds to the challenge from social choice theory. It makes precise a sense in which majorities may be said to have control even if the majority preference relation has an empty core. And it presents a simple game-theoretic model to illustrate how majorities can exercise control in this specified sense, even when incumbents are engaged in purely re-distributive policymaking and the majority rule core is empty. (shrink)
I ask whether what we know about the universe from modern physics and cosmology, including fine-tuning, provides compelling evidence for the existence of God, and answer largely in the negative.
In this article, it is argued that existing democracies might establish popular rule even if Joseph Schumpeter’s notoriously unflattering picture of ordinary citizens is accurate. Some degree of popular rule is in principle compatible with apathetic, ignorant and suggestible citizens, contrary to what Schumpeter and others have maintained. The people may have control over policy, and their control may constitute popular rule, even if citizens lack definite policy opinions and even if their opinions result in part from elites’ efforts to (...) manipulate these opinions. Thus, even a purely descriptive, ‘realist’ account of democracy of the kind that Schumpeter professed to offer may need to concede that there is no democracy without some degree of popular rule. (shrink)
In this introduction to contemporary conceptions of time and change, I investigate what our experience of time, that is, our experience of change, seems to be and ask whether or not we can say that how it seems could match the reality. My conclusion is that more recent contemporary conceptions of time can do this but that more intuitive or traditional conceptions cannot. Thus, the more contemporary conceptions are preferable for research into time consciousness.
A crucial part of the contemporary interest in logicism in the philosophy of mathematics resides in its idea that arithmetical knowledge may be based on logical knowledge. Here an implementation of this idea is considered that holds that knowledge of arithmetical principles may be based on two things: (i) knowledge of logical principles and (ii) knowledge that the arithmetical principles are representable in the logical principles. The notions of representation considered here are related to theory-based and structure-based notions of representation (...) from contemporary mathematical logic. It is argued that the theory-based versions of such logicism are either too liberal (the plethora problem) or are committed to intuitively incorrect closure conditions (the consistency problem). Structure-based versions must on the other hand respond to a charge of begging the question (the circularity problem) or explain how one may have a knowledge of structure in advance of a knowledge of axioms (the signature problem). This discussion is significant because it gives us a better idea of what a notion of representation must look like if it is to aid in realizing some of the traditional epistemic aims of logicism in the philosophy of mathematics. (shrink)
The identity theory’s rise to prominence in analytic philosophy of mind during the late 1950s and early 1960s is widely seen as a watershed in the development of physicalism, in the sense that whereas logical behaviourism proposed analytic and a priori ascertainable identities between the meanings of mental and physical-behavioural concepts, the identity theory proposed synthetic and a posteriori knowable identities between mental and physical properties. While this watershed does exist, the standard account of it is misleading, as it is (...) founded in erroneous intensional misreadings of the logical positivists’—especially Carnap’s—extensional notions of translation and meaning, as well as misinterpretations of the positivists’ shift from the strong thesis of translation-physicalism to the weaker and more liberal notion of reduction-physicalism that occurred in the Unity of Science programme. After setting the historical record straight, the essay traces the first truly modern identity theory to Schlick’s pre-positivist views circa 1920 and goes on to explore its further development in Feigl, arguing that the fundamental difference between the Schlick-Feigl identity theory and the more familiar and influential Place-Smart-Armstrong identity theory has resurfaced in the deep and seemingly unbridgeable gulf in contemporary philosophy of consciousness between inflationary mentalism and deflationary physicalism. (shrink)
Riker (1982) famously argued that Arrow’s impossibility theorem undermined the logical foundations of “populism”, the view that in a democracy, laws and policies ought to express “the will of the people”. In response, his critics have questioned the use of Arrow’s theorem on the grounds that not all configurations of preferences are likely to occur in practice; the critics allege, in particular, that majority preference cycles, whose possibility the theorem exploits, rarely happen. In this essay, I argue that the critics’ (...) rejoinder to Riker misses the mark even if its factual claim about preferences is correct: Arrow’s theorem and related results threaten the populist’s principle of democratic legitimacy even if majority preference cycles never occur. In this particular context, the assumption of an unrestricted domain is justified irrespective of the preferences citizens are likely to have. (shrink)
Political scientists' failure to pay careful attention to the content (as opposed to the operationalization) of their chosen definition of 'democracy' can make them liable to draw invalid inferences from their empirical research. With this problem in mind, we argue for the following proposition: if one wishes to conduct empirical research that contributes to an existing conversation about democracy, then one must choose a definition of 'democracy' that picks out the topic of that conversation as opposed to some other (perhaps (...) nearby) topic of conversation. We show that, as a practical matter, one of the most effective methods for preserving "topic continuity" is to choose a definition of `democracy' that concurs with prevailing judgments about how to classify particular regimes, emphasizing the superiority (in this regard) of judgments about stylized hypothetical scenarios as opposed to judgments about the actual regimes we observe in our datasets. (shrink)
Pettit (2012) presents a model of popular control over government, according to which it consists in the government being subject to those policy-making norms that everyone accepts. In this paper, I provide a formal statement of this interpretation of popular control, which illuminates its relationship to other interpretations of the idea with which it is easily conflated, and which gives rise to a theorem, similar to the famous Gibbard-Satterthwaite theorem. The theorem states that if government policy is subject to popular (...) control, as Pettit interprets it, and policy responds positively to changes in citizens' normative attitudes, then there is a single individual whose normative attitudes unilaterally determine policy. I use the model and theorem as an illustrative example to discuss the role of mathematics in normative political theory. (shrink)
Athenaeus of Attalia distinguishes two types of exercise or training (γυμνασία) that are required at each stage of life: training of the body and training of the soul. He says that training of the body includes activities like physical exercises, eating, drinking, bathing and sleep. Training of the soul, on the other hand, consists of thinking, education, and emotional regulation (in other words, 'philosophy'). The notion of 'training of the soul' and the contrast between 'bodily' and 'psychic' exercise is common (...) in the Academic and Stoic traditions Athenaeus is drawing from; however, he is the earliest extant medical author to distinguish these kinds of training and to treat them as equally important aspects of regimen. In this paper, I propose some reasons why he found this distinction useful, and I examine how he justified incorporating it into his writings on regimen, namely by attributing Plato's beliefs about regimen to Hippocrates, a strategy Galen would adopt well over a century later. (shrink)
This paper examines the notion of the biopolitical body from the standpoint of Foucault’s logic of the security mechanism and the history he tells of vaccine technology. It then investigates how the increasing importance of the genetic code for determining the meaning and limits of the human in the field of 20th century cell biology has been a cause for ongoing transformation in the practices that currently extend vaccine research and development. I argue that these transformations mark the emergence of (...) a new kind of medical subject – the stabilized and infinitely reproducible human cell line – and that the practices and markets exploiting this new form of organism have had a destabilizing effect on the very biopolitical structures that engendered them and, in fact, mark a new way of conceiving the possibilities of cellular life. I call these new ways of organizing power that intervene in the logic of the security measure by mediating the relationship between populations and persons the microbiopolitical. (shrink)
In Morality & Mathematics, Justin Clarke-Doane argues that it is hard to imagine being "a realist about, for example, the standard model of particle physics, but not about mathematics." I try to explain how that seems very possible from the perspective of a physicist. What is real is the physical world; mathematics starts from descriptions of the natural world and extrapolates from there, but that extrapolation does not imply any independent reality. -/- Submitted to an Analysis Reviews symposium on Clarke-Doane's (...) Morality & Mathematics. (shrink)
Our culture is dominated by digital documents in ways that are easy to overlook. These documents have changed our worldviews about science and have raised our expectations of them as tools for knowledge justification. This article explores the complexities surrounding the digital document by revisiting Michael Polanyi’s theory of tacit knowledge—the idea that “we can know more than we can tell.” The theory presents to us a dilemma: if we can know more than we can tell, then this means that (...) the communication of science via the document as a primary form of telling will always be incomplete. This dilemma presents significant challenges to the open science movement. (shrink)
The received view in the history of the philosophy of psychology is that the logical positivists—Carnap and Hempel in particular—endorsed the position commonly known as “logical” or “analytical” behaviourism, according to which the relations between psychological statements and the physical-behavioural statements intended to give their meaning are analytic and knowable a priori. This chapter argues that this is sheer legend: most, if not all, such relations were viewed by the logical positivists as synthetic and knowable only a posteriori. It then (...) traces the origins of the legend to the logical positivists’ idiosyncratic extensional or at best weakly intensional use of what are now considered crucially strongly intensional semantic notions, such as “translation,” “meaning” and their cognates, focussing on a particular instance of this latter phenomenon, arguing that a conflation of explicit definition and analyticity may be the chief source of the legend. (shrink)
The Pneumatist school of medicine has the distinction of being the only medical school in antiquity named for a belief in a part of a human being. Unlike the Herophileans or the Asclepiadeans, their name does not pick out the founder of the school. Unlike the Dogmatists, Empiricists, or Methodists, their name does not pick out a specific approach to medicine. Instead, the name picks out a belief: the fact that pneuma is of paramount importance, both for explaining health and (...) disease, and for determining treatments for the healthy and sick. In this paper, we re-examine what our sources say about the pneuma of the Pneumatists in order to understand what these physicians thought it was and how it shaped their views on physiology, diagnosis and treatment. (shrink)
According to Russellianism (or Millianism), the two sentences ‘Ralph believes George Eliot is a novelist’ and ‘Ralph believes Mary Ann Evans is a novelist’ cannot diverge in truth-value, since they express the same proposition. The problem for the Russellian (or Millian) is that a puzzle of Kaplan’s seems to show that they can diverge in truth-value and that therefore, since the Russellian holds that they express the same proposition, the Russellian view is contradictory. I argue that the standard Russellian appeal (...) to “ways of thinking” or “propositional guises” is not necessary to solve the puzzle. Rather than this retrograde concession to Fregeanism, appeal should be made to second-order belief. The puzzle is solved, and the contradiction avoided, by maintaining that both sentences are indeed true in addition to the sentence ‘Ralph (mistakenly) believes that he does not believe Mary Ann Evans/George Eliot is a novelist’. (shrink)
This paper is about the history of a question in ancient Greek philosophy and medicine: what holds the parts of a whole together? The idea that there is a single cause responsible for cohesion is usually associated with the Stoics. They refer to it as the synectic cause (αἴτιον συνεκτικόν), a term variously translated as ‘cohesive cause,’ ‘containing cause’ or ‘sustaining cause.’ The Stoics, however, are neither the first nor the only thinkers to raise this question or to propose a (...) single answer. Many earlier thinkers offer their own candidates for what actively binds parts together, with differing implications not only for why we are wholes rather than heaps, but also why our bodies inevitably become diseased and fall apart. This paper assembles, up to the time of the Stoics, one part of the history of such a cause: what is called ‘the synechon’ (τὸ συνέχον) – that which holds things together. Starting with our earliest evidence from Anaximenes (sixth century BCE), the paper looks at different candidates and especially the models and metaphors for thinking about causes of cohesion which were proposed by different philosophers and doctors including Empedocles, early Greek doctors, Diogenes of Apollonia, Plato and Aristotle. My goal is to explore why these candidates and models were proposed and how later philosophical objections to them led to changes in how causes of cohesion were understood. (shrink)
Propositionalism is the view that intentional attitudes, such as belief, are relations to propositions. Propositionalists argue that propositionalism follows from the intuitive validity of certain kinds of inferences involving attitude reports. Jubien (2001) argues powerfully against propositions and sketches some interesting positive proposals, based on Russell’s multiple relation theory of judgment, about how to accommodate “propositional phenomena” without appeal to propositions. This paper argues that none of Jubien’s proposals succeeds in accommodating an important range of propositional phenomena, such as the (...) aforementioned validity of attitude-report inferences. It then shows that the notion of a predication act-type, which remains importantly Russellian in spirit, is sufficient to explain the range of propositional phenomena in question, in particular the validity of attitude-report inferences. The paper concludes with a discussion of whether predication act-types are really just propositions by another name. (shrink)
Quine introduced a famous distinction between the ‘notional’ sense and the ‘relational’ sense of certain attitude verbs. The distinction is both intuitive and sound but is often conflated with another distinction Quine draws between ‘dyadic’ and ‘triadic’ (or higher degree) attitudes. I argue that this conflation is largely responsible for the mistaken view that Quine’s account of attitudes is undermined by the problem of the ‘exportation’ of singular terms within attitude contexts. Quine’s system is also supposed to suffer from the (...) problem of ‘suspended judgement with continued belief’. I argue that this criticism fails to take account of a crucial presupposition of Quine’s about the connection between thought and language. The aim of the paper is to defend the spirit of Quine’s account of attitudes by offering solutions to these two problems. (shrink)
This concluding chapter of _Techno-Fixers: Origins and Implications of Technological Faith_ examines the widespread overconfidence in present-day and proposed 'technological fixes', and provides guidelines - social, ethical and technical - for soberly assessing candidate technological solutions for societal problems.
Abstract According to Russellianism, the content of a Russellian thought, in which a person ascribes a monadic property to an object, can be represented as an ordered couple of the object and the property. A consequence of this is that it is not possible for a person to believe that a is F and not to believe b is F, when a=b. Many critics of Russellianism suppose that this is possible and thus that Russellianism is false. Several arguments for this (...) claim are criticized and it is argued that Russellians need not appeal to representational notions in order to defeat them. Contrary to popular opinion, the prospects for a pure Russellianism, a Russellianism without representations, are in fact very good. (shrink)
This is a phenomenologically grounded ethnographic study of the life-world of ecstasy users in the socio-cultural contexts of raving and clubs in Sydney, Australia. The thesis espouses existential-phenomenology as a framework for describing and understanding these experiences. I argue against and reject the widespread mechanistic-materialist paradigms that inform bio-medical and bio-psychological interpretations of drug-use and non-ordinary states of consciousness. -/- As an alternative to these dominant reductionist perspectives I draw on a holistic organismic approach and the application of phenomenology to (...) ethnographic field research. More specifically, my exploration of the experiences of ecstasy is based upon a dialogal phenomenology which enabled me to generate a processual morphology of the varieties of ecstasy experience and the users’ mode of being-in-the-world. Through this endeavour I also argue for a phenomenological foundation of the study of drug-use and non-ordinary states of consciousness in general. (shrink)
The existence of object-dependent thoughts has been doubted on the grounds that reference to such thoughts is unnecessary or 'redundant' in the psychological explanation of intentional action. This paper argues to the contrary that reference to object-dependent thoughts is necessary to the proper psychological explanation of intentional action upon objects. Section I sets out the argument for the alleged explanatory redundancy of object-dependent thoughts; an argument which turns on the coherence of an alternative 'dual-component' model of explanation. Section II rebuts (...) this argument by showing the dual-component model to be incoherent precisely because of its exclusion of object-dependent thoughts. Section III concludes with a conjecture about the further possible significance of object-dependent thoughts for the prediction of action. (shrink)
This chapter corrects for Susan Sontag's undeserved neglect by contemporary ethical philosophers by bringing awareness to some of the unique metaethical insights born of her reflections on photographic representations of evil. I argue that Sontag's thought provides fertile ground for thinking about: (1) moral perception and its relation to moral knowledge; and (2) the epistemic and moral value of our emotional responses to the misery and suffering of others. I show that, contrary to standard moral perception theory (e.g. Blum 1994), (...) Sontag holds that we can have general moral perceptual knowledge. I then explore Sontag's idea that certain emotional responses, like sympathy and compassion, can sometimes be impertinent, in virtue of their having false or illusory content. I explain why this is so, and show the epistemic and motivational problems it poses for moral sentimentalism. (shrink)
The paper presents a new theory of perceptual demonstrative thought, the property-dependent theory. It argues that the theory is superior to both the object-dependent theory (Evans, McDowell) and the object-independent theory (Burge).
The theory of object-dependent singular thought is outlined and the central motivation for it, turning on the connection between thought content and truth conditions, is discussed. Some of its consequences for the epistemology of thought are noted and connections are drawn to the general doctrine of externalism about thought content. Some of the main criticisms of the object-dependent view of singular thought are outlined. Rival conceptions of singular thought are also sketched and their problems noted.
Reichenbach's 'principle of the common cause' is a foundational assumption of some important recent contributions to quantitative social science methodology but no similar principle appears in econometrics. Reiss (2005) has argued that the principle is necessary for instrumental variables methods in econometrics, and Pearl (2009) builds a framework using it that he proposes as a means of resolving an important methodological dispute among econometricians. We aim to show, through analysis of the main problem instrumental variables methods are used to resolve, (...) that the relationship of the principle to econometric methods is more nuanced than implied by previous work, but nevertheless may make a valuable contribution to the coherence and validity of existing methods. (shrink)
In this paper I attempt to discover the object of Aristotle’s God’s νόησις in Metαphysics Λ.9. In Section I, I catalogue existing interpretations and mention the two key concepts of (i) God’s substancehood and (ii) his metaphysical simplicity. In Section II, I explore the first two aporiae of Λ.9 – namely (1) what God’s οὐσία is and (2) what God intelligizes. In Section III, I show how Aristotle solves these aporiae by contending that God’s οὐσία is actually intelligizing, and being (...) determined to do so by himself, and that the object of his νόησις is himself, such that he intelligizes his own οὐσία, and I explain what this means. In Section IV, I present the second pair of aporiae in Λ.9 and show how, by solving these, Aristotle clarifies the position arrived at in Section III. Lastly, in Section V, I present the final aporia and its solution, and conclude that Aristotle’s God is a radically-unified Narcissus-God who intelligizes his own οὐσία. (shrink)
Extended Cognition (EC) hypothesizes that there are parts of the world outside the head serving as cognitive vehicles. One criticism of this controversial view is the problem of “cognitive bloat” which says that EC is too permissive and fails to provide an adequate necessary criterion for cognition. It cannot, for instance, distinguish genuine cognitive vehicles from mere supports (e.g. the Yellow Pages). In response, Andy Clark and Mark Rowlands have independently suggested that genuine cognitive vehicles are distinguished from supports in (...) that the former have been “recruited,” i.e. they are either artifacts, or, products of evolution. I argue against this proposal. There are counter examples to the claim that “Teleological” EC is either necessary or sufficient for cognition. Teleological EC conflates different types of scientific projects, and inherits content externalism’s alienation from historically impartial cognitive science. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.