Our understanding of ourselves and the world as historical has drastically changed since the postwar period, yet this emerging historical sensibility has not been appropriately explained in a coherent theory of history. In this book, Zoltán Simon argues that instead of seeing the past, the present and the future together on a temporal continuum as history, we now expect unprecedented change to happen in the future and we look at the past by assuming that such changes have already happened. (...) This radical theory of history challenges narrative conceptualizations of history which assume a past potential of humanity unfolding over time to reach future fulfillment and seeks new ways of conceptualizing the altered socio-cultural concerns Western societies are currently facing. By creating a novel set of concepts to make sense of our altered historical condition regarding both history understood as the course of human affairs and historical writing, History in Times of Unprecedented Change offers a highly original and engaging take on the state of history and historical theory in the present and beyond. (shrink)
The posthuman has been looming large on the human horizon lately. Yet there is no shared understanding of what a posthuman future could possibly mean, and the tension between a technological‐scientific prospect of posthumanity and the critical posthumanist scholarship of the humanities is growing palpable. Whereas the former harbors a novel sense of historicity signaled by the expectation of an evental change to bring about the technological posthuman as a previously nonexistent and other‐than‐human central subject, the latter theorizes a postanthropocentric (...) subjectivity of beings still human. In doing so, it extends the already familiar emancipatory concerns of the human world over the nonhuman, with special attention paid to the ecological other. Despite the occasional claims of critical posthumanism to bring humanities and technological‐scientific approaches to a shared platform, the prospect of technological beings of unparalleled power and the ecotopia of species equality do not fit together very well. In this article I argue that, in their present shape, technological posthumanity and critical posthumanism represent hardly reconcilable social imaginaries and two cultures of the posthuman future. My intervention is a plea for developing a more profound and mutual understanding of both. Instead of advocating particular agendas that nevertheless claim validity for the entirety of planetary life and the entire scholarly enterprise of knowledge‐production, we could invest more in efforts to come to grips with both social imaginaries and venture jointly into the creation of the conceptual tools of a new knowledge economy of understanding the rapidly changing world and our own (post)human prospects. (shrink)
O objetivo deste artigo é analisar os usos que Werner Heisenberg fez da filosofia grega em sua obra. Pretende-se relacionar tais usos não apenas com a argumentação interna presente nos textos do físico alemão, mas também com o contexto histórico, conflitos e debates entre as diversas interpretações da teoria dos quanta durante a primeira metade do século XX. Faremos, inicialmente, uma apresentação geral da teoria quântica e da presença da filosofia na obra de Heisenberg e, em seguida, um estudo de (...) caso da apropriação que Heisenberg fez do pensamento de Leucipo, Demócrito, Heráclito, Platão e Aristóteles. (shrink)
Software application ontologies have the potential to become the keystone in state-of-the-art information management techniques. It is expected that these ontologies will support the sort of reasoning power required to navigate large and complex terminologies correctly and efficiently. Yet, there is one problem in particular that continues to stand in our way. As these terminological structures increase in size and complexity, and the drive to integrate them inevitably swells, it is clear that the level of consistency required for such navigation (...) will become correspondingly difficult to maintain. While descriptive semantic representations are certainly a necessary component to any adequate ontology-based system, so long as ontology engineers rely solely on semantic information, without a sound ontological theory informing their modeling decisions, this goal will surely remain out of reach. In this paper we describe how Language and Computing nv (L&C), along with The Institute for Formal Ontology and Medical Information Sciences (IFOMIS), are working towards developing and implementing just such a theory, combining the open software architecture of L&C’s LinkSuiteTM with the philosophical rigor of IFOMIS’s Basic Formal Ontology. In this way we aim to move beyond the more or less simple controlled vocabularies that have dominated the industry to date. (shrink)
In recent years, a number of theorists have claimed that beliefs about probability are transparent. To believe probably p is simply to have a high credence that p. In this paper, I prove a variety of triviality results for theses like the above. I show that such claims are inconsistent with the thesis that probabilistic modal sentences have propositions or sets of worlds as their meaning. Then I consider the extent to which a dynamic semantics for probabilistic modals can capture (...) theses connecting belief, certainty, credence, and probability. I show that although a dynamic semantics for probabilistic modals does allow one to validate such theses, it can only do so at a cost. I prove that such theses can only be valid if probabilistic modals do not satisfy the axioms of the probability calculus. (shrink)
I argue that there can be no such thing as a borderline case of the predicate ‘phenomenally conscious’: for any given creature at any given time, it cannot be vague whether that creature is phenomenally conscious at that time. I first defend the Positive Characterization Thesis, which says that for any borderline case of any predicate there is a positive characterization of that case that can show any sufficiently competent speaker what makes it a borderline case. I then appeal to (...) the familiar claim that zombies are conceivable, and I argue that this claim entails that there can be no positive characterizations of borderline cases of ‘phenomenally conscious’. By the Positive Characterization Thesis, it follows that ‘phenomenally conscious’ can not have any borderline cases. (shrink)
This paper is a fresh attempt to articulate the role of a theory of truthmakers. We argue that truthmaker theory constitutes a cornerstone of good methodology in metaphysics, but that a conflation of truthmaker theory with the theory of truth has been responsible for certain excesses associated with truthmaker-based approaches in the recent literature. If truthmaker theory is not a component of a theory of truth, then truthmaker maximalism – the view that every truth has a truthmaker – loses its (...) primary motivation. More generally, if the task of truthmaker theory is not to provide a definition or account of truth in truthmaker terms, there is no pressing need for hard, a priori principles stating which truths have truthmakers and which do not. (shrink)
In this open peer commentary, we categorize the possible “neuroscience in national security” definitions of misuse of science and identify which, if any, are uniquely presented by advances in neuroscience. To define misuse, we first define what we would consider appropriate use: the application of reasonably safe and effective technology, based on valid and reliable scientific research, to serve a legitimate end. This definition presents distinct opportunities for assessing misuse: misuse is the application of invalid or unreliable science, or is (...) the use of reliable scientific methods to serve illegitimate ends. Ultimately, we conclude that while national security is often a politicized issue, assessing the state of scientific progress should not be. (shrink)
Today’s technological-scientific prospect of posthumanity simultaneously evokes and defies historical understanding. On the one hand, it implies a historical claim of an epochal transformation concerning posthumanity as a new era. On the other, by postulating the birth of a novel, better-than-human subject for this new era, it eliminates the human subject of modern Western historical understanding. In this article, I attempt to understand posthumanity as measured against the story of humanity as the story of history itself. I examine the fate (...) of humanity as the central subject of history in three consecutive steps: first, by exploring how classical philosophies of history achieved the integrity of the greatest historical narrative of history itself through the very invention of humanity as its subject; second, by recounting how this central subject came under heavy criticism by postcolonial and gender studies in the last half-century, targeting the universalism of the story of humanity as the greatest historical narrative of history; and third, by conceptualizing the challenge of posthumanity against both the story of humanity and its criticism. Whereas criticism fragmented history but retained the possibility of smaller-scale narratives, posthumanity does not doubt the feasibility of the story of humanity. Instead, it necessarily invokes humanity, if only in order to be able to claim its supersession by a better-than-human subject. In that, it represents a fundamental challenge to the modern Western historical condition and the very possibility of historical narratives – small-scale or large-scale, fragmented or universal. (shrink)
The central hypothesis of the collaboration between Language and Computing (L&C) and the Institute for Formal Ontology and Medical Information Science (IFOMIS) is that the methodology and conceptual rigor of a philosophically inspired formal ontology greatly benefits application ontologies. To this end r®, L&C’s ontology, which is designed to integrate and reason across various external databases simultaneously, has been submitted to the conceptual demands of IFOMIS’s Basic Formal Ontology (BFO). With this project we aim to move beyond the level of (...) controlled vocabularies to yield an ontology with the ability to support reasoning applications. Our general procedure has been the implementation of a meta-ontological definition space in which the definitions of all the concepts and relations in LinKBase® are standardized in a framework of first-order logic. In this paper we describe how this standardization has already led to an improvement in the LinKBase® structure that allows for a greater degree of internal coherence than ever before possible. We then show the use of this philosophical standardization for the purpose of mapping external databases to one another, using LinKBase® as translation hub, with a greater degree of success than possible hitherto. We demonstrate how this offers a genuine advance over other application ontologies that have not submitted themselves to the demands of philosophical scrutiny. LinKBase® is one of the world’s largest applications-oriented medical domain ontologies, and BFO is one of the world’s first philosophically driven reference ontologies. The collaboration of the two thus initiates a new phase in the quest to solve the so-called “Tower of Babel”. (shrink)
Le présent article est une tentative nouvelle d’articuler le rôle d’une théorie des vérifacteurs. Nous soutenons que la théorie de la vérifaction constitue une pierre angulaire dans une bonne méthodologie en métaphysique, mais que l’amalgame entre la théorie de la vérifaction et la théorie de la vérité a été responsable de certains excès associés aux approches vérifactionnistes dans la littérature récente. Nous montrons que la théorie de la vérifaction conserve son attrait comme instrument d’investigation métaphysique, et ce, malgré notre accord (...) avec les doctrines déflationnistes telles que celles défendues par Ayer, Quine, Field et Horwich (ou, du moins, malgré notre neutralité à leur égard). Nous soutenons en outre que les intuitions sous-jacentes à la théorie de la vérifaction s’éclairent quand nous les dissocions d’une théorie de la vérité et, par-dessus tout, de la tentative de fournir une définition de la vérité. (shrink)
The theory and philosophy of history (just like philosophy in general) has established a dogmatic dilemma regarding the issue of language and experience: either you have an immediate experience separated from language, or you have language without any experiential basis. In other words, either you have an immediate experience that is and must remain mute and ineffable, or you have language and linguistic conceptualization that precedes experience, provides the condition of possibility of it, and thus, in a certain sense, produces (...) it. Either you join forces with the few and opt for such mute experiences, or you go with the flow of narrative philosophy of history and the impossibility of immediacy. Either way, you end up postulating a mutual hostility between the nonlinguistic and language, and, more important, you remain unable to account for new insights and change. Contrary to this and in relation to history, I am going to talk about something nonlinguistic—historical experience—and about how such historical experience could productively interact with language in giving birth to novel historical representations. I am going to suggest that, under a theory of expression, a more friendly relationship can be established between experience and language: a relationship in which they are not hostile to but rather desperately need each other. To explain the occurrence of new insights and historiographical change, I will talk about a process of expression as sense-formation and meaning-constitution in history, and condense the theory into a struck-through “of,” as the expression of historical experience. (shrink)
The collaboration of Language and Computing nv (L&C) and the Institute for Formal Ontology and Medical Information Science (IFOMIS) is guided by the hypothesis that quality constraints on ontologies for software ap-plication purposes closely parallel the constraints salient to the design of sound philosophical theories. The extent of this parallel has been poorly appreciated in the informatics community, and it turns out that importing the benefits of phi-losophical insight and methodology into application domains yields a variety of improvements. L&C’s LinKBase® (...) is one of the world’s largest medical domain ontologies. Its current primary use pertains to natural language processing ap-plications, but it also supports intelligent navigation through a range of struc-tured medical and bioinformatics information resources, such as SNOMED-CT, Swiss-Prot, and the Gene Ontology (GO). In this report we discuss how and why philosophical methods improve both the internal coherence of LinKBase®, and its capacity to serve as a translation hub, improving the interoperability of the ontologies through which it navigates. (shrink)
The central hypothesis of the collaboration between Language and Computing (L&C) and the Institute for Formal Ontology and Medical Information Science (IFOMIS) is that the methodology and conceptual rigor of a philosophically inspired formal ontology will greatly benefit software application ontologies. To this end LinKBase®, L&C’s ontology, which is designed to integrate and reason across various external databases simultaneously, has been submitted to the conceptual demands of IFOMIS’s Basic Formal Ontology (BFO). With this, we aim to move beyond the level (...) of controlled vocabularies to yield an ontology with the ability to support reasoning applications. (shrink)
The humanities and the social sciences have been hostile to future visions in the postwar period. The most famous victim of their hostility was the enterprise of classical philosophy of history, condemned to illegitimacy precisely because of its fundamental engagement with the future. Contrary to this attitude, in this essay I argue that there is no history (neither in the sense of the course of human affairs nor in the sense of historical writing) without having a future vision in the (...) first place. History, its very possibility, begins in the future, in the postulation of a future where further change can take place. Our notions of history, change, and the future are interdependent, they come as one package, meaning that the abandonment of one entails the abandonment of the other two. As to the current situation, although lately it became a commonplace to diagnose our age as presentist, Western societies are deeply engaged in a vision of the future revolving around artificial intelligence and the prospect of technological singularity. This technological vision is best characterized as the prospect of unprecedented change, substantially differing from Enlightenment and nineteenth-century developmental visions of future. If our notions of history, change, and the future are necessarily interdependent, and if we have a characteristically new future vision, it follows that our historical sensibility is already transformed and is accommodated to the prospect of unprecedented change. The ultimate aim of this essay is to outline this transformed historical sensibility of our technological age. (shrink)
The rapidly growing transdisciplinary enthusiasm about developing new kinds of Anthropocene stories is based on the shared assumption that the Anthropocene predicament is best made sense of by narrative means. Against this assumption, this article argues that the challenge we are facing today does not merely lie in telling either scientific, socio-political, or entangled Anthropocene narratives to come to terms with our current condition. Instead, the challenge lies in coming to grips with how the stories we can tell in the (...) Anthropocene relate to the radical novelty of the Anthropocene condition about which no stories can be told. What we need to find are meaningful ways to reconcile an inherited commitment to narrativization and the collapse of storytelling as a vehicle of understanding the Anthropocene as our current predicament. (shrink)
The historical sensibility of Western modernity is best captured by the phrase “acting upon a story that we can believe.” Whereas the most famous stories of historians facilitated nation-building processes, philosophers of history told the largest possible story to act upon: history itself. When the rise of an overwhelming postwar skepticism about the modern idea of history discredited the entire enterprise, the historical sensibility of “acting upon a story that we can believe” fell apart to its constituents: action, story form, (...) and belief in a feasible future outcome. Its constituent parts are nevertheless still hold, either separately or in paired arrangements. First, believable stories are still told, but without an equally believable future outcome to facilitate. Second, in the shape of what I call the prospect of unprecedented change, there still is a feasible vision of a future (in prospects of technology and the Anthropocene), but it defies story form. And third, it is even possible to upon that feasible future, but such action aims at avoiding worst case scenarios instead of facilitating best outcomes. These are, I believe, features of an emerging postwar historical sensibility that the theory and philosophy of history is yet to understand. (shrink)
This brief article is a discussion-starter on the question of the role and use of theories and philosophies of history. In the last few decades, theories of history typically intended to transform the practice of historical studies through a straightforward application of their insights. Contrary to this, I argue that they either bring about particular historiographical innovations in terms of methodology but leave the entirety of historical studies intact, or change the way we think about the entirety of historical studies (...) merely by describing and explaining it in fresh and novel ways, without the need of application. In the former case, theories appear as internal to historical studies. In the latter case, they appear as theories about history. Such theories about history are no longer limited to study history understood as historical writing. In reflecting on the historical condition of the ever-changing world, they foster a more fruitful cooperative relationship with the discipline of history. Discussing the scope and use of such theories of history is inevitable today when a younger generation sets out to theorize history against the backdrop of the experiential horizon of their own times. (shrink)
Both short and long-term video-game play may result in superior performance on visual and attentional tasks. To further these findings, we compared the performance of experienced male video-game players (VGPs) and non-VGPs on a Simon-task. Experienced-VGPs began playing before the age of 10, had a minimum of 8 years of experience and a minimum play time of over 20 h per week over the past 6 months. Our results reveal a significantly reduced Simon-effect in experienced-VGPs relative to non-VGPs. (...) However, this was true only for the right-responses, which typically show a greater Simon-effect than left-responses. In addition, experienced-VGPs demonstrated significantly quicker reaction times and more balanced left-versus-right-hand performance than non-VGPs. Our results suggest that experienced-VGPs can resolve response-selection conflicts more rapidly for right-responses than non-VGPs, and this may in part be underpinned by improved bimanual motor control. (shrink)
In times of a felt need to justify the value of the humanities, the need to revisit and re-establish the public relevance of the discipline of history cannot come as a surprise. On the following pages I will argue that this need is unappeasable by scholarly proposals. The much desired revitalization of historical writing lies instead in reconciling ourselves with the dual meaning of the word history, in exploring the necessary interconnection between history understood as the course of events and (...) as historical writing. Despite the general tendency of the last decades to forbid philosophizing about history in the former sense (at least in departments of history and philosophy), I think that to a certain extent we already do so without succumbing to substantive thought. We already have the sprouts of a speculative although only quasi-substantive philosophy of history that nevertheless takes seriously the postwar criticism of the substantive enterprise. In this essay I will first try to outline this quasi-substantive philosophy of history that attests to the historical sensibility of our times; and second, I will try to outline its consequences regarding history as historical writing. Finally, in place of a conclusion I will suggest that historical writing is not as much a contribution to public agendas as it is the very arena in which public life is at stake. (shrink)
A brief examination of the self-negating quality of the precautionary principle within the context of environmental ethics, and its consequent failure, as an ethical guide, to justify large-scale regulation of atmospheric cabon dioxide emissions.
This article reviews Herbert Simon's theory of bounded rationality, with a view of deemphasizing his "satisficing" model, and by contrast, of emphasizing his distinction between "procedural" and "substantive" rationality. The article also discusses a possible move from neo-classical economists to respond to Simon's criticisms, i.e., a reduction of bounded rationality to a special case of second-optimization, using Stigler's search theory. This move is eventually dismissed.
Simon Blackburn has not shied away from the use of vivid imagery in developing, over a long and prolific career, a large-scale philosophical vision. Here one might think, for instance, of ‘Practical Tortoise Raising’ or ‘Ramsey's Ladder’ or ‘Frege's Abyss’. Blackburn develops a ‘quasi-realist’ account of many of our philosophical and everyday commitments, both theoretical (e.g., modality and causation) and practical (e.g., moral judgement and normative reasons). Quasi-realism aims to provide a naturalistic treatment of its targeted phenomena while earning (...) the right to deploy all of the ‘trappings’ of realism—i.e., while eschewing any idea that our normal thought and talk about such phenomena are pervasively in error. The quasi-realist project is that of explaining how (as Huw Price puts it here) ‘the folk come to “talk the realist talk” without committing ourselves—us theorists, as it were—to “walking the metaphysical walk”’ (p. 136). Quasi-realism, too, can speak of truth, facts, properties, belief, knowledge, and so on. The imagery in this collection also abounds, though, in capturing a different view of quasi-realism: No fewer than three of the contributors picture Blackburn as wanting to have his cake and eat it too (Louise Antony asking, in addition, ‘Who doesn't? It's cake’ [p. 19]). (shrink)
This chapter argues that Simon anticipated what has emerged as the consensus view about human cognition: embodied functionalism. According to embodied functionalism, cognitive processes appear at a distinctively cognitive level; types of cognitive processes (such as proving a theorem) are not identical to kinds of neural processes, because the former can take various physical forms in various individual thinkers. Nevertheless, the distinctive characteristics of such processes — their causal structures — are determined by fine-grained properties shared by various, often (...) especially bodily related, physical processes that realize them. Simon’s apparently anti-embodiment views are surveyed and are shown to be consistent with his many claims that lend themselves to an embodied interpretation and that, to a significant extent, helped to lay the groundwork for an embodied cognitive science. (shrink)
In his recent book ‘Experiencing time’, Simon Prosser discusses a wide variety of topics relating to temporal experience, in a way that is accessible both to those steeped in the philosophy of mind, and to those more familiar with the philosophy of time. He forcefully argues for the conclusion that the B-theorist of time can account for the temporal appearances. In this article, I offer a chapter by chapter response.
Review of: Menachem Fisch; Simon Schaffer (Editors). William Whewell: A Composite Portrait. xiv + 403 pp., bibl., index. Oxford: Clarendon Press of Oxford University Press, 1991. $98.
Albert Lautman. Mathematics, Ideas and the Physical Real. Simon B. Duffy, trans. London and New York: Continuum, 2011. 978-1-4411-2344-2 (pbk); 978-1-44114656-4 (hbk); 978-1-44114433-1 (pdf e-bk); 978-1-44114654-0 (epub e-bk). Pp. xlii + 310.
ONE of the most celebrated mathematical physicists, Pierre-Simon Laplace is often remembered as the mathematician who showed that despite appearances, the Solar System does conform to Newton’s theories. Together with distinguished scholars Robert Fox and Ivor Grattan-Guinness, Charles Gillispie gives us a new perspective, showing that Laplace did not merely vindicate Newton’s system, but had a uniquely creative and independent mind.
The goal of this paper is to critically examine the objections of John Locke’s contemporaries against the theory of substance or substratum. Locke argues in Essay that substratum is the bearer of the properties of a particular substance. Locke also claims that we have no knowledge of substratum. But Locke’s claim about our ignorance as to what substratum is, is contentious. That is, if we don’t know what substratum is, then what is the point of proposing it as a bearer (...) of properties? This question underlies the criticism Locke’s contemporaries raise against the notion of substratum. In section I, I lay out the context for Locke’s theory of substratum by pointing out his main motivation in proposing his theory. In section II, I give a brief analysis of the theory of substratum. In section III, I discuss the objections of Locke’s contemporaries against the theory of substratum.1 I focus on Edward Stillingfleet, Lee Henry, G. W. Leibniz and John Sergeant. In section IV, I conclude that there is no warrant to dismiss Locke’s theory of substance. (shrink)
If the import of a book can be assessed by the problem it takes on, how that problem unfolds, and the extent of the problem’s fruitfulness for further exploration and experimentation, then Duffy has produced a text worthy of much close attention. Duffy constructs an encounter between Deleuze’s creation of a concept of difference in Difference and Repetition (DR) and Deleuze’s reading of Spinoza in Expressionism in Philosophy: Spinoza (EP). It is surprising that such an encounter has not already been (...) explored, at least not to this extent and in this much detail. Since the two works were written simultaneously, as Deleuze’s primary and secondary dissertations, it is to be expected that there is much to learn from their interaction. Duffy proceeds by explicating, in terms of the differential calculus, a logic of what Deleuze in DR calls different/ciation, and then maps this onto Deleuze’s account of modal expression in EP. (shrink)
When probability discounting (or probability weighting), one multiplies the value of an outcome by one's subjective probability that the outcome will obtain in decision-making. The broader import of defending probability discounting is to help justify cost-benefit analyses in contexts such as climate change. This chapter defends probability discounting under risk both negatively, from arguments by Simon Caney (2008, 2009), and with a new positive argument. First, in responding to Caney, I argue that small costs and benefits need to be (...) evaluated, and that viewing practices at the social level is too coarse-grained. Second, I argue for probability discounting, using a distinction between causal responsibility and moral responsibility. Moral responsibility can be cashed out in terms of blameworthiness and praiseworthiness, while causal responsibility obtains in full for any effect which is part of a causal chain linked to one's act. With this distinction in hand, unlike causal responsibility, moral responsibility can be seen as coming in degrees. My argument is, given that we can limit our deliberation and consideration to that which we are morally responsible for and that our moral responsibility for outcomes is limited by our subjective probabilities, our subjective probabilities can ground probability discounting. (shrink)
The professor of psychopathology Simon Baron-Cohen is well-known for his thesis that males are on average better at systematizing than empathizing and females are on average better at empathizing than systematizing. In this paper, I note an ambiguity in how he defines systematizing.
Simon Blackburn objects that Wittgenstein's private language argument overlooks the possibility that a private linguist can equip himself with a criterion of correctness by confirming generalizations about the patterns in which his private sensations occur. Crispin Wright responds that appropriate generalizations would be too few to be interesting. But I show that Wright's calculations are upset by his failure to appreciate both the richness of the data and the range of theories that would be available to the private linguist.
The author argues that Plato’s “proof” that happiness follows justice has a fatal flaw – because the philosopher king in Plato’s Republic is itself a counter example.
The aim of this paper is to trace the development of a particular current of thought known under the label ‘pragmatism’ in the last part of the Twentieth century and the beginning of the Twenty-first. I address three questions about this current of thought. First, what is its actual historical development? Second, does it constitute a single, coherent, philosophical outlook? Third, in what form, if any, does it constitute an attractive philosophical outlook. In the course of addressing these questions I (...) identify a potential tension between the two main proponents of this current of thought (Huw Price and Simon Blackburn), namely the attitude they take to what I refer to as their ‘naturalist master narrative’. (shrink)
In nearly forty years’ of work, Simon Blackburn has done more than anyone to expand our imaginations about the aspirations for broadly projectivist/expressivist theorizing in all areas of philosophy. I know that I am far from alone in that his work has often been a source of both inspiration and provocation for my own work. It might be tempting, in a volume of critical essays such as this, to pay tribute to Blackburn’s special talent for destructive polemic, by seeking (...) to take down that by which I’ve been most provoked over the years. But Blackburn’s biting wit has both more wit and more bite than I could hope to emulate. So instead I’ll try to emulate here what I’ve admired the most about Blackburn – the constructive vein of much of his work. (shrink)
The professor of psychopathology Simon Baron-Cohen claims that males are on average stronger at systematizing than empathizing and females are on average stronger at empathizing than systematizing. Systematizing is defined as the drive to construct or understand systems. In this paper, I observe that Baron-Cohen overlooks certain examples of systems, examples which lead to doubts about his claim.
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.