We argue that the notion of "mental institutions"-discussed in recent debates about extended cognition-can help better understand the origin and character of social impairments in autism, and also help illuminate the extent to which some mechanisms of autistic dysfunction extend across both internal and external factors (i.e., they do not just reside within an individual's head). After providing some conceptual background, we discuss the connection between mental institutions and embodied habits of mind. We then discuss the significance of our view (...) for understanding autistic habits of mind and consider why these embodied habits are sometimes a poor fit with neurotypical mental institutions. We conclude by considering how these insights highlight the two-way, extended nature of social impairments in autism, and how this extended picture might assist in constructing more inclusive mental institutions and intervention strategies. (shrink)
In my paper "Controlling the noise", I present a phenomenological investigation of bodily experience in AN. Turning to descriptions of those who have suffered from AN, which repeatedly describe the experience of finding their bodies threatening, out of control and noisy, I suggest that the phenomenological conceptions of body-as-object, body-as-subject and visceral body can help us unpack the complex bodily experience of AN throughout its various stages. My claim is that self-starvation is enacted by a bodily-subject who wishes to quell (...) or reassert authority over a visceral body whose demands and needs she finds threatening to her autonomy. -/- In response to the generous and insightful commentaries of MichelleMaiese and Drew Leder, I explore how their suggestions can further enrich our understanding of the visceral body and how it is experienced in AN. (shrink)
The surface grammar of reports such as ‘I have a pain in my leg’ suggests that pains are objects which are spatially located in parts of the body. We show that the parallel construction is not available in Mandarin. Further, four philosophically important grammatical features of such reports cannot be reproduced. This suggests that arguments and puzzles surrounding such reports may be tracking artefacts of English, rather than philosophically significant features of the world.
While social geographers have convincingly made the case that space is not an external constant, but rather is produced through inter-relations, anthropologists and sociologists have done much to further an understanding of time, as itself constituted through social interaction and inter-relation. Their work suggests that time is not an apolitical background to social life, but shapes how we perceive and relate to others. For those interested in exploring issues such as identity, community and difference, this suggests that attending to how (...) temporal discourses are utilised in relation to these issues is a key task. This article seeks to contribute to an expansion of the debate about time and sociality by contributing an analysis of a variety of ways in which Gloria Anzaldúa utilises temporal concepts as part of her work of rethinking social identity and community. In particular, I suggest that in contesting homogeneous identity, Anzaldúa also implicitly contests linear temporal frameworks. Further, in creating new frameworks for identity, I suggest the possibility of discerning an alternative approach to time in her work that places difference at the heart of simultaneity. I suggest that the interconnection between concepts of time and community within Anzaldúa’s work indicates, more broadly, that attempts to rework understandings of relationality must be accompanied by reworked accounts of temporality. (shrink)
Although conceptually distinct, ‘ time ’ and ‘community’ are multiply intertwined within a myriad of key debates in both the social sciences and the humanities. Even so, the role of conceptions of time in social practices of inclusion and exclusion has yet to achieve the prominence of other key analytical categories such as identity and space. This article seeks to contribute to the development of this field by highlighting the importance of thinking time and community together through the lens of (...) political apologies. Often ostensibly offered in order to re-articulate both the constitution of ‘the community’ and its future direction, official apologies are prime examples of deliberate attempts to intervene in shared understandings of political community and its temporality. Offering a detailed case study of one of these apologies, I will focus on Australian debates over the removal of indigenous children from their families, known as the Stolen Generations, and examine the temporal dimensions of the different responses offered by former prime ministers John Howard and Kevin Rudd. (shrink)
In the philosophy of mind, revelation is the claim that the nature of qualia is revealed in phenomenal experience. In the literature, revelation is often thought of as intuitive but in tension with physicalism. While mentions of revelation are frequent, there is room for further discussion of how precisely to formulate the thesis of revelation and what it exactly amounts to. Drawing on the work of David Lewis, this paper provides a detailed discussion on how the thesis of revelation, as (...) well as its incompatibility with physicalism, is to be understood. (shrink)
Focusing particularly on the role of the clock in social life, this article explores the conventions we use to “tell the time.” I argue that although clock time generally appears to be an all-encompassing tool for social coordination, it is actually failing to coordinate us with some of the most pressing ecological changes currently taking place. Utilizing philosophical approaches to performativity to explore what might be going wrong, I then draw on Derrida’s and Haraway’s understandings of social change in order (...) to suggest a fairly unconventional, but perhaps more accurate, mode of reckoning time in the context of climate change, resource depletion, and mass extinctions. (shrink)
Since the human genome was decoded, great emphasis has been placed on the unique, personal nature of the genome, along with the benefits that personalized medicine can bring to individuals and the importance of safeguarding genetic privacy. As a result, an equally important aspect of the human genome – its common nature – has been underappreciated and underrepresented in the ethics literature and policy dialogue surrounding genetics and genomics. This article will argue that, just as the personal nature of the (...) genome has been used to reinforce individual rights and justify important privacy protections, so too the common nature of the genome can be employed to support protections of the genome at a population level and policies designed to promote the public's wellbeing. In order for public health officials to have the authority to develop genetics policies for the sake of the public good, the genome must have not only a common, but also a public, dimension. This article contends that DNA carries a public dimension through the use of two conceptual frameworks: the common heritage framework and the common resource framework. Both frameworks establish a public interest in the human genome, but the CH framework can be used to justify policies aimed at preserving and protecting the genome, while the CR framework can be employed to justify policies for utilizing the genome for the public benefit. A variety of possible policy implications are discussed, with special attention paid to the use of large-scale genomics databases for public health research. (shrink)
P.F. Strawson famously suggested that employment of the objective attitude in an intimate relationship forebodes the relationship’s demise. Relatively less remarked is Strawson's admission that the objective attitude is available as a refuge from the strains of relating to normal, mature adults as proper subjects of the reactive attitudes. I develop an account of the strategic employment of the objective attitude in such cases according to which it denies a person a power of will – authorial power – whose recognition (...) is necessary for sustaining intimacy. This conception of the objective attitude in its strategic employment presses those who urge universal adoption of the objective attitude (perhaps in its nonreactive, emotionally toned species) to confront the costs of their proposal. (shrink)
MICHELLE MARDER KAMHI offers an in-depth response to The Aesthetics Symposium. In addition to answering many of the contributors’ objections to What Art Is: The Esthetic Theory of Ayn Rand, she offers a critique of their own theses—in particular, Barry Vacker’s claim that chaos theory is implicit in Rand’s aesthetics, Jeff Riggenbach’s argument that much of Rand’s theory was anticipated by Susanne Langer and Stephen Pepper, and Roger Bissell’s suggestion that the concept of a microcosm be applied to Rand’s (...) view of the function of art. (shrink)
In this chapter, I bring insights from the social sciences, about the role of time in exclusionary practices, into debates around the under-representation of women in philosophy. I will suggest that part of what supports the exclusionary culture of philosophy is a particular approach to time, and thus that changing this culture requires that we also change its time.
The principle of subsidiarity is a multi-layered and flexible principle that can be utilised to empower, inform, enhance and reform scholarship in a range of significant areas, however, it has been somewhat overlooked in recent scholarship. In order to highlight the continued relevance and potential applications of the principle, this, the first of two papers, will provide a detailed analysis of the meaning and application of the principle of subsidiarity in Catholic social teaching. In doing so, the interplay of the (...) principle of subsidiarity and other key principles of catholic social teaching such as dignity of the person, solidarity, and the common good will be highlighted. The second part of this paper discusses the political applications of the principle, including its ability to inform scholarship on the allocation of governmental powers (including federalism), democracy, and individual participation in government. This leads to a discussion in the second paper, of the Catholic aspects of subsidiarity in the governance of the European Union. (shrink)
This paper is a response to Val Plumwoods call for writers to engage in ‘the struggle to think differently’. Specifically, she calls writers to engage in the task of opening up an experience of nature as powerful and as possessing agency. I argue that a critical component of opening up who or what can be understood as possessing agency involves challenging the conception of time as linear, externalised and absolute, particularly in as much as it has guided Western conceptions of (...) process, change and invention. I explore this through anthropologist Carol Greenhouse's claim that social conceptions of time can be read as theories of agency. Thus, in seeking to respond to Plumwood’s call to think differently, the question becomes: what kind of writing would enable a fundamental re-thinking of agency without, however, ignoring the way Western notions of agency have been shaped by linear accounts of time? I look to Jacques Derrida's work as one example. I first locate the possibility of re-writing time and agency in the experiential aspects of his writing, which I argue interrupt both the reader’s sense of agency and linear models of reading. But further, I connect Derrida’s work directly with Plumwood’s by examining how his deconstruction of the Western concept of invention may enable another account of creative change that could reshape what counts as ‘agency’ within the Anthropocene. (shrink)
This article explores Donna Haraway’s overlooked theories of coalition-building along with the tactics of transversalism. I initially outline Haraway’s contributions and discuss why the cyborg of coalition has been ignored. I then relate this work to transversal politics, a form of coalition-building that acknowledges both the need for more open understandings of the subject and also the threatening circumstances that form these ‘hybrid’ subjects. The intriguing alliance that can be formed between them offers ways of dealing with the fears and (...) self-defensiveness that often stand in the way of creating effective political groups. Having explored the tactics that could productively structure interactions between Haraway’s cyborg actors, I broaden this out to discuss community more generally, in order to further explore the possible applications of these theories of coalition. Finally, through the work of Linnell Secomb and Jean-Luc Nancy, I suggest a vision of a community of cyborgs. (shrink)
In June 2002, Arthur Andersen LLP became the first accounting firm in history to be criminally convicted. The repercussions were immense. From a position as one of the leading professional services firms in the world, with 85,000 staff in 84 countries and revenues in excess of $9 billion, Andersen effectively ceased to exist within a matter of months. Although Andersen’s conviction related specifically to a charge of obstructing justice, public attention focused on the audit relationship between Andersen and its major (...) client, Enron Corporation, particularly the actions that had allowed Enron to post spectacular year-on-year earnings and profit growth. As well as examining events leading up to the demise of Andersen, the case provides an opportunity to consider the broader controversy over accounting and corporate governance practices and, more generally, the pressures found within organisations that can foster unethical conduct. The case was prepared from public sources. (shrink)
Human resource management (HRM) education has tended to focus on specific functions and tasks within organizations, such as compensation, staffing, and evaluation. This task orientation within HRM education fails to account for the bigger questions facing human resource management and employment relationships, questions which address the roles and responsibilities of the HR function and HR practitioners. An educational focus on HRM that does not explicitly address larger ethical questions fails to equip students to address stakeholder concerns about how employees are (...) treated or the ethical dilemmas facing employers with regard to the employment relationship, and ironically makes the HRM function less strategic to the organization. In this paper, we identify some of the key ethical issues within the employment relationship, discuss how extant HRM education often fails to address these issues or help students to become aware of them, and offer a framework for integrating ethics into HRM education. (shrink)
In the law of rape, consent has been and remains a gendered concept. Consent presumes female acquiescence to male sexual initiation. It presumes a man desires to penetrate a woman sexually. It presumes the woman willingly yields to the man's desires. It does not presume, and of course does not require, female sexual desire. Consent is what the law calls it when he advances and she does not put up a fight. I have argued elsewhere that the kind of thin (...) consent that the law focuses on is not enough ethically and it should not be enough legally to justify sexual penetration. I advocate sexual negotiation, where individuals discuss sexual desires and boundaries and agree to engage in penetration before it occurs, except under circumstances in which the partners have a reasonable basis to assess one another's nonverbal behavior. I argue that not only is verbal consultation about desire ordinarily ethically necessary before most acts of sexual penetration, it should be legally required. Consultation to ascertain sexual desires and boundaries assures that both parties desire penetration. (shrink)
Across a wide range of cultural forms, including philosophy, cultural theory, literature and art, the figure of the clock has drawn suspicion, censure and outright hostility. In contrast, even while maps have been shown to be complicit with forms of domination, they are also widely recognised as tools that can be critically reworked in the service of more liberatory ends. This paper seeks to counteract the tendency to see clocks in this way, arguing that they have many more interesting possibilities (...) than they are usually given credit for. An analysis of approaches to clocks in continental philosophy critiques the way they have too often been dismissed as unworthy of further analysis, and argues that this dismissal is based upon an inadequate understanding of how clocks operate. Seeking to move towards more critical and curious approaches, the paper draws inspiration from critical cartography in order to call for the development of a ‘critical horology’ which would emphasise both the fundamentally political nature of clocks, and the potential for designing them otherwise. A discussion of temporal design provides a range of examples of how clocks might open up new horizons within the politics of time. (shrink)
This paper is the second of two papers which examine the versatility of the principle of subsidiarity. The first paper explored the nature of the principle in Catholic social teaching as a moral and social principle and its potential application in the political sphere. This paper further explores the political application of the principle of subsidiarity through a discussion of its operation in the European Union, where it is embodied in article 5(3) of the Treaty on European Union. This paper (...) discusses subsidiarity’s interpretation by the European Court of Justice as a political value judgement, rather than a legal principle. In its discussion of subsidiarity in the European Union, this paper draws some comparisons with the principle’s enunciation in Catholic social teaching. Together, these papers are intended to highlight the many facets of the principle of subsidiarity in order to promote its continued relevance and to promote further scholarship on subsidiarity. (shrink)
This commentary focuses on explaining the intuition of revelation, an issue that Chalmers (2018) raises in his paper. I first sketch how the truth of revelation provides an explanation for the intuition of revelation, and then assess a physicalist proposal to explain the intuition that appeals to Derk Pereboom’s (2011, 2016, 2019) qualitative inaccuracy hypothesis.
The paradox of pain refers to the idea that the folk concept of pain is paradoxical, treating pains as simultaneously mental states and bodily states (e.g. Hill 2005, 2017; Borg et al. 2020). By taking a close look at our pain terms, this paper argues that there is no paradox of pain. The air of paradox dissolves once we recognise that pain terms are polysemous and that there are two separate but related concepts of pain rather than one.
In a recent paper, Reuter, Seinhold and Sytsma put forward an implicature account to explain the intuitive failure of the pain-in-mouth argument. They argue that utterances such as ‘There is tissue damage / a pain / an inflammation in my mouth’ carry the conversational implicature that there is something wrong with the speaker’s mouth. Appealing to new empirical data, this paper argues against the implicature account and for the entailment account, according to which pain reports using locative locutions, such as (...) ‘There is a pain in my mouth’, are intuitively understood as entailing corresponding predicative locutions, such as ‘My mouth hurts.’ On this latter account, the pain-in-mouth argument seems invalid because the conclusion is naturally understood as entailing something which cannot be inferred from the premisses. Implications for the philosophical debate about pain are also drawn. (shrink)
In Book 10 of the Republic, Plato launches an extensive critique of art, claiming that it can have no legitimate role within the well-ordered state. While his reasons are multifac- eted, Plato’s primary objection to art rests on its status as a mere shadow of a shadow. Such shadows inevitably lead the human mind away from the Good, rather than toward it. How- ever, after voicing his many objections, Plato concedes that if art “has any arguments to show it should (...) have a place in a well-governed city, [he] would gladly welcome it back.” Over two millennia later, the nineteenth-century Russian philosopher Vladimir Solov’ev implicitly responded to this challenge in his Lectures on Godmanhood (1881). Solov’ev cited the phenomenon of art as additional proof in favor of his model of the metaphysical foundations of reality. According to Solov’ev, art is not three steps removed from ultimate reality; rather, an artist creates true art only when he has experienced a vision of the univer- sal and substantial ideas that stand over and above particular things, and then conveys them to the viewer directly, via the artistic medium. Hence, the artist is able to sidestep the in- termediate shadow and produce something that is more than a shadow—a clear reflection of that higher reality. If Solov’ev is correct, the artist should enjoy the elevated status of sage, per- haps even philosopher-king, rather than face exile from Plato’s republic, because the artist both knows the Good and guides the less enlightened toward it. After a brief sketch of the metaphysical grounds for Plato’s critique of art, I provide an analysis of Solov’ev’s ontology, as represented in his Lectures on Godmanhood. Next, I describe Solov’ev’s concept of the three-fold mission of art and its relationship to human nature, drawing both from the Lectures and from The Universal Meaning of Art (1890). Finally, in the last section, I demonstrate how the afore-mentioned account comprises Solov’ev’s robust and successful response to Plato’s challenge, from within a platonic framework. (shrink)
Theistic activism and theistic conceptual realism attempt to relieve the tension between transcendent realism about universals and a strong aseity-sovereignty doctrine. Paradoxically, both theories seem to imply that God is metaphysically prior and metaphysically posterior to his own nature. In this paper I critique one attempt to respond to this worry and offer a neo-Augustinian solution in its place. I demonstrate that Augustine’s argument for forms as ideas in the mind of God strongly suggests that only created beings need universals (...) to ground their character. For them, divine concepts can do all of the work that universals are typically invoked to do in the contemporary literature. An uncreated being’s character needs no such grounding and can be accounted for in terms of his own concepts. If this is correct, theists may be realists about universals while maintaining the traditional read of God’s aseity and sovereignty. (shrink)
There has been little discussion of the compatibility of Theistic Conceptual Realism (TCR) with the doctrine of divine simplicity (DDS). On one hand, if a plurality of universals is necessary to explain the character of particular things, there is reason to think this commits the proponent of TCR to the existence of a plurality of divine concepts. So the proponent of the DDS has a prima facie reason to reject TCR (and vice versa). On the other hand, many mediaeval philosophers (...) accept both the existence of divine ideas and the DDS. In this paper I draw on Mediaeval and contemporary accounts of properties and divine simplicity to argue that the two theories are not logically incompatible. (shrink)
Kosch attempts to show that post-Kantian German idealism duplicates and exacerbates a kind of intelligible determinism that is incompatible with a muscular conception of human freedom. Schelling, in his Freedom essay of 1809, finally recognized this; and his attempt to reconfigure idealism from within was motivated by his recognition of the need to provide a place for human freedom. The attempt failed (even if interestingly) but is taken up again and more successfully by Kierkegaard. While the account of Kant draws (...) on a quite familiar interpretive matrix, the application of this matrix to Schelling and Kierkegaard is novel, fruitful, compelling and extremely rigorously laid out. (shrink)
Part 1 of 2, this is an introductory critical review of Michelle Alexander's "The New Jim Crow: Mass Incarceration In The Age of Colorblindness" (The New Press, 2010). See part 2: "Toward Détournement of The New Jim Crow" for an advanced critical reading.
In three empirical studies we examined how people reason about prior convictions in child abuse cases. We tested whether the disclosure of similar prior convictions prompts a mental representation or an additive probative value (Criminal Justice Act, 2003). Asymmetrical use of similar priors were observed in three studies. A pilot study showed that disclosure of a second prior did not contribute a weight equivalent to that of the first disclosure. Study 1 showed jurors did not see left-handed evidence (i.e. matching (...) victim bruising) as more indicative of guilt than right-handedness unless a prior conviction was present, and the presence of priors suppressed the generation of alternative possibilities indicative of innocence. Study 2 showed that disclosure did not decrease community ratings of reoffending propensity and dangerousness as much as a similar prior conviction increased them. We consider the results in the context of a new psychological theory of prior conviction bias and the consequences for the implementation of Section 100 of the Criminal Justice Act (2003). (shrink)
Many theories in philosophy, law, and psychology, make no distinction in meaning between causing and enabling conditions. Yet, psychologically people readily make such distinctions each day. In this paper we report three experiments, showing that individuals distinguish between causes and enabling conditions in brief descriptions of wrongful outcomes. Respondents rate actions that bring about outcomes as causes, and actions that make possible the causal relation as enablers. Likewise, causers (as opposed to enablers) are rated as more responsible for the outcome, (...) as liable to longer prison sentences, and as liable to pay higher fines. Moreover, the more actors involved, the more blame volume there is psychologically to be apportioned between them. The implication is that theories and the law in practice, both criminal and civil, may dangerously mismatch the intuitions of those to whom they are supposed to apply. The findings are discussed in light of contemporary psychological theories of how people reason about cause. -/- The paper presents the extended findings supplementary to the conference paper: Frosch, C. A., Johnson-Laird, P. N., Cowley, M. (2007). Don’t blame me your Honor, I’m only the enabler. Proceedings of the Twenty–Ninth Annual Conference of the Cognitive Science Society, p. 1755, Mahweh, N. J: Erlbaum. Nashville, USA. *Main findings now cited in the Oxford Handbook of Causal Reasoning. (shrink)
Falsification may demarcate science from non-science as the rational way to test the truth of hypotheses. But experimental evidence from studies of reasoning shows that people often find falsification difficult. We suggest that domain expertise may facilitate falsification. We consider new experimental data about chess experts’ hypothesis testing. The results show that chess masters were readily able to falsify their plans. They generated move sequences that falsified their plans more readily than novice players, who tended to confirm their plans. The (...) finding that experts in a domain are more likely to falsify their hypotheses has important implications for the debate about human rationality. (shrink)
The purpose of this paper is to provide a detailed technical protocol analysis of chess masters' evaluative expertise, paying particular attention to the analysis of the structure of their memory process in evaluating foreseen possibilities in games of dynamic equilibrium. The paper has two purposes. First, to publish a results chapter from my DPhil thesis (in revised journal article form) attending to the measurement of foresight in chess masters' evaluation process, testing alternative theories of cognitive expertise in the domain of (...) chess; and second to provide a subset of the technical graphical analysis that corresponds to that measurement to preserve this protocol analysis for access in the academic domain for future studies of expert memory and foresight (e.g., Ericsson & Simon, 1993). The step-by-step protocol analysis consists of: (i) an introduction to foresight cognition as hypothesis testing, (ii) a theoretical review in the domain of chess masters' expertise according to the theoretical frameworks in that field purporting hypotheses relevant to chess masters' evaluative skill processes, and (iii) summary tables and non-parametric statistical analysis corroborating chunking theory frameworks of expert cognition (e.g., DeGroot, 1965; Newell & Simon, 1972; Gobet, 1998; Gobet et al., 2004), and refuting the alternative search-evaluation models (e.g., Holding & Reynolds, 1982). Moreover, the journal article espouses the preservation of the traditional protocol analysis method core to the field of expert cognition (DeGroot, 1969; Kotov, 1971). The full protocol analysis can be found in monograph form here on my SSRN profile in ‘The role of falsification in hypothesis testing’. It takes the form of a specialist population study (e.g., detailed case study work; Luria, 1987). Thus the outline consists of a short introduction, a theoretical methodological review discussing protocol analysis methods for specialist population studies in cognition (with particular attention to the preservation of protocol analysis methods for chess studies in cognition and expert memory/ with a fresh angle on the foresight process), and the full set of protocol analyses with corresponding problem behaviour graphs. A subset of the main results has been published elsewhere (e.g., Cowley & Byrne, 2004; Cowley 2006), receiving scientific and scientific journalistic acclaim (e.g., Nature Online News 2004). (shrink)
Two main cognitive theories predict that people find refuting evidence that falsifies their theorising difficult, if not impossible to consider, even though such reasoning may be pivotal to grounding their everyday thoughts in reality (i.e., Poletiek, 1996; Klayman & Ha, 1987). In the classic 2-4-6 number sequence task devised by psychologists to test such reasoning skills in a simulated environment – people fail the test more often than not. In the 2-4-6 task participants try to discover what rule the number (...) triple 2-4-6 conforms to. The rule is ‘ascending numbers’, but it is tricky to discover this rule. Participants tend to generate hypotheses with the properties of the 2-4-6 triple, for example, ‘even numbers ascending in twos’. They must search for evidence to test whether their hypothesis is the rule. But experimental evidence has shown that they tend to generate confirming triples that they expect to be consistent with their hypothesis rather than inconsistent falsifying triples. Counter to the two main hypothesis testing theories this paper demonstrates that falsification is possible in five 2-4-6 task experiments when participants consider an Imaginary Participant’s hypothesis. Experiment 1 and 2 show that competition with an opponent hypothesis tester facilitates falsification. Experiments 3 to 5 show that the consideration of an alternative hypothesis helps this falsification of hypotheses lead to rule discovery. The implications of the results for theories of hypothesis testing and reasoning are discussed. (shrink)
Each day people are presented with circumstances that may require speculation. Scientists may ponder questions such as why a star is born or how rainbows are made, psychologists may ask social questions such as why people are prejudiced, and military strategists may imagine what the consequences of their actions might be. Speculations may lead to the generation of putative explanations called hypotheses. But it is by checking if hypotheses accurately reflect the encountered facts that lead to sensible behaviour demonstrating a (...) true understanding. If evidence shows a hypothesis to be false, then people should rationally abandon it, especially if there are negative consequences. The aim of this thesis is to examine how effectively people search for evidence in their hypothesis testing to test whether or not their hypotheses are true or false in competitive games. -/- Research findings from six studies of hypothesis testing behaviour in competitive deductive tasks are explored. Chapter by chapter the thesis tests how everyday people, and master chess players, tackle hypothesis testing in mathematical tasks, such as how to solve sequential number sequence puzzles when thinking about an opponent, or how to solve chess problems in a variety of contexts. The implications of the results are discussed in light of aspects of general cognition: such as reasoning, social hypothesis testing and planning. (shrink)
Intended and merely foreseen consequences: The psychology of the ‘cause or allow’ offence. A short report for the Socio-Legal Community on ESRC Grant RES-000-22-3114.
According to the mental model theory, causes and enablers differ in meaning, and therefore in their logical consequences (Goldvarg & Johnson-Laird, 2001). They are consistent with different possibilities. Recent psychological studies have argued to the contrary, and suggested that linguistic cues guide this distinction (Kuhnmünch & Beller, 2005). The issue is important because neither British nor American law recognizes this distinction (e.g., Roberts & Zuckerman, 2004). Yet, in our view, it is central to human conceptions of causality. Hence, in two (...) experiments, we examined our participants’ ability to distinguish between causes and enablers in scenarios describing the actions of two agents and a subsequent outcome, e.g.: ‘Mary threw a lighted cigarette into a bush. Just as the cigarette was going out, Laura deliberately threw petrol on it. The resulting fire burnt down her neighbor’s house.’ Here Mary enabled the fire to occur, whereas Laura caused the fire to occur. (shrink)
This summary note series outlines legal empirical approaches to the study of juries and jury decision-making behaviour for undergraduate students of sociology, criminology and legal systems, and forensic psychology. The note series is divided into two lectures. The first lecture attends to the background relevant to the historical rise of juries and socio-legal methodologies used to understand jury behaviour. The second lecture attends to questions surrounding jury competence, classic studies illustrative of juror bias, and a critical comparison of juries to (...) legal alternatives not reliant on jury deliberation for judicial process. Where appropriate the note series indicates key readings relevant to each core component of the note series, for students to develop their understanding in self-study time. (shrink)
This paper presents empirical findings from a set of reasoning and mock jury studies presented at the Experimental Psychology Oxford Seminar Series (2010) and the King's Bench Chambers KBW Barristers Seminar Series (2010). The presentation asks the following questions and presents empirical answers using the Lenses of Evidence Framework (Cowley & Colyer, 2010; see also van Koppen & Wagenaar, 1993): -/- Why is mental representation important for psychology? -/- Why is mental representation important for evidence law? -/- Lens 1: The (...) self representation - Key findings -/- Lens 2: The expert representation - Key findings -/- Lens 3: The anchor representation - Key findings -/- Conclusions & Future directions. -/- The series of research essentially explores how people represent evidence in mind and presents key findings now cited in the following literatures: Philosophy of Science, Cognitive Expertise, Behavioural Economics, Cognitive Science, Psychology and Public Policy, & Causation and the Law. (shrink)
There are two competing theoretical frameworks with which cognitive sciences examines how people reason. These frameworks are broadly categorized into logic and probability. This paper reports two applied experiments to test which framework explains better how people reason about evidence in criminal cases. Logical frameworks predict that people derive conclusions from the presented evidence to endorse an absolute value of certainty such as ‘guilty’ or ‘not guilty’ (e.g., Johnson-Laird, 1999). But probabilistic frameworks predict that people derive conclusions from the presented (...) evidence in order that they may use knowledge of prior instances to endorse a conclusion of guilt which varies in certainty (e.g., Tenenbaum, Griffiths, & Kemp, 2006). Experiment 1 showed that reasoning about evidence of prior instances, such as disclosed prior convictions, affected participants’ underlying ratings of guilt. Participants’ guilt ratings increased in certainty according to the number of disclosed prior convictions. Experiment 2 showed that participants’ reasoning about evidence of prior convictions and some forensic evidence tended to lead participants to endorse biased ‘guilty’ verdicts when rationally the evidence does not prove guilt. Both results are predicted by probabilistic frameworks. The paper considers the implications for logical and probabilistic frameworks for reasoning in the real world. (shrink)
This notebook presents an introductory overview to the cognitive perspective on the psychology of human behaviour for social science students. Starting with an introduction to cognitive developmental theories of how babies reason, the overview then moves to discuss how children develop into better thinkers. Adult theories of cognition are subsequently outlined and critically evaluated. -/- A chronology of topics include: the rise of 'this thing we call cognition', Piaget's theory of cognitive development and its evaluation, problem space theory, and theories (...) of mental representation in adult thought examining, amongst other types of thinking and reasoning, deduction and induction and an evaluation of mental representation theories. (shrink)
DNA evidence is one of the most significant modern advances in the search for truth since the cross examination, but its format as a random-match-probability makes it difficult for people to assign an appropriate probative value (Koehler, 2001). While Frequentist theories propose that the presentation of the match as a frequency rather than a probability facilitates more accurate assessment (e.g., Slovic et al., 2000), Exemplar-Cueing Theory predicts that the subjective weight assigned may be affected by the frequency or probability format, (...) and how easily examples of the event, i.e., ‘exemplars’, are generated from linguistic cues that frame the match in light of further evidence (Koehler & Macchi, 2004). This paper presents two juror research studies to examine the difficulties that jurors have in assigning appropriate probative value to DNA evidence when contradictory evidence is presented. Study 1 showed that refuting evidence significantly reduced guilt judgments when exemplars were linguistically cued, even when the probability match and the refuting evidence had the same objective probative value. Moreover, qualitative reason for judgment responses revealed that interpreting refuting evidence was found to be complex and not necessarily reductive; refutation was found indicative of innocence or guilt depending on whether exemplars have been cued or not. Study 2 showed that the introduction of judges’ directions to linguistically cue exemplars, did not increase the impact of refuting evidence beyond its objective probative value, but less guilty verdicts were returned when jurors were instructed to consider all possible explanations of the evidence. The results are discussed in light of contradictory frequentist and exemplar-cueing theoretical positions, and their real-world consequences. (shrink)
Can people consistently attempt to falsify, that is, search for refuting evidence, when testing the truth of hypotheses? Experimental evidence indicates that people tend to search for confirming evidence. We report two novel experiments that show that people can consistently falsify when it is the only helpful strategy. Experiment 1 showed that participants readily falsified somebody else’s hypothesis. Their task was to test a hypothesis belonging to an ‘imaginary participant’ and they knew it was a low quality hypothesis. Experiment 2 (...) showed that participants were able to falsify a low quality hypothesis belonging to an imaginary participant more readily than their own low quality hypothesis. The results have important implications for theories of hypothesis testing and human rationality. (shrink)
This paper sketches a brief account of multiculturalism in order to distinguish it from other positions that have been under attack recently. Following this, we address two prevalent and diametrically opposed criticisms of multiculturalism, namely, that multiculturalism is relativistic, on the one hand, and that it is absolutist, on the other. Both of these criticisms, we argue, simply mask liberal democratic theory's myth- begotten attempt to resolve the tension between the one and the many. Multiculturalism challenges the myths of meritocracy (...) and abstract individualism which underlie liberalism; properly understood, it evades the criticisms often hurled at it. (shrink)
This paper argues against the realization principle, which reifies the realization relation between lower-level and higher-level properties. It begins with a review of some principles of naturalistic metaphysics. Then it criticizes some likely reasons for embracing the realization principle, and finally it argues against the principle directly. The most likely reasons for embracing the principle depend on the dubious assumption that special science theories cannot be true unless special science predicates designate properties. The principle itself turns out to be false (...) because the realization relation fails the naturalistic test for reality: it makes no causal difference to the world.1 1This paper resulted from work done at John Heil's 2006 Mind and Metaphysics NEH Summer Seminar at Washington University in St. Louis. An early version of it was presented in a special symposium on realization at the 2007 meeting of the Southern Society for Philosophy and Psychology. I owe thanks to all the participants in both events for helpful discussions, and I owe particular thanks to Ken Aizawa, Torin Alter, Jason Ford, Carl Gillett, John Heil, Nicholas Helms, Pete Mandik, John Post, Gene Witmer, Michelle Wrenn, Tad Zawidzki, and two anonymous referees for the AJP. (shrink)
Propositionalism is the view that the contents of intentional attitudes have a propositional structure. Objectualism opposes propositionalism in allowing the contents of these attitudes to be ordinary objects or properties. Philosophers including Talbot Brewer, Paul Thagard, Michelle Montague, and Alex Grzankowski attack propositionalism about such attitudes as desire, liking, and fearing. This article defends propositionalism, mainly on grounds that it better supports psychological explanations.
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.