The authors of the book have come to the conclusion that it is necessary to effectively use modern approaches to developing and implementation strategies of sustainable socio-economic development in order to increase efficiency and competitiveness of economic entities. Basic research focuses on economic diagnostics of socio-economic potential and financial results of economic entities, transition period in the economy of individual countries and ensuring their competitiveness, assessment of educational processes and knowledge management. The research results have been implemented in the different (...) models and strategies of supply and logistics management, development of non-profit organizations, competitiveness of tourism and transport, financing strategies for small and medium-sized enterprises, cross-border cooperation. The results of the study can be used in decision-making at the level the economic entities in different areas of activity and organizational-legal forms of ownership, ministries and departments that promote of development the economic entities on the basis of models and strategies for sustainable socio-economic development. The results can also be used by students and young scientists in modern concepts and mechanisms for management of sustainable socio-economic development of economic entities in the condition of global economic transformations and challenges. (shrink)
Escribir hoy en día un libro sobre hermenéutica, que tal hermenéutica se refiera a la desarrollada por G. Gadamer en su conocido Verdad y método y que se pretenda añadir algo nuevo a lo mucho escrito sobre el tema parecería, a primera vista, empresa irrealizable. Que ambas pretensiones inspiren la sólida monografía de María G. Navarro —titulada Interpretar y argumentar— constituye empresa audaz y arriesgada, plena de coraje innovador, que provoca admiración, curiosidad e interés. Contra lo que pudiera parecer a (...) primera vista, el libro contiene un alto componente de originalidad y creatividad, debido a la estratagema metodoló-gica de que se sirve la autora. A saber, una hermenéutica in obliquo, estrategia consistente en interpretar a la hermenéutica gadameriana a través del prisma de la lógica de la argumentación. (shrink)
A number of recent writers have expressed scepticism about the viability of a specifically moral concept of obligation, and some of the considerations offered have been interesting and persuasive. This is a scepticism that has its roots in Nietzsche, even if he is mentioned only rather rarely in the debate. More proximately, the scepticism in question receives seminal expression in Elizabeth Anscombe's 1958 essay, ‘Modern Moral Philosophy’, a piece that is often paid lip-service to, but—like Nietzsche's work—has only rarely been (...) taken seriously by those wishing to defend the conception of obligation under attack. This is regrettable. Anscombe's essay is powerful and direct, and it makes a forthright case for the claim that, in the absence of a divine law conception of ethics, any specifically moral concept of obligation must be redundant, and that the best that can be hoped for in a secular age is some sort of neo-Aristotelianism. Anscombe is right about this, we think. And, among those who disagree, one of the very few to have taken her on at all explicitly is Christine Korsgaard, whose Kantianism of course commits her to the view that the concept of moral obligation is central, with or without God. Here, we try to show that Korsgaard loses the argument. (shrink)
In this Chapter, Maria Kronfeldner discusses whether psychological essentialism is a necessary part of dehumanization. This involves different elements of essentialism, and a narrow and a broad way of conceptualizing psychological essentialism, the first akin to natural kind thinking, the second based on entitativity. She first presents authors that have connected essentialism with dehumanization. She then introduces the error theory of psychological essentialism regarding the category of the human, and distinguishes different elements of psychological essentialism. On that basis, Kronfeldner (...) connects historical, socio-psychological, and philosophical insights in order to show that although essentialism can act as a catalyst for dehumanization, it is not necessary for it. Examples relate to dehumanization in the context of colonialism and evolutionary thinking, to the history of dehumanizing women from Aristotle to 19th-century craniology, and to contemporary self-dehumanization and ‘lesser mind’ attribution. (shrink)
Human nature has always been a foundational issue for philosophy. What does it mean to have a human nature? Is the concept the relic of a bygone age? What is the use of such a concept? What are the epistemic and ontological commitments people make when they use the concept? In What’s Left of Human Nature? Maria Kronfeldner offers a philosophical account of human nature that defends the concept against contemporary criticism. In particular, she takes on challenges related to (...) social misuse of the concept that dehumanizes those regarded as lacking human nature (the dehumanization challenge); the conflict between Darwinian thinking and essentialist concepts of human nature (the Darwinian challenge); and the consensus that evolution, heredity, and ontogenetic development results from nurture and nature. After answering each of these challenges, Kronfeldner presents a revisionist account of human nature that minimizes dehumanization and does not fall back on outdated biological ideas. Her account is post-essentialist because it eliminates the concept of an essence of being human; pluralist in that it argues that there are different things in the world that correspond to three different post-essentialist concepts of human nature; and interactive because it understands nature and nurture as interacting at the developmental, epigenetic, and evolutionary levels. On the basis of this, she introduces a dialectical concept of an ever-changing and “looping” human nature. Finally, noting the essentially contested character of the concept and the ambiguity and redundancy of the terminology, she wonders if we should simply eliminate the term “human nature” altogether. (shrink)
Maria Kronfeldner’s Preface and Introduction to the Routledge Handbook of Dehumanization maps the landscape of dehumanization studies. She starts with a brief portrayal of the history of the field. The systematically minded sections that follow guide the reader through the resulting rugged landscape represented in the Handbook’s contributions. Different realizations, levels, forms, and ontological contrasts of dehumanization are distinguished, followed by remarks on the variety of targets of dehumanization. A discussion on valence and emotional aspects is added. Causes, functions, (...) and consequences of dehumanization, and the prospects for reducing or undoing it, are introduced. The systematic overview closes with a discussion of some important theoretical complexities that arise in studying dehumanization. After these systematic sections, the scholarly work on dehumanization gets situated in the broader intellectual landscape of debates about the ‘human’ in the humanities and social sciences. The Introduction ends with some notes on scope, limitations, and intended readership of the Handbook. (shrink)
This book surveys the ways in which languages of different types refer to past, present, and future events and how these referents are related to the knowledge and attitudes of discourse participants. The book is the culmination of fifteen years of research by the author. Four major language types are examined in-depth: tense-based English, tense-aspect-based Polish, aspect-based Chinese, and mood-based Kalaallisut. Each contributes to a series of logical representation languages, which together define a common logical language that is argued to (...) underlie all language types. The four types differ in whether they choose to grammaticalize discourse reference to times (tense), events (aspect), and/or attitudes (mood), and how non-grammaticalized elements are inferred. The common logical language is a dynamic update logic, building on DRT and Centering Theory, but with a novel architecture—e.g. the distinction between focal vs. peripheral attention plays a key role, parallel to focal vs. peripheral vision. (shrink)
Creativity has often been declared, especially by philosophers, as the last frontier of science. The assumption is that it will defy explanation forever. I will defend two claims in order to oppose this assumption and to demystify creativity: (1) the perspective that creativity cannot be explained wrongly identifies creativity with what I shall call metaphysical freedom; (2) the Darwinian approach to creativity, a prominent naturalistic account of creativity, fails to give an explanation of creativity, because it confuses conceptual issues with (...) explanation. I will close with some remarks on the status and differences in some explanations available in contemporary cognitive science. (shrink)
This work offers a possible Foucaultian interpretation of Corporate Compliance Management. It is motivated by the so-called "failure" of ethical training. This research pursues a critical and local perspective. The aim is to endorse and strengthen the ethical potentiality of a Compliance Program. In the first part, metaphors produced by corporate employees are presented. These images symbolize the power relationships with their employers and illustrate some Foucaultian concepts. In the second part, Compliance Management is interpreted as an exercise of corporate (...) power validated by an intentioned discourse. In the third part, Compliance Program is interpreted as a possibility of power resistance. In the fourth part, it provides a discussion with some authors who have tried to rescue Foucault's contribution to business ethics. It is reached to the conclusion that a Compliance Program contains potentialities for the individual´s empowerment. This empowerment is mainly granted through dialogue in training workshops. (shrink)
Recent authors have drawn attention to a new kind of defeating evidence commonly referred to as higher-order evidence. Such evidence works by inducing doubts that one’s doxastic state is the result of a flawed process – for instance, a process brought about by a reason-distorting drug. I argue that accommodating defeat by higher-order evidence requires a two-tiered theory of justification, and that the phenomenon gives rise to a puzzle. The puzzle is that at least in some situations involving higher-order defeaters (...) the correct epistemic rules issue conflicting recommendations. For instance, a subject ought to believe p, but she ought also to suspend judgment in p. I discuss three responses. The first resists the puzzle by arguing that there is only one correct epistemic rule, an Über-rule. The second accepts that there are genuine epistemic dilemmas. The third appeals to a hierarchy or ordering of correct epistemic rules. I spell out problems for all of these responses. I conclude that the right lesson to draw from the puzzle is that a state can be epistemically rational or justified even if one has what looks to be strong evidence to think that it is not. As such, the considerations put forth constitute a non question-begging argument for a kind of externalism. (shrink)
Recent philosophical work on the concept of human nature disagrees on how to respond to the Darwinian challenge, according to which biological species do not have traditional essences. Three broad kinds of reactions can be distinguished: conservative intrinsic essentialism, which defends essences in the traditional sense, eliminativism, which suggests dropping the concept of human nature altogether, and constructive approaches, which argue that revisions can generate sensible concepts of human nature beyond traditional essences. The different constructive approaches pick out one or (...) two of the three epistemic roles that are fused in traditional essentialist conceptions of human nature: descriptive, explanatory, definitional, or explanatory and definitional. These turns towards diverging epistemic roles are best interpreted pluralistically: there is a plurality of concepts of human nature that have to be clearly distinguished, each with a legitimate role in respective scientific contexts. (shrink)
I formulate a resilient paradox about epistemic rationality, discuss and reject various solutions, and sketch a way out. The paradox exemplifies a tension between a wide range of views of epistemic justification, on the one hand, and enkratic requirements on rationality, on the other. According to the enkratic requirements, certain mismatched doxastic states are irrational, such as believing p, while believing that it is irrational for one to believe p. I focus on an evidentialist view of justification on which a (...) doxastic state regarding a proposition p is epistemically rational or justified just in case it tracks the degree to which one’s evidence supports p. If it is possible to have certain kinds of misleading evidence, then evidentialism and the enkratic requirements come into conflict. Yet, both have been defended as platitudinous. After discussing and rejecting three solutions, I sketch an account that rejects the enkratic requirements, while nevertheless explaining our sense that epistemic akrasia is a distinct kind of epistemic failure. Central to the account is distinguishing between two evaluative perspectives, one having to do with the relevant kind of success, the other having to do with manifesting good dispositions. The problem with akratic subjects, I argue, is that they manifest dispositions to fail to correctly respond to a special class of conclusive and conspicuous reasons. (shrink)
In the face of causal complexity, scientists reconstitute phenomena in order to arrive at a more simplified and partial picture that ignores most of the 'bigger picture.' This paper will distinguish between two modes of reconstituting phenomena: one moving down to a level of greater decomposition (toward organizational parts of the original phenomenon), and one moving up to a level of greater abstraction (toward different differences regarding the phenomenon). The first aim of the paper is to illustrate that phenomena are (...) moving targets, i.e., they are not fixed once and for all, but are adapted, if necessary, on the basis of the preferred perspective adopted for pragmatic reasons. The second aim is to analyze in detail the second mode of reconstituting phenomena. This includes an exposition of the kind of pragmatic-pluralistic picture resulting from the fact that phenomena are reconstituted by a move up to a level of greater abstraction. (shrink)
Human nature is a concept that transgresses the boundary between science and society and between fact and value. It is as much a political concept as it is a scientific one. This chapter will cover the politics of human nature by using evidence from history, anthropology and social psychology. The aim is to show that an important political function of the vernacular concept of human nature is social demarcation (inclusion/exclusion): it is involved in regulating who is ‘us’ and who is (...) ‘them.’ It is a folk concept that is used for dehumanization, for denying (a) membership in humankind or (b) full humanness to certain people in order to include or exclude them from various forms of politically relevant aspects of human life, such as rights, power, etc. (shrink)
In this paper I propose a way of characterizing human agency in terms of the concept of a two‐way power. I outline this conception of agency, defend it against some objections, and briefly indicate how it relates to free agency and to moral praise‐ and blameworthiness.
In English, discourse reference to time involves grammatical tenses interpreted as temporal anaphors. Recently, it has been argued that conditionals involve modal discourse anaphora expressed by a parallel grammatical system of anaphoric modals. Based on evidence from Kalaallisut, this paper argues that temporal and modal anaphora can be just as precise in a language that does not have either grammatical category. Instead, temporal anaphora directly targets eventualities of verbs, without mediating tenses, while modal anaphora involves anaphoric moods and/or attitudinal verbs.
This paper advances the view that the history of philosophy is both a kind of history and a kind of philosophy. Through a discussion of some examples from epistemology, metaphysics, and the historiography of philosophy, it explores the benefit to philosophy of a deep and broad engagement with its history. It comes to the conclusion that doing history of philosophy is a way to think outside the box of the current philosophical orthodoxies. Somewhat paradoxically, far from imprisoning its students in (...) outdated and crystallized views, the history of philosophy trains the mind to think differently and alternatively about the fundamental problems of philosophy. It keeps us alert to the fact that latest is not always best, and that a genuinely new perspective often means embracing and developing an old insight. The upshot is that the study of the history of philosophy has an innovative and subversive potential, and that philosophy has a great deal to gain from a long, broad, and deep conversation with its history. (shrink)
This paper addresses whether the often-bemoaned loss of unity of knowledge about humans, which results from the disciplinary fragmentation of science, is something to be overcome. The fragmentation of being human rests on a couple of distinctions, such as the nature-culture divide. Since antiquity the distinction between nature (roughly, what we inherit biologically) and culture (roughly, what is acquired by social interaction) has been a commonplace in science and society. Recently, the nature/culture divide has come under attack in various ways, (...) in philosophy as well as in cultural anthropology. Regarding the latter, for instance, the divide was quintessential in its beginnings as an academic dis-cipline, when Alfred L. Kroeber, one of the first professional anthropologists in the US, rallied for (what I call) the right to ignore—in his case, human nature—by adopting a separationist epistemic stance. A separationist stance will be understood as an epistemic research heuristic that defends the right to ignore a specif-ic phenomenon (e.g., human nature) or a specific causal factor in an explanation typical for a disciplinary field. I will use Kroeber’s case as an example for making a general point against a bias towards integration (synthesis bias, as I call it) that is exemplified, for instance, by defenders of evolutionary psychology. I will claim that, in principle, a separationist stance is as good as an integrationist stance since both can be equally fruitful. With this argument from fruitful sepa-ration in place, not just the separationist stance but also the nature/culture di-vide can be defended against its critics. (shrink)
In 1969 Harry Frankfurt published his hugely influential paper 'Alternate Possibilities and Moral Responsibility' in which he claimed to present a counterexample to the so-called 'Principle of Alternate Possibilities' ('a person is morally responsible for what he has done only if he could have done otherwise'). The success of Frankfurt-style cases as counterexamples to the Principle has been much debated since. I present an objection to these cases that, in questioning their conceptual cogency, undercuts many of those debates. Such cases (...) all require a counterfactual mechanism that could cause an agent to perform an action that he cannot avoid performing. I argue that, given our concept of what it is for someone to act, this requirement is inconsistent. Frankfurt-style alleged counterexamples are cases where an agent is morally responsible for an action he performs even though, the claim goes, he could not have avoided performing that action. However, it has recently been argued, e.g. by John Fischer, that a counterexample to the Principle could be a 'Fischer-style case', i.e. a case where the agent can either perform the action or do nothing else. I argue that, although Fischer-style cases do not share the conceptual flaw common to all Frankfurt-style cases, they also fail as counterexamples to the Principle. The paper finishes with a brief discussion of the significance of the Principle of Alternate Possibilities. (shrink)
Over the last four decades arguments for and against the claim that creative hypothesis formation is based on Darwinian ‘blind’ variation have been put forward. This paper offers a new and systematic route through this long-lasting debate. It distinguishes between undirected, random, and unjustified variation, to prevent widespread confusions regarding the meaning of undirected variation. These misunderstandings concern Lamarckism, equiprobability, developmental constraints, and creative hypothesis formation. The paper then introduces and develops the standard critique that creative hypothesis formation is guided (...) rather than blind, integrating developments from contemporary research on creativity. On that basis, I discuss three compatibility arguments that have been used to answer the critique. These arguments do not deny guided variation but insist that an important analogy exists nonetheless. These compatibility arguments all fail, even though they do so for different reasons: trivialisation, conceptual confusion, and lack of evidence respectively. Revisiting the debate in this manner not only allows us to see where exactly a ‘Darwinian’ account of creative hypothesis formation goes wrong, but also to see that the debate is not about factual issues, but about the interpretation of these factual issues in Darwinian terms. (shrink)
The Eskimo language Kalaallisut (alias West Greenlandic) has traditionally been described as having a rich tense system, with three future tenses (Kleinschmidt 1851, Bergsland 1955, Fortescue 1984) and possibly four past tenses (Fortescue 1984). Recently however, Shaer (2003) has challenged these traditional claims, arguing that Kalaallisut is in fact tenseless.
It is common orthodoxy among internalists and externalists alike that knowledge is lost or defeated in situations involving misleading evidence of a suitable kind. But making sense of defeat has seemed to present a particular challenge for those who reject an internalist justification condition on knowledge. My main aim here is to argue that externalists ought to take seriously a view on which knowledge can be retained even in the face of strong seemingly defeating evidence. As an instructive example, I (...) first discuss whether a theory on which knowledge is belief that is safe from error has the resources for accommodating defeat. I argue that beliefs retained in defeat cases need not be unsafe or true in some accidental way. I then discuss externalist strategies for explaining why we have incorrect intuitions about defeat. The notion of an epistemically reasonable subject plays a central role in my theory. Reasonable subjects adopt general strategies that are good for acquiring true belief and knowledge across a wide range of normal cases, but stubbornly retaining belief in the face of new evidence does not reflect such policies. I argue that though the methods employed by subjects who fail to adjust their beliefs in defeat cases may be perfectly good, they are not good methods to adopt, as their adoption is accompanied by bad dispositions. What emerges is a view on which a subject can know despite being unreasonable, and despite failing to manifest dispositions to know across normal cases. Unreasonable subjects are genuinely criticisable, but like almost anything, knowledge can sometimes be achieved in the absence of a good general strategy. (shrink)
Continuing Franz Boas' work to establish anthropology as an academic discipline in the US at the turn of the twentieth century, Alfred L. Kroeber re-defined culture as a phenomenon sui generis. To achieve this he asked geneticists to enter into a coalition against hereditarian thoughts prevalent at that time in the US. The goal was to create space for anthropology as a separate discipline within academia, distinct from other disciplines. To this end he crossed the boundary separating anthropology from biology (...) in order to secure the boundary. His notion of culture, closely bound to the concept of heredity, saw it as independent of biological heredity (culture as superorganic) but at the same time as a heredity of another sort. The paper intends to summarise the shifting boundaries of anthropology at the beginning of the twentieth century, and to present Kroeber?s ideas on culture, with a focus on how the changing landscape of concepts of heredity influenced his views. The historical case serves to illustrate two general conclusions: that the concept of culture played and plays different roles in explaining human existence; that genetics and the concept of Weismannian hard inheritance did not have an unambiguous unidirectional historical effect on the vogue for hereditarianism at that time; on the contrary, it helped to establish culture in Kroeber's sense, culture as independent of heredity. (shrink)
What does it mean to be an aesthetic beholder? Is it different than simply being a perceiver? Most theories of aesthetic perception focus on 1) features of the perceived object and its presentation or 2) on psychological evaluative or emotional responses and intentions of perceiver and artist. In this chapter I propose that we need to look at the process of engaged perception itself, and further that this temporal process of be- coming a beholder must be understood in its embodied, (...) contextual and dynamic speci- ficity. Through both phenomenological and neuroscientific explorations I analyze what is characteristic about a more “aesthetic stance” and argue that there is a certain asym- metry between beholder and beheld, which has to do with a disengagement of goal- directed action, and which allows for other kinds of perceptual involvement than in a more “practical stance”. It is a multi-disciplinary project integrating a sensorimotor no- tion of aesthetic affordances, 18th century philosophy, and large-scale brain network findings. What ensues is a new dynamic framework for future empirical and theoretical research on aesthetic perception. (shrink)
The term ‘human nature’ can refer to different things in the world and fulfil different epistemic roles. Human nature can refer to a classificatory nature (classificatory criteria that determine the boundaries of, and membership in, a biological or social group called ‘human’), a descriptive nature (a bundle of properties describing the respective group’s life form), or an explanatory nature (a set of factors explaining that life form). This chapter will first introduce these three kinds of ‘human nature’, together with seven (...) reasons why we disagree about human nature. In the main, this chapter focuses on the explanatory concept of human nature, which is related to one of the seven reasons for disagreement, namely, the scientific authority inherent in the term ‘nature’. I will examine why, in a number of historical contexts, it was attractive to refer to ‘nature’ as an explanatory category, and why this usage has led to the continual contestation of the term within the sciences. The claim is that even if the contents of talk about ‘nature’ varied historically, the term’s pragmatic function of demarcation stayed the same. The term ‘nature’ conveys scientific authority over a territory; ‘human nature’ is a concept used to divide causes, as well as experts, and thereby conquer others who threaten to invade one’s epistemic territory. Analysing this demarcation, which has social as well as epistemic aspects, will help us to understand why the explanatory role has been important and why it is unlikely that people will ever agree on either the meaning or the importance of ‘human nature’ as an explanatory category. (shrink)
My paper examines the popular idea, defended by Kripke, that meaning is an essentially normative notion. I consider four common versions of this idea and suggest that none of them can be supported, either because the alleged normativity has nothing to do with normativity or because it cannot plausibly be said that meaning is normative in the sense suggested. I argue that contrary to received opinion, we don’t need normativity to secure the possibility of meaning. I conclude by considering the (...) repercussions of rejecting semantic normativity on three central issues: justification, communication, and naturalism. (shrink)
Ignorance is often a perfectly good excuse. There are interesting debates about whether non-culpable factual ignorance and mistake subvert obligation, but little disagreement about whether non-culpable factual ignorance and mistake exculpate. What about agents who have all the relevant facts in view but fail to meet their obligations because they do not have the right moral beliefs? If their ignorance of their obligations derives from mistaken moral beliefs or from ignorance of the moral significance of the facts they have in (...) view, should they be excused for failing to meet their moral obligations? It is not obvious that they should. In this paper we argue that the best non-skeptical accounts of moral responsibility acknowledge that factual ignorance and mistake will diminish moral responsibility in a way that moral ignorance and mistake will not. That is because factual ignorance is often non-culpable so long as it meets certain merely procedural epistemic standards but the same is not true of moral ignorance. Our argument is that the assumption that it is gets the standards of culpability for moral ignorance wrong, and that the mistake is encouraged by the thought that culpability in general requires an instance of known wrongdoing: that acting wrongly requires de dicto unresponsiveness to one’s obligations at some stage. We deny this and conclude that, therefore, ignorance and mistaken belief are indeed often perfectly good excuses – but far less often than some philosophers claim. (shrink)
This paper introduces a framework for direct surface composition by online update. The surface string is interpreted as is, with each morpheme in turn updating the input state of information and attention. A formal representation language, Logic of Centering, is defined and some crosslinguistic constraints on lexical meanings and compositional operations are formulated.
Although cognitivism has lost some ground recently in the philosophical circles, it is still the favorite view of many scholars of emotions. Even though I agree with cognitivism's insight that emotions typically involve some type of evaluative intentional state, I shall argue that in some cases, less epistemically committed, non-propositional evaluative states such as mental pictures can do a better job in identifying the emotion and providing its intentional object. Mental pictures have different logical features from propositions: they are representational, (...) and some may or may not portray actual objects aptly. Yet, unlike propositional attitudes, mental pictures do not allow for objective criteria by which one can judge that a certain picture is an apt portrait of someone or something. (shrink)
Partee (1973) noted anaphoric parallels between English tenses and pronouns. Since then these parallels have been analyzed in terms of type-neutral principles of discourse anaphora. Recently, Stone (1997) extended the anaphoric parallel to English modals. In this paper I extend the story to languages of other types. This evidence also shows that centering parallels are even more detailed than previously recognized. Based on this evidence, I propose a semantic representation language (Logic of Change with Centered Worlds), in which the observed (...) parallels can be formally analyzed. (shrink)
No two individuals with the autism diagnosis are ever the same—yet many practitioners and parents can recognize signs of ASD very rapidly with the naked eye. What, then, is this phenotype of autism that shows itself across such distinct clinical presentations and heterogeneous developments? The “signs” seem notoriously slippery and resistant to the behavioral threshold categories that make up current assessment tools. Part of the problem is that cognitive and behavioral “abilities” typically are theorized as high-level disembodied and modular functions—that (...) are assessed discretely (impaired, normal, enhanced) to define a spectral syndrome. Even as biology reminds us that organic developing bodies are not made up of independent switches, we remain often seduced by the simplicity of mechanistic and cognitive models. Developmental disorders such as autism have accordingly been theorized as due to different modular dysfunctions—typically of cortical origin, i.e., failures of “theory of mind” (Baron-Cohen et al., 1985), of the “mirror neuron system” (Ramachandran and Oberman, 2006), of “weak central coherence” (Happe and Frith, 2006) or of the balance of “empathizing” and “systemizing” (Baron-Cohen, 2009), just to list a few. -/- The broad array of autonomic (Ming et al., 2005; Cheshire, 2012) and sensorimotor (Damasio and Maurer, 1978; Maurer and Damasio, 1982; Donnellan and Leary, 1995; Leary and Hill, 1996; Donnellan and Leary, 2012; Donnellan et al., 2012) differences experienced and reported by people with autism have by such theories typically been sidelined as “co-morbidities,” possibly sharing genetic causes, but rendered as incidental and decisively behaviorally irrelevant symptoms—surely disconnected from cognition. But what if the development of cortically based mental processes and autonomous control relies on the complexities and proper function of the peripheral nervous systems? Through such an “embodied” lens the heterogeneous symptoms of autism invites new interpretations. We propose here that many behavioral-level findings can be re-defined as downstream effects of how developing nervous systems attempt to cope and adapt to the challenges of having various noisy, unpredictable, and unreliable peripheral inputs. (shrink)
There seems to be confusion and disagreement among scholars about the meaning of interpersonal forgiveness. In this essay we shall venture to clarify the meaning of forgiveness by examining various literary works. In particular, we shall discuss instances of forgiveness from Homer’s The Iliad, Euripides’ Hippolytus, and Aristotle’s Nicomachean Ethics and we shall focus on the changes that the concept of forgiveness has gone through throughout the centuries, in the hope of being able to understand, and therefore, of being able (...) to use more accurately, contemporary notions of forgiveness. We shall also explore the relationship between forgiveness and concepts that are closely associated with it, such as anger/resentment, hurt, clemency, desert/merit, excuse, etc. (shrink)
This paper investigates civility from an Aristotelian perspective and has two objectives. The first is to offer a novel account of this virtue based on Aristotle’s remarks about civic friendship. The proposed account distinguishes two main components of civility—civic benevolence and civil deliberation—and shows how Aristotle’s insights can speak to the needs of our communities today. The notion of civil deliberation is then unpacked into three main dimensions: motivational, inquiry-related, and ethical. The second objective is to illustrate how the post-truth (...) condition—in particular, the spread of misinformation typical of the digital environments we inhabit—obstructs our capacity to cultivate the virtue of civility by impairing every component of civil deliberation. The paper hopes to direct virtue theorists’ attention to the need to foster civic virtues as a means of counteracting the negative aspects of the post-truth age. (shrink)
How and when do we learn to understand other people’s perspectives and possibly divergent beliefs? This question has elicited much theoretical and empirical research. A puzzling finding has been that toddlers perform well on so-called implicit false belief (FB) tasks but do not show such capacities on traditional explicit FB tasks. I propose a navigational approach, which offers a hitherto ignored way of making sense of the seemingly contradictory results. The proposal involves a distinction between how we navigate FBs as (...) they relate to (1) our current affordances (here & now navigation) as opposed to (2) presently non-actual relations, where we need to leave our concrete embodied/situated viewpoint (counterfactual navigation). It is proposed that whereas toddlers seem able to understand FBs in their current affordance space, they do not yet possess the resources to navigate in abstraction from such concrete affordances, which explicit FB tests seem to require. It is hypothesized that counterfactual navigation depends on the development of “sensorimotor priors,” i.e., statistical expectations of own kinesthetic re-afference, which evidence now suggests matures around age four, consistent with core findings of explicit FB performance. (shrink)
It has long been recognized that temporal anaphora in French and English depends on the aspectual distinction between events and states. For example, temporal location as well as temporal update depends on the aspectual type. This paper presents a general theory of aspect-based temporal anaphora, which extends from languages with grammatical tenses (like French and English) to tenseless languages (e.g. Kalaallisut). This theory also extends to additional aspect-dependent phenomena and to non-atomic aspectual types, processes and habits, which license anaphora to (...) proper atomic parts (cf. nominal pluralities and kinds). (shrink)
Rooth & Partee (1982) and Rooth (1985) have shown that the English-specific rule-by-rule system of PTQ can be factored out into function application plus two transformations for resolving type mismatch (type lifting and variable binding). Building on these insights, this article proposes a universal system for type-driven translation, by adding two more innovations: local type determination for gaps (generalizing Montague 1973) and a set of semantic filters (extending Cooper 1983). This system, dubbed Cross-Linguistic Semantics (XLS), is shown to account for (...) various phenomena — including scope relations in English and Greenlandic Eskimo, internally headed relative clauses in Lakhota, serial verbs in Yoruba and VP ellipsis in English. (shrink)
This paper examines Bohdan Boichuk’s poetry by looking into the role his childhood memories played in forming his poetic imagination. Displaced by World War II, the poet displays a unique capacity to transcend his traumatic experiences by engaging in creative writing. Eyewitnessing war atrocities perpetrated by the Nazis does not destroy his belief in the healing power of poetry; on the contrary, it makes him appreciate poetry as the only existentially worthy enterprise. Invoking Gaston Bachelard’s classic work The Poetics of (...) Reveries: Childhood, Language, and the Cosmos, I argue that Boichuk’s vivid childhood memories, however painful they might be, helped him poetically recreate and reimagine fateful moments of his migrant life. (shrink)
This paper is on the problem of causal selection and comments on Collingwood's classic paper "The so-called idea of causation". It discusses the relevance of Collingwood’s control principle in contemporary life sciences and defends that it is not the ability to control, but the willingness to control that often biases us towards some rather than other causes of a phenomenon. Willingness to control is certainly only one principle that influences causal selection, but it is an important one. It shows how (...) norms make causes. (shrink)
Section III of part IV of Book I of Hume's Treatise entitled “Of the ancient philosophy” has been virtually ignored by most Hume scholars. Although philosophers seem to concentrate on sections II and VI of part IV and pay little or no attention to section III, the latter section is paramount in showing how serious Hume's skepticism is, and how Hume's philosophy, contrary to his intention, is far removed from "the sentiments of the vulgar". In this paper I shall first (...) explore Hume's view on ancient philosophy as it is presented in section III, and I shall particularly focus on his discussion of identity and simplicity of bodies. Second, I shall argue that Hume's account of identity and simplicity in terms of qualities is at best unsatisfactory. Finally, I shall try to show that Hume's advice to hold a "moderate" skepticism cannot be taken seriously. On the contrary, Hume seems to hold an "extravagant" skepticism, since he claims that there is a contradiction between our most fundamental natural beliefs, as well as between our natural beliefs and philosophical reasoning. (shrink)
While most scholars focus on the advantages of forgiveness, the negative effects of hasty forgiveness have been largely neglected in the literature. In this essay I shall argue that in certain contexts granting forgiveness to a wrongdoer could be morally questionable, and sometimes it could even be morally wrong. Following Aristotle’s view of emotion, and, in particular, his notion of virtuous anger, I shall claim that appropriate, righteous anger is instrumental for justice, and, as a result, inappropriate, or imprudent forgiveness (...) could be an impediment to justice, or even a license for the continuation of injustice. (shrink)
Theories of cultural evolution rest on the assumption that cultural inheritance is distinct from biological inheritance. Cultural and biological inheritance are two separate so-called channels of inheritance, two sub-systems of the sum total of developmental resources traveling in distinct ways between individual agents. This paper asks: what justifies this assumption? In reply, a philosophical account is offered that points at three related but distinct criteria that (taken together) make the distinction between cultural and biological inheritance not only precise but also (...) justify it as real, i.e. as ontologically adequate. These three criteria are: (a) the autonomy of cultural change, (b) the near-decomposability of culture, and (c) differences in temporal order between cultural and biological inheritance. (shrink)
This paper aims to show that a proper understanding of what Leibniz meant by “hypercategorematic infinite” sheds light on some fundamental aspects of his conceptions of God and of the relationship between God and created simple substances or monads. After revisiting Leibniz’s distinction between (i) syncategorematic infinite, (ii) categorematic infinite, and (iii) actual infinite, I examine his claim that the hypercategorematic infinite is “God himself” in conjunction with other key statements about God. I then discuss the issue of whether the (...) hypercategorematic infinite is a “whole”, comparing the four kinds of infinite outlined by Leibniz in 1706 with the three degrees of infinity outlined in 1676. In the last section, I discuss the relationship between the hypercategorematic infinite and created simple substances. I conclude that, for Leibniz, only a being beyond all determinations but eminently embracing all determinations can enjoy the pure positivity of what is truly infinite while constituting the ontological grounding of all things. (shrink)
Willem B. Drees’ book defends the humanities as a valuable endeavor in understanding human beings that is vibrant and essential for the academic and non-academic world ... The review highlights two issues, the book's naturalism (presenting the humanities as a human necessity) and the book's idealistic outlook (presenting the humanities as following the value-free ideal).
This article illustrates in which sense genetic determinism is still part of the contemporary interactionist consensus in medicine. Three dimensions of this consensus are discussed: kinds of causes, a continuum of traits ranging from monogenetic diseases to car accidents, and different kinds of determination due to different norms of reaction. On this basis, this article explicates in which sense the interactionist consensus presupposes the innate?acquired distinction. After a descriptive Part 1, Part 2 reviews why the innate?acquired distinction is under attack (...) in contemporary philosophy of biology. Three arguments are then presented to provide a limited and pragmatic defense of the distinction: an epistemic, a conceptual, and a historical argument. If interpreted in a certain manner, and if the pragmatic goals of prevention and treatment (ideally specifying what medicine and health care is all about) are taken into account, then the innate?acquired distinction can be a useful epistemic tool. It can help, first, to understand that genetic determination does not mean fatalism, and, second, to maintain a system of checks and balances in the continuing nature?nurture debates. (shrink)
We think we have lots of substantial knowledge about the future. But contemporary wisdom has it that indeterminism prevails in such a way that just about any proposition about the future has a non-zero objective chance of being false.2, 3 What should one do about this? One, pessimistic, reaction is scepticism about knowledge of the future. We think this should be something of a last resort, especially since this scepticism is likely to infect alleged knowledge of the present and past. (...) One anti-sceptical strategy is to pin our hopes on determinism, conceding that knowledge of the future is unavailable in an indeterministic world. This is not satisfying either: we would rather not be hostage to empirical fortune in the way that this strategy recommends. A final strategy, one that we shall explore in this paper, is one of reconciliation: knowledge of a proposition is compatible with a subject’s belief having a non-zero objective chance of error.4 Following Williamson, we are interested in tying knowledge to the presence or absence of error in close cases, and so we shall explore the connections between knowledge and objective chance within such a framework. We don’t want to get tangled up here in complications involved in attempting to formulate a necessary and sufficient condition for knowledge in terms of safety. Instead, we will assume the following rough and ready necessary condition: a subject knows P only if she could not easily have falsely believed P.5 Assuming that easiness is to be spelt.. (shrink)
Faced with current urgent calls for more trust in experts, especially in high impact and politically sensitive domains, such as climate science and COVID-19, the complex and problematic nature of public trust in experts and the need for a more critical approach to the topic are easy to overlook. Scepticism – at least in its Humean mitigated form that encourages independent, questioning attitudes – can prove valuable to democratic governance, but stands in opposition to the cognitive dependency entailed by epistemic (...) trust. In this paper, we investigate the tension between the value of mitigated scepticism – understood as the exercise of reason-based doubt in a particular domain – and the need for trust in experts. We offer four arguments in favour of mitigated scepticism: the argument from loss of intellectual autonomy; the argument from democratic deficit; the argument from the normative failures of science; and the argument from past and current injustices. These arguments highlight the tension between the requirements for trust and justified scepticism about the role of experts. One solution, which we reject, is the idea that reliance, rather than trust, is sufficient for the purposes of accommodating experts in policy matters. The solution we endorse is to create a ‘climate of trust’, where questioning experts and expertise is welcomed, but the epistemic trust necessary for action upon information which the public cannot obtain first-hand is enabled and encouraged through structural, institutional, and justice-based measures. (shrink)
Reasons can play a variety of roles in a variety of contexts. For instance, reasons can motivate and guide us in our actions (and omissions), in the sense that we often act in the light of reasons. And reasons can be grounds for beliefs, desires and emotions and can be used to evaluate, and sometimes to justify, all these. In addition, reasons are used in explanations: both in explanations of human actions, beliefs, desires, emotions, etc., and in explanations of a (...) wide range of phenomena involving all sorts of animate and inanimate substances. This diversity has encouraged the thought that the term 'reason' is ambiguous or has different senses in different contexts. Moreover, this view often goes hand in hand with the claim that reasons of these different kinds belong to different ontological categories: to facts (or something similar) in the case of normative/justifying reasons, and to mental states in the case of motivating/explanatory reasons. In this paper I shall explore some of the main roles that reasons play and, on that basis, I shall offer a classification of kinds of reasons. As will become clear, my classification of reasons is at odds with much of the literature in several respects: first, because of my views about how we should understand the claim that reasons are classified into different kinds; second, because of the kinds into which I think reasons should be classified; and, finally, because of the consequences I think this view has for the ontology of reasons. (shrink)
Natural languages exhibit a great variety of grammatical paradigms. For instance, in English verbs are grammatically marked for tense, whereas in the tenseless Eskimo-Aleut language Kalaallisut they are marked for illocutionary mood. Although time is a universal dimension of the human experience and speaking is part of that experience, some languages encode reference to time without any grammatical tense morphology, or reference to speech acts without any illocutionary mood morphology. Nevertheless, different grammatical systems are semantically parallel in certain respects. Specifically, (...) I propose that English tenses form a temporal centering system, which monitors and updates topic times, whereas Kalaallisut moods form a modal centering system, which monitors and updates modal discourse referents. To formalize these centering parallels I define a dynamic logic that represents not only changing information but also changing focus of attention in discourse (Update with Centering, formalizing Grosz et al 1995). Different languages can be translated into this typed logic by directly compositional universal rules of Combinatory Categorial Grammar (CCG) The resulting centering theory of tense and illocutionary mood draws semantic parallels across different grammatical systems. The centering generalizations span the extremes of the typological spectrum, so they are likely to be universal. In addition, the theory accounts for the translation equivalence of tense and illocutionary mood in a given utterance context. Following Stalnaker (1978) I assume that the very act of speaking up has a ‘commonplace effect’ on the context. It focuses attention on the speech act and thereby introduces default modal and temporal topics. These universal defaults complement language-specific grammars, e.g. English tenses and Kalaallisut moods. In a given utterance context the universal discourse-initial defaults plus language-specific grammatical marking may add up to the same truth conditions.. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.