Carl Gillett has defended what he calls the “dimensioned” view of the realization relation, which he contrasts with the traditional “flat” view of realization (2003, 2007; see also Gillett 2002). Intuitively, the dimensioned approach characterizes realization in terms of composition whereas the flat approach views realization in terms of occupiers of functional roles. Elsewhere we have argued that the general view of realization and multiple realization that Gillett advances is not able to discharge the theoretical duties of those relations ( (...) class='Hi'>Shapiro 2004, unpublished manuscript; Polger 2004, 2007, forthcoming). Here we focus on an internal objection to Gillett’s account and then raise some broader reasons to reject it. (shrink)
Successful athletic performance requires precision in many respects. A batter stands behind home plate awaiting the arrival of a ball that is less than three inches in diameter and moving close to 100 mph. His goal is to hit it with a bat that is also less than three inches in diameter. This impressive feat requires extraordinary temporal and spatial coordination. The sweet spot of the bat must be at the same place, at the same time, as the ball. A (...) basketball player must keep a ball bouncing as she speeds from one end of the court to another, evading defensive players. She may never break pace as she lifts from the ground, throwing the ball fifteen feet toward a hoop that is eighteen inches in diameter. One task facing a psychologist involves explaining how the body does such things within the sometimes very demanding spatial and temporal constraints that a given task imposes. Part of the goal of this chapter is to sketch the commitments of an embodied approach to such an explanation. We shall see that an embodied account of motor skills draws concepts that depart radically from more traditional cognitivist theories of motor activity. Similarly, because an embodied approach to cognition introduces new ways to understand the human capacity for social interaction, it also promises to shed new light on how athletes coordinate their actions with each other. (shrink)
The field of textbooks in philosophy of mind is a crowded one. I shall consider six recent texts for their pedagogical usefulness. All have been published within the last five years, though two are new editions of previously published books. The first three are authored monographs: by K. T. Maslin, Barbara Montero, and André Kukla and Joel Walmsley. I then review three anthologies, each with two editors: William Lycan and Jesse Prinz, Brie Gertler and LawrenceShapiro, and Brian (...) McLaughlin and Jonathan Cohen. These six texts constitute a diverse bunch. Within each of the two groups (monographs and anthologies), each individual text differs significantly from the other two in its approach, scope, and thus suitability for various levels of teaching. (shrink)
Hossack’s project in this book is to provide a new foundation for the philosophy of number inspired by the traditional idea that numbers are magnitudes.
Building on recent work, I present sequent systems for the non-classical logics LP, K3, and FDE with two main virtues. First, derivations closely resemble those in standard Gentzen-style systems. Second, the systems can be obtained by reformulating a classical system using nonstandard sequent structure and simply removing certain structural rules (relatives of exchange and contraction). I clarify two senses in which these logics count as “substructural.”.
Cells are cognitive entities possessing great computational power. DNA serves as a multivalent information storage medium for these computations at various time scales. Information is stored in sequences, epigenetic modifications, and rapidly changing nucleoprotein complexes. Because DNA must operate through complexes formed with other molecules in the cell, genome functions are inherently interactive and involve two-way communication with various cellular compartments. Both coding sequences and repetitive sequences contribute to the hierarchical systemic organization of the genome. By virtue of nucleoprotein complexes, (...) epigenetic modifications, and natural genetic engineering activities, the genome can serve as a read-write storage system. An interactive informatic conceptualization of the genome allows us to understand the functional importance of DNA that does not code for protein or RNA structure, clarifies the essential multidirectional and systemic nature of genomic information transfer, and emphasizes the need to investigate how cellular computation operates in reproduction and evolution. (shrink)
Examines the theories of Socrates, Kant, Dewey, Piaget, and others to explore the implications of Socrates' question "what is a virtuous man, and what is a virtuous school and society which educates virtuous men.".
The scientific consensus regarding anthropogenic climate change is firmly established yet climate change denialism, a species of what I call pseudoskepticism, is on the rise in industrial nations most responsible for climate change. Such denialism suggests the need for a robust ethics of inquiry and public discourse. In this paper I argue: (1) that ethical obligations of inquiry extend to every voting citizen insofar as citizens are bound together as a political body. (2) It is morally condemnable for public officials (...) to put forward assertions contrary to scientific consensus when such consensus is decisive for public policy and legislation. (3) It is imperative upon educators, journalists, politicians and all those with greater access to the public forum to condemn, factually and ethically, pseudoskeptical assertions made in the public realm without equivocation. (shrink)
Since the publication of Andrews Reath's “Two Conceptions of the Highest Good in Kant” (Journal of the History of Philosophy 26:4 (1988)), most scholars have come to accept the view that Kant migrated away from an earlier “theological” version to one that is more “secular.” The purpose of this paper is to explore the roots of this interpretative trend, re-assess its merits, and then examine how the Highest Good is portrayed in Kant’s Religion within the Boundaries of Mere Reason. As (...) will be argued, it is in this text more so than any other where Kant develops his most philosophically sophisticated account of the Highest Good. Because of the central significance of Kant’s doctrine of the Highest Good for both his ethical theory and philosophy of religion, this paper therefore seeks to provide an important corrective to the current received views. (shrink)
In the Fifth Meditation, Descartes makes a remarkable claim about the ontological status of geometrical figures. He asserts that an object such as a triangle has a 'true and immutable nature' that does not depend on the mind, yet has being even if there are no triangles existing in the world. This statement has led many commentators to assume that Descartes is a Platonist regarding essences and in the philosophy of mathematics. One problem with this seemingly natural reading is that (...) it contradicts the conceptualist account of universals that one finds in the Principles of Philosophy and elsewhere. In this paper, I offer a novel interpretation of the notion of a true and immutable nature which reconciles the Fifth Meditation with the conceptualism of Descartes' other work. Specifically, I argue that Descartes takes natures to be innate ideas considered in terms of their so-called 'objective being'. (shrink)
Stewart Shapiro’s book develops a contextualist approach to vagueness. It’s chock-full of ideas and arguments, laid out in wonderfully limpid prose. Anyone working on vagueness (or the other topics it touches on—see below) will want to read it. According to Shapiro, vague terms have borderline cases: there are objects to which the term neither determinately applies nor determinately does not apply. A term determinately applies in a context iff the term’s meaning and the non-linguistic facts determine that they (...) do. The non-linguistic facts include the “external” context: “comparison class, paradigm cases, contrasting cases, etc.” (33) But external-contextsensitivity is not what’s central to Shapiro’s contextualism. Even fixing external context, vague terms’ (anti-)extensions exhibit sensitivity to internal context: the decisions of competent speakers. According to Shapiro’s open texture thesis, for each borderline case, there is some circumstance in which a speaker, consistently with the term’s meaning and the non-linguistic facts, can judge it to fall into the term’s extension and some circumstance in which the speaker can judge it to fall into the term’s anti-extension: she can “go either way.” Moreover, borderline sentences are Euthyphronically judgment- dependent: a competent speaker’s judging a borderline to fall into a term’s (anti- )extension makes it so. For Shapiro, then, a sentence can be true but indeterminate: a case left unsettled by meaning and the non-linguistic facts (and thus indeterminate, or borderline) may be made true by a competent speaker’s judgment. Importantly, among the non-linguistic facts that constrain speakers’ judgments (at least in the cases Shapiro cares about) is a principle of tolerance: for all x and y, if x and y differ marginally in the relevant respect (henceforth, Mxy), then if one competently judges Bx, one cannot competently judge y in any other manner in the same (total) context.1 This does not require that one judge By: one might not consider the matter at all.. (shrink)
This article seeks to re-conceptualize Rawlsian public reason as a critical tool against ideological propaganda. The article proposes that public reason, as a standard for public discourse, must be conceptualized beyond its mandate for comprehensive neutrality to additionally emphasize critique of ideologically driven ignorance and propaganda in the public realm. I connect uncritical hospitality to such ideological propaganda with Harry Frankfurt’s concept of bullshit. This paper proposes that philosophers have a unique moral obligation to engage bullshit critically in the public (...) sphere. The obligation for such critique, I argue, represents philosophy’s essential moral component in a society committed to the protection of free speech and deliberative democracy. (shrink)
One prominent criticism of the abstractionist program is the so-called Bad Company objection. The complaint is that abstraction principles cannot in general be a legitimate way to introduce mathematical theories, since some of them are inconsistent. The most notorious example, of course, is Frege’s Basic Law V. A common response to the objection suggests that an abstraction principle can be used to legitimately introduce a mathematical theory precisely when it is stable: when it can be made true on all sufficiently (...) large domains. In this paper, we raise a worry for this response to the Bad Company objection. We argue, perhaps surprisingly, that it requires very strong assumptions about the range of the second-order quantifiers; assumptions that the abstractionist should reject. (shrink)
This paper argues that it is morally irresponsible for modern medical providers or health care institutions to support and advocate the integration of CAM practices (i.e. homeopathy, acupuncture, energy healing, etc.) with conventional modern medicine. The results of such practices are not reliable beyond that of placebo. As a corollary, it is argued that prescribing placebos perceived to stand outside the norm of modern medicine is morally inappropriate. Even when such treatments do no direct physical harm, they create unnecessary barriers (...) to patients' informed understanding of their health. (shrink)
The Many Gods Objection (MGO) is widely viewed as a decisive criticism of Pascal’s Wager. By introducing a plurality of hypotheses with infinite expected utility into the decision matrix, the wagerer is left without adequate grounds to decide between them. However, some have attempted to rebut this objection by employing various criteria drawn from the theological tradition. Unfortunately, such defenses do little good for an argument that is supposed to be an apologetic aimed at atheists and agnostics. The purpose of (...) this paper is to offer a defensive strategy of a different sort, one more suited to the Wager’s apologetic aim and status as a decision under ignorance. Instead of turning to criteria independent of the Wager, it will be shown that there are characteristics already built into its decision theoretic structure that can be used to block many categories of theological hypotheses including MGO’s more outrageous “cooked-up” hypotheses and “philosophers’ fictions”. -/- Please note that there are editorial errors in the published version. They have been corrected in the attached. (shrink)
Is it possible to recognize the limits of rationality, and thus to embrace moral pluralism, without embracing moral relativism? My answer is yes; nevertheless, certain anti-foundational positions, both recent and ancient, take a cynical stance toward the possibility of any critical moral judgment, and as such, must be regarded as relativistic.1 It is such cynicism, I argue, whether openly announced or unknowingly implied, that marks the distinction between relativism and pluralism.2 The danger of this cynicism is not so much that (...) it renders the categorical acceptance of a particular moral view unattainable, but that it renders categorical condemnation of any particular position (or action) impossible.3 Two .. (shrink)
We present a computational analysis of de re, de dicto, and de se belief and knowledge reports. Our analysis solves a problem first observed by Hector-Neri Castañeda, namely, that the simple rule -/- `(A knows that P) implies P' -/- apparently does not hold if P contains a quasi-indexical. We present a single rule, in the context of a knowledge-representation and reasoning system, that holds for all P, including those containing quasi-indexicals. In so doing, we explore the difference between reasoning (...) in a public communication language and in a knowledge-representation language, we demonstrate the importance of representing proper names explicitly, and we provide support for the necessity of considering sentences in the context of extended discourse (for example, written narrative) in order to fully capture certain features of their semantics. (shrink)
Nicolas Malebranche holds that we see all things in the physical world by means of ideas in God (the doctrine of "vision in God"). In some writings he seems to posit ideas of particular bodies in God, but when pressed by critics he insists that there is only one general idea of extension, which he calls “intelligible extension.” But how can this general and “pure” idea represent particular sensible objects? I develop systematic solutions to this and two other putative difficulties (...) with Malebranche’s theory of sensory cognition by appealing to the notion of “seeing as” and to his doctrine that ideas in God have causal powers to affect the mind. (shrink)
Solidarity within a group facing adversity exemplifies certain human goods, some instrumental to the goal of mitigating the adversity, some non-instrumental, such as trust, loyalty, and mutual concern. Group identity, shared experience, and shared political commitments are three distinct but often-conflated bases of racial group solidarity. Solidarity groups built around political commitments include members of more than one identity group, even when the political focus is primarily on the justice-related interests of only one identity group (such as African Americans). A (...) solidarity group is more than a mere political coalition or alliance. Two other forms of political- commitment solidarity groups are ones devoted to racial justice more generally, and social justice even more generally. Racially plural political solidarity groups realize values beyond the afore-mentioned solidaristic ones, in meeting the challenges of different races working together. (shrink)
This is a review of Vicious Circles: On the Mathematics of Non-Wellfounded Phenomena, written by Jon Barwise and Lawrence Moss and published by CSLI Publications in 1996.
Semantic originalism is a theory of constitutional meaning that aims to disentangle the semantic, legal, and normative strands of debates in constitutional theory about the role of original meaning in constitutional interpretation and construction. This theory affirms four theses: (1) the fixation thesis, (2) the clause meaning thesis, (3) the contribution thesis, and (4) the fidelity thesis. -/- The fixation thesis claims that the semantic content of each constitutional provision is fixed at the time the provision is framed and ratified: (...) subsequent changes in linguistic practice cannot change the semantic content of an utterance. -/- The clause meaning thesis claims that the semantic content is given by the conventional semantic meaning (or original public meaning) of the text with four modifications. The first modification is provided by the publicly available context of constitutional utterance: words and phrases that might be ambiguous in isolation can become clear in light of those circumstances of framing and ratification that could be expected to known to interpreters of the Constitution across time. The second modification is provided by the idea of the division of linguistic labor: some constitutional provisions, such as the natural born citizen clause may be terms of art, the meaning of which are fixed by the usages of experts. The third modification is provided by the idea of constitutional implicature: the constitution may mean things it does not explicitly say. The fourth modification is provided by the idea of constitutional stipulations: the constitution brings into being new terms such as House of Representatives and the meaning of these terms is stipulated by the Constitution itself. -/- The contribution thesis asserts that the semantic content of the Constitution contributes to the law: the most plausible version of the contribution thesis is modest, claiming that the semantic content of the Constitution provides rules of constitutional law, subject to various qualifications. Our constitutional practice provides strong evidence for the modest version of the contribution thesis. -/- The fidelity thesis asserts that we have good reasons to affirm fidelity to constitutional law: virtuous citizens and officials are disposed to act in accord with the Constitution; right acting citizens and officials obey the constitution in normal circumstances; constitutional conformity produces good consequences. Our public political culture affirms the great value of the rule of law. -/- We can summarize semantic originalism as a slogan: The original public meaning of the constitution is the law and for that reason it should be respected and obeyed. The slogan recapitulates each of the claims made by semantic originalism, but it is potentially misleading because it does not clearly distinguish between the semantic claims made by the fixation and clause meaning theses, the legal claim made by the contribution thesis, and the normative claim made by the fidelity thesis. -/- Part I introduces the four theses. Part II is entitled An Opinionated History of Constitutional Originalism, and it provides the context for all that follows. Part III is entitled Semantic Originalism: A Theory of Constitutional Meaning, and it lays out the case for original public meaning as the best nonnormative theory of constitutional content. Part IV is entitled The Normative Implications of Semantic Originalism, and it articulates a variety of normative arguments for originalism. Part V is entitled Conclusion: Semantic Originalism and Living Constitutionalism, and it explores the broad implications of semantic originalism for living constitutionalism and the future of constitutional theory. (shrink)
"Procedural Justice" offers a theory of procedural fairness for civil dispute resolution. The core idea behind the theory is the procedural legitimacy thesis: participation rights are essential for the legitimacy of adjudicatory procedures. The theory yields two principles of procedural justice: the accuracy principle and the participation principle. The two principles require a system of procedure to aim at accuracy and to afford reasonable rights of participation qualified by a practicability constraint. The Article begins in Part I, Introduction, with two (...) observations. First, the function of procedure is to particularize general substantive norms so that they can guide action. Second, the hard problem of procedural justice corresponds to the following question: How can we regard ourselves as obligated by legitimate authority to comply with a judgment that we believe (or even know) to be in error with respect to the substantive merits? The theory of procedural justice is developed in several stages, beginning with some preliminary questions and problems. The first question - what is procedure? - is the most difficult and requires an extensive answer: Part II, Substance and Procedure, defines the subject of the inquiry by offering a new theory of the distinction between substance and procedure that acknowledges the entanglement of the action-guiding roles of substantive and procedural rules while preserving the distinction between two ideal types of rules. The key to the development of this account of the nature of procedure is a thought experiment, in which we imagine a world with the maximum possible acoustic separation between substance and procedure. Part III, The Foundations of Procedural Justice, lays out the premises of general jurisprudence that ground the theory and answers a series of objections to the notion that the search for a theory of procedural justice is a worthwhile enterprise. Sections II and III set the stage for the more difficult work of constructing a theory of procedural legitimacy. Part IV, Views of Procedural Justice, investigates the theories of procedural fairness found explicitly or implicitly in case law and commentary. After a preliminary inquiry that distinguishes procedural justice from other forms of justice, Part IV focuses on three models or theories. The first, the accuracy model, assumes that the aim of civil dispute resolution is correct application of the law to the facts. The second, the balancing model, assumes that the aim of civil procedure is to strike a fair balance between the costs and benefits of adjudication. The third, the participation model, assumes that the very idea of a correct outcome must be understood as a function of process that guarantees fair and equal participation. Part IV demonstrates that none of these models provides the basis for a fully adequate theory of procedural justice. In Part V, The Value of Participation, the lessons learned from analysis and critique of the three models are then applied to the question whether a right of participation can be justified for reasons that are not reducible to either its effect on the accuracy or its effect on the cost of adjudication. The most important result of Part V is the Participatory Legitimacy Thesis: it is (usually) a condition for the fairness of a procedure that those who are to be finally bound shall have a reasonable opportunity to participate in the proceedings. The central normative thrust of Procedural Justice is developed in Part VI, Principles of Procedural Justice. The first principle, the Participation Principle, stipulates a minimum (and minimal) right of participation, in the form of notice and an opportunity to be heard, that must be satisfied (if feasible) in order for a procedure to be considered fair. The second principle, the Accuracy Principle, specifies the achievement of legally correct outcomes as the criterion for measuring procedural fairness, subject to four provisos, each of which sets out circumstances under which a departure from the goal of accuracy is justified by procedural fairness itself. In Part VII, The Problem of Aggregation, the Participation Principle and the Accuracy Principle are applied to the central problem of contemporary civil procedure - the aggregation of claims in mass litigation. Part VIII offers some concluding observations about the point and significance of Procedural Justice. (shrink)
Nichols’s view of empathy (in Sentimental Rules) in light of experimental moral psychology suffers from several deficiencies: (1) It operates with an impoverished view of the altruistic emotions (empathy, sympathy, concern, compassion, etc.) as mere short-term, affective states of mind, lacking any essential connection to intentionality, perception, cognition, and expressiveness. (2) It fails to keep in focus the moral distinction between two very different kinds of emotional response to the distress and suffering of others—other-directed, altruistic, emotions that have moral value, (...) and self-directed emotional responses, such as personal distress, that do not. (3) Nichols is correct to see morality as requiring affectivity, and the capability of emotional response to others; but his incorrect view of altruistic emotions (and of emotions in general) leads him to misstate the connection between morality and emotion. (4) Nichols’s specific attempt to ground moral judgment in emotion fails, but the argument he provides for it is part of the explanation of point (2), his failure to sustain the distinction between egoistic and altruistic emotions. (5) Without in any way denying that moral philosophy is strengthened by knowledge of empirical psychology, I suggest that the foregoing failures of Nichols’s argument are partly due to his misuse of particular empirical results and findings, and possibly in part to a weakened commitment to the distinctive contribution the humanistic methods of philosophy make to our understanding of the moral dimension of life. (shrink)
The continuing rejection of anthropogenic global warming by non-experts despite overwhelming scientific consensus is rationally untenable and best described as “pseudoskeptical;” it is akin to AIDS denialism, the advocacy of intelligent design, and anti-vaccination movements.
SNePS, the Semantic Network Processing System 45, 54], has been designed to be a system for representing the beliefs of a natural-language-using intelligent system (a \cognitive agent"). It has always been the intention that a SNePS-based \knowledge base" would ultimatelybe built, not by a programmeror knowledge engineer entering representations of knowledge in some formallanguage or data entry system, but by a human informing it using a natural language (NL) (generally supposed to be English), or by the system reading books or (...) articles that had been prepared for human readers. Because of this motivation, the criteria for the development of SNePS have included: it should be able to represent anything and everything expressible in NL; it should be able to represent generic, as well as speci c information; it should be able to use the generic and the speci c information to reason and infer information implied by what it has been told; it cannot count on any particular order among the pieces of information it is given; it must continue to act reasonably even if the information it is given includes circular de nitions, recursive rules, and inconsistent information. (shrink)
ABSTRACT: Stewart Shapiro recently argued that there is no higher-order vagueness. More specifically, his thesis is: (ST) ‘So-called second-order vagueness in ‘F’ is nothing but first-order vagueness in the phrase ‘competent speaker of English’ or ‘competent user of “F”’. Shapiro bases (ST) on a description of the phenomenon of higher-order vagueness and two accounts of ‘borderline case’ and provides several arguments in its support. We present the phenomenon (as Shapiro describes it) and the accounts; then discuss (...) class='Hi'>Shapiro’s arguments, arguing that none is compelling. Lastly, we introduce the account of vagueness Shapiro would have obtained had he retained compositionality and show that it entails true higher-order vagueness. (shrink)
Stereotypes are false or misleading generalizations about groups, generally widely shared in a society, and held in a manner resistant, but not totally, to counterevidence. Stereotypes shape the stereotyper’s perception of stereotyped groups, seeing the stereotypic characteristics when they are not present, and generally homogenizing the group. The association between the group and the given characteristic involved in a stereotype often involves a cognitive investment weaker than that of belief. The cognitive distortions involved in stereotyping lead to various forms of (...) moral distortion, to which moral philosophers have paid insufficient attention. Some of these are common to all stereotypes—failing to see members of the stereotyped groups as individuals, moral distancing, failing to see subgroup diversity within the group. Other moral distortions vary with the stereotype. Some attribute a much more damaging or stigmatizing characteristic (e.g. being violent) than others (e.g. being good at basketball). But the characteristic in question must also be viewed in its wider historical and social context to appreciate its overall negative and positive dimensions. (shrink)
This chapter constructs the argument that corporate and political policies known to accelerate anthropogenic global warming, and subsequent climate change, constitute crimes against humanity—predicated on failures to avoid reasonably foreseeable threats to sustained human existence. Given the moral gravity of crimes against humanity it follows that financial divestment is ethically obligatory for institutions wishing to avoid moral association. The moral case for fossil fuel divestment, in the wake of such crimes, derives from (a) the ethical implications of negative responsibility, or (...) what constitutes culpability by omission; (b) institutional collaboration based upon immediate financial interests tied to climate inaction; and (c) institutional complicity derived from financial support of corporate behaviors that hasten global warming. Possible objections are examined and refuted. (shrink)
Locke's claims about the "inadequacy" of substance-ideas can only be understood once it is recognized that the "sort" represented by such an idea is not wholly determined by the idea's descriptive content. The key to his compromise between classificatory conventionalism and essentialism is his injunction to "perfect" the abstract ideas that serve as "nominal essences." This injunction promotes the pursuit of collections of perceptible qualities that approach ever closer to singling out things that possess some shared explanatory-level constitution. It is (...) in view of this norm regulating natural-historical inquiry that a substance-idea represents a sort for which some such constitution serves as the "real essence," i.e. as that on which all the sort's characteristic "properties" depend. (shrink)
The Brown vs. Board of Education decision of 1954 mandated school integration. The decision also to recognize that inequalities outside the schools, of both a class- and race-based nature, prevent equality in education. Today, the most prominent argument for integration is that disadvantaged students benefit from the financial, social, and cultural “capital” of middle class families when the children attend the same schools. This argument fails to recognize that disadvantaged students contribute to advantaged students’ educational growth, and sends demeaning messages (...) to the disadvantaged students and messages of unwarranted superiority to the advantaged. Parents, teachers, and schools can adopt a justice perspective that avoids these deleterious aspects of the capital argument, and helps create a community of equals inside the integrated school. Struggles for educational justice must remain closely linked with struggles of both a class- and race-based nature for other forms of justice in the wider society. (shrink)
This article explicates the views on both race and ethnicity of these three prominent Latinx philosophers, compares them (somewhat), and offers some criticisms. Corlett jettisons race as a categorization of groups, but accepts a form of racialization somewhat at odds with this jettisoning. Gracia adopts as a general principle that an account of both ethnicity and race should help us see aspects of reality that would otherwise be obscured; but this is at odds with his regarding the Latin American view (...) of race as more rational than the U.S. version with its “one-drop rule.” The latter has structured the reality of race in the U.S. for African Americans. Alcoff is much more concerned with the phenomenology of race and ethnicity than the other two, and she clearly adds “pan-ethnicity” to the mix of concepts required to understand Latino/a Americans. I argue that she fails to see the agentic and political aspect of black identity in the U.S., and in a sense shares with Gracia a misplaced sense that the mixedness of Latin American racial identity is somehow to be preferred to the more binary U.S. form. (shrink)
Forty years’ experience as a bacterial geneticist has taught me that bacteria possess many cognitive, computational and evolutionary capabilities unimaginable in the first six decades of the twentieth century. Analysis of cellular processes such as metabolism, regulation of protein synthesis, and DNA repair established that bacteria continually monitor their external and internal environments and compute functional outputs based on information provided by their sensory apparatus. Studies of genetic recombination, lysogeny, antibiotic resistance and my own work on transposable elements revealed multiple (...) widespread bacterial systems for mobilizing and engineering DNA molecules. Examination of colony development and organization led me to appreciate how extensive multicellular collaboration is among the majority of bacterial species. Contemporary research in many laboratories on cell–cell signaling, symbiosis and pathogenesis show that bacteria utilise sophisticated mechanisms for intercellular communication and even have the ability to commandeer the basic cell biology of ‘higher’ plants and animals to meet their own needs. This remarkable series of observations requires us to revise basic ideas about biological information processing and recognise that even the smallest cells are sentient beings. Previous article in issue. (shrink)
This work is an examination of Peter Singer’s notion of speciesism: case for animal rights in Ejagham culture. It primarily deals with an evaluation of the phenomenon of animal rights from the standpoint of Peter Singer’s notion of speciesism. Singer’s notion of speciesism deals with the moral obligation humans owe to animals as against the bias or prejudice that humanity has greater moral worth than non-human animals. Most opponents of speciesism contend that, animals are not members of the moral community (...) as such humans have no moral obligation to them. Contrary to this view, proponents of speciesism argue that animals are capable of suffering and should be considered morally. Thus, the emphasis here is that just like many societies of the world, the Ejagham people are guilty of speciesism. Among the several ways by which speciesism is practiced, this work identifies hunting, deforestation, bush burning and fishing as ways by which the Ejagham people are guilty. Using the tool of critical analysis, evaluation and prescription, this work submits that animals have interest, as such, should be granted rights. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.