Kant’s distinction between the determining and reflecting power of judgment in the third Critique is not well understood in the literature. A mainstream view unifies these by making determination the telos of all acts of judgment (Longuenesse 1998). On this view, all reflection is primarily in the business of producing empirical concepts for cognition, and thus has what I call a determinative ideal. I argue that this view fails to take seriously the independence and autonomy of the ‘power of (...) judgment’ [Urteilskraft] as a higher cognitive faculty in its own right with its own a priori principle. Instead of seeing merely reflecting judgments as failed or incomplete acts of judgment, I argue that these are in fact paradigmatic of the activity of the power of judgment. More precisely, the reflecting power of judgment just is the power of judgment. Accordingly, reflecting judgment takes precedence over determining judgment; while the former operates according to a law that it gives itself, the latter requires another higher cognitive faculty to provide its principle. On my view, reflecting judgment should be understood as the capacity for purposive subsumption—most clearly seen in the activity of mere reflection. (shrink)
Anselm is known for offering a distinctive definition of freedom of choice as “the ability of preserving uprightness of will for its own sake.” When we turn to Anselm’s account of the devil’s fall in De Casu Diaboli, however, this idiosyncratic understanding of freedom is not at the forefront. In that text, Anselm seemingly assumes a traditional understanding of free will defined in terms of alternative possibilities for the angels. These alternative possibilities must be present so the angels can engage (...) in ‘self-determination.’ God, however, does not face alternative possibilities to achieve His self-determination. Anselm thus explicates his notion of free will in terms of three different concepts: his distinctive definition of free choice, self-determination, and the principle of alternative possibilities. Despite attempts (by both scholars and Anselm) to explain how these three concepts are related, I argue that their relationship is problematic. In particular, I argue that Anselm is guilty of conflating and equivocating with regard to these concepts. I further importantly claim that the conflation obscures the fact that his understanding of self-determination calls into question God’s excellence over that of the good angels. (shrink)
Disputes over territory are among the most contentious in human affairs. Throughout the world, societies view control over land and resources as necessary to ensure their survival and to further their particular life-style, and the very passion with which claims over a region are asserted and defended suggests that difficult normative issues lurk nearby. Questions about rights to territory vary. It is one thing to ask who owns a particular parcel of land, another who has the right (...) to reside within its boundaries and yet another to determine which individuals or groups have political rights of citizenship, sovereignty, and self-determination within it. It must also be asked how these rights—if ‘rights’ is the correct term—are acquired. When attention turns to the territorial rights of communities, national groups or states, sovereignty is the principal concern. Within international law, de facto power over a territory, say, of occupying forces or trustees, is insufficient to possess or acquire sovereignty (Brownlie, 1990, p. 111). The central conceptions underlying modern democratic thought are that sovereignty over a politically demarcated territory is vested in the resident population, and that governmental authority is derived from the consent of that population. It is simple enough to identify the latter with the citizenry of a state, but demographic and political flux makes this a loose criterion. States come and go, and sometimes a territory is stateless. Also, large-scale demographic shifts during upheavals and peacetime immigrations change the assessments of who belongs where. Does everyone residing in a place at a particular time have a right to share in its governance then? What about illegal immigrants? Presumably, sovereignty rests with the established population or.. (shrink)
Functional decomposition is an important goal in the life sciences, and is central to mechanistic explanation and explanatory reduction. A growing literature in philosophy of science, however, has challenged decomposition-based notions of explanation. ‘Holists’ posit that complex systems exhibit context-sensitivity, dynamic interaction, and network dependence, and that these properties undermine decomposition. They then infer from the failure of decomposition to the failure of mechanistic explanation and reduction. I argue that complexity, so construed, is only incompatible with one notion of decomposition, (...) which I call ‘atomism’, and not with decomposition writ large. Atomism posits that function ascriptions must be made to parts with minimal reference to the surrounding system. Complexity does indeed falsify atomism, but I contend that there is a weaker, ‘contextualist’ notion of decomposition that is fully compatible with the properties that holists cite. Contextualism suggests that the function of parts can shift with external context, and that interactions with other parts might help determine their context-appropriate functions. This still admits of functional decomposition within a given context. I will give examples based on the notion of oscillatory multiplexing in systems neuroscience. If contextualism is feasible, then holist inferences are faulty—one cannot infer from the presence of complexity to the failure of decomposition, mechanism, and reductionism. (shrink)
Throughout its brief history the philosophy of technology has been largely concerned with the debate over the nature of technology. Typically, technology has been viewed as being essentially another term for applied science, the practical application of scientific theory to the material world. In recent years philosophers and cultural critics have characterised technology in a far more problematic fashion, as an authoritarian power with the ability to bring about far-reaching cultural, political and ecological effects. Proponents of the former view (...) are often termed instrumentalists and those of the latter technological determinists. The debate between them revolves around the question of the fact/value distinction, namely whether technology can be deemed to be value-neutral. I argue that employing a phenomenological approach to technology grants us a fresh perspective on the instrumentalism-determinism debate. It enables us to recast the instrumentalist/determinist debate as a debate between technological idealism and materialism, and to ground the instrumentalist and determinist positions in different experiential relations to technology. It also gives us a better grasp of the function of the different critiques of technology, with idealists concerned primarily with the misapplication of technology as a form of knowledge, and materialists with the existential implications of concrete technological relations. (shrink)
The doctrine of permanent sovereignty over natural resources is a hugely consequential one in the contemporary world, appearing to grant nation-states both jurisdiction-type rights and rights of ownership over the resources to be found in their territories. But the normative justification for that doctrine is far from clear. This article elucidates the best arguments that might be made for permanent sovereignty, including claims from national improvement of or attachment to resources, as well as functionalist claims linking resource rights (...) to key state functions. But it also shows that these defences are insufficient to justify permanent sovereignty and that in many cases they actually count against it as a practice. They turn out to be compatible, furthermore, with the dispersal of resource rights away from the nation-state which global justice appears to demand. (shrink)
Cases of heroic supererogation have been taken to suggest that non-moral reasons are morally relevant. While non-moral reasons are unable to make actions morally required, they can prevent moral reasons from doing so. I argue that non-moral reasons are morally relevant in yet another way, since they can also play an essential role in making it the case that an action is morally required. Even though non-moral reasons are not able themselves to make actions morally required, they can prevent reasons (...) that otherwise would prevent moral reasons from making actions morally required from doing so. I elaborate and defend this view, and I show how it can be made the basis of an explanation of moral requirements in terms of reasons that is to be preferred over rival accounts. (shrink)
Feelings of belonging are integral in people’s choice of what career to pursue. Women and men are disproportionately represented across careers, starting with academic training. The present research focuses on two fields that are similar in their history and subject matter but feature inverse gender gaps—psychology (more women than men) and philosophy (more men than women)—to investigate how theorized explanations for academic gender gaps contribute to feelings of belonging. Specifically, we simultaneously model the relative contribution of theoretically relevant individual differences (...) (empathizing, systematizing, and intellectual combativeness) as well as life goals (prioritization of family, money, and status) to feelings of belonging and majoring in psychology or philosophy. We find that men report higher intellectual combativeness than women, and intellectual combativeness predicts feelings of belonging and majoring in philosophy over psychology. Although systematizing and empathizing are predictive of belonging and, in turn, majoring in psychology and philosophy, respectively, when other factors are taken into account, women and men do not differ in empathizing and systematizing. Women, more than men, report prioritizing having a family, wealth, and status in choosing a career, and these directly or indirectly feed into feelings of belonging and majoring in psychology, in contrast to prior theory. Together, these findings suggest that students’ perceptions of their own combativeness and the extent to which they desire money and status play essential roles in women’s feeling they belong in psychology and men’s feeling they belong in philosophy. (shrink)
Sleep onset is associated with marked changes in behavioral, physiological, and subjective phenomena. In daily life though subjective experience is the main criterion in terms of which we identify it. But very few studies have focused on these experiences. This study seeks to identify the subjective variables that reflect sleep onset. Twenty young subjects took an afternoon nap in the laboratory while polysomnographic recordings were made. They were awakened four times in order to assess subjective experiences that correlate with the (...) (1) appearance of slow eye movement, (2) initiation of stage 1 sleep, (3) initiation of stage 2 sleep, and (4) 5 min after the start of stage 2 sleep. A logistic regression identified control over and logic of thought as the two variables that predict the perception of having fallen asleep. For sleep perception, these two variables accurately classified 91.7% of the cases; for the waking state, 84.1%. (shrink)
This paper concerns model-checking of fragments and extensions of CTL* on infinite-state Presburger counter systems, where the states are vectors of integers and the transitions are determined by means of relations definable within Presburger arithmetic. In general, reachability properties of counter systems are undecidable, but we have identified a natural class of admissible counter systems (ACS) for which we show that the quantification over paths in CTL* can be simulated by quantification over tuples of natural numbers, eventually allowing (...) translation of the whole Presburger-CTL* into Presburger arithmetic, thereby enabling effective model checking. We provide evidence that our results are close to optimal with respect to the class of counter systems described above. (shrink)
There are a number of important links and similarities between public health and safety. In this extended essay, Gregg D. Caruso defends and expands his public health-quarantine model, which is a non-retributive alternative for addressing criminal behavior that draws on the public health framework and prioritizes prevention and social justice. In developing his account, he explores the relationship between public health and safety, focusing on how social inequalities and systemic injustices affect health outcomes and crime rates, how poverty affects brain (...) development, how offenders often have pre-existing medical conditions (especially mental health issues), how involvement in the criminal justice system itself can lead to or worsen health and cognitive problems, how treatment and rehabilitation methods can best be employed to reduce recidivism and reintegrate offenders back into society, and how a public health approach could be successfully applied within the criminal justice system. Caruso's approach draws on research from the health sciences, social sciences, public policy, law, psychiatry, medical ethics, neuroscience, and philosophy, and he delivers a set of ethically defensible and practically workable proposals for implementing the public health-quarantine model. The essay begins by discussing recent empirical findings in psychology, neuroscience, and the social sciences that provide us with an increased understanding of the social and neurological determinants of health and criminal behavior. It then turns to Caruso's public health-quarantine model and argues that the model provides the most justified, humane, and effective approach for addressing criminal behavior. Caruso concludes by proposing a capability approach to social justice grounded in six key features of human well-being. He argues that we cannot successfully address concerns over public health and safety without simultaneously addressing issues of social justice—including the social determinants of health (SDH) and the social determinants of criminal behavior (SDCB)—and he recommends eight general policy proposals consistent with his model. (shrink)
SummaryThis paper argues that the influence of language on science, philosophy and other field is mediated by communicative practices. Where communications is more restrictive, established linguistic structures exercise a tighter control over innovations and scientifically motivated reforms of language. The viewpoint here centers on the thesis that argumentation is crucial in the understanding and evaluation of proposed reforms and that social practices which limit argumentation serve to erode scientific objectivity. Thus, a plea is made for a sociology of scientific (...) belief designed to understand and insure social‐institutional conditions of the possibility of knowledge and its growth. A chief argument draws on work of Axelrod concerning the evolution of cooperation. (shrink)
Children's vulnerability gives rise to duties of justice towards children and determines when authority over them is legitimately exercised. I argue for two claims. First, children's general vulnerability to objectionable dependency on their caregivers entails that they have a right not to be subject to monopolies of care, and therefore determines the structure of legitimate authority over them. Second, children's vulnerability to the loss of some special goods of childhood determines the content of legitimate authority over them. (...) My interest is in the so-far little-discussed goods of engaging in world discovery, artistic creation, philosophical pursuits and experimentation with one's self. I call these ‘special goods of childhood’ because individuals, in general, only have full access to them during childhood and they make a distinctive and weighty contribution to wellbeing. Therefore, they are part of the metric of justice towards children. The overall conclusion is that we ought to make good institutional care part of every child's upbringing. (shrink)
In debating the ethics of immigration, philosophers have focused much of their attention on determining whether a political community ought to have the discretionary right to control immigration. They have not, however, given the same amount of consideration to determining whether there are any ethical limits on how a political community enforces its immigration policy. This article, therefore, offers a different approach to immigration justice. It presents a case against legitimate states having discretionary control over immigration by showing both (...) how ethical limits on enforcement circumscribe the options legitimate states have in determining their immigration policy and how all immigrants (including undocumented immigrants) are entitled to certain protections against a state’s enforcement apparatus. (shrink)
The temporal unity of the self cannot be accounted for by the continuity of causal, factual, or contiguous relations between independently definable mental events, as proposed by Locke and Parfit. The identity of the self over time is normative: it depends on the institutional context of social rules external to the self that determine the relationship between past commitments and current responsibilities. (2005).
The paper argues that while the Serbian society and political elite are known for treating their country’s accession to the EU in terms of pragmatic utility maximisation, they generally conceive of Serbian relations with Russia, contrariwise, as an identity-laden issue. To prove it, the author analyses Serbia’s behaviour toward Russia along the features of emotion-driven cooperation, found in the literature on identity and emotions in foreign policy. In particular, the paper focuses on Serbians’ especially strong friendliness vis-à-vis Russia, the parallel (...) existence of the Other (the West) in their identity and the particularly strong intensity of their attraction to Russia during Serbia-West conflicts, the reinforcement of their affection to Russia by national traumas, the endurance of the affection’s strength despite conflicting rational interests and negative experiences in bilateral interaction, the frequent occurrence of references to Russia in Serbia’s domestic discourse and decisional justifications and a large use of historical analogies concerning Russia. Finally, the author ponders over the implications of the existent configuration of emotional and pragmatic forces in Serbian politics for the country’s current and future conduct toward Russia and the EU. (shrink)
European colonialism is an extremely controversial affair in world history that is also discussed today. This paper explores the influence of these happenings as the world incorporates the culture of European in every step of life. Europe dominated almost the entire world and its people were leaders in science and technology. European languages, literature, and culture spread all over the globe. Decisions in Europe largely determined global events for centuries. The other continents did not approach European power until after (...) WW1 or WW2. Europe consisted of constantly competing nations, and this competition spurred development. Europe achieved world hegemony in the years after 1500 A.D., primarily due to technological advancements, scientific research, political development of nations with stable succession and continuity, and a culture dominated by Christianity. These advantages were translated into military power, projected all over the world. (shrink)
Can we ever justly critique the norms and practices of another culture? When activists or policy-makers decide that one culture’s traditional practice is harmful and needs to be eradicated, does it matter whether they are members of that culture? Given the history of imperialism, many argue that any critique of another culture’s practices must be internal. Others argue that we can appeal to a universal standard of human wellbeing to determine whether or not a particular practice is legitimate or whether (...) it should be eradicated. In this paper, I use the FGC eradication campaigns of the 1980s to show that the internal/external divide is complicated by the interconnectedness of these debates on the international level. As the line blurs between internal and external criticism and interventions, new questions emerge about the representativeness of global institutions. (shrink)
In this paper, I argue that standard psychological continuity theory does not account for an important feature of what is important in survival – having the property of personhood. I offer a theory that can account for this, and I explain how it avoids the implausible consequences of standard psychological continuity theory, as well as having certain other advantages over that theory.
Persons interested in developing virtue will find attending to, and attempting to act on, the right reason for action a rich resource for developing virtue. In this paper I consider the role of self-knowledge in intentional moral development. I begin by making a general case that because improving one’s moral character requires intimate knowledge of its components and their relation to right reason, the aim of developing virtue typically requires the development of self-knowledge. I next turn to Kant’s ethics for (...) an account which explains the reflexivity involved in moral reasoning generally, and the significance of self-knowledge to morality. I then take up Robert Audi’s interesting notion of the harnessing and unharnessing of reasons as a potential way of strengthening the agent’s connection to right reason, and his concerns about our limited and indirect resources for becoming virtuous. I argue that harnessing and unharnessing are not plausibly characterized as activities to be accomplished by an exertion of will, rather they involve a dynamic, cognitive, reflective attempt to gain self-knowledge and align oneself with one’s moral reasons for action. (shrink)
A standard way of representing causation is with neuron diagrams. This has become popular since the influential work of David Lewis. But it should not be assumed that such representations are metaphysically neutral and amenable to any theory of causation. On the contrary, this way of representing causation already makes several Humean assumptions about what causation is, and which suit Lewis’s programme of Humean Supervenience. An alternative of a vector diagram is better suited for a powers ontology. Causation should be (...) understood as connecting property types and tokens where there are dispositions towards some properties rather than others. Such a model illustrates how an effect is typically polygenous: caused by many powers acting with each other, and sometimes against each other. It models causation as a tendency towards an effect which can be counteracted. The model can represent cases of causal complexity, interference, over-determination and causation of absence (equilibrium). (shrink)
ABSTRACT On the traditional picture, accidents must inhere in substances in order to exist. Berkeley famously argues that a particular class of accidents—the sensible qualities—are mere ideas—entities that depend for their existence on minds. To defend this view, Berkeley provides us with an elegant alternative to the traditional framework: sensible qualities depend on a mind, not in virtue of inhering in it, but in virtue of being perceived by it. This metaphysical insight, once correctly understood, gives us the resources to (...) solve a central problem that still plagues the philosophy of perception—the problem of how, given the power of the mind to create phenomenally rich experiences, ordinary perception can nonetheless be said to acquaint us with the mind-independent world. (shrink)
In this paper, by suggesting a formal representation of science based on recent advances in logic-based Artificial Intelligence (AI), we show how three serious concerns around the realisation of traditional scientific realism (the theory/observation distinction, over-determination of theories by data, and theory revision) can be overcome such that traditional realism is given a new guise as ‘naturalised’. We contend that such issues can be dealt with (in the context of scientific realism) by developing a formal representation of science (...) based on the application of the following tools from Knowledge Representation: the family of Description Logics, an enrichment of classical logics via defeasible statements, and an application of the preferential interpretation of the approach to Belief Revision. (shrink)
Ludwig Wittgenstein, in the "Remarks on the Foundation of Mathematics", often refers to contradictions as deserving special study. He is said to have predicted that there will be mathematical investigations of calculi containing contradictions and that people will pride themselves on having emancipated themselves from consistency. This paper examines a way of taking this prediction seriously. It starts by demonstrating that the easy way of understanding the role of contradictions in a discourse, namely in terms of pure convention within a (...) specific linguistic community, is naive and therefore to be avoided. A number of steps will then be taken to formulate and justify what will be called a ‘containment-account’ of contradiction. This term refers to a way of understanding how a contradiction in a discourse can be contained without allowing it to infect the entire discourse with inconsistency. The account builds on work done by N. Rescher and involves complex ontologies that are either over-determined or under-determined at some points. These worlds are such that, for some P, they allow us to deny simultaneously both that P and that -P. They likewise allow us to hold both that P and that -P. The ontological status of P and that of -P are considered independent issues, so as to block the inference from P and -P to the conjunction (P & -P). In this way, the task of containment is carried out by the complexity of the ontology determining the possible worlds under consideration. The paper proceeds by countering the objection that such possible worlds are useless fictions. This is carried out by highlighting three significant areas of application, one in philosophy of science, one in philosophy of religion, and one in the area of epistemology. In general, one may say that, in some language-games, a contradiction can be a useful instrument to attest that rationality goes beyond ratiocination, in other words to indicate that there is more to discourse and practice than what can be expressed in precise algorithmic trees. The upshot of the entire argument is that the proposed ‘containment-account’ of contradiction can be an adequate illustration of how Wittgenstein’s prediction is not sterile, but a possible source of inspiration. (shrink)
The paper will explicate the Sache or matter of the dialectic of the founder of Kyoto School philosophy, Nishida Kitarō (1870-1945), from the standpoint of his mature thought, especially from the 1930s and 40s. Rather than providing a simple exposition of his thought I will engage in a creative reading of his concept of basho (place) in terms of chiasma and chōra, or a chiasmatic chōra. I argue that Nishida’s appropriation of nineteenth century German, especially Hegelian, terminology was inadequate in (...) expressing what he strove to say—for his concept of basho confounds traditional metaphysical discourse. Because of its chiasmatic and chōratic nature, the Sache he strove to capture and express through the language of dialectical philosophy, perpetually slips away from any systemic bounds. His “dialectic” (benshōhō) implies a chiasma or a criss-crossing of multiple factors on multi-dimensional levels that exceed in complexity simplistic binomial oppositions or the triadic formula of traditional dialectics. The complexity is one of over-determination that threatens to undermine the very language of such a dialectic. As the deep complexity of over-inter-determinations would deconstruct any notion of a substance, what Nishida offers—as opposed to an ousiology (or logic of substance)—is a chiasmology. I thus argue that his so-called dialectic is really an unfolding of that chiasma. And if chiasma expresses the over-determinate aspect of Nishida’s matter of thinking, chōra would express its under-determinate aspect. Nishida himself based his concept of basho or “place” on Plato’s notion of the chōra from the Timaeus. I take Nishida’s basho in its chōratic nature as what simultaneously unfolds and enfolds the chiasma. But in the case of the chōra it is its under-determinate nature that refuses reduction to any of the terms of opposition. In its self-withdrawal, it provides a clearing, a space, for the chiasmatic unraveling of the many. Like the chiasma it undermines any claim to a first substance or the hegemony of a universal First. For in its indeterminateness, it is “nothing” (mu). The unfolding it enfolds is, as Nishida states, “a determination without determiner.” In concrete terms, however, we might develop Nishida’s concept further by returning to the original pre-Platonic Greek meaning of chōra in the sense of “region” or “country,” to understand chōra or basho here as the very space of co-existence provided by this very earth. As a chiasmatic chōra irreducible, in its over- and under-determinations, to being or non-being, Nishida’s basho qua mu proves to be the an-ontological origin of both on and meon (being and non-being). Rejecting the culture-nature dichotomy this notion of our place of being as chiasma and chōra underscores our holistic symbiosis with the earth as the anontological (un)ground and clearing for our co-existence in a concrete milieau with one another and with nature. It is this earth as our ultimate contextual wherein that provides a clearing for co-dwelling and mutual encounter with one’s other, that we must acknowledge today if we are to co-exist authentically and freely vis-à-vis our global neighbors and vis-à-vis the surrounding nature. (shrink)
Critics of carbon mitigation often appeal to what Jonathan Glover has called ‘the argument from no difference’: that is, ‘If I don’t do it, someone else will’. Yet even if this justifies continued high emissions by the industrialised countries, it cannot excuse business as usual. The North’s emissions might not harm the victims of climate change in the sense of making them worse off than they would otherwise be. Nevertheless, it receives benefits produced at the latter’s expense, with the result (...) that it has more than it deserves, and that victims will have less. This enrichment is unjust; unjustly enriched agents ought to make compensation. The best form of compensation is vigorous action against climate change. (shrink)
Over the years, companies have adopted hiring algorithms because they promise wider job candidate pools, lower recruitment costs and less human bias. Despite these promises, they also bring perils. Using them can inflict unintentional harms on individual human rights. These include the five human rights to work, equality and nondiscrimination, privacy, free expression and free association. Despite the human rights harms of hiring algorithms, the AI ethics literature has predominantly focused on abstract ethical principles. This is problematic for two (...) reasons. First, AI principles have been criticized for being vague and not actionable. Second, the use of vague ethical principles to discuss algorithmic risks does not provide any accountability. This lack of accountability creates an algorithmic accountability gap. Closing this gap is crucial because, without accountability, the use of hiring algorithms can lead to discrimination and unequal access to employment opportunities. This paper makes two contributions to the AI ethics literature. First, it frames the ethical risks of hiring algorithms using international human rights law as a universal standard for determining algorithmic accountability. Second, it evaluates four types of algorithmic impact assessments in terms of how effectively they address the five human rights of job applicants implicated in hiring algorithms. It determines which of the assessments can help companies audit their hiring algorithms and close the algorithmic accountability gap. (shrink)
One of the central topics in semantic theory over the last few decades concerns the nature of local contexts. Recently, theorists have tried to develop general, non-stipulative accounts of local contexts (Schlenker, 2009; Ingason, 2016; Mandelkern & Romoli, 2017a). In this paper, we contribute to this literature by drawing attention to the local contexts of subclausal expressions. More specifically, we focus on the local contexts of quantificational determiners, e.g. `all', `both', etc. Our central tool for probing the local contexts (...) of subclausal elements is the principle Maximize Presupposition! (Percus, 2006; Singh, 2011). The empirical basis of our investigation concerns some data discussed by Anvari (2018b), e.g. the fact that sentences such as `All of the two presidential candidates are crooked' are unacceptable. In order to explain this, we suggest that the local context of determiners needs to contain the information carried by their restrictor. However, no existing non-stipulative account predicts this. Consequently, we think that the local contexts of subclausal expressions will likely have to be stipulated. This result has important consequences for debates in semantics and pragmatics, e.g. those around the so-called "explanatory problem" for dynamic semantics (Soames, 1982; Heim, 1990; Schlenker, 2009). (shrink)
Debates over the reality of race often rely on arguments about the connection between race and science—those who deny that race is real argue that there is no significant support from science for our ordinary race concepts; those who affirm that race is real argue that our ordinary race concepts are supported by scientific findings. However, there is arguably a more fundamental concern here: How should we define race concepts in the first place? The reason I claim that this (...) definitional question is more fundamental is that our handling of the underlying definitional problem often determines the scientific support our ordinary race concepts need, and importantly the likelihood of finding such support. In short, the defini- tional question, “How do we define race?” often undercuts the question of whether race is scientifically meaningful. (shrink)
After determining one set of skills that we hoped our students were learning in the introductory philosophy class at Carnegie Mellon University, we performed an experiment twice over the course of two semesters to test whether they were actually learning these skills. In addition, there were four different lectures of this course in the first semester, and five in the second; in each semester students in some lectures were taught the material using argument diagrams as a tool to aid (...) understanding and critical evaluation, while the other students were taught using more traditional methods. In each lecture, the students were given a pre-test at the beginning of the semester, and a structurally identical post-test at the end. We determined that the students did develop the skills in which we were interested over the course of the semester. We also determined that the students who were taught argument diagramming gained significantly more than the students who were not. We conclude that learning how to construct argument diagrams significantly improves a student’s ability to analyze arguments. (shrink)
The debate about justice in immigration seems somehow stagnated given that it seems justice requires both further exclusion and more porous borders. In the face of this, I propose to take a step back and to realize that the general problem of borders—to determine what kind of borders liberal democracies ought to have—gives rise to two particular problems: first, to justify exclusive control over the administration of borders (the problem of legitimacy of borders) and, second, to specify how this (...) control ought to be exercised (the problem of justice of borders). The literature has explored the second but ignored the first. Therefore, I propose a different approach to the ethics of immigration by focusing on concerns of legitimacy in a three-step framework: first, identifying the kind of authority or power that immigration controls exercise; second, redefining borders as international and domestic institutions that issue that kind of power; and finally, considering supranational institutions that redistribute the right to exclude among legitimate borders. (shrink)
One determining characteristic of contemporary sociopolitical systems is their power over increasingly large and diverse populations. This raises questions about power relations between heterogeneous individuals and increasingly dominant and homogenizing system objectives. This article crosses epistemic boundaries by integrating computer engineering and a historicalphilosophical approach making the general organization of individuals within large-scale systems and corresponding individual homogenization intelligible. From a versatile archeological-genealogical perspective, an analysis of computer and social architectures is conducted that reinterprets Foucault’s disciplines and political anatomy (...) to establish the notion of politics for a purely technical system. This permits an understanding of system organization as modern technology with application to technical and social systems alike. Connecting to Heidegger’s notions of the enframing (Gestell) and a more primal truth (anfänglicheren Wahrheit), the recognition of politics in differently developing systems then challenges the immutability of contemporary organization. Following this critique of modernity and within the conceptualization of system organization, Derrida’s democracy to come (à venir) is then reformulated more abstractly as organizations to come. Through the integration of the discussed concepts, the framework of Large-Scale Systems Composed of Homogeneous Individuals (LSSCHI) is proposed, problematizing the relationships between individuals, structure, activity, and power within large-scale systems. The LSSCHI framework highlights the conflict of homogenizing system-level objectives and individual heterogeneity, and outlines power relations and mechanisms of control shared across different social and technical systems. (shrink)
Over the past decades a number of scholars have identified Johann Heinrich Bisterfeld as one of the most decisive early influences on Leibniz. In particular, the impressive similarity between their conceptions of universal harmony has been stressed. Since the issue of relations is at the heart of both Bisterfeld and Leibniz’s doctrines of universal harmony, the extent of the similarity between their doctrines will depend, however, on Bisterfeld and Leibniz’s respective theories of relations, and especially on their ontologies of (...) relations. This paper attempts to determine in more detail whether Bisterfeld’s ontology of relations contains at least the germ of the defining features of the ontology of relations later developed by Leibniz. It comes to the conclusion that, although Bisterfeld’s theory of relations is not as fully developed and explicit as that of Leibniz, it does contain all the key “ingredients” of it. (shrink)
Over the past 50 years, there has been a great deal of philosophical interest in laws of nature, perhaps because of the essential role that laws play in the formulation of, and proposed solutions to, a number of perennial philosophical problems. For example, many have thought that a satisfactory account of laws could be used to resolve thorny issues concerning explanation, causation, free-will, probability, and counterfactual truth. Moreover, interest in laws of nature is not constrained to metaphysics or philosophy (...) of science; claims about laws play essential roles in areas as diverse as the philosophy of religion (e.g., in the argument from design) and the philosophy of mind (e.g., in the formulation of Davidson’s anomalous monism). In my dissertation, I consider and reject the widely-held thesis that the facts concerning laws can be reduced to the facts concerning the particular entities that the laws “govern,” and that the laws thus have no independent existence. I instead defend a version of nomic primitivism, according to which the facts about laws cannot be reduced to facts that are themselves non-nomic – i.e., to facts that do not fundamentally involve laws, counterfactuals, causes, etc. Insofar as the truth or falsity of reductionism about laws has implications for many of the problems mentioned above, I think that this result should be of interest even to those who who do not work in metaphysics or the philosophy of science. My methodology, which I lay out and defend in Chapter One, is a version of Carnapian explication. This method emphasizes the importance of articulating and maintaining clear distinctions between (1) the vague concept (or concepts) law of nature inherent in ordinary language and scientific practice and (2) the precise analyses of “law of nature” that philosophers have proposed as potential replacements for this concept. I argue that metaphysics-as-explication has clear advantages over rival conceptions of metaphysical methodology; in particular, it allows us to formulate evaluative criteria for metaphysical claims. In Chapter Two, I offer an example of how careful attention to concepts already in use can help resolve philosophical debate. Specifically, I argue that much recent literature has mistakenly assumed that there is only one concept of “law of nature” in use, while there are in fact at least two. Strong laws are the principles pursued by fundamental physics: they are true, objective, and bear distinctive relationships to counterfactuals and explanation. Weak laws, by contrast, lack at least one of these distinctive characteristics but play central roles in both the “special sciences” and in everyday life. In Chapters Three and Four, I offer extended arguments against the two most prominent versions of reductionism about laws – Humeanism and law necessitarianism. According to philosophical Humeans, the laws of nature supervene upon the non-modal, non-nomic facts concerning the behavior of particular things at particular times and places. Law necessitarians, by contrast, argue that the laws are in fact metaphysically necessary, and that which laws there are is determined by a class of primitive, modally loaded facts concerning the essences, natures, or dispositions. I argue that both of these views are mistaken insofar as they disagree with well-entrenched scientific practices and those in favor of reductionism have failed to provided sufficient reason for thinking that these practices should be revised. Much of my argument is focused on the role played by a number of supposed methodological principles, including appeals to intuition, parsimony, and methodological naturalism. While the conclusions of this dissertation are explicitly constrained to laws, many of the arguments should be of interest to those who are concerned about philosophical methodology (especially in the role of intuition in philosophical argument) or the appropriate relation between metaphysics, science, and the philosophy of science. (shrink)
This essay (a revised version of my undergraduate honors thesis at Stanford) constructs a theory of analogy as it applies to argumentation and reasoning, especially as used in fields such as philosophy and law. The word analogy has been used in different senses, which the essay defines. The theory developed herein applies to analogia rationis, or analogical reasoning. Building on the framework of situation theory, a type of logical relation called determination is defined. This determination relation solves a (...) puzzle about analogy in the context of logical argument, namely, whether an analogous situation contributes anything logically over and above what could be inferred from the application of prior knowledge to a present situation. Scholars of reasoning have often claimed that analogical arguments are never logically valid, and that they therefore lack cogency. However, when the right type of determination structure exists, it is possible to prove that projecting a conclusion inferred by analogy onto the situation about which one is reasoning is both valid and non-redundant. Various other properties and consequences of the determination relation are also proven. Some analogical arguments are based on principles such as similarity, which are not logically valid. The theory therefore provides us with a way to distinguish between legitimate and illegitimate arguments. It also provides an alternative to procedures based on the assessment of similarity for constructing analogies in artificial intelligence systems. (shrink)
The principle of national self-determination holds that a national community, simply by virtue of being a national community, has a prima facie right to create its own sovereign state. While many support this principle, not as many agree that it should be formally recognized by political institutions. One of the main concerns is that implementing this principle may lead to certain types of inequalities—between nations with and without their own states, members inside and outside the border, and members and (...) nonmembers inside the same nation state. While these inequalities may arise, I shall argue that they are not unjust. These worries are partly the results of confusing two types of interests that a national group may have—in cultural affairs and in political affairs. While a national community should enjoy rights over their cultural affairs, this does not grant them authority over other non-cultural, political affairs. Once the distinction is drawn, we can see that there are constraints on the implementation of this principle. Consequently, these inequalities justify setting limits to a group’s right of self-government, although they do not conclusively refute the right itself. (shrink)
How do we determine whether some candidate causal factor is an actual cause of some particular outcome? Many philosophers have wanted a view of actual causation which fits with folk intuitions of actual causation and those who wish to depart from folk intuitions of actual causation are often charged with the task of providing a plausible account of just how and where the folk have gone wrong. In this paper, I provide a range of empirical evidence aimed at showing just (...) how and where the folk go wrong in determining whether an actual causal relation obtains. The evidence suggests that folk intuitions of actual causation are generated by two epistemically defective processes. I situate the empirical evidence within a background discussion of debunking, arguing for a two-pronged debunking explanation of folk intuitions of actual causation. I conclude that those who wish to depart from folk intuitions of actual causation should not be compelled to square their account of actual causation with the verdicts of the folk. In the dispute over actual causation, folk intuitions deserve to be rejected. (shrink)
How should we determine the distribution of psychological traits—such as Theory of Mind, episodic memory, and metacognition—throughout the Animal kingdom? Researchers have long worried about the distorting effects of anthropomorphic bias on this comparative project. A purported corrective against this bias was offered as a cornerstone of comparative psychology by C. Lloyd Morgan in his famous “Canon”. Also dangerous, however, is a distinct bias that loads the deck against animal mentality: our tendency to tie the competence criteria for cognitive capacities (...) to an exaggerated sense of typical human performance. I dub this error “anthropofabulation”, since it combines anthropocentrism with confabulation about our own prowess. Anthropofabulation has long distorted the debate about animal minds, but it is a bias that has been little discussed and against which the Canon provides no protection. Luckily, there is a venerable corrective against anthropofabulation: a principle offered long ago by David Hume, which I call “Hume’s Dictum”. In this paper, I argue that Hume’s Dictum deserves a privileged place next to Morgan’s Canon in the methodology of comparative psychology, illustrating my point through a discussion of the debate over Theory of Mind in nonhuman animals. (shrink)
Over the decades, scholarly discourses on sovereignty and globalization have been produced following various theories and numerous debates about the strength and weakness of the sovereign nation-state and globalization. In this paper, the various theories on the discourse of sovereignty and globalization are traced and placed into four categories as: contending paradigm, globalization paradigm, transformation paradigm and complementary paradigm. Both concepts, sovereignty and globalization, are explored by adopting the methodological framework, sources of explanation. The argument is that there is (...) an intricate relationship between these concepts. To determine the relationship between sovereignty and globalization, three world systems were examined and it revealed that, globalization is born of the sovereign nation-state and that globalization can only be assert in the current sovereign world system and not the ones preceding it. The overall conclusion is that globalization emerged as a result of sovereignty and since the discourse of sovereignty and globalization is about the same space and its inhabitants, they are bound to be discursively set against each other if the discourse focusses solely on the phenomena seen as globalization. The forces of globalization and sovereignty need to be further researched into to be able to tell where they are leading us. (shrink)
For over two decades now, Sub-Saharan Africa has been superimposed in a coercive and contradictory neo-liberal development economism agenda. According to this paradigm, markets and not states are the fundamental determinants of distributive justice and human flourishing through the promotion of economic growth that is believed to trickle down to the poor in due time. Despite the global intellectual criticism of this neo-liberal development economics orthodox of measuring development and wellbeing in terms of market induced economic growth, autocratic states (...) in Sub-Saharan Africa that have accumulated un-dimensional growth continue to be applauded as role models on poverty reduction, wellbeing and social justice by donors and global development institutions such as the World Bank and international monetary fund (IMF). This is basically because they have wholly embraced the implementation of the anti-pro-poor neo-liberal structural adjustment tool kit. This study uses a critical hermeneutics methodology to expose the distortions embedded in neo-liberal gross domestic product (GDP) growth cartographies and how these disguise the social injustices against the poor in Sub- Saharan Africa with particular reference to Uganda. The study contends that in measuring development and wellbeing, human rights and social justice must take precedence over economic efficiency and GDP growth for that matter. (shrink)
Over the past decade, attention to epistemically significant disagreement has centered on the question of whose disagreement qualifies as significant, but ignored another fundamental question: what is the epistemic significance of disagreement? While epistemologists have assumed that disagreement is only significant when it indicates a determinate likelihood that one’s own belief is false, and therefore that only disagreements with epistemic peers are significant at all, they have ignored a more subtle and more basic significance that belongs to all disagreements, (...) regardless of who they are with—that the opposing party is wrong. It is important to recognize the basic significance of disagreement since it is what explains all manners of rational responses to disagreement, including assessing possible epistemic peers and arguing against opponents regardless of their epistemic fitness. (shrink)
Over centuries, philosophers have theorised about what constitutes ‘the good’ regarding behavioural choice. Characteristically, these attempts have tried to decipher the nature and substantive values that link the apparent trichotomous nature of the human psyche, variously articulated in terms of human reasoning, feeling, and desiring. Of the three, most emphasis has focused on the unique human characteristic of reasoned behavioural choice in terms of its relationship to the emotions. This article determines the principle dynamics behind 'ethical' behaviour: In the (...) nervous system, efferent nerves, otherwise known as motor neurones, carry nerve impulses away from the central nervous system to effectors such as muscles. A great deal of neural activity underpins‘efferent information processing’. What follows is a categorisation of the structure of 'efferent information processing' in a manner, that enhances our understanding of the attempts of philosophers, from Plato to Russell, to explain ‘what it is to be behave well’. (shrink)
Disputes over land are the major source of conflict between Aboriginal and non-Aboriginal peoples around the globe. According to the Royal Commission on Aboriginal Peoples in Canada, land claims do not simply have to do with economic settlements. They also involve, in a critical sense, respect and recognition for cultural differences regarding culturally distinct self-understandings of land. The Commissioners argue that these disputes will never be wholly resolved unless dialogue and negotiations are "guided by one of the fundamental insights (...) from our hearings: that is, to Aboriginal peoples, land is not just a commodity; it is an inextricable part of Aboriginal identity, deeply rooted in moral and spiritual values" (1996, 430). I would contend that human rights and global justice require that, as the United Nations Charter asserts, formerly colonized peoples have a legitimate claim to pursue their social, economic and cultural interests within the boundaries of a peoples' right to self- determination. I examine a spectrum of dominant liberal theories of justice regarding cultural membership and its relationship to politics with respect to Indigenous demands for self-determination. Specifically, my purpose is to explore which position is best able to accommodate the key aspect of this demand: that they have the power to organize themselves according to their traditional views of the land and that, importantly, they have the power to promote such self-understandings in their social, legal, and political institutions. I demonstrate the manner in which many such liberal theories continue to perpetuate (at times, unwittingly) a neo-colonial agenda in which Indigenous claims would be recognized by a liberal state the degree that Indigenous tribes assimilate to European cultural self-understandings. (shrink)
This paper sets out an account of trust in AI as a relationship between clinicians, AI applications, and AI practitioners in which AI is given discretionary authority over medical questions by clinicians. Compared to other accounts in recent literature, this account more adequately explains the normative commitments created by practitioners when inviting clinicians’ trust in AI. To avoid committing to an account of trust in AI applications themselves, I sketch a reductive view on which discretionary authority is exercised by (...) AI practitioners through the vehicle of an AI application. I conclude with four critical questions based on the discretionary account to determine if trust in particular AI applications is sound, and a brief discussion of the possibility that the main roles of the physician could be replaced by AI. (shrink)
In the debate over what determines the reference of an indexical expression on a given occasion of use, we can distinguish between two generic positions. According to the first, the reference is determined by internal factors, such as the speaker’s intentions. According to the second, the reference is determined by external factors, like conventions or what a competent and attentive audience would take the reference to be. It has recently been argued that the first position is untenable, since there (...) are cases of mismatch where the intuitively correct reference differs from the one that would be determined by the relevant internal factors. The aim of this paper is to show that, contrary to this line of argument, it is the proponent of the second position that should be worried, since this position yields counterintuitive consequences regarding communicative success in cases of mismatch. (shrink)
Farmers’ organizations all over the world are very well aware that in order to build and retain a critical mass with sufficient bargaining power to democratically influence local governments and international organizations they will have to unite by identifying common goals and setting aside their differences. After decades of local movements and struggles, farmers’ organizations around the globe found in the concept of “food sovereignty” the normative framework they were long searching for. The broadness of the concept has had (...) a remarkable success in embracing the interests of food producers and consumers from all geographic locations and development levels. (shrink)
Since its formal definition over sixty years ago, category theory has been increasingly recognized as having a foundational role in mathematics. It provides the conceptual lens to isolate and characterize the structures with importance and universality in mathematics. The notion of an adjunction (a pair of adjoint functors) has moved to center-stage as the principal lens. The central feature of an adjunction is what might be called “determination through universals” based on universal mapping properties. A recently developed “heteromorphic” (...) theory about adjoints suggests a conceptual structure, albeit abstract and atemporal, for how new relatively autonomous behavior can emerge within a system obeying certain laws. The focus here is on applications in the life sciences (e.g., selectionist mechanisms) and human sciences (e.g., the generative grammar view of language). (shrink)
I argue that when determining whether an agent ought to perform an act, we should not hold fixed the fact that she’s going to form certain attitudes (and, here, I’m concerned with only reasons-responsive attitudes such as beliefs, desires, and intentions). For, as I argue, agents have, in the relevant sense, just as much control over which attitudes they form as which acts they perform. This is important because what effect an act will have on the world depends not (...) only on which acts the agent will simultaneously and subsequently perform, but also on which attitudes she will simultaneously and subsequently form. And this all leads me to adopt a new type of practical theory, which I call rational possibilism. On this theory, we first evaluate the entire set of things over which the agent exerts control, where this includes the formation of certain attitudes as well as the performance of certain acts. And, then, we evaluate individual acts as being permissible if and only if, and because, there is such a set that is itself permissible and that includes that act as a proper part. Importantly, this theory has two unusual features. First, it is not exclusively act-orientated, for it requires more from us than just the performance of certain voluntary acts. It requires, in addition, that we involuntarily form certain attitudes. Second, it is attitude-dependent in that it holds that which acts we’re required to perform depends on which attitudes we’re required to form. I then show how these two features can help us both to address certain puzzling cases of rational choice and to understand why most typical practical theories (utilitarianism, virtue ethics, rational egoism, Rossian deontology, etc.) are problematic. (shrink)
The debate over the merits of originalism has advanced considerably in recent years, both in terms of its intellectual sophistication and its practical significance. In the process, some prominent originalists—Lawrence Solum and Jeffrey Goldsworthy being the two discussed here—have been at pains to separate out the linguistic and normative components of the theory. For these authors, while it is true that judges and other legal decision-makers ought to be originalists, it is also true that the communicated content of the (...) constitution is its original meaning. That is to say: the meaning is what it is, not what it should be. Accordingly, there is no sense in which the communicated content of the constitution is determined by reference to moral desiderata; linguistic desiderata do all the work. In this article, I beg to differ. In advancing their arguments for linguistic originalism, both authors rely upon the notion of successful communications conditions. In doing so they implicitly open up the door for moral desiderata to play a role in determining the original communicated content. This undercuts their claim and changes considerably the dialectical role of linguistic originalism in the debate over constitutional interpretation. (shrink)
The Controversy over the Existence of the World (henceforth Controversy) is the magnum opus of Polish philosopher Roman Ingarden. Despite the renewed interest for Ingarden’s pioneering ontological work whithin analytic philosophy, little attention has been dedicated to Controversy's main goal, clearly indicated by the very title of the book: finding a solution to the centuries-old philosophical controversy about the ontological status of the external world. -/- There are at least three reasons for this relative indifference. First, even at the (...) time when the book was published, the Controversy was no longer seen as a serious polemical topic, whether it was disqualified as an archaic metaphysical pseudo-problem, or taken to be the last remnant of an antiscientific approach to philosophy culminating in idealism and relativism. Second, Ingarden’s reasoning on the matter is highly complex, at times misleading, and even occasionally faulty. Finally, his analysis is not only incomplete – Controversy being unachieved – but also arguably aporetic. -/- One may wonder, then, why it is still worth excavating this mammoth treatise to study an issue apparently no longer relevant to contemporary philosophy. Aside from historical and exegetical purposes, which are of course very interesting in their own right, Ingarden’s treatment of the Controversy remains one of the most detailed and ambitious ontological undertakings of the twentieth century. Not only does it lay out an incredibly detailed map of possible solutions to the Controversy, but it also tries to show why the latter is a genuine and fundamental problem that owes its hasty disqualification to various oversimplifications over the course of the history of philosophy. -/- In this chapter, I first give an overview of Ingarden’s method, which relies mainly on a combinatorial analysis. Then, I summarize his examination of possible solutions to the Controversy, and determine which ones can be ruled out on ontological grounds. Finally, I explain why this ambitious project ultimately leads to a theoretical impasse, leaving Ingarden unable to come up with a definitive solution to the Controversy – regardless of the fact that the book is unachieved. I argue that his analysis of the problem yields a more modest but nonetheless valuable result. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.