Was ist Natur oder was könnte sie sein? Diese und weitere Fragen sind grundlegend für Naturdenken und -handeln. Das Lehr- und Studienbuch bietet eine historisch-systematische und zugleich praxisbezogene Einführung in die Naturphilosophie mit ihren wichtigsten Begriffen. Es nimmt den pluralen Charakter der Wahrnehmung von Natur in den philosophischen Blick und ist auch zum Selbststudium bestens geeignet.
Los teóricos de la democracia dejaron de lado la pregunta de quién legalmente forma parte del "pueblo" autorizado, pregunta que atraviesa a todas las teoría de la democracia y continuamente vivifica la práctica democrática. Determinar quién constituye el pueblo es un dilema inabordable e incluso imposible de responder democráticamente; no es una pregunta que el pueblo mismo pueda decidir procedimentalmente porque la propia premisa subvierte las premisas de su resolución. Esta paradoja del mandato popular revela que el pueblo para ser (...) mejor comprendido como una demanda política, como un proceso de subjetivación, surge y se desarrolla en distintos contextos democráticos. En Estados Unidos el disputado poder para hablar en beneficio del pueblo deriva de un excedente constitutivo heredado de la era revolucionaria, a partir del hecho de que desde la Revolución el pueblo ha sido por vez primera encarnado por la representación y como exceso de cualquier forma de representación. La autoridad posrevolucionaria del vox populi deriva de esa continuamente reiterada pero nunca realizada referencia a la soberanía del pueblo a partir de la representación, legitimidad a partir de la ley, espíritu a partir de la letra, la palabra a través de la palabra. Este ensayo examina la emergencia histórica de este exceso de democracia en el período revolucionario, y cómo este habilita a una subsecuente historia de "momentos constituyentes", momentos cuando subautorizados -radicales, entidades autocreadas-, se apoderan del manto de la autoridad, cambiando las reglas de la autoridad en ese proceso. Estos pequeños dramas de reclamos de autoridad política para hablar en nombre del pueblo son felices, aun cuando explícitamente rompan con los procedimientos o reglas estatuidas para representar la voz popular. -/- Momentos constituyentes: paradojas y poder popular en los Estados Unidos de América posrevolucionarios [traducción], Revista Argentina de Ciencia Política, N°15, EUDEBA, Buenos Aires, Octubre 2012, pp. 49-74. ISSN: 0329-3092. Introducción de “Constituent Moments: Enacting the People in Postrevolutionary America”, de Jason Frank [Ed.: Duke University Press Books, enero de 2010. ISBN-10: 0822346753; ISBN-13: 978-0822346753]. (shrink)
The aim of this paper is to discuss the “Austro-American” logical empiricism proposed by physicist and philosopher Philipp Frank, particularly his interpretation of Carnap’s Aufbau, which he considered the charter of logical empiricism as a scientific world conception. According to Frank, the Aufbau was to be read as an integration of the ideas of Mach and Poincaré, leading eventually to a pragmatism quite similar to that of the American pragmatist William James. Relying on this peculiar interpretation, Frank (...) intended to bring about a rapprochement between the logical empiricism of the Vienna Circle in exile and American pragmatism. In the course of this project, in the last years of his career, Frank outlined a comprehensive, socially engaged philosophy of science that could serve as a “link between science and philosophy”. (shrink)
In this paper, I offer an analysis of the radical disagreement over the adequacy of string theory. The prominence of string theory despite its notorious lack of empirical support is sometimes explained as a troubling case of science gone awry, driven largely by sociological mechanisms such as groupthink (e.g. Smolin 2006). Others, such as Dawid (2013), explain the controversy by positing a methodological revolution of sorts, according to which string theorists have quietly turned to nonempirical methods of theory assessment given (...) the technological inability to directly test the theory. The appropriate response, according to Dawid, is to acknowledge this development and widen the canons of acceptable scientific methods. As I’ll argue, however, the current situation in fundamental physics does not require either of these responses. Rather, as I’ll suggest, much of the controversy stems from a failure to properly distinguish the “context of justification” from the “context of pursuit”. Both those who accuse string theorists of betraying the scientific method and those who advocate an enlarged conception of scientific methodology objectionably conflate epistemic justification with judgements of pursuit-worthiness. Once we get clear about this distinction and about the different norms governing the two contexts, the current situation in fundamental physics becomes much less puzzling. After defending this diagnosis of the controversy, I’ll show how the argument patterns that have been posited by Dawid as constituting an emergent methodological revolution in science are better off if reworked as arguments belonging to the context of pursuit. (shrink)
In this paper, I consider the relationship between Inference to the Best Explanation and Bayesianism, both of which are well-known accounts of the nature of scientific inference. In Sect. 2, I give a brief overview of Bayesianism and IBE. In Sect. 3, I argue that IBE in its most prominently defended forms is difficult to reconcile with Bayesianism because not all of the items that feature on popular lists of “explanatory virtues”—by means of which IBE ranks competing explanations—have confirmational import. (...) Rather, some of the items that feature on these lists are “informational virtues”—properties that do not make a hypothesis \ more probable than some competitor \ given evidence E, but that, roughly-speaking, give that hypothesis greater informative content. In Sect. 4, I consider as a response to my argument a recent version of compatibilism which argues that IBE can provide further normative constraints on the objectively correct probability function. I argue that this response does not succeed, owing to the difficulty of defending with any generality such further normative constraints. Lastly, in Sect. 5, I propose that IBE should be regarded, not as a theory of scientific inference, but rather as a theory of when we ought to “accept” H, where the acceptability of H is fixed by the goals of science and concerns whether H is worthy of commitment as research program. In this way, IBE and Bayesianism, as I will show, can be made compatible, and thus the Bayesian and the proponent of IBE can be friends. (shrink)
The Politics of Military Force uses discourse theory to examine the dynamics of discursive change that made participation in military operations possible against the background of German antimilitarist culture. Once considered a strict taboo, so-called out-of-area operations have now become widely considered by German policymakers to be without alternative. The book argues that an understanding of how certain policies are made possible (in this case, military operations abroad and force transformation), one needs to focus on processes of discursive change that (...) result in different policy options appearing rational, appropriate, feasible, or even self-evident. Drawing on Essex School discourse theory, the book develops a theoretical framework to understand how discursive change works, and elaborates on how discursive change makes once unthinkable policy options not only acceptable but even without alternative. Based on a detailed discourse analysis of more than 25 years of German parliamentary debates, The Politics of Military Force provides an explanation for: (1) the emergence of a new hegemonic discourse in German security policy after the end of the Cold War (discursive change), (2) the rearticulation of German antimilitarism in the process (ideational change/norm erosion) and (3) the resulting making-possible of military operations and force transformation (policy change). In doing so, the book also demonstrates the added value of a poststructuralist approach compared to the naive realism and linear conceptions of norm change so prominent in the study of German foreign policy and International Relations more generally. (shrink)
In this paper, I critically evaluate several related, provocative claims made by proponents of data-intensive science and “Big Data” which bear on scientific methodology, especially the claim that scientists will soon no longer have any use for familiar concepts like causation and explanation. After introducing the issue, in Section 2, I elaborate on the alleged changes to scientific method that feature prominently in discussions of Big Data. In Section 3, I argue that these methodological claims are in tension with a (...) prominent account of scientific method, often called “Inference to the Best Explanation”. Later on, in Section 3, I consider an argument against IBE that will be congenial to proponents of Big Data, namely, the argument due to Roche and Sober Analysis, 73:659–668, that “explanatoriness is evidentially irrelevant.” This argument is based on Bayesianism, one of the most prominent general accounts of theory-confirmation. In Section 4, I consider some extant responses to this argument, especially that of Climenhaga Philosophy of Science, 84:359–368,. In Section 5, I argue that Roche and Sober’s argument does not show that explanatory reasoning is dispensable. In Section 6, I argue that there is good reason to think explanatory reasoning will continue to prove indispensable in scientific practice. Drawing on Cicero’s oft-neglected De Divinatione, I formulate what I call the “Ciceronian Causal-nomological Requirement”, which states, roughly, that causal-nomological knowledge is essential for relying on correlations in predictive inference. I defend a version of the CCR by appealing to the challenge of “spurious correlations,” chance correlations which we should not rely upon for predictive inference. In Section 7, I offer some concluding remarks. (shrink)
In this article, I consider an important challenge to the popular theory of scientific inference commonly known as ‘inference to the best explanation’, one that has received scant attention.1 1 The problem is that there exists a wide array of rival models of explanation, thus leaving IBE objectionably indeterminate. First, I briefly introduce IBE. Then, I motivate the problem and offer three potential solutions, the most plausible of which is to adopt a kind of pluralism about the rival models of (...) explanation. However, I argue that how ranking explanations on this pluralistic account of IBE remains obscure and pluralism leads to contradictory results. In light of these objections, I attempt to dissolve the problem by showing why IBE does not require a ‘model’ of explanation and by giving an account of what explanation consists in within the context of IBE. (shrink)
Some properties are causally relevant for a certain effect, others are not. In this paper we describe a problem for our understanding of this notion and then offer a solution in terms of the notion of a program explanation.
We develop a Bayesian framework for thinking about the way evidence about the here and now can bear on hypotheses about the qualitative character of the world as a whole, including hypotheses according to which the total population of the world is infinite. We show how this framework makes sense of the practice cosmologists have recently adopted in their reasoning about such hypotheses.
In this paper, I discuss the ways in which epistemic anxiety promotes well-being, specifically by examining the positive contributions that feelings of epistemic anxiety make toward intellectually virtuous inquiry. While the prospects for connecting the concept of epistemic anxiety to the two most prominent accounts of intellectual virtue, i.e., “virtue-reliabilism” and “virtue-responsibilism”, are promising, I primarily focus on whether the capacity for epistemic anxiety counts as an intellectual virtue in the reliabilist sense. As I argue, there is a close yet (...) unexplored connection between feelings of epistemic anxiety and the form of inference commonly known as “Inference to the Best Explanation” (IBE). Specifically, I argue that both the recognition that some fact requires an explanation—a necessary first step in applying IBE—and the subsequent motivation to employ IBE are typically facilitated by feelings of epistemic anxiety. So, provided IBE is truth-conducive the capacity for epistemic anxiety should count as an intellectual virtue in the reliabilist sense. After outlining my main argument, I address the challenge that the capacity for epistemic anxiety has the potential to be misleading. To respond to this challenge, I discuss how our recognition that a fact requires an explanation must in part be a species of practical knowledge, rather than theoretical knowledge. For the agent to skillfully distinguish between facts that require an explanation and facts that do not, she must develop the virtuous disposition to feel the appropriate amount of epistemic anxiety. Despite the many negative aspects associated with anxiety, as I conclude, being disposed to feel the appropriate amount of epistemic anxiety is ultimately good for us. (shrink)
In this paper, I describe some strategies for teaching an introductory philosophy of science course to Science, Technology, Engineering, and Mathematics (STEM) students, with reference to my own experience teaching a philosophy of science course in the Fall of 2020. The most important strategy that I advocate is what I call the “Second Philosophy” approach, according to which instructors ought to emphasize that the problems that concern philosophers of science are not manufactured and imposed by philosophers from the outside, but (...) rather are ones that arise internally, during the practice of science itself. To justify this approach, I appeal to considerations from both educational research and the epistemology of testimony. In addition, I defend some distinctive learning goals that philosophy of science instructors ought to adopt when teaching STEM students, which include rectifying empirically well-documented shortcomings in students’ conceptions of the “scientific method” and the “nature of science.” Although my primary focus will be on teaching philosophy of science to STEM students, much of what I propose can be applied to non-philosophy majors generally. Ultimately, as I argue, a successful philosophy of science course for non-philosophy majors must be one that advances a student’s science education. The strategies that I describe and defend here are aimed at precisely that end. (shrink)
In this paper, I examine Cicero’s oft-neglected De Divinatione, a dialogue investigating the legitimacy of the practice of divination. First, I offer a novel analysis of the main arguments for divination given by Quintus, highlighting the fact that he employs two logically distinct argument forms. Next, I turn to the first of the main arguments against divination given by Marcus. Here I show, with the help of modern probabilistic tools, that Marcus’ skeptical response is far from the decisive, proto-naturalistic assault (...) on superstition that it is sometimes portrayed to be. Then, I offer an extended analysis of the second of the main arguments against divination given by Marcus. Inspired by Marcus’ second main argument, I formulate, explicate, and defend a substantive principle of scientific methodology that I call the “Ciceronian Causal-Nomological Requirement” (CCR). Roughly, this principle states that causal knowledge is essential for relying on correlations in predictive inference. Although I go on to argue that Marcus’ application of the CCR in his debate with Quintus is dialectically inadequate, I conclude that De Divinatione deserves its place in Cicero’s philosophical corpus, and that ultimately, its significance for the history and philosophy of science ought to be recognized. (shrink)
Intuitions about intentional action have turned out to be sensitive to normative factors: most people say that an indifferent agent brings about an effect of her action intentionally when it is harmful, but unintentionally when it is beneficial. Joshua Knobe explains this asymmetry, which is known as ‘the Knobe effect’, in terms of the moral valence of the effect, arguing that this explanation generalizes to other asymmetries concerning notions as diverse as deciding and being free. I present an alternative explanation (...) of the Knobe effect in terms of normative reasons. This explanation generalizes to other folk psychological notions such as deciding, but not to such notions as being free. I go on to argue, against Knobe, that offering a unified explanation of all the asymmetries he discusses is in fact undesirable. (shrink)
These essays draw on work in the history and philosophy of science, the philosophy of mind and language, the development of concepts in children, conceptual..
Die Rezeption von Walter Benjamins Arbeiten ist von einer Paradoxie durchzogen: Obwohl Konsens darüber besteht, dass er seine Begriffe in ›enger Fühlung‹ mit dem jeweiligen Material entwickelt, werden seine Schriften häufig ohne ein eigenständiges Studium seiner Quellen gelesen, losgelöst vom jeweiligen Problem- und Debattenzusammenhang. Das verstärkt den Eindruck einer Esoterik seiner Texte und kann zu der Annahme verleiten, Benjamin entnehme Motive willkürlich aus seinem Material- und Quellenstudium und nutze sie als Vehikel eines an sich schwer in eine Tradition einzuordnenden Denkens. (...) Die Autorinnen und Autoren des vorliegenden Bandes legen das Gewicht auf Benjamins Material, sei es Literatur, Theater, Metaphysik, Rechts- und Moralphilosophie, Zeitschriftenprojekte, soziale Bewegungen oder Stadtarchitektur. Anhand von Texten unterschiedlicher Werkphasen untersuchen sie das Verhältnis von Material und Begriff aus beiden Blickrichtungen. Dieses Vorgehen ermöglicht es, auch Licht auf theoretische Beziehungen und Arbeitszusammenhänge zu werfen, die bisher eher unbeachtet blieben. (shrink)
In this article, I will provide a critical overview of the form of non-deductive reasoning commonly known as “Inference to the Best Explanation” (IBE). Roughly speaking, according to IBE, we ought to infer the hypothesis that provides the best explanation of our evidence. In section 2, I survey some contemporary formulations of IBE and highlight some of its putative applications. In section 3, I distinguish IBE from C.S. Peirce’s notion of abduction. After underlining some of the essential elements of IBE, (...) the rest of the entry is organized around an examination of various problems that IBE confronts, along with some extant attempts to address these problems. In section 4, I consider the question of when a fact requires an explanation, since presumably IBE applies only in cases where some explanation is called for. In section 5, I consider the difficult question of how we ought to understand IBE in light of the fact that among philosophers, there is significant disagreement about what constitutes an explanation. In section 6, I consider different strategies for justifying the truth-conduciveness of the explanatory virtues, e.g., simplicity, unification, scope, etc., criteria which play an indispensable role in any given application of IBE. In section 7, I survey some of the most recent literature on IBE, much of which consists of investigations of the status of IBE from the standpoint of the Bayesian philosophy of science. (shrink)
Juhani Yli-Vakkuri and John Hawthorne have recently presented a thought experiment—Mirror Man—designed to refute internalist theories of belief and content. We distinguish five ways in which the case can be interpreted and argue that on none does it refute internalism.
In a previous issue of this journal Michael Veber argued that God could not answer certain prayers because doing so would be immoral. In this article I attempt to demonstrate that Veber’s argument is simply the logical problem of evil applied to a possible world. Because of this, his argument is susceptible to a Plantinga-style defense.
Jason Frank's book can be situated in this second wave. Similar to other agonistic theorists, he focuses on the affective, aesthetic, and strategic dimensions of politics, while assuming that conflict and struggle are inevitable features of political experience.
Kunstmatige intelligentie (AI) en systemen die met machine learning (ML) werken, kunnen veel onderdelen van het medische besluitvormingsproces ondersteunen of vervangen. Ook zouden ze artsen kunnen helpen bij het omgaan met klinische, morele dilemma’s. AI/ML-beslissingen kunnen zo in de plaats komen van professionele beslissingen. We betogen dat dit belangrijke gevolgen heeft voor de relatie tussen een patiënt en de medische professie als instelling, en dat dit onvermijdelijk zal leiden tot uitholling van het institutionele vertrouwen in de geneeskunde.
Environmental health research produces scientific knowledge about environmental hazards crucial for public health and environmental justice movements that seek to prevent or reduce exposure to these hazards. The environment in environmental health research is conceptualized as the range of possible social, biological, chemical, and/or physical hazards or risks to human health, some of which merit study due to factors such as their probability and severity, the feasibility of their remediation, and injustice in their distribution. This paper explores the ethics of (...) identifying the relevant environment for environmental health research, as judgments involved in defining an environmental hazard or risk, judgments of that hazard or risk's probability, severity, and/ or injustice, as well as the feasibility of its remediation, all ought to appeal to non-epistemic as well as epistemic values. I illustrate by discussing the case of environmental lead, a housing-related hazard that remains unjustly distributed by race and class and is particularly dangerous to children. Examining a controversy in environmental health research ethics where researchers tested multiple levels of lead abatement in lead-contaminated households, I argue that the broader perspective on the ethics of environmental health research provided in the first part of this paper may have helped prevent this controversy. (shrink)
This paper is an attempt to historicize Frank Plumpton Ramsey’s Apostle talks delivered from 1923 to 1925 within the social and political context of the time. In his talks, Ramsey discusses socialism, psychoanalysis, and feminism. Ramsey’s views on these three intellectual movements were inter-connected, and they all contributed to his take on the then policy debates on the role of women in economy. Drawing on some archival materials, biographical facts, and the historiographical literature on the early inter-war politics of (...) motherhood, I show that Ramsey held a positive view of the feminist campaign for family endowment. He demanded government financial support for motherhood in recognition of the economic significance of women’s domestic works and as what could bring economic independence to them. In addition, he found such economic scheme compatible with the kind of maternalism endorsed by Freudian psychoanalysis – his favorite theory of psychology. (shrink)
Many morally significant outcomes can be brought about only if several individuals contribute to them. However, individual contributions to collective outcomes often fail to have morally significant effects on their own. Some have concluded from this that it is permissible to do nothing. What I call ‘the problem of insignificant hands’ is the challenge of determining whether and when people are obligated to contribute. For this to be the case, I argue, the prospect of helping to bring about the outcome (...) has to be good enough. Furthermore, the individual must be in a position to increase the probability of its being brought about to an appropriate extent. Finally, I argue that when too few are willing to contribute, people may have a duty to increase their number. Thus, someone can be obligated to contribute or to get others to contribute. This prospect account is consistent with Kantianism, contractualism and rule consequentialism but inconsistent with act consequentialism. (shrink)
Name der Zeitschrift: Nietzsche-Studien Jahrgang: 44 Heft: 1 Seiten: 267-290 In this paper, I examine the possibility of constructing an ontological phenomenology of love by tracing Nietzsche’s questioning about science. I examine how the evolution of Nietzsche’s thinking about science and his increasing suspicion towards it coincide with his interest for the question of love. Although the texts from the early and middle period praise science as an antidote to asceticism, the later texts associate the scientifi c spirit with asceticism. (...) I argue that this shift is motivated by Nietzsche’s realization that asceticism and science share the same fetish of facts. It is now for Nietzsche no longer a matter of proving the so-called facts of the backworlds to be wrong (something science is very capable of doing), but a matter of rejecting the very structure of thought that reduces a shapeless reality into a series of facts, subjects and objects. It is this second attitude that Nietzsche regards as the common core of science and asceticism. From this critique of science and its correlative critique of facts, Nietzsche begins searching for a counter-attitude able to perform the reduction of the factual attitude. This is the attitude he calls love. Although Nietzsche’s concept of love has oft en been elucidated in terms of its object or its subject, I argue that such interpretations precisely defeat Nietzsche’s point, which is to recover a ground that precedes the division of the world into subjects and objects. Love becomes the name of this intra-relationship of being, opening up to new perspectives on Nietzsche’s ontology of the will to power. (shrink)
This paper consists of two parts. In the first part, I give an in-depth comparison and analysis of the theories of Frank Ankersmit and Eelco Runia, in which I highlight their most important resemblances and differences. What both have in common is their notion of the presence of the past as a ‘presence in absence’. They differ, however, with respect to the character of this past and the role representation plays in making it present. Second, I also argue that (...) for both Ankersmit and Runia, the presence of the past is always the present of our past, which excludes the experience of the otherness of the past, and which opens both theories to the criticisms of being self-centered and nationalistic. (shrink)
Institutions can be strong or weak. But what does this mean? Equilibrium theories equate institutions with behavioural regularities. In contrast, rule theories explicate them in terms of a standard that people are supposed to meet. I propose that, when an institution is weak, a discrepancy exists between the regularity and the standard or rule. To capture this discrepancy, I present a hybrid theory, the Rules-and-Equilibria Theory. According to this theory, institutions are rule-governed behavioural regularities. The Rules-and-Equilibria Theory provides the basis (...) for two measures of institutional strength. First, institutions that pertain to coordination games solve problems of information. Their strength is primarily a matter of the expected degree of compliance. Second, institutions that concern mixed-motive games solve problems of motivation. Their strength can be measured in terms of the weight people attribute to its rule. (shrink)
In the past twenty years, scholarly interest in John Dewey's later writings has surged. While later works such as Art as Experience (1934), Logic: The Theory of Inquiry (1938), and Freedom and Culture (1939) have received considerable attention, Knowing and the Known (1949), Dewey's late-in-life collaboration with Arthur F. Bentley, has been largely neglected. A common bias among Dewey scholars is that this work, instead of developing Dewey's Logic, departs from its spirit, reflects the overbearing influence of Bentley on Dewey (...) (who was at the time an octogenarian), and, therefore, merits little serious scholarly consideration. However, Dewey and Bentley engaged in an extended correspondence, collected in John Dewey and Arthur Bentley: A Philosophical Correspondence, 1932-1951 (1964), the result of which was no less than a watershed moment in Dewey's thinking on the experimental method of inquiry. The Logic was improved in ways that incorporated the insights of Charles Sanders Peirce's logic and developed Dewey's earlier work in a direction expressly intended by the aging pragmatist. Indeed, Dewey writes in correspondence with his co-author: "You [Bentley] shouldn't lean too heavily on the [1938] Logic; it wasn't a bad job at the time, but I could do better now [with Knowing and the Known]; largely through association with you and getting the courage to see my thing [logical theory] through without compromise" (Correspondence, 4:595, see also 184, 420, 481, 483-84). One of the few scholars of American pragmatism to acknowledge that Knowing and the Known was a watershed development in Dewey's thinking is Frank X. Ryan, author of an exciting new book, Seeing Together: Mind, Matter, and the Experimental Outlook of John Dewey and Arthur F. Bentley, that clearly and concisely presents the revolutionary method developed in Knowing and Known: the transactional approach. (shrink)
For each positive n , two alternative axiomatizations of the theory of strings over n alphabetic characters are presented. One class of axiomatizations derives from Tarski's system of the Wahrheitsbegriff and uses the n characters and concatenation as primitives. The other class involves using n character-prefixing operators as primitives and derives from Hermes' Semiotik. All underlying logics are second order. It is shown that, for each n, the two theories are definitionally equivalent [or synonymous in the sense of deBouvere]. It (...) is further shown that each member of one class is synonymous with each member of the other class; thus that all of the theories are definitionally equivalent with each other and with Peano arithmetic. Categoricity of Peano arithmetic then implies categoricity of each of the above theories. (shrink)
In recent years, language has been shown to play a number of important cognitive roles over and above the communication of thoughts. One hypothesis gaining support is that language facilitates thought about abstract categories, such as democracy or prediction. To test this proposal, a novel set of semantic memory task trials, designed for assessing abstract thought non-linguistically, were normed for levels of abstractness. The trials were rated as more or less abstract to the degree that answering them required the participant (...) to abstract away from both perceptual features and common setting associations corresponding to the target image. The normed materials were then used with a population of people with aphasia to assess the relationship of abstract thought to language. While the language-impaired group with aphasia showed lower overall accuracy and longer response times than controls in general, of special note is that their response times were significantly longer as a function of a trial’s degree of abstractness. Further, the aphasia group’s response times in reporting their degree of confidence (a separate, metacognitive measure) were negatively correlated with their language production abilities, with lower language scores predicting longer metacognitive response times. These results provide some support for the hypothesis that language is an important aid to abstract thought and to metacognition about abstract thought. (shrink)
Biomarker-based predictive tests for subjectively asymptomatic Alzheimer’s disease (AD) are utilized in research today. Novel applications of artificial intelligence (AI) promise to predict the onset of AD several years in advance without determining biomarker thresholds. Until now, little attention has been paid to the new ethical challenges that AI brings to the early diagnosis in asymptomatic individuals, beyond contributing to research purposes, when we still lack adequate treatment. The aim of this paper is to explore the ethical arguments put forward (...) for AI aided AD prediction in subjectively asymptomatic individuals and their ethical implications. The ethical assessment is based on a systematic literature search. Thematic analysis was conducted inductively of 18 included publications. The ethical framework includes the principles of autonomy, beneficence, non-maleficence, and justice. Reasons for offering predictive tests to asymptomatic individuals are the right to know, a positive balance of the risk-benefit assessment, and the opportunity for future planning. Reasons against are the lack of disease modifying treatment, the accuracy and explicability of AI aided prediction, the right not to know, and threats to social rights. We conclude that there are serious ethical concerns in offering early diagnosis to asymptomatic individuals and the issues raised by the application of AI add to the already known issues. Nevertheless, pre-symptomatic testing should only be offered on request to avoid inflicted harm. We recommend developing training for physicians in communicating AI aided prediction. (shrink)
The topic of fake news has received increased attention from philosophers since the term became a favorite of politicians (Habgood-Coote 2016; Dentith 2016). Notably missing from the conversation, however, is a discussion of fake news and conspiracy theory media as a market. This paper will take as its starting point the account of noxious markets put forward by Debra Satz (2010), and will argue that there is a pro tanto moral reason to restrict the market for fake news. Specifically, we (...) begin with Satz’s argument that restricting a market may be required when i) that market inhibits citizens from being able to stand in an equal relationship with one another, and ii) this problem cannot be solved without such direct restrictions. Our own argument then proceeds in three parts: first, we argue that the market for fake news fits Satz’s description of a noxious market; second, we argue against explanations of the proliferation of fake news that are couched in terms of “epistemic vice”, and likewise argue against prescribing critical thinking education as a solution to the problem; finally, we conclude that, in the absence of other solutions to mitigate the noxious effects of the fake news market, we have a pro tanto moral reason to impose restrictions on this market. At the end of the paper, we consider one proposal to regulate the fake news market, which involves making social media outlets potentially liable in civil court for damages caused by the fake news hosted on their websites. (shrink)
This paper offers a probabilistic treatment of the conditions for argument cogency as endorsed in informal logic: acceptability, relevance, and sufficiency. Treating a natural language argument as a reason-claim-complex, our analysis identifies content features of defeasible argument on which the RSA conditions depend, namely: change in the commitment to the reason, the reason’s sensitivity and selectivity to the claim, one’s prior commitment to the claim, and the contextually determined thresholds of acceptability for reasons and for claims. Results contrast with, and (...) may indeed serve to correct, the informal understanding and applications of the RSA criteria concerning their conceptual dependence, their function as update-thresholds, and their status as obligatory rather than permissive norms, but also show how these formal and informal normative approachs can in fact align. (shrink)
This paper discusses the finite version of the two envelope paradox. (That is, we treat the paradox against the background assumption that there is only a finite amount of money in the world.).
We introduce two notions–the shadows and the shallows of explanation–in opening up explanation to broader, interdisciplinary investigation. The shadows of explanation refer to past philosophical efforts to provide either a conceptual analysis of explanation or in some other way to pinpoint the essence of explanation. The shallows of explanation refer to the phenomenon of having surprisingly limited everyday, individual cognitive abilities when it comes to explanation. Explanations are ubiquitous, but they typically are not accompanied by the depth that we might, (...) prima facie, expect. We explain the existence of the shadows and shallows of explanation in terms of there being a theoretical abyss between explanation and richer, theoretical structures that are often attributed to people. We offer an account of the shallows, in particular, both in terms of shorn-down, internal, mental machinery, and in terms of an enriched, public symbolic environment, relative to the currently dominant ways of thinking about cognition and the world. (shrink)
In this chapter, we consider ethical and philosophical aspects of trust in the practice of medicine. We focus on trust within the patient-physician relationship, trust and professionalism, and trust in Western (allopathic) institutions of medicine and medical research. Philosophical approaches to trust contain important insights into medicine as an ethical and social practice. In what follows we explain several philosophical approaches and discuss their strengths and weaknesses in this context. We also highlight some relevant empirical work in the section on (...) trust in the institutions of medicine. It is hoped that the approaches discussed here can be extended to nursing and other topics in the philosophy of medicine. (shrink)
We model scientific theories as Bayesian networks. Nodes carry credences and function as abstract representations of propositions within the structure. Directed links carry conditional probabilities and represent connections between those propositions. Updating is Bayesian across the network as a whole. The impact of evidence at one point within a scientific theory can have a very different impact on the network than does evidence of the same strength at a different point. A Bayesian model allows us to envisage and analyze the (...) differential impact of evidence and credence change at different points within a single network and across different theoretical structures. (shrink)
This article presents a critique of the current naturalist readings of Nietzsche by drawing a distinction between a sense of naturalism based on nature taken as "what there is" and one based on the scientific concept of nature. The paper suggests that Nietzsche is a naturalist in the first sense, but not in the latter, and that due to the confusion between the two sense, many arguments in favor of the first have been unwarrantedly transferred into the latter. The article (...) begins with a close critical reading of Brian Leiter's "Nietzsche's Naturalism Revisited" before confronting the naturalist readings with one case of Nietzsche's writings, found in Daybreak's Book V. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.