This paper considers the relevance of the Duhem-Quinethesis in economics. In the introductory discussion which follows, the meaning of the thesis and a brief history of its development are detailed. The purpose of the paper is to discuss the effects of the thesis in four specific and diverse theories in economics, and to illustrate the dependence of testing the theories on a set of auxiliary hypotheses. A general taxonomy of auxiliary hypotheses is provided to (...) demonstrate the confounding of auxiliary hypotheses with the testing of economic theory. (shrink)
Contemporary developments in economicmethodology have produced a vibrant agenda ofcompeting positions. These include, amongothers, constructivism, critical realism andrhetoric, with each contributing to the Realistvs. Pragmatism debate in the philosophies of thesocial sciences. A major development in theneo-pragmatist contribution to economicmethodology has been Quine's pragmatic assaulton the dogmas of empiricism, which are nowclearly acknowledged within contemporaryeconomic methodology. This assault isencapsulated in the celebrated Duhem-Quinethesis, which according to a number ofcontemporary leading philosophers of economics,poses a particularly serious methodologicalproblem for economics. (...) This problem, asreflected in Hausman's analysis, consists ofthe inability of economics to learn fromexperience, thereby subverting the capacity totest economic theories. In this paper wedispute this position. Our argument is basedon a combination of Quine's holism with VanFraassen's constructive empiricism, especiallythe latter's analysis of empirical adequacy andhis pragmatic approach to explanation. Theresulting reorientation of economic methodologyrestores the capacity of economics to learnfrom experience and reinstates the imperativeof developing alternatives to orthodoxtheorizing in economics. (shrink)
The Duhem-QuineThesis is the claim that it is impossible to test a scientific hypothesis in isolation because any empirical test requires assuming the truth of one or more auxiliary hypotheses. This is taken by many philosophers, and is assumed here, to support the further thesis that theory choice is underdetermined by empirical evidence. This inquiry is focused strictly on the axiological commitments engendered in solutions to underdetermination, specifically those of Pierre Duhem and W. V. (...)Quine. Duhem resolves underdetermination by appealing to a cluster of virtues called 'good sense', and it has recently been argued by Stump (Stud Hist Philos Biol Biomed Sei, 18(1):149-159,2007) that good sense is a form of virtue epistemology. This paper considers whether Quine, who's philosophy is heavily influenced by the very thesis that led Duhem to the virtues, is also led to a virtue epistemology in the face of underdetermination. Various sources of Quinian epistemic normativity are considered, and it is argued that, in conjunction with other normative commitments, Quine's sectarian solution to underdetermination amounts to a skills based virtue epistemology. The paper also sketches formal features of the novel form of virtue epistemology common to Duhem and Quine that challenges the adequacy of epistemic value truth-monism and blocks any imperialist naturalization of virtue epistemology, as the epistemic virtues are essential to the success of the sciences themselves. (shrink)
Quine claims that holism (i.e., the Quine-Duhemthesis) prevents us from defining synonymy and analyticity (section 2). In Word and Object, he dismisses a notion of synonymy which works well even if holism is true. The notion goes back to a proposal from Grice and Strawson and runs thus: R and S are synonymous iff for all sentences T we have that the logical conjunction of R and T is stimulus-synonymous to that of S and T. (...) Whereas Grice and Strawson did not attempt to defend this definition, I try to show that it indeed gives us a satisfactory account of synonymy. Contrary to Quine, the notion is tighter than stimulus-synonymy – particularly when applied to sentences with less than critical semantic mass (section 3). Now according to Quine, analyticity could be defined in terms of synonymy, if synonymy were to make sense: A sentence is analytic iff synonymous to self-conditionals. This leads us to the following notion of analyticity: S is analytic iff, for all sentences T, the logical conjunction of S and T is stimulus-synonymous to T; an analytic sentence does not change the semantic mass of any theory to which it may be conjoined (section 4). This notion is tighter than Quine's stimulus-analyticity; unlike stimulus-analyticity, it does not apply to those sentences from the very center of our theories which can be assented to come what may, even though they are not synthetic in the intuitive sense (section 5). Conclusion: We can have well-defined notions of synonymy and analyticity even if we embrace Quine's holism, naturalism, behaviorism, and radical translation. Quine's meaning skepticism is to be repudiated on Quinean grounds. (shrink)
Quine is routinely perceived as having changed his mind about the scope of the Duhem-Quinethesis, shifting from what has been called an 'extreme holism' to a more moderate view. Where the Quine of 'Two Dogmas of Empiricism' argues that “the unit of empirical significance is the whole of science” (1951, 42), the later Quine seems to back away from this “needlessly strong statement of holism” (1991, 393). In this paper, I show that the (...) received view is incorrect. I distinguish three ways in which Quine's early holism can be said to be wide-scoped and show that he has never changed his mind about any one of these aspects of his early view. Instead, I argue that Quine's apparent change of mind can be explained away as a mere shift of emphasis. (shrink)
Pierre Duhem is the discoverer of the physics of the Middle Ages. The discovery that there existed a physics of the Middle Ages was a surprise primarily for Duhem himself. This discovery completely changed the way he saw the evolution of physics, bringing him to formulate a complex argument for the growth and continuity of scientific knowledge, which I call the ‘Pierre DuhemThesis’ (not to be confused either with what Roger Ariew called the ‘true (...) class='Hi'>Duhemthesis’ as opposed to the Quine-Duhemthesis, which he persuasively argued is not Duhem’s, or with the famous ‘Quine-DuhemThesis’ itself). The ‘Pierre DuhemThesis’ consists of five sub-theses (some transcendental in nature, some other causal, factual, or descriptive), which are not independent, as they do not work separately (but only as a system) and do not relate to reality separately (but only simultaneously). The famous and disputed ‘continuity thesis’ is part, as a sub-thesis, from this larger argument. I argue that the ‘Pierre DuhemThesis’ wraps up all of Duhem’s discoveries in the history of science and as a whole represents his main contribution to the historiography of science. The ‘Pierre DuhemThesis’ is the central argument of Pierre Duhem's work as historian of science. (shrink)
In this paper I explore Karl Popper’s ‘critical rationalism’, focusing on its presuppositions and implications as a form of realism regarding the nature of scientific truth. I reveal an underlying tension in Popper’s thought pertaining to his account of basic statements and the related question of whether the falsification of a universal theory can ever justifiably be regarded as final or conclusive. I conclude that Popper’s account of basic statements is implicitly conventionalist, and that it should, in consistency, have forced (...) him in the direction of Quinean holism. (shrink)
This article examines whether Willard Van Orman Quine’s indeterminacy thesis can be sustained. The argument from above, Quine argues, can derive indeterminacy as its conclusion. I will argue that the indeterminacy claim cannot be sustained. I further argue that Quine changed the formulation of the underdetermination of theory by evidence (UTE) argument from what Duhem said to the Quine/Pierce meaning verification view, in order use the new formulation of UTE to imply indeterminacy. Given all (...) that, we see when we apply the old UTE argument we only arrive at underdetermination of theory by evidence, and that applies to all sciences, philosophy and knowledge, including philosophy of language. (shrink)
Despite offering many formulations of his controversial indeterminacy of translation thesis, Quine has never explored in detail the connection between indeterminacy and the conception of meaning that he supposedly derived from the work of Peirce and Duhem. The outline of such a conception of meaning, as well as its relationship to the indeterminacy thesis, is worked out in this paper; and its merits and implications are assessed both in the context of Quine’s own philosophical agenda, (...) and also with a view to a very different approach to meaning and understanding exemplified by the work of Gadamer. (shrink)
The Duhem-Quinethesis famously holds that a single hypothesis cannot be confirmed or disconfirmed in isolation, but instead only in conjunction with other background hypotheses. This article argues that this has important and underappreciated implications for metaethics. Section 1 argues that if one begins metaethics firmly wedded to a naturalistic worldview—due (e.g.) to methodological/epistemic considerations—then normativity will appear to be reducible to a set of social-psycho-semantic behaviors that I call the ‘normative stance.’ Contra Hume and Bedke (2012), (...) I argue that the normative stance provides semantically-grounded entailments from natural truths to normative truths, reducing the latter to the former. Specifically, the normative stance explains the truth-conditions, truth-values, and truth-makers of normative propositions in terms of socially grounded cognitive-behavioral rules and other natural facts, thus explaining how there can be bona fide normative facts and properties in a wholly naturalistic world. I then show that the normative stance explains the apparent stance-independence and non-naturalness of normative reasons, intrinsic value, and categoricity of moral reasons as ‘user-illusions’ generated by people having strong psycho-social propensities—rooted in evolution and social cooperation—to take these normative stances. Section 2 then argues that while the normative stance may appear to naturalists to successfully explain normativity, it will not appeal to those who come to metaethics with different background commitments. I conclude that naturalists should take the normative stance to be a promising metaethical theory of normativity, and that whether it is a true theory of normativity is something that can only be ascertained by determining which background hypotheses—naturalistic or otherwise—we should have when doing metaethics. (shrink)
This paper aims to contribute to the current debate about the status of the “Ought Implies Can” principle and the growing body of empirical evidence that undermines it. We report the results of an experimental study which show that people judge that agents ought to perform an action even when they also judge that those agents cannot do it and that such “ought” judgments exhibit an actor-observer effect. Because of this actor-observer effect on “ought” judgments and the Duhem-Quine (...)thesis, talk of an “empirical refutation” of OIC is empirically and methodologically unwarranted. What the empirical fact that people attribute moral obligations to unable agents shows is that OIC is not intuitive, not that OIC has been refuted. (shrink)
This article aims to discuss Bunge's critique of Popper's falsifiabilism as a criterion for distinguishing science from pseudoscience. First, it provides a historical background to the distinction of science; secondly, it explore Duhem-Quinethesis as an early objection to falsifiability, and thirdly, it discusses Bunge's criticism of falsifiability as all-around false — logically, methodologically, psychologically, and historically; and finally, it provides the criteria of scientificity from Bunge's point of view. في الوقت الذي يتسابق فيه العلماء من أجل (...) الكشف عن علاج لفيروس كورونا (كوفيد-19)، تطل علينا وسائل إعلامية لتقدم بعض الوصفات بوصفها علاجًا مثل الشلولو (ملوخية جافة مع الثوم والليمون والشطة)، والفول، ونحو ذلك، ومهمة فلاسفة العلم هي إعادة النظر من حين إلى آخر في ما يميز العلم من اللاعلم والعلم الزائف. ولعلّ أقل فائدة للفلسفة العلمية هي حمايتنا من هذا من الخداع الفكري. تدور هذه المقالة على ثلاث خطوات، تقدم الأولى خلفية تاريخية لمشكلة تمييز العلم، وتعرض الثانية حلّ بوبر لهذه المشكلة، واعتراض نزعة الكلية عليه، وتعالج الثالثة نقد بونجي لحل بوبر، وتصوره الخاص لمعايير العلم. (shrink)
While a large social-choice-theoretic literature discusses the aggregation of individual judgments into collective ones, there is much less formal work on the transformation of judgments in group communication. I develop a model of judgment transformation and prove a baseline impossibility theorem: Any judgment transformation function satisfying some initially plausible conditions is the identity function, under which no opinion change occurs. I identify escape routes from this impossibility and argue that the kind of group communication envisaged by deliberative democats must be (...) "holistic": It must focus on webs of connected propositions, not on one proposition at a time, which echoes the Duhem-Quine "holism thesis" on scientific theory testing. My approach provides a map of the logical space in which different possible group communication processes are located. (shrink)
Scott Soames argues that interpreted in the light of Quine's holistic verificationism, Quine's thesis of underdetermination leads to a contradiction. It is contended here that if we pay proper attention to the evolution of Quine's thinking on the subject, particularly his criterion of theory individuation, Quine's thesis of underdetermination escapes Soames' charge of paradoxicality.
This is the Introduction to my translation of Quine's Kant Lectures. Part of my interpretation is that an "esoteric doctrine" in involved in Quine's distinctive semantic claims: his skepticism of the credulity of non-expert evaluation of discourse and theory.
The analytic/synthetic distinction can be conceived from two points of view: from within or from without; from the perspective of one's own language or from the perspective of the language of others. From without, the central question is which sentences of a foreign language are to be classified as analytic. From within, by contrast, the question concerning the synthetic and the analytic acquires a normative dimension: which sentences am I not permitted to reject—if I want to avoid talking nonsense? Both (...) perspectives on analytic sentences do not mutually exclude but supplement and illuminate each other. In "Two Dogmas", Quine's criticism of the analytic/synthetic distinction comes from within, whereas in Word and Object, Quine repeats his earlier criticism from without. His criticism is directed against Carnap's views on our understanding of theoretical terms. I challenge Quine's criticism in both of its versions and provide two definitions for analyticity that are immune to Quine's arguments. Using the first of these definitions (the one from without) I try to show how it is possible to distinguish (genuine) belief revision from linguistic change—even in case of a scientific revolution. (shrink)
Quine on Ethics: The Gavagai of Moral Discourse is the first comprehensive treatment of Quine’s brief yet memorable foray into ethics. It defends him against his most formidable critics, corrects misconceptions in the reception of his outlook on morality as a social institution and ethics as a philosophical enterprise, and restores emphasis on observationality as the impetus behind his momentous intervention in ethical theory. The central focus is on Quine’s infamous challenge to ethical theory: his thesis (...) of the methodological infirmity of ethics as compared with science. The ultimate aim is to demonstrate that the challenge is not only valid but also valuable insofar as it identifies opportunities for reformation in moral justification. (shrink)
Physicien théoricien, philosophe de la physique et historien des théories physiques, le savant catholique français Pierre Duhem (1861-1916) a profondément marqué la pensée du vingtième siècle. Chacun connaît le Système du monde, dont les dix volumes ont contribué à la redécouverte de la science médiévale, et La théorie physique, qui a notamment donné lieu à la célèbre «thèse Duhem-Quine». Si Clio a donc gardé de Duhem le souvenir d’un grand historien des sciences et d’un philosophe perspicace (...) de la physique, lui-même cependant n’aspirait qu’à être reconnu comme physicien. Son œuvre est en effet traversée par un projet scientifique qui consiste à ordonner et à réunir les diverses branches de la physique sous l’égide de la thermodynamique dans le cadre d’une théorie représentative et non explicative du réel. C’est ce projet que Duhem a voulu réaliser dans ses publications scientifiques, exposer dans ses écrits philosophiques, et finalement cautionner par ses recherches historiques. -/- Cependant l’investissement toujours plus important de Duhem en histoire des sciences et la présence dans son œuvre de considérations apologétiques et d’écrits patriotiques peuvent donner à penser qu’il s’est progressivement détourné de ce projet primordial au profit d’autres préoccupations. De même, les tensions qui, à l’intérieur de ce projet scientifique, subsistent entre sa volonté unificatrice et sa revendication phénoménaliste peuvent conduire à une relativisation de cette dernière, conçue comme une demande contextuelle, passagère et finalement peu significative. Sans ignorer ces préoccupations historiques, religieuses ou patriotiques, sans négliger ce conflit d’intérêt entre les deux parties constitutives du projet duhémien, cette étude entend tout d’abord réaffirmer que ce projet scientifique ne sera jamais ni abandonné, ni amputé. -/- Toutefois, dès lors que sont maintenues la permanence, la priorité et l’intégralité de ce projet, trois paradoxes surgissent immédiatement. Si Duhem se voulait avant tout physicien et souhaitait être reconnu comme tel, par quelle extravagance de l’histoire est-il finalement connu pour ses recherches historiques et ses travaux philosophiques et non pour ce qui lui tenait le plus à cœur ? S’il ne voulait être qu’un illustre physicien, pourquoi s’est-il acharné, au retour du laboratoire, à exhumer de l’oubli les manuscrits et les théories scientifiques des auteurs médiévaux ? Enfin, s’il voulait vraiment établir une physique qui soit unifiée, cohérente et parfaite, pourquoi se prive-t-il du réalisme et s’embarrasse-t-il du phénoménalisme ? Basée sur la correspondance inédite de Duhem, cette étude, centrée plus particulièrement sur ce troisième paradoxe, contribue finalement à élucider chacun d’eux. (shrink)
The paper concentrates on an appreciation of W.V. Quine’s thought on meaning and how it escalates beyond the meaning holism and confirmation holism, thereby paving the way for a ‘meaning nihilism’ and ‘confirmation rejectionism’. My effort would be to see that how could the acceptance of radical naturalism in Quine’s theory of meaning escorts him to the indeterminacy thesis of meaning. There is an interesting shift from epistemology to language as Quine considers that a person who (...) is aware of linguistic trick can be the master of referential language. Another important question is that how could Quine’s radical translation thesis reduce into semantic indeterminacy that is a consequence of his confirmation methord. Even I think that the notion and the analysis of meaning became hopelessly vague in Quine’s later work. I further argue on Quine’s position of meaning that I call, following Hilary Putnam, ‘meaning nihilism’. It seems to me that Quine had no belief like ‘meaning consists in’, or ‘meaning depends on’ something. Through this argument, I would like to challenge the confirmation holism that was foisted by Fodor on Quine’s thesis. My attempt would be to scrutinize Putnam’s point of view that Quine was neither a confirmation holist nor a meaning holist. I think that both Putnam and Quine denied the concept of constitutive connection of meaning as a second grade notion not only from the realm of semantic, but also from the perspective of epistemology. So, linguistic meaning cannot be formed by any sample of its uses. For Quine, the concept of meaning in metaphysics is heuristic and need not be taken seriously in any ‘science worthy’ literature. (shrink)
The paper begins by acknowledging that weakened systematic precision in phenomenology has made its application in philosophy of science obscure and ineffective. The defining aspirations of early transcendental phenomenology are, however, believed to be important ones. A path is therefore explored that attempts to show how certain recent developments in the logic of self-reference fulfill in a clear and more rigorous fashion in the context of philosophy of science certain of the early hopes of phenomenologists. The resulting dual approach is (...) applied to several problems in the philosophy of science: on the one hand, to proposed rejections of scientific objectivity, to the doctrine of radical meaning variance, and to the Quine-Duhemthesis, and or. the other, to an analysis of hidden variable theory in quantum mechanics. (shrink)
The paper seeks to show that Quine’s theses concerning the underdetermination of scientific theories by experience and the indeterminacy of reference cannot be reconciled if some of Quine’s central assumptions are accepted. The argument is this. Quine holds that the thesis about reference is not just a special case of the other thesis. In order to make sense of this comment we must distinguish between factual and epistemic indeterminacy. Something is factual indeterminate if it is (...) not determined by the facts. Epistemic indeterminacy, on the other hand, is due to the lack of evidence. Quine’s claim about the relationship between the two theses is best understood as saying that reference is factually indeterminate, whereas the underdetermination of scientific theories is merely epistemic. But the latter cannot be sustained in light of Quine’s verificationism, holism and naturalism. (shrink)
Quine behauptet, dass uns der Holismus (d.h. die Quine/Duhem-These) daran hindert, Synonymie zu definieren. In "Word and Object" weist er einen Synonymiebegriff zurück, der selbst dann gut funktioniert, wenn der Holismus zutrifft. Dieser Begriff lässt sich so definieren: R und S sind synonym, wenn für alle Sätze T die logische Konjunktion aus R und T reizsynonym zur Konjunktion aus S und T ist. Dieser Begriff entgeht Quines bedeutungsskeptischen, holistischen Einwänden. Anders als Quine gemeint hat, ist der (...) Begriff enger als sein Begriff der Reizsynonymie – insbesondere bei Anwendung auf Sätze mit weniger als kritischer semantischer Masse. Fazit: Wir können Synonymie selbst dann sauber definieren, wenn wir Quines Holismus, Naturalismus, Behaviorismus und radikale Übersetzung mitmachen. Quines Bedeutungsskepsis sollte sogar auf Quines eigenem Territorium zurückgewiesen werden. (shrink)
Both John Langshaw Austin and Willard Van Orman Quine were critical of the traditional division of propositions into the two categories: analytic and synthetic. their criticism has, however, a di erent character. Quine questions the usefulness of the notion of analyticity, whereas Austin does not accept the view that every proposition should be considered either analytic or synthetic. According to Quine, we have to abandon the notion of analyticity because we cannot de ne it in a satisfactory (...) way. Quine’s criticism is based on his conviction that the very notion of meaning is suspicious om the scienti c point of view. this general outlook is supported by arguments the point of which is to show that we cannot avoid an indeterminacy of translation. Austin criticises the distinction for di erent reasons. According to him, it is not the notion of meaning which is suspicious, but a certain model of this notion — a model which is based on false analogies. In my text, I compare these two approaches and point out that they have di erent metaphilosophical sources. the main di erence lies in the fact that, according to Austin, statements about linguistic meaning usually have a descriptive character, whereas Quine claims that linguistic meanings are theoretical entities. In the last part of my article, I discuss the thesis of indeterminacy of translation and assess its credibility, as it plays a key role in Quine’s criticism of the notion of meaning. (shrink)
Wer eine philosophische Bedeutungstheorie aufstellen möchte, sollte u.a. den Begriff der Synonymie (Bedeutungsgleichheit) zu klären versuchen. Ein Hauptproblem für dies bedeutungstheoretische Projekt hängt mit dem Holismus der Quine/Duhem-These zusammen: Dieser holistischen These zufolge findet die Überprüfung unserer Behauptungen über die Welt nicht auf der Ebene des einzelnen Satzes statt, sondern auf der Ebene ganzer Theorien (Abschnitt I). Wenn wir z.B. aus Davidsons Bedeutungskonzeption Aufschlüsse über Synonymie extrahieren wollen, dann stolpern wir über Quines Holismus. Falls Davidsons Theorie gar nicht (...) für Synonymie aufkommen möchte, ist diese Theorie zumindest unvollständig (Abschnitt II). Dies Verdikt trifft natürlich nur zu, wenn es überhaupt möglich ist, im Rahmen von Quines Holismus einen Begriff der Synonymie einzuführen. Dass dies möglich ist, zeige ich (gegen Quine) im Abschnitt III, indem ich verifikationalistische und falsifikationalistische Theoriestücke zu einer neuen empiristischen Bedeutungstheorie ausbaue, in deren Rahmen sinnvoll über Synonymie gesprochen werden kann. Im Abschnitt IV wird dies positive Resultat zur Verteidigung des Begriffs des analytischen Satzes herangezogen. (shrink)
Hume and Quine argue that human beings do not have access to general knowledge, that is, to general truths . The arguments of these two philosophers are premised on what Jaakko Hintikka has called the atomistic postulate. In the present work, it is shown that Hume and Quine in fact sanction an extreme version of this postulate, according to which even items of particular knowledge are not directly accessible in so far as they are relational. For according to (...) their fully realized systems, human beings do not initially perceive any relations, or similar epistemological elements that can associate or combine terms on which a relational or general knowledge claim may be based. Nor, likewise, do human beings perceive the relations or the associations themselves as separate entities. ;In Chapters 1 and 2, respectively, it is shown precisely why Hume and Quine deny that human beings initially perceive either such associative elements or associations in general. Concomitantly, it is made clear why Hume's and Quine's respective epistemologies preclude human beings from initially apprehending not only general knowledge, but particular relational knowledge as well. But this is not to say that Hume and Quine do not think we can eventually acquire such associative elements and correspondingly, knowledge. Rather, Hume and Quine do provide an account of knowledge, but one that holds all relational and connective elements to be constructed by the human mind. In Hume's case, they are constructed by the imagination. In Quine's case, we are never told quite how this construction occurs, although the evidence suggests that Quine implicitly relies on a faculty similar to Hume's imagination. ;In the final chapter of this thesis, it is argued that both Hume and Quine must be read as philosophers who justify knowledge by reducing its possibility to a psychological faculty of construction, as well as to a few concepts of intuitively grasped relations. By way of conclusion, it is shown that this makes Quine's naturalism the psychological heir to Carnap's Aufbau. (shrink)
This monographic chapter explains how expected utility (EU) theory arose in von Neumann and Morgenstern, how it was called into question by Allais and others, and how it gave way to non-EU theories, at least among the specialized quarters of decion theory. I organize the narrative around the idea that the successive theoretical moves amounted to resolving Duhem-Quine underdetermination problems, so they can be assessed in terms of the philosophical recommendations made to overcome these problems. I actually follow (...)Duhem's recommendation, which was essentially to rely on the passing of time to make many experiments and arguments available, and evebntually strike a balance between competing theories on the basis of this improved knowledge. Although Duhem's solution seems disappointingly vague, relying as it does on "bon sens" to bring an end to the temporal process, I do not think there is any better one in the philosophical literature, and I apply it here for what it is worth. In this perspective, EU theorists were justified in resisting the first attempts at refuting their theory, including Allais's in the 50s, but they would have lacked "bon sens" in not acknowledging their defeat in the 80s, after the long process of pros and cons had sufficiently matured. This primary Duhemian theme is actually combined with a secondary theme - normativity. I suggest that EU theory was normative at its very beginning and has remained so all along, and I express dissatisfaction with the orthodox view that it could be treated as a straightforward descriptive theory for purposes of prediction and scientific test. This view is usually accompanied with a faulty historical reconstruction, according to which EU theorists initially formulated the VNM axioms descriptively and retreated to a normative construal once they fell threatened by empirical refutation. From my historical study, things did not evolve in this way, and the theory was both proposed and rebutted on the basis of normative arguments already in the 1950s. The ensuing, major problem was to make choice experiments compatible with this inherently normative feature of theory. Compability was obtained in some experiments, but implicitly and somewhat confusingly, for instance by excluding overtly incoherent subjects or by creating strong incentives for the subjects to reflect on the questions and provide answers they would be able to defend. I also claim that Allais had an intuition of how to combine testability and normativity, unlike most later experimenters, and that it would have been more fruitful to work from his intuition than to make choice experiments of the naively empirical style that flourished after him. In sum, it can be said that the underdetermination process accompanying EUT was resolved in a Duhemian way, but this was not without major inefficiencies. To embody explicit rationality considerations into experimental schemes right from the beginning would have limited the scope of empirical research, avoided wasting resources to get only minor findings, and speeded up the Duhemian process of groping towards a choice among competing theories. (shrink)
Confirmation and falsification are different strategies for testing theories and characterizing the outcomes of those tests. Roughly speaking, confirmation is the act of using evidence or reason to verify or certify that a statement is true, definite, or approximately true, whereas falsification is the act of classifying a statement as false in the light of observation reports. After expounding the intellectual history behind confirmation and falsificationism, reaching back to Plato and Aristotle, I survey some of the main controversial issues and (...) arguments that pertain to the choice between these strategies: the Raven Paradox, the Duhem/Quine problem and the Grue Paradox. Finally, I outline an evolutionary criticism of inductive Bayesian approaches based on my assumption of doxastic involuntarism. (shrink)
Much contemporary epistemology is informed by a kind of confirmational holism, and a consequent rejection of the assumption that all confirmation rests on experiential certainties. Another prominent theme is that belief comes in degrees, and that rationality requires apportioning one's degrees of belief reasonably. Bayesian confirmation models based on Jeffrey Conditionalization attempt to bring together these two appealing strands. I argue, however, that these models cannot account for a certain aspect of confirmation that would be accounted for in any adequate (...) holistic confirmation theory. I then survey the prospects for constructing a formal epistemology that better accommodates holistic insights. (shrink)
Analytische Sätze, die kraft Definition wahr sein sollen, schaden der Naturwissenschaft oder trivialisieren ihren Fortschritt: So lautet einer der Kritikpunkte, die Quine in seinem Feldzug gegen die Unterscheidung zwischen synthetischen und analytischen Sätzen vorgebracht hat. Sie schaden, so Quine, weil sie nicht revidiert werden dürfen und damit die Wahlfreiheit beim Theorienwandel über Gebühr einschränken. (Hätte sich z.B. Einstein vom analytischen Status der newtonischen Impulsdefinition beeindrucken lassen, so hätte er die Relativitätstheorie nicht formulieren können). Oder sie trivialisieren den Fortschritt, (...) weil sich durch Preisgabe analytischer Sätze bloß die Sprache ändert – und wie soll uns eine Änderung der Begriffe vorwärtsbringen? Um dieser Herausforderung zu begegnen, müssen wir Sprach- und Theorienwandel auseinanderdividieren. Im Falle einer wissenschaftlichen Revolution findet beides gleichzeitig statt. Trotzdem kann man die Beobachtungskonsequenzen der neuen Theorie (mithilfe des Ramsey-Satzes) mit denen der alten vergleichen; und mithilfe des Carnap-Satzes kann man den jeweiligen analytischen Anteil der beiden Theorien extrahieren. Es war Carnaps Fehler, diesen Schachzug zur Definition des Begriffs vom analytischen Satz heranzuziehen. Umgekehrt wird ein Schuh draus: Mithilfe einer holismustauglichen Definition von "analytisch" lässt sich zeigen, dass die Carnap-Sätze analytisch sind. Hier die Grundidee der fraglichen Definition: Ein Satz ist analytisch, wenn sich der empirische Gehalt keiner Gesamttheorie dadurch ändert, dass man ihr den Satz einpflanzt. (shrink)
Quine’s writings on indeterminacy of translation are mostly abstract and theoretical; his reasons for the thesis are not based on historical cases of translation but on general considerations about how language works. So it is no surprise that a common objection to the thesis asserts that it is not backed up by any positive empirical evidence. Ian Hacking (1981 and 2002) claims that whatever credibility the thesis does enjoy comes rather from alleged (fictitious) cases of radical (...) mistranslation. This paper responds to objections of that kind by exhibiting actual cases of indeterminacy of translation. (shrink)
The present essay addresses the epistemic difficulties involved in achieving consensus with respect to the Hayek–Keynes debate. It is argued that the empirical implications of the relevant theories are such that, regardless of what is observed, both theories can be interpreted as true, or at least, as not falsified. The essay explicates the respects in which the empirical evidence underdetermines the choice between the relevant theories. In particular, it is argued both that there are convenient responses that protect each theory (...) from what appears to be threatening evidence and that, for particular kinds of evidence, the two theories are empirically equivalent. Larry Laudan's suggestion that ampliative methodological criteria can resolve an underdetermined choice between multiple scientific theories is considered and rejected as a possible means to rational consensus. (shrink)
Bu çalışmada bilginin kazandığı doğru nitelemesi ve bilimsel bilgide doğrunun bir dogma haline gelmesi, kökenleriyle birlikte tartışılacaktır. Bu hususta Karl Popper'ın yanlışlamacı bilim imgesinin anlatıldığı bu çalışmada ikili bir yol izlenecektir. İlki; Popper'ın neye, neden karşı çıktığı üzerine, ikincisi ise bu karşı çıkış sonucunda açılan boşluğun nasıl doldurulduğudur. Dolayısıyla birinci aşama Popper açısından geleneksel imgeye yapılan bir kritik niteliğinde olurken, ikinci aşama ise Poppercı bilim imgesinin serimlendiği bölümü oluşturacaktır. Nihayetinde Popper yenilikleriyle bilimsel düşünüşe yeni bir soluk getirirken, son bölümde sisteminin (...) ne gibi eksiklikler barındırdığı tartışılacaktır. Bu eksiklikler ise Ayer ve Duhem-Quine tezi ile sırasıyla yanlışlamanın mantıksal olarak imkânsızlığı ve kuramın yanlışlanmasında holistik yaklaşımın gerekliliğini kapsayacaktır. (shrink)
Quine’s thesis of the indeterminacy of translation has puzzled the philosophical community for several decades. It is unquestionably among the best known and most disputed theses in contemporary philosophy. Quine’s classical argument for the indeterminacy thesis, in his seminal work Word and Object, has even been described by Putnam as “what may well be the most fascinating and the most discussed philosophical argument since Kant’s Transcendental Deduction of the Categories” (Putnam, 1975a: p. 159).
Functionalism would be mistaken if there existed a system of deviant relations (an “anti-mind”) that had the same functional roles as the standard mental relations. In this paper such a system is constructed, using “Quinean transformations” of the sort associated with Quine’s thesis of the indeterminacy of translation. For example, a mapping m from particularistic propositions (e.g., that there exists a rabbit) to universalistic propositions (that rabbithood is manifested). Using m, a deviant relation thinking* is defined: x thinks* (...) p iff x thinks m(p). Such deviant relations satisfy the commonly discussed functionalist psychological principles. Finally, a more complicated system of deviant relations is constructed, one satisfying sophisticated principles dealing with the self-conscious rational mind. (shrink)
Willard V. Quine’s 1951 article, “Two Dogmas of Empiricism” (Two Dogmas) was taken to be revolutionary because it rejects the analytic-synthetic distinction and the thesis that empirical statements are confirmed individually rather than holistically. The present chapter, however, argues that the overcoming of modern philosophy already included the overcoming of these theses by Hegelians, pragmatists and two critics of Hegelianism and pragmatism, Grace and Theodore de Laguna. From this perspective, Two Dogmas offers a Hegelian epistemology that was already (...) superseded in 1910. The perspective is largely based on the de Lagunas’ 1910 book Dogmatism and Evolution: Studies in Modern Philosophy. The de Lagunas’ book also helps to make clear that the real revolution Two Dogmas participated in was the marginalisation of their work and that of other speculative philosophers. Grace de Laguna surely recognised much of this when she stood opposite Quine as he first presented Two Dogmas. (shrink)
Fundamental to Quine’s philosophy of logic is the thesis that substitutional quantification does not express existence. This paper considers the content of this claim and the reasons for thinking it is true.
In 1947 Quine wrote one of the most important and influential articles in the twentieth century philosophy - "On What There Is". One of the aims of this article was a critique of Meinong's Theory of Object. The critique was especially focused upon nonactual possibilities, which (according to Meinong) are some kinds of nonexistent objects. In my paper I want to present Neo-Meinongian refutations of Quine's critique. In order to do this I discuss: (i) the main thesis (...) of "On What There Is" ,(ii) premises of Meinongian Theory, (iii) views of proponents and opponents of the idea of nonexistent objects, (iv) Quine's critique aimed at nonactual possibilities, (v) Terence Parsons' theory, based on the distinction between nuclear and extranucler properties, and (vi) noneism, defended by Richard Routley. I also try to give a reply to the most popular critiques aimed at both Neo-Meinongian theories. The main conclusion is that Quine's critique and his arguments against nonactual possibilities aren't dangerous for theories endorsing Meinong's Theory of Object. Contrary to what Gilbert Ryle once claimed (If Meinongianism isn't dead, nothing is), Meinongian theories are still alive and doing well. (shrink)
The paper concentrates on how the acceptance of radical naturalism in Quine’s theory of meaning escorts Quine to ponder the naturalized epistemology. W.V. Quine was fascinated by the evidential acquisition of scientific knowledge, and language as a vehicle of knowledge plays a significant role in his regimented naturalistic theory anchored in the scientific framework. My point is that there is an interesting shift from epistemology to language (semantic externalism). The rejection of the mentalist approach on meaning vindicates (...) external that somehow pave the way for ‘semantic holism’, a thesis where the meaning of a sentence is defined in turns to the totality of nodes and paths of its semantic networks where the meaning of linguistic units depend upon the meaning of the entire language. I would like to relook on Quine’s heart-throbbing claim about the co-extensiveness of the sentential relation and the evidential relation that point towards an affirmation of meaning holism and semantic externalism. Besides, the knowledge of acquaintance that relinquishes the singular thought from the account of psychological consideration and self-knowledge hypothesis copes up with the testimonial and warrant knowledge entangling by the claims of social knowledge as anticipated by Alvin Goldman. My conclusion would be nearer to the stance of semantic externalism inculcated by the social knowledge (in an epistemic sense) and semantic holism. (shrink)
Quine’s classical classic interpretation succinctly characterized characterizes Carnap’s Aufbau as an attempt “to account for the external world as a logical construct of sense-data....” Consequently, “Russell” was characterized as the most important influence on the Aufbau. Those times have passed. Formulating a comprehensive and balanced interpretation of the Aufbau has turned out to be a difficult task and one that must take into account several disjointed sources. My thesis is that the core of the Aufbau rested on a (...) problem that had haunted German philosophy since the end of the 19th century. In terms fashionable at the time, this problem may be expressed as the polarity between Leben and Geist that characterized German philosophy during the years of the Weimar Republic. At that time, many philosophers, including Cassirer, Rickert and Vaihinger, were engaged in overcoming this polarity. As I will show, Carnap’s Aufbau joined the ranks of these projects. This suggests that Lebensphilosophie and Rickert’s System der Philosophie exerted a strong influence on Carnap’s projects, an influence that is particularly conspicuous in his unpublished manuscript Vom Chaos zur Wirklichkeit. Carnap himself asserted that this manuscript could be considered “the germ of the constitution theory” of the Aufbau. Reading Chaos also reveals another strong but neglected influence on the Aufbau, namely a specific version of neutral monism put forward by the philosopher and psychologist Theodor Ziehen before World War I. Ziehen’s work contributed much to the invention of the constitutional method of quasi-analysis. -/-. (shrink)
PUTNAM has made highly regarded contributions to mathematics, to philosophy of logic and to philosophy of science, and in this book he brings his ideas in these three areas to bear on the traditional philosophic problem of materialism versus (objective) idealism. The book assumes that contemporary science (mathematical and physical) is largely correct as far as it goes, or at least that it is rational to believe in it. The main thesis of the book is that consistent acceptance of (...) contemporary science requires the acceptance of some sort of Platonistic idealism affirming the existence of abstract, non-temporal, non-material, non-mental entities (numbers, scientific laws, mathematical formulas, etc.). The author is thus in direct opposition to the extreme materialism which had dominated philosophy of science in the first three quarters of this century. The book can be recommended to the scientifically literate, general reader whose acquaintance with these areas is limited to the literature of the 1950’s and before, when it had been assumed that empiricistic materialism was the only philosophy compatible with a scientific outlook. To this group the book presents an eye-opening challenge fulfilling the author’s intention of “shaking up preconceptions and stimulating further discussion”. QUINE’S book is not easy to read, partly because the level of sophistication fluctuates at high frequency between remote extremes and partly because of convoluted English prose style and devilish terminology. Almost all of the minor but troublesome technical errata in the first printing have been corrected [see reviews, e.g., the reviewer, Philos. Sci. 39 (1972), no. 1, 97–99]. In the opinion of the reviewer the book is not suitable for undergraduate instruction, and without external motivation few mathematicians are likely to have the patience to appreciate it. Nevertheless, a careful study of the book will more than repay the effort and one should expect to find frequent references to this book in coming years. (shrink)
Inspired by Rudolf Carnap's Der Logische Aufbau Der Welt, David J. Chalmers argues that the world can be constructed from a few basic elements. He develops a scrutability thesis saying that all truths about the world can be derived from basic truths and ideal reasoning. This thesis leads to many philosophical consequences: a broadly Fregean approach to meaning, an internalist approach to the contents of thought, and a reply to W. V. Quine's arguments against the analytic and (...) the a priori. Chalmers also uses scrutability to analyze the unity of science, to defend a conceptual approach to metaphysics, and to mount a structuralist response to skepticism. Based on the 2010 John Locke lectures, Constructing the World opens up debate on central philosophical issues involving language, consciousness, knowledge, and reality. This major work by a leading philosopher will appeal to philosophers in all areas. This entry contains uncorrected proofs of front matter, chapter 1, and first excursus. (shrink)
Evidential holism begins with something like the claim that “it is only jointly as a theory that scientific statements imply their observable consequences.” This is the holistic claim that Elliott Sober tells us is an “unexceptional observation”. But variations on this “unexceptional” claim feature as a premise in a series of controversial arguments for radical conclusions, such as that there is no analytic or synthetic distinction that the meaning of a sentence cannot be understood without understanding the whole language of (...) which it is a part and that all knowledge is empirical knowledge. This paper is a survey of what evidential holism is, how plausible it is, and what consequences it has. Section 1 will distinguish a range of different holistic claims, Sections 2 and 3 explore how well motivated they are and how they relate to one another, and Section 4 returns to the arguments listed above and uses the distinctions from the previous sections to identify holism's role in each case. (shrink)
The thesis evaluates a contemporary debate concerning the very possibility of thinking about the world. In the first chapter, McDowell's critique of Davidson is presented, focusing on the coherentism defended by the latter. The critique of the myth of the given (as it appears in Sellars and Wittgenstein), as well as the necessity of a minimal empiricism (which McDowell finds in Quine and Kant), lead to an oscillation in contemporary thinking between two equally unsatisfactory ways of understanding the (...) empirical content of thought. In the second chapter, I defend Davidson's approach, focusing on his theory of interpretation and semantic externalism, as well as on the relation between causes and reasons. In the third chapter, the debate is analyzed in more detail. I criticize the anomalous monism, the way in which the boundaries between the conceptual and the non-conceptual are understood by Davidson, as well as the naturalized Platonism defended by McDowell. This thesis is mainly negative, and it concludes by revealing problems in both positions under evaluation. (shrink)
In this paper, I develop a criticism to a method for metaontology, namely, the idea that a discourse’s or theory’s ontological commitments can be read off its sentences’ truth- conditions. Firstly, I will put forward this idea’s basis and, secondly, I will present the way Quine subscribed to it. However, I distinguish between two readings of Quine’s famous ontological criterion, and I center the focus on the one currently dubbed “ontological minimalism”, a kind of modern Ockhamism applied to (...) the mentioned metaontological view. I show that this view has a certain application via Quinean thesis of reference inscrutability but that it is not possible to press that application any further and, in particular, not for the ambitious metaontological task some authors try to employ. The conclusion may sound promising: having shown the impossibility of a semantic ontological criterion, intentionalist or subjectivist ones should be explored. (shrink)
"Understanding Scientific Progress constitutes a potentially enormous and revolutionary advancement in philosophy of science. It deserves to be read and studied by everyone with any interest in or connection with physics or the theory of science. Maxwell cites the work of Hume, Kant, J.S. Mill, Ludwig Bolzmann, Pierre Duhem, Einstein, Henri Poincaré, C.S. Peirce, Whitehead, Russell, Carnap, A.J. Ayer, Karl Popper, Thomas Kuhn, Imre Lakatos, Paul Feyerabend, Nelson Goodman, Bas van Fraassen, and numerous others. He lauds Popper for advancing (...) beyond verificationism and Hume’s problem of induction, but faults both Kuhn and Popper for being unable to show that and how their work could lead nearer to the truth." —Dr. LLOYD EBY teaches philosophy at The George Washington University and The Catholic University of America, in Washington, DC "Maxwell's aim-oriented empiricism is in my opinion a very significant contribution to the philosophy of science. I hope that it will be widely discussed and debated." – ALAN SOKAL, Professor of Physics, New York University "Maxwell takes up the philosophical challenge of how natural science makes progress and provides a superb treatment of the problem in terms of the contrast between traditional conceptions and his own scientifically-informed theory—aim-oriented empiricism. This clear and rigorously-argued work deserves the attention of scientists and philosophers alike, especially those who believe that it is the accumulation of knowledge and technology that answers the question."—LEEMON McHENRY, California State University, Northridge "Maxwell has distilled the finest essence of the scientific enterprise. Science is about making the world a better place. Sometimes science loses its way. The future depends on scientists doing the right things for the right reasons. Maxwell's Aim-Oriented Empiricism is a map to put science back on the right track."—TIMOTHY McGETTIGAN, Professor of Sociology, Colorado State University - Pueblo "Maxwell has a great deal to offer with these important ideas, and deserves to be much more widely recognised than he is. Readers with a background in philosophy of science will appreciate the rigour and thoroughness of his argument, while more general readers will find his aim-oriented rationality a promising way forward in terms of a future sustainable and wise social order."—David Lorimer, Paradigm Explorer, 2017/2 "This is a book about the very core problems of the philosophy of science. The idea of replacing Standard Empiricism with Aim-Oriented Empiricism is understood by Maxwell as the key to the solution of these central problems. Maxwell handles his main tool masterfully, producing a fascinating and important reading to his colleagues in the field. However, Nicholas Maxwell is much more than just a philosopher of science. In the closing part of the book he lets the reader know about his deep concern and possible solutions of the biggest problems humanity is facing."—Professor PEETER MŰŰREPP, Tallinn University of Technology, Estonia “For many years, Maxwell has been arguing that fundamental philosophical problems about scientific progress, especially the problem of induction, cannot be solved granted standard empiricism (SE), a doctrine which, he thinks, most scientists and philosophers of science take for granted. A key tenet of SE is that no permanent thesis about the world can be accepted as a part of scientific knowledge independent of evidence. For a number of reasons, Maxwell argues, we need to adopt a rather different conception of science which he calls aim-oriented empiricism (AOE). This holds that we need to construe physics as accepting, as a part of theoretical scientific knowledge, a hierarchy of metaphysical theses about the comprehensibility and knowability of the universe, which become increasingly insubstantial as we go up the hierarchy. In his book “Understanding Scientific Progress: Aim-Oriented Empiricism”, Maxwell gives a concise and excellent illustration of this view and the arguments supporting it… Maxwell’s book is a potentially important contribution to our understanding of scientific progress and philosophy of science more generally. Maybe it is the time for scientists and philosophers to acknowledge that science has to make metaphysical assumptions concerning the knowability and comprehensibility of the universe. Fundamental philosophical problems about scientific progress, which cannot be solved granted SE, may be solved granted AOE.” Professor SHAN GAO, Shanxi University, China . (shrink)
According to Quine, Charles Parsons, Mark Steiner, and others, Russell's logicist project is important because, if successful, it would show that mathematical theorems possess desirable epistemic properties often attributed to logical theorems, such as a prioricity, necessity, and certainty. Unfortunately, Russell never attributed such importance to logicism, and such a thesis contradicts Russell's explicitly stated views on the relationship between logic and mathematics. This raises the question: what did Russell understand to be the philosophical importance of logicism? Building (...) on recent work by Andrew Irvine and Martin Godwyn, I argue that Russell thought a systematic reduction of mathematics increases the certainty of known mathematical theorems (even basic arithmetical facts) by showing mathematical knowledge to be coherently organized. The paper outlines Russell's theory of coherence, and discusses its relevance to logicism and the certainty attributed to mathematics. -/- . (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.