While Thomas Kuhn's theory of scientific revolutions does not specifically deal with validation, the validation of simulations can be related in various ways to Kuhn's theory: 1) Computer simulations are sometimes depicted as located between experiments and theoretical reasoning, thus potentially blurring the line between theory and empirical research. Does this require a new kind of research logic that is different from the classical paradigm which clearly distinguishes between theory and empirical observation? I argue that this is not the case. (...) 2) Another typical feature of computer simulations is their being ``motley'' (Winsberg 2003) with respect to the various premises that enter into simulations. A possible consequence is that in case of failure it can become difficult to tell which of the premises is to blame. Could this issue be understood as fostering Kuhn's mild relativism with respect to theory choice? I argue that there is no need to worry about relativism with respect to computer simulations, in particular. 3) The field of social simulations, in particular, still lacks a common understanding concerning the requirements of empirical validation of simulations. Does this mean that social simulations are still in a pre-scientific state in the sense of Kuhn? My conclusion is that despite ongoing efforts to promote quality standards in this field, lack of proper validation is still a problem of many published simulation studies and that, at least large parts of social simulations must be considered as pre-scientific. (shrink)
This paper tries to answer the question why the epistemic value of so many social simulations is questionable. I consider the epistemic value of a social simulation as questionable if it contributes neither directly nor indirectly to the understanding of empirical reality. To examine this question, two classical social simulations are analyzed with respect to their possible epistemic justification: Schelling’s neighborhood segregation model and Axelrod’s reiterated Prisoner’s Dilemma simulations of the evolution of cooperation. It is argued that Schelling’s simulation is (...) useful because it can be related to empirical reality, while Axelrod’s simulations and those of his followers cannot and thus that their scientific value remains doubtful. I relate this findingto the background beliefs of modelers about the superiority of the modeling method as expressed in Joshua Epstein’s keynote address “Why model?”. (shrink)
Employing computer simulations for the study of the evolution of altruism has been popular since Axelrod's book "The Evolution of Cooperation". But have the myriads of simulation studies that followed in Axelrod's footsteps really increased our knowledge about the evolution of altruism or cooperation? This book examines in detail the working mechanisms of simulation based evolutionary explanations of altruism. It shows that the "theoretical insights" that can be derived from simulation studies are often quite arbitrary and of little use for (...) the empirical research. In the final chapter of the book, therefore, a set of epistemological requirements for computer simulations is proposed and recommendations for the proper research design of simulation studies are made. (shrink)
In this paper we investigate with a case study from chemistry under what conditions a simulation can serve as a surrogate for an experiment. The case-study concerns a simulation of H2-formation in outer space. We find that in this case the simulation can act as a surrogate for an experiment, because there exists comprehensive theoretical background knowledge in form of quantum mechanics about the range of phenomena to which the investigated process belongs and because any particular modelling assumptions as can (...) be justified. If these requirements are met then direct empirical validation may even be dispensable. We conjecture that this is not the case in the absence of comprehensive theoretical background knowledge. (shrink)
In a recent Philosophy of Science article Gerhard Schurz proposes meta-inductivistic prediction strategies as a new approach to Hume's. This comment examines the limitations of Schurz's approach. It can be proven that the meta-inductivist approach does not work any more if the meta-inductivists have to face an infinite number of alternative predictors. With his limitation it remains doubtful whether the meta-inductivist can provide a full solution to the problem of induction.
This paper discusses critically what simulation models of the evolution of cooperation can possibly prove by examining Axelrod’s “Evolution of Cooperation” (1984) and the modeling tradition it has inspired. Hardly any of the many simulation models in this tradition have been applicable empirically. Axelrod’s role model suggested a research design that seemingly allowed to draw general conclusions from simulation models even if the mechanisms that drive the simulation could not be identified empirically. But this research design was fundamentally flawed. At (...) best such simulations can claim to prove logical possibilities, i.e. they prove that certain phenomena are possible as the consequence of the modeling assumptions built into the simulation, but not that they are possible or can be expected to occur in reality. I suggest several requirements under which proofs of logical possibilities can nevertheless be considered useful. Sadly, most Axelrod-style simulations do not meet these requirements. It would be better not to use this kind of simulations at all. (shrink)
This paper discusses critically what simulation models of the evolution ofcooperation can possibly prove by examining Axelrod’s “Evolution of Cooperation” and the modeling tradition it has inspired. Hardly any of the many simulation models of the evolution of cooperation in this tradition have been applicable empirically. Axelrod’s role model suggested a research design that seemingly allowed to draw general conclusions from simulation models even if the mechanisms that drive the simulation could not be identified empirically. But this research design was (...) fundamentally flawed, because it is not possible to draw general empirical conclusions from theoretical simulations. At best such simulations can claim to prove logical possibilities, i.e. they prove that certain phenomena are possible as the consequence of the modeling assumptions built into the simulation, but not that they are possible or can be expected to occur in reality I suggest several requirements under which proofs of logical possibilities can nevertheless be considered useful. Sadly, most Axelrod-style simulations do not meet these requirements. I contrast this with Schelling’s neighborhood segregation model, thecore mechanism of which can be retraced empirically. (shrink)
Simulation models of the Reiterated Prisoner's Dilemma (in the following: RPD-models) are since 30 years considered as one of the standard tools to study the evolution of cooperation (Rangoni 2013; Hoffmann 2000). A considerable number of such simulation models has been produced by scientists. Unfortunately, though, none of these models has empirically been verified and there exists no example of empirical research where any of the RPD-models has successfully been employed to a particular instance of cooperation. Surprisingly, this has not (...) kept scientists from continuing to produce simulation models in the same tradition and from writing their own history as a history of success. In a recent simulation study -- which does not make use of the RPD but otherwise follows the same pattern of research -- Robert Axelrod's (1984) original role model for this kind of simulation studies is praised as ``an extremely effective means for investigating the evolution of cooperation'' and considered as ``widely credited with invigorating that field'' (Rendell et al. 2010). (shrink)
Simulation models of the Reiterated Prisoner's Dilemma have been popular for studying the evolution of cooperation since more than 30 years now. However, there have been practically no successful instances of empirical application of any of these models. At the same time this lack of empirical testing and confirmation has almost entirely been ignored by the modelers community. In this paper, I examine some of the typical narratives and standard arguments with which these models are justified by their authors despite (...) the lack of empirical validation. I find that most of the narratives and arguments are not at all compelling. None the less they seem to serve an important function in keeping the simulation business running despite its empirical shortcomings. (shrink)
This paper is intended as a critical examination of the question of when and under what conditions the use of computer simulations is beneficial to scientific explanations. This objective is pursued in two steps: First, I try to establish clear criteria that simulations must meet in order to be explanatory. Basically, a simulation has explanatory power only if it includes all causally relevant factors of a given empirical configuration and if the simulation delivers stable results within the measurement inaccuracies of (...) the input parameters. -/- In the second step, I examine a few examples of Axelrod-style simulations as they have been used to understand the evolution of cooperation (Axelrod, Schüßler). These simulations do not meet the criteria for explanatory validity and it can be shown, as I believe, that they lead us astray from the scientific problems they have been addressed to solve. (shrink)
There is an ongoing debate on whether or to what degree computer simulations can be likened to experiments. Many philosophers are sceptical whether a strict separation between the two categories is possible and deny that the materiality of experiments makes a difference (Morrison 2009, Parker 2009, Winsberg 2010). Some also like to describe computer simulations as a “third way” between experimental and theoretical research (Rohrlich 1990, Axelrod 2003, Kueppers/Lenhard 2005). In this article I defend the view that computer simulations are (...) not experiments but that they are tools for evaluating the consequences of theories and theoretical assumptions. In order to do so the (alleged) similarities and differences between simulations and experiments are examined. It is found that three fundamental differences between simulations and experiments remain: 1) Only experiments can generate new empirical data. 2) Only Experiments can operate directly on the target system. 3) Experiments alone can be employed for testing fundamental hypotheses. As a consequence, experiments enjoy a distinct epistemic role in science that cannot completely be superseded by computer simulations. This finding in connection with a discussion of border cases such as hybrid methods that combine measurement with simulation shows that computer simulations can clearly be distinguished from empirical methods. It is important to understand that computer simulations are not experiments, because otherwise there is a danger of systematically underestimating the need for empirical validation of simulations. (shrink)
In this paper Kant's "perpetual peace" is being interpreted as a realistic utopia. Kant's "perpetual peace" remains an utopia even today in the sense that the described perpetual world peace is still a long way to go from today's state of world politics. But Kant also tried to show that the utopian scenario is possible under realistic assumptions. Therefore this essay examines the question, if Kant's basic assumptions - such as for example the assumption that democracies are generally non aggressive (...) - are still valid in the light of the political experiences of the two centuries that have elapsed since the publication of the "perpetual peace" and how the realisation of Kant's utopia can best be promoted in today's situation. (shrink)
This article is a commentary on another article by Burkhard Stephan in "Erwägen Wissen Ethik" (16/2005 Issue 3). The question is examined, whether there exist analogies between (Darwinian) biological evolution cultural development processes. The topics discussed are: 1. Analogies to biological evolution on the cultural level. 2. Analogies to cultural processes on the biological level. 3. Features of the biological evolution of human nature that have direct consequences on the cultural level. 4. Ethical questions raised by the previous three points.
Epistemic Responsibility means that scientists are responsible for their research being suitable to contribute to our understanding of the world, or at least some part of the world. As will be shown with the example of computer simulations in social sciences, this is unfortunately far from being understood as a matter of course. Rather, there exist whole research traditions in which the bulk of the contributions is quite free from any tangible purpose of enhancing our knowledge about anything. This essay (...) is concerned with the causes of this phenomenon and pleads for taking epistemic responsibility as a scientific virtue serious. Science should be organized in such a way that it is possible and likely that scientists will assume epistemic responsibility for their research tasks. (shrink)
In this paper the ethical problem is discussed how moral judgments of foreign cultures and bygone epochs can be justified. After ruling out the extremes of moral absolutism (judging without any reservations by the standards of one's own culture and epoch) and moral relativism (judging only by the respective standards of the time and culture in question) the following solution to the dilemma is sought: A distinction has to be made between judging the norms and institutions in power at a (...) certain place and time and judging people acting within the social institutions of their time and culture. While the former may be judged rigorously, only taking into account the objective possibilities for having other institutions at a certain development stage, the latter should be judged against the background of the common sense morals of the respective time and culture. (shrink)
Mathematical models are a well established tool in most natural sciences. Although models have been neglected by the philosophy of science for a long time, their epistemological status as a link between theory and reality is now fairly well understood. However, regarding the epistemological status of mathematical models in the social sciences, there still exists a considerable unclarity. In my paper I argue that this results from specific challenges that mathematical models and especially computer simulations face in the social sciences. (...) The most important difference between the social sciences and the natural sciences with respect to modeling is that in the social sciences powerful and well confirmed background theories (like Newtonian mechanics, quantum mechanics or the theory of relativity in physics) do not exist in the social sciences. Therefore, an epistemology of models that is formed on the role model of physics may not be appropriate for the social sciences. I discuss the challenges that modeling faces in the social sciences and point out their epistemological consequences. The most important consequences are that greater emphasis must be placed on empirical validation than on theoretical validation and that the relevance of purely theoretical simulations is strongly limited. (shrink)
In this commentary I criticise Doris Gerber's intentionalistic reading of history. While an intentionalistic philosophy of history has some plausibility, a *purely* intentionalistic view is often irreconcilable with the most elementary common sense. For example, that history ought to be considered exclusively as the history of human action and not of things that simply happen to humans as well - like the outbreak of the volcano Vesuv in the year 79 which lead to the destructions of Pompeii. Or that historical (...) value judgments should always be judgments about the reasons for human action, which raises the questions if and how the unintended consequences of human action are to be judged. (shrink)
Hans Kelsens Critique of Eric Voegelins "New Science of Politics" has for a long time been very difficult to access, because Kelsen has published only parts of it in his life time and left other parts unpublished. This allowed Voegelin to spread the myth that Kelsen had refrained from publishing his criticism, because he had understood that he was wrong. This is nonsense. The reasons why Kelsen left part of his criticism unpublished are mostly accidental. At the same time Kelsens (...) criticism itself is a very sound rejection of Voegelin's hopelessly flawed theory of modernity as an age of Gnosticism. In this Nachwort to the Edition of Kelsen's Voegelin-criticism I summarize and comment the most important points of Kelsens criticism: 1. Kelsen's take on Voegelin's criticism of positivism. 2. Kelsen's rejection of Voegelin's "theory of representation" which is in fact a rejection of democratic representation in favor of an authoritarian and theocratic model of politics and 3. Kelsen's refuation of Voegelin's explanation of political disorder in the 20th century as a result of Gnostic heresy. My conclusion is that Kelen's criticism is successful (with only very minor qualifications) in all of these points and that thus it is an important contribution to the discussion of Voegelin's scholarship which has until recently been almost entirely apologetic. (shrink)
This is a series of lectures on formal decision theory held at the University of Bayreuth during the summer terms 2008 and 2009. It largely follows the book from Michael D. Resnik: Choices. An Introduction to Decision Theory, 5th ed. Minneapolis London 2000 and covers the topics: -/- Decisions under ignorance and risk Probability calculus (Kolmogoroff Axioms, Bayes' Theorem) Philosophical interpretations of probability (R. v. Mises, Ramsey-De Finetti) Neuman-Morgenstern Utility Theory Introductory Game Theory Social Choice Theory (Sen's Paradox of Liberalism, (...) Arrow's Theorem) . (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.