Switch to: Citations

Add references

You must login to add references.
  1. Nature's capacities and their measurement.Nancy Cartwright - 1989 - New York: Oxford University Press.
    Ever since David Hume, empiricists have barred powers and capacities from nature. In this book Cartwright argues that capacities are essential in our scientific world, and, contrary to empiricist orthodoxy, that they can meet sufficiently strict demands for testability. Econometrics is one discipline where probabilities are used to measure causal capacities, and the technology of modern physics provides several examples of testing capacities (such as lasers). Cartwright concludes by applying the lessons of the book about capacities and probabilities to the (...)
    Download  
     
    Export citation  
     
    Bookmark   628 citations  
  • Wimsatt and the robustness family: Review of Wimsatt’s Re-engineering Philosophy for Limited Beings. [REVIEW]Brett Calcott - 2011 - Biology and Philosophy 26 (2):281-293.
    This review of Wimsatt’s book Re-engineering Philosophy for Limited Beings focuses on analysing his use of robustness, a central theme in the book. I outline a family of three distinct conceptions of robustness that appear in the book, and look at the different roles they play. I briefly examine what underwrites robustness, and suggest that further work is needed to clarify both the structure of robustness and the relation between it various conceptions.
    Download  
     
    Export citation  
     
    Bookmark   34 citations  
  • Dynamic mechanistic explanation: computational modeling of circadian rhythms as an exemplar for cognitive science.William Bechtel & Adele Abrahamsen - 2010 - Studies in History and Philosophy of Science Part A 41 (3):321-333.
    Two widely accepted assumptions within cognitive science are that (1) the goal is to understand the mechanisms responsible for cognitive performances and (2) computational modeling is a major tool for understanding these mechanisms. The particular approaches to computational modeling adopted in cognitive science, moreover, have significantly affected the way in which cognitive mechanisms are understood. Unable to employ some of the more common methods for conducting research on mechanisms, cognitive scientists’ guiding ideas about mechanism have developed in conjunction with their (...)
    Download  
     
    Export citation  
     
    Bookmark   118 citations  
  • Model Organisms and Mathematical and Synthetic Models to Explore Gene Regulation Mechanisms.Andrea Loettgers - 2007 - Biological Theory 2 (2):134-142.
    Gene regulatory networks are intensively studied in biology. One of the main aims of these studies is to gain an understanding of how the structure of genetic networks relates to specific functions such as chemotaxis and the circadian clock. Scientists have examined this question by using model organisms such as Drosophila and mathematical models. In the last years, synthetic models—engineered genetic networks—have become more and more important in the exploration of gene regulation. What is the potential of this new approach (...)
    Download  
     
    Export citation  
     
    Bookmark   10 citations  
  • Robustness, Reliability, and Overdetermination (1981).William C. Wimsatt - 2012 - In Lena Soler (ed.), Characterizing the robustness of science: after the practice turn in philosophy of science. New York: Springer Verlag. pp. 61-78.
    The use of multiple means of determination to “triangulate” on the existence and character of a common phenomenon, object, or result has had a long tradition in science but has seldom been a matter of primary focus. As with many traditions, it is traceable to Aristotle, who valued having multiple explanations of a phenomenon, and it may also be involved in his distinction between special objects of sense and common sensibles. It is implicit though not emphasized in the distinction between (...)
    Download  
     
    Export citation  
     
    Bookmark   146 citations  
  • The Robust Volterra Principle.Michael Weisberg & Kenneth Reisman - 2008 - Philosophy of Science 75 (1):106-131.
    Theorizing in ecology and evolution often proceeds via the construction of multiple idealized models. To determine whether a theoretical result actually depends on core features of the models and is not an artifact of simplifying assumptions, theorists have developed the technique of robustness analysis, the examination of multiple models looking for common predictions. A striking example of robustness analysis in ecology is the discovery of the Volterra Principle, which describes the effect of general biocides in predator-prey systems. This paper details (...)
    Download  
     
    Export citation  
     
    Bookmark   74 citations  
  • Three Kinds of Idealization.Michael Weisberg - 2007 - Journal of Philosophy 104 (12):639-659.
    Philosophers of science increasingly recognize the importance of idealization: the intentional introduction of distortion into scientific theories. Yet this recognition has not yielded consensus about the nature of idealization. e literature of the past thirty years contains disparate characterizations and justifications, but little evidence of convergence towards a common position.
    Download  
     
    Export citation  
     
    Bookmark   280 citations  
  • Forty years of 'the strategy': Levins on model building and idealization.Michael Weisberg - 2006 - Biology and Philosophy 21 (5):623-645.
    This paper is an interpretation and defense of Richard Levins’ “The Strategy of Model Building in Population Biology,” which has been extremely influential among biologists since its publication 40 years ago. In this article, Levins confronted some of the deepest philosophical issues surrounding modeling and theory construction. By way of interpretation, I discuss each of Levins’ major philosophical themes: the problem of complexity, the brute-force approach, the existence and consequence of tradeoffs, and robustness analysis. I argue that Levins’ article is (...)
    Download  
     
    Export citation  
     
    Bookmark   77 citations  
  • Robustness, discordance, and relevance.Jacob Stegenga - 2009 - Philosophy of Science 76 (5):650-661.
    Robustness is a common platitude: hypotheses are better supported with evidence generated by multiple techniques that rely on different background assumptions. Robustness has been put to numerous epistemic tasks, including the demarcation of artifacts from real entities, countering the “experimenter’s regress,” and resolving evidential discordance. Despite the frequency of appeals to robustness, the notion itself has received scant critique. Arguments based on robustness can give incorrect conclusions. More worrying is that although robustness may be valuable in ideal evidential circumstances (i.e., (...)
    Download  
     
    Export citation  
     
    Bookmark   66 citations  
  • Does matter really matter? Computer simulations, experiments, and materiality.Wendy S. Parker - 2009 - Synthese 169 (3):483-496.
    A number of recent discussions comparing computer simulation and traditional experimentation have focused on the significance of “materiality.” I challenge several claims emerging from this work and suggest that computer simulation studies are material experiments in a straightforward sense. After discussing some of the implications of this material status for the epistemology of computer simulation, I consider the extent to which materiality (in a particular sense) is important when it comes to making justified inferences about target systems on the basis (...)
    Download  
     
    Export citation  
     
    Bookmark   134 citations  
  • Models and the locus of their truth.Uskali Mäki - 2011 - Synthese 180 (1):47 - 63.
    If models can be true, where is their truth located? Giere (Explaining science, University of Chicago Press, Chicago, 1998) has suggested an account of theoretical models on which models themselves are not truth-valued. The paper suggests modifying Giere’s account without going all the way to purely pragmatic conceptions of truth—while giving pragmatics a prominent role in modeling and truth-acquisition. The strategy of the paper is to ask: if I want to relocate truth inside models, how do I get it, what (...)
    Download  
     
    Export citation  
     
    Bookmark   72 citations  
  • MISSing the World. Models as Isolations and Credible Surrogate Systems.Uskali Mäki - 2009 - Erkenntnis 70 (1):29-43.
    This article shows how the MISS account of models—as isolations and surrogate systems—accommodates and elaborates Sugden’s account of models as credible worlds and Hausman’s account of models as explorations. Theoretical models typically isolate by means of idealization, and they are representatives of some target system, which prompts issues of resemblance between the two to arise. Models as representations are constrained both ontologically (by their targets) and pragmatically (by the purposes and audiences of the modeller), and these relations are coordinated by (...)
    Download  
     
    Export citation  
     
    Bookmark   143 citations  
  • Exporting causal knowledge in evolutionary and developmental biology.Sandra D. Mitchell - 2008 - Philosophy of Science 75 (5):697-706.
    In this article I consider the challenges for exporting causal knowledge raised by complex biological systems. In particular, James Woodward’s interventionist approach to causality identified three types of stability in causal explanation: invariance, modularity, and insensitivity. I consider an example of robust degeneracy in genetic regulatory networks and knockout experimental practice to pose methodological and conceptual questions for our understanding of causal explanation in biology. †To contact the author, please write to: Department of History and Philosophy of Science, University of (...)
    Download  
     
    Export citation  
     
    Bookmark   17 citations  
  • Robustness Analysis.Michael Weisberg - 2006 - Philosophy of Science 73 (5):730-742.
    Modelers often rely on robustness analysis, the search for predictions common to several independent models. Robustness analysis has been characterized and championed by Richard Levins and William Wimsatt, who see it as central to modern theoretical practice. The practice has also been severely criticized by Steven Orzack and Elliott Sober, who claim that it is a nonempirical form of confirmation, effective only under unusual circumstances. This paper addresses Orzack and Sober's criticisms by giving a new account of robustness analysis and (...)
    Download  
     
    Export citation  
     
    Bookmark   167 citations  
  • Economic Modelling as Robustness Analysis.Jaakko Kuorikoski, Aki Lehtinen & Caterina Marchionni - 2010 - British Journal for the Philosophy of Science 61 (3):541-567.
    We claim that the process of theoretical model refinement in economics is best characterised as robustness analysis: the systematic examination of the robustness of modelling results with respect to particular modelling assumptions. We argue that this practise has epistemic value by extending William Wimsatt's account of robustness analysis as triangulation via independent means of determination. For economists robustness analysis is a crucial methodological strategy because their models are often based on idealisations and abstractions, and it is usually difficult to tell (...)
    Download  
     
    Export citation  
     
    Bookmark   94 citations  
  • Isolating Representations Versus Credible Constructions? Economic Modelling in Theory and Practice.Tarja Knuuttila - 2009 - Erkenntnis 70 (1):59-80.
    This paper examines two recent approaches to the nature and functioning of economic models: models as isolating representations and models as credible constructions. The isolationist view conceives of economic models as surrogate systems that isolate some of the causal mechanisms or tendencies of their respective target systems, while the constructionist approach treats them rather like pure constructions or fictional entities that nevertheless license different kinds of inferences. I will argue that whereas the isolationist view is still tied to the representationalist (...)
    Download  
     
    Export citation  
     
    Bookmark   42 citations  
  • Extending Ourselves: Computational Science, Empiricism, and Scientific Method.Paul Humphreys - 2004 - New York, US: Oxford University Press.
    Computational methods such as computer simulations, Monte Carlo methods, and agent-based modeling have become the dominant techniques in many areas of science. Extending Ourselves contains the first systematic philosophical account of these new methods, and how they require a different approach to scientific method. Paul Humphreys draws a parallel between the ways in which such computational methods have enhanced our abilities to mathematically model the world, and the more familiar ways in which scientific instruments have expanded our access to the (...)
    Download  
     
    Export citation  
     
    Bookmark   274 citations  
  • Computational Models.Paul Humphreys - 2002 - Philosophy of Science 69 (S3):S1-S11.
    A different way of thinking about how the sciences are organized is suggested by the use of cross-disciplinary computational methods as the organizing unit of science, here called computational templates. The structure of computational models is articulated using the concepts of construction assumptions and correction sets. The existence of these features indicates that certain conventionalist views are incorrect, in particular it suggests that computational models come with an interpretation that cannot be removed as well as a prior justification. A form (...)
    Download  
     
    Export citation  
     
    Bookmark   38 citations  
  • Computational models.Paul Humphreys - 2002 - Proceedings of the Philosophy of Science Association 2002 (3):S1-S11.
    A different way of thinking about how the sciences are organized is suggested by the use of cross‐disciplinary computational methods as the organizing unit of science, here called computational templates. The structure of computational models is articulated using the concepts of construction assumptions and correction sets. The existence of these features indicates that certain conventionalist views are incorrect, in particular it suggests that computational models come with an interpretation that cannot be removed as well as a prior justification. A form (...)
    Download  
     
    Export citation  
     
    Bookmark   38 citations  
  • The Vanity of Rigour in Economics: Theoretical Models and Galilean Experiments.Nancy Cartwright - 1999 - Lse, Centre for the Philosophy of the Natural and Social Sciences.
    Download  
     
    Export citation  
     
    Bookmark   33 citations  
  • Re-engineering philosophy for limited beings: piecewise approximations to reality.William C. Wimsatt - 2007 - Cambridge, Mass.: Harvard University Press.
    This book offers a philosophy for error-prone humans trying to understand messy systems in the real world.
    Download  
     
    Export citation  
     
    Bookmark   383 citations  
  • Depth: An Account of Scientific Explanation.Michael Strevens - 2008 - Cambridge, Mass.: Harvard University Press.
    Approaches to explanation -- Causal and explanatory relevance -- The kairetic account of /D making -- The kairetic account of explanation -- Extending the kairetic account -- Event explanation and causal claims -- Regularity explanation -- Abstraction in regularity explanation -- Approaches to probabilistic explanation -- Kairetic explanation of frequencies -- Kairetic explanation of single outcomes -- Looking outward -- Looking inward.
    Download  
     
    Export citation  
     
    Bookmark   461 citations  
  • A System of Logic.John Stuart Mill - 1874 - Longman.
    Reprint of the original, first published in 1869.
    Download  
     
    Export citation  
     
    Bookmark   562 citations  
  • Complex biological mechanisms: Cyclic, oscillatory, and autonomous.William Bechtel & Adele Abrahamsen - unknown
    The mechanistic perspective has dominated biological disciplines such as biochemistry, physiology, cell and molecular biology, and neuroscience, especially during the 20th century. The primary strategy is reductionist: organisms are to be decomposed into component parts and operations at multiple levels. Researchers adopting this perspective have generated an enormous body of information about the mechanisms of life at scales ranging from the whole organism down to genetic and other molecular operations.
    Download  
     
    Export citation  
     
    Bookmark   42 citations  
  • On the method of isolation in economics.Uskali Mäki - 1992 - Poznan Studies in the Philosophy of the Sciences and the Humanities 26:19-54.
    Download  
     
    Export citation  
     
    Bookmark   110 citations  
  • Fictions, representations, and reality.Margaret Morrison - 2009 - In Mauricio Suárez (ed.), Fictions in Science: Philosophical Essays on Modeling and Idealization. Routledge. pp. 4--110.
    Download  
     
    Export citation  
     
    Bookmark   19 citations