Switch to: References

Add citations

You must login to add citations.
  1. Software Intensive Science.John Symons & Jack Horner - 2014 - Philosophy and Technology 27 (3):461-477.
    This paper argues that the difference between contemporary software intensive scientific practice and more traditional non-software intensive varieties results from the characteristically high conditionality of software. We explain why the path complexity of programs with high conditionality imposes limits on standard error correction techniques and why this matters. While it is possible, in general, to characterize the error distribution in inquiry that does not involve high conditionality, we cannot characterize the error distribution in inquiry that depends on software. Software intensive (...)
    Download  
     
    Export citation  
     
    Bookmark   21 citations  
  • Epistemic Entitlements and the Practice of Computer Simulation.John Symons & Ramón Alvarado - 2019 - Minds and Machines 29 (1):37-60.
    What does it mean to trust the results of a computer simulation? This paper argues that trust in simulations should be grounded in empirical evidence, good engineering practice, and established theoretical principles. Without these constraints, computer simulation risks becoming little more than speculation. We argue against two prominent positions in the epistemology of computer simulation and defend a conservative view that emphasizes the difference between the norms governing scientific investigation and those governing ordinary epistemic practices.
    Download  
     
    Export citation  
     
    Bookmark   19 citations  
  • Epistemic injustice and data science technologies.John Symons & Ramón Alvarado - 2022 - Synthese 200 (2):1-26.
    Technologies that deploy data science methods are liable to result in epistemic harms involving the diminution of individuals with respect to their standing as knowers or their credibility as sources of testimony. Not all harms of this kind are unjust but when they are we ought to try to prevent or correct them. Epistemically unjust harms will typically intersect with other more familiar and well-studied kinds of harm that result from the design, development, and use of data science technologies. However, (...)
    Download  
     
    Export citation  
     
    Bookmark   12 citations  
  • Can we trust Big Data? Applying philosophy of science to software.John Symons & Ramón Alvarado - 2016 - Big Data and Society 3 (2).
    We address some of the epistemological challenges highlighted by the Critical Data Studies literature by reference to some of the key debates in the philosophy of science concerning computational modeling and simulation. We provide a brief overview of these debates focusing particularly on what Paul Humphreys calls epistemic opacity. We argue that debates in Critical Data Studies and philosophy of science have neglected the problem of error management and error detection. This is an especially important feature of the epistemology of (...)
    Download  
     
    Export citation  
     
    Bookmark   15 citations  
  • On malfunctioning software.Giuseppe Primiero, Nir Fresco & Luciano Floridi - 2015 - Synthese 192 (4):1199-1220.
    Artefacts do not always do what they are supposed to, due to a variety of reasons, including manufacturing problems, poor maintenance, and normal wear-and-tear. Since software is an artefact, it should be subject to malfunctioning in the same sense in which other artefacts can malfunction. Yet, whether software is on a par with other artefacts when it comes to malfunctioning crucially depends on the abstraction used in the analysis. We distinguish between “negative” and “positive” notions of malfunction. A negative malfunction, (...)
    Download  
     
    Export citation  
     
    Bookmark   28 citations  
  • Software engineering standards for epidemiological models.Jack K. Horner & John F. Symons - 2020 - History and Philosophy of the Life Sciences 42 (4):1-24.
    There are many tangled normative and technical questions involved in evaluating the quality of software used in epidemiological simulations. In this paper we answer some of these questions and offer practical guidance to practitioners, funders, scientific journals, and consumers of epidemiological research. The heart of our paper is a case study of the Imperial College London covid-19 simulator, set in the context of recent work in epistemology of simulation and philosophy of epidemiology.
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Models and people: An alternative view of the emergent properties of computational models.Fabio Boschetti - 2016 - Complexity 21 (6):202-213.
    Download  
     
    Export citation  
     
    Bookmark  
  • Computer Simulations as Scientific Instruments.Ramón Alvarado - 2022 - Foundations of Science 27 (3):1183-1205.
    Computer simulations have conventionally been understood to be either extensions of formal methods such as mathematical models or as special cases of empirical practices such as experiments. Here, I argue that computer simulations are best understood as instruments. Understanding them as such can better elucidate their actual role as well as their potential epistemic standing in relation to science and other scientific methods, practices and devices.
    Download  
     
    Export citation  
     
    Bookmark   6 citations