Switch to: References

Add citations

You must login to add citations.
  1. Amalgamating evidence of dynamics.David Danks & Sergey Plis - 2019 - Synthese 196 (8):3213-3230.
    Many approaches to evidence amalgamation focus on relatively static information or evidence: the data to be amalgamated involve different variables, contexts, or experiments, but not measurements over extended periods of time. However, much of scientific inquiry focuses on dynamical systems; the system’s behavior over time is critical. Moreover, novel problems of evidence amalgamation arise in these contexts. First, data can be collected at different measurement timescales, where potentially none of them correspond to the underlying system’s causal timescale. Second, missing variables (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • With or Without Mechanisms: A Reply to Weber.Daniel Steel - 2007 - Philosophy of the Social Sciences 37 (3):360-365.
    This reply to Erik Weber's commentary agrees that mechanisms are important for causal inference in social science, but argues that Weber makes the mistake that was the main focus of my original essay: inferring that since a problem cannot be solved without mechanisms, it can be solved with them. As it stands, this inference is invalid since the problem might be unsolvable with or without mechanisms. Any claim about the usefulness of mechanisms for some purpose requires an adequate account of (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Is meta-analysis the platinum standard of evidence?Jacob Stegenga - 2011 - Studies in History and Philosophy of Science Part C: Studies in History and Philosophy of Biological and Biomedical Sciences 42 (4):497-507.
    An astonishing volume and diversity of evidence is available for many hypotheses in the biomedical and social sciences. Some of this evidence—usually from randomized controlled trials (RCTs)—is amalgamated by meta-analysis. Despite the ongoing debate regarding whether or not RCTs are the ‘gold-standard’ of evidence, it is usually meta-analysis which is considered the best source of evidence: meta-analysis is thought by many to be the platinum standard of evidence. However, I argue that meta-analysis falls far short of that standard. Different meta-analyses (...)
    Download  
     
    Export citation  
     
    Bookmark   71 citations  
  • Is meta-analysis the platinum standard of evidence?Jacob Stegenga - 2011 - Studies in History and Philosophy of Science Part C: Studies in History and Philosophy of Biological and Biomedical Sciences 42 (4):497-507.
    Download  
     
    Export citation  
     
    Bookmark   71 citations  
  • The Problem of Piecemeal Induction.Conor Mayo-Wilson - 2011 - Philosophy of Science 78 (5):864-874.
    It is common to assume that the problem of induction arises only because of small sample sizes or unreliable data. In this paper, I argue that the piecemeal collection of data can also lead to underdetermination of theories by evidence, even if arbitrarily large amounts of completely reliable experimental and observational data are collected. Specifically, I focus on the construction of causal theories from the results of many studies (perhaps hundreds), including randomized controlled trials and observational studies, where the studies (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • The Limits of Piecemeal Causal Inference.Conor Mayo-Wilson - 2014 - British Journal for the Philosophy of Science 65 (2):213-249.
    In medicine and the social sciences, researchers must frequently integrate the findings of many observational studies, which measure overlapping collections of variables. For instance, learning how to prevent obesity requires combining studies that investigate obesity and diet with others that investigate obesity and exercise. Recently developed causal discovery algorithms provide techniques for integrating many studies, but little is known about what can be learned from such algorithms. This article argues that there are causal facts that one could learn by conducting (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • What is right with 'bayes net methods' and what is wrong with 'hunting causes and using them'?Clark Glymour - 2010 - British Journal for the Philosophy of Science 61 (1):161-211.
    Nancy Cartwright's recent criticisms of efforts and methods to obtain causal information from sample data using automated search are considered. In addition to reviewing that effort, I argue that almost all of her criticisms are false and rest on misreading, overgeneralization, or neglect of the relevant literature.
    Download  
     
    Export citation  
     
    Bookmark   7 citations