Switch to: References

Citations of:

In defense of meta-analysis

Synthese 196 (8):3189-3211 (2019)

Add citations

You must login to add citations.
  1. Simulation of Trial Data to Test Speculative Hypotheses about Research Methods.Hamed Tabatabaei Ghomi & Jacob Stegenga - 2023 - In Kristien Hens & Andreas De Block (eds.), Advances in experimental philosophy of medicine. New York: Bloomsbury Academic. pp. 111-128.
    We simulate trial data to test speculative claims about research methods, such as the impact of publication bias.
    Download  
     
    Export citation  
     
    Bookmark  
  • Bias as an epistemic notion.Anke Bueter - 2022 - Studies in History and Philosophy of Science Part A 91 (C):307-315.
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Evidence amalgamation in the sciences: an introduction.Roland Poellinger, Jürgen Landes & Samuel C. Fletcher - 2019 - Synthese 196 (8):3163-3188.
    Amalgamating evidence from heterogeneous sources and across levels of inquiry is becoming increasingly important in many pure and applied sciences. This special issue provides a forum for researchers from diverse scientific and philosophical perspectives to discuss evidence amalgamation, its methodologies, its history, its pitfalls, and its potential. We situate the contributions therein within six themes from the broad literature on this subject: the variety-of-evidence thesis, the philosophy of meta-analysis, the role of robustness/sensitivity analysis for evidence amalgamation, its bearing on questions (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Is meta-analysis of RCTs assessing the efficacy of interventions a reliable source of evidence for therapeutic decisions?Mariusz Maziarz - 2022 - Studies in History and Philosophy of Science Part A 91 (C):159-167.
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • (1 other version)Assessing the Overall Validity of Randomised Controlled Trials.Alexander Krauss - 2021 - International Studies in the Philosophy of Science 34 (3):159-182.
    In the biomedical, behavioural and social sciences, the leading method used to estimate causal effects is commonly randomised controlled trials (RCTs) that are generally viewed as both the source and justification of the most valid evidence. In studying the foundation and theory behind RCTs, the existing literature analyses important single issues and biases in isolation that influence causal outcomes in trials (such as randomisation, statistical probabilities and placebos). The common account of biased causal inference is described in a general way (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • (1 other version)The promise and perils of industry‐funded science.Bennett Holman & Kevin C. Elliott - 2018 - Philosophy Compass 13 (11).
    Private companies provide by far the most funding for scientific research and development. Nevertheless, relatively little attention has been paid to the dynamics of industry‐funded research by philosophers of science. This paper addresses this gap by providing an overview of the major strengths and weaknesses of industry research funding, together with the existing recommendations for addressing the weaknesses. It is designed to provide a starting point for future philosophical work that explores the features of industry‐funded research, avenues for addressing concerns, (...)
    Download  
     
    Export citation  
     
    Bookmark   23 citations  
  • How (not) to measure replication.Samuel C. Fletcher - 2021 - European Journal for Philosophy of Science 11 (2):1-27.
    The replicability crisis refers to the apparent failures to replicate both important and typical positive experimental claims in psychological science and biomedicine, failures which have gained increasing attention in the past decade. In order to provide evidence that there is a replicability crisis in the first place, scientists have developed various measures of replication that help quantify or “count” whether one study replicates another. In this nontechnical essay, I critically examine five types of replication measures used in the landmark article (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations