Switch to: References

Add citations

You must login to add citations.
  1. Teaching evolutionary developmental biology: concepts, problems, and controversy.A. C. Love - 2013 - In Kostas Kampourakis (ed.), The Philosophy of Biology: a Companion for Educators. Dordrecht: Springer. pp. 323-341.
    Although sciences are often conceptualized in terms of theory confirmation and hypothesis testing, an equally important dimension of scientific reasoning is the structure of problems that guide inquiry. This problem structure is evident in several concepts central to evolutionary developmental biology (Evo-devo)—constraints, modularity, evolvability, and novelty. Because problems play an important role in biological practice, they should be included in biological pedagogy, especially when treating the issue of scientific controversy. A key feature of resolving controversy is synthesizing methodologies from different (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Expanding Our Grasp: Causal Knowledge and the Problem of Unconceived Alternatives.Matthias Egg - 2016 - British Journal for the Philosophy of Science 67 (1):115-141.
    I argue that scientific realism, insofar as it is only committed to those scientific posits of which we have causal knowledge, is immune to Kyle Stanford’s argument from unconceived alternatives. This causal strategy is shown not to repeat the shortcomings of previous realist responses to Stanford’s argument. Furthermore, I show that the notion of causal knowledge underlying it can be made sufficiently precise by means of conceptual tools recently introduced into the debate on scientific realism. Finally, I apply this strategy (...)
    Download  
     
    Export citation  
     
    Bookmark   20 citations  
  • Science without (parametric) models: the case of bootstrap resampling.Jan Sprenger - 2011 - Synthese 180 (1):65-76.
    Scientific and statistical inferences build heavily on explicit, parametric models, and often with good reasons. However, the limited scope of parametric models and the increasing complexity of the studied systems in modern science raise the risk of model misspecification. Therefore, I examine alternative, data-based inference techniques, such as bootstrap resampling. I argue that their neglect in the philosophical literature is unjustified: they suit some contexts of inquiry much better and use a more direct approach to scientific inference. Moreover, they make (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Bayesian Confirmation Theory and The Likelihood Principle.Daniel Steel - 2007 - Synthese 156 (1):53-77.
    The likelihood principle (LP) is a core issue in disagreements between Bayesian and frequentist statistical theories. Yet statements of the LP are often ambiguous, while arguments for why a Bayesian must accept it rely upon unexamined implicit premises. I distinguish two propositions associated with the LP, which I label LP1 and LP2. I maintain that there is a compelling Bayesian argument for LP1, based upon strict conditionalization, standard Bayesian decision theory, and a proposition I call the practical relevance principle. In (...)
    Download  
     
    Export citation  
     
    Bookmark   11 citations  
  • Non-cognitive Values and Methodological Learning in the Decision-Oriented Sciences.Oliver Todt & José Luis Luján - 2017 - Foundations of Science 22 (1):215-234.
    The function and legitimacy of values in decision making is a critically important issue in the contemporary analysis of science. It is particularly relevant for some of the more application-oriented areas of science, specifically decision-oriented science in the field of regulation of technological risks. Our main objective in this paper is to assess the diversity of roles that non-cognitive values related to decision making can adopt in the kinds of scientific activity that underlie risk regulation. We start out, first, by (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Climate Simulations: Uncertain Projections for an Uncertain World.Rafaela Hillerbrand - 2014 - Journal for General Philosophy of Science / Zeitschrift für Allgemeine Wissenschaftstheorie 45 (1):17-32.
    Between the fourth and the recent fifth IPCC report, science as well as policy making have made great advances in dealing with uncertainties in global climate models. However, the uncertainties public decision making has to deal with go well beyond what is currently addressed by policy makers and climatologists alike. It is shown in this paper that within an anthropocentric framework, a whole hierarchy of models from various scientific disciplines is needed for political decisions as regards climate change. Via what (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • Feminist Philosophy of Science.Lynn Hankinson Nelson - 2002 - In Peter K. Machamer & Michael Silberstein (eds.), The Blackwell guide to the philosophy of science. Malden, Mass.: Blackwell. pp. 312–331.
    This chapter contains sections titled: Highlights of Past Literature Current Work Future Work.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Interpretive praxis and theory‐networks.Sangwon Lee - 2006 - Pacific Philosophical Quarterly 87 (2):213-230.
    I develop the idea of what I call an interpretive praxis as a generalized procedure for analyzing how experimenters can formulate observable predictions, discern real effects from experimental artifacts, and compare predictions with data. An interpretive praxis requires theories – theories not only about instruments and the interpretation of phenomena, but also theories that connect the use of instruments and interpretation of phenomena to high‐level theory. I will call all such theories that enable experimentation to work intermediate theories. I offer (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Towards the Methodological Turn in the Philosophy of Science.Hsiang-Ke Chao, Szu-Ting Chen & Roberta L. Millstein - 2013 - In Hsiang-Ke Chao, Szu-Ting Chen & Roberta L. Millstein (eds.), Mechanism and Causality in Biology and Economics. Dordrecht: Springer.
    This chapter provides an introduction to the study of the philosophical notions of mechanisms and causality in biology and economics. This chapter sets the stage for this volume, Mechanism and Causality in Biology and Economics, in three ways. First, it gives a broad review of the recent changes and current state of the study of mechanisms and causality in the philosophy of science. Second, consistent with a recent trend in the philosophy of science to focus on scientific practices, it in (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Securing reliable evidence.Kent W. Staley - unknown
    : Evidence claims depend on fallible assumptions. Three strategies for making true evidence claims in spite of this fallibility are strengthening the support for those assumptions, weakening conclusions, and using multiple independent tests to produce robust evidence. Reliability itself, understood in frequentist terms, does not explain the usefulness of all three strategies; robustness, in particular, sometimes functions in a way that is not well-characterized in terms of reliability. I argue that, in addition to reliability, the security of evidence claims is (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • (1 other version)How experimental algorithmics can benefit from Mayo’s extensions to Neyman–Pearson theory of testing.Thomas Bartz-Beielstein - 2008 - Synthese 163 (3):385 - 396.
    Although theoretical results for several algorithms in many application domains were presented during the last decades, not all algorithms can be analyzed fully theoretically. Experimentation is necessary. The analysis of algorithms should follow the same principles and standards of other empirical sciences. This article focuses on stochastic search algorithms, such as evolutionary algorithms or particle swarm optimization. Stochastic search algorithms tackle hard real-world optimization problems, e.g., problems from chemical engineering, airfoil optimization, or bio-informatics, where classical methods from mathematical optimization fail. (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • The quantitative problem of old evidence.E. C. Barnes - 1999 - British Journal for the Philosophy of Science 50 (2):249-264.
    The quantitative problem of old evidence is the problem of how to measure the degree to which e confirms h for agent A at time t when A regards e as justified at t. Existing attempts to solve this problem have applied the e-difference approach, which compares A's probability for h at t with what probability A would assign h if A did not regard e as justified at t. The quantitative problem has been widely regarded as unsolvable primarily on (...)
    Download  
     
    Export citation  
     
    Bookmark   11 citations  
  • Experimental psychology and Duhem's problem.Sam S. Rakover - 2003 - Journal for the Theory of Social Behaviour 33 (1):45–66.
    The paper proposes a practical answer to Duhem's problem within the framework of experimental psychology. First, this problem is briefly discussed; second, two studies in psychology are presented illustrating how theories are tested. Thirdly, based on the foregoing, an approach called the “Empirical Reasoning” is developed and justified. It is shown that the ER approach can successfully cope with Duhem's problem. Finally, the ER approach and the Error Statistics approach of Mayo are critically compared with regard to Duhem's problem.
    Download  
     
    Export citation  
     
    Bookmark  
  • (1 other version)Introduction: History of science and philosophy of science.Friedrich Steinle & Richard M. Burian - 2002 - Perspectives on Science 10 (4):391-397.
    Introduces a series of articles which deals with the relationship between history of science and philosophy of science.; Introduces a series of articles which deals with the relationship between history of science and philosophy of science.
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • What is a data model?: An anatomy of data analysis in high energy physics.Antonis Antoniou - 2021 - European Journal for Philosophy of Science 11 (4):1-33.
    Many decades ago Patrick Suppes argued rather convincingly that theoretical hypotheses are not confronted with the direct, raw results of an experiment, rather, they are typically compared with models of data. What exactly is a data model however? And how do the interactions of particles at the subatomic scale give rise to the huge volumes of data that are then moulded into a polished data model? The aim of this paper is to answer these questions by presenting a detailed case (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • (1 other version)Accuracy, conditionalization, and probabilism.Don Fallis & Peter J. Lewis - 2019 - Synthese 198 (5):4017-4033.
    Accuracy-based arguments for conditionalization and probabilism appear to have a significant advantage over their Dutch Book rivals. They rely only on the plausible epistemic norm that one should try to decrease the inaccuracy of one’s beliefs. Furthermore, conditionalization and probabilism apparently follow from a wide range of measures of inaccuracy. However, we argue that there is an under-appreciated diachronic constraint on measures of inaccuracy which limits the measures from which one can prove conditionalization, and none of the remaining measures allow (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • A mistaken confidence in data.Edouard Machery - 2021 - European Journal for Philosophy of Science 11 (2):1-17.
    In this paper I explore an underdiscussed factor contributing to the replication crisis: Scientists, and following them policy makers, often neglect sources of errors in the production and interpretation of data and thus overestimate what can be learnt from them. This neglect leads scientists to conduct experiments that are insufficiently informative and science consumers, including other scientists, to put too much weight on experimental results. The former leads to fragile empirical literatures, the latter to surprise and disappointment when the fragility (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • What is epistemically wrong with research affected by sponsorship bias? The evidential account.Alexander Reutlinger - 2020 - European Journal for Philosophy of Science 10 (2):1-26.
    Biased research occurs frequently in the sciences. In this paper, I will focus on one particular kind of biased research: research that is subject to sponsorship bias. I will address the following epistemological question: what precisely is epistemically wrong with biased research of this kind? I will defend the evidential account of epistemic wrongness: that is, research affected by sponsorship bias is epistemically wrong if and only if the researchers in question make false claims about the evidential support of some (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • (1 other version)Phenomenology and qualitative research: Amedeo Giorgi's hermetic epistemology.John Paley - 2018 - Nursing Philosophy 19 (3):e12212.
    Amedeo Giorgi has published a review article devoted to Phenomenology as Qualitative Research: A Critical Analysis of Meaning Attribution. However, anyone reading this article, but unfamiliar with the book, will get a distorted view of what it is about, whom it is addressed to, what it seeks to achieve and how it goes about presenting its arguments. Not mildly distorted, in need of the odd correction here and there, but systematically misrepresented. The article is a study in misreading. Giorgi misreads (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • No evidence amalgamation without evidence measurement.Veronica J. Vieland & Hasok Chang - 2019 - Synthese 196 (8):3139-3161.
    In this paper we consider the problem of how to measure the strength of statistical evidence from the perspective of evidence amalgamation operations. We begin with a fundamental measurement amalgamation principle : for any measurement, the inputs and outputs of an amalgamation procedure must be on the same scale, and this scale must have a meaningful interpretation vis a vis the object of measurement. Using the p value as a candidate evidence measure, we examine various commonly used approaches to amalgamation (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Evidence in Neuroimaging: Towards a Philosophy of Data Analysis.Jessey Wright - 2017 - Dissertation, The University of Western Ontario
    Neuroimaging technology is the most widely used tool to study human cognition. While originally a promising tool for mapping the content of cognitive theories onto the structures of the brain, recently developed tools for the analysis, handling and sharing of data have changed the theoretical landscape of cognitive neuroscience. Even with these advancements philosophical analyses of evidence in neuroimaging remain skeptical of the promise of neuroimaging technology. These views often treat the analysis techniques used to make sense of data produced (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Psychopathy: Morally Incapacitated Persons.Heidi Maibom - 2017 - In Thomas Schramme & Steven Edwards (eds.), Handbook of the Philosophy of Medicine. Springer. pp. 1109-1129.
    After describing the disorder of psychopathy, I examine the theories and the evidence concerning the psychopaths’ deficient moral capacities. I first examine whether or not psychopaths can pass tests of moral knowledge. Most of the evidence suggests that they can. If there is a lack of moral understanding, then it has to be due to an incapacity that affects not their declarative knowledge of moral norms, but their deeper understanding of them. I then examine two suggestions: it is their deficient (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Was regression to the mean really the solution to Darwin’s problem with heredity?: Essay Review of Stigler, Stephen M. 2016. The Seven Pillars of Statistical Wisdom. Cambridge, Massachusetts: Harvard University Press. [REVIEW]Adam Krashniak & Ehud Lamm - 2017 - Biology and Philosophy (5):1-10.
    Statistical reasoning is an integral part of modern scientific practice. In The Seven Pillars of Statistical Wisdom Stephen Stigler presents seven core ideas, or pillars, of statistical thinking and the historical developments of each of these pillars, many of which were concurrent with developments in biology. Here we focus on Stigler’s fifth pillar, regression, and his discussion of how regression to the mean came to be thought of as a solution to a challenge for the theory of natural selection. Stigler (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Bayesian Perspectives on the Discovery of the Higgs Particle.Richard Dawid - 2017 - Synthese 194 (2):377-394.
    It is argued that the high degree of trust in the Higgs particle before its discovery raises the question of a Bayesian perspective on data analysis in high energy physics in an interesting way that differs from other suggestions regarding the deployment of Bayesian strategies in the field.
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • Two approaches to reasoning from evidence or what econometrics can learn from biomedical research.Julian Reiss - 2015 - Journal of Economic Methodology 22 (3):373-390.
    This paper looks at an appeal to the authority of biomedical research that has recently been used by empirical economists to motivate and justify their methods. I argue that those who make this appeal mistake the nature of biomedical research. Randomised trials, which are said to have revolutionised biomedical research, are a central methodology, but according to only one paradigm. There is another paradigm at work in biomedical research, the inferentialist paradigm, in which randomised trials play no special role. I (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Objective evidence and rules of strategy: Achinstein on method: Peter Achinstein: Evidence and method: Scientific strategies of Isaac Newton and James Clerk Maxwell. Oxford and New York: Oxford University Press, 2013, 177pp, $24.95 HB.William L. Harper, Kent W. Staley, Henk W. de Regt & Peter Achinstein - 2014 - Metascience 23 (3):413-442.
    Download  
     
    Export citation  
     
    Bookmark  
  • Phenomenology as rhetoric.John Paley - 2005 - Nursing Inquiry 12 (2):106-116.
    Phenomenology as rhetoric The literature on ‘nursing phenomenology’ is driven by a range of ontological and epistemological considerations, intended to distance it from conventionally scientific approaches. However, this paper examines a series of discrepancies between phenomenological rhetoric and phenomenological practice. The rhetoric celebrates perceptions and experience; but the concluding moment of a research report almost always makes implicit claims about reality. The rhetoric insists on uniquely personal meanings; but the practice offers blank, anonymous abstractions. The rhetoric invites us to believe (...)
    Download  
     
    Export citation  
     
    Bookmark   15 citations  
  • A Synthesis of Hempelian and Hypothetico-Deductive Confirmation.Jan Sprenger - 2013 - Erkenntnis 78 (4):727-738.
    This paper synthesizes confirmation by instances and confirmation by successful predictions, and thereby the Hempelian and the hypothetico-deductive traditions in confirmation theory. The merger of these two approaches is subsequently extended to the piecemeal confirmation of entire theories. It is then argued that this synthetic account makes a useful contribution from both a historical and a systematic perspective.
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • Testing a precise null hypothesis: the case of Lindley’s paradox.Jan Sprenger - 2013 - Philosophy of Science 80 (5):733-744.
    The interpretation of tests of a point null hypothesis against an unspecified alternative is a classical and yet unresolved issue in statistical methodology. This paper approaches the problem from the perspective of Lindley's Paradox: the divergence of Bayesian and frequentist inference in hypothesis tests with large sample size. I contend that the standard approaches in both frameworks fail to resolve the paradox. As an alternative, I suggest the Bayesian Reference Criterion: it targets the predictive performance of the null hypothesis in (...)
    Download  
     
    Export citation  
     
    Bookmark   10 citations  
  • Probabilistic Logics and Probabilistic Networks.Rolf Haenni, Jan-Willem Romeijn, Gregory Wheeler & Jon Williamson - 2010 - Dordrecht, Netherland: Synthese Library. Edited by Gregory Wheeler, Rolf Haenni, Jan-Willem Romeijn & and Jon Williamson.
    Additionally, the text shows how to develop computationally feasible methods to mesh with this framework.
    Download  
     
    Export citation  
     
    Bookmark   22 citations  
  • Why Frequentists and Bayesians Need Each Other.Jon Williamson - 2013 - Erkenntnis 78 (2):293-318.
    The orthodox view in statistics has it that frequentism and Bayesianism are diametrically opposed—two totally incompatible takes on the problem of statistical inference. This paper argues to the contrary that the two approaches are complementary and need to mesh if probabilistic reasoning is to be carried out correctly.
    Download  
     
    Export citation  
     
    Bookmark   10 citations  
  • From data to phenomena: a Kantian stance.Michela Massimi - 2011 - Synthese 182 (1):101-116.
    This paper investigates some metaphysical and epistemological assumptions behind Bogen and Woodward’s data-to-phenomena inferences. I raise a series of points and suggest an alternative possible Kantian stance about data-to-phenomena inferences. I clarify the nature of the suggested Kantian stance by contrasting it with McAllister’s view about phenomena as patterns in data sets.
    Download  
     
    Export citation  
     
    Bookmark   22 citations  
  • A new solution to the paradoxes of rational acceptability.Igor Douven - 2002 - British Journal for the Philosophy of Science 53 (3):391-410.
    The Lottery Paradox and the Preface Paradox both involve the thesis that high probability is sufficient for rational acceptability. The standard solution to these paradoxes denies that rational acceptability is deductively closed. This solution has a number of untoward consequences. The present paper suggests that a better solution to the paradoxes is to replace the thesis that high probability suffices for rational acceptability with a somewhat stricter thesis. This avoids the untoward consequences of the standard solution. The new solution will (...)
    Download  
     
    Export citation  
     
    Bookmark   63 citations  
  • Comparativist rationality and.Kristin Shrader-Frechette - 2004 - Topoi 23 (2):153-163.
    US testing of nuclear weapons has resulted in about 800,000 premature fatal cancers throughout the globe, and the nuclear tests of China, France, India, Russia, and the UK have added to this total. Surprisingly, however, these avoidable deaths have not received much attention, as compared, for example, to the smaller number of US fatalities on 9-11-01. This essay (1) surveys the methods and models used to assess effects of low-dose ionizing radiation from above-ground nuclear weapons tests and (2) explains some (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Golden events and statistics: What's wrong with Galison's image/logic distinction?Kent W. Staley - 1999 - Perspectives on Science 7 (2):196-230.
    : Peter Galison has recently claimed that twentieth-century microphysics has been pursued by two distinct experimental traditions--the image tradition and the logic tradition--that have only recently merged into a hybrid tradition. According to Galison, the two traditions employ fundamentally different forms of experimental argument, with the logic tradition using statistical arguments, while the image tradition strives for non-statistical demonstrations based on compelling ("golden") single events. I show that discoveries in both traditions have employed the same statistical form of argument, even (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • Manipulating underdetermination in scientific controversy: The case of the molecular clock.Michael R. Dietrich & Robert A. Skipper - 2007 - Perspectives on Science 15 (3):295-326.
    : Where there are cases of underdetermination in scientific controversies, such as the case of the molecular clock, scientists may direct the course and terms of dispute by playing off the multidimensional framework of theory evaluation. This is because assessment strategies themselves are underdetermined. Within the framework of assessment, there are a variety of trade-offs between different strategies as well as shifting emphases as specific strategies are given more or less weight in assessment situations. When a strategy is underdetermined, scientists (...)
    Download  
     
    Export citation  
     
    Bookmark   17 citations  
  • (1 other version)Scientific self-correction: the Bayesian way.Felipe Romero & Jan Sprenger - 2020 - Synthese (Suppl 23):1-21.
    The enduring replication crisis in many scientific disciplines casts doubt on the ability of science to estimate effect sizes accurately, and in a wider sense, to self-correct its findings and to produce reliable knowledge. We investigate the merits of a particular countermeasure—replacing null hypothesis significance testing with Bayesian inference—in the context of the meta-analytic aggregation of effect sizes. In particular, we elaborate on the advantages of this Bayesian reform proposal under conditions of publication bias and other methodological imperfections that are (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • On the Possibility of Crucial Experiments in Biology.Tudor Baetu - 2019 - British Journal for the Philosophy of Science 70 (2):407-429.
    The article analyses in detail the Meselson–Stahl experiment, identifying two novel difficulties for the crucial experiment account, namely, the fragility of the experimental results and the fact that the hypotheses under scrutiny were not mutually exclusive. The crucial experiment account is rejected in favour of an experimental-mechanistic account of the historical significance of the experiment, emphasizing that the experiment generated data about the biochemistry of DNA replication that is independent of the testing of the semi-conservative, conservative, and dispersive hypotheses. _1_ (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Significance testing, p-values and the principle of total evidence.Bengt Autzen - 2016 - European Journal for Philosophy of Science 6 (2):281-295.
    The paper examines the claim that significance testing violates the Principle of Total Evidence. I argue that p-values violate PTE for two-sided tests but satisfy PTE for one-sided tests invoking a sufficient test statistic independent of the preferred theory of evidence. While the focus of the paper is to evaluate a particular claim about the relationship of significance testing and PTE, I clarify the reading of this methodological principle along the way.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Base empírica global de contrastación, base empírica local de contrastación y aserción empírica de una teoría.Pablo Lorenzano - 2012 - Agora 31 (2):71-107.
    The aim of this article is to contribute to the discussion about the so-called “empirical claim” and “empirical basis” of theory testing. First, the proposals of reconceptualization of the standard notions of partial potential model, intended application and empirical claim of a theory made by Balzer (1982, 1988, 1997a, 1997b, 2006, Balzer, Lauth & Zoubek 1993) and Gähde (1996, 2002, 2008) will be first discussed. Then, the distinction between “global” and “local empirical basis” will be introduced, linking it with that (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • (1 other version)What’s Wrong With Our Theories of Evidence?Julian Reiss - 2014 - Theoria: Revista de Teoría, Historia y Fundamentos de la Ciencia 29 (2):283-306.
    This paper surveys and critically assesses existing theories of evidence with respect to four desiderata. A good theory of evidence should be both a theory of evidential support (i.e., be informative about what kinds of facts speak in favour of a hypothesis), and of warrant (i.e., be informative about how strongly a given set of facts speaks in favour of the hypothesis), it should apply to the non-ideal cases in which scientists typically find themselves, and it should be ‘descriptively adequate’, (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Financial Conflicts of Interest and Criteria for Research Credibility.Kevin C. Elliott - 2014 - Erkenntnis 79 (5):917-937.
    The potential for financial conflicts of interest (COIs) to damage the credibility of scientific research has become a significant social concern, especially in the wake of high-profile incidents involving the pharmaceutical, tobacco, fossil-fuel, and chemical industries. Scientists and policy makers have debated whether the presence of financial COIs should count as a reason for treating research with suspicion or whether research should instead be evaluated solely based on its scientific quality. This paper examines a recent proposal to develop criteria for (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • Pragmatic norms in science: making them explicit.María Caamaño Alegre - 2013 - Synthese 190 (15):3227-3246.
    The present work constitutes an attempt to make explicit those pragmatic norms successfully operating in empirical science. I will first comment on the initial presuppositions of the discussion, in particular, on those concerning the instrumental character of scientific practice and the nature of scientific goals. Then I will depict the moderately naturalistic frame in which, from this approach, the pragmatic norms make sense. Third, I will focus on the specificity of the pragmatic norms, making special emphasis on what I regard (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Computer simulation and the philosophy of science.Eric Winsberg - 2009 - Philosophy Compass 4 (5):835-845.
    There are a variety of topics in the philosophy of science that need to be rethought, in varying degrees, after one pays careful attention to the ways in which computer simulations are used in the sciences. There are a number of conceptual issues internal to the practice of computer simulation that can benefit from the attention of philosophers. This essay surveys some of the recent literature on simulation from the perspective of the philosophy of science and argues that philosophers have (...)
    Download  
     
    Export citation  
     
    Bookmark   29 citations  
  • Franklin, Holmes, and the epistemology of computer simulation.Wendy S. Parker - 2008 - International Studies in the Philosophy of Science 22 (2):165 – 183.
    Allan Franklin has identified a number of strategies that scientists use to build confidence in experimental results. This paper shows that Franklin's strategies have direct analogues in the context of computer simulation and then suggests that one of his strategies—the so-called 'Sherlock Holmes' strategy—deserves a privileged place within the epistemologies of experiment and simulation. In particular, it is argued that while the successful application of even several of Franklin's other strategies (or their analogues in simulation) may not be sufficient for (...)
    Download  
     
    Export citation  
     
    Bookmark   23 citations  
  • Error, error-statistics and self-directed anticipative learning.R. P. Farrell & C. A. Hooker - 2008 - Foundations of Science 14 (4):249-271.
    Error is protean, ubiquitous and crucial in scientific process. In this paper it is argued that understanding scientific process requires what is currently absent: an adaptable, context-sensitive functional role for error in science that naturally harnesses error identification and avoidance to positive, success-driven, science. This paper develops a new account of scientific process of this sort, error and success driving Self-Directed Anticipative Learning (SDAL) cycling, using a recent re-analysis of ape-language research as test example. The example shows the limitations of (...)
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • Incommensurability and theory comparison in experimental biology.Marcel Weber - 2002 - Biology and Philosophy 17 (2):155-169.
    Incommensurability of scientific theories, as conceived by Thomas Kuhnand Paul Feyerabend, is thought to be a major or even insurmountable obstacletothe empirical comparison of these theories. I examine this problem in light ofaconcrete case from the history of experimental biology, namely the oxidativephosphorylation controversy in biochemistry (ca. 1961-1977). After a briefhistorical exposition, I show that the two main competing theories which werethe subject of the ox-phos controversy instantiate some of the characteristicfeatures of incommensurable theories, namely translation failure,non-corresponding predictions, and different (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Explaining disease: Correlations, causes, and mechanisms. [REVIEW]Paul Thagard - 1998 - Minds and Machines 8 (1):61-78.
    Why do people get sick? I argue that a disease explanation is best thought of as causal network instantiation, where a causal network describes the interrelations among multiple factors, and instantiation consists of observational or hypothetical assignment of factors to the patient whose disease is being explained. This paper first discusses inference from correlation to causation, integrating recent psychological discussions of causal reasoning with epidemiological approaches to understanding disease causation, particularly concerning ulcers and lung cancer. It then shows how causal (...)
    Download  
     
    Export citation  
     
    Bookmark   32 citations  
  • The phenomena of homology.Paul Edmund Griffiths - 2007 - Biology and Philosophy 22 (5):643-658.
    Philosophical discussions of biological classification have failed to recognise the central role of homology in the classification of biological parts and processes. One reason for this is a misunderstanding of the relationship between judgments of homology and the core explanatory theories of biology. The textbook characterisation of homology as identity by descent is commonly regarded as a definition. I suggest instead that it is one of several attempts to explain the phenomena of homology. Twenty years ago the ‘new experimentalist’ movement (...)
    Download  
     
    Export citation  
     
    Bookmark   38 citations  
  • (1 other version)What distinguishes data from models?Sabina Leonelli - 2019 - European Journal for Philosophy of Science 9 (2):22.
    I propose a framework that explicates and distinguishes the epistemic roles of data and models within empirical inquiry through consideration of their use in scientific practice. After arguing that Suppes’ characterization of data models falls short in this respect, I discuss a case of data processing within exploratory research in plant phenotyping and use it to highlight the difference between practices aimed to make data usable as evidence and practices aimed to use data to represent a specific phenomenon. I then (...)
    Download  
     
    Export citation  
     
    Bookmark   17 citations