Switch to: Citations

Add references

You must login to add references.
  1. Introduction: Machine learning as philosophy of science.Kevin B. Korb - 2004 - Minds and Machines 14 (4):433-440.
    I consider three aspects in which machine learning and philosophy of science can illuminate each other: methodology, inductive simplicity and theoretical terms. I examine the relations between the two subjects and conclude by claiming these relations to be very close.
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • The Structure and Dynamics of Scientific Theories: A Hierarchical Bayesian Perspective.Leah Henderson, Noah D. Goodman, Joshua B. Tenenbaum & James F. Woodward - 2010 - Philosophy of Science 77 (2):172-200.
    Hierarchical Bayesian models (HBMs) provide an account of Bayesian inference in a hierarchically structured hypothesis space. Scientific theories are plausibly regarded as organized into hierarchies in many cases, with higher levels sometimes called ‘paradigms’ and lower levels encoding more specific or concrete hypotheses. Therefore, HBMs provide a useful model for scientific theory change, showing how higher‐level theory change may be driven by the impact of evidence on lower levels. HBMs capture features described in the Kuhnian tradition, particularly the idea that (...)
    Download  
     
    Export citation  
     
    Bookmark   37 citations  
  • The structure and dynamics of scientific theories: a hierarchical Bayesian perspective.Leah Henderson, Noah D. Goodman, Joshua B. Tenenbaum & James F. Woodward - 2010 - Philosophy of Science 77 (2):172-200.
    Hierarchical Bayesian models (HBMs) provide an account of Bayesian inference in a hierarchically structured hypothesis space. Scientific theories are plausibly regarded as organized into hierarchies in many cases, with higher levels sometimes called ‘para- digms’ and lower levels encoding more specific or concrete hypotheses. Therefore, HBMs provide a useful model for scientific theory change, showing how higher-level theory change may be driven by the impact of evidence on lower levels. HBMs capture features described in the Kuhnian tradition, particularly the idea (...)
    Download  
     
    Export citation  
     
    Bookmark   34 citations  
  • Bayesian Perspectives on the Discovery of the Higgs Particle.Richard Dawid - 2017 - Synthese 194 (2):377-394.
    It is argued that the high degree of trust in the Higgs particle before its discovery raises the question of a Bayesian perspective on data analysis in high energy physics in an interesting way that differs from other suggestions regarding the deployment of Bayesian strategies in the field.
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • Can there be a Bayesian explanationism? On the prospects of a productive partnership.Frank Cabrera - 2017 - Synthese 194 (4):1245–1272.
    In this paper, I consider the relationship between Inference to the Best Explanation and Bayesianism, both of which are well-known accounts of the nature of scientific inference. In Sect. 2, I give a brief overview of Bayesianism and IBE. In Sect. 3, I argue that IBE in its most prominently defended forms is difficult to reconcile with Bayesianism because not all of the items that feature on popular lists of “explanatory virtues”—by means of which IBE ranks competing explanations—have confirmational import. (...)
    Download  
     
    Export citation  
     
    Bookmark   26 citations  
  • Artificial Intelligence: A Modern Approach.Stuart Jonathan Russell & Peter Norvig (eds.) - 1995 - Prentice-Hall.
    Artificial Intelligence: A Modern Approach, 3e offers the most comprehensive, up-to-date introduction to the theory and practice of artificial intelligence. Number one in its field, this textbook is ideal for one or two-semester, undergraduate or graduate-level courses in Artificial Intelligence. Dr. Peter Norvig, contributing Artificial Intelligence author and Professor Sebastian Thrun, a Pearson author are offering a free online course at Stanford University on artificial intelligence. According to an article in The New York Times, the course on artificial intelligence is (...)
    Download  
     
    Export citation  
     
    Bookmark   279 citations  
  • 1. Not a Sure Thing: Fitness, Probability, and Causation Not a Sure Thing: Fitness, Probability, and Causation (pp. 147-171). [REVIEW]Denis M. Walsh, Leah Henderson, Noah D. Goodman, Joshua B. Tenenbaum, James F. Woodward, Hannes Leitgeb, Richard Pettigrew, Brad Weslake & John Kulvicki - 2010 - Philosophy of Science 77 (2):172-200.
    Hierarchical Bayesian models provide an account of Bayesian inference in a hierarchically structured hypothesis space. Scientific theories are plausibly regarded as organized into hierarchies in many cases, with higher levels sometimes called ‘paradigms’ and lower levels encoding more specific or concrete hypotheses. Therefore, HBMs provide a useful model for scientific theory change, showing how higher-level theory change may be driven by the impact of evidence on lower levels. HBMs capture features described in the Kuhnian tradition, particularly the idea that higher-level (...)
    Download  
     
    Export citation  
     
    Bookmark   19 citations  
  • Simplicity and model selection.Guillaume Rochefort-Maranda - 2016 - European Journal for Philosophy of Science 6 (2):261-279.
    In this paper I compare parametric and nonparametric regression models with the help of a simulated data set. Doing so, I have two main objectives. The first one is to differentiate five concepts of simplicity and assess their respective importance. The second one is to show that the scope of the existing philosophical literature on simplicity and model selection is too narrow because it does not take the nonparametric approach into account, S112–S123, 2002; Forster and Sober in The British Journal (...)
    Download  
     
    Export citation  
     
    Bookmark   10 citations  
  • The Nature of Statistical Learning Theory.Vladimir Vapnik - 2000 - Springer: New York.
    The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. It considers learning as a general problem of function estimation based on empirical data. Omitting proofs and technical details, the author concentrates on discussing the main results of learning theory and their connections to fundamental problems in statistics. This second edition contains three new chapters devoted to further development of the learning theory and SVM techniques. Written in a readable (...)
    Download  
     
    Export citation  
     
    Bookmark   66 citations  
  • Pattern Recognition and Machine Learning.Christopher M. Bishop - 2006 - Springer: New York.
    This is the first textbook on pattern recognition to present the Bayesian viewpoint. The book presents approximate inference algorithms that permit fast approximate answers in situations where exact answers are not feasible. It uses graphical models to describe probability distributions when no other books apply graphical models to machine learning. No previous knowledge of pattern recognition or machine learning concepts is assumed. Familiarity with multivariate calculus and basic linear algebra is required, and some experience in the use of probabilities would (...)
    Download  
     
    Export citation  
     
    Bookmark   82 citations  
  • Ockham’s Razors: A User’s Manual.Elliott Sober - 2015 - Cambridge: Cambridge University Press.
    Ockham's razor, the principle of parsimony, states that simpler theories are better than theories that are more complex. It has a history dating back to Aristotle and it plays an important role in current physics, biology, and psychology. The razor also gets used outside of science - in everyday life and in philosophy. This book evaluates the principle and discusses its many applications. Fascinating examples from different domains provide a rich basis for contemplating the principle's promises and perils. It is (...)
    Download  
     
    Export citation  
     
    Bookmark   90 citations  
  • Simplicity, Truth, and Probability.Kevin T. Kelly - unknown
    Simplicity has long been recognized as an apparent mark of truth in science, but it is difficult to explain why simplicity should be accorded such weight. This chapter examines some standard, statistical explanations of the role of simplicity in scientific method and argues that none of them explains, without circularity, how a reliance on simplicity could be conducive to finding true models or theories. The discussion then turns to a less familiar approach that does explain, in a sense, the elusive (...)
    Download  
     
    Export citation  
     
    Bookmark   14 citations  
  • Ockham's razor, empirical complexity, and truth-finding efficiency.Kevin Kelly - 2007 - Theoretical Computer Science 383:270-289.
    Theoretical Computer Science, 383: 270-289, 2007.
    Download  
     
    Export citation  
     
    Bookmark   11 citations