Switch to: References

Add citations

You must login to add citations.
  1. Reconciling simplicity and likelihood principles in perceptual organization.Nick Chater - 1996 - Psychological Review 103 (3):566-581.
    Download  
     
    Export citation  
     
    Bookmark   38 citations  
  • Probabilistic Reasoning in Cosmology.Yann Benétreau-Dupin - 2015 - Dissertation, The University of Western Ontario
    Cosmology raises novel philosophical questions regarding the use of probabilities in inference. This work aims at identifying and assessing lines of arguments and problematic principles in probabilistic reasoning in cosmology. -/- The first, second, and third papers deal with the intersection of two distinct problems: accounting for selection effects, and representing ignorance or indifference in probabilistic inferences. These two problems meet in the cosmology literature when anthropic considerations are used to predict cosmological parameters by conditionalizing the distribution of, e.g., the (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Philosophy as conceptual engineering: Inductive logic in Rudolf Carnap's scientific philosophy.Christopher F. French - 2015 - Dissertation, University of British Columbia
    My dissertation explores the ways in which Rudolf Carnap sought to make philosophy scientific by further developing recent interpretive efforts to explain Carnap’s mature philosophical work as a form of engineering. It does this by looking in detail at his philosophical practice in his most sustained mature project, his work on pure and applied inductive logic. I, first, specify the sort of engineering Carnap is engaged in as involving an engineering design problem and then draw out the complications of design (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Error statistical modeling and inference: Where methodology meets ontology.Aris Spanos & Deborah G. Mayo - 2015 - Synthese 192 (11):3533-3555.
    In empirical modeling, an important desiderata for deeming theoretical entities and processes as real is that they can be reproducible in a statistical sense. Current day crises regarding replicability in science intertwines with the question of how statistical methods link data to statistical and substantive theories and models. Different answers to this question have important methodological consequences for inference, which are intertwined with a contrast between the ontological commitments of the two types of models. The key to untangling them is (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • The mind, the lab, and the field: Three kinds of populations in scientific practice.Rasmus Grønfeldt Winther, Ryan Giordano, Michael D. Edge & Rasmus Nielsen - 2015 - Studies in History and Philosophy of Science Part C: Studies in History and Philosophy of Biological and Biomedical Sciences 52:12-21.
    Scientists use models to understand the natural world, and it is important not to conflate model and nature. As an illustration, we distinguish three different kinds of populations in studies of ecology and evolution: theoretical, laboratory, and natural populations, exemplified by the work of R.A. Fisher, Thomas Park, and David Lack, respectively. Biologists are rightly concerned with all three types of populations. We examine the interplay between these different kinds of populations, and their pertinent models, in three examples: the notion (...)
    Download  
     
    Export citation  
     
    Bookmark   15 citations  
  • A frequentist interpretation of probability for model-based inductive inference.Aris Spanos - 2013 - Synthese 190 (9):1555-1585.
    The main objective of the paper is to propose a frequentist interpretation of probability in the context of model-based induction, anchored on the Strong Law of Large Numbers (SLLN) and justifiable on empirical grounds. It is argued that the prevailing views in philosophy of science concerning induction and the frequentist interpretation of probability are unduly influenced by enumerative induction, and the von Mises rendering, both of which are at odds with frequentist model-based induction that dominates current practice. The differences between (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Probabilistic Logics and Probabilistic Networks.Rolf Haenni, Jan-Willem Romeijn, Gregory Wheeler & Jon Williamson - 2010 - Dordrecht, Netherland: Synthese Library. Edited by Gregory Wheeler, Rolf Haenni, Jan-Willem Romeijn & and Jon Williamson.
    Additionally, the text shows how to develop computationally feasible methods to mesh with this framework.
    Download  
     
    Export citation  
     
    Bookmark   22 citations  
  • Prediction in selectionist evolutionary theory.Rasmus Gr⊘Nfeldt Winther - 2009 - Philosophy of Science 76 (5):889-901.
    Selectionist evolutionary theory has often been faulted for not making novel predictions that are surprising, risky, and correct. I argue that it in fact exhibits the theoretical virtue of predictive capacity in addition to two other virtues: explanatory unification and model fitting. Two case studies show the predictive capacity of selectionist evolutionary theory: parallel evolutionary change in E. coli, and the origin of eukaryotic cells through endosymbiosis.
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Causality and causal modelling in the social sciences.Federica Russo - 2009 - Springer, Dordrecht.
    The anti-causal prophecies of last century have been disproved. Causality is neither a ‘relic of a bygone’ nor ‘another fetish of modern science’; it still occupies a large part of the current debate in philosophy and the sciences. This investigation into causal modelling presents the rationale of causality, i.e. the notion that guides causal reasoning in causal modelling. It is argued that causal models are regimented by a rationale of variation, nor of regularity neither invariance, thus breaking down the dominant (...)
    Download  
     
    Export citation  
     
    Bookmark   33 citations  
  • The theory of nomic probability.John L. Pollock - 1992 - Synthese 90 (2):263 - 299.
    This article sketches a theory of objective probability focusing on nomic probability, which is supposed to be the kind of probability figuring in statistical laws of nature. The theory is based upon a strengthened probability calculus and some epistemological principles that formulate a precise version of the statistical syllogism. It is shown that from this rather minimal basis it is possible to derive theorems comprising (1) a theory of direct inference, and (2) a theory of induction. The theory of induction (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Models and statistical inference: The controversy between Fisher and neyman–pearson.Johannes Lenhard - 2006 - British Journal for the Philosophy of Science 57 (1):69-91.
    The main thesis of the paper is that in the case of modern statistics, the differences between the various concepts of models were the key to its formative controversies. The mathematical theory of statistical inference was mainly developed by Ronald A. Fisher, Jerzy Neyman, and Egon S. Pearson. Fisher on the one side and Neyman–Pearson on the other were involved often in a polemic controversy. The common view is that Neyman and Pearson made Fisher's account more stringent mathematically. It is (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • Kolmogorov complexity and information theory. With an interpretation in terms of questions and answers.Peter D. Grünwald & Paul M. B. Vitányi - 2003 - Journal of Logic, Language and Information 12 (4):497-529.
    We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and wherethey are fundamentally different. We discuss and relate the basicnotions of both theories: Shannon entropy, Kolmogorov complexity, Shannon mutual informationand Kolmogorov (``algorithmic'') mutual information. We explainhow universal coding may be viewed as a middle ground betweenthe two theories. We consider Shannon's rate distortion theory, whichquantifies useful (in a certain sense) information.We use the communication of information as our guiding motif, (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • The rational analysis of mind and behavior.Nick Chater & Mike Oaksford - 2000 - Synthese 122 (1-2):93-131.
    Rational analysis (Anderson 1990, 1991a) is an empiricalprogram of attempting to explain why the cognitive system isadaptive, with respect to its goals and the structure of itsenvironment. We argue that rational analysis has two importantimplications for philosophical debate concerning rationality. First,rational analysis provides a model for the relationship betweenformal principles of rationality (such as probability or decisiontheory) and everyday rationality, in the sense of successfulthought and action in daily life. Second, applying the program ofrational analysis to research on human reasoning (...)
    Download  
     
    Export citation  
     
    Bookmark   22 citations  
  • Type I error rates are not usually inflated.Mark Rubin - manuscript
    The inflation of Type I error rates is thought to be one of the causes of the replication crisis. Questionable research practices such as p-hacking are thought to inflate Type I error rates above their nominal level, leading to unexpectedly high levels of false positives in the literature and, consequently, unexpectedly low replication rates. In this article, I offer an alternative view. I argue that questionable and other research practices do not usually inflate relevant Type I error rates. I begin (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • The Identification of Mean Quantum Potential with Fisher Information Leads to a Strong Uncertainty Relation.Yakov Bloch & Eliahu Cohen - 2022 - Foundations of Physics 52 (6):1-11.
    The Cramér–Rao bound, satisfied by classical Fisher information, a key quantity in information theory, has been shown in different contexts to give rise to the Heisenberg uncertainty principle of quantum mechanics. In this paper, we show that the identification of the mean quantum potential, an important notion in Bohmian mechanics, with the Fisher information, leads, through the Cramér–Rao bound, to an uncertainty principle which is stronger, in general, than both Heisenberg and Robertson–Schrödinger uncertainty relations, allowing to experimentally test the validity (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Biosemiotics and Applied Evolutionary Epistemology: A Comparison.Nathalie Gontier & M. Facoetti - 2021 - In Nathalie Gontier & M. Facoetti (eds.), In: Pagni E., Theisen Simanke R. (eds) Biosemiotics and Evolution. Interdisciplinary Evolution Research, vol 6. Springer, Cham. Cham: pp. 175-199.
    Both biosemiotics and evolutionary epistemology are concerned with how knowledge evolves. (Applied) Evolutionary Epistemology thereby focuses on identifying the units, levels, and mechanisms or processes that underlie the evolutionary development of knowing and knowledge, while biosemiotics places emphasis on the study of how signs underlie the development of meaning. We compare the two schools of thought and analyze how in delineating their research program, biosemiotics runs into several problems that are overcome by evolutionary epistemologists. For one, by emphasizing signs, biosemiotics (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Bernoulli’s golden theorem in retrospect: error probabilities and trustworthy evidence.Aris Spanos - 2021 - Synthese 199 (5-6):13949-13976.
    Bernoulli’s 1713 golden theorem is viewed retrospectively in the context of modern model-based frequentist inference that revolves around the concept of a prespecified statistical model Mθx\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${\mathcal{M}}_{{{\varvec{\uptheta}}}} \left( {\mathbf{x}} \right)$$\end{document}, defining the inductive premises of inference. It is argued that several widely-accepted claims relating to the golden theorem and frequentist inference are either misleading or erroneous: (a) Bernoulli solved the problem of inference ‘from probability to frequency’, and thus (b) the golden theorem (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Inference in the age of big data: Future perspectives on neuroscience.Danilo Bzdok & B. Yeo - unknown
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • The problem of model selection and scientific realism.Stanislav Larski - unknown
    This thesis has two goals. Firstly, we consider the problem of model selection for the purposes of prediction. In modern science predictive mathematical models are ubiquitous and can be found in such diverse fields as weather forecasting, economics, ecology, mathematical psychology, sociology, etc. It is often the case that for a given domain of inquiry there are several plausible models, and the issue then is how to discriminate between them – this is the problem of model selection. We consider approaches (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Can a Significance Test Be Genuinely Bayesian?Julio Michael Stern, Carlos Alberto de Braganca Pereira & Sergio Wechsler - 2008 - Bayesian Analysis 3 (1):79-100.
    The Full Bayesian Significance Test, FBST, is extensively reviewed. Its test statistic, a genuine Bayesian measure of evidence, is discussed in detail. Its behavior in some problems of statistical inference like testing for independence in contingency tables is discussed.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • On Peirce’s 1878 article ‘The probability of induction’: a conceptualistic appraisal.G. A. Kyriazis - 2020 - Archive for History of Exact Sciences 75 (1):1-20.
    Charles Sanders Peirce wrote the article “The probability of induction” in 1878. It was the fourth article of the series “Illustrations of the Logic of Science” which comprised a total of six articles. According to Peirce, to get a clear idea of the conception of probability, one has ‘to consider what real and sensible difference there is between one degree of probability and another.’ He endorsed what John Venn had called the ‘materialistic view’ of the subject, namely that probability is (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Semantic Information G Theory and Logical Bayesian Inference for Machine Learning.Chenguang Lu - 2019 - Information 10 (8):261.
    An important problem with machine learning is that when label number n>2, it is very difficult to construct and optimize a group of learning functions, and we wish that optimized learning functions are still useful when prior distribution P(x) (where x is an instance) is changed. To resolve this problem, the semantic information G theory, Logical Bayesian Inference (LBI), and a group of Channel Matching (CM) algorithms together form a systematic solution. MultilabelMultilabel A semantic channel in the G theory consists (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Burnside’s engagement with the “modern theory of statistics”.John Aldrich - 2008 - Archive for History of Exact Sciences 63 (1):51-79.
    The group theorist William Burnside devoted much of the last decade of his life to probability and statistics. The work led to contact with Ronald Fisher who was on his way to becoming the leading statistician of the age and with Karl Pearson, the man Fisher supplanted. Burnside corresponded with Fisher for nearly three years until their correspondence ended abruptly. This paper examines Burnside’s interactions with the statisticians and looks more generally at his work in probability and statistics.
    Download  
     
    Export citation  
     
    Bookmark  
  • What type of Type I error? Contrasting the Neyman–Pearson and Fisherian approaches in the context of exact and direct replications.Mark Rubin - 2021 - Synthese 198 (6):5809–5834.
    The replication crisis has caused researchers to distinguish between exact replications, which duplicate all aspects of a study that could potentially affect the results, and direct replications, which duplicate only those aspects of the study that are thought to be theoretically essential to reproduce the original effect. The replication crisis has also prompted researchers to think more carefully about the possibility of making Type I errors when rejecting null hypotheses. In this context, the present article considers the utility of two (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • Statistics and Probability Have Always Been Value-Laden: An Historical Ontology of Quantitative Research Methods.Michael J. Zyphur & Dean C. Pierides - 2020 - Journal of Business Ethics 167 (1):1-18.
    Quantitative researchers often discuss research ethics as if specific ethical problems can be reduced to abstract normative logics (e.g., virtue ethics, utilitarianism, deontology). Such approaches overlook how values are embedded in every aspect of quantitative methods, including ‘observations,’ ‘facts,’ and notions of ‘objectivity.’ We describe how quantitative research practices, concepts, discourses, and their objects/subjects of study have always been value-laden, from the invention of statistics and probability in the 1600s to their subsequent adoption as a logic made to appear as (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • From Models to Simulations.Franck Varenne - 2018 - London, UK: Routledge.
    This book analyses the impact computerization has had on contemporary science and explains the origins, technical nature and epistemological consequences of the current decisive interplay between technology and science: an intertwining of formalism, computation, data acquisition, data and visualization and how these factors have led to the spread of simulation models since the 1950s. -/- Using historical, comparative and interpretative case studies from a range of disciplines, with a particular emphasis on the case of plant studies, the author shows how (...)
    Download  
     
    Export citation  
     
    Bookmark   15 citations  
  • Simplicity, Inference and Modelling: Keeping It Sophisticatedly Simple.Arnold Zellner, Hugo A. Keuzenkamp & Michael McAleer (eds.) - 2001 - New York: Cambridge University Press.
    The idea that simplicity matters in science is as old as science itself, with the much cited example of Ockham's Razor, 'entia non sunt multiplicanda praeter necessitatem': entities are not to be multiplied beyond necessity. A problem with Ockham's razor is that nearly everybody seems to accept it, but few are able to define its exact meaning and to make it operational in a non-arbitrary way. Using a multidisciplinary perspective including philosophers, mathematicians, econometricians and economists, this 2002 monograph examines simplicity (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • Applied Logic without Psychologism.Gregory Wheeler - 2008 - Studia Logica 88 (1):137-156.
    Logic is a celebrated representation language because of its formal generality. But there are two senses in which a logic may be considered general, one that concerns a technical ability to discriminate between different types of individuals, and another that concerns constitutive norms for reasoning as such. This essay embraces the former, permutation-invariance conception of logic and rejects the latter, Fregean conception of logic. The question of how to apply logic under this pure invariantist view is addressed, and a methodology (...)
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • Reasoning defeasibly about probabilities.John L. Pollock - 2011 - Synthese 181 (2):317-352.
    In concrete applications of probability, statistical investigation gives us knowledge of some probabilities, but we generally want to know many others that are not directly revealed by our data. For instance, we may know prob(P/Q) (the probability of P given Q) and prob(P/R), but what we really want is prob(P/Q& R), and we may not have the data required to assess that directly. The probability calculus is of no help here. Given prob(P/Q) and prob(P/R), it is consistent with the probability (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Johannes von Kries’s Principien: A Brief Guide for the Perplexed.Sandy Zabell - 2016 - Journal for General Philosophy of Science / Zeitschrift für Allgemeine Wissenschaftstheorie 47 (1):131-150.
    This paper has the aim of making Johannes von Kries’s masterpiece, Die Principien der Wahrscheinlichkeitsrechnung of 1886, a little more accessible to the modern reader in three modest ways: first, it discusses the historical background to the book ; next, it summarizes the basic elements of von Kries’s approach ; and finally, it examines the so-called “principle of cogent reason” with which von Kries’s name is often identified in the English literature.
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • Two Cornell realisms: moral and scientific.Elliott Sober - 2015 - Philosophical Studies 172 (4):905-924.
    Richard Boyd and Nicholas Sturgeon develop distinctive naturalistic arguments for scientific realism and moral realism. Each defends a realist position by an inference to the best explanation. In this paper, I suggest that these arguments for realism should be reformulated, with the law of likelihood replacing inference to the best explanation. The resulting arguments for realism do not work.
    Download  
     
    Export citation  
     
    Bookmark   10 citations  
  • Not throwing out the baby with the bathwater: Bell's condition of local causality mathematically 'sharp and clean'.Michiel P. Seevinck & Jos Uffink - 2011 - In Dennis Dieks, Wenceslao Gonzalo, Thomas Uebel, Stephan Hartmann & Marcel Weber (eds.), Explanation, Prediction, and Confirmation. Springer. pp. 425--450.
    The starting point of the present paper is Bell’s notion of local causality and his own sharpening of it so as to provide for mathematical formalisation. Starting with Norsen’s analysis of this formalisation, it is subjected to a critique that reveals two crucial aspects that have so far not been properly taken into account. These are the correct understanding of the notions of sufficiency, completeness and redundancy involved; and the fact that the apparatus settings and measurement outcomes have very different (...)
    Download  
     
    Export citation  
     
    Bookmark   15 citations  
  • Models and Statistical Inference: The Controversy between Fisher and Neyman–Pearson.Lenhard Johannes - 2006 - British Journal for the Philosophy of Science 57 (1):69-91.
    The main thesis of the paper is that in the case of modern statistics, the differences between the various concepts of models were the key to its formative controversies. The mathematical theory of statistical inference was mainly developed by Ronald A. Fisher, Jerzy Neyman, and Egon S. Pearson. Fisher on the one side and Neyman–Pearson on the other were involved often in a polemic controversy. The common view is that Neyman and Pearson made Fisher's account more stringent mathematically. It is (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • On the determinants of the conjunction fallacy: Probability versus inductive confirmation.Katya Tentori, Vincenzo Crupi & Selena Russo - 2013 - Journal of Experimental Psychology: General 142 (1):235.
    Download  
     
    Export citation  
     
    Bookmark   39 citations  
  • Megavariate Genetics: What You Find Is What You Go Looking For.Clive E. Bowman - 2009 - Biological Theory 4 (1):21-28.
    The subjectivity or “purpose dependency” of measurement in biology is discussed using examples from high-dimensional medical genetic research. The human observer and study designer tacitly determine the numerical and graphical representation of biological simplicity or complexity via choice of ascertainment , numbers to measure, referential basis, statistical learning formalism and feature search, and also via the selection of display styles for all these quantifications.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • The Unreasonable Ineffectiveness of Fisherian “Tests” in Biology, and Especially in Medicine.Deirdre N. McCloskey & Stephen T. Ziliak - 2009 - Biological Theory 4 (1):44-53.
    Biometrics has done damage with levels of R or p or Student’s t. The damage widened with Ronald A. Fisher’s victory in the 1920s and 1930s in devising mechanical methods of “testing,” against methods of common sense and scientific impact, “oomph.” The scale along which one would measure oomph is particularly clear in biomedical sciences: life or death. Cardiovascular epidemiology, to take one example, combines with gusto the “fallacy of the transposed conditional” and what we call the “sizeless stare” of (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • An objectivist argument for thirdism.The Oscar Seminar - 2008 - Analysis 68 (2):149–155.
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Mathematical statistics and metastatistical analysis.Andrés Rivadulla - 1991 - Erkenntnis 34 (2):211 - 236.
    This paper deals with meta-statistical questions concerning frequentist statistics. In Sections 2 to 4 I analyse the dispute between Fisher and Neyman on the so called logic of statistical inference, a polemic that has been concomitant of the development of mathematical statistics. My conclusion is that, whenever mathematical statistics makes it possible to draw inferences, it only uses deductive reasoning. Therefore I reject Fisher's inductive approach to the statistical estimation theory and adhere to Neyman's deductive one. On the other hand, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Ein beispiel zur ermittlung Des optimalen signifik anzniveaus nach der methodologischen regel der maximierung Des graDes der prüfbarkeit.Bernd Fleischmann - 1986 - Erkenntnis 24 (3):385 - 395.
    Download  
     
    Export citation  
     
    Bookmark  
  • Probable probabilities.John Pollock - 2007
    In concrete applications of probability, statistical investigation gives us knowledge of some probabilities, but we generally want to know many others that are not directly revealed by our data. For instance, we may know prob(P/Q) (the probability of P given Q) and prob(P/R), but what we really want is prob(P/Q&R), and we may not have the data required to assess that directly. The probability calculus is of no help here. Given prob(P/Q) and prob(P/R), it is consistent with the probability calculus (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Corroboration, explanation, evolving probability, simplicity and a sharpened razor.I. J. Good - 1968 - British Journal for the Philosophy of Science 19 (2):123-143.
    Download  
     
    Export citation  
     
    Bookmark   30 citations