Switch to: References

Citations of:

The History of Statistics: The Measurement of Uncertainty Before 1900

Harvard University Press: Cambridge (1986)

Add citations

You must login to add citations.
  1. Econophysics: making sense of a chimera.Adrian K. Yee - 2021 - European Journal for Philosophy of Science 11 (4):1-34.
    The history of economic thought witnessed several prominent economists who took seriously models and concepts in physics for the elucidation and prediction of economic phenomena. Econophysics is an emerging discipline at the intersection of heterodox economics and the physics of complex systems, with practitioners typically engaged in two overlapping but distinct methodological programs. The first is to export mathematical methods used in physics for the purposes of studying economic phenomena. The second is to export mechanisms in physics into economics. A (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Decoupling, Sparsity, Randomization, and Objective Bayesian Inference.Julio Michael Stern - 2008 - Cybernetics and Human Knowing 15 (2):49-68..
    Decoupling is a general principle that allows us to separate simple components in a complex system. In statistics, decoupling is often expressed as independence, no association, or zero covariance relations. These relations are sharp statistical hypotheses, that can be tested using the FBST - Full Bayesian Significance Test. Decoupling relations can also be introduced by some techniques of Design of Statistical Experiments, DSEs, like randomization. This article discusses the concepts of decoupling, randomization and sparsely connected statistical models in the epistemological (...)
    Download  
     
    Export citation  
     
    Bookmark   12 citations  
  • A small step towards unification of economics and physics.Subhendu Bhattacharyya - 2021 - Mind and Society 20 (1):69-84.
    Unification of natural science and social science is a centuries-old, unmitigated debate. Natural science has a chronological advantage over social science because the latter took time to include many social phenomena in its fold. History of science witnessed quite a number of efforts by social scientists to fit this discipline in a rational if not mathematical framework. On the other hand a tendency among some physicists has been observed especially since the last century to recast a number of social phenomena (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • La valeur de l'incertitude : l'évaluation de la précision des mesures physiques et les limites de la connaissance expérimentale.Fabien Grégis - 2016 - Dissertation, Université Sorbonne Paris Cité Université Paris.Diderot (Paris 7)
    Abstract : A measurement result is never absolutely accurate: it is affected by an unknown “measurement error” which characterizes the discrepancy between the obtained value and the “true value” of the quantity intended to be measured. As a consequence, to be acceptable a measurement result cannot take the form of a unique numerical value, but has to be accompanied by an indication of its “measurement uncertainty”, which enunciates a state of doubt. What, though, is the value of measurement uncertainty? What (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Philosophy as conceptual engineering: Inductive logic in Rudolf Carnap's scientific philosophy.Christopher F. French - 2015 - Dissertation, University of British Columbia
    My dissertation explores the ways in which Rudolf Carnap sought to make philosophy scientific by further developing recent interpretive efforts to explain Carnap’s mature philosophical work as a form of engineering. It does this by looking in detail at his philosophical practice in his most sustained mature project, his work on pure and applied inductive logic. I, first, specify the sort of engineering Carnap is engaged in as involving an engineering design problem and then draw out the complications of design (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • ‘‘Describing our whole experience’’: The statistical philosophies of W. F. R. Weldon and Karl Pearson.Charles H. Pence - 2011 - Studies in History and Philosophy of Science Part C: Studies in History and Philosophy of Biological and Biomedical Sciences 42 (4):475-485.
    There are two motivations commonly ascribed to historical actors for taking up statistics: to reduce complicated data to a mean value (e.g., Quetelet), and to take account of diversity (e.g., Galton). Different motivations will, it is assumed, lead to different methodological decisions in the practice of the statistical sciences. Karl Pearson and W. F. R. Weldon are generally seen as following directly in Galton’s footsteps. I argue for two related theses in light of this standard interpretation, based on a reading (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • The Philosophy of Generative Linguistics.Peter Ludlow - 2011 - Oxford, GB: Oxford University Press.
    Peter Ludlow presents the first book on the philosophy of generative linguistics, including both Chomsky's government and binding theory and his minimalist ...
    Download  
     
    Export citation  
     
    Bookmark   42 citations  
  • Regression explanation and statistical autonomy.Joeri Witteveen - 2019 - Biology and Philosophy 34 (5):1-20.
    The phenomenon of regression toward the mean is notoriously liable to be overlooked or misunderstood; regression fallacies are easy to commit. But even when regression phenomena are duly recognized, it remains perplexing how they can feature in explanations. This article develops a philosophical account of regression explanations as “statistically autonomous” explanations that cannot be deepened by adducing details about causal histories, even if the explananda as such are embedded in the causal structure of the world. That regression explanations have statistical (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Statistics and Probability Have Always Been Value-Laden: An Historical Ontology of Quantitative Research Methods.Michael J. Zyphur & Dean C. Pierides - 2020 - Journal of Business Ethics 167 (1):1-18.
    Quantitative researchers often discuss research ethics as if specific ethical problems can be reduced to abstract normative logics (e.g., virtue ethics, utilitarianism, deontology). Such approaches overlook how values are embedded in every aspect of quantitative methods, including ‘observations,’ ‘facts,’ and notions of ‘objectivity.’ We describe how quantitative research practices, concepts, discourses, and their objects/subjects of study have always been value-laden, from the invention of statistics and probability in the 1600s to their subsequent adoption as a logic made to appear as (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Notes on bayesian confirmation theory.Michael Strevens -
    Bayesian confirmation theory—abbreviated to in these notes—is the predominant approach to confirmation in late twentieth century philosophy of science. It has many critics, but no rival theory can claim anything like the same following. The popularity of the Bayesian approach is due to its flexibility, its apparently effortless handling of various technical problems, the existence of various a priori arguments for its validity, and its injection of subjective and contextual elements into the process of confirmation in just the places where (...)
    Download  
     
    Export citation  
     
    Bookmark   25 citations  
  • Gauss on least-squares and maximum-likelihood estimation.Jan R. Magnus - 2022 - Archive for History of Exact Sciences 76 (4):425-430.
    Gauss’ 1809 discussion of least squares, which can be viewed as the beginning of mathematical statistics, is reviewed. The general consensus seems to be that Gauss’ arguments are at fault, but we show that his reasoning is in fact correct, given his self-imposed restrictions, and persuasive without these restrictions.
    Download  
     
    Export citation  
     
    Bookmark  
  • R. A. Fisher and his advocacy of randomization.Nancy S. Hall - 2007 - Journal of the History of Biology 40 (2):295-325.
    The requirement of randomization in experimental design was first stated by R. A. Fisher, statistician and geneticist, in 1925 in his book Statistical Methods for Research Workers. Earlier designs were systematic and involved the judgment of the experimenter; this led to possible bias and inaccurate interpretation of the data. Fisher's dictum was that randomization eliminates bias and permits a valid test of significance. Randomization in experimenting had been used by Charles Sanders Peirce in 1885 but the practice was not continued. (...)
    Download  
     
    Export citation  
     
    Bookmark   13 citations  
  • Johannes von Kries’s Principien: A Brief Guide for the Perplexed.Sandy Zabell - 2016 - Journal for General Philosophy of Science / Zeitschrift für Allgemeine Wissenschaftstheorie 47 (1):131-150.
    This paper has the aim of making Johannes von Kries’s masterpiece, Die Principien der Wahrscheinlichkeitsrechnung of 1886, a little more accessible to the modern reader in three modest ways: first, it discusses the historical background to the book ; next, it summarizes the basic elements of von Kries’s approach ; and finally, it examines the so-called “principle of cogent reason” with which von Kries’s name is often identified in the English literature.
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • Two Mathematical Approaches to Random Fluctuations.Chen-Pang Yeang - 2016 - Perspectives on Science 24 (1):45-72.
    Randomness, uncertainty, and lack of regularity had concerned savants for a long time. As early as the seventeenth century, Blaise Pascal conceived the arithmetic of chance for gambling. At the height of positional astronomy, mathematicians developed a theory of errors to cope with random deviations in astronomical observations. In the nineteenth century, pioneers of statistics employed probabilistic calculus to define “normal” and “pathological” in the distribution of social characters, while physicists devised the statistical-mechanical interpretation of thermodynamic effects. By the end (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • William Stanley Jevons.Bert Mosselmans - 2008 - Stanford Encyclopedia of Philosophy.
    Download  
     
    Export citation  
     
    Bookmark  
  • Indifference, neutrality and informativeness: generalizing the three prisoners paradox.Sergio Wechsler, L. G. Esteves, A. Simonis & C. Peixoto - 2005 - Synthese 143 (3):255-272.
    . The uniform prior distribution is often seen as a mathematical description of noninformativeness. This paper uses the well-known Three Prisoners Paradox to examine the impossibility of maintaining noninformativeness throughout hierarchization. The Paradox has been solved by Bayesian conditioning over the choice made by the Warder when asked to name a prisoner who will be shot. We generalize the paradox to situations of N prisoners, k executions and m announcements made by the Warder. We then extend the consequences of hierarchically (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Emancipation Through Interaction – How Eugenics and Statistics Converged and Diverged.Francisco Louçã - 2009 - Journal of the History of Biology 42 (4):649-684.
    The paper discusses the scope and influence of eugenics in defining the scientific programme of statistics and the impact of the evolution of biology on social scientists. It argues that eugenics was instrumental in providing a bridge between sciences, and therefore created both the impulse and the institutions necessary for the birth of modern statistics in its applications first to biology and then to the social sciences. Looking at the question from the point of view of the history of statistics (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • The Emergence of Modern Statistics in Agricultural Science: Analysis of Variance, Experimental Design and the Reshaping of Research at Rothamsted Experimental Station, 1919–1933.Giuditta Parolini - 2015 - Journal of the History of Biology 48 (2):301-335.
    During the twentieth century statistical methods have transformed research in the experimental and social sciences. Qualitative evidence has largely been replaced by quantitative results and the tools of statistical inference have helped foster a new ideal of objectivity in scientific knowledge. The paper will investigate this transformation by considering the genesis of analysis of variance and experimental design, statistical methods nowadays taught in every elementary course of statistics for the experimental and social sciences. These methods were developed by the mathematician (...)
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • Probability, uncertainty and artificial intelligence: Carlotta Piscopo: The metaphysical nature of the non-adequacy claim. Dordrecht: Springer, 2013, 146pp, $129 HB.James Cussens - 2014 - Metascience 23 (3):505-511.
    The central thesis of this book is that the argument that probability is insufficient to handle uncertainty in artificial intelligence (AI) is metaphysical in nature. Piscopo calls this argument against probability the non-adequacy claim and provides this summary of it [which first appeared in (Piscopo and Birattari 2008)]:Probability theory is not suitable to handle uncertainty in AI because it has been developed to deal with intrinsically stochastic phenomena, while in AI, uncertainty has an epistemic nature. (Piscopo (3))Piscopo uses the term (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Minding Matter/Mattering Mind: Knowledge and the Subject in Nineteenth-Century Psychology.John Carson - 1999 - Studies in History and Philosophy of Science Part C: Studies in History and Philosophy of Biological and Biomedical Sciences 30 (3):345-376.
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Dispersion of response times reveals cognitive dynamics.John G. Holden, Guy C. Van Orden & Michael T. Turvey - 2009 - Psychological Review 116 (2):318-342.
    Download  
     
    Export citation  
     
    Bookmark   29 citations  
  • Measurement, Explanation, and Biology: Lessons From a Long Century.Fred L. Bookstein - 2009 - Biological Theory 4 (1):6-20.
    It is far from obvious that outside of highly specialized domains such as commercial agriculture, the methodology of biometrics—quantitative comparisons over groups of organisms—should be of any use in today’s bioinformatically informed biological sciences. The methods in our biometric textbooks, such as regressions and principal components analysis, make assumptions of homogeneity that are incompatible with current understandings of the origins of developmental or evolutionary data in historically contingent processes, processes that might have come out otherwise; the appropriate statistical methods are (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • Personalization as a promise: Can Big Data change the practice of insurance?Arthur Charpentier & Laurence Barry - 2020 - Big Data and Society 7 (1).
    The aim of this article is to assess the impact of Big Data technologies for insurance ratemaking, with a special focus on motor products.The first part shows how statistics and insurance mechanisms adopted the same aggregate viewpoint. It made visible regularities that were invisible at the individual level, further supporting the classificatory approach of insurance and the assumption that all members of a class are identical risks. The second part focuses on the reversal of perspective currently occurring in data analysis (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Models of integration given multiple sources of information.Dominic W. Massaro & Daniel Friedman - 1990 - Psychological Review 97 (2):225-252.
    Download  
     
    Export citation  
     
    Bookmark   45 citations  
  • Racial zigzags.Amir Teicher - 2015 - History of the Human Sciences 28 (5):17-48.
    In 1907, German anthropologist Theodor Mollison invented a unique method for racial differentiation, called ‘deviation curves’. By transforming anthropometric data matrices into graphs, Mollison’s method enabled the simultaneous comparison of a large number of physical attributes of individuals and groups. However, the construction of deviation curves had been highly desultory, and their interpretation had been prone to various visual misjudgements. Despite their methodological shortcomings, deviation curves became very popular among racial anthropologists. This positive reception not only stemmed from the method’s (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • On the construction of mental objects in third and in first persons.Arno L. Goudsmit - 2000 - Foundations of Science 5 (4):399-428.
    This paper deals with some formal properties of objects that are supposed to be internal to persons, that is, mental structures and mental functions. Depending on the ways of talking about these internal objects, they will appear different. Two types of discourse will be presented, to be called the realist and the nominalist discourses, and for eachdiscourse I will focus upon the construction of `self'.The realist discourse assumes an identity between the person and his construction of himself. I will illustrate (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • The chaos of particular facts: statistics, medicine and the social body in early 19th-century France.Joshua Cole - 1994 - History of the Human Sciences 7 (3):1-27.
    Download  
     
    Export citation  
     
    Bookmark   1 citation