Switch to: References

Citations of:

The Rise of Statistical Thinking, 1820-1900

Princeton University Press: Princeton (1986)

Add citations

You must login to add citations.
  1. Philosophy as conceptual engineering: Inductive logic in Rudolf Carnap's scientific philosophy.Christopher F. French - 2015 - Dissertation, University of British Columbia
    My dissertation explores the ways in which Rudolf Carnap sought to make philosophy scientific by further developing recent interpretive efforts to explain Carnap’s mature philosophical work as a form of engineering. It does this by looking in detail at his philosophical practice in his most sustained mature project, his work on pure and applied inductive logic. I, first, specify the sort of engineering Carnap is engaged in as involving an engineering design problem and then draw out the complications of design (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • The Early History of Chance in Evolution.Charles H. Pence - 2015 - Studies in History and Philosophy of Science Part A 50:48-58.
    Work throughout the history and philosophy of biology frequently employs ‘chance’, ‘unpredictability’, ‘probability’, and many similar terms. One common way of understanding how these concepts were introduced in evolution focuses on two central issues: the first use of statistical methods in evolution (Galton), and the first use of the concept of “objective chance” in evolution (Wright). I argue that while this approach has merit, it fails to fully capture interesting philosophical reflections on the role of chance expounded by two of (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • International Handbook of Research in History, Philosophy and Science Teaching.Michael R. Matthews (ed.) - 2014 - Springer.
    This inaugural handbook documents the distinctive research field that utilizes history and philosophy in investigation of theoretical, curricular and pedagogical issues in the teaching of science and mathematics. It is contributed to by 130 researchers from 30 countries; it provides a logically structured, fully referenced guide to the ways in which science and mathematics education is, informed by the history and philosophy of these disciplines, as well as by the philosophy of education more generally. The first handbook to cover the (...)
    Download  
     
    Export citation  
     
    Bookmark   14 citations  
  • Continuity, causality and determinism in mathematical physics: from the late 18th until the early 20th century.Marij van Strien - 2014 - Dissertation, University of Ghent
    It is commonly thought that before the introduction of quantum mechanics, determinism was a straightforward consequence of the laws of mechanics. However, around the nineteenth century, many physicists, for various reasons, did not regard determinism as a provable feature of physics. This is not to say that physicists in this period were not committed to determinism; there were some physicists who argued for fundamental indeterminism, but most were committed to determinism in some sense. However, for them, determinism was often not (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Towards “A Natural History of Data”: Evolving Practices and Epistemologies of Data in Paleontology, 1800–2000. [REVIEW]David Sepkoski - 2013 - Journal of the History of Biology 46 (3):401-444.
    The fossil record is paleontology’s great resource, telling us virtually everything we know about the past history of life. This record, which has been accumulating since the beginning of paleontology as a professional discipline in the early nineteenth century, is a collection of objects. The fossil record exists literally, in the specimen drawers where fossils are kept, and figuratively, in the illustrations and records of fossils compiled in paleontological atlases and compendia. However, as has become increasingly clear since the later (...)
    Download  
     
    Export citation  
     
    Bookmark   11 citations  
  • The chaos of particular facts: statistics, medicine and the social body in early 19th-century France.Joshua Cole - 1994 - History of the Human Sciences 7 (3):1-27.
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Michael young's the rise of the meritocracy: A philosophical critique.Ansgar Allen - 2011 - British Journal of Educational Studies 59 (4):367 - 382.
    This paper examines Michael Young's 1958 dystopia, The Rise of the Meritocracy. In this book, the word 'meritocracy' was coined and used in a pejorative sense. Today, however, meritocracy represents a positive ideal against which we measure the justice of our institutions. This paper argues that, when read in the twenty-first century, Young's dystopia does little to dislodge the implicit appeal of a meritocratic society. It examines the principles of education and administrative justice upon which meritocracy is based, suggesting that (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • ‘‘Describing our whole experience’’: The statistical philosophies of W. F. R. Weldon and Karl Pearson.Charles H. Pence - 2011 - Studies in History and Philosophy of Science Part C: Studies in History and Philosophy of Biological and Biomedical Sciences 42 (4):475-485.
    There are two motivations commonly ascribed to historical actors for taking up statistics: to reduce complicated data to a mean value (e.g., Quetelet), and to take account of diversity (e.g., Galton). Different motivations will, it is assumed, lead to different methodological decisions in the practice of the statistical sciences. Karl Pearson and W. F. R. Weldon are generally seen as following directly in Galton’s footsteps. I argue for two related theses in light of this standard interpretation, based on a reading (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • Notes on bayesian confirmation theory.Michael Strevens -
    Bayesian confirmation theory—abbreviated to in these notes—is the predominant approach to confirmation in late twentieth century philosophy of science. It has many critics, but no rival theory can claim anything like the same following. The popularity of the Bayesian approach is due to its flexibility, its apparently effortless handling of various technical problems, the existence of various a priori arguments for its validity, and its injection of subjective and contextual elements into the process of confirmation in just the places where (...)
    Download  
     
    Export citation  
     
    Bookmark   25 citations  
  • Pedigrees of madness: the study of heredity in nineteenth and early twentieth century psychiatry.Bernd Gausemeier - 2015 - History and Philosophy of the Life Sciences 36 (4):467-483.
    This article discusses the development of the statistical methods employed by psychiatrists to study heredity as a causative factor of mental diseases. It argues that psychiatric asylums and clinics were the first institutions in which human heredity became the object of systematic research. It also highlights the different concepts of heredity prevalent in the psychiatric community. The first of four parts traces how heredity became a central category of asylum statistics in the first half of the nineteenth century. The second (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Two explanations of evolutionary progress.Gregory Radick - 2000 - Biology and Philosophy 15 (4):475-491.
    Natural selection explains how living forms are fitted to theirconditions of life. Darwin argued that selection also explains what hecalled the gradual advancement of the organisation, i.e.evolutionary progress. Present-day selectionists disagree. In theirview, it is happenstance that sustains conditions favorable to progress,and therefore happenstance, not selection, that explains progress. Iargue that the disagreement here turns not on whether there exists aselection-based condition bias – a belief now attributed to Darwin – but on whether there needs to be such a bias (...)
    Download  
     
    Export citation  
     
    Bookmark   11 citations  
  • When data drive health: an archaeology of medical records technology.Colin Koopman, Paul D. G. Showler, Patrick Jones, Mary McLevey & Valerie Simon - 2022 - Biosocieties 17 (4):782-804.
    Medicine is often thought of as a science of the body, but it is also a science of data. In some contexts, it can even be asserted that data drive health. This article focuses on a key piece of data technology central to contemporary practices of medicine: the medical record. By situating the medical record in the perspective of its history, we inquire into how the kinds of data that are kept at sites of clinical encounter often depend on informational (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • The Average Isn’t Normal: The History and Cognitive Science of an Everyday Scientific Practice.Henry Cowles & Joshua Knobe - 2023 - In Uriah Kriegel (ed.), Oxford Studies in Philosophy of Mind Vol. 3. Oxford: Oxford University Press.
    Within contemporary science, it is common practice to compare data points to the average, i.e., to the statistical mean. Because this practice is so familiar, it might at first appear not to be the sort of thing that requires explanation. But recent research in cognitive science and in the history of science gives us reason to adopt the opposite perspective. Cognitive science research on the ways people ordinarily make sense of the world suggests that, instead of using a purely statistical (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Whatever Happened to Reversion?Charles H. Pence - 2022 - Studies in History and Philosophy of Science Part A 92 (C):97-108.
    The idea of ‘reversion’ or ‘atavism’ has a peculiar history. For many authors in the latenineteenth and early-twentieth centuries – including Darwin, Galton, Pearson, Weismann, and Spencer, among others – reversion was one of the central phenomena which a theory of heredity ought to explain. By only a few decades later, however, Fisher and others could look back upon reversion as a historical curiosity, a non-problem, or even an impediment to clear theorizing. I explore various reasons that reversion might have (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • The Average Isn’t Normal.Joshua Knobe & Henry Cowles - manuscript
    Within contemporary science, it is common practice to compare data points to the _average_, i.e., to the statistical mean. Because this practice is so familiar, it might at first appear not to be the sort of thing that requires explanation. But recent research in cognitive science gives us reason to adopt the opposite perspective. Research on the cognitive processes involved in people’s ordinary efforts to make sense of the world suggests that, instead of using a purely statistical notion of the (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • In Praise of Clausius Entropy: Reassessing the Foundations of Boltzmannian Statistical Mechanics.Christopher Gregory Weaver - 2021 - Foundations of Physics 51 (3):1-64.
    I will argue, pace a great many of my contemporaries, that there's something right about Boltzmann's attempt to ground the second law of thermodynamics in a suitably amended deterministic time-reversal invariant classical dynamics, and that in order to appreciate what's right about (what was at least at one time) Boltzmann's explanatory project, one has to fully apprehend the nature of microphysical causal structure, time-reversal invariance, and the relationship between Boltzmann entropy and the work of Rudolf Clausius.
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Outline for a History of Science Measurement.Benoît Godin - 2002 - Science, Technology, and Human Values 27 (1):3-27.
    The measurement of science and technology is now fifty years old. It owes a large part of its existence to the work of the National Science Foundation and the Organization for Economic Cooperation and Development in the 1950s and 1960s. Given the centrality of S&T statistics in science studies, it is surprising that no history of the measurement exists in the literature. This article outlines such a history. The history is cast in the light of social statistics. Like social statistics, (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • Corporate Capitalism and the Growing Power of Big Data: Review Essay. [REVIEW]Martha Poon - 2016 - Science, Technology, and Human Values 41 (6):1088-1108.
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Making Quantitative Research Work: From Positivist Dogma to Actual Social Scientific Inquiry.Michael J. Zyphur & Dean C. Pierides - 2020 - Journal of Business Ethics 167 (1):49-62.
    Researchers misunderstand their role in creating ethical problems when they allow dogmas to purportedly divorce scientists and scientific practices from the values that they embody. Cortina, Edwards, and Powell help us clarify and further develop our position by responding to our critique of, and alternatives to, this misleading separation. In this rebuttal, we explore how the desire to achieve the separation of facts and values is unscientific on the very terms endorsed by its advocates—this separation is refuted by empirical observation. (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Statistics and Probability Have Always Been Value-Laden: An Historical Ontology of Quantitative Research Methods.Michael J. Zyphur & Dean C. Pierides - 2020 - Journal of Business Ethics 167 (1):1-18.
    Quantitative researchers often discuss research ethics as if specific ethical problems can be reduced to abstract normative logics (e.g., virtue ethics, utilitarianism, deontology). Such approaches overlook how values are embedded in every aspect of quantitative methods, including ‘observations,’ ‘facts,’ and notions of ‘objectivity.’ We describe how quantitative research practices, concepts, discourses, and their objects/subjects of study have always been value-laden, from the invention of statistics and probability in the 1600s to their subsequent adoption as a logic made to appear as (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Jamesian Free Will, The Two-stage Model Of William James.Bob Doyle - 2010 - William James Studies 5:1-28.
    Research into two-stage models of “free will” – first “free” random generation of alternative possibilities, followed by “willed” adequately determined decisions consistent with character, values, and desires – suggests that William James was in 1884 the first of a dozen philosophers and scientists to propose such a two-stage model for free will. We review the later work to establish James’s priority. By limiting chance to the generation of alternative possibilities, James was the first to overcome the standard two-part argument against (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • The Emergence of Modern Statistics in Agricultural Science: Analysis of Variance, Experimental Design and the Reshaping of Research at Rothamsted Experimental Station, 1919–1933.Giuditta Parolini - 2015 - Journal of the History of Biology 48 (2):301-335.
    During the twentieth century statistical methods have transformed research in the experimental and social sciences. Qualitative evidence has largely been replaced by quantitative results and the tools of statistical inference have helped foster a new ideal of objectivity in scientific knowledge. The paper will investigate this transformation by considering the genesis of analysis of variance and experimental design, statistical methods nowadays taught in every elementary course of statistics for the experimental and social sciences. These methods were developed by the mathematician (...)
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • Observation observed: Lorraine Daston and Elizabeth Lunbeck : Histories of scientific observation. Chicago: University of Chicago Press, 2011, 460pp, $81.00 HB, $27.50 PB.Sachiko Kusukawa - 2013 - Metascience 23 (2):347-352.
    This is an important volume of seventeen essays that historicizes observation as a practice, concept and ideal. It belongs to the historiographical tradition of scrutinizing central aspects of the scientific enterprise such as experiments and objectivity that once appeared too self-evident to be probed. The challenge of historicizing such a significant idea is that it has to be a collective enterprise.The volume starts with three essays that provide a chronological survey of the period from 500 to 1800. Katherine Park, covering (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Probability and statistics in Boltzmann's early papers on kinetic theory.Massimiliano Badino - unknown
    Boltzmann’s equilibrium theory has not received by the scholars the attention it deserves. It was always interpreted as a mere generalization of Maxwell’s work or, in the most favorable case, a sketch of some ideas more consistently developed in the 1872 memoir. In this paper, I try to prove that this view is ungenerous. My claim is that in the theory developed during the period 1866-1871 the generalization of Maxwell’s distribution was mainly a mean to get a more general scope: (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Health Equity’s Missing Substance: (Re)Engaging the Normative in Public Health Discourse and Knowledge Making.Adam Wildgen & Keith Denny - 2020 - Public Health Ethics 13 (3):247-258.
    Since 1984, the idea of health equity has proliferated throughout public health discourse with little mainstream critique for its variability and distance from its original articulation signifying social transformation and a commitment to social justice. In the years since health equity’s emergence and proliferation, it has taken on a seemingly endless range of invocations and deployments, but it most often translates into proactive and apolitical discourse and practice. In Margaret Whitehead’s influential characterization, achieving health equity requires determining what is inequitable (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Die Historizität der Verdatung: Konzepte, Werkzeuge und Praktiken im 19. Jahrhundert.Christine von Oertzen - 2017 - NTM Zeitschrift für Geschichte der Wissenschaften, Technik und Medizin 25 (4):407-434.
    ZusammenfassungDer Beitrag nimmt den heute allgegenwärtigen Begriff „Daten“ als historische Kategorie in den Blick. Er geht der langsamen Verbreitung des Wortes unter Statistikern im 19. Jahrhundert nach und untersucht die materielle Kultur derjenigen Konzepte und Praktiken, die mit seiner Verwendung einhergingen. Am Beispiel der preußischen Volkszählung legt der Beitrag mit diesem Vorgehen bislang unbeachtete Genealogien datengetriebener Forschung frei: Nicht erst Computerspezialisten des 20. und 21. Jahrhunderts, sondern Wissenschaftler des 19. Jahrhunderts machten sich den Begriff für die Produktion streng abstrahierter, numerischer (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Critical Data Studies: A dialog on data and space.Jim Thatcher, Linnet Taylor & Craig M. Dalton - 2016 - Big Data and Society 3 (1).
    In light of recent technological innovations and discourses around data and algorithmic analytics, scholars of many stripes are attempting to develop critical agendas and responses to these developments. In this mutual interview, three scholars discuss the stakes, ideas, responsibilities, and possibilities of critical data studies. The resulting dialog seeks to explore what kinds of critical approaches to these topics, in theory and practice, could open and make available such approaches to a broader audience.
    Download  
     
    Export citation  
     
    Bookmark   20 citations  
  • Why Friedman's methodology did not generate consensus among economists?David Teira - 2009 - Journal of the History of Economic Thought 31 (2):201-214.
    In this paper I study how the theoretical categories of consumption theory were used by Milton Friedman in order to classify empirical data and obtain predictions. Friedman advocated a case by case definition of these categories that traded theoretical coherence for empirical content. I contend that this methodological strategy puts a clear incentive to contest any prediction contrary to our interest: it can always be argued that these predictions rest on a wrong classification of data. My conjecture is that this (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • A Primer on Ernst Abbe for Frege Readers.Jamie Tappenden - 2008 - Canadian Journal of Philosophy 38 (S1):31-118.
    Setting out to understand Frege, the scholar confronts a roadblock at the outset: We just have little to go on. Much of the unpublished work and correspondence is lost, probably forever. Even the most basic task of imagining Frege's intellectual life is a challenge. The people he studied with and those he spent daily time with are little known to historians of philosophy and logic. To be sure, this makes it hard to answer broad questions like: 'Who influenced Frege?' But (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Series of forms, visual techniques, and quantitative devices: ordering the world between the end of the nineteenth and early twentieth centuries.Marco Tamborini - 2019 - History and Philosophy of the Life Sciences 41 (4):1-20.
    In this paper, I investigate the variety and richness of the taxonomical practices between the end of the nineteenth and the early twentieth centuries. During these decades, zoologists and paleontologists came up with different quantitative practices in order to classify their data in line with the new biological principles introduced by Charles Darwin. Specifically, I will investigate Florentino Ameghino’s mathematization of mammalian dentition and the quantitative practices and visualizations of several German-speaking paleontologists at the beginning of the twentieth century. In (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Series of forms, visual techniques, and quantitative devices: ordering the world between the end of the nineteenth and early twentieth centuries.Marco Tamborini - 2019 - History and Philosophy of the Life Sciences 41 (4):1-20.
    In this paper, I investigate the variety and richness of the taxonomical practices between the end of the nineteenth and the early twentieth centuries. During these decades, zoologists and paleontologists came up with different quantitative practices in order to classify their data in line with the new biological principles introduced by Charles Darwin. Specifically, I will investigate Florentino Ameghino’s mathematization of mammalian dentition and the quantitative practices and visualizations of several German-speaking paleontologists at the beginning of the twentieth century. In (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Phrenology and the average person, 1840–1940.Fenneke Sysling - 2021 - History of the Human Sciences 34 (2):27-45.
    The popular science of phrenology is known for its preoccupation with geniuses and criminals, but this article shows that phrenologists also introduced ideas about the ‘average’ person. Popular phrenologists in the US and the UK examined the heads of their clients to give an indication of their character. Based on the publications of phrenologists and on a large collection of standardized charts with clients’ scores, this article analyses their definition of what they considered to be the ‘average’. It can be (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • The Norton Dome and the Nineteenth Century Foundations of Determinism.Marij van Strien - 2014 - Journal for General Philosophy of Science / Zeitschrift für Allgemeine Wissenschaftstheorie 45 (1):167-185.
    The recent discovery of an indeterministic system in classical mechanics, the Norton dome, has shown that answering the question whether classical mechanics is deterministic can be a complicated matter. In this paper I show that indeterministic systems similar to the Norton dome were already known in the nineteenth century: I discuss four nineteenth century authors who wrote about such systems, namely Poisson, Duhamel, Boussinesq and Bertrand. However, I argue that their discussion of such systems was very different from the contemporary (...)
    Download  
     
    Export citation  
     
    Bookmark   18 citations  
  • Commentary on Missimer.Christina Slade - unknown
    Download  
     
    Export citation  
     
    Bookmark  
  • Teaching Bayesian reasoning in less than two hours.Peter Sedlmeier & Gerd Gigerenzer - 2001 - Journal of Experimental Psychology: General 130 (3):380.
    Download  
     
    Export citation  
     
    Bookmark   37 citations  
  • Peopling Europe through Data Practices: Introduction to the Special Issue.Stephan Scheel, Evelyn Ruppert & Baki Cakici - 2020 - Science, Technology, and Human Values 45 (2):199-211.
    Politically, Europe has been unable to address itself to a constituted polity and people as more than an agglomeration of nation-states. From the resurgence of nationalisms to the crisis of the single currency and the unprecedented decision of a member state to leave the European Union, core questions about the future of Europe have been rearticulated: Who are the people of Europe? Is there a European identity? What does it mean to say, “I am European?” Where does Europe begin and (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Biopolitical bordering: Enacting populations as intelligible objects of government.Stephan Scheel - 2020 - European Journal of Social Theory 23 (4):571-590.
    Since Foucault introduced the notion of biopolitics, it has been fiercely debated—usually in highly generalized terms—how to interpret and use this concept. This article argues that these discussions need to be situated, as biopolitics have features that do not travel from one site to the next. This becomes apparent if we attend to an aspect of biopolitics that has only received scant attention so far: the knowledge practices required to constitute populations as intelligible objects of government. To illustrate this point, (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Null Findings, Replications and Preregistered Studies in Business Ethics Research.Julia Roloff & Michael J. Zyphur - 2018 - Journal of Business Ethics 160 (3):609-619.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Gene.Hans-Jörg Rheinberger - 2008 - Stanford Encyclopedia of Philosophy.
    Download  
     
    Export citation  
     
    Bookmark   22 citations  
  • Data politics.Didier Bigo, Engin Isin & Evelyn Ruppert - 2017 - Big Data and Society 4 (2).
    The commentary raises political questions about the ways in which data has been constituted as an object vested with certain powers, influence, and rationalities. We place the emergence and transformation of professional practices such as ‘data science’, ‘data journalism’, ‘data brokerage’, ‘data mining’, ‘data storage’, and ‘data analysis’ as part of the reconfiguration of a series of fields of power and knowledge in the public and private accumulation of data. Data politics asks questions about the ways in which data has (...)
    Download  
     
    Export citation  
     
    Bookmark   17 citations  
  • Data Cleaners for Pristine Datasets: Visibility and Invisibility of Data Processors in Social Science.Jean-Christophe Plantin - 2019 - Science, Technology, and Human Values 44 (1):52-73.
    This article investigates the work of processors who curate and “clean” the data sets that researchers submit to data archives for archiving and further dissemination. Based on ethnographic fieldwork conducted at the data processing unit of a major US social science data archive, I investigate how these data processors work, under which status, and how they contribute to data sharing. This article presents two main results. First, it contributes to the study of invisible technicians in science by showing that the (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Norming Normality: On Scientific Fictions and Canonical Visualisations.Lara Huber - 2011 - Medicine Studies 3 (1):41-52.
    Taking the visual appeal of the ‘bell curve’ as an example, this paper discusses in how far the availability of quantitative approaches (here: statistics) that comes along with representational standards immediately affects qualitative concepts of scientific reasoning (here: normality). Within the realm of this paper I shall focus on the relationship between normality, as defined by scientific enterprise, and normativity, that result out of the very processes of standardisation itself. Two hypotheses are guiding this analysis: (1) normality, as it is (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Robert Leslie Ellis and John Stuart mill on the one and the many of frequentism.Berna Kilinç - 2000 - British Journal for the History of Philosophy 8 (2):251-274.
    (2000). ROBERT LESLIE ELLIS AND JOHN STUART MILL ON THE ONE AND THE MANY OF FREQUENTISM. British Journal for the History of Philosophy: Vol. 8, No. 2, pp. 251-274.
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Scientific revolutions.Thomas Nickles - 2010 - Stanford Encyclopedia of Philosophy.
    Download  
     
    Export citation  
     
    Bookmark   11 citations  
  • Jurisdiction, inscription, and state formation: administrative modernism and knowledge regimes. [REVIEW]Chandra Mukerji - 2011 - Theory and Society 40 (3):223-245.
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • William Stanley Jevons.Bert Mosselmans - 2008 - Stanford Encyclopedia of Philosophy.
    Download  
     
    Export citation  
     
    Bookmark  
  • Do the Fallacies you Favour Retard the Growth of Knowledge?Connie Missimer - unknown
    A simple way to approach fallacies is to ask, "Has reasoning-strategy X retarded or halted the growth of knowledge?" and seek uncontroversial historical events as empirical support for the fallacy moniker. Historical support also offers a means of retiring reasoning strategies heretofore thought fallacious—they are wrongly accused if they helped drive knowledge. Finally, this approach allows us to be more critical of our argumentative practices. Evidence is offered for an Intuitive Fallacy: In its extreme form it rules out the possibility (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • From pool to profile: Social consequences of algorithmic prediction in insurance.Elena Esposito & Alberto Cevolini - 2020 - Big Data and Society 7 (2).
    The use of algorithmic prediction in insurance is regarded as the beginning of a new era, because it promises to personalise insurance policies and premiums on the basis of individual behaviour and level of risk. The core idea is that the price of the policy would no longer refer to the calculated uncertainty of a pool of policyholders, with the consequence that everyone would have to pay only for her real exposure to risk. For insurance, however, uncertainty is not only (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Dispersion of response times reveals cognitive dynamics.John G. Holden, Guy C. Van Orden & Michael T. Turvey - 2009 - Psychological Review 116 (2):318-342.
    Download  
     
    Export citation  
     
    Bookmark   29 citations  
  • Minding Matter/Mattering Mind: Knowledge and the Subject in Nineteenth-Century Psychology.John Carson - 1999 - Studies in History and Philosophy of Science Part C: Studies in History and Philosophy of Biological and Biomedical Sciences 30 (3):345-376.
    Download  
     
    Export citation  
     
    Bookmark   4 citations