Results for 'Statistical Relevance'

965 found
Order:
  1. Causal Conditionals, Tendency Causal Claims and Statistical Relevance.Michał Sikorski, van Dongen Noah & Jan Sprenger - 2024 - Review of Philosophy and Psychology 1:1-26.
    Indicative conditionals and tendency causal claims are closely related (e.g., Frosch and Byrne, 2012), but despite these connections, they are usually studied separately. A unifying framework could consist in their dependence on probabilistic factors such as high conditional probability and statistical relevance (e.g., Adams, 1975; Eells, 1991; Douven, 2008, 2015). This paper presents a comparative empirical study on differences between judgments on tendency causal claims and indicative conditionals, how these judgments are driven by probabilistic factors, and how these (...)
    Download  
     
    Export citation  
     
    Bookmark  
  2. On statistical criteria of algorithmic fairness.Brian Hedden - 2021 - Philosophy and Public Affairs 49 (2):209-231.
    Predictive algorithms are playing an increasingly prominent role in society, being used to predict recidivism, loan repayment, job performance, and so on. With this increasing influence has come an increasing concern with the ways in which they might be unfair or biased against individuals in virtue of their race, gender, or, more generally, their group membership. Many purported criteria of algorithmic fairness concern statistical relationships between the algorithm’s predictions and the actual outcomes, for instance requiring that the rate of (...)
    Download  
     
    Export citation  
     
    Bookmark   39 citations  
  3. Determination, uniformity, and relevance: normative criteria for generalization and reasoning by analogy.Todd R. Davies - 1988 - In T. Davies (ed.), Analogical Reasoning. Kluwer Academic Publishers. pp. 227-250.
    This paper defines the form of prior knowledge that is required for sound inferences by analogy and single-instance generalizations, in both logical and probabilistic reasoning. In the logical case, the first order determination rule defined in Davies (1985) is shown to solve both the justification and non-redundancy problems for analogical inference. The statistical analogue of determination that is put forward is termed 'uniformity'. Based on the semantics of determination and uniformity, a third notion of "relevance" is defined, both (...)
    Download  
     
    Export citation  
     
    Bookmark   11 citations  
  4. Inherent Properties and Statistics with Individual Particles in Quantum Mechanics.Matteo Morganti - 2009 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 40 (3):223-231.
    This paper puts forward the hypothesis that the distinctive features of quantum statistics are exclusively determined by the nature of the properties it describes. In particular, all statistically relevant properties of identical quantum particles in many-particle systems are conjectured to be irreducible, ‘inherent’ properties only belonging to the whole system. This allows one to explain quantum statistics without endorsing the ‘Received View’ that particles are non-individuals, or postulating that quantum systems obey peculiar probability distributions, or assuming that there are primitive (...)
    Download  
     
    Export citation  
     
    Bookmark   17 citations  
  5. Theory of signs and statistical approach to big data in assessing the relevance of clinical biomarkers of inflammation and oxidative stress.Pietro Ghezzi, Kevin Davies, Aidan Delaney & Luciano Floridi - 2018 - Proceedings of the National Academy of Sciences of the United States of America 115 (10):2473-2477.
    Biomarkers are widely used not only as prognostic or diagnostic indicators, or as surrogate markers of disease in clinical trials, but also to formulate theories of pathogenesis. We identify two problems in the use of biomarkers in mechanistic studies. The first problem arises in the case of multifactorial diseases, where different combinations of multiple causes result in patient heterogeneity. The second problem arises when a pathogenic mediator is difficult to measure. This is the case of the oxidative stress (OS) theory (...)
    Download  
     
    Export citation  
     
    Bookmark  
  6. Disparate Statistics.Kevin P. Tobia - 2017 - Yale Law Journal 126 (8):2382-2420.
    Statistical evidence is crucial throughout disparate impact’s three-stage analysis: during (1) the plaintiff’s prima facie demonstration of a policy’s disparate impact; (2) the defendant’s job-related business necessity defense of the discriminatory policy; and (3) the plaintiff’s demonstration of an alternative policy without the same discriminatory impact. The circuit courts are split on a vital question about the “practical significance” of statistics at Stage 1: Are “small” impacts legally insignificant? For example, is an employment policy that causes a one percent (...)
    Download  
     
    Export citation  
     
    Bookmark  
  7. Statistical mechanics and thermodynamics: A Maxwellian view.Wayne C. Myrvold - 2011 - Studies in History and Philosophy of Science Part A 42 (4):237-243.
    One finds, in Maxwell's writings on thermodynamics and statistical physics, a conception of the nature of these subjects that differs in interesting ways from the way that they are usually conceived. In particular, though—in agreement with the currently accepted view—Maxwell maintains that the second law of thermodynamics, as originally conceived, cannot be strictly true, the replacement he proposes is different from the version accepted by most physicists today. The modification of the second law accepted by most physicists is a (...)
    Download  
     
    Export citation  
     
    Bookmark   16 citations  
  8. (1 other version)Cultural Statistics, the Media and the Planning and Development of Calabar.Lawrence Ekwok - 2019 - GNOSI: An Interdisciplinary Journal of Human Theory and Praxis 2 (2).
    This paper, “Cultural Statistics, the Media and the Planning and Development of Calabar, Nigeria” stresses the need for the use of Cultural Statistics and effective media communication in the planning and development of Calabar, the Cross River State Capital. This position is anchored on the fact that in virtually every sphere of life, there can be no development without planning, and there can be no proper planning without accurate data or information. Cultural Statistics, and effective use of the media thus (...)
    Download  
     
    Export citation  
     
    Bookmark  
  9. Drift and “Statistically Abstractive Explanation”.Mohan Matthen - 2009 - Philosophy of Science 76 (4):464-487.
    A hitherto neglected form of explanation is explored, especially its role in population genetics. “Statistically abstractive explanation” (SA explanation) mandates the suppression of factors probabilistically relevant to an explanandum when these factors are extraneous to the theoretical project being pursued. When these factors are suppressed, the explanandum is rendered uncertain. But this uncertainty traces to the theoretically constrained character of SA explanation, not to any real indeterminacy. Random genetic drift is an artifact of such uncertainty, and it is therefore wrong (...)
    Download  
     
    Export citation  
     
    Bookmark   29 citations  
  10. Legal Burdens of Proof and Statistical Evidence.Georgi Gardiner - 2018 - In David Coady & James Chase (eds.), Routledge Handbook of Applied Epistemology. New York: Routledge, Taylor & Francis Group.
    In order to perform certain actions – such as incarcerating a person or revoking parental rights – the state must establish certain facts to a particular standard of proof. These standards – such as preponderance of evidence and beyond reasonable doubt – are often interpreted as likelihoods or epistemic confidences. Many theorists construe them numerically; beyond reasonable doubt, for example, is often construed as 90 to 95% confidence in the guilt of the defendant. -/- A family of influential cases suggests (...)
    Download  
     
    Export citation  
     
    Bookmark   34 citations  
  11. Artificial intelligence and identity: the rise of the statistical individual.Jens Christian Bjerring & Jacob Busch - forthcoming - AI and Society:1-13.
    Algorithms are used across a wide range of societal sectors such as banking, administration, and healthcare to make predictions that impact on our lives. While the predictions can be incredibly accurate about our present and future behavior, there is an important question about how these algorithms in fact represent human identity. In this paper, we explore this question and argue that machine learning algorithms represent human identity in terms of what we shall call the statistical individual. This statisticalized representation (...)
    Download  
     
    Export citation  
     
    Bookmark  
  12. Reality in a Few Thermodynamic Reference Frames: Statistical Thermodynamics From Boltzmann via Gibbs to Einstein.Vasil Penchev - 2020 - Philosophy of Science eJournal (Elsevier: SSRN) 13 (33):1-14.
    The success of a few theories in statistical thermodynamics can be correlated with their selectivity to reality. These are the theories of Boltzmann, Gibbs, and Einstein. The starting point is Carnot’s theory, which defines implicitly the general selection of reality relevant to thermodynamics. The three other theories share this selection, but specify it further in detail. Each of them separates a few main aspects within the scope of the implicit thermodynamic reality. Their success grounds on that selection. Those aspects (...)
    Download  
     
    Export citation  
     
    Bookmark  
  13. Reducing Emergence: The Case Studies in Statistic Thermodynamics, General Relativity, and Quantum Mechanics.Vasil Penchev - 2020 - Epistemology eJournal (Elsevier: SSRN) 13 (23):1-3.
    The emergent properties are properties referring to a system as a whole, but they do not make sense to its elements or parts being small enough. Furthermore certain emergent properties are reducible to those of elements or relevant parts often. The paper means the special case where the description of the system by means of its emergent properties is much simpler than that of its relevant elements or parts. The concept is investigated by a case study based on statistic thermodynamics, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  14. Relevance of the history of concepts for psychopathology and the other sciences of mind: introspection as a case in point.Massimiliano Aragona - 2013 - Dialogues in Philosophy, Mental and Neuro Sciences (1):1-3.
    Sometimes it happens that the same concept is discussed independently but, at the same time, in different disciplinary fields. The recent dominance of neuroscientific research has reintroduced into the experimental realm the importance of the experimental subject’s self-evaluation to be correlated to detectable changes into brain activity. For example, the experimental subjects are instructed to press a button or move a finger when they perceive or feel something, or they fill questionnaires supposed to measure their experience; all these “data” are (...)
    Download  
     
    Export citation  
     
    Bookmark  
  15.  64
    Consciousness without Report: Insights from Summary Statistics and Inattention ‘Blindness’.Marius Usher, Zohar Bronfman, Shiri Talmor, Hilla Jacobson & Baruch Eitam - 2018 - Philosophical Transactions of the Royal Society B: Biological Sciences 373 (1755).
    We contrast two theoretical positions on the relation between phenomenal and access consciousness. First, we discuss previous data supporting a mild Overflow position, according to which transient visual awareness can overflow report. These data are open to two interpretations: (i) observers transiently experience specific visual elements outside attentional focus without encoding them into working memory; (ii) no specific visual elements but only statistical summaries are experienced in such conditions. We present new data showing that under data-limited conditions observers cannot (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  16. Conflict Contagion.Marie Oldfield - 2015 - Institute of Mathematics and its Applications 1.
    With an increased emphasis on upstream activity and Defence Engagement, it has become increasingly more important for the UK Ministry of Defence (MOD) and government to understand the relationship between conflict and regional instability. As part of this process, the Historical and Operational Data Analysis Team (HODA) in Defence Science and Technology Laboratory (Dstl) was tasked to look at factors that influenced the regional spread of internal conflicts to help aid the decision making of government. Conflict contagion is the process (...)
    Download  
     
    Export citation  
     
    Bookmark  
  17.  41
    Probabilistic causation and the explanatory role of natural selection.Pablo Razeto-Barry & Ramiro Frick - 2011 - Studies in History and Philosophy of Science Part C: Studies in History and Philosophy of Biological and Biomedical Sciences 42 (3):344-355.
    The explanatory role of natural selection is one of the long-term debates in evolutionary biology. Nevertheless, the consensus has been slippery because conceptual confusions and the absence of a unified, formal causal model that integrates different explanatory scopes of natural selection. In this study we attempt to examine two questions: (i) What can the theory of natural selection explain? and (ii) Is there a causal or explanatory model that integrates all natural selection explananda? For the first question, we argue that (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  18. Which Models of Scientific Explanation Are (In)Compatible with Inference to the Best Explanation?Yunus Prasetya - 2024 - British Journal for the Philosophy of Science 75 (1):209-232.
    In this article, I explore the compatibility of inference to the best explanation (IBE) with several influential models and accounts of scientific explanation. First, I explore the different conceptions of IBE and limit my discussion to two: the heuristic conception and the objective Bayesian conception. Next, I discuss five models of scientific explanation with regard to each model’s compatibility with IBE. I argue that Kitcher’s unificationist account supports IBE; Railton’s deductive–nomological–probabilistic model, Salmon’s statistical-relevance model, and van Fraassen’s erotetic (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  19. Explicação Científica.Eduardo Castro - 2020 - Compêndio Em Linha de Problemas de Filosofia Analítica.
    Opinionated state of the art paper on scientific explanation. Analysis and discussion of the most relevant models and theories in the contemporary literature, namely, the deductive-nomological model, the models of inductive-statistical and statistical relevance, the pragmatic theory of why questions, the unifying theory of standard arguments, and the causal/non-causal counterfactual theory.
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  20. Evidence, Risk, and Proof Paradoxes: Pessimism about the Epistemic Project.Giada Fratantonio - 2021 - International Journal of Evidence and Proof:online first.
    Why can testimony alone be enough for findings of liability? Why statistical evidence alone can’t? These questions underpin the “Proof Paradox” (Redmayne 2008, Enoch et al. 2012). Many epistemologists have attempted to explain this paradox from a purely epistemic perspective. I call it the “Epistemic Project”. In this paper, I take a step back from this recent trend. Stemming from considerations about the nature and role of standards of proof, I define three requirements that any successful account in line (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  21. Why Does Laudan’s Confutation of Convergent Realism Fail?Antonio Diéguez-Lucena - 2006 - Journal for General Philosophy of Science / Zeitschrift für Allgemeine Wissenschaftstheorie 37 (2):393 - 403.
    In his paper "A Confutation of Convergent Realism", Larry Laudan offered one of the most powerful criticisms of scientific realism. I defend here that although Laudan's criticism is right, this does not refute the realist position. The thesis that Laudan confutes is a much stronger thesis than realist needs to maintain. As I will exemplify with Salmon's statistical-relevance model, a less strict notion of explanation would allow us to claim that (approximate) truth is the best explanation for such (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  22. Legal evidence and knowledge.Georgi Gardiner - 2024 - In Maria Lasonen-Aarnio & Clayton Littlejohn (eds.), The Routledge Handbook of the Philosophy of Evidence. New York, NY: Routledge.
    This essay is an accessible introduction to the proof paradox in legal epistemology. -/- In 1902 the Supreme Judicial Court of Maine filed an influential legal verdict. The judge claimed that in order to find a defendant culpable, the plaintiff “must adduce evidence other than a majority of chances”. The judge thereby claimed that bare statistical evidence does not suffice for legal proof. -/- In this essay I first motivate the claim that bare statistical evidence does not suffice (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  23.  98
    Dissertation Abstract - Math Over Mechanism: Proposing the Rational-Relational Theory of Scientific Explanation in Light of Impinging Constraints of New Mechanism.Jenny Nielsen - forthcoming - In ProQuest.
    In this dissertation I achieve the following: (1) I present motivating criteria for a general comprehensive theory of scientific explanation. I review historical approaches to modeling explanation in light of these criteria. (2) I present New Mechanist Explanation ("NME") as the leading candidate for a contemporary, complete theory of scientific explanation. (3) I present constraints on the applicability of New Mechanism in modeling biology, chemistry, and physics. I argue for the unsuitability of NME as a candidate for a general theory (...)
    Download  
     
    Export citation  
     
    Bookmark  
  24. Are we Living in a (Quantum) Simulation? – Constraints, observations, and experiments on the simulation hypothesis.Anders Indset, Florian Neukart, Markus Pflitsch & Michael R. Perelshtein - manuscript
    The God Experiment – Let there be Light -/- The question “What is real?” can be traced back to the shadows in Plato’s cave. Two thousand years later, Rene Descartes lacked knowledge about arguing against an evil´ deceiver feeding us the illusion of sensation. Descartes’ epistemological concept later led to various theories of what our sensory experiences actually are. The concept of ”illusionism”, proposing that even the very conscious experience we have – our qualia – is an illusion, is not (...)
    Download  
     
    Export citation  
     
    Bookmark  
  25. Sensitivity, safety, and the law: A reply to Pardo.David Enoch & Levi Spectre - 2019 - Legal Theory 25 (3):178-199.
    ABSTRACTIn a recent paper, Michael Pardo argues that the epistemic property that is legally relevant is the one called Safety, rather than Sensitivity. In the process, he argues against our Sensitivity-related account of statistical evidence. Here we revisit these issues, partly in order to respond to Pardo, and partly in order to make general claims about legal epistemology. We clarify our account, we show how it adequately deals with counterexamples and other worries, we raise suspicions about Safety's value here, (...)
    Download  
     
    Export citation  
     
    Bookmark   12 citations  
  26. Non-factive Understanding: A Statement and Defense.Yannick Doyle, Spencer Egan, Noah Graham & Kareem Khalifa - 2019 - Journal for General Philosophy of Science / Zeitschrift für Allgemeine Wissenschaftstheorie 50 (3):345-365.
    In epistemology and philosophy of science, there has been substantial debate about truth’s relation to understanding. “Non-factivists” hold that radical departures from the truth are not always barriers to understanding; “quasi-factivists” demur. The most discussed example concerns scientists’ use of idealizations in certain derivations of the ideal gas law from statistical mechanics. Yet, these discussions have suffered from confusions about the relevant science, as well as conceptual confusions. Addressing this example, we shall argue that the ideal gas law is (...)
    Download  
     
    Export citation  
     
    Bookmark   11 citations  
  27. Explanatory reasoning in the material theory of induction.William Peden - 2022 - Metascience 31 (3):303-309.
    In his recent book, John Norton has created a theory of inference to the best explanation, within the context of his "material theory of induction". I apply it to the problem of scientific explanations that are false: if we want the theories in our explanations to be true, then why do historians and scientists often say that false theories explained phenomena? I also defend Norton's theory against some possible objections.
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  28. Reviving Frequentism.Mario Hubert - 2021 - Synthese 199:5255–5584.
    Philosophers now seem to agree that frequentism is an untenable strategy to explain the meaning of probabilities. Nevertheless, I want to revive frequentism, and I will do so by grounding probabilities on typicality in the same way as the thermodynamic arrow of time can be grounded on typicality within statistical mechanics. This account, which I will call typicality frequentism, will evade the major criticisms raised against previous forms of frequentism. In this theory, probabilities arise within a physical theory from (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  29. A Weibull Wearout Test: Full Bayesian Approach.Julio Michael Stern, Telba Zalkind Irony, Marcelo de Souza Lauretto & Carlos Alberto de Braganca Pereira - 2001 - Reliability and Engineering Statistics 5:287-300.
    The Full Bayesian Significance Test (FBST) for precise hypotheses is presented, with some applications relevant to reliability theory. The FBST is an alternative to significance tests or, equivalently, to p-ualue.s. In the FBST we compute the evidence of the precise hypothesis. This evidence is the probability of the complement of a credible set "tangent" to the sub-manifold (of the para,rreter space) that defines the null hypothesis. We use the FBST in an application requiring a quality control of used components, based (...)
    Download  
     
    Export citation  
     
    Bookmark  
  30. (2 other versions)The explanation game: a formal framework for interpretable machine learning.David S. Watson & Luciano Floridi - 2020 - Synthese 198 (10):1–⁠32.
    We propose a formal framework for interpretable machine learning. Combining elements from statistical learning, causal interventionism, and decision theory, we design an idealised explanation game in which players collaborate to find the best explanation for a given algorithmic prediction. Through an iterative procedure of questions and answers, the players establish a three-dimensional Pareto frontier that describes the optimal trade-offs between explanatory accuracy, simplicity, and relevance. Multiple rounds are played at different levels of abstraction, allowing the players to explore (...)
    Download  
     
    Export citation  
     
    Bookmark   16 citations  
  31. Some Libertarian Ideas about Human Social Life.Gheorghe-Ilie Farte - 2012 - Argumentum. Journal of the Seminar of Discursive Logic, Argumentation Theory and Rhetoric 10 (2):07-19.
    The central thesis of my article is that people live a life worthy of a human being only as self-ruling members of some autarchic (or self-governing) communities. On the one hand, nobody is born as a self-ruling individual, and on the other hand, everybody can become such a person by observing progressively the non-aggression principle and, ipso facto, by behaving as a moral being. A self-ruling person has no interest in controlling her neighbors, but in mastering his own impulses, needs, (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  32. What If Plato Took Surveys? Thoughts about Philosophy Experiments.William M. Goodman - 2012 - In Patricia Hanna (ed.), An Anthology of Philosophical Studies - Volume 6. Athiner.
    The movement called Experimental Philosophy (‘x-Phi’) has now passed its tenth anniversary. Its central insight is compelling: When an argument hinges on accepting certain ‘facts’ about human perception, knowledge, or judging, the evoking of relevant intuitions by thought experiments is intended to make those facts seem obvious. But these intuitions may not be shared universally. Experimentalists propose testing claims that traditionally were intuition-based using real experiments, with real samples. Demanding that empirical claims be empirically supported is certainly reasonable; though experiments (...)
    Download  
     
    Export citation  
     
    Bookmark  
  33.  69
    Preregistration Does Not Improve the Transparent Evaluation of Severity in Popper’s Philosophy of Science or When Deviations are Allowed.Mark Rubin - manuscript
    One justification for preregistering research hypotheses, methods, and analyses is that it improves the transparent evaluation of the severity of hypothesis tests. In this article, I consider two cases in which preregistration does not improve this evaluation. First, I argue that, although preregistration can facilitate the transparent evaluation of severity in Mayo’s error statistical philosophy of science, it does not facilitate this evaluation in Popper’s theory-centric approach. To illustrate, I show that associated concerns about Type I error rate inflation (...)
    Download  
     
    Export citation  
     
    Bookmark  
  34. Wright’s path analysis: Causal inference in the early twentieth century.Zili Dong - 2024 - Theoria. An International Journal for Theory, History and Foundations of Science 39 (1):67–88.
    Despite being a milestone in the history of statistical causal inference, Sewall Wright’s 1918 invention of path analysis did not receive much immediate attention from the statistical and scientific community. Through a careful historical analysis, this paper reveals some previously overlooked philosophical issues concerning the history of causal inference. Placing the invention of path analysis in a broader historical and intellectual context, I portray the scientific community’s initial lack of interest in the method as a natural consequence of (...)
    Download  
     
    Export citation  
     
    Bookmark  
  35. Mathematical Explanations in Evolutionary Biology or Naturalism? A Challenge for the Statisticalist.Fabio Sterpetti - 2021 - Foundations of Science 27 (3):1073-1105.
    This article presents a challenge that those philosophers who deny the causal interpretation of explanations provided by population genetics might have to address. Indeed, some philosophers, known as statisticalists, claim that the concept of natural selection is statistical in character and cannot be construed in causal terms. On the contrary, other philosophers, known as causalists, argue against the statistical view and support the causal interpretation of natural selection. The problem I am concerned with here arises for the statisticalists (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  36. Evaluating Methods of Correcting for Multiple Comparisons Implemented in SPM12 in Social Neuroscience fMRI Studies: An Example from Moral Psychology.Hyemin Han & Andrea L. Glenn - 2018 - Social Neuroscience 13 (3):257-267.
    In fMRI research, the goal of correcting for multiple comparisons is to identify areas of activity that reflect true effects, and thus would be expected to replicate in future studies. Finding an appropriate balance between trying to minimize false positives (Type I error) while not being too stringent and omitting true effects (Type II error) can be challenging. Furthermore, the advantages and disadvantages of these types of errors may differ for different areas of study. In many areas of social neuroscience (...)
    Download  
     
    Export citation  
     
    Bookmark  
  37. Initial Conditions as Exogenous Factors in Spatial Explanation.Clint Ballinger - 2008 - Dissertation, University of Cambridge
    This dissertation shows how initial conditions play a special role in the explanation of contingent and irregular outcomes, including, in the form of geographic context, the special case of uneven development in the social sciences. The dissertation develops a general theory of this role, recognizes its empirical limitations in the social sciences, and considers how it might be applied to the question of uneven development. The primary purpose of the dissertation is to identify and correct theoretical problems in the study (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  38. The politics of uncertainty.Luciano Floridi - 2015 - Philosophy and Technology 28 (1):1-4.
    What is uncertainty? There are of course several possible definitions, offered by different fields, from epistemology to statistics, but, in the background, one usually finds some kind of relation with the lack of information, in the following sense. Suppose we define semantic or factual information as the combination of a question plus the relevant, correct answer. If one has both the question and the correct answer, one is informed: “was Berlin the capital of Germany in 2010? Yes”. If one has (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  39. g as bridge model.Devin Sanchez Curry - 2021 - Philosophy of Science 88 (5):1067-1078.
    Psychometric g—a statistical factor capturing intercorrelations between scores on different IQ tests—is of theoretical interest despite being a low-fidelity model of both folk psychological intelligence and its cognitive/neural underpinnings. Psychometric g idealizes away from those aspects of cognitive/neural mechanisms that are not explanatory of the relevant variety of folk psychological intelligence, and it idealizes away from those varieties of folk psychological intelligence that are not generated by the relevant cognitive/neural substrate. In this manner, g constitutes a high-fidelity bridge model (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  40. Black Hole Paradoxes: A Unified Framework for Information Loss.Saakshi Dulani - 2024 - Dissertation, University of Geneva
    The black hole information loss paradox is a catch-all term for a family of puzzles related to black hole evaporation. For almost 50 years, the quest to elucidate the implications of black hole evaporation has not only sustained momentum, but has also become increasingly populated with proposals that seem to generate more questions than they purport to answer. Scholars often neglect to acknowledge ongoing discussions within black hole thermodynamics and statistical mechanics when analyzing the paradox, including the interpretation of (...)
    Download  
     
    Export citation  
     
    Bookmark  
  41. Truth, knowledge, and the standard of proof in criminal law.Clayton Littlejohn - 2020 - Synthese 197 (12):5253-5286.
    Could it be right to convict and punish defendants using only statistical evidence? In this paper, I argue that it is not and explain why it would be wrong. This is difficult to do because there is a powerful argument for thinking that we should convict and punish defendants using statistical evidence. It looks as if the relevant cases are cases of decision under risk and it seems we know what we should do in such cases (i.e., maximize (...)
    Download  
     
    Export citation  
     
    Bookmark   62 citations  
  42. Two Problems of Direct Inference.Paul D. Thorn - 2012 - Erkenntnis 76 (3):299-318.
    The article begins by describing two longstanding problems associated with direct inference. One problem concerns the role of uninformative frequency statements in inferring probabilities by direct inference. A second problem concerns the role of frequency statements with gerrymandered reference classes. I show that past approaches to the problem associated with uninformative frequency statements yield the wrong conclusions in some cases. I propose a modification of Kyburg’s approach to the problem that yields the right conclusions. Past theories of direct inference have (...)
    Download  
     
    Export citation  
     
    Bookmark   17 citations  
  43. Explaining Thermodynamic-Like Behavior in Terms of Epsilon-Ergodicity.Roman Frigg & Charlotte Werndl - 2011 - Philosophy of Science 78 (4):628-652.
    Gases reach equilibrium when left to themselves. Why do they behave in this way? The canonical answer to this question, originally proffered by Boltzmann, is that the systems have to be ergodic. This answer has been criticised on different grounds and is now widely regarded as flawed. In this paper we argue that some of the main arguments against Boltzmann's answer, in particular, arguments based on the KAM-theorem and the Markus-Meyer theorem, are beside the point. We then argue that something (...)
    Download  
     
    Export citation  
     
    Bookmark   29 citations  
  44. Machine learning in bail decisions and judges’ trustworthiness.Alexis Morin-Martel - 2023 - AI and Society:1-12.
    The use of AI algorithms in criminal trials has been the subject of very lively ethical and legal debates recently. While there are concerns over the lack of accuracy and the harmful biases that certain algorithms display, new algorithms seem more promising and might lead to more accurate legal decisions. Algorithms seem especially relevant for bail decisions, because such decisions involve statistical data to which human reasoners struggle to give adequate weight. While getting the right legal outcome is a (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  45. A new approach to the approach to equilibrium.Roman Frigg & Charlotte Werndl - 2012 - In Yemima Ben-Menahem & Meir Hemmo (eds.), Probability in Physics. Springer. pp. 99-114.
    Consider a gas confined to the left half of a container. Then remove the wall separating the two parts. The gas will start spreading and soon be evenly distributed over the entire available space. The gas has approached equilibrium. Why does the gas behave in this way? The canonical answer to this question, originally proffered by Boltzmann, is that the system has to be ergodic for the approach to equilibrium to take place. This answer has been criticised on different grounds (...)
    Download  
     
    Export citation  
     
    Bookmark   11 citations  
  46. How People Judge What Is Reasonable.Kevin P. Tobia - 2018 - Alabama Law Review 70 (2):293-359.
    A classic debate concerns whether reasonableness should be understood statistically (e.g., reasonableness is what is common) or prescriptively (e.g., reasonableness is what is good). This Article elaborates and defends a third possibility. Reasonableness is a partly statistical and partly prescriptive “hybrid,” reflecting both statistical and prescriptive considerations. Experiments reveal that people apply reasonableness as a hybrid concept, and the Article argues that a hybrid account offers the best general theory of reasonableness. -/- First, the Article investigates how ordinary (...)
    Download  
     
    Export citation  
     
    Bookmark   13 citations  
  47. Difference and Robustness in the Patterns of Philosophical Intuition Across Demographic Groups.Joshua Knobe - 2023 - Review of Philosophy and Psychology 14 (2):435-455.
    In a recent paper, I argued that philosophical intuitions are surprisingly robust both across demographic groups and across development. Machery and Stich reply by reviewing a series of studies that do show significant differences in philosophical intuition between different demographic groups. This is a helpful point, which gets at precisely the issues that are most relevant here. However, even when one looks at those very studies, one finds truly surprising robustness. In other words, despite the presence of statistically significant differences (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  48. Defeasible Conditionalization.Paul D. Thorn - 2014 - Journal of Philosophical Logic 43 (2-3):283-302.
    The applicability of Bayesian conditionalization in setting one’s posterior probability for a proposition, α, is limited to cases where the value of a corresponding prior probability, PPRI(α|∧E), is available, where ∧E represents one’s complete body of evidence. In order to extend probability updating to cases where the prior probabilities needed for Bayesian conditionalization are unavailable, I introduce an inference schema, defeasible conditionalization, which allows one to update one’s personal probability in a proposition by conditioning on a proposition that represents a (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  49. Does my total evidence support that I’m a Boltzmann Brain?Sinan Dogramaci - 2020 - Philosophical Studies 177 (12):3717-3723.
    A Boltzmann Brain, haphazardly formed through the unlikely but still possible random assembly of physical particles, is a conscious brain having experiences just like an ordinary person. The skeptical possibility of being a Boltzmann Brain is an especially gripping one: scientific evidence suggests our actual universe’s full history may ultimately contain countless short-lived Boltzmann Brains with experiences just like yours or mine. I propose a solution to the skeptical challenge posed by these countless actual Boltzmann Brains. My key idea is (...)
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  50. From The Principle Of Least Action To The Conservation Of Quantum Information In Chemistry: Can One Generalize The Periodic Table?Vasil Penchev - 2019 - Chemistry: Bulgarian Journal of Science Education 28 (4):525-539.
    The success of a few theories in statistical thermodynamics can be correlated with their selectivity to reality. These are the theories of Boltzmann, Gibbs, end Einstein. The starting point is Carnot’s theory, which defines implicitly the general selection of reality relevant to thermodynamics. The three other theories share this selection, but specify it further in detail. Each of them separates a few main aspects within the scope of the implicit thermodynamic reality. Their success grounds on that selection. Those aspects (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
1 — 50 / 965