The inflation of Type I error rates is thought to be one of the causes of the replication crisis. Questionable research practices such as p-hacking are thought to inflate Type I error rates above their nominal level, leading to unexpectedly high levels of false positives in the literature and, consequently, unexpectedly low replication rates. In this article, I offer an alternative view. I argue that questionable and other research practices do not usually inflate relevant Type I error rates. I begin (...) with an introduction to Type I error rates that distinguishes them from theoretical errors. I then illustrate my argument with respect to model misspecification, multiple testing, selective inference, forking paths, exploratory analyses, p-hacking, optional stopping, double dipping, and HARKing. In each case, I demonstrate that relevant Type I error rates are not usually inflated above their nominal level, and in the rare cases that they are, the inflation is easily identified and resolved. I conclude that the replication crisis may be explained, at least in part, by researchers’ misinterpretation of statistical errors and their underestimation of theoretical errors. (shrink)
The major competing statistical paradigms share a common remarkable but unremarked thread: in many of their inferential applications, different probability interpretations are combined. How this plays out in different theories of inference depends on the type of question asked. We distinguish four question types: confirmation, evidence, decision, and prediction. We show that Bayesian confirmation theory mixes what are intuitively “subjective” and “objective” interpretations of probability, whereas the likelihood-based account of evidence melds three conceptions of what constitutes an “objective” probability.
I argue that when we use ‘probability’ language in epistemic contexts—e.g., when we ask how probable some hypothesis is, given the evidence available to us—we are talking about degrees of support, rather than degrees of belief. The epistemic probability of A given B is the mind-independent degree to which B supports A, not the degree to which someone with B as their evidence believes A, or the degree to which someone would or should believe A if they had B as (...) their evidence. My central argument is that the degree-of-support interpretation lets us better model good reasoning in certain cases involving old evidence. Degree-of-belief interpretations make the wrong predictions not only about whether old evidence confirms new hypotheses, but about the values of the probabilities that enter into Bayes’ Theorem when we calculate the probability of hypotheses conditional on old evidence and new background information. (shrink)
In this paper, we illustrate some serious difficulties involved in conveying information about uncertain risks and securing informed consent for risky interventions in a clinical setting. We argue that in order to secure informed consent for a medical intervention, physicians often need to do more than report a bare, numerical probability value. When probabilities are given, securing informed consent generally requires communicating how probability expressions are to be interpreted and communicating something about the quality and quantity of the evidence for (...) the probabilities reported. Patients may also require guidance on how probability claims may or may not be relevant to their decisions, and physicians should be ready to help patients understand these issues. (shrink)
One popular approach to statistical mechanics understands statistical mechanical probabilities as measures of rational indifference. Naive formulations of this ``indifference approach'' face reversibility worries - while they yield the right prescriptions regarding future events, they yield the wrong prescriptions regarding past events. This paper begins by showing how the indifference approach can overcome the standard reversibility worries by appealing to the Past Hypothesis. But, the paper argues, positing a Past Hypothesis doesn't free the indifference approach from all reversibility worries. For (...) while appealing to the Past Hypothesis allows it to escape one kind of reversibility worry, it makes it susceptible to another - the Meta-Reversibility Objection. And there is no easy way for the indifference approach to escape the Meta-Reversibility Objection. As a result, reversibility worries pose a steep challenge to the viability of the indifference approach. (shrink)
This paper develops a probabilistic analysis of conditionals which hinges on a quantitative measure of evidential support. In order to spell out the interpreta- tion of ‘if’ suggested, we will compare it with two more familiar interpretations, the suppositional interpretation and the strict interpretation, within a formal framework which rests on fairly uncontroversial assumptions. As it will emerge, each of the three interpretations considered exhibits specific logical features that deserve separate consideration.
Current planetary defense policy prioritizes a probability assessment of risk of Earth impact by an asteroid or a comet in the planning of detection and mitigation strategies and in setting the levels of urgency and budgeting to operationalize them. The result has been a focus on asteroids of Tunguska size, which could destroy a city or a region, since this is the most likely sort of object we would need to defend against. However a complete risk assessment would consider not (...) only the probability of an impact but also the magnitude of its consequences, which in the case of an object of Chicxulub size could be the end of civilization or even human extinction. This paper argues that a planetary defense policy based on a complete (or one could say genuine) risk assessment would justify expenditures much higher than at present. (shrink)
Why is the concept of truth so important to us? After all, it is not at all obvious why human intelligence would have evolved to do anything other than to dissimulate, deceive, cheat, and trick. Pragmatic genealogies like the genealogies of the value of truth told by Nietzsche and Williams can help us grasp why we think as we do. But instead of explaining concepts by tracing them to antecedent objects in reality, they trace them to practical needs and reverse-engineer (...) the functions performed by the concepts. (shrink)
The epistemic probability of A given B is the degree to which B evidentially supports A, or makes A plausible. This paper is a first step in answering the question of what determines the values of epistemic probabilities. I break this question into two parts: the structural question and the substantive question. Just as an object’s weight is determined by its mass and gravitational acceleration, some probabilities are determined by other, more basic ones. The structural question asks what probabilities are (...) not determined in this way—these are the basic probabilities which determine values for all other probabilities. The substantive question asks how the values of these basic probabilities are determined. I defend an answer to the structural question on which basic probabilities are the probabilities of atomic propositions conditional on potential direct explanations. I defend this against the view, implicit in orthodox mathematical treatments of probability, that basic probabilities are the unconditional probabilities of complete worlds. I then apply my answer to the structural question to clear up common confusions in expositions of Bayesianism and shed light on the “problem of the priors.”. (shrink)
A historical review and philosophical look at the introduction of “negative probability” as well as “complex probability” is suggested. The generalization of “probability” is forced by mathematical models in physical or technical disciplines. Initially, they are involved only as an auxiliary tool to complement mathematical models to the completeness to corresponding operations. Rewards, they acquire ontological status, especially in quantum mechanics and its formulation as a natural information theory as “quantum information” after the experimental confirmation the phenomena of “entanglement”. Philosophical (...) interpretations appear. A generalization of them is suggested: ontologically, they correspond to a relevant generalization to the relation of a part and its whole where the whole is a subset of the part rather than vice versa. The structure of “vector space” is involved necessarily in order to differ the part “by itself” from it in relation to the whole as a projection within it. That difference is reflected in the new dimension of vector space both mathematically and conceptually. Then, “negative or complex probability” are interpreted as a quantity corresponding the generalized case where the part can be “bigger” than the whole, and it is represented only partly in general within the whole. (shrink)
Bài mới xuất bản vào ngày 19-5-2020 với tác giả liên lạc là NCS Nguyễn Minh Hoàng, cán bộ nghiên cứu của Trung tâm ISR, trình bày tiếp cận thống kê Bayesian cho việc nghiên cứu dữ liệu khoa học xã hội. Đây là kết quả của định hướng Nhóm nghiên cứu SDAG được nêu rõ ngay từ ngày 18-5-2019.
Abstract : A measurement result is never absolutely accurate: it is affected by an unknown “measurement error” which characterizes the discrepancy between the obtained value and the “true value” of the quantity intended to be measured. As a consequence, to be acceptable a measurement result cannot take the form of a unique numerical value, but has to be accompanied by an indication of its “measurement uncertainty”, which enunciates a state of doubt. What, though, is the value of measurement uncertainty? What (...) is its numerical value: how does one calculate it? What is its epistemic value: how one should interpret a measurement result? Firstly, we describe the statistical models that scientists make use of in contemporary metrology to perform an uncertainty analysis, and we show that the issue of the interpretation of probabilities is vigorously debated. This debate brings out epistemological issues about the nature and function of physical measurements, metrologists insisting in particular on the subjective aspect of measurement. Secondly, we examine the philosophical elaboration of metrologists in their technical works, where they criticize the use of the notion of “true value” of a physical quantity. We then challenge this elaboration and defend such a notion. The third part turns to a specific use of measurement uncertainty in order to address our thematic from the perspective of precision physics, considering the activity of the adjustments of physical constants. In the course of this activity, physicists have developed a dynamic conception of the accuracy of their measurement results, oriented towards a future progress of knowledge, and underlining the epistemic virtues of a never-ending process of identification and correction of measurement errors. (shrink)
MỘT SỐ QUÁ TRÌNH NGẪU NHIÊN CÓ BƯỚC NHẢY -/- Hoàng Thị Phương Thảo -/- Luận án Tiến sỹ -/- TRƯỜNG ĐẠI HỌC KHOA HỌC TỰ NHIÊN ĐẠI HỌC QUỐC GIA HÀ NỘI -/- Hà Nội - 2015 .
This doctoral dissertation investigates the notion of physical necessity. Specifically, it studies whether it is possible to account for non-accidental regularities without the standard assumption of a pre-existent set of governing laws. Thus, it takes side with the so called deflationist accounts of laws of nature, like the humean or the antirealist. The specific aim is to complement such accounts by providing a missing explanation of the appearance of physical necessity. In order to provide an explanation, I recur to fields (...) that have not been appealed to so far in discussions about the metaphysics of laws. Namely, I recur to complex systems’ theory, and to the foundations of statistical mechanics. The explanation proposed is inspired by how complex systems’ theory has elucidated the way patterns emerge, and by the probabilistic explanations of the 2nd law of thermodynamics. More specifically, this thesis studies how some constraints that make no direct reference to the dynamics can be a sufficient condition for obtaining in the long run, with high probability, stable regular behavior. I hope to show how certain metaphysical accounts of laws might benefit from the insights achieved in these other fields. According to the proposal studied in this thesis, some regularities are not accidental not in virtue of an underlying physical necessity. The non-accidental character of certain regular behavior is only due to its overwhelming stability. Thus, from this point of view the goal becomes to explain the stability of temporal patterns without assuming a set of pre-existent guiding laws. It is argued that the stability can be the result of a process of convergence to simpler and stable regularities from a more complex lower level. According to this project, if successful, there would be no need to postulate a (mysterious) intermediate category between logical necessity and pure contingency. Similarly, there would be no need to postulate a (mysterious) set of pre-existent governing laws. Part I of the thesis motivates part II, mostly by arguing why further explanation of the notions of physical necessity and governing laws should be welcomed (chapter 1), and by studying the plausibility of a lawless fundamental level (chapters 2 and 3). Given so, part II develops the explanation of formation of simpler and stable behavior from a more complex underlying level. (shrink)
Abstract The opening section outlines probabilism in the 20th century philosophy and shortly discusses the major accomplishments of Polish probabilist thinkers. A concise characterization of Bayesianism as the major recent form of probabilism follows. It builds upon the core personalist version of Bayesianism towards more objectively oriented versions thereof. The problem of a priori probability is shortly discussed. A tentative characterization of Kazimierz Ajdukiewicz’s standpoint regarding the inductive inference is cast in Bayesian terms. His objections against it presented in Pragmatic (...) Logic are presented. His 1958 paper on justification of non-deductive inference, as amply demonstrated by K. Szaniawski and I. Niiniluoto, extends his earlier Baysian position from 1928 monograph. In the closing section Ajdukiewicz’s standpoint is presented as a characteristically pragmatist and empiricist version of Bayesianism, which remains an unexplored and stimulating position. (shrink)
In this paper the strategy for the eliminative reduction of the alethic modalities suggested by John Venn is outlined and it is shown to anticipate certain related contemporary empiricistic and nominalistic projects. Venn attempted to reduce the alethic modalities to probabilities, and thus suggested a promising solution to the nagging issue of the inclusion of modal statements in empiricistic philosophical systems. However, despite the promise that this suggestion held for laying the ‘ghost of modality’ to rest, this general approach, tempered (...) modal eliminativism, is shown to be inadequate for that task.<br><br>. (shrink)
This dissertation looks at a set of interconnected questions concerning the foundations of probability, and gives a series of interconnected answers. At its core is a piece of old-fashioned philosophical analysis, working out what probability is. Or equivalently, investigating the semantic question of what is the meaning of ‘probability’? Like Keynes and Carnap, I say that probability is degree of reasonable belief. This immediately raises an epistemological question, which degrees count as reasonable? To solve that in its full generality would (...) mean the end of human inquiry, so that won’t be attempted here. Rather I will follow tradition and merely investigate which sets of partial beliefs are coherent. -/- The standard answer to this question, what is commonly called the Bayesian answer, says that degrees of belief are coherent iff they form a probability function. I disagree with the way this is usually justified, but subject to an important qualification I accept the answer. The important qualification is that degrees of belief may be imprecise, or vague. Part one of the dissertation, chapters 1 to 6, looks largely at the consequences of this qualification for the semantic and epistemological questions already mentioned. It turns out that when we allow degrees of belief to be imprecise, we can discharge potentially fatal objections to some philosophically attractive theses. Two of these, that probability is degree of reasonable belief and that the probability calculus provides coherence constraints on partial beliefs, have been mentioned. Others include the claim, defended in chapter 4, that chance is probability given total history. -/- As well as these semantic and epistemological questions, studies of the foundations of probability usually include a detailed discussion of decision theory. For reasons set out in chapter 2, I deny we can gain epistemological insights from decision theory. Nevertheless, it is an interesting field to study on its own, and it might be expected that there would be decision theoretic consequences of allowing imprecise degrees of belief. As I show in part two, this expectation seems to be mistaken. Chapter 9 shows that there aren’t interesting consequences of this theory for decision theory proper, and chapters 10 and 11 show that Keynes’s attempt to use imprecision in degrees of belief to derive a distinctive theory of interest rates is unsound. -/- Chapters 7 and 8 provide a link between these two parts. In chapter 7 I look at some previous philosophical investigations into the effects of imprecision. In chapter 8 I develop what I take to be the best competitor to the theory defended here – a constructivist theory of probability. On this view degrees of belief are precise, but the relevant coherence constraint is a constructivist probability calculus. This view is, I think, mistaken, but the calculus has some intrinsic interest, and there are at least enough arguments for it to warrant a chapter-length examination. (shrink)
The iroha song of human concepts (2021) The iroha is a Japanese poem of a perfect pangram and isogram, containing each character of the Japanese syllabary exactly once. It also mimics an ultimate conceptual engineering, in that there is more and more restricted scope for meaningful expressions, given more and more condensed means of description. This culminates in crystallizations of human values by auto-condensations of meaningful concepts. Instead of distilling Japanese values of 11th century, I try for those of human (...) concepts, given our merging mind, language and culture. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.