Results for 'Statistical approaches'

1000+ found
Order:
  1. Theory of signs and statistical approach to big data in assessing the relevance of clinical biomarkers of inflammation and oxidative stress.Pietro Ghezzi, Kevin Davies, Aidan Delaney & Luciano Floridi - 2018 - Proceedings of the National Academy of Sciences of the United States of America 115 (10):2473-2477.
    Biomarkers are widely used not only as prognostic or diagnostic indicators, or as surrogate markers of disease in clinical trials, but also to formulate theories of pathogenesis. We identify two problems in the use of biomarkers in mechanistic studies. The first problem arises in the case of multifactorial diseases, where different combinations of multiple causes result in patient heterogeneity. The second problem arises when a pathogenic mediator is difficult to measure. This is the case of the oxidative stress (OS) theory (...)
    Download  
     
    Export citation  
     
    Bookmark  
  2. Contemporary Approaches to Statistical Mechanical Probabilities: A Critical Commentary - Part I: The Indifference Approach.Christopher J. G. Meacham - 2010 - Philosophy Compass 5 (12):1116-1126.
    This pair of articles provides a critical commentary on contemporary approaches to statistical mechanical probabilities. These articles focus on the two ways of understanding these probabilities that have received the most attention in the recent literature: the epistemic indifference approach, and the Lewis-style regularity approach. These articles describe these approaches, highlight the main points of contention, and make some attempts to advance the discussion. The first of these articles provides a brief sketch of statistical mechanics, and (...)
    Download  
     
    Export citation  
     
    Bookmark   11 citations  
  3. Contemporary Approaches to Statistical Mechanical Probabilities: A Critical Commentary - Part II: The Regularity Approach.Christopher J. G. Meacham - 2010 - Philosophy Compass 5 (12):1127-1136.
    This pair of articles provides a critical commentary on contemporary approaches to statistical mechanical probabilities. These articles focus on the two ways of understanding these probabilities that have received the most attention in the recent literature: the epistemic indifference approach, and the Lewis-style regularity approach. These articles describe these approaches, highlight the main points of contention, and make some attempts to advance the discussion. The second of these articles discusses the regularity approach to statistical mechanical probabilities, (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  4. Pedagogical Approaches in Statistics and Probability during Pandemic.Melanie Gurat & Cathlyn Dumale - 2023 - American Journal of Educational Research 11 (6):337-347.
    The difficulty of the students in Statistics and Probability subject, and the pedagogical approaches used by the teachers, were the challenges encountered by both students and teachers due to the restrictions during the CoViD-19 pandemic. Hence, this study aimed to determine the pedagogical approaches used in teaching statistics and probability during the pandemic. The study used a qualitative approach, particularly document analysis. The main source of the data was the module in statistics and probability specifically the learning activity (...)
    Download  
     
    Export citation  
     
    Bookmark  
  5. A statistical learning approach to a problem of induction.Kino Zhao - manuscript
    At its strongest, Hume's problem of induction denies the existence of any well justified assumptionless inductive inference rule. At the weakest, it challenges our ability to articulate and apply good inductive inference rules. This paper examines an analysis that is closer to the latter camp. It reviews one answer to this problem drawn from the VC theorem in statistical learning theory and argues for its inadequacy. In particular, I show that it cannot be computed, in general, whether we are (...)
    Download  
     
    Export citation  
     
    Bookmark  
  6. Vagueness : a statistical epistemicist approach.Jiri Benovsky - 2011 - Teorema: International Journal of Philosophy (3):97-112.
    There are three main traditional accounts of vagueness : one takes it as a genuinely metaphysical phenomenon, one takes it as a phenomenon of ignorance, and one takes it as a linguistic or conceptual phenomenon. In this paper I first very briefly present these views, especially the epistemicist and supervaluationist strategies, and shortly point to some well-known problems that the views carry. I then examine a 'statistical epistemicist' account of vagueness that is designed to avoid precisely these problems – (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  7. Enviromental genotoxicity evaluation: Bayesian approach for a mixture statistical model.Julio Michael Stern, Angela Maria de Souza Bueno, Carlos Alberto de Braganca Pereira & Maria Nazareth Rabello-Gay - 2002 - Stochastic Environmental Research and Risk Assessment 16:267–278.
    The data analyzed in this paper are part of the results described in Bueno et al. (2000). Three cytogenetics endpoints were analyzed in three populations of a species of wild rodent – Akodon montensis – living in an industrial, an agricultural, and a preservation area at the Itajaí Valley, State of Santa Catarina, Brazil. The polychromatic/normochromatic ratio, the mitotic index, and the frequency of micronucleated polychromatic erythrocites were used in an attempt to establish a genotoxic profile of each area. It (...)
    Download  
     
    Export citation  
     
    Bookmark  
  8. Comment on Gignac and Zajenkowski, “The Dunning-Kruger effect is (mostly) a statistical artefact: Valid approaches to testing the hypothesis with individual differences data”.Avram Hiller - 2023 - Intelligence 97 (March-April):101732.
    Gignac and Zajenkowski (2020) find that “the degree to which people mispredicted their objectively measured intelligence was equal across the whole spectrum of objectively measured intelligence”. This Comment shows that Gignac and Zajenkowski’s (2020) finding of homoscedasticity is likely the result of a recoding choice by the experimenters and does not in fact indicate that the Dunning-Kruger Effect is a mere statistical artifact. Specifically, Gignac and Zajenkowski (2020) recoded test subjects’ responses to a question regarding self-assessed comparative IQ onto (...)
    Download  
     
    Export citation  
     
    Bookmark  
  9. Statistical Evidence, Normalcy, and the Gatecrasher Paradox.Michael Blome-Tillmann - 2020 - Mind 129 (514):563-578.
    Martin Smith has recently proposed, in this journal, a novel and intriguing approach to puzzles and paradoxes in evidence law arising from the evidential standard of the Preponderance of the Evidence. According to Smith, the relation of normic support provides us with an elegant solution to those puzzles. In this paper I develop a counterexample to Smith’s approach and argue that normic support can neither account for our reluctance to base affirmative verdicts on bare statistical evidence nor resolve the (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  10. The Statistical Nature of Causation.David Papineau - 2022 - The Monist 105 (2):247-275.
    Causation is a macroscopic phenomenon. The temporal asymmetry displayed by causation must somehow emerge along with other asymmetric macroscopic phenomena like entropy increase and the arrow of radiation. I shall approach this issue by considering ‘causal inference’ techniques that allow causal relations to be inferred from sets of observed correlations. I shall show that these techniques are best explained by a reduction of causation to structures of equations with probabilistically independent exogenous terms. This exogenous probabilistic independence imposes a recursive order (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  11. Statistical Inference and the Replication Crisis.Lincoln J. Colling & Dénes Szűcs - 2018 - Review of Philosophy and Psychology 12 (1):121-147.
    The replication crisis has prompted many to call for statistical reform within the psychological sciences. Here we examine issues within Frequentist statistics that may have led to the replication crisis, and we examine the alternative—Bayesian statistics—that many have suggested as a replacement. The Frequentist approach and the Bayesian approach offer radically different perspectives on evidence and inference with the Frequentist approach prioritising error control and the Bayesian approach offering a formal method for quantifying the relative strength of evidence for (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  12. 'More Likely Than Not' - Knowledge First and the Role of Statistical Evidence in Courts of Law.Michael Blome-Tillmann - 2017 - In Carter Adam, Gordon Emma & Jarvis Benjamin (eds.), Knowledge First,. Oxford University Press. pp. 278-292.
    The paper takes a closer look at the role of knowledge and evidence in legal theory. In particular, the paper examines a puzzle arising from the evidential standard Preponderance of the Evidence and its application in civil procedure. Legal scholars have argued since at least the 1940s that the rule of the Preponderance of the Evidence gives rise to a puzzle concerning the role of statistical evidence in judicial proceedings, sometimes referred to as the Problem of Bare Statistical (...)
    Download  
     
    Export citation  
     
    Bookmark   20 citations  
  13. Sensitivity, Causality, and Statistical Evidence in Courts of Law.Michael Blome-Tillmann - 2015 - Thought: A Journal of Philosophy 4 (2):102-112.
    Recent attempts to resolve the Paradox of the Gatecrasher rest on a now familiar distinction between individual and bare statistical evidence. This paper investigates two such approaches, the causal approach to individual evidence and a recently influential (and award-winning) modal account that explicates individual evidence in terms of Nozick's notion of sensitivity. This paper offers counterexamples to both approaches, explicates a problem concerning necessary truths for the sensitivity account, and argues that either view is implausibly committed to (...)
    Download  
     
    Export citation  
     
    Bookmark   41 citations  
  14. Statistical trends of school size, location and enrolment: An evaluation of public junior secondary schools for sustainable development.Samuel Okpon Ekaette, Eyiene Ameh & Valentine Joseph Owan - 2020 - World Journal of Vocational Education and Training 2 (2):76-88.
    This is a trend study of School Size, Location and Enrolment Figures of Junior secondary schools in Akwa Ibom State of Nigeria covering 2008 – 2016 with implications on sustainable development. The study was tailored to follow the ex-post facto research design. This study was a census, hence the entire population of 227 public secondary schools were used. Secondary quantitative data were obtained using “School Size, Location and Enrolment Figures Checklist (SSLEFC)” were analysed using descriptive statistics, while line graphs and (...)
    Download  
     
    Export citation  
     
    Bookmark  
  15. Revisiting the two predominant statistical problems: the stopping-rule problem and the catch-all hypothesis problem.Yusaku Ohkubo - 2021 - Annals of the Japan Association for Philosophy of Science 30:23-41.
    The history of statistics is filled with many controversies, in which the prime focus has been the difference in the “interpretation of probability” between Fre- quentist and Bayesian theories. Many philosophical arguments have been elabo- rated to examine the problems of both theories based on this dichotomized view of statistics, including the well-known stopping-rule problem and the catch-all hy- pothesis problem. However, there are also several “hybrid” approaches in theory, practice, and philosophical analysis. This poses many fundamental questions. This (...)
    Download  
     
    Export citation  
     
    Bookmark  
  16. Improving Bayesian statistics understanding in the age of Big Data with the bayesvl R package.Quan-Hoang Vuong, Viet-Phuong La, Minh-Hoang Nguyen, Manh-Toan Ho, Manh-Tung Ho & Peter Mantello - 2020 - Software Impacts 4 (1):100016.
    The exponential growth of social data both in volume and complexity has increasingly exposed many of the shortcomings of the conventional frequentist approach to statistics. The scientific community has called for careful usage of the approach and its inference. Meanwhile, the alternative method, Bayesian statistics, still faces considerable barriers toward a more widespread application. The bayesvl R package is an open program, designed for implementing Bayesian modeling and analysis using the Stan language’s no-U-turn (NUTS) sampler. The package combines the ability (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  17. Why do we need to employ Bayesian statistics and how can we employ it in studies of moral education?: With practical guidelines to use JASP for educators and researchers.Hyemin Han - 2018 - Journal of Moral Education 47 (4):519-537.
    ABSTRACTIn this article, we discuss the benefits of Bayesian statistics and how to utilize them in studies of moral education. To demonstrate concrete examples of the applications of Bayesian statistics to studies of moral education, we reanalyzed two data sets previously collected: one small data set collected from a moral educational intervention experiment, and one big data set from a large-scale Defining Issues Test-2 survey. The results suggest that Bayesian analysis of data sets collected from moral educational studies can provide (...)
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  18. Mathematics and Statistics in the Social Sciences.Stephan Hartmann & Jan Sprenger - 2011 - In Ian C. Jarvie & Jesus Zamora-Bonilla (eds.), The SAGE Handbook of the Philosophy of Social Sciences. London: Sage Publications. pp. 594-612.
    Over the years, mathematics and statistics have become increasingly important in the social sciences1 . A look at history quickly confirms this claim. At the beginning of the 20th century most theories in the social sciences were formulated in qualitative terms while quantitative methods did not play a substantial role in their formulation and establishment. Moreover, many practitioners considered mathematical methods to be inappropriate and simply unsuited to foster our understanding of the social domain. Notably, the famous Methodenstreit also concerned (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  19. ‘‘Describing our whole experience’’: The statistical philosophies of W. F. R. Weldon and Karl Pearson.Charles H. Pence - 2011 - Studies in History and Philosophy of Biological and Biomedical Sciences 42 (4):475-485.
    There are two motivations commonly ascribed to historical actors for taking up statistics: to reduce complicated data to a mean value (e.g., Quetelet), and to take account of diversity (e.g., Galton). Different motivations will, it is assumed, lead to different methodological decisions in the practice of the statistical sciences. Karl Pearson and W. F. R. Weldon are generally seen as following directly in Galton’s footsteps. I argue for two related theses in light of this standard interpretation, based on a (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  20. Milton Friedman, the Statistical Methodologist.David Teira - 2007 - History of Political Economy 39 (3):511-28.
    In this paper I study Milton Friedman’s statistical education, paying special attention to the different methodological approaches (Fisher, Neyman and Savage) to which he was exposed. I contend that these statistical procedures involved different views as to the evaluation of statistical predictions. In this light, the thesis defended in Friedman’s 1953 methodological essay appears substantially ungrounded.
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  21. Curriculum Management and Graduate Programmes’ Viability: The Mediation of Institutional Effectiveness Using PLS-SEM Approach.Valentine Joseph Owan, Emmanuel E. Emanghe, Chiaka P. Denwigwe, Eno Etudor-Eyo, Abosede A. Usoro, Victor O. Ebuara, Charles Effiong, Joseph O. Ogar & Bassey A. Bassey - 2022 - Journal of Curriculum and Teaching 11 (5):114-127.
    This study used a partial least squares structural equation modelling (PLS-SEM) to estimate curriculum management's direct and indirect effects on university graduate programmes' viability. The study also examined the role of institutional effectiveness in mediating the nexus between the predictor and response variables. This is a correlational study with a factorial research design. The study's participants comprised 149 higher education administrators (23 Faculty Deans and 126 HODs) from two public universities in Nigeria. A structured questionnaire designed by the researchers was (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  22. How Typical! An Epistemological Analysis of Typicality in Statistical Mechanics.Massimiliano Badino - manuscript
    The recent use of typicality in statistical mechanics for foundational purposes has stirred an important debate involving both philosophers and physicists. While this debate customarily focuses on technical issues, in this paper I try to approach the problem from an epistemological angle. The discussion is driven by two questions: (1) What does typicality add to the concept of measure? (2) What kind of explanation, if any, does typicality yield? By distinguishing the notions of `typicality-as-vast-majority' and `typicality-as-best-exemplar', I argue that (...)
    Download  
     
    Export citation  
     
    Bookmark  
  23. Normativity, Epistemic Rationality, and Noisy Statistical Evidence.Boris Babic, Anil Gaba, Ilia Tsetlin & Robert Winkler - forthcoming - British Journal for the Philosophy of Science.
    Many philosophers have argued that statistical evidence regarding group char- acteristics (particularly stereotypical ones) can create normative conflicts between the requirements of epistemic rationality and our moral obligations to each other. In a recent paper, Johnson-King and Babic argue that such conflicts can usually be avoided: what ordinary morality requires, they argue, epistemic rationality permits. In this paper, we show that as data gets large, Johnson-King and Babic’s approach becomes less plausible. More constructively, we build on their project and (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  24. A simple proof of Born’s rule for statistical interpretation of quantum mechanics.Biswaranjan Dikshit - 2017 - Journal for Foundations and Applications of Physics 4 (1):24-30.
    The Born’s rule to interpret the square of wave function as the probability to get a specific value in measurement has been accepted as a postulate in foundations of quantum mechanics. Although there have been so many attempts at deriving this rule theoretically using different approaches such as frequency operator approach, many-world theory, Bayesian probability and envariance, literature shows that arguments in each of these methods are circular. In view of absence of a convincing theoretical proof, recently some researchers (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  25. Reliable credence and the foundations of statistics.Jesse Clifon - manuscript
    If the goal of statistical analysis is to form justified credences based on data, then an account of the foundations of statistics should explain what makes credences justified. I present a new account called statistical reliabilism (SR), on which credences resulting from a statistical analysis are justified (relative to alternatives) when they are in a sense closest, on average, to the corresponding objective probabilities. This places (SR) in the same vein as recent work on the reliabilist justification (...)
    Download  
     
    Export citation  
     
    Bookmark  
  26. A new approach to the approach to equilibrium.Roman Frigg & Charlotte Werndl - 2012 - In Yemima Ben-Menahem & Meir Hemmo (eds.), Probability in Physics. The Frontiers Collection. Springer. pp. 99-114.
    Consider a gas confined to the left half of a container. Then remove the wall separating the two parts. The gas will start spreading and soon be evenly distributed over the entire available space. The gas has approached equilibrium. Why does the gas behave in this way? The canonical answer to this question, originally proffered by Boltzmann, is that the system has to be ergodic for the approach to equilibrium to take place. This answer has been criticised on different grounds (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  27. Phenomenology and Dimensional Approaches to Psychiatric Research and Classification.Anthony Vincent Fernandez - 2019 - Philosophy, Psychiatry, and Psychology 26 (1):65-75.
    Contemporary psychiatry finds itself in the midst of a crisis of classification. The developments begun in the 1980s—with the third edition of the Diagnostic and Statistical Manual of Mental Disorders —successfully increased inter-rater reliability. However, these developments have done little to increase the predictive validity of our categories of disorder. A diagnosis based on DSM categories and criteria often fails to accurately anticipate course of illness or treatment response. In addition, there is little evidence that the DSM categories link (...)
    Download  
     
    Export citation  
     
    Bookmark   11 citations  
  28. Development of an Offline Computer- Based Assessment Tool in Statistics and Probability Utilizing MS PowerPoint and MS Excel.Cherry Mae Cabrera & Jupeth Pentang - 2023 - International Journal of Science, Technology, Engineering and Mathematics 3 (4):73-100.
    This capstone project aimed to develop an offline computer-based assessment (CBA) tool that uses a computer instead of a traditional paper test in evaluating student learning. This addresses the difficulties faced by teachers in administering quarterly assessments. Incorporating technology into student assessment can increase student interest because of the immediate feedback generated automatically and can help teachers improve their work performance. This capstone project employed a developmental approach and utilized a modified ADDIE (Analysis, Design, and Development) model to design the (...)
    Download  
     
    Export citation  
     
    Bookmark  
  29. Grade 12 Students’ Retention in Statistics and Probability amidst Covid-19.Cathlyn Dumale & Melanie Gurat - 2023 - American Journal of Educational Research 11 (10):670-676.
    This study aimed to assess the retention of grade 12 students in statistics and probability, along with a comparative analysis of these retentions across the distinct topics of the subject. Statistics and probability subjects were taken by the students when they were in grade 11. Employing a quantitative approach, the research used descriptive-comparative design to describe the level of retention of the students and compare the retention of the students in each topic. These encompass random variables and probability distribution, normal (...)
    Download  
     
    Export citation  
     
    Bookmark  
  30.  48
    On Regression Modeling for Students’ Attitude towards Statistics Online Learning in Higher Education.Leomarich Casinillo & Ginna Tavera - 2023 - St. Theresa Journal of Humanities and Social Sciences 9 (2):60-74.
    Students during the distance education were experiencing solitude and depression in their studies due to no social interaction which led to psychological suffering. In this article, college students' attitudes toward statistics learning were investigated, and its predictors by statistical modeling. Secondary data was extracted from a current study from the literature, summarized using descriptive statistics, and presented in tabular form. As for modeling the predictors of students' attitudes in learning statistics, it was done through multiple linear regression via the (...)
    Download  
     
    Export citation  
     
    Bookmark  
  31. A systems thinking approach to e-learning on climate change: capacity-building for junior high school teachers in the Philippines.John Trixstan Ignacio, Charlotte Kendra Gonzales & Queena Lee-Chua - forthcoming - International Journal of Disaster Resilience in the Built Environment:1-16.
    Purpose A mixed-method study was performed to determine the impact of integrating systems thinking (ST) into an electronic learning module for junior high school teachers in the Philippines. The study aims to assess how an ST approach to pedagogy compared against a conventional approach in terms of contribution to the participants’ global climate change content knowledge, holistic thinking and depth and accuracy of knowledge and reasoning. -/- Design/methodology/approach The study implemented e-learning modules using an ST approach versus a conventional approach (...)
    Download  
     
    Export citation  
     
    Bookmark  
  32. A decoherence-based approach to the classical limit in Bohm's theory.Davide Romano - 2023 - Foundations of Physics 53 (41):1-27.
    The paper explains why the de Broglie-Bohm theory reduces to Newtonian mechanics in the macroscopic classical limit. The quantum-to-classical transition is based on three steps: (i) interaction with the environment produces effectively factorized states, leading to the formation of effective wave functions and hence decoherence; (ii) the effective wave functions selected by the environment–the pointer states of decoherence theory–will be well-localized wave packets, typically Gaussian states; (iii) the quantum potential of a Gaussian state becomes negligible under standard classicality conditions; therefore, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  33. Arguments from Expert Opinion – An Epistemological Approach.Christoph Lumer - 2020 - In Catarina Dutilh Novaes, Henrike Jansen, Jan Albert Van Laar & Bart Verheij (eds.), Reason to Dissent. Proceedings of the 3rd European Conference on Argumentation. College Publications. pp. 403-422.
    In times of populist mistrust towards experts, it is important and the aim of the paper to ascertain the rationality of arguments from expert opinion and to reconstruct their rational foundations as well as to determine their limits. The foundational approach chosen is probabilistic. However, there are at least three correct probabilistic reconstructions of such argumentations: statistical inferences, Bayesian updating, and interpretive arguments. To solve this competition problem, the paper proposes a recourse to the arguments' justification strengths achievable in (...)
    Download  
     
    Export citation  
     
    Bookmark  
  34. A Decoherence-Based Approach to the Classical Limit in Bohm’s Theory.Davide Romano - 2023 - Foundations of Physics 53 (2):1-27.
    The paper explains why the de Broglie–Bohm theory reduces to Newtonian mechanics in the macroscopic classical limit. The quantum-to-classical transition is based on three steps: (i) interaction with the environment produces effectively factorized states, leading to the formation of _effective wave functions_ and hence _decoherence_; (ii) the effective wave functions selected by the environment—the pointer states of decoherence theory—will be well-localized wave packets, typically Gaussian states; (iii) the quantum potential of a Gaussian state becomes negligible under standard classicality conditions; therefore, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  35. A computational approach to classifying UKLO questions.Charles Massingham - manuscript
    I leave the work I’ve done to classify all UKLO questions below, with the latest appendices for historical record. From my recollection, this was the first complete system to classify all UKLO questions. Attempts at classifying UKLO questions before accounted for some questions but not others, leaving some questions unclassified. The classifications then were more labelling rather than a systemic approach that applied to all questions. As a result, this made it difficult for competitors and test development to find the (...)
    Download  
     
    Export citation  
     
    Bookmark  
  36. A Gentle Approach to Imprecise Probabilities.Gregory Wheeler - 2022 - In Thomas Augustin, Fabio Gagliardi Cozman & Gregory Wheeler (eds.), Reflections on the Foundations of Probability and Statistics: Essays in Honor of Teddy Seidenfeld. Springer. pp. 37-67.
    The field of of imprecise probability has matured, in no small part because of Teddy Seidenfeld’s decades of original scholarship and essential contributions to building and sustaining the ISIPTA community. Although the basic idea behind imprecise probability is (at least) 150 years old, a mature mathematical theory has only taken full form in the last 30 years. Interest in imprecise probability during this period has also grown, but many of the ideas that the mature theory serves can be difficult to (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  37. DSM-5 and Psychiatry's Second Revolution: Descriptive vs. Theoretical Approaches to Psychiatric Classification.Jonathan Y. Tsou - 2015 - In Steeves Demazeux & Patrick Singy (eds.), The DSM-5 in Perspective: Philosophical Reflections on the Psychiatric Babel. Springer. pp. 43-62.
    A large part of the controversy surrounding the publication of DSM-5 stems from the possibility of replacing the purely descriptive approach to classification favored by the DSM since 1980. This paper examines the question of how mental disorders should be classified, focusing on the issue of whether the DSM should adopt a purely descriptive or theoretical approach. I argue that the DSM should replace its purely descriptive approach with a theoretical approach that integrates causal information into the DSM’s descriptive diagnostic (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  38. Interactive Justice as an Approach to Enhance Organizational Loyalty among Faculty Staff At Palestine Technical University- (Kadoorei).Al Shobaki Mazen J. - 2018 - International Journal of Academic Information Systems Research (IJAISR) 2 (9):: 17-28.
    This study aimed to identify the interactive justice and its impact on the organizational loyalty of the Faculty Staff in the Technical University of Palestine-(Kadoorei). In order to achieve this, the researchers used a questionnaire consisting of (22) paragraphs where the first area (10) paragraphs looking at interactive justice, while the second area (12) in the area of organizational loyalty to the Faculty Staff at the university, where it was distributed to (105) individuals from the study sample, and after the (...)
    Download  
     
    Export citation  
     
    Bookmark  
  39. Assessing the needs of healthcare information for assisting family caregivers in cancer fear management: A mindsponge-based approach.Ni Putu Wulan Purnama Sari, Minh-Phuong Thi Duong, Made Mahaguna Putra, Pande Made Arbi Yudamuckti, Minh-Hoang Nguyen & Quan-Hoang Vuong - manuscript
    Fear of cancer is mostly related to cancer recurrence, metastasis, additional cancer, and diagnostic tests. Its legacy as a lethal disease has raised fear of approaching death. Currently, cancer’s total suffering and the worsening phenomena have raised fear, especially among female patients. Family caregivers (FCGs) who are responsible for the day-to-day cancer care at home need to help the patients deal with this fear frequently. Due to the limited care competencies, they need supportive care from healthcare professionals in cancer fear (...)
    Download  
     
    Export citation  
     
    Bookmark  
  40. An evaluation of four solutions to the forking paths problem: Adjusted alpha, preregistration, sensitivity analyses, and abandoning the Neyman-Pearson approach.Mark Rubin - 2017 - Review of General Psychology 21:321-329.
    Gelman and Loken (2013, 2014) proposed that when researchers base their statistical analyses on the idiosyncratic characteristics of a specific sample (e.g., a nonlinear transformation of a variable because it is skewed), they open up alternative analysis paths in potential replications of their study that are based on different samples (i.e., no transformation of the variable because it is not skewed). These alternative analysis paths count as additional (multiple) tests and, consequently, they increase the probability of making a Type (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  41. A Weibull Wearout Test: Full Bayesian Approach.Julio Michael Stern, Telba Zalkind Irony, Marcelo de Souza Lauretto & Carlos Alberto de Braganca Pereira - 2001 - Reliability and Engineering Statistics 5:287-300.
    The Full Bayesian Significance Test (FBST) for precise hypotheses is presented, with some applications relevant to reliability theory. The FBST is an alternative to significance tests or, equivalently, to p-ualue.s. In the FBST we compute the evidence of the precise hypothesis. This evidence is the probability of the complement of a credible set "tangent" to the sub-manifold (of the para,rreter space) that defines the null hypothesis. We use the FBST in an application requiring a quality control of used components, based (...)
    Download  
     
    Export citation  
     
    Bookmark  
  42. Individual, Motivational, and Social Support Factors Towards Learning Mathematics of University Students in the Blended Learning Approach.Dishelle Hufana & Melanie Gurat - 2023 - American Journal of Educational Research 11 (4):175-182.
    The broad range of emotional factors that affect learning on the sudden change of the learning modality in Mathematics led to this study. This study aimed to compare the level of support factors that greatly affect the students’ learning in mathematics in a blended learning approach. Sex, age, and relationship status were considered grouping variables on the individual, motivational, and social support factors towards learning mathematics in the blended learning mode of thirty education students at the tertiary level. This study (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  43. Combining Optimization and Randomization Approaches for the Design of Clinical Trials.Julio Michael Stern, Victor Fossaluza, Marcelo de Souza Lauretto & Carlos Alberto de Braganca Pereira - 2015 - Springer Proceedings in Mathematics and Statistics 118:173-184.
    t Intentional sampling methods are non-randomized procedures that select a group of individuals for a sample with the purpose of meeting specific prescribed criteria. In this paper we extend previous works related to intentional sampling, and address the problem of sequential allocation for clinical trials with few patients. Roughly speaking, patients are enrolled sequentially, according to the order in which they start the treatment at the clinic or hospital. The allocation problem consists in assigning each new patient to one, and (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  44. A MULTIVARIATE ANALYSIS ON THE FACTORS AFFECTING THE STUDENTS’ MATHEMATICS PERFORMANCE IN A MODULAR APPROACH OF DISTANCE LEARNING.Joel C. Patiño Jr - 2023 - Get International Research Journal 1 (2).
    This study sought to examine the factors affecting the Science, Technology, Engineering and Mathematics (STEM) students’ Mathematics performance in a modular distance learning at Notre Dame Village National High School (NDVNHS). In particular, the researcher was interested to determine if these factors had a significant effect on the students’ Pre-Calculus and General Mathematics performance considering the number of hours spent in modular learning. The period covered by the study was during the first semester of the school year 2020-2021. The respondents (...)
    Download  
     
    Export citation  
     
    Bookmark  
  45. Non-Arbitrage In Financial Markets: A Bayesian Approach for Verification.Julio Michael Stern & Fernando Valvano Cerezetti - 2012 - AIP Conference Proceedings 1490:87-96.
    The concept of non-arbitrage plays an essential role in finance theory. Under certain regularity conditions, the Fundamental Theorem of Asset Pricing states that, in non-arbitrage markets, prices of financial instruments are martingale processes. In this theoretical framework, the analysis of the statistical distributions of financial assets can assist in understanding how participants behave in the markets, and may or may not engender arbitrage conditions. Assuming an underlying Variance Gamma statistical model, this study aims to test, using the FBST (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  46. Beitz and the Problem with a State-Focused Approach to Human Rights.Jennifer Szende - manuscript
    Charles Beitz has presented us with a new and novel theory of human rights, one that is motivated by a concern for the enforcement of human rights in modern international practice. However, the focus on states in his human rights project generates a tension between the universal aspirations of individual human rights and the vulnerable individuals who through rendition or state failure find themselves outside the international state system. This paper argues that Beitz and other theorists of human rights make (...)
    Download  
     
    Export citation  
     
    Bookmark  
  47. Basic Paradigm Change: Communicative Rationality Approach.Rinat M. Nugayev (ed.) - 2003 - Dom Pechati.
    Special Relativity and the Early Quantum Theory were created within the same programme of statistical mechanics, thermodynamics and maxwellian electrodynamics reconciliation. I shall try to explain why classical mechanics and classical electrodynamics were “refuted” almost simultaneously or, in more suitable for the present congress terms, why did quantum revolution and the relativistic one both took place at the beginning of the 20-th century. I shall argue that quantum and relativistic revolutions were simultaneous since they had common origin - the (...)
    Download  
     
    Export citation  
     
    Bookmark  
  48.  34
    Adaptive Channel Hopping for IEEE 802.15. 4 TSCH-Based Networks: A Dynamic Bernoulli Bandit Approach.Taheri Javan Nastooh - 2021 - IEEE Sensors Journal 21 (20):23667-23681.
    In IEEE 802.15.4 standard for low-power low-range wireless communications, only one channel is employed for transmission which can result in increased energy consumption, high network delay and poor packet delivery ratio (PDR). In the subsequent IEEE 802.15.4-2015 standard, a Time-slotted Channel Hopping (TSCH) mechanism has been developed which allows for a periodic yet fixed frequency hopping pattern over 16 different channels. Unfortunately, however, most of these channels are susceptible to high-power coexisting Wi-Fi signal interference and to possibly some other ISM-band (...)
    Download  
     
    Export citation  
     
    Bookmark  
  49. The Meta-Reversibility Objection.Meacham Christopher - 2023 - In Barry Loewer, Brad Weslake & Eric B. Winsberg (eds.), The Probability Map of the Universe: Essays on David Albert’s _time and Chance_. Cambridge MA: Harvard University Press.
    One popular approach to statistical mechanics understands statistical mechanical probabilities as measures of rational indifference. Naive formulations of this ``indifference approach'' face reversibility worries - while they yield the right prescriptions regarding future events, they yield the wrong prescriptions regarding past events. This paper begins by showing how the indifference approach can overcome the standard reversibility worries by appealing to the Past Hypothesis. But, the paper argues, positing a Past Hypothesis doesn't free the indifference approach from all reversibility (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  50. race and racial profiling.Annabelle Lever - 2017 - In Naomi Zack (ed.), The Oxford Handbook of Philosophy and Race. NEW YORK: Oxford University Press. pp. 425-435.
    Philosophical reflection on racial profiling tends to take one of two forms. The first sees it as an example of ‘statistical discrimination,’ (SD), raising the question of when, if ever, probabilistic generalisations about group behaviour or characteristics can be used to judge particular individuals.(Applbaum 2014; Harcourt 2004; Hellman, 2014; Risse and Zeckhauser 2004; Risse 2007; Lippert-Rasmussen 2006; Lippert-Rasmussen 2007; Lippert-Rasmussen 2014) . This approach treats racial profiling as one example amongst many others of a general problem in egalitarian political (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
1 — 50 / 1000