Results for 'statistical representation'

998 found
Order:
  1. An Alternative Interpretation of Statistical Mechanics.C. D. McCoy - 2020 - Erkenntnis 85 (1):1-21.
    In this paper I propose an interpretation of classical statistical mechanics that centers on taking seriously the idea that probability measures represent complete states of statistical mechanical systems. I show how this leads naturally to the idea that the stochasticity of statistical mechanics is associated directly with the observables of the theory rather than with the microstates (as traditional accounts would have it). The usual assumption that microstates are representationally significant in the theory is therefore dispensable, a (...)
    Download  
     
    Export citation  
     
    Bookmark   12 citations  
  2. Neutrosophic Statistics is an extension of Interval Statistics, while Plithogenic Statistics is the most general form of statistics (second version).Florentin Smarandache - 2022 - International Journal of Neutrosophic Science 19 (1):148-165.
    In this paper, we prove that Neutrosophic Statistics is more general than Interval Statistics, since it may deal with all types of indeterminacies (with respect to the data, inferential procedures, probability distributions, graphical representations, etc.), it allows the reduction of indeterminacy, and it uses the neutrosophic probability that is more general than imprecise and classical probabilities and has more detailed corresponding probability density functions. While Interval Statistics only deals with indeterminacy that can be represented by intervals. And we respond to (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  3. Sample representation in the social sciences.Kino Zhao - 2021 - Synthese (10):9097-9115.
    The social sciences face a problem of sample non-representation, where the majority of samples consist of undergraduate students from Euro-American institutions. The problem has been identified for decades with little trend of improvement. In this paper, I trace the history of sampling theory. The dominant framework, called the design-based approach, takes random sampling as the gold standard. The idea is that a sampling procedure that is maximally uninformative prevents samplers from introducing arbitrary bias, thus preserving sample representation. I (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  4. Artificial intelligence and identity: the rise of the statistical individual.Jens Christian Bjerring & Jacob Busch - forthcoming - AI and Society:1-13.
    Algorithms are used across a wide range of societal sectors such as banking, administration, and healthcare to make predictions that impact on our lives. While the predictions can be incredibly accurate about our present and future behavior, there is an important question about how these algorithms in fact represent human identity. In this paper, we explore this question and argue that machine learning algorithms represent human identity in terms of what we shall call the statistical individual. This statisticalized (...) of individuals, we shall argue, differs significantly from our ordinary conception of human identity, which is tightly intertwined with considerations about biological, psychological, and narrative continuity—as witnessed by our most well-established philosophical views on personal identity. Indeed, algorithmic representations of individuals give no special attention to biological, psychological, and narrative continuity and instead rely on predictive properties that significantly exceed and diverge from those that we would ordinarily take to be relevant for questions about how we are. (shrink)
    Download  
     
    Export citation  
     
    Bookmark  
  5. INFERENCE AND REPRESENTATION: PHILOSOPHICAL AND COGNITIVE ISSUES.Igor Mikhailov - 2020 - Vestnik Tomskogo Gosudarstvennogo Universiteta. Filosofiya, Sotsiologiya, Politologiya 1 (58):34-46.
    The paper is dedicated to particular cases of interaction and mutual impact of philosophy and cognitive science. Thus, philosophical preconditions in the middle of the 20th century shaped the newly born cognitive science as mainly based on conceptual and propositional representations and syntactical inference. Further developments towards neural networks and statistical representations did not change the prejudice much: many still believe that network models must be complemented with some extra tools that would account for proper human cognitive traits. I (...)
    Download  
     
    Export citation  
     
    Bookmark  
  6. Maxwell-Boltzmann Statistics and the Metaphysics of Modality.Bruce L. Gordon - 2002 - Synthese 133 (3):393-417.
    ABSTRACT. Two arguments have recently been advanced that Maxwell-Boltzmann particles are indistinguishable just like Bose-Einstein and Fermi-Dirac particles. Bringing modal metaphysics to bear on these arguments shows that ontological indistinguishability for classical (MB) particles does not follow. The first argument, resting on symmetry in the occupation representation for all three cases, fails since peculiar correlations exist in the quantum (BE and FD) context as harbingers of ontic indistinguishability, while the indistinguishability of classical particles remains purely epistemic. The second argument, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  7. Perceptron Connectives in Knowledge Representation.Pietro Galliani, Guendalina Righetti, Daniele Porello, Oliver Kutz & Nicolas Toquard - 2020 - In Pietro Galliani, Guendalina Righetti, Daniele Porello, Oliver Kutz & Nicolas Toquard (eds.), Knowledge Engineering and Knowledge Management - 22nd International Conference, {EKAW} 2020, Bolzano, Italy, September 16-20, 2020, Proceedings. Lecture Notes in Computer Science 12387. pp. 183-193.
    We discuss the role of perceptron (or threshold) connectives in the context of Description Logic, and in particular their possible use as a bridge between statistical learning of models from data and logical reasoning over knowledge bases. We prove that such connectives can be added to the language of most forms of Description Logic without increasing the complexity of the corresponding inference problem. We show, with a practical example over the Gene Ontology, how even simple instances of perceptron connectives (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  8. Explanation through representation, and its limits.Bas Van Fraassen - 2012 - Epistemologia 1:30-46.
    Why-questions and how-possibly-questions are two common forms of explanation request. Answers to the former ones require factual assertions, but the latter ones can be answered by displaying a representation of the targeted phenomenon. However, in an extreme case, a representation could come accompanied by the assertion that it displays the only possible way a phenomenon could develop. Using several historical controversies concerning statistical modeling, it is argued that such cases must inevitably involve tacit or explicit empirical assumptions.
    Download  
     
    Export citation  
     
    Bookmark  
  9. Pedagogical Approaches in Statistics and Probability during Pandemic.Melanie Gurat & Cathlyn Dumale - 2023 - American Journal of Educational Research 11 (6):337-347.
    The difficulty of the students in Statistics and Probability subject, and the pedagogical approaches used by the teachers, were the challenges encountered by both students and teachers due to the restrictions during the CoViD-19 pandemic. Hence, this study aimed to determine the pedagogical approaches used in teaching statistics and probability during the pandemic. The study used a qualitative approach, particularly document analysis. The main source of the data was the module in statistics and probability specifically the learning activity sheets in (...)
    Download  
     
    Export citation  
     
    Bookmark  
  10. OBCS: The Ontology of Biological and Clinical Statistics.Jie Zheng, Marcelline R. Harris, Anna Maria Masci, Yu Lin, Alfred Hero, Barry Smith & Yongqun He - 2014 - Proceedings of the Fifth International Conference on Biomedical Ontology 1327:65.
    Statistics play a critical role in biological and clinical research. To promote logically consistent representation and classification of statistical entities, we have developed the Ontology of Biological and Clinical Statistics (OBCS). OBCS extends the Ontology of Biomedical Investigations (OBI), an OBO Foundry ontology supported by some 20 communities. Currently, OBCS contains 686 terms, including 381 classes imported from OBI and 147 classes specific to OBCS. The goal of this paper is to present OBCS for community critique and to (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  11. Diagrammatic Reasoning as the Basis for Developing Concepts: A Semiotic Analysis of Students' Learning about Statistical Distribution.Arthur Bakker & Michael H. G. Hoffmann - 2005 - Educational Studies in Mathematics 60:333–358.
    In recent years, semiotics has become an innovative theoretical framework in mathematics education. The purpose of this article is to show that semiotics can be used to explain learning as a process of experimenting with and communicating about one's own representations of mathematical problems. As a paradigmatic example, we apply a Peircean semiotic framework to answer the question of how students learned the concept of "distribution" in a statistics course by "diagrammatic reasoning" and by developing "hypostatic abstractions," that is by (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  12. La Estadística Neutrosófica es una extensión de la Estadística de Intervalos, mientras que la Estadística Plitogénica es la forma más general de estadística. (Cuarta versión). Neutrosophic Statistics is an extension of Interval Statistics, while Plitogenic Statistics is the most general form of statistics (Fourth version).Florentin Smarandache - 2022 - Neutrosophic Computing and Machine Learning 23 (1):21-38.
    In this paper we show that Neutrosophic Statistics is an extension of Interval Statistics, since it deals with all kinds of indeterminacy (with respect to data, inferential procedures, probability distributions, graphical representations, etc.), allows for indeterminacy reduction, and uses neutrosophic probability which is more general than imprecise and classical probabilities, and has more detailed corresponding probability density functions. Whereas Interval Statistics only deals with indeterminacy that can be represented by intervals. And we respond to the arguments of Woodall et al (...)
    Download  
     
    Export citation  
     
    Bookmark  
  13. The Ontology of Biological and Clinical Statistics (OBCS) for standardized and reproducible statistical analysis.Jie Zheng, Marcelline R. Harris, Anna Maria Masci, Lin Yu, Alfred Hero, Barry Smith & Yongqun He - 2016 - Journal of Biomedical Semantics 7 (53).
    Statistics play a critical role in biological and clinical research. However, most reports of scientific results in the published literature make it difficult for the reader to reproduce the statistical analyses performed in achieving those results because they provide inadequate documentation of the statistical tests and algorithms applied. The Ontology of Biological and Clinical Statistics (OBCS) is put forward here as a step towards solving this problem. Terms in OBCS, including ‘data collection’, ‘data transformation in statistics’, ‘data visualization’, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  14. The common effect of value on prioritized memory and category representation.Joshua Knobe & Fiery Cushman - forthcoming - Trends in Cognitive Sciences.
    The way we represent categories depends on both the frequency and value of the category’s members. Thus, for instance, prototype representations can be impacted both by information about what is statistically frequent and by judgments about what is valuable. Notably, recent research on memory suggests that prioritized memory is also influenced by both statistical frequency and value judgments. Although work on conceptual representation and work on prioritized memory have thus far proceeded almost entirely independently, the patterns of existing (...)
    Download  
     
    Export citation  
     
    Bookmark  
  15. Visual features as carriers of abstract quantitative information.Ronald A. Rensink - 2022 - Journal of Experimental Psychology: General 8 (151):1793-1820.
    Four experiments investigated the extent to which abstract quantitative information can be conveyed by basic visual features. This was done by asking observers to estimate and discriminate Pearson correlation in graphical representations where the first data dimension of each element was encoded by its horizontal position, and the second by the value of one of its visual features; perceiving correlation then requires combining the information in the two encodings via a common abstract representation. Four visual features were examined: luminance, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  16. Cognition according to Quantum Information: Three Epistemological Puzzles Solved.Vasil Penchev - 2020 - Epistemology eJournal (Elsevier: SSRN) 13 (20):1-15.
    The cognition of quantum processes raises a series of questions about ordering and information connecting the states of one and the same system before and after measurement: Quantum measurement, quantum in-variance and the non-locality of quantum information are considered in the paper from an epistemological viewpoint. The adequate generalization of ‘measurement’ is discussed to involve the discrepancy, due to the fundamental Planck constant, between any quantum coherent state and its statistical representation as a statistical ensemble after measurement. (...)
    Download  
     
    Export citation  
     
    Bookmark  
  17. Quantum Invariance.Vasil Penchev - 2020 - Epistemology eJournal (Elsevier: SSRN) 13 (22):1-6.
    Quantum invariance designates the relation of any quantum coherent state to the corresponding statistical ensemble of measured results. The adequate generalization of ‘measurement’ is discussed to involve the discrepancy, due to the fundamental Planck constant, between any quantum coherent state and its statistical representation as a statistical ensemble after measurement. A set-theory corollary is the curious invariance to the axiom of choice: Any coherent state excludes any well-ordering and thus excludes also the axiom of choice. It (...)
    Download  
     
    Export citation  
     
    Bookmark  
  18. Scrambling for higher metrics in the Journal Impact Factor bubble period: a real-world problem in science management and its implications.Tran Trung, Hoang Khanh Linh, La Viet Phuong, Manh-Toan Ho & Quan-Hoang Vuong - 2020 - Problems and Perspectives in Management 18 (1):48-56.
    Universities and funders in many countries have been using Journal Impact Factor (JIF) as an indicator for research and grant assessment despite its controversial nature as a statistical representation of scientific quality. This study investigates how the changes of JIF over the years can affect its role in research evaluation and science management by using JIF data from annual Journal Citation Reports (JCR) to illustrate the changes. The descriptive statistics find out an increase in the median JIF for (...)
    Download  
     
    Export citation  
     
    Bookmark  
  19. Intentional Models as Essential Scientific Tools.Eric Hochstein - 2013 - International Studies in the Philosophy of Science 27 (2):199-217.
    In this article, I argue that the use of scientific models that attribute intentional content to complex systems bears a striking similarity to the way in which statistical descriptions are used. To demonstrate this, I compare and contrast an intentional model with a statistical model, and argue that key similarities between the two give us compelling reasons to consider both as a type of phenomenological model. I then demonstrate how intentional descriptions play an important role in scientific methodology (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  20. Can Deep CNNs Avoid Infinite Regress/Circularity in Content Constitution?Jesse Lopes - 2023 - Minds and Machines 33 (3):507-524.
    The representations of deep convolutional neural networks (CNNs) are formed from generalizing similarities and abstracting from differences in the manner of the empiricist theory of abstraction (Buckner, Synthese 195:5339–5372, 2018). The empiricist theory of abstraction is well understood to entail infinite regress and circularity in content constitution (Husserl, Logical Investigations. Routledge, 2001). This paper argues these entailments hold a fortiori for deep CNNs. Two theses result: deep CNNs require supplementation by Quine’s “apparatus of identity and quantification” in order to (1) (...)
    Download  
     
    Export citation  
     
    Bookmark  
  21. Why the Brain Knows More than We Do.Birgitta Dresp-Langley - 2011 - Brain Sciences 2:1-21.
    Scientific studies have shown that non-conscious stimuli and représentations influence information processing during conscious experience. In the light of such evidence, questions about potential functional links between non-conscious brain representations and conscious experience arise. This article discusses models capable of explaining how statistical learning mechanisms in dedicated resonant circuits could generate specific temporal activity traces of non-conscious representations in the brain. How reentrant signaling, top-down matching, and statistical coincidence of such activity traces may lead to the progressive consolidation (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  22. Just How Conservative is Conservative Predictive Processing?Paweł Gładziejewski - 2017 - Hybris. Internetowy Magazyn Filozoficzny 38:98-122.
    Predictive Processing (PP) framework construes perception and action (and perhaps other cognitive phenomena) as a matter of minimizing prediction error, i.e. the mismatch between the sensory input and sensory predictions generated by a hierarchically organized statistical model. There is a question of how PP fits into the debate between traditional, neurocentric and representation-heavy approaches in cognitive science and those approaches that see cognition as embodied, environmentally embedded, extended and (largely) representation-free. In the present paper, I aim to (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  23. Special Characterizations of Standard Discrete Models.Julio Michael Stern & Carlos Alberto de Braganca Pereira - 2008 - RevStat – Statistical Journal 6:199-230.
    This article presents important properties of standard discrete distributions and its conjugate densities. The Bernoulli and Poisson processes are described as generators of such discrete models. A characterization of distributions by mixtures is also introduced. This article adopts a novel singular notation and representation. Singular representations are unusual in statistical texts. Nevertheless, the singular notation makes it simpler to extend and generalize theoretical results and greatly facilitates numerical and computational implementation.
    Download  
     
    Export citation  
     
    Bookmark  
  24. Normality: Part Descriptive, part prescriptive.Adam Bear & Joshua Knobe - 2017 - Cognition 167 (C):25-37.
    People’s beliefs about normality play an important role in many aspects of cognition and life (e.g., causal cognition, linguistic semantics, cooperative behavior). But how do people determine what sorts of things are normal in the first place? Past research has studied both people’s representations of statistical norms (e.g., the average) and their representations of prescriptive norms (e.g., the ideal). Four studies suggest that people’s notion of normality incorporates both of these types of norms. In particular, people’s representations of what (...)
    Download  
     
    Export citation  
     
    Bookmark   43 citations  
  25. Mechanizmy predykcyjne i ich normatywność [Predictive mechanisms and their normativity].Michał Piekarski - 2020 - Warszawa, Polska: Liberi Libri.
    The aim of this study is to justify the belief that there are biological normative mechanisms that fulfill non-trivial causal roles in the explanations (as formulated by researchers) of actions and behaviors present in specific systems. One example of such mechanisms is the predictive mechanisms described and explained by predictive processing (hereinafter PP), which (1) guide actions and (2) shape causal transitions between states that have specific content and fulfillment conditions (e.g. mental states). Therefore, I am guided by a specific (...)
    Download  
     
    Export citation  
     
    Bookmark  
  26. Using Phenotypology Hypotheses as a Personality Assessment Tool: the Tentative Validation Study.Vitalii Shymko - 2020 - PSYCHOLOGICAL JOURNAL 6 (5):9-17.
    The transformational pace of modern education, healthcare, business management systems, etc., requires new approaches for prompt and reliable personality assessment. Phenotypology is one of such theories and it claims of the discovered interconnections of a person’s psychological and psychophysical characteristics on the basis of individual features of his/her phenotype. The article aim is to present some validation results for the Phenotypology hypotheses as a possible tool for personality assessment. In order to verify connections between phenotypic treats and individual behavior, we (...)
    Download  
     
    Export citation  
     
    Bookmark  
  27. Towards Knowledge-driven Distillation and Explanation of Black-box Models.Roberto Confalonieri, Guendalina Righetti, Pietro Galliani, Nicolas Toquard, Oliver Kutz & Daniele Porello - 2021 - In Roberto Confalonieri, Guendalina Righetti, Pietro Galliani, Nicolas Toquard, Oliver Kutz & Daniele Porello (eds.), Proceedings of the Workshop on Data meets Applied Ontologies in Explainable {AI} {(DAO-XAI} 2021) part of Bratislava Knowledge September {(BAKS} 2021), Bratislava, Slovakia, September 18th to 19th, 2021. CEUR 2998.
    We introduce and discuss a knowledge-driven distillation approach to explaining black-box models by means of two kinds of interpretable models. The first is perceptron (or threshold) connectives, which enrich knowledge representation languages such as Description Logics with linear operators that serve as a bridge between statistical learning and logical reasoning. The second is Trepan Reloaded, an ap- proach that builds post-hoc explanations of black-box classifiers in the form of decision trees enhanced by domain knowledge. Our aim is, firstly, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  28. Essays on the Metaphysics of Quantum Mechanics.Eddy Keming Chen - 2019 - Dissertation, Rutgers University, New Brunswick
    What is the proper metaphysics of quantum mechanics? In this dissertation, I approach the question from three different but related angles. First, I suggest that the quantum state can be understood intrinsically as relations holding among regions in ordinary space-time, from which we can recover the wave function uniquely up to an equivalence class (by representation and uniqueness theorems). The intrinsic account eliminates certain conventional elements (e.g. overall phase) in the representation of the quantum state. It also dispenses (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  29. Conceptual Centrality and Implicit Bias.Del Pinal Guillermo & Spaulding Shannon - 2018 - Mind and Language 33 (1):95-111.
    How are biases encoded in our representations of social categories? Philosophical and empirical discussions of implicit bias overwhelmingly focus on salient or statistical associations between target features and representations of social categories. These are the sorts of associations probed by the Implicit Association Test and various priming tasks. In this paper, we argue that these discussions systematically overlook an alternative way in which biases are encoded, that is, in the dependency networks that are part of our representations of social (...)
    Download  
     
    Export citation  
     
    Bookmark   10 citations  
  30. Editorial: PerceptualGrouping — The State of The Art.Birgitta Dresp-Langley - 2017 - Frontiers in Psychology 8:67.
    Perceptual neuroscience has identified mechanisms of perceptual grouping which account for the ways in which visual sensitivity to ordered structure and regularities expresses itself, in behavior and in the brain. The need to actively construct order, notably representations of objects in depth, is mandated as soon as visual signals reach the retina, given the occlusion of retinal signals by retinal veins and other retinal elements or blur. Multiple stages of neural processing transform fragmented signals into visual key representations of 3D (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  31. A Simpler and More Realistic Subjective Decision Theory.Haim Gaifman & Yang Liu - 2018 - Synthese 195 (10):4205--4241.
    In his classic book “the Foundations of Statistics” Savage developed a formal system of rational decision making. The system is based on (i) a set of possible states of the world, (ii) a set of consequences, (iii) a set of acts, which are functions from states to consequences, and (iv) a preference relation over the acts, which represents the preferences of an idealized rational agent. The goal and the culmination of the enterprise is a representation theorem: Any preference relation (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  32. A Plastic Temporal Brain Code for Conscious State Generation.Birgitta Dresp & Jean Durup - 2009 - Neural Plasticity 2009:1-15.
    Consciousness is known to be limited in processing capacity and often described in terms of a unique processing stream across a single dimension: time. In this paper, we discuss a purely temporal pattern code, functionally decoupled from spatial signals, for conscious state generation in the brain. Arguments in favour of such a code include Dehaene et al.’s long-distance reverberation postulate, Ramachandran’s remapping hypothesis, evidence for a temporal coherence index and coincidence detectors, and Grossberg’s Adaptive Resonance Theory. A time-bin resonance model (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  33. The State of the Discipline: New Data on Women Faculty in Philosophy.Sherri Lynn Conklin, Irina Artamonova & Nicole Hassoun - 2019 - Ergo: An Open Access Journal of Philosophy 6.
    This paper presents data on the representation of women at 98 philosophy departments in the United States, which were ranked by the Philosophical Gourmet Report (PGR) in 2015 as well as all of those schools on which data from 2004 exist. The paper makes four points in providing an overview of the state of the field. First, all programs reveal a statistically significant increase in the percentage of women tenured/tenure-track faculty, since 2004. Second, out of the 98 US philosophy (...)
    Download  
     
    Export citation  
     
    Bookmark  
  34. The outlier paradox: The role of iterative ensemble coding in discounting outliers.Michael Epstein, Jake Quilty-Dunn, Eric Mandelbaum & Tatiana Emmanouil - forthcoming - Journal of Experimental Psychology: Human Perception and Performance 1.
    Ensemble perception—the encoding of objects by their group properties—is known to be resistant to outlier noise. However, this resistance is somewhat paradoxical: how can the visual system determine which stimuli are outliers without already having derived statistical properties of the ensemble? A simple solution would be that ensemble perception is not a simple, one-step process; instead, outliers are detected through iterative computations that identify items with high deviance from the mean and reduce their weight in the representation over (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  35. Ecomindsponge: A Novel Perspective on Human Psychology and Behavior in the Ecosystem.Minh-Hoang Nguyen, Tam-Tri Le & Quan-Hoang Vuong - 2023 - Urban Science 7 (1):31.
    Modern society faces major environmental problems, but there are many difficulties in studying the nature–human relationship from an integral psychosocial perspective. We propose the ecomind sponge conceptual framework, based on the mindsponge theory of information processing. We present a systematic method to examine the nature–human relationship with conceptual frameworks of system boundaries, selective exchange, and adaptive optimization. The theoretical mechanisms were constructed based on principles and new evidence in natural sciences. The core mechanism of ecomindsponge is the subjective sphere of (...)
    Download  
     
    Export citation  
     
    Bookmark  
  36. Media Possibilities of Comics: Modern Tools for the Formation and Presentation of Organizational Culture.O. Hudoshnyk & Oleksandr P. Krupskyi - 2023 - European Journal of Management Issues 31 (1):40-49.
    Purpose: The modern development of mass culture is characterized by the growth of the market for graphic narratives, the rapid increase in the segment of digital comics, and the active use of comics as a communication tool in various industries and disciplinary areas. The purpose of the study: to determine the media capabilities of the comics in presenting educational, cross-cultural, problematic, and ethical content of modern organizational culture. Design / Method / Approach: The review nature of the article involves the (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  37. Paraconsistent Sensitivity Analysis for Bayesian Significance Tests.Julio Michael Stern - 2004 - Lecture Notes in Artificial Intelligence 3171:134-143.
    In this paper, the notion of degree of inconsistency is introduced as a tool to evaluate the sensitivity of the Full Bayesian Significance Test (FBST) value of evidence with respect to changes in the prior or reference density. For that, both the definition of the FBST, a possibilistic approach to hypothesis testing based on Bayesian probability procedures, and the use of bilattice structures, as introduced by Ginsberg and Fitting, in paraconsistent logics, are reviewed. The computational and theoretical advantages of using (...)
    Download  
     
    Export citation  
     
    Bookmark   11 citations  
  38.  26
    Classification by decomposition: a novel approach to classification of symmetric $$2\times 2$$ games.Mikael Böörs, Tobias Wängberg, Tom Everitt & Marcus Hutter - 2022 - Theory and Decision 93 (3):463-508.
    In this paper, we provide a detailed review of previous classifications of 2 × 2 games and suggest a mathematically simple way to classify the symmetric 2 × 2 games based on a decomposition of the payoff matrix into a cooperative and a zero-sum part. We argue that differences in the interaction between the parts is what makes games interesting in different ways. Our claim is supported by evolutionary computer experiments and findings in previous literature. In addition, we provide a (...)
    Download  
     
    Export citation  
     
    Bookmark  
  39. Philosophical foundations of intelligence collection and analysis: a defense of ontological realism.William Mandrick & Barry Smith - 2022 - Intelligence and National Security 38.
    There is a common misconception across the lntelligence Community (IC) to the effect that information trapped within multiple heterogeneous data silos can be semantically integrated by the sorts of meaning-blind statistical methods employed in much of artificial intelligence (Al) and natural language processlng (NLP). This leads to the misconception that incoming data can be analysed coherently by relying exclusively on the use of statistical algorithms and thus without any shared framework for classifying what the data are about. Unfortunately, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  40. Language Writ Large: LLMs, ChatGPT, Grounding, Meaning and Understanding.Stevan Harnad - manuscript
    Apart from what (little) OpenAI may be concealing from us, we all know (roughly) how ChatGPT works (its huge text database, its statistics, its vector representations, and their huge number of parameters, its next-word training, and so on). But none of us can say (hand on heart) that we are not surprised by what ChatGPT has proved to be able to do with these resources. This has even driven some of us to conclude that ChatGPT actually understands. It is not (...)
    Download  
     
    Export citation  
     
    Bookmark  
  41. Dependence relationships between Gene Ontology terms based on TIGR gene product annotations.Anand Kumar, Barry Smith & Christian Borgelt - 2004 - Proceedings of the 3rd International Workshop on Computational Terminology 2004:31-38.
    The Gene Ontology is an important tool for the representation and processing of information about gene products and functions. It provides controlled vocabularies for the designations of cellular components, molecular functions, and biological processes used in the annotation of genes and gene products. These constitute three separate ontologies, of cellular components), molecular functions and biological processes, respectively. The question we address here is: how are the terms in these three separate ontologies related to each other? We use statistical (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  42. Is Mass at Rest One and the Same? A Philosophical Comment: on the Quantum Information Theory of Mass in General Relativity and the Standard Model.Vasil Penchev - 2014 - Journal of SibFU. Humanities and Social Sciences 7 (4):704-720.
    The way, in which quantum information can unify quantum mechanics (and therefore the standard model) and general relativity, is investigated. Quantum information is defined as the generalization of the concept of information as to the choice among infinite sets of alternatives. Relevantly, the axiom of choice is necessary in general. The unit of quantum information, a qubit is interpreted as a relevant elementary choice among an infinite set of alternatives generalizing that of a bit. The invariance to the axiom of (...)
    Download  
     
    Export citation  
     
    Bookmark  
  43. On the Notions of Rulegenerating & Anticipatory Systems.Niels Ole Finnemann - 1997 - Online Publication on Conference Site - Which Does Not Exist Any More.
    Until the late 19th century scientists almost always assumed that the world could be described as a rule-based and hence deterministic system or as a set of such systems. The assumption is maintained in many 20th century theories although it has also been doubted because of the breakthrough of statistical theories in thermodynamics (Boltzmann and Gibbs) and other fields, unsolved questions in quantum mechanics as well as several theories forwarded within the social sciences. Until recently it has furthermore been (...)
    Download  
     
    Export citation  
     
    Bookmark  
  44. Linguistic Competence and New Empiricism in Philosophy and Science.Vanja Subotić - 2023 - Dissertation, University of Belgrade
    The topic of this dissertation is the nature of linguistic competence, the capacity to understand and produce sentences of natural language. I defend the empiricist account of linguistic competence embedded in the connectionist cognitive science. This strand of cognitive science has been opposed to the traditional symbolic cognitive science, coupled with transformational-generative grammar, which was committed to nativism due to the view that human cognition, including language capacity, should be construed in terms of symbolic representations and hardwired rules. Similarly, linguistic (...)
    Download  
     
    Export citation  
     
    Bookmark  
  45.  84
    Body, dance and abstraction for spatial and structural comprehension in the first year of design education.Serkan Can Hatıpoğlu, Melih Kamaoğlu, Gamze Şensoy & Mehmet İnceoğlu - 2023 - International Journal of Technology and Design Education 33 (1).
    The first year of design education is essential for students as it is their initial interaction with the design process. Awareness of the body through dance has the potential to reveal bodily experience in space. Abstraction of embodied experience contributes to realising the significance of the body and its analytical dimension for spatial and structural design. This study investigates the impact of embodied experience and abstraction on the architectural design process and the outcome through correlation and regression analysis. We observed (...)
    Download  
     
    Export citation  
     
    Bookmark  
  46. Big Data: truth, quasi-truth or post-truth?Ricardo Peraça Cavassane & M. Loffredo D'ottaviano Itala - 2020 - Acta Scientiarum. Human and Social Sciences 42 (3):1-7.
    In this paper we investigate if sentences presented as the result of the application of statistical models and artificial intelligence to large volumes of data – the so-called ‘Big Data’ – can be characterized as semantically true, or as quasi-true, or even if such sentences can only be characterized as probably quasi-false and, in a certain way, post-true; that is, if, in the context of Big Data, the representation of a data domain can be configured as a total (...)
    Download  
     
    Export citation  
     
    Bookmark  
  47. PRAGMATIK TRANSFER ON COMPLIMENT AND CAMPLIMENT RESPONSES TO ENGLISH LEARNERS AT THE FACULTY ADAB AND HUMANIORA UIN ALAUDDIN MAKASSAR.Natsir Nurasia - 2021 - International Journal of Research and Innovation in Social Science (IJRISS) 5 (6):514-519.
    Abstract: This study aims to describe (1) the form of compliment strategies and compliment responses of English learners at the Adab and Humanities Faculty of UIN Alauddin Makassar, (2) the factors related to the pragmatics of the transfer of English learners at the Adab and Humanities Faculty of UIN Alauddin Makassar. The population in this study is all students of semester 6 of the Class of 2017 English Language and Literature Faculty of Adab and Humanities UIN Alauddin Makassar, with a (...)
    Download  
     
    Export citation  
     
    Bookmark  
  48. Neural Correlates of Color-Selective Metacontrast in Human Early Retinotopic Areas.Kiyohiro Maeda, Hiroki Yamamoto, Masaki Fukunaga, Masahior Umeda, Chuzo Tanaka & Yoshimichi Ejima - 2010 - Journal of Neurophysiology 104:2291-2301.
    Metacontrast is a visual illusion in which the visibility of a target stimulus is virtually lost when immediately followed by a nonoverlapping mask stimulus. For a colored target, metacontrast is color-selective, with target visibility markedly reduced when the mask and target are the same color, but only slightly reduced when the colors differ. This study investigated neural correlates of color-selective metacontrast for cone-opponent red and green stimuli in the human V1, V2, and V3 using functional magnetic resonance imaging. Neural activity (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  49. Demographic statistics in defensive decisions.Renée Jorgensen Bolinger - 2019 - Synthese 198 (5):4833-4850.
    A popular informal argument suggests that statistics about the preponderance of criminal involvement among particular demographic groups partially justify others in making defensive mistakes against members of the group. One could worry that evidence-relative accounts of moral rights vindicate this argument. After constructing the strongest form of this objection, I offer several replies: most demographic statistics face an unmet challenge from reference class problems, even those that meet it fail to ground non-negligible conditional probabilities, even if they did, they introduce (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  50. Statistical Evidence, Sensitivity, and the Legal Value of Knowledge.David Enoch, Levi Spectre & Talia Fisher - 2012 - Philosophy and Public Affairs 40 (3):197-224.
    The law views with suspicion statistical evidence, even evidence that is probabilistically on a par with direct, individual evidence that the law is in no way suspicious of. But it has proved remarkably hard to either justify this suspicion, or to debunk it. In this paper, we connect the discussion of statistical evidence to broader epistemological discussions of similar phenomena. We highlight Sensitivity – the requirement that a belief be counterfactually sensitive to the truth in a specific way (...)
    Download  
     
    Export citation  
     
    Bookmark   103 citations  
1 — 50 / 998