Results for 'Shannon-information'

998 found
Order:
  1.  75
    Confusing the "Confusion Matrix": The Misapplication of Shannon Information Theory in Sensory Psychology.Lance Nizami - 2012 - Acta Systemica 12 (1):1-17.
    Information flow in a system is a core cybernetics concept. It has been used frequently in Sensory Psychology since 1951. There, Shannon Information Theory was used to calculate "information transmitted" in "absolute identification" experiments involving human subjects. Originally, in Shannon's "system", any symbol received ("outcome") is among the symbols sent ("events"). Not all symbols are received as transmitted, hence an indirect noise measure is calculated, "information transmitted", which requires knowing the confusion matrix, its columns (...)
    Download  
     
    Export citation  
     
    Bookmark  
  2. Counting Distinctions: On the Conceptual Foundations of Shannon’s Information Theory.David Ellerman - 2009 - Synthese 168 (1):119-149.
    Categorical logic has shown that modern logic is essentially the logic of subsets (or "subobjects"). Partitions are dual to subsets so there is a dual logic of partitions where a "distinction" [an ordered pair of distinct elements (u,u′) from the universe U ] is dual to an "element". An element being in a subset is analogous to a partition π on U making a distinction, i.e., if u and u′ were in different blocks of π. Subset logic leads to finite (...)
    Download  
     
    Export citation  
     
    Bookmark   18 citations  
  3. A Generalization of Shannon's Information Theory.Chenguang Lu - 1999 - Int. J. Of General Systems 28 (6):453-490.
    A generalized information theory is proposed as a natural extension of Shannon's information theory. It proposes that information comes from forecasts. The more precise and the more unexpected a forecast is, the more information it conveys. If subjective forecast always conforms with objective facts then the generalized information measure will be equivalent to Shannon's information measure. The generalized communication model is consistent with K. R. Popper's model of knowledge evolution. The mathematical foundations (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  4. A Semantic Information Formula Compatible with Shannon and Popper's Theories.Chenguang Lu - manuscript
    Semantic Information conveyed by daily language has been researched for many years; yet, we still need a practical formula to measure information of a simple sentence or prediction, such as “There will be heavy rain tomorrow”. For practical purpose, this paper introduces a new formula, Semantic Information Formula (SIF), which is based on L. A. Zadeh’s fuzzy set theory and P. Z. Wang’s random set falling shadow theory. It carries forward C. E. Shannon and K. Popper’s (...)
    Download  
     
    Export citation  
     
    Bookmark  
  5.  13
    Cognitive Empathy.Spaulding Shannon - 2017 - In Heidi L. Maibom (ed.), The Routledge Handbook of Philosophy of Empathy. Routledge Press. pp. 13-21.
    We have various strategies available to us for understanding another person’s state of mind. Cognitive empathy may be achieved by mental simulation, i.e. by imagining yourself in another’s situation and figuring out what you would think and feel in that situation. Alternatively, you could consider all the relevant information about the person’s situation and folk psychology and draw a sophisticated inference to the best explanation of that person’s perspective. In this chapter, I examine the conditions under which we are (...)
    Download  
     
    Export citation  
     
    Bookmark  
  6. Do You See What I See? How Social Differences Influence Mindreading.Spaulding Shannon - 2018 - Synthese 195 (9):4009-4030.
    Disagreeing with others about how to interpret a social interaction is a common occurrence. We often find ourselves offering divergent interpretations of others’ motives, intentions, beliefs, and emotions. Remarkably, philosophical accounts of how we understand others do not explain, or even attempt to explain such disagreements. I argue these disparities in social interpretation stem, in large part, from the effect of social categorization and our goals in social interactions, phenomena long studied by social psychologists. I argue we ought to expand (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  7. An Enactive-Ecological Approach to Information and Uncertainty.Eros Moreira de Carvalho & Giovanni Rolla - 2020 - Frontiers in Psychology 11 (Enaction and Ecological Psycholo):1-11.
    Information is a central notion for cognitive sciences and neurosciences, but there is no agreement on what it means for a cognitive system to acquire information about its surroundings. In this paper, we approximate three influential views on information: the one at play in ecological psychology, which is sometimes called information for action; the notion of information as covariance as developed by some enactivists, and the idea of information as minimization of uncertainty as presented (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  8. Information Theory is Abused in Neuroscience.Lance Nizami - 2019 - Cybernetics and Human Knowing 26 (4):47-97.
    In 1948, Claude Shannon introduced his version of a concept that was core to Norbert Wiener's cybernetics, namely, information theory. Shannon's formalisms include a physical framework, namely a general communication system having six unique elements. Under this framework, Shannon information theory offers two particularly useful statistics, channel capacity and information transmitted. Remarkably, hundreds of neuroscience laboratories subsequently reported such numbers. But how (and why) did neuroscientists adapt a communications-engineering framework? Surprisingly, the literature offers no (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  9. An Introduction to Logical Entropy and its Relation to Shannon Entropy.David Ellerman - 2013 - International Journal of Semantic Computing 7 (2):121-145.
    The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set--just as the usual logical notion of probability based on the Boolean (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  10. From Art to Information System.Miro Brada - 2021 - AGI Laboratory.
    This insight to art came from chess composition concentrating art in a very dense form. To identify and mathematically assess the uniqueness is the key applicable to other areas eg. computer programming. Maximization of uniqueness is minimization of entropy that coincides as well as goes beyond Information Theory (Shannon, 1948). The reusage of logic as a universal principle to minimize entropy, requires simplified architecture and abstraction. Any structures (e.g. plugins) duplicating or dividing functionality increase entropy and so unreliability (...)
    Download  
     
    Export citation  
     
    Bookmark  
  11. Information-Not-Thing: Further Problems with and Alternatives to the Belief That Information is Physical.Jesse David Dinneen & Christian Brauner - 2017 - Proceedings of 2017 CAIS-ACSI Conference.
    In this short paper, we show that a popular view in information science, information-as-thing, fails to account for a common example of information that seems physical. We then demonstrate how the distinction between types and tokens, recently used to analyse Shannon information, can account for this same example by viewing information as abstract, and discuss existing definitions of information that are consistent with this approach. -/- Dans ce court article nous montrons qu'une vision (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  12.  93
    Conceptual Centrality and Implicit Bias.Del Pinal Guillermo & Spaulding Shannon - 2018 - Mind and Language 33 (1):95-111.
    How are biases encoded in our representations of social categories? Philosophical and empirical discussions of implicit bias overwhelmingly focus on salient or statistical associations between target features and representations of social categories. These are the sorts of associations probed by the Implicit Association Test and various priming tasks. In this paper, we argue that these discussions systematically overlook an alternative way in which biases are encoded, that is, in the dependency networks that are part of our representations of social categories. (...)
    Download  
     
    Export citation  
     
    Bookmark  
  13. How Lateral Inhibition and Fast Retinogeniculo-Cortical Oscillations Create Vision: A New Hypothesis.Jerath Ravinder, Shannon M. Cearley, Vernon A. Barnes & Elizabeth Nixon-Shapiro - 2016 - Medical Hypotheses 96:20-29.
    The role of the physiological processes involved in human vision escapes clarification in current literature. Many unanswered questions about vision include: 1) whether there is more to lateral inhibition than previously proposed, 2) the role of the discs in rods and cones, 3) how inverted images on the retina are converted to erect images for visual perception, 4) what portion of the image formed on the retina is actually processed in the brain, 5) the reason we have an after-image with (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  14. Logical Entropy: Introduction to Classical and Quantum Logical Information Theory.David Ellerman - 2018 - Entropy 20 (9):679.
    Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  15. The Introduction of Information Into Neurobiology.Justin Garson - 2003 - Philosophy of Science 70 (5):926-936.
    The first use of the term "information" to describe the content of nervous impulse occurs 20 years prior to Shannon`s (1948) work, in Edgar Adrian`s The Basis of Sensation (1928). Although, at least throughout the 1920s and early 30s, the term "information" does not appear in Adrian`s scientific writings to describe the content of nervous impulse, the notion that the structure of nervous impulse constitutes a type of message subject to certain constraints plays an important role in (...)
    Download  
     
    Export citation  
     
    Bookmark   17 citations  
  16. Information, Learning and Falsification.David Balduzzi - 2011
    There are (at least) three approaches to quantifying information. The first, algorithmic information or Kolmogorov complexity, takes events as strings and, given a universal Turing machine, quantifies the information content of a string as the length of the shortest program producing it [1]. The second, Shannon information, takes events as belonging to ensembles and quantifies the information resulting from observing the given event in terms of the number of alternate events that have been ruled (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  17. Homunculus Strides Again: Why ‘Information Transmitted’ in Neuroscience Tells Us Nothing.Lance Nizami - 2015 - Kybernetes 44:1358-1370.
    Purpose – For half a century, neuroscientists have used Shannon Information Theory to calculate “information transmitted,” a hypothetical measure of how well neurons “discriminate” amongst stimuli. Neuroscientists’ computations, however, fail to meet even the technical requirements for credibility. Ultimately, the reasons must be conceptual. That conclusion is confirmed here, with crucial implications for neuroscience. The paper aims to discuss these issues. Design/methodology/approach – Shannon Information Theory depends upon a physical model, Shannon’s “general communication system.” (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  18. Information Theory’s Failure in Neuroscience: On the Limitations of Cybernetics.Lance Nizami - 2014 - In Proceedings of the IEEE 2014 Conference on Norbert Wiener in the 21st Century.
    In Cybernetics (1961 Edition), Professor Norbert Wiener noted that “The role of information and the technique of measuring and transmitting information constitute a whole discipline for the engineer, for the neuroscientist, for the psychologist, and for the sociologist”. Sociology aside, the neuroscientists and the psychologists inferred “information transmitted” using the discrete summations from Shannon Information Theory. The present author has since scrutinized the psychologists’ approach in depth, and found it wrong. The neuroscientists’ approach is highly (...)
    Download  
     
    Export citation  
     
    Bookmark  
  19. Semantic Information Measure with Two Types of Probability for Falsification and Confirmation.Lu Chenguang - manuscript
    Logical Probability (LP) is strictly distinguished from Statistical Probability (SP). To measure semantic information or confirm hypotheses, we need to use sampling distribution (conditional SP function) to test or confirm fuzzy truth function (conditional LP function). The Semantic Information Measure (SIM) proposed is compatible with Shannon’s information theory and Fisher’s likelihood method. It can ensure that the less the LP of a predicate is and the larger the true value of the proposition is, the more (...) there is. So the SIM can be used as Popper's information criterion for falsification or test. The SIM also allows us to optimize the true-value of counterexamples or degrees of disbelief in a hypothesis to get the optimized degree of belief, i. e. Degree of Confirmation (DOC). To explain confirmation, this paper 1) provides the calculation method of the DOC of universal hypotheses; 2) discusses how to resolve Raven Paradox with new DOC and its increment; 3) derives the DOC of rapid HIV tests: DOC of “+” =1-(1-specificity)/sensitivity, which is similar to Likelihood Ratio (=sensitivity/(1-specificity)) but has the upper limit 1; 4) discusses negative DOC for excessive affirmations, wrong hypotheses, or lies; and 5) discusses the DOC of general hypotheses with GPS as example. (shrink)
    Download  
     
    Export citation  
     
    Bookmark  
  20. Objects and Processes: Two Notions for Understanding Biological Information.Agustín Mercado-Reyes, Pablo Padilla Longoria & Alfonso Arroyo-Santos - forthcoming - Journal of Theoretical Biology.
    In spite of being ubiquitous in life sciences, the concept of information is harshly criticized. Uses of the concept other than those derived from Shannon's theory are denounced as pernicious metaphors. We perform a computational experiment to explore whether Shannon's information is adequate to describe the uses of said concept in commonplace scientific practice. Our results show that semantic sequences do not have unique complexity values different from the value of meaningless sequences. This result suggests that (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  21. A Quantitative-Informational Approach to Logical Consequence.Marcos Antonio Alves & Ítala M. Loffredo D'Otaviano - 2015 - In Jean-Yves Beziau (ed.), The Road to Universal Logic (Studies in Universal Logic). Switzerland: Springer International Publishing. pp. 105-24.
    In this work, we propose a definition of logical consequence based on the relation between the quantity of information present in a particular set of formulae and a particular formula. As a starting point, we use Shannon‟s quantitative notion of information, founded on the concepts of logarithmic function and probability value. We first consider some of the basic elements of an axiomatic probability theory, and then construct a probabilistic semantics for languages of classical propositional logic. We define (...)
    Download  
     
    Export citation  
     
    Bookmark  
  22. Interpretation of Absolute Judgments Using Information Theory: Channel Capacity or Memory Capacity?Lance Nizami - 2010 - Cybernetics and Human Knowing 17:111-155.
    Shannon’s information theory has been a popular component of first-order cybernetics. It quantifies information transmitted in terms of the number of times a sent symbol is received as itself, or as another possible symbol. Sent symbols were events and received symbols were outcomes. Garner and Hake reinterpreted Shannon, describing events and outcomes as categories of a stimulus attribute, so as to quantify the information transmitted in the psychologist’s category (or absolute judgment) experiment. There, categories are (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  23. An Analysis of Information Visualisation.Min Chen & Luciano Floridi - 2013 - Synthese 190 (16):3421-3438.
    Philosophers have relied on visual metaphors to analyse ideas and explain their theories at least since Plato. Descartes is famous for his system of axes, and Wittgenstein for his first design of truth table diagrams. Today, visualisation is a form of ‘computer-aided seeing’ information in data. Hence, information is the fundamental ‘currency’ exchanged through a visualisation pipeline. In this article, we examine the types of information that may occur at different stages of a general visualization pipeline. We (...)
    Download  
     
    Export citation  
     
    Bookmark  
  24.  35
    The Concept of Information for Functions.Yair Lapin - manuscript
    A Shannon's information for functions, an interpretation for equations with multiple solutions, and some reflections on recursion, recursive functions, and computational complexity.
    Download  
     
    Export citation  
     
    Bookmark  
  25. Turing's Three Philosophical Lessons and the Philosophy of Information.Luciano Floridi - 2012 - Philosophical Transactions of the Royal Society A 370 (1971):3536-3542.
    In this article, I outline the three main philosophical lessons that we may learn from Turing’s work, and how they lead to a new philosophy of information. After a brief introduction, I discuss his work on the method of levels of abstraction (LoA), and his insistence that questions could be meaningfully asked only by specifying the correct LoA. I then look at his second lesson, about the sort of philosophical questions that seem to be most pressing today. Finally, I (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  26. Memory Model of Information Transmitted in Absolute Judgment.Lance Nizami - 2011 - Kybernetes 40:80-109.
    Purpose – The purpose of this paper is to examine the popular “information transmitted” interpretation of absolute judgments, and to provide an alternative interpretation if one is needed. Design/methodology/approach – The psychologists Garner and Hake and their successors used Shannon’s Information Theory to quantify information transmitted in absolute judgments of sensory stimuli. Here, information theory is briefly reviewed, followed by a description of the absolute judgment experiment, and its information theory analysis. Empirical channel capacities (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  27.  84
    Is Dretske's Theory of Information Naturalistically Grounded? How Emergent Communication Channels Reference an Abstracted Ontic Framework.Timothy M. Rogers - manuscript
    By bringing together Dretske’s theory of knowledge, Shannon’s theory of information, and the conceptual framework of statistical physics, this paper explores some of the meta-physical challenges posed by a naturalistic notion of semantical information. It is argued that Dretske’s theory cannot be said to be naturalistically grounded in the world described by classical physics and that Dretske information is not consistent with Shannon information. A possible route to reconciling Dretske’s insights with Shannon’s theory (...)
    Download  
     
    Export citation  
     
    Bookmark  
  28. Preface to a Philosophy of Legal Information.Kevin Lee - 2018 - SMU Science and Technology Law Review 20.
    This essay introduces the philosophy of legal information (PLI), which is a response to the radical changes brought about in philosophy by the information revolution. It reviews in some detail the work of Luciano Floridi, who is an influential advocate for an information turn in philosophy that he calls the philosophy of information (PI). Floridi proposes that philosophers investigate the conceptual nature of information as it currently exists across multiple disciplines. He shows how a focus (...)
    Download  
     
    Export citation  
     
    Bookmark  
  29. Historical and Conceptual Foundations of Information Physics.Anta Javier - 2021 - Dissertation, Universitat de Barcelona
    The main objective of this dissertation is to philosophically assess how the use of informational concepts in the field of classical thermostatistical physics has historically evolved from the late 1940s to the present day. I will first analyze in depth the main notions that form the conceptual basis on which 'informational physics' historically unfolded, encompassing (i) different entropy, probability and information notions, (ii) their multiple interpretative variations, and (iii) the formal, numerical and semantic-interpretative relationships among them. In the following, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  30. Beyond Turing: Hypercomputation and Quantum Morphogenesis.Ignazio Licata - 2012 - Asia Pacific Mathematics Newsletter 2 (3):20-24.
    A Geometrical Approach to Quantum Information.
    Download  
     
    Export citation  
     
    Bookmark  
  31. Norwich’s Entropy Theory: How Not to Go From Abstract to Actual.Lance Nizami - 2011 - Kybernetes 40:1102-1118.
    Purpose – The purpose of this paper is to ask whether a first-order-cybernetics concept, Shannon’s Information Theory, actually allows a far-reaching mathematics of perception allegedly derived from it, Norwich et al.’s “Entropy Theory of Perception”. Design/methodology/approach – All of The Entropy Theory, 35 years of publications, was scrutinized for its characterization of what underlies Shannon Information Theory: Shannon’s “general communication system”. There, “events” are passed by a “source” to a “transmitter”, thence through a “noisy channel” (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  32.  87
    Paradigm Versus Praxis: Why Psychology ‘Absolute Identification’ Experiments Do Not Reveal Sensory Processes.Lance Nizami - 2013 - Kybernetes 42:1447-1456.
    Purpose – A key cybernetics concept, information transmitted in a system, was quantified by Shannon. It quickly gained prominence, inspiring a version by Harvard psychologists Garner and Hake for “absolute identification” experiments. There, human subjects “categorize” sensory stimuli, affording “information transmitted” in perception. The Garner-Hake formulation has been in continuous use for 62 years, exerting enormous influence. But some experienced theorists and reviewers have criticized it as uninformative. They could not explain why, and were ignored. Here, the (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  33. Emergence and Computation at the Edge of Classical and Quantum Systems.Ignazio Licata - 2008 - In World Scientific (ed.), Physics of Emergence and Organization. World Scientific.
    The problem of emergence in physical theories makes necessary to build a general theory of the relationships between the observed system and the observing system. It can be shown that there exists a correspondence between classical systems and computational dynamics according to the Shannon-Turing model. A classical system is an informational closed system with respect to the observer; this characterizes the emergent processes in classical physics as phenomenological emergence. In quantum systems, the analysis based on the computation theory fails. (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  34.  27
    Normativity at the Edge of Reason - Review of Cecile Malaspina, An Epistemology of Noise. [REVIEW]Iain Campbell - 2021 - Radical Philosophy 9:93-96.
    In recent years noise seems to have become an interdisciplinary concept par excellence, apt to capturing important dynamics at work whether in technological, scientific, social, or aesthetic domains. But when economists, biologists, psychologists, and musicians speak of noise, are they really all referring to the same thing? In An Epistemology of Noise Cecile Malaspina takes this dispersion of the notion of noise as a starting point, and moreover accepts that, when removed from its mathematical formulation in information theory and (...)
    Download  
     
    Export citation  
     
    Bookmark  
  35. Sensory Systems as Cybernetic Systems That Require Awareness of Alternatives to Interact with the World: Analysis of the Brain-Receptor Loop in Norwich's Entropy Theory of Perception.Lance Nizami - 2009 - Proceedings of the 2009 IEEE International Conference on Systems, Man, and Cybernetics. San Antonio, TX.
    Introduction & Objectives: Norwich’s Entropy Theory of Perception (1975 [1] -present) stands alone. It explains many firing-rate behaviors and psychophysical laws from bare theory. To do so, it demands a unique sort of interaction between receptor and brain, one that Norwich never substantiated. Can it now be confirmed, given the accumulation of empirical sensory neuroscience? Background: Norwich conjoined sensation and a mathematical model of communication, Shannon’s Information Theory, as follows: “In the entropic view of sensation, magnitude of sensation (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  36.  61
    Arithmetic Logical Irreversibility and the Turing's Halt Problem.Yair Lapin - manuscript
    A new approach to the halting problem of the Turing machine using different interpretations of the Shannon measure of the information on the computational process represented as a distribution of events (deleting, logical or arithmetic operations) and defining a new concept of arithmetic logical irreversibility and memory erasure that generate uncertainty and computational improbability due to loss of information during these events. Different computational steps (input) may give the same result (next step, output) introducing thus information (...)
    Download  
     
    Export citation  
     
    Bookmark  
  37. “Identifying Phrasal Connectives in Italian Using Quantitative Methods”.Edoardo Zamuner, Fabio Tamburini & Cristiana de Sanctis - 2002 - In Stefania Nuccorini (ed.), Phrases and Phraseology – Data and Descriptions. Peter Lang Verlag.
    In recent decades, the analysis of phraseology has made use of the exploration of large corpora as a source of quantitative information about language. This paper intends to present the main lines of work in progress based on this empirical approach to linguistic analysis. In particular, we focus our attention on some problems relating to the morpho-syntactic annotation of corpora. The CORIS/CODIS corpus of contemporary written Italian, developed at CILTA – University of Bologna (Rossini Favretti 2000; Rossini Favretti, Tamburini, (...)
    Download  
    Translate
     
     
    Export citation  
     
    Bookmark  
  38. The Rise of Cognitive Science in the 20th Century.Carrie Figdor - 2018 - In Amy Kind (ed.), Philosophy of Mind in the Twentieth and Twenty-First Centuries: The History of the Philosophy of Mind, Volume 6. Abingdon, UK and New York: Routledge. pp. 280-302.
    This chapter describes the conceptual foundations of cognitive science during its establishment as a science in the 20th century. It is organized around the core ideas of individual agency as its basic explanans and information-processing as its basic explanandum. The latter consists of a package of ideas that provide a mathematico-engineering framework for the philosophical theory of materialism.
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  39.  24
    On the Notions of Rulegenerating & Anticipatory Systems.Niels Ole Finnemann - 1997 - Online Publication on Conference Site - Which Does Not Exist Any More.
    Until the late 19th century scientists almost always assumed that the world could be described as a rule-based and hence deterministic system or as a set of such systems. The assumption is maintained in many 20th century theories although it has also been doubted because of the breakthrough of statistical theories in thermodynamics (Boltzmann and Gibbs) and other fields, unsolved questions in quantum mechanics as well as several theories forwarded within the social sciences. Until recently it has furthermore been assumed (...)
    Download  
     
    Export citation  
     
    Bookmark  
  40.  28
    Whispers and Shouts. The Measurement of the Human Act.Fernando Flores Morador & Luis de Marcos Ortega (eds.) - 2021 - Alcalá de Henares, Madrid: Departement of Computational Sciences. University of Alcalá; Madrid.
    The 20th Century is the starting point for the most ambitious attempts to extrapolate human life into artificial systems. Norbert Wiener’s Cybernetics, Claude Shannon’s Information Theory, John von Neumann’s Cellular Automata, Universal Constructor to the Turing Test, Artificial Intelligence to Maturana and Varela’s Autopoietic Organization, all shared the goal of understanding in what sense humans resemble a machine. This scientific and technological movement has embraced all disciplines without exceptions, not only mathematics and physics but also biology, sociology, psychology, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  41.  41
    Is Uncertainty Reduction the Basis for Perception? Errors in Norwich’s Entropy Theory of Perception Imply Otherwise.Lance Nizami - 2010 - Proceedings of the World Congress on Engineering and Computer Science 2010 (Lecture Notes in Engineering and Computer Science) 2.
    This paper reveals errors within Norwich et al.’s Entropy Theory of Perception, errors that have broad implications for our understanding of perception. What Norwich and coauthors dubbed their “informational theory of neural coding” is based on cybernetics, that is, control and communication in man and machine. The Entropy Theory uses information theory to interpret human performance in absolute judgments. There, the continuum of the intensity of a sensory stimulus is cut into categories and the subject is shown exemplar stimuli (...)
    Download  
     
    Export citation  
     
    Bookmark  
  42. Shannon + Friston = Content: Intentionality in Predictive Signaling Systems.Carrie Figdor - 2021 - Synthese 199 (1-2):2793-2816.
    What is the content of a mental state? This question poses the problem of intentionality: to explain how mental states can be about other things, where being about them is understood as representing them. A framework that integrates predictive coding and signaling systems theories of cognitive processing offers a new perspective on intentionality. On this view, at least some mental states are evaluations, which differ in function, operation, and normativity from representations. A complete naturalistic theory of intentionality must account for (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  43.  13
    Phenomenology of Social Explanation.Shannon Spaulding - forthcoming - Phenomenology and the Cognitive Sciences:1-17.
    The orthodox view of social cognition maintains that mentalizing is an important and pervasive element of our ordinary social interactions. The orthodoxy has come under scrutiny from various sources recently. Critics from the phenomenological tradition argue that phenomenological reflection on our social interactions tells against the orthodox view. Proponents of pluralistic folk psychology argue that our ordinary social interactions extend far beyond mentalizing. Both sorts of critics argue that emphasis in social cognition research ought to be on other elements of (...)
    Download  
     
    Export citation  
     
    Bookmark  
  44. On Direct Social Perception.Shannon Spaulding - 2015 - Consciousness and Cognition 36:472-482.
    Direct Social Perception (DSP) is the idea that we can non-inferentially perceive others’ mental states. In this paper, I argue that the standard way of framing DSP leaves the debate at an impasse. I suggest two alternative interpretations of the idea that we see others’ mental states: others’ mental states are represented in the content of our perception, and we have basic perceptual beliefs about others’ mental states. I argue that the latter interpretation of DSP is more promising and examine (...)
    Download  
     
    Export citation  
     
    Bookmark   52 citations  
  45. Mirror Neurons and Social Cognition.Shannon Spaulding - 2013 - Mind and Language 28 (2):233-257.
    Mirror neurons are widely regarded as an important key to social cognition. Despite such wide agreement, there is very little consensus on how or why they are important. The goal of this paper is to clearly explicate the exact role mirror neurons play in social cognition. I aim to answer two questions about the relationship between mirroring and social cognition: What kind of social understanding is involved with mirroring? How is mirroring related to that understanding? I argue that philosophical and (...)
    Download  
     
    Export citation  
     
    Bookmark   26 citations  
  46. Imagination Through Knowledge.Shannon Spaulding - 2016 - In Amy Kind & Peter Kung (eds.), Knowledge Through Imagination. Oxford University Press. pp. 207-226.
    Imagination seems to play an epistemic role in philosophical and scientific thought experiments, mindreading, and ordinary practical deliberations insofar as it generates new knowledge of contingent facts about the world. However, it also seems that imagination is limited to creative generation of ideas. Sometimes we imagine fanciful ideas that depart freely from reality. The conjunction of these claims is what I call the puzzle of knowledge through imagination. This chapter aims to resolve this puzzle. I argue that imagination has an (...)
    Download  
     
    Export citation  
     
    Bookmark   18 citations  
  47. On Whether We Can See Intentions.Shannon Spaulding - 2017 - Pacific Philosophical Quarterly 98 (2):150-170.
    Direct Perception is the view that we can see others' mental states, i.e. that we perceive others' mental states with the same immediacy and directness that we perceive ordinary objects in the world. I evaluate Direct Perception by considering whether we can see intentions, a particularly promising candidate for Direct Perception. I argue that the view equivocates on the notion of intention. Disambiguating the Direct Perception claim reveals a troubling dilemma for the view: either it is banal or highly implausible.
    Download  
     
    Export citation  
     
    Bookmark   16 citations  
  48. Imagination, Desire, and Rationality.Shannon Spaulding - 2015 - Journal of Philosophy 112 (9):457-476.
    We often have affective responses to fictional events. We feel afraid for Desdemona when Othello approaches her in a murderous rage. We feel disgust toward Iago for orchestrating this tragic event. What mental architecture could explain these affective responses? In this paper I consider the claim that the best explanation of our affective responses to fiction involves imaginative desires. Some theorists argue that accounts that do not invoke imaginative desires imply that consumers of fiction have irrational desires. I argue that (...)
    Download  
     
    Export citation  
     
    Bookmark   13 citations  
  49. Introduction to Folk Psychology: Pluralistic Approaches.Kristin Andrews, Shannon Spaulding & Evan Westra - 2020 - Synthese 199 (1-2):1685-1700.
    This introduction to the topical collection, Folk Psychology: Pluralistic Approaches reviews the origins and basic theoretical tenets of the framework of pluralistic folk psychology. It places special emphasis on pluralism about the variety folk psychological strategies that underlie behavioral prediction and explanation beyond belief-desire attribution, and on the diverse range of social goals that folk psychological reasoning supports beyond prediction and explanation. Pluralism is not presented as a single theory or model of social cognition, but rather as a big-tent research (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  50. Mind Misreading.Shannon Spaulding - 2016 - Philosophical Issues 26 (1).
    Most people think of themselves as pretty good at understanding others’ beliefs, desires, emotions, and intentions. Accurate mindreading is an impressive cognitive feat, and for this reason the philosophical literature on mindreading has focused exclusively on explaining such successes. However, as it turns out, we regularly make mindreading mistakes. Understanding when and how mind misreading occurs is crucial for a complete account of mindreading. In this paper, I examine the conditions under which mind misreading occurs. I argue that these patterns (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
1 — 50 / 998