Results for 'Shannon-information'

961 found
Order:
  1. Do you see what I see? How social differences influence mindreading.Spaulding Shannon - 2018 - Synthese 195 (9):4009-4030.
    Disagreeing with others about how to interpret a social interaction is a common occurrence. We often find ourselves offering divergent interpretations of others’ motives, intentions, beliefs, and emotions. Remarkably, philosophical accounts of how we understand others do not explain, or even attempt to explain such disagreements. I argue these disparities in social interpretation stem, in large part, from the effect of social categorization and our goals in social interactions, phenomena long studied by social psychologists. I argue we ought to expand (...)
    Download  
     
    Export citation  
     
    Bookmark   14 citations  
  2. Cognitive Empathy.Spaulding Shannon - 2017 - In Heidi Lene Maibom (ed.), The Routledge Handbook of Philosophy of Empathy. Routledge. pp. 13-21.
    We have various strategies available to us for understanding another person’s state of mind. Cognitive empathy may be achieved by mental simulation, i.e. by imagining yourself in another’s situation and figuring out what you would think and feel in that situation. Alternatively, you could consider all the relevant information about the person’s situation and folk psychology and draw a sophisticated inference to the best explanation of that person’s perspective. In this chapter, I examine the conditions under which we are (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  3. Military Ethics and Strategy: Senior Commanders, Moral Values and Cultural Perspectives.Shannon Brandt Ford - 2015 - In Jr Lucas (ed.), Routledge Handbook of Military Ethics. London: Routledge.
    In this chapter, I explore the importance of ethics education for senior military officers with responsibilities at the strategic level of government. One problem, as I see it, is that senior commanders might demand “ethics” from their soldiers but then they are themselves primarily informed by a “morally skeptical viewpoint” (in the form of political realism). I argue that ethics are more than a matter of personal behavior alone: the ethical position of an armed service is a matter of the (...)
    Download  
     
    Export citation  
     
    Bookmark  
  4. Conceptual Centrality and Implicit Bias.Del Pinal Guillermo & Spaulding Shannon - 2018 - Mind and Language 33 (1):95-111.
    How are biases encoded in our representations of social categories? Philosophical and empirical discussions of implicit bias overwhelmingly focus on salient or statistical associations between target features and representations of social categories. These are the sorts of associations probed by the Implicit Association Test and various priming tasks. In this paper, we argue that these discussions systematically overlook an alternative way in which biases are encoded, that is, in the dependency networks that are part of our representations of social categories. (...)
    Download  
     
    Export citation  
     
    Bookmark   10 citations  
  5. Cybersecurity, Trustworthiness and Resilient Systems: Guiding Values for Policy.Adam Henschke & Shannon Ford - 2017 - Journal of Cyber Policy 1 (2).
    Cyberspace relies on information technologies to mediate relations between different people, across different communication networks and is reliant on the supporting technology. These interactions typically occur without physical proximity and those working depending on cybersystems must be able to trust the overall human–technical systems that support cyberspace. As such, detailed discussion of cybersecurity policy would be improved by including trust as a key value to help guide policy discussions. Moreover, effective cybersystems must have resilience designed into them. This paper (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  6. I, Spy Robot: The Ethics of Robots in National Intelligence Activities.Patrick Lin & Shannon Brandt Ford - 2016 - In Jai Galliott & Warren Reed (eds.), Ethics and the Future of Spying: Technology, National Security and Intelligence Collection. Routledge. pp. 145-157.
    In this chapter, we examine the key moral issues for the intelligence community with regard to the use of robots for intelligence collection. First, we survey the diverse range of spy robots that currently exist or are emerging, and examine their value for national security. This includes describing a number of plausible scenarios in which they have been (or could be) used, including: surveillance, attack, sentry, information collection, delivery, extraction, detention, interrogation and as Trojan horses. Second, we examine several (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  7. How lateral inhibition and fast retinogeniculo-cortical oscillations create vision: A new hypothesis.Jerath Ravinder, Shannon M. Cearley, Vernon A. Barnes & Elizabeth Nixon-Shapiro - 2016 - Medical Hypotheses 96:20-29.
    The role of the physiological processes involved in human vision escapes clarification in current literature. Many unanswered questions about vision include: 1) whether there is more to lateral inhibition than previously proposed, 2) the role of the discs in rods and cones, 3) how inverted images on the retina are converted to erect images for visual perception, 4) what portion of the image formed on the retina is actually processed in the brain, 5) the reason we have an after-image with (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  8. Confusing the "Confusion Matrix": The Misapplication of Shannon Information Theory in Sensory Psychology.Lance Nizami - 2012 - Acta Systemica 12 (1):1-17.
    Information flow in a system is a core cybernetics concept. It has been used frequently in Sensory Psychology since 1951. There, Shannon Information Theory was used to calculate "information transmitted" in "absolute identification" experiments involving human subjects. Originally, in Shannon's "system", any symbol received ("outcome") is among the symbols sent ("events"). Not all symbols are received as transmitted, hence an indirect noise measure is calculated, "information transmitted", which requires knowing the confusion matrix, its columns (...)
    Download  
     
    Export citation  
     
    Bookmark  
  9. Ethical Leadership as a Balance Between Opposing Neural Networks.Kylie C. Rochford, Anthony I. Jack, Richard E. Boyatzis & Shannon E. French - 2017 - Journal of Business Ethics 144 (4):755-770.
    In this article, we explore the implications of opposing domains theory for developing ethical leaders. Opposing domains theory highlights a neurological tension between analytic reasoning and socioemotional reasoning. Specifically, when we engage in analytic reasoning, we suppress our ability to engage in socioemotional reasoning and vice versa. In this article, we bring together the domains of neuroscience, psychology, and ethics, to inform our theorizing around ethical leadership. We propose that a key issue for ethical leadership is achieving a healthy balance (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  10. A Semantic Information Formula Compatible with Shannon and Popper's Theories.Chenguang Lu - manuscript
    Semantic Information conveyed by daily language has been researched for many years; yet, we still need a practical formula to measure information of a simple sentence or prediction, such as “There will be heavy rain tomorrow”. For practical purpose, this paper introduces a new formula, Semantic Information Formula (SIF), which is based on L. A. Zadeh’s fuzzy set theory and P. Z. Wang’s random set falling shadow theory. It carries forward C. E. Shannon and K. Popper’s (...)
    Download  
     
    Export citation  
     
    Bookmark  
  11. A Generalization of Shannon's Information Theory.Chenguang Lu - 1999 - Int. J. Of General Systems 28 (6):453-490.
    A generalized information theory is proposed as a natural extension of Shannon's information theory. It proposes that information comes from forecasts. The more precise and the more unexpected a forecast is, the more information it conveys. If subjective forecast always conforms with objective facts then the generalized information measure will be equivalent to Shannon's information measure. The generalized communication model is consistent with K. R. Popper's model of knowledge evolution. The mathematical foundations (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  12. Information Theory is abused in neuroscience.Lance Nizami - 2019 - Cybernetics and Human Knowing 26 (4):47-97.
    In 1948, Claude Shannon introduced his version of a concept that was core to Norbert Wiener's cybernetics, namely, information theory. Shannon's formalisms include a physical framework, namely a general communication system having six unique elements. Under this framework, Shannon information theory offers two particularly useful statistics, channel capacity and information transmitted. Remarkably, hundreds of neuroscience laboratories subsequently reported such numbers. But how (and why) did neuroscientists adapt a communications-engineering framework? Surprisingly, the literature offers no (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  13. Counting distinctions: on the conceptual foundations of Shannon’s information theory.David Ellerman - 2009 - Synthese 168 (1):119-149.
    Categorical logic has shown that modern logic is essentially the logic of subsets (or "subobjects"). Partitions are dual to subsets so there is a dual logic of partitions where a "distinction" [an ordered pair of distinct elements (u,u′) from the universe U ] is dual to an "element". An element being in a subset is analogous to a partition π on U making a distinction, i.e., if u and u′ were in different blocks of π. Subset logic leads to finite (...)
    Download  
     
    Export citation  
     
    Bookmark   18 citations  
  14. An Enactive-Ecological Approach to Information and Uncertainty.Eros Moreira de Carvalho & Giovanni Rolla - 2020 - Frontiers in Psychology 11 (Enaction and Ecological Psycholo):1-11.
    Information is a central notion for cognitive sciences and neurosciences, but there is no agreement on what it means for a cognitive system to acquire information about its surroundings. In this paper, we approximate three influential views on information: the one at play in ecological psychology, which is sometimes called information for action; the notion of information as covariance as developed by some enactivists, and the idea of information as minimization of uncertainty as presented (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  15. “A Thousand Words”: How Shannon Entropy perspective provides link among exponential data growth, average temperature of the Earth, declining Earth magnetic field, and global consciousness.Victor Christianto & Florentin Smarandache - manuscript
    The sunspot data seems to indicate that the Sun is likely to enter Maunder Minimum, then it will mean that low Sun activity may cause low temperature in Earth. If this happens then it will cause a phenomenon which is called by some climatology experts as “The Little Ice Age” for the next 20-30 years, starting from the next few years. Therefore, the Earth climate in the coming years tend to be cooler than before. This phenomenon then causes us to (...)
    Download  
     
    Export citation  
     
    Bookmark  
  16. An introduction to logical entropy and its relation to Shannon entropy.David Ellerman - 2013 - International Journal of Semantic Computing 7 (2):121-145.
    The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set--just as the usual logical notion of probability based on the Boolean (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  17. Information-not-thing: further problems with and alternatives to the belief that information is physical.Jesse David Dinneen & Christian Brauner - 2017 - Proceedings of 2017 CAIS-ACSI Conference.
    In this short paper, we show that a popular view in information science, information-as-thing, fails to account for a common example of information that seems physical. We then demonstrate how the distinction between types and tokens, recently used to analyse Shannon information, can account for this same example by viewing information as abstract, and discuss existing definitions of information that are consistent with this approach. -/- Dans ce court article nous montrons qu'une vision (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  18. Further on informational quanta, interactions, and entropy under the granular view of value formation.Quan-Hoang Vuong & Minh-Hoang Nguyen - 2024 - Informational Entropy-Based Notion of Value.
    A recent study suggests that value and quantum states seem to be governed by the same underlying mechanisms. In our recent book titled "Better economics for the Earth: A lesson from quantum and information theories," specifically Chapter 5, we have proposed an informational entropy-based notion of value, drawing on the granular worldview and primary features of quantum mechanics, Shannon’s information theory, and the mindsponge theory. Specifically, the notion suggests that values are created through the interactions of (...). But how do information and its interactions lead to values? This research note aims to contribute to the answer to this question. (shrink)
    Download  
     
    Export citation  
     
    Bookmark  
  19. Information before information theory: The politics of data beyond the perspective of communication.Colin Koopman - forthcoming - New Media and Society.
    Scholarship on the politics of new media widely assumes that communication functions as a sufficient conceptual paradigm for critically assessing new media politics. This article argues that communication-centric analyses fail to engage the politics of information itself, limiting information only to its consequences for communication, and neglecting information as it reaches into our selves, lives, and actions beyond the confines of communication. Furthering recent new media historiography on the “information theory” of Shannon and Wiener, the (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  20. Logical Entropy: Introduction to Classical and Quantum Logical Information theory.David Ellerman - 2018 - Entropy 20 (9):679.
    Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  21. Information, learning and falsification.David Balduzzi - 2011
    There are (at least) three approaches to quantifying information. The first, algorithmic information or Kolmogorov complexity, takes events as strings and, given a universal Turing machine, quantifies the information content of a string as the length of the shortest program producing it [1]. The second, Shannon information, takes events as belonging to ensembles and quantifies the information resulting from observing the given event in terms of the number of alternate events that have been ruled (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  22. Information Theory’s failure in neuroscience: on the limitations of cybernetics.Lance Nizami - 2014 - In Martin Gibbs (ed.), Proceedings of the IEEE 2014 Conference on Norbert Wiener in the 21st Century. IEEE.
    In Cybernetics (1961 Edition), Professor Norbert Wiener noted that “The role of information and the technique of measuring and transmitting information constitute a whole discipline for the engineer, for the neuroscientist, for the psychologist, and for the sociologist”. Sociology aside, the neuroscientists and the psychologists inferred “information transmitted” using the discrete summations from Shannon Information Theory. The present author has since scrutinized the psychologists’ approach in depth, and found it wrong. The neuroscientists’ approach is highly (...)
    Download  
     
    Export citation  
     
    Bookmark  
  23. Homunculus strides again: why ‘information transmitted’ in neuroscience tells us nothing.Lance Nizami - 2015 - Kybernetes 44:1358-1370.
    Purpose – For half a century, neuroscientists have used Shannon Information Theory to calculate “information transmitted,” a hypothetical measure of how well neurons “discriminate” amongst stimuli. Neuroscientists’ computations, however, fail to meet even the technical requirements for credibility. Ultimately, the reasons must be conceptual. That conclusion is confirmed here, with crucial implications for neuroscience. The paper aims to discuss these issues. Design/methodology/approach – Shannon Information Theory depends upon a physical model, Shannon’s “general communication system.” (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  24. Semantic Information Measure with Two Types of Probability for Falsification and Confirmation.Lu Chenguang - manuscript
    Logical Probability (LP) is strictly distinguished from Statistical Probability (SP). To measure semantic information or confirm hypotheses, we need to use sampling distribution (conditional SP function) to test or confirm fuzzy truth function (conditional LP function). The Semantic Information Measure (SIM) proposed is compatible with Shannon’s information theory and Fisher’s likelihood method. It can ensure that the less the LP of a predicate is and the larger the true value of the proposition is, the more (...) there is. So the SIM can be used as Popper's information criterion for falsification or test. The SIM also allows us to optimize the true-value of counterexamples or degrees of disbelief in a hypothesis to get the optimized degree of belief, i. e. Degree of Confirmation (DOC). To explain confirmation, this paper 1) provides the calculation method of the DOC of universal hypotheses; 2) discusses how to resolve Raven Paradox with new DOC and its increment; 3) derives the DOC of rapid HIV tests: DOC of “+” =1-(1-specificity)/sensitivity, which is similar to Likelihood Ratio (=sensitivity/(1-specificity)) but has the upper limit 1; 4) discusses negative DOC for excessive affirmations, wrong hypotheses, or lies; and 5) discusses the DOC of general hypotheses with GPS as example. (shrink)
    Download  
     
    Export citation  
     
    Bookmark  
  25. A quantitative-informational approach to logical consequence.Marcos Antonio Alves & Ítala M. Loffredo D'Otaviano - 2015 - In Beziau Jean-Yves (ed.), The Road to Universal Logic (Studies in Universal Logic). Springer International Publishing. pp. 105-24.
    In this work, we propose a definition of logical consequence based on the relation between the quantity of information present in a particular set of formulae and a particular formula. As a starting point, we use Shannon‟s quantitative notion of information, founded on the concepts of logarithmic function and probability value. We first consider some of the basic elements of an axiomatic probability theory, and then construct a probabilistic semantics for languages of classical propositional logic. We define (...)
    Download  
     
    Export citation  
     
    Bookmark  
  26. Interpretation of absolute judgments using information theory: channel capacity or memory capacity?Lance Nizami - 2010 - Cybernetics and Human Knowing 17:111-155.
    Shannon’s information theory has been a popular component of first-order cybernetics. It quantifies information transmitted in terms of the number of times a sent symbol is received as itself, or as another possible symbol. Sent symbols were events and received symbols were outcomes. Garner and Hake reinterpreted Shannon, describing events and outcomes as categories of a stimulus attribute, so as to quantify the information transmitted in the psychologist’s category (or absolute judgment) experiment. There, categories are (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  27. The Introduction of Information into Neurobiology.Justin Garson - 2003 - Philosophy of Science 70 (5):926-936.
    The first use of the term "information" to describe the content of nervous impulse occurs 20 years prior to Shannon`s (1948) work, in Edgar Adrian`s The Basis of Sensation (1928). Although, at least throughout the 1920s and early 30s, the term "information" does not appear in Adrian`s scientific writings to describe the content of nervous impulse, the notion that the structure of nervous impulse constitutes a type of message subject to certain constraints plays an important role in (...)
    Download  
     
    Export citation  
     
    Bookmark   17 citations  
  28. Objects and processes: two notions for understanding biological information.Agustín Mercado-Reyes, Pablo Padilla Longoria & Alfonso Arroyo-Santos - forthcoming - Journal of Theoretical Biology.
    In spite of being ubiquitous in life sciences, the concept of information is harshly criticized. Uses of the concept other than those derived from Shannon's theory are denounced as pernicious metaphors. We perform a computational experiment to explore whether Shannon's information is adequate to describe the uses of said concept in commonplace scientific practice. Our results show that semantic sequences do not have unique complexity values different from the value of meaningless sequences. This result suggests that (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  29.  53
    The Contribution of the Rejection Mechanism to Scientific Knowledge Production: A View from Granular Interaction Thinking and Information Theories.Quan-Hoang Vuong & Minh-Hoang Nguyen - 2024 - Qeios Preprint.
    Rejection is an essential part of the scholarly publishing process, acting as a filter to distinguish between robust and less credible scientific works. This study examines the advantages and limitations of the rejection mechanism through the lens of Shannon’s information theory and the theory of granular interactions thinking. We argue that while rejection helps reduce entropy and increase the likelihood of disseminating useful knowledge, the process is not devoid of subjectivity. We propose two recommendations to improve the rejection (...)
    Download  
     
    Export citation  
     
    Bookmark  
  30. Is Dretske's Theory of Information Naturalistically Grounded? How emergent communication channels reference an abstracted ontic framework.Timothy M. Rogers - manuscript
    By bringing together Dretske’s theory of knowledge, Shannon’s theory of information, and the conceptual framework of statistical physics, this paper explores some of the meta-physical challenges posed by a naturalistic notion of semantical information. It is argued that Dretske’s theory cannot be said to be naturalistically grounded in the world described by classical physics and that Dretske information is not consistent with Shannon information. A possible route to reconciling Dretske’s insights with Shannon’s theory (...)
    Download  
     
    Export citation  
     
    Bookmark  
  31. An analysis of information visualisation.Min Chen & Luciano Floridi - 2013 - Synthese 190 (16):3421-3438.
    Philosophers have relied on visual metaphors to analyse ideas and explain their theories at least since Plato. Descartes is famous for his system of axes, and Wittgenstein for his first design of truth table diagrams. Today, visualisation is a form of ‘computer-aided seeing’ information in data. Hence, information is the fundamental ‘currency’ exchanged through a visualisation pipeline. In this article, we examine the types of information that may occur at different stages of a general visualization pipeline. We (...)
    Download  
     
    Export citation  
     
    Bookmark  
  32. From Art to Information System.Miro Brada - 2021 - AGI Laboratory.
    This insight to art came from chess composition concentrating art in a very dense form. To identify and mathematically assess the uniqueness is the key applicable to other areas eg. computer programming. Maximization of uniqueness is minimization of entropy that coincides as well as goes beyond Information Theory (Shannon, 1948). The reusage of logic as a universal principle to minimize entropy, requires simplified architecture and abstraction. Any structures (e.g. plugins) duplicating or dividing functionality increase entropy and so unreliability (...)
    Download  
     
    Export citation  
     
    Bookmark  
  33. Memory model of information transmitted in absolute judgment.Lance Nizami - 2011 - Kybernetes 40:80-109.
    Purpose – The purpose of this paper is to examine the popular “information transmitted” interpretation of absolute judgments, and to provide an alternative interpretation if one is needed. Design/methodology/approach – The psychologists Garner and Hake and their successors used Shannon’s Information Theory to quantify information transmitted in absolute judgments of sensory stimuli. Here, information theory is briefly reviewed, followed by a description of the absolute judgment experiment, and its information theory analysis. Empirical channel capacities (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  34.  81
    Erroneous concepts of prominent scientists: C. F. Weizsäcker, J. A. Wheeler, S. Wolfram, S. Lloyd, J. Schmidhuber, and M. Vopson, resulting from misunderstanding of information and complexity.Mariusz Stanowski - 2024 - Journal of Information Science 1:9.
    The common use of Shannon’s information, specified for the needs of telecommunications, gives rise to many misunderstandings outside of this context. (e.g. in conceptions of such well-known theorists as C.F. Weizsäcker and J. A. Wheeler). This article shows that the terms of the general definition of information meet the structural information, and Shannon’s information is a special case of it. Similarly, complexity is misunderstood today as exemplified by the concepts of reputable computer scientists, such (...)
    Download  
     
    Export citation  
     
    Bookmark  
  35. Preface to a Philosophy of Legal Information.Kevin Lee - 2018 - SMU Science and Technology Law Review 20.
    This essay introduces the philosophy of legal information (PLI), which is a response to the radical changes brought about in philosophy by the information revolution. It reviews in some detail the work of Luciano Floridi, who is an influential advocate for an information turn in philosophy that he calls the philosophy of information (PI). Floridi proposes that philosophers investigate the conceptual nature of information as it currently exists across multiple disciplines. He shows how a focus (...)
    Download  
     
    Export citation  
     
    Bookmark  
  36. Turing's three philosophical lessons and the philosophy of information.Luciano Floridi - 2012 - Philosophical Transactions of the Royal Society A 370 (1971):3536-3542.
    In this article, I outline the three main philosophical lessons that we may learn from Turing’s work, and how they lead to a new philosophy of information. After a brief introduction, I discuss his work on the method of levels of abstraction (LoA), and his insistence that questions could be meaningfully asked only by specifying the correct LoA. I then look at his second lesson, about the sort of philosophical questions that seem to be most pressing today. Finally, I (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  37. "L'énergie en science et la théorie de l'information".Gagnon Philippe - 2022 - Connaître : Cahiers de l'Association Foi Et Culture Scientifique 58 (December):5-30.
    This is the Outline: 1. Introduction 2. L’information en première approche 3. L’information-Janus 4. À quoi sert l’information? 5. Les usages que l’on fit de la théorie de l’information 6. L’information dite de « second ordre » 7. Doit-on mettre l’information au service d’une vision théologique ? 7.1 Une incidence à partir de nos images de Dieu.
    Download  
     
    Export citation  
     
    Bookmark  
  38. Historical and Conceptual Foundations of Information Physics.Anta Javier - 2021 - Dissertation, Universitat de Barcelona
    The main objective of this dissertation is to philosophically assess how the use of informational concepts in the field of classical thermostatistical physics has historically evolved from the late 1940s to the present day. I will first analyze in depth the main notions that form the conceptual basis on which 'informational physics' historically unfolded, encompassing (i) different entropy, probability and information notions, (ii) their multiple interpretative variations, and (iii) the formal, numerical and semantic-interpretative relationships among them. In the following, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  39. Reviewing Evolution of Learning Functions and Semantic Information Measures for Understanding Deep Learning. [REVIEW]Chenguang Lu - 2023 - Entropy 25 (5).
    A new trend in deep learning, represented by Mutual Information Neural Estimation (MINE) and Information Noise Contrast Estimation (InfoNCE), is emerging. In this trend, similarity functions and Estimated Mutual Information (EMI) are used as learning and objective functions. Coincidentally, EMI is essentially the same as Semantic Mutual Information (SeMI) proposed by the author 30 years ago. This paper first reviews the evolutionary histories of semantic information measures and learning functions. Then, it briefly introduces the author’s (...)
    Download  
     
    Export citation  
     
    Bookmark  
  40. Beyond Turing: Hypercomputation and Quantum Morphogenesis.Ignazio Licata - 2012 - Asia Pacific Mathematics Newsletter 2 (3):20-24.
    A Geometrical Approach to Quantum Information.
    Download  
     
    Export citation  
     
    Bookmark  
  41. Norwich’s Entropy Theory: how not to go from abstract to actual.Lance Nizami - 2011 - Kybernetes 40:1102-1118.
    Purpose – The purpose of this paper is to ask whether a first-order-cybernetics concept, Shannon’s Information Theory, actually allows a far-reaching mathematics of perception allegedly derived from it, Norwich et al.’s “Entropy Theory of Perception”. Design/methodology/approach – All of The Entropy Theory, 35 years of publications, was scrutinized for its characterization of what underlies Shannon Information Theory: Shannon’s “general communication system”. There, “events” are passed by a “source” to a “transmitter”, thence through a “noisy channel” (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  42. The relevance of communication theory for theories of representation.Stephen Francis Mann - 2023 - Philosophy and the Mind Sciences 4.
    Prominent views about representation share a premise: that mathematical communication theory is blind to representational content. Here I challenge that premise by rejecting two common misconceptions: that Claude Shannon said that the meanings of signals are irrelevant for communication theory (he didn't and they aren't), and that since correlational measures can't distinguish representations from natural signs, communication theory can't distinguish them either (the premise is true but the conclusion is false; no valid argument can link them).
    Download  
     
    Export citation  
     
    Bookmark  
  43. Paradigm versus praxis: why psychology ‘absolute identification’ experiments do not reveal sensory processes.Lance Nizami - 2013 - Kybernetes 42:1447-1456.
    Purpose – A key cybernetics concept, information transmitted in a system, was quantified by Shannon. It quickly gained prominence, inspiring a version by Harvard psychologists Garner and Hake for “absolute identification” experiments. There, human subjects “categorize” sensory stimuli, affording “information transmitted” in perception. The Garner-Hake formulation has been in continuous use for 62 years, exerting enormous influence. But some experienced theorists and reviewers have criticized it as uninformative. They could not explain why, and were ignored. Here, the (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  44. Emergence and Computation at the Edge of Classical and Quantum Systems.Ignazio Licata - 2008 - In World Scientific (ed.), Physics of Emergence and Organization.
    The problem of emergence in physical theories makes necessary to build a general theory of the relationships between the observed system and the observing system. It can be shown that there exists a correspondence between classical systems and computational dynamics according to the Shannon-Turing model. A classical system is an informational closed system with respect to the observer; this characterizes the emergent processes in classical physics as phenomenological emergence. In quantum systems, the analysis based on the computation theory fails. (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  45.  47
    Ludwig Boltzmann and the Key to Connecting Crucial Physics and Social Science Theories.Minh-Hoang Nguyen - manuscript
    Since 2019, I have been accompanying the development of the Mindsponge Theory (MT). In 2023, the book Mindsponge Theory was officially published, marking a significant milestone. During this period, I completed my doctoral research and explored new methods to enhance work efficiency. The journey was filled with challenges. Surprisingly, the difficulties arose from concepts that seemed small and basic, almost taken for granted as "already known." I was entrusted by my co-author—the father of the mindsponge mechanism—with the task of further (...)
    Download  
     
    Export citation  
     
    Bookmark  
  46. Irreversibility and Complexity.Lapin Yair - manuscript
    Complexity is a relatively new field of study that is still heavily influenced by philosophy. However, with the advent of modern computing, it has become easier to conduct thorough investigations of complex systems using computational simulations. Despite significant progress, there remain certain characteristics of complex systems that are difficult to comprehend. To better understand these features, information can be applied using simple models of complex systems. The concepts of Shannon's information theory, Kolgomorov complexity, and logical depth are (...)
    Download  
     
    Export citation  
     
    Bookmark  
  47. Sensory Systems as Cybernetic Systems that Require Awareness of Alternatives to Interact with the World: Analysis of the Brain-Receptor Loop in Norwich's Entropy Theory of Perception.Lance Nizami - 2009 - Proceedings of the 2009 IEEE International Conference on Systems, Man, and Cybernetics. San Antonio, TX.
    Introduction & Objectives: Norwich’s Entropy Theory of Perception (1975 [1] -present) stands alone. It explains many firing-rate behaviors and psychophysical laws from bare theory. To do so, it demands a unique sort of interaction between receptor and brain, one that Norwich never substantiated. Can it now be confirmed, given the accumulation of empirical sensory neuroscience? Background: Norwich conjoined sensation and a mathematical model of communication, Shannon’s Information Theory, as follows: “In the entropic view of sensation, magnitude of sensation (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  48. On the Notions of Rulegenerating & Anticipatory Systems.Niels Ole Finnemann - 1997 - Online Publication on Conference Site - Which Does Not Exist Any More.
    Until the late 19th century scientists almost always assumed that the world could be described as a rule-based and hence deterministic system or as a set of such systems. The assumption is maintained in many 20th century theories although it has also been doubted because of the breakthrough of statistical theories in thermodynamics (Boltzmann and Gibbs) and other fields, unsolved questions in quantum mechanics as well as several theories forwarded within the social sciences. Until recently it has furthermore been assumed (...)
    Download  
     
    Export citation  
     
    Bookmark  
  49.  55
    Ludwig Boltzmann và đầu mối dẫn tới kết nối lý thuyết quan trọng.Minh-Hoang Nguyen - manuscript
    Đồng hành với lý thuyết MT từ 2019, tới 2023 cuốn sách Mindsponge Theory (MT) chính thức ra mắt, đánh dấu một chặng đường cá nhân tôi vừa hoàn thành các nghiên cứu tiến sỹ, vừa tìm tòi phát triển các phương pháp trợ giúp công việc hiệu quả hơn. Trên hành trình xuất hiện rất nhiều thách thức. Nhưng kỳ lạ ở chỗ khó khăn lại xuất hiện ở khái niệm tưởng như rất bé, rất cơ bản, thậm chí (...)
    Download  
     
    Export citation  
     
    Bookmark  
  50. Normativity at the edge of reason - review of Cecile Malaspina, An Epistemology of Noise. [REVIEW]Iain Campbell - 2021 - Radical Philosophy 9:93-96.
    In recent years noise seems to have become an interdisciplinary concept par excellence, apt to capturing important dynamics at work whether in technological, scientific, social, or aesthetic domains. But when economists, biologists, psychologists, and musicians speak of noise, are they really all referring to the same thing? In An Epistemology of Noise Cecile Malaspina takes this dispersion of the notion of noise as a starting point, and moreover accepts that, when removed from its mathematical formulation in information theory and (...)
    Download  
     
    Export citation  
     
    Bookmark  
1 — 50 / 961