Results for 'information entropy'

1000+ found
Order:
  1. There’s Plenty of Boole at the Bottom: A Reversible CA Against Information Entropy.Francesco Berto, Jacopo Tagliabue & Gabriele Rossi - 2016 - Minds and Machines 26 (4):341-357.
    “There’s Plenty of Room at the Bottom”, said the title of Richard Feynman’s 1959 seminal conference at the California Institute of Technology. Fifty years on, nanotechnologies have led computer scientists to pay close attention to the links between physical reality and information processing. Not all the physical requirements of optimal computation are captured by traditional models—one still largely missing is reversibility. The dynamic laws of physics are reversible at microphysical level, distinct initial states of a system leading to distinct (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  2. Logical Entropy: Introduction to Classical and Quantum Logical Information theory.David Ellerman - 2018 - Entropy 20 (9):679.
    Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  3. Information and meaning (Entropy 2003).Christophe Menant - 2003 - Entropy 5:193-204.
    We propose here to clarify some of the relations existing between information and meaning by showing how meaningful information can be generated by a system submitted to a constraint. We build up definitions and properties for meaningful information, a meaning generator system and the domain of efficiency of a meaning (to cover cases of meaningful information transmission). Basic notions of information processing are used.
    Download  
     
    Export citation  
     
    Bookmark   17 citations  
  4.  25
    How Entropy Explains the Emergence of Consciousness: The Entropic Theory.Peter C. Lugten - 2024 - Journal of Neurobehavioral Sciences 11 (1):10-18.
    Background: Emergentism as an ontology of consciousness leaves unanswered the question as to its mechanism. Aim: I aim to solve the Body-Mind problem by explaining how conscious organisms emerged on an evolutionary basis at various times in accordance with an accepted scientific principle, through a mechanism that cannot be understood, in principle. Proposal: The reason for this cloak of secrecy is found in a seeming contradiction in the behaviour of information with respect to the first two laws of thermodynamics. (...)
    Download  
     
    Export citation  
     
    Bookmark  
  5. Entropy - A Guide for the Perplexed.Roman Frigg & Charlotte Werndl - 2011 - In Claus Beisbart & Stephan Hartmann (eds.), Probabilities in Physics. Oxford University Press. pp. 115-142.
    Entropy is ubiquitous in physics, and it plays important roles in numerous other disciplines ranging from logic and statistics to biology and economics. However, a closer look reveals a complicated picture: entropy is defined differently in different contexts, and even within the same domain different notions of entropy are at work. Some of these are defined in terms of probabilities, others are not. The aim of this chapter is to arrive at an understanding of some of the (...)
    Download  
     
    Export citation  
     
    Bookmark   21 citations  
  6. An introduction to logical entropy and its relation to Shannon entropy.David Ellerman - 2013 - International Journal of Semantic Computing 7 (2):121-145.
    The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set--just as the usual logical notion of probability based on the (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  7. Degeneration and Entropy.Eugene Y. S. Chua - 2022 - Kriterion - Journal of Philosophy 36 (2):123-155.
    [Accepted for publication in Lakatos's Undone Work: The Practical Turn and the Division of Philosophy of Mathematics and Philosophy of Science, special issue of Kriterion: Journal of Philosophy. Edited by S. Nagler, H. Pilin, and D. Sarikaya.] Lakatos’s analysis of progress and degeneration in the Methodology of Scientific Research Programmes is well-known. Less known, however, are his thoughts on degeneration in Proofs and Refutations. I propose and motivate two new criteria for degeneration based on the discussion in Proofs and Refutations (...)
    Download  
     
    Export citation  
     
    Bookmark  
  8. Norwich’s Entropy Theory: how not to go from abstract to actual.Lance Nizami - 2011 - Kybernetes 40:1102-1118.
    Purpose – The purpose of this paper is to ask whether a first-order-cybernetics concept, Shannon’s Information Theory, actually allows a far-reaching mathematics of perception allegedly derived from it, Norwich et al.’s “Entropy Theory of Perception”. Design/methodology/approach – All of The Entropy Theory, 35 years of publications, was scrutinized for its characterization of what underlies Shannon Information Theory: Shannon’s “general communication system”. There, “events” are passed by a “source” to a “transmitter”, thence through a “noisy channel” to (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  9. How bad is the postulation of a low entropy initial state of the universe?Aldo Filomeno - 2023 - Aphex 27:141-158.
    I summarize, in this informal interview, the main approaches to the ‘Past Hypothesis’, the postulation of a low-entropy initial state of the universe. I’ve chosen this as an open problem in the philosophical foundations of physics. I hope that this brief overview helps readers in gaining perspective and in appreciating the diverse range of approaches in this fascinating unresolved debate.
    Download  
     
    Export citation  
     
    Bookmark  
  10. Information ethics: on the philosophical foundation of computer ethics.Luciano Floridi - 1999 - Ethics and Information Technology 1 (1):33–52.
    The essential difficulty about Computer Ethics' (CE) philosophical status is a methodological problem: standard ethical theories cannot easily be adapted to deal with CE-problems, which appear to strain their conceptual resources, and CE requires a conceptual foundation as an ethical theory. Information Ethics (IE), the philosophical foundational counterpart of CE, can be seen as a particular case of environmental ethics or ethics of the infosphere. What is good for an information entity and the infosphere in general? This is (...)
    Download  
     
    Export citation  
     
    Bookmark   113 citations  
  11. On Classical and Quantum Logical Entropy.David Ellerman - manuscript
    The notion of a partition on a set is mathematically dual to the notion of a subset of a set, so there is a logic of partitions dual to Boole's logic of subsets (Boolean logic is usually mis-specified as "propositional" logic). The notion of an element of a subset has as its dual the notion of a distinction of a partition (a pair of elements in different blocks). Boole developed finite logical probability as the normalized counting measure on elements of (...)
    Download  
     
    Export citation  
     
    Bookmark  
  12.  51
    “A Thousand Words”: How Shannon Entropy perspective provides link among exponential data growth, average temperature of the Earth, declining Earth magnetic field, and global consciousness.Victor Christianto & Florentin Smarandache - manuscript
    The sunspot data seems to indicate that the Sun is likely to enter Maunder Minimum, then it will mean that low Sun activity may cause low temperature in Earth. If this happens then it will cause a phenomenon which is called by some climatology experts as “The Little Ice Age” for the next 20-30 years, starting from the next few years. Therefore, the Earth climate in the coming years tend to be cooler than before. This phenomenon then causes us to (...)
    Download  
     
    Export citation  
     
    Bookmark  
  13. Information-Matter Bipolarity of the Human Organism and Its Fundamental Circuits: From Philosophy to Physics/Neurosciences-Based Modeling.Florin Gaiseanu - 2020 - Philosophy Study 10 (2):107-118.
    Starting from a philosophical perspective, which states that the living structures are actually a combination between matter and information, this article presents the results on an analysis of the bipolar information-matter structure of the human organism, distinguishing three fundamental circuits for its survival, which demonstrates and supports this statement, as a base for further development of the informational model of consciousness to a general informational model of the human organism. For this, it was examined the Informational System of (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  14. Information ethics: on the philosophical foundation of computer ethics.Luciano Floridi - 2007 - In John Weckert (ed.), Computer Ethics. Routledge. pp. 63–82.
    The essential difficulty about Computer Ethics’ (CE) philosophical status is a methodological problem: standard ethical theories cannot easily be adapted to deal with CE-problems, which appear to strain their conceptual resources, and CE requires a conceptual foundation as an ethical theory. Information Ethics (IE), the philosophical foundational counterpart of CE, can be seen as a particular case of ‘environmental’ ethics or ethics of the infosphere. What is good for an information entity and the infosphere in general? This is (...)
    Download  
     
    Export citation  
     
    Bookmark   34 citations  
  15. Bridging Conceptual Gaps: The Kolmogorov-Sinai Entropy.Massimiliano Badino - forthcoming - Isonomía. Revista de Teoría y Filosofía Del Derecho.
    The Kolmogorov-Sinai entropy is a fairly exotic mathematical concept which has recently aroused some interest on the philosophers’ part. The most salient trait of this concept is its working as a junction between such diverse ambits as statistical mechanics, information theory and algorithm theory. In this paper I argue that, in order to understand this very special feature of the Kolmogorov-Sinai entropy, is essential to reconstruct its genealogy. Somewhat surprisingly, this story takes us as far back as (...)
    Download  
     
    Export citation  
     
    Bookmark  
  16. The temporal foundation of the principle of maximal entropy.Vasil Penchev - 2020 - Logic and Philosophy of Mathematics eJournal 12 (11):1-3.
    The principle of maximal entropy (further abbreviated as “MaxEnt”) can be founded on the formal mechanism, in which future transforms into past by the mediation of present. This allows of MaxEnt to be investigated by the theory of quantum information. MaxEnt can be considered as an inductive analog or generalization of “Occam’s razor”. It depends crucially on choice and thus on information just as all inductive methods of reasoning. The essence shared by Occam’s razor and MaxEnt is (...)
    Download  
     
    Export citation  
     
    Bookmark  
  17. Qualitative probabilistic inference under varied entropy levels.Paul D. Thorn & Gerhard Schurz - 2016 - Journal of Applied Logic 19 (2):87-101.
    In previous work, we studied four well known systems of qualitative probabilistic inference, and presented data from computer simulations in an attempt to illustrate the performance of the systems. These simulations evaluated the four systems in terms of their tendency to license inference to accurate and informative conclusions, given incomplete information about a randomly selected probability distribution. In our earlier work, the procedure used in generating the unknown probability distribution (representing the true stochastic state of the world) tended to (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  18. Does Information Have a Moral Worth in Itself?Luciano Floridi - 1998 - In CEPE 1998, Computer Ethics: Philosophical Enquiry. London:
    The paper provides an axiological analysis of the concepts of respect for information and of information dignity from the vantage point provided by Information Ethics and the conceptual paradigm of object-oriented analysis (OOA). The general perspective adopted is that of an ontocentric approach to the philosophy of information ethics, according to which the latter is an expansion of environmental ethics towards a less biologically biased concept of a ‘centre of ethical worth’. The paper attempts to answer (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  19. The cognitive agent: Overcoming informational limits.Orlin Vakarelov - 2011 - Adaptive Behavior 19 (2):83-100.
    This article provides an answer to the question: What is the function of cognition? By answering this question it becomes possible to investigate what are the simplest cognitive systems. It addresses the question by treating cognition as a solution to a design problem. It defines a nested sequence of design problems: (1) How can a system persist? (2) How can a system affect its environment to improve its persistence? (3) How can a system utilize better information from the environment (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  20. Information and meaning in life, humans and robots (FIS 2005).Christophe Menant - manuscript
    Information and meaning exist around us and within ourselves, and the same information can correspond to different meanings. This is true for humans and animals, and is becoming true for robots. We propose here an overview of this subject by using a systemic tool related to meaning generation that has already been published (C. Menant, Entropy 2003). The Meaning Generator System (MGS) is a system submitted to a constraint that generates a meaningful information when it receives (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  21. Sensory Systems as Cybernetic Systems that Require Awareness of Alternatives to Interact with the World: Analysis of the Brain-Receptor Loop in Norwich's Entropy Theory of Perception.Lance Nizami - 2009 - Proceedings of the 2009 IEEE International Conference on Systems, Man, and Cybernetics. San Antonio, TX.
    Introduction & Objectives: Norwich’s Entropy Theory of Perception (1975 [1] -present) stands alone. It explains many firing-rate behaviors and psychophysical laws from bare theory. To do so, it demands a unique sort of interaction between receptor and brain, one that Norwich never substantiated. Can it now be confirmed, given the accumulation of empirical sensory neuroscience? Background: Norwich conjoined sensation and a mathematical model of communication, Shannon’s Information Theory, as follows: “In the entropic view of sensation, magnitude of sensation (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  22. Information of the chassis and information of the program in synthetic cells.Antoine Danchin - 2009 - Systems and Synthetic Biology 3:125-134.
    Synthetic biology aims at reconstructing life to put to the test the limits of our understanding. It is based on premises similar to those which permitted invention of computers, where a machine, which reproduces over time, runs a program, which replicates. The underlying heuristics explored here is that an authentic category of reality, information, must be coupled with the standard categories, matter, energy, space and time to account for what life is. The use of this still elusive category permits (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  23. Spencer-Brown vs. Probability and Statistics: Entropy’s Testimony on Subjective and Objective Randomness.Julio Michael Stern - 2011 - Information 2 (2):277-301.
    This article analyzes the role of entropy in Bayesian statistics, focusing on its use as a tool for detection, recognition and validation of eigen-solutions. “Objects as eigen-solutions” is a key metaphor of the cognitive constructivism epistemological framework developed by the philosopher Heinz von Foerster. Special attention is given to some objections to the concepts of probability, statistics and randomization posed by George Spencer-Brown, a figure of great influence in the field of radical constructivism.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  24. There Is No Puzzle about the Low Entropy Past.Craig Callender - 2004 - In Christopher Hitchcock (ed.), Contemporary Debates in Philosophy of Science. Blackwell. pp. 240-255.
    Suppose that God or a demon informs you of the following future fact: despite recent cosmological evidence, the universe is indeed closed and it will have a ‘final’ instant of time; moreover, at that final moment, all 49 of the world’s Imperial Faberge eggs will be in your bedroom bureau’s sock drawer. You’re absolutely certain that this information is true. All of your other dealings with supernatural powers have demonstrated that they are a trustworthy lot.
    Download  
     
    Export citation  
     
    Bookmark   37 citations  
  25. Advances and Applications of DSmT for Information Fusion. Collected Works, Volume 5.Florentin Smarandache - 2023 - Edited by Smarandache Florentin, Dezert Jean & Tchamova Albena.
    This fifth volume on Advances and Applications of DSmT for Information Fusion collects theoretical and applied contributions of researchers working in different fields of applications and in mathematics, and is available in open-access. The collected contributions of this volume have either been published or presented after disseminating the fourth volume in 2015 in international conferences, seminars, workshops and journals, or they are new. The contributions of each part of this volume are chronologically ordered. First Part of this book presents (...)
    Download  
     
    Export citation  
     
    Bookmark  
  26. The Informational Foundation of the Human Act.Fernando- Luis de Marcos Ortega Flores Morador & Luis de Marcos Ortega (eds.) - 2018 - Alcalá. Madrid: Servicio de Publicaciones Universidad de Alcalá.
    This book is the result of a collective research effort performed during many years in both Sweden and Spain. It is the result of attempting to develop a new field of research that could we denominate «human act informatics.» The goal has been to use the technologies of information to the study of the human act in general, including embodied acts and disembodied acts. The book presents a theory of the quantification of the informational value of human acts as (...)
    Download  
     
    Export citation  
     
    Bookmark  
  27. Information, learning and falsification.David Balduzzi - 2011
    There are (at least) three approaches to quantifying information. The first, algorithmic information or Kolmogorov complexity, takes events as strings and, given a universal Turing machine, quantifies the information content of a string as the length of the shortest program producing it [1]. The second, Shannon information, takes events as belonging to ensembles and quantifies the information resulting from observing the given event in terms of the number of alternate events that have been ruled out (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  28. Complexity and information.Panu Raatikainen - 1998 - In Complexity, Information and Incompleteness (doctoral dissertation). Reports from the Department of Philosophy, University of Helsinki, 2/1998.
    "Complexity" is a catchword of certain extremely popular and rapidly developing interdisciplinary new sciences, often called accordingly the sciences of complexity. It is often closely associated with another notably popular but ambiguous word, "information"; information, in turn, may be justly called the central new concept in the whole 20th century science. Moreover, the notion of information is regularly coupled with a key concept of thermodynamics, viz. entropy. And like this was not enough it is quite usual (...)
    Download  
     
    Export citation  
     
    Bookmark  
  29. Is uncertainty reduction the basis for perception? Errors in Norwich’s Entropy Theory of Perception imply otherwise.Lance Nizami - 2010 - Proceedings of the World Congress on Engineering and Computer Science 2010 (Lecture Notes in Engineering and Computer Science) 2.
    This paper reveals errors within Norwich et al.’s Entropy Theory of Perception, errors that have broad implications for our understanding of perception. What Norwich and coauthors dubbed their “informational theory of neural coding” is based on cybernetics, that is, control and communication in man and machine. The Entropy Theory uses information theory to interpret human performance in absolute judgments. There, the continuum of the intensity of a sensory stimulus is cut into categories and the subject is shown (...)
    Download  
     
    Export citation  
     
    Bookmark  
  30. From Art to Information System.Miro Brada - 2021 - AGI Laboratory.
    This insight to art came from chess composition concentrating art in a very dense form. To identify and mathematically assess the uniqueness is the key applicable to other areas eg. computer programming. Maximization of uniqueness is minimization of entropy that coincides as well as goes beyond Information Theory (Shannon, 1948). The reusage of logic as a universal principle to minimize entropy, requires simplified architecture and abstraction. Any structures (e.g. plugins) duplicating or dividing functionality increase entropy and (...)
    Download  
     
    Export citation  
     
    Bookmark  
  31. Black Hole Paradoxes: A Unified Framework for Information Loss.Saakshi Dulani - 2024 - Dissertation, University of Geneva
    The black hole information loss paradox is a catch-all term for a family of puzzles related to black hole evaporation. For almost 50 years, the quest to elucidate the implications of black hole evaporation has not only sustained momentum, but has also become increasingly populated with proposals that seem to generate more questions than they purport to answer. Scholars often neglect to acknowledge ongoing discussions within black hole thermodynamics and statistical mechanics when analyzing the paradox, including the interpretation of (...)
    Download  
     
    Export citation  
     
    Bookmark  
  32. On the intrinsic value of information objects and the infosphere.Luciano Floridi - 2002 - Ethics and Information Technology 4 (4):287–304.
    What is the most general common set of attributes that characterises something as intrinsically valuable and hence as subject to some moral respect, and without which something would rightly be considered intrinsically worthless or even positively unworthy and therefore rightly to be disrespected in itself? This paper develops and supports the thesis that the minimal condition of possibility of an entity's least intrinsic value is to be identified with its ontological status as an information object. All entities, even when (...)
    Download  
     
    Export citation  
     
    Bookmark   69 citations  
  33. Visual features as carriers of abstract quantitative information.Ronald A. Rensink - 2022 - Journal of Experimental Psychology: General 8 (151):1793-1820.
    Four experiments investigated the extent to which abstract quantitative information can be conveyed by basic visual features. This was done by asking observers to estimate and discriminate Pearson correlation in graphical representations where the first data dimension of each element was encoded by its horizontal position, and the second by the value of one of its visual features; perceiving correlation then requires combining the information in the two encodings via a common abstract representation. Four visual features were examined: (...)
    Download  
     
    Export citation  
     
    Bookmark  
  34. Discussion on the Relationship between Computation, Information, Cognition, and Their Embodiment.Gordana Dodig-Crnkovic & Marcin Miłkowski - 2023 - Entropy 25 (2):310.
    Three special issues of Entropy journal have been dedicated to the topics of “InformationProcessing and Embodied, Embedded, Enactive Cognition”. They addressed morphological computing, cognitive agency, and the evolution of cognition. The contributions show the diversity of views present in the research community on the topic of computation and its relation to cognition. This paper is an attempt to elucidate current debates on computation that are central to cognitive science. It is written in the form of a dialog between two (...)
    Download  
     
    Export citation  
     
    Bookmark  
  35. From data to semantic information.Luciano Floridi - 2003 - Entropy 5:125–145.
    There is no consensus yet on the definition of semantic information. This paper contributes to the current debate by criticising and revising the Standard Definition of semantic Information as meaningful data, in favour of the Dretske-Grice approach: meaningful and well-formed data constitute semantic information only if they also qualify as contingently truthful. After a brief introduction, SDI is criticised for providing necessary but insufficient conditions for the definition of semantic information. SDI is incorrect because truth-values do (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  36.  67
    A New Logic, a New Information Measure, and a New Information-Based Approach to Interpreting Quantum Mechanics.David Ellerman - 2024 - Entropy Special Issue: Information-Theoretic Concepts in Physics 26 (2).
    The new logic of partitions is dual to the usual Boolean logic of subsets (usually presented only in the special case of the logic of propositions) in the sense that partitions and subsets are category-theoretic duals. The new information measure of logical entropy is the normalized quantitative version of partitions. The new approach to interpreting quantum mechanics (QM) is showing that the mathematics (not the physics) of QM is the linearized Hilbert space version of the mathematics of partitions. (...)
    Download  
     
    Export citation  
     
    Bookmark  
  37. Counting distinctions: on the conceptual foundations of Shannon’s information theory.David Ellerman - 2009 - Synthese 168 (1):119-149.
    Categorical logic has shown that modern logic is essentially the logic of subsets (or "subobjects"). Partitions are dual to subsets so there is a dual logic of partitions where a "distinction" [an ordered pair of distinct elements (u,u′) from the universe U ] is dual to an "element". An element being in a subset is analogous to a partition π on U making a distinction, i.e., if u and u′ were in different blocks of π. Subset logic leads to finite (...)
    Download  
     
    Export citation  
     
    Bookmark   18 citations  
  38.  55
    The Fundamental Tension in Integrated Information Theory 4.0’s Realist Idealism.Ignacio Cea - 2023 - Entropy 25 (10).
    Integrated Information Theory (IIT) is currently one of the most influential scientific theories of consciousness. Here, we focus specifically on a metaphysical aspect of the theory’s most recent version (IIT 4.0), what we may call its idealistic ontology, and its tension with a kind of realism about the external world that IIT also endorses. IIT 4.0 openly rejects the mainstream view that consciousness is generated by the brain, positing instead that consciousness is ontologically primary while the physical domain is (...)
    Download  
     
    Export citation  
     
    Bookmark  
  39. A Generalization of Shannon's Information Theory.Chenguang Lu - 1999 - Int. J. Of General Systems 28 (6):453-490.
    A generalized information theory is proposed as a natural extension of Shannon's information theory. It proposes that information comes from forecasts. The more precise and the more unexpected a forecast is, the more information it conveys. If subjective forecast always conforms with objective facts then the generalized information measure will be equivalent to Shannon's information measure. The generalized communication model is consistent with K. R. Popper's model of knowledge evolution. The mathematical foundations of the (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  40. Optimization Models for Reaction Networks: Information Divergence, Quadratic Programming and Kirchhoff’s Laws.Julio Michael Stern - 2014 - Axioms 109:109-118.
    This article presents a simple derivation of optimization models for reaction networks leading to a generalized form of the mass-action law, and compares the formal structure of Minimum Information Divergence, Quadratic Programming and Kirchhoff type network models. These optimization models are used in related articles to develop and illustrate the operation of ontology alignment algorithms and to discuss closely connected issues concerning the epistemological and statistical significance of sharp or precise hypotheses in empirical science.
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  41. Destiny or Free Will Decision? A Life Overview from the Perspective of an Informational Modeling of Consciousness Part I: Information, Consciousness and Life Cycle.Florin Gaiseanu - 2019 - Gerontology and Geriatrics Studies 4 (3):1-6.
    We drive our lives permanently by decisions YES/NO, and even we no longer distinguish the elementary intermediary steps of such decisions most often, they form stereotyped chains that once triggered, they run unconsciously, daily facilitating our activities. We lead our lives actually by conscious decisions, each of such decisions establishing our future trajectory. The YES/NO dipole is actually the elemental evaluation and decisional unit in the informational transmission/reception equipment and lines and in computers, respectively. Based on a binary probabilistic system, (...)
    Download  
     
    Export citation  
     
    Bookmark   13 citations  
  42. A Complexity Basis for Phenomenology: How information states at criticality offer a new approach to understanding experience of self, being and time.Alex Hankey - 2015 - Progress in Biophysics and Molecular Biology 119:288–302.
    In the late 19th century Husserl studied our internal sense of time passing, maintaining that its deep connections into experience represent prima facie evidence for it as the basis for all investigations in the sciences: Phenomenology was born. Merleau-Ponty focused on perception pointing out that any theory of experience must in accord with established aspects of biology i.e. embodied. Recent analyses suggest that theories of experience require non-reductive, integrative information, together with a specific property connecting them to experience. Here (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  43. Reviewing Evolution of Learning Functions and Semantic Information Measures for Understanding Deep Learning. [REVIEW]Chenguang Lu - 2023 - Entropy 25 (5).
    A new trend in deep learning, represented by Mutual Information Neural Estimation (MINE) and Information Noise Contrast Estimation (InfoNCE), is emerging. In this trend, similarity functions and Estimated Mutual Information (EMI) are used as learning and objective functions. Coincidentally, EMI is essentially the same as Semantic Mutual Information (SeMI) proposed by the author 30 years ago. This paper first reviews the evolutionary histories of semantic information measures and learning functions. Then, it briefly introduces the author’s (...)
    Download  
     
    Export citation  
     
    Bookmark  
  44. Historical and Conceptual Foundations of Information Physics.Anta Javier - 2021 - Dissertation, Universitat de Barcelona
    The main objective of this dissertation is to philosophically assess how the use of informational concepts in the field of classical thermostatistical physics has historically evolved from the late 1940s to the present day. I will first analyze in depth the main notions that form the conceptual basis on which 'informational physics' historically unfolded, encompassing (i) different entropy, probability and information notions, (ii) their multiple interpretative variations, and (iii) the formal, numerical and semantic-interpretative relationships among them. In the (...)
    Download  
     
    Export citation  
     
    Bookmark  
  45. The nature of correlation perception in scatterplots.Ronald A. Rensink - 2017 - Psychonomic Bulletin & Review 24 (3):776-797.
    For scatterplots with gaussian distributions of dots, the perception of Pearson correlation r can be described by two simple laws: a linear one for discrimination, and a logarithmic one for perceived magnitude (Rensink & Baldridge, 2010). The underlying perceptual mechanisms, however, remain poorly understood. To cast light on these, four different distributions of datapoints were examined. The first had 100 points with equal variance in both dimensions. Consistent with earlier results, just noticeable difference (JND) was a linear function of the (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  46. 47 Vietnamese people are among the world's most influential scientists in 2023.Authority of Foreign Information Service - 2023 - Vietnam Information Service.
    Vietnamese scientists on the list of "100.000 influential scientists" this year increased sharply in number and rank. -/- The ranking was selected by a group of scientists led by Professor John PA Ioannidis and colleagues from Stanford University (USA) on the Scopus database and published by Elsevier Publishing House.
    Download  
     
    Export citation  
     
    Bookmark  
  47. Artificial Evil and the Foundation of Computer Ethics.Luciano Floridi & J. W. Sanders - 2001 - Springer Netherlands. Edited by Luciano Floridi & J. W. Sanders.
    Moral reasoning traditionally distinguishes two types of evil:moral (ME) and natural (NE). The standard view is that ME is the product of human agency and so includes phenomena such as war,torture and psychological cruelty; that NE is the product of nonhuman agency, and so includes natural disasters such as earthquakes, floods, disease and famine; and finally, that more complex cases are appropriately analysed as a combination of ME and NE. Recently, as a result of developments in autonomous agents in cyberspace, (...)
    Download  
     
    Export citation  
     
    Bookmark   29 citations  
  48. Falsification and future performance.David Balduzzi - manuscript
    We information-theoretically reformulate two measures of capacity from statistical learning theory: empirical VC-entropy and empirical Rademacher complexity. We show these capacity measures count the number of hypotheses about a dataset that a learning algorithm falsifies when it finds the classifier in its repertoire minimizing empirical risk. It then follows from that the future performance of predictors on unseen data is controlled in part by how many hypotheses the learner falsifies. As a corollary we show that empirical VC-entropy (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  49. SAR-BSO meta-heuristic hybridization for feature selection and classification using DBNover stream data.Dharani Talapula, Kiran Ravulakollu, Manoj Kumar & Adarsh Kumar - forthcoming - Artificial Intelligence Review.
    Advancements in cloud technologies have increased the infrastructural needs of data centers due to storage needs and processing of extensive dimensional data. Many service providers envisage anomaly detection criteria to guarantee availability to avoid breakdowns and complexities caused due to large-scale operations. The streaming log data generated is associated with multi-dimensional complexity and thus poses a considerable challenge to detect the anomalies or unusual occurrences in the data. In this research, a hybrid model is proposed that is motivated by deep (...)
    Download  
     
    Export citation  
     
    Bookmark  
  50. From Quantum Entanglement to Spatiotemporal Distance.Alyssa Ney - 2021 - In Christian Wüthrich, Baptiste Le Bihan & Nick Huggett (eds.), Philosophy Beyond Spacetime. Oxford: Oxford University Press.
    Within the field of quantum gravity, there is an influential research program developing the connection between quantum entanglement and spatiotemporal distance. Quantum information theory gives us highly refined tools for quantifying quantum entanglement such as the entanglement entropy. Through a series of well-confirmed results, it has been shown how these facts about the entanglement entropy of component systems may be connected to facts about spatiotemporal distance. Physicists are seeing these results as yielding promising methods for better understanding (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
1 — 50 / 1000