Results for 'Entropy increase'

997 found
Order:
  1.  22
    How Entropy Explains the Emergence of Consciousness: The Entropic Theory.Peter C. Lugten - 2024 - Journal of Neurobehavioral Sciences 11 (1):10-18.
    Background: Emergentism as an ontology of consciousness leaves unanswered the question as to its mechanism. Aim: I aim to solve the Body-Mind problem by explaining how conscious organisms emerged on an evolutionary basis at various times in accordance with an accepted scientific principle, through a mechanism that cannot be understood, in principle. Proposal: The reason for this cloak of secrecy is found in a seeming contradiction in the behaviour of information with respect to the first two laws of thermodynamics. Information, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  2. Logical Entropy: Introduction to Classical and Quantum Logical Information theory.David Ellerman - 2018 - Entropy 20 (9):679.
    Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  3. Entropy and the Direction of Time.Jerzy Gołosz - 2021 - Entropy 23 (4):388.
    The paper tries to demonstrate that the process of the increase of entropy does not explain the asymmetry of time itself because it is unable to account for its fundamental asymmetries, that is, the asymmetry of traces (we have traces of the past and no traces of the future), the asymmetry of causation (we have an impact on future events with no possibility of having an impact on the past), and the asymmetry between the fixed past and the (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  4. Randomness Increases Order in Biological Evolution.Giuseppe Longo & Maël Montévil - 2012 - In M. Dinneen, B. Khoussainov & A. Nies (eds.), Computation, Physics and Beyond. Berlin Heidelberg: pp. 289-308.
    n this text, we revisit part of the analysis of anti-entropy in Bailly and Longo (2009} and develop further theoretical reflections. In particular, we analyze how randomness, an essential component of biological variability, is associated to the growth of biological organization, both in ontogenesis and in evolution. This approach, in particular, focuses on the role of global entropy production and provides a tool for a mathematical understanding of some fundamental observations by Gould on the increasing phenotypic complexity along (...)
    Download  
     
    Export citation  
     
    Bookmark  
  5. On Classical and Quantum Logical Entropy.David Ellerman - manuscript
    The notion of a partition on a set is mathematically dual to the notion of a subset of a set, so there is a logic of partitions dual to Boole's logic of subsets (Boolean logic is usually mis-specified as "propositional" logic). The notion of an element of a subset has as its dual the notion of a distinction of a partition (a pair of elements in different blocks). Boole developed finite logical probability as the normalized counting measure on elements of (...)
    Download  
     
    Export citation  
     
    Bookmark  
  6. Why did life emerge?Arto Annila & Annila E. Annila A. - 2008 - International Journal of Astrobiology 7 (3-4):293–300.
    Many mechanisms, functions and structures of life have been unraveled. However, the fundamental driving force that propelled chemical evolution and led to life has remained obscure. The second law of thermodynamics, written as an equation of motion, reveals that elemental abiotic matter evolves from the equilibrium via chemical reactions that couple to external energy towards complex biotic non-equilibrium systems. Each time a new mechanism of energy transduction emerges, e.g., by random variation in syntheses, evolution prompts by punctuation and settles to (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  7. The Statistical Nature of Causation.David Papineau - 2022 - The Monist 105 (2):247-275.
    Causation is a macroscopic phenomenon. The temporal asymmetry displayed by causation must somehow emerge along with other asymmetric macroscopic phenomena like entropy increase and the arrow of radiation. I shall approach this issue by considering ‘causal inference’ techniques that allow causal relations to be inferred from sets of observed correlations. I shall show that these techniques are best explained by a reduction of causation to structures of equations with probabilistically independent exogenous terms. This exogenous probabilistic independence imposes a (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  8. The Universal Arrow of Time.Oleg Kupervasser, Hrvoje Nikolić & Vinko Zlatić - 2012 - Foundations of Physics 42 (9):1165-1185.
    Statistical physics cannot explain why a thermodynamic arrow of time exists, unless one postulates very special and unnatural initial conditions. Yet, we argue that statistical physics can explain why the thermodynamic arrow of time is universal, i.e., why the arrow points in the same direction everywhere. Namely, if two subsystems have opposite arrow-directions at a particular time, the interaction between them makes the configuration statistically unstable and causes a decay towards a system with a universal direction of the arrow of (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  9. The physics of implementing logic: Landauer's principle and the multiple-computations theorem.Meir Hemmo & Orly Shenker - 2019 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 68:90-105.
    This paper makes a novel linkage between the multiple-computations theorem in philosophy of mind and Landauer’s principle in physics. The multiple-computations theorem implies that certain physical systems implement simultaneously more than one computation. Landauer’s principle implies that the physical implementation of “logically irreversible” functions is accompanied by minimal entropy increase. We show that the multiple-computations theorem is incompatible with, or at least challenges, the universal validity of Landauer’s principle. To this end we provide accounts of both ideas in (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  10. The Second Law of Thermodynamics and the Psychological Arrow of Time.Meir Hemmo & Orly Shenker - 2022 - British Journal for the Philosophy of Science 73 (1):85-107.
    Can the second law of thermodynamics explain our mental experience of the direction of time? According to an influential approach, the past hypothesis of universal low entropy also explains how the psychological arrow comes about. We argue that although this approach has many attractive features, it cannot explain the psychological arrow after all. In particular, we show that the past hypothesis is neither necessary nor sufficient to explain the psychological arrow on the basis of current physics. We propose two (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  11. Why gravity is not an entropic force.Shan Gao - 2010
    The remarkable connections between gravity and thermodynamics seem to imply that gravity is not fundamental but emergent, and in particular, as Verlinde suggested, gravity is probably an entropic force. In this paper, we will argue that the idea of gravity as an entropic force is debatable. It is shown that there is no convincing analogy between gravity and entropic force in Verlinde’s example. Neither holographic screen nor test particle satisfies all requirements for the existence of entropic force in a thermodynamics (...)
    Download  
     
    Export citation  
     
    Bookmark  
  12. Are we Living in a (Quantum) Simulation? – Constraints, observations, and experiments on the simulation hypothesis.Anders Indset, Florian Neukart, Markus Pflitsch & Michael R. Perelshtein - manuscript
    The God Experiment – Let there be Light -/- The question “What is real?” can be traced back to the shadows in Plato’s cave. Two thousand years later, Rene Descartes lacked knowledge about arguing against an evil´ deceiver feeding us the illusion of sensation. Descartes’ epistemological concept later led to various theories of what our sensory experiences actually are. The concept of ”illusionism”, proposing that even the very conscious experience we have – our qualia – is an illusion, is not (...)
    Download  
     
    Export citation  
     
    Bookmark  
  13. In the light of time.Arto Annila - 2009 - Proceedings of Royal Society A 465:1173–1198.
    The concept of time is examined using the second law of thermodynamics that was recently formulated as an equation of motion. According to the statistical notion of increasing entropy, flows of energy diminish differences between energy densities that form space. The flow of energy is identified with the flow of time. The non-Euclidean energy landscape, i.e. the curved space–time, is in evolution when energy is flowing down along gradients and levelling the density differences. The flows along the steepest descents, (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  14. Towards an AB-series interpretation of time in physics.Paul Merriam - manuscript
    How can McTaggart's A-series notion of time be incorporated into physics while retaining the B-series notion? It may be the A-series 'now' can be construed as ontologically private. How is that modeled? Could a definition of a combined AB-series entropy help with the Past Hypothesis problem? What if the increase in entropy as a system goes from earlier times to later times is canceled by the decrease in entropy as a system goes from future, to present, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  15.  83
    Time is not Entropic in Origin.Paul Merriam - manuscript
    Many researchers suppose that the forward direction of time as a consequence of increasing entropy. But the human brain, including the human brain with memories, and life in general, are regarded as pockets of decreasing entropy, so as to accommodate the continual addition of memories and abilities. But then humans and life in general should see time going ‘backward’ in some sense. But we do not. Therefore, time is not entropic in origin. -/- The purpose of this note (...)
    Download  
     
    Export citation  
     
    Bookmark  
  16. Black Hole Paradoxes: A Unified Framework for Information Loss.Saakshi Dulani - 2024 - Dissertation, University of Geneva
    The black hole information loss paradox is a catch-all term for a family of puzzles related to black hole evaporation. For almost 50 years, the quest to elucidate the implications of black hole evaporation has not only sustained momentum, but has also become increasingly populated with proposals that seem to generate more questions than they purport to answer. Scholars often neglect to acknowledge ongoing discussions within black hole thermodynamics and statistical mechanics when analyzing the paradox, including the interpretation of Bekenstein-Hawking (...)
    Download  
     
    Export citation  
     
    Bookmark  
  17.  47
    Cosmological Black Holes and the Direction of Time.Gustavo E. Romero, Federico G. López Armengol & Daniela Pérez - 2018 - Foundations of Science 23 (2):415-426.
    Macroscopic irreversible processes emerge from fundamental physical laws of reversible character. The source of the local irreversibility seems to be not in the laws themselves but in the initial and boundary conditions of the equations that represent the laws. In this work we propose that the screening of currents by black hole event horizons determines, locally, a preferred direction for the flux of electromagnetic energy. We study the growth of black hole event horizons due to the cosmological expansion and accretion (...)
    Download  
     
    Export citation  
     
    Bookmark  
  18. Time's Arrow in a Quantum Universe: On the Status of Statistical Mechanical Probabilities.Eddy Keming Chen - 2020 - In Valia Allori (ed.), Statistical Mechanics and Scientific Explanation: Determinism, Indeterminism and Laws of Nature. World Scientific. pp. 479–515.
    In a quantum universe with a strong arrow of time, it is standard to postulate that the initial wave function started in a particular macrostate---the special low-entropy macrostate selected by the Past Hypothesis. Moreover, there is an additional postulate about statistical mechanical probabilities according to which the initial wave function is a ''typical'' choice in the macrostate. Together, they support a probabilistic version of the Second Law of Thermodynamics: typical initial wave functions will increase in entropy. Hence, (...)
    Download  
     
    Export citation  
     
    Bookmark   12 citations  
  19. Where there is life there is mind: In support of a strong life-mind continuity thesis.Michael David Kirchhoff & Tom Froese - 2017 - Entropy 19.
    This paper considers questions about continuity and discontinuity between life and mind. It begins by examining such questions from the perspective of the free energy principle (FEP). The FEP is becoming increasingly influential in neuroscience and cognitive science. It says that organisms act to maintain themselves in their expected biological and cognitive states, and that they can do so only by minimizing their free energy given that the long-term average of free energy is entropy. The paper then argues that (...)
    Download  
     
    Export citation  
     
    Bookmark   35 citations  
  20. From Art to Information System.Miro Brada - 2021 - AGI Laboratory.
    This insight to art came from chess composition concentrating art in a very dense form. To identify and mathematically assess the uniqueness is the key applicable to other areas eg. computer programming. Maximization of uniqueness is minimization of entropy that coincides as well as goes beyond Information Theory (Shannon, 1948). The reusage of logic as a universal principle to minimize entropy, requires simplified architecture and abstraction. Any structures (e.g. plugins) duplicating or dividing functionality increase entropy and (...)
    Download  
     
    Export citation  
     
    Bookmark  
  21. The Pharmacological Significance of Mechanical Intelligence and Artificial Stupidity.Adrian Mróz - 2019 - Kultura I Historia 36 (2):17-40.
    By drawing on the philosophy of Bernard Stiegler, the phenomena of mechanical (a.k.a. artificial, digital, or electronic) intelligence is explored in terms of its real significance as an ever-repeating threat of the reemergence of stupidity (as cowardice), which can be transformed into knowledge (pharmacological analysis of poisons and remedies) by practices of care, through the outlook of what researchers describe equivocally as “artificial stupidity”, which has been identified as a new direction in the future of computer science and machine problem (...)
    Download  
     
    Export citation  
     
    Bookmark  
  22. The cognitive agent: Overcoming informational limits.Orlin Vakarelov - 2011 - Adaptive Behavior 19 (2):83-100.
    This article provides an answer to the question: What is the function of cognition? By answering this question it becomes possible to investigate what are the simplest cognitive systems. It addresses the question by treating cognition as a solution to a design problem. It defines a nested sequence of design problems: (1) How can a system persist? (2) How can a system affect its environment to improve its persistence? (3) How can a system utilize better information from the environment to (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  23. Time Arrows and Determinism in Biology.Bartolomé Sabater - 2009 - Biological Theory 4 (2):174-182.
    I propose that, in addition to the commonly recognized increase of entropy, two more time arrows influence living beings. The increase of damage reactions, which produce aging and genetic variation, and the decrease of the rate of entropy production involved in natural selection are neglected arrows of time. Although based on the statistical theory of the arrow of time, they are distinguishable from the general arrow of the increase of entropy. Physiology under healthy conditions (...)
    Download  
     
    Export citation  
     
    Bookmark  
  24. SAR-BSO meta-heuristic hybridization for feature selection and classification using DBNover stream data.Dharani Talapula, Kiran Ravulakollu, Manoj Kumar & Adarsh Kumar - forthcoming - Artificial Intelligence Review.
    Advancements in cloud technologies have increased the infrastructural needs of data centers due to storage needs and processing of extensive dimensional data. Many service providers envisage anomaly detection criteria to guarantee availability to avoid breakdowns and complexities caused due to large-scale operations. The streaming log data generated is associated with multi-dimensional complexity and thus poses a considerable challenge to detect the anomalies or unusual occurrences in the data. In this research, a hybrid model is proposed that is motivated by deep (...)
    Download  
     
    Export citation  
     
    Bookmark  
  25. Arithmetic logical Irreversibility and the Halting Problem (Revised and Fixed version).Yair Lapin - manuscript
    The Turing machine halting problem can be explained by several factors, including arithmetic logic irreversibility and memory erasure, which contribute to computational uncertainty due to information loss during computation. Essentially, this means that an algorithm can only preserve information about an input, rather than generate new information. This uncertainty arises from characteristics such as arithmetic logical irreversibility, Landauer's principle, and memory erasure, which ultimately lead to a loss of information and an increase in entropy. To measure this uncertainty (...)
    Download  
     
    Export citation  
     
    Bookmark  
  26. Physical complexity and cognitive evolution.Peter Jedlicka - 2007 - In Carlos Gershenson, Diederik Aerts & Bruce Edmonds (eds.), Worldviews, Science, and Us: Philosophy and Complexity. World Scientific. pp. 221--231.
    Our intuition tells us that there is a general trend in the evolution of nature, a trend towards greater complexity. However, there are several definitions of complexity and hence it is difficult to argue for or against the validity of this intuition. Christoph Adami has recently introduced a novel measure called physical complexity that assigns low complexity to both ordered and random systems and high complexity to those in between. Physical complexity measures the amount of information that an organism stores (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  27. Pragmatism and the definition of the idea of disease.Maurilio Lovatti - manuscript
    The purpose of this paper is to support the idea that pragmatism is still a productive resource for the study of health and disease. Combining the results of thermodynamic theories with evolutionism, some said that the biological structures selected by evolution are perfect to maximize the conservation of energy; as a consequence, they tend to reduce the entropy in the organism. Thus, a process can be defined as pathological if it increases the entropy, which means a diminished efficiency (...)
    Download  
     
    Export citation  
     
    Bookmark  
  28. Blockchain Philosophy - Bitcoin.Nicolae Sfetcu - manuscript
    State features are affected by the connection with digital coins. Social systems create their own limits and remain alive according to their internal logic, which does not derive from the system environment. So, social systems are operationally and autonomously closed - interacting with their environment and there is a general increase in entropy, but individual systems work to maintain and preserve their internal order. Autopoietic systems (like the state, with the tendency to maintain the inner order with a (...)
    Download  
     
    Export citation  
     
    Bookmark  
  29. Entropy - A Guide for the Perplexed.Roman Frigg & Charlotte Werndl - 2011 - In Claus Beisbart & Stephan Hartmann (eds.), Probabilities in Physics. Oxford University Press. pp. 115-142.
    Entropy is ubiquitous in physics, and it plays important roles in numerous other disciplines ranging from logic and statistics to biology and economics. However, a closer look reveals a complicated picture: entropy is defined differently in different contexts, and even within the same domain different notions of entropy are at work. Some of these are defined in terms of probabilities, others are not. The aim of this chapter is to arrive at an understanding of some of the (...)
    Download  
     
    Export citation  
     
    Bookmark   21 citations  
  30. The entropy theory of counterfactuals.Douglas N. Kutach - 2002 - Philosophy of Science 69 (1):82-104.
    I assess the thesis that counterfactual asymmetries are explained by an asymmetry of the global entropy at the temporal boundaries of the universe, by developing a method of evaluating counterfactuals that includes, as a background assumption, the low entropy of the early universe. The resulting theory attempts to vindicate the common practice of holding the past mostly fixed under counterfactual supposition while at the same time allowing the counterfactual's antecedent to obtain by a natural physical development. Although the (...)
    Download  
     
    Export citation  
     
    Bookmark   27 citations  
  31. Entropy of Polysemantic Words for the Same Part of Speech.Mihaela Colhon, Florentin Smarandache & Dan Valeriu Voinea - unknown
    In this paper, a special type of polysemantic words, that is, words with multiple meanings for the same part of speech, are analyzed under the name of neutrosophic words. These words represent the most dif cult cases for the disambiguation algorithms as they represent the most ambiguous natural language utterances. For approximate their meanings, we developed a semantic representation framework made by means of concepts from neutrosophic theory and entropy measure in which we incorporate sense related data. We show (...)
    Download  
     
    Export citation  
     
    Bookmark  
  32. IN-cross Entropy Based MAGDM Strategy under Interval Neutrosophic Set Environment.Shyamal Dalapati, Surapati Pramanik, Shariful Alam, Florentin Smarandache & Tapan Kumar Roy - 2017 - Neutrosophic Sets and Systems 18:43-57.
    Cross entropy measure is one of the best way to calculate the divergence of any variable from the priori one variable. We define a new cross entropy measure under interval neutrosophic set environment.
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  33. Does von Neumann Entropy Correspond to Thermodynamic Entropy?Eugene Y. S. Chua - 2021 - Philosophy of Science 88 (1):145-168.
    Conventional wisdom holds that the von Neumann entropy corresponds to thermodynamic entropy, but Hemmo and Shenker (2006) have recently argued against this view by attacking von Neumann's (1955) argument. I argue that Hemmo and Shenker's arguments fail due to several misunderstandings: about statistical-mechanical and thermodynamic domains of applicability, about the nature of mixed states, and about the role of approximations in physics. As a result, their arguments fail in all cases: in the single-particle case, the finite particles case, (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  34. Quantum Mechanics as an Entropy Maximization Problem: Deriving Axioms from Measurements.Harvey-Tremblay Alexandre - manuscript
    This work presents a novel formulation of quantum mechanics as the solution to an entropy maximization problem constrained by empirical measurement outcomes. By treating the complete set of possible measurement outcomes as an optimization constraint, our entropy maximization problem derives the axioms of quantum mechanics as theorems, demonstrating that the theory's mathematical structure is the least biased probability measure consistent with the observed data. This approach reduces the foundation of quantum mechanics to a single axiom, the measurement constraint, (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  35. Entropy : A concept that is not a physical quantity.Shufeng Zhang - 2012 - Physics Essays 25 (2):172-176.
    This study has demonstrated that entropy is not a physical quantity, that is, the physical quantity called entropy does not exist. If the efficiency of heat engine is defined as η = W/W1, and the reversible cycle is considered to be the Stirling cycle, then, given ∮dQ/T = 0, we can prove ∮dW/T = 0 and ∮d/T = 0. If ∮dQ/T = 0, ∮dW/T = 0 and ∮dE/T = 0 are thought to define new system state variables, such (...)
    Download  
     
    Export citation  
     
    Bookmark  
  36. An introduction to logical entropy and its relation to Shannon entropy.David Ellerman - 2013 - International Journal of Semantic Computing 7 (2):121-145.
    The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set--just as the usual logical notion of probability based on the Boolean (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  37. Entropy as Root Metaphor.Eric Zencey - 1986 - Dissertation, The Claremont Graduate University
    Metaphors establish connection. Root metaphors--patterns of relational imagery in the language and thought of a culture, in which a diverse group of tenors are related to a single indentifiable class of vehicles--play an important role in organizing our thought, and in bringing a coherence to our vision of the world. This is a political function; root metaphors, as philosopher Stephen Pepper discusses them, are most often found in the works of philosophers remembered as political philosophers. ;The second law of thermodynamics--the (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  38. Einstein, Entropy, and Anomalies.Daniel Sirtes & Eric Oberheim - 2006 - AIP Conference Proceedings 861:1147-1154.
    This paper strengthens and defends the pluralistic implications of Einstein's successful, quantitative predictions of Brownian motion for a philosophical dispute about the nature of scientific advance that began between two prominent philosophers of science in the second half of the twentieth century (Thomas Kuhn and Paul Feyerabend). Kuhn promoted a monistic phase-model of scientific advance, according to which a paradigm driven `normal science' gives rise to its own anomalies, which then lead to a crisis and eventually a scientific revolution. Feyerabend (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  39. Degeneration and Entropy.Eugene Y. S. Chua - 2022 - Kriterion - Journal of Philosophy 36 (2):123-155.
    [Accepted for publication in Lakatos's Undone Work: The Practical Turn and the Division of Philosophy of Mathematics and Philosophy of Science, special issue of Kriterion: Journal of Philosophy. Edited by S. Nagler, H. Pilin, and D. Sarikaya.] Lakatos’s analysis of progress and degeneration in the Methodology of Scientific Research Programmes is well-known. Less known, however, are his thoughts on degeneration in Proofs and Refutations. I propose and motivate two new criteria for degeneration based on the discussion in Proofs and Refutations (...)
    Download  
     
    Export citation  
     
    Bookmark  
  40. Counting with Cilia: The Role of Morphological Computation in Basal Cognition Research.Wiktor Rorot - 2022 - Entropy 24 (11):1581.
    “Morphological computation” is an increasingly important concept in robotics, artificial intelligence, and philosophy of the mind. It is used to understand how the body contributes to cognition and control of behavior. Its understanding in terms of "offloading" computation from the brain to the body has been criticized as misleading, and it has been suggested that the use of the concept conflates three classes of distinct processes. In fact, these criticisms implicitly hang on accepting a semantic definition of what constitutes computation. (...)
    Download  
     
    Export citation  
     
    Bookmark  
  41. Change in Entropy as a Function of McTaggart's A-series and B-series.Paul Merriam - manuscript
    This careful note is a very initial foray into the issue of the change in entropy with respect to both McTaggart’s A-series and his B-series. We find a possible solution to the Past Hypothesis problem.
    Download  
     
    Export citation  
     
    Bookmark  
  42. How bad is the postulation of a low entropy initial state of the universe?Aldo Filomeno - 2023 - Aphex 27:141-158.
    I summarize, in this informal interview, the main approaches to the ‘Past Hypothesis’, the postulation of a low-entropy initial state of the universe. I’ve chosen this as an open problem in the philosophical foundations of physics. I hope that this brief overview helps readers in gaining perspective and in appreciating the diverse range of approaches in this fascinating unresolved debate.
    Download  
     
    Export citation  
     
    Bookmark  
  43. On walk entropies in graphs. Response to Dehmer and Mowshowitz.Ernesto Estrada, José A. de la Peña & Naomichi Hatano - 2016 - Complexity 21 (S1):15-18.
    We provide here irrefutable facts that prove the falsehood of the claims published in [1] by Dehmer and Mowshowitz (DM) against our paper published in [2]. We first prove that Dehmer’s definition of node probability [3] is flawed. In addition, we show that it was not Dehmer in [3] who proposed this definition for the first time. We continue by proving how the use of Dehmer’s definition does not reveal all the physico-mathematical richness of the walk entropy of graphs. (...)
    Download  
     
    Export citation  
     
    Bookmark  
  44. Norwich’s Entropy Theory: how not to go from abstract to actual.Lance Nizami - 2011 - Kybernetes 40:1102-1118.
    Purpose – The purpose of this paper is to ask whether a first-order-cybernetics concept, Shannon’s Information Theory, actually allows a far-reaching mathematics of perception allegedly derived from it, Norwich et al.’s “Entropy Theory of Perception”. Design/methodology/approach – All of The Entropy Theory, 35 years of publications, was scrutinized for its characterization of what underlies Shannon Information Theory: Shannon’s “general communication system”. There, “events” are passed by a “source” to a “transmitter”, thence through a “noisy channel” to a “receiver”, (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  45.  47
    “A Thousand Words”: How Shannon Entropy perspective provides link among exponential data growth, average temperature of the Earth, declining Earth magnetic field, and global consciousness.Victor Christianto & Florentin Smarandache - manuscript
    The sunspot data seems to indicate that the Sun is likely to enter Maunder Minimum, then it will mean that low Sun activity may cause low temperature in Earth. If this happens then it will cause a phenomenon which is called by some climatology experts as “The Little Ice Age” for the next 20-30 years, starting from the next few years. Therefore, the Earth climate in the coming years tend to be cooler than before. This phenomenon then causes us to (...)
    Download  
     
    Export citation  
     
    Bookmark  
  46. The Concept of Entropy in Statistical Mechanics and Stochastic Music Theory.Ivano Zanzarella - manuscript
    Originally appeared in the field of thermodynamics, the concept of entropy, especially in its statistical acceptation, has found applications in many different disciplines, both inside and outside science. In this work we focus on the possibility of drawing an isomorphism between the entropy of Boltzmann’s statistical mechanics and that of Xenakis’s stochastic music theory. We expose the major technical aspects of the two entropies and then consider affinities and differences between them, both at syntactic and at semantic level, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  47. Increasing FX Exposures Unnerved Vietnam Banking Authorities.Quan-Hoang Vuong - 2011 - Stratfor Global Intelligence 2011 (8).
    Vietnam’s money and capital markets continue to see dramatic stages in the remaining months of 2011 due to high bank rates and also increasing foreign-exchange exposure faced by domestic enterprises.
    Download  
     
    Export citation  
     
    Bookmark  
  48. Ampliative Inference Under Varied Entropy Levels.Paul D. Thorn & Gerhard Schurz - 2013 - In Christoph Beierle & Gabriele Kern-Isberner (eds.), Proceedings of the 4th Workshop on Dynamics of Knowledge and Belief (DKB-2013). Fakultät für Mathematik und Informatik, FernUniversität in Hagen. pp. 77-88.
    Download  
     
    Export citation  
     
    Bookmark  
  49.  42
    A Decision-Making Approach Incorporating TODIM Method and Sine Entropy in q-Rung Picture Fuzzy Set Setting.Büşra Aydoğan, Murat Olgun, Florentin Smarandache & Mehmet Ünver - 2024 - Journal of Applied Mathematics 2024.
    In this study, we propose a new approach based on fuzzy TODIM (Portuguese acronym for interactive and multicriteria decision-making) for decision-making problems in uncertain environments. Our method incorporates group utility and individual regret, which are often ignored in traditional multicriteria decision-making (MCDM) methods. To enhance the analysis and application of fuzzy sets in decision-making processes, we introduce novel entropy and distance measures for q-rung picture fuzzy sets. These measures include an entropy measure based on the sine function and (...)
    Download  
     
    Export citation  
     
    Bookmark  
  50. Information and meaning (Entropy 2003).Christophe Menant - 2003 - Entropy 5:193-204.
    We propose here to clarify some of the relations existing between information and meaning by showing how meaningful information can be generated by a system submitted to a constraint. We build up definitions and properties for meaningful information, a meaning generator system and the domain of efficiency of a meaning (to cover cases of meaningful information transmission). Basic notions of information processing are used.
    Download  
     
    Export citation  
     
    Bookmark   17 citations  
1 — 50 / 997