Results for ' maximum entropy'

409 found
Order:
  1. Bertrand's Paradox and the Maximum Entropy Principle.Nicholas Shackel & Darrell P. Rowbottom - 2019 - Philosophy and Phenomenological Research 101 (3):505-523.
    An important suggestion of objective Bayesians is that the maximum entropy principle can replace a principle which is known to get into paradoxical difficulties: the principle of indifference. No one has previously determined whether the maximum entropy principle is better able to solve Bertrand’s chord paradox than the principle of indifference. In this paper I show that it is not. Additionally, the course of the analysis brings to light a new paradox, a revenge paradox of the (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  2.  83
    Entropy in Physics using my Universal Formula.Angelito Malicse - manuscript
    -/- 1. Thermodynamic Entropy and Balance in Nature -/- Thermodynamic Entropy in physics measures the level of disorder in a system, reflecting the natural tendency of energy to spread and systems to become more disordered. -/- Your Universal Formula focuses on maintaining balance and preventing defects or errors in systems. -/- Integration: -/- Increasing thermodynamic entropy (e.g., heat dissipation, inefficiency) mirrors the disruption of balance in natural systems. -/- Preventing imbalance: To minimize entropy, systems must operate (...)
    Download  
     
    Export citation  
     
    Bookmark  
  3. Spencer-Brown vs. Probability and Statistics: Entropy’s Testimony on Subjective and Objective Randomness.Julio Michael Stern - 2011 - Information 2 (2):277-301.
    This article analyzes the role of entropy in Bayesian statistics, focusing on its use as a tool for detection, recognition and validation of eigen-solutions. “Objects as eigen-solutions” is a key metaphor of the cognitive constructivism epistemological framework developed by the philosopher Heinz von Foerster. Special attention is given to some objections to the concepts of probability, statistics and randomization posed by George Spencer-Brown, a figure of great influence in the field of radical constructivism.
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  4.  31
    Beyond the Bekenstein Bound_ Prime-Driven Structured Resonance as the Fundamental Ordering Principle of Information and Entropy.Devin Bostick - manuscript
    Abstract -/- The Bekenstein Bound proposed that the maximum information content of a finite region of space is determined by its surface area, not its volume, leading to the holographic principle. This shift reframed information as the fundamental constraint on physical systems rather than matter or energy. It suggested that entropy, rather than being an inherent measure of disorder, is a function of informational constraints at the boundaries of a system. However, existing models that incorporate this bound still (...)
    Download  
     
    Export citation  
     
    Bookmark  
  5.  13
    An Ontological Framework of Space‐Time‐Entropy.Zhikai Zou - manuscript
    A speculative concretized framework for understanding relativity and quantum field theory. A thermodynamic time definition with clear time arrow based on the Law of entropy increase. Abstract: Define time as a mapping of the whole universe transformations. In this definition about the whole transformations,the concept of relative time or local time is different from the time of the universe. And this definition is consistent with the most physical phenomena. The physicists of last century excluded the possibility of the existence (...)
    Download  
     
    Export citation  
     
    Bookmark  
  6. On the naturalisation of teleology: self-organisation, autopoiesis and teleodynamics.Miguel Garcia-Valdecasas - 2022 - Adaptive Behavior 30 (2):103-117.
    In recent decades, several theories have claimed to explain the teleological causality of organisms as a function of self-organising and self-producing processes. The most widely cited theories of this sort are variations of autopoiesis, originally introduced by Maturana and Varela. More recent modifications of autopoietic theory have focused on system organisation, closure of constraints and autonomy to account for organism teleology. This article argues that the treatment of teleology in autopoiesis and other organisation theories is inconclusive for three reasons: First, (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  7. Symmetry, Invariance and Ontology in Physics and Statistics.Julio Michael Stern - 2011 - Symmetry 3 (3):611-635.
    This paper has three main objectives: (a) Discuss the formal analogy between some important symmetry-invariance arguments used in physics, probability and statistics. Specifically, we will focus on Noether’s theorem in physics, the maximum entropy principle in probability theory, and de Finetti-type theorems in Bayesian statistics; (b) Discuss the epistemological and ontological implications of these theorems, as they are interpreted in physics and statistics. Specifically, we will focus on the positivist (in physics) or subjective (in statistics) interpretations vs. objective (...)
    Download  
     
    Export citation  
     
    Bookmark   18 citations  
  8. A Fundamentally Irreversible World as an Opportunity towards a Consistent Understanding of Quantum and Cosmological Contexts.Tributsch Helmut Helmuttributsch@Aliceit - 2016 - Lournal of Modern Physics 7:1455-1482.
    In a preceding publication a fundamentally oriented and irreversible world was shown to be de- rivable from the important principle of least action. A consequence of such a paradigm change is avoidance of paradoxes within a “dynamic” quantum physics. This becomes essentially possible because fundamental irreversibility allows consideration of the “entropy” concept in elementary processes. For this reason, and for a compensation of entropy in the spread out energy of the wave, the duality of particle and wave has (...)
    Download  
     
    Export citation  
     
    Bookmark  
  9. Cellular Primary Consciousness Theory (CPCT): The Foundation Intelligence of Emergent Phenomena in Closed Systems; in Theory and Practice And Open and Closed Systems Theory (OCST): The Purpose of Meaninglessness.Brian Brown - manuscript
    This paper presents a unified theory of reality, which integrates two interdependent frameworks: Cellular Primary Consciousness Theory (CPCT) and Open and Closed Systems Theory (OCST). Although CPCT and OCST can each stand as individual theories, they are, in this work, combined to form a cohesive explanation of both the mechanics and purpose of the universe. CPCT posits that consciousness is a fundamental aspect of all life, extending to even the simplest cells, rather than being an emergent property exclusive to complex (...)
    Download  
     
    Export citation  
     
    Bookmark  
  10. Spirit.Eric Steinhart - 2017 - Sophia 56 (4):557-571.
    Many religions and religious philosophies say that ultimate reality is a kind of primal energy. This energy is often described as a vital power animating living things, as a spiritual force directing the organization of matter, or as a divine creative power which generates all things. By refuting older conceptions of primal energy, modern science opens the door to new and more precise conceptions. Primal energy is referred to here as ‘spirit’. But spirit is a natural power. A naturalistic theory (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  11. مرگ حرارتی و پیامدهای الهیاتی آن.Javad Navaei & سیدمحمدکاظم علوی - 2019 - دانشگاه امام صادق علیه السلام 17 (1):233-253.
    According to the second law of thermodynamics, irreversible processes in an isolated system move towards the goal of reaching maximum entropy. In this state, mechanical work is converted to thermal energy and thermodynamic equilibrium occurs; which is determined by the equilibrium in temperature, pressure, etc. Assuming that the universe is an isolated system, the second law of thermodynamics states that the fate of the universe is a state of thermodynamic equilibrium in which all mechanical energies are converted to (...)
    Download  
     
    Export citation  
     
    Bookmark  
  12. Indifference to Anti-Humean Chances.J. Dmitri Gallow - 2022 - Canadian Journal of Philosophy 52 (5):485-501.
    An indifference principle says that your credences should be distributed uniformly over each of the possibilities you recognise. A chance deference principle says that your credences should be aligned with the chances. My thesis is that, if we are anti-Humeans about chance, then these two principles are incompatible. Anti-Humeans think that it is possible for the actual frequencies to depart from the chances. So long as you recognise possibilities like this, you cannot both spread your credences evenly and defer to (...)
    Download  
     
    Export citation  
     
    Bookmark  
  13. Reviewing Evolution of Learning Functions and Semantic Information Measures for Understanding Deep Learning. [REVIEW]Chenguang Lu - 2023 - Entropy 25 (5).
    A new trend in deep learning, represented by Mutual Information Neural Estimation (MINE) and Information Noise Contrast Estimation (InfoNCE), is emerging. In this trend, similarity functions and Estimated Mutual Information (EMI) are used as learning and objective functions. Coincidentally, EMI is essentially the same as Semantic Mutual Information (SeMI) proposed by the author 30 years ago. This paper first reviews the evolutionary histories of semantic information measures and learning functions. Then, it briefly introduces the author’s semantic information G theory with (...)
    Download  
     
    Export citation  
     
    Bookmark  
  14. Informational entropy-based value formation: A new paradigm for a deeper understanding of value.Quan-Hoang Vuong, Viet-Phuong La & Minh-Hoang Nguyen - 2025 - I.E.V.25.
    The major global challenges of our time, like climate and environmental crises, rising inequality, the emergence of disruptive technologies, etc., demand interdisciplinary research for effective solutions. A clear understanding of value is essential for guiding socio-cultural and economic transitions to address these issues. Despite numerous attempts to define value, existing approaches remain inconsistent across disciplines and lack a comprehensive framework. This paper introduces a novel perspective on value through the lens of granular interaction thinking theory, proposing an informational entropy-based (...)
    Download  
     
    Export citation  
     
    Bookmark   17 citations  
  15. Entropy - A Guide for the Perplexed.Roman Frigg & Charlotte Werndl - 2011 - In Claus Beisbart & Stephan Hartmann, Probabilities in Physics. Oxford, GB: Oxford University Press. pp. 115-142.
    Entropy is ubiquitous in physics, and it plays important roles in numerous other disciplines ranging from logic and statistics to biology and economics. However, a closer look reveals a complicated picture: entropy is defined differently in different contexts, and even within the same domain different notions of entropy are at work. Some of these are defined in terms of probabilities, others are not. The aim of this chapter is to arrive at an understanding of some of the (...)
    Download  
     
    Export citation  
     
    Bookmark   38 citations  
  16. Logical Entropy: Introduction to Classical and Quantum Logical Information theory.David Ellerman - 2018 - Entropy 20 (9):679.
    Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  17. Rethinking Maximum Likelihood.Paul Mayer - manuscript
    It is argued that Maximum Likelihood Estimation (MLE) is wrong, both conceptually and in terms of results it produces (except in two very special cases, which are discussed). While the use of MLE can still be justified on the basis of its practical performance, we argue there are better estimation methods that overcome MLE's empirical and philosophical shortcomings while retaining all of MLE's benefits.
    Download  
     
    Export citation  
     
    Bookmark  
  18. Entropy and the Direction of Time.Jerzy Gołosz - 2021 - Entropy 23 (4):388.
    The paper tries to demonstrate that the process of the increase of entropy does not explain the asymmetry of time itself because it is unable to account for its fundamental asymmetries, that is, the asymmetry of traces (we have traces of the past and no traces of the future), the asymmetry of causation (we have an impact on future events with no possibility of having an impact on the past), and the asymmetry between the fixed past and the open (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  19.  36
    Conditional Entropy with Swarm Optimization Approach for Privacy Preservation of Datasets in Cloud.Sugumar R. - 2016 - Indian Journal of Science and Technology 9 (28):1-6.
    Background/Objective: The primary intension is to provide utility trade off and good privacy for intermediate datasets in cloud. Methods: An efficient conditional entropy and database difference ratio is employed for the process. Utility is taken care with the employment of conditional entropy with the help of Swarm Optimization (PSO). Privacy handled by database difference ratio. Findings: Conditional entropy is found out between the first column and the original database and this is taken as the fitness function in (...)
    Download  
     
    Export citation  
     
    Bookmark   13 citations  
  20. Further on informational quanta, interactions, and entropy under the granular view of value formation.Quan-Hoang Vuong & Minh-Hoang Nguyen - 2024 - SSRN.
    A recent study suggests that value and quantum states seem to be governed by the same underlying mechanisms. In our recent book titled "Better economics for the Earth: A lesson from quantum and information theories," specifically Chapter 5, we have proposed an informational entropy-based notion of value, grounded in Granular Interaction Thinking Theory (GITT), which integrates granular worldview and primary features of quantum mechanics, Shannon’s information theory, and the mindsponge theory. Specifically, the notion suggests that values are created through (...)
    Download  
     
    Export citation  
     
    Bookmark   125 citations  
  21. IN-cross Entropy Based MAGDM Strategy under Interval Neutrosophic Set Environment.Shyamal Dalapati, Surapati Pramanik, Shariful Alam, Florentin Smarandache & Tapan Kumar Roy - 2017 - Neutrosophic Sets and Systems 18:43-57.
    Cross entropy measure is one of the best way to calculate the divergence of any variable from the priori one variable. We define a new cross entropy measure under interval neutrosophic set environment.
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  22. Entropy as Root Metaphor.Eric Zencey - 1986 - Dissertation, The Claremont Graduate University
    Metaphors establish connection. Root metaphors--patterns of relational imagery in the language and thought of a culture, in which a diverse group of tenors are related to a single indentifiable class of vehicles--play an important role in organizing our thought, and in bringing a coherence to our vision of the world. This is a political function; root metaphors, as philosopher Stephen Pepper discusses them, are most often found in the works of philosophers remembered as political philosophers. ;The second law of thermodynamics--the (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  23. The entropy theory of counterfactuals.Douglas N. Kutach - 2002 - Philosophy of Science 69 (1):82-104.
    I assess the thesis that counterfactual asymmetries are explained by an asymmetry of the global entropy at the temporal boundaries of the universe, by developing a method of evaluating counterfactuals that includes, as a background assumption, the low entropy of the early universe. The resulting theory attempts to vindicate the common practice of holding the past mostly fixed under counterfactual supposition while at the same time allowing the counterfactual's antecedent to obtain by a natural physical development. Although the (...)
    Download  
     
    Export citation  
     
    Bookmark   31 citations  
  24. Entropy : A concept that is not a physical quantity.Shufeng Zhang - 2012 - Physics Essays 25 (2):172-176.
    This study has demonstrated that entropy is not a physical quantity, that is, the physical quantity called entropy does not exist. If the efficiency of heat engine is defined as η = W/W1, and the reversible cycle is considered to be the Stirling cycle, then, given ∮dQ/T = 0, we can prove ∮dW/T = 0 and ∮d/T = 0. If ∮dQ/T = 0, ∮dW/T = 0 and ∮dE/T = 0 are thought to define new system state variables, such (...)
    Download  
     
    Export citation  
     
    Bookmark  
  25. (1 other version)How Entropy Explains the Emergence of Consciousness: The Entropic Theory.Peter C. Lugten - 2024 - Journal of Neurobehavioral Sciences 11 (1):10-18.
    Background: Emergentism as an ontology of consciousness leaves unanswered the question as to its mechanism. Aim: I aim to solve the Body-Mind problem by explaining how conscious organisms emerged on an evolutionary basis at various times in accordance with an accepted scientific principle, through a mechanism that cannot be understood, in principle. Proposal: The reason for this cloak of secrecy is found in a seeming contradiction in the behaviour of information with respect to the first two laws of thermodynamics. Information, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  26. An introduction to logical entropy and its relation to Shannon entropy.David Ellerman - 2013 - International Journal of Semantic Computing 7 (2):121-145.
    The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set--just as the usual logical notion of probability based on the Boolean (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  27. Entropy of Polysemantic Words for the Same Part of Speech.Mihaela Colhon, Florentin Smarandache & Dan Valeriu Voinea - unknown
    In this paper, a special type of polysemantic words, that is, words with multiple meanings for the same part of speech, are analyzed under the name of neutrosophic words. These words represent the most dif cult cases for the disambiguation algorithms as they represent the most ambiguous natural language utterances. For approximate their meanings, we developed a semantic representation framework made by means of concepts from neutrosophic theory and entropy measure in which we incorporate sense related data. We show (...)
    Download  
     
    Export citation  
     
    Bookmark  
  28. Does von Neumann Entropy Correspond to Thermodynamic Entropy?Eugene Y. S. Chua - 2021 - Philosophy of Science 88 (1):145-168.
    Conventional wisdom holds that the von Neumann entropy corresponds to thermodynamic entropy, but Hemmo and Shenker (2006) have recently argued against this view by attacking von Neumann's (1955) argument. I argue that Hemmo and Shenker's arguments fail due to several misunderstandings: about statistical-mechanical and thermodynamic domains of applicability, about the nature of mixed states, and about the role of approximations in physics. As a result, their arguments fail in all cases: in the single-particle case, the finite particles case, (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  29. Degeneration and Entropy.Eugene Y. S. Chua - 2022 - Kriterion - Journal of Philosophy 36 (2):123-155.
    [Accepted for publication in Lakatos's Undone Work: The Practical Turn and the Division of Philosophy of Mathematics and Philosophy of Science, special issue of Kriterion: Journal of Philosophy. Edited by S. Nagler, H. Pilin, and D. Sarikaya.] Lakatos’s analysis of progress and degeneration in the Methodology of Scientific Research Programmes is well-known. Less known, however, are his thoughts on degeneration in Proofs and Refutations. I propose and motivate two new criteria for degeneration based on the discussion in Proofs and Refutations (...)
    Download  
     
    Export citation  
     
    Bookmark  
  30. Einstein, Entropy, and Anomalies.Daniel Sirtes & Eric Oberheim - 2006 - AIP Conference Proceedings 861:1147-1154.
    This paper strengthens and defends the pluralistic implications of Einstein's successful, quantitative predictions of Brownian motion for a philosophical dispute about the nature of scientific advance that began between two prominent philosophers of science in the second half of the twentieth century (Thomas Kuhn and Paul Feyerabend). Kuhn promoted a monistic phase-model of scientific advance, according to which a paradigm driven `normal science' gives rise to its own anomalies, which then lead to a crisis and eventually a scientific revolution. Feyerabend (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  31. AI training data, model success likelihood, and informational entropy-based value.Quan-Hoang Vuong, Viet-Phuong La & Minh-Hoang Nguyen - manuscript
    Since the release of OpenAI's ChatGPT, the world has entered a race to develop more capable and powerful AI, including artificial general intelligence (AGI). The development is constrained by the dependency of AI on the model, quality, and quantity of training data, making the AI training process highly costly in terms of resources and environmental consequences. Thus, improving the effectiveness and efficiency of the AI training process is essential, especially when the Earth is approaching the climate tipping points and planetary (...)
    Download  
     
    Export citation  
     
    Bookmark  
  32. Change in Entropy as a Function of McTaggart's A-series and B-series.Paul Merriam - manuscript
    This careful note is a very initial foray into the issue of the change in entropy with respect to both McTaggart’s A-series and his B-series. We find a possible solution to the Past Hypothesis problem.
    Download  
     
    Export citation  
     
    Bookmark  
  33.  40
    Increasing coding opportunities using maximum-weight clique.Nastooh Taheri Javan - 2013 - 2013 5Th Computer Science and Electronic Engineering Conference (Ceec) 1 (1):168-173.
    Network coding is used to improve the throughput of communication networks. In this technique, the intermediate nodes mix packets to increase the information content of each transmission. For each flow, a coding pattern is defined as a set of flows that can be coded together. Finding a suitable coding pattern is a challenge due to much complexity. In this paper, we propose an algorithm to find a suitable coding pattern in intermediate nodes by mapping this problem onto maximum-weight clique. (...)
    Download  
     
    Export citation  
     
    Bookmark  
  34. To Relativity, the maximum speed of information transmission is c, which is false.Alfonso Leon Guillen Gomez - manuscript
    According to a bits generation technique, that is, bits can only take a binary value of 0 in the absence of wave function collapse and 1 when wave function collapse occurs, that is, regardless of the value random in which the collapse of the wave function occurs that in the past caused the development of an alternative technique of classic bits to be renounced and remains currently an impossibility in normal science, through series of quantum entanglements, using BCD, EBCDIC or (...)
    Download  
     
    Export citation  
     
    Bookmark  
  35. A Decision-Making Approach Incorporating TODIM Method and Sine Entropy in q-Rung Picture Fuzzy Set Setting.Büşra Aydoğan, Murat Olgun, Florentin Smarandache & Mehmet Ünver - 2024 - Journal of Applied Mathematics 2024.
    In this study, we propose a new approach based on fuzzy TODIM (Portuguese acronym for interactive and multicriteria decision-making) for decision-making problems in uncertain environments. Our method incorporates group utility and individual regret, which are often ignored in traditional multicriteria decision-making (MCDM) methods. To enhance the analysis and application of fuzzy sets in decision-making processes, we introduce novel entropy and distance measures for q-rung picture fuzzy sets. These measures include an entropy measure based on the sine function and (...)
    Download  
     
    Export citation  
     
    Bookmark  
  36. How bad is the postulation of a low entropy initial state of the universe?Aldo Filomeno - 2023 - Aphex 27:141-158.
    I summarize, in this informal interview, the main approaches to the ‘Past Hypothesis’, the postulation of a low-entropy initial state of the universe. I’ve chosen this as an open problem in the philosophical foundations of physics. I hope that this brief overview helps readers in gaining perspective and in appreciating the diverse range of approaches in this fascinating unresolved debate.
    Download  
     
    Export citation  
     
    Bookmark  
  37. On walk entropies in graphs. Response to Dehmer and Mowshowitz.Ernesto Estrada, José A. de la Peña & Naomichi Hatano - 2016 - Complexity 21 (S1):15-18.
    We provide here irrefutable facts that prove the falsehood of the claims published in [1] by Dehmer and Mowshowitz (DM) against our paper published in [2]. We first prove that Dehmer’s definition of node probability [3] is flawed. In addition, we show that it was not Dehmer in [3] who proposed this definition for the first time. We continue by proving how the use of Dehmer’s definition does not reveal all the physico-mathematical richness of the walk entropy of graphs. (...)
    Download  
     
    Export citation  
     
    Bookmark  
  38.  52
    The Temporal Mastery Hypothesis: Entropy, Knowledge, and the Structuring of Time.Devin Bostick - manuscript
    Abstract -/- This paper proposes that entropy—both physical and cognitive—can be reduced through structured mastery and framework collapse, leading to a restructured perception of time and order. By integrating key philosophical frameworks—Adler’s drive for superiority, Fromm’s humanistic psychoanalysis, Arendt’s theory of action, Rand’s rational individualism, Boulivert’s feminist critique of power structures, and Foucault’s knowledge-power dynamics—we examine whether structured mastery over systems creates a functional negentropy, where the refinement of cognitive and social structures leads to an increase in coherence and (...)
    Download  
     
    Export citation  
     
    Bookmark  
  39.  49
    The Four Horsemen of Entropy: A Metaphysical Warning for Systemic Renewal.Tim Grooms - manuscript
    Abstract This paper examines the Four Horsemen of the Apocalypse through the lens of entropy, ethical misalignment, and systemic collapse. Rather than viewing the Horsemen as physical entities or apocalyptic figures, they are reinterpreted as metaphors representing the forces that lead to the degradation of systems—both societal and individual. The paper argues that war, famine, pestilence, and death can be reinterpreted as metaphors for systemic entropy, representing the forces of ethical misalignment and systemic breakdown that threaten both individuals (...)
    Download  
     
    Export citation  
     
    Bookmark  
  40.  53
    Privacy preserving data mining using hiding maximum utility item first algorithm by means of grey wolf optimisation algorithm.Sugumar Rajendran - 2023 - Int. J. Business Intell. Data Mining 10 (2):1-20.
    In the privacy preserving data mining, the utility mining casts a very vital part. The objective of the suggested technique is performed by concealing the high sensitive item sets with the help of the hiding maximum utility item first (HMUIF) algorithm, which effectively evaluates the sensitive item sets by effectively exploiting the user defined utility threshold value. It successfully attempts to estimate the sensitive item sets by utilising optimal threshold value, by means of the grey wolf optimisation (GWO) algorithm. (...)
    Download  
     
    Export citation  
     
    Bookmark   52 citations  
  41. “A Thousand Words”: How Shannon Entropy perspective provides link among exponential data growth, average temperature of the Earth, declining Earth magnetic field, and global consciousness.Victor Christianto & Florentin Smarandache - manuscript
    The sunspot data seems to indicate that the Sun is likely to enter Maunder Minimum, then it will mean that low Sun activity may cause low temperature in Earth. If this happens then it will cause a phenomenon which is called by some climatology experts as “The Little Ice Age” for the next 20-30 years, starting from the next few years. Therefore, the Earth climate in the coming years tend to be cooler than before. This phenomenon then causes us to (...)
    Download  
     
    Export citation  
     
    Bookmark  
  42. On Classical and Quantum Logical Entropy.David Ellerman - manuscript
    The notion of a partition on a set is mathematically dual to the notion of a subset of a set, so there is a logic of partitions dual to Boole's logic of subsets (Boolean logic is usually mis-specified as "propositional" logic). The notion of an element of a subset has as its dual the notion of a distinction of a partition (a pair of elements in different blocks). Boole developed finite logical probability as the normalized counting measure on elements of (...)
    Download  
     
    Export citation  
     
    Bookmark  
  43. Norwich’s Entropy Theory: how not to go from abstract to actual.Lance Nizami - 2011 - Kybernetes 40:1102-1118.
    Purpose – The purpose of this paper is to ask whether a first-order-cybernetics concept, Shannon’s Information Theory, actually allows a far-reaching mathematics of perception allegedly derived from it, Norwich et al.’s “Entropy Theory of Perception”. Design/methodology/approach – All of The Entropy Theory, 35 years of publications, was scrutinized for its characterization of what underlies Shannon Information Theory: Shannon’s “general communication system”. There, “events” are passed by a “source” to a “transmitter”, thence through a “noisy channel” to a “receiver”, (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  44. The Concept of Entropy in Statistical Mechanics and Stochastic Music Theory.Ivano Zanzarella - manuscript
    Originally appeared in the field of thermodynamics, the concept of entropy, especially in its statistical acceptation, has found applications in many different disciplines, both inside and outside science. In this work we focus on the possibility of drawing an isomorphism between the entropy of Boltzmann’s statistical mechanics and that of Xenakis’s stochastic music theory. We expose the major technical aspects of the two entropies and then consider affinities and differences between them, both at syntactic and at semantic level, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  45.  35
    Re-examination of Fundamental Concepts of Heat, Work, Energy, Entropy, and Information Based on NGST.Pan Lingli & Cui Weicheng - 2022 - Philosophy Study 12 (1):1-17.
    In order to use the framework of general system theory (GST) to unify the three mechanics subjects of classical mechanics, quantum mechanics, and relativistic mechanics, a new general system theory (NGST) is developed based on a new ontology of ether and minds as the fundamental existences in the world. Based on this new ontology, many fundamental concepts have been detected to be ambiguously defined nowadays and particularly lack of ontological support. In our previous work, some of the fundamental concepts such (...)
    Download  
     
    Export citation  
     
    Bookmark  
  46. There’s Plenty of Boole at the Bottom: A Reversible CA Against Information Entropy.Francesco Berto, Jacopo Tagliabue & Gabriele Rossi - 2016 - Minds and Machines 26 (4):341-357.
    “There’s Plenty of Room at the Bottom”, said the title of Richard Feynman’s 1959 seminal conference at the California Institute of Technology. Fifty years on, nanotechnologies have led computer scientists to pay close attention to the links between physical reality and information processing. Not all the physical requirements of optimal computation are captured by traditional models—one still largely missing is reversibility. The dynamic laws of physics are reversible at microphysical level, distinct initial states of a system leading to distinct final (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  47. Bridging Conceptual Gaps: The Kolmogorov-Sinai Entropy.Massimiliano Badino - forthcoming - Isonomía. Revista de Teoría y Filosofía Del Derecho.
    The Kolmogorov-Sinai entropy is a fairly exotic mathematical concept which has recently aroused some interest on the philosophers’ part. The most salient trait of this concept is its working as a junction between such diverse ambits as statistical mechanics, information theory and algorithm theory. In this paper I argue that, in order to understand this very special feature of the Kolmogorov-Sinai entropy, is essential to reconstruct its genealogy. Somewhat surprisingly, this story takes us as far back as the (...)
    Download  
     
    Export citation  
     
    Bookmark  
  48. Qualitative probabilistic inference under varied entropy levels.Paul D. Thorn & Gerhard Schurz - 2016 - Journal of Applied Logic 19 (2):87-101.
    In previous work, we studied four well known systems of qualitative probabilistic inference, and presented data from computer simulations in an attempt to illustrate the performance of the systems. These simulations evaluated the four systems in terms of their tendency to license inference to accurate and informative conclusions, given incomplete information about a randomly selected probability distribution. In our earlier work, the procedure used in generating the unknown probability distribution (representing the true stochastic state of the world) tended to yield (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  49. Information and meaning (Entropy 2003).Christophe Menant - 2003 - Entropy 5:193-204.
    We propose here to clarify some of the relations existing between information and meaning by showing how meaningful information can be generated by a system submitted to a constraint. We build up definitions and properties for meaningful information, a meaning generator system and the domain of efficiency of a meaning (to cover cases of meaningful information transmission). Basic notions of information processing are used.
    Download  
     
    Export citation  
     
    Bookmark   18 citations  
  50. The temporal foundation of the principle of maximal entropy.Vasil Penchev - 2020 - Logic and Philosophy of Mathematics eJournal 12 (11):1-3.
    The principle of maximal entropy (further abbreviated as “MaxEnt”) can be founded on the formal mechanism, in which future transforms into past by the mediation of present. This allows of MaxEnt to be investigated by the theory of quantum information. MaxEnt can be considered as an inductive analog or generalization of “Occam’s razor”. It depends crucially on choice and thus on information just as all inductive methods of reasoning. The essence shared by Occam’s razor and MaxEnt is for the (...)
    Download  
     
    Export citation  
     
    Bookmark  
1 — 50 / 409