Results for 'entropy measure'

953 found
Order:
  1. IN-cross Entropy Based MAGDM Strategy under Interval Neutrosophic Set Environment.Shyamal Dalapati, Surapati Pramanik, Shariful Alam, Florentin Smarandache & Tapan Kumar Roy - 2017 - Neutrosophic Sets and Systems 18:43-57.
    Cross entropy measure is one of the best way to calculate the divergence of any variable from the priori one variable. We define a new cross entropy measure under interval neutrosophic set environment.
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  2. Entropy of Polysemantic Words for the Same Part of Speech.Mihaela Colhon, Florentin Smarandache & Dan Valeriu Voinea - unknown
    In this paper, a special type of polysemantic words, that is, words with multiple meanings for the same part of speech, are analyzed under the name of neutrosophic words. These words represent the most dif cult cases for the disambiguation algorithms as they represent the most ambiguous natural language utterances. For approximate their meanings, we developed a semantic representation framework made by means of concepts from neutrosophic theory and entropy measure in which we incorporate sense related data. We (...)
    Download  
     
    Export citation  
     
    Bookmark  
  3. Logical Entropy: Introduction to Classical and Quantum Logical Information theory.David Ellerman - 2018 - Entropy 20 (9):679.
    Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  4. A Decision-Making Approach Incorporating TODIM Method and Sine Entropy in q-Rung Picture Fuzzy Set Setting.Büşra Aydoğan, Murat Olgun, Florentin Smarandache & Mehmet Ünver - 2024 - Journal of Applied Mathematics 2024.
    In this study, we propose a new approach based on fuzzy TODIM (Portuguese acronym for interactive and multicriteria decision-making) for decision-making problems in uncertain environments. Our method incorporates group utility and individual regret, which are often ignored in traditional multicriteria decision-making (MCDM) methods. To enhance the analysis and application of fuzzy sets in decision-making processes, we introduce novel entropy and distance measures for q-rung picture fuzzy sets. These measures include an entropy measure based on the sine function (...)
    Download  
     
    Export citation  
     
    Bookmark  
  5. An introduction to logical entropy and its relation to Shannon entropy.David Ellerman - 2013 - International Journal of Semantic Computing 7 (2):121-145.
    The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set--just as the usual logical notion of probability based on the (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  6. On Classical and Quantum Logical Entropy.David Ellerman - manuscript
    The notion of a partition on a set is mathematically dual to the notion of a subset of a set, so there is a logic of partitions dual to Boole's logic of subsets (Boolean logic is usually mis-specified as "propositional" logic). The notion of an element of a subset has as its dual the notion of a distinction of a partition (a pair of elements in different blocks). Boole developed finite logical probability as the normalized counting measure on elements (...)
    Download  
     
    Export citation  
     
    Bookmark  
  7. Sensory Systems as Cybernetic Systems that Require Awareness of Alternatives to Interact with the World: Analysis of the Brain-Receptor Loop in Norwich's Entropy Theory of Perception.Lance Nizami - 2009 - Proceedings of the 2009 IEEE International Conference on Systems, Man, and Cybernetics. San Antonio, TX.
    Introduction & Objectives: Norwich’s Entropy Theory of Perception (1975 [1] -present) stands alone. It explains many firing-rate behaviors and psychophysical laws from bare theory. To do so, it demands a unique sort of interaction between receptor and brain, one that Norwich never substantiated. Can it now be confirmed, given the accumulation of empirical sensory neuroscience? Background: Norwich conjoined sensation and a mathematical model of communication, Shannon’s Information Theory, as follows: “In the entropic view of sensation, magnitude of sensation is (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  8. (1 other version)Causal Confirmation Measures: From Simpson’s Paradox to COVID-19.Chenguang Lu - 2023 - Entropy 25 (1):143.
    When we compare the influences of two causes on an outcome, if the conclusion from every group is against that from the conflation, we think there is Simpson’s Paradox. The Existing Causal Inference Theory (ECIT) can make the overall conclusion consistent with the grouping conclusion by removing the confounder’s influence to eliminate the paradox. The ECIT uses relative risk difference Pd = max(0, (R − 1)/R) (R denotes the risk ratio) as the probability of causation. In contrast, Philosopher Fitelson uses (...)
    Download  
     
    Export citation  
     
    Bookmark  
  9. A New Logic, a New Information Measure, and a New Information-Based Approach to Interpreting Quantum Mechanics.David Ellerman - 2024 - Entropy Special Issue: Information-Theoretic Concepts in Physics 26 (2).
    The new logic of partitions is dual to the usual Boolean logic of subsets (usually presented only in the special case of the logic of propositions) in the sense that partitions and subsets are category-theoretic duals. The new information measure of logical entropy is the normalized quantitative version of partitions. The new approach to interpreting quantum mechanics (QM) is showing that the mathematics (not the physics) of QM is the linearized Hilbert space version of the mathematics of partitions. (...)
    Download  
     
    Export citation  
     
    Bookmark  
  10. Whispers and Shouts. The measurement of the human act.Fernando Flores Morador & Luis de Marcos Ortega (eds.) - 2021 - Alcalá de Henares, Madrid: Departement of Computational Sciences. University of Alcalá; Madrid.
    The 20th Century is the starting point for the most ambitious attempts to extrapolate human life into artificial systems. Norbert Wiener’s Cybernetics, Claude Shannon’s Information Theory, John von Neumann’s Cellular Automata, Universal Constructor to the Turing Test, Artificial Intelligence to Maturana and Varela’s Autopoietic Organization, all shared the goal of understanding in what sense humans resemble a machine. This scientific and technological movement has embraced all disciplines without exceptions, not only mathematics and physics but also biology, sociology, psychology, economics etc. (...)
    Download  
     
    Export citation  
     
    Bookmark  
  11. Reviewing Evolution of Learning Functions and Semantic Information Measures for Understanding Deep Learning. [REVIEW]Chenguang Lu - 2023 - Entropy 25 (5).
    A new trend in deep learning, represented by Mutual Information Neural Estimation (MINE) and Information Noise Contrast Estimation (InfoNCE), is emerging. In this trend, similarity functions and Estimated Mutual Information (EMI) are used as learning and objective functions. Coincidentally, EMI is essentially the same as Semantic Mutual Information (SeMI) proposed by the author 30 years ago. This paper first reviews the evolutionary histories of semantic information measures and learning functions. Then, it briefly introduces the author’s semantic information G theory with (...)
    Download  
     
    Export citation  
     
    Bookmark  
  12. Theory of Cooperative-Competitive Intelligence: Principles, Research Directions, and Applications.Robert Hristovski & Natàlia Balagué - 2020 - Frontiers in Psychology 11.
    We present a theory of cooperative-competitive intelligence (CCI), its measures, research program, and applications that stem from it. Within the framework of this theory, satisficing sub-optimal behavior is any behavior that does not promote a decrease in the prospective control of the functional action diversity/unpredictability (D/U) potential of the agent or team. This potential is defined as the entropy measure in multiple, context-dependent dimensions. We define the satisficing interval of behaviors as CCI. In order to manifest itself at (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  13. Advances and Applications of DSmT for Information Fusion. Collected Works, Volume 5.Florentin Smarandache - 2023 - Edited by Smarandache Florentin, Dezert Jean & Tchamova Albena.
    This fifth volume on Advances and Applications of DSmT for Information Fusion collects theoretical and applied contributions of researchers working in different fields of applications and in mathematics, and is available in open-access. The collected contributions of this volume have either been published or presented after disseminating the fourth volume in 2015 in international conferences, seminars, workshops and journals, or they are new. The contributions of each part of this volume are chronologically ordered. First Part of this book presents some (...)
    Download  
     
    Export citation  
     
    Bookmark  
  14. (1 other version)Introduction to Image Processing via Neutrosophic Techniques.A. A. Salama, Florentin Smarandache & Mohamed Eisa - 2014 - Neutrosophic Sets and Systems 5:59-64.
    This paper is an attempt of proposing the processing approach of neutrosophic technique in image processing. As neutrosophic sets is a suitable tool to cope with imperfectly defined images, the properties, basic operations distance measure, entropy measures, of the neutrosophic sets method are presented here. İn this paper we, introduce the distances between neutrosophic sets: the Hamming distance, the normalized Hamming distance, the Euclidean distance and normalized Euclidean distance. We will extend the concepts of distances to the case (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  15. Everettian Formulation of the Second Law of Thermodynamics.Yu Feng - manuscript
    The second law of thermodynamics is traditionally interpreted as a coarse-grained result of classical mechanics. Recently its relation with quantum mechanical processes such as decoherence and measurement has been revealed in literature. In this paper we will formulate the second law and the associated time irreversibility following Everett’s idea: systems entangled with an object getting to know the branch in which they live. Accounting for this self-locating knowledge, we get two forms of entropy: objective entropy measuring the uncertainty (...)
    Download  
     
    Export citation  
     
    Bookmark  
  16. The computable universe: from prespace metaphysics to discrete quantum mechanics.Martin Leckey - 1997 - Dissertation, Monash University
    The central motivating idea behind the development of this work is the concept of prespace, a hypothetical structure that is postulated by some physicists to underlie the fabric of space or space-time. I consider how such a structure could relate to space and space-time, and the rest of reality as we know it, and the implications of the existence of this structure for quantum theory. Understanding how this structure could relate to space and to the rest of reality requires, I (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  17. On the (Im)possibility of Scalable Quantum Computing.Andrew Knight - manuscript
    The potential for scalable quantum computing depends on the viability of fault tolerance and quantum error correction, by which the entropy of environmental noise is removed during a quantum computation to maintain the physical reversibility of the computer’s logical qubits. However, the theory underlying quantum error correction applies a linguistic double standard to the words “noise” and “measurement” by treating environmental interactions during a quantum computation as inherently reversible, and environmental interactions at the end of a quantum computation as (...)
    Download  
     
    Export citation  
     
    Bookmark  
  18. Maxwell’s Demon in Quantum Mechanics.Orly Shenker & Meir Hemmo - 2020 - Entropy 22 (3):269.
    Maxwell’s Demon is a thought experiment devised by J. C. Maxwell in 1867 in order to show that the Second Law of thermodynamics is not universal, since it has a counter-example. Since the Second Law is taken by many to provide an arrow of time, the threat to its universality threatens the account of temporal directionality as well. Various attempts to “exorcise” the Demon, by proving that it is impossible for one reason or another, have been made throughout the years, (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  19. CSsEv: Modelling QoS Metrics in Tree Soft Toward Cloud Services Evaluator based on Uncertainty Environment.Mona Gharib, Florentin Smarandache & Mona Mohamed - 2024 - International Journal of Neutrosophic Science 23 (2):32-41.
    Cloud computing (ClC) has become a more popular computer paradigm in the preceding few years. Quality of Service (QoS) is becoming a crucial issue in service alteration because of the rapid growth in the number of cloud services. When evaluating cloud service functioning using several performance measures, the issue becomes more complex and non-trivial. It is therefore quite difficult and crucial for consumers to choose the best cloud service. The user's choices are provided in a quantifiable manner in the current (...)
    Download  
     
    Export citation  
     
    Bookmark  
  20. Follow the Math!: The Mathematics of Quantum Mechanics as the Mathematics of Set Partitions Linearized to (Hilbert) Vector Spaces.David Ellerman - 2022 - Foundations of Physics 52 (5):1-40.
    The purpose of this paper is to show that the mathematics of quantum mechanics is the mathematics of set partitions linearized to vector spaces, particularly in Hilbert spaces. That is, the math of QM is the Hilbert space version of the math to describe objective indefiniteness that at the set level is the math of partitions. The key analytical concepts are definiteness versus indefiniteness, distinctions versus indistinctions, and distinguishability versus indistinguishability. The key machinery to go from indefinite to more definite (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  21. Falsification and future performance.David Balduzzi - manuscript
    We information-theoretically reformulate two measures of capacity from statistical learning theory: empirical VC-entropy and empirical Rademacher complexity. We show these capacity measures count the number of hypotheses about a dataset that a learning algorithm falsifies when it finds the classifier in its repertoire minimizing empirical risk. It then follows from that the future performance of predictors on unseen data is controlled in part by how many hypotheses the learner falsifies. As a corollary we show that empirical VC-entropy quantifies (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  22. A Generalization of Shannon's Information Theory.Chenguang Lu - 1999 - Int. J. Of General Systems 28 (6):453-490.
    A generalized information theory is proposed as a natural extension of Shannon's information theory. It proposes that information comes from forecasts. The more precise and the more unexpected a forecast is, the more information it conveys. If subjective forecast always conforms with objective facts then the generalized information measure will be equivalent to Shannon's information measure. The generalized communication model is consistent with K. R. Popper's model of knowledge evolution. The mathematical foundations of the new information theory, the (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  23. Entanglement and thermodynamics in general probabilistic theories.Giulio Chiribella & Carlo Maria Scandolo - 2015 - New Journal of Physics 17:103027.
    Entanglement is one of the most striking features of quantum mechanics, and yet it is not specifically quantum. More specific to quantum mechanics is the connection between entanglement and thermodynamics, which leads to an identification between entropies and measures of pure state entanglement. Here we search for the roots of this connection, investigating the relation between entanglement and thermodynamics in the framework of general probabilistic theories. We first address the question whether an entangled state can be transformed into another by (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  24. Fragmental Presentism and Quantum Mechanics.Paul Merriam - 2021
    This paper develops a Fragmentalist theory of Presentism and shows how it can help to develop a interpretation of quantum mechanics. There are several fragmental interpretations of physics. In the interpretation of this paper, each quantum system forms a fragment, and fragment f1 makes a measurement on fragment f2 if and only if f2 makes a corresponding measurement on f1. The main idea is then that each fragment has its own present (or ‘now’) until a mutual quantum measurement—at which time (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  25. On the Genealogy of the Eternal Return.Dmitri Safronov - 2021 - Vestnik 78 (4):3-24.
    Guided to the notion of the eternal return by the philosophical intuitions of the Greek antiquity, Nietzsche turned to the physical sciences of his day in order to further his inquiry. This extensive intellectual engagement represented a genuine attempt to investigate the possible continuity of meaning between the mythical tradition, on the one hand, and the rational-empirical (i.e. scientific), on the other. In particular, Nietzsche was intrigued by the manner in which the relationship between myth and science played out in (...)
    Download  
     
    Export citation  
     
    Bookmark  
  26. A Stieglerianesque Critique Of Transhumanisms: On Narratives And Neganthropocene.Adrian Mróz - 2019 - Hybris 46:138-160.
    While drawing from the philosophy of Bernard Stiegler throughout the paper, I commence by highlighting Zoltan Istvan’s representation of transhumanism in the light of its role in politics. I continue by elaborating on the notion of the promise of eternal life. After that I differentiate between subjects that are proper for philosophy (such as the mind or whether life is worth living) and science (measurable and replicable). The arguments mostly concern mind-uploading and at the same time I elaborate on a (...)
    Download  
     
    Export citation  
     
    Bookmark  
  27. De la selección natural al diseño: una propuesta de extensión del darwinismo formal.Giorgio Airoldi & Cristian Saborido - 2017 - Metatheoria – Revista de Filosofía E Historia de la Ciencia 8 (1):71--80.
    Darwin’s claim that Natural Selection, through optimization of fitness, explains complex biological design has not yet been properly formalized. Alan Grafen’s Formal Darwinism Project aims at providing such a formalization and at demonstrating that fitness maximization is coherent with results from Population Genetics, usually interpreted as denying it. We suggest that Grafen’s proposal suffers from some limitations linked to its concept of design as optimized fitness. In order to overcome these limitations, we propose a classification of evolutionary facts based on (...)
    Download  
     
    Export citation  
     
    Bookmark  
  28. Dimensional theoretical properties of some affine dynamical systems.Jörg Neunhäuserer - 1999 - Dissertation,
    In this work we study dimensional theoretical properties of some a±ne dynamical systems. By dimensional theoretical properties we mean Hausdor® dimension and box- counting dimension of invariant sets and ergodic measures on theses sets. Especially we are interested in two problems. First we ask whether the Hausdor® and box- counting dimension of invariant sets coincide. Second we ask whether there exists an ergodic measure of full Hausdor® dimension on these invariant sets. If this is not the case we ask (...)
    Download  
     
    Export citation  
     
    Bookmark  
  29. Physical complexity and cognitive evolution.Peter Jedlicka - 2007 - In Carlos Gershenson, Diederik Aerts & Bruce Edmonds (eds.), Worldviews, Science and Us: Philosophy and Complexity. World Scientific. pp. 221--231.
    Our intuition tells us that there is a general trend in the evolution of nature, a trend towards greater complexity. However, there are several definitions of complexity and hence it is difficult to argue for or against the validity of this intuition. Christoph Adami has recently introduced a novel measure called physical complexity that assigns low complexity to both ordered and random systems and high complexity to those in between. Physical complexity measures the amount of information that an organism (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  30. Channels’ Confirmation and Predictions’ Confirmation: From the Medical Test to the Raven Paradox.Chenguang Lu - 2020 - Entropy 22 (4):384.
    After long arguments between positivism and falsificationism, the verification of universal hypotheses was replaced with the confirmation of uncertain major premises. Unfortunately, Hemple proposed the Raven Paradox. Then, Carnap used the increment of logical probability as the confirmation measure. So far, many confirmation measures have been proposed. Measure F proposed by Kemeny and Oppenheim among them possesses symmetries and asymmetries proposed by Elles and Fitelson, monotonicity proposed by Greco et al., and normalizing property suggested by many researchers. Based (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  31. Information, learning and falsification.David Balduzzi - 2011
    There are (at least) three approaches to quantifying information. The first, algorithmic information or Kolmogorov complexity, takes events as strings and, given a universal Turing machine, quantifies the information content of a string as the length of the shortest program producing it [1]. The second, Shannon information, takes events as belonging to ensembles and quantifies the information resulting from observing the given event in terms of the number of alternate events that have been ruled out [2]. The third, statistical learning (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  32. Arithmetic logical Irreversibility and the Halting Problem (Revised and Fixed version).Yair Lapin - manuscript
    The Turing machine halting problem can be explained by several factors, including arithmetic logic irreversibility and memory erasure, which contribute to computational uncertainty due to information loss during computation. Essentially, this means that an algorithm can only preserve information about an input, rather than generate new information. This uncertainty arises from characteristics such as arithmetic logical irreversibility, Landauer's principle, and memory erasure, which ultimately lead to a loss of information and an increase in entropy. To measure this uncertainty (...)
    Download  
     
    Export citation  
     
    Bookmark  
  33. Evidence and Credibility: Full Bayesian Significance Test for Precise Hypotheses.Julio Michael Stern & Carlos Alberto de Braganca Pereira - 1999 - Entropy 1 (1):69-80.
    A Bayesian measure of evidence for precise hypotheses is presented. The intention is to give a Bayesian alternative to significance tests or, equivalently, to p-values. In fact, a set is defined in the parameter space and the posterior probability, its credibility, is evaluated. This set is the “Highest Posterior Density Region” that is “tangent” to the set that defines the null hypothesis. Our measure of evidence is the complement of the credibility of the “tangent” region.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  34. The Montevideo Interpretation of Quantum Mechanics: a short review.Rodolfo Gambini & Jorge Pullin - 2015 - Entropy 20 (6).
    The Montevideo interpretation of quantum mechanics, which consists in supplementing environmental decoherence with fundamental limitations in measurement stemming from gravity, has been described in several publications. However, some of them appeared before the full picture provided by the interpretation was developed. As such it can be difficult to get a good understanding via the published literature. Here we summarize it in a self contained brief presentation including all its principal elements.
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  35. Contextuality and Indistinguishability.Acacio de Barros, Federico Holik & Décio Krause - 2017 - Entropy 19 ((9)):435.
    It is well known that in quantum mechanics we cannot always define consistently properties that are context independent. Many approaches exist to describe contextual properties, such as Contextuality by Default, sheaf theory, topos theory, and non-standard or signed probabilities. In this paper we propose a treatment of contextual properties that is specific to quantum mechanics, as it relies on the relationship between contextuality and indistinguishability. In particular, we propose that if we assume the ontological thesis that quantum particles or properties (...)
    Download  
     
    Export citation  
     
    Bookmark  
  36. Closing the door on quantum nonlocality.Marian Kupczynski - 2018 - Entropy 363347 (363347):17.
    Bell-type inequalities are proven using oversimplified probabilistic models and/or counterfactual definiteness (CFD). If setting-dependent variables describing measuring instruments are correctly introduced, none of these inequalities may be proven. In spite of this, a belief in a mysterious quantum nonlocality is not fading. Computer simulations of Bell tests allow people to study the different ways in which the experimental data might have been created. They also allow for the generation of various counterfactual experiments’ outcomes, such as repeated or simultaneous measurements performed (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  37. Contextuality-by-Default Description of Bell Tests: Contextuality as the Rule and Not as an Exception.Marian Kupczynski - 2021 - Entropy 2021 (23):1104-1120.
    Contextuality and entanglement are valuable resources for quantum computing and quantum information. Bell inequalities are used to certify entanglement; thus, it is important to understand why and how they are violated. Quantum mechanics and behavioural sciences teach us that random variables ‘measuring’ the same content (the answer to the same Yes or No question) may vary, if ‘measured’ jointly with other random variables. Alice’s and Bob’s raw data confirm Einsteinian non-signaling, but setting dependent experimental protocols are used to create samples (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  38. Entropy - A Guide for the Perplexed.Roman Frigg & Charlotte Werndl - 2011 - In Claus Beisbart & Stephan Hartmann (eds.), Probabilities in Physics. Oxford, GB: Oxford University Press. pp. 115-142.
    Entropy is ubiquitous in physics, and it plays important roles in numerous other disciplines ranging from logic and statistics to biology and economics. However, a closer look reveals a complicated picture: entropy is defined differently in different contexts, and even within the same domain different notions of entropy are at work. Some of these are defined in terms of probabilities, others are not. The aim of this chapter is to arrive at an understanding of some of the (...)
    Download  
     
    Export citation  
     
    Bookmark   21 citations  
  39. Entropy and the Direction of Time.Jerzy Gołosz - 2021 - Entropy 23 (4):388.
    The paper tries to demonstrate that the process of the increase of entropy does not explain the asymmetry of time itself because it is unable to account for its fundamental asymmetries, that is, the asymmetry of traces (we have traces of the past and no traces of the future), the asymmetry of causation (we have an impact on future events with no possibility of having an impact on the past), and the asymmetry between the fixed past and the open (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  40. (1 other version)How Entropy Explains the Emergence of Consciousness: The Entropic Theory.Peter C. Lugten - 2024 - Journal of Neurobehavioral Sciences 11 (1):10-18.
    Background: Emergentism as an ontology of consciousness leaves unanswered the question as to its mechanism. Aim: I aim to solve the Body-Mind problem by explaining how conscious organisms emerged on an evolutionary basis at various times in accordance with an accepted scientific principle, through a mechanism that cannot be understood, in principle. Proposal: The reason for this cloak of secrecy is found in a seeming contradiction in the behaviour of information with respect to the first two laws of thermodynamics. Information, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  41. The entropy theory of counterfactuals.Douglas N. Kutach - 2002 - Philosophy of Science 69 (1):82-104.
    I assess the thesis that counterfactual asymmetries are explained by an asymmetry of the global entropy at the temporal boundaries of the universe, by developing a method of evaluating counterfactuals that includes, as a background assumption, the low entropy of the early universe. The resulting theory attempts to vindicate the common practice of holding the past mostly fixed under counterfactual supposition while at the same time allowing the counterfactual's antecedent to obtain by a natural physical development. Although the (...)
    Download  
     
    Export citation  
     
    Bookmark   29 citations  
  42. Entropy : A concept that is not a physical quantity.Shufeng Zhang - 2012 - Physics Essays 25 (2):172-176.
    This study has demonstrated that entropy is not a physical quantity, that is, the physical quantity called entropy does not exist. If the efficiency of heat engine is defined as η = W/W1, and the reversible cycle is considered to be the Stirling cycle, then, given ∮dQ/T = 0, we can prove ∮dW/T = 0 and ∮d/T = 0. If ∮dQ/T = 0, ∮dW/T = 0 and ∮dE/T = 0 are thought to define new system state variables, such (...)
    Download  
     
    Export citation  
     
    Bookmark  
  43. Measurement scales and welfarist social choice.Michael Morreau & John A. Weymark - 2016 - Journal of Mathematical Psychology 75:127-136.
    The social welfare functional approach to social choice theory fails to distinguish a genuine change in individual well-beings from a merely representational change due to the use of different measurement scales. A generalization of the concept of a social welfare functional is introduced that explicitly takes account of the scales that are used to measure well-beings so as to distinguish between these two kinds of changes. This generalization of the standard theoretical framework results in a more satisfactory formulation of (...)
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  44.  71
    AI training data, model success likelihood, and informational entropy-based value.Quan-Hoang Vuong, Viet-Phuong La & Minh-Hoang Nguyen - manuscript
    Since the release of OpenAI's ChatGPT, the world has entered a race to develop more capable and powerful AI, including artificial general intelligence (AGI). The development is constrained by the dependency of AI on the model, quality, and quantity of training data, making the AI training process highly costly in terms of resources and environmental consequences. Thus, improving the effectiveness and efficiency of the AI training process is essential, especially when the Earth is approaching the climate tipping points and planetary (...)
    Download  
     
    Export citation  
     
    Bookmark  
  45. Measuring effectiveness.Jacob Stegenga - 2015 - Studies in History and Philosophy of Science Part C: Studies in History and Philosophy of Biological and Biomedical Sciences 54:62-71.
    Measuring the effectiveness of medical interventions faces three epistemological challenges: the choice of good measuring instruments, the use of appropriate analytic measures, and the use of a reliable method of extrapolating measures from an experimental context to a more general context. In practice each of these challenges contributes to overestimating the effectiveness of medical interventions. These challenges suggest the need for corrective normative principles. The instruments employed in clinical research should measure patient-relevant and disease-specific parameters, and should not be (...)
    Download  
     
    Export citation  
     
    Bookmark   27 citations  
  46. Entropy as Root Metaphor.Eric Zencey - 1986 - Dissertation, The Claremont Graduate University
    Metaphors establish connection. Root metaphors--patterns of relational imagery in the language and thought of a culture, in which a diverse group of tenors are related to a single indentifiable class of vehicles--play an important role in organizing our thought, and in bringing a coherence to our vision of the world. This is a political function; root metaphors, as philosopher Stephen Pepper discusses them, are most often found in the works of philosophers remembered as political philosophers. ;The second law of thermodynamics--the (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  47. Does von Neumann Entropy Correspond to Thermodynamic Entropy?Eugene Y. S. Chua - 2021 - Philosophy of Science 88 (1):145-168.
    Conventional wisdom holds that the von Neumann entropy corresponds to thermodynamic entropy, but Hemmo and Shenker (2006) have recently argued against this view by attacking von Neumann's (1955) argument. I argue that Hemmo and Shenker's arguments fail due to several misunderstandings: about statistical-mechanical and thermodynamic domains of applicability, about the nature of mixed states, and about the role of approximations in physics. As a result, their arguments fail in all cases: in the single-particle case, the finite particles case, (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  48. Measurement in biology is methodized by theory.Maël Montévil - 2019 - Biology and Philosophy 34 (3):35.
    We characterize access to empirical objects in biology from a theoretical perspective. Unlike objects in current physical theories, biological objects are the result of a history and their variations continue to generate a history. This property is the starting point of our concept of measurement. We argue that biological measurement is relative to a natural history which is shared by the different objects subjected to the measurement and is more or less constrained by biologists. We call symmetrization the theoretical and (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  49. Measuring knowledge management maturity at HEI to enhance performance-an empirical study at Al-Azhar University in Palestine.Samy S. Abu Naser, Mazen J. Al Shobaki & Youssef M. Abu Amuna - 2016 - International Journal of Commerce and Management Research 2 (5):55-62.
    This paper aims to assess knowledge management maturity at HEI to determine the most effecting variables on knowledge management that enhance the total performance of the organization. This study was applied on Al-Azhar University in Gaza strip, Palestine. This paper depends on Asian productivity organization model that used to assess KM maturity. Second dimension assess high performance was developed by the authors. The controlled sample was (364). Several statistical tools were used for data analysis and hypotheses testing, including reliability Correlation (...)
    Download  
     
    Export citation  
     
    Bookmark   14 citations  
  50. Similarity Measure of Refined Single-Valued Neutrosophic Sets and Its Multicriteria Decision Making Method.Jun Ye & Florentin Smarandache - 2016 - Neutrosophic Sets and Systems 12:41-44.
    This paper introduces a refined single-valued neutrosophic set (RSVNS) and presents a similarity measure of RSVNSs. Then a multicriteria decision-making method with RSVNS information is developed based on the similarity measure of RSVNSs. By the similarity measure between each alternative and the ideal solution (ideal alternative), all the alternatives can be ranked and the best one can be selected as well. Finally, an actual example on the selecting problems of construction projects demonstrates the application and effectiveness of (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
1 — 50 / 953