Results for 'entropy '

189 found
Order:
  1.  22
    How Entropy Explains the Emergence of Consciousness: The Entropic Theory.Peter C. Lugten - 2024 - Journal of Neurobehavioral Sciences 11 (1):10-18.
    Background: Emergentism as an ontology of consciousness leaves unanswered the question as to its mechanism. Aim: I aim to solve the Body-Mind problem by explaining how conscious organisms emerged on an evolutionary basis at various times in accordance with an accepted scientific principle, through a mechanism that cannot be understood, in principle. Proposal: The reason for this cloak of secrecy is found in a seeming contradiction in the behaviour of information with respect to the first two laws of thermodynamics. Information, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  2. Entropy - A Guide for the Perplexed.Roman Frigg & Charlotte Werndl - 2011 - In Claus Beisbart & Stephan Hartmann (eds.), Probabilities in Physics. Oxford University Press. pp. 115-142.
    Entropy is ubiquitous in physics, and it plays important roles in numerous other disciplines ranging from logic and statistics to biology and economics. However, a closer look reveals a complicated picture: entropy is defined differently in different contexts, and even within the same domain different notions of entropy are at work. Some of these are defined in terms of probabilities, others are not. The aim of this chapter is to arrive at an understanding of some of the (...)
    Download  
     
    Export citation  
     
    Bookmark   21 citations  
  3. Logical Entropy: Introduction to Classical and Quantum Logical Information theory.David Ellerman - 2018 - Entropy 20 (9):679.
    Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  4. Entropy and the Direction of Time.Jerzy Gołosz - 2021 - Entropy 23 (4):388.
    The paper tries to demonstrate that the process of the increase of entropy does not explain the asymmetry of time itself because it is unable to account for its fundamental asymmetries, that is, the asymmetry of traces (we have traces of the past and no traces of the future), the asymmetry of causation (we have an impact on future events with no possibility of having an impact on the past), and the asymmetry between the fixed past and the open (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  5. The entropy theory of counterfactuals.Douglas N. Kutach - 2002 - Philosophy of Science 69 (1):82-104.
    I assess the thesis that counterfactual asymmetries are explained by an asymmetry of the global entropy at the temporal boundaries of the universe, by developing a method of evaluating counterfactuals that includes, as a background assumption, the low entropy of the early universe. The resulting theory attempts to vindicate the common practice of holding the past mostly fixed under counterfactual supposition while at the same time allowing the counterfactual's antecedent to obtain by a natural physical development. Although the (...)
    Download  
     
    Export citation  
     
    Bookmark   27 citations  
  6. Entropy of Polysemantic Words for the Same Part of Speech.Mihaela Colhon, Florentin Smarandache & Dan Valeriu Voinea - unknown
    In this paper, a special type of polysemantic words, that is, words with multiple meanings for the same part of speech, are analyzed under the name of neutrosophic words. These words represent the most dif cult cases for the disambiguation algorithms as they represent the most ambiguous natural language utterances. For approximate their meanings, we developed a semantic representation framework made by means of concepts from neutrosophic theory and entropy measure in which we incorporate sense related data. We show (...)
    Download  
     
    Export citation  
     
    Bookmark  
  7. IN-cross Entropy Based MAGDM Strategy under Interval Neutrosophic Set Environment.Shyamal Dalapati, Surapati Pramanik, Shariful Alam, Florentin Smarandache & Tapan Kumar Roy - 2017 - Neutrosophic Sets and Systems 18:43-57.
    Cross entropy measure is one of the best way to calculate the divergence of any variable from the priori one variable. We define a new cross entropy measure under interval neutrosophic set environment.
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  8. Does von Neumann Entropy Correspond to Thermodynamic Entropy?Eugene Y. S. Chua - 2021 - Philosophy of Science 88 (1):145-168.
    Conventional wisdom holds that the von Neumann entropy corresponds to thermodynamic entropy, but Hemmo and Shenker (2006) have recently argued against this view by attacking von Neumann's (1955) argument. I argue that Hemmo and Shenker's arguments fail due to several misunderstandings: about statistical-mechanical and thermodynamic domains of applicability, about the nature of mixed states, and about the role of approximations in physics. As a result, their arguments fail in all cases: in the single-particle case, the finite particles case, (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  9. Entropy : A concept that is not a physical quantity.Shufeng Zhang - 2012 - Physics Essays 25 (2):172-176.
    This study has demonstrated that entropy is not a physical quantity, that is, the physical quantity called entropy does not exist. If the efficiency of heat engine is defined as η = W/W1, and the reversible cycle is considered to be the Stirling cycle, then, given ∮dQ/T = 0, we can prove ∮dW/T = 0 and ∮d/T = 0. If ∮dQ/T = 0, ∮dW/T = 0 and ∮dE/T = 0 are thought to define new system state variables, such (...)
    Download  
     
    Export citation  
     
    Bookmark  
  10. An introduction to logical entropy and its relation to Shannon entropy.David Ellerman - 2013 - International Journal of Semantic Computing 7 (2):121-145.
    The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set--just as the usual logical notion of probability based on the Boolean (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  11. Entropy as Root Metaphor.Eric Zencey - 1986 - Dissertation, The Claremont Graduate University
    Metaphors establish connection. Root metaphors--patterns of relational imagery in the language and thought of a culture, in which a diverse group of tenors are related to a single indentifiable class of vehicles--play an important role in organizing our thought, and in bringing a coherence to our vision of the world. This is a political function; root metaphors, as philosopher Stephen Pepper discusses them, are most often found in the works of philosophers remembered as political philosophers. ;The second law of thermodynamics--the (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  12. Quantum Mechanics as the Solution to a Maximization Problem on the Entropy of All Quantum Measurements.Harvey-Tremblay Alexandre - manuscript
    This work presents a novel formulation of quantum mechanics as the solution to an entropy maximization problem constrained by empirical measurement outcomes. By treating the complete set of possible measurement outcomes as an optimization constraint, our entropy maximization problem derives the axioms of quantum mechanics as theorems, demonstrating that the theory's mathematical structure is the least biased probability measure consistent with the observed data. This approach reduces the foundation of quantum mechanics to a single axiom, the measurement constraint, (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  13. Einstein, Entropy, and Anomalies.Daniel Sirtes & Eric Oberheim - 2006 - AIP Conference Proceedings 861:1147-1154.
    This paper strengthens and defends the pluralistic implications of Einstein's successful, quantitative predictions of Brownian motion for a philosophical dispute about the nature of scientific advance that began between two prominent philosophers of science in the second half of the twentieth century (Thomas Kuhn and Paul Feyerabend). Kuhn promoted a monistic phase-model of scientific advance, according to which a paradigm driven `normal science' gives rise to its own anomalies, which then lead to a crisis and eventually a scientific revolution. Feyerabend (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  14. Degeneration and Entropy.Eugene Y. S. Chua - 2022 - Kriterion - Journal of Philosophy 36 (2):123-155.
    [Accepted for publication in Lakatos's Undone Work: The Practical Turn and the Division of Philosophy of Mathematics and Philosophy of Science, special issue of Kriterion: Journal of Philosophy. Edited by S. Nagler, H. Pilin, and D. Sarikaya.] Lakatos’s analysis of progress and degeneration in the Methodology of Scientific Research Programmes is well-known. Less known, however, are his thoughts on degeneration in Proofs and Refutations. I propose and motivate two new criteria for degeneration based on the discussion in Proofs and Refutations (...)
    Download  
     
    Export citation  
     
    Bookmark  
  15. Change in Entropy as a Function of McTaggart's A-series and B-series.Paul Merriam - manuscript
    This careful note is a very initial foray into the issue of the change in entropy with respect to both McTaggart’s A-series and his B-series. We find a possible solution to the Past Hypothesis problem.
    Download  
     
    Export citation  
     
    Bookmark  
  16. How bad is the postulation of a low entropy initial state of the universe?Aldo Filomeno - 2023 - Aphex 27:141-158.
    I summarize, in this informal interview, the main approaches to the ‘Past Hypothesis’, the postulation of a low-entropy initial state of the universe. I’ve chosen this as an open problem in the philosophical foundations of physics. I hope that this brief overview helps readers in gaining perspective and in appreciating the diverse range of approaches in this fascinating unresolved debate.
    Download  
     
    Export citation  
     
    Bookmark  
  17. On walk entropies in graphs. Response to Dehmer and Mowshowitz.Ernesto Estrada, José A. de la Peña & Naomichi Hatano - 2016 - Complexity 21 (S1):15-18.
    We provide here irrefutable facts that prove the falsehood of the claims published in [1] by Dehmer and Mowshowitz (DM) against our paper published in [2]. We first prove that Dehmer’s definition of node probability [3] is flawed. In addition, we show that it was not Dehmer in [3] who proposed this definition for the first time. We continue by proving how the use of Dehmer’s definition does not reveal all the physico-mathematical richness of the walk entropy of graphs. (...)
    Download  
     
    Export citation  
     
    Bookmark  
  18. Norwich’s Entropy Theory: how not to go from abstract to actual.Lance Nizami - 2011 - Kybernetes 40:1102-1118.
    Purpose – The purpose of this paper is to ask whether a first-order-cybernetics concept, Shannon’s Information Theory, actually allows a far-reaching mathematics of perception allegedly derived from it, Norwich et al.’s “Entropy Theory of Perception”. Design/methodology/approach – All of The Entropy Theory, 35 years of publications, was scrutinized for its characterization of what underlies Shannon Information Theory: Shannon’s “general communication system”. There, “events” are passed by a “source” to a “transmitter”, thence through a “noisy channel” to a “receiver”, (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  19.  49
    “A Thousand Words”: How Shannon Entropy perspective provides link among exponential data growth, average temperature of the Earth, declining Earth magnetic field, and global consciousness.Victor Christianto & Florentin Smarandache - manuscript
    The sunspot data seems to indicate that the Sun is likely to enter Maunder Minimum, then it will mean that low Sun activity may cause low temperature in Earth. If this happens then it will cause a phenomenon which is called by some climatology experts as “The Little Ice Age” for the next 20-30 years, starting from the next few years. Therefore, the Earth climate in the coming years tend to be cooler than before. This phenomenon then causes us to (...)
    Download  
     
    Export citation  
     
    Bookmark  
  20.  45
    A Decision-Making Approach Incorporating TODIM Method and Sine Entropy in q-Rung Picture Fuzzy Set Setting.Büşra Aydoğan, Murat Olgun, Florentin Smarandache & Mehmet Ünver - 2024 - Journal of Applied Mathematics 2024.
    In this study, we propose a new approach based on fuzzy TODIM (Portuguese acronym for interactive and multicriteria decision-making) for decision-making problems in uncertain environments. Our method incorporates group utility and individual regret, which are often ignored in traditional multicriteria decision-making (MCDM) methods. To enhance the analysis and application of fuzzy sets in decision-making processes, we introduce novel entropy and distance measures for q-rung picture fuzzy sets. These measures include an entropy measure based on the sine function and (...)
    Download  
     
    Export citation  
     
    Bookmark  
  21. The Concept of Entropy in Statistical Mechanics and Stochastic Music Theory.Ivano Zanzarella - manuscript
    Originally appeared in the field of thermodynamics, the concept of entropy, especially in its statistical acceptation, has found applications in many different disciplines, both inside and outside science. In this work we focus on the possibility of drawing an isomorphism between the entropy of Boltzmann’s statistical mechanics and that of Xenakis’s stochastic music theory. We expose the major technical aspects of the two entropies and then consider affinities and differences between them, both at syntactic and at semantic level, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  22. Ampliative Inference Under Varied Entropy Levels.Paul D. Thorn & Gerhard Schurz - 2013 - In Christoph Beierle & Gabriele Kern-Isberner (eds.), Proceedings of the 4th Workshop on Dynamics of Knowledge and Belief (DKB-2013). Fakultät für Mathematik und Informatik, FernUniversität in Hagen. pp. 77-88.
    Download  
     
    Export citation  
     
    Bookmark  
  23. Information and meaning (Entropy 2003).Christophe Menant - 2003 - Entropy 5:193-204.
    We propose here to clarify some of the relations existing between information and meaning by showing how meaningful information can be generated by a system submitted to a constraint. We build up definitions and properties for meaningful information, a meaning generator system and the domain of efficiency of a meaning (to cover cases of meaningful information transmission). Basic notions of information processing are used.
    Download  
     
    Export citation  
     
    Bookmark   17 citations  
  24. Bertrand's Paradox and the Maximum Entropy Principle.Nicholas Shackel & Darrell P. Rowbottom - 2019 - Philosophy and Phenomenological Research 101 (3):505-523.
    An important suggestion of objective Bayesians is that the maximum entropy principle can replace a principle which is known to get into paradoxical difficulties: the principle of indifference. No one has previously determined whether the maximum entropy principle is better able to solve Bertrand’s chord paradox than the principle of indifference. In this paper I show that it is not. Additionally, the course of the analysis brings to light a new paradox, a revenge paradox of the chords, that (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  25. There’s Plenty of Boole at the Bottom: A Reversible CA Against Information Entropy.Francesco Berto, Jacopo Tagliabue & Gabriele Rossi - 2016 - Minds and Machines 26 (4):341-357.
    “There’s Plenty of Room at the Bottom”, said the title of Richard Feynman’s 1959 seminal conference at the California Institute of Technology. Fifty years on, nanotechnologies have led computer scientists to pay close attention to the links between physical reality and information processing. Not all the physical requirements of optimal computation are captured by traditional models—one still largely missing is reversibility. The dynamic laws of physics are reversible at microphysical level, distinct initial states of a system leading to distinct final (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  26. Qualitative probabilistic inference under varied entropy levels.Paul D. Thorn & Gerhard Schurz - 2016 - Journal of Applied Logic 19 (2):87-101.
    In previous work, we studied four well known systems of qualitative probabilistic inference, and presented data from computer simulations in an attempt to illustrate the performance of the systems. These simulations evaluated the four systems in terms of their tendency to license inference to accurate and informative conclusions, given incomplete information about a randomly selected probability distribution. In our earlier work, the procedure used in generating the unknown probability distribution (representing the true stochastic state of the world) tended to yield (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  27. Spencer-Brown vs. Probability and Statistics: Entropy’s Testimony on Subjective and Objective Randomness.Julio Michael Stern - 2011 - Information 2 (2):277-301.
    This article analyzes the role of entropy in Bayesian statistics, focusing on its use as a tool for detection, recognition and validation of eigen-solutions. “Objects as eigen-solutions” is a key metaphor of the cognitive constructivism epistemological framework developed by the philosopher Heinz von Foerster. Special attention is given to some objections to the concepts of probability, statistics and randomization posed by George Spencer-Brown, a figure of great influence in the field of radical constructivism.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  28. On Classical and Quantum Logical Entropy.David Ellerman - manuscript
    The notion of a partition on a set is mathematically dual to the notion of a subset of a set, so there is a logic of partitions dual to Boole's logic of subsets (Boolean logic is usually mis-specified as "propositional" logic). The notion of an element of a subset has as its dual the notion of a distinction of a partition (a pair of elements in different blocks). Boole developed finite logical probability as the normalized counting measure on elements of (...)
    Download  
     
    Export citation  
     
    Bookmark  
  29. The Inert vs. the Living State of Matter: Extended Criticality, Time Geometry, Anti-Entropy - An Overview.Giuseppe Longo & Maël Montévil - 2012 - Frontiers in Physiology 3:39.
    The physical singularity of life phenomena is analyzed by means of comparison with the driving concepts of theories of the inert. We outline conceptual analogies, transferals of methodologies and theoretical instruments between physics and biology, in addition to indicating significant differences and sometimes logical dualities. In order to make biological phenomenalities intelligible, we introduce theoretical extensions to certain physical theories. In this synthetic paper, we summarize and propose a unified conceptual framework for the main conclusions drawn from work spanning a (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  30. Bridging Conceptual Gaps: The Kolmogorov-Sinai Entropy.Massimiliano Badino - forthcoming - Isonomía. Revista de Teoría y Filosofía Del Derecho.
    The Kolmogorov-Sinai entropy is a fairly exotic mathematical concept which has recently aroused some interest on the philosophers’ part. The most salient trait of this concept is its working as a junction between such diverse ambits as statistical mechanics, information theory and algorithm theory. In this paper I argue that, in order to understand this very special feature of the Kolmogorov-Sinai entropy, is essential to reconstruct its genealogy. Somewhat surprisingly, this story takes us as far back as the (...)
    Download  
     
    Export citation  
     
    Bookmark  
  31. The temporal foundation of the principle of maximal entropy.Vasil Penchev - 2020 - Logic and Philosophy of Mathematics eJournal 12 (11):1-3.
    The principle of maximal entropy (further abbreviated as “MaxEnt”) can be founded on the formal mechanism, in which future transforms into past by the mediation of present. This allows of MaxEnt to be investigated by the theory of quantum information. MaxEnt can be considered as an inductive analog or generalization of “Occam’s razor”. It depends crucially on choice and thus on information just as all inductive methods of reasoning. The essence shared by Occam’s razor and MaxEnt is for the (...)
    Download  
     
    Export citation  
     
    Bookmark  
  32. There Is No Puzzle about the Low Entropy Past.Craig Callender - 2004 - In Christopher Hitchcock (ed.), Contemporary Debates in Philosophy of Science. Blackwell. pp. 240-255.
    Suppose that God or a demon informs you of the following future fact: despite recent cosmological evidence, the universe is indeed closed and it will have a ‘final’ instant of time; moreover, at that final moment, all 49 of the world’s Imperial Faberge eggs will be in your bedroom bureau’s sock drawer. You’re absolutely certain that this information is true. All of your other dealings with supernatural powers have demonstrated that they are a trustworthy lot.
    Download  
     
    Export citation  
     
    Bookmark   37 citations  
  33. Can Interventionists Be Neo-Russellians? Interventionism, the Open Systems Argument, and the Arrow of Entropy.Alexander Reutlinger - 2013 - International Studies in the Philosophy of Science 27 (3):273-293.
    International Studies in the Philosophy of Science, Volume 27, Issue 3, Page 273-293, September 2013.
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  34. Sensory Systems as Cybernetic Systems that Require Awareness of Alternatives to Interact with the World: Analysis of the Brain-Receptor Loop in Norwich's Entropy Theory of Perception.Lance Nizami - 2009 - Proceedings of the 2009 IEEE International Conference on Systems, Man, and Cybernetics. San Antonio, TX.
    Introduction & Objectives: Norwich’s Entropy Theory of Perception (1975 [1] -present) stands alone. It explains many firing-rate behaviors and psychophysical laws from bare theory. To do so, it demands a unique sort of interaction between receptor and brain, one that Norwich never substantiated. Can it now be confirmed, given the accumulation of empirical sensory neuroscience? Background: Norwich conjoined sensation and a mathematical model of communication, Shannon’s Information Theory, as follows: “In the entropic view of sensation, magnitude of sensation is (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  35. Matter Cycles, Energy Flows, Entropy Ensures Complexity Grows. [REVIEW]Blaine Snow - manuscript
    A review of Eric Schneider and Dorion Sagan's 2006, "Into the Cool: Energy Flows, Thermodynamics, and Life.".
    Download  
     
    Export citation  
     
    Bookmark  
  36. Is uncertainty reduction the basis for perception? Errors in Norwich’s Entropy Theory of Perception imply otherwise.Lance Nizami - 2010 - Proceedings of the World Congress on Engineering and Computer Science 2010 (Lecture Notes in Engineering and Computer Science) 2.
    This paper reveals errors within Norwich et al.’s Entropy Theory of Perception, errors that have broad implications for our understanding of perception. What Norwich and coauthors dubbed their “informational theory of neural coding” is based on cybernetics, that is, control and communication in man and machine. The Entropy Theory uses information theory to interpret human performance in absolute judgments. There, the continuum of the intensity of a sensory stimulus is cut into categories and the subject is shown exemplar (...)
    Download  
     
    Export citation  
     
    Bookmark  
  37. Ethical implications of onto-epistemological pluralism in relation to entropy,.Mónica Gómez - 2019 - Scientia in Verba Magazine 3 (2):200-211.
    From the epistemological posture that we present in this work we sustain the following thesis:-That as subjects we constitute the world we live in through one of the possible conceptual frameworks.-Our cognitive and social practices construct the world in a certain manner, which makes us responsible for the way this world is constituted.
    Download  
     
    Export citation  
     
    Bookmark  
  38. From Quantum Entanglement to Spatiotemporal Distance.Alyssa Ney - 2021 - In Christian Wüthrich, Baptiste Le Bihan & Nick Huggett (eds.), Philosophy Beyond Spacetime. Oxford: Oxford University Press.
    Within the field of quantum gravity, there is an influential research program developing the connection between quantum entanglement and spatiotemporal distance. Quantum information theory gives us highly refined tools for quantifying quantum entanglement such as the entanglement entropy. Through a series of well-confirmed results, it has been shown how these facts about the entanglement entropy of component systems may be connected to facts about spatiotemporal distance. Physicists are seeing these results as yielding promising methods for better understanding the (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  39. Time's arrow and self‐locating probability.Eddy Keming Chen - 2021 - Philosophy and Phenomenological Research 105 (3):533-563.
    One of the most difficult problems in the foundations of physics is what gives rise to the arrow of time. Since the fundamental dynamical laws of physics are (essentially) symmetric in time, the explanation for time's arrow must come from elsewhere. A promising explanation introduces a special cosmological initial condition, now called the Past Hypothesis: the universe started in a low-entropy state. Unfortunately, in a universe where there are many copies of us (in the distant ''past'' or the distant (...)
    Download  
     
    Export citation  
     
    Bookmark  
  40. Why did life emerge?Arto Annila & Annila E. Annila A. - 2008 - International Journal of Astrobiology 7 (3-4):293–300.
    Many mechanisms, functions and structures of life have been unraveled. However, the fundamental driving force that propelled chemical evolution and led to life has remained obscure. The second law of thermodynamics, written as an equation of motion, reveals that elemental abiotic matter evolves from the equilibrium via chemical reactions that couple to external energy towards complex biotic non-equilibrium systems. Each time a new mechanism of energy transduction emerges, e.g., by random variation in syntheses, evolution prompts by punctuation and settles to (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  41. Statistical mechanics and thermodynamics: A Maxwellian view.Wayne C. Myrvold - 2011 - Studies in History and Philosophy of Science Part A 42 (4):237-243.
    One finds, in Maxwell's writings on thermodynamics and statistical physics, a conception of the nature of these subjects that differs in interesting ways from the way that they are usually conceived. In particular, though—in agreement with the currently accepted view—Maxwell maintains that the second law of thermodynamics, as originally conceived, cannot be strictly true, the replacement he proposes is different from the version accepted by most physicists today. The modification of the second law accepted by most physicists is a probabilistic (...)
    Download  
     
    Export citation  
     
    Bookmark   16 citations  
  42. Artificial Evil and the Foundation of Computer Ethics.Luciano Floridi & J. W. Sanders - 2001 - Springer Netherlands. Edited by Luciano Floridi & J. W. Sanders.
    Moral reasoning traditionally distinguishes two types of evil:moral (ME) and natural (NE). The standard view is that ME is the product of human agency and so includes phenomena such as war,torture and psychological cruelty; that NE is the product of nonhuman agency, and so includes natural disasters such as earthquakes, floods, disease and famine; and finally, that more complex cases are appropriately analysed as a combination of ME and NE. Recently, as a result of developments in autonomous agents in cyberspace, (...)
    Download  
     
    Export citation  
     
    Bookmark   29 citations  
  43. Natural process – Natural selection.Arto Annila - 2007 - Biophysical Chemistry 127: 123–128.
    Life is supported by a myriad of chemical reactions. To describe the overall process we have formulated entropy for an open system undergoing chemical reactions. The entropy formula allows us to recognize various ways for the system to move towards more probable states. These correspond to the basic processes of life i.e. proliferation, differentiation, expansion, energy intake, adaptation and maturation. We propose that the rate of entropy production by various mechanisms is the fitness criterion of natural selection. (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  44.  57
    A New Logic, a New Information Measure, and a New Information-Based Approach to Interpreting Quantum Mechanics.David Ellerman - 2024 - Entropy Special Issue: Information-Theoretic Concepts in Physics 26 (2).
    The new logic of partitions is dual to the usual Boolean logic of subsets (usually presented only in the special case of the logic of propositions) in the sense that partitions and subsets are category-theoretic duals. The new information measure of logical entropy is the normalized quantitative version of partitions. The new approach to interpreting quantum mechanics (QM) is showing that the mathematics (not the physics) of QM is the linearized Hilbert space version of the mathematics of partitions. Or, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  45. The nature of correlation perception in scatterplots.Ronald A. Rensink - 2017 - Psychonomic Bulletin & Review 24 (3):776-797.
    For scatterplots with gaussian distributions of dots, the perception of Pearson correlation r can be described by two simple laws: a linear one for discrimination, and a logarithmic one for perceived magnitude (Rensink & Baldridge, 2010). The underlying perceptual mechanisms, however, remain poorly understood. To cast light on these, four different distributions of datapoints were examined. The first had 100 points with equal variance in both dimensions. Consistent with earlier results, just noticeable difference (JND) was a linear function of the (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  46. Squaring the Circle: In Quest for Sustainability.Gennady Shkliarevsky - 2015 - Systems Research and Behavioral Science 32 (6):629-49.
    Development has been themain strategy in addressing the problemof sustainability since at least the mid-1980s. The results of this strategy have been mixed, if not disappointing. In their objections to this approach, critics frequently invoke constraints imposed by physical reality of which the most important one is entropy production. They question the belief that technological innovations are capable of solving the problem of sustainability. Is development the right response to this problem and is the current course capable of attaining (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  47. Essays on the Metaphysics of Quantum Mechanics.Eddy Keming Chen - 2019 - Dissertation, Rutgers University, New Brunswick
    What is the proper metaphysics of quantum mechanics? In this dissertation, I approach the question from three different but related angles. First, I suggest that the quantum state can be understood intrinsically as relations holding among regions in ordinary space-time, from which we can recover the wave function uniquely up to an equivalence class (by representation and uniqueness theorems). The intrinsic account eliminates certain conventional elements (e.g. overall phase) in the representation of the quantum state. It also dispenses with first-order (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  48. In the light of time.Arto Annila - 2009 - Proceedings of Royal Society A 465:1173–1198.
    The concept of time is examined using the second law of thermodynamics that was recently formulated as an equation of motion. According to the statistical notion of increasing entropy, flows of energy diminish differences between energy densities that form space. The flow of energy is identified with the flow of time. The non-Euclidean energy landscape, i.e. the curved space–time, is in evolution when energy is flowing down along gradients and levelling the density differences. The flows along the steepest descents, (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  49.  55
    Theory of Cooperative-Competitive Intelligence: Principles, Research Directions, and Applications.Robert Hristovski & Natàlia Balagué - 2020 - Frontiers in Psychology 11.
    We present a theory of cooperative-competitive intelligence (CCI), its measures, research program, and applications that stem from it. Within the framework of this theory, satisficing sub-optimal behavior is any behavior that does not promote a decrease in the prospective control of the functional action diversity/unpredictability (D/U) potential of the agent or team. This potential is defined as the entropy measure in multiple, context-dependent dimensions. We define the satisficing interval of behaviors as CCI. In order to manifest itself at individual (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  50. Towards an AB-series interpretation of time in physics.Paul Merriam - manuscript
    How can McTaggart's A-series notion of time be incorporated into physics while retaining the B-series notion? It may be the A-series 'now' can be construed as ontologically private. How is that modeled? Could a definition of a combined AB-series entropy help with the Past Hypothesis problem? What if the increase in entropy as a system goes from earlier times to later times is canceled by the decrease in entropy as a system goes from future, to present, to (...)
    Download  
     
    Export citation  
     
    Bookmark  
1 — 50 / 189