Entropy is ubiquitous in physics, and it plays important roles in numerous other disciplines ranging from logic and statistics to biology and economics. However, a closer look reveals a complicated picture: entropy is defined differently in different contexts, and even within the same domain different notions of entropy are at work. Some of these are defined in terms of probabilities, others are not. The aim of this chapter is to arrive at an understanding of some of the (...) most important notions of entropy and to clarify the relations between them, After setting the stage by introducing the thermodynamic entropy, we discuss notions of entropy in information theory, statistical mechanics, dynamical systems theory and fractal geometry. (shrink)
Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this (...) paper is to give the direct generalization to quantum logical information theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., qudits of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates that are distinguished by the measurement. Both the classical and quantum versions of logical entropy have simple interpretations as “two-draw” probabilities for distinctions. The conclusion is that quantum logical entropy is the simple and natural notion of information for quantum information theory focusing on the distinguishing of quantum states. (shrink)
I assess the thesis that counterfactual asymmetries are explained by an asymmetry of the global entropy at the temporal boundaries of the universe, by developing a method of evaluating counterfactuals that includes, as a background assumption, the low entropy of the early universe. The resulting theory attempts to vindicate the common practice of holding the past mostly fixed under counterfactual supposition while at the same time allowing the counterfactual's antecedent to obtain by a natural physical development. Although the (...) theory has some success in evaluating a wide variety of ordinary counterfactuals, it fails as an explanation of counterfactual asymmetry. (shrink)
The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set--just as the usual logical notion of probability based on the Boolean (...) logic of subsets is the normalized counting measure of the subsets (events). Thus logical entropy is a measure on the set of ordered pairs, and all the compound notions of entropy (join entropy, conditional entropy, and mutual information) arise in the usual way from the measure (e.g., the inclusion-exclusion principle)--just like the corresponding notions of probability. The usual Shannon entropy of a partition is developed by replacing the normalized count of distinctions (dits) by the average number of binary partitions (bits) necessary to make all the distinctions of the partition. (shrink)
This paper strengthens and defends the pluralistic implications of Einstein's successful, quantitative predictions of Brownian motion for a philosophical dispute about the nature of scientific advance that began between two prominent philosophers of science in the second half of the twentieth century (Thomas Kuhn and Paul Feyerabend). Kuhn promoted a monistic phase-model of scientific advance, according to which a paradigm driven `normal science' gives rise to its own anomalies, which then lead to a crisis and eventually a scientific revolution. Feyerabend (...) stressed the importance of pluralism for scientific progress. He rejected Kuhn's model arguing that it fails to recognize the role that alternative theories can play in identifying exactly which phenomena are anomalous in the first place. On Feyerabend's account, Einstein's predictions allow for a crucial experiment between two incommensurable theories, and are an example of an anomaly that could refute the reigning paradigm only after the development of a competitor. Using Kuhn's specification of a disciplinary matrix to illustrate the incommensurability between the two paradigms, we examine the different research strategies available in this peculiar case. On the basis of our reconstruction, we conclude by rebutting some critics of Feyerabend's argument. (shrink)
Metaphors establish connection. Root metaphors--patterns of relational imagery in the language and thought of a culture, in which a diverse group of tenors are related to a single indentifiable class of vehicles--play an important role in organizing our thought, and in bringing a coherence to our vision of the world. This is a political function; root metaphors, as philosopher Stephen Pepper discusses them, are most often found in the works of philosophers remembered as political philosophers. ;The second law of thermodynamics--the (...) law of entropy--holds that in any spontaneous process, usable energy becomes unusable energy. It also suggests that improbable order must succumb, through time, to more probable chaos. The law of entropy has enjoyed a popularity as metaphor unusual for such physics esoterica. In the works of Brooks Adams, Henry Adams, Nicholas Georgescu-Roegen, and Thomas Pynchon, the idea of entropy appears as the fundamental, organizing idea for an economic interpretation of history, a philosophy of history, an ecologically enlightened economic theory, and an encyclopedic novel that apotheosizes modern culture. Analysis of how the entropy metaphor is manifest in the works of these thinkers allows us to judge the strengths and weaknesses of entropy as root metaphor. Analysis of its contemporary popularity affords insight into the politics of the day. Ultimately, the entropy root metaphor serves as the foundation of a refurbished "generating substance" world hypothesis, but the root metaphor itself remains equivocal on the important issue of centralized versus decentralized political organization. (shrink)
This study has demonstrated that entropy is not a physical quantity, that is, the physical quantity called entropy does not exist. If the efficiency of heat engine is defined as η = W/W1, and the reversible cycle is considered to be the Stirling cycle, then, given ∮dQ/T = 0, we can prove ∮dW/T = 0 and ∮d/T = 0. If ∮dQ/T = 0, ∮dW/T = 0 and ∮dE/T = 0 are thought to define new system state variables, such (...) definitions would be absurd. The fundamental error of entropy is that in any reversible process, the polytropic process function Q is not a single-valued function of T, and the key step of Σ[(ΔQ)/T)] to ∫dQ/T doesn’t hold, P-V fig. should be P-V-T fig.in thermodynamics. Similarly, ∮dQ/T = 0, ∮dW/T = 0 and ∮dE/T = 0 do not hold, either. Since the absolute entropy of Boltzmann is used to explain Clausius entropy and the unit (J/K) of the former is transformed from the latter, the non-existence of Clausius entropy simultaneously denies Boltzmann entropy. (shrink)
The paper tries to demonstrate that the process of the increase of entropy does not explain the asymmetry of time itself because it is unable to account for its fundamental asymmetries, that is, the asymmetry of traces (we have traces of the past and no traces of the future), the asymmetry of causation (we have an impact on future events with no possibility of having an impact on the past), and the asymmetry between the fixed past and the open (...) future, To this end, the approaches of Boltzmann, Reichenbach (and his followers), and Albert are analysed. It is argued that we should look for alternative approaches instead of this, namely we should consider a temporally asymmetrical physical theory or seek a source of the asymmetry of time in metaphysics. This second approach may even turn out to be complementary if only we accept that metaphysics can complement scientific research programmes. (shrink)
In previous work, we studied four well known systems of qualitative probabilistic inference, and presented data from computer simulations in an attempt to illustrate the performance of the systems. These simulations evaluated the four systems in terms of their tendency to license inference to accurate and informative conclusions, given incomplete information about a randomly selected probability distribution. In our earlier work, the procedure used in generating the unknown probability distribution (representing the true stochastic state of the world) tended to yield (...) probability distributions with moderately high entropy levels. In the present article, we present data charting the performance of the four systems when reasoning in environments of various entropy levels. The results illustrate variations in the performance of the respective reasoning systems that derive from the entropy of the environment, and allow for a more inclusive assessment of the reliability and robustness of the four systems. (shrink)
The physical singularity of life phenomena is analyzed by means of comparison with the driving concepts of theories of the inert. We outline conceptual analogies, transferals of methodologies and theoretical instruments between physics and biology, in addition to indicating significant differences and sometimes logical dualities. In order to make biological phenomenalities intelligible, we introduce theoretical extensions to certain physical theories. In this synthetic paper, we summarize and propose a unified conceptual framework for the main conclusions drawn from work spanning a (...) book and several articles, quoted throughout. (shrink)
Purpose – The purpose of this paper is to ask whether a first-order-cybernetics concept, Shannon’s Information Theory, actually allows a far-reaching mathematics of perception allegedly derived from it, Norwich et al.’s “Entropy Theory of Perception”. Design/methodology/approach – All of The Entropy Theory, 35 years of publications, was scrutinized for its characterization of what underlies Shannon Information Theory: Shannon’s “general communication system”. There, “events” are passed by a “source” to a “transmitter”, thence through a “noisy channel” to a “receiver”, (...) that passes “outcomes” (received events) to a “destination”. Findings – In the entropy theory, “events” were sometimes interactions with the stimulus, but could be microscopic stimulus conditions. “Outcomes” often went unnamed; sometimes, the stimulus, or the interaction with it, or the resulting sensation, were “outcomes”. A “source” was often implied to be a “transmitter”, which frequently was a primary afferent neuron; elsewhere, the stimulus was the “transmitter” and perhaps also the “source”. “Channel” was rarely named; once, it was the whole eye; once, the incident photons; elsewhere, the primary or secondary afferent. “Receiver” was usually the sensory receptor, but could be an afferent. “Destination” went unmentioned. In sum, the entropy theory’s idea of Shannon’s “general communication system” was entirely ambiguous. Research limitations/implications – The ambiguities indicate that, contrary to claim, the entropy theory cannot be an “information theoretical description of the process of perception”. Originality/value – Scrutiny of the entropy theory’s use of information theory was overdue and reveals incompatibilities that force a reconsideration of information theory’s possible role in perception models. A second-order-cybernetics approach is suggested. (shrink)
We propose here to clarify some of the relations existing between information and meaning by showing how meaningful information can be generated by a system submitted to a constraint. We build up definitions and properties for meaningful information, a meaning generator system and the domain of efficiency of a meaning (to cover cases of meaningful information transmission). Basic notions of information processing are used.
This careful note is a very initial foray into the issue of the change in entropy with respect to both McTaggart’s A-series and his B-series. We find a possible solution to the Past Hypothesis problem.
Conventional wisdom holds that the von Neumann entropy corresponds to thermodynamic entropy, but Hemmo and Shenker (2006) have recently argued against this view by attacking von Neumann's (1955) argument. I argue that Hemmo and Shenker's arguments fail due to several misunderstandings: about statistical-mechanical and thermodynamic domains of applicability, about the nature of mixed states, and about the role of approximations in physics. As a result, their arguments fail in all cases: in the single-particle case, the finite particles case, (...) and the infinite particles case. (shrink)
Introduction & Objectives: Norwich’s Entropy Theory of Perception (1975 [1] -present) stands alone. It explains many firing-rate behaviors and psychophysical laws from bare theory. To do so, it demands a unique sort of interaction between receptor and brain, one that Norwich never substantiated. Can it now be confirmed, given the accumulation of empirical sensory neuroscience? Background: Norwich conjoined sensation and a mathematical model of communication, Shannon’s Information Theory, as follows: “In the entropic view of sensation, magnitude of sensation is (...) regarded as a measure of the entropy or uncertainty of the stimulus signal” [2]. “To be uncertain about the outcome of an event, one must first be aware of a set of alternative outcomes” [3]. “The entropy-establishing process begins with the generation of a [internal] sensory signal by the stimulus generator. This is followed by receipt of the [external] stimulus by the sensory receptor, transmission of action potentials by the sensory neurons, and finally recapture of the [response to the internal] signal by the generator” [4]. The latter “recapture” differentiates external from internal stimuli. The hypothetical “stimulus generators” are internal emitters, that generate photons in vision, audible sounds in audition (to Norwich, the spontaneous otoacoustic emissions [SOAEs]), “temperatures in excess of local skin temperature” in skin temperature sensation [4], etc. Method (1): Several decades of empirical sensory physiology literature was scrutinized for internal “stimulus generators”. Results (1): Spontaneous photopigment isomerization (“dark light”) does not involve visible light. SOAEs are electromechanical basilar-membrane artefacts that rarely produce audible tones. The skin’s temperature sensors do not raise skin temperature, etc. Method (2): The putative action of the brain-and-sensory-receptor loop was carefully reexamined. Results (2): The sensory receptor allegedly “perceives”, experiences “awareness”, possesses “memory”, and has a “mind”. But those traits describe the whole human. The receptor, thus anthropomorphized, must therefore contain its own perceptual loop, containing a receptor, containing a perceptual loop, etc. Summary & Conclusions: The Entropy Theory demands sensory awareness of alternatives, through an imagined brain-and-sensory-receptor loop containing internal “stimulus generators”. But (1) no internal “stimulus generators” seem to exist and (2) the loop would be the outermost of an infinite nesting of identical loops. (shrink)
This article analyzes the role of entropy in Bayesian statistics, focusing on its use as a tool for detection, recognition and validation of eigen-solutions. “Objects as eigen-solutions” is a key metaphor of the cognitive constructivism epistemological framework developed by the philosopher Heinz von Foerster. Special attention is given to some objections to the concepts of probability, statistics and randomization posed by George Spencer-Brown, a figure of great influence in the field of radical constructivism.
The principle of maximal entropy (further abbreviated as “MaxEnt”) can be founded on the formal mechanism, in which future transforms into past by the mediation of present. This allows of MaxEnt to be investigated by the theory of quantum information. MaxEnt can be considered as an inductive analog or generalization of “Occam’s razor”. It depends crucially on choice and thus on information just as all inductive methods of reasoning. The essence shared by Occam’s razor and MaxEnt is for the (...) relevant data known till now to be postulated as an enough fundament of conclusion. That axiom is the kind of choice grounding both principles. Popper’s falsifiability (1935) can be discussed as a complement to them: That axiom (or axiom scheme) is always sufficient but never necessary condition of conclusion therefore postulating the choice in the base of MaxEnt. Furthermore, the abstraction axiom (or axiom scheme) relevant to set theory (e.g. the axiom scheme of specification in ZFC) involves choice analogically. (shrink)
The notion of a partition on a set is mathematically dual to the notion of a subset of a set, so there is a logic of partitions dual to Boole's logic of subsets (Boolean logic is usually mis-specified as "propositional" logic). The notion of an element of a subset has as its dual the notion of a distinction of a partition (a pair of elements in different blocks). Boole developed finite logical probability as the normalized counting measure on elements of (...) subsets so there is a dual concept of logical entropy which is the normalized counting measure on distinctions of partitions. Thus the logical notion of information is a measure of distinctions. Classical logical entropy naturally extends to the notion of quantum logical entropy which provides a more natural and informative alternative to the usual Von Neumann entropy in quantum information theory. The quantum logical entropy of a post-measurement density matrix has the simple interpretation as the probability that two independent measurements of the same state using the same observable will have different results. The main result of the paper is that the increase in quantum logical entropy due to a projective measurement of a pure state is the sum of the absolute squares of the off-diagonal entries ("coherences") of the pure state density matrix that are zeroed ("decohered") by the measurement, i.e., the measure of the distinctions ("decoherences") created by the measurement. (shrink)
The Kolmogorov-Sinai entropy is a fairly exotic mathematical concept which has recently aroused some interest on the philosophers’ part. The most salient trait of this concept is its working as a junction between such diverse ambits as statistical mechanics, information theory and algorithm theory. In this paper I argue that, in order to understand this very special feature of the Kolmogorov-Sinai entropy, is essential to reconstruct its genealogy. Somewhat surprisingly, this story takes us as far back as the (...) beginning of celestial mechanics and through some of the most exciting developments of mathematical physics of the 19th century. (shrink)
“There’s Plenty of Room at the Bottom”, said the title of Richard Feynman’s 1959 seminal conference at the California Institute of Technology. Fifty years on, nanotechnologies have led computer scientists to pay close attention to the links between physical reality and information processing. Not all the physical requirements of optimal computation are captured by traditional models—one still largely missing is reversibility. The dynamic laws of physics are reversible at microphysical level, distinct initial states of a system leading to distinct final (...) states. On the other hand, as von Neumann already conjectured, irreversible information processing is expensive: to erase a single bit of information costs ~3 × 10−21 joules at room temperature. Information entropy is a thermodynamic cost, to be paid in non-computational energy dissipation. This paper addresses the problem drawing on Edward Fredkin’s Finite Nature hypothesis: the ultimate nature of the universe is discrete and finite, satisfying the axioms of classical, atomistic mereology. The chosen model is a cellular automaton with reversible dynamics, capable of retaining memory of the information present at the beginning of the universe. Such a CA can implement the Boolean logical operations and the other building bricks of computation: it can develop and host all-purpose computers. The model is a candidate for the realization of computational systems, capable of exploiting the resources of the physical world in an efficient way, for they can host logical circuits with negligible internal energy dissipation. (shrink)
This paper reveals errors within Norwich et al.’s Entropy Theory of Perception, errors that have broad implications for our understanding of perception. What Norwich and coauthors dubbed their “informational theory of neural coding” is based on cybernetics, that is, control and communication in man and machine. The Entropy Theory uses information theory to interpret human performance in absolute judgments. There, the continuum of the intensity of a sensory stimulus is cut into categories and the subject is shown exemplar (...) stimuli of each category. The subject must then identify individual exemplars by category. The identifications are recorded in the Garner-Hake version of the Shannon “confusion matrix”. The matrix yields “H”, the entropy (degree of uncertainty) about what stimulus was presented. Hypothetically, uncertainty drops as a stimulus lengthens, i.e. a plot of H vs. stimulus duration should fall monotonically. Such “adaptation” is known for both sensation and firing rate. Hence, because “the physiological adaptation curve has the same general shape as the psychophysical adaptation curve”, Norwich et al. assumed that both have the same time course; sensation and firing rate were thus both declared proportional to H. However, a closer look reveals insurmountable contradictions. First, the peripheral neuron hypothetically cannot fire in response to a stimulus of a given intensity until after somehow computing H from its responses to stimuli of various intensities. Thus no sensation occurs until firing rate adapts, i.e. attains its spontaneous rate. But hypothetically, once adaptation is complete, certainty is reached and perception ends. Altogether, then, perception cannot occur until perception is over. Secondly, sensations, firing rates, and H’s are empirically not synchronous, contrary to assumption. In sum, the core concept of the cybernetics-based Entropy Theory of Perception, that is, that uncertainty reduction is the basis for perception, is irrational. (shrink)
From the epistemological posture that we present in this work we sustain the following thesis:-That as subjects we constitute the world we live in through one of the possible conceptual frameworks.-Our cognitive and social practices construct the world in a certain manner, which makes us responsible for the way this world is constituted.
Within the field of quantum gravity, there is an influential research program developing the connection between quantum entanglement and spatiotemporal distance. Quantum information theory gives us highly refined tools for quantifying quantum entanglement such as the entanglement entropy. Through a series of well-confirmed results, it has been shown how these facts about the entanglement entropy of component systems may be connected to facts about spatiotemporal distance. Physicists are seeing these results as yielding promising methods for better understanding the (...) emergence of (the dynamical) spacetime (of general relativity) from more fundamental quantum theories, and moreover, as promising for the development of a nonperturbative theory of quantum gravity. However, to what extent does the case for the entanglement entropy-distance link provide evidence that spacetime structure is nonfundamental and emergent from nongravitational degrees of freedom? I will show that a closer look at the results lends support only to a weaker conclusion, that the facts about quantum entanglement are constrained by facts about spatiotemporal distance, and not that they are the basis from which facts about spatiotemporal distance emerge. (shrink)
Many mechanisms, functions and structures of life have been unraveled. However, the fundamental driving force that propelled chemical evolution and led to life has remained obscure. The second law of thermodynamics, written as an equation of motion, reveals that elemental abiotic matter evolves from the equilibrium via chemical reactions that couple to external energy towards complex biotic non-equilibrium systems. Each time a new mechanism of energy transduction emerges, e.g., by random variation in syntheses, evolution prompts by punctuation and settles to (...) a stasis when the accessed free energy has been consumed. The evolutionary course towards an increasingly larger energy transduction system accumulates a diversity of energy transduction mechanisms, i.e. species. The rate of entropy increase is identified as the fitness criterion among the diverse mechanisms, which places the theory of evolution by natural selection on the fundamental thermodynamic principle with no demarcation line between inanimate and animate. (shrink)
One finds, in Maxwell's writings on thermodynamics and statistical physics, a conception of the nature of these subjects that differs in interesting ways from the way that they are usually conceived. In particular, though—in agreement with the currently accepted view—Maxwell maintains that the second law of thermodynamics, as originally conceived, cannot be strictly true, the replacement he proposes is different from the version accepted by most physicists today. The modification of the second law accepted by most physicists is a probabilistic (...) one: although statistical fluctuations will result in occasional spontaneous differences in temperature or pressure, there is no way to predictably and reliably harness these to produce large violations of the original version of the second law. Maxwell advocates a version of the second law that is strictly weaker; the validity of even this probabilistic version is of limited scope, limited to situations in which we are dealing with large numbers of molecules en masse and have no ability to manipulate individual molecules. Connected with this is his concept of the thermodynamic concepts of heat, work, and entropy; on the Maxwellian view, these are concepts that must be relativized to the means we have available for gathering information about and manipulating physical systems. The Maxwellian view is one that deserves serious consideration in discussions of the foundation of statistical mechanics. It has relevance for the project of recovering thermodynamics from statistical mechanics because, in such a project, it matters which version of the second law we are trying to recover. (shrink)
Moral reasoning traditionally distinguishes two types of evil:moral (ME) and natural (NE). The standard view is that ME is the product of human agency and so includes phenomena such as war,torture and psychological cruelty; that NE is the product of nonhuman agency, and so includes natural disasters such as earthquakes, floods, disease and famine; and finally, that more complex cases are appropriately analysed as a combination of ME and NE. Recently, as a result of developments in autonomous agents in cyberspace, (...) a new class of interesting and important examples of hybrid evil has come to light. In this paper, it is called artificial evil (AE) and a case is made for considering it to complement ME and NE to produce a more adequate taxonomy. By isolating the features that have led to the appearance of AE, cyberspace is characterised as a self-contained environment that forms the essential component in any foundation of the emerging field of Computer Ethics (CE). It is argued that this goes someway towards providing a methodological explanation of why cyberspace is central to so many of CE's concerns; and it is shown how notions of good and evil can be formulated in cyberspace. Of considerable interest is how the propensity for an agent's action to be morally good or evil can be determined even in the absence of biologically sentient participants and thus allows artificial agents not only to perpetrate evil (and fort that matter good) but conversely to `receive' or `suffer from' it. The thesis defended is that the notion of entropy structure, which encapsulates human value judgement concerning cyberspace in a formal mathematical definition, is sufficient to achieve this purpose and, moreover, that the concept of AE can be determined formally, by mathematical methods. A consequence of this approach is that the debate on whether CE should be considered unique, and hence developed as a Macroethics, may be viewed, constructively,in an alternative manner. The case is made that whilst CE issues are not uncontroversially unique, they are sufficiently novel to render inadequate the approach of standard Macroethics such as Utilitarianism and Deontologism and hence to prompt the search for a robust ethical theory that can deal with them successfully. The name Information Ethics (IE) is proposed for that theory. Itis argued that the uniqueness of IE is justified by its being non-biologically biased and patient-oriented: IE is an Environmental Macroethics based on the concept of data entity rather than life. It follows that the novelty of CE issues such as AE can be appreciated properly because IE provides a new perspective (though not vice versa). In light of the discussion provided in this paper, it is concluded that Computer Ethics is worthy of independent study because it requires its own application-specific knowledge and is capable of supporting a methodological foundation, Information Ethics. (shrink)
Life is supported by a myriad of chemical reactions. To describe the overall process we have formulated entropy for an open system undergoing chemical reactions. The entropy formula allows us to recognize various ways for the system to move towards more probable states. These correspond to the basic processes of life i.e. proliferation, differentiation, expansion, energy intake, adaptation and maturation. We propose that the rate of entropy production by various mechanisms is the fitness criterion of natural selection. (...) The quest for more probable states results in organization of matter in functional hierarchies. (shrink)
The concept of time is examined using the second law of thermodynamics that was recently formulated as an equation of motion. According to the statistical notion of increasing entropy, flows of energy diminish differences between energy densities that form space. The flow of energy is identified with the flow of time. The non-Euclidean energy landscape, i.e. the curved space–time, is in evolution when energy is flowing down along gradients and levelling the density differences. The flows along the steepest descents, (...) i.e. geodesics are obtained from the principle of least action for mechanics, electrodynamics and quantum mechanics. The arrow of time, associated with the expansion of the Universe, identifies with grand dispersal of energy when high-energy densities transform by various mechanisms to lower densities in energy and eventually to ever-diluting electromagnetic radiation. Likewise, time in a quantum system takes an increment forwards in the detection-associated dissipative transformation when the stationary-state system begins to evolve pictured as the wave function collapse. The energy dispersal is understood to underlie causality so that an energy gradient is a cause and the resulting energy flow is an effect. The account on causality by the concepts of physics does not imply determinism; on the contrary, evolution of space–time as a causal chain of events is non-deterministic. (shrink)
Development has been themain strategy in addressing the problemof sustainability since at least the mid-1980s. The results of this strategy have been mixed, if not disappointing. In their objections to this approach, critics frequently invoke constraints imposed by physical reality of which the most important one is entropy production. They question the belief that technological innovations are capable of solving the problem of sustainability. Is development the right response to this problem and is the current course capable of attaining (...) sustainability? The article examines closely and critiques the principal theoretical objection to sustainable development that emphasizes physical constraints, and more specifically entropy production. It also offers a critique of the current approach to sustainable development. The article advocates a systems approach as a way to anchor a broad consensus in the ongoing sustainability debates. (shrink)
One of the most difficult problems in the foundations of physics is what gives rise to the arrow of time. Since the fundamental dynamical laws of physics are (essentially) symmetric in time, the explanation for time’s arrow must come from elsewhere. A promising explanation introduces a special cosmological initial condition, now called the Past Hypothesis: the universe started in a low-entropy state. Unfortunately, in a universe where there are many copies of us (in the distant “past” or the distant (...) “future”), the Past Hypothesis is not enough; we also need to postulate self-locating (de se) probabilities. However, I show that we can similarly use self-locating probabilities to strengthen its rival—the Fluctuation Hypothesis, leading to in-principle empirical underdetermination and radical epistemological skepticism. The underdetermination is robust in the sense that it is not resolved by the usual appeal to ‘empirical coherence’ or ‘simplicity.’ That is a serious problem for the vision of providing a completely scientific explanation of time’s arrow. (shrink)
The central motivating idea behind the development of this work is the concept of prespace, a hypothetical structure that is postulated by some physicists to underlie the fabric of space or space-time. I consider how such a structure could relate to space and space-time, and the rest of reality as we know it, and the implications of the existence of this structure for quantum theory. Understanding how this structure could relate to space and to the rest of reality requires, I (...) believe, that we consider how space itself relates to reality, and how other so-called "spaces" used in physics relate to reality. In chapter 2, I compare space and space-time to other spaces used in physics, such as configuration space, phase space and Hilbert space. I support what is known as the "property view" of space, opposing both the traditional views of space and space-time, substantivalism and relationism. I argue that all these spaces are property spaces. After examining the relationships of these spaces to causality, I argue that configuration space has, due to its role in quantum mechanics, a special status in the microscopic world similar to the status of position space in the macroscopic world. In chapter 3, prespace itself is considered. One way of approaching this structure is through the comparison of the prespace structure with a computational system, in particular to a cellular automaton, in which space or space-time and all other physical quantities are broken down into discrete units. I suggest that one way open for a prespace metaphysics can be found if physics is made fully discrete in this way. I suggest as a heuristic principle that the physical laws of our world are such that the computational cost of implementing those laws on an arbitrary computational system is minimized, adapting a heuristic principle of this type proposed by Feynman. In chapter 4, some of the ideas of the previous chapters are applied in an examination of the physics and metaphysics of quantum theory. I first discuss the "measurement problem" of quantum mechanics: this problem and its proposed solution are the primary subjects of chapter 4. It turns out that considering how quantum theory could be made fully discrete leads naturally to a suggestion of how standard linear quantum mechanics could be modified to give rise to a solution to the measurement problem. The computational heuristic principle reinforces the same solution. I call the modified quantum mechanics Critical Complexity Quantum Mechanics (CCQM). I compare CCQM with some of the other proposed solutions to the measurement problem, in particular the spontaneous localization model of Ghirardi, Rimini and Weber. Finally, in chapters 5 and 6, I argue that the measure of complexity of quantum mechanical states I introduce in CCQM also provides a new definition of entropy for quantum mechanics, and suggests a solution to the problem of providing an objective foundation for statistical mechanics, thermodynamics, and the arrow of time. (shrink)
How can McTaggart's A-series notion of time be incorporated into physics while retaining the B-series notion? It may be the A-series 'now' can be construed as ontologically private. How is that modeled? Could a definition of a combined AB-series entropy help with the Past Hypothesis problem? What if the increase in entropy as a system goes from earlier times to later times is canceled by the decrease in entropy as a system goes from future, to present, to (...) past? (shrink)
Maxwell’s Demon is a thought experiment devised by J. C. Maxwell in 1867 in order to show that the Second Law of thermodynamics is not universal, since it has a counter-example. Since the Second Law is taken by many to provide an arrow of time, the threat to its universality threatens the account of temporal directionality as well. Various attempts to “exorcise” the Demon, by proving that it is impossible for one reason or another, have been made throughout the years, (...) but none of them were successful. We have shown (in a number of publications) by a general state-space argument that Maxwell’s Demon is compatible with classical mechanics, and that the most recent solutions, based on Landauer’s thesis, are not general. In this paper we demonstrate that Maxwell’s Demon is also compatible with quantum mechanics. We do so by analyzing a particular (but highly idealized) experimental setup and proving that it violates the Second Law. Our discussion is in the framework of standard quantum mechanics; we give two separate arguments in the framework of quantum mechanics with and without the projection postulate. We address in our analysis the connection between measurement and erasure interactions and we show how these notions are applicable in the microscopic quantum mechanical structure. We discuss what might be the quantum mechanical counterpart of the classical notion of “macrostates”, thus explaining why our Quantum Demon setup works not only at the micro level but also at the macro level, properly understood. One implication of our analysis is that the Second Law cannot provide a universal lawlike basis for an account of the arrow of time; this account has to be sought elsewhere. (shrink)
We refer to the remarkable thought of Erwin Schrödinger expressed in his book “What is life?” regarding the connection between life and a decrease of entropy realized via feeding (eating). This thought is “transferred” into the ﬁeld of human psychology, explaining hooligan behavior (e.g. the “days of violence”) as a natural human response to the improper (in its content or form) “informational feeding” that does not allow one to normally treat (‘’digest”) the received information, i.e. to make ones thoughts (...) simpler in their logical structure. Delivering information without pedagogical, psychological, and (when children are involved) neurological supervision or assistance is supposed to be the cause for the hooliganism that more and more often obtains a dangerous organized form. (shrink)
n this text, we revisit part of the analysis of anti-entropy in Bailly and Longo (2009} and develop further theoretical reflections. In particular, we analyze how randomness, an essential component of biological variability, is associated to the growth of biological organization, both in ontogenesis and in evolution. This approach, in particular, focuses on the role of global entropy production and provides a tool for a mathematical understanding of some fundamental observations by Gould on the increasing phenotypic complexity along (...) evolution. Lastly, we analyze the situation in terms of theoretical symmetries, in order to further specify the biological meaning of anti-entropy as well as its strong link with randomness. (shrink)
In a preceding publication a fundamentally oriented and irreversible world was shown to be de- rivable from the important principle of least action. A consequence of such a paradigm change is avoidance of paradoxes within a “dynamic” quantum physics. This becomes essentially possible because fundamental irreversibility allows consideration of the “entropy” concept in elementary processes. For this reason, and for a compensation of entropy in the spread out energy of the wave, the duality of particle and wave has (...) to be mediated via an information self-image of matter. In this publication considerations are extended to irreversible thermodynamics, to gravitation and cos- mology with its dependence on quantum interpretations. The information self-image of matter around particles could be identified with gravitation. Because information can also impose an al- ways constant light velocity there is no need any more to attribute such a property to empty space, as done in relativity theory. In addition, the possibility is recognized to consider entropy genera- tion by expanding photon fields in the universe. Via a continuous activation of information on matter photons can generate entropy and release small energy packages without interacting with matter. This facilitates a new interpretation of galactic redshift, emphasizes an information link between quantum- and cosmological phenomena, and evidences an information-triggered origin of the universe. Self-organized processes approach maximum entropy production within their constraints. In a far from equilibrium world also information, with its energy content, can self- organize to a higher hierarchy of computation. It is here identified with consciousness. This ap- pears to explain evolution of spirit and intelligence on a materialistic basis. Also gravitation, here identified as information on matter, could, under special conditions, self-organize to act as a su- per-gravitation, offering an alternative to dark matter. Time is not an illusion, but has to be understood as flux of action, which is the ultimate reality of change. The concept of an irreversible physical world opens a route towards a rational understanding of complex contexts in nature. (shrink)
This paper reviews the complex, overlapping ideas of two prominent Italian philosophers, Lorenzo Magnani and Luciano Floridi, with the aim of facilitating the nonviolent transformation of self and world, and with a focus on information technologies in mediating this process. In Floridi’s information ethics, problems of consistency arise between self-poiesis, anagnorisis, entropy, evil, and the narrative structure of the world. Solutions come from Magnani’s work in distributed morality, moral mediators, moral bubbles and moral disengagement. Finally, two examples of information (...) technology, one ancient and one new, a Socratic narrative and an information processing model of moral cognition, are offered as mediators for the nonviolent transformation of self and world respectively, while avoiding the tragic requirements inherent in Floridi’s proposal. (shrink)
This paper considers questions about continuity and discontinuity between life and mind. It begins by examining such questions from the perspective of the free energy principle (FEP). The FEP is becoming increasingly influential in neuroscience and cognitive science. It says that organisms act to maintain themselves in their expected biological and cognitive states, and that they can do so only by minimizing their free energy given that the long-term average of free energy is entropy. The paper then argues that (...) there is no singular interpretation of the FEP for thinking about the relation between life and mind. Some FEP formulations express what we call an independence view of life and mind. One independence view is a cognitivist view of the FEP. It turns on information processing with semantic content, thus restricting the range of systems capable of exhibiting mentality. Other independence views exemplify what we call an overly generous non-cognitivist view of the FEP, and these appear to go in the opposite direction. That is, they imply that mentality is nearly everywhere. The paper proceeds to argue that non-cognitivist FEP, and its implications for thinking about the relation between life and mind, can be usefully constrained by key ideas in recent enactive approaches to cognitive science. We conclude that the most compelling account of the relationship between life and mind treats them as strongly continuous, and that this continuity is based on particular concepts of life (autopoiesis and adaptivity) and mind (basic and non-semantic). (shrink)
In a quantum universe with a strong arrow of time, we postulate a low-entropy boundary condition to account for the temporal asymmetry. In this paper, I show that the Past Hypothesis also contains enough information to simplify the quantum ontology and define a unique initial condition in such a world. First, I introduce Density Matrix Realism, the thesis that the quantum universe is described by a fundamental density matrix that represents something objective. This stands in sharp contrast to Wave (...) Function Realism, the thesis that the quantum universe is described by a wave function that represents something objective. Second, I suggest that the Past Hypothesis is sufficient to determine a unique and simple density matrix. This is achieved by what I call the Initial Projection Hypothesis: the initial density matrix of the universe is the normalized projection onto the special low-dimensional Hilbert space. Third, because the initial quantum state is unique and simple, we have a strong case for the \emph{Nomological Thesis}: the initial quantum state of the universe is on a par with laws of nature. This new package of ideas has several interesting implications, including on the harmony between statistical mechanics and quantum mechanics, the dynamic unity of the universe and the subsystems, and the alleged conflict between Humean supervenience and quantum entanglement. (shrink)
Philosophers of physics have long debated whether the Past State of low entropy of our universe calls for explanation. What is meant by “calls for explanation”? In this article we analyze this notion, distinguishing between several possible meanings that may be attached to it. Taking the debate around the Past State as a case study, we show how our analysis of what “calling for explanation” might mean can contribute to clarifying the debate and perhaps to settling it, thus demonstrating (...) the fruitfulness of this analysis. Applying our analysis, we show that two main opponents in this debate, Huw Price and Craig Callender, are, for the most part, talking past each other rather than disagreeing, as they employ different notions of “calling for explanation”. We then proceed to show how answering the different questions that arise out of the different meanings of “calling for explanation” can result in clarifying the problems at hand and thus, hopefully, to solving them. (shrink)
We review some of the main implications of the free-energy principle (FEP) for the study of the self-organization of living systems – and how the FEP can help us to understand (and model) biotic self-organization across the many temporal and spatial scales over which life exists. In order to maintain its integrity as a bounded system, any biological system - from single cells to complex organisms and societies - has to limit the disorder or dispersion (i.e., the long-run entropy) (...) of its constituent states. We review how this can be achieved by living systems that minimize their variational free energy. Variational free energy is an information theoretic construct, originally introduced into theoretical neuroscience and biology to explain perception, action, and learning. It has since been extended to explain the evolution, development, form, and function of entire organisms, providing a principled model of biotic self-organization and autopoiesis. It has provided insights into biological systems across spatiotemporal scales, ranging from microscales (e.g., sub- and multicellular dynamics), to intermediate scales (e.g., groups of interacting animals and culture), through to macroscale phenomena (the evolution of entire species). A crucial corollary of the FEP is that an organism just is (i.e., embodies or entails) an implicit model of its environment. As such, organisms come to embody causal relationships of their ecological niche, which, in turn, is influenced by their resulting behaviors. Crucially, free-energy minimization can be shown to be equivalent to the maximization of Bayesian model evidence. This allows us to cast natural selection in terms of Bayesian model selection, providing a robust theoretical account of how organisms come to match or accommodate the spatiotemporal complexity of their surrounding niche. In line with the theme of this volume; namely, biological complexity and self-organization, this chapter will examine a variational approach to self-organization across multiple dynamical scales. (shrink)
In a quantum universe with a strong arrow of time, it is standard to postulate that the initial wave function started in a particular macrostate---the special low-entropy macrostate selected by the Past Hypothesis. Moreover, there is an additional postulate about statistical mechanical probabilities according to which the initial wave function is a ''typical'' choice in the macrostate. Together, they support a probabilistic version of the Second Law of Thermodynamics: typical initial wave functions will increase in entropy. Hence, there (...) are two sources of randomness in such a universe: the quantum-mechanical probabilities of the Born rule and the statistical mechanical probabilities of the Statistical Postulate. I propose a new way to understand time's arrow in a quantum universe. It is based on what I call the Thermodynamic Theories of Quantum Mechanics. According to this perspective, there is a natural choice for the initial quantum state of the universe, which is given by not a wave function but by a density matrix. The density matrix plays a microscopic role: it appears in the fundamental dynamical equations of those theories. The density matrix also plays a macroscopic / thermodynamic role: it is exactly the projection operator onto the Past Hypothesis subspace. Thus, given an initial subspace, we obtain a unique choice of the initial density matrix. I call this property "the conditional uniqueness" of the initial quantum state. The conditional uniqueness provides a new and general strategy to eliminate statistical mechanical probabilities in the fundamental physical theories, by which we can reduce the two sources of randomness to only the quantum mechanical one. I also explore the idea of an absolutely unique initial quantum state, in a way that might realize Penrose's idea of a strongly deterministic universe. (shrink)
The essential difficulty about Computer Ethics' (CE) philosophical status is a methodological problem: standard ethical theories cannot easily be adapted to deal with CE-problems, which appear to strain their conceptual resources, and CE requires a conceptual foundation as an ethical theory. Information Ethics (IE), the philosophical foundational counterpart of CE, can be seen as a particular case of environmental ethics or ethics of the infosphere. What is good for an information entity and the infosphere in general? This is the ethical (...) question asked by IE. The answer is provided by a minimalist theory of deseerts: IE argues that there is something more elementary and fundamental than life and pain, namely being, understood as information, and entropy, and that any information entity is to be recognised as the centre of a minimal moral claim, which deserves recognition and should help to regulate the implementation of any information process involving it. IE can provide a valuable perspective from which to approach, with insight and adequate discernment, not only moral problems in CE, but also the whole range of conceptual and moral phenomena that form the ethical discourse. (shrink)
Classical logic is usually interpreted as the logic of propositions. But from Boole's original development up to modern categorical logic, there has always been the alternative interpretation of classical logic as the logic of subsets of any given (nonempty) universe set. Partitions on a universe set are dual to subsets of a universe set in the sense of the reverse-the-arrows category-theoretic duality--which is reflected in the duality between quotient objects and subobjects throughout algebra. Hence the idea arises of a dual (...) logic of partitions. That dual logic is described here. Partition logic is at the same mathematical level as subset logic since models for both are constructed from (partitions on or subsets of) arbitrary unstructured sets with no ordering relations, compatibility or accessibility relations, or topologies on the sets. Just as Boole developed logical finite probability theory as a quantitative treatment of subset logic, applying the analogous mathematical steps to partition logic yields a logical notion of entropy so that information theory can be refounded on partition logic. But the biggest application is that when partition logic and the accompanying logical information theory are "lifted" to complex vector spaces, then the mathematical framework of quantum mechanics is obtained. Partition logic models indefiniteness (i.e., numerical attributes on a set become more definite as the inverse-image partition becomes more refined) while subset logic models the definiteness of classical physics (an entity either definitely has a property or definitely does not). Hence partition logic provides the backstory so the old idea of "objective indefiniteness" in QM can be fleshed out to a full interpretation of quantum mechanics. (shrink)
The development of technology is unbelievably rapid. From limited local networks to high speed Internet, from crude computing machines to powerful semi-conductors, the world had changed drastically compared to just a few decades ago. In the constantly renewing process of adapting to such an unnaturally high-entropy setting, innovations as well as entirely new concepts, were often born. In the business world, one such phenomenon was the creation of a new type of entrepreneurship. This paper proposes a new academic discipline (...) of computational entrepreneurship, which centers on: (i) an exponentially growing (and less expensive) computing power, to the extent that almost everybody in a modern society can own and use that; (ii) omnipresent high-speed Internet connectivity, wired or wireless, representing our modern day’s economic connectomics; (iii) growing concern of exploiting “serendipity” for a strategic commercial advantage; and (iv) growing capabilities of lay people in performing calculations for their informed decisions in taking fast-moving entrepreneurial opportunities. Computational entrepreneurship has slowly become a new mode of operation for business ventures and will likely bring the academic discipline of entrepreneurship back to mainstream economics. (shrink)
Some modern cosmological models predict the appearance of Boltzmann Brains: observers who randomly fluctuate out of a thermal bath rather than naturally evolving from a low-entropy Big Bang. A theory in which most observers are of the Boltzmann Brain type is generally thought to be unacceptable, although opinions differ. I argue that such theories are indeed unacceptable: the real problem is with fluctuations into observers who are locally identical to ordinary observers, and their existence cannot be swept under the (...) rug by a choice of probability distributions over observers. The issue is not that the existence of such observers is ruled out by data, but that the theories that predict them are cognitively unstable: they cannot simultaneously be true and justifiably believed. (shrink)
Categorical logic has shown that modern logic is essentially the logic of subsets (or "subobjects"). Partitions are dual to subsets so there is a dual logic of partitions where a "distinction" [an ordered pair of distinct elements (u,u′) from the universe U ] is dual to an "element". An element being in a subset is analogous to a partition π on U making a distinction, i.e., if u and u′ were in different blocks of π. Subset logic leads to finite (...) probability theory by taking the (Laplacian) probability as the normalized size of each subset-event of a finite universe. The analogous step in the logic of partitions is to assign to a partition the number of distinctions made by a partition normalized by the total number of ordered pairs |U|² from the finite universe. That yields a notion of "logical entropy" for partitions and a "logical information theory." The logical theory directly counts the (normalized) number of distinctions in a partition while Shannon's theory gives the average number of binary partitions needed to make those same distinctions. Thus the logical theory is seen as providing a conceptual underpinning for Shannon's theory based on the logical notion of "distinctions.". (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.