One finds, in Maxwell's writings on thermodynamics and statistical physics, a conception of the nature of these subjects that differs in interesting ways from the way that they are usually conceived. In particular, though—in agreement with the currently accepted view—Maxwell maintains that the second law of thermodynamics, as originally conceived, cannot be strictly true, the replacement he proposes is different from the version accepted by most physicists today. The modification of the second law accepted by most physicists is (...) a probabilistic one: although statistical fluctuations will result in occasional spontaneous differences in temperature or pressure, there is no way to predictably and reliably harness these to produce large violations of the original version of the second law. Maxwell advocates a version of the second law that is strictly weaker; the validity of even this probabilistic version is of limited scope, limited to situations in which we are dealing with large numbers of molecules en masse and have no ability to manipulate individual molecules. Connected with this is his concept of the thermodynamic concepts of heat, work, and entropy; on the Maxwellian view, these are concepts that must be relativized to the means we have available for gathering information about and manipulating physical systems. The Maxwellian view is one that deserves serious consideration in discussions of the foundation of statistical mechanics. It has relevance for the project of recovering thermodynamics from statistical mechanics because, in such a project, it matters which version of the second law we are trying to recover. (shrink)
This paper is the first part of a three-part project ‘How the principle of energy conservation evolved between 1842 and 1870: the view of a participant’. This paper aims at showing how the new ideas of Mayer and Joule were received, what constituted the new theory in the period under study, and how it was supported experimentally. A connection was found between the new theory and thermodynamics which benefited both of them. Some considerations are offered about the desirability of (...) taking a historical approach to teaching energy and its conservation. (shrink)
Entanglement is one of the most striking features of quantum mechanics, and yet it is not specifically quantum. More specific to quantum mechanics is the connection between entanglement and thermodynamics, which leads to an identification between entropies and measures of pure state entanglement. Here we search for the roots of this connection, investigating the relation between entanglement and thermodynamics in the framework of general probabilistic theories. We first address the question whether an entangled state can be transformed into (...) another by means of local operations and classical communication. Under two operational requirements, we prove a general version of the Lo-Popescu theorem, which lies at the foundations of the theory of pure-state entanglement. We then consider a resource theory of purity where free operations are random reversible transformations, modelling the scenario where an agent has limited control over the dynamics of a closed system. Our key result is a duality between the resource theory of entanglement and the resource theory of purity, valid for every physical theory where all processes arise from pure states and reversible interactions at the fundamental level. As an application of the main result, we establish a one-to-one correspondence between entropies and measures of pure bipartite entanglement and exploit it to define entanglement measures in the general probabilistic framework. In addition, we show a duality between the task of information erasure and the task of entanglement generation, whereby the existence of entropy sinks (systems that can absorb arbitrary amounts of information) becomes equivalent to the existence of entanglement sources (correlated systems from which arbitrary amounts of entanglement can be extracted). (shrink)
The connection between brain and mind is an important scientific and philosophical question that we are still far from completely understanding. A crucial point to our work is noticing that thermodynamics provides a convenient framework to model brain activity, whereas cognition can be modeled in information-theoretical terms. In fact, several models have been proposed so far from both approaches. A second critical remark is the existence of deep theoretical connections between thermodynamics and information theory. In fact, some well-known (...) authors claim that the laws of thermodynamics are nothing but principles in information theory. Unlike in physics or chemistry, a formalization of the relationship between information and energy is currently lacking in neuroscience. In this paper we propose a framework to connect physical brain and cognitive models by means of the theoretical connections between information theory and thermodynamics. Ultimately, this article aims at providing further insight on the formal relationship between cognition and neural activity. (shrink)
In this article, it is argued that, given an initial uncertainty in the state of a system, the information possessed about the system, by any given observer, tend to decrease exponentially until there is none left. By linking the subjective, i.e. observer dependent, concepts of information and entropy, the statement of information decrease represent an alternative formulation of the second law of thermodynamics. With this reformulation, the connection between the foundations of statistical mechanics and classical mechanics is clarified. In (...) conclusion, it is argued that concepts such as probability, ergodicity, entropy, as well as the arrow of time, arise naturally as a consequence of the Gibbs-Liouville theorem in combination with the fact that any given observer of a system do not possess infinite knowledge about the initial conditions of the system. (shrink)
Genesis of the early quantum theory represented by Planck’s 1897-1906 papers is considered. It is shown that the first quantum theoretical schemes were constructed as crossbreed ones composed from ideal models and laws of Maxwellian electrodynamics, Newtonian mechanics, statistical mechanics and thermodynamics. Ludwig Boltzmann’s ideas and technique appeared to be crucial. Deriving black-body radiation law Max Planck had to take the experimental evidence into account. It forced him not to deduce from phenomena but to use more theory instead. The (...) experiments forced Planck to apply the statistical technique to radiation in increasing portions. Planck’s theories in no way were generalizations of existing experimental results. They represented the stages of an ambitious programme of Maxwellian electrodynamics and statistical mechanics reconciliation. (shrink)
The emergent properties are properties referring to a system as a whole, but they do not make sense to its elements or parts being small enough. Furthermore certain emergent properties are reducible to those of elements or relevant parts often. The paper means the special case where the description of the system by means of its emergent properties is much simpler than that of its relevant elements or parts. The concept is investigated by a case study based on statistic (...) class='Hi'>thermodynamics, general relativity, and quantum mechanics. (shrink)
The success of a few theories in statistical thermodynamics can be correlated with their selectivity to reality. These are the theories of Boltzmann, Gibbs, and Einstein. The starting point is Carnot’s theory, which defines implicitly the general selection of reality relevant to thermodynamics. The three other theories share this selection, but specify it further in detail. Each of them separates a few main aspects within the scope of the implicit thermodynamic reality. Their success grounds on that selection. Those (...) aspects can be represented by corresponding oppositions. These are: macroscopic – microscopic; elements – states; relational – non-relational; and observable – theoretical. They can be interpreted as axes of independent qualities constituting a common qualitative reference frame shared by those theories. Each of them can be situated in this reference frame occupying a different place. This reference frame can be interpreted as an additional selection of reality within Carnot’s initial selection describable as macroscopic and both observable and theoretical. The deduced reference frame refers implicitly to many scientific theories independent of their subject therefore defining a general and common space or subspace for scientific theories (not for all). The immediate conclusion is: The examples of a few statistical thermodynamic theories demonstrate that the concept of “reality” is changed or generalized, or even exemplified (i.e. “de-generalized”) from a theory to another. Still a few more general suggestions referring the scientific realism debate can be added: One can admit that reality in scientific theories is some partially shared common qualitative space or subspace describable by relevant oppositions and rather independent of their subject quite different in general. Many or maybe all theories can be situated in that space of reality, which should develop adding new dimensions in it for still newer and newer theories. Its division of independent subspaces can represent the many-realities conception. The subject of a theory determines some relevant subspace of reality. This represents a selection within reality, relevant to the theory in question. The success of that theory correlates essentially with the selection within reality, relevant to its subject. (shrink)
The aim of this paper is to make a step towards a complete description of Special Relativity genesis and acceptance, bringing some light on the intertheoretic relations between Special Relativity and other physical theories of the day. I’ll try to demonstrate that Special Relativity and the Early Quantum Theory were created within the same programme of statistical mechanics, thermodynamics and Maxwellian electrodynamics reconciliation, i.e. elimination of the contradictions between the consequences of this theories. The approach proposed enables to explain (...) why classical mechanics and classical electrodynamics were “refuted” almost simultaneously or, in terms more suitable for the present discussion, why did the quantum and the relativistic revolutions both took place at the beginning of the 20-th century. I ‘ll argue that the quantum and the relativistic revolutions were simultaneous since they had common origin - the clash between the fundamental theories of the second half of the 19-th century that constituted the “body” of Classical Physics. The revolution’ s most dramatic turning point was Einstein’s 1905 light quantum paper, that laid the foundations of the Old Quantum Theory and influenced the fate of special theory of relativity too. Hence, the following two main interrelated theses are defended.(1)Einstein’s special relativity 1905 paper can be considered as a subprogramme of a general research programme that had its pivot in the quantum; (2) One of the reasons of Einstein’s victory over Lorentz consists in the following: special relativity theory superseded Lorentz’s theory when the general programme imposed itself, and, in so doing, made the ether concept untenable. -/- Key words: A.Einstein; H.Lorentz; I.Yu.Kobzarev; context of discovery; context of justification . (shrink)
Entangled quantum systems can be harnessed to transmit, store, and manipulate information in a more efficient and secure way than possible in the realm of classical physics. Given this resource character of entanglement, it is an important problem to characterize ways to manipulate it and meaningful approaches to its quantification. This is the objective of entanglement theory.
This chapter will review selected aspects of the terrain of discussions about probabilities in statistical mechanics (with no pretensions to exhaustiveness, though the major issues will be touched upon), and will argue for a number of claims. None of the claims to be defended is entirely original, but all deserve emphasis. The first, and least controversial, is that probabilistic notions are needed to make sense of statistical mechanics. The reason for this is the same reason that convinced Maxwell, Gibbs, and (...) Boltzmann that probabilities would be needed, namely, that the second law of thermodynamics, which in its original formulation says that certain processes are impossible, must, on the kinetic theory, be replaced by a weaker formulation according to which what the original version deems impossible is merely improbable. Second is that we ought not take the standard measures invoked in equilibrium statistical mechanics as giving, in any sense, the correct probabilities about microstates of the system. We can settle for a much weaker claim: that the probabilities for outcomes of experiments yielded by the standard distributions are effectively the same as those yielded by any distribution that we should take as a representing probabilities over microstates. Lastly, (and most controversially): in asking about the status of probabilities in statistical mechanics, the familiar dichotomy between epistemic probabilities (credences, or degrees of belief) and ontic (physical) probabilities is insufficient; the concept of probability that is best suited to the needs of statistical mechanics is one that combines epistemic and physical considerations. (shrink)
This study has demonstrated that entropy is not a physical quantity, that is, the physical quantity called entropy does not exist. If the efficiency of heat engine is defined as η = W/W1, and the reversible cycle is considered to be the Stirling cycle, then, given ∮dQ/T = 0, we can prove ∮dW/T = 0 and ∮d/T = 0. If ∮dQ/T = 0, ∮dW/T = 0 and ∮dE/T = 0 are thought to define new system state variables, such definitions would (...) be absurd. The fundamental error of entropy is that in any reversible process, the polytropic process function Q is not a single-valued function of T, and the key step of Σ[(ΔQ)/T)] to ∫dQ/T doesn’t hold, P-V fig. should be P-V-T fig.in thermodynamics. Similarly, ∮dQ/T = 0, ∮dW/T = 0 and ∮dE/T = 0 do not hold, either. Since the absolute entropy of Boltzmann is used to explain Clausius entropy and the unit (J/K) of the former is transformed from the latter, the non-existence of Clausius entropy simultaneously denies Boltzmann entropy. (shrink)
I assess the thesis that counterfactual asymmetries are explained by an asymmetry of the global entropy at the temporal boundaries of the universe, by developing a method of evaluating counterfactuals that includes, as a background assumption, the low entropy of the early universe. The resulting theory attempts to vindicate the common practice of holding the past mostly fixed under counterfactual supposition while at the same time allowing the counterfactual's antecedent to obtain by a natural physical development. Although the theory has (...) some success in evaluating a wide variety of ordinary counterfactuals, it fails as an explanation of counterfactual asymmetry. (shrink)
The ecological crisis demonstrates the inadequacy of current modes of thought to grasp the nature of reality and to act accordingly. A more sophisticated metaphysical system is necessary. Arran Gare, a prominent Australian philosopher, has produced such a system, which takes into account the post modern sciences of non-linear thermodynamics, quantum mechanics, and complexity theory. The present article promotes a cosmology based on Gare's metaphysics. In contrast to modern science, the postmodern account offered here will come to terms with (...) a world governed by indifference, which is the same indifference that Albert Camus describes as "absurd". Camus will be interpreted in light of Gare's metaphysics. (shrink)
The explicit history of the “hidden variables” problem is well-known and established. The main events of its chronology are traced. An implicit context of that history is suggested. It links the problem with the “conservation of energy conservation” in quantum mechanics. Bohr, Kramers, and Slaters (1924) admitted its violation being due to the “fourth Heisenberg uncertainty”, that of energy in relation to time. Wolfgang Pauli rejected the conjecture and even forecast the existence of a new and unknown then elementary particle, (...) neutrino, on the ground of energy conservation in quantum mechanics, afterwards confirmed experimentally. Bohr recognized his defeat and Pauli’s truth: the paradigm of elementary particles (furthermore underlying the Standard model) dominates nowadays. However, the reason of energy conservation in quantum mechanics is quite different from that in classical mechanics (the Lie group of all translations in time). Even more, if the reason was the latter, Bohr, Cramers, and Slatters’s argument would be valid. The link between the “conservation of energy conservation” and the problem of hidden variables is the following: the former is equivalent to their absence. The same can be verified historically by the unification of Heisenberg’s matrix mechanics and Schrödinger’s wave mechanics in the contemporary quantum mechanics by means of the separable complex Hilbert space. The Heisenberg version relies on the vector interpretation of Hilbert space, and the Schrödinger one, on the wave-function interpretation. However the both are equivalent to each other only under the additional condition that a certain well-ordering is equivalent to the corresponding ordinal number (as in Neumann’s definition of “ordinal number”). The same condition interpreted in the proper terms of quantum mechanics means its “unitarity”, therefore the “conservation of energy conservation”. In other words, the “conservation of energy conservation” is postulated in the foundations of quantum mechanics by means of the concept of the separable complex Hilbert space, which furthermore is equivalent to postulating the absence of hidden variables in quantum mechanics (directly deducible from the properties of that Hilbert space). Further, the lesson of that unification (of Heisenberg’s approach and Schrödinger’s version) can be directly interpreted in terms of the unification of general relativity and quantum mechanics in the cherished “quantum gravity” as well as a “manual” of how one can do this considering them as isomorphic to each other in a new mathematical structure corresponding to quantum information. Even more, the condition of the unification is analogical to that in the historical precedent of the unifying mathematical structure (namely the separable complex Hilbert space of quantum mechanics) and consists in the class of equivalence of any smooth deformations of the pseudo-Riemannian space of general relativity: each element of that class is a wave function and vice versa as well. Thus, quantum mechanics can be considered as a “thermodynamic version” of general relativity, after which the universe is observed as if “outside” (similarly to a phenomenological thermodynamic system observable only “outside” as a whole). The statistical approach to that “phenomenological thermodynamics” of quantum mechanics implies Gibbs classes of equivalence of all states of the universe, furthermore re-presentable in Boltzmann’s manner implying general relativity properly … The meta-lesson is that the historical lesson can serve for future discoveries. (shrink)
Maxwell’s Demon is a thought experiment devised by J. C. Maxwell in 1867 in order to show that the Second Law of thermodynamics is not universal, since it has a counter-example. Since the Second Law is taken by many to provide an arrow of time, the threat to its universality threatens the account of temporal directionality as well. Various attempts to “exorcise” the Demon, by proving that it is impossible for one reason or another, have been made throughout the (...) years, but none of them were successful. We have shown (in a number of publications) by a general state-space argument that Maxwell’s Demon is compatible with classical mechanics, and that the most recent solutions, based on Landauer’s thesis, are not general. In this paper we demonstrate that Maxwell’s Demon is also compatible with quantum mechanics. We do so by analyzing a particular (but highly idealized) experimental setup and proving that it violates the Second Law. Our discussion is in the framework of standard quantum mechanics; we give two separate arguments in the framework of quantum mechanics with and without the projection postulate. We address in our analysis the connection between measurement and erasure interactions and we show how these notions are applicable in the microscopic quantum mechanical structure. We discuss what might be the quantum mechanical counterpart of the classical notion of “macrostates”, thus explaining why our Quantum Demon setup works not only at the micro level but also at the macro level, properly understood. One implication of our analysis is that the Second Law cannot provide a universal lawlike basis for an account of the arrow of time; this account has to be sought elsewhere. (shrink)
The remarkable connections between gravity and thermodynamics seem to imply that gravity is not fundamental but emergent, and in particular, as Verlinde suggested, gravity is probably an entropic force. In this paper, we will argue that the idea of gravity as an entropic force is debatable. It is shown that there is no convincing analogy between gravity and entropic force in Verlinde’s example. Neither holographic screen nor test particle satisfies all requirements for the existence of entropic force in a (...)thermodynamics system. As a result, there is no entropic force in the gravity system. Furthermore, we show that the entropy increase of the screen is not caused by its statistical tendency to increase entropy as required by the existence of entropic force, but in fact caused by gravity. Therefore, Verlinde’s argument for the entropic origin of gravity is problematic. In addition, we argue that the existence of a minimum size of spacetime, together with the Heisenberg uncertainty principle in quantum theory, may imply the fundamental existence of gravity as a geometric property of spacetime. This provides a further support for the conclusion that gravity is not an entropic force. (shrink)
In a quantum universe with a strong arrow of time, it is standard to postulate that the initial wave function started in a particular macrostate---the special low-entropy macrostate selected by the Past Hypothesis. Moreover, there is an additional postulate about statistical mechanical probabilities according to which the initial wave function is a ''typical'' choice in the macrostate. Together, they support a probabilistic version of the Second Law of Thermodynamics: typical initial wave functions will increase in entropy. Hence, there are (...) two sources of randomness in such a universe: the quantum-mechanical probabilities of the Born rule and the statistical mechanical probabilities of the Statistical Postulate. I propose a new way to understand time's arrow in a quantum universe. It is based on what I call the Thermodynamic Theories of Quantum Mechanics. According to this perspective, there is a natural choice for the initial quantum state of the universe, which is given by not a wave function but by a density matrix. The density matrix plays a microscopic role: it appears in the fundamental dynamical equations of those theories. The density matrix also plays a macroscopic / thermodynamic role: it is exactly the projection operator onto the Past Hypothesis subspace. Thus, given an initial subspace, we obtain a unique choice of the initial density matrix. I call this property "the conditional uniqueness" of the initial quantum state. The conditional uniqueness provides a new and general strategy to eliminate statistical mechanical probabilities in the fundamental physical theories, by which we can reduce the two sources of randomness to only the quantum mechanical one. I also explore the idea of an absolutely unique initial quantum state, in a way that might realize Penrose's idea of a strongly deterministic universe. (shrink)
Many attempts have been made to provide Quantum Field Theory with conceptually clear and mathematically rigorous foundations; remarkable examples are the Bohmian and the algebraic perspectives respectively. In this essay we introduce the dissipative approach to QFT, a new alternative formulation of the theory explaining the phenomena of particle creation and annihilation starting from nonequilibrium thermodynamics. It is shown that DQFT presents a rigorous mathematical structure, and a clear particle ontology, taking the best from the mentioned perspectives. Finally, after (...) the discussion of its principal implications and consequences, we compare it with the main Bohmian QFTs implementing a particle ontology. (shrink)
This chapter draws on insights from non-equilibrium thermodynamics to demonstrate the ontological inadequacy of the machine conception of the organism. The thermodynamic character of living systems underlies the importance of metabolism and calls for the adoption of a processual view, exemplified by the Heraclitean metaphor of the stream of life. This alternative conception is explored in its various historical formulations and the extent to which it captures the nature of living systems is examined. Following this, the chapter considers the (...) metaphysical implications of reconceptualizing the organism from complex machine to flowing stream. What do we learn when we reject the mechanical and embrace the processual? Three key lessons for biological ontology are identified. The first is that activity is a necessary condition for existence. The second is that persistence is grounded in the continuous self-maintenance of form. And the third is that order does not entail design. (shrink)
This doctoral dissertation investigates the notion of physical necessity. Specifically, it studies whether it is possible to account for non-accidental regularities without the standard assumption of a pre-existent set of governing laws. Thus, it takes side with the so called deflationist accounts of laws of nature, like the humean or the antirealist. The specific aim is to complement such accounts by providing a missing explanation of the appearance of physical necessity. In order to provide an explanation, I recur to fields (...) that have not been appealed to so far in discussions about the metaphysics of laws. Namely, I recur to complex systems’ theory, and to the foundations of statistical mechanics. The explanation proposed is inspired by how complex systems’ theory has elucidated the way patterns emerge, and by the probabilistic explanations of the 2nd law of thermodynamics. More specifically, this thesis studies how some constraints that make no direct reference to the dynamics can be a sufficient condition for obtaining in the long run, with high probability, stable regular behavior. I hope to show how certain metaphysical accounts of laws might benefit from the insights achieved in these other fields. According to the proposal studied in this thesis, some regularities are not accidental not in virtue of an underlying physical necessity. The non-accidental character of certain regular behavior is only due to its overwhelming stability. Thus, from this point of view the goal becomes to explain the stability of temporal patterns without assuming a set of pre-existent guiding laws. It is argued that the stability can be the result of a process of convergence to simpler and stable regularities from a more complex lower level. According to this project, if successful, there would be no need to postulate a (mysterious) intermediate category between logical necessity and pure contingency. Similarly, there would be no need to postulate a (mysterious) set of pre-existent governing laws. Part I of the thesis motivates part II, mostly by arguing why further explanation of the notions of physical necessity and governing laws should be welcomed (chapter 1), and by studying the plausibility of a lawless fundamental level (chapters 2 and 3). Given so, part II develops the explanation of formation of simpler and stable behavior from a more complex underlying level. (shrink)
Philosophers of physics have long debated whether the Past State of low entropy of our universe calls for explanation. What is meant by “calls for explanation”? In this article we analyze this notion, distinguishing between several possible meanings that may be attached to it. Taking the debate around the Past State as a case study, we show how our analysis of what “calling for explanation” might mean can contribute to clarifying the debate and perhaps to settling it, thus demonstrating the (...) fruitfulness of this analysis. Applying our analysis, we show that two main opponents in this debate, Huw Price and Craig Callender, are, for the most part, talking past each other rather than disagreeing, as they employ different notions of “calling for explanation”. We then proceed to show how answering the different questions that arise out of the different meanings of “calling for explanation” can result in clarifying the problems at hand and thus, hopefully, to solving them. (shrink)
The model of scientific revolution genesis and structure, extracted from Einstein’s revolution and described in author’s previous publications, is applied to the Copernican one . In the case of Einstein’s revolution I had argued that its cause consisted in the clash between the main classical physics scientific programmes: newtonian mechanics, maxwellian electrodynamics, classical thermodynamics and statistical mechanics. Analogously in the present paper it is argued that the Copernican revolution took place due to realization of the dualism between mathematical astronomy (...) and Aristotelian qualitative physics in Ptolemy’s cosmology and the corresponding efforts to eliminate it. The works of Copernicus, Galileo, Kepler and Newton were all the stages of the mathematics descendance from skies to earth and reciprocal extrapolation of earth physics on divine phenomena. (shrink)
A comprehensible model is proposed aimed at an analysis of the reasons for theory change in science. According to the model the origins of scientific revolutions lie not in a clash of fundamental theories with facts, but of “old” fundamental theories with each other, leading to contradictions that can only be eliminated in a more general theory. The model is illustrated with reference to physics in the early 20th century, the three “old” theories in this case being Maxwellian electrodynamics, statistical (...) mechanics and thermodynamics. Modern example, referring to general relativity and quantum field theory fusion, is highlighted. Key words: Popper, Kuhn, Lakatos, Feyerabend, Stepin, Bransky,Mamchur, mature theory, structure, Einstein, Lorentz, , Boltzmann, Planck, Hawking, De Witt. (shrink)
Many religions and religious philosophies say that ultimate reality is a kind of primal energy. This energy is often described as a vital power animating living things, as a spiritual force directing the organization of matter, or as a divine creative power which generates all things. By refuting older conceptions of primal energy, modern science opens the door to new and more precise conceptions. Primal energy is referred to here as ‘spirit’. But spirit is a natural power. A naturalistic theory (...) of spirit is developed using ideas from information theory and thermodynamics, such as the maximum entropy production principle. Spirit drives the evolution of complexity at all levels of existence. (shrink)
Different anesthetics are known to modulate different types of membrane-bound receptors. Their common mechanism of action is expected to alter the mechanism for consciousness. Consciousness is hypothesized as the integral of all the units of internal sensations induced by reactivation of inter-postsynaptic membrane functional LINKs during mechanisms that lead to oscillating potentials. The thermodynamics of the spontaneous lateral curvature of lipid membranes induced by lipophilic anesthetics can lead to the formation of non-specific inter-postsynaptic membrane functional LINKs by different mechanisms. (...) These include direct membrane contact by excluding the inter-membrane hydrophilic region and readily reversible partial membrane hemifusion. The constant reorganization of the lipid membranes at the lateral edges of the postsynaptic terminals (dendritic spines) resulting from AMPA receptor-subunit vesicle exocytosis and endocytosis can favor the effect of anesthetic molecules on lipid membranes at this location. Induction of a large number of non-specific LINKs can alter the conformation of the integral of the units of internal sensations that maintain consciousness. Anesthetic requirement is reduced in the presence of dopamine that causes enlargement of dendritic spines. Externally applied pressure can transduce from the middle ear through the perilymph, cerebrospinal fluid, and the recently discovered glymphatic pathway to the extracellular matrix space, and finally to the paravenular space. The pressure gradient reduce solubility and displace anesthetic molecules from the membranes into the paravenular space, explaining the pressure reversal of anesthesia. Changes in membrane composition and the conversion of membrane hemifusion to fusion due to defects in the checkpoint mechanisms can lead to cytoplasmic content mixing between neurons and cause neurodegenerative changes. The common mechanism of anesthetics presented here can operate along with the known specific actions of different anesthetics. (shrink)
Many mechanisms, functions and structures of life have been unraveled. However, the fundamental driving force that propelled chemical evolution and led to life has remained obscure. The second law of thermodynamics, written as an equation of motion, reveals that elemental abiotic matter evolves from the equilibrium via chemical reactions that couple to external energy towards complex biotic non-equilibrium systems. Each time a new mechanism of energy transduction emerges, e.g., by random variation in syntheses, evolution prompts by punctuation and settles (...) to a stasis when the accessed free energy has been consumed. The evolutionary course towards an increasingly larger energy transduction system accumulates a diversity of energy transduction mechanisms, i.e. species. The rate of entropy increase is identified as the fitness criterion among the diverse mechanisms, which places the theory of evolution by natural selection on the fundamental thermodynamic principle with no demarcation line between inanimate and animate. (shrink)
In the beginning God created the elementary particles. Bosons, electrons, protons, quarks and the rest he created them. And they were without form and void, so God created the fundamental laws of physics - the laws of mechanics, electromagnetism, thermodynamics and the rest - and assigned values to the fundamental physical constants: the gravitational constant, the speed of light, Planck's constant and the rest. God then set the Universe in motion. And God looked at what he had done, and (...) saw that it was physicalistically acceptable. (shrink)
INTERNATIONAL STUDIES IN THE PHILOSOPHY OF SCIENCE Vol. 10, number 2, 1996, pp. 127-140. R.M. Nugayev. Why did the new physics force out the old ? Abstract. The aim of my paper is to demonstrate that special relativity and the early quantum theory were created within the same programme of statistical mechanics, thermodynamics and Maxwellian electrodynamics reconciliation. I’ll try to explain why classical mechanics and classical electrodynamics were “refuted” almost simultaneously or, in other words, why the quantum revolution and (...) the relativistic one both took place at the beginning of the 20th century. I’ll argue that the quantum and relativistic revolutions were simultaneous since they had a common origin – the clash beyween the mature theories of the second half of the 19th century that constituted the “body” of classical physics. The revolution’s most dramatic point was Einstein’s 1905 photon paper that laid the foundations of both special relativity and the old quantum theory. Hence the dialectic of the old theories is crucial for theory change. Later, classical physics was forced out by the joint development of quantum and relativistic subprogrammes. The title of my paper can be reformulated in Bruno Latour’s terms: The Einstein Revolution or Drawing Models Together. -/- . (shrink)
The concept of time is examined using the second law of thermodynamics that was recently formulated as an equation of motion. According to the statistical notion of increasing entropy, flows of energy diminish differences between energy densities that form space. The flow of energy is identified with the flow of time. The non-Euclidean energy landscape, i.e. the curved space–time, is in evolution when energy is flowing down along gradients and levelling the density differences. The flows along the steepest descents, (...) i.e. geodesics are obtained from the principle of least action for mechanics, electrodynamics and quantum mechanics. The arrow of time, associated with the expansion of the Universe, identifies with grand dispersal of energy when high-energy densities transform by various mechanisms to lower densities in energy and eventually to ever-diluting electromagnetic radiation. Likewise, time in a quantum system takes an increment forwards in the detection-associated dissipative transformation when the stationary-state system begins to evolve pictured as the wave function collapse. The energy dispersal is understood to underlie causality so that an energy gradient is a cause and the resulting energy flow is an effect. The account on causality by the concepts of physics does not imply determinism; on the contrary, evolution of space–time as a causal chain of events is non-deterministic. (shrink)
A comprehensible model is proposed aimed at an analysis of the reasons for theory change in science. According to model the origins of scientific revolutions lie not in a clash of fundamental theories with facts, but of “old” fundamental theories with each other, leading to contradictions that can only be eliminated in a more general theory. The model is illustrated with reference to physics in the early 20th century, the three “old” theories in this case being Maxwellian electrodynamics, statistical mechanics (...) and thermodynamics. Modern example, referring to general relativity and quantum field theory, is considered. Key words: Popper, Kuhn, Stepin, Einstein . (shrink)
This paper distinguishes between 3 meanings of reversal, all of which are mathematically equivalent in classical mechanics: velocity reversal, retrodiction, and time reversal. It then concludes that in order to have well defined velocities a primitive arrow of time must be included in every time slice. The paper briefly mentions that this arrow cannot come from the Second Law of thermodynamics, but this point is developed in more details elsewhere.
To make out in what way Einstein’s 1905 ‘annus mirabilis’ writings hang together one has to hang on Einstein’s strive for unity evinced in his stubborn attempts to coordinate with one another the basic research traditions of classical physics. Light quanta hypothesis and special theory of relativity turn out to be mere milestones of maxwellian electrodynamics and statistical thermodynamics reconciliation programme. The conception of luminiferous ether was an insurmountable stumbling block for Einstein’s statistical thermodynamics programme in which the (...) leading role was played by the light quanta paper . (shrink)
The central motivating idea behind the development of this work is the concept of prespace, a hypothetical structure that is postulated by some physicists to underlie the fabric of space or space-time. I consider how such a structure could relate to space and space-time, and the rest of reality as we know it, and the implications of the existence of this structure for quantum theory. Understanding how this structure could relate to space and to the rest of reality requires, I (...) believe, that we consider how space itself relates to reality, and how other so-called "spaces" used in physics relate to reality. In chapter 2, I compare space and space-time to other spaces used in physics, such as configuration space, phase space and Hilbert space. I support what is known as the "property view" of space, opposing both the traditional views of space and space-time, substantivalism and relationism. I argue that all these spaces are property spaces. After examining the relationships of these spaces to causality, I argue that configuration space has, due to its role in quantum mechanics, a special status in the microscopic world similar to the status of position space in the macroscopic world. In chapter 3, prespace itself is considered. One way of approaching this structure is through the comparison of the prespace structure with a computational system, in particular to a cellular automaton, in which space or space-time and all other physical quantities are broken down into discrete units. I suggest that one way open for a prespace metaphysics can be found if physics is made fully discrete in this way. I suggest as a heuristic principle that the physical laws of our world are such that the computational cost of implementing those laws on an arbitrary computational system is minimized, adapting a heuristic principle of this type proposed by Feynman. In chapter 4, some of the ideas of the previous chapters are applied in an examination of the physics and metaphysics of quantum theory. I first discuss the "measurement problem" of quantum mechanics: this problem and its proposed solution are the primary subjects of chapter 4. It turns out that considering how quantum theory could be made fully discrete leads naturally to a suggestion of how standard linear quantum mechanics could be modified to give rise to a solution to the measurement problem. The computational heuristic principle reinforces the same solution. I call the modified quantum mechanics Critical Complexity Quantum Mechanics (CCQM). I compare CCQM with some of the other proposed solutions to the measurement problem, in particular the spontaneous localization model of Ghirardi, Rimini and Weber. Finally, in chapters 5 and 6, I argue that the measure of complexity of quantum mechanical states I introduce in CCQM also provides a new definition of entropy for quantum mechanics, and suggests a solution to the problem of providing an objective foundation for statistical mechanics, thermodynamics, and the arrow of time. (shrink)
The principle of least action, which has so successfully been applied to diverse fields of physics looks back at three centuries of philosophical and mathematical discussions and controversies. They could not explain why nature is applying the principle and why scalar energy quantities succeed in describing dynamic motion. When the least action integral is subdivided into infinitesimal small sections each one has to maintain the ability to minimise. This however has the mathematical consequence that the Lagrange function at a given (...) point of the trajectory, the dynamic, available energy generating motion, must itself have a fundamental property to minimize. Since a scalar quantity, a pure number, cannot do that, energy must fundamentally be dynamic and time oriented for a consistent understanding. It must have vectorial properties in aiming at a decrease of free energy per state (which would also allow derivation of the second law of thermodynamics). Present physics is ignoring that and applying variation calculus as a formal mathematical tool to impose a minimisation of scalar assumed energy quantities for obtaining dynamic motion. When, however, the dynamic property of energy is taken seriously it is fundamental and has also to be applied to quantum processes. A consequence is that particle and wave are not equivalent, but the wave (distributed energy) follows from the first (concentrated energy). Information, provided from the beginning, an information self-image of matter, is additionally needed to recreate the particle from the wave, shaping a “dynamic” particle-wave duality. It is shown that this new concept of a “dynamic” quantum state rationally explains quantization, the double slit experiment and quantum correlation, which has not been possible before. Some more general considerations on the link between quantum processes, gravitation and cosmological phenomena are also advanced. (shrink)
The possibility of empirical test is discussed with respect to three issues: (1) What is the ontological relationship between consciousness and the brain/physical world? (2) What physical characteristics are associated with the mind/brain interface? (3) Can consciousness act on the brain independently of any brain process?
In quantum theory every state can be diagonalized, i.e. decomposed as a convex combination of perfectly distinguishable pure states. This elementary structure plays an ubiquitous role in quantum mechanics, quantum information theory, and quantum statistical mechanics, where it provides the foundation for the notions of majorization and entropy. A natural question then arises: can we reconstruct these notions from purely operational axioms? We address this question in the framework of general probabilistic theories, presenting a set of axioms that guarantee that (...) every state can be diagonalized. The first axiom is Causality, which ensures that the marginal of a bipartite state is well defined. Then, Purity Preservation states that the set of pure transformations is closed under composition. The third axiom is Purification, which allows to assign a pure state to the composition of a system with its environment. Finally, we introduce the axiom of Pure Sharpness, stating that for every system there exists at least one pure effect occurring with unit probability on some state. For theories satisfying our four axioms, we show a constructive algorithm for diagonalizing every given state. The diagonalization result allows us to formulate a majorization criterion that captures the convertibility of states in the operational resource theory of purity, where random reversible transformations are regarded as free operations. (shrink)
Metaphors establish connection. Root metaphors--patterns of relational imagery in the language and thought of a culture, in which a diverse group of tenors are related to a single indentifiable class of vehicles--play an important role in organizing our thought, and in bringing a coherence to our vision of the world. This is a political function; root metaphors, as philosopher Stephen Pepper discusses them, are most often found in the works of philosophers remembered as political philosophers. ;The second law of (...) class='Hi'>thermodynamics--the law of entropy--holds that in any spontaneous process, usable energy becomes unusable energy. It also suggests that improbable order must succumb, through time, to more probable chaos. The law of entropy has enjoyed a popularity as metaphor unusual for such physics esoterica. In the works of Brooks Adams, Henry Adams, Nicholas Georgescu-Roegen, and Thomas Pynchon, the idea of entropy appears as the fundamental, organizing idea for an economic interpretation of history, a philosophy of history, an ecologically enlightened economic theory, and an encyclopedic novel that apotheosizes modern culture. Analysis of how the entropy metaphor is manifest in the works of these thinkers allows us to judge the strengths and weaknesses of entropy as root metaphor. Analysis of its contemporary popularity affords insight into the politics of the day. Ultimately, the entropy root metaphor serves as the foundation of a refurbished "generating substance" world hypothesis, but the root metaphor itself remains equivocal on the important issue of centralized versus decentralized political organization. (shrink)
A comprehensible model is proposed aimed at an analysis of the reasons for theory change in science. According to model the origins of scientific revolutions lie not in a clash of fundamental theories with facts, but of “old” fundamental theories with each other, leading to contradictions that can only be eliminated in a more general theory. The model is illustrated with reference to physics in the early 20th century, the three “old” theories in this case being Maxwellian electrodynamics, statistical mechanics (...) and thermodynamics. Modern example, referring to general relativity and quantum field theory, is considered. Key words: Popper, Kuhn, Stepin, Einstein . (shrink)
In the brain the relations between free neurons and the conditioned ones establish the constraints for the informational neural processes. These constraints reflect the systemenvironment state, i.e. the dynamics of homeocognitive activities. The constraints allow us to define the cost function in the phase space of free neurons so as to trace the trajectories of the possible configurations at minimal cost while respecting the constraints imposed. Since the space of the free states is a manifold or a non orthogonal space, (...) the minimum distance is not a straight line but a geodesic. The minimum condition is expressed by a set of ordinary differential equation ( ODE ) that in general are not linear. In the brain there is not an algorithm or a physical field that regulates the computation, then we must consider an emergent process coming out of the neural collective behavior triggered by synaptic variability. We define the neural computation as the study of the classes of trajectories on a manifold geometry defined under suitable constraints. The cost function supervises pseudo equilibrium thermodynamics effects that manage the computational activities from beginning to end and realizes an optimal control through constraints and geodetics. The task of this work is to establish a connection between the geometry of neural computation and cost functions. To illustrate the essential mathematical aspects we will use as toy model a Network Resistor with Adaptive Memory (Memristors).The information geometry here defined is an analog computation, therefore it does not suffer the limits of the Turing computation and it seems to respond to the demand for a greater biological plausibility. The model of brain optimal control proposed here can be a good foundation for implementing the concept of "intentionality",according to the suggestion of W. Freeman. Indeed, the geodesic in the brain states can produce suitable behavior to realize wanted functions and invariants as neural expressionsof cognitive intentions. (shrink)
This report offers a modern perspective on the question of time directionality as it arises in a semi-classical context, based on key developments in the field of gravitational physics. Important clarifications are achieved regarding, in particular, the concepts of time reversal and negative energy. The conditions imposed by the Leibnizian constraint of relational definition of physical attributes are thoroughly examined and significant consequences of applying this consistency requirement are derived. From this analysis emerges an improved understanding of the general relativistic (...) concept of stress-energy of matter as being a manifestation of local variations in the energy density of zero-point vacuum fluctuations. Based on those developments a set of axioms is proposed that enables the derivation of generalized gravitational field equations which actually constitute a simplification of relativity theory in the presence of negative energy matter and vacuum energy. Those results are then applied to provide significant new insights into many aspects of the semi-classical theory of black hole thermodynamics and to offer original solutions to several long-standing problems in theoretical cosmology, including the problem of the nature of dark matter and dark energy, that of the origin of thermodynamic time asymmetry and several other issues traditionally approached using inflation theory. (shrink)
A comprehensible model is proposed aimed at an analysis of the reasons for theory change in science. According to model the origins of scientific revolutions lie not in a clash of fundamental theories with facts, but of “old” research traditions with each other, leading to contradictions that can only be eliminated in a more general theory. The model is illustrated with reference to physics in the early 20th century, the three “old” traditions in this case being linked with Maxwellian electrodynamics, (...) Newtonian mechanics and phenomenological thermodynamics. Some modern examples are considered. Key words: Kuhn, Lakatos, Zahar. (shrink)
This collection of articles was written over the last 10 years and edited them to bring them up to date (2017). All the articles are about human behavior (as are all articles by anyone about anything), and so about the limitations of having a recent monkey ancestry (8 million years or much less depending on viewpoint) and manifest words and deeds within the framework of our innate psychology as presented in the table of intentionality. As famous evolutionist Richard Leakey says, (...) it is critical to keep in mind not that we evolved from apes, but that in every important way, we are apes. If everyone was given a real understanding of this (i.e., of human ecology and psychology to actually give them some control over themselves), maybe civilization would have a chance. As things are however the leaders of society have no more grasp of things than their constituents and so collapse into anarchy is inevitable. -/- It is critical to understand why we behave as we do and so the first section presents articles that try to describe (not explain as Wittgenstein insisted) behavior. Section one starts with a brief review of the logical structure of rationality which provides some heuristics for the description of language (mind) and gives some suggestions as to how this relates to the evolution of social behavior. This centers around the two writers I have found the most important in this regard, Ludwig Wittgenstein and John Searle, whose ideas I combine and extend within the dual system (two systems of thought) framework that has proven so useful in recent thinking and reasoning research. As I note, there is in my view essentially complete overlap between philosophy, in the strict sense of the enduring questions that concern the academic discipline, and the descriptive psychology of higher order thought (behavior). Once one has grasped Wittgenstein’s insight that there is only the issue of how the language game is to be played, one determines the Conditions of Satisfaction (what makes a statement true or satisfied etc.) and that is the end of the discussion. -/- Since philosophical problems are the result of our innate psychology, or as Wittgenstein put it, due to the lack of perspicuity of language, they run throughout human discourse, so there is endless need for philosophical analysis, not only in the ‘human sciences’ of philosophy, sociology, anthropology, political science, psychology, history, literature, religion, etc., but in the ‘hard sciences’ of physics, mathematics, and biology. It is universal to mix the language game questions with the real scientific ones as to what the empirical facts are. Scientism is ever present and the master has laid it before us long ago, i.e., Wittgenstein (hereafter W) beginning with the Blue and Brown Books in the early 1930’s. -/- "Philosophers constantly see the method of science before their eyes and are irresistibly tempted to ask and answer questions in the way science does. This tendency is the real source of metaphysics and leads the philosopher into complete darkness." (BBB p18) -/- The key to everything about us is biology, and it is obliviousness to it that leads millions of smart educated people like Obama, Chomsky, Clinton and the Pope to espouse suicidal utopian ideals that inexorably lead straight to Hell On Earth. As W noted, it is what is always before our eyes that is the hardest to see. We live in the world of conscious deliberative linguistic System 2, but it is unconscious, automatic reflexive System 1 that rules. This is the source of the universal blindness described by Searle’s The Phenomenological Illusion (TPI), Pinker’s Blank Slate and Tooby and Cosmides’ Standard Social Science Model. -/- The astute may wonder why we cannot see System 1 at work, but it is clearly counterproductive for an animal to be thinking about or second guessing every action, and in any case there is no time for the slow, massively integrated System 2 to be involved in the constant stream of split second ‘decisions’ we must make. As W noted, our ‘thoughts’ (T1 or the ‘thoughts’ of System 1) must lead directly to actions. -/- It is my contention that the table of intentionality (rationality, mind, thought, language, personality etc.) that features prominently here describes more or less accurately, or at least serves as an heuristic for, how we think and behave, and so it encompasses not merely philosophy and psychology, but everything else (history, literature, mathematics, politics etc.). Note especially that intentionality and rationality as I (along with Searle, Wittgenstein and others) view it, includes both conscious deliberative System 2 and unconscious automated System 1 actions or reflexes. -/- Thus all the articles, like all behavior, are intimately connected if one knows how to look at them. As I note, The Phenomenological Illusion (oblivion to our automated System 1) is universal and extends not merely throughout philosophy but throughout life. I am sure that Chomsky, Obama, Zuckerberg and the Pope would be incredulous if told that they suffer from the same problem as Hegel, Husserl and Heidegger, (or that that they differ only in degree from drug and sex addicts in being motivated by stimulation of their frontal cortices by the delivery of dopamine via the ventral tegmentum and the nucleus accumbens) but it’s clearly true. While the phenomenologists only wasted a lot of people’s time, they are wasting the earth and their descendant’s future. Section one continues with other views of behavior which my reviews attempt to correct and put in context with minimal theory. -/- The next section describes the digital delusions which confuse the language games of System 2 with the automatisms of System one, and so cannot distinguish biological machines (i.e., people) from other kinds of machines (i.e., computers). The ‘reductionist’ claim is that one can ‘explain’ behavior at a ‘lower’ level, but what actually happens is that one does not explain human behavior but a ‘stand in’ for it. Hence the title of Searle’s classic review of Dennett’s book (“Consciousness Explained”)— “Consciousness Explained Away”. In most contexts ‘reduction’ of higher level emergent behavior to brain functions, biochemistry, or physics is incoherent. Even for chemistry or physics the path is blocked by chaos and uncertainty. Anything can be ‘represented’ by equations, but when they ‘represent’ higher order behavior, it is not clear what the ‘results’ mean. Reductionist metaphysics is a joke but most scientists and philosophers lack the appropriate sense of humor. -/- Another hi-tech delusion is that the we will be saved from the pure evil (selfishness) of System 1 by computers/AI/robotics/ nanotech/genetic engineering created by System 2. The No Free Lunch principal tells us there will be serious and possibly fatal consequences. The adventurous may regard this principle as a higher order emergent expression of the Second Law of Thermodynamics. -/- The last section describes The One Big Happy Family Delusion , i.e., that we are selected for cooperation with everyone and that the euphonious ideals of Democracy, Diversity and Equality will lead us into utopia. Again the No Free Lunch Principle ought to warn us it cannot be true, and we see throughout history and all over the contemporary world, that without strict controls, selfishness and stupidity gain the upper hand and soon destroy any nation that embraces it. In addition, the monkey mind steeply discounts the future, and so we sell our descendant’s heritage for temporary comforts greatly exacerbating the problems. -/- I describe versions of this delusion (i.e., that we are basically ‘friendly’ if just given a chance) as it appears in some recent books on sociology/biology/economics. I end with an essay on the great tragedy playing out in America and the world, which can be seen as a direct result of our evolved psychology manifested as the inexorable machinations of System 1. Our evolved psychology, eminently adaptive and eugenic on the plains of Africa ca. 50,000 years ago, when many of our ancestors left Africa, to ca. 6 million years ago, when we split from chimpanzees (i.e., in the EEA or Environment of Evolutionary Adaptation), but now maladaptive and dysgenic and the source of our Suicidal Utopian Delusions. So, like all discussions of behavior, this book is about evolutionary strategies, selfish genes and inclusive fitness. (shrink)
Special Relativity and the Early Quantum Theory were created within the same programme of statistical mechanics, thermodynamics and maxwellian electrodynamics reconciliation. I shall try to explain why classical mechanics and classical electrodynamics were “refuted” almost simultaneously or, in more suitable for the present congress terms, why did quantum revolution and the relativistic one both took place at the beginning of the 20-th century. I shall argue that quantum and relativistic revolutions were simultaneous since they had common origin - the (...) clash between the fundamental theories of the second half of the 19-th century that constituted the “body” of Classical Physics. The revolution’ s most dramatic point was Einstein’s 1905 photon paper that laid the foundations of both Special Relativity and Old Quantum Theory. Hence the dialectic of the old theories is crucial for theory change. Modern physics began with Einstein’s reconciliation of electrodynamics, mechanics and thermodynamics in 1905 and his unification of Special Relativity and Newtonian Theory of Gravity. Or, in more general social context: progressive scientific change can be described not in Weberian terms of zweckrational action forcing out all the other forms of action only but in terms of Habermas’s communicative rationality encouraging the establishment of mutual understanding between the various scientific communities also. Einstein’s programme constituted a progressive step in respect to its rivals not because it could explain more “facts” or was more “mathematical”. It was high than its rivals because it constituted a basis of communication and interpenetration between three main paradigms of 19-th century physics. Of course in the long run it resulted in empirical successes. Key words: Einstein, scientific revolution, communicative rationality. (shrink)
To make out in what way Einstein’s manifold 1905 ‘annus mirabilis’ writings hang together one has to take into consideration Einstein’s strive for unity evinced in his persistent attempts to reconcile the basic research traditions of classical physics. Light quanta hypothesis and special theory of relativity turn out to be the contours of a more profound design, mere milestones of implementation of maxwellian electrodynamics, statistical mechanics and thermodynamics reconciliation programme. The conception of luminiferous ether was an insurmountable obstacle for (...) Einstein’s statistical thermodynamics in which the leading role was played by the light quanta paper. In his critical stand against the entrenched research traditions of classical physics Einstein was apparently influenced by David Hume and Ernst Mach. However, when related to creative momenta, Einstein’s 1905 unificationist modus operandi was drawn upon Mach’s principle of economy of thought taken in the context of his ‘instinctive knowledge’ doctrine and with promising inclinations of Kantian epistemology presuming the coincidence of both constructing theory and integrating intuition of Principle. (shrink)
Press release. -/- The ebook entitled, Einstein’s Revolution: A Study of Theory-Unification, gives students of physics and philosophy, and general readers, an epistemological insight into the genesis of Einstein’s special relativity and its further unification with other theories, that ended well by the construction of general relativity. The book was developed by Rinat Nugayev who graduated from Kazan State University relativity department and got his M.Sci at Moscow State University department of philosophy of science and Ph.D at Moscow Institute of (...) Philosophy, Russian Academy of Science. He has forty years of philosophy of science and relativistic astrophysics teaching and research experience evincing in more than 200 papers in the scientific journals of Russia, Ukraine, Belorussia, USA, Great Britain, Germany, Spain, Italy, Sweden, Switzerland, Netherlands, Canada, Denmark, Poland, Romania, France, Greece, Japan and some other countries, and 8 monographs. Revolutions in physics all embody theoretical unification. Hence the overall aim of the present book is to unfold Einstein’s unificationist modus operandi, the hallmarks of actual Einstein’s methodology of unification that engendered his 1905 special relativity, as well as his 1915 general relativity. To achieve the object, a lucid epistemic model is exposed aimed at an analysis of the reasons for mature theory change in science (chapter1). According to the model, scientific revolutions were not due to fanciful creation of new ideas ‘ex nihilo’, but rather to the long-term processes of the reconciliation, interpenetration and intertwinement of ‘old’ research traditions preceding such breaks .Accordingly, origins of scientific revolutions lie not in a clash of fundamental theories with facts, but of “old” mature research traditions with each other, leading to contradictions that can only be attenuated in a more general theoretical approach. In chapter 2 it is contended that Einstein’s ingenious approach to special relativity creation, substantially distinguishing him from Lorentz’s and Poincaré’s invaluable impacts, turns to be a milestone of maxwellian electrodynamics, statistical mechanics and thermodynamics reconciliation design. Special relativity turns out to be grounded on Einstein’s breakthrough 1905 light quantum hypothesis. Eventually the author amends the received view on the general relativity genesis by stressing that the main reason for Einstein’s victory over the rival programmes of Abraham and Nordström was a unificationist character of Einstein’s research programme (chapter 3). Rinat M. Nugayev, Ph.D, professor of Volga Region Academy, Kazan, the Republic of Tatarstan, the Russian Federation. (shrink)
What are the reasons of the second scientific revolution that happened at the beginning of the XX century? Why did the new physics supersede the old one? The author tries to answer the subtle questions with a help of the epistemological model of scientific revolutions that takes into account some recent advances in philosophy, sociology and history of science. According to the model, Einstein’s Revolution took place due to resolution of deep contradictions between the basic classical research traditions: Newtonian mechanics, (...) maxwellian electrodynamics, thermodynamics and statistical mechanics. As a result, two new research programmes – relativistic and quantum- had been constructed. It was the interaction between them that formed the interdisciplinary context of Einstein’s Revolution. (shrink)
What are the reasons for theory change in science? –To give a sober answer a comprehensible model is proposed based on the works of V.P. Bransky, P. Feyerabend , T.S. Kuhn, I. Lakatos, K.R.Popper, V.S. Scwvyrev, Ya. Smorodinsky, V.S. Stepin, and others. According to model the origins of scientific revolutions lie not in a clash of fundamental theories with facts, but of “old” basic research traditions with each other, leading to contradictions that can only be eliminated in a more general (...) theory. The model is illustrated with reference to physics in the early 20th century, the three “old” research programmes in this case being Maxwellian electrodynamics, statistical mechanics and thermodynamics. Modern example, referring to general relativity and quantum field theory, is considered. Key words: Bransky, Feyerabend, Popper, Kuhn, Lakatos, Stepin, Einstein, Lorentz, Planck. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.