A Monograph Dealing With Unification In Relation To Dark Energy, Dark Matter, Cosmic Expansion, E=mc2, Quantum Gravity, "Imaginary" Computers, Creation Of The Infinite And Eternal Universe Using Electronic BITS + PI + "Imaginary" Time, Earthly Education, Science-Religion Union, The Human Condition, Superconductivity, Planetary Fields, How Gravitation Can Boost Health, Space-Time Propulsion From The Emdrive To The Brouwer Fixed-Point Theorem, "Light Matter", Etc. These Effects Were Originally Discussed In Several Short Internet Articles. Table Of Contents Introduction Superconductivity And Planetary (...) Magnetic / Electric Fields Co-Movement Of Photons And Graviton General Relativity Deletes Dark Energy, Dark Matter And Universal Expansion The Relation Of The Higgs Field To Gravity Spin Interactions And Making Bosons Or Fermions The Final Missing Steps In E=mc2 What Will Education Be Like In 2049? Learn By Holographic Teachers Using Quantum Mechanics, "Imaginary" Computers And A Unification Of Physics That Will Bring Education To Everyone, Everywhere Hypotheses Supporting Gravitation As A Push - (1) M-Sigma, The Non-Fundamental Nuclear Forces (2) Geysers On Saturn's Moon Enceladus (3) Gravity, Falling Bodies (4) Earth's Tides, Astronomical Unit, Cosmic Backgrounds A Proposal For The True Human Condition That Reconciles Science With Religion Back To The Moon And On To The Stars Normalising Patients With Gravitation. (shrink)
This article aims to unify all scientific theories based on the concept of “intrinsicality of nature”, including the fundamental theories in physics. First, the general property within existing natural phenomena, say “intrinsicality of nature”, was deduced as “logicality” and "imperfectness". Then, the identical intrinsicality was deduced out for the science to unify all scientific theories. Finally, with this intrinsicality and the novelties of consciousness, the unification of physics theories, say Theory of Everything, was framed by a physical model (...) of more fundamental element (MFE), which had managed to cover the concepts of quantum mechanics, general relativity, and the physics of consciousness altogether, to test this model of unification. Since the MFE is only a theoretical model, two methods were introduced for its experimental verification. While its application to explain yet unanswered questions in science was tested, and further verification of this model in a lab by measuring consciousness signals was discussed. (shrink)
I motivate the concept of styles of scientific investigation, and differentiate two styles, formal and compositional. Styles are ways of doing scientific research. Radically different styles exist. I explore the possibility of the unification of biology and social science, as well as the possibility of unifying the two styles I identify. Recent attempts at unifying biology and social science have been premised almost exclusively on the formal style. Through the use of a historical example of defenders of (...) compositional biological social science, the Ecology Group at the University of Chicago from, roughly, the 1930s to the 1950s, I attempt to show the coherence and possibility, if not utility, of employing the compositional style to effect the synthesis of biology and social science. I also relate the efforts of the Ecology Group to those of investigators in the Sociology Department of the University of Chicago. In my conclusion, I discuss the usefulness both of employing the category of styles of scientific investigation in historical and philosophical studies of science, as well as the concept of compositionality in scientific studies. I end the paper with some tentative suggestions regarding the importance of compositionality for an analysis of human society. (shrink)
Unification of natural science and social science is a centuries-old, unmitigated debate. Natural science has a chronological advantage over social science because the latter took time to include many social phenomena in its fold. History of science witnessed quite a number of efforts by social scientists to fit this discipline in a rational if not mathematical framework. On the other hand a tendency among some physicists has been observed especially since the last century to (...) recast a number of social phenomena in the mould of events taking place in physical world and governed by well-known systems and equations of physics. It necessitated the introduction of social physics as a new inter-disciplinary subject. Obviously this attempt is aimed at explaining hitherto unsolved or highly debated issues of social science. Physicists are showing special interest on problems on economics, ranging from some topics of normative economics to the movement of prices of derivatives. Statistics has been widely used in these attempts and at least two sub-disciplines of the subject, namely, stochastic process and time series analysis deserve special mention. All these research activities gave birth to another inter-disciplinary subject named as econophysics. Interestingly, global financial crisis of 2007–08 has revived the need of determination of prices of derivatives in a more accurate manner. This article adumbrates a sketch of the theoretical synthesis between physics and economics and the role played by statistics in this process. (shrink)
What were the reasons of the Copernican Revolution ? How did modern science (created by a bunch of ambitious intellectuals) manage to force out the old one created by Aristotle and Ptolemy, rooted in millennial traditions and strongly supported by the Church? What deep internal causes and strong social movements took part in the genesis, development and victory of modern science? The author comes to a new picture of Copernican Revolution on the basis of the elaborated model of (...) scientific revolutions that takes into account some recent advances in philosophy, sociology and history of science. The model was initially invented to describe Einstein’s Revolution of the XX century beginning. The model considers the growth of knowledge as interaction, interpenetration and unification of the research programmes, springing out of different cultural traditions. Thus, Copernican Revolution appears as a result of revealation and (partial) resolution of the dualism , of the gap between Ptolemy’s mathematical astronomy and Aristotelian qualitative physics. The works of Copernicus, Galileo, Kepler and Newton were all the stages of mathematics descendance from skies to earth and reciprocal extrapolation of earth physics on skies. The model elaborated enables to reassess the role of some social factors crucial for the scientific revolution. It is argued that initially modern science was a result of the development of Christian Weltanschaugung . Later the main support came from the absolute monarchies. In the long run the creators of modern science appeared to be the “apparatchics” of the “regime of truth” built-in state machine. Natural science became a part of ideological state apparatus providing not only scientific education but the internalization of values crucial for the functioning of state. -/- . (shrink)
Cognitive science is an interdisciplinary conglomerate of various research fields and disciplines, which increases the risk of fragmentation of cognitive theories. However, while most previous work has focused on theoretical integration, some kinds of integration may turn out to be monstrous, or result in superficially lumped and unrelated bodies of knowledge. In this paper, I distinguish theoretical integration from theoretical unification, and propose some analyses of theoretical unification dimensions. Moreover, two research strategies that are supposed to lead (...) to unification are analyzed in terms of the mechanistic account of explanation. Finally, I argue that theoretical unification is not an absolute requirement from the mechanistic perspective, and that strategies aiming at unification may be premature in fields where there are multiple conflicting explanatory models. (shrink)
In this paper, we defend a novel, multidimensional account of representational unification, which we distinguish from integration. The dimensions of unity are simplicity, generality and scope, non-monstrosity, and systematization. In our account, unification is a graded property. The account is used to investigate the issue of how research traditions contribute to representational unification, focusing on embodied cognition in cognitive science. Embodied cognition contributes to unification even if it fails to offer a grand unification of (...) cognitive science. The study of this failure shows that unification, contrary to what defenders of mechanistic explanation claim, is an important mechanistic virtue of research traditions. (shrink)
A nonstandard viewpoint to quantum gravity is discussed. General relativity and quantum mechanics are to be related as two descriptions of the same, e.g. as Heisenberg’s matrix mechanics and Schrödinger’s wave mechanics merged in the contemporary quantum mechanics. From the viewpoint of general relativity one can search for that generalization of relativity implying the in-variance “within – out of” of the same system.
It is often claimed that the greatest value of the Bayesian framework in cognitive science consists in its unifying power. Several Bayesian cognitive scientists assume that unification is obviously linked to explanatory power. But this link is not obvious, as unification in science is a heterogeneous notion, which may have little to do with explanation. While a crucial feature of most adequate explanations in cognitive science is that they reveal aspects of the causal mechanism that (...) produces the phenomenon to be explained, the kind of unification afforded by the Bayesian framework to cognitive science does not necessarily reveal aspects of a mechanism. Bayesian unification, nonetheless, can place fruitful constraints on causal–mechanical explanation. 1 Introduction2 What a Great Many Phenomena Bayesian Decision Theory Can Model3 The Case of Information Integration4 How Do Bayesian Models Unify?5 Bayesian Unification: What Constraints Are There on Mechanistic Explanation?5.1 Unification constrains mechanism discovery5.2 Unification constrains the identification of relevant mechanistic factors5.3 Unification constrains confirmation of competitive mechanistic models6 ConclusionAppendix. (shrink)
Predictive processing (PP) has been repeatedly presented as a unificatory account of perception, action, and cognition. In this paper, we argue that this is premature: As a unifying theory, PP fails to deliver general, simple, homogeneous, and systematic explanations. By examining its current trajectory of development, we conclude that PP remains only loosely connected both to its computational framework and to its hypothetical biological underpinnings, which makes its fundamentals unclear. Instead of offering explanations that refer to the same set of (...) principles, we observe systematic equivocations in PP‐based models, or outright contradictions with its avowed principles. To make matters worse, PP‐based models are seldom empirically validated, and they are frequently offered as mere just‐so stories. The large number of PP‐based models is thus not evidence of theoretical progress in unifying perception, action, and cognition. On the contrary, we maintain that the gap between theory and its biological and computational bases contributes to the arrested development of PP as a unificatory theory. Thus, we urge the defenders of PP to focus on its critical problems instead of offering mere re‐descriptions of known phenomena, and to validate their models against possible alternative explanations that stem from different theoretical assumptions. Otherwise, PP will ultimately fail as a unified theory of cognition. (shrink)
The Comprehensibility of the Universe puts forward a radically new conception of science. According to the orthodox conception, scientific theories are accepted and rejected impartially with respect to evidence, no permanent assumption being made about the world independently of the evidence. Nicholas Maxwell argues that this orthodox view is untenable. He urges that in its place a new orthodoxy is needed, which sees science as making a hierarchy of metaphysical assumptions about the comprehensibility and knowability of the universe, (...) these assumptions asserting less and less as one ascends the hierarchy. This view has significant implications: that it is part of scientific knowledge that the universe is physically comprehensible; that metaphysics and philosophy are central to scientific knowledge; that science possesses a rational, if fallible, method of discovery; that a new understanding of scientific method and rationality is required. Maxwell argues that this new conception makes possible a natural resolution of long-standing philosophical problems about science, regarding simplicity, induction, and progress. His goal is the reform not just of the philosophy of science but of science itself, and the healing of the rift between the two. (shrink)
Mathematics clearly plays an important role in scientific explanation. Debate continues, however, over the kind of role that mathematics plays. I argue that if pure mathematical explananda and physical explananda are unified under a common explanation within science, then we have good reason to believe that mathematics is explanatory in its own right. The argument motivates the search for a new kind of scientific case study, a case in which pure mathematical facts and physical facts are explanatorily unified. I (...) argue that it is possible for there to be such cases, and provide some toy examples to demonstrate this. I then identify a potential source of scientific case studies as a guide for future work. (shrink)
Incommensurability was Kuhn’s worst mistake. If it is to be found anywhere in science, it would be in physics. But revolutions in theoretical physics all embody theoretical unification. Far from obliterating the idea that there is a persisting theoretical idea in physics, revolutions do just the opposite: they all actually exemplify the persisting idea of underlying unity. Furthermore, persistent acceptance of unifying theories in physics when empirically more successful disunified rivals can always be concocted means that physics makes (...) a persistent implicit assumption concerning unity. To put it in Kuhnian terms, underlying unity is a paradigm for paradigms. We need a conception of science which represents problematic assumptions concerning the physical comprehensibility and knowability of the universe in the form of a hierarchy, these assumptions becoming less and less substantial and more and more such that their truth is required for science, or the pursuit of knowledge, to be possible at all, as one goes up the hierarchy. This hierarchical conception of science has important Kuhnian features, but also differs dramatically from the view Kuhn expounds in his The Structure of Scientific Revolutions. In this paper, I compare and contrast these two views in a much more detailed way than has been done hitherto. I show how the hierarchical view can be construed to emerge from Kuhn’s view as it is modified to overcome objections. I argue that the hierarchical conception of science is to be preferred to Kuhn’s view. (shrink)
Summary: Throughout the history of science, indeed throughout the history of knowledge, unification has been touted as a central aim of intellectual inquiry. We’ve always wanted to discover not only numerous bare facts about the universe, but to show how such facts are linked and interrelated. Large amounts of time and effort have been spent trying to show diverse arrays of things can be seen as different manifestations of some common underlying entities or properties. Thales is said to (...) have originated philosophy and science with his declaration that everything was, at base, a form of water. Plato’s theory of the forms was thought to be a magnificent accomplishment because it gave a unified solution to the separate problems of the relation between knowledge and belief, the grounding of objective values, and how continuity is possible amid change. Pasteur made numerous medical advancements possible by demonstrating the interconnection between microorganisms and human disease symptoms. Many technological advances were aided by Maxwell’s showing that light is a kind of electromagnetic radiation. The attempt to unify the various known forces is often referred to as “The Holy Grail” of physics. Some philosophers have even suggested that providing explanations is itself just a sort of unifying of our knowledge. But while unification (like simplicity) has often been hailed as a tremendous virtue in science, the meaning of the term is not altogether clear. Scientists often don’t specify what, precisely, they mean by unification. And in cases where what they mean is clear, different thinkers plainly mean different things by the term. What are the various senses of unification, and why has unification been such an important aim in the history of inquiry? (shrink)
According to Michael Friedman’s theory of explanation, a law X explains laws Y1, Y2, …, Yn precisely when X unifies the Y’s, where unification is understood in terms of reducing the number of independently acceptable laws. Philip Kitcher criticized Friedman’s theory but did not analyze the concept of independent acceptability. Here we show that Kitcher’s objection can be met by modifying an element in Friedman’s account. In addition, we argue that there are serious objections to the use that Friedman (...) makes of the concept of independent acceptability. (shrink)
Given the diversity of explanatory practices that is typical of the sciences a healthy pluralism would seem to be desirable where theories of explanation are concerned. Nevertheless, I argue that explanations are only unifying in Kitcher's unificationist sense if they are backed by the kind of understanding of underlying mechanisms, dispositions, constitutions, and dependencies that is central to a causalist account of explanation. This case can be made through analysis of Kitcher's account of the conditions under which apparent improvements in (...) unifying power may be judged spurious. But to clarify what is at issue I consider an archaeological case in which debate about the merits of an ambitious explanatory account reproduces exactly the intuitions that divide Salmon and Kitcher. The case in question is the “demic-diffusion” account of contemporary linguistic diversity advanced by Renfrew in the late 1980s: the thesis that the diffusion of agricultural populations, itself attributed to demographic pressure, was responsible for the spread of the ancestral root languages (e.g., proto-Indo-Eurpoean) that account for the existence and distribution of linguistic macrofamilies. The credibility of this powerfully unifying argument pattern depends entirely on the plausibility of its claims about the conditions and mechanisms actually responsible for the explanandum, the spread of agriculture, and not on an elaboration of its unificationist virtues. (shrink)
The widespread mistrust of metaphysics-the main obstacle to the unification of physics and philosophy-is based on the myth that metaphysical claims cannot be falsified or verified, because they are supposedly true independently of empirical knowledge. This is not true of metaphysical naturalism, whose approach is to critically reflect on the theories and findings of all the empirical disciplines and abstract from them a theory about such general features of reality that no single empirical discipline can be the authority on. (...) Causation is such a feature, since its instances include anything from planetary motions to particle interactions, chemical reactions, biological functions, and closing a door. Consequently, a general account of causation is beyond any particular empirical discipline. Metaphysical naturalism takes a meta-perspective on the results of the empirical sciences and attempts to figure out how it fits together in a coherent whole, e.g. by offering a general account of causation. General accounts of this kind are falsifiable in so far as the theories are falsifiable from which they are generalizations. The paper also discusses some fundamental metaphysical principles implicitly assumed by the sciences generally, and why they imply that unification is methodologically virtuous. (shrink)
The study of color expanded rapidly in the 20th century. With this expansion came fragmentation, as philosophers, physicists, physiologists, psychologists, and others explored the subject in vastly different ways. There are at least two ways in which the study of color became contentious. The first was with regard to the definitional question: what is color? The second was with the location question: are colors inside the head or out in the world? In this chapter, we summarize the most prominent answers (...) that color scientists and philosophers gave to the definitional and location questions in the 20th century. We identify some of the different points at which their work intersected, as well as the most prominent schisms between them. One overarching theme of the chapter is the surprising proliferation of different views on color. Whereas some assume that progress in science must take the form of convergence, the 20th century history of color exhibited a marked divergence in views. This chapter leaves it an open question whether an ultimate unification of views is possible, or whether the only thing that ties together the study of “color” is the shared inheritance of a word. (shrink)
It is discerned what light can bring the recent historical reconstructions of maxwellian optics and electromagnetism unification on the following philosophical/methodological questions. I. Why should one believe that Nature is ultimately simple and that unified theories are more likely to be true? II. What does it mean to say that a theory is unified? III. Why theory unification should be an epistemic virtue? To answer the questions posed genesis and development of Maxwellian electrodynamics are elucidated. It is enunciated (...) that the Maxwellian Revolution is a far more complicated phenomenon than it may be seen in the light of Kuhnian and Lakatosian epistemological models. Correspondingly it is maintained that maxwellian electrodynamics was elaborated in the course of the old pre-maxwellian programmes’ reconciliation: the electrodynamics of Ampére-Weber, the wave theory of Young-Fresnel and Faraday’s programme. To compare the different theoretical schemes springing from the different language games James Maxwell had constructed a peculiar neutral language. Initially it had encompassed the incompressible fluid models; eventually – the vortices ones. The three programmes’ encounter engendered the construction of the hybrid theory at first with an irregular set of theoretical schemes. However, step by step, on revealing and gradual eliminating the contradictions between the programmes involved, the hybrid set is “put into order” (Maxwell’s term). A hierarchy of theoretical schemes starting from ingenious crossbreeds (the displacement current) and up to usual hybrids is set up. After the displacement current construction the interpenetration of the pre-maxwellian programmes begins that marks the commencement of theoretical schemes of optics, electricity and magnetism real unification. Maxwell’s programme surpassed that of Ampére-Weber because it did absorb the ideas of the Ampére-Weber programme, as well as the presuppositions of the programmes of Young-Fresnel and Faraday properly co-ordinating them with each other. But the opposite statement is not true. The Ampére-Weber programme did not assimilate the propositions of the Maxwellian programme. Maxwell’s victory over his rivals became possible because the gist of Maxwell’s unification strategy was formed by Kantian epistemology looked in the light of William Whewell and such representatives of Scottish Enlightenment as Thomas Reid and Sir William Hamilton. Maxwell did put forward as basic synthetic principles the ideas that radically differed from that of Ampére-Weber approach by their open, flexible and contra-ontological, genuinly epistemological, Kantian character. For Maxwell, ether was not the ultimate building block of physical reality, from which all the charges and fields should be constructed. “Action at a distance”, “incompressible fluid”, “molecular vortices”, etc. were contrived analogies for Maxwell, capable only to direct the researcher at the “right” mathematical relations. Key words: J.C. Maxwell, unification of optics and electromagnetism, I. Kant, T. Reid, W. Hamilton . (shrink)
Speaking for God has been part of religion for many years. However, science has come in the past few years to question that role or even our very ability to speak about God in general. My goal is to show that dogmatism, under any form, is wrong. And even though dogmatism had for a long time been associated with ill-intentioned religion, nowadays science has replaced religion in the throne of doctrinaire thinking. The point of the paper is to (...) illustrate that one-way thinking is never correct – most of the times a combination of science and religion, measurements and theoretical thinking, logic and intuition, is required to draw a conclusion about the most important philosophical questions. The paper establishes that exact sciences can be very useful, but they also have limits. The Religion-vs-Science problem is a pseudo-problem; logic and evidence can easily be used to defend theistic views. Both science and religion use common tools and methods and can be unified in a new way of thinking. This paper sets the foundations on how this can be achieved. The conclusion is that science and religion both complete our knowledge for the world, our understanding of humans and our purpose in life. Speaking about God is part of science as well as of religion. Only when we think of God as theologians and as scientists at the same time can we fully reach Him…. (shrink)
__In this paper I investigate unification as a virtue of explanation. I the first part of the paper I give a brief exposition of the unification account of Schurz and Lambert and Schurz. I illustrate the advantages of this account in comparison to the older unification accounts of Friedman and Kitcher. In the second part I discuss several comments and objections to the Schurz-Lambert account that were raised by Weber and van Dyck, Gijsberg and de Regt. In (...) the third and final part, I argue that explanation should be understood as a prototype concept which contains nomic expectability, causality and unification as prototypical virtues of explanations, although none of these virtues provides a sufficient and necessary "defining condition" of explanation. (shrink)
Origins of the Copernican Revolution that led to modern science genesis can be explained only by the joint influence of external and internal factors. The author tries to take this influence into account with a help of his own growth of knowledge model according to which the growth of science consists in interaction, interpenetration and unification of various scientific research programmes spreading from different cultural milieux. Copernican Revolution consisted in revealation and elimination of the gap between Ptolemy’s (...) mathematical astronomy and Aristotelian qualitative physics. But the very realization of the gap between physics and astronomy appeared to be possible because at least at its first stages modern science was a result of Christian Weltanschaugung development with its aspiration for elimination of pagan components. Of all the external factors religion was the strongest one. Key words: scientific revolution, Christian weltanschaugung, modernity, Copernicus, Ptolemy. (shrink)
This article is a commentary on Machery (2009) Doing without Concepts. Concepts are mental symbols that have semantic structure and processing structure. This approach (1) allows for different disciplines to converge on a common subject matter; (2) it promotes theoretical unification; and (3) it accommodates the varied processes that preoccupy Machery. It also avoids problems that go with his eliminativism, including the explanation of how fundamentally different types of concepts can be co-referential.
This article had its beginning with Einstein's 1919 paper "Do gravitational fields play an essential role in the structure of elementary particles?" Together with General Relativity's statement that gravity is not a pull but is a push caused by the curvature of space-time, a hypothesis for Earth's ocean tides was developed that does not solely depend on the Sun and Moon as Kepler and Newton believed. It also borrows from Galileo. The breakup of planets and asteroids by white dwarfs, neutron (...) stars or black holes is popularly ascribed by today's science to tidal forces (gravitation emanating from the stellar body and having a greater effect on the near side of a planet/asteroid than the farthest side). Remembering Einstein's 1919 paper, it was apparent that my revised idea of tidal forces improves on current accounts because it views matter and mass as unified with space-time whose curvature is gravitation. Unification is a necessity for modern science's developing view of one united and entangled universe – expressed in the Unified Field Theory, the Theory of Everything, String theory and Loop Quantum Gravity. The writing of this article was also assisted by visualizing the gravitational fields forming space-time being themselves formed by a multitude of weak and presently undetectable gravitational waves. The final part of this article concludes that the section BITS AND TOPOLOGY will lead to the conclusions in ETERNAL LIFE, WORLD PEACE AND PHYSICS' UNIFICATION. The final part also compares cosmology to biological enzymes and biology's substrate of reacting "chemicals" - using virtual particles, hidden variables, gravitation, electromagnetism, electronics’ binary digits, plus topology’s Mobius strip and figure-8 Klein bottle. The product is mass - enzyme, substrate and product are all considered mathematical in nature. Also, gravitation and electromagnetism are united using logic and topology – showing there’s no need in this article for things like mathematical formalism, field equations or tensor calculus. (shrink)
It is argued that the origins of modern science can be revealed due to joint account of external and internal factors. The author tries to keep it in mind applying his scientific revolution model according to which the growth of knowledge consists in interaction, interpenetration and even unification of different scientific research programmes. Hence the Copernican Revolution as a matter of fact consisted in realization and elimination of the gap between the mathematical astronomy and Aristotelian qualitative physics in (...) Ptolemaic cosmology. Yet the very realization of the contradictions became possible because at the first stages European science was a result of Christian Weltanschaugung evolution with its gradual elimination of pagan components. Key words: modern European science, Christian Weltanschaugung. (shrink)
The model of scientific revolution genesis and structure, extracted from Einstein’s revolution and considered in my previous publications, is applied to the Copernican one . According to the model, Einstein’s revolution origins can be understood due to occurrence and partial resolution of the contradictions between main rival classical physics research programmes : newtonian mechanics, maxwellian electrodynamics, thermodynamics and Boltzmann’s statistical mechanics. In general the growth of knowledge consists in interaction, interpenetration and even unification of different scientific research programmes. It (...) is argued that the Copernican revolution also happened due to realization of a certain dualism – now between mathematical astronomy and Aristotelian qualitative physics in Ptolemy’s cosmology and the corresponding efforts to eliminate it. The works of Copernicus, Galileo, Kepler and Newton all were the stages of the mathematics descendance from skies to earth and reciprocal extrapolation of earth physics on divine phenomena. Yet the very realization of the gap between physics and astronomy appeared to be possible because at least at its first stages modern science was a result of Christian Weltanschaugung development with its aspiration for elimination of pagan components. -/- Key words: scientific revolution, modernity, Christian Weltanschaugung, Copernicus, Ptolemy. (shrink)
Husserl (a mathematician by education) remained a few famous and notable philosophical “slogans” along with his innovative doctrine of phenomenology directed to transcend “reality” in a more general essence underlying both “body” and “mind” (after Descartes) and called sometimes “ontology” (terminologically following his notorious assistant Heidegger). Then, Husserl’s tradition can be tracked as an idea for philosophy to be reinterpreted in a way to be both generalized and mathenatizable in the final analysis. The paper offers a pattern borrowed from the (...) theory of information and quantum information (therefore relating philosophy to both mathematics and physics) to formalize logically a few key concepts of Husserl’s phenomenology such as “epoché” “eidetic, phenomenological, and transcendental reductions” as well as the identification of “phenomenological, transcendental, and psychological reductions” in a way allowing for that identification to be continued to “eidetic reduction” (and thus to mathematics). The approach is tested by an independent and earlier idea of Husserl, “logical arithmetic” (parallelly implemented in mathematics by Whitehead and Russell’s Principia) as what “Hilbert arithmetic” generalizing Peano arithmetics is interpreted. A basic conclusion states for the unification of philosophy, mathematics, and physics in their foundations and fundamentals to be the Husserl tradition both tracked to its origin (in the being itself after Heidegger or after Husserl’s “zu Sache selbst”) and embodied in the development of human cognition in the third millennium. (shrink)
We discuss Russell's 1913 essay arguing for the irrelevance of the idea of causation to science and its elimination from metaphysics as a precursor to contemporary philosophical naturalism. We show how Russell's application raises issues now receiving much attention in debates about the adequacy of such naturalism, in particular, problems related to the relationship between folk and scientific conceptual influences on metaphysics, and to the unification of a scientifically inspired worldview. In showing how to recover an approximation to (...) Russell's conclusion while explaining scientists' continuing appeal to causal ideas (without violating naturalism by philosophically correcting scientists) we illustrate a general naturalist strategy for handling problems around the unification of sciences that assume different levels of naïveté with respect to folk conceptual frameworks. We do this despite rejecting one of the premises of Russell's argument, a version of reductionism that was scientifically plausible in 1913 but is not so now. (shrink)
This paper presents the unification of all knowledge and the framework of all sciences, so it goes the theory of consciousness, the method to measure consciousness, and the three keys of the Strong AI. “Logicality and non-absoluteness” is found out to be the intrinsicality of nature, so the “Fundamental Law of Nature” is discovered. Then, the “general methodology of research” and the “model system of nature” are developed to explain everything, especially consciousness. The Coupling Theory of Consciousness tells that (...) nature has the MFEs (More Fundamental Element) that couple to show as material and consciousness, and the brain has a CNS (central nervous system)-independent consciousness system that couples to give off the consciousness signals, with their three excitation levels to show as memory, sub-consciousness, and subjective consciousness. As consciousness is not from neurons, the method to measure consciousness via NCC (neural correlates of consciousness) is to find out participants with the same kinds of consciousness systems. And make them to produce paralleled “conscious information processes” to filtrate out the “noise” neural activities, to obtain the “real” NCC data. Examining the property of consciousness may advance not only neuroscience and physics, but also cognitive science and AI technology. First, the “conscious information processing” model has shown that the Strong AI should process the beyond-domain concepts with a self-controlled intention, second, as the “general methodology of research” is just the other name of the “principle of cognition”, we have already known the running style of the Strong AI. Hopefully, we could develop the Strong AI technology to upgrade human civilization very soon. (shrink)
Press release. -/- The ebook entitled, Einstein’s Revolution: A Study of Theory-Unification, gives students of physics and philosophy, and general readers, an epistemological insight into the genesis of Einstein’s special relativity and its further unification with other theories, that ended well by the construction of general relativity. The book was developed by Rinat Nugayev who graduated from Kazan State University relativity department and got his M.Sci at Moscow State University department of philosophy of science and Ph.D at (...) Moscow Institute of Philosophy, Russian Academy of Science. He has forty years of philosophy of science and relativistic astrophysics teaching and research experience evincing in more than 200 papers in the scientific journals of Russia, Ukraine, Belorussia, USA, Great Britain, Germany, Spain, Italy, Sweden, Switzerland, Netherlands, Canada, Denmark, Poland, Romania, France, Greece, Japan and some other countries, and 8 monographs. Revolutions in physics all embody theoretical unification. Hence the overall aim of the present book is to unfold Einstein’s unificationist modus operandi, the hallmarks of actual Einstein’s methodology of unification that engendered his 1905 special relativity, as well as his 1915 general relativity. To achieve the object, a lucid epistemic model is exposed aimed at an analysis of the reasons for mature theory change in science (chapter1). According to the model, scientific revolutions were not due to fanciful creation of new ideas ‘ex nihilo’, but rather to the long-term processes of the reconciliation, interpenetration and intertwinement of ‘old’ research traditions preceding such breaks .Accordingly, origins of scientific revolutions lie not in a clash of fundamental theories with facts, but of “old” mature research traditions with each other, leading to contradictions that can only be attenuated in a more general theoretical approach. In chapter 2 it is contended that Einstein’s ingenious approach to special relativity creation, substantially distinguishing him from Lorentz’s and Poincaré’s invaluable impacts, turns to be a milestone of maxwellian electrodynamics, statistical mechanics and thermodynamics reconciliation design. Special relativity turns out to be grounded on Einstein’s breakthrough 1905 light quantum hypothesis. Eventually the author amends the received view on the general relativity genesis by stressing that the main reason for Einstein’s victory over the rival programmes of Abraham and Nordström was a unificationist character of Einstein’s research programme (chapter 3). Rinat M. Nugayev, Ph.D, professor of Volga Region Academy, Kazan, the Republic of Tatarstan, the Russian Federation. (shrink)
A scientific theory, in order to be accepted as a part of theoretical scientific knowledge, must satisfy both empirical and non-empirical requirements, the latter having to do with simplicity, unity, explanatory character, symmetry, beauty. No satisfactory, generally accepted account of such non-empirical requirements has so far been given. Here, a proposal is put forward which, it is claimed, makes a contribution towards solving the problem. This proposal concerns unity of physical theory. In order to satisfy the non-empirical requirement of unity, (...) a physical theory must be such that the same laws govern all possible phenomena to which the theory applies. Eight increasingly demanding versions of this requirement are distinguished. Some implications for other non-empirical requirements, and for our understanding of science are indicated. (shrink)
This chapter examines the status of inference to the best explanation in naturalistic metaphysics. The methodology of inference to the best explanation in metaphysics is studied from the perspective of contemporary views on scientific explanation and explanatory inferences in the history and philosophy of science. This reveals serious shortcomings in prevalent attempts to vindicate metaphysical "explanationism" by reference to similarities between science and naturalistic metaphysics. This critique is brought out by considering a common gambit of methodological unity: (1) (...) Both metaphysics and science employ inference to the best explanation. (2) One has no reason to think that if explanationism is truth-conducive in science, it is not so in metaphysics. (3) One has a positive reason to think that if explanationism is truth-conducive in science, it is also so in metaphysics. (shrink)
An account of distinctively mathematical explanation (DME) should satisfy three desiderata: it should account for the modal import of some DMEs; it should distinguish uses of mathematics in explanation that are distinctively mathematical from those that are not (Baron [2016]); and it should also account for the directionality of DMEs (Craver and Povich [2017]). Baron’s (forthcoming) deductive-mathematical account, because it is modelled on the deductive-nomological account, is unlikely to satisfy these desiderata. I provide a counterfactual account of DME, the Narrow (...) Ontic Counterfactual Account (NOCA), that can satisfy all three desiderata. NOCA appeals to ontic considerations to account for explanatory asymmetry and ground the relevant counterfactuals. NOCA provides a unification of the causal and the non-causal, the ontic and the modal, by identifying a common core that all explanations share and in virtue of which they are explanatory. (shrink)
The host of SUSY(supersymmetry) based string theories is considered. Superstrings are comprehended as possible candidates on Quantum Gravity basic objects. It is argued that superstring theories constitute mainly mathematical progress and can reconcile general relativity with quantum field theory at best. Yet they cannot provide the genuine synthesis. Superstring unification of all the four forces at hand is a formal one . It is contended that genesis and proliferation of superstrings can better be described not by philosophy of (...) class='Hi'>science models but in terms of modern sociology of science. The formal character of gravity and quantum fields fusion and the lack of experimental verification make the transition to superstring theories ad hoc in Lakatosian standards. Possible way of explanation is proposed based on social interests conception of Andrew Pickering. -/- Key words: superstrings, unification, Kaluza-Klein, sociology of science . (shrink)
The chapter discusses the principle of conservatism and traces how the general principle is related to the specific one. This tracing suggests that the principle of conservatism needs to be refined. Connecting the principle in cognitive science to more general questions about scientific inference also allows us to revisit the question of realism versus instrumentalism. The framework deployed in model selection theory is very general; it is not specific to the subject matter of science. The chapter outlines some (...) non-Bayesian ideas that have been developed in model selection theory. The principle of conservatism, like C. Lloyd Morgan's canon, describes a preference concerning kinds of parameters. It says that a model that postulates only lower-level intentionality is preferable to one that postulates higher-level intentionality if both fit the data equally well. The model selection approach to parsimony helps explain why unification is a theoretical virtue. (shrink)
Philosophers of science have examined The Theory of Island Biogeography by Robert MacArthur and E. O. Wilson (1967) mainly due to its important contribution to modeling in ecology, but they have not examined it as a representative case of ecological explanation. In this paper, I scrutinize the type of explanation used in this paradigmatic work of ecology. I describe the philosophy of science of MacArthur and Wilson and show that it is mechanistic. Based on this account and in (...) light of contributions to the mechanistic conception of explanation due to Craver (2007), and Bechtel and Richardson (1993), I argue that MacArthur and Wilson use a mechanistic approach to explain the species-area relationship. In light of this examination, I formulate a normative account of mechanistic explanation in ecology. Furthermore, I argue that it offers a basis for methodological unification of ecology and solves a dispute on the nature of ecology. Lastly, I show that proposals for a new paradigm of biogeography appear to maintain the norms of mechanistic explanation implicit in The Theory of Island Biogeography. (shrink)
This paper argues that there is no good reason to suppose that the current physical laws represent the end of the road for science. Taking due account of experience, and especially mystical experience, may lead to an extension of science involving a synthesis of scientific and spiritual knowledge.
This paper intends to further the understanding of the formal properties of (higher-order) vagueness by connecting theories of (higher-order) vagueness with more recent work in topology. First, we provide a “translation” of Bobzien's account of columnar higher-order vagueness into the logic of topological spaces. Since columnar vagueness is an essential ingredient of her solution to the Sorites paradox, a central problem of any theory of vagueness comes into contact with the modern mathematical theory of topology. Second, Rumfitt’s recent topological reconstruction (...) of Sainsbury’s theory of prototypically defined concepts is shown to lead to the same class of spaces that characterize Bobzien’s account of columnar vagueness, namely, weakly scattered spaces. Rumfitt calls these spaces polar spaces. They turn out to be closely related to Gärdenfors’ conceptual spaces, which have come to play an ever more important role in cognitive science and related disciplines. Finally, Williamson’s “logic of clarity” is explicated in terms of a generalized topology (“locology”) that can be considered an alternative to standard topology. Arguably, locology has some conceptual advantages over topology with respect to the conceptualization of a boundary and a borderline. Moreover, in Williamson’s logic of clarity, vague concepts with respect to a notion of a locologically inspired notion of a “slim boundary” are (stably) columnar. Thus, Williamson’s logic of clarity also exhibits a certain affinity for columnar vagueness. In sum, a topological perspective is useful for a conceptual elucidation and unification of central aspects of a variety of contemporary accounts of vagueness. (shrink)
The no-miracles argument (Putnam, 1975) holds that science is successful because successful theories are (approximately) true. Frost-Arnold (2010) objects that this argument is unacceptable because it generates neither new predictions nor unifications. It is similar to the unacceptable explanation that opium puts people to sleep because it has a dormative virtue. I reply that on close examination, realism explains not only why some theories are successful but also why successful theories exist in current science. Therefore, it unifies the (...) disparate phenomena. (shrink)
WikiSilo is a tool for theorizing across interdisciplinary fields such as Cognitive Science, and provides a vocabulary for talking about the problems of doing so. It can be used to demonstrate that a particular cognitive theory is complete and coherent at multiple levels of discourse, and commensurable with and relevant to a wider domain of cognition. WikiSilo is also a minimalist theory and methodology for effectively doing science. WikiSilo is simultaneously similar to and distinct, as well as integrated (...) and separated from Wikipedia™. This paper will introduce the advantages of WikiSilo for use in the Cognitive Sciences. (shrink)
It is usually accepted that deductions are non-informative and monotonic, inductions are informative and nonmonotonic, abductions create hypotheses but are epistemically irrelevant, and both deductions and inductions can’t provide new insights. In this article, I attempt to provide a more cohesive view of the subject with the following hypotheses: (1) the paradigmatic examples of deductions, such as modus ponens and hypothetical syllogism, are not inferential forms, but coherence requirements for inferences; (2) since any reasoner aims to be coherent, any inference (...) must be deductive; (3) a coherent inference is an intuitive process where the premises should be taken as sufficient evidence for the conclusion, which on its turn should be viewed as a necessary evidence for the premises in some modal range; (4) inductions, properly understood, are abductions, but there are no abductions beyond the fact that in any inference the conclusion should be regarded as a necessary evidence for the premises; (5) motonocity is not only compatible with the retraction of past inferences given new information, but it is a requirement for it; (6) this explanation of inferences holds true for discovery processes, predictions and trivial inferences. (shrink)
This article had its start with another article, concerned with measuring the speed of gravitational waves - "The Measurement of the Light Deflection from Jupiter: Experimental Results" by Ed Fomalont and Sergei Kopeikin (2003) - The Astrophysical Journal 598 (1): 704–711. This starting-point led to many other topics that required explanation or naturally seemed to follow on – Unification of gravity with electromagnetism and the 2 nuclear forces, Speed of electromagnetic waves, Energy of cosmic rays and UHECRs, Digital string (...) theory, Dark energy+gravity+binary digits, Cosmic strings and wormholes from Figure-8 Klein bottles, Massless and massive photons and gravitons, Inverse square+quantum entanglement = God+evolution, Binary digits projected to make Prof. Greene’s cosmic holographic movie, Renormalization of infinity, Colliding subuniverses, Unifying cosmic inflation, TOE (emphasizing “EVERYthing”) = Bose-Einstein renormalized. The text also addresses (in a nonmathematical way) the wavelength of electromagnetic waves, the frequency of gravitational waves, gravitational and electromagnetic waves having identical speed, the gamma-ray burst designated GRB 090510, the smoothness of space, and includes these words – “Gravity produces electromagnetism. Retrocausally (by means of humans travelling into the past of this subuniverse with their electronics); this “Cosmic EM Background” produces base-2 mathematics, which produces gravity. EM interacts with gravity to produce particles, mass – gravity/EM could be termed “the Higgs field” - and the nuclear forces associated with those particles. It makes gravity using BITS that copy the principle of magnetism attracting and repelling, before pasting it into what we call the strong force and dark energy.” . (shrink)
This presentation discusses a notion encountered across disciplines, and in different facets of human activity: autonomous activity. We engage it in an interdisciplinary way. We start by considering the reactions and behaviors of biological entities to biotechnological intervention. An attempt is made to characterize the degree of freedom of embryos & clones, which show openness to different outcomes when the epigenetic developmental landscape is factored in. We then consider the claim made in programming and artificial intelligence that automata could show (...) self-directed behavior as to the determination of their step-wise decisions on courses of action. This question remains largely open and calls for some important qualifications. We try to make sense of the presence of claims of freedom in agency, first in common sense, then by ascribing developmental plasticity in biology and biotechnology, and in the mapping of programmed systems in the presence of environmental cues and self-referenced circuits as well as environmental coupling. This is the occasion to recall attempts at working out a logical and methodological approach to the openness of concepts that are still to be found, and assess whether they can operate the structuring intelligibility of a yet undeveloped or underdeveloped field of study, where a “bisociation" and a unification of knowledge might be possible. (shrink)
Can there be knowledge and rational belief in the absence of a rational degree of confidence? Yes, and cases of "mistuned knowledge" demonstrate this. In this paper we leverage this normative possibility in support of advancing our understanding of the metaphysical relation between belief and credence. It is generally assumed that a Lockean metaphysics of belief that reduces outright belief to degrees of confidence would immediately effect a unification of coarse-grained epistemology of belief with fine-grained epistemology of confidence. Scott (...) Sturgeon has suggested that the unification is effected by understanding the relation between outright belief and confidence as an instance of the determinable-determinate relation. But determination of belief by confidence would not by itself yield the result that norms for confidence carry over to norms for outright belief unless belief and high confidence are token identical. We argue that this token-identity thesis is incompatible with the neglected phenomenon of “mistuned knowledge”—knowledge and rational belief in the absence of rational confidence. We contend that there are genuine cases of mistuned knowledge and that, therefore, epistemological unification must forego token identity of belief and high confidence. We show how partial epistemological unification can be secured given determination of outright belief by degrees of confidence even without token-identity. Finally, we suggest a direction for the pursuit of thoroughgoing epistemological unification. (shrink)
The subject of this paper is Charles Morris’ semiotic theory that has as one of its major projects the unification of all sciences of signs. However, since the above project has proven to be unsuccessful, we will try to examine here the reasons that led to this. Accordingly, we will argue that to transcend the particularities of individual disciplines that he wanted to unify, Morris had to make certain ontological assumptions, instead of theoretical and methodological ones, that they could (...) share. However, because the 'sign' as an ontological category could in our view only be established if we follow the principles of the pragmatic philosophical tradition, we will try to show that the reasons for this failure should be primarily sought in different effects that consistent application of the pragmatic principles has in each of them (primarily in linguistics and the philosophy of language). On the other hand, this should enable us to draw several important conclusions regarding Morris’ project: namely, that his failure does not have to mean giving up semiotics as a potentially key discipline in approaching some fundamental philosophical problems, but also that it would demand return to the original semiotics developed in Peirce’s works. (shrink)
Inquiry is an aim-directed activity, and as such governed by instrumental normativity. If you have reason to figure out a question, you have reason to take means to figuring it out. Beliefs are governed by epistemic normativity. On a certain pervasive understanding, this means that you are permitted – maybe required – to believe what you have sufficient evidence for. The norms of inquiry and epistemic norms both govern us as agents in pursuit of knowledge and understanding, and, on the (...) surface, they do so in harmony. Recently, however, Jane Friedman (2020) has pointed out that they are in tension with each other. In this paper, I aim to resolve this tension by showing that reasons for acts of inquiry – zetetic reasons – and epistemic reasons for belief can both be understood as flowing from the same general normative principle: the transmission principle for instrumental reasons. The resulting account is a version of epistemic instrumentalism that offers an attractive unity between zetetic and epistemic normativity. (shrink)
Unity of science was once a very popular idea among both philosophers and scientists. But it has fallen out of fashion, largely because of its association with reductionism and the challenge from multiple realisation. Pluralism and the disunity of science are the new norm, and higher-level natural kinds and special science laws are considered to have an important role in scientific practice. What kind of reductionism does multiple realisability challenge? What does it take to reduce one phenomenon (...) to another? How do we determine which kinds are natural? What is the ontological basis of unity? In this Element, Tuomas Tahko examines these questions from a contemporary perspective, after a historical overview. The upshot is that there is still value in the idea of a unity of science. We can combine a modest sense of unity with pluralism and give an ontological analysis of unity in terms of natural kind monism. This title is available as Open Access on Cambridge Core. (shrink)
The present crisis of foundations in Fundamental Science is manifested as a comprehensive conceptual crisis, crisis of understanding, crisis of interpretation and representation, crisis of methodology, loss of certainty. Fundamental Science "rested" on the understanding of matter, space, nature of the "laws of nature", fundamental constants, number, time, information, consciousness. The question "What is fundametal?" pushes the mind to other questions → Is Fundamental Science fundamental? → What is the most fundamental in the Universum?.. Physics, do not (...) be afraid of Metaphysics! Levels of fundamentality. The problem №1 of Fundamental Science is the ontological justification (basification) of mathematics. To understand is to "grasp" Structure ("La Structure mère"). Key ontological ideas for emerging from the crisis of understanding: total unification of matter across all levels of the Universum, one ontological superaxiom, one ontological superprinciple. The ontological construction method of the knowledge basis (framework, carcass, foundation). The triune (absolute, ontological) space of eternal generation of new structures and meanings. Super concept of the scientific world picture of the Information era - Ontological (structural, cosmic) memory as "soul of matter", measure of the Universum being as the holistic generating process. The result of the ontological construction of the knowledge basis: primordial (absolute) generating structure is the most fundamental in the Universum. (shrink)
The evolution of gravitational tests from an epistemological perspective framed in the concept of rational reconstruction of Imre Lakatos, based on his methodology of research programmes. Unlike other works on the same subject, the evaluated period is very extensive, starting with Newton's natural philosophy and up to the quantum gravity theories of today. In order to explain in a more rational way the complex evolution of the gravity concept of the last century, I propose a natural extension of the methodology (...) of the research programmes of Lakatos that I then use during the paper. I believe that this approach offers a new perspective on how evolved over time the concept of gravity and the methods of testing each theory of gravity, through observations and experiments. I argue, based on the methodology of the research programmes and the studies of scientists and philosophers, that the current theories of quantum gravity are degenerative, due to the lack of experimental evidence over a long period of time and of self-immunization against the possibility of falsification. Moreover, a methodological current is being developed that assigns a secondary, unimportant role to verification through observations and/or experiments. For this reason, it will not be possible to have a complete theory of quantum gravity in its current form, which to include to the limit the general relativity, since physical theories have always been adjusted, during their evolution, based on observational or experimental tests, and verified by the predictions made. Also, contrary to a widespread opinion and current active programs regarding the unification of all the fundamental forces of physics in a single final theory, based on string theory, I argue that this unification is generally unlikely, and it is not possible anyway for a unification to be developed based on current theories of quantum gravity, including string theory. In addition, I support the views of some scientists and philosophers that currently too much resources are being consumed on the idea of developing quantum gravity theories, and in particular string theory, to include general relativity and to unify gravity with other forces, as long as science does not impose such research programs. -/- DOI: 10.13140/RG.2.2.35350.70724 . (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.