In a preceding publication a fundamentally oriented and irreversible world was shown to be de- rivable from the important principle of least action. A consequence of such a paradigm change is avoidance of paradoxes within a “dynamic” quantum physics. This becomes essentially possible because fundamental irreversibility allows consideration of the “entropy” concept in elementary processes. For this reason, and for a compensation of entropy in the spread out energy of the wave, the duality of particle and wave has to be (...) mediated via an information self-image of matter. In this publication considerations are extended to irreversible thermodynamics, to gravitation and cos- mology with its dependence on quantum interpretations. The information self-image of matter around particles could be identified with gravitation. Because information can also impose an al- ways constant lightvelocity there is no need any more to attribute such a property to empty space, as done in relativity theory. In addition, the possibility is recognized to consider entropy genera- tion by expanding photon fields in the universe. Via a continuous activation of information on matter photons can generate entropy and release small energy packages without interacting with matter. This facilitates a new interpretation of galactic redshift, emphasizes an information link between quantum- and cosmological phenomena, and evidences an information-triggered origin of the universe. Self-organized processes approach maximum entropy production within their constraints. In a far from equilibrium world also information, with its energy content, can self- organize to a higher hierarchy of computation. It is here identified with consciousness. This ap- pears to explain evolution of spirit and intelligence on a materialistic basis. Also gravitation, here identified as information on matter, could, under special conditions, self-organize to act as a su- per-gravitation, offering an alternative to dark matter. Time is not an illusion, but has to be understood as flux of action, which is the ultimate reality of change. The concept of an irreversible physical world opens a route towards a rational understanding of complex contexts in nature. (shrink)
This paper centers on the implicit metaphysics beyond the Theory of Relativity and the Principle of Indeterminacy – two revolutionary theories that have changed 20th Century Physics – using the perspective of Husserlian Transcedental Phenomenology. Albert Einstein (1879-1955) and Werner Heisenberg (1901-1976) abolished the theoretical framework of Classical (Galilean- Newtonian) physics that has been complemented, strengthened by Cartesian metaphysics. Rene Descartes (1596- 1850) introduced a separation between subject and object (as two different and self- enclosed substances) while Galileo and Newton (...) did the “mathematization” of the world. Newtonian physics, however, had an inexplicable postulate of absolute space and absolute time – a kind of geometrical framework, independent of all matter, for the explication of locality and acceleration. Thus, Cartesian modern metaphysics and Galilean- Newtonian physics go hand in hand, resulting to socio- ethical problems, materialism and environmental destruction. Einstein got rid of the Newtonian absolutes and was able to provide a new foundation for our notions of space and time: the four (4) dimensional space- time; simultaneity and the constancy of velocity of light, and the relativity of all systems of reference. Heisenberg, following the theory of quanta of Max Planck, told us of our inability to know sub- atomic phenomena and thus, blurring the line between the Cartesian separation of object and subject, hence, initiating the crisis of the foundations of Classical Physics. But the real crisis, according to Edmund Husserl (1859-1930) is that Modern (Classical) Science had “idealized” the world, severing nature from what he calls the Lebenswelt (life- world), the world that is simply there even before it has been reduced to mere mathematical- logical equations. Husserl thus, aims to establish a new science that returns to the “pre- scientific” and “non- mathematized” world of rich and complex phenomena: phenomena as they “appear to human consciousness”. To overcome the Cartesian equation of subject vs. object (man versus environment), Husserl brackets the external reality of Newtonian Science (epoché = to put in brackets, to suspend judgment) and emphasizes (1) the meaning of “world” different from the “world” of Classical Physics, (2) the intentionality of consciousness (L. in + tendere = to tend towards, to be essentially related to or connected to) which means that even before any scientific- logical description of the external reality, there is always a relation already between consciousness and an external reality. The world is the equiprimordial existence of consciousness and of external reality. My paper aims to look at this new science of the pre- idealized phenomena started by Husserl (a science of phenomena as they appear to conscious, human, lived experience, hence he calls it phenomenology), centering on the life- world and the intentionality of consciousness, as providing a new way of looking at ourselves and the world, in short, as providing a new metaphysics (as an antidote to Cartesian metaphysics) that grounds the revolutionary findings of Einstein and Heisenberg. The environmental destruction, technocracy, socio- ethical problems in the modern world are all rooted in this Galilean- Newtonian- Cartesian interpretation of the relationship between humans and the world after the crumbling of European Medieval Ages. Friedrich Nietzsche (1844-1900) comments that the modern world is going toward a nihilism (L. nihil = nothingness) at the turn of the century. Now, after two World Wars and the dropping of Atomic bomb, the capitalism and imperialism on the one hand, and on the other hand the poverty, hunger of the non- industrialized countries alongside destruction of nature (i.e., global warming), Nietzsche might be correct: unless humanity changes the way it looks at humanity and the kosmos. The works of Einstein, Heisenberg and Husserl seem to be pointing the way for us humans to escape nihilism by a “great existential transformation.” What these thinkers of post- modernity (after Cartesian/ Newtonian/ Galilean modernity) point to are: a) a new therapeutic way of looking at ourselves and our world (metaphysics) and b) a new and corrective notion of “rationality” (different from the objectivist, mathematico- logical way of thinking). This paper is divided into four parts: 1) A summary of Classical Physics and a short history of Quantum Theory 2) Einstein’s Special and General Relativity and Heisenberg’s Indeterminacy Principle 3) Husserl’s discussion of the Crisis of Europe, the life- world and intentionality of consciousness 4) A Metaphysics of Relativity and Indeterminacy and a Corrective notion of Rationality in Husserl’s Phenomenology . (shrink)
This paper resolves a paradox concerning colour constancy. On the one hand, our intuitive, pre-theoretical concept holds that colour constancy involves invariance in the perceived colours of surfaces under changes in illumination. On the other, there is a robust scientific consensus that colour constancy can persist in cerebral achromatopsia, a profound impairment in the ability to perceive colours. The first stage of the solution advocates pluralism about our colour constancy capacities. The second details the close relationship (...) between colour constancy and contrast. The third argues that achromatopsics retain a basic type of colour constancy associated with invariants in contrast processing. The fourth suggests that one person-level, conscious upshot of such processing is the visual awareness of chromatic contrasts ‘at’ the edges of surfaces, implicating the ‘colour for form’ perceptual function. This primitive type of constancy sheds new light on our most basic perceptual capacities, which mark the lower borders of representational mind. (shrink)
For a stable visual world, the colours of objects should appear the same under different lights. This property of colour constancy has been assumed to be fundamental to vision, and many experimental attempts have been made to quantify it. I contend here, however, that the usual methods of measurement are either too coarse or concentrate not on colour constancy itself, but on other, complementary aspects of scene perception. Whether colour constancy exists other than in nominal terms remains (...) unclear. (shrink)
Small changes in daylight in the environment can produce large changes in reflected light, even over short intervals of time. Do these changes limit the visual recognition of surfaces by their colour? To address this question, information-theoretic methods were used to estimate computationally the maximum number of surfaces in a sample that can be identified as the same after an interval. Scene data were taken from successive hyperspectral radiance images. With no illumination change, the average number of surfaces distinguishable (...) by colour was of the order of 10,000. But with an illumination change, the average number still identifiable declined rapidly with change duration. In one condition, the number after two minutes was around 600, after 10 min around 200, and after an hour around 70. These limits on identification are much lower than with spectral changes in daylight. No recoding of the colour signal is likely to recover surface identity lost in this uncertain environment. (shrink)
I argue that perception is necessarily situation-dependent. The way an object is must not just be distinguished from the way it appears and the way it is represented, but also from the way it is presented given the situational features. First, I argue that the way an object is presented is best understood in terms of external, mind-independent, but situation-dependent properties of objects. Situation-dependent properties are exclusively sensitive to and ontologically dependent on the intrinsic properties of objects, such as their (...) shape, size, and color, and the situational features, such as the lighting conditions and the perceiver’s location in relation to the perceived object. Second, I argue that perceiving intrinsic properties is epistemically dependent on representing situation-dependent properties. Recognizing situation-dependent properties yields four advantages. It makes it possible to embrace the motivations that lead to phenomenalism and indirect realism by recognizing that objects are presented a certain way, while holding on to the intuition that subjects directly perceive objects. Second, it acknowledges that perceptions are not just individuated by the objects they are of, but by the ways those objects are presented given the situational features. Third, it allows for a way to accommodate the fact that there is a wide range of viewing conditions or situational features that can count as normal. Finally, it makes it possible to distinguish perception and thought about the same object with regard to what is represented. (shrink)
By eliminating the need for an absolute frame of reference or ether, Einstein resolved the problem of the constancy of light-speed in all inertial frames but created a new problem in our understanding of time. The resolution of this problem requires no experimentation but only a careful analysis of special relativity, in particular the relativity of simultaneity. This concept is insufficiently relativistic insofar as Einstein failed to recognize that any given set of events privileges the frame in which (...) the events occur; relative to those events, only the privileged frame yields the correct measurement. Instead of equally valid frames occupying different times, one frame is correct and all others incorrect within a shared present moment. I conclude that (1) time is a succession of universal moments and (2) in the context of flowing time, time dilation requires absolute simultaneity, whereas relative simultaneity predicts a nonexistent phenomenon here dubbed time regression. (shrink)
The electronic and muonic hydrogen energy levels are calculated very accurately [1] in Quantum Electrodynamics (QED) by coupling the Dirac Equation four vector (c ,mc2) current covariantly with the external electromagnetic (EM) field four vector in QED’s Interactive Representation (IR). The c -Non Exclusion Principle(c -NEP) states that, if one accepts c as the electron/muon velocity operator because of the very accurate hydrogen energy levels calculated, the one must also accept the resulting electron/muon internal spatial and time coordinate operators (...) (ISaTCO) derived directly from c without any assumptions. This paper does not change any of the accurate QED calculations of hydrogen’s energy levels, given the simplistic model of the proton used in these calculations [1]. The Proton Radius Puzzle [2, 3] may indicate that new physics is necessary beyond the Standard Model (SM), and this paper describes the bizarre, and very different, situation when the electron and muon are located “inside” the spatially extended proton with their Centers of Charge (CoCs) orbiting the proton at the speed of light in S energy states. The electron/muon center of charge (CoC) is a structureless point that vibrates rapidly in its inseparable, non-random vacuum whose geometry and time structure are defined by the electron/muon ISaTCO discrete geometry. The electron/muon self mass becomes finite in a natural way due to the ISaTCOs cutting off high virtual photon energies in the photon propagator. The Dirac-Maxwell-Wilson (DMW) Equations are derived from the ISaTCO for the EM fields of an electron/muon, and the electron/muon “look” like point particles in far field scattering experiments in the same way the electric field from a sphere with evenly distributed charge “e” “looks” like a point with the same charge in the far field (Gauss Law). The electron’s/muon’s three fluctuating CoC internal spatial coordinate operators have eight possible eigenvalues [4, 5, 6] that fall on a spherical shell centered on the electron’s CoM with radius in the rest frame. The electron/muon internal time operator is discrete, describes the rapid virtual electron/positron pair production and annihilation. The ISaTCO together create a current that produce spin and magnetic moment operators, and the electron and muon no longer have “intrinsic” properties since the ISaTCO kinematics define spin and magnetic moment properties. (shrink)
This is an essay I entered in a competition about the Bhagavad Gita. Probably written about 2,000 years ago; this writing is perhaps the greatest philosophical expression of Hinduism. I was attracted to the contest because the website included a very favourable comment about the Bhagavad Gita by Albert Einstein (see below). For a while, I actually considered it possible that I’d win the contest. But that time has passed. The winner has been announced and I can now see my (...) entry for what it is – a naïve attempt to preach science to the religionists, as well as a naïve attempt to preach religion to the scientists. -/- There’s a statement in the essay which I’m wondering about. I said, “However, the concept of possessing a soul is not automatically supported. The idea of having a reincarnating soul would be an easy way of explaining immortality thousands of years ago when people were completely unfamiliar with the scientific facts of an eternal universe, Einstein’s Unified Field, and fractal geometry.” I still think this might be correct. But “might” is the word. I’m wondering about something I later wrote – could the ghostly immaterial body described below be what we call the soul? If such a body is developed in the future to overcome present limitations, could it be referred to as a soul if it travels into the past and is absorbed into a physical body? (It might be what the Bhagavad Gita refers to as the Supersoul, and might be quantum entangled with all space and all time. And if the physical brain is receptive to this so-called entangled soul’s knowledge of everything in space and time, the presently accepted limits to acquiring knowledge would, to use the below quote of Einstein’s, be “superfluous”). The preceding agrees with Zen Buddhism's idea of intuitively getting in touch with your inner self. -/- From “Physics and Philosophy Beyond the Standard Model” (http://vixra.org/abs/1411.0585) – “In 1925, the Austrian physicist Wolfgang Pauli discovered the exclusion principle. This says two similar particles cannot have both the same position and velocity. If two electrons could have identical positions and velocities, they could all collapse into a roughly uniform, dense “soup”. Protons and neutrons would do the same, and there would be no well-defined atoms. So we need the exclusion principle. Force-carrying particles like photons and gravitons do not obey the exclusion principle so we might assume the immaterial body wouldn’t be well-defined and would collapse into a ghostly soup. But perhaps a well-defined structure can be built if the photons are first stopped. The potential for photons to possess mass by having their digital sequence altered and being converted to other particles – or the potential for programming the photons - may make this definition possible. A chrononaut whose body is defined by mass would still have a gravitational effect, and be dark matter. But if she or he would rather not be a lump of dark matter, her or his body might be defined by programming photons and gravitons; creating a body of “light matter”. The beginnings of this technology may be in [43] which speaks of one photon being “stuck” to another.” . (shrink)
This paper presents an attempt to define temporal coincidence starting from the first principles. The temporal coincidence defined here differs from Einstein’s simultaneity for it is invariant across inertial frames - not relative. The meaning and significance of temporal coincidence is derived from axioms of existence and it somehow relates to Kant’s notion of simultaneity. Consistentl y applied to the Special Theory of Relativity framework, temporal coincidence does not in any way create mathematical contradictions; however it allows looking at some (...) common relativity claims with a dose of scepticism. Time, as derived from Lorentz transformations, appears to be conventional in order to match the postulate of constancy of the speed of light. The relative simultaneity is only apparent due to that convention. There are insufficient grounds to claim that inertial systems moving relatively to each other have their own different temporal realities. Overall, the innate temporal logic we have is not erroneous and does not need to be replaced contrary to the claims of some relativity educators. (shrink)
How are fundamental constants, such as c for the speed of light, related to particular technological environments? Our understanding of the constant c and Einstein’s relativistic cosmology depended on key experiences and lessons learned in connection to new forms of telecommunications, first used by the military and later adapted for commercial purposes. Many of Einstein’s contemporaries understood his theory of relativity by reference to telecommunications, some referring to it as “signal-theory” and “message theory.” Prominent physicists who contributed to it (...) (Hans Reichenbach, Max Born, Paul Langevin, Louis de Broglie, and Léon Brillouin, among others) worked in radio units during WWI. At the time of its development, the old Newtonian mechanics was retrospectively rebranded as based on the belief in a means of “instantaneous signaling at a distance.” Even our thinking about lengths and solid bodies, argued Einstein and his interlocutors, needed to be overhauled in light of a new understanding of signaling possibilities. Pulling a rod from one side will not make the other end move at once, since relativity had shown that “this would be a signal that moves with infinite speed.” Einstein’s universe, where time and space dilated, where the shortest path between two points was often curved and which broke the laws of Euclidean geometry, functioned according to the rules of electromagnetic signal transmission. For some critics, the new understanding of the speed of light as an unsurpassable velocity—a fundamental tenet of Einstein’s theory—was a mere technological effect related to current limitations in communication technologies. (shrink)
This chapter considers the ways in which whiteness as a skin color and ideology becomes a dominant level that sets the background against which all things, people and relations appear. Drawing on Merleau-Ponty's phenomenology, it takes up a series of films by Bruce Nauman and Marlon Riggs to consider ways in which this level is phenomenally challenged providing insights into the embodiment of racialization.
The Mach-Zehnder Interferometer (MZI) is chosen to illustrate the long-standing wave-particle duality problem. Why is which-way (welcher weg) information incompatible with wave interference? How do we explain Wheeler’s delayed choice experiment? Most crucially, how can the photon divide at the first beam splitter and yet terminate on either arm with its undiminished energy? The position advanced is that the photon has two identities, one supporting particle features and the other wave features. There is photon kinetic energy that never splits (on (...) half-silvered mirrors) or diffracts (in pinholes or slits). Then there are photon probability waves that do diffract and can reinforce or cancel. Photon kinetic energy is oscillatory; its cycles require/occupy time. E = mc2 suggests that kinetic energy is physically real as occurrence in time just as rest mass is physically real as existence in space; both are quantized and both occupy/require a dimension for their occurrence or existence. Photon kinetic energy (KE) thus resides in time, but is still present/available for interactions (events) in space; rest mass (e.g., your desk) resides in space but is still present/available for interactions (events) in time. While photon probability waves progress in space and diffract there, photon KE resides in time and never diffracts in space; at reception it always arrives whole and imitates particle impact without being a particle. Photon probability waves are real; they diffract in space. Acknowledging that the photon has two identities (residing energy and progressing probability), explains photon dual nature. And wave-particle duality is central to quantum mechanics. Understanding it leads to new insights into entanglement, nonlocality and the measurement problem. A 30-minute video on nonlocality and photon dualism can be found by a google: search “youtube klevgard nonlocality”). (shrink)
Here we discuss and hope to solve a problem rooted in the necessity of the study of historical science, the slow deviation of physics education over the past century, and how the loss of crucial contextual tool has debilitated discussion of a very important yet specialized physics sub-topic: the isotropy of the one-way speed of light. Most notably, the information that appears to be most commonly missing is not simply the knowledge of the historical fact that Poincare and Lorentz (...) presented a mathematically equivalent representation of relativity theory contemporary with Einstein’s publication, but the most deleterious outcome is a lack of understanding of how that theory worked within a mechanical wave system to give the same results in all known experiments via an instrumentation-based illusion. Unfortunately those well educated in relativity theory, often haven’t been granted the advantage of contrast this history of development of the theory gives and thus some of the most direct and critical implications of modern theory are often lost on even graduate students. Chief among the implications that should be trivial to a student of relativity is the incompatibility of a mechanical wave concept of light with the modern assumptions of relativity. However, one should also be able to expect a graduate student to also easily comprehend the mathematical necessity of conjoining space with time is only descended from the presumption of isotropic constancy specifically, and further that the mechanics are one and the same as the relativity of simultaneity. However, many published papers appear to lack this understanding. The result is that the modern consensus appears to have arrived at the conclusion that the one-way speed of light is intrinsically untestable and therefore the physics community has accidentally pushed Minkowski spacetime, and most directly, the relativity of simultaneity, into the domain of pseudoscience, leaving only the historical relativistic “ether” described by Mansouri-Sexl test theory (AKA Lorentz Ether Theory) as the only workable alternative. This situation must be re-examined at the lowest possible level to arrive at an appropriate experimental regime. (shrink)
Perceptual experience has the phenomenal character of encountering a mind-independent objective world. What we encounter in perceptual experience is not presented to us as a state of our own mind. Rather, we seem to encounter facts, objects, and properties that are independent from our mind. In short, perceptual experience has phenomenal objectivity. This paper proposes and defends a Kantian account of phenomenal objectivity that grounds it in experiences of lawlike regularities. The paper offers a novel account of the connection between (...) phenomenology and intentionality. It also sheds some light on one of the central themes in Kant’s Critique of Pure Reason. (shrink)
The Simulation Hypothesis proposes that all of reality is in fact an artificial simulation, analogous to a computer simulation. Outlined here is a method for programming relativistic mass, space and time at the Planck level as applicable for use in Planck Universe-as-a-Simulation Hypothesis. For the virtual universe the model uses a 4-axis hyper-sphere that expands in incremental steps (the simulation clock-rate). Virtual particles that oscillate between an electric wave-state and a mass point-state are mapped within this hyper-sphere, the oscillation driven (...) by this expansion. Particles are assigned an N-S axis which determines the direction in which they are pulled along by the expansion, thus an independent particle motion may be dispensed with. Only in the mass point-state do particles have fixed hyper-sphere co-ordinates. The rate of expansion translates to the speed of light and so in terms of the hyper-sphere co-ordinates all particles (and objects) travel at the speed of light, time (as the clock-rate) and velocity (as the rate of expansion) are therefore constant, however photons, as the means of information exchange, are restricted to lateral movement across the hyper-sphere thus giving the appearance of a 3-D space. Lorentz formulas are used to translate between this 3-D space and the hyper-sphere co-ordinates, relativity resembling the mathematics of perspective. (shrink)
Albert Einstein's bold assertion of the form-invariance of the equation of a spherical light wave with respect to inertial frames of reference became, in the space of six years, the preferred foundation of his theory of relativity. Early on, however, Einstein's universal light-sphere invariance was challenged on epistemological grounds by Henri Poincaré, who promoted an alternative demonstration of the foundations of relativity theory based on the notion of a light-ellipsoid. Drawing in part on archival sources, this paper (...) shows how an informal, international group of physicists, mathematicians, and engineers, including Einstein, Paul Langevin, Poincaré, Hermann Minkowski, Ebenezer Cunningham, Harry Bateman, Otto Berg, Max Planck, Max Laue, A. A. Robb, and Ludwig Silberstein, employed figures of light during the formative years of relativity theory in their discovery of the salient features of the relativistic worldview. (shrink)
An analysis of the Third Man Argument, especially in light of Constance Meinwald's book Plato's Parmenides. I argue that her solution to the TMA fails. Then I present my own theory as to what Plato's solution was.
In this essay, we draw on John Haugeland’s work in order to argue that Burge is wrong to think that exercises of perceptual constancy mechanisms suffice for perceptual representation. Although Haugeland did not live to read or respond to Burge’s Origins of Objectivity, we think that his work contains resources that can be developed into a critique of the very foundation of Burge’s approach. Specifically, we identify two related problems for Burge. First, if (what Burge calls) mere sensory responses (...) are not representational, then neither are exercises of constancy mechanisms, since the differences between them do not suffice to imply that one is representational and the other is not. Second, taken by themselves, exercises of constancy mechanisms are only derivatively representational, so merely understanding how they work is not sufficient for understanding what is required for something, in itself, to be representational (and thereby provide a full solution to the problem of perceptual representation). (shrink)
No one is quite sure what happened to T.S. Eliot in that rose-garden. What we do know is that it formed the basis for Four Quartets, arguably the greatest English poem written in the twentieth century. Luckily it turns out that Martin Heidegger, when not pondering the meaning of being, spent a great deal of time thinking and writing about the kind of event that Eliot experienced. This essay explores how Heidegger developed the concept of Ereignis, “event” which, in the (...) context of Eliot’s poetry, helps us understand an encounter with the “heart of light” a little better. (shrink)
Space-time intervals are the fundamental components of conscious experience, gravity, and a Theory of Everything. Space-time intervals are relationships that arise naturally between events. They have a general covariance (independence of coordinate systems, scale invariance), a physical constancy, that encompasses all frames of reference. There are three basic types of space-time intervals (light-like, time-like, space-like) which interact to create space-time and its properties. Human conscious experience is a four-dimensional space-time continuum created through the processing of space-time intervals by (...) the brain; space-time intervals are the source of conscious experience (observed physical reality). Human conscious experience is modeled by Einstein’s special theory of relativity, a theory designed specifically from the general covariance of space-time intervals (for inertial frames of reference). General relativity is our most accurate description of gravity. In general relativity, the general covariance of space-time intervals is extended to all frames of reference (inertial and non-inertial), including gravitational reference frames; space-time intervals are the source of gravity in general relativity. The general covariance of space-time intervals is further extended to quantum mechanics; space-time intervals are the source of quantum gravity. The general covariance of space-time intervals seamlessly merges general relativity with quantum field theory (the two grand theories of the universe). Space-time intervals consequently are the basis of a Theory of Everything (a single all-encompassing coherent theoretical framework of physics that fully explains and links together all physical aspects of the universe). This theoretical framework encompasses our observed physical reality (conscious experience) as well; space-time intervals link observed physical reality to actual physical reality. This provides an accurate and reliable match between observed physical reality and the physical universe by which we can carry on our activity. The Minkowski metric, which defines generally covariant space-time intervals, may be considered an axiom (premise, postulate) for the Theory of Everything. (shrink)
This article shows that in 1.4.2.15-24 of the Treatise of Human Nature, Hume presents his own position on objects, which is to be distinguished from both the vulgar and philosophical conception of objects. Here, Hume argues that objects that are effectively imagined to have a “perfect identity” are imagined due to the constancy and coherence of our perceptions (what we may call ‘level 1 constancy and coherence’). In particular, we imagine that objects cause such perceptions, via what I (...) call ‘indirect causation.’ In virtue of imagining ideas of objects that have a perfect identity, our perceptions seem to be even more constant and coherent (what we may call ‘level 2 constancy and coherence’). Thus, in addition to seeing that Hume is presenting his own position on objects in this section of the Treatise, we see that he is working with a previously unrecognized kind of causation, i.e., indirect causation, and that he has two kinds of constancy and coherence in mind: level 1 and level 2. (shrink)
Wormhole is a popular tool for interstellar travel in science fiction. It connects two different space-time points directly although such an astrophysical object is not yet observed, but in 2015, researchers in Spain created a tiny magnetic wormhole for the first time ever. They used it to connect two regions of space so that a magnetic field could travel 'invisibly' between them. This article presents a model of making a wormhole of light on the principle of conversion of energy (...) so that light going from one end could come out of another end; hence it can be called a wormhole of light. Materials including solar cell, amplified speaker, one way walkie talkie, alligator clips, light emitting diode, transistors and clapping switch have been used. The method of conversion of energy is used and light is transferred into sound first and then back into light. The preparation of such wormhole of light is split into two parts. One end includes a circuit that would accept light through a box and the other end would provide light coming from another box. In order to utilize a wormhole of light to two locales of room the goal was to make light travel between them. Sound made from light would travel the distance between the two boxes. For this purpose a one way walkie-talkie will provide the connection between the two boxes and thus as a whole it would work as a wormhole of light. (shrink)
Time contraction is not a property of the moving body. Permanent contraction is against the constancy of the speed of light. Einstein uses wrong units of measurement and comes to wrong conclusions. We use the translation of the article "On the electrodynamics of moving bodies".
A deconstruction of the implicit notion of Absolute space that dominates modern physics. The deconstruction is enacted by juxtaposing the common notion of Absolute space abstracted from Newton’s Philosophiae Naturalis Principia Mathematica with Levinas’ particular present treatment of space in Otherwise than Being: Or Beyond Essence.
This study, presenting a history of the measurement of light intensity from its first hesitant emergence to its gradual definition as a scientific subject, explores two major themes. The first concerns the adoption by the evolving physics and engineering communities of quantitative measures of light intensity around the turn of the twentieth century. The mathematisation of light measurement was a contentious process that hinged on finding an acceptable relationship between the mutable response of the human eye and (...) the more easily stabilised, but less encompassing, techniques of physical measurement. -/- A second theme is the exploration of light measurement as an example of ‘peripheral science’. Among the characteristics of such a science, I identify the lack of a coherent research tradition and the persistent partitioning of the subject between disparate groups of practitioners. Light measurement straddled the conventional categories of ‘science’ and ‘technology’, and was influenced by such distinct factors as utilitarian requirements, technological innovation, human perception and bureaucratisation. Peripheral fields such as this, which may be typical of much of modern science and technology, have hitherto received little attention from historians. -/- These themes are pursued with reference to the social and technological factors which were combined inextricably in the development of the subject. The intensity of light gained only sporadic attention until the late nineteenth century. Measured for the utilitarian needs of the gas lighting industry from the second half of the century, light intensity was appropriated by members of the nascent electric lighting industry, too, in their search for a standard of illumination. By the turn of the century the ‘illuminating engineering movement’ was becoming an organised, if eclectic, community which promoted research into and standards for the measurement of light intensity. -/- The twentieth-century development of the subject was moulded by organisation and institutionalisation. Between 1900 and 1920, the new national and industrial laboratories in Britain, America and Germany were crucial in stabilising the subject. In the inter-war period, committees and international commissions sought to standardise light measurement and to promote research. Such government- and industry-supported delegations, rather than academic institutions, were primarily responsible for the ‘construction’ of the subject. Practitioners increasingly came to interpret the three topics of photometry (visible light measurement), colorimetry (the measurement of colour) and radiometry (the measurement of invisible radiations) as aspects of a broader study, and enthusiastically applied them to industrial and scientific problems. -/- From the 1920s, the long-established visual methods of observation were increasingly replaced by physical means of light measurement, a process initially contingent on scientific fashion more than demonstrated superiority. New photoelectric techniques for measuring light intensity engendered new commercial instruments, a trend which accelerated in the following decade when photometric measurement was applied with limited success to a range of industrial problems. Seeds sowed in the 1920s – namely commercialisation and industrial application, the transition from visual to ‘physical’ methods, and the search for fundamental limitations in light measurement – gave the subject substantially the form it was to retain over the next half-century. (shrink)
Reid, Constance. Hilbert (a Biography). Reviewed by Corcoran in Philosophy of Science 39 (1972), 106–08. -/- Constance Reid was an insider of the Berkeley-Stanford logic circle. Her San Francisco home was in Ashbury Heights near the homes of logicians such as Dana Scott and John Corcoran. Her sister Julia Robinson was one of the top mathematical logicians of her generation, as was Julia’s husband Raphael Robinson for whom Robinson Arithmetic was named. Julia was a Tarski PhD and, in recognition of (...) a distinguished career, was elected President of the American Mathematics Society. https://en.wikipedia.org/wiki/Julia_Robinson http://www.awm-math.org/noetherbrochure/Robinson82.html. (shrink)
The special theory of relativity holds significant interest for scientific perspectivists. In this paper, I distinguish between two related meanings of “perspectival,” and argue that reference frames are perspectives, provided that perspectival means “being conditional” rather than “being partial.” Frame-dependent properties such as length, time duration, and simultaneity, are not partially measured in a reference frame, but their measurements are conditional on the choice of frame. I also discuss whether the constancy of the speed of light depends on (...) perspectival factors such as the idealized definition of the speed of light in a perfect vacuum and the Einstein synchronization convention. Furthermore, I argue for the view that the constancy of its speed is a robust property of light according to the conditions of currently acceptable experimental setups pertaining to special relativity, and conclude that this view supports perspectivism. (shrink)
What colour does a white wall look in the pinkish light of the late afternoon? Philosophers disagree: they hold variously that it looks pink, white, both, and no colour at all. A new approach is offered. After reviewing the dispute, a reinterpretation of perceptual constancy is offered. In accordance with this reinterpretation, it is argued that perceptual features such as color must always be predicated of perceptual objects. Thus, it might be that in pinkish light, the wall (...) looks white and the light looks pink. The paper concludes by discussing some criteria for object identification in perceptual states. (shrink)
Albert Abraham Michelson (1852-1931), the American optical physicist best known for his precise determination of the velocity of light and for his experiments concerning aether drift, is less often acknowledged as the creator of new spectroscopic instrumentation and new spectroscopies. He devised a new method of light analysis relying upon his favourite instrument – a particular configuration of optical interferometer – and published investigations of spectral line separation, Doppler-broadening and simple high-resolution spectra (1887-1898). Contemporaries did not pursue (...) his method. Michelson himself discarded the technique by the end of the decade, promoting a new device, the ‘echelon spectroscope’, as a superior instrument. High-resolution spectroscopy was taken up by others at the turn of the century using the echelon, Fabry-Pérot etalon and similar instruments. Michelson’s ‘Light Wave Analysis’ was largely forgotten, but was rediscovered c1950 and developed over the following three decades into a technique rechristened ‘Fourier transform spectroscopy’. This paper presents Michelson’s interferometric work as a continuum of personal interests and historical context as an example of 'research technology' and 'peripheral science'. (shrink)
Non-commuting quantities and hidden parameters – Wave-corpuscular dualism and hidden parameters – Local or nonlocal hidden parameters – Phase space in quantum mechanics – Weyl, Wigner, and Moyal – Von Neumann’s theorem about the absence of hidden parameters in quantum mechanics and Hermann – Bell’s objection – Quantum-mechanical and mathematical incommeasurability – Kochen – Specker’s idea about their equivalence – The notion of partial algebra – Embeddability of a qubit into a bit – Quantum computer is not Turing machine – (...) Is continuality universal? – Diffeomorphism and velocity – Einstein’s general principle of relativity – „Mach’s principle“ – The Skolemian relativity of the discrete and the continuous – The counterexample in § 6 of their paper – About the classical tautology which is untrue being replaced by the statements about commeasurable quantum-mechanical quantities – Logical hidden parameters – The undecidability of the hypothesis about hidden parameters – Wigner’s work and и Weyl’s previous one – Lie groups, representations, and psi-function – From a qualitative to a quantitative expression of relativity − psi-function, or the discrete by the random – Bartlett’s approach − psi-function as the characteristic function of random quantity – Discrete and/ or continual description – Quantity and its “digitalized projection“ – The idea of „velocity−probability“ – The notion of probability and the light speed postulate – Generalized probability and its physical interpretation – A quantum description of macro-world – The period of the as-sociated de Broglie wave and the length of now – Causality equivalently replaced by chance – The philosophy of quantum information and religion – Einstein’s thesis about “the consubstantiality of inertia ant weight“ – Again about the interpretation of complex velocity – The speed of time – Newton’s law of inertia and Lagrange’s formulation of mechanics – Force and effect – The theory of tachyons and general relativity – Riesz’s representation theorem – The notion of covariant world line – Encoding a world line by psi-function – Spacetime and qubit − psi-function by qubits – About the physical interpretation of both the complex axes of a qubit – The interpretation of the self-adjoint operators components – The world line of an arbitrary quantity – The invariance of the physical laws towards quantum object and apparatus – Hilbert space and that of Minkowski – The relationship between the coefficients of -function and the qubits – World line = psi-function + self-adjoint operator – Reality and description – Does a „curved“ Hilbert space exist? – The axiom of choice, or when is possible a flattening of Hilbert space? – But why not to flatten also pseudo-Riemannian space? – The commutator of conjugate quantities – Relative mass – The strokes of self-movement and its philosophical interpretation – The self-perfection of the universe – The generalization of quantity in quantum physics – An analogy of the Feynman formalism – Feynman and many-world interpretation – The psi-function of various objects – Countable and uncountable basis – Generalized continuum and arithmetization – Field and entanglement – Function as coding – The idea of „curved“ Descartes product – The environment of a function – Another view to the notion of velocity-probability – Reality and description – Hilbert space as a model both of object and description – The notion of holistic logic – Physical quantity as the information about it – Cross-temporal correlations – The forecasting of future – Description in separable and inseparable Hilbert space – „Forces“ or „miracles“ – Velocity or time – The notion of non-finite set – Dasein or Dazeit – The trajectory of the whole – Ontological and onto-theological difference – An analogy of the Feynman and many-world interpretation − psi-function as physical quantity – Things in the world and instances in time – The generation of the physi-cal by mathematical – The generalized notion of observer – Subjective or objective probability – Energy as the change of probability per the unite of time – The generalized principle of least action from a new view-point – The exception of two dimensions and Fermat’s last theorem. (shrink)
This paper situates Einstein's theory of relativity within broader networks of communication. The speed of light, explained Einstein, was an unsurpassable velocity if , and only if , it was considered in terms of »arbitrary« and »voluntary« signals. Light signals in physics belong within a broader set of signs and symbols that include communication and military signals, understood by reference to Helmholtz, Saussure, media philosophies from WWII to '68 (Lavelle, Ong, McLuhan) and Derrida. Once light signals (...) in physics are considered in relation to semaphore, print, telegraph, radio, computers and tape recorders, Kittler and Habermas provide us with conflicting ways for understanding science and technology, rationality and consensus. We conclude with a study of »flash and bang« in popular accounts of relativity theory to understand the role of theoretical science in the transmission of information and violence. (shrink)
This dissertation is about human knowledge of reality. In particular, it argues that scientific knowledge is bounded by historically available instruments and theories; nevertheless, the use of several independent instruments and theories can provide access to the persistent potentialities of reality. The replicability of scientific observations and experiments allows us to obtain explorable evidence of robust entities and properties. The dissertation includes seven chapters. It also studies three cases – namely, Higgs bosons and hypothetical Ϝ-particles (section 2.4), the Ptolemaic and (...) Kepler model of the planets (section 6.7), and the special theory of relativity (chapter 7). -/- Chapter 1 is the introduction of the dissertation. Chapter 2 clarifies the notion of the real on the basis of two concepts: persistence and resistance. These concepts enable me to explain my ontological belief in the real potentialities of human-independent things and the implications of this view for the perceptual and epistemological levels of discussion. On the basis of the concept of “overlapping perspectives”, chapter 3 argues that entity realism and perspectivism are complementary. That is, an entity that manifests itself through several experimental/observational methods is something real, but our knowledge of its nature is perspectival. Critically studying the recent views of entity realism, chapter 4 extends the discussion of entity realism and provides a criterion for the reality of property tokens. Chapter 5, in contrast, develops the perspectival aspects of my view on the basis of the phenomenological-hermeneutical approaches to the philosophy of science. This chapter also elaborates my view of empirical evidence, as briefly expressed in sections 2.5 and 4.5. Chapter 6 concerns diachronic theoretical perspectives. It first explains my view of progress, according to which current perspectives are broader than past ones. Second, it argues that the successful explanations and predictions of abandoned theories can be accounted for from our currently acceptable perspectives. The case study of Ptolemaic astronomy supports the argument of this chapter. Chapter 7 serves as the conclusion of the dissertation by applying the central themes of the previous chapters to the case study of special relativity theory. I interpret frame-dependent properties, such as length and time duration, and the constancy of the speed of light according to realist perspectivism. (shrink)
In this journal, Schulte develops a novel solution to the problem of distal content: by virtue of what is a mental representation about a distal object rather than a more proximal cause of that representation? Schulte maintains that in order for a representation to have a distal content, it must be produced by a constancy mechanism, along with two other conditions. I raise three objections to his solution. First, a core component of Schulte's solution is just a restrictive version (...) of Dretske's solution, but Schulte gives no argument for his restriction. Second, his proposed solution to a disjunction problem is ad hoc. Finally, his ‘far-out’ version of the distality problem is not a version of the distality problem at all. I conclude that Dretske's solution is preferable to Schulte's. (shrink)
This wide ranging discourse covers many disciplines of science and the human condition in an attempt to fully understand the manifestation of time. Time's Paradigm is, at its inception, a philosophical debate between the theories of 'Presentism' and 'The Block Model', beginning with a pronounced psychological analysis of 'free will' in an environment where the past and the future already exist. It lays the foundation for the argument that time is a cyclical, contained progression, rather than a meandering voyage into (...) infinity, bringing into question the validity of a commensurate 'Big Bang'. Following, the proposal widens to encompass physics. It tackles clock rates and time dilation, acausality and the nuisance of a universal clock, and demonstrates that conscious consideration creates the present moment - time's flow - separating the solid state past and future whose reality is devoid of space. Arguments relating to Quantum Physics theory, including the Uncertainty Principle and a Superposition of States, lend credibility to key areas involving cognitive awareness. It is posited that defined points in time and space prohibit progress in linear models for progression. Thus motile paradoxes can be resolved with the absence of infinities; temporal perception, it is concluded, being the result of uncertainty. Time's Paradigm takes the bold step of asking us to consider a tangible dimension of time, representing an intimate extension of our three, known spatial dimensions. Chaos theory is briefly introduced leading to the configuration of a fractal fourth dimension of time whose assumption demands only one direction of flow. Further, it asks whether our universe is expanding or contracting. It considers the simple physics of bodies contracting in a fourth dimension of time (UC), and how that marries comfortably with standard scientific models such as Special Relativity. The rate at which matter is contracting in the universe is illustrated in a reduction factor of 1.618... coinciding with Fibonacci's Ratio and countering Time Dilation. Lastly the more complex aspects of relativistic velocities are tackled together with the conundrum of Zero Velocity and The Speed of Light being attributes of the same event in Cyclical Space-Time, and ultimately, the prospect of superluminal velocities by interaction with parallel time-zones in a multi-layered block universe. (shrink)
In this paper I argue that from the point of view of a theist, inclusivism with respect to the issue whether adherents of different religious traditions can have veridical experience of God (or Ultimate Reality) now, is more plausible than the Alstonian exclusivism. I suggest that mystical inclusivism of the kind I imply in this paper may contribute to the development of cross-cultural philosophy of religion, as well as to the theoretical framework for inter-religious dialogue, because (1) it allows for (...) the possibility of veridical experience of God in a variety of religious traditions, but (2) it avoids the radical revisionist postulates of Hickian pluralism and (3) it leaves open the question whether the creed of any specific tradition is a better approximation to the truth about God than the creeds of other traditions. (edited). (shrink)
We present an epistemological scheme of natural sciences inspired in Peirce's pragmaticist view, stressing the role of the \emph{phenomenological map}, that connects reality and our ideas about it. The scheme has a recognisable mathematical/logical structure which allows to explore some of its consequences. We show that seemingly independent principles as the requirement of reproducibility of experiments and the Principle of sufficient reason are both implied by the scheme, as well as Popper's concept of falsifiability. We show that the scheme has (...) some power in demarcating science by first comparing with an alternative scheme advanced during the first part of the XX century (which we call Popper-Einstein and has its roots in Hertz). Further, the identified differences allow us to focus in the construction of Special Relativity showing that it uses an intuited concept of velocity that does not satisfy the requirements of reality in Peirce. We track the problem to hidden hypothesis in Einstein's work. While the main mathematical observation has been known for more than a century it has not been investigated from an epistemological point of view, probably because the socially dominating epistemology in physics discourages so doing. (shrink)
All representationalists maintain that there is a necessary connection between an experience’s phenomenal character and intentional content; but there is a disagreement amongst representationalists regarding the nature of those intentional contents that are necessarily connected to phenomenal character. Russellian representationalists maintain that the relevant contents are composed of objects and/or properties, while Fregean representationalists maintain that the relevant contents are composed of modes of presentation of objects and properties. According to Fregean representationalists such as David Chalmers and Brad Thompson, the (...) Fregean variety of the view is preferable to the Russellian variety because the former can accommodate purported counterexamples involving spectrum inversion without illusion and colour constancy while the latter cannot. I maintain that colour constancy poses a special problem for the Fregean theory in that the features of the theory that enable it handle spectrum inversion without illusion cannot be extended to handle colour constancy. I consider the two most plausible proposals regarding how the Fregean view might be developed in order to handle colour constancy—one of which has recently been defended by Thompson —and argue that neither is adequate. I conclude that Fregean representationalism is no more able to accommodate colour constancy than is Russellian representationalism and, as such, ought to be rejected. (shrink)
I argue for the following claims: (1) A core Husserlian account of perceptual constancy needs to be given in terms of indicative future-oriented conditionals but can be complemented by a counterfactual account; (2) thus conceived, constancy is a necessary aspect of content. I speak about a “core Husserlian” account so as to capture certain ideas that Michael Madary has presented as the core of Edmund Husserl's approach to perceptual constancy, viz., that “perception is partly constituted by the (...) continuous interplay of intention and fulfilment” and that this “gives us a way to understand the relationship between different appearances of the same object” (See Madary, M. (2012) “Husserl on Perceptual Constancy.” European Journal of Philosophy 20(1): 145–165.). I take myself to be developing, and perhaps correcting, Madary's view as I discuss the role of the core Husserlian ideas, and counterfactuals, in accounting for shape and color constancy, respectively. I bridge constancy and fulfilment-conditional content by appealing to the Husserlian notion of constitution, which captures the process in which objectivity and, correlatively, intentional experience, are built up in the experiential flow. (shrink)
Naive observers viewed a sequence of colored Mondrian patterns, simulated on a color monitor. Each pattern was presented twice in succession, first under one daylight illuminant with a correlated color temperature of either 16,000 or 4,000 K and then under the other, to test for color constancy. The observers compared the central square of the pattern across illuminants, either rating it for sameness of material appearance or sameness of hue and saturation or judging an objective property—that is, whether its (...) change of color originated from a change in material or only from a change in illumination. Average color constancy indices were high for material appearance ratings and binary judgments of origin and low for hue–saturation ratings. Individuals’ performance varied, but judgments of material and of hue and saturation remained demarcated. Observers seem able to separate phenomenal percepts from their ontological projections of mental appearance onto physical phenomena; thus, even when a chromatic change alters perceived hue and saturation, observers can reliably infer the cause, the constancy of the underlying surface spectral reflectance. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.