It is shown that the heuristic "derivation" of the Schrödinger equation in quantum mechanics textbooks can be turned into a real derivation by resorting to spacetime translation invariance and relativistic invariance.
I give concise derivations of Price's equation and the criteria for kin and group selection, prove that kin and group selection are equivalent, and discuss the controversies about altruism.
In physics there are different opinions about the conceptual interpretation of Einstein’s famous equation that describes the equivalence between mass and energy. It is understandable that the equation has different interpretations because of the different points of view to interpret phenomenological reality. This paper is about the meaning of the equation in relation to the general concept of quantum field theory. In other words, reality is created by the underlying structure of the basic quantum fields.
We propose to simplify the problem of the unified theory of Quantum-Gravity through dealing first with the simple case of non-relativistic equations of Gravity and Quantum Mechanics. We show that unification of the two non-relativistic formalisms can be achieved through the joined classical and Quantum postulate that every natural body is composed of N identical final particles. This includes the current 'elementary' particles of the standard model such as quarks, photons, gluons, etc. Furthermore, we show that this opens a new (...) route toward a Generalized Equation of Quantum-Gravity that takes the effects of both of velocity and acceleration into account. (shrink)
The Equation (TE) states that the probability of A → B is the probability of B given A (Jeffrey, 1964: 702–703). Lewis has shown that the acceptance of TE implies that the probability of A → B is the probability of B, which is implausible: the probability of a conditional cannot plausibly be the same as the probability of its consequent, e.g., the probability that the match will light given that is struck is not intuitively the same as the (...) probability that it will light (Lewis, 1976: 299–300). Here I want to counter Lewis’ claim. My aim is to argue that: (1) (TE) doesn’t track the probability of A → B, but instead our willingness to employ it on a modus ponens; (2) the triviality result doesn’t strike us as implausible if our willingness to employ A → B on a modus ponens implies a similar result; (3) (TE) is still inadequate in this limited role given that some conditionals are only employable on a modus tollens or can’t be employed on a modus ponens; (4) (TE) does not have the logical significance that is usually attributed to it, since inferential disposition is a pragmatic phenomenon. (shrink)
Four intuitions are recurrent and influential in theories about conditionals: the Ramsey’s test, the Adams’ Thesis, the Equation, and the robustness requirement. For simplicity’s sake, I call these intuitions ‘the big four’. My aim is to show that: (1) the big four are interdependent; (2) they express our inferential dispositions to employ a conditional on a modus ponens; (3) the disposition to employ conditionals on a modus ponens doesn’t have the epistemic significance that is usually attributed to it, since (...) the acceptability or truth conditions of a conditional is not necessarily associated with its employability on a modus ponens. (shrink)
In this manuscript we study individual variation in the interpretation of conditionals by establishing individual profiles of the participants based on their behavioral responses and reflective attitudes. To investigate the participants’ reflective attitudes we introduce a new experimental paradigm called the Scorekeeping Task, and a Bayesian mixture model tailored to analyze the data. The goal is thereby to identify the participants who follow the Suppositional Theory of conditionals and Inferentialism and to investigate their performance on the uncertain and-to-if inference task.
This paper examines precursors and consequents of perceived relevance of a proposition A for a proposition C. In Experiment 1, we test Spohn's assumption that ∆P = P − P is a good predictor of ratings of perceived relevance and reason relations, and we examine whether it is a better predictor than the difference measure − P). In Experiment 2, we examine the effects of relevance on probabilistic coherence in Cruz, Baratgin, Oaksford, and Over's uncertain “and-to-if” inferences. The results suggest (...) that ∆P predicts perceived relevance and reason relations better than the difference measure and that participants are either less probabilistically coherent in “and-to-if” inferences than initially assumed or that they do not follow P = P. Results are discussed in light of recent results suggesting that the Equation may not hold under conditions of irrelevance or negative relevance. (shrink)
In their article 'Causes and Explanations: A Structural-Model Approach. Part I: Causes', Joseph Halpern and Judea Pearl draw upon structural equation models to develop an attractive analysis of 'actual cause'. Their analysis is designed for the case of deterministic causation. I show that their account can be naturally extended to provide an elegant treatment of probabilistic causation.
The term prohairesis has a long history; its usage is crucial for the development and understanding of basic ethical and anthropological assumptions in ancient Hellenic philosophy. In this article the author analyses the most important moments for the semantic transformation of this term, with particular reference to the implications of its usage in Byzantine theological and philosophical heritage, with the ultimate expression in work of St Maximus the Confessor and his christological synthesis. The equation between the terms prohairesis and (...) gnome and their separation from the authentic human nature, as well as the usage of the term thelesis for the original „human will“, represents the thorough revision of the antique philosophical heritage which could be compared with the distinction of the terms ousia and hypostasis by Cappadocian Fathers. In this article the author will show the extent to which and the way in which Byzantine theological and philosophical thought adopted and transformed its own Hellenic heritage. (shrink)
According to manipulationist accounts of causal explanation, to explain an event is to show how it could be changed by intervening on its cause. The relevant change must be a ‘serious possibility’ claims Woodward 2003, distinct from mere logical or physical possibility—approximating something I call ‘scientific possibility’. This idea creates significant difficulties: background knowledge is necessary for judgments of possibili-ty. Yet the primary vehicles of explanation in manipulationism are ‘invariant’ generali-sations, and these are not well adapted to encoding such knowledge, (...) especially in the social sciences, as some of it is non-causal. Ceteris paribus (CP) laws or generalisa-tions labour under no such difficulty. A survey of research methods such as case and comparative studies, randomised control trials, ethnography, and structural equation modeling, suggests that it would be more difficult and in some instances impossible to try to represent the output of each method in invariant generalisations; and that this is because in each method causal and non-causal background knowledge mesh in a way that cannot easily be accounted for in manipulationist terms. Ceteris paribus-generalisations being superior in this regard, a theory of explanation based on the latter is a better fit for social science. (shrink)
The purpose of this writing is to propose a frame of view, a form as the eternal world element, that is compatible with paradox within the history of ideas, modern discovery as they confront one another. Under special consideration are problems of representation of phenomena, life, the cosmos as the rational facility of mind confronts the physical/perceptual, and itself. Current topics in pursuit are near as diverse and numbered as are the possibilities for a world composed strictly of uniqueness able (...) to fill infinite space; it is assumed that not all of the paths chosen in contemporary pursuits will produce coherent determinations in an appropriate frame able to accommodate a world of nominals in motion, containing motion, and is commensurate with basic physical law and the propagation of form, change from within. Intended as a potential guiding post for the aim of reason seeking to select, define and capture topics, chosen as special examples are the works of logistician/mathematician Lewis Carroll as he presents a paradox of actuality verses the reality of perception in Alice in Wonderland, the theory of relativity of Albert Einstein as he fails to elaborate a mathematics to communicate an inertial frame of reference, and the reconstruction ideas of Jacques Derrida as he refers for contrast with the scientific world view constructed of dualisms, monisms that are conceived to have no opposites. Supporting discussion is evolved from the works of Bertrand Russell, Erwin Schrodinger, Jurgen Habermas, Bronislaw Malinowski, Michel Foucault. (shrink)
The paper investigates the kind of dependence relation that best portrays Machian frame-dragging in general relativity. The question is tricky because frame-dragging relates local inertial frames to distant distributions of matter in a time-independent way, thus establishing some sort of non-local link between the two. For this reason, a plain causal interpretation of frame-dragging faces huge challenges. The paper will shed light on the issue by using a generalized structural equation model analysis in terms of manipulationist counterfactuals recently applied (...) in the context of metaphysical enquiry by Schaffer (2016) and Wilson (2017). The verdict of the analysis will be that frame-dragging is best understood in terms of a novel type of dependence relation that is half-way between causation and grounding. (shrink)
The electronic and muonic hydrogen energy levels are calculated very accurately [1] in Quantum Electrodynamics (QED) by coupling the Dirac Equation four vector (c ,mc2) current covariantly with the external electromagnetic (EM) field four vector in QED’s Interactive Representation (IR). The c -Non Exclusion Principle(c -NEP) states that, if one accepts c as the electron/muon velocity operator because of the very accurate hydrogen energy levels calculated, the one must also accept the resulting electron/muon internal spatial and time coordinate operators (...) (ISaTCO) derived directly from c without any assumptions. This paper does not change any of the accurate QED calculations of hydrogen’s energy levels, given the simplistic model of the proton used in these calculations [1]. The Proton Radius Puzzle [2, 3] may indicate that new physics is necessary beyond the Standard Model (SM), and this paper describes the bizarre, and very different, situation when the electron and muon are located “inside” the spatially extended proton with their Centers of Charge (CoCs) orbiting the proton at the speed of light in S energy states. -/- The electron/muon center of charge (CoC) is a structureless point that vibrates rapidly in its inseparable, non-random vacuum whose geometry and time structure are defined by the electron/muon ISaTCO discrete geometry. The electron/muon self mass becomes finite in a natural way due to the ISaTCOs cutting off high virtual photon energies in the photon propagator. The Dirac-Maxwell-Wilson (DMW) Equations are derived from the ISaTCO for the EM fields of an electron/muon, and the electron/muon “look” like point particles in far field scattering experiments in the same way the electric field from a sphere with evenly distributed charge “e” “looks” like a point with the same charge in the far field (Gauss Law). The electron’s/muon’s three fluctuating CoC internal spatial coordinate operators have eight possible eigenvalues [4, 5, 6] that fall on a spherical shell centered on the electron’s CoM with radius in the rest frame. The electron/muon internal time operator is discrete, describes the rapid virtual electron/positron pair production and annihilation. The ISaTCO together create a current that produce spin and magnetic moment operators, and the electron and muon no longer have “intrinsic” properties since the ISaTCO kinematics define spin and magnetic moment properties. (shrink)
The meaning of the wave function and its evolution are investigated. First, we argue that the wave function in quantum mechanics is a description of random discontinuous motion of particles, and the modulus square of the wave function gives the probability density of the particles being in certain locations in space. Next, we show that the linear non-relativistic evolution of the wave function of an isolated system obeys the free Schrödinger equation due to the requirements of spacetime translation invariance (...) and relativistic invariance. Thirdly, we argue that the random discontinuous motion of particles may lead to a stochastic, nonlinear collapse evolution of the wave function. A discrete model of energy-conserved wavefunction collapse is proposed and shown consistent with existing experiments and our macroscopic experience. Besides, we also give a critical analysis of the de Broglie-Bohm theory, the many-worlds interpretation and other dynamical collapse theories, and briefly discuss the issues of unifying quantum mechanics and relativity. (shrink)
This report reviews what quantum physics and information theory have to tell us about the age-old question, How come existence? No escape is evident from four conclusions: (1) The world cannot be a giant machine, ruled by any preestablished continuum physical law. (2) There is no such thing at the microscopic level as space or time or spacetime continuum. (3) The familiar probability function or functional, and wave equation or functional wave equation, of standard quantum theory provide mere (...) continuum idealizations and by reason of this circumstance conceal the information-theoretic source from which they derive. (4) No element in the description of physics shows itself as closer to primordial than the elementary quantum phenomenon, that is, the elementary device-intermediated act of posing a yes-no physical question and eliciting an answer or, in brief, the elementary act of observer-participancy. Otherwise stated, every physical quantity, every it, derives its ultimate significance from bits, binary yes-or-no indications, a conclusion which we epitomize in the phrase, it from bit. (shrink)
In this paper I put forward a new micro realistic, fundamentally probabilistic, propensiton version of quantum theory. According to this theory, the entities of the quantum domain - electrons, photons, atoms - are neither particles nor fields, but a new kind of fundamentally probabilistic entity, the propensiton - entities which interact with one another probabilistically. This version of quantum theory leaves the Schroedinger equation unchanged, but reinterprets it to specify how propensitons evolve when no probabilistic transitions occur. Probabilisitic transitions (...) occur when new "particles" are created as a result of inelastic interactions. All measurements are just special cases of this. This propensiton version of quantum theory, I argue, solves the wave/particle dilemma, is free of conceptual problems that plague orthodox quantum theory, recovers all the empirical success of orthodox quantum theory, and at the same time yields as yet untested predictions that differ from those of orthodox quantum theory. (shrink)
In book II of Plato’s Republic, Socrates discusses the cities of necessity and luxury (372d-373a). Discussions of these cities have often focused on citizens desiring more than they need, which creates a demand for luxury. Yet the second part of the equation, which is not usually recognized, is that there must be sufficient supply to meet this demand. The focus of this article is on the importance of supply in the discussion of the first two cities in book II (...) of the Republic. This article argues that the way Plato models the cities makes it the case that a surplus above levels of necessity will be generated from time to time. That the unwanted surplus cannot be spontaneously disposed of entails that the first two cities are institutionally incomplete. A government is needed in order to coordinate the disposal of the surplus supply the city will produce. (shrink)
This paper provides a realist analysis of the EU's legitimacy. We propose a modification of Bernard Williams' theory of legitimacy, which we term critical responsiveness. For Williams, 'Basic Legitimation Demand + Modernity = Liberalism'. Drawing on that model, we make three claims. (i) The right side of the equation is insufficiently sensitive to popular sovereignty; (ii) The left side of the equation is best thought of as a 'legitimation story': a non-moralised normative account of how to shore up (...) belief in legitimacy while steering clear of both raw domination and ideological distortions. (iii) The EU's current legitimation story draws on a tradition of popular sovereignty that sits badly with the supranational delegation and pooling of sovereign powers. We conclude by suggesting that the EU's legitimation deficit may be best addressed demoicratically, by recovering the value of popular sovereignty at the expense of a degree of state sovereignty. (shrink)
Cogitation described as calculation, the living being described as a machine, cognitive functions considered as algorithmic sequences and the ‘mechanization’ of the subjective were the theoretical elements that late heideggerian anti–humanism, especially in France was able to utilize[1], even more so, after the second cybernetics or post-cybernetics movement of the late ‘60s introduced the concepts of the autopoietic and the allopoietic automata[2]. Recently, neurologists pose claims on the traditional epistemological field of philosophy, proceeding from this ontological decision, the equation (...) of human cognition to cybernetic systems. -/- The emergence of the world-wide-web in the 1990s and the global expansion of the internet during the first decades of the 21st century indicate the fallacies of the cybernetics programme to mechanize the mind. We stand witnesses to a semantic colonization of the cybernetic system, a social imaginary creation and expansion within the digital ensemblistic – identitarian organization that cannot be described by mechanical or cybernetic terms. Paradoxically, cyberspace, as a new being, a form of alterity, seems to both exacerbate and capsize the polarization between the operational and the symbolic. The creation of the internet might be more than an epistemological revolution, to use the terminology of Thomas Kuhn. It might be an ontological revolution. -/- I will try to demonstrate that the emergence of the Internet refutes any such claims, since its context and utility can only be described by means of a social epistemology based on the understanding of social significances as continuous creations of an anonymous social imaginary proposed by Cornelius Castoriadis (1922-1997). I will try to explore some social-semantic aspects of the cyberspace as a nexus of social representations of the individual identity that forms a new sphere of being, where the subjective and the objective merge in a virtual subjective objectivity with unique epistemological attributes and possibilities. (shrink)
Among Bayesian confirmation theorists, several quantitative measures of the degree to which an evidential proposition E confirms a hypothesis H have been proposed. According to one popular recent measure, s, the degree to which E confirms H is a function of the equation P(H|E) − P(H|~E). A consequence of s is that when we have two evidential propositions, E1 and E2, such that P(H|E1) = P(H|E2), and P(H|~E1) ≠ P(H|~E2), the confirmation afforded to H by E1 does not equal (...) the confirmation afforded to H by E2. I present several examples that demonstrate the unacceptability of this result, and conclude that we should reject s (and other measures that share this feature) as a measure of confirmation. (shrink)
This dissertation is devoted to empirically contrasting the Suppositional Theory of conditionals, which holds that indicative conditionals serve the purpose of engaging in hypothetical thought, and Inferentialism, which holds that indicative conditionals express reason relations. Throughout a series of experiments, probabilistic and truth-conditional variants of Inferentialism are investigated using new stimulus materials, which manipulate previously overlooked relevance conditions. These studies are some of the first published studies to directly investigate the central claims of Inferentialism empirically. In contrast, the Suppositional Theory (...) of conditionals has an impressive track record through more than a decade of intensive testing. The evidence for the Suppositional Theory encompasses three sources. Firstly, direct investigations of the probability of indicative conditionals, which substantiate “the Equation” (P(if A, then C) = P(C|A)). Secondly, the pattern of results known as “the defective truth table” effect, which corroborates the de Finetti truth table. And thirdly, indirect evidence from the uncertain and-to-if inference task. Through four studies each of these sources of evidence are scrutinized anew under the application of novel stimulus materials that factorially combine all permutations of prior and relevance levels of two conjoined sentences. The results indicate that the Equation only holds under positive relevance (P(C|A) – P(C|¬A) > 0) for indicative conditionals. In the case of irrelevance (P(C|A) – P(C|¬A) = 0), or negative relevance (P(C|A) – P(C|¬A) < 0), the strong relationship between P(if A, then C) and P(C|A) is disrupted. This finding suggests that participants tend to view natural language conditionals as defective under irrelevance and negative relevance (Chapter 2). Furthermore, most of the participants turn out only to be probabilistically coherent above chance levels for the uncertain and-to-if inference in the positive relevance condition, when applying the Equation (Chapter 3). Finally, the results on the truth table task indicate that the de Finetti truth table is at most descriptive for about a third of the participants (Chapter 4). Conversely, strong evidence for a probabilistic implementation of Inferentialism could be obtained from assessments of P(if A, then C) across relevance levels (Chapter 2) and the participants’ performance on the uncertain-and-to-if inference task (Chapter 3). Yet the results from the truth table task suggest that these findings could not be extended to truth-conditional Inferentialism (Chapter 4). On the contrary, strong dissociations could be found between the presence of an effect of the reason relation reading on the probability and acceptability evaluations of indicative conditionals (and connate sentences), and the lack of an effect of the reason relation reading on the truth evaluation of the same sentences. A bird’s eye view on these surprising results is taken in the final chapter and it is discussed which perspectives these results open up for future research. (shrink)
Albert Einstein's bold assertion of the form-invariance of the equation of a spherical light wave with respect to inertial frames of reference became, in the space of six years, the preferred foundation of his theory of relativity. Early on, however, Einstein's universal light-sphere invariance was challenged on epistemological grounds by Henri Poincaré, who promoted an alternative demonstration of the foundations of relativity theory based on the notion of a light-ellipsoid. Drawing in part on archival sources, this paper shows how (...) an informal, international group of physicists, mathematicians, and engineers, including Einstein, Paul Langevin, Poincaré, Hermann Minkowski, Ebenezer Cunningham, Harry Bateman, Otto Berg, Max Planck, Max Laue, A. A. Robb, and Ludwig Silberstein, employed figures of light during the formative years of relativity theory in their discovery of the salient features of the relativistic worldview. (shrink)
Building upon Nancy Cartwright's discussion of models in How the Laws of Physics Lie, this paper addresses solid state research in transition metal oxides. Historical analysis reveals that in this domain models function both as the culmination of phenomenology and the commencement of theoretical explanation. Those solid state chemists who concentrate on the description of phenomena pertinent to specific elements or compounds assess models according to different standards than those who seek explanation grounded in approximate applications of the Schroedinger (...) class='Hi'>equation. Accurate accounts of scientific debate in this field must include both perspectives. (shrink)
Fitting Attitudes accounts of value analogize or equate being good with being desirable, on the premise that ‘desirable’ means not, ‘able to be desired’, as Mill has been accused of mistakenly assuming, but ‘ought to be desired’, or something similar. The appeal of this idea is visible in the critical reaction to Mill, which generally goes along with his equation of ‘good’ with ‘desirable’ and only balks at the second step, and it crosses broad boundaries in terms of philosophers’ (...) other commitments. For example, Fitting Attitudes accounts play a central role both in T.M. Scanlon’s [1998] case against teleology, and in Michael Smith [2003], [unpublished] and Doug Portmore’s [2007] cases for it. And of course they have a long and distinguished history. (shrink)
The paper explores the influence of greenwash on green trust and discusses the mediation roles of green consumer confusion and green perceived risk. The research object of this study focuses on Taiwanese consumers who have the purchase experience of information and electronics products in Taiwan. This research employs an empirical study by means of the structural equation modeling. The results show that greenwash is negatively related to green trust. Therefore, this study suggests that companies must reduce their greenwash behaviors (...) to enhance their consumers’ green trust. In addition, this study finds out that green consumer confusion and green perceived risk mediate the negative relationship between greenwash and green trust. The results also demonstrate that greenwash is positively associated with green consumer confusion and green perceived risk which would negatively affect green trust. It means that greenwash does not only negatively affect green trust directly but also negatively influence it via green consumer confusion and green perceived risk indirectly. Hence, if companies would like to reduce the negative relationship between greenwash and green trust, they need to decrease their consumers’ green consumer confusion and green perceived risk. (shrink)
The idea that incompatibilism is intuitive is one of the key motivators for incompatibilism. Not surprisingly, then philosophers who defend incompatibilism often claim that incompatibilism is the natural, commonsense view about free will and moral responsibility (e.g., Pereboom 2001, Kane Journal of Philosophy 96:217–240 1999, Strawson 1986). And a number of recent studies find that people give apparently incompatibilist responses in vignette studies. When participants are presented with a description of a causal deterministic universe, they tend to deny that people (...) are morally responsible in that universe. Although this suggests that people are intuitive incompatibilists, Eddy Nahmias and Dylan Murray, in a recent series of important papers, have developed an important challenge to this interpretation. They argue that people confuse determinism with bypassing, the idea that one’s mental states lack causal efficacy. Murray and Nahmias present new experiments that seem to confirm the bypassing hypothesis. In this paper, we use structural equation modeling to re-examine the issue. We find support instead for an incompatibilist explanation of the bypassing results, i.e., incompatibilist judgments seem to cause bypassing judgments. We hypothesize that this phenomenon occurs because people think of decisions as essentially indeterministic; thus, when confronted with a description of determinism they tend to think that decisions do not even occur. We provide evidence for this in three subsequent studies which show that many participants deny that people make decisions in a deterministic universe; by contrast, most participants tend to allow that people add numbers in a deterministic universe. Together, these studies suggest that bypassing results don’t reflect a confusion, but rather the depth of the incompatibilist intuition. (shrink)
This paper investigates the link between the consumer perception that a company is socially oriented and the consumer intention to buy products marketed by that company. We suggest that this link exists when at least two conditions prevail: (1) the products sold by that company comply with ethical and social requirements; (2) the company has an acknowledged commitment to protect consumer rights and interests. To test these hypotheses, we conducted a survey among the clients of retail chains offering Fair Trade (...) products. The results show that socially oriented companies can successfully leverage their reputation to market products with high symbolic values. (shrink)
Arnold Sommerfeld introduced the fine-structure constant that determines the strength of the electromagnetic interaction. Following Sommerfeld, Wolfgang Pauli left several clues to calculating the fine-structure constant with his research on Johannes Kepler's view of nature and Pythagorean geometry. The Laplace limit of Kepler's equation in classical mechanics, the Bohr-Sommerfeld model of the hydrogen atom and Julian Schwinger's research enable a calculation of the electron magnetic moment anomaly. Considerations of fundamental lengths such as the charge radius of the proton and (...) mass ratios suggest some further foundational interpretations of quantum electrodynamics. (shrink)
From the exponential function of Euler’s equation to the geometry of a fundamental form, a calculation of the fine-structure constant and its relationship to the proton-electron mass ratio is given. Equations are found for the fundamental constants of the four forces of nature: electromagnetism, the weak force, the strong force and the force of gravitation. Symmetry principles are then associated with traditional physical measures.
The paper addresses the problem, which quantum mechanics resolves in fact. Its viewpoint suggests that the crucial link of time and its course is omitted in understanding the problem. The common interpretation underlain by the history of quantum mechanics sees discreteness only on the Plank scale, which is transformed into continuity and even smoothness on the macroscopic scale. That approach is fraught with a series of seeming paradoxes. It suggests that the present mathematical formalism of quantum mechanics is only partly (...) relevant to its problem, which is ostensibly known. The paper accepts just the opposite: The mathematical solution is absolute relevant and serves as an axiomatic base, from which the real and yet hidden problem is deduced. Wave-particle duality, Hilbert space, both probabilistic and many-worlds interpretations of quantum mechanics, quantum information, and the Schrödinger equation are included in that base. The Schrödinger equation is understood as a generalization of the law of energy conservation to past, present, and future moments of time. The deduced real problem of quantum mechanics is: “What is the universal law describing the course of time in any physical change therefore including any mechanical motion?”. (shrink)
I examine to what extent accounts of mechanisms based on formal interventionist theories of causality can adequately represent biological mechanisms with complex dynamics. Using a differential equation model for a circadian clock mechanism as an example, I first show that there exists an iterative solution that can be interpreted as a structural causal model. Thus, in principle it is possible to integrate causal difference-making information with dynamical information. However, the differential equation model itself lacks the right modularity properties (...) for a full integration. A formal mechanistic model will therefore either have to leave out non-causal or causal explanatory relations. (shrink)
Although Fuzzy logic and Fuzzy Mathematics is a widespread subject and there is a vast literature about it, yet the use of Fuzzy issues like Fuzzy sets and Fuzzy numbers was relatively rare in time concept. This could be seen in the Fuzzy time series. In addition, some attempts are done in fuzzing Turing Machines but seemingly there is no need to fuzzy time. Throughout this article, we try to change this picture and show why it is helpful to consider (...) the instants of time as Fuzzy numbers. In physics, though there are revolutionary ideas on the time concept like B theories in contrast to A theory also about central concepts like space, momentum… it is a long time that these concepts are changed, but time is considered classically in all well-known and established physics theories. Seemingly, we stick to the classical time concept in all fields of science and we have a vast inertia to change it. Our goal in this article is to provide some bases why it is rational and reasonable to change and modify this picture. Here, the central point is the modified version of “Unexpected Hanging” paradox as it is described in "Is classical Mathematics appropriate for theory of Computation".This modified version leads us to a contradiction and based on that it is presented there why some problems in Theory of Computation are not solved yet. To resolve the difficulties arising there, we have two choices. Either “choosing” a new type of Logic like “Para-consistent Logic” to tolerate contradiction or changing and improving the time concept and consequently to modify the “Turing Computational Model”. Throughout this paper, we select the second way for benefiting from saving some aspects of Classical Logic. In chapter 2, by applying quantum Mechanics and Schrodinger equation we compute the associated fuzzy number to time. These, provides a new interpretation of Quantum Mechanics.More exactly what we see here is "Particle-Fuzzy time" interpretation of quantum Mechanics, in contrast to some other interpretations of Quantum Mechanics like " Wave-Particle" interpretation. At the end, we propound a question about the possible solution of a paradox in Physics, the contradiction between General Relativity and Quantum Mechanics. (shrink)
Gone are the heady days when Bernard Williams (1993) could get away with saying that “Nietzsche is not a source of philosophical theories” (p. 4). The last two decades have witnessed a flowering of research that aims to interpret, elucidate, and defend Nietzsche’s theories about science, the mind, and morality. This paper is one more blossom in that efflorescence. What I want to argue is that Nietzsche theorized three important and surprising moral psychological insights that have been born out by (...) contemporary empirical psychology. The first Nietzschean insight is the disunity of the self. The second, connected, Nietzschean insight is the primacy of affect. This primacy is expressed by what I have called elsewhere (Alfano 2010, 2013a) the tenacity of the intentional, and what Nietzsche calls the Socratic equation (TI Socrates 4, 10; WP 2:432-3). The third major Nietzschean insight is the social construction of character, which presupposes a wild diversity within the extensions of trait-terms and the dual direction of fit of character trait attributions. This last point is somewhat in tension with the only other published defense of the empirical credentials of Nietzsche’s moral psychology (Knobe & Leiter 2007), so I will make a few remarks about the contrast between my view and theirs. (shrink)
All too often women are considered sexy in accordance with an externally dictated and unduly narrow conception of sexiness – one that excludes large portions of the female population from being considered sexy. In response to this, some feminists have suggested that we should give up on sexiness altogether. Since the agency, subjectivity, and autonomy of a woman being judged sexy is generally ignored, they argue, we have, in effect, an equation of sexiness with objecthood. In a recent essay (...) entitled “Sex Objects and Sexy Subjects” Sheila Lintott and Sherri Irvin object to this strategy because they see sexuality as a crucial element of selfhood – something that one cannot simply ‘give up on’. Instead, they propose to reclaim and redefine sexiness in such a way that makes room for women as sexy subjects desiring and pursuing authentic pleasure. In this short paper, I will investigate the merits and shortcomings of their proposal and present an alternative account. (shrink)
Hydrogen, an atom composed of a single proton and electron, is the fundamental and most abundant element in the universe. Hydrogen composes approximately 90% of the visible universe. As we all know there are different types of energies linked with proton –electron system due to the fundamental forces in any atom such as Kinetic energy, Electrostatic energy,Gravitational energy etc. In quantum framework, Gravity is a very weak force and it’s equivalence with other forces were once thought impossible. I strongly believe (...) that all energies are one or the other forms of a single energy and to prove that all energies are a fraction or part of rest mass energy mec^2. In this paper, I’ve tried to define a new equation unifying all the energies of a hydrogen atom. (shrink)
Relationships with international students can be beneficial to higher education in terms of financial and human resources. For this reason, establishing and maintaining such relationships are usually pre-eminent concerns. In this study, we extended the application of the disconfirmation expectation model by incorporating components from subjective task value to predict the loyalty of international students toward their host countries. On a sample of 410 Vietnamese students enrolled in establishments of higher education in over 15 countries across the globe, we employed (...) structural equation model to construct the conceptual model. Our empirical findings revealed that while the roles of satisfaction and disconfirmation are still important as direct and indirect antecedents of international student loyalty, its most powerful predictors are the three components of subjective task value: attainment, utility and intrinsic. These insights result in a number of implications for actors on the higher ducation scene, such as heads of institutions and policy makers. (shrink)
According to orthodox quantum mechanics, state vectors change in two incompatible ways: "deterministically" in accordance with Schroedinger's time-dependent equation, and probabilistically if and only if a measurement is made. It is argued here that the problem of measurement arises because the precise mutually exclusive conditions for these two types of transitions to occur are not specified within orthodox quantum mechanics. Fundamentally, this is due to an inevitable ambiguity in the notion of "meawurement" itself. Hence, if the problem of measurement (...) is to be resolved, a new, fully objective version of quantjm mechanics needs to be developed which does not incorporate the notion of measurement in its basic postuolates at all. (shrink)
We criticize the bare theory of quantum mechanics -- a theory on which the Schrödinger equation is universally valid, and standard way of thinking about superpositions is correct.
The concept of time is examined using the second law of thermodynamics that was recently formulated as an equation of motion. According to the statistical notion of increasing entropy, flows of energy diminish differences between energy densities that form space. The flow of energy is identified with the flow of time. The non-Euclidean energy landscape, i.e. the curved space–time, is in evolution when energy is flowing down along gradients and levelling the density differences. The flows along the steepest descents, (...) i.e. geodesics are obtained from the principle of least action for mechanics, electrodynamics and quantum mechanics. The arrow of time, associated with the expansion of the Universe, identifies with grand dispersal of energy when high-energy densities transform by various mechanisms to lower densities in energy and eventually to ever-diluting electromagnetic radiation. Likewise, time in a quantum system takes an increment forwards in the detection-associated dissipative transformation when the stationary-state system begins to evolve pictured as the wave function collapse. The energy dispersal is understood to underlie causality so that an energy gradient is a cause and the resulting energy flow is an effect. The account on causality by the concepts of physics does not imply determinism; on the contrary, evolution of space–time as a causal chain of events is non-deterministic. (shrink)
The study aimed to identify the reality of the process design management in the local ngos in Gaza Strip. In order to achieve the objectives of the study and to test its hypotheses, the analytical descriptive method was used, relying on the questionnaire as the main tool for data collection. The study society is one of the decision makers in the local ngos in Gaza Strip. The study population consisted of 78 local ngos in Gaza Strip. The overall inventory of (...) the possible study community was based mainly on the use of the SPSS in processing and analyzing the data obtained through the survey tool. Smart-PLS was also used to construct the structural equation model (SEM) to analyze the relationship between the variables of the study, the calculation of the direct and indirect effects of the independent variable on the dependent variable through the intermediary variable. The study reached a set of results, the most important: the lack of direct relationship between the management of process design and decision-making, the study found that the design thinking mediates the relationship between the management of process design and decision making with a holistic effect. The study showed the interest of local ngos in creating a good mental image in the local community. The ownership of local ngos to the expertise and technical skills required to implement the projects, and showed the adoption of local ngos in their activities to meet the needs of the beneficiaries and their wishes, and local ngos analyze the problem, and causes, through data relevant to the decision, based on reference data for decision-making. The main recommendations of the study are: the need for senior management to be concerned with local ngos in Gaza Strip; encouraging managers and employees to take care of developing the field of managing the design of operations in their projects, enhancing the creative environment and adding competitive advantage to the organization. The study also recommended that the senior management of local ngos in Gaza Strip adopt the methodology of design thinking because it has an impact on the sustainability of projects, design of the technical feasibility study, meeting the wishes of the beneficiaries, and the continued development of local ngos in Gaza Strip to make sound and sound decisions. Encourage them to follow the scientific methodologies in the decision-making mechanism. (shrink)
The basic idea of quantum mechanics is that the property of any system can be in a state of superposition of various possibilities. This state of superposition is also known as wave function and it evolves linearly with time in a deterministic way in accordance with the Schrodinger equation. However, when a measurement is carried out on the system to determine the value of that property, the system instantaneously transforms to one of the eigen states and thus we get (...) only a single value as outcome of the measurement. Quantum measurement problem seeks to find the cause and exact mechanism governing this transformation. In an attempt to solve the above problem, in this paper, we will first define what the wave function represents in real world and will identify the root cause behind the stochastic nature of events. Then, we will develop a model to explain the mechanism of collapse of the quantum mechanical wave function in response to a measurement. In the process of development of model, we will explain Schrodinger cat paradox and will show how Born’s rule for probability becomes a natural consequence of measurement process. (shrink)
Based on de Broglie’s wave hypothesis and the covariant ether, the Three Wave Hypothesis (TWH) has been proposed and developed in the last century. In 2007, the author found that the TWH may be attributed to a kinematical classical system of two perpendicular rolling circles. In 2012, the author showed that the position vector of a point in a model of two rolling circles in plane can be transformed to a complex vector under a proposed effect of partial observation. In (...) the present project, this concept of transformation is developed to be a lab observation concept. Under this transformation of the lab observer, it is found that velocity equation of the motion of the point is transformed to an equation analogising the relativistic quantum mechanics equation (Dirac equation). Many other analogies has been found, and are listed in a comparison table. The analogy tries to explain the entanglement within the scope of the transformation. These analogies may suggest that both quantum mechanics and special relativity are emergent, both of them are unified, and of the same origin. The similarities suggest analogies and propose questions of interpretation for the standard quantum theory, without any possible causal claims. (shrink)
According to the BSM- Supergravitation Unified Theory (BSM-SG), the energy is indispensable feature of matter, while the matter possesses hierarchical levels of organization from a simple to complex forms, with appearance of fields at some levels. Therefore, the energy also follows these levels. At the fundamental level, where the primary energy source exists, the matter is in its primordial form, where two super-dense fundamental particles (FP) exist in a classical pure empty space (not a physical vacuum). They are associated with (...) the Planck scale parameters of frequency and distance and interact by Supergravitational forces. These forces are inverse proportional to the cube of distance at pure empty space and they are based on frequency interactions. Since the two FPs have different intrinsic frequencies, the SG forces appear different for interactions between the like and unlike FPs and may change the sign. This primordial form of matter exists in the super-heavy black holes located in the center of each well formed galaxy. The next upper level of matter organization includes the underlying structure of the physical vacuum, called a Cosmic Lattice, and the structure of elementary particles. They have common substructure elements obtained by specific crystallization process preceding the formation of the observable galaxies. The Cosmic Lattice, forming a space known as a physical vacuum, is responsible for the existence and propagation of the physical fields: electrical, magnetic, Newtonian gravity and inertia. The energy of physical vacuum is in two forms: Static (enormous) and Dynamic (weak). The Static energy is directly related to the Newtonian mass by the Einstein equation E = mc^2 and it is a primary source of the nuclear energy. The Dynamic energy is responsible for the existence of the electric and magnetic fields, the constant speed of light and the quantum mechanical properties of the physical vacuum. The next upper energy level is the dynamical energy of excited atoms and molecules. At this level a hidden energy wells exit, such as the internal energy of the electron and the internal energy of atoms with more than one electron. The next upper energy level is at some organic molecules and particularly in the biomolecules that contain ring atomic structures. In such a structure, some quantum states are not emitted immediately, but rotating in the ring. While in organic molecules the energy stored in such a ring is released by a chemical process, in the long chain molecule of proteins in the living organism the stored energy can be released simultaneously by triggering. A huge number of atomic rings are contained in the DNA strands. The release of the energy stored in DNA, for example, is an avalanche process that causes an emission of entangled photons possessing a strong penetrating capability. A sequence of entangled photons emitted by DNA should carry the genetic information encoded by the cordons. This mechanism, predicted in BSM-SG theory, is very important for intercommunication between the cells of the living organism. The next upper level of energy organization may exist in the brain. The brain is an organ of a most abundant number of atomic rings, while its tissue environment might permit complex energy interactions. The human brain contains billions of atomic rings. The next hypothetical upper level of energy organization is an information field, physically existed outside, but connected with the living brain. It corresponds to a specific field known as aura, while the possibility of its existence is still not accepted by the main stream science. The problem is that this field could not be detected by the currently existing technical means used for EM communications. The BSM-SG predicts that this field might differ from the EM field we use for communication, but it is a subject of a further theoretical development that must be supported by experiments using specifically designed technical means. According to the BSM-SG theory, the energy conversion from the primary energy source to the complex levels of matter and field organization is a permanent syntropic process based on complex resonance interactions. (shrink)
Are mathematical objects affected by their historicity? Do they simply lose their identity and their validity in the course of history? If not, how can they always be accessible in their ideality regardless of their transmission in the course of time? Husserl and Foucault have raised this question and offered accounts, both of which, albeit different in their originality, are equally provocative. Both acknowledge that a scientific object like a geometrical theorem or a chemical equation has a history because (...) it is only constituted in and transmitted through history. But they see that history as a part of its ideality, so that, although historical, a scientific object retains its identity as one and the same object. (shrink)
This study analyses the predictions of the General Theory of Relativity (GTR) against a slightly modified version of the standard central mass solution (Schwarzschild solution). It is applied to central gravity in the solar system, the Pioneer spacecraft anomalies (which GTR fails to predict correctly), and planetary orbit distances and times, etc (where GTR is thought consistent.) -/- The modified gravity equation was motivated by a theory originally called ‘TFP’ (Time Flow Physics, 2004). This is now replaced by the (...) ‘Geometric Model’, 2014 [20], which retains the same theory of gravity. This analysis is offered partially as supporting detail for the claim in [20] that the theory is realistic in the solar system and explains the Pioneer anomalies. The overall conclusion is that the model can claim to explain the Pioneer anomalies, contingent on the analysis being independently verified and duplicated of course. -/- However the interest lies beyond testing this theory. To start with, it gives us a realistic scale on which gravity might vary from the accepted theory, remain consistent with most solar-scale astronomical observations. It is found here that the modified gravity equation would appear consistent with GTR for most phenomena, but it would retard the Pioneer spacecraft by about the observed amount (15 seconds or so at time). Hence it is a possible explanation of this anomaly, which as far as I know remains unexplained now for 20 years. -/- It also shows what many philosophers of science have emphasized: the pivotal role of counterfactual reasoning. By putting forward an exact alternative solution, and working through the full explanation, we discover a surprising ‘counterfactual paradox’: the modified theory slightly weakens GTR gravity – and yet the effect is to slow down the Pioneer trajectory, making it appear as if gravity is stronger than GTR. The inference that “there must be some tiny extra force…” (Musser, 1998 [1]) is wrong: there is a second option: “…or there may be a slightly weaker form of gravity than GTR.” . (shrink)
In this paper, we argue that computer simulations can provide valuable insights into the performance of voting methods on different collective decision problems. This could improve institutional design, even when there is no general theoretical result to support the optimality of a voting method. To support our claim, we first describe a decision problem that has not received much theoretical attention in the literature. We outline different voting methods to address that collective decision problem. Under certain criteria of assessment akin (...) to extensions of the Condorcet Jury Theorem, we run simulations for the methods using MATLAB, in order to compare their performance under various conditions. We consider and respond to concerns about the use of simulations in the assessment of voting procedures for policymaking. (shrink)
The problem of measurement in economic models and the possibility of their quantum-mechanical description are considered. It is revealed that the apparent paradox of such a description is associated with a priori requirement of conformity of the model to all the alternatives of free choice of the observer. The measurement of the state of a trader on a stock exchange is formally defined as his responses to the proposals of sale at a fixed price. It is shown that an analogue (...) of Bell's inequalities for this measurement model is violated at the most general assumptions related to the strategy of the trader and requires a quantummechanical description of the dynamics of his condition. In the framework of the theory of weak continuous quantum measurements, the equation of stock price dynamics and the quantum-mechanical generalization of the F. Black and M. Scholes model for pricing options are obtained. The fundamental distinctions between the obtained model and the classical one are discussed. (shrink)
The importance of corporate social responsibility is shaping investment decisions and entrepreneurial actions in diverse perspectives. The rapid growth of SMEs has tremendous impacts on the environment. Nonetheless, the economic emergence plan of Cameroon has prompted government support of SMEs through diverse projects. This saw economic growth increased to 3.8% and unemployment dropped to 4.3% caused by the expansion of private sector investments. The dilemma that necessitated this study is the response strategy of SMEs operators towards environmental sustainability. This study, (...) thus seeks to examine the effects of entrepreneurial intentions and actions on environmental sustainability. The research is a conclusive case study design supported by the philosophical underpins of objectivism ontology and positivism epistemology. Data was sourced from four hundred (400) SMEs operators purposively sampled from the Centre and Littoral regions of Cameroon using structured questionnaire. Data was analysed using the Structural Equation Modelling technique with the aid of statistical packages including: SPSS 24 and AMOS 23. The study revealed that entrepreneurial action has weak positive statistical significant impacts on environmental sustainability; whereas entrepreneurial intention has strong positive statistical significant effects on environmental sustainability. Entrepreneurial intention comprised of self-efficacy and perceived control whereas, entrepreneurial actions involved entrepreneurial alertness and uncertainty. This study concludes that entrepreneurs in Cameroon have sustainable intentions to protect the environment but; the current actions taken are inadequate. This research recommends that entrepreneurs should enhance efforts toward attaining the state of genuine sustainability. (shrink)
The importance of corporate social responsibility is shaping investment decisions and entrepreneurial actions in diverse perspectives. The rapid growth of SMEs has tremendous impacts on the environment. Nonetheless, the economic emergence plan of Cameroon has prompted government support of SMEs through diverse projects. This saw economic growth increased to 3.8% and unemployment dropped to 4.3% caused by the expansion of private sector investments. The dilemma that necessitated this study is the response strategy of SMEs operators towards environmental sustainability. This study, (...) thus seeks to examine the effects of entrepreneurial intentions and actions on environmental sustainability. The research is a conclusive case study design supported by the philosophical underpins of objectivism ontology and positivism epistemology. Data was sourced from four hundred (400) SMEs operators purposively sampled from the Centre and Littoral regions of Cameroon using structured questionnaire. Data was analysed using the Structural Equation Modelling technique with the aid of statistical packages including: SPSS 24 and AMOS 23. The study revealed that entrepreneurial action has weak positive statistical significant impacts on environmental sustainability; whereas entrepreneurial intention has strong positive statistical significant effects on environmental sustainability. Entrepreneurial intention comprised of self-efficacy and perceived control whereas, entrepreneurial actions involved entrepreneurial alertness and uncertainty. This study concludes that entrepreneurs in Cameroon have sustainable intentions to protect the environment but; the current actions taken are inadequate. This research recommends that entrepreneurs should enhance efforts toward attaining the state of genuine sustainability. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.