The study aimed to analyze lexical items underpinned in the textbooks used in the current teaching of ESP and GE. Using content analysis, a systematic evaluation of texts to examine nuances to bridge the gap between quantitative and qualitative data. This was such of importance, however, difficult to study due to issues of interest like in the study, frequency of lexical items in ESP, and GE textbooks. Results found 13,713 lexical items in Hospitality Management, 17,561 in Criminology, 4576 in Tourism, (...) 7167 in Marine Engineering, and 512 in Information Technology. Furthermore, the overall percentage of ESP lexical items fell in Tier 2 (with multiple-meaning while the least was in Tier 3, specifically on context-specific vocabulary. It is the core of vocabulary learning to ensure English language teaching. It is its goal to help learners better understand language, allowing them to understand others as they want to express themselves as well. This applies not only in speaking but also in writing and reading. Wilkins (1972, p. 111-112) stated that without grammar very little understanding can be acquired and without vocabulary there can be no learning at all. Thus, even without good grammar, so long as you equipped with useful words and expression, one can still manage to communicate. Lewis (1993) argued that “lexis is the heart of language” and that it develops a better fluency and expression in English. He also added the significance to the learners of acquiring a more productive vocabulary knowledge, also, their eagerness to develop their own personal vocabulary strategies. Thus, a proposed bridge model program was recommended to highlight the study findings using the lexicons found from the different respective courses. (shrink)
This letter was rejected by International Knowledge Press because "we are unable to conclude that these findings would warrant publication in this journal." The letter is suggesting that dark energy, dark matter and universal expansion are intimately related. However, they aren't viewed as revolutions in cosmology which are essential to a complete understanding of the modern universe. They are instead viewed as properties which need to be added to the cosmos when Einstein's theory of gravity (General Relativity) is apparently still (...) not thoroughly comprehended a little over a century since it was published. (shrink)
The Generalized Quantifiers Theory, I will argue, in the second half of last Century has led to an important rapprochement, relevant both in logic and in linguistics, between logical quantification theories and the semantic analysis of quantification in natural languages. In this paper I concisely illustrate the formal aspects and the theoretical implications of this rapprochement.
Contains a description of a generalized and constructive formal model for the processes of subjective and creative thinking. According to the author, the algorithm presented in the article is capable of real and arbitrarily complex thinking and is potentially able to report on the presence of consciousness.
Hossack's 'The Metaphysics of Knowledge' develops a theory of facts, entities in which universals are combined with universals or particulars, as the foundation of his metaphysics. While Hossack argues at length that there must be negative facts, facts in which the universal 'negation' is combined with universals or particulars, his conclusion that there are also general facts, facts in which the universal 'generality' is combined with universals, is reached rather more swiftly. In this paper I present Hossack with three (...) arguments for his conclusion. They all draw, as does Hossack's theory of facts, on views Russell expressed in various writings. Two arguments are based on Russell's explanation of universals as aspects of resemblance; the third on Russell's observation that general propositions do not follow logically from exclusively particular premises. Comparison with other metaphysics of generality show them to be wanting and Russell's and Hossack's accounts superior. (shrink)
The paper discusses the philosophical conclusions, which the interrelation between quantum mechanics and general relativity implies by quantum measure. Quantum measure is three-dimensional, both universal as the Borel measure and complete as the Lebesgue one. Its unit is a quantum bit (qubit) and can be considered as a generalization of the unit of classical information, a bit. It allows quantum mechanics to be interpreted in terms of quantum information, and all physical processes to be seen as informational in a generalized (...) sense. This implies a fundamental connection between the physical and material, on the one hand, and the mathematical and ideal, on the other hand. Quantum measure unifies them by a common and joint informational unit. Furthermore the approach clears up philosophically how quantum mechanics and general relativity can be understood correspondingly as the holistic and temporal aspect of one and the same, the state of a quantum system, e.g. that of the universe as a whole. The key link between them is the notion of the Bekenstein bound as well as that of quantum temperature. General relativity can be interpreted as a special particular case of quantum gravity. All principles underlain by Einstein (1918) reduce the latter to the former. Consequently their generalization and therefore violation addresses directly a theory of quantum gravity. Quantum measure reinterprets newly the “Bing Bang” theories about the beginning of the universe. It measures jointly any quantum leap and smooth motion complementary to each other and thus, the jump-like initiation of anything and the corresponding continuous process of its appearance. Quantum measure unifies the “Big Bang” and the whole visible expansion of the universe as two complementary “halves” of one and the same, the set of all states of the universe as a whole. It is a scientific viewpoint to the “creation from nothing”. (shrink)
Examining the significance of the General’s enlightenment in the Platform Sutra, this article clarifies the fundamental role that emotions play in the development of one’s spiritual understanding. In order to do so, this article emphasizes that the way to enlightenment implicit in the story of the General and the Master involves first granting negative emotions a means for productive expression. By acting as a preparatory measure for calming the mind and surrendering control over it, human passions become a necessary, antecedent (...) condition to wisdom—a conclusion that this article argues is a major, and sometimes underappreciated, lesson embedded in the teachings of the Sixth Patriarch. (shrink)
In responding to my article, Quinn raises the question of development in science and scientific method. He picks up on the topic of the last section of my paper, and suggests that “generalized empirical method” will be “coherent with the essential dynamics of scientific progress.” He points out that, if implemented, such an extended method “promises to be a way toward new and practical results”.
Abstract. The theory-change epistemological model, tried on maxwellian revolution and special relativity genesis, is unfolded to apprehend general relativity genesis. It is exhibited that the dynamics of general relativity (GR) construction was largely governed by internal tensions of special relativity and Newton’s theory of gravitation. The research traditions’ encounter engendered construction of the hybrid domain at first with an irregular set of theoretical models. However, step by step, on revealing and gradual eliminating the contradictions between the models involved, the hybrid (...) set was put into order with a help of equivalence principle. A hierarchy of theoretical models starting from the crossbreeds and up to usual hybrids was moulded. The claim to put forward is that Einstein’s unification design could be successfully implemented since his programme embraced the ideas of the Nordström research programme, as well as the presuppositions of the programme of Max Abraham. By and large Einstein’s victory over his rivals became possible because the core of his research strategy was formed by the equivalence principle comprehended in the light of Kantian epistemology. It is stated that the theories of Nordström and Abraham contrived before November 25, 1915, were not merely the scaffolds to construct the GR basic model. They are still the necessary part of the whole GR theory necessary for its common use. Key words: Einstein, Nordstrom, Abraham, general relativity. -/- . (shrink)
Bien qu'il existe des différences significatives entre la philosophie de Mario Bunge et celle de Graham Harman, il existe également des similitudes fonda-mentales entre elles. Ces penseurs affirment tous deux qu'il est possible de dé-velopper une théorie générale des objets. Le premier estime que la théorie en question est logico-mathématique, tandis que le second suggère qu'elle est on-tologique. Quoi qu’il en soit, ils conviennent que tous les objets doivent être con-sidérés, qu’ils soient réels ou non. En outre, ils suggèrent que (...) même si aucun ob-jet ne doit être exclu de la théorie, il est nécessaire d’en distinguer différents types. (shrink)
Vogel argues that sensitivity accounts of knowledge are implausible because they entail that we cannot have any higher-level knowledge that our beliefs are true, not false. Becker and Salerno object that Vogel is mistaken because he does not formalize higher-level beliefs adequately. They claim that if formalized correctly, higher-level beliefs are sensitive, and can therefore constitute knowledge. However, these accounts do not consider the belief-forming method as sensitivity accounts require. If we take bootstrapping as the belief-forming method, as the discussed (...) cases suggest, then we face a generality problem. Our higher-level beliefs as formalized by Becker and Salerno turn out to be sensitive according to a wide reading of bootstrapping, but insensitive according to a narrow reading. This particular generality problem does not arise for the alternative accounts of process reliabilism and basis-relative safety. Hence, sensitivity accounts not only deliver opposite results given different formalizations of higher-level beliefs, but also for the same formalization, depending on how we interpret bootstrapping. Therefore, sensitivity accounts do not fail because they make higher-level knowledge impossible, as Vogel argues, and they do not succeed in allowing higher-level knowledge, as Becker and Salerno suggest. Rather, their problem is that they deliver far too heterogeneous results. (shrink)
This article outlines a method of collaboration that will manifest a high probability of cumulative and progressive results in science. The method will accomplish this through a division of labour grounded in the order of occurrence of human cognitional operations. The following article explores the possibility of a method known as functional specialization, distinct tasks presently operative in neuroscience. Functional specialization will enhance collaboration within a science as well as initiate implementation of generalized empirical method. Implementation of generalized empirical method (...) will be achieved through the focus of individual specialties on specific mental operations. (shrink)
We generalize, by a progressive procedure, the notions of conjunction and disjunction of two conditional events to the case of n conditional events. In our coherence-based approach, conjunctions and disjunctions are suitable conditional random quantities. We define the notion of negation, by verifying De Morgan’s Laws. We also show that conjunction and disjunction satisfy the associative and commutative properties, and a monotonicity property. Then, we give some results on coherence of prevision assessments for some families of compounded conditionals; in particular (...) we examine the Fréchet-Hoeffding bounds. Moreover, we study the reverse probabilistic inference from the conjunction Cn+1 of n + 1 conditional events to the family {Cn,En+1|Hn+1}. We consider the relation with the notion of quasi-conjunction and we examine in detail the coherence of the prevision assessments related with the conjunction of three conditional events. Based on conjunction, we also give a characterization of p-consistency and of p-entailment, with applications to several inference rules in probabilistic nonmonotonic reasoning. Finally, we examine some non p-valid inference rules; then, we illustrate by an example two methods which allow to suitably modify non p-valid inference rules in order to get inferences which are p-valid. (shrink)
A general framework for translating various logical systems is presented, including a set of partial unary operators of affirmation and negation. Despite its usual reading, affirmation is not redundant in any domain of values and whenever it does not behave like a full mapping. After depicting the process of partial functions, a number of logics are translated through a variety of affirmations and a unique pair of negations. This relies upon two preconditions: a deconstruction of truth-values as ordered and structured (...) objects, unlike its mainstream presentation as a simple object; a redefinition of the Principle of Bivalence as a set of four independent properties, such that its definition does not equate with normality. (shrink)
This paper offers a modification of Fabrice Correia's and Alexander Skiles' ("Grounding, Essence, and Identity") definition of grounding in terms of generalized identity that extends it to zero-grounding. This definition promises to improve our understanding of zero-grounding by capturing it within the framework of generalized identity and allows an essentialist theory of modality based on Correia's and Skiles' account to resist a recent challenge by Jessica Leech. The latter is achieved by combining the following two ideas: (1) Some necessities are (...) grounded in truths about zero-grounding, and (2) at least some identity truths are zero-grounded. Finally, some advantages of the zero-grounding approach over Correia's and Skiles' recent definition of necessity in terms of generalized identity and logical consequence are argued for. (shrink)
The generalized Darwinian research programme accepts physicalism, but holds that all life is purposive in character. It seeks to understand how and why all purposiveness has evolved in the universe – especially purposiveness associated with what we value most in human life, such as sentience, consciousness, person-to-person understanding, science, art, free¬dom, love. As evolution proceeds, the mechanisms of evolution themselves evolve to take into account the increasingly important role that purposive action can play - especially when quasi-Lamarckian evolution by cultural (...) means comes into existence. This programme of research brings together, into a coherent field of inquiry, aspects of such diverse fields of research as orthodox Darwinian theory (given its purposive interpretation), the study of animal behaviour, palaeontology, archaeology, history, anthropology, psycho-neurology, artificial intelligence, psychology, sociology, philosophy, linguistics, semantics, history and philosophy of science, and history and philosophy of inquiry more generally (the history and philosophy of ideas and culture). The great advantage of the generalized Darwinian research programme is that it provides a framework for understanding the deeds, achievements and experiences of people in a way that is compatible with the kind of knowledge and understanding achieved in the physical sciences, without being reducible to such knowledge and understanding. It promises to enable us to understand ourselves as a part of the biological domain without our humanity, our distinctive human value, being in any way denied: persons are not reduced to animals, and nor are animals misconceived to be persons. It holds out the hope that we can come to understand the human world as an integral part of the natural world without the meaning and value of the human world being thereby conceptually annihilated. The programme specifies in general terms what we must seek to do in order to develop a coherent understanding of nature and of ourselves which does justice to the character of both. (shrink)
A generalized information theory is proposed as a natural extension of Shannon's information theory. It proposes that information comes from forecasts. The more precise and the more unexpected a forecast is, the more information it conveys. If subjective forecast always conforms with objective facts then the generalized information measure will be equivalent to Shannon's information measure. The generalized communication model is consistent with K. R. Popper's model of knowledge evolution. The mathematical foundations of the new information theory, the generalized communication (...) model , information measures for semantic information and sensory information, and the coding meanings of generalized entropy and generalized mutual information are introduced. Assessments and optimizations of pattern recognition, predictions, and detection with the generalized information criterion are discussed. For economization of communication, a revised version of rate-distortion theory: rate-of-keeping-precision theory, which is a theory for datum compression and also a theory for matching an objective channels with the subjective understanding of information receivers, is proposed. Applications include stock market forecasting and video image presentation. (shrink)
The 'complexity' approach can be positive and very helpful for General Linguistics theory because departs from: a) the idea that knowledge or meaning can exist without a being who produces them, b) the fragmented and reductionist view of reality and its too mechanistic oriented images, c) the 'linear' causality models, d) the tendency to dichotomise the categories about reality, e) the 'third excluded' Aristotelian principle (binary logic: if something is here it is not there), f) the disappearance of the mind (...) in some 'higher' social sciences, g) an inadequate approach of the relationships between the whole and its parts, and, h) a perspective on creativity too much based on logic and not on 'artistic' intuition and imagination in science. (shrink)
Penelope Maddy’s Second Philosophy is one of the most well-known ap- proaches in recent philosophy of mathematics. She applies her second-philosophical method to analyze mathematical methodology by reconstructing historical cases in a setting of means-ends relations. However, outside of Maddy’s own work, this kind of methodological analysis has not yet been extensively used and analyzed. In the present work, we will make a first step in this direction. We develop a general framework that allows us to clarify the procedure and (...) aims of the Second Philosopher’s investigation into set-theoretic methodology; pro- vides a platform to analyze the Second Philosopher’s methods themselves; and can be applied to further questions in the philosophy of set theory. (shrink)
In the mid-1960s, Soviet computer scientist Mikhail Moiseevich Bongard created sets of visual puzzles where the objective was to spot an easily justifiable difference between two sides of a single image (for instance, white shapes vs black shapes, etc...). The idea was that these puzzles could be used to teach computers the general faculty of abstraction: perhaps by learning to spot the differences between these sorts of images, a computational agent could learn about inference in general. Considered a global expert (...) on Bongard problems, cognitive scientist Harry Foundalis developed the Phaeaco cognitive architecture for his PhD thesis - based on emulating cognition by solving the problems, creating a kind of artificial intelligence. In this paper, the extent to which Foundalis' approach allows for artificial general intelligence (the ability to reproduce a wide range of human abilities, or the goal of cognitive models) will be evaluated - with reference to Daniel Dennett’s reductive theory of mind and Immanuel Kant’s concept of the phenomenon and the noumenon. The point of view presented is that Phaeaco is missing several characteristics of general artificial intelligence. (shrink)
Maxwell’s Classical Electrodynamics (MCED) suffers several inconsistencies: (1) the Lorentz force law of MCED violates Newton’s Third Law of Motion (N3LM) in case of stationary and divergent or convergent current distributions; (2) the general Jefimenko electric field solution of MCED shows two longitudinal far fields that are not waves; (3) the ratio of the electrodynamic energy-momentum of a charged sphere in uniform motion has an incorrect factor of 4/3. A consistent General Classical Electrodynamics (GCED) is presented that is based on (...) Whittaker’s reciprocal force law that satisfies N3LM. The Whittaker force is expressed as a scalar magnetic field force, added to the Lorentz force. GCED is consistent only if it is assumed that the electric potential velocity in vacuum, ’a’, is much greater than ’c’ (a ≫ c); GCED reduces to MCED, in case we assume a = c. Longitudinal electromagnetic waves and superluminal longitudinal electric potential waves are predicted. This theory has been verified by seemingly unrelated experiments, such as the detection of superluminal Coulomb fields and longitudinal Ampe`re forces, and has a wide range of electrical engineering applications. (shrink)
The aim of this paper is twofold: First, to generalize Quine's epistemology, to show that what Quine refutes for traditional epistemology is not only Cartesian foundationalism and Carnapian reductionism, but also any epistemological program if it takes atomic verificationist semantics or supernaturalism, which are rooted in the linguistic/factual distinction of individual sentences, as its underlying system. Thus, we will see that the range of naturalization in the Quinean sense is not as narrow as his critics think. Second, to normalize Quine's (...) epistemology, to explain in what sense Quinean naturalized epistemology is normative. The reason I maintain that critics miss the point of Quinean naturalized epistemology is that they do not appreciate the close connection between Quine's naturalistic approach and his holistic approach to epistemology. To show this I shall reconstruct Quine's argument for naturalizing epistemology within his systematic philosophy, and focus specifically on his holism and its applications, on which Quine relies both in arguing against traditional epistemology, and in supporting his theses of underdetermination of physical theory and indeterminacy of translation. This is the key to understanding the scope and the normativity of Quine's epistemology. In the conclusion I will point out what the genuine problems are for Quinean naturalized epistemology. (shrink)
Robert Henman’s article, Can brain scanning and imaging techniques contribute to a theory of thinking? came to my attention recently. My own work over the years has included applications of mathematics in the biological sciences, collaboration in experimental work in biochemistry, as well as work in the philosophy of the biological sciences. Henman suggests something that, at present, would be out of the ordinary.
The thesis is: the “periodic table” of “dark matter” is equivalent to the standard periodic table of the visible matter being entangled. Thus, it is to consist of all possible entangled states of the atoms of chemical elements as quantum systems. In other words, an atom of any chemical element and as a quantum system, i.e. as a wave function, should be represented as a non-orthogonal in general (i.e. entangled) subspace of the separable complex Hilbert space relevant to the system (...) to which the atom at issue is related as a true part of it. The paper follows previous publications of mine stating that “dark matter” and “dark energy” are projections of arbitrarily entangled states on the cognitive “screen” of Einstein’s “Mach’s principle” in general relativity postulating that gravitational field can be generated only by mass or energy. (shrink)
Widespread amongst scholars is the legend according to which Locke shows a strong aversion to abstract ideas, similar to that of Berkley in the Treatise. This legend is endorsed by influential commentators on Locke. He does not even propose the reduction of ideas to mental pictures (a reduction which in Berkeley and Hume will form the base of the negation of the existence of abstract ideas in the mind). Locke is not in the least afraid of abstract ideas; his constant (...) concern, which is evident in his treatment of the complex question of the relation between real and nominal essence, is to refute the position of the Scholastics, according to which a universal concept in the mind (post rem) reflects the universal present in all things as substantial form (the universal in re), without assuming positions which are purely conventionalist and nominalist with regard to knowledge, such as those of Mersenne, Gassendi, Hobbes and sceptical and anti-Cartesian free-thinkers. To show this, I offer an analysis of the relation Locke makes between real and nominal essence, with regard to the relations which link term to idea and idea to things. The nature of the relation between signifier and signified is variable, though, in the relation between ideas and things with respect to the various kinds of complex ideas which the human mind may frame. The greatest difference is to be found between complex ideas of mixed mode and complex ideas of substance. (shrink)
Epistemic rationality is typically taken to be immodest at least in this sense: a rational epistemic state should always take itself to be doing at least as well, epistemically and by its own light, than any alternative epistemic state. If epistemic states are probability functions and their alternatives are other probability functions defined over the same collection of proposition, we can capture the relevant sense of immodesty by claiming that epistemic utility functions are (strictly) proper. In this paper I examine (...) what happens if we allow for the alternatives to an epistemic state to include probability functions with different domains. I first prove an impossibility result: on minimal assumptions, I show that there is no way of vindicating strong immodesty principles to the effect that any probability function should take itself to be doing at least as well than any alternative probability function, regardless of its domain. I then consider alternative, weaker generalizations of the traditional immodesty principle and prove some characterization results for some classes of epistemic utility functions satisfying each of the relevant principles. (shrink)
Assuming that votes are independent, the epistemically optimal procedure in a binary collective choice problem is known to be a weighted supermajority rule with weights given by personal log-likelihood-ratios. It is shown here that an analogous result holds in a much more general model. Firstly, the result follows from a more basic principle than expected-utility maximisation, namely from an axiom (Epistemic Monotonicity) which requires neither utilities nor prior probabilities of the ‘correctness’ of alternatives. Secondly, a person’s input need not be (...) a vote for an alternative, it may be any type of input, for instance a subjective degree of belief or probability of the correctness of one of the alternatives. The case of a proﬁle of subjective degrees of belief is particularly appealing, since here no parameters such as competence parameters need to be known. (shrink)
In this paper, I define and study an abstract algebraic structure, the dimensive algebra, which embodies the most general features of the algebra of dimensional physical quantities. I prove some elementary results about dimensive algebras and suggest some directions for future work.
Several decades ago, Wheeler and Misner presented a model of electric charge based on the topological trapping of electric field lines in wormholes. In this paper, which does not argue for or against the "charge without charge" concept, I describe some generalizations of this model which might serve as topological analogs of color charges and electroweak charges.
Abstract : The main goal in this work to find the general solution for some kind of linear second order homogenous differential equations with variable coefficients which have the general form , by using the substitution ,which transform form the above equation to Riccati equation .
In this article, a possible generalization of the Löb’s theorem is considered. Main result is: let κ be an inaccessible cardinal, then ¬Con( ZFC +∃κ) .
Lipsey and Lancaster's "general theory of second best" is widely thought to have significant implications for applied theorizing about the institutions and policies that most effectively implement abstract normative principles. It is also widely thought to have little significance for theorizing about which abstract normative principles we ought to implement. Contrary to this conventional wisdom, I show how the second-best theorem can be extended to myriad domains beyond applied normative theorizing, and in particular to more abstract theorizing about the normative (...) principles we should aim to implement. I start by separating the mathematical model used to prove the second-best theorem from its familiar economic interpretation. I then develop an alternative normative-theoretic interpretation of the model, which yields a novel second best theorem for idealistic normative theory. My method for developing this interpretation provides a template for developing additional interpretations that can extend the reach of the second-best theorem beyond normative theoretical domains. I also show how, within any domain, the implications of the second-best theorem are more specific than is typically thought. I conclude with some brief remarks on the value of mathematical models for conceptual exploration. (shrink)
Maxwell’s Classical Electrodynamics (MCED) suffers several inconsistencies: (1) the Lorentz force law of MCED violates Newton’s Third Law of Motion (N3LM) in case of stationary and divergent or convergent current distributions; (2) the general Jefimenko electric field solution of MCED shows two longitudinal far fields that are not waves; (3) the ratio of the electrodynamic energy-momentum of a charged sphere in uniform motion has an incorrect factor of 4/3. A consistent General Classical Electrodynamics (GCED) is presented that is based on (...) Whittaker’s reciprocal force law that satisfies N3LM. The Whittaker force is expressed as a scalar magnetic field force, added to the Lorentz force. GCED is consistent only if it is assumed that the electric potential velocity in vacuum, ’a’, is much greater than ’c’ (a ≫ c); GCED reduces to MCED, in case we assume a = c. Longitudinal electromagnetic waves and superluminal longitudinal electric potential waves are predicted. This theory has been verified by seemingly unrelated experiments, such as the detection of superluminal Coulomb fields and longitudinal Ampère forces, and has a wide range of electrical engineering applications. (shrink)
In this paper, I present a general theory of topological explanations, and illustrate its fruitfulness by showing how it accounts for explanatory asymmetry. My argument is developed in three steps. In the first step, I show what it is for some topological property A to explain some physical or dynamical property B. Based on that, I derive three key criteria of successful topological explanations: a criterion concerning the facticity of topological explanations, i.e. what makes it true of a particular system; (...) a criterion for describing counterfactual dependencies in two explanatory modes, i.e. the vertical and the horizontal; and, finally, a third perspectival one that tells us when to use the vertical and when to use the horizontal mode. In the second step, I show how this general theory of topological explanations accounts for explanatory asymmetry in both the vertical and horizontal explanatory modes. Finally, in the third step, I argue that this theory is universally applicable across biological sciences, which helps to unify essential concepts of biological networks. (shrink)
An examination of the role played by general rules in Hume's positive (nonskeptical) epistemology. General rules for Hume are roughly just general beliefs. The difference between justified and unjustified belief is a matter of the influence of good versus bad general rules, the good general rules being the "extensive" and "constant" ones.
We present a module based criterion, i.e. a sufficient condition based on the absolute value of the matrix coefficients, for the convergence of Gauss–Seidel method (GSM) for a square system of linear algebraic equations, the Generalized Line Criterion (GLC). We prove GLC to be the “most general” module based criterion and derive, as GLC corollaries, some previously know and also some new criteria for GSM convergence. Although far more general than the previously known results, the proof of GLC is simpler. (...) The results used here are related to recent research in stability of dynamical systems and control of manufacturing systems. (shrink)
Einstein structured the theoretical frame of his work on gravity under the Special Relativity and Minkowski´s spacetime using three guide principles: The strong principle of equivalence establishes that acceleration and gravity are equivalents. Mach´s principle explains the inertia of the bodies and particles as completely determined by the total mass existent in the universe. And, general covariance searches to extend the principle of relativity from inertial motion to accelerated motion. Mach´s principle was abandoned quickly, general covariance resulted mathematical property of (...) the tensors and principle of equivalence inconsistent and it can only apply to punctual gravity, no to extended gravity. Also, the basic principle of Special Relativity, i.e., the constancy of the speed of the electromagnetic wave in the vacuum was abandoned, static Minkowski´s spacetime was replaced to dynamic Lorentz´s manifold and the main conceptual fundament of the theory, i.e. spacetime is not known what is. Of other hand, gravity never was conceptually defined; neither answers what is the law of gravity in general. However, the predictions arise of Einstein equations are rigorously exacts. Thus, the conclusion is that on gravity, it has only the equations. In this work it shows that principle of equivalence applies really to punctual and extended gravity, gravity is defined as effect of change of coordinates although in the case of the extended gravity with change of geometry from Minkowski´s spacetime to Lorentz´s manifold; and the gravitational motion is the geodesic motion that well it can declare as the general law of gravity. (shrink)
It is outlined the possibility to extend the quantum formalism in relation to the requirements of the general systems theory. It can be done by using a quantum semantics arising from the deep logical structure of quantum theory. It is so possible taking into account the logical openness relationship between observer and system. We are going to show how considering the truth-values of quantum propositions within the context of the fuzzy sets is here more useful for systemics. In conclusion we (...) propose an example of formal quantum coherence. (shrink)
What does it mean for a general term to be rigid? It is argued by some that if we take general terms to designate their extensions, then almost no empirical general term will turn out to be rigid; and if we take them to designate some abstract entity, such as a kind, then it turns out that almost all general terms will be rigid. Various authors who pursue this line of reasoning have attempted to capture Kripke’s intent by defining a (...) rigid general term as one that applies to the objects in its extension essentially. I argue that this account is significantly mistaken for various reasons: it conflates a metaphysical notion (essentialism) with a semantic one (rigidity); it fails to countenance the fact that any term can be introduced into a language by stipulating that it be a rigid designator; it limits the extension of rigid terms so much that terms such as ‘meter’, ‘rectangle’, ‘truth’, etc. do not turn out to be rigid, when they obviously are; and it wrongly concentrates on the predicative use of a general term in applying a certain test offered by Kripke to determine whether a term is rigid. (shrink)
Metacognition is the capacity to evaluate the success of one's own cognitive processes in various domains; for example, memory and perception. It remains controversial whether metacognition relies on a domain-general resource that is applied to different tasks or if self-evaluative processes are domain specific. Here, we investigated this issue directly by examining the neural substrates engaged when metacognitive judgments were made by human participants of both sexes during perceptual and memory tasks matched for stimulus and performance characteristics. By comparing patterns (...) of fMRI activity while subjects evaluated their performance, we revealed both domain-specific and domain-general metacognitive representations. Multivoxel activity patterns in anterior prefrontal cortex predicted levels of confidence in a domain-specific fashion, whereas domain-general signals predicting confidence and accuracy were found in a widespread network in the frontal and posterior midline. The demonstration of domain-specific metacognitive representations suggests the presence of a content-rich mechanism available to introspection and cognitive control. (shrink)
The question of whether there are laws of nature in ecology has developed substantially in the last 20 years. Many have attempted to rehabilitate ecology’s lawlike status through establishing that ecology possesses laws that robustly appear across many different ecological systems. I argue that there is still something missing, which explains why so many have been skeptical of ecology’s lawlike status. Community ecology has struggled to establish what I call a General Unificatory Theory. The lack of a GUT causes problems (...) for explanation as there are no guidelines for how to integrate the lower-level mathematical and causal models into a larger theory of how ecological assemblages are formed. I turn to a promising modern attempt to provide a unified higher-level explanation in ecology, presented by ecologist Mark Vellend, and advocate for philosophical engagement with its prospects for aiding ecological explanation. (shrink)
Although written in Japanese, 経験的綜合判断の一般的形式（The General Form of Empirical Synthetic Judgement）pursues the general logical form of the Kantian empirical synthetic judgements.
A wide range of problems of the relationship between consciousness and matter are discussed. Particular attention is paid to the analysis of the structure and properties of consciousness in the framework of information evolution. The role of specific (non-computational) properties of consciousness in the procedure of classical and quantum measurements is analyzed. In particular, the issue of "cloning" of consciousness (the possibility of copying its properties onto a new material carrier) is discussed in detail. We hope that the generalized principle (...) of complementarity formulated by us will open up new ways for studying the problems of consciousness within the framework of the fundamental physical picture of the world. (shrink)
According to Intellectualism knowing how to V is a matter of knowing a suitable proposition about a way of V-ing. In this paper, I consider the question of which ways of acting might figure in the propositions which Intellectualists claim constitute the object of knowledge-how. I argue that Intellectualists face a version of the Generality Problem – familiar from discussions of Reliabilism – since not all ways of V-ing are such that knowledge about them suffices for knowledge-how. I consider (...) various responses to this problem, and argue that none are satisfactory. (shrink)
The dynamics of general relativity is encoded in a set of ten differential equations, the so-called Einstein field equations. It is usually believed that Einstein's equations represent a physical law describing the coupling of spacetime with material fields. However, just six of these equations actually describe the coupling mechanism: the remaining four represent a set of differential relations known as Bianchi identities. The paper discusses the physical role that the Bianchi identities play in general relativity, and investigates whether these identities (...) --qua part of a physical law-- highlight some kind of a posteriori necessity in a Kripkean sense. The inquiry shows that general relativistic physics has an interesting bearing on the debate about the metaphysics of the laws of nature. (shrink)
I present and defend the generalized selected effects theory (GSE) of function. According to GSE, the function of a trait consists in the activity that contributed to its bearer’s differential reproduction, or differential retention, within a population. Unlike the traditional selected effects (SE) theory, it does not require that the functional trait helped its bearer reproduce; differential retention is enough. Although the core theory has been presented previously, I go significantly beyond those presentations by providing a new argument for GSE (...) and defending it from a recent objection. I also sketch its implications for teleosemantics and philosophy of medicine. (shrink)
I distinguish two ways an ability might be general: (i) an ability might be general in that its possession doesn't entail the possession of an opportunity; (ii) an ability might be general in virtue of pertaining to a wide range of circumstances. I argue that these two types of generality – I refer to them with the terms ‘general’ and ‘generic’, respectively – produce two orthogonal distinctions among abilities. I show that the two types of generality are sometimes (...) run together by those writing on free will and argue that both types of generality are relevant to understanding the modality of abilities. (shrink)
For anyone with an interest in the philosophical teachings of Ch’an (Zen Buddhism), the Platform Sutra is arguably the classic source of philosophical as opposed to religious Ch’an. The text is exclusively concerned with expounding the nature of Ch’an and its key feature: enlightenment achieved by the mind alone or by pure understanding without the assistance of textual authority, religious devotion, charitable acts, meditative practices or monastic discipline. Yet, despite its centrality in Zen Buddhism, the book presents one account of (...) enlightenment that has received very little attention: the story of the General. It is commonly thought that emotions are to be repressed in order to attain enlightenment. The argument I would like to present is that one case of attaining enlightenment recounted in the Platform Sutra shows that one ought to take a very different attitude to desire and emotions than annihilation, tranquilization or repression. From the account of the General, it would appear that desires and emotions are not to be simply eradicated or repressed. Rather, they must first be expressed and then acknowledged as possessed before one can attain enlightenment. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.