An early, very preliminary edition of this book was circulated in 1962 under the title Set-theoretical Structures in Science. There are many reasons for maintaining that such structures play a role in the philosophy of science. Perhaps the best is that they provide the right setting for investigating problems of representation and invariance in any systematic part of science, past or present. Examples are easy to cite. Sophisticated analysis of the nature of representation in perception is to be found (...) already in Plato and Aristotle. One of the great intellectual triumphs of the nineteenth century was the mechanical explanation of such familiar concepts as temperature and pressure by their representation in terms of the motion of particles. A more disturbing change of viewpoint was the realization at the beginning of the twentieth century that the separate invariant properties of space and time must be replaced by the space-time invariants of Einstein's special relativity. Another example, the focus of the longest chapter in this book, is controversy extending over several centuries on the proper representation of probability. The six major positions on this question are critically examined. Topics covered in other chapters include an unusually detailed treatment of theoretical and experimental work on visual space, the two senses of invariance represented by weak and strong reversibility of causal processes, and the representation of hidden variables in quantum mechanics. The final chapter concentrates on different kinds of representations of language, concluding with some empirical results on brain-wave representations of words and sentences. (shrink)
Many philosophers are baffled by necessity. Humeans, in particular, are deeply disturbed by the idea of necessary laws of nature. In this paper I offer a systematic yet down to earth explanation of necessity and laws in terms of invariance. The type of invariance I employ for this purpose generalizes an invariance used in meta-logic. The main idea is that properties and relations in general have certain degrees of invariance, and some properties/relations have a stronger degree (...) of invariance than others. The degrees of invariance of highly-invariant properties are associated with high degrees of necessity of laws governing/describing these properties, and this explains the necessity of such laws both in logic and in science. This non-mysterious explanation has rich ramifications for both fields, including the formality of logic and mathematics, the apparent conflict between the contingency of science and the necessity of its laws, the difference between logical-mathematical, physical, and biological laws/principles, the abstract character of laws, the applicability of logic and mathematics to science, scientific realism, and logical-mathematical realism. (shrink)
Although the invariance criterion of logicality first emerged as a criterion of a purely mathematical interest, it has developed into a criterion of considerable linguistic and philosophical interest. In this paper I compare two different perspectives on this criterion. The first is the perspective of natural language. Here, the invariance criterion is measured by its success in capturing our linguistic intuitions about logicality and explaining our logical behavior in natural-linguistic settings. The second perspective is more theoretical. Here, the (...)invariance criterion is used as a tool for developing a theoretical foundation of logic, focused on a critical examination, explanation, and justification of its veridicality and modal force. (shrink)
Wigner’s quantum-mechanical classification of particle-types in terms of irreducible representations of the Poincaré group has a classical analogue, which we extend in this paper. We study the compactness properties of the resulting phase spaces at fixed energy, and show that in order for a classical massless particle to be physically sensible, its phase space must feature a classical-particle counterpart of electromagnetic gauge invariance. By examining the connection between massless and massive particles in the massless limit, we also derive a (...) classical-particle version of the Higgs mechanism. (shrink)
This paper has three main objectives: (a) Discuss the formal analogy between some important symmetry-invariance arguments used in physics, probability and statistics. Specifically, we will focus on Noether’s theorem in physics, the maximum entropy principle in probability theory, and de Finetti-type theorems in Bayesian statistics; (b) Discuss the epistemological and ontological implications of these theorems, as they are interpreted in physics and statistics. Specifically, we will focus on the positivist (in physics) or subjective (in statistics) interpretations vs. objective interpretations (...) that are suggested by symmetry and invariance arguments; (c) Introduce the cognitive constructivism epistemological framework as a solution that overcomes the realism-subjectivism dilemma and its pitfalls. The work of the physicist and philosopher Max Born will be particularly important in our discussion. (shrink)
What is a logical constant? The question is addressed in the tradition of Tarski's definition of logical operations as operations which are invariant under permutation. The paper introduces a general setting in which invariance criteria for logical operations can be compared and argues for invariance under potential isomorphism as the most natural characterization of logical operations.
What can rational deliberation indicate about belief? Belief clearly influences deliberation. The principle that rational belief is stake-invariant rules out at least one way that deliberation might influence belief. The principle is widely, if implicitly, held in work on the epistemology of categorical belief, and it is built into the model of choice-guiding degrees of belief that comes to us from Ramsey and de Finetti. Criticisms of subjective probabilism include challenges to the assumption of additive values (the package principle) employed (...) by defenses of probabilism. But the value-interaction phenomena often cited in such challenges are excluded by stake-invariance. A comparison with treatments of categorical belief suggests that the appeal to stake-invariance is not ad hoc. Whether or not to model belief as stake-invariant is a question not settled here. (shrink)
It is now standard to interpret symmetry-related models of physical theories as representing the same state of affairs. Recently, a debate has sprung up around the question when this interpretational move is warranted. In particular, Møller-Nielsen :1253–1264, 2017) has argued that one is only allowed to interpret symmetry-related models as physically equivalent when one has a characterisation of their common content. I disambiguate two versions of this claim. On the first, a perspicuous interpretation is required: an account of the models’ (...) common ontology. On the second, stricter, version of this claim, a perspicuous formalism is required in addition: one whose mathematical structures ‘intrinsically’ represent the physical world, in the sense of Field. Using Dewar’s :485–521, 2019) distinction between internal and external sophistication as a case study, I argue that the second requirement is decisive. This clarifies the conditions under which it is warranted to interpret symmetry-related models as physically equivalent. (shrink)
I provide a theory of causation within the causal modeling framework. In contrast to most of its predecessors, this theory is model-invariant in the following sense: if the theory says that C caused (didn't cause) E in a causal model, M, then it will continue to say that C caused (didn't cause) E once we've removed an inessential variable from M. I suggest that, if this theory is true, then we should understand a cause as something which transmits deviant or (...) non-inertial behavior to its effect. (shrink)
The presence of symmetries in physical theories implies a pernicious form of underdetermination. In order to avoid this theoretical vice, philosophers often espouse a principle called Leibniz Equivalence, which states that symmetry-related models represent the same state of affairs. Moreover, philosophers have claimed that the existence of non-trivial symmetries motivates us to accept the Invariance Principle, which states that quantities that vary under a theory’s symmetries aren’t physically real. Leibniz Equivalence and the Invariance Principle are often seen as (...) part of the same package. I argue that this is a mistake: Leibniz Equivalence and the Invariance Principle are orthogonal to each other. This means that it is possible to hold that symmetry-related models represent the same state of affairs whilst having a realist attitude towards variant quantities. Various arguments have been presented in favour of the Invariance Principle: a rejection of the Invariance Principle is inter alia supposed to cause indeterminism, undetectability or failure of reference. I respond that these arguments at best support Leibniz Equivalence. (shrink)
Properties and relations in general have a certain degree of invariance, and some types of properties/relations have a stronger degree of invariance than others. In this paper I will show how the degrees of invariance of different types of properties are associated with, and explain, the modal force of the laws governing them. This explains differences in the modal force of laws/principles of different disciplines, starting with logic and mathematics and proceeding to physics and biology.
Quantum invariance designates the relation of any quantum coherent state to the corresponding statistical ensemble of measured results. The adequate generalization of ‘measurement’ is discussed to involve the discrepancy, due to the fundamental Planck constant, between any quantum coherent state and its statistical representation as a statistical ensemble after measurement. A set-theory corollary is the curious invariance to the axiom of choice: Any coherent state excludes any well-ordering and thus excludes also the axiom of choice. It should be (...) equated to a well-ordered set after measurement and thus requires the axiom of choice. Quantum invariance underlies quantum information and reveals it as the relation of an unordered quantum “much” (i.e. a coherent state) and a well-ordered “many” of the measured results (i.e. a statistical ensemble). It opens up to a new horizon, in which all physical processes and phenomena can be interpreted as quantum computations realizing relevant operations and algorithms on quantum information. All phenomena of entanglement can be described in terms of the so defined quantum information. Quantum invariance elucidates the link between general relativity and quantum mechanics and thus, the problem of quantum gravity. (shrink)
Relativity Theory by Albert Einstein has been so far littleconsidered by cognitive scientists, notwithstanding its undisputedscientific and philosophical moment. Unfortunately, we don't have adiary or notebook as cognitively useful as Faraday's. But physicshistorians and philosophers have done a great job that is relevant bothfor the study of the scientist's reasoning and the philosophy ofscience. I will try here to highlight the fertility of a `triangulation'using cognitive psychology, history of science and philosophy of sciencein starting answering a clearly very complex question:why (...) did Einstein discover Relativity Theory? Here we arenot much concerned with the unending question of precisely whatEinstein discovered, that still remains unanswered, for we have noconsensus over the exact nature of the theory's foundations. We are mainly interested in starting to answer the`how question', and especially the following sub-question: what were his goals and strategies in hissearch? I will base my argument on fundamental publications ofEinstein, aiming at pointing out a theory-specific heuristic, settingboth a goal and a strategy: covariance/invariance.The result has significance in theory formation in science, especiallyin concept and model building. It also raises other questions that gobeyond the aim of this paper: why was he so confident in suchheuristic? Why didn't many other scientists use it? Where did he keep? such a heuristic? Do we have any other examples ofsimilar heuristic search in other scientific problemsolving? (shrink)
How should we understand the notion of moral objectivity? Metaethical positions that vindicate morality’s objective appearance are often associated with moral realism. On a realist construal, moral objectivity is understood in terms of mind-, stance-, or attitude-independence. But realism is not the only game in town for moral objectivists. On an antirealist construal, morality’s objective features are understood in virtue of our attitudes. In this paper I aim to develop this antirealist construal of moral objectivity in further detail, and to (...) make its metaphysical commitments explicit. I do so by building on Sharon Street’s version of “Humean Constructivism”. Instead of the realist notion of attitude-independence, the antirealist account of moral objectivity that I articulate centres on the notion of standpoint-invariance. While constructivists have been criticized for compromising on the issue of moral objectivity, I make a preliminary case for the thesis that, armed with the notion of standpoint-invariance, constructivists have resources to vindicate an account of objectivity with just the right strength, given the commitments of ordinary moral thought and practice. In support of this thesis I highlight recent experimental findings about folk moral objectivism. Empirical observations about the nature of moral discourse have traditionally been taken to give prima facie support to moral realism. I argue, by contrast, that from what we can tell from our current experimental understanding, antirealists can capture the commitments of ordinary discourse at least as well as realists can. (shrink)
Some philosophers say that in special relativity, four-dimensional stuff is invariant in some sense that three-dimensional stuff is not. I show that this claim is false.
This review is a critical discussion of three main claims in Debs and Redhead’s thought-provoking book Objectivity, Invariance, and Convention. These claims are: (i) Social acts impinge upon formal aspects of scientific representation; (ii) symmetries introduce the need for conventional choice; (iii) perspectival symmetry is a necessary and sufficient condition for objectivity, while symmetry simpliciter fails to be necessary.
This contribution is an invitation to consider the professional situation in a way that goes beyond a social meaning or a subjective approach. Understood as an intermediate object, the professional situation is studied as a result of tension between invariance and perspective. The data centre on the activity of counselors whose role is to guide farmers confronted with agro-environmental standards. This text brings into question on one hand the attributions qualifying the situation as «professional» and that attest to a (...) form of invariance; and on the other, the counselors’ activity of understanding the «perspective» of the farmers with whom they are in relation. Finally, the article suggests three avenues for future work on the notion of professional situation. Cette contribution invite à considérer la situation professionnelle au-delà d’une acception sociale d’un côté et d’une approche subjective d’autre part. Entendue comme objet intermédiaire, la situation professionnelle est étudiée comme résultant d’une tension entre invariance et perspective. Les données portent sur l’activité de conseillers dont le rôle est d’accompagner les agriculteurs confrontés aux normes agro-environnementales. On s’attache à remettre en question d’une part les attributions qualifiant la situation de « professionnelle » qui témoignent d’une forme d’invariance; d’autre part, l’activité de saisie par les conseillers de la « perspective » des agriculteurs avec qui ils sont en relation. Enfin, on propose trois lignes de travail à venir sur la notion de situation professionnelle. (shrink)
N. Cartwright’s results on invariance under intervention and causality (2003) are reconsidered. Procedural approach to causality elicited in this paper and contrasted with Cartwright’s apparently philosophical one unravels certain ramifications of her results. The procedural approach seems to license only a constrained notion of intervention and in consequence the “correctness to invariance” part of Cartwright’s first theorem fails for a class of cases. The converse “invariance to correctness” part of the theorem relies heavily on modeling assumptions which (...) prove to be difficult to validate in practice and are often buttressed by independently acquired evidence. (shrink)
In this paper, we review a general technique for converting the standard Lagrangian description of a classical system into a formulation that puts time on an equal footing with the system's degrees of freedom. We show how the resulting framework anticipates key features of special relativity, including the signature of the Minkowski metric tensor and the special role played by theories that are invariant under a generalized notion of Lorentz transformations. We then use this technique to revisit a classification of (...) classical particle-types that mirrors Wigner's classification of quantum particle-types in terms of irreducible representations of the Poincaré group, including the cases of massive particles, massless particles, and tachyons. Along the way, we see gauge invariance naturally emerge in the context of classical massless particles with nonzero spin, as well as study the massless limit of a massive particle and derive a classical-particle version of the Higgs mechanism. (shrink)
The incompleteness of set theory ZFC leads one to look for natural extensions of ZFC in which one can prove statements independent of ZFC which appear to be "true". One approach has been to add large cardinal axioms. Or, one can investigate second-order expansions like Kelley-Morse class theory, KM or Tarski- Grothendieck set theory TG [1]-[3] It is a non-conservative extension of ZFC and is obtaineed from other axiomatic set theories by the inclusion of Tarski's axiom which implies the existence (...) of inaccessible cardinals [1].Non-conservative extension of ZFC based on an generalized quantifiers considered in [4]. In this paper we look at a set theory NC_{∞^{#}}^{#},based on bivalent gyper infinitary logic with restricted Modus Ponens Rule [5]-[8]. In this paper we deal with set theory NC_{∞^{#}}^{#} based on gyper infinitary logic with Restricted Modus Ponens Rule. Set theory NC_{∞^{}}^{} contains Aczel's anti-foundation axiom [9]. We present a new approach to the invariant subspace problem for Hilbert spaces. Our main result will be that: if T is a bounded linear operator on an infinite-dimensional complex separable Hilbert space H,it follow that T has a non-trivial closed invariant subspace.Non-conservative extension based on set theory NC_{∞}^{#} of the model theoretical nonstandard analysis [10]-[12] also is considered. (shrink)
The nature of temporal experience is typically explained in one of a small number of ways, most are versions of either retentionalism or extensionalism. After describing these, I make a distinction between two kinds of temporal character that could structure temporal experience: A-ish contents are those that present events as structured in past/present/future terms, and B-ish contents are those that present events as structured in earlier-than/later-than/simultaneous-with relations. There are a few exceptions, but most of the literature ignores this distinction, and (...) silently assumes temporal experience is A-ish. I then argue that temporal character is not scale invariant, but rather that temporal experience is A-ish at larger scales, and B-ish at smaller scales. I then point out that this scale non-invariance opens the possibility of hybrid views. I clarify my own view as a hybrid view, according to which temporal experience is B-ish at small scales – and at this scale my trajectory estimation model applies – but A-ish at larger scales, and at the larger scale my TEM does not apply. I then motivate this hybrid position by first defending it against arguments that have tried to show that the TEM is untenable. Since the hybrid view has TEM as its small-scale component, it must address this objection. I then put pressure on the main alternative account, extentionalism, by showing that its proponents have not adequately dealt with the problem of temporal illusions. The result is a new theory motivated by i) explaining its virtues, ii) showing that objections to it can be met, and iii) showing that objections to its main competitors have not been met. (shrink)
In this paper, I argue that the recent discussion on the time - reversal invariance of classical electrodynamics (see (Albert 2000: ch.1), (Arntzenius 2004), (Earman 2002), (Malament 2004),(Horwich 1987: ch.3)) can be best understood assuming that the disagreement among the various authors is actually a disagreement about the metaphysics of classical electrodynamics. If so, the controversy will not be resolved until we have established which alternative is the most natural. It turns out that we have a paradox, namely that (...) the following three claims are incompatible: the electromagnetic fields are real, classical electrodynamics is time-reversal invariant, and the content of the state of affairs of the world does not depend on whether it belongs to a forward or a backward sequence of states of the world. (shrink)
A violation of procedure invariance in preference measurement is that the predominant or prominent attribute looms larger in choice than in a matching task. In Experiment 1, this so-called prominence effect was demonstrated for choices between pairs of options, choices to accept single options, and preference ratings of single options. That is, in all these response modes the prominent attribute loomed larger than in matching. The results were replicated in Experiment 2, in which subjects chose between or rated their (...) preference for pairs of options which were matched to be equally attractive either in the same session or 1 week earlier. On the basis of these and previous results, it is argued that the prominence effect is a reliable phenomenon. However, none of several cognitive explanations which have been offered appears to be completely viable. (shrink)
This essay explores Kaila's interpretation of the special theory of relativity. Although the relevance of his work to logical empiricism is well-known, not much has been written on what Kaila calls the ‘Einstein-Minkowski invariance theory’. Kaila's interpretation focuses on two salient features. First, he emphasizes the importance of the invariance of the spacetime interval. The general point about spacetime invariance has been known at least since Minkowski, yet Kaila applies his overall tripartite theory of invariances to space, (...) time and spacetime in an original way. Second, Kaila provides a non-conventionalist argument for the isotropic speed of electromagnetic signals. The standard Einstein synchrony is not a mere convention but a part of a larger empirical theory. According to Kaila's holistic principle of testability, which stands in contrast to the theses of translatability and verification, different items in the theory cannot be sharply divided into conventional and empirical. Kaila's invariantism/non-conventionalism about relativity reflects an interesting case in the gradual transition from positivism to realism within the philosophy of science. (shrink)
This contribution is an invitation to consider the professional situation in a way that goes beyond a social meaning or a subjective approach. Understood as an intermediate object, the professional situation is studied as a result of tension between invariance and perspective. The data centre on the activity of counselors whose role is to guide farmers confronted with agro-environmental standards. This text brings into question on one hand the attributions qualifying the situation as «professional» and that attest to a (...) form of invariance; and on the other, the counselors’ activity of understanding the «perspective» of the farmers with whom they are in relation. Finally, the article suggests three avenues for future work on the notion of professional situation. Cette contribution invite à considérer la situation professionnelle au-delà d’une acception sociale d’un côté et d’une approche subjective d’autre part. Entendue comme objet intermédiaire, la situation professionnelle est étudiée comme résultant d’une tension entre invariance et perspective. Les données portent sur l’activité de conseillers dont le rôle est d’accompagner les agriculteurs confrontés aux normes agro-environnementales. On s’attache à remettre en question d’une part les attributions qualifiant la situation de « professionnelle » qui témoignent d’une forme d’invariance; d’autre part, l’activité de saisie par les conseillers de la « perspective » des agriculteurs avec qui ils sont en relation. Enfin, on propose trois lignes de travail à venir sur la notion de situation professionnelle. (shrink)
This French article aims at analyzing the Ricardian problem of an "invariable standard of value" in Ricardo's own terms. It is argued that Ricardo's commentators and modern followers have changed these terms significantly. The problem actually branches into two subproblems, i.e., that of "invariability" strictly, and that of "neutrality with respect to distribution". These subproblems do not matter to Ricardo to the same extent. He regards the latter (in various formulations recapitulated here) as a complication of the former, which is (...) the crucial one in his search for a "good" standard. This exemplifies precisely how Ricardo could theoretically focus on the production side of the economy at the expense of the distribution side. With these conclusions at hand, the paper can be critical of Marx's and Sraffa's interpretations of the Ricardian problem of the standard: respectively, because Marx's is simply incorrect, and because Sraffa's solved a problem that was unrelated to the original one in Ricardo. -/- -/- . (shrink)
The philosophy of science of Patrick Suppes is centered on two important notions that are part of the title of his recent book (Suppes 2002): Representation and Invariance. Representation is important because when we embrace a theory we implicitly choose a way to represent the phenomenon we are studying. Invariance is important because, since invariants are the only things that are constant in a theory, in a way they give the “objective” meaning of that theory. Every scientific theory (...) gives a representation of a class of structures and studies the invariant properties holding in that class of structures. In Suppes’ view, the best way to define this class of structures is via axiomatization. This is because a class of structures is given by a definition, and this same definition establishes which are the properties that a single structure must possess in order to belong to the class. These properties correspond to the axioms of a logical theory. In Suppes’ view, the best way to characterize a scientific structure is by giving a representation theorem for its models and singling out the invariants in the structure. Thus, we can say that the philosophy of science of Patrick Suppes consists in the application of the axiomatic method to scientific disciplines. What I want to argue in this paper is that this application of the axiomatic method is also at the basis of a new approach that is being increasingly applied to the study of computer science and information systems, namely the approach of formal ontologies. The main task of an ontology is that of making explicit the conceptual structure underlying a certain domain. By “making explicit the conceptual structure” we mean singling out the most basic entities populating the domain and writing axioms expressing the main properties of these primitives and the relations holding among them. So, in both cases, the axiomatization is the main tool used to characterize the object of inquiry, being this object scientific theories (in Suppes’ approach), or information systems (for formal ontologies). In the following section I will present the view of Patrick Suppes on the philosophy of science and the axiomatic method, in section 3 I will survey the theoretical issues underlying the work that is being done in formal ontologies and in section 4 I will draw a comparison of these two approaches and explore similarities and differences between them. (shrink)
Symmetries have a crucial role in today’s physics. In this thesis, we are mostly concerned with time reversal invariance (T-symmetry). A physical system is time reversal invariant if its underlying laws are not sensitive to the direction of time. There are various accounts of time reversal transformation resulting in different views on whether or not a given theory in physics is time reversal invariant. With a focus on quantum mechanics, I describe the standard account of time reversal and compare (...) it with my alternative account, arguing why it deserves serious attention. Then, I review three known ways to T-violation in quantum mechanics, and explain two unique experiments made to detect it in the neutral K and B mesons. (shrink)
This paper presents an attempt to define temporal coincidence starting from the first principles. The temporal coincidence defined here differs from Einstein’s simultaneity for it is invariant across inertial frames - not relative. The meaning and significance of temporal coincidence is derived from axioms of existence and it somehow relates to Kant’s notion of simultaneity. Consistentl y applied to the Special Theory of Relativity framework, temporal coincidence does not in any way create mathematical contradictions; however it allows looking at some (...) common relativity claims with a dose of scepticism. Time, as derived from Lorentz transformations, appears to be conventional in order to match the postulate of constancy of the speed of light. The relative simultaneity is only apparent due to that convention. There are insufficient grounds to claim that inertial systems moving relatively to each other have their own different temporal realities. Overall, the innate temporal logic we have is not erroneous and does not need to be replaced contrary to the claims of some relativity educators. (shrink)
Purpose of the article is the reconstruction of ancient Greek and ancient Roman models of religiosity as anthropological invariants that determine the patterns of thinking and being of subsequent eras. Theoretical basis. The author applied the statement of Protagoras that "Man is the measure of all things" to the reconstruction of the religious sphere of culture. I proceed from the fact that each historical community has a set of inherent ideas about the principles of reality, which found unique "universes of (...) meanings". The historical space acquires anthropological properties that determine the specific mythology of the respective societies, as well as their spiritual successors. In particular, the religious models of ancient Greece and ancient Rome had a huge influence on formation of the worldview of the Christian civilization of the West. Originality. Multiplicity of the Olympic mythology contributed to the diversity of the expression forms of the Greek genius, which manifested itself in different fields of cultural activity, not reducible to political, philosophical or religious unity. The poverty of Roman mythology was compensated by a clear awareness of the unity of the community, which for all historical vicissitudes had always remained an unchanging ideal, and which was conceived as a reflection of the unity of the heavens. These two approaches to the divine predetermined the formation of two interacting, but conceptually different anthropological paradigms of Antiquity. Conclusions. Western concepts of divinity are invariants of two basic theological concepts – "Greek" and "Roman". These are ideal types, so these two tendencies can co-exist in one society. The Roman trend continued to be realized by the anti-Roman religion, which took Roman forms and Roman name. Iconoclasm was a Byzantine version of the Reformation, promoted by the Isaurian emperors and failed due to the strong Hellenistic naturalistic lobby. Modern "Romans" are trying to get rid of the last elements of religious naturalism, and modern "Greeks" are trying to preserve the Hellenic elements in Christianity. Patterns can be transformed, but the observational view will still be able to identify their lineage. The developed model allows a deeper understanding of the culture of both ancient societies, as well as the outlook of Western man. (shrink)
The concept of ‘ideas’ plays central role in philosophy. The genesis of the idea of continuity and its essential role in intellectual history have been analyzed in this research. The main question of this research is how the idea of continuity came to the human cognitive system. In this context, we analyzed the epistemological function of this idea. In intellectual history, the idea of continuity was first introduced by Leibniz. After him, this idea, as a paradigm, formed the base of (...) several fundamental scientific conceptions. This idea also allowed mathematicians to justify the nature of real numbers, which was one of central questions and intellectual discussions in the history of mathematics. For this reason, we analyzed how Dedekind’s continuity idea was used to this justification. As a result, it can be said that several fundamental conceptions in intellectual history, philosophy and mathematics cannot arise without existence of the idea of continuity. However, this idea is neither a purely philosophical nor a mathematical idea. This is an interdisciplinary concept. For this reason, we call and classify it as mathematical and philosophical invariance. (shrink)
A primary dimension of our engagement with fictional works of art – paradigmatically literary, dramatic, and cinematic narratives – is figuring out what is true in such representations, what the facts are in the fictional world. These facts include not only those that ground any genuine understanding of a story – say, that it was his own father whom Oedipus killed – but also those that may be missed in even a largely competent reading, say, that Emma Bovary's desires and (...) dissatisfactions are fed by reading romance novels. (shrink)
Purpose of the article is the reconstruction of ancient Greek and ancient Roman models of religiosity as anthropological invariants that determine the patterns of thinking and being of subsequent eras. Theoretical basis. The author applied the statement of Protagoras that "Man is the measure of all things" to the reconstruction of the religious sphere of culture. I proceed from the fact that each historical community has a set of inherent ideas about the principles of reality, which found unique "universes of (...) meanings". The historical space acquires anthropological properties that determine the specific mythology of the respective societies, as well as their spiritual successors. In particular, the religious models of ancient Greece and ancient Rome had a huge influence on formation of the worldview of the Christian civilization of the West. Originality. Multiplicity of the Olympic mythology contributed to the diversity of the expression forms of the Greek genius, which manifested itself in different fields of cultural activity, not reducible to political, philosophical or religious unity. The poverty of Roman mythology was compensated by a clear awareness of the unity of the community, which for all historical vicissitudes had always remained an unchanging ideal, and which was conceived as a reflection of the unity of the heavens. These two approaches to the divine predetermined the formation of two interacting, but conceptually different anthropological paradigms of Antiquity. Conclusions. Western concepts of divinity are invariants of two basic theological concepts – "Greek" (naturalism and paganism) and "Roman" (transcendentalism and henotheism). These are ideal types, so these two tendencies can co-exist in one society. The Roman trend continued to be realized by the anti-Roman religion, which took Roman forms and Roman name. Iconoclasm was a Byzantine version of the Reformation, promoted by the Isaurian emperors and failed due to the strong Hellenistic naturalistic lobby. Modern "Romans" are trying to get rid of the last elements of religious naturalism, and modern "Greeks" are trying to preserve the Hellenic elements in Christianity. Patterns can be transformed, but the observational view will still be able to identify their lineage. The developed model allows a deeper understanding of the culture of both ancient societies, as well as the outlook of Western man. (shrink)
In this paper I investigate, within the framework of realistic interpretations of the wave function in nonrelativistic quantum mechanics, the mathematical and physical nature of the wave function. I argue against the view that mathematically the wave function is a two-component scalar field on configuration space. First, I review how this view makes quantum mechanics non- Galilei invariant and yields the wrong classical limit. Moreover, I argue that interpreting the wave function as a ray, in agreement many physicists, Galilei (...) class='Hi'>invariance is preserved. In addition, I discuss how the wave function behaves more similarly to a gauge potential than to a field. Finally I show how this favors a nomological rather than an ontological view of the wave function. (shrink)
This paper develops and explores a new framework for theorizing about the measurement and aggregation of well-being. It is a qualitative variation on the framework of social welfare functionals developed by Amartya Sen. In Sen’s framework, a social or overall betterness ordering is assigned to each profile of real-valued utility functions. In the qualitative framework developed here, numerical utilities are replaced by the properties they are supposed to represent. This makes it possible to characterize the measurability and interpersonal comparability of (...) well-being directly, without the use of invariance conditions, and to distinguish between real changes in well-being and merely representational changes in the unit of measurement. The qualitative framework is shown to have important implications for a range of issues in axiology and social choice theory, including the characterization of welfarism, axiomatic derivations of utilitarianism, the meaningfulness of prioritarianism, the informational requirements of variable-population ethics, the impossibility theorems of Arrow and others, and the metaphysics of value. (shrink)
The full Bayesian signi/cance test (FBST) for precise hypotheses is presented, with some illustrative applications. In the FBST we compute the evidence against the precise hypothesis. We discuss some of the theoretical properties of the FBST, and provide an invariant formulation for coordinate transformations, provided a reference density has been established. This evidence is the probability of the highest relative surprise set, “tangential” to the sub-manifold (of the parameter space) that defines the null hypothesis.
On a widespread naturalist view, the meanings of mathematical terms are determined, and can only be determined, by the way we use mathematical language—in particular, by the basic mathematical principles we’re disposed to accept. But it’s mysterious how this can be so, since, as is well known, minimally strong first-order theories are non-categorical and so are compatible with countless non-isomorphic interpretations. As for second-order theories: though they typically enjoy categoricity results—for instance, Dedekind’s categoricity theorem for second-order PA and Zermelo’s quasi-categoricity (...) theorem for second-order ZFC—these results require full second-order logic. So appealing to these results seems only to push the problem back, since the principles of second-order logic are themselves non-categorical: those principles are compatible with restricted interpretations of the second-order quantifiers on which Dedekind’s and Zermelo’s results are no longer available. In this paper, we provide a naturalist-friendly, non-revisionary solution to an analogous but seemingly more basic problem—Carnap’s Categoricity Problem for propositional and first-order logic—and show that our solution generalizes, giving us full second-order logic and thereby securing the categoricity or quasi-categoricity of second-order mathematical theories. Briefly, the first-order quantifiers have their intended interpretation, we claim, because we’re disposed to follow the quantifier rules in an open-ended way. As we show, given this open-endedness, the interpretation of the quantifiers must be permutation-invariant and so, by a theorem recently proved by Bonnay and Westerståhl, must be the standard interpretation. Analogously for the second-order case: we prove, by generalizing Bonnay and Westerståhl’s theorem, that the permutation invariance of the interpretation of the second-order quantifiers, guaranteed once again by the open-endedness of our inferential dispositions, suffices to yield full second-order logic. (shrink)
The paper discusses the origin of dark matter and dark energy from the concepts of time and the totality in the final analysis. Though both seem to be rather philosophical, nonetheless they are postulated axiomatically and interpreted physically, and the corresponding philosophical transcendentalism serves heuristically. The exposition of the article means to outline the “forest for the trees”, however, in an absolutely rigorous mathematical way, which to be explicated in detail in a future paper. The “two deductions” are two successive (...) stage of a single conclusion mentioned above. The concept of “transcendental invariance” meaning ontologically and physically interpreting the mathematical equivalence of the axiom of choice and the well-ordering “theorem” is utilized again. Then, time arrow is a corollary from that transcendental invariance, and in turn, it implies quantum information conservation as the Noether correlate of the linear “increase of time” after time arrow. Quantum information conservation implies a few fundamental corollaries such as the “conservation of energy conservation” in quantum mechanics from reasons quite different from those in classical mechanics and physics as well as the “absence of hidden variables” (versus Einstein’s conjecture) in it. However, the paper is concentrated only into the inference of another corollary from quantum information conservation, namely, dark matter and dark energy being due to entanglement, and thus and in the final analysis, to the conservation of quantum information, however observed experimentally only on the “cognitive screen” of “Mach’s principle” in Einstein’s general relativity. therefore excluding any other source of gravitational field than mass and gravity. Then, if quantum information by itself would generate a certain nonzero gravitational field, it will be depicted on the same screen as certain masses and energies distributed in space-time, and most presumably, observable as those dark energy and dark matter predominating in the universe as about 96% of its energy and matter quite unexpectedly for physics and the scientific worldview nowadays. Besides on the cognitive screen of general relativity, entanglement is available necessarily on still one “cognitive screen” (namely, that of quantum mechanics), being furthermore “flat”. Most probably, that projection is confinement, a mysterious and ad hoc added interaction along with the fundamental tree ones of the Standard model being even inconsistent to them conceptually, as far as it need differ the local space from the global space being definable only as a relation between them (similar to entanglement). So, entanglement is able to link the gravity of general relativity to the confinement of the Standard model as its projections of the “cognitive screens” of those two fundamental physical theories. (shrink)
The idea that beliefs may be stake-sensitive is explored. This is the idea that the strength with which a single, persistent belief is held may vary and depend upon what the believer takes to be at stake. The stakes in question are tied to the truth of the belief—not, as in Pascal’s wager and other cases, to the belief’s presence. Categorical beliefs and degrees of belief are considered; both kinds of account typically exclude the idea and treat belief as stake-invariant (...) , though an exception is briefly described. The role of the assumption of stake-invariance in familiar accounts of degrees of belief is also discussed, and morals are drawn concerning finite and countable Dutch book arguments. (shrink)
A generalized and unifying viewpoint to both general relativity and quantum mechanics and information is investigated. It may be described as a generaliztion of the concept of reference frame from mechanics to thermodynamics, or from a reference frame linked to an element of a system, and thus, within it, to another reference frame linked to the whole of the system or to any of other similar systems, and thus, out of it. Furthermore, the former is the viewpoint of general relativity, (...) the latter is that of quantum mechanics and information. Ciclicity in the manner of Nicolas Cusanus (Nicolas of Cusa) is complemented as a fundamental and definitive property of any totality, e.g. physically, that of the universe. It has to contain its externality within it somehow being namely the totality. This implies a seemingly paradoxical (in fact, only to common sense rather logically and mathematically) viewpoint for the universe to be repesented within it as each one quant of action according to the fundamental Planck constant. That approach implies the unification of gravity and entanglement correspondiing to the former or latter class of reference frames. An invariance, more general than Einstein's general covariance is to be involved as to both classes of reference frames unifying them. Its essence is the unification of the discrete and cotnitinuous (smooth). That idea underlies implicitly quantum mechanics for Bohr's principle that it study the system of quantum microscopic entities and the macroscopic apparatus desribed uniformly by the smmoth equations of classical physics. (shrink)
Psychopathy refers to a range of complex behaviors and personality traits, including callousness and antisocial behavior, typically studied in criminal populations. Recent studies have used self-reports to examine psychopathic traits among noncriminal samples. The goal of the current study was to examine the underlying factor structure of the Self-Report of Psychopathy Scale–Short Form (SRP-SF) across complementary samples and examine the impact of gender on factor structure. We examined the structure of the SRP-SF among 2,554 young adults from three undergraduate samples (...) and a high-risk young adult sample. Using confirmatory factor analysis, a four-correlated factor model and a four-bifactor model showed good fit to the data. Evidence of weak invariance was found for both models across gender. These findings highlight that the SRP-SF is a useful measure of low-level psychopathic traits in noncriminal samples, although the underlying factor structure may not fully translate across men and women. (shrink)
It is shown that the heuristic "derivation" of the Schrödinger equation in quantum mechanics textbooks can be turned into a real derivation by resorting to spacetime translation invariance and relativistic invariance.
[ENGLISH] The present article is a contribution to the development of metrological structural realism. This position of philosophy of science goes back to Matthias Neuber, who introduces it as a third variation of the main structural realisms: epistemic structural realism and ontic structural realism. Here, Neuber attempts to tackle the problems of OSR and ESR while preserving their respective strengths. Of central importance to his approach, are the concepts of invariance, structure and, especially, measurement. Starting from Eino Kaila’s „non-linguistic, (...) realist account of logical empricism“, the present article investigates the necessity of yet another position of structural realism. The established structural realisms are examined for their strengths and weaknesses. Afterwards, the requirements on MSR are formulated in a way that extends beyond Neuber’s account. These requirements are of ontological, epistemological and metrological nature. -/- [DEUTSCH] Der vorliegende Aufsatz ist ein Beitrag zur Entwicklung des Metrologischen Strukturenrealismus. Diese Wissenschaftstheoretische Position geht auf Matthias Neuber zurück, der sie als dritte Spielart zwischen den großen Strukturenrealismen – dem Epistemischen Strukturenrealismus und dem Ontischen Strukturenrealismus – ansiedelt. Neuber versucht, die wissenschaftstheoretischen Probleme von ESR und OSR anzugehen, gleichzeitig aber ihre jeweiligen Stärken beizubehalten. Dabei sind die Konzepte der Invarianz, der Struktur und besonders der Messung von zentraler Bedeutung. Ausgehend von Eino Kailas „non-linguistic, realist account of logical empiricism“ untersucht der vorliegende Aufsatz die Notwendigkeit einer weiteren strukturenrealistischen Position. Dazu werden die etablierten Strukturenrealismen auf ihre Stärken und Schwächen hin untersucht. Es folgt eine Ausformulierung der Forderungen an den MSR, die über die Darstellung bei Neuber hinaus geht. Diese Forderungen sind ontologischer, epistemischer und metrologischer Natur. (shrink)
In this manuscript, published here for the first time, Tarski explores the concept of logical notion. He draws on Klein's Erlanger Programm to locate the logical notions of ordinary geometry as those invariant under all transformations of space. Generalizing, he explicates the concept of logical notion of an arbitrary discipline.
We criticise Shepard's notions of “invariance” and “universality,” and the incorporation of Shepard's work on inference into the general framework of his paper. We then criticise Tenenbaum and Griffiths' account of Shepard (1987b), including the attributed likelihood function, and the assumption of “weak sampling.” Finally, we endorse Barlow's suggestion that minimum message length (MML) theory has useful things to say about the Bayesian inference problems discussed by Shepard and Tenenbaum and Griffiths. [Barlow; Shepard; Tenenbaum & Griffiths].
Although they are continually compositionally reconstituted and reconfigured, organisms nonetheless persist as ontologically unified beings over time – but in virtue of what? A common answer is: in virtue of their continued possession of the capacity for morphological invariance which persists through, and in spite of, their mereological alteration. While we acknowledge that organisms‟ capacity for the “stability of form” – homeostasis - is an important aspect of their diachronic unity, we argue that this capacity is derived from, and (...) grounded in a more primitive one – namely, the homeodynamic capacity for the “specified variation of form”. In introducing a novel type of causal power – a „structural power‟ – we claim that it is the persistence of their dynamic potential to produce a specified series of structurally adaptive morphologies which grounds organisms‟ privileged status as metaphysically “one over many” over time. (shrink)
The study examined the differential item functioning (DIF) of 2018 Basic Education Certificate examination (BECE) in Mathematics tests of National Examination Council (NECO) and BECE of Akwa Ibom State government in Nigeria. The invariance in the tests with regards to sex was considered using Item Response Theory (IRT) approach. The study area was Akwa Ibom state of Nigeria having a student population of 58,281 for the examination. The sample was made of up 3810 students drawn through a multi-stage sampling (...) approach. The multidimensional IRT (MIRT) package implemented in R-programming language software was applied in analyzing the data. The findings reveal that BECE of NECO displayed 23(38.3%) DIF items while BECE of Akwa Ibom State had 37 (61.7%) DIF items in terms of sex. The findings also revealed that, in the two examinations, more items favoured the male candidates more than the female candidates in terms of performance. It was recommended that IRT model should be adopted by test developers to determine item parameters for selection of good items to ensure quality of items before administration. Test equating of students who write equivalent form examinations conducted by different examining bodies was also recommended for admission and placement of candidates to determine actual group differences in performance. The study posits that research on Differential Item Functioning is inconclusive, so should be encouraged. (shrink)
In this paper we have shown how the consideration of a chaotic mechanics supplies a redefinition of special-relativistic space-time. In particular chaotic time means no possibility of defining temporal ordering and implies a breakdown of causality. The new chaotic transformations among "undetermined" space-time coordinates are no more linear and homogeneous. The principles of inertia and of energy-impulse conservation are no longer well defined and in any case no more invariant.
Debates about modularity invariably involve a crucial premise about how visual illusions are experienced. This paper argues that these debates are wrongheaded, and that experience of illusions is orthogonal to the core issue of the modularity hypothesis: informational encapsulation.
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.