The idea that there could be spatially extended mereological simples has recently been defended by a number of metaphysicians (Markosian 1998, 2004; Simons 2004; Parsons (2000) also takes the idea seriously). Peter Simons (2004) goes further, arguing not only that spatially extended mereological simples (henceforth just extended simples) are possible, but that it is more plausible that our world is composed of such simples, than that it is composed of either point-sized simples, or of atomless gunk. The difficulty for these (...) views lies in explaining why it is that the various sub-volumes of space occupied by such simples, are not occupied by proper parts of those simples. Intuitively at least, many of us find compelling the idea that spatially extended objects have proper parts at every sub-volume of the region they occupy. It seems that the defender of extended simples must reject a seemingly plausible claim, what Simons calls the geometriccorrespondenceprinciple (GCP): that any (spatially) extended object has parts that correspond to the parts of the region that it occupies (Simons 2004: 371). We disagree. We think that GCP is a plausible principle. We also think it is plausible that our world is composed of extended simples. We reconcile these two notions by two means. On the one hand we pay closer attention to the physics of our world. On the other hand, we consider what happens when our concept of something—in this case space—contains elements not all of which are realized in anything, but instead key components are realized in different features of the world. (shrink)
We discuss the fate of the correspondenceprinciple beyond quantum mechanics, specifically in quantum field theory and quantum gravity, in connection with the intrinsic limitations of the human ability to observe the external world. We conclude that the best correspondenceprinciple is made of unitarity, locality, proper renormalizability (a refinement of strict renormalizability), combined with fundamental local symmetries and the requirement of having a finite number of fields. Quantum gravity is identified in an essentially unique way. (...) The gauge interactions are uniquely identified in form. Instead, the matter sector remains basically unrestricted. The major prediction is the violation of causality at small distances. (shrink)
In a recent revision (chapter 4 of Nowakowa and Nowak 2000) of an older article Leszek Nowak (1992) has attempted to rebut Niiniluoto’s 1990 critical suggestion that proponents of the Poznań idealizational approach to the sciences have committed a rather elementary logical error in the formal machinery that they advocate for use in the analysis of scientific methodology. In this paper I criticize Nowak’s responses to Niiniluoto’s suggestion, and, subsequently, work out some of the consequences of that criticism for understanding (...) the role that idealization plays in scientific methodology. (shrink)
The correspondenceprinciple made of unitarity, locality and renormalizability has been very successful in quantum field theory. Among the other things, it helped us build the standard model. However, it also showed important limitations. For example, it failed to restrict the gauge group and the matter sector in a powerful way. After discussing its effectiveness, we upgrade it to make room for quantum gravity. The unitarity assumption is better understood, since it allows for the presence of physical particles (...) as well as fake particles (fakeons). The locality assumption is applied to an interim classical action, since the true classical action is nonlocal and emerges from the quantization and a later process of classicization. The renormalizability assumption is refined to single out the special role of the gauge couplings. We show that the upgraded principle leads to an essentially unique theory of quantum gravity. In particular, in four dimensions, a fakeon of spin 2, together with a scalar field, is able to make the theory renormalizable while preserving unitarity. We offer an overview of quantum field theories of particles and fakeons in various dimensions, with and without gravity. (shrink)
Supra-Bayesianism is the Bayesian response to learning the opinions of others. Probability pooling constitutes an alternative response. One natural question is whether there are cases where probability pooling gives the supra-Bayesian result. This has been called the problem of Bayes-compatibility for pooling functions. It is known that in a common prior setting, under standard assumptions, linear pooling cannot be non-trivially Bayes-compatible. We show by contrast that geometric pooling can be non-trivially Bayes-compatible. Indeed, we show that, under certain assumptions, (...) class='Hi'>geometric and Bayes-compatible pooling are equivalent. Granting supra-Bayesianism its usual normative status, one upshot of our study is thus that, in a certain class of epistemic contexts, geometric pooling enjoys a normative advantage over linear pooling as a social learning mechanism. We discuss the philosophical ramifications of this advantage, which we show to be robust to variations in our statement of the Bayes-compatibility problem. (shrink)
This paper explores the issue of the unification of three languages of physics, the geometric language of forces, geometric language of fields or 4-dimensional space-time, and probabilistic language of quantum mechanics. On the one hand, equations in each language may be derived from the Principle of Least Action (PLA). On the other hand, Feynman's path integral method could explain the physical meaning of PLA. The axioms of classical and relativistic mechanics can be considered as consequences of Feynman's (...) formulation of quantum mechanics. (shrink)
Relationships between current theories, and relationships between current theories and the sought theory of quantum gravity (QG), play an essential role in motivating the need for QG, aiding the search for QG, and defining what would count as QG. Correspondence is the broad class of inter-theory relationships intended to demonstrate the necessary compatibility of two theories whose domains of validity overlap, in the overlap regions. The variety of roles that correspondence plays in the search for QG are illustrated, (...) using examples from specific QG approaches. Reduction is argued to be a special case of correspondence, and to form part of the definition of QG. Finally, the appropriate account of emergence in the context of QG is presented, and compared to conceptions of emergence in the broader philosophy literature. It is argued that, while emergence is likely to hold between QG and general relativity, emergence is not part of the definition of QG, and nor can it serve usefully in the development and justification of the new theory. (shrink)
When talking about truth, we ordinarily take ourselves to be talking about one-and-the-same thing. Alethic monists suggest that theorizing about truth ought to begin with this default or pre-reflective stance, and, subsequently, parlay it into a set of theoretical principles that are aptly summarized by the thesis that truth is one. Foremost among them is the invariance principle.
In a first investigation, a Lacan-motivated template of the Poe story is fitted to the data. A segmentation of the storyline is used in order to map out the diachrony. Based on this, it will be shown how synchronous aspects, potentially related to Lacanian registers, can be sought. This demonstrates the effectiveness of an approach based on a model template of the storyline narrative. In a second and more Comprehensive investigation, we develop an approach for revealing, that is, uncovering, Lacanian (...) register relationships. Objectives of this work include the wide and general application of our methodology. This methodology is strongly based on the “letting the data speak” Correspondence Analysis analytics Platform of Jean-Paul Benzécri, that is also the geometric data analysis, both qualitative and quantitative analytics, developed by Pierre Bourdieu. (shrink)
Robert Batterman’s ontological insights (2002, 2004, 2005) are apt: Nature abhors singularities. “So should we,” responds the physicist. However, the epistemic assessments of Batterman concerning the matter prove to be less clear, for in the same vein he write that singularities play an essential role in certain classes of physical theories referring to certain types of critical phenomena. I devise a procedure (“methodological fundamentalism”) which exhibits how singularities, at least in principle, may be avoided within the same classes of (...) formalisms discussed by Batterman. I show that we need not accept some divergence between explanation and reduction (Batterman 2002), or between epistemological and ontological fundamentalism (Batterman 2004, 2005). Though I remain sympathetic to the ‘principle of charity’ (Frisch (2005)), which appears to favor a pluralist outlook, I nevertheless call into question some of the forms such pluralist implications take in Robert Batterman’s conclusions. It is difficult to reconcile some of the pluralist assessments that he and some of his contemporaries advocate with what appears to be a countervailing trend in a burgeoning research tradition known as Clifford (or geometric) algebra. In my critical chapters (2 and 3) I use some of the demonstrated formal unity of Clifford algebra to argue that Batterman (2002) equivocates a physical theory’s ontology with its purely mathematical content. Carefully distinguishing the two, and employing Clifford algebraic methods reveals a symmetry between reduction and explanation that Batterman overlooks. I refine this point by indicating that geometric algebraic methods are an active area of research in computational fluid dynamics, and applied in modeling the behavior of droplet-formation appear to instantiate a “methodologically fundamental” approach. I argue in my introductory and concluding chapters that the model of inter-theoretic reduction and explanation offered by Fritz Rohrlich (1988, 1994) provides the best framework for accommodating the burgeoning pluralism in philosophical studies of physics, with the presumed claims of formal unification demonstrated by physicists choices of mathematical formalisms such as Clifford algebra. I show how Batterman’s insights can be reconstructed in Rohrlich’s framework, preserving Batterman’s important philosophical work, minus what I consider are his incorrect conclusions. (shrink)
Martin Peterson’s The Ethics of Technology: A Geometric Analysis of Five Moral Principles offers a welcome contribution to the ethics of technology, understood by Peterson as a branch of applied ethics that attempts ‘to identify the morally right courses of action when we develop, use, or modify technological artifacts’ (3). He argues that problems within this field are best treated by the use of five domain-specific principles: the Cost-Benefit Principle, the Precautionary Principle, the Sustainability Principle, the (...) Autonomy Principle, and the Fairness Principle. These principles are, in turn, to be understood and applied with reference to the geometric method. This method is perhaps the most interesting and novel part of Peterson’s book, and I’ll devote the bulk of my review to it. (shrink)
Recent work has defended “Euclidean” theories of set size, in which Cantor’s Principle (two sets have equally many elements if and only if there is a one-to-one correspondence between them) is abandoned in favor of the Part-Whole Principle (if A is a proper subset of B then A is smaller than B). It has also been suggested that Gödel’s argument for the unique correctness of Cantor’s Principle is inadequate. Here we see from simple examples, not that (...) Euclidean theories of set size are wrong, but that they must be either very weak and narrow or largely arbitrary and misleading. (shrink)
In Newton’s correspondence with Richard Bentley, Newton rejected the possibility of remote action, even though he accepted it in the Principia. Practically, Newton’s natural philosophy is indissolubly linked to his conception of God. The knowledge of God seems to be essentially immutable, unlike the laws of nature that can be subjected to refining, revision and rejection procedures. As Newton later states in Opticks, the cause of gravity is an active principle in matter, but this active principle is (...) not an essential aspect of matter, but something that must have been added to matter by God, arguing in the same Query of Opticks even the need for divine intervention. DOI: 10.13140/RG.2.2.16732.44162 . (shrink)
Based on the various documents, 1989-2002, through the original texts, in addition to the author's contributions, this paper presents the refutation of the mathematicians and physicists A. Logunov and M. Mestvirishvil of A. Einstein's "general relativity", from the relativistic theory of gravitation of these authors, who applying the fundamental principle of the science of physics of the conservation of the energy-momentum and using absolute differential calculus they rigorously perform their mathematical tests. It is conclusively shown that, from the Einstein-Grossman-Hilbert (...) equations, gravity is absurdly a metric field devoid of physical reality unlike all other fields in nature that are material fields, interrupting the chain of transformations between the different existing fields. Also, in Einstein's theory the proved "inertial mass" equal to gravitational mass has no physical meaning. Therefore, "general relativity" does not obey the correspondenceprinciple with Newton's gravity. (shrink)
In times of crisis, when current theories are revealed as inadequate to task, and new physics is thought to be required---physics turns to re-evaluate its principles, and to seek new ones. This paper explores the various types, and roles of principles that feature in the problem of quantum gravity as a current crisis in physics. I illustrate the diversity of the principles being appealed to, and show that principles serve in a variety of roles in all stages of the crisis, (...) including in motivating the need for a new theory, and defining what this theory should be like. In particular, I consider: the generalised correspondenceprinciple, UV-completion, background independence, and the holographic principle. I also explore how the current crisis fits with Friedman's view on the roles of principles in revolutionary theory-change, finding that while many key aspects of this view are not represented in quantum gravity, the view could potentially offer a useful diagnostic, and prescriptive strategy. This paper is intended to be relatively non-technical, and to bring some of the philosophical issues from the search for quantum gravity to a more general philosophical audience interested in the roles of principles in scientific theory-change. (shrink)
The numerous and diverse roles of theory reduction in science have been insufficiently explored in the philosophy literature on reduction. Part of the reason for this has been a lack of attention paid to reduction2 (successional reduction)---although I here argue that this sense of reduction is closer to reduction1 (explanatory reduction) than is commonly recognised, and I use an account of reduction that is neutral between the two. This paper draws attention to the utility---and incredible versatility---of theory reduction. A non-exhaustive (...) list of various applications of reduction in science is presented, some of which are drawn from a particular case-study, being the current search for a new theory of fundamental physics. This case-study is especially interesting because it employs both senses of reduction at once, and because of the huge weight being put on reduction by the different research groups involved; additionally, it presents some unique uses for reduction---revealing, I argue, the fact that reduction can be of specialised and unexpected service in particular scientific cases. The paper makes two other general findings: that the functions of reduction that are typically assumed to characterise the different forms of the relation may instead be understood as secondary consequences of some other roles; and that most of the roles that reduction plays in science can actually also be fulfilled by a weaker relation than (the typical understanding of) reduction. (shrink)
Gideon Rosen and Robert Schwartzkopff have independently suggested (variants of) the following claim, which is a varian of Hume's Principle: -/- When the number of Fs is identical to the number of Gs, this fact is grounded by the fact that there is a one-to-one correspondence between the Fs and Gs. -/- My paper is a detailed critique of the proposal. I don't find any decisive refutation of the proposal. At the same time, it has some consequences which (...) many will find objectionable. (shrink)
The predominant approaches to understanding how quantum theory and General Relativity are related to each other implicitly assume that both theories use the same concept of mass. Given that despite great efforts such approaches have not yet produced a consistent falsifiable quantum theory of gravity, this paper entertains the possibility that the concepts of mass in the two theories are in fact distinct. It points out that if the concept of mass in quantum mechanics is defined such that it always (...) exists in a superposition and is not a gravitational source, then this sharply segregates the domains of quantum theory and of general relativity. This concept of mass violates the equivalence principle applied to active gravitational mass, but may still produce effects consistent with the equivalence principle when applied to passive gravitational mass (in agreement with observations) by the correspondenceprinciple applied to a weak field in the appropriate limit. An experiment that successfully measures the gravity field of quantum objects in a superposition, and in particular of photons, would not only falsify this distinction but also constitute the first direct empirical test that gravity must in fact be described fundamentally by a quantum theory. (shrink)
An analysis of the classical-quantum correspondence shows that it needs to identify a preferred class of coordinate systems, which defines a torsionless connection. One such class is that of the locally-geodesic systems, corresponding to the Levi-Civita connection. Another class, thus another connection, emerges if a preferred reference frame is available. From the classical Hamiltonian that rules geodesic motion, the correspondence yields two distinct Klein-Gordon equations and two distinct Dirac-type equations in a general metric, depending on the connection used. (...) Each of these two equations is generally-covariant, transforms the wave function as a four-vector, and differs from the Fock-Weyl gravitational Dirac equation (DFW equation). One obeys the equivalence principle in an often-accepted sense, whereas the DFW equation obeys that principle only in an extended sense. (shrink)
Gauss’s quadratic reciprocity theorem is among the most important results in the history of number theory. It’s also among the most mysterious: since its discovery in the late 18th century, mathematicians have regarded reciprocity as a deeply surprising fact in need of explanation. Intriguingly, though, there’s little agreement on how the theorem is best explained. Two quite different kinds of proof are most often praised as explanatory: an elementary argument that gives the theorem an intuitive geometric interpretation, due to (...) Gauss and Eisenstein, and a sophisticated proof using algebraic number theory, due to Hilbert. Philosophers have yet to look carefully at such explanatory disagreements in mathematics. I do so here. According to the view I defend, there are two important explanatory virtues—depth and transparency—which different proofs (and other potential explanations) possess to different degrees. Although not mutually exclusive in principle, the packages of features associated with the two stand in some tension with one another, so that very deep explanations are rarely transparent, and vice versa. After developing the theory of depth and transparency and applying it to the case of quadratic reciprocity, I draw some morals about the nature of mathematical explanation. (shrink)
Can a group be an orthodox rational agent? This requires the group's aggregate preferences to follow expected utility (static rationality) and to evolve by Bayesian updating (dynamic rationality). Group rationality is possible, but the only preference aggregation rules which achieve it (and are minimally Paretian and continuous) are the linear-geometric rules, which combine individual values linearly and combine individual beliefs geometrically. Linear-geometric preference aggregation contrasts with classic linear-linear preference aggregation, which combines both values and beliefs linearly, but achieves (...) only static rationality. Our characterisation of linear-geometric preference aggregation has two corollaries: a characterisation of linear aggregation of values (Harsanyi's Theorem) and a characterisation of geometric aggregation of beliefs. (shrink)
Conwy Lloyd Morgan (1852–1936) is widely regarded as the father of modern comparative psychology. Yet, Morgan initially had significant doubts about whether a genuine science of comparative psychology was even possible, only later becoming more optimistic about our ability to make reliable inferences about the mental capacities of non-human animals. There has been a fair amount of disagreement amongst scholars of Morgan’s work about the nature, timing, and causes of this shift in Morgan’s thinking. We argue that Morgan underwent two (...) quite different shifts of attitude towards the proper practice of comparative psychology. The first was a qualified acceptance of the Romanesian approach to comparative psychology that he had initially criticized. The second was a shift away from Romanes’ reliance on systematizing anecdotal evidence of animal intelligence towards an experimental approach, focused on studying the development of behaviour. We emphasize the role of Morgan’s evolving epistemological views in bringing about the first shift – in particular, his philosophy of science. We emphasize the role of an intriguing but overlooked figure in the history of comparative psychology in explaining the second shift, T. Mann Jones, whose correspondence with Morgan provided an important catalyst for Morgan’s experimental turn, particularly the special focus on development. We also shed light on the intended function of Morgan’s Canon, the methodological principle for which Morgan is now mostly known. The Canon can only be properly understood by seeing it in the context of Morgan’s own unique experimental vision for comparative psychology. (shrink)
Formalizing Euclid’s first axiom. Bulletin of Symbolic Logic. 20 (2014) 404–5. (Coauthor: Daniel Novotný) -/- Euclid [fl. 300 BCE] divides his basic principles into what came to be called ‘postulates’ and ‘axioms’—two words that are synonyms today but which are commonly used to translate Greek words meant by Euclid as contrasting terms. -/- Euclid’s postulates are specifically geometric: they concern geometric magnitudes, shapes, figures, etc.—nothing else. The first: “to draw a line from any point to any point”; the (...) last: the parallel postulate. -/- Euclid’s axioms are general principles of magnitude: they concern geometric magnitudes and magnitudes of other kinds as well even numbers. The first is often translated “Things that equal the same thing equal one another”. -/- There are other differences that are or might become important. -/- Aristotle [fl. 350 BCE] meticulously separated his basic principles [archai, singular archê] according to subject matter: geometrical, arithmetic, astronomical, etc. However, he made no distinction that can be assimilated to Euclid’s postulate/axiom distinction. -/- Today we divide basic principles into non-logical [topic-specific] and logical [topic-neutral] but this too is not the same as Euclid’s. In this regard it is important to be cognizant of the difference between equality and identity—a distinction often crudely ignored by modern logicians. Tarski is a rare exception. The four angles of a rectangle are equal to—not identical to—one another; the size of one angle of a rectangle is identical to the size of any other of its angles. No two angles are identical to each other. -/- The sentence ‘Things that equal the same thing equal one another’ contains no occurrence of the word ‘magnitude’. This paper considers the problem of formalizing the proposition Euclid intended as a principle of magnitudes while being faithful to the logical form and to its information content. (shrink)
This article discusses the relation between the early Wittgenstein’s and Carnap’s philosophies of logic, arguing that Carnap’s position in The Logical Syntax of Language is in certain respects much closer to the Tractatus than has been recognized. In Carnapian terms, the Tractatus’ goal is to introduce, by means of quasi-syntactical sentences, syntactical principles and concepts to be used in philosophical clarification in the formal mode. A distinction between the material and formal mode is therefore already part of the Tractatus’ view, (...) and its method for introducing syntactical concepts and principles should be entirely acceptable for Carnap by his own criteria. Moreover, despite the Tractatus’ rejection of syntactical statements, there is an important correspondence between Wittgenstein’s saying-showing distinction and Carnap’s object-language-syntax-language distinction: both constitute a distinction between logico-syntactical determinations concerning language and language as determined or described by those determinations. Wittgenstein’s distinction therefore constitutes a precursor of the object-language syntax-language distinction which the latter in a certain sense affirms, rather than simply contradicting it. The saying-showing distinction agrees with Carnap’s position also in marking logic as something that isn’t true/false about either language or reality, which is a conception that underlies Carnap’s principle of tolerance. (shrink)
Thomas Reid’s philosophy is a philosophy of mind—a Pneumatology in the idiom of 18th century Scotland. His overarching philosophical project is to construct an account of the nature and operations of the human mind, focusing on the two-way correspondence, in perception and action, between the thinking principle within and the material world without. Like his contemporaries, Reid’s treatment of these topics aimed to incorporate the lessons of the scientific revolution. What sets Reid’s philosophy of mind apart is his (...) commitment to a set of intuitive contingent truths he called the principles of common sense. This difference, as this chapter will show, enables Reid to construct an account of mind that resists the temptation to which so many philosophers in his day and ours succumb, i.e., the temptation, in his words, to materialize minds or spiritualize bodies. (shrink)
Motivated by Scholze and Fargues' geometrization of the local Langlands correspondence using perfectoid diamonds and Clausen and Scholze's work on the K-theory of adic spaces using condensed mathematics, we introduce the Efimov K-theory of diamonds. We propose a pro-diamond, a large stable (infinity,1)-category of diamonds D^{diamond}, a diamond spectra and chromatic tower, and a localization sequence for diamond spectra. Commensurate with the localization sequence, we detail three potential applications of the Efimov K-theory of D^{diamond}: to emergent time as a (...) pro-emergence (v-stack time) in quantum gravity in a diamond holographic principle using Scholze's six operations in the 'etale cohomology of diamonds; to D^{diamond}-cryptography; and to nonlocality in perfectoid quantum physics. (shrink)
This contribution discusses Leibniz’s conception of the Christian church, his life-long ecumenical efforts, and his stance toward religious toleration. Leibniz’s regarded the main Christian denominations as particular churches constituting the only one truly catholic or universal church, whose authority went back to apostolic times, and whose theology was to be traced back to the entire ecclesiastical tradition. This is the ecclesiology which underpins his ecumenism. The main phases and features of his work toward reunification of Protestants and Roman Catholics, and (...) unification of Protestant churches are briefly explored, before turning to the issue of religious toleration. It is argued that a remarkably inclusive conception of toleration can be gleaned from a broad sample of Leibniz’s writings and correspondence. It is thanks to the philosophical and theological grounds of this conception that toleration can in principle be extended, for Leibniz, to all men and women of good will, including non-Christians, pagans, and atheists. (shrink)
Abstract This essay seeks to realize the following question: what is the correspondence between the sign and the thing that names? In reality, what is significant is closely linked with the being, however in this case the sense will be a particular manifestation of an own subjectivity; therefore the determinant relation is limited to the capacity of representation, where the meaning is connected with the socio-cultural sense. In this regard, this work seek realize this problem from the analytical approach (...) of Ludwig Wittgenstein, who is based on the method of philosophical investigation through formal logic. In this first period of development, his speculative work, which in principle leads to the search for logical foundations for mathematics, practised it in direction to the study of the nature of the representation. Subsequently, and as necessary this test delimitation, will expose the confrontation —more not necessarily contradiction— of their approaches in the light of the own Wittgenstein who, in a second reflective period poured into his later work, is obliged to discuss the foundational in his speculative work logical-philosophical topics from a pragmatic point of view. (shrink)
We discuss central aspects of history of the concept of an affine differentiable manifold, as a proposal confirming the need for using some quantitative methods (drawn from elementary Model Theory) in Mathematical Historiography. In particular, we prove that this geometric structure is a syntactic rigid designator in the sense of Kripke-Putnam.
The remarkable connections between gravity and thermodynamics seem to imply that gravity is not fundamental but emergent, and in particular, as Verlinde suggested, gravity is probably an entropic force. In this paper, we will argue that the idea of gravity as an entropic force is debatable. It is shown that there is no convincing analogy between gravity and entropic force in Verlinde’s example. Neither holographic screen nor test particle satisfies all requirements for the existence of entropic force in a thermodynamics (...) system. As a result, there is no entropic force in the gravity system. Furthermore, we show that the entropy increase of the screen is not caused by its statistical tendency to increase entropy as required by the existence of entropic force, but in fact caused by gravity. Therefore, Verlinde’s argument for the entropic origin of gravity is problematic. In addition, we argue that the existence of a minimum size of spacetime, together with the Heisenberg uncertainty principle in quantum theory, may imply the fundamental existence of gravity as a geometric property of spacetime. This provides a further support for the conclusion that gravity is not an entropic force. (shrink)
Nelson's Proof of the Impossibility of the Theory of Knowledge -/- In addressing the possibility of a theory of knowledge, Leonard Nelson noted the contradiction of an epistemological criterion that one would require in order to differentiate between valid and invalid knowledge. Nelson concluded that the inconsistency of such a criterion proves the impossibility of the theory of knowledge. -/- Had the epistemological criterion had a perception, then it would presume to adjudicate on its own truth (thus epistemological circular argument). (...) However, if one were to assume that the criterion is not knowledge, one would then have to justify how this is a criterion for truth - yet this would only be possible when it may be considered as an object of knowledge. One would equally have had to predetermine the criterion in order to determine the truth of this knowledge, thereby providing another circular argument. Ostensibly, every criterion of truth fails at its very own test since it cannot guarantee its own truth, just as Munchausen, contrary to his assertion, could not draw himself out of the swamp by tugging on a tuft of his own hair. -/- Nelson proposed a solution of the epistemological problem (the question of the differentiation between valid and invalid knowledge), that based on Jakob Friedrich Fries' differentiation between proof and deduction. Proof, according to Nelson (in reference to Fries), can be defined as derivation of truth from one statement from another statement. Thus, from the truth in the statement that "all men are mortal", one is then able to say that "Socrates is a man" and thence extrapolate from the truth of the statement that "Socrates is mortal." If knowledge were to be considered somewhat judgmental (in a statement), then an attempt at proof (i.e. recourse to previous judgments) would inevitably lead to an infinite regression in justification, since each judgment would necessitate a further justification from another judgment. Every attempt to prove an epistemological criterion is thus also confronted by this regression in justification. -/- Nelson's attempt at a solution rests on the assumption of the existence of an immediate knowledge as a justification of the truth (mediate) of knowledge. Nelson considers immediate knowledge to be non-judgmental knowledge. These include intuitions (e. g. seeing-the-red-roof) and also philosophical knowledge that pre-exists in his opinion before a judgmental reflexion (immediate) in our reason (e. g. the principle of causality). -/- Proof of the truth of mediate knowledge can be effected by showing its compliance with attendant immediate knowledge (rational truth = correspondence of mediate knowledge with their immediate knowledge). Nelson considered this as a resolution of the circular epistemological argument. In regard to philosophical knowledge, Nelson sees these as subject to deduction and not proof. The following example illustrates the goal of deduction: -/- An approach for deducing the principle of causality: A) Every change has a cause. (The principle of causality) A´) A is a reiteration of an immediate knowledge. (Meta-assertion following A) -/- "A" may not be provable, but A´ may justified, and thus Nelson identified it as a deduction following from A. // reference: http://www.friesian.com/nelproof.htm. (shrink)
Many commentators have remarked upon the striking points of correspondence that can be found in the works of Freud and Nietzsche. However, this essay argues that on the subject of desire their work presents us with a radical choice: Freud or Nietzsche. I first argue that Freud’s theory of desire is grounded in the principle of inertia, a principle that is incompatible with his later theory of Eros and the life drive. Furthermore, the principle of inertia (...) is not essentially distinct from his later theory of the death drive. Consequently, Freud’s theory of desire can only be interpreted consistently as a monism of the death drive. I then analyze Nietzsche’s attempt to ground his theory of desire in the concept of the will to power. I argue that Nietzsche’s view of desire is fundamentally opposed to the key elements of Freud’s theory of desire: the principle of constancy, the Freudian definition of the drive, and the pleasure principle. Next, I explicate the stakes of this opposition by analyzing the social consequences of each view for morality and justice. I argue that the Freudian subject seeks to dominate the social other, and that there is an insurmountable conflict between the satisfaction of desire and the demands of social life. Consequently, Freud’s view allows only for a negative conception of the social good in which morality is defined as the intrinsically impossible task of eliminating evil, and justice can be achieved only through the equal distribution of instinctual frustration. Finally, I argue that in Nietzsche’s theory of desire there is no essential conflict between individual desire and social life. The Nietzschean subject desires to manifest power in the form of activity that is independent of external agents, not to dominate the other. Consequently, Nietzsche’s view allows for the possibility of a positively defined concept of the social good in which morality is the affirmation and enhancement of every subject’s happiness, and justice can be achieved through the promotion and protection of an equality of power among subjects. (shrink)
PETER KNAUER'S CONCEPTION OF MORAL CHOICE ON THE ANTHROPOLOGICAL CREAITVENESS IN MODERN MORAL THEOLOGY Summary The author undertakes a critical analysis of the ethical views of Peter Knauer who is one of the most influential theological moralist today. The author tends to show the consequences of Knauer's theory which consequences are destructive for morality. The first part of the paper presents Knauer's standpoint in view of the conception of moral choice and shows three crucial points of his system. They are (...) the following: the definition of moral good (lightness) in its relation to physical good; the reinterpretation of the principle of double effect which reinterpretation reduces that principle to its "Ideological explanation"; and, finally, the conception of the so-called non-counterproductivity which decides whether an activity is right. That non-counterproductivity is comprehended as an all-embracing correspondence between the goals which the subject has chosen and means which the subject has taken in order to accomplish the goals. In the second critical part one finds some questions related to the immanent critique of the discussed theory and some consequences of Knauer's claims. If one applies his assumptions strictly, it turns out that in his model of morality there is no place for such an activity of man which, at the same time, would be both rational and free, i.e., sensu stricto moral. In view of Knauer's system the criteria of moral evaluation, which criteria he proposed, do not hold sense. Thus morality and ethics lose their essential normative character. Consequently, the system under scrutiny leads to an antipersonalistic vision of both an individual and society, and in terms of eternity it seems to dismiss the possibility of recognizing the existence of Absolute. The theory rejects values for the sake of which it was construed. Translated by Jan Kłos. (shrink)
Leibniz claims that Berkeley “wrongly or at least pointlessly rejects abstract ideas”. What he fails to realize, however, is that some of his own core views commit him to essentially the same stance. His belief that this is the best (and thus most harmonious) possible world, which itself stems from his Principle of Sufficient Reason, leads him to infer that mind and body must perfectly represent or ‘express’ one another. In the case of abstract thoughts he admits that this (...) can happen only in virtue of thinking of some image that, being essentially a mental copy of a brain state, expresses (and is expressed by) that state. But here he faces a problem. In order for a thought to be genuinely abstract, its representational content must differ from that of any mental image, since the latter can represent only something particular. In that case, however, an exact correspondence between the accompanying mental image and the brain state would not suffice to establish a perfect harmony between mind and body. Even on Leibniz’s own principles, then, it appears that Berkeley was right to dismiss abstract ideas. (shrink)
The aim of this study is to examine the relation between Nietzsche’s perspectivism and his doctrine of the will to power and to show that perspectivism is almost a direct and natural consequence of the doctrine of the will to power. Without exploring the doctrine, it is not possible to understand what Nietzsche’s perspectivism is and what he trying to do by proposing it as an alternative to traditional epistemology. To this aim, firstly, Nietzsche’s doctrine of the will to power (...) is explained in detail. Next, in order to provide a deeper understanding of the doctrine, its relation with Darwinism and the claims which say that it is a metaphysical principle are analyzed. Afterwards, Nietzsche’s construction of the world as becoming out of will to power is investigated. Nietzsche’s conception of interpretation as power struggle and its role in perspectivism explained. Then, how Nietzsche’s construction of the world as becoming and his concept of interpretation as power struggle emerge as perspectivism is explained. After that, in order to present the differences between Nietzsche’s perspectivism and traditional understanding of epistemology, Nietzsche’s critiques of some of the fundamental assumptions of traditional epistemology, i.e., causality, logic, and subject-object and apparent-real world distinctions, are investigated. Finally, Nietzsche’s understanding of truth based on his perspectivism is inquired. Its relation with correspondence, pragmatic and coherence theories of truth is explored to show that Nietzsche’s understanding of truth could not be comprehended through these theories. Consequently, it isclaimed that the tendency to attribute a truth theory to Nietzsche’s perspectivism, which is prevalent in the current Nietzsche studies, stems from commentator’s, consciously or unconsciously, ignoring of the relation between his perspectivism and his doctrine of the will to power. (shrink)
As the 19th century drew to a close, logicians formalized an ideal notion of proof. They were driven by nothing other than an abiding interest in truth, and their proofs were as ethereal as the mind of God. Yet within decades these mathematical abstractions were realized by the hand of man, in the digital stored-program computer. How it came to be recognized that proofs and programs are the same thing is a story that spans a century, a chase with as (...) many twists and turns as a thriller. At the end of the story is a new principle for designing programming languages that will guide computers into the 21st century. -/- For my money, Gentzen’s natural deduction and Church’s lambda calculus are on a par with Einstein’s relativity and Dirac’s quantum physics for elegance and insight. And the maths are a lot simpler. I want to show you the essence of these ideas. I’ll need a few symbols, but not too many, and I’ll explain as I go along. -/- To simplify, I’ll present the story as we understand it now, with some asides to fill in the history. First, I’ll introduce Gentzen’s natural deduction, a formalism for proofs. Next, I’ll introduce Church’s lambda calculus, a formalism for programs. Then I’ll explain why proofs and programs are really the same thing, and how simplifying a proof corresponds to executing a program. Finally, I’ll conclude with a look at how these principles are being applied to design a new generation of programming languages, particularly mobile code for the Internet. (shrink)
This study presents a new type of foundational model unifying quantum theory, relativity theory and gravitational physics, with a novel cosmology. It proposes a six-dimensional geometric manifold as the foundational ontology for our universe. The theoretical unification is simple and powerful, and there are a number of novel empirical predictions and theoretical reductions that are strikingly accurate. It subsequently addresses a variety of current anomalies in physics. It shows how incomplete modern physics is by giving an example of a (...) theory that is genuinely unified. In doing this, it radically alters the metaphysical interpretation of the nature of time, space and matter currently interpreted from modern physics. It also profoundly challenges materialist expectations about a naturalistic account our own existence. I contend here that there is sufficient evidence to support this theory as a leading paradigm for a unified foundational theory. (shrink)
I show that centered propositions—also called de se propositions, and usually modeled as sets of centered worlds—pose a serious problem for various versions of Lewis's Principal Principle. The problem, put roughly, is that in scenarios like Elga's `Sleeping Beauty' case, those principles imply that rational agents ought to have obviously irrational credences. To solve the problem, I propose a centered version of the Principal Principle. My version allows centered propositions to be objectively chancy.
Luminance and color are strong and self-sufficient cues to pictorial depth in visual scenes and images. The present study investigates the conditions Under which luminance or color either strengthens or overrides geometric depth cues. We investigated how luminance contrasts associated with color contrast interact with relative height in the visual field, partial occlusion, and interposition in determining the probability that a given figure is perceived as ‘‘nearer’’ than another. Latencies of ‘‘near’’ responses were analyzed to test for effects of (...) attentional selection. Figures in a pair were supported by luminance contrast or isoluminant color contrast and combined with one of the three geometric cues. The results of Experiment 1 show that luminance contrasts associated with hue, when it does not interact with other hues, produces the same effects as achromatic luminance contrasts: The probability of‘‘near’’ increases with luminance contrast while the latencies for ‘‘near’’ responses decrease. Partial occlusion is found to be a strong enough pictorial cue to support a weaker red luminance contrast. Interposition cues lose out against cues of spatial position and partial occlusion. The results of Experiment 2, with isoluminant displays of varying color contrast, reveal that red color contrast on a light background supported by any of the three geometric cues wins over green or white supported by any of the three geometric cues. On a dark background, red color contrast supported by the interposition cue loses out against green or white color contrast supported by partial occlusion. These findings reveal that color is not an independent depth cue, but is strongly influenced by luminance contrast and stimulus geometry. Systematically shorter response latencies for stronger ‘‘near’’ percepts demonstrate that selective visual attention reliably detects the most likely depth cue combination in a given configuration. (shrink)
In the early 20th century, scepticism was common among philosophers about the very meaningfulness of the notion of truth – and of the related notions of denotation, definition etc. (i.e., what Tarski called semantical concepts). Awareness was growing of the various logical paradoxes and anomalies arising from these concepts. In addition, more philosophical reasons were being given for this aversion.1 The atmosphere changed dramatically with Alfred Tarski’s path-breaking contribution. What Tarski did was to show that, assuming that the syntax of (...) the object language is specified exactly enough, and that the metatheory has a certain amount of set theoretic power,2 one can explicitly define truth in the object language. And what can be explicitly defined can be eliminated. It follows that the defined concept cannot give rise to any inconsistencies (that is, paradoxes). This gave new respectability to the concept of truth and related notions. Nevertheless, philosophers’ judgements on the nature and philosophical relevance of Tarski’s work have varied. It is my aim here to review and evaluate some threads in this debate. (shrink)
In a previous work we introduced the algorithm \SQEMA\ for computing first-order equivalents and proving canonicity of modal formulae, and thus established a very general correspondence and canonical completeness result. \SQEMA\ is based on transformation rules, the most important of which employs a modal version of a result by Ackermann that enables elimination of an existentially quantified predicate variable in a formula, provided a certain negative polarity condition on that variable is satisfied. In this paper we develop several extensions (...) of \SQEMA\ where that syntactic condition is replaced by a semantic one, viz. downward monotonicity. For the first, and most general, extension \SSQEMA\ we prove correctness for a large class of modal formulae containing an extension of the Sahlqvist formulae, defined by replacing polarity with monotonicity. By employing a special modal version of Lyndon's monotonicity theorem and imposing additional requirements on the Ackermann rule we obtain restricted versions of \SSQEMA\ which guarantee canonicity, too. (shrink)
Ernst Cassirer claimed that Kant's notion of actual object presupposes the notion of truth. Therefore, Kant cannot define truth as the correspondence of a judgement with an actual object. In this paper, I discuss the relations between Kant's notions of truth, object, and actuality. I argue that's notion of actual object does not presuppose the notion of truth. I conclude that Kant can define truth as the correspondence of a judgement with an actual object.
The paper delineates a new approach to truth that falls under the category of “Pluralism within the bounds of correspondence”, and illustrates it with respect to mathematical truth. Mathematical truth, like all other truths, is based on correspondence, but the route of mathematical correspondence differs from other routes of correspondence in (i) connecting mathematical truths to a special aspect of reality, namely, its formal aspect, and (ii) doing so in a complex, indirect way, rather than in (...) a simple and direct way. The underlying idea is that an intricate mind is capable of creating intricate routes from language to reality, and this enables it to apply correspondence principles in areas for which correspondence is traditionally thought to be problematic. (shrink)
The previously introduced algorithm \sqema\ computes first-order frame equivalents for modal formulae and also proves their canonicity. Here we extend \sqema\ with an additional rule based on a recursive version of Ackermann's lemma, which enables the algorithm to compute local frame equivalents of modal formulae in the extension of first-order logic with monadic least fixed-points \mffo. This computation operates by transforming input formulae into locally frame equivalent ones in the pure fragment of the hybrid mu-calculus. In particular, we prove that (...) the recursive extension of \sqema\ succeeds on the class of `recursive formulae'. We also show that a certain version of this algorithm guarantees the canonicity of the formulae on which it succeeds. (shrink)
Although the controversy between Malthus and Ricardo has long been considered to be an important source for the history of economic thought, it has hardly been the object of a careful study qua controversy, i.e. as a polemical dialogical exchange. We have undertaken to fill this gap, within the framework of a more ambitious project that places controversies at the center of an account of the history of ideas, in science and elsewhere. It is our contention that the dialogical co-text (...) is essential for reconstructing the meaning and the evolution of science. In the present paper we try to substantiate this contention by means of a pragma-rhetorical study of this particular controversy. First, we reconstruct, through an analysis of a chunk of the correspondence, a micro-level of specific moves and countermoves which constitute a sequential structure within which also meta-scientific and meta-controversial considerations play a role. We then move to a macro-level of analysis, looking for recurrent patterns of argumentation. Finally, we draw epistemological conclusions on the nature of rationality and progress as manifested in actual scientific controversies. (shrink)
Recently we proposed "quantum language" (or, the linguistic Copenhagen interpretation of quantum mechanics"), which was not only characterized as the metaphysical and linguistic turn of quantum mechanics but also the linguistic turn of Descartes=Kant epistemology. We believe that quantum language is the language to describe science, which is the final goal of dualistic idealism. Hence there is a reason to want to clarify, from the quantum linguistic point of view, the following problems: "brain in a vat argument", "the Cogito proposition", (...) "five-minute hypothesis", "only the present exists", "Copernican revolution", "McTaggart's paradox", and so on. In this paper, these will be discussed and clarified in quantum language. That is, these are not in quantum language. Also we emphasize that Leibniz's relationalism in Leibniz-Clarke correspondence is regarded as one of the most important parts of the linguistic Copenhagen interpretation of quantum mechanics. This paper is the revised version of the paper: Open Journal of Philosophy, 2018 Vol.8, No.5, 466-480). (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.