Kant's arguments for the synthetic a priori status of geometry are generally taken to have been refuted by the development of non-Euclidean geometries. Recently, however, some philosophers have argued that, on the contrary, the development of non-Euclideangeometry has confirmed Kant's views, for since a demonstration of the consistency of non-Euclideangeometry depends on a demonstration of its equi-consistency with Euclideangeometry, one need only show that the axioms of Euclidean (...) class='Hi'>geometry have 'intuitive content' in order to show that both Euclidean and non-Euclideangeometry are bodies of synthetic a priori truths. Michael Friedman has argued that this defence presumes a polyadic conception of logic that was foreign to Kant. According to Friedman, Kant held that geometrical reasoning itself relies essentially on intuition, and that this precludes the very possibility of non-Euclideangeometry. While Friedman's characterization of Kant's views on geometrical reasoning is correct, I argue that Friedman's conclusion that non-Euclidean geometries are logically impossible for Kant is not. I argue that Kant is best understood as a proto-constructivist and that modern constructive axiomatizations (unlike Hilbert-style axiomatizations) of both Euclidean and non-Euclideangeometry capture Kant's views on the essentially constructive nature of geometrical reasoning well. (shrink)
This paper examines Helmholtz's attempt to use empirical psychology to refute certain of Kant's epistemological positions. Particularly, Helmholtz believed that his work in the psychology of visual perception showed Kant's doctrine of the a priori character of spatial intuition to be in error. Some of Helmholtz's arguments are effective, but this effectiveness derives from his arguments to show the possibility of obtaining evidence that the structure of physical space is non-Euclidean, and these arguments do not depend on his theory (...) of vision. Helmholtz's general attempt to provide an empirical account of the "inferences" of perception is regarded as a failure. (shrink)
The role of conventions in the formulation of Thomas Reid’s theory of the geometry of vision, which he calls the “geometry of visibles”, is the subject of this investigation. In particular, we will examine the work of N. Daniels and R. Angell who have alleged that, respectively, Reid’s “geometry of visibles” and the geometry of the visual field are non-Euclidean. As will be demonstrated, however, the construction of any geometry of vision is subject to (...) a choice of conventions regarding the construction and assignment of its various properties, especially metric properties, and this fact undermines the claim for a unique non-Euclidean status for the geometry of vision. Finally, a suggestion is offered for trying to reconcile Reid’s direct realist theory of perception with his geometry of visibles. (shrink)
David Hyder.The Determinate World: Kant and Helmholtz on the Physical Meaning of Geometry. viii + 229 pp., bibl., index. Berlin/New York: Walter de Gruyter, 2009.
In the article, an argument is given that Euclideangeometry is a priori in the same way that numbers are a priori, the result of modelling, not the world, but our activities in the world.
Kant argued that Euclideangeometry is synthesized on the basis of an a priori intuition of space. This proposal inspired much behavioral research probing whether spatial navigation in humans and animals conforms to the predictions of Euclideangeometry. However, Euclideangeometry also includes concepts that transcend the perceptible, such as objects that are infinitely small or infinitely large, or statements of necessity and impossibility. We tested the hypothesis that certain aspects of nonperceptible Euclidian (...) class='Hi'>geometry map onto intuitions of space that are present in all humans, even in the absence of formal mathematical education. Our tests probed intuitions of points, lines, and surfaces in participants from an indigene group in the Amazon, the Mundurucu, as well as adults and age-matched children controls from the United States and France and younger US children without education in geometry. The responses of Mundurucu adults and children converged with that of mathematically educated adults and children and revealed an intuitive understanding of essential properties of Euclideangeometry. For instance, on a surface described to them as perfectly planar, the Mundurucu's estimations of the internal angles of triangles added up to ∼180 degrees, and when asked explicitly, they stated that there exists one single parallel line to any given line through a given point. These intuitions were also partially in place in the group of younger US participants. We conclude that, during childhood, humans develop geometrical intuitions that spontaneously accord with the principles of Euclideangeometry, even in the absence of training in mathematics. (shrink)
John Corcoran and George Boger. Aristotelian logic and Euclideangeometry. Bulletin of Symbolic Logic. 20 (2014) 131. -/- By an Aristotelian logic we mean any system of direct and indirect deductions, chains of reasoning linking conclusions to premises—complete syllogisms, to use Aristotle’s phrase—1) intended to show that their conclusions follow logically from their respective premises and 2) resembling those in Aristotle’s Prior Analytics. Such systems presuppose existence of cases where it is not obvious that the conclusion follows from (...) the premises: there must be something deductions can show. Corcoran calls a proposition that follows from given premises a hidden consequence of those premises if it is not obvious that the proposition follows from those premises. By a Euclideangeometry we mean an extended discourse beginning with basic premises—axioms, postulates, definitions—1) treating a universe of geometrical figures and 2) resembling Euclid’s Elements. There were Euclidean geometries before Euclid (fl. 300 BCE), even before Aristotle (384–322 BCE). Bochenski, Lukasiewicz, Patzig and others never new this or if they did they found it inconvenient to mention. Euclid shows no awareness of Aristotle. It is obvious today—as it should have been obvious in Euclid’s time, if anyone knew both—that Aristotle’s logic was insufficient for Euclid’s geometry: few if any geometrical theorems can be deduced from Euclid’s premises by means of Aristotle’s deductions. Aristotle’s writings don’t say whether his logic is sufficient for Euclideangeometry. But, there is not even one fully-presented example. However, Aristotle’s writings do make clear that he endorsed the goal of a sufficient system. Nevertheless, incredible as this is today, many logicians after Aristotle claimed that Aristotelian logics are sufficient for Euclidean geometries. This paper reviews and analyses such claims by Mill, Boole, De Morgan, Russell, Poincaré, and others. It also examines early contrary statements by Hintikka, Mueller, Smith, and others. Special attention is given to the argumentations pro or con and especially to their logical, epistemic, and ontological presuppositions. What methodology is necessary or sufficient to show that a given logic is adequate or inadequate to serve as the underlying logi of a given science. (shrink)
In this paper, we will make explicit the relationship that exists between geometric objects and geometric figures in planar Euclideangeometry. That will enable us to determine basic features regarding the role of geometric figures and diagrams when used in the context of pure and applied planar Euclideangeometry, arising due to this relationship. By taking into account pure geometry, as developed in Euclid’s Elements, and practical geometry, we will establish a relation between geometric (...) objects and figures. Geometric objects are defined in terms of idealizations of the corresponding figures of practical geometry. We name the relationship between them as a relation of idealization. This relation, existing between objects and figures, is what enables figures to have a role in pure and applied geometry. That is, we can use a figure or diagram as a representation of geometric objects or composite geometric objects because the relation of idealization corresponds to a resemblance-like relationship between objects and figures. Moving beyond pure geometry, we will defend that there are two other ‘layers’ of representation at play in applied geometry. To show that, we will consider Euclid’s Optics. (shrink)
Throughout history, almost all mathematicians, physicists and philosophers have been of the opinion that space and time are infinitely divisible. That is, it is usually believed that space and time do not consist of atoms, but that any piece of space and time of non-zero size, however small, can itself be divided into still smaller parts. This assumption is included in geometry, as in Euclid, and also in the Euclidean and non- Euclidean geometries used in modern physics. (...) Of the few who have denied that space and time are infinitely divisible, the most notable are the ancient atomists, and Berkeley and Hume. All of these assert not only that space and time might be atomic, but that they must be. Infinite divisibility is, they say, impossible on purely conceptual grounds. (shrink)
In this article I develop an elementary system of axioms for Euclideangeometry. On one hand, the system is based on the symmetry principles which express our a priori ignorant approach to space: all places are the same to us, all directions are the same to us and all units of length we use to create geometric figures are the same to us. On the other hand, through the process of algebraic simplification, this system of axioms directly provides (...) the Weyl’s system of axioms for Euclideangeometry. The system of axioms, together with its a priori interpretation, offers new views to philosophy and pedagogy of mathematics: it supports the thesis that Euclideangeometry is a priori, it supports the thesis that in modern mathematics the Weyl’s system of axioms is dominant to the Euclid’s system because it reflects the a priori underlying symmetries, it gives a new and promising approach to learn geometry which, through the Weyl’s system of axioms, leads from the essential geometric symmetry principles of the mathematical nature directly to modern mathematics. (shrink)
Social space is superimposed on the civilization map of the world whereas the social time is correlated with the duration of civilization existence. Within own civilization the concept space is non-homogeneous, there are “singled out points” — “concept factories”. As social structures, cities may exist rather long, sometimes during several millennia, but as concept centres they are limited by the duration of civilization existence. If civilization is a “concept universe”, nobody and nothing may cross the boundaries, which include cities as (...) well. Death of civilization leads to reboot cultural and historical space-time. On the other hand, reformatted olds concepts are not preserved, but there may be reception of old concepts and their new interpretation. However, even in case of genetic links presence, they are the other concepts and not modified old ones. Under certain circumstances may take place “rebranding” when attractive name is connected to the concept of absolutely different order to attach to it authority of the past. (shrink)
Girolamo Saccheri (1667--1733) was an Italian Jesuit priest, scholastic philosopher, and mathematician. He earned a permanent place in the history of mathematics by discovering and rigorously deducing an elaborate chain of consequences of an axiom-set for what is now known as hyperbolic (or Lobachevskian) plane geometry. Reviewer's remarks: (1) On two pages of this book Saccheri refers to his previous and equally original book Logica demonstrativa (Turin, 1697) to which 14 of the 16 pages of the editor's "Introduction" are (...) devoted. At the time of the first edition, 1920, the editor was apparently not acquainted with the secondary literature on Logica demonstrativa which continued to grow in the period preceding the second edition \ref[see D. J. Struik, in Dictionary of scientific biography, Vol. 12, 55--57, Scribner's, New York, 1975]. Of special interest in this connection is a series of three articles by A. F. Emch [Scripta Math. 3 (1935), 51--60; Zbl 10, 386; ibid. 3 (1935), 143--152; Zbl 11, 193; ibid. 3 (1935), 221--333; Zbl 12, 98]. (2) It seems curious that modern writers believe that demonstration of the "nondeducibility" of the parallel postulate vindicates Euclid whereas at first Saccheri seems to have thought that demonstration of its "deducibility" is what would vindicate Euclid. Saccheri is perfectly clear in his commitment to the ancient (and now discredited) view that it is wrong to take as an "axiom" a proposition which is not a "primal verity", which is not "known through itself". So it would seem that Saccheri should think that he was convicting Euclid of error by deducing the parallel postulate. The resolution of this confusion is that Saccheri thought that he had proved, not merely that the parallel postulate was true, but that it was a "primal verity" and, thus, that Euclid was correct in taking it as an "axiom". As implausible as this claim about Saccheri may seem, the passage on p. 237, lines 3--15, seems to admit of no other interpretation. Indeed, Emch takes it this way. (3) As has been noted by many others, Saccheri was fascinated, if not obsessed, by what may be called "reflexive indirect deductions", indirect deductions which show that a conclusion follows from given premises by a chain of reasoning beginning with the given premises augmented by the denial of the desired conclusion and ending with the conclusion itself. It is obvious, of course, that this is simply a species of ordinary indirect deduction; a conclusion follows from given premises if a contradiction is deducible from those given premises augmented by the denial of the conclusion---and it is immaterial whether the contradiction involves one of the premises, the denial of the conclusion, or even, as often happens, intermediate propositions distinct from the given premises and the denial of the conclusion. Saccheri seemed to think that a proposition proved in this way was deduced from its own denial and, thus, that its denial was self-contradictory (p. 207). Inference from this mistake to the idea that propositions proved in this way are "primal verities" would involve yet another confusion. The reviewer gratefully acknowledges extensive communication with his former doctoral students J. Gasser and M. Scanlan. ADDED 14 March 14, 2015: (1) Wikipedia reports that many of Saccheri's ideas have a precedent in the 11th Century Persian polymath Omar Khayyám's Discussion of Difficulties in Euclid, a fact ignored in most Western sources until recently. It is unclear whether Saccheri had access to this work in translation, or developed his ideas independently. (2) This book is another exemplification of the huge difference between indirect deduction and indirect reduction. Indirect deduction requires making an assumption that is inconsistent with the premises previously adopted. This means that the reasoner must perform a certain mental act of assuming a certain proposition. It case the premises are all known truths, indirect deduction—which would then be indirect proof—requires the reasoner to assume a falsehood. This fact has been noted by several prominent mathematicians including Hardy, Hilbert, and Tarski. Indirect reduction requires no new assumption. Indirect reduction is simply a transformation of an argument in one form into another argument in a different form. In an indirect reduction one proposition in the old premise set is replaced by the contradictory opposite of the old conclusion and the new conclusion becomes the contradictory opposite of the replaced premise. Roughly and schematically, P,Q/R becomes P,~R/~Q or ~R, Q/~P. Saccheri’s work involved indirect deduction not indirect reduction. (3) The distinction between indirect deduction and indirect reduction has largely slipped through the cracks, the cracks between medieval-oriented logic and modern-oriented logic. The medievalists have a heavy investment in reduction and, though they have heard of deduction, they think that deduction is a form of reduction, or vice versa, or in some cases they think that the word ‘deduction’ is the modern way of referring to reduction. The modernists have no interest in reduction, i.e. in the process of transforming one argument into another having exactly the same number of premises. Modern logicians, like Aristotle, are concerned with deducing a single proposition from a set of propositions. Some focus on deducing a single proposition from the null set—something difficult to relate to reduction. (shrink)
In his doctoral dissertation On the Principle of Sufficient Reason, Arthur Schopenhauer there outlines a critique of Euclideangeometry on the basis of the changing nature of mathematics, and hence of demonstration, as a result of Kantian idealism. According to Schopenhauer, Euclid treats geometry synthetically, proceeding from the simple to the complex, from the known to the unknown, “synthesizing” later proofs on the basis of earlier ones. Such a method, although proving the case logically, nevertheless fails to (...) attain the raison d’être of the entity. In order to obtain this, a separate method is required, which Schopenhauer refers to as “analysis,” thus echoing a method already in practice among the early Greek geometers, with however some significant differences. In this essay, I here discuss Schopenhauer’s criticism of synthesis in Euclid’s Elements, and the nature and relevance of his own method of analysis. (shrink)
In this paper I will offer a novel understanding of a priori knowledge. My claim is that the sharp distinction that is usually made between a priori and a posteriori knowledge is groundless. It will be argued that a plausible understanding of a priori and a posteriori knowledge has to acknowledge that they are in a constant bootstrapping relationship. It is also crucial that we distinguish between a priori propositions that hold in the actual world and merely possible, non-actual a (...) priori propositions, as we will see when considering cases like Euclideangeometry. Furthermore, contrary to what Kripke seems to suggest, a priori knowledge is intimately connected with metaphysical modality, indeed, grounded in it. The task of a priori reasoning, according to this account, is to delimit the space of metaphysically possible worlds in order for us to be able to determine what is actual. (shrink)
This paper investigates the nature of reality by looking at the philosophical debate between realism and idealism and at scientific investigations in quantum physics and at recent studies of animal senses, neurology and cognitive psychology. The concept of perceptual relativity is examined and this involves looking at sense perception in other animals and various examples of perceptual relativity in science. It will be concluded that the universe is observer dependent and that there is no reality independent of the observer, which (...) is knowable to the observer. The paper concludes by an investigation of what an observer dependent universe would be like and that recognition of an observer dependent world would lead to a more open minded and tolerant world. (shrink)
Stanisław Ignacy Witkiewicz and Franciszka and Stefan Themerson constitute a rare constellation of outstanding artists of the 20th century avant-garde. Their best known contributions were a concept of Pure Form and an idea of Semantic Poetry respectively. They all shared multiplicity and diversity of interests and areas of not only artistic activities. Philosophy and science influenced to large extent form and content of their works. Despite their mutual interest in each other’s work they had never met personally and no correspondence (...) between them was found. However similarities in attitudes resulted then in intertwining of their paths of artistic development. One of early Stefan Themerson’s novels featured strong influence of Witkacy works but all his later writings were fully original and individual. However the author believes one can still find some flavours, tones and traces leading to the oeuvre of the author of Insatiability. After leaving Poland in 1938 Themersons had still maintained their interest in Witkiewicz works what finally led to translation and preparation for publishing by Gaberbocchus Press of his two plays – Gyubal Wahazar and Mother featuring Franciszka’s drawings. A brief review of Gaberbocchus Press achievements and resources of the Themersons Archive, established by Jasia Reichardt after their death, constitute a background for an analysis of relationships between Witkacy and Themersons. The fact National Library in Warsaw acquired Themersons heritage alleviated very much access to many very interesting source documents. The author focused on a part of the Archive related to the all aspects of Witkacy’s works. Typescripts of publication ready translated and adapted plays, Vahazar and Mother, which finally remained not published by the Gaberbocchus Press triggered an idea to take up the challenge of publishing a “bestlooker” today. A draft of a bibliophilic bilingual edition of the Gyubal Wahazar drama illustrated with Franciszka Themerson’s drawings (samples are included in the paper) is proposed as a final thought. (shrink)
The electronic and muonic hydrogen energy levels are calculated very accurately [1] in Quantum Electrodynamics (QED) by coupling the Dirac Equation four vector (c ,mc2) current covariantly with the external electromagnetic (EM) field four vector in QED’s Interactive Representation (IR). The c -Non Exclusion Principle(c -NEP) states that, if one accepts c as the electron/muon velocity operator because of the very accurate hydrogen energy levels calculated, the one must also accept the resulting electron/muon internal spatial and time coordinate operators (ISaTCO) (...) derived directly from c without any assumptions. This paper does not change any of the accurate QED calculations of hydrogen’s energy levels, given the simplistic model of the proton used in these calculations [1]. The Proton Radius Puzzle [2, 3] may indicate that new physics is necessary beyond the Standard Model (SM), and this paper describes the bizarre, and very different, situation when the electron and muon are located “inside” the spatially extended proton with their Centers of Charge (CoCs) orbiting the proton at the speed of light in S energy states. The electron/muon center of charge (CoC) is a structureless point that vibrates rapidly in its inseparable, non-random vacuum whose geometry and time structure are defined by the electron/muon ISaTCO discrete geometry. The electron/muon self mass becomes finite in a natural way due to the ISaTCOs cutting off high virtual photon energies in the photon propagator. The Dirac-Maxwell-Wilson (DMW) Equations are derived from the ISaTCO for the EM fields of an electron/muon, and the electron/muon “look” like point particles in far field scattering experiments in the same way the electric field from a sphere with evenly distributed charge “e” “looks” like a point with the same charge in the far field (Gauss Law). The electron’s/muon’s three fluctuating CoC internal spatial coordinate operators have eight possible eigenvalues [4, 5, 6] that fall on a spherical shell centered on the electron’s CoM with radius in the rest frame. The electron/muon internal time operator is discrete, describes the rapid virtual electron/positron pair production and annihilation. The ISaTCO together create a current that produce spin and magnetic moment operators, and the electron and muon no longer have “intrinsic” properties since the ISaTCO kinematics define spin and magnetic moment properties. (shrink)
Geometry, etymologically the “science of measuring the Earth”, is a mathematical formalization of space. Just as formal concepts of number may be rooted in an evolutionary ancient system for perceiving numerical quantity, the fathers of geometry may have been inspired by their perception of space. Is the spatial content of formal Euclideangeometry universally present in the way humans perceive space, or is Euclideangeometry a mental construction, specific to those who have received appropriate (...) instruction? The spatial content of the formal theories of geometry may depart from spatial perception for two reasons: first, because in geometry, only some of the features of spatial figures are theoretically relevant; and second, because some geometric concepts go beyond any possible perceptual experience. Focusing in turn on these two aspects of geometry, we will present several lines of research on US adults and children from the age of three years, and participants from an Amazonian culture, the Mundurucu. Almost all the aspects of geometry tested proved to be shared between these two cultures. Nevertheless, some aspects involve a process of mental construction where explicit instruction seem to play a role in the US, but that can still take place in the absence of instruction in geometry. (shrink)
Does geometry constitues a core set of intuitions present in all humans, regarless of their language or schooling ? We used two non verbal tests to probe the conceptual primitives of geometry in the Munduruku, an isolated Amazonian indigene group. Our results provide evidence for geometrical intuitions in the absence of schooling, experience with graphic symbols or maps, or a rich language of geometrical terms.
This paper examines explanations that turn on non-local geometrical facts about the space of possible configurations a system can occupy. I argue that it makes sense to contrast such explanations from "geometry of motion" with causal explanations. I also explore how my analysis of these explanations cuts across the distinction between kinematics and dynamics.
This work examines the unique way in which Benedict de Spinoza combines two significant philosophical principles: that real existence requires causal power and that geometrical objects display exceptionally clearly how things have properties in virtue of their essences. Valtteri Viljanen argues that underlying Spinoza's psychology and ethics is a compelling metaphysical theory according to which each and every genuine thing is an entity of power endowed with an internal structure akin to that of geometrical objects. This allows Spinoza to offer (...) a theory of existence and of action - human and non-human alike - as dynamic striving that takes place with the same kind of necessity and intelligibility that pertain to geometry. This fresh and original study will interest a wide range of readers in Spinoza studies and early modern philosophy more generally. (shrink)
Evolution and geometry generate complexity in similar ways. Evolution drives natural selection while geometry may capture the logic of this selection and express it visually, in terms of specific generic properties representing some kind of advantage. Geometry is ideally suited for expressing the logic of evolutionary selection for symmetry, which is found in the shape curves of vein systems and other natural objects such as leaves, cell membranes, or tunnel systems built by ants. The topology and (...) class='Hi'>geometry of symmetry is controlled by numerical parameters, which act in analogy with a biological organism’s DNA. The introductory part of this paper reviews findings from experiments illustrating the critical role of two-dimensional (2D) design parameters, affine geometry and shape symmetry for visual or tactile shape sensation and perception-based decision making in populations of experts and non-experts. It will be shown that 2D fractal symmetry, referred to herein as the “symmetry of things in a thing”, results from principles very similar to those of affine projection. Results from experiments on aesthetic and visual preference judgments in response to 2D fractal trees with varying degrees of asymmetry are presented. In a first experiment (psychophysical scaling procedure), non-expert observers had to rate (on a scale from 0 to 10) the perceived beauty of a random series of 2D fractal trees with varying degrees of fractal symmetry. In a second experiment (two-alternative forced choice procedure), they had to express their preference for one of two shapes from the series. The shape pairs were presented successively in random order. Results show that the smallest possible fractal deviation from “symmetry of things in a thing” significantly reduces the perceived attractiveness of such shapes. The potential of future studies where different levels of complexity of fractal patterns are weighed against different degrees of symmetry is pointed out in the conclusion. (shrink)
This paper argues that Frege's notoriously long commitment to Kant's thesis that Euclideangeometry is synthetic _a priori_ is best explained by realizing that Frege uses ‘intuition’ in two senses. Frege sometimes adopts the usage presented in Hermann Helmholtz's sign theory of perception. However, when using ‘intuition’ to denote the source of geometric knowledge, he is appealing to Hermann Cohen's use of Kantian terminology. We will see that Cohen reinterpreted Kantian notions, stripping them of any psychological connotation. Cohen's (...) defense of his modified Kantian thesis on the unique status of the Euclidean axioms presents Frege's own views in a much more favorable light. (shrink)
In the brain the relations between free neurons and the conditioned ones establish the constraints for the informational neural processes. These constraints reflect the systemenvironment state, i.e. the dynamics of homeocognitive activities. The constraints allow us to define the cost function in the phase space of free neurons so as to trace the trajectories of the possible configurations at minimal cost while respecting the constraints imposed. Since the space of the free states is a manifold or a non orthogonal space, (...) the minimum distance is not a straight line but a geodesic. The minimum condition is expressed by a set of ordinary differential equation ( ODE ) that in general are not linear. In the brain there is not an algorithm or a physical field that regulates the computation, then we must consider an emergent process coming out of the neural collective behavior triggered by synaptic variability. We define the neural computation as the study of the classes of trajectories on a manifold geometry defined under suitable constraints. The cost function supervises pseudo equilibrium thermodynamics effects that manage the computational activities from beginning to end and realizes an optimal control through constraints and geodetics. The task of this work is to establish a connection between the geometry of neural computation and cost functions. To illustrate the essential mathematical aspects we will use as toy model a Network Resistor with Adaptive Memory (Memristors).The information geometry here defined is an analog computation, therefore it does not suffer the limits of the Turing computation and it seems to respond to the demand for a greater biological plausibility. The model of brain optimal control proposed here can be a good foundation for implementing the concept of "intentionality",according to the suggestion of W. Freeman. Indeed, the geodesic in the brain states can produce suitable behavior to realize wanted functions and invariants as neural expressionsof cognitive intentions. (shrink)
When national borders in the modern sense first began to be established in early modern Europe, non-contiguous and perforated nations were a commonplace. According to the conception of the shapes of nations that is currently preferred, however, nations must conform to the topological model of circularity; their borders must guarantee contiguity and simple connectedness, and such borders must as far as possible conform to existing topographical features on the ground. The striving to conform to this model can be seen at (...) work today in Quebec and in Ireland, it underpins much of the rhetoric of the P.L.O., and was certainly to some degree involved as a motivating factor in much of the ethnic cleansing which took place in Bosnia in recent times. The question to be addressed in what follows is: to what extent could inter-group disputes be more peacefully resolved, and ethnic cleansing avoided, if political leaders, diplomats and others involved in the resolution of such disputes could be brought to accept weaker geometrical constraints on the shapes of nations? A number of associated questions then present themselves: What sorts of administrative and logistical problems have been encountered by existing non contiguous nations and by perforated nations, and by other nations deviating in different ways from the received geometrical ideal? To what degree is the desire for continuity and simple connectedness a rational desire, and to what degree does it rest on species of political rhetoric which might be countered by, for example, philosophical argument? These and a series of related questions will form the subject- matter of the present essay. (shrink)
Abstract. Let REL(O*E) be the relation algebra of binary relations defined on the Boolean algebra O*E of regular open regions of the Euclidean plane E. The aim of this paper is to prove that the canonical contact relation C of O*E generates a subalgebra REL(O*E, C) of REL(O*E) that has infinitely many elements. More precisely, REL(O*,C) contains an infinite family {SPPn, n ≥ 1} of relations generated by the relation SPP (Separable Proper Part). This relation can be used to (...) define point-free concept of connectedness that for the regular open regions of E coincides with the standard topological notion of connectedness, i.e., a region of the plane E is connected in the sense of topology if and only if it has no separable proper part. Moreover, it is shown that the contact relation algebra REL(O*E, C) and the relation algebra REL(O*E, NTPP) generated by the non-tangential proper parthood relation NTPP, coincide. This entails that the allegedly purely topological notion of connectedness can be defined in mereological terms. (shrink)
Berkeley in his Introduction to the Principles of Human knowledge uses geometrical examples to illustrate a way of generating “universal ideas,” which allegedly account for the existence of general terms. In doing proofs we might, for example, selectively attend to the triangular shape of a diagram. Presumably what we prove using just that property applies to all triangles.I contend, rather, that given Berkeley’s view of extension, no Euclidean triangles exist to attend to. Rather proof, as Berkeley would normally assume, (...) requires idealizing diagrams; treating them as if they obeyed Euclidean constraints. This convention solves the problem of representative generalization. View HTML Send article to KindleTo send this article to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle. Find out more about sending to your Kindle. Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply. Find out more about the Kindle Personal Document Service.Berkeley and Proof in GeometryVolume 51, Issue 3RICHARD J. BROOK DOI: https://doi.org/10.1017/S0012217312000686Your Kindle email address Please provide your Kindle [email protected]@kindle.com Available formats PDF Please select a format to send. By using this service, you agree that you will only keep articles for personal use, and will not openly distribute them via Dropbox, Google Drive or other file sharing services. Please confirm that you accept the terms of use. Cancel Send ×Send article to Dropbox To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about sending content to Dropbox. Berkeley and Proof in GeometryVolume 51, Issue 3RICHARD J. BROOK DOI: https://doi.org/10.1017/S0012217312000686Available formats PDF Please select a format to send. By using this service, you agree that you will only keep articles for personal use, and will not openly distribute them via Dropbox, Google Drive or other file sharing services. Please confirm that you accept the terms of use. Cancel Send ×Send article to Google Drive To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about sending content to Google Drive. Berkeley and Proof in GeometryVolume 51, Issue 3RICHARD J. BROOK DOI: https://doi.org/10.1017/S0012217312000686Available formats PDF Please select a format to send. By using this service, you agree that you will only keep articles for personal use, and will not openly distribute them via Dropbox, Google Drive or other file sharing services. Please confirm that you accept the terms of use. Cancel Send ×Export citation Request permission. (shrink)
We introduce and study a variety of modal logics of parallelism, orthogonality, and affine geometries, for which we establish several completeness, decidability and complexity results and state a number of related open, and apparently difficult problems. We also demonstrate that lack of the finite model property of modal logics for sufficiently rich affine or projective geometries (incl. the real affine and projective planes) is a rather common phenomenon.
Let us start by some general definitions of the concept of complexity. We take a complex system to be one composed by a large number of parts, and whose properties are not fully explained by an understanding of its components parts. Studies of complex systems recognized the importance of “wholeness”, defined as problems of organization (and of regulation), phenomena non resolvable into local events, dynamics interactions in the difference of behaviour of parts when isolated or in higher configuration, etc., in (...) short, systems of various orders (or levels) not understandable by investigation of their respective parts in isolation. In a complex system it is essential to distinguish between ‘global’ and ‘local’ properties. Theoretical physicists in the last two decades have discovered that the collective behaviour of a macro-system, i.e. a system composed of many objects, does not change qualitatively when the behaviour of single components are modified slightly. Conversely, it has been also found that the behaviour of single components does change when the overall behaviour of the system is modified. There are many universal classes which describe the collective behaviour of the system, and each class has its own characteristics; the universal classes do not change when we perturb the system. The most interesting and rewarding work consists in finding these universal classes and in spelling out their properties. This conception has been followed in studies done in the last twenty years on second order phase transitions. The objective, which has been mostly achieved, was to classify all possible types of phase transitions in different universality classes and to compute the parameters that control the behaviour of the system near the transition (or critical or bifurcation) point as a function of the universality class. This point of view is not very different from the one expressed by Thom in the introduction of Structural Stability and Morphogenesis (1975). It differs from Thom’s program because there is no a priori idea of the mathematical framework which should be used. Indeed Thom considers only a restricted class of models (ordinary differential equations in low dimensional spaces) while we do not have any prejudice regarding which models should be accepted. One of the most interesting and surprising results obtained by studying complex systems is the possibility of classifying the configurations of the system taxonomically. It is well-known that a well founded taxonomy is possible only if the objects we want to classify have some unique properties, i.e. species may be introduced in an objective way only if it is impossible to go continuously from one specie to another; in a more mathematical language, we say that objects must have the property of ultrametricity. More precisely, it was discovered that there are conditions under which a class of complex systems may only exist in configurations that have the ultrametricity property and consequently they can be classified in a hierarchical way. Indeed, it has been found that only this ultrametricity property is shared by the near-optimal solutions of many optimization problems of complex functions, i.e. corrugated landscapes in Kauffman’s language. These results are derived from the study of spin glass model, but they have wider implications. It is possible that the kind of structures that arise in these cases is present in many other apparently unrelated problems. Before to go on with our considerations, we have to pick in mind two main complementary ideas about complexity. (i) According to the prevalent and usual point of view, the essence of complex systems lies in the emergence of complex structures from the non-linear interaction of many simple elements that obey simple rules. Typically, these rules consist of 0–1 alternatives selected in response to the input received, as in many prototypes like cellular automata, Boolean networks, spin systems, etc. Quite intricate patterns and structures can occur in such systems. However, what can be also said is that these are toy systems, and the systems occurring in reality rather consist of elements that individually are quite complex themselves. (ii) So, this bring a new aspect that seems essential and indispensable to the emergence and functioning of complex systems, namely the coordination of individual agents or elements that themselves are complex at their own scale of operation. This coordination dramatically reduces the degree of freedom of those participating agents. Even the constituents of molecules, i.e. the atoms, are rather complicated conglomerations of subatomic particles, perhaps ultimately excitations of patterns of superstrings. Genes, the elementary biochemical coding units, are very complex macromolecular strings, as are the metabolic units, the proteins. Neurons, the basic elements of cognitive networks, themselves are cells. In those mentioned and in other complex systems, it is an important feature that the potential complexity of the behaviour of the individual agents gets dramatically simplified through the global interactions within the system. The individual degrees of freedom are drastically reduced, or, in a more formal terminology, the factual space of the system is much smaller than the product of the state space of the individual elements. That is one key aspect. The other one is that on this basis, that is utilizing the coordination between the activities of its members, the system then becomes able to develop and express a coherent structure at a higher level, that is, an emergent behaviour (and emergent properties) that transcends what each element is individually capable of. (shrink)
While I was working about some basic physical phenomena, I discovered some geometric relations that also interest mathematics. In this work, I applied the rules I have been proven to P=NP? problem over impossibility of perpendicularity in the universe. It also brings out extremely interesting results out like imaginary numbers which are known as real numbers currently. Also it seems that EuclideanGeometry is impossible. The actual geometry is Riemann Geometry and complex numbers are real.
REVIEW OF: Automated Development of Fundamental Mathematical Theories by Art Quaife. (1992: Kluwer Academic Publishers) 271pp. Using the theorem prover OTTER Art Quaife has proved four hundred theorems of von Neumann-Bernays-Gödel set theory; twelve hundred theorems and definitions of elementary number theory; dozens of Euclideangeometry theorems; and Gödel's incompleteness theorems. It is an impressive achievement. To gauge its significance and to see what prospects it offers this review looks closely at the book and the proofs it presents.
This article analyzes the value of geometric models to understand matter with the examples of the Platonic model for the primary four elements (fire, air, water, and earth) and the models of carbon atomic structures in the new science of crystallography. How the geometry of these models is built in order to discover the properties of matter is explained: movement and stability for the primary elements, and hardness, softness and elasticity for the carbon atoms. These geometric models appear to (...) have a double quality: firstly, they exhibit visually the scientific properties of matter, and secondly they give us the possibility to visualize its whole nature. Geometrical models appear to be the expression of the mind in the understanding of physical matter. (shrink)
Purpose of the article is to study the Western worldview as a framework of beliefs in probable supernatural encroachment into the objective reality. Methodology underpins the idea that every cultural-historical community envisions the reality principles according to the beliefs inherent to it which accounts for the formation of the unique “universes of meanings”. The space of history acquires the Non-Euclidean properties that determine the specific cultural attitudes as well as part and parcel mythology of the corresponding communities. Novelty consists (...) in the approach to the miracle as a psychological need in a religious authority, expressed through the religious and non-religious (scientific) worldviews, which are interconnected by invariant thinking patterns deeply inside. It has been proven that the full-fledged existence of the religion is impossible without a miraculous constituent. It has been illustrated that the development of society causes a transformation of beliefs in gods and in miracles they do. The theological origins of the scientific beliefs stating the importance and regularity of the natural processes have been outlined. Conclusions: religion suggests emotional involvement and reasoning which is realized by means of a miracle. The modern science reproduces the theological concept of the permanence of God and His will at own level. Through the history of humankind not only the nature of miracle (whereof the common tendency belongs to the daily reality expansion) underwent changes but also its suggested subject (wherein abstraction is in trend). (shrink)
Analysing several characteristic mathematical models: natural and real numbers, Euclideangeometry, group theory, and set theory, I argue that a mathematical model in its final form is a junction of a set of axioms and an internal partial interpretation of the corresponding language. It follows from the analysis that (i) mathematical objects do not exist in the external world: they are our internally imagined objects, some of which, at least approximately, we can realize or represent; (ii) mathematical truths (...) are not truths about the external world but specifications (formulations) of mathematical conceptions; (iii) mathematics is first and foremost our imagined tool by which, with certain assumptions about its applicability, we explore nature and synthesize our rational cognition of it. (shrink)
We examine some of Connes’ criticisms of Robinson’s infinitesimals starting in 1995. Connes sought to exploit the Solovay model S as ammunition against non-standard analysis, but the model tends to boomerang, undercutting Connes’ own earlier work in functional analysis. Connes described the hyperreals as both a “virtual theory” and a “chimera”, yet acknowledged that his argument relies on the transfer principle. We analyze Connes’ “dart-throwing” thought experiment, but reach an opposite conclusion. In S , all definable sets of reals are (...) Lebesgue measurable, suggesting that Connes views a theory as being “virtual” if it is not definable in a suitable model of ZFC. If so, Connes’ claim that a theory of the hyperreals is “virtual” is refuted by the existence of a definable model of the hyperreal field due to Kanovei and Shelah. Free ultrafilters aren’t definable, yet Connes exploited such ultrafilters both in his own earlier work on the classification of factors in the 1970s and 80s, and in Noncommutative Geometry, raising the question whether the latter may not be vulnerable to Connes’ criticism of virtuality. We analyze the philosophical underpinnings of Connes’ argument based on Gödel’s incompleteness theorem, and detect an apparent circularity in Connes’ logic. We document the reliance on non-constructive foundational material, and specifically on the Dixmier trace −∫ (featured on the front cover of Connes’ magnum opus) and the Hahn–Banach theorem, in Connes’ own framework. We also note an inaccuracy in Machover’s critique of infinitesimal-based pedagogy. (shrink)
The concept of time is examined using the second law of thermodynamics that was recently formulated as an equation of motion. According to the statistical notion of increasing entropy, flows of energy diminish differences between energy densities that form space. The flow of energy is identified with the flow of time. The non-Euclidean energy landscape, i.e. the curved space–time, is in evolution when energy is flowing down along gradients and levelling the density differences. The flows along the steepest descents, (...) i.e. geodesics are obtained from the principle of least action for mechanics, electrodynamics and quantum mechanics. The arrow of time, associated with the expansion of the Universe, identifies with grand dispersal of energy when high-energy densities transform by various mechanisms to lower densities in energy and eventually to ever-diluting electromagnetic radiation. Likewise, time in a quantum system takes an increment forwards in the detection-associated dissipative transformation when the stationary-state system begins to evolve pictured as the wave function collapse. The energy dispersal is understood to underlie causality so that an energy gradient is a cause and the resulting energy flow is an effect. The account on causality by the concepts of physics does not imply determinism; on the contrary, evolution of space–time as a causal chain of events is non-deterministic. (shrink)
How are fundamental constants, such as c for the speed of light, related to particular technological environments? Our understanding of the constant c and Einstein’s relativistic cosmology depended on key experiences and lessons learned in connection to new forms of telecommunications, first used by the military and later adapted for commercial purposes. Many of Einstein’s contemporaries understood his theory of relativity by reference to telecommunications, some referring to it as “signal-theory” and “message theory.” Prominent physicists who contributed to it (Hans (...) Reichenbach, Max Born, Paul Langevin, Louis de Broglie, and Léon Brillouin, among others) worked in radio units during WWI. At the time of its development, the old Newtonian mechanics was retrospectively rebranded as based on the belief in a means of “instantaneous signaling at a distance.” Even our thinking about lengths and solid bodies, argued Einstein and his interlocutors, needed to be overhauled in light of a new understanding of signaling possibilities. Pulling a rod from one side will not make the other end move at once, since relativity had shown that “this would be a signal that moves with infinite speed.” Einstein’s universe, where time and space dilated, where the shortest path between two points was often curved and which broke the laws of Euclideangeometry, functioned according to the rules of electromagnetic signal transmission. For some critics, the new understanding of the speed of light as an unsurpassable velocity—a fundamental tenet of Einstein’s theory—was a mere technological effect related to current limitations in communication technologies. (shrink)
We present an elementary system of axioms for the geometry of Minkowski spacetime. It strikes a balance between a simple and streamlined set of axioms and the attempt to give a direct formalization in first-order logic of the standard account of Minkowski spacetime in [Maudlin 2012] and [Malament, unpublished]. It is intended for future use in the formalization of physical theories in Minkowski spacetime. The choice of primitives is in the spirit of [Tarski 1959]: a predicate of betwenness and (...) a four place predicate to compare the square of the relativistic intervals. Minkowski spacetime is described as a four dimensional ‘vector space’ that can be decomposed everywhere into a spacelike hyperplane - which obeys the Euclidean axioms in [Tarski and Givant, 1999] - and an orthogonal timelike line. The length of other ‘vectors’ are calculated according to Pythagora’s theorem. We conclude with a Representation Theorem relating models of our system that satisfy second order continuity to the mathematical structure called ‘Minkowski spacetime’ in physics textbooks. (shrink)
This is a chapter of the planned monograph "Out of Nowhere: The Emergence of Spacetime in Quantum Theories of Gravity", co-authored by Nick Huggett and Christian Wüthrich and under contract with Oxford University Press. (More information at www<dot>beyondspacetime<dot>net.) This chapter introduces the problem of emergence of spacetime in quantum gravity. It introduces the main philosophical challenge to spacetime emergence and sketches our preferred solution to it.
This article’s conclusion is that the theories of Einstein are generally correct and will still be relevant in the next century (there will be modifications necessary for development of quantum gravity). Those Einsteinian theories are Special Relativity, General Relativity, and the title of a paper he published in 1919 which asked if gravitation plays a role in the composition of elementary particles of matter. This paper was the bridge between General Relativity and the Unified Field Theory he sought during the (...) last 25 years of his life. In an article published in the "Annals of Physics" in 1957, Charles Misner and John Wheeler claimed that Einstein's latest equations demonstrated the unified field theory. But Einstein himself felt he had not fully succeeded. The present article begins with Olbers’ paradox (why is the sky dark at night?) Then it briefly proceeds to the subjects of Newtonian gravity, quantum entanglement, gravitational waves, E=mc^2, dark energy, dark matter, cosmic expansion, redshift, blueshift, the cosmic microwave background, the 1st Law of Thermodynamics, and explanation of advanced waves travelling back in time. The section “vector-tensor-scalar geometry” touches on mass, quantum spin, the Higgs boson and Higgs field, stellar jets, the pervasiveness of photons and gravitons, and supersymmetry. Then come half a dozen paragraphs referring to formation of planets, black holes, and bosons of the weak and strong nuclear forces. They end with Descartes’ space-matter relation. Also added are paragraphs about simply-connected mathematics, non-orientability, consciousness, the Law of Falling Bodies, the multiverse, space-time travel developed from maths’ Brouwer Fixed Point Theorem and from an experiment in electrical engineering performed at Yale University, development from future space-time travel of human flight in the manner of fiction’s Superman and Supergirl, as well as downloaded band-gap implants in the brain that could deal with forms of matter. They could add or delete anything and everything we choose by emulating computers’ copy/paste function to add things; as well as their delete function, to remove things. To complete my seemingly unusual ideas, 6 sections are added – 1) “Advanced and Retarded Waves” is extended to include dinosaurs, ageing, and photography, 2) there’s a bit about space-time warping and “imaginary” computers, 3) several paragraphs about restoring health (even gaining immortality) by using gravity, 4) a section about superconductivity and the electric or magnetic fields of planets (this section mentions Mercury, Planet 9 and precession), 5) a section titled EXPLAINING OCEAN TIDES WHEN GENERAL RELATIVITY SAYS GRAVITY IS A PUSH CAUSED BY THE CURVATURE OF SPACE-TIME (this has subsections about M-Sigma, Geysers on Saturn’s Moon Enceladus, and A Brief History of Gravity), plus 5) the potential of COVID-19 to create the Golden Rule, world peace, eternal life, and a non-economic world that doesn’t use any form of money (no cash, credit cards, digital currency, etc.) The final section is called DISTANT-FUTURE SCIENCE INTERPRETED BY RELIGIONS AS SUPERNATURAL and introduces an idea for becoming immortal in these physical bodies. If the Theory of Everything sought by physicists applies to all space-time, then every person’s brain must be entangled with the 22nd century (and far beyond that time too). (shrink)
The human attempts to access, measure and organize physical phenomena have led to a manifold construction of mathematical and physical spaces. We will survey the evolution of geometries from Euclid to the Algebraic Geometry of the 20th century. The role of Persian/Arabic Algebra in this transition and its Western symbolic development is emphasized. In this relation, we will also discuss changes in the ontological attitudes toward mathematics and its applications. Historically, the encounter of geometric and algebraic perspectives enriched the (...) mathematical practices and their foundations. Yet, the collapse of Euclidean certitudes, of over 2300 years, and the crisis in the mathematical analysis of the 19th century, led to the exclusion of “geometric judgments” from the foundations of Mathematics. After the success and the limits of the logico-formal analysis, it is necessary to broaden our foundational tools and re-examine the interactions with natural sciences. In particular, the way the geometric and algebraic approaches organize knowledge is analyzed as a cross-disciplinary and cross-cultural issue and will be examined in Mathematical Physics and Biology. We finally discuss how the current notions of mathematical (phase) “space” should be revisited for the purposes of life sciences. (shrink)
In standard probability theory, probability zero is not the same as impossibility. But many have suggested that only impossible events should have probability zero. This can be arranged if we allow infinitesimal probabilities, but infinitesimals do not solve all of the problems. We will see that regular probabilities are not invariant over rigid transformations, even for simple, bounded, countable, constructive, and disjoint sets. Hence, regular chances cannot be determined by space-time invariant physical laws, and regular credences cannot satisfy seemingly reasonable (...) symmetry principles. Moreover, the examples here are immune to the objections against Williamson’s infinite coin flips. (shrink)
The content of the manuscript represents a bold idea system, it is beyond the boundaries of all existing knowledge but the method of reasoning and logic is also very strict and scientific. The purpose of the manuscript is to unify the natural categories (natural philosophy, natural geometry, quantum mechanics, astronomy,…), and to open a new direction for most other sciences. Abstract of the manuscript: About Philosophy: • Proved the existence of time and non-dilation. • Proved that matter is always (...) motionless in space. • Conclusion that space is energy. • … About Mathematics: • Solved the squaring the circle problem(a millennium problem). • Using the equation to calculate speed of light is 471.000.000 m/s • … About Physics : • Explained the nature of gravity. • Explained dark energy, spin value of Nucleon. • Explained matter and antimatter. • …. (shrink)
Are the special sciences autonomous from physics? Those who say they are need to explain how dependent special science properties could feature in irreducible causal explanations, but that’s no easy task. The demands of a broadly physicalist worldview require that such properties are not only dependent on the physical, but also physically realized. Realized properties are derivative, so it’s natural to suppose that they have derivative causal powers. Correspondingly, philosophical orthodoxy has it that if we want special science properties to (...) bestow genuinely new causal powers, we must reject physical realization and embrace a form of emergentism, in which such properties arise from the physical by mysterious brute determination. In this paper, I argue that contrary to this orthodoxy, there are physically realized properties that bestow new causal powers in relation to their realizers. The key to my proposal is to reject causal-functional accounts of realization and embrace a broader account that allows for the realization of shapes and patterns. Unlike functional properties, such properties are defined by qualitative, non-causal specifications, so realizing them does not consist in bestowing causal powers. This, I argue, allows for causal novelty of the strongest kind. I argue that the molecular geometry of H2O—a qualitative, multiply realizable property—plays an irreducible role in explaining its dipole moment, and thereby bestows novel powers. On my proposal, special science properties can have the kind of causal novelty traditionally associated with strong emergence, without any of the mystery. (shrink)
Ladyman and Ross argue that quantum objects are not individuals and use this idea to ground their metaphysical view, ontic structural realism, according to which relational structures are primary to things. LR acknowledge that there is a version of quantum theory, namely the Bohm theory, according to which particles do have denite trajectories at all times. However, LR interpret the research by Brown et al. as implying that "raw stuff" or haecceities are needed for the individuality of particles of BT, (...) and LR dismiss this as idle metaphysics. In this paper we note that Brown et al.'s research does not imply that haecceities are needed. Thus BT remains as a genuine option for those who seek to understand quantum particles as individuals. However, we go on to discuss some problems with BT which led Bohm and Hiley to modify it. This modified version underlines that, due to features such as context-dependence and non-locality, Bohmian particles have a very limited autonomy in situations where quantum effects are non-negligible. So while BT restores the possibility of quantum individuals, it also underlines the primacy of the whole over the autonomy of the parts. The later sections of the paper also examine the Bohm theory in the general mathematical context of symplectic geometry. This provides yet another way of understanding the subtle, holistic and dynamic nature of Bohmian individuals. We finally briefly consider Bohm's other main line of research, the "implicate order", which is in some ways similar to LR's structural realism. (shrink)
The Renaissance architect, moral philosopher, cryptographer, mathematician, Papal adviser, painter, city planner and land surveyor Leon Battista Alberti provided the theoretical foundations of modern perspective geometry. Alberti’s work on perspective exerted a powerful influence on painters of the stature of Albrecht Dürer, Leonardo da Vinci and Piero della Francesca. But his Della pittura of 1435–36 contains also a hitherto unrecognized ontology of pictorial projection. We sketch this ontology, and show how it can be generalized to apply to representative devices (...) in general, including maps and spatial and non-spatial databases. (shrink)
[written in 2002/2003 while I was a graduate student at the University of Connecticut and ultimately submitted as part of my qualifying exam for the Masters of Philosophy] The question I am interested in revolves around Kant’s notion of the unity of experience. My central claim will be that, apart from the unity of experiencings and the unity of individual substances, there is a third unity: the unity of Experience. I will argue that this third unity can be conceived of (...) as a sort of ‘experiential space’ with the Aesthetic and Categories as dimensions. I call this ‘Euclidean Experience’ to emphasize the idea that individual experiencings have a ‘location’ within this framework much like individual objects have a location in space and time. The first sort of unity, that of experiences (or ‘experiencings’ as I will call them) is not enough. In order to have self-consciousness (ascribed atomic experiencings) there must be a consciousness in which the experiencings ‘take place’ just as in order for there to be objects there must be space in which they are located. With such a notion of experience in hand I argue that it can be used to bring together the solipsistic and non-solipsistic strands in Kant’s thinking. The resulting position I call ‘Polysolipsism.’. (shrink)
This paper examines two objections by Colin McGinn to panexperientialist metaphysics as a solution to the mind-body problem. It begins by briefly stating how the `ontological problem' of the mind-body relationship is central to the philosophy of mind, summarizes the difficulties with dualism and materialism, and outlines the main tenets of panexperientialism. Panexperientialists, such as David Ray Griffin, claim that theirs is one approach to solving the mind-body problem which does not get stuck in accounting for interaction nor in the (...) difficulties with emergentism and epiphenomenalism . McGinn attacks panexperientialism on two fronts: the Whiteheadian distinction between `consciousness' and `experience' and the notion of consciousness emerging from `non-conscious experience'; and the implicit `absurdities' inherent in the notion of experience and self-agency in the fundamental particles of physics. Griffin's defence fails to satisfactorily address challenge ; though a model is presented by the author which may offer panexperientialism a way out. McGinn's challenge is an attempted reductio which Griffin rejects: that panexperientialism contradicts the evidence of modern quantum- relativistic physics. The author's analysis of the opposing positions shows that both philosophers are arguing from incompatible `geometries of discourse' and radically inconsistent metaphysical assumptions. The paper concludes that a resolution of both the mind-body problem in general, and of the McGinn-Griffin dispute in particular, needs to involve an epistemological shift to include extra-rational ways of knowing. (shrink)
Symmetry in biological and physical systems is a product of self-organization driven by evolutionary processes, or mechanical systems under constraints. Symmetry-based feature extraction or representation by neural networks may unravel the most informative contents in large image databases. Despite significant achievements of artificial intelligence in recognition and classification of regular patterns, the problem of uncertainty remains a major challenge in ambiguous data. In this study, we present an artificial neural network that detects symmetry uncertainty states in human observers. To this (...) end, we exploit a neural network metric in the output of a biologically inspired Self- Organizing Map Quantization Error (SOM-QE). Shape pairs with perfect geometry mirror symmetry but a non-homogenous appearance, caused by local variations in hue, saturation, or lightness within and/or across the shapes in a given pair produce, as shown here, a longer choice response time (RT) for “yes” responses relative to symmetry. These data are consistently mirrored by the variations in the SOM-QE from unsupervised neural network analysis of the same stimulus images. The neural network metric is thus capable of detecting and scaling human symmetry uncertainty in response to patterns. Such capacity is tightly linked to the metric’s proven selectivity to local contrast and color variations in large and highly complex image data. (shrink)
Genevieve Lloyd’s Spinoza is quite a different thinker from the arch rationalist caricature of some undergraduate philosophy courses devoted to “The Continental Rationalists”. Lloyd’s Spinoza does not see reason as a complete source of knowledge, nor is deductive rational thought productive of the highest grade of knowledge. Instead, that honour goes to a third kind of knowledge—intuitive knowledge (scientia intuitiva), which provides an immediate, non-discursive knowledge of its singular object. To the embarrassment of some hard-nosed philosophers, intellectual intuition has an (...) affective component; it is a form of love, and ultimately given that human beings are finite modes of God/Nature (Deus sive Natura), it is a form of the intellectual love of God (amor Dei intellectualis). Some philosophers do not know what to make of this mysterious aspect of Spinoza’s philosophy, which is nonetheless firmly anchored in a reading of Part V of the Ethics. Nonetheless, this note will insist with Lloyd’s “Reconsidering Spinoza’s Rationalism” that such doctrine is an integral part of Spinoza’s philosophy. Moreover, it will be shown that Spinoza is well aware of the limitations of reason (ratio) in gaining scientific knowledge of the world and requires intuition precisely because of the inability of reason to represent individuals in their full particularity. Imagination too has a role to play in shaping scientific knowledge, although reason performs a vital critical role in disciplining and liberating the human mind from inadequate imaginary ideas. The result is an interpretation of Spinoza’s epistemology as both rationalist and intuitionist. DOI: 10.1080/24740500.2021.1962652. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.