Abstract. Let REL(O*E) be the relation algebra of binary relations defined on the Boolean algebra O*E of regular open regions of the Euclideanplane E. The aim of this paper is to prove that the canonical contact relation C of O*E generates a subalgebra REL(O*E, C) of REL(O*E) that has infinitely many elements. More precisely, REL(O*,C) contains an infinite family {SPPn, n ≥ 1} of relations generated by the relation SPP (Separable Proper Part). This relation can be used (...) to define point-free concept of connectedness that for the regular open regions of E coincides with the standard topological notion of connectedness, i.e., a region of the plane E is connected in the sense of topology if and only if it has no separable proper part. Moreover, it is shown that the contact relation algebra REL(O*E, C) and the relation algebra REL(O*E, NTPP) generated by the non-tangential proper parthood relation NTPP, coincide. This entails that the allegedly purely topological notion of connectedness can be defined in mereological terms. (shrink)
The work is an attempt to transfer a structure from Euclideanplane (pure geometrical) under the physical observation limit (resolving power) to a physical space (observable space). The transformation from the mathematical space to physical space passes through the observation condition. The mathematical modelling is adopted. The project is based on two stapes: (1) Looking for a simple mathematical model satisfies the definition of Euclidian plane; (2)That model is examined against three observation resolution conditions (resolved, unresolved and (...) partially resolved). The simplest mechanical model satisfies the definition of Euclidian plane is a planetary gear. The interesting examination of the mechanical model is that is under partial resolution. That examination shows analogous equation for Euler’s formula. The derived complex formula contains the resolved (observable) quantities of the mechanical system and satisfies the linear wave equation. The interpretation of this complex formula is: it is a function related to the position vector of a point in the small wheel of the partially resolved planetary gear system. The function is in terms of the observable quantities only. The work shows the possibility of transformation from real to complex space. The work is purely classical but the result of the partial resolution shows a function similar to the Quantum mechanics wave function. (shrink)
ABSTRACT of The Flying Termite by L.L. Katona -/- In this book I would like to show the term “intelligence“ has a universal, non-anthropomorphic meaning. We can perceive intelligence in dogs, dolphins or gorillas without understanding of it, but intelligence can be also seen in many other things from insects and the Solar System to elementary particles or the rules of a triangle. But that doesn’t mean intelligence comes from Intelligent Design, yet alone a Designer, they seems to be the (...) “state of things” in the Universe. One of the most known law of the triangle is “the measures of the interior angles of a triangle in Euclidean space always add up to 180 degrees.” The question is this law exists without any existing triangle, or not? If there is an encrypted program in DNA, what determines which living being will be a chameleon or a cat, where this program lies? Whether in the DNA itself? But DNA also programmed. Where is the program of a computer? You may say in the hard drive or in the head of the programmer. But the hard drive or the head of the programmer is just the vehicle of the program; when you dissect a computer or a programmer’s head you will not find the program, just the diodes or cells connected by pack of orders. And you won’t see the orders in them. In my book I want to show the works of this elusive intelligence with some interesting contradictions between causality, determinism, teleology, and dependent formation through biology, mathematics and logic. If I need to summarize my idea I can use the following parable. A priest and an atheist are playing with a beach ball in a pool, in a two-dimensioned plane. They are rod-people with a circle. What they can’t perceive is this plane is a part of a three-dimensioned world, where the wind is blowing, so they can’t understand why the ball disappearing continuously, and why it comes back at another point randomly. Both of them have a hint, the ball is a sphere, and the space is three-dimensioned, but instead of trying to work in team to understand this curious phenomena of the Beach Ball, they’re using the old method: insulting each other. (shrink)
In his doctoral dissertation On the Principle of Sufficient Reason, Arthur Schopenhauer there outlines a critique of Euclidean geometry on the basis of the changing nature of mathematics, and hence of demonstration, as a result of Kantian idealism. According to Schopenhauer, Euclid treats geometry synthetically, proceeding from the simple to the complex, from the known to the unknown, “synthesizing” later proofs on the basis of earlier ones. Such a method, although proving the case logically, nevertheless fails to attain the (...) raison d’être of the entity. In order to obtain this, a separate method is required, which Schopenhauer refers to as “analysis,” thus echoing a method already in practice among the early Greek geometers, with however some significant differences. In this essay, I here discuss Schopenhauer’s criticism of synthesis in Euclid’s Elements, and the nature and relevance of his own method of analysis. (shrink)
Girolamo Saccheri (1667--1733) was an Italian Jesuit priest, scholastic philosopher, and mathematician. He earned a permanent place in the history of mathematics by discovering and rigorously deducing an elaborate chain of consequences of an axiom-set for what is now known as hyperbolic (or Lobachevskian) plane geometry. Reviewer's remarks: (1) On two pages of this book Saccheri refers to his previous and equally original book Logica demonstrativa (Turin, 1697) to which 14 of the 16 pages of the editor's "Introduction" are (...) devoted. At the time of the first edition, 1920, the editor was apparently not acquainted with the secondary literature on Logica demonstrativa which continued to grow in the period preceding the second edition \ref[see D. J. Struik, in Dictionary of scientific biography, Vol. 12, 55--57, Scribner's, New York, 1975]. Of special interest in this connection is a series of three articles by A. F. Emch [Scripta Math. 3 (1935), 51--60; Zbl 10, 386; ibid. 3 (1935), 143--152; Zbl 11, 193; ibid. 3 (1935), 221--333; Zbl 12, 98]. (2) It seems curious that modern writers believe that demonstration of the "nondeducibility" of the parallel postulate vindicates Euclid whereas at first Saccheri seems to have thought that demonstration of its "deducibility" is what would vindicate Euclid. Saccheri is perfectly clear in his commitment to the ancient (and now discredited) view that it is wrong to take as an "axiom" a proposition which is not a "primal verity", which is not "known through itself". So it would seem that Saccheri should think that he was convicting Euclid of error by deducing the parallel postulate. The resolution of this confusion is that Saccheri thought that he had proved, not merely that the parallel postulate was true, but that it was a "primal verity" and, thus, that Euclid was correct in taking it as an "axiom". As implausible as this claim about Saccheri may seem, the passage on p. 237, lines 3--15, seems to admit of no other interpretation. Indeed, Emch takes it this way. (3) As has been noted by many others, Saccheri was fascinated, if not obsessed, by what may be called "reflexive indirect deductions", indirect deductions which show that a conclusion follows from given premises by a chain of reasoning beginning with the given premises augmented by the denial of the desired conclusion and ending with the conclusion itself. It is obvious, of course, that this is simply a species of ordinary indirect deduction; a conclusion follows from given premises if a contradiction is deducible from those given premises augmented by the denial of the conclusion---and it is immaterial whether the contradiction involves one of the premises, the denial of the conclusion, or even, as often happens, intermediate propositions distinct from the given premises and the denial of the conclusion. Saccheri seemed to think that a proposition proved in this way was deduced from its own denial and, thus, that its denial was self-contradictory (p. 207). Inference from this mistake to the idea that propositions proved in this way are "primal verities" would involve yet another confusion. The reviewer gratefully acknowledges extensive communication with his former doctoral students J. Gasser and M. Scanlan. ADDED 14 March 14, 2015: (1) Wikipedia reports that many of Saccheri's ideas have a precedent in the 11th Century Persian polymath Omar Khayyám's Discussion of Difficulties in Euclid, a fact ignored in most Western sources until recently. It is unclear whether Saccheri had access to this work in translation, or developed his ideas independently. (2) This book is another exemplification of the huge difference between indirect deduction and indirect reduction. Indirect deduction requires making an assumption that is inconsistent with the premises previously adopted. This means that the reasoner must perform a certain mental act of assuming a certain proposition. It case the premises are all known truths, indirect deduction—which would then be indirect proof—requires the reasoner to assume a falsehood. This fact has been noted by several prominent mathematicians including Hardy, Hilbert, and Tarski. Indirect reduction requires no new assumption. Indirect reduction is simply a transformation of an argument in one form into another argument in a different form. In an indirect reduction one proposition in the old premise set is replaced by the contradictory opposite of the old conclusion and the new conclusion becomes the contradictory opposite of the replaced premise. Roughly and schematically, P,Q/R becomes P,~R/~Q or ~R, Q/~P. Saccheri’s work involved indirect deduction not indirect reduction. (3) The distinction between indirect deduction and indirect reduction has largely slipped through the cracks, the cracks between medieval-oriented logic and modern-oriented logic. The medievalists have a heavy investment in reduction and, though they have heard of deduction, they think that deduction is a form of reduction, or vice versa, or in some cases they think that the word ‘deduction’ is the modern way of referring to reduction. The modernists have no interest in reduction, i.e. in the process of transforming one argument into another having exactly the same number of premises. Modern logicians, like Aristotle, are concerned with deducing a single proposition from a set of propositions. Some focus on deducing a single proposition from the null set—something difficult to relate to reduction. (shrink)
John Corcoran and George Boger. Aristotelian logic and Euclidean geometry. Bulletin of Symbolic Logic. 20 (2014) 131. -/- By an Aristotelian logic we mean any system of direct and indirect deductions, chains of reasoning linking conclusions to premises—complete syllogisms, to use Aristotle’s phrase—1) intended to show that their conclusions follow logically from their respective premises and 2) resembling those in Aristotle’s Prior Analytics. Such systems presuppose existence of cases where it is not obvious that the conclusion follows from the (...) premises: there must be something deductions can show. Corcoran calls a proposition that follows from given premises a hidden consequence of those premises if it is not obvious that the proposition follows from those premises. By a Euclidean geometry we mean an extended discourse beginning with basic premises—axioms, postulates, definitions—1) treating a universe of geometrical figures and 2) resembling Euclid’s Elements. There were Euclidean geometries before Euclid (fl. 300 BCE), even before Aristotle (384–322 BCE). Bochenski, Lukasiewicz, Patzig and others never new this or if they did they found it inconvenient to mention. Euclid shows no awareness of Aristotle. It is obvious today—as it should have been obvious in Euclid’s time, if anyone knew both—that Aristotle’s logic was insufficient for Euclid’s geometry: few if any geometrical theorems can be deduced from Euclid’s premises by means of Aristotle’s deductions. Aristotle’s writings don’t say whether his logic is sufficient for Euclidean geometry. But, there is not even one fully-presented example. However, Aristotle’s writings do make clear that he endorsed the goal of a sufficient system. Nevertheless, incredible as this is today, many logicians after Aristotle claimed that Aristotelian logics are sufficient for Euclidean geometries. This paper reviews and analyses such claims by Mill, Boole, De Morgan, Russell, Poincaré, and others. It also examines early contrary statements by Hintikka, Mueller, Smith, and others. Special attention is given to the argumentations pro or con and especially to their logical, epistemic, and ontological presuppositions. What methodology is necessary or sufficient to show that a given logic is adequate or inadequate to serve as the underlying logi of a given science. (shrink)
The discrete–structural structure of the world is described. In comparison with the idea of Heraclitus about an indissoluble world, preference is given to the discrete world of Democritus. It is noted that if the discrete atoms of Democritus were simple and indivisible, the atoms of the modern world indicated in the article would possess, rather, a structural structure. The article proves the problem of how the mutual connection of mathematics and philosophy influences cognition, which creates a discrete–structural worldview. The author (...) notes that the appearance of writing, symbolic language and the depiction of the picture of the world through mathematics, led us into the sphere of discrete mathematical mathematics. (shrink)
This paper examines Helmholtz's attempt to use empirical psychology to refute certain of Kant's epistemological positions. Particularly, Helmholtz believed that his work in the psychology of visual perception showed Kant's doctrine of the a priori character of spatial intuition to be in error. Some of Helmholtz's arguments are effective, but this effectiveness derives from his arguments to show the possibility of obtaining evidence that the structure of physical space is non-Euclidean, and these arguments do not depend on his theory (...) of vision. Helmholtz's general attempt to provide an empirical account of the "inferences" of perception is regarded as a failure. (shrink)
As all philosophical concepts, also the concept of person constitutes itself referring to a particular problem. The analysis drafted by Marcel Mauss regarding the genesis of the aforementioned category permits to analyse the problematic nucleus to which such concept refers to, disclosing that it responds to the necessity of solving the ancient problem of the link between mind and body in a specific perspective. In this respect, Bergsonian theory of images represents a solid attempt of passing not only the habitual (...) solution of the problem represented by the concept of person, but also the problem itself whereto refers. The deleuzian reading of Henri Bergson’s theory leads him to define the impersonal and pre-individual field sketched out by Bergson as a plane of immanence, and this last as “a life”. The ultimate aim of this paper is to analyse the nature of such plane and to suggest an interpretation of the latest Gilles Deleuze’s piece of writing, "L’immanence: une vie..." that unveils its profound Bergsonism. (shrink)
REVIEW OF: Automated Development of Fundamental Mathematical Theories by Art Quaife. (1992: Kluwer Academic Publishers) 271pp. Using the theorem prover OTTER Art Quaife has proved four hundred theorems of von Neumann-Bernays-Gödel set theory; twelve hundred theorems and definitions of elementary number theory; dozens of Euclidean geometry theorems; and Gödel's incompleteness theorems. It is an impressive achievement. To gauge its significance and to see what prospects it offers this review looks closely at the book and the proofs it presents.
In standard probability theory, probability zero is not the same as impossibility. But many have suggested that only impossible events should have probability zero. This can be arranged if we allow infinitesimal probabilities, but infinitesimals do not solve all of the problems. We will see that regular probabilities are not invariant over rigid transformations, even for simple, bounded, countable, constructive, and disjoint sets. Hence, regular chances cannot be determined by space-time invariant physical laws, and regular credences cannot satisfy seemingly reasonable (...) symmetry principles. Moreover, the examples here are immune to the objections against Williamson’s infinite coin flips. (shrink)
This article analyzes the value of geometric models to understand matter with the examples of the Platonic model for the primary four elements (fire, air, water, and earth) and the models of carbon atomic structures in the new science of crystallography. How the geometry of these models is built in order to discover the properties of matter is explained: movement and stability for the primary elements, and hardness, softness and elasticity for the carbon atoms. These geometric models appear to have (...) a double quality: firstly, they exhibit visually the scientific properties of matter, and secondly they give us the possibility to visualize its whole nature. Geometrical models appear to be the expression of the mind in the understanding of physical matter. (shrink)
David Hyder.The Determinate World: Kant and Helmholtz on the Physical Meaning of Geometry. viii + 229 pp., bibl., index. Berlin/New York: Walter de Gruyter, 2009.
The human attempts to access, measure and organize physical phenomena have led to a manifold construction of mathematical and physical spaces. We will survey the evolution of geometries from Euclid to the Algebraic Geometry of the 20th century. The role of Persian/Arabic Algebra in this transition and its Western symbolic development is emphasized. In this relation, we will also discuss changes in the ontological attitudes toward mathematics and its applications. Historically, the encounter of geometric and algebraic perspectives enriched the mathematical (...) practices and their foundations. Yet, the collapse of Euclidean certitudes, of over 2300 years, and the crisis in the mathematical analysis of the 19th century, led to the exclusion of “geometric judgments” from the foundations of Mathematics. After the success and the limits of the logico-formal analysis, it is necessary to broaden our foundational tools and re-examine the interactions with natural sciences. In particular, the way the geometric and algebraic approaches organize knowledge is analyzed as a cross-disciplinary and cross-cultural issue and will be examined in Mathematical Physics and Biology. We finally discuss how the current notions of mathematical (phase) “space” should be revisited for the purposes of life sciences. (shrink)
Physical boundaries and the earliest topologists. Topology has a relatively short history; but its 19th century roots are embedded in philosophical problems about the nature of extended substances and their boundaries which go back to Zeno and Aristotle. Although it seems that there have always been philosophers interested in these matters, questions about the boundaries of three-dimensional objects were closest to center stage during the later medieval and modern periods. Are the boundaries of an object actually existing, less-than-three-dimensional parts of (...) the object—that is, are solids bounded by two-dimensional surfaces, surfaces by one-dimensional “edges” or “physical lines”, edges by dimensionless “simples”? If not, how does a perfectly spherical object manage to touch a perfectly flat object—what part of the sphere is in immediate contact with the plane, if the sphere has no unextended parts? But if such parts be admitted, are we not then saddled with “actual infinities” of simples, lines, and surfaces spread throughout each continuous object—the boundaries of all the object’s internal parts? Does it help to say that these internal boundaries exist only “potentially”? (shrink)
Recent work has defended “Euclidean” theories of set size, in which Cantor’s Principle (two sets have equally many elements if and only if there is a one-to-one correspondence between them) is abandoned in favor of the Part-Whole Principle (if A is a proper subset of B then A is smaller than B). It has also been suggested that Gödel’s argument for the unique correctness of Cantor’s Principle is inadequate. Here we see from simple examples, not that Euclidean theories (...) of set size are wrong, but that they must be either very weak and narrow or largely arbitrary and misleading. (shrink)
Hobbes emphasized that the state of nature is a state of war because it is characterized by fundamental and generalized distrust. Exiting the state of nature and the conflicts it inevitably fosters is therefore a matter of establishing trust. Extant discussions of trust in the philosophical literature, however, focus either on isolated dyads of trusting individuals or trust in large, faceless institutions. In this paper, I begin to fill the gap between these extremes by analyzing what I call the topology (...) of communities of trust. Such communities are best understood in terms of interlocking dyadic relationships that approximate the ideal of being symmetric, Euclidean, reflexive, and transitive. Few communities of trust live up to this demanding ideal, and those that do tend to be small (between three and fifteen individuals). Nevertheless, such communities of trust serve as the conditions for the possibility of various important prudential epistemic, cultural, and mental health goods. However, communities of trust also make possible various problematic phenomena. They can become insular and walled-off from the surrounding community, leading to distrust of out-groups. And they can lead their members to abandon public goods for tribal or parochial goods. These drawbacks of communities of trust arise from some of the same mecha-nisms that give them positive prudential, epistemic, cultural, and mental health value – and so can at most be mitigated, not eliminated. (shrink)
The plane was going to crash, but it didn't. Johnny was going to bleed to death, but he didn't. Geach sees here a changing future. In this paper, I develop Geach's primary argument for the (almost universally rejected) thesis that the future is mutable (an argument from the nature of prevention), respond to the most serious objections such a view faces, and consider how Geach's view bears on traditional debates concerning divine foreknowledge and human freedom. As I hope to (...) show, Geach's view constitutes a radically new view on the logic of future contingents, and deserves the status of a theoretical contender in these debates. (shrink)
The attempt to define meaning arouses numerous questions, such as whether life can be meaningful without actions devoted to a central purpose or whether the latter guarantee a meaningful life. Communities of inquiry are relevant in this context because they create relationships within and between people and the environment. The more they address relations—social, cognitive, emotional, etc.—that tie-in with the children’s world even if not in a concrete fashion, the more they enable young people to search for and find meaning. (...) Examining the way in which philosophical communities of inquiry serve as a dialogical space that enables a search for meaning on the personal and collective plane, this article seeks to expand the discussion of how/whether finding meaning on a private or communal level can promote recognition of the existential uniqueness of each individual and the development of a sense of responsibility for him or her. Grounded in the writings of Matthew Lipman, it links his ideas about finding meaning in philosophical communities of inquiry with those of Jean-Paul Sartre, Viktor Frankl, and Emmanuel Levinas, in particular with regard to the association between meaning and responsibility. (shrink)
In this paper I deal with Nietzsche's theory of knowledge in the context of 19th century epistemology. In particular, I argue that, even though Nietzsche shows the ontological lack of content of truths (both on the theoretic and on the moral plane), he nevertheless leaves the space for a practical use of them, in a way that can be compared with William James' pragmatism. I thus deal with Nietzsche's and James' concept of "truth", and show their relationship with some (...) outcomes of Ernst Mach's epistemology. (shrink)
We can locate the problematic of time within three philosophical questions, which respectively designate three central areas of philosophical reflection and contemplation. These are: 1) The ontological question, i.e. 'what is being?' 2) The epistemological question, i.e. 'what can we know with certainty?' 3) The existential question, i.e. 'what is the meaning of existence?' These three questions, which are philosophical, but also scientific and political, as they underline the political and moral question of truth and justice, arise from the phenomenon (...) of time, the irreversible constant flow of phenomena that undermines every claim to absolute knowledge. The purpose of this essay is to illuminate the importance of time for philosophical thought and, more generally, for human social and psychical life, in the context of the ontology of Cornelius Castoriadis. Castoriadis, who asserted that " being is time – and not in the horizon of time " , correlated history to society and being to temporality within the social-historical stratum, the ontological plane created by human existence, where " existence is signification ". Time is interpreted as the creation and destruction of forms in a magmatic, layered with a non-regular stratification, reality, where the social-historical manifests as the creation of collective human activity, in the manner of social imaginary significations. This notion of temporality is accompanied by a profound criticism of traditional rationalistic philosophy, to which Castoriadis assigns the name 'ensemblistic/identitary', that highlights the necessity of a new, magmatic ontology, based on the primacy of time. (shrink)
We address the question of whether it is possible to operate a time machine by manipulating matter and energy so as to manufacture closed timelike curves. This question has received a great deal of attention in the physics literature, with attempts to prove no- go theorems based on classical general relativity and various hybrid theories serving as steps along the way towards quantum gravity. Despite the effort put into these no-go theorems, there is no widely accepted definition of a time (...) machine. We explain the conundrum that must be faced in providing a satisfactory definition and propose a resolution. Roughly, we require that all extensions of the time machine region contain closed timelike curves; the actions of the time machine operator are then sufficiently "potent" to guarantee that closed timelike curves appear. We then review no-go theorems based on classical general relativity, semi-classical quantum gravity, quantum field theory on curved spacetime, and Euclidean quantum gravity. Our verdict on the question of our title is that no result of sufficient generality to underwrite a confident "yes" has been proven. Our review of the no-go results does, however, highlight several foundational problems at the intersection of general relativity and quantum physics that lend substance to the search for an answer. (shrink)
The paper aims to investigate some aspects of Ernst Mach’s epistemology in the light of the problem of human orientation in relation to the world (Weltorientierung), which is a main topic of Western philosophy since Kant. As will be argued, Mach has been concerned with that problem, insofar as he developed an original pragmatist epistemology. In order to support my argument, I firstly investigate whether Mach defended a nominalist or a realist account of knowledge and compare his view to those (...) elaborated by other pragmatist thinkers, such as W. James, H. Vaihinger and H. Poincaré. Secondly, the question of what does it mean, for Mach, to orient ourselves in science is addressed. Finally, it will be argued that, although Mach tried to keep his epistemology restricted to a mere operational and economical account of science, that question involves the wider plane of practical philosophy. (shrink)
Poorly saturated colors are closer to a pure grey than strongly saturated ones and, therefore, appear less “colorful”. Color saturation is effectively manipulated in the visual arts for balancing conflicting sensations and moods and for inducing the perception of relative distance in the pictorial plane. While perceptual science has proven quite clearly that the luminance contrast of any hue acts as a self-sufficient cue to relative depth in visual images, the role of color saturation in such figure-ground organization has (...) remained unclear. We presented configurations of colored inducers on grey ‘test’ backgrounds to human observers. Luminance and saturation of the inducers was uniform on each trial, but varied across trials. We ran two separate experimental tasks. In the relative background brightness task, perceptual judgments indicated whether the apparent brightness of the grey test background contrasted with, assimilated to, or appeared equal (no effect) to that of a comparison background with the same luminance contrast. Contrast polarity and its interaction with color saturation affected response proportions for contrast, assimilation and no effect. In the figure-ground task, perceptual judgments indicated whether the inducers appeared to lie in front of, behind, or in the same depth with the background. Strongly saturated inducers produced significantly larger proportions of foreground effects indicating that these inducers stand out as figure against the background. Weakly saturated inducers produced significantly larger proportions of background effects, indicating that these inducers are perceived as lying behind the backgrounds. We infer that color saturation modulates figure-ground organization, both directly by determining relative inducer depth, and indirectly, and in interaction with contrast polarity, by affecting apparent background brightness. The results point towards a hitherto undocumented functional role of color saturation in the genesis of form, and in particular figure-ground percepts in the absence of chromatostereopsis. (shrink)
The spin-statistics connection is derived in a simple manner under the postulates that the original and the exchange wave functions are simply added, and that the azimuthal phase angle, which defines the orientation of the spin part of each single-particle spin-component eigenfunction in the plane normal to the spin-quantization axis, is exchanged along with the other parameters. The spin factor (−1)2s belongs to the exchange wave function when this function is constructed so as to get the spinor ambiguity under (...) control. This is achieved by effecting the exchange of the azimuthal angle by means of rotations and admitting only rotations in one sense. The procedure works in Galilean as well as in Lorentz-invariant quantum mechanics. Relativistic quantum field theory is not required. (shrink)
For Filipinos in Japan, their long-historicized existence in Japan has forced them to continually (re)adjust and (re)articulate their own sociocultural norms, particularly in secular areas like workplaces, societal institutions, marketplaces, and even in their own domestic familial spaces. This article argues, however, that this narrative of struggle is somehow extended even in the confines of religious and ecclesial spaces of Catholic parishes and churches. In this light, this article attempts to articulate the current status and predicament of Filipino Catholics in (...) Japan, particularly in the Archdiocese of Tokyo, where the author spent ten months of field work in selected parishes, churches, and Filipino Catholic communities. It seeks to offer a fresh and updated analysis of their ethnoreligious stories given the emerging situational predicament of increasing nonreligiosity of society and the aging population of Filipino Catholics and their disinterested bicultural children. In response to current demographic crisis and future uncertainty, the Archdiocese has responded by initiating a call for “full integration” that embraces the image of a multicultural church in Japan to acknowledge the presence and contribution of foreign Catholics in Japan. However, this has been received with suspicion and anxiety, particularly from Filipino Catholics. While attempting to expose its ambiguity, this article also highlights the interesting situation of Filipino Catholics as religious in nonreligious Japan—despite this, they have reconfigured the way they express and practice their faith. Their historicized attempt to survive and negotiate as a religious “other” within a constrained and confined socio-spatial plane reveals interesting dynamics and opportunities for renewed dialogue. (shrink)
While representation learning techniques have shown great promise in application to a number of different NLP tasks, they have had little impact on the problem of ontology matching. Unlike past work that has focused on feature engineering, we present a novel representation learning approach that is tailored to the ontology matching task. Our approach is based on embedding ontological terms in a high-dimensional Euclidean space. This embedding is derived on the basis of a novel phrase retrofitting strategy through which (...) semantic similarity information becomes inscribed onto fields of pre-trained word vectors. The resulting framework also incorporates a novel outlier detection mechanism based on a denoising autoencoder that is shown to improve performance. An ontology matching system derived using the proposed framework achieved an F-score of 94% on an alignment scenario involving the Adult Mouse Anatomical Dictionary and the Foundational Model of Anatomy ontology (FMA) as targets. This compares favorably with the best performing systems on the Ontology Alignment Evaluation Initiative anatomy challenge. We performed additional experiments on aligning FMA to NCI Thesaurus and to SNOMED CT based on a reference alignment extracted from the UMLS Metathesaurus. Our system obtained overall F-scores of 93.2% and 89.2% for these experiments, thus achieving state-of-the-art results. (shrink)
It might sound rather convincing to assume that we owe the pleasure of reading the novel form to our elemental repository of physical perception, to our feelings. This would be true only if mere feelings could add up to something more than just emotions, to some deep understanding of the human. After all, a moment of epiphany, where we begin to realize things that dramatically disturb our normal state of mind, is not just emotional, nor indeed a simple moment. Despite (...) its root in the corporeal, a mo(ve)ment of affective realization reaches beyond the realm of the human and opens up the plane of virtual potentials. In this work, we intend to map out the points and relations of affective singularity that pervade the narrative of Toni Morrison’s Sula (1973). Also, we will discuss how these mo(ve)ments of sensation give form to Sula’s and Nel’s experiences and contribute to an affective transformation in morality and friendship. (shrink)
In this paper I will offer a novel understanding of a priori knowledge. My claim is that the sharp distinction that is usually made between a priori and a posteriori knowledge is groundless. It will be argued that a plausible understanding of a priori and a posteriori knowledge has to acknowledge that they are in a constant bootstrapping relationship. It is also crucial that we distinguish between a priori propositions that hold in the actual world and merely possible, non-actual a (...) priori propositions, as we will see when considering cases like Euclidean geometry. Furthermore, contrary to what Kripke seems to suggest, a priori knowledge is intimately connected with metaphysical modality, indeed, grounded in it. The task of a priori reasoning, according to this account, is to delimit the space of metaphysically possible worlds in order for us to be able to determine what is actual. (shrink)
Throughout history, almost all mathematicians, physicists and philosophers have been of the opinion that space and time are infinitely divisible. That is, it is usually believed that space and time do not consist of atoms, but that any piece of space and time of non-zero size, however small, can itself be divided into still smaller parts. This assumption is included in geometry, as in Euclid, and also in the Euclidean and non- Euclidean geometries used in modern physics. Of (...) the few who have denied that space and time are infinitely divisible, the most notable are the ancient atomists, and Berkeley and Hume. All of these assert not only that space and time might be atomic, but that they must be. Infinite divisibility is, they say, impossible on purely conceptual grounds. (shrink)
In this reading of the Euthyphro, Socrates and Euthyphro are seen less in a primordial conflict between reason and devotion, than as sincere Hellenic polytheists engaged in an inquiry based upon a common intuition that, in addition to the irreducible agency of the Gods, there is also some irreducible intelligible content to holiness. This reading is supported by the fact that Euthyphro does not claim the authority of revelation for his decision to prosecute his father, but rather submits it to (...) elenchus, and that Euthyphro does not embrace the ‘solution’ of theological voluntarism when Socrates explicitly offers it. Since the goal of this inquiry is neither to eliminate the noetic content of the holy, nor to eliminate the Gods’ agency, the purpose of the elenchus becomes the effort to articulate the results of this productive tension between the Gods and the intelligible on the several planes of Being implied by each conception of the holy which is successively taken up and dialectically overturned to yield the conception appropriate to the next higher plane, a style of interpretation characteristic of the ancient Neoplatonists. (shrink)
Das natürlich vorkommende Phänomen, das den Ausgangspunkt der vorliegenden Untersuchung ausmacht, ist das empirische Bewußtsein. Betrachtet man die Inhalte des menschlichen, empirischen Bewußtseins, dann fällt auf, daß diese grundsätzlich in zwei Klassen aufgeteilt werden können: Zum einen gibt es spontane Vorstellungen, die der Willkür des empirischen Subjekts unterliegen, etwa Phantasien, Pläne, (Handlungs)entscheidungen oder Spekulationen. Zum anderen hat das Subjekt räumlich und zeitlich strukturierte Vorstellungen, die ihm derart gegeben sind, daß es sich intentional auf Gegenstände und Prozesse innerer und äußerer Wahrnehmung (...) richten kann (rezeptive Vorstellungen). Die Empirizität solcher Vorstellungen ist maßgeblich durch Strukturen bestimmt, die nicht der Willkür des empirischen Subjekts unterliegen: Raum, Zeit und Gegenständlichkeit von Objekten inklusive ihrer Formen, Farben und anderen sinnlichen Eigenschaften (Gerüche, Konsistenzen) sind dem empirischen Subjekt gegeben. Es findet sich immer schon in einer aus Gegenständen bestehenden räumlich-zeitlich strukturierten Welt vor. In dieser Untersuchung will ich Licht in dieses (das menschliche Leben grundsätzlich bestimmende) Verhältnis unserer selbst zu der vorgestellten Welt bringen. (shrink)
Berkeley in his Introduction to the Principles of Human knowledge uses geometrical examples to illustrate a way of generating “universal ideas,” which allegedly account for the existence of general terms. In doing proofs we might, for example, selectively attend to the triangular shape of a diagram. Presumably what we prove using just that property applies to all triangles.I contend, rather, that given Berkeley’s view of extension, no Euclidean triangles exist to attend to. Rather proof, as Berkeley would normally assume, (...) requires idealizing diagrams; treating them as if they obeyed Euclidean constraints. This convention solves the problem of representative generalization. View HTML Send article to KindleTo send this article to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle. Find out more about sending to your Kindle. Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply. Find out more about the Kindle Personal Document Service.Berkeley and Proof in GeometryVolume 51, Issue 3RICHARD J. BROOK DOI: https://doi.org/10.1017/S0012217312000686Your Kindle email address Please provide your Kindle email.@free.kindle.com@kindle.com Available formats PDF Please select a format to send. By using this service, you agree that you will only keep articles for personal use, and will not openly distribute them via Dropbox, Google Drive or other file sharing services. Please confirm that you accept the terms of use. Cancel Send ×Send article to Dropbox To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about sending content to Dropbox. Berkeley and Proof in GeometryVolume 51, Issue 3RICHARD J. BROOK DOI: https://doi.org/10.1017/S0012217312000686Available formats PDF Please select a format to send. By using this service, you agree that you will only keep articles for personal use, and will not openly distribute them via Dropbox, Google Drive or other file sharing services. Please confirm that you accept the terms of use. Cancel Send ×Send article to Google Drive To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about sending content to Google Drive. Berkeley and Proof in GeometryVolume 51, Issue 3RICHARD J. BROOK DOI: https://doi.org/10.1017/S0012217312000686Available formats PDF Please select a format to send. By using this service, you agree that you will only keep articles for personal use, and will not openly distribute them via Dropbox, Google Drive or other file sharing services. Please confirm that you accept the terms of use. Cancel Send ×Export citation Request permission. (shrink)
The concept of time is examined using the second law of thermodynamics that was recently formulated as an equation of motion. According to the statistical notion of increasing entropy, flows of energy diminish differences between energy densities that form space. The flow of energy is identified with the flow of time. The non-Euclidean energy landscape, i.e. the curved space–time, is in evolution when energy is flowing down along gradients and levelling the density differences. The flows along the steepest descents, (...) i.e. geodesics are obtained from the principle of least action for mechanics, electrodynamics and quantum mechanics. The arrow of time, associated with the expansion of the Universe, identifies with grand dispersal of energy when high-energy densities transform by various mechanisms to lower densities in energy and eventually to ever-diluting electromagnetic radiation. Likewise, time in a quantum system takes an increment forwards in the detection-associated dissipative transformation when the stationary-state system begins to evolve pictured as the wave function collapse. The energy dispersal is understood to underlie causality so that an energy gradient is a cause and the resulting energy flow is an effect. The account on causality by the concepts of physics does not imply determinism; on the contrary, evolution of space–time as a causal chain of events is non-deterministic. (shrink)
This paper argues that Frege's notoriously long commitment to Kant's thesis that Euclidean geometry is synthetic _a priori_ is best explained by realizing that Frege uses ‘intuition’ in two senses. Frege sometimes adopts the usage presented in Hermann Helmholtz's sign theory of perception. However, when using ‘intuition’ to denote the source of geometric knowledge, he is appealing to Hermann Cohen's use of Kantian terminology. We will see that Cohen reinterpreted Kantian notions, stripping them of any psychological connotation. Cohen's defense (...) of his modified Kantian thesis on the unique status of the Euclidean axioms presents Frege's own views in a much more favorable light. (shrink)
In the prospectus for his later work pronounced in 1952, Merleau-Ponty announced that his move beyond the phenomenological to the ontological level of analysis is motivated by issues of sociality, notably communication with others.' I propose to interrogate this priority attributed by the author to this interpersonal bond in his reflections on corporeality in general, marking a departure from The Structure of Behavior and The Phenomenology of Perception, which privileged the starting point of consciousness and the body proper. My interest (...) lies particularly in exposing the psychological sources of Merleau-Ponty's thinking about the primacy of sociality. Referring to his lectures on Child Psychology and Pedagogy, which he delivered as Professor at the Sorbonne in Paris in 1949-52,2 I will develop the contention that the developmental psychology of child sociality significantly informed his understanding of relations between self and other laid out in the later texts, and henceforth informed also his conception of the flesh. Specifically, the psychological hypotheses about the anonymous and fusional form initially taken by human sociality appears to play a determining role in his conception of interpersonal life formulated on the ontological plane. I will then point to the internal tensions involved in the theory of sociality based on the thesis of anonymity and disclose an alternative theoretical account, which has the merit of preserving the advantages of the anonymity thesis while avoiding its drawbacks; it also facilitates continued dialogue between Merleau-Ponty's philosophy and recent developmental psychology. (shrink)
Seeing with Ears, Hearing with Eyes. How Technology Molds Synesthesia Within Us -/- The subject of consideration within this lecture is the contribution of existing scientific discoveries on the visual and musical connection within the perceptual plane. Points of reference are the studies of Amir Amedi, Jacob Jolij and Maaieke Meurs, Harry McGurk, as well as, the works of Iwona Sowińska, Roger Scruton, Oliver Sacks, and a cultural analysis of Joshua Bell’s performance. I will also consider how the senses (...) effect each other, pursuing the diversified reception of vision, which consists of the sense of hearing [sic!], on which I would like to focus attention. -/- Paper structure: Introduction to key concepts in the fields of research and development, The visual outlook on hearing, The aural perspective on vision, Relationships to related sciences, Summary of multimedia examples, An attempt to extend the “techno-view” to the auditory senses within synesthesia. (shrink)
It is a received view that Kant’s formal logic (or what he calls “pure general logic”) is thoroughly intensional. On this view, even the notion of logical extension must be understood solely in terms of the concepts that are subordinate to a given concept. I grant that the subordination relation among concepts is an important theme in Kant’s logical doctrine of concepts. But I argue that it is both possible and important to ascribe to Kant an objectual notion of logical (...) extension according to which the extension of a concept is the multitude of objects falling under it. I begin by defending this ascription in response to three reasons that are commonly invoked against it. First, I explain that this ascription is compatible with Kant’s philosophical reflections on the nature and boundary of a formal logic. Second, I show that the objectual notion of extension I ascribe to Kant can be traced back to many of the early modern works of logic with which he was more or less familiar. Third, I argue that such a notion of extension makes perfect sense of a pivotal principle in Kant’s logic, namely the principle that the quantity of a concept’s extension is inversely proportional to that of its intension. In the process, I tease out two important features of the Kantian objectual notion of logical extension in terms of which it markedly differs from the modern one. First, on the modern notion the extension of a concept is the sum of the objects actually falling under it; on the Kantian notion, by contrast, the extension of a concept consists of the multitude of possible objects—not in the metaphysical sense of possibility, though—to which a concept applies in virtue of being a general representation. While the quantity of the former extension is finite, that of the latter is infinite—as is reflected in Kant’s use of a plane-geometrical figure (e.g., circle, square), which is continuum as opposed to discretum, to represent the extension in question. Second, on the modern notion of extension, a concept that signifies exactly one object has a one-member extension; on the Kantian notion, however, such a concept has no extension at all—for a concept is taken to have extension only if it signifies a multitude of things. This feature of logical extension is manifested in Kant’s claim that a singular concept (or a concept in its singular use) can, for lack of extension, be figuratively represented only by a point—as opposed to an extended figure like circle, which is reserved for a general concept (or a concept in its general use). Precisely on account of these two features, the Kantian objectual extension proves vital to Kant’s theory of logical quantification (in universal, particular and singular judgments, respectively) and to his view regarding the formal truth of analytic judgments. (shrink)
The word ‘equality’ often requires disambiguation, which is provided by context or by an explicit modifier. For each sort of magnitude, there is at least one sense of ‘equals’ with its correlated senses of ‘is greater than’ and ‘is less than’. Given any two magnitudes of the same sort—two line segments, two plane figures, two solids, two time intervals, two temperature intervals, two amounts of money in a single currency, and the like—the one equals the other or the one (...) is greater than the other or the one is greater than the other [sc. in appropriate correlated senses of ‘equals’, ‘is greater than’ and ‘is less than’]. In case there are two or more appropriate senses of ‘equals’, the one intended is often indicated by an adverb. For example, one plane figure may be said to be equal in area to another and, in certain cases, one plane figure may be said to be equal in length to another. Each sense of ‘equality’ is tied to a specific domain and is therefore non-logical. Notice that in every cases ‘equality’ is definable in terms of ‘is greater than’ and also in terms of ‘is less than’ both of which are routinely considered domain specific, non-logical. The word ‘identity’ in the logical sense does not require disambiguation. Moreover, it is not correlated ‘is greater than’ and ‘is less than’. If it is not the case that a certain designated triangle is [sc. is identical to] an otherwise designated triangle, it is not necessary for the one to be greater than or less than the other. Moreover, if two magnitudes are equal then a unit of measure can be chosen and, no matter what unit is chosen, each magnitude is the same multiple of the unit that the other is. But identity does not require units. In this regard, congruence is like identity and unlike equality. In arithmetic, the logical concept of identity is coextensive with the arithmetic concept of equality. The logical concept of identity admits of an analytically adequate definition in terms of logical concepts: given any number x and any number y, x is y iff x has every property that y has. The arithmetical concept of equality admits of an analytically adequate definition in terms of arithmetical concepts: given any number x and any number y, x equals y iff x is neither less than nor greater than y. As Aristotle told us and as Frege retold us, just because one relation is coextensive with another is no reason to conclude that they are one. (shrink)
The content of Boscovich’s Theoria philosophiae naturalis was well-known to his contemporaries, but both scientists and philosophers chiefly discussed it during the 19th century. The observations that Boscovich presented in this text, and that he himself defined as “philosophicas metitationes”, soon showed their being a good programme for the forthcoming atomic physics, and contributed to get rid of the mechanistic paradigm in science. In this paper I’ll go back to some meaningful moments of the history of Boscovich’s reception in the (...) era of contemporary philosophy, by referring to what authors such as Popper, Cassirer, Nietzsche and Fechner wrote about him. These thinkers, indeed, particularly stressed the importance of the Theoria in the history of Western thought, and showed that it can easily be evaluated beyond the plane of a pure scientific investigation. (shrink)
[written in 2002/2003 while I was a graduate student at the University of Connecticut and ultimately submitted as part of my qualifying exam for the Masters of Philosophy] The question I am interested in revolves around Kant’s notion of the unity of experience. My central claim will be that, apart from the unity of experiencings and the unity of individual substances, there is a third unity: the unity of Experience. I will argue that this third unity can be conceived of (...) as a sort of ‘experiential space’ with the Aesthetic and Categories as dimensions. I call this ‘Euclidean Experience’ to emphasize the idea that individual experiencings have a ‘location’ within this framework much like individual objects have a location in space and time. The first sort of unity, that of experiences (or ‘experiencings’ as I will call them) is not enough. In order to have self-consciousness (ascribed atomic experiencings) there must be a consciousness in which the experiencings ‘take place’ just as in order for there to be objects there must be space in which they are located. With such a notion of experience in hand I argue that it can be used to bring together the solipsistic and non-solipsistic strands in Kant’s thinking. The resulting position I call ‘Polysolipsism.’. (shrink)
British art historian Charles Harrison presumes the existence of a patriarchal world with power in the hands of men who dominate the representation of women and femininity. He applauds the ground-breaking work of feminist theorists who have questioned this imbalance of power since the 1970s. He stops short, however, of accepting their claims that all women have been represented by male artists as images of “utter passivity” (p. 4), routinely reduced by the male gaze to the status of exploited sexual (...) objects, or that women’s subjectivity is eroded by the visual treatment they receive at the hands of male artists such as Manet and Picasso. He wants to show that what is depicted in the picture plane by the (typically male) artist and enjoyed by the (typically male) spectator is more nuanced than just a simple privileged understanding between two men. He adds a third (and possibly fourth or more) party to the mix when he significantly redefines and expands our concept of the gaze: “A gaze may also be conceived of as a function of a painting’s represented content” (p. 9). In other words, a gaze may be “addressed outward by a represented figure,” and regardless of who and where, “the assumption conveyed by the term [‘gaze’] is that some differential and usually asymmetrical relation will be at stake in any exchange between one who directs the gaze and another at whom it is directed. In fact, it is just this difference—in age, in sex, in class, in interest, in power —that the operation of the gaze tends to mark” (p. 9). Referring to a woman depicted within the picture plane, he asks us to consider, “What does it feel like to look like this?” (p. 21) in order to entertain our many emotional responses and interpretations. When he adds, “What does it feel like to whom?” the sexual difference of the spectator also clearly comes into play. (shrink)
Based on de Broglie’s wave hypothesis and the covariant ether, the Three Wave Hypothesis (TWH) has been proposed and developed in the last century. In 2007, the author found that the TWH may be attributed to a kinematical classical system of two perpendicular rolling circles. In 2012, the author showed that the position vector of a point in a model of two rolling circles in plane can be transformed to a complex vector under a proposed effect of partial observation. (...) In the present project, this concept of transformation is developed to be a lab observation concept. Under this transformation of the lab observer, it is found that velocity equation of the motion of the point is transformed to an equation analogising the relativistic quantum mechanics equation (Dirac equation). Many other analogies has been found, and are listed in a comparison table. The analogy tries to explain the entanglement within the scope of the transformation. These analogies may suggest that both quantum mechanics and special relativity are emergent, both of them are unified, and of the same origin. The similarities suggest analogies and propose questions of interpretation for the standard quantum theory, without any possible causal claims. (shrink)
I argue that Galileo regarded unaccelerated motion as requiring cause to sustain in. In an inclined plane experiment, the cause ceases when the incline ceases. When the incline ceases, what ceases is acceleration, not motion. Hence, unaccelerated motion requires no cause to sustain it.
This largely expository lecture deals with aspects of traditional solid geometry suitable for applications in logic courses. Polygons are plane or two-dimensional; the simplest are triangles. Polyhedra [or polyhedrons] are solid or three-dimensional; the simplest are tetrahedra [or triangular pyramids, made of four triangles]. -/- A regular polygon has equal sides and equal angles. A polyhedron having congruent faces and congruent [polyhedral] angles is not called regular, as some might expect; rather they are said to be subregular—a word coined (...) for this lecture. To repeat, a subregular polyhedron has congruent faces and congruent [polyhedral] angles. A subregular polyhedron whose faces are all regular polygons is regular—using standard terminology. -/- Geometers before Euclid showed that there are “essentially” only five regular polyhedra: every regular polyhedron is a tetrahedron (4 faces), a hexahedron or cube (6 faces), an octahedron (8 faces), a dodecahedron (12 faces), or an icosahedron (20 faces). -/- The first question is whether there are subregular polyhedra that are not regular. For example, are there tetrahedra having congruent angles and congruent triangular faces but whose faces are not equilateral triangles? -/- Another question is the classification of subregular polyhedra if they exist. For example, considering the fact that the regular tetrahedra all have equilateral triangles as faces, we ask which triangles other than equilaterals are faces of subregular tetrahedra. Similarly, considering the fact that the regular hexahedra all have squares as faces, we ask which quadrangles other than squares are faces of subregular hexahedra. -/- After introductory remarks that include historical and philosophical points, we concentrate on tetrahedra. A triangle that is congruent to each of the four faces of a tetrahedron is called a generator of the tetrahedron. The main result proved is that every acute triangle is a generator of a subregular tetrahedron. The proof includes an algorithm –implementable with scissors and paper –that constructs from any given acute triangle a subregular tetrahedron whose faces are congruent to the given triangle. -/- Algorithm: Given any acute triangle. Construct a similar triangle whose sides are double the sides of the given triangle. Draw the three lines connecting the three midpoints of the sides (making four triangles congruent to the given triangle—a central triangle surrounded by three peripheral triangles). Make three “hinges” along the lines connecting the midpoints. “Fold” the peripheral triangles together (into a tetrahedron). [LIGHTLY EDITED VERSION OF PRINTED ABSTRACT] Acknowledgements: William Lawvere, Colin McLarty, Irvin Miller, Frango Nabrasa, Lawrence Spector, Roberto Torretti, and Richard Vesley. -/- . (shrink)
In my dissertation (Rutgers, 2007) I developed the proposal that one can establish that material quantum objects behave classically just in case there is a “local plane wave” regime, which naturally corresponds to the suppression of all quantum interference.
The Ethics of Immigration, by Joseph Carens, Oxford University Press, 2013. -/- Joseph Carens is arguably the most prominent political theorist to defend open borders, a view which he did much to make intellectually respectable in a famous 1987 article, “Aliens and Citizens: The Case for Open Borders.” In The Ethics of Immigration Carens again defends the open borders view, but with a new rationale. Whereas before he argued that seemingly opposed philosophies provided converging support for open borders, now he (...) bases his case on “democratic principles,” by which he means uncontroversial moral commitments that are widely shared in liberal states. Carens argues that one such commitment is to freedom, which can be understood as “not being the subject of the will of another.” A commitment to such a value would explain why freedom of movement within a state is considered a basic human right. But, Carens asks, if we have a general right to freedom of movement within countries, why not between them? -/- Carens has long noted that despite the attractiveness of open borders at the level of pure justice, it is deeply at odds with how immigration policy is normally viewed. Given this, Carens’ many writings on immigration have long approached it from a second perspective, one that puts aside questions of ideal theory and takes for granted the conventional view that states are entitled to discretionary control over their borders. This second perspective is the dominant one in The Ethics of Immigration, as Carens spends most of the book outlining standards of fair treatment for permanent residents, temporary workers, refugees and other migrants that do not presuppose any commitment to open borders. In this mode Carens offers a revised version of one his most thought-provoking and controversial arguments, defending amnesty for immigrants who first arrive illegally. -/- Carens’ investigation of immigration issues at both the level of ideal justice and the more immediate plane of the debate over amnesty and related issues makes his book unusually rich. It has the rare virtue of being both philosophically rigorous and politically relevant. -/- . (shrink)
Forgotten Truth is primarily a presentation of the traditional esoteric view that reality consists of a hierarchy of Being. Within the hierarchy there are an indefinite number of worlds, but they can be classified into four levels: the terrestrial, psychic, and celestial planes, and the Infinite. The corresponding levels within the human microcosm are body, mind, soul, and spirit. “From the multiple heavens of Judaism to the storied structure of the Hindu temple and the angelologies of innumerable traditions, the view (...) was reached convergently and independently, as if by innate tropism, by virtually all known societies…” (18). The important exception is our current society, whose fundamental flaw is its inclination to reduce all reality to the terrestrial plane alone. Modernity is “captive of an outlook presumed to be scientific but in fact scientistic” (17); it goes beyond the actual findings of science to deny that other approaches to knowledge are valid and other truths true” (16). (shrink)
Ordinary people shudder at the thought that people in positions of power might do whatever they think they can get away with. But that is often the way it is in the real world, and the risks go even higher when opportunity is compounded with impatience. The ways of negotiation and diplomacy are not considered entirely outmoded. But more and more we are being duped by a dream of some ultimate technological fix: that one more fancy gadget is all it (...) will take to solve the vexing problems that less well-tooled folks have been stumbling over for centuries. Our success rate, this reasoning goes, has been limited so far only by the limits on our equipment. With the new super-missile, or the new super-plane, or the new superlaunching system in space, we will be able to leap tall buildings in a single bound-or, what is more to the point, just blow them away and walk across the crater. "Bombs can be clean." "Nuclearwar is winnable." The illusion of omnipotence that accompanies this megalomania is well nurtured by manufacturers who stand in line for contracts to help build some super-weapon. This should not be surprising. What at first glance is surprising is the almost total failure of our commercial media to call this myth into question. This criticism is meant to be sweeping, but I will here focus my remarks on film. (shrink)
Filtration combustion is described by Laplacian growth without surface tension. These equations have elegant analytical solutions that replace the complex integro-differential motion equations by simple differential equations of pole motion in a complex plane. The main problem with such a solution is the existence of finite time singularities. To prevent such singularities, nonzero surface tension is usually used. However, nonzero surface tension does not exist in filtration combustion, and this destroys the analytical solutions. However, a more elegant approach exists (...) for solving the problem. First, we can introduce a small amount of pole noise to the system. Second, for regularisation of the problem, we throw out all new poles that can produce a finite time singularity. It can be strictly proved that the asymptotic solution for such a system is a single finger. Moreover, the qualitative consideration demonstrates that a finger with 1 2 of the channel width is statistically stable. Therefore, all properties of such a solution are exactly the same as those of the solution with nonzero surface tension under numerical noise. The solution of the ST problem without surface tension is similar to the solution for the equation of cellular flames in the case of the combustion of gas mixtures. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.