Given the hard problem of consciousness (Chalmers, 1995) there are no brain electrophysiological correlates of the subjective experience (the felt quality of redness or the redness of red, the experience of dark and light, the quality of depth in a visual field, the sound of a clarinet, the smell of mothball, bodily sensations from pains to orgasms, mental images that are conjured up internally, the felt quality of emotion, the experience of a stream of conscious thought or the phenomenology of (...) thought). However, there are brain occipital and left temporal electrophysiological correlates of the subjective experience (Pereira, 2015). Notwithstanding, as evoked signal, the change in event-related brain potentials phase (frequency is the change in phase over time) is instantaneous, that is, the frequency will transiently be infinite: a transient peak in frequency (positive or negative), if any, is instantaneous in electroencephalogram averaging or filtering that the event-related brain potentials required and the underlying structure of the event-related brain potentials in the frequency domain cannot be accounted, for example, by the Wavelet Transform (WT) or the Fast Fourier Transform (FFT) analysis, because they require that frequency is derived by convolution rather than by differentiation. However, as I show in the current original research report, one suitable method for analyse the instantaneous change in event-related brain potentials phase and accounted for a transient peak in frequency (positive or negative), if any, in the underlying structure of the event-related brain potentials is the Empirical Mode Decomposition with post processing (Xie et al., 2014) Ensemble Empirical Mode Decomposition (postEEMD) and Hilbert-HuangTransform (HHT). (shrink)
R Core Team. (2016). R: A language and environment for statistical computing. R Foundation for Statistical Computing. Supplement to Occipital and left temporal instantaneous amplitude and frequency oscillations correlated with access and phenomenal consciousness. Occipital and left temporal instantaneous amplitude and frequency oscillations correlated with access and phenomenal consciousness move from the features of the ERP characterized in Occipital and Left Temporal EEG Correlates of Phenomenal Consciousness (Pereira, 2015) towards the instantaneous amplitude and frequency of event-related changes correlated with a (...) contrast in access and in phenomenology. Occipital and left temporal instantaneous amplitude and frequency oscillations correlated with access and phenomenal consciousness proceed as following. In the first section, empirical mode decomposition (EMD) with post processing (Xie, G., Guo, Y., Tong, S., and Ma, L., 2014. Calculate excess mortality during heatwaves using Hilbert-Huangtransform algorithm. BMC medical research methodology, 14, 35) Ensemble Empirical Mode Decomposition (postEEMD) and Hilbert-HuangTransform (HHT). -/- In the second section, calculated the variance inflation factor (VIF). In the third section, partial least squares regression (PLSR): the minimal root mean squared error of prediction (RMSEP). In the last section, partial least squares regression (PLSR): significance multivariate correlation (sMC) statistic. (shrink)
Arthur Clark and Michael Kube–McDowell (“The Triger”, 2000) suggested the sci-fi idea about the direct transformation from a chemical substance to another by the action of a newly physical, “Trigger” field. Karl Brohier, a Nobel Prize winner, who is a dramatic persona in the novel, elaborates a new theory, re-reading and re-writing Pauling’s “The Nature of the Chemical Bond”; according to Brohier: “Information organizes and differentiates energy. It regularizes and stabilizes matter. Information propagates through matter-energy and mediates the interactions of (...) matter-energy.” Dr Horton, his collaborator in the novel replies: “If the universe consists of energy and information, then the Trigger somehow alters the information envelope of certain substances –“. “Alters it, scrambles it, overwhelms it, destabilizes it” Brohier adds. There is a scientific debate whether or how far chemistry is fundamentally reducible to quantum mechanics. Nevertheless, the fact that many essential chemical properties and reactions are at least partly representable in terms of quantum mechanics is doubtless. For the quantum mechanics itself has been reformulated as a theory of a special kind of information, quantum information, chemistry might be in turn interpreted in the same terms. Wave function, the fundamental concept of quantum mechanics, can be equivalently defined as a series of qubits, eventually infinite. A qubit, being defined as the normed superposition of the two orthogonal subspaces of the complex Hilbert space, can be interpreted as a generalization of the standard bit of information as to infinite sets or series. All “forces” in the Standard model, which are furthermore essential for chemical transformations, are groups [U(1),SU(2),SU(3)] of the transformations of the complex Hilbert space and thus, of series of qubits. One can suggest that any chemical substances and changes are fundamentally representable as quantum information and its transformations. If entanglement is interpreted as a physical field, though any group above seems to be unattachable to it, it might be identified as the “Triger field”. It might cause a direct transformation of any chemical substance by from a remote distance. Is this possible in principle? (shrink)
The paper addresses the problem, which quantum mechanics resolves in fact. Its viewpoint suggests that the crucial link of time and its course is omitted in understanding the problem. The common interpretation underlain by the history of quantum mechanics sees discreteness only on the Plank scale, which is transformed into continuity and even smoothness on the macroscopic scale. That approach is fraught with a series of seeming paradoxes. It suggests that the present mathematical formalism of quantum mechanics is only partly (...) relevant to its problem, which is ostensibly known. The paper accepts just the opposite: The mathematical solution is absolute relevant and serves as an axiomatic base, from which the real and yet hidden problem is deduced. Wave-particle duality, Hilbert space, both probabilistic and many-worlds interpretations of quantum mechanics, quantum information, and the Schrödinger equation are included in that base. The Schrödinger equation is understood as a generalization of the law of energy conservation to past, present, and future moments of time. The deduced real problem of quantum mechanics is: “What is the universal law describing the course of time in any physical change therefore including any mechanical motion?”. (shrink)
The concept of quantum information is introduced as both normed superposition of two orthogonal sub-spaces of the separable complex Hilbert space and in-variance of Hamilton and Lagrange representation of any mechanical system. The base is the isomorphism of the standard introduction and the representation of a qubit to a 3D unit ball, in which two points are chosen. The separable complex Hilbert space is considered as the free variable of quantum information and any point in it (a wave (...) function describing a state of a quantum system) as its value as the bound variable. A qubit is equivalent to the generalization of ‘bit’ from the set of two equally probable alternatives to an infinite set of alternatives. Then, that Hilbert space is considered as a generalization of Peano arithmetic where any unit is substituted by a qubit and thus the set of natural number is mappable within any qubit as the complex internal structure of the unit or a different state of it. Thus, any mathematical structure being reducible to set theory is re-presentable as a set of wave functions and a subspace of the separable complex Hilbert space, and it can be identified as the category of all categories for any functor represents an operator transforming a set (or subspace) of the separable complex Hilbert space into another. Thus, category theory is isomorphic to the Hilbert-space representation of set theory & Peano arithmetic as above. Given any value of quantum information, i.e. a point in the separable complex Hilbert space, it always admits two equally acceptable interpretations: the one is physical, the other is mathematical. The former is a wave function as the exhausted description of a certain state of a certain quantum system. The latter chooses a certain mathematical structure among a certain category. Thus there is no way to be distinguished a mathematical structure from a physical state for both are described exhaustedly as a value of quantum information. This statement in turn can be utilized to be defined quantum information by the identity of any mathematical structure to a physical state, and also vice versa. Further, that definition is equivalent to both standard definition as the normed superposition and in-variance of Hamilton and Lagrange interpretation of mechanical motion introduced in the beginning of the paper. Then, the concept of information symmetry can be involved as the symmetry between three elements or two pairs of elements: Lagrange representation and each counterpart of the pair of Hamilton representation. The sense and meaning of information symmetry may be visualized by a single (quantum) bit and its interpretation as both (privileged) reference frame and the symmetries of the Standard model. (shrink)
Girolamo Saccheri (1667--1733) was an Italian Jesuit priest, scholastic philosopher, and mathematician. He earned a permanent place in the history of mathematics by discovering and rigorously deducing an elaborate chain of consequences of an axiom-set for what is now known as hyperbolic (or Lobachevskian) plane geometry. Reviewer's remarks: (1) On two pages of this book Saccheri refers to his previous and equally original book Logica demonstrativa (Turin, 1697) to which 14 of the 16 pages of the editor's "Introduction" are devoted. (...) At the time of the first edition, 1920, the editor was apparently not acquainted with the secondary literature on Logica demonstrativa which continued to grow in the period preceding the second edition \ref[see D. J. Struik, in Dictionary of scientific biography, Vol. 12, 55--57, Scribner's, New York, 1975]. Of special interest in this connection is a series of three articles by A. F. Emch [Scripta Math. 3 (1935), 51--60; Zbl 10, 386; ibid. 3 (1935), 143--152; Zbl 11, 193; ibid. 3 (1935), 221--333; Zbl 12, 98]. (2) It seems curious that modern writers believe that demonstration of the "nondeducibility" of the parallel postulate vindicates Euclid whereas at first Saccheri seems to have thought that demonstration of its "deducibility" is what would vindicate Euclid. Saccheri is perfectly clear in his commitment to the ancient (and now discredited) view that it is wrong to take as an "axiom" a proposition which is not a "primal verity", which is not "known through itself". So it would seem that Saccheri should think that he was convicting Euclid of error by deducing the parallel postulate. The resolution of this confusion is that Saccheri thought that he had proved, not merely that the parallel postulate was true, but that it was a "primal verity" and, thus, that Euclid was correct in taking it as an "axiom". As implausible as this claim about Saccheri may seem, the passage on p. 237, lines 3--15, seems to admit of no other interpretation. Indeed, Emch takes it this way. (3) As has been noted by many others, Saccheri was fascinated, if not obsessed, by what may be called "reflexive indirect deductions", indirect deductions which show that a conclusion follows from given premises by a chain of reasoning beginning with the given premises augmented by the denial of the desired conclusion and ending with the conclusion itself. It is obvious, of course, that this is simply a species of ordinary indirect deduction; a conclusion follows from given premises if a contradiction is deducible from those given premises augmented by the denial of the conclusion---and it is immaterial whether the contradiction involves one of the premises, the denial of the conclusion, or even, as often happens, intermediate propositions distinct from the given premises and the denial of the conclusion. Saccheri seemed to think that a proposition proved in this way was deduced from its own denial and, thus, that its denial was self-contradictory (p. 207). Inference from this mistake to the idea that propositions proved in this way are "primal verities" would involve yet another confusion. The reviewer gratefully acknowledges extensive communication with his former doctoral students J. Gasser and M. Scanlan. ADDED 14 March 14, 2015: (1) Wikipedia reports that many of Saccheri's ideas have a precedent in the 11th Century Persian polymath Omar Khayyám's Discussion of Difficulties in Euclid, a fact ignored in most Western sources until recently. It is unclear whether Saccheri had access to this work in translation, or developed his ideas independently. (2) This book is another exemplification of the huge difference between indirect deduction and indirect reduction. Indirect deduction requires making an assumption that is inconsistent with the premises previously adopted. This means that the reasoner must perform a certain mental act of assuming a certain proposition. It case the premises are all known truths, indirect deduction—which would then be indirect proof—requires the reasoner to assume a falsehood. This fact has been noted by several prominent mathematicians including Hardy, Hilbert, and Tarski. Indirect reduction requires no new assumption. Indirect reduction is simply a transformation of an argument in one form into another argument in a different form. In an indirect reduction one proposition in the old premise set is replaced by the contradictory opposite of the old conclusion and the new conclusion becomes the contradictory opposite of the replaced premise. Roughly and schematically, P,Q/R becomes P,~R/~Q or ~R, Q/~P. Saccheri’s work involved indirect deduction not indirect reduction. (3) The distinction between indirect deduction and indirect reduction has largely slipped through the cracks, the cracks between medieval-oriented logic and modern-oriented logic. The medievalists have a heavy investment in reduction and, though they have heard of deduction, they think that deduction is a form of reduction, or vice versa, or in some cases they think that the word ‘deduction’ is the modern way of referring to reduction. The modernists have no interest in reduction, i.e. in the process of transforming one argument into another having exactly the same number of premises. Modern logicians, like Aristotle, are concerned with deducing a single proposition from a set of propositions. Some focus on deducing a single proposition from the null set—something difficult to relate to reduction. (shrink)
Based on the various documents, 1989-2002, through the original texts, in addition to the author's contributions, this paper presents the refutation of the mathematicians and physicists A. Logunov and M. Mestvirishvil of A. Einstein's "general relativity", from the relativistic theory of gravitation of these authors, who applying the fundamental principle of the science of physics of the conservation of the energy-momentum and using absolute differential calculus they rigorously perform their mathematical tests. It is conclusively shown that, from the Einstein-Grossman-Hilbert (...) equations, gravity is absurdly a metric field devoid of physical reality unlike all other fields in nature that are material fields, interrupting the chain of transformations between the different existing fields. Also, in Einstein's theory the proved "inertial mass" equal to gravitational mass has no physical meaning. Therefore, "general relativity" does not obey the correspondence principle with Newton's gravity. (shrink)
The quantum information introduced by quantum mechanics is equivalent to a certain generalization of classical information: from finite to infinite series or collections. The quantity of information is the quantity of choices measured in the units of elementary choice. The “qubit”, can be interpreted as that generalization of “bit”, which is a choice among a continuum of alternatives. The axiom of choice is necessary for quantum information. The coherent state is transformed into a well-ordered series of results in time after (...) measurement. The quantity of quantum information is the transfinite ordinal number corresponding to the infinity series in question. The transfinite ordinal numbers can be defined as ambiguously corresponding “transfinite natural numbers” generalizing the natural numbers of Peano arithmetic to “Hilbert arithmetic” allowing for the unification of the foundations of mathematics and quantum mechanics. (shrink)
Restall set forth a "consecution" calculus in his "An Introduction to Substructural Logics." This is a natural deduction type sequent calculus where the structural rules play an important role. This paper looks at different ways of extending Restall's calculus. It is shown that Restall's weak soundness and completeness result with regards to a Hilbert calculus can be extended to a strong one so as to encompass what Restall calls proofs from assumptions. It is also shown how to extend the (...) calculus so as to validate the metainferential rule of reasoning by cases, as well as certain theory-dependent rules. (shrink)
Detlefsen (1986) reads Hilbert's program as a sophisticated defense of instrumentalism, but Feferman (1998) has it that Hilbert's program leaves significant ontological questions unanswered. One such question is of the reference of individual number terms. Hilbert's use of admittedly "meaningless" signs for numbers and formulae appears to impair his ability to establish the reference of mathematical terms and the content of mathematical propositions (Weyl (1949); Kitcher (1976)). The paper traces the history and context of Hilbert's reasoning (...) about signs, which illuminates Hilbert's account of mathematical objectivity, axiomatics, idealization, and consistency. (shrink)
Hilbert’s program was an ambitious and wide-ranging project in the philosophy and foundations of mathematics. In order to “dispose of the foundational questions in mathematics once and for all,” Hilbert proposed a two-pronged approach in 1921: first, classical mathematics should be formalized in axiomatic systems; second, using only restricted, “finitary” means, one should give proofs of the consistency of these axiomatic systems. Although Gödel’s incompleteness theorems show that the program as originally conceived cannot be carried out, it had (...) many partial successes, and generated important advances in logical theory and metatheory, both at the time and since. The article discusses the historical background and development of Hilbert’s program, its philosophical underpinnings and consequences, and its subsequent development and influences since the 1930s. (shrink)
Color-vision defects constitute a spectrum of disorders with varying degrees and types of departure from normal human color vision. One form of color-vision defect is dichromacy; by mixing together only two lights, the dichromat can match any light, unlike normal trichromatic humans, who need to mix three. In a philosophical context, our titular question may be taken in two ways. First, it can be taken at face value as a question about visible properties of external objects, and second, it may (...) be interpreted as the more intangible question of “what it’s like” to be color-blind. (shrink)
In the 1920s, David Hilbert proposed a research program with the aim of providing mathematics with a secure foundation. This was to be accomplished by first formalizing logic and mathematics in their entirety, and then showing---using only so-called finitistic principles---that these formalizations are free of contradictions. ;In the area of logic, the Hilbert school accomplished major advances both in introducing new systems of logic, and in developing central metalogical notions, such as completeness and decidability. The analysis of unpublished (...) material presented in Chapter 2 shows that a completeness proof for propositional logic was found by Hilbert and his assistant Paul Bernays already in 1917--18, and that Bernays's contribution was much greater than is commonly acknowledged. Aside from logic, the main technical contribution of Hilbert's Program are the development of formal mathematical theories and proof-theoretical investigations thereof, in particular, consistency proofs. In this respect Wilhelm Ackermann's 1924 dissertation is a milestone both in the development of the Program and in proof theory in general. Ackermann gives a consistency proof for a second-order version of primitive recursive arithmetic which, surprisingly, explicitly uses a finitistic version of transfinite induction up to www . He also gave a faulty consistency proof for a system of second-order arithmetic based on Hilbert's &egr;-substitution method. Detailed analyses of both proofs in Chapter 3 shed light on the development of finitism and proof theory in the 1920s as practiced in Hilbert's school. ;In a series of papers, Charles Parsons has attempted to map out a notion of mathematical intuition which he also brings to bear on Hilbert's finitism. According to him, mathematical intuition fails to be able to underwrite the kind of intuitive knowledge Hilbert thought was attainable by the finitist. It is argued in Chapter 4 that the extent of finitistic knowledge which intuition can provide is broader than Parsons supposes. According to another influential analysis of finitism due to W. W. Tait, finitist reasoning coincides with primitive recursive reasoning. The acceptance of non-primitive recursive methods in Ackermann's dissertation presented in Chapter 3, together with additional textual evidence presented in Chapter 4, shows that this identification is untenable as far as Hilbert's conception of finitism is concerned. Tait's conception, however, differs from Hilbert's in important respects, yet it is also open to criticisms leading to the conclusion that finitism encompasses more than just primitive recursive reasoning. (shrink)
Some of the most important developments of symbolic logic took place in the 1920s. Foremost among them are the distinction between syntax and semantics and the formulation of questions of completeness and decidability of logical systems. David Hilbert and his students played a very important part in these developments. Their contributions can be traced to unpublished lecture notes and other manuscripts by Hilbert and Bernays dating to the period 1917-1923. The aim of this paper is to describe these (...) results, focussing primarily on propositional logic, and to put them in their historical context. It is argued that truth-value semantics, syntactic ("Post-") and semantic completeness, decidability, and other results were first obtained by Hilbert and Bernays in 1918, and that Bernays's role in their discovery and the subsequent development of mathematical logic is much greater than has so far been acknowledged. (shrink)
The human faculty of moral judgment is not well suited to address problems, like climate change, that are global in scope and remote in time. Advocates of ‘moral bioenhancement’ have proposed that we should investigate the use of medical technologies to make human beings more trusting and altruistic, and hence more willing to cooperate in efforts to mitigate the impacts of climate change. We survey recent accounts of the proximate and ultimate causes of human cooperation in order to assess the (...) prospects for bioenhancement. We identify a number of issues that are likely to be significant obstacles to effective bioenhancement, as well as areas for future research. (shrink)
This paper critically examines color relationalism and color relativism, two theories of color that are allegedly supported by variation in normal human color vision. We mostly discuss color relationalism, defended at length in Jonathan Cohen's The Red and the Real, and argue that the theory has insuperable problems.
David Hilbert's finitistic standpoint is a conception of elementary number theory designed to answer the intuitionist doubts regarding the security and certainty of mathematics. Hilbert was unfortunately not exact in delineating what that viewpoint was, and Hilbert himself changed his usage of the term through the 1920s and 30s. The purpose of this paper is to outline what the main problems are in understanding Hilbert and Bernays on this issue, based on some publications by them which (...) have so far received little attention, and on a number of philosophical reconstructions of the viewpoint (in particular, by Hand, Kitcher, and Tait). (shrink)
After sketching the main lines of Hilbert's program, certain well-known and influential interpretations of the program are critically evaluated, and an alternative interpretation is presented. Finally, some recent developments in logic related to Hilbert's program are reviewed.
Integrity is often conceived as a heroic ideal: the person of integrity sticks to what they believe is right, regardless of the consequences. In this article, I defend a conception of ordinary integrity, for people who either do not desire or are unable to be moral martyrs. Drawing on the writings of seventeenth century thinker Huang Zongxi, I propose refocussing attention away from an abstract ideal of integrity, to instead consider the institutional conditions whereby it is made safe not (...) to be servile. (shrink)
This paper proposes a way to understand transformative choices, choices that change ‘who you are.’ First, it distinguishes two broad models of transformative choice: 1) ‘event-based’ transformative choices in which some event—perhaps an experience—downstream from a choice transforms you, and 2) ‘choice-based’ transformative choices in which the choice itself—and not something downstream from the choice—transforms you. Transformative choices are of interest primarily because they purport to pose a challenge to standard approaches to rational choice. An examination of the event-based transformative (...) choices of L. A. Paul and Edna Ullman-Margalit, however, suggests that event-based transformative choices don’t raise any difficulties for standard approaches to rational choice. An account of choice-based transformative choices—and what it is to be transformed—is then proposed. Transformative choices so understood not only capture paradigmatic cases of transformative choice but also point the way to a different way of thinking about rational choice and agency. (shrink)
Religious freedom is often thought to protect, not only religious practices, but also the underlying religious beliefs of citizens. But what should be said about religious beliefs that oppose religious freedom itself or that deny the concept of equal citizenship? The author argues here that such beliefs, while protected against coercive sanction, are rightly subject to attempts at transformation by the state in its expressive capacities. Transformation is entailed by a commitment to publicizing the reasons and principles that justify the (...) basic rights of citizens. (shrink)
What words we use, and what meanings they have, is important. We shouldn't use slurs; we should use 'rape' to include spousal rape (for centuries we didn’t); we should have a word which picks out the sexual harassment suffered by people in the workplace and elsewhere (for centuries we didn’t). Sometimes we need to change the word-meaning pairs in circulation, either by getting rid of the pair completely (slurs), changing the meaning (as we did with 'rape'), or adding brand new (...) word-meaning pairs (as with 'sexual harassment'). A problem, though, is how to do this. One might worry that any attempt to change language in this way will lead to widespread miscommunication and confusion. I argue that this is indeed so, but that's a feature, not a bug of attempting to change word-meaning pairs. The miscommunications and confusion such changes cause can lead us, via a process I call transformative communicative disruption, to reflect on our language and its use, and this can be further, rather than hinder, our goal of improving language. (shrink)
The objective of this article is twofold. First, a methodological issue is addressed. It is pointed out that even if philosophers of mathematics have been recently more and more concerned with the practice of mathematics, there is still a need for a sharp deﬁnition of what the targets of a philosophy of mathematical practice should be. Three possible objects of inquiry are put forward: (1) the collective dimension of the practice of mathematics; (2) the cognitives capacities requested to the practitioners; (...) and (3) the speciﬁc forms of representation and notation shared and selected by the practitioners. Moreover, it is claimed that a broadening of the notion of ‘permissible action’ as introduced by Larvor (2012) with respect to mathematical arguments, allows for a consideration of all these three elements simultaneously. Second, a case from topology – the proof of Alexander’s theorem – is presented to illustrate a concrete analysis of a mathematical practice and to exemplify the proposed method. It is discussed that the attention to the three elements of the practice identiﬁed above brings to the emergence of philosophically relevant features in the practice of topology: the need for a revision in the deﬁnition of criteria of validity, the interest in tracking the operations that are performed on the notation, and the constant and fruitful back-and-forth from one representation to another in dealing with mathematical content. Finally, some suggestions for further research are given in the conclusions. (shrink)
This study attempts to reconstruct Nietzsche’s reading of Aristotle in the 1860s and 1870s—the years before he left his career as a philologist. Against the popular view that Nietzsche read only one book by Aristotle, namely the Rhetoric, the present study hopes to show that he had direct knowledge of several of Aristotle’s main works, while much of his interest in Aristotle centred on the latter’s account of art. The particular aim of this study is to explore how Nietzsche’s reading (...) of Aristotle contributed to the formation of The Birth of Tragedy. It will show that, although Nietzsche mentions Aristotle in his first book only en passant, his theory of tragedy should be understood against the background of Aristotelian poetics, especially as interpreted by such contemporaries as Jacob Bernays, Joseph Hubert Reinkens, and Gustav Teichmüller. (shrink)
We often find ourselves in situations where it is up to us to make decisions on behalf of others. How can we determine whether such decisions are morally justified, especially if those decisions may change who it is these others end up becoming? In this paper, I will evaluate one plausible kind of justification that may tempt us: we may want to justify our decision by appealing to the likelihood that the other person will be glad we made that specific (...) choice down the line. Although it is tempting, I ultimately argue that we should reject this sort of appeal as a plausible justification for the moral permissibility of our vicarious decisions. This is because the decisions that we make on behalf of another may affect the interests and values that that person will hold in the future. As I will show, this complicates the justificatory relationship between present decisions and future attitudes, since the latter can depend on the former. This is not to say that the predicted future attitudes of others can play no significant role in justifying our decisions on others’ behalf. Rather, appealing to the future attitudes in our moral justifications may play an important role in our practical thinking but only when we consider the future attitudes of all relevant possible futures. (shrink)
McQueen and Vaidman argue that the Many Worlds Interpretation (MWI) of quantum mechanics provides local causal explanations of the outcomes of experiments in our experience that is due to the total effect of all the worlds together. We show that although the explanation is local in one world, it requires a causal influence that travels across different worlds. We further argue that in the MWI the local nature of our experience is not derivable from the Hilbert space structure, but (...) has to be added to it as an independent postulate. This is due to what we call the factorisation-symmetry and basis-symmetry of Hilbert space. (shrink)
Social scientists have paid insufficient attention to the role of law in constituting the economic institutions of capitalism. Part of this neglect emanates from inadequate conceptions of the nature of law itself. Spontaneous conceptions of law and property rights that downplay the role of the state are criticized here, because they typically assume relatively small numbers of agents and underplay the complexity and uncertainty in developed capitalist systems. In developed capitalist economies, law is sustained through interaction between private agents, courts (...) and the legislative apparatus. Law is also a key institution for overcoming contracting uncertainties. It is furthermore a part of the power structure of society, and a major means by which power is exercised. This argument is illustrated by considering institutions such as property and the firm. Complex systems of law have played a crucial role in capitalist development and are also vital for developing economies. (shrink)
In artificial intelligence, recent research has demonstrated the remarkable potential of Deep Convolutional Neural Networks (DCNNs), which seem to exceed state-of-the-art performance in new domains weekly, especially on the sorts of very difficult perceptual discrimination tasks that skeptics thought would remain beyond the reach of artificial intelligence. However, it has proven difficult to explain why DCNNs perform so well. In philosophy of mind, empiricists have long suggested that complex cognition is based on information derived from sensory experience, often appealing to (...) a faculty of abstraction. Rationalists have frequently complained, however, that empiricists never adequately explained how this faculty of abstraction actually works. In this paper, I tie these two questions together, to the mutual benefit of both disciplines. I argue that the architectural features that distinguish DCNNs from earlier neural networks allow them to implement a form of hierarchical processing that I call “transformational abstraction”. Transformational abstraction iteratively converts sensory-based representations of category exemplars into new formats that are increasingly tolerant to “nuisance variation” in input. Reflecting upon the way that DCNNs leverage a combination of linear and non-linear processing to efficiently accomplish this feat allows us to understand how the brain is capable of bi-directional travel between exemplars and abstractions, addressing longstanding problems in empiricist philosophy of mind. I end by considering the prospects for future research on DCNNs, arguing that rather than simply implementing 80s connectionism with more brute-force computation, transformational abstraction counts as a qualitatively distinct form of processing ripe with philosophical and psychological significance, because it is significantly better suited to depict the generic mechanism responsible for this important kind of psychological processing in the brain. (shrink)
Advancements in computing, instrumentation, robotics, digital imaging, and simulation modeling have changed science into a technology-driven institution. Government, industry, and society increasingly exert their influence over science, raising questions of values and objectivity. These and other profound changes have led many to speculate that we are in the midst of an epochal break in scientific history. -/- This edited volume presents an in-depth examination of these issues from philosophical, historical, social, and cultural perspectives. It offers arguments both for and against (...) the epochal break thesis in light of historical antecedents. Contributors discuss topics such as: science as a continuing epistemological enterprise; the decline of the individual scientist and the rise of communities; the intertwining of scientific and technological needs; links to prior practices and ways of thinking; the alleged divide between mode-1 and mode-2 research methods; the commodification of university science; and the shift from the scientific to a technological enterprise. Additionally, they examine the epochal break thesis using specific examples, including the transition from laboratory to real world experiments; the increased reliance on computer imaging; how analog and digital technologies condition behaviors that shape the object and beholder; the cultural significance of humanoid robots; the erosion of scientific quality in experimentation; and the effect of computers on prediction at the expense of explanation. -/- Whether these events represent a historic break in scientific theory, practice, and methodology is disputed. What they do offer is an important occasion for philosophical analysis of the epistemic, institutional and moral questions affecting current and future scientific pursuits. (shrink)
I defend the extremist position that the fundamental ontology of the world consists of a vector in Hilbert space evolving according to the Schrödinger equation. The laws of physics are determined solely by the energy eigenspectrum of the Hamiltonian. The structure of our observed world, including space and fields living within it, should arise as a higher-level emergent description. I sketch how this might come about, although much work remains to be done.
This dissertation examines several of the problems that Hilbert discovered in the foundations of mathematics, from a metalogical perspective. The problems manifest themselves in four different aspects of Hilbert’s views: (i) Hilbert’s axiomatic approach to the foundations of mathematics; (ii) His response to criticisms of set theory; (iii) His response to intuitionist criticisms of classical mathematics; (iv) Hilbert’s contribution to the specification of the role of logical inference in mathematical reasoning. This dissertation argues that Hilbert’s (...) axiomatic approach was guided primarily by model theoretical concerns. Accordingly, the ultimate aim of his consistency program was to prove the model-theoretical consistency of mathematical theories. It turns out that for the purpose of carrying out such consistency proofs, a suitable modification of the ordinary first-order logic is needed. To effect this modification, independence-friendly logic is needed as the appropriate conceptual framework. It is then shown how the model theoretical consistency of arithmetic can be proved by using IF logic as its basic logic. Hilbert’s other problems, manifesting themselves as aspects (ii), (iii), and (iv)—most notably the problem of the status of the axiom of choice, the problem of the role of the law of excluded middle, and the problem of giving an elementary account of quantification—can likewise be approached by using the resources of IF logic. It is shown that by means of IF logic one can carry out Hilbertian solutions to all these problems. The two major results concerning aspects (ii), (iii) and (iv) are the following: (a) The axiom of choice is a logical principle; (b) The law of excluded middle divides metamathematical methods into elementary and non-elementary ones. It is argued that these results show that IF logic helps to vindicate Hilbert’s nominalist philosophy of mathematics. On the basis of an elementary approach to logic, which enriches the expressive resources of ordinary first-order logic, this dissertation shows how the different problems that Hilbert discovered in the foundations of mathematics can be solved. (shrink)
This paper explores the use of model organisms in studying the cognitive phenomenon of decision-making. Drawing on the framework of biological control to develop a skeletal conception of decision-making, we show that two core features of decision-making mechanisms can be identified by studying model organisms, such as E. coli, jellyfish, C. elegans, lamprey, etc. First, decision mechanisms are distributed and heterarchically-structured. Second, they depend heavily on chemical information processing, such as those involving neuromodulators. We end by discussing the implications for (...) studying distinctively human decision-making. (shrink)
In this commentary, I critique three aspects of Emily Walsh's proposal to reduce the moral and legal weight of advance directives: (1) the ambiguity of its initial thesis, (2) its views about the ethics and legality of clinical practice, and (3) its interpretation and application of Ronald Dworkin’s account of advance directives and L.A. Paul's view on transformative experience. I also consider what Walsh’s proposal would mean for people facing the prospect of dementia. I conclude that our reasons to honor (...) many advance directives survive the move to a transformative experience framework. (shrink)
The Protein Ontology (PRO) provides a formal, logically-based classification of specific protein classes including structured representations of protein isoforms, variants and modified forms. Initially focused on proteins found in human, mouse and Escherichia coli, PRO now includes representations of protein complexes. The PRO Consortium works in concert with the developers of other biomedical ontologies and protein knowledge bases to provide the ability to formally organize and integrate representations of precise protein forms so as to enhance accessibility to results of protein (...) research. PRO (http://pir.georgetown.edu/pro) is part of the Open Biomedical Ontologies (OBO) Foundry. (shrink)
ABSTRACT In a Buddhist treatise from around the fourth century CE there is a very remarkable story which serves as a thought experiment calling us to question the nature of self and the identity of persons. Lost in Sanskrit, the passage is fortunately preserved in a Chinese translation, the Dà zhìdù lùn. We here present the first reliable translation directly from the Classical Chinese, and discuss the philosophical significance of the story in its historical and literary context. We emphasise the (...) philosophical importance of embedding the story in two framing narratives, and demonstrate that the story taps a range of intuitions, and indeed fears, about the survival of the self which have also played a large role in the history of the topic in the West, and which continue to be of great contemporary concern. (shrink)
Hilbert izlencesinin kanıt kuramsal amacı tarihsel gelişimi içinde özetlendikten sonra arka plandaki model-kuramsal motivasyonu belirtilmektedir. Hilbert'in nihai hedefinin matematiğin temellerine ilişkin tüm epistemolojik ve ontolojik varsayımlardan arındırılmış bir matematik kuramı geliştirmek olduğu savunulmaktadır. Yakın geçmişte mantıktaki bazı gelişmelerin Hilbert izlencesinin yalnızca adcı varsayımlar temelinde sürdürülebileceğine ilişkin yeni bir bakış açısı sağladığı öne sürülmektedir.
The issue of the use of the Nachlass material has been much debated in Nietzsche scholarship in recent decades. Some insist on the absolute interpretative priority of his published writings over those unpublished and suggest that an extensive engagement with the Nachlass is harmful because it is something Nietzsche rejected. To verify this claim, they appeal to the story of Nietzsche asking his landlord in Sils-Maria to burn some of his notes. Since the notes that were ultimately retrieved are purportedly (...) incorporated into the compilation The Will to Power, the story also leads some to conclude that Nietzsche rejected his project on the will to power. However, the reliability of this story has been questioned. In this manuscript I ﬁrst present the decisive piece of evidence that will settle the controversy over the story’s authenticity. After showing that it is true that in 1888 Nietzsche wanted some of his notes burned, I address the question of what we can conclude from this story. I argue that it neither suggests the abandonment of the will to power project, nor warrants a devaluation of the Nachlass. Finally, I will discuss the methodological problem of the use of Nietzsche’s Nachlass in general. (shrink)
Human development is meant to be transformational in that it aims to improve people's lives by enhancing their capabilities. But who does it target: people as they are or the people they will become? This paper argues that the human development approach relies on an understanding of personal identity as dynamic rather than as static collections of preferences, and that this distinguishes human development from conventional approaches to development. Nevertheless, this dynamic understanding of personal identity is presently poorly conceptualized and (...) this has implications for development practice. We identify a danger of paternalism and propose institutionalizing two procedural principles as side constraints on development policies and projects: the principle of free prior informed consent and the principle of democratic development. -/- . (shrink)
Deliberative Transformative Moments (DTM) is a new concept that serves as an amendment to the DQI. With this new concept it is easier to get at the quick give-and-take of discussions of small groups of ordinary citizens. As an illustration, we apply the concept to discussions about the peace process among Colombian ex-combatants, ex-guerrillas and ex-paramilitaries. Specifically, we show how personal stories can transform a discussion from a low to a high level of deliberation and how they can have (...) the opposite effect. To increase the level of deliberation in the general population, we recommend that good illustrations of DTM’s should be part of the school programs from an early age on, so that children learn how to discuss with others who have different opinions and values. (shrink)
This paper aims to provide a clarification of the long debate on whether enhancement will or will not diminish authenticity. It focuses particularly on accounts provided by Carl Elliott and David DeGrazia. Three clarifications will be presented here. First, most discussants only criticise Elliott’s identity argument and neglect that his conservative position in the use of enhancement can be understood as a concern over social coercion. Second, Elliott’s and DeGrazia’s views can, not only co-exist, but even converge together as an (...) autonomy based theory of authenticity. Third, the current account of autonomy provided by DeGrazia fails to address the importance of rationality and the ability of self-correction, which, as a result impedes the theory to provide a fully developed account for authenticity. In conclusion, a satisfactory account of authenticity cannot focus only on identity or subjective preference. (shrink)
The foundational ideas of David Hilbert have been generally misunderstood. In this dissertation prospectus, different aims of Hilbert are summarized and a new interpretation of Hilbert's work in the foundations of mathematics is roughly sketched out. Hilbert's view of the axiomatic method, his response to criticisms of set theory and intuitionist criticisms of the classical foundations of mathematics, and his view of the role of logical inference in mathematical reasoning are briefly outlined.
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.