Vogel argues that sensitivity accounts of knowledge are implausible because they entail that we cannot have any higher-level knowledge that our beliefs are true, not false. Becker and Salerno object that Vogel is mistaken because he does not formalize higher-level beliefs adequately. They claim that if formalized correctly, higher-level beliefs are sensitive, and can therefore constitute knowledge. However, these accounts do not consider the belief-forming method as sensitivity accounts require. If we take bootstrapping as the belief-forming method, as the discussed (...) cases suggest, then we face a generality problem. Our higher-level beliefs as formalized by Becker and Salerno turn out to be sensitive according to a wide reading of bootstrapping, but insensitive according to a narrow reading. This particular generality problem does not arise for the alternative accounts of process reliabilism and basis-relative safety. Hence, sensitivity accounts not only deliver opposite results given different formalizations of higher-level beliefs, but also for the same formalization, depending on how we interpret bootstrapping. Therefore, sensitivity accounts do not fail because they make higher-level knowledge impossible, as Vogel argues, and they do not succeed in allowing higher-level knowledge, as Becker and Salerno suggest. Rather, their problem is that they deliver far too heterogeneous results. (shrink)
We generalize, by a progressive procedure, the notions of conjunction and disjunction of two conditional events to the case of n conditional events. In our coherence-based approach, conjunctions and disjunctions are suitable conditional random quantities. We define the notion of negation, by verifying De Morgan’s Laws. We also show that conjunction and disjunction satisfy the associative and commutative properties, and a monotonicity property. Then, we give some results on coherence of prevision assessments for some families of compounded conditionals; in particular (...) we examine the Fréchet-Hoeffding bounds. Moreover, we study the reverse probabilistic inference from the conjunction Cn+1 of n + 1 conditional events to the family {Cn,En+1|Hn+1}. We consider the relation with the notion of quasi-conjunction and we examine in detail the coherence of the prevision assessments related with the conjunction of three conditional events. Based on conjunction, we also give a characterization of p-consistency and of p-entailment, with applications to several inference rules in probabilistic nonmonotonic reasoning. Finally, we examine some non p-valid inference rules; then, we illustrate by an example two methods which allow to suitably modify non p-valid inference rules in order to get inferences which are p-valid. (shrink)
Lipsey and Lancaster's "general theory of second best" is widely thought to have significant implications for applied theorizing about the institutions and policies that most effectively implement abstract normative principles. It is also widely thought to have little significance for theorizing about which abstract normative principles we ought to implement. Contrary to this conventional wisdom, I show how the second-best theorem can be extended to myriad domains beyond applied normative theorizing, and in particular to more abstract theorizing about the normative (...) principles we should aim to implement. I start by separating the mathematical model used to prove the second-best theorem from its familiar economic interpretation. I then develop an alternative normative-theoretic interpretation of the model, which yields a novel second best theorem for idealistic normative theory. My method for developing this interpretation provides a template for developing additional interpretations that can extend the reach of the second-best theorem beyond normative theoretical domains. I also show how, within any domain, the implications of the second-best theorem are more specific than is typically thought. I conclude with some brief remarks on the value of mathematical models for conceptual exploration. (shrink)
In this paper, I present a general theory of topological explanations, and illustrate its fruitfulness by showing how it accounts for explanatory asymmetry. My argument is developed in three steps. In the first step, I show what it is for some topological property A to explain some physical or dynamical property B. Based on that, I derive three key criteria of successful topological explanations: a criterion concerning the facticity of topological explanations, i.e. what makes it true of a particular system; (...) a criterion for describing counterfactual dependencies in two explanatory modes, i.e. the vertical and the horizontal; and, finally, a third perspectival one that tells us when to use the vertical and when to use the horizontal mode. In the second step, I show how this general theory of topological explanations accounts for explanatory asymmetry in both the vertical and horizontal explanatory modes. Finally, in the third step, I argue that this theory is universally applicable across biological sciences, which helps to unify essential concepts of biological networks. (shrink)
A general framework for translating various logical systems is presented, including a set of partial unary operators of affirmation and negation. Despite its usual reading, affirmation is not redundant in any domain of values and whenever it does not behave like a full mapping. After depicting the process of partial functions, a number of logics are translated through a variety of affirmations and a unique pair of negations. This relies upon two preconditions: a deconstruction of truth-values as ordered and structured (...) objects, unlike its mainstream presentation as a simple object; a redefinition of the Principle of Bivalence as a set of four independent properties, such that its definition does not equate with normality. (shrink)
Hossack's 'The Metaphysics of Knowledge' develops a theory of facts, entities in which universals are combined with universals or particulars, as the foundation of his metaphysics. While Hossack argues at length that there must be negative facts, facts in which the universal 'negation' is combined with universals or particulars, his conclusion that there are also general facts, facts in which the universal 'generality' is combined with universals, is reached rather more swiftly. In this paper I present Hossack with three (...) arguments for his conclusion. They all draw, as does Hossack's theory of facts, on views Russell expressed in various writings. Two arguments are based on Russell's explanation of universals as aspects of resemblance; the third on Russell's observation that general propositions do not follow logically from exclusively particular premises. Comparison with other metaphysics of generality show them to be wanting and Russell's and Hossack's accounts superior. (shrink)
Metacognition is the capacity to evaluate the success of one's own cognitive processes in various domains; for example, memory and perception. It remains controversial whether metacognition relies on a domain-general resource that is applied to different tasks or if self-evaluative processes are domain specific. Here, we investigated this issue directly by examining the neural substrates engaged when metacognitive judgments were made by human participants of both sexes during perceptual and memory tasks matched for stimulus and performance characteristics. By comparing patterns (...) of fMRI activity while subjects evaluated their performance, we revealed both domain-specific and domain-general metacognitive representations. Multivoxel activity patterns in anterior prefrontal cortex predicted levels of confidence in a domain-specific fashion, whereas domain-general signals predicting confidence and accuracy were found in a widespread network in the frontal and posterior midline. The demonstration of domain-specific metacognitive representations suggests the presence of a content-rich mechanism available to introspection and cognitive control. (shrink)
Multiple generality has long been known to cause confusion. For example, “Everyone has a donkey that is running” has two readings: either (i) there is a donkey, owned by everyone, and it is running; or (ii) everyone owns some donkey or other, and all such donkeys run. Medieval logicians were acutely aware of such ambiguities, and the logical problems they pose, and sought to sort them out. One of the most ambitious undertakings in this regard is a pair of (...) massive diagrams (magnae figurae) which map out the logical interrelations of two sets of doubly-general forms. These appear in a fourteenth-century MS of John Buridan’s Summulae de Propositionibus. In this paper, I present these diagrams, and determine the truth conditions of their different forms. To that end, I have developed a bespoke system of diagrams to display their truth conditions. As we will see, such forms present significant difficulties for an all-encompassing account of the role form plays in logic. Accordingly, they can tell us important things about the role logical form plays in Buridan’s account of logical foundations. (shrink)
The paper discusses the philosophical conclusions, which the interrelation between quantum mechanics and general relativity implies by quantum measure. Quantum measure is three-dimensional, both universal as the Borel measure and complete as the Lebesgue one. Its unit is a quantum bit (qubit) and can be considered as a generalization of the unit of classical information, a bit. It allows quantum mechanics to be interpreted in terms of quantum information, and all physical processes to be seen as informational in a generalized (...) sense. This implies a fundamental connection between the physical and material, on the one hand, and the mathematical and ideal, on the other hand. Quantum measure unifies them by a common and joint informational unit. Furthermore the approach clears up philosophically how quantum mechanics and general relativity can be understood correspondingly as the holistic and temporal aspect of one and the same, the state of a quantum system, e.g. that of the universe as a whole. The key link between them is the notion of the Bekenstein bound as well as that of quantum temperature. General relativity can be interpreted as a special particular case of quantum gravity. All principles underlain by Einstein (1918) reduce the latter to the former. Consequently their generalization and therefore violation addresses directly a theory of quantum gravity. Quantum measure reinterprets newly the “Bing Bang” theories about the beginning of the universe. It measures jointly any quantum leap and smooth motion complementary to each other and thus, the jump-like initiation of anything and the corresponding continuous process of its appearance. Quantum measure unifies the “Big Bang” and the whole visible expansion of the universe as two complementary “halves” of one and the same, the set of all states of the universe as a whole. It is a scientific viewpoint to the “creation from nothing”. (shrink)
The generalized Darwinian research programme accepts physicalism, but holds that all life is purposive in character. It seeks to understand how and why all purposiveness has evolved in the universe – especially purposiveness associated with what we value most in human life, such as sentience, consciousness, person-to-person understanding, science, art, free¬dom, love. As evolution proceeds, the mechanisms of evolution themselves evolve to take into account the increasingly important role that purposive action can play - especially when quasi-Lamarckian evolution by cultural (...) means comes into existence. This programme of research brings together, into a coherent field of inquiry, aspects of such diverse fields of research as orthodox Darwinian theory (given its purposive interpretation), the study of animal behaviour, palaeontology, archaeology, history, anthropology, psycho-neurology, artificial intelligence, psychology, sociology, philosophy, linguistics, semantics, history and philosophy of science, and history and philosophy of inquiry more generally (the history and philosophy of ideas and culture). The great advantage of the generalized Darwinian research programme is that it provides a framework for understanding the deeds, achievements and experiences of people in a way that is compatible with the kind of knowledge and understanding achieved in the physical sciences, without being reducible to such knowledge and understanding. It promises to enable us to understand ourselves as a part of the biological domain without our humanity, our distinctive human value, being in any way denied: persons are not reduced to animals, and nor are animals misconceived to be persons. It holds out the hope that we can come to understand the human world as an integral part of the natural world without the meaning and value of the human world being thereby conceptually annihilated. The programme specifies in general terms what we must seek to do in order to develop a coherent understanding of nature and of ourselves which does justice to the character of both. (shrink)
Reconstructability analysis (RA) decomposes wholes, namely data in the form either of set theoretic relations or multivariate probability distributions, into parts, namely relations or distributions involving subsets of variables. Data is modeled and compressed by variable-based decomposition, by more general state-based decomposition, or by the use of latent variables. Models, which specify the interdependencies among the variables, are selected to minimize error and complexity.
I present and defend the generalized selected effects theory (GSE) of function. According to GSE, the function of a trait consists in the activity that contributed to its bearer’s differential reproduction, or differential retention, within a population. Unlike the traditional selected effects (SE) theory, it does not require that the functional trait helped its bearer reproduce; differential retention is enough. Although the core theory has been presented previously, I go significantly beyond those presentations by providing a new argument for GSE (...) and defending it from a recent objection. I also sketch its implications for teleosemantics and philosophy of medicine. (shrink)
Epistemic rationality is typically taken to be immodest at least in this sense: a rational epistemic state should always take itself to be doing at least as well, epistemically and by its own light, than any alternative epistemic state. If epistemic states are probability functions and their alternatives are other probability functions defined over the same collection of proposition, we can capture the relevant sense of immodesty by claiming that epistemic utility functions are (strictly) proper. In this paper I examine (...) what happens if we allow for the alternatives to an epistemic state to include probability functions with different domains. I first prove an impossibility result: on minimal assumptions, I show that there is no way of vindicating strong immodesty principles to the effect that any probability function should take itself to be doing at least as well than any alternative probability function, regardless of its domain. I then consider alternative, weaker generalizations of the traditional immodesty principle and prove some characterization results for some classes of epistemic utility functions satisfying each of the relevant principles. (shrink)
Assuming that votes are independent, the epistemically optimal procedure in a binary collective choice problem is known to be a weighted supermajority rule with weights given by personal log-likelihood-ratios. It is shown here that an analogous result holds in a much more general model. Firstly, the result follows from a more basic principle than expected-utility maximisation, namely from an axiom (Epistemic Monotonicity) which requires neither utilities nor prior probabilities of the ‘correctness’ of alternatives. Secondly, a person’s input need not be (...) a vote for an alternative, it may be any type of input, for instance a subjective degree of belief or probability of the correctness of one of the alternatives. The case of a profile of subjective degrees of belief is particularly appealing, since here no parameters such as competence parameters need to be known. (shrink)
This paper offers a modification of Fabrice Correia's and Alexander Skiles' ("Grounding, Essence, and Identity") definition of grounding in terms of generalized identity that extends it to zero-grounding. This definition promises to improve our understanding of zero-grounding by capturing it within the framework of generalized identity and allows an essentialist theory of modality based on Correia's and Skiles' account to resist a recent challenge by Jessica Leech. The latter is achieved by combining the following two ideas: (1) Some necessities are (...) grounded in truths about zero-grounding, and (2) at least some identity truths are zero-grounded. Finally, some advantages of the zero-grounding approach over Correia's and Skiles' recent definition of necessity in terms of generalized identity and logical consequence are argued for. (shrink)
Examining the significance of the General’s enlightenment in the Platform Sutra, this article clarifies the fundamental role that emotions play in the development of one’s spiritual understanding. In order to do so, this article emphasizes that the way to enlightenment implicit in the story of the General and the Master involves first granting negative emotions a means for productive expression. By acting as a preparatory measure for calming the mind and surrendering control over it, human passions become a necessary, antecedent (...) condition to wisdom—a conclusion that this article argues is a major, and sometimes underappreciated, lesson embedded in the teachings of the Sixth Patriarch. (shrink)
Abstract. The theory-change epistemological model, tried on maxwellian revolution and special relativity genesis, is unfolded to apprehend general relativity genesis. It is exhibited that the dynamics of general relativity (GR) construction was largely governed by internal tensions of special relativity and Newton’s theory of gravitation. The research traditions’ encounter engendered construction of the hybrid domain at first with an irregular set of theoretical models. However, step by step, on revealing and gradual eliminating the contradictions between the models involved, the hybrid (...) set was put into order with a help of equivalence principle. A hierarchy of theoretical models starting from the crossbreeds and up to usual hybrids was moulded. The claim to put forward is that Einstein’s unification design could be successfully implemented since his programme embraced the ideas of the Nordström research programme, as well as the presuppositions of the programme of Max Abraham. By and large Einstein’s victory over his rivals became possible because the core of his research strategy was formed by the equivalence principle comprehended in the light of Kantian epistemology. It is stated that the theories of Nordström and Abraham contrived before November 25, 1915, were not merely the scaffolds to construct the GR basic model. They are still the necessary part of the whole GR theory necessary for its common use. Key words: Einstein, Nordstrom, Abraham, general relativity. -/- . (shrink)
We present a module based criterion, i.e. a sufficient condition based on the absolute value of the matrix coefficients, for the convergence of Gauss–Seidel method (GSM) for a square system of linear algebraic equations, the Generalized Line Criterion (GLC). We prove GLC to be the “most general” module based criterion and derive, as GLC corollaries, some previously know and also some new criteria for GSM convergence. Although far more general than the previously known results, the proof of GLC is simpler. (...) The results used here are related to recent research in stability of dynamical systems and control of manufacturing systems. (shrink)
According to Intellectualism knowing how to V is a matter of knowing a suitable proposition about a way of V-ing. In this paper, I consider the question of which ways of acting might figure in the propositions which Intellectualists claim constitute the object of knowledge-how. I argue that Intellectualists face a version of the Generality Problem – familiar from discussions of Reliabilism – since not all ways of V-ing are such that knowledge about them suffices for knowledge-how. I consider (...) various responses to this problem, and argue that none are satisfactory. (shrink)
An examination of the role played by general rules in Hume's positive (nonskeptical) epistemology. General rules for Hume are roughly just general beliefs. The difference between justified and unjustified belief is a matter of the influence of good versus bad general rules, the good general rules being the "extensive" and "constant" ones.
We argue that the extant evidence for Stoic logic provides all the elements required for a variable-free theory of multiple generality, including a number of remarkably modern features that straddle logic and semantics, such as the understanding of one- and two-place predicates as functions, the canonical formulation of universals as quantified conditionals, a straightforward relation between elements of propositional and first-order logic, and the roles of anaphora and rigid order in the regimented sentences that express multiply general propositions. We (...) consider and reinterpret some ancient texts that have been neglected in the context of Stoic universal and existential propositions and offer new explanations of some puzzling features in Stoic logic. Our results confirm that Stoic logic surpasses Aristotle’s with regard to multiple generality, and are a reminder that focusing on multiple generality through the lens of Frege-inspired variable-binding quantifier theory may hamper our understanding and appreciation of pre-Fregean theories of multiple generality. (shrink)
The aim of this paper is twofold: First, to generalize Quine's epistemology, to show that what Quine refutes for traditional epistemology is not only Cartesian foundationalism and Carnapian reductionism, but also any epistemological program if it takes atomic verificationist semantics or supernaturalism, which are rooted in the linguistic/factual distinction of individual sentences, as its underlying system. Thus, we will see that the range of naturalization in the Quinean sense is not as narrow as his critics think. Second, to normalize Quine's (...) epistemology, to explain in what sense Quinean naturalized epistemology is normative. The reason I maintain that critics miss the point of Quinean naturalized epistemology is that they do not appreciate the close connection between Quine's naturalistic approach and his holistic approach to epistemology. To show this I shall reconstruct Quine's argument for naturalizing epistemology within his systematic philosophy, and focus specifically on his holism and its applications, on which Quine relies both in arguing against traditional epistemology, and in supporting his theses of underdetermination of physical theory and indeterminacy of translation. This is the key to understanding the scope and the normativity of Quine's epistemology. In the conclusion I will point out what the genuine problems are for Quinean naturalized epistemology. (shrink)
The Generalized Quantifiers Theory, I will argue, in the second half of last Century has led to an important rapprochement, relevant both in logic and in linguistics, between logical quantification theories and the semantic analysis of quantification in natural languages. In this paper I concisely illustrate the formal aspects and the theoretical implications of this rapprochement.
The thesis is: the “periodic table” of “dark matter” is equivalent to the standard periodic table of the visible matter being entangled. Thus, it is to consist of all possible entangled states of the atoms of chemical elements as quantum systems. In other words, an atom of any chemical element and as a quantum system, i.e. as a wave function, should be represented as a non-orthogonal in general (i.e. entangled) subspace of the separable complex Hilbert space relevant to the system (...) to which the atom at issue is related as a true part of it. The paper follows previous publications of mine stating that “dark matter” and “dark energy” are projections of arbitrarily entangled states on the cognitive “screen” of Einstein’s “Mach’s principle” in general relativity postulating that gravitational field can be generated only by mass or energy. (shrink)
In this paper, I define and study an abstract algebraic structure, the dimensive algebra, which embodies the most general features of the algebra of dimensional physical quantities. I prove some elementary results about dimensive algebras and suggest some directions for future work.
In this paper, I articulate and argue for a new truthmaker view of ontological commitment, which I call the “General Truthmaker View”: when one affirms a sentence, one is ontologically committed to there being something that makes true the proposition expressed by the sentence. This view comes apart from Quinean orthodoxy in that we are not ontologically committed to the things over which we quantify, and it comes apart from extant truthmaker views of ontological commitment in that we are not (...) ontologically committed to the truthmakers of our sentences. (shrink)
Penelope Maddy’s Second Philosophy is one of the most well-known ap- proaches in recent philosophy of mathematics. She applies her second-philosophical method to analyze mathematical methodology by reconstructing historical cases in a setting of means-ends relations. However, outside of Maddy’s own work, this kind of methodological analysis has not yet been extensively used and analyzed. In the present work, we will make a first step in this direction. We develop a general framework that allows us to clarify the procedure and (...) aims of the Second Philosopher’s investigation into set-theoretic methodology; pro- vides a platform to analyze the Second Philosopher’s methods themselves; and can be applied to further questions in the philosophy of set theory. (shrink)
The dynamics of general relativity is encoded in a set of ten differential equations, the so-called Einstein field equations. It is usually believed that Einstein's equations represent a physical law describing the coupling of spacetime with material fields. However, just six of these equations actually describe the coupling mechanism: the remaining four represent a set of differential relations known as Bianchi identities. The paper discusses the physical role that the Bianchi identities play in general relativity, and investigates whether these identities (...) --qua part of a physical law-- highlight some kind of a posteriori necessity in a Kripkean sense. The inquiry shows that general relativistic physics has an interesting bearing on the debate about the metaphysics of the laws of nature. (shrink)
Several decades ago, Wheeler and Misner presented a model of electric charge based on the topological trapping of electric field lines in wormholes. In this paper, which does not argue for or against the "charge without charge" concept, I describe some generalizations of this model which might serve as topological analogs of color charges and electroweak charges.
Bien qu'il existe des différences significatives entre la philosophie de Mario Bunge et celle de Graham Harman, il existe également des similitudes fonda-mentales entre elles. Ces penseurs affirment tous deux qu'il est possible de dé-velopper une théorie générale des objets. Le premier estime que la théorie en question est logico-mathématique, tandis que le second suggère qu'elle est on-tologique. Quoi qu’il en soit, ils conviennent que tous les objets doivent être con-sidérés, qu’ils soient réels ou non. En outre, ils suggèrent que (...) même si aucun ob-jet ne doit être exclu de la théorie, il est nécessaire d’en distinguer différents types. (shrink)
Contains a description of a generalized and constructive formal model for the processes of subjective and creative thinking. According to the author, the algorithm presented in the article is capable of real and arbitrarily complex thinking and is potentially able to report on the presence of consciousness.
This letter was rejected by International Knowledge Press because "we are unable to conclude that these findings would warrant publication in this journal." The letter is suggesting that dark energy, dark matter and universal expansion are intimately related. However, they aren't viewed as revolutions in cosmology which are essential to a complete understanding of the modern universe. They are instead viewed as properties which need to be added to the cosmos when Einstein's theory of gravity (General Relativity) is apparently still (...) not thoroughly comprehended a little over a century since it was published. (shrink)
Jeff Paris proves a generalized Dutch Book theorem. If a belief state is not a generalized probability then one faces ‘sure loss’ books of bets. In Williams I showed that Joyce’s accuracy-domination theorem applies to the same set of generalized probabilities. What is the relationship between these two results? This note shows that both results are easy corollaries of the core result that Paris appeals to in proving his dutch book theorem. We see that every point of accuracy-domination defines a (...) dutch book, but we only have a partial converse. (shrink)
In this article, a possible generalization of the Löb’s theorem is considered. Main result is: let κ be an inaccessible cardinal, then ¬Con( ZFC +∃κ) .
In this paper we present the generalization of neutrosophic rings and neutrosophic fields. We also extend the neutrosophic ideal to neutrosophic biideal and neutrosophic N-ideal. We also find some new type of notions which are related to the strong or pure part of neutrosophy. We have given sufficient amount of examples to illustrate the theory of neutrosophic birings, neutrosophic N-rings with neutrosophic bifields and neutrosophic N-fields and display many properties of them in this paper.
I distinguish two ways an ability might be general: (i) an ability might be general in that its possession doesn't entail the possession of an opportunity; (ii) an ability might be general in virtue of pertaining to a wide range of circumstances. I argue that these two types of generality – I refer to them with the terms ‘general’ and ‘generic’, respectively – produce two orthogonal distinctions among abilities. I show that the two types of generality are sometimes (...) run together by those writing on free will and argue that both types of generality are relevant to understanding the modality of abilities. (shrink)
What does it mean for a general term to be rigid? It is argued by some that if we take general terms to designate their extensions, then almost no empirical general term will turn out to be rigid; and if we take them to designate some abstract entity, such as a kind, then it turns out that almost all general terms will be rigid. Various authors who pursue this line of reasoning have attempted to capture Kripke’s intent by defining a (...) rigid general term as one that applies to the objects in its extension essentially. I argue that this account is significantly mistaken for various reasons: it conflates a metaphysical notion (essentialism) with a semantic one (rigidity); it fails to countenance the fact that any term can be introduced into a language by stipulating that it be a rigid designator; it limits the extension of rigid terms so much that terms such as ‘meter’, ‘rectangle’, ‘truth’, etc. do not turn out to be rigid, when they obviously are; and it wrongly concentrates on the predicative use of a general term in applying a certain test offered by Kripke to determine whether a term is rigid. (shrink)
Although written in Japanese, 経験的綜合判断の一般的形式(The General Form of Empirical Synthetic Judgement)pursues the general logical form of the Kantian empirical synthetic judgements.
I address a type of circularity threat that arises for the view that we employ general basic logical principles in deductive reasoning. This type of threat has been used to argue that whatever knowing such principles is, it cannot be a fully cognitive or propositional state, otherwise deductive reasoning would not be possible. I look at two versions of the circularity threat and answer them in a way that both challenges the view that we need to apply general logical principles (...) in deductive reasoning and defuses the threat to a cognitivist account of knowing basic logical principles. (shrink)
For Humeans, many facts—even ones intuitively “about” particular, localized macroscopic parts of the world—turn out to depend on surprisingly global fundamental bases. We investigate some counterintuitive consequences of this picture. Many counterfactuals whose antecedents describe intuitively localized, non-actual states of affairs nevertheless end up involving wide-ranging implications for the global, embedding Humean mosaic. The case of self-undermining chances is a familiar example of this. We examine that example in detail and argue that popular existing strategies such as “holding the laws (...) fixed as laws” or “holding the laws fixed as true” are of no help. Interestingly, we show how a new proposal that draws on the resources of the Mentaculus can yield the right results—but only on the assumption that the Humean can make cross-world identifications. We go on to argue that the Humean cannot make such identifications. We conclude that the root of this trouble is deeper, and its reach broader, than the familiar cases suggest. We think it is very much an open question whether the Humean has sufficient resources to properly conceptualize macroscopic objects or to analyze these “local” counterfactuals. (shrink)
The way, in which quantum information can unify quantum mechanics (and therefore the standard model) and general relativity, is investigated. Quantum information is defined as the generalization of the concept of information as to the choice among infinite sets of alternatives. Relevantly, the axiom of choice is necessary in general. The unit of quantum information, a qubit is interpreted as a relevant elementary choice among an infinite set of alternatives generalizing that of a bit. The invariance to the axiom of (...) choice shared by quantum mechanics is introduced: It constitutes quantum information as the relation of any state unorderable in principle (e.g. any coherent quantum state before measurement) and the same state already well-ordered (e.g. the well-ordered statistical ensemble of the measurement of the quantum system at issue). This allows of equating the classical and quantum time correspondingly as the well-ordering of any physical quantity or quantities and their coherent superposition. That equating is interpretable as the isomorphism of Minkowski space and Hilbert space. Quantum information is the structure interpretable in both ways and thus underlying their unification. Its deformation is representable correspondingly as gravitation in the deformed pseudo-Riemannian space of general relativity and the entanglement of two or more quantum systems. The standard model studies a single quantum system and thus privileges a single reference frame turning out to be inertial for the generalized symmetry [U(1)]X[SU(2)]X[SU(3)] “gauging” the standard model. As the standard model refers to a single quantum system, it is necessarily linear and thus the corresponding privileged reference frame is necessary inertial. The Higgs mechanism U(1) → [U(1)]X[SU(2)] confirmed enough already experimentally describes exactly the choice of the initial position of a privileged reference frame as the corresponding breaking of the symmetry. The standard model defines ‘mass at rest’ linearly and absolutely, but general relativity non-linearly and relatively. The “Big Bang” hypothesis is additional interpreting that position as that of the “Big Bang”. It serves also in order to reconcile the linear standard model in the singularity of the “Big Bang” with the observed nonlinearity of the further expansion of the universe described very well by general relativity. Quantum information links the standard model and general relativity in another way by mediation of entanglement. The linearity and absoluteness of the former and the nonlinearity and relativeness of the latter can be considered as the relation of a whole and the same whole divided into parts entangled in general. (shrink)
This article outlines a method of collaboration that will manifest a high probability of cumulative and progressive results in science. The method will accomplish this through a division of labour grounded in the order of occurrence of human cognitional operations. The following article explores the possibility of a method known as functional specialization, distinct tasks presently operative in neuroscience. Functional specialization will enhance collaboration within a science as well as initiate implementation of generalized empirical method. Implementation of generalized empirical method (...) will be achieved through the focus of individual specialties on specific mental operations. (shrink)
In responding to my article, Quinn raises the question of development in science and scientific method. He picks up on the topic of the last section of my paper, and suggests that “generalized empirical method” will be “coherent with the essential dynamics of scientific progress.” He points out that, if implemented, such an extended method “promises to be a way toward new and practical results”.
This is the editorial for a special volume of JETAI, featuring papers by Omohundro, Armstrong/Sotala/O’Heigeartaigh, T Goertzel, Brundage, Yampolskiy, B. Goertzel, Potapov/Rodinov, Kornai and Sandberg. - If the general intelligence of artificial systems were to surpass that of humans significantly, this would constitute a significant risk for humanity – so even if we estimate the probability of this event to be fairly low, it is necessary to think about it now. We need to estimate what progress we can expect, what (...) the impact of superintelligent machines might be, how we might design safe and controllable systems, and whether there are directions of research that should best be avoided or strengthened. (shrink)
We present a general framework for representing belief-revision rules and use it to characterize Bayes's rule as a classical example and Jeffrey's rule as a non-classical one. In Jeffrey's rule, the input to a belief revision is not simply the information that some event has occurred, as in Bayes's rule, but a new assignment of probabilities to some events. Despite their differences, Bayes's and Jeffrey's rules can be characterized in terms of the same axioms: "responsiveness", which requires that revised beliefs (...) incorporate what has been learnt, and "conservativeness", which requires that beliefs on which the learnt input is "silent" do not change. To illustrate the use of non-Bayesian belief revision in economic theory, we sketch a simple decision-theoretic application. (shrink)
A generalized information theory is proposed as a natural extension of Shannon's information theory. It proposes that information comes from forecasts. The more precise and the more unexpected a forecast is, the more information it conveys. If subjective forecast always conforms with objective facts then the generalized information measure will be equivalent to Shannon's information measure. The generalized communication model is consistent with K. R. Popper's model of knowledge evolution. The mathematical foundations of the new information theory, the generalized communication (...) model , information measures for semantic information and sensory information, and the coding meanings of generalized entropy and generalized mutual information are introduced. Assessments and optimizations of pattern recognition, predictions, and detection with the generalized information criterion are discussed. For economization of communication, a revised version of rate-distortion theory: rate-of-keeping-precision theory, which is a theory for datum compression and also a theory for matching an objective channels with the subjective understanding of information receivers, is proposed. Applications include stock market forecasting and video image presentation. (shrink)
The present paper was written as a contribution to ongoing methodological debates within the NCC project. We focus on the neural correlates of conscious perceptual episodes. Our claim is that the NCC notion, as applied to conscious perceptual episodes, needs to be reconceptualized. It mixes together the processing related to the perceived contents and the neural substrate of consciousness proper, i.e. mechanisms making the perceptual contents conscious. We thus propose that the perceptual NCC be divided into two constitutive subnotions. The (...) main theoretical idea that emerges as a consequence of this reconceptualization is that the neural correlate of a perceptual episode is formed in the neural interaction between content-processing and consciousness-conferring mechanisms. The paper elaborates this distinction, marshals some initial arguments in its favour, and tests it against some of the most debated theories of consciousness. (shrink)
How can different individuals' probability assignments to some events be aggregated into a collective probability assignment? Classic results on this problem assume that the set of relevant events -- the agenda -- is a sigma-algebra and is thus closed under disjunction (union) and conjunction (intersection). We drop this demanding assumption and explore probabilistic opinion pooling on general agendas. One might be interested in the probability of rain and that of an interest-rate increase, but not in the probability of rain or (...) an interest-rate increase. We characterize linear pooling and neutral pooling for general agendas, with classic results as special cases for agendas that are sigma-algebras. As an illustrative application, we also consider probabilistic preference aggregation. Finally, we compare our results with existing results on binary judgment aggregation and Arrovian preference aggregation. This paper is the first of two self-contained, but technically related companion papers inspired by binary judgment-aggregation theory. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.