Informational theories of semantic content have been recently gaining prominence in the debate on the notion of mental representation. In this paper we examine new-wave informational theories which have a special focus on cognitive science. In particular, we argue that these theories face four important difficulties: they do not fully solve the problem of error, fall prey to the wrong distality attribution problem, have serious difficulties accounting for ambiguous and redundant representations and fail to deliver a metasemantic theory of (...) representation. Furthermore, we argue that these difficulties derive from their exclusive reliance on the notion of information, so we suggest that pure informational accounts should be complemented with functional approaches. (shrink)
Integrated InformationTheory (IIT) identifies consciousness with having a maximum amount of integrated information. But a thing’s having the maximum amount of anything cannot be intrinsic to it, for that depends on how that thing compares to certain other things. IIT’s consciousness, then, is not intrinsic. A mereological argument elaborates this consequence: IIT implies that one physical system can be conscious while a physical duplicate of it is not conscious. Thus, by a common and reasonable conception of (...) intrinsicality, IIT’s consciousness is not intrinsic. It is then argued that to avoid the implication that consciousness is not intrinsic, IIT must abandon its Exclusion Postulate, which prohibits overlapping conscious systems. Indeed, theories of consciousness that attribute consciousness to physical systems, should embrace the view that some conscious systems overlap. A discussion of the admittedly counterintuitive nature of this solution, along with some medical and neuroscientific realities that would seem to support it, is included. (shrink)
The Integrated InformationTheory is a leading scientific theory of consciousness, which implies a kind of panpsychism. In this paper, I consider whether IIT is compatible with a particular kind of panpsychism, known as Russellian panpsychism, which purports to avoid the main problems of both physicalism and dualism. I will first show that if IIT were compatible with Russellian panpsychism, it would contribute to solving Russellian panpsychism’s combination problem, which threatens to show that the view does not (...) avoid the main problems of physicalism and dualism after all. I then show that the theories are not compatible as they currently stand, in view of what I call the coarse-graining problem. After I explain the coarse-graining problem, I will offer two possible solutions, each involving a small modification of IIT. Given either of these modifications, IIT and Russellian panpsychism may be fully compatible after all, and jointly enable significant progress on the mind–body problem. (shrink)
The causal and simulation theories are often presented as very distinct views about declarative memory, their major difference lying on the causal condition. The causal theory states that remembering involves an accurate representation causally connected to an earlier experience. In the simulation theory, remembering involves an accurate representation generated by a reliable memory process. I investigate how to construe detailed versions of these theories that correctly classify memory errors as misremembering or confabulation. Neither causalists nor simulationists have paid (...) attention to memory-conjunction errors, which is unfortunate because both theories have problems with these cases. The source of the difficulty is the background assumption that an act of remembering has one target. I fix these theories for those cases. The resulting versions are closely related when implemented using tools of informationtheory, differing only on how memory transmits information about the past. The implementation provides us with insights about the distinction between confabulatory and non-confabulatory memory, where memory-conjunction errors have a privileged position. (shrink)
In this essay we discuss recent attempts to analyse the notion of representation, as it is employed in cognitive science, in purely informational terms. In particular, we argue that recent informational theories cannot accommodate the existence of metarepresentations. Since metarepresentations play a central role in the explanation of many cognitive abilities, this is a serious shortcoming of these proposals.
Backtracking counterfactuals are problem cases for the standard, similarity based, theories of counterfactuals e.g., Lewis. These theories usually need to employ extra-assumptions to deal with those cases. Hiddleston, 632–657, 2005) proposes a causal theory of counterfactuals that, supposedly, deals well with backtracking. The main advantage of the causal theory is that it provides a unified account for backtracking and non-backtracking counterfactuals. In this paper, I present a backtracking counterfactual that is a problem case for Hiddleston’s account. Then I (...) propose an informational theory of counterfactuals, which deals well with this problem case while maintaining the main advantage of Hiddleston’s account. In addition, the informational theory offers a general theory of backtracking that provides clues for the semantics and epistemology of counterfactuals. I propose that backtracking is reasonable when the state of affairs expressed in the antecedent of a counterfactual transmits less information about an event in the past than the actual state of affairs. (shrink)
In 1948, Claude Shannon introduced his version of a concept that was core to Norbert Wiener's cybernetics, namely, informationtheory. Shannon's formalisms include a physical framework, namely a general communication system having six unique elements. Under this framework, Shannon informationtheory offers two particularly useful statistics, channel capacity and information transmitted. Remarkably, hundreds of neuroscience laboratories subsequently reported such numbers. But how (and why) did neuroscientists adapt a communications-engineering framework? Surprisingly, the literature offers no clear (...) answers. To therefore first answer "how", 115 authoritative peer-reviewed papers, proceedings, books and book chapters were scrutinized for neuroscientists' characterizations of the elements of Shannon's general communication system. Evidently, many neuroscientists attempted no identification of the system's elements. Others identified only a few of Shannon's system's elements. Indeed, the available neuroscience interpretations show a stunning incoherence, both within and across studies. The interpretational gamut implies hundreds, perhaps thousands, of different possible neuronal versions of Shannon's general communication system. The obvious lack of a definitive, credible interpretation makes neuroscience calculations of channel capacity and information transmitted meaningless. To now answer why Shannon's system was ever adapted for neuroscience, three common features of the neuroscience literature were examined: ignorance of the role of the observer, the presumption of "decoding" of neuronal voltage-spike trains, and the pursuit of ingrained analogies such as information, computation, and machine. Each of these factors facilitated a plethora of interpretations of Shannon's system elements. Finally, let us not ignore the impact of these "informational misadventures" on society at large. It is the same impact as scientific fraud. (shrink)
Chaitin’s incompleteness result related to random reals and the halting probability has been advertised as the ultimate and the strongest possible version of the incompleteness and undecidability theorems. It is argued that such claims are exaggerations.
In the first instance, IIT is formulated as a theory of the physical basis of the 'degree' or ‘level’ or ‘amount’ of consciousness in a system. I raise a series of questions about the central explanatory target, the 'degree' or ‘level’ or ‘amount’ of consciousness. I suggest it is not at all clear what scientists and philosophers are talking about when they talk about consciousness as gradable. This point is developed in more detail in my paper "What Is the (...) Integrated InformationTheory of Consciousness?"Journal of Consciousness Studies 26 (1-2):1-2 (2019) . (shrink)
In Cybernetics (1961 Edition), Professor Norbert Wiener noted that “The role of information and the technique of measuring and transmitting information constitute a whole discipline for the engineer, for the neuroscientist, for the psychologist, and for the sociologist”. Sociology aside, the neuroscientists and the psychologists inferred “information transmitted” using the discrete summations from Shannon InformationTheory. The present author has since scrutinized the psychologists’ approach in depth, and found it wrong. The neuroscientists’ approach is highly (...) related, but remains unexamined. Neuroscientists quantified “the ability of [physiological sensory] receptors (or other signal-processing elements) to transmit information about stimulus parameters”. Such parameters could vary along a single continuum (e.g., intensity), or along multiple dimensions that altogether provide a Gestalt – such as a face. Here, unprecedented scrutiny is given to how 23 neuroscience papers computed “information transmitted” in terms of stimulus parameters and the evoked neuronal spikes. The computations relied upon Shannon’s “confusion matrix”, which quantifies the fidelity of a “general communication system”. Shannon’s matrix is square, with the same labels for columns and for rows. Nonetheless, neuroscientists labelled the columns by “stimulus category” and the rows by “spike-count category”. The resulting “information transmitted” is spurious, unless the evoked spike-counts are worked backwards to infer the hypothetical evoking stimuli. The latter task is probabilistic and, regardless, requires that the confusion matrix be square. Was it? For these 23 significant papers, the answer is No. (shrink)
Integrated InformationTheory (IIT) is one of the most influential theories of consciousness, mainly due to its claim of mathematically formalizing consciousness in a measurable way. However, the theory, as it is formulated, does not account for contextual observations that are crucial for understanding consciousness. Here we put forth three possible difficulties for its current version, which could be interpreted as a trilemma. Either consciousness is contextual or not. If contextual, either IIT needs revisions to its axioms (...) to include contextuality, or it is inconsistent. If consciousness is not contextual, then IIT faces an empirical challenge. Therefore, we argue that IIT in its current version is inadequate. (shrink)
We propose that measures of information integration can be more straightforwardly interpreted as measures of agency rather than of consciousness. This may be useful to the goals of consciousness research, given how agency and consciousness are “duals” in many (although not all) respects.
This paper investigates the degree to which informationtheory, and the derived uses that make it work as a metaphor of our age, can be helpful in thinking about God’s immanence and transcendance. We ask when it is possible to say that a consciousness has to be behind the information we encounter. If God is to be thought about as a communicator of information, we need to ask whether a communication system has to pre-exist to the (...) divine and impose itself to God. If we want God to be Creator, and not someone who would work like a human being, ‘creating’ will mean sustaining in being as much the channel, the material system, as the message. Is information control? It seems that God’s actions are not going to be informational control of everything. To clarify the issue, we attempt to distinguish two kinds of ‘genialities’ in nature, as a way to evaluate the likelihood of God from nature. We investigate concepts and images of God, in terms of the history of ideas but also in terms of philosophical theology, metaphysics, and religious ontology. (shrink)
In the first instance, IIT is formulated as a theory of the physical basis of the 'degree' or ‘level’ or ‘amount’ of consciousness in a system. In addition, integrated information theorists have tried to provide a systematic theory of how physical states determine the specific qualitative contents of episodes of consciousness: for instance, an experience as of a red and round thing rather than a green and square thing. I raise a series of questions about the central (...) explanatory target, the 'degree' or ‘level’ or ‘amount’ of consciousness. I suggest it is not at all clear what scientists and philosophers are talking about when they talk about consciousness as gradable. I also raise some questions about the explanation of qualitative content. (shrink)
InformationTheory, Evolution and The Origin ofLife: The Origin and Evolution of Life as a Digital Message: How Life Resembles a Computer, Second Edition. Hu- bert P. Yockey, 2005, Cambridge University Press, Cambridge: 400 pages, index; hardcover, US $60.00; ISBN: 0-521-80293-8. The reason that there are principles of biology that cannot be derived from the laws of physics and chemistry lies simply in the fact that the genetic information content of the genome for constructing even the simplest (...) organisms is much larger than the information content of these laws. Yockey in his previous book (1992, 335) In this new book, InformationTheory, Evolution and The Origin ofLife, Hubert Yockey points out that the digital, segregated, and linear character of the genetic information system has a fundamental significance. If inheritance would blend and not segregate, Darwinian evolution would not occur. If inheritance would be analog, instead of digital, evolution would be also impossible, because it would be impossible to remove the effect of noise. In this way, life is guided by information, and so information is a central concept in molecular biology. The author presents a picture of how the main concepts of the genetic code were developed. He was able to show that despite Francis Crick's belief that the Central Dogma is only a hypothesis, the Central Dogma of Francis Crick is a mathematical consequence of the redundant nature of the genetic code. The redundancy arises from the fact that the DNA and mRNA alphabet is formed by triplets of 4 nucleotides, and so the number of letters (triplets) is 64, whereas the proteome alphabet has only 20 letters (20 amino acids), and so the translation from the larger alphabet to the smaller one is necessarily redundant. Except for Tryptohan and Methionine, all amino acids are coded by more than one triplet, therefore, it is undecidable which source code letter was actually sent from mRNA. This proof has a corollary telling that there are no such mathematical constraints for protein-protein communication. With this clarification, Yockey contributes to diminishing the widespread confusion related to such a central concept like the Central Dogma. Thus the Central Dogma prohibits the origin of life "proteins first." Proteins can not be generated by "self-organization." Understanding this property of the Central Dogma will have a serious impact on research on the origin of life. (shrink)
In this paper, we take a meta-theoretical stance and aim to compare and assess two conceptual frameworks that endeavor to explain phenomenal experience. In particular, we compare Feinberg & Mallatt’s Neurobiological Naturalism (NN) and Tononi’s and colleagues' Integrated InformationTheory (IIT), given that the former pointed out some similarities between the two theories (Feinberg & Mallatt 2016c-d). To probe their similarity, we first give a general introduction to both frameworks. Next, we expound a ground plan for carrying out (...) our analysis. We move on to articulate a philosophical profile of NN and IIT, addressing their ontological commitments and epistemological foundations. Finally, we compare the two point-by-point, also discussing how they stand on the issue of artificial consciousness. (shrink)
The Integrated InformationTheory of consciousness (IIT) claims that consciousness is identical to maximal integrated information, or maximal Φ. One objection to IIT is based on what may be called the intrinsicality problem: consciousness is an intrinsic property, but maximal Φ is an extrinsic property; therefore, they cannot be identical. In this paper, I show that this problem is not unique to IIT, but rather derives from a trilemma that confronts almost any theory of consciousness. Given (...) most theories of consciousness, the following three claims are inconsistent. INTRINSICALITY: Consciousness is intrinsic. NON-OVERLAP: Conscious systems do not overlap with other conscious systems (a la Unger’s problem of the many). REDUCTIONISM: Consciousness is constituted by more fundamental properties (as per standard versions of physicalism and Russellian monism). In view of this, I will consider whether rejecting INTRINSICALITY is necessarily less plausible than rejecting NON-OVERLAP or REDUCTIONISM. I will also consider whether IIT is necessarily committed to rejecting INTRINSICALITY or whether it could also accept solutions that reject NON-OVERLAP or REDUCTIONISM instead. I will suggest that the best option for IIT may be a solution that rejects REDUCTIONISM rather than INTRINSICALITY or NON-OVERLAP. (shrink)
The problem adressed in this paper is “the main epistemic problem concerning science”, viz. “the explication of how we compare and evaluate theories [...] in the light of the available evidence” (van Fraassen 1983, 27).
The way, in which quantum information can unify quantum mechanics (and therefore the standard model) and general relativity, is investigated. Quantum information is defined as the generalization of the concept of information as to the choice among infinite sets of alternatives. Relevantly, the axiom of choice is necessary in general. The unit of quantum information, a qubit is interpreted as a relevant elementary choice among an infinite set of alternatives generalizing that of a bit. The invariance (...) to the axiom of choice shared by quantum mechanics is introduced: It constitutes quantum information as the relation of any state unorderable in principle (e.g. any coherent quantum state before measurement) and the same state already well-ordered (e.g. the well-ordered statistical ensemble of the measurement of the quantum system at issue). This allows of equating the classical and quantum time correspondingly as the well-ordering of any physical quantity or quantities and their coherent superposition. That equating is interpretable as the isomorphism of Minkowski space and Hilbert space. Quantum information is the structure interpretable in both ways and thus underlying their unification. Its deformation is representable correspondingly as gravitation in the deformed pseudo-Riemannian space of general relativity and the entanglement of two or more quantum systems. The standard model studies a single quantum system and thus privileges a single reference frame turning out to be inertial for the generalized symmetry [U(1)]X[SU(2)]X[SU(3)] “gauging” the standard model. As the standard model refers to a single quantum system, it is necessarily linear and thus the corresponding privileged reference frame is necessary inertial. The Higgs mechanism U(1) → [U(1)]X[SU(2)] confirmed enough already experimentally describes exactly the choice of the initial position of a privileged reference frame as the corresponding breaking of the symmetry. The standard model defines ‘mass at rest’ linearly and absolutely, but general relativity non-linearly and relatively. The “Big Bang” hypothesis is additional interpreting that position as that of the “Big Bang”. It serves also in order to reconcile the linear standard model in the singularity of the “Big Bang” with the observed nonlinearity of the further expansion of the universe described very well by general relativity. Quantum information links the standard model and general relativity in another way by mediation of entanglement. The linearity and absoluteness of the former and the nonlinearity and relativeness of the latter can be considered as the relation of a whole and the same whole divided into parts entangled in general. (shrink)
Logical informationtheory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon informationtheory are derived by a uniform transformation from the corresponding definitions at the logical (...) level. The purpose of this paper is to give the direct generalization to quantum logical informationtheory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., qudits of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates that are distinguished by the measurement. Both the classical and quantum versions of logical entropy have simple interpretations as “two-draw” probabilities for distinctions. The conclusion is that quantum logical entropy is the simple and natural notion of information for quantum informationtheory focusing on the distinguishing of quantum states. (shrink)
The Integrated InformationTheory (IIT) formulated for the first time in 2004 by the neuroscientist Giulio Tononi, is a theoretical framework aiming to scientifically explain phenomenal consciousness. The IIT is presented in the first part of this work. Broadly speaking, integrated information is an abstract quantitative measure of the causal power a system has on itself. The main claim of IIT is the identity between informational structures and experience. The nature of this identity will be the subject (...) of the second part. One can interpret the IIT as a fundamental law of nature connecting the physical domain to the mental domain. The philosophical implications of such a claim are numerous and they are subject to criticism. This will be the main concern of the third and last part. (shrink)
Perceptual filling-in for vision is the insertion of visual properties (e.g., color, contour, luminance, or motion) into one’s visual field, when those properties have no corresponding retinal input. This paper introduces and provides preliminary empirical support for filled/non-filled pairs, pairs of images that appear identical, yet differ by amount of filling-in. It is argued that such image pairs are important to the experimental testing of theories of consciousness. We review recent experimental research and conclude that filling-in involves brain activity with (...) relatively high integrated information (Phi) compared to veridical visual perceptions. We then present filled/non-filled pairs as an empirical challenge to the integrated informationtheory of consciousness, which predicts that phenomenologically identical experiences depend on brain processes with identical Phi. (shrink)
Categorical logic has shown that modern logic is essentially the logic of subsets (or "subobjects"). Partitions are dual to subsets so there is a dual logic of partitions where a "distinction" [an ordered pair of distinct elements (u,u′) from the universe U ] is dual to an "element". An element being in a subset is analogous to a partition π on U making a distinction, i.e., if u and u′ were in different blocks of π. Subset logic leads to finite (...) probability theory by taking the (Laplacian) probability as the normalized size of each subset-event of a finite universe. The analogous step in the logic of partitions is to assign to a partition the number of distinctions made by a partition normalized by the total number of ordered pairs |U|² from the finite universe. That yields a notion of "logical entropy" for partitions and a "logical informationtheory." The logical theory directly counts the (normalized) number of distinctions in a partition while Shannon's theory gives the average number of binary partitions needed to make those same distinctions. Thus the logical theory is seen as providing a conceptual underpinning for Shannon's theory based on the logical notion of "distinctions.". (shrink)
What does it feel like to be a bat? Is conscious experience of echolocation closer to that of vision or audition? Or do bats process echolocation nonconsciously, such that they do not feel anything about echolocation? This famous question of bats' experience, posed by a philosopher Thomas Nagel in 1974, clarifies the difficult nature of the mind–body problem. Why a particular sense, such as vision, has to feel like vision, but not like audition, is totally puzzling. This is especially so (...) given that any conscious experience is supported by neuronal activity. Activity of a single neuron appears fairly uniform across modalities and even similar to those for non-conscious processing. Without any explanation on why a particular sense has to feel the way it does, researchers cannot approach the question of the bats' experience. Is there any theory that gives us a hope for such explanation? Currently, probably none, except for one. Integrated informationtheory has potential to offer a plausible explanation. IIT essentially claims that any system that is composed of causally interacting mechanisms can have conscious experience. And precisely how the system feels is determined by the way the mechanisms influence each other in a holistic way. In this article, I will give a brief explanation of the essence of IIT. Further, I will briefly provide a potential scientific pathway to approach bats' conscious experience and its philosophical implications. If IIT, or its improved or related versions, is validated enough, the theory will gain credibility. When it matures enough, predictions from the theory, including nature of bats' experience, will have to be accepted. I argue that a seemingly impossible question about bats' consciousness will drive empirical and theoretical consciousness research to make big breakthroughs, in a similar way as an impossible question about the age of the universe has driven modern cosmology. (shrink)
A generalized informationtheory is proposed as a natural extension of Shannon's informationtheory. It proposes that information comes from forecasts. The more precise and the more unexpected a forecast is, the more information it conveys. If subjective forecast always conforms with objective facts then the generalized information measure will be equivalent to Shannon's information measure. The generalized communication model is consistent with K. R. Popper's model of knowledge evolution. The mathematical foundations (...) of the new informationtheory, the generalized communication model , information measures for semantic information and sensory information, and the coding meanings of generalized entropy and generalized mutual information are introduced. Assessments and optimizations of pattern recognition, predictions, and detection with the generalized information criterion are discussed. For economization of communication, a revised version of rate-distortion theory: rate-of-keeping-precision theory, which is a theory for datum compression and also a theory for matching an objective channels with the subjective understanding of information receivers, is proposed. Applications include stock market forecasting and video image presentation. (shrink)
In the last decade, Guilio Tononi has developed the Integrated InformationTheory (IIT) of consciousness. IIT postulates that consciousness is equal to integrated information (Φ). The goal of this paper is to show that IIT fails in its stated goal of quantifying consciousness. The paper will challenge the theoretical and empirical arguments in support of IIT. The main theoretical argument for the relevance of integrated information to consciousness is the principle of information exclusion. Yet, no (...) justification is given to support this principle. Tononi claims there is significant empirical support for IIT, but this is called into question by the creation of a trivial theory of consciousness with equal explanatory power. After examining the theoretical and empirical evidence for IIT, arguments from philosophy of mind and epistemology will be examined. Since IIT is not a form of computational functionalism, it is vulnerable to fading/dancing qualia arguments. Finally, the limitations of the phenomenological approach to studying consciousness are examined, and it will be shown that IIT is a theory of protoconsciousness rather than a theory of consciousness. (shrink)
The connection between brain and mind is an important scientific and philosophical question that we are still far from completely understanding. A crucial point to our work is noticing that thermodynamics provides a convenient framework to model brain activity, whereas cognition can be modeled in information-theoretical terms. In fact, several models have been proposed so far from both approaches. A second critical remark is the existence of deep theoretical connections between thermodynamics and informationtheory. In fact, some (...) well-known authors claim that the laws of thermodynamics are nothing but principles in informationtheory. Unlike in physics or chemistry, a formalization of the relationship between information and energy is currently lacking in neuroscience. In this paper we propose a framework to connect physical brain and cognitive models by means of the theoretical connections between informationtheory and thermodynamics. Ultimately, this article aims at providing further insight on the formal relationship between cognition and neural activity. (shrink)
Shannon’s informationtheory has been a popular component of first-order cybernetics. It quantifies information transmitted in terms of the number of times a sent symbol is received as itself, or as another possible symbol. Sent symbols were events and received symbols were outcomes. Garner and Hake reinterpreted Shannon, describing events and outcomes as categories of a stimulus attribute, so as to quantify the information transmitted in the psychologist’s category (or absolute judgment) experiment. There, categories are represented (...) by specific stimuli, and the human subject must assign those stimuli, singly and in random order, to the categories that they represent. Hundreds of computations ensued of information transmitted and its alleged asymptote, the sensory channel capacity. The present paper critically re-examines those estimates. It also reviews estimates of memory capacity from memory experiments. It concludes that absolute judgment is memory-limited and that channel capacities are actually memory capacities. In particular, there are factors that affect absolute judgment that are not explainable within Shannon’s theory, factors such as feedback, practice, motivation, and stimulus range, as well as the anchor effect, sequential dependences, the rise in information transmitted with the increase in number of stimulus dimensions, and the phenomena of masking and stimulus duration dependence. It is recommended that absolute judgments be abandoned, because there are already many direct estimates of memory capacity. (shrink)
Information flow in a system is a core cybernetics concept. It has been used frequently in Sensory Psychology since 1951. There, Shannon InformationTheory was used to calculate "information transmitted" in "absolute identification" experiments involving human subjects. Originally, in Shannon's "system", any symbol received ("outcome") is among the symbols sent ("events"). Not all symbols are received as transmitted, hence an indirect noise measure is calculated, "information transmitted", which requires knowing the confusion matrix, its columns labeled (...) by "event" and its rows labeled by "outcome". Each matrix entry is dependent upon the frequency with which a particular outcome corresponds to a particular event. However, for the sensory psychologist, stimulus intensities are "events"; the experimenter partitions the intensity continuum into ranges called "stimulus categories" and "response categories", such that each confusion-matrix entry represents the frequency with which a stimulus from a stimulus category falls within a particular response category. Of course, a stimulus evokes a sensation, and the subject's immediate memory of it is compared to the memories of sensations learned during practice, to make a categorization. Categorizing thus introduces "false noise", which is only removed if categorizations can be converted back to their hypothetical evoking stimuli. But sensations and categorizations are both statistically distributed, and the stimulus that corresponds to a given mean categorization cannot be known from only the latter; the relation of intensity to mean sensation, and of mean sensation to mean categorization, are needed. Neither, however, are presently knowable. This is a quandary, which arose because sensory psychologists ignored an ubiquitous component of Shannon's "system", the uninvolved observer, who calculates "information transmitted". Human sensory systems, however, are within de facto observers, making "false noise" inevitable. (shrink)
The article addresses the problem of how semantic information can be upgraded to knowledge. The introductory section explains the technical terminology and the relevant background. Section 2 argues that, for semantic information to be upgraded to knowledge, it is necessary and sufficient to be embedded in a network of questions and answers that correctly accounts for it. Section 3 shows that an information flow network of type A fulfils such a requirement, by warranting that the erotetic deficit, (...) characterising the target semantic information t by default, is correctly satisfied by the information flow of correct answers provided by an informational source s. Section 4 illustrates some of the major advantages of such a Network Theory of Account (NTA) and clears the ground of a few potential difficulties. Section 5 clarifies why NTA and an informational analysis of knowledge, according to which knowledge is accounted semantic information, is not subject to Gettier-type counterexamples. A concluding section briefly summarises the results obtained. (shrink)
Semantic information is usually supposed to satisfy the veridicality thesis: p qualifies as semantic information only if p is true. However, what it means for semantic information to be true is often left implicit, with correspondentist interpretations representing the most popular, default option. The article develops an alternative approach, namely a correctness theory of truth (CTT) for semantic information. This is meant as a contribution not only to the philosophy of information but also to (...) the philosophical debate on the nature of truth. After the introduction, in Sect. 2, semantic information is shown to be translatable into propositional semantic information (i). In Sect. 3, i is polarised into a query (Q) and a result (R), qualified by a specific context, a level of abstraction and a purpose. This polarization is normalised in Sect. 4, where [Q + R] is transformed into a Boolean question and its relative yes/no answer [Q + A]. This completes the reduction of the truth of i to the correctness of A. In Sects. 5 and 6, it is argued that (1) A is the correct answer to Q if and only if (2) A correctly saturates Q by verifying and validating it (in the computer science’s sense of “verification” and “validation”); that (2) is the case if and only if (3) [Q + A] generates an adequate model (m) of the relevant system (s) identified by Q; that (3) is the case if and only if (4) m is a proxy of s (in the computer science’s sense of “proxy”) and (5) proximal access to m commutes with the distal access to s (in the category theory’s sense of “commutation”); and that (5) is the case if and only if (6) reading/writing (accessing, in the computer science’s technical sense of the term) m enables one to read/write (access) s. Sect. 7 provides some further clarifications about CTT, in the light of semantic paradoxes. Section 8 draws a general conclusion about the nature of CTT as a theory for systems designers not just systems users. In the course of the article all technical expressions from computer science are explained. (shrink)
In this conceptual paper, the traditional conceptualization of sustainable entrepreneurship is challenged because of a fundamental tension between processes involved in sustainable development and processes involved in entrepreneurship: the concept of sustainable business models contains a paradox, because sustainability involves the reduction of information asymmetries, whereas entrepreneurship involves enhanced and secured levels of information asymmetries. We therefore propose a new and integrated theory of sustainable entrepreneurship that overcomes this paradox. The basic argument is that environmental problems have (...) to be conceptualized as wicked problems or sustainability-related ecosystem failures. Because all actors involved in the entrepreneurial process are characterized by their epistemic insufficiency regarding the solving of these problems, the role of information in the sustainable entrepreneurial process changes. On the one hand, the reduction of information asymmetries primarily aims to enable actors to become critical of sustainable entrepreneurs’ actual business models. On the other hand, the epistemic insufficiency of sustainable entrepreneurs guarantees that information asymmetries remain as a source of new sustainable business opportunities. Three further characteristics of sustainable entrepreneurs are distinguished: sustainability and entrepreneurship-related risk-taking; sustainability and entrepreneurship-related self-efficacy; and the development of satisficing and open-ended solutions, together with multiple stakeholders. (shrink)
Semantic information is usually supposed to satisfy the veridicality thesis: p qualifies as semantic information only if p is true. However, what it means for semantic information to be true is often left implicit, with correspondentist interpretations representing the most popular, default option. The article develops an alternative approach, namely a correctness theory of truth (CTT) for semantic information. This is meant as a contribution not only to the philosophy of information but also to (...) the philosophical debate on the nature of truth. After the introduction, in Sect. 2, semantic information is shown to be translatable into propositional semantic information (i). In Sect. 3, i is polarised into a query (Q) and a result (R), qualified by a specific context, a level of abstraction and a purpose. This polarization is normalised in Sect. 4, where [Q + R] is transformed into a Boolean question and its relative yes/no answer [Q + A]. This completes the reduction of the truth of i to the correctness of A. In Sects. 5 and 6, it is argued that (1) A is the correct answer to Q if and only if (2) A correctly saturates Q by verifying and validating it (in the computer science’s sense of verification and validation ); that (2) is the case if and only if (3) [Q + A] generates an adequate model (m) of the relevant system (s) identified by Q; that (3) is the case if and only if (4) m is a proxy of s (in the computer science’s sense of proxy ) and (5) proximal access to m commutes with the distal access to s (in the category theory’s sense of commutation ); and that (5) is the case if and only if (6) reading/writing (accessing, in the computer science’s technical sense of the term) m enables one to read/write (access) s. Sect. 7 provides some further clarifications about CTT, in the light of semantic paradoxes. Section 8 draws a general conclusion about the nature of CTT as a theory for systems designers not just systems users. In the course of the article all technical expressions from computer science are explained. (shrink)
Measures and theories of information abound, but there are few formalised methods for treating the contextuality that can manifest in different information systems. Quantum theory provides one possible formalism for treating information in context. This paper introduces a quantum inspired model of the human mental lexicon. This model is currently being experimentally investigated and we present a preliminary set of pilot data suggesting that concept combinations can indeed behave non-separably.
This paper outlines a quantitative theory of strongly semantic information (TSSI) based on truth-values rather than probability distributions. The main hypothesis supported in the paper is that the classic quantitative theory of weakly semantic information (TWSI), based on probability distributions, assumes that truth-values supervene on factual semantic information, yet this principle is too weak and generates a well-known semantic paradox, whereas TSSI, according to which factual semantic information encapsulates truth, can avoid the paradox and (...) is more in line with the standard conception of what generally counts as semantic information. After a brief introduction, section two outlines the semantic paradox implied by TWSI, analysing it in terms of an initial conflict between two requisites of a quantitative theory of semantic information. In section three, three criteria of semantic information equivalence are used to provide a taxonomy of quantitative approaches to semantic information and introduce TSSI. In section four, some further desiderata that should be fulfilled by a quantitative TSSI are explained. From section five to section seven, TSSI is developed on the basis of a calculus of truth-values and semantic discrepancy with respect to a given situation. In section eight, it is shown how TSSI succeeds in solving the paradox. Section nine summarises the main results of the paper and indicates some future developments. (shrink)
Punctuation has so far attracted attention within the linguistics community mostly from a syntactic perspective. In this paper, we give a preliminary account of the information-based aspects of punctuation, drawing our points from assorted, naturally occurring sentences. We present our formal models of these sentences and the semantic contributions of punctuation marks. Our formalism is a simplified analogue of an extension --- due to Nicholas Asher --- of Discourse Representation Theory.
Software application ontologies have the potential to become the keystone in state-of-the-art information management techniques. It is expected that these ontologies will support the sort of reasoning power required to navigate large and complex terminologies correctly and efficiently. Yet, there is one problem in particular that continues to stand in our way. As these terminological structures increase in size and complexity, and the drive to integrate them inevitably swells, it is clear that the level of consistency required for such (...) navigation will become correspondingly difficult to maintain. While descriptive semantic representations are certainly a necessary component to any adequate ontology-based system, so long as ontology engineers rely solely on semantic information, without a sound ontological theory informing their modeling decisions, this goal will surely remain out of reach. In this paper we describe how Language and Computing nv (L&C), along with The Institute for Formal Ontology and Medical Information Sciences (IFOMIS), are working towards developing and implementing just such a theory, combining the open software architecture of L&C’s LinkSuiteTM with the philosophical rigor of IFOMIS’s Basic Formal Ontology. In this way we aim to move beyond the more or less simple controlled vocabularies that have dominated the industry to date. (shrink)
The objective of the paper is to explore the issue that despite the absence of adequate formal and systematic ways for the poor and disadvantaged people to get access to health benefit like in a rich liberal society, there are active social customs, feelings and individual and collective responsibilities among the people that help the disadvantaged and poor people to have access to the minimum health care facility in both liberal and non-liberal poor countries. In order to explain the importance (...) and functional contribution of the social norms in this respect, some examples will be illustrated from Bangladesh which is a poor liberal country. There will be two sections of the paper. In the first section, it will be exhibited how the naturally and socially disadvantaged people in a liberal society get benefit following Rawls‟ theory of distributive justice. In the second section, it will be showed that in a poor country where there are less resources of the government to provide enough services to the poor and disadvantaged people, the communal feelings and the informal social institutions play a vital role that helps the disadvantaged and poor people to get access to the health benefit. The traditional social norms impose indirect sanction on its people to come forward to help the worse off people of the country. It is depicted that Rawlsian theory of distribution does not work properly in these countries, rather the communitarian feelings is more welcomed for the benefit of the overall welfare of the society and this will be shown in the conclusion of the paper. (shrink)
Bohm and Hiley suggest that a certain new type of active information plays a key objective role in quantum processes. This paper discusses the implications of this suggestion to our understanding of the relation between the mental and the physical aspects of reality.
An important problem with machine learning is that when label number n>2, it is very difficult to construct and optimize a group of learning functions, and we wish that optimized learning functions are still useful when prior distribution P(x) (where x is an instance) is changed. To resolve this problem, the semantic information G theory, Logical Bayesian Inference (LBI), and a group of Channel Matching (CM) algorithms together form a systematic solution. MultilabelMultilabel A semantic channel in the G (...)theory consists of a group of truth functions or membership functions. In comparison with likelihood functions, Bayesian posteriors, and Logistic functions used by popular methods, membership functions can be more conveniently used as learning functions without the above problem. In Logical Bayesian Inference (LBI), every label’s learning is independent. For Multilabel learning, we can directly obtain a group of optimized membership functions from a big enough sample with labels, without preparing different samples for different labels. A group of Channel Matching (CM) algorithms are developed for machine learning. For the Maximum Mutual Information (MMI) classification of three classes with Gaussian distributions on a two-dimensional feature space, 2-3 iterations can make mutual information between three classes and three labels surpass 99% of the MMI for most initial partitions. For mixture models, the Expectation-Maxmization (EM) algorithm is improved and becomes the CM-EM algorithm, which can outperform the EM algorithm when mixture ratios are imbalanced, or local convergence exists. The CM iteration algorithm needs to combine neural networks for MMI classifications on high-dimensional feature spaces. LBI needs further studies for the unification of statistics and logic. (shrink)
In this article, I summarise the ontological theory of informational privacy (an approach based on information ethics) and then discuss four types of interesting challenges confronting any theory of informational privacy: (1) parochial ontologies and non-Western approaches to informational privacy; (2) individualism and the anthropology of informational privacy; (3) the scope and limits of informational privacy; and (4) public, passive and active informational privacy. I argue that the ontological theory of informational privacy can cope with such (...) challenges fairly successfully. In the conclusion, I discuss some of the work that lies ahead. (shrink)
The article develops a correctness theory of truth (CTT) for semantic information. After the introduction, in section two, semantic information is shown to be translatable into propositional semantic information (i). In section three, i is polarised into a query (Q) and a result (R), qualified by a specific context, a level of abstraction and a purpose. This polarization is normalised in section four, where [Q + R] is transformed into a Boolean question and its relative yes/no (...) answer [Q + A]. This completes the reduction of the truth of i to the correctness of A. In sections five and six, it is argued that (1) A is the correct answer to Q if and only if (2) A correctly saturates (in a Fregean sense) Q by verifying and validating it (in the computer science’s sense of “verification” and “validation”); that (2) is the case if and only if (3) [Q + A] generates an adequate model (m) of the relevant system (s) identified by Q; that (3) is the case if and only if (4) m is a proxy of s (in the computer science’s sense of “proxy”) and (5) proximal access to m commutes with the distal access to s (in the category theory’s sense of “commutation”); and that (5) is the case if and only if (6) reading/writing (accessing, in the computer science’s technical sense of the term) m enables one to read/write (access) s. The last section draws a general conclusion about the nature of CTT as a theory for systems designers not just systems users. (shrink)
In a recent paper I proposed a novel relativity theory termed Information Relativity (IR). Unlike Einstein's relativity which dictates as force majeure that relativity is a true state of nature, Information Relativity assumes that relativity results from difference in information about nature between observers who are in motion relative to each other. The theory is based on two axioms: 1. the laws of physics are the same in all inertial frames of reference (Special relativity's first (...) axiom); 2. All translations of information from one frame of reference to another are carried by light or by another carrier with equal velocity (information-carrier axiom). For the case of constant relative velocities, I showed in the aforementioned paper that IR accounts successfully for the results of a class of relativistic time results, including the Michelson-Morley's "null" result, the Sagnac effect, and the neutrino velocities reported by OPERA and other collaborations. Here I apply the theory, with no alteration, to cosmology. I show that the theory is successful in accounting for several cosmological findings, including the pattern of recession velocity predicted by inflationary theories, the GZK energy suppression phenomenon at redshift z ̴ 1.6, and the amounts of matter and dark energy reported in recent ΛCDM cosmologies. (shrink)
The merits of set theory as a foundational tool in mathematics stimulate its use in various areas of artificial intelligence, in particular intelligent information systems. In this paper, a study of various nonstandard treatments of set theory from this perspective is offered. Applications of these alternative set theories to information or knowledge management are surveyed.
In the April 2002 edition of JCS I outlined the conscious electromagnetic information field theory, claiming that consciousness is that component of the brain's electromagnetic field that is downloaded to motor neurons and is thereby capable of communicating its informational content to the outside world. In this paper I demonstrate that the theory is robust to criticisms. I further explore implications of the theory particularly as regards the relationship between electromagnetic fields, information, the phenomenology of (...) consciousness and the meaning of free will. Using cemi field theory I propose a working hypothesis that shows, among other things, that awareness and information represent the same phenomenon viewed from different reference frames. (shrink)
This report reviews what quantum physics and informationtheory have to tell us about the age-old question, How come existence? No escape is evident from four conclusions: (1) The world cannot be a giant machine, ruled by any preestablished continuum physical law. (2) There is no such thing at the microscopic level as space or time or spacetime continuum. (3) The familiar probability function or functional, and wave equation or functional wave equation, of standard quantum theory provide (...) mere continuum idealizations and by reason of this circumstance conceal the information-theoretic source from which they derive. (4) No element in the description of physics shows itself as closer to primordial than the elementary quantum phenomenon, that is, the elementary device-intermediated act of posing a yes-no physical question and eliciting an answer or, in brief, the elementary act of observer-participancy. Otherwise stated, every physical quantity, every it, derives its ultimate significance from bits, binary yes-or-no indications, a conclusion which we epitomize in the phrase, it from bit. (shrink)
By bringing together Dretske’s theory of knowledge, Shannon’s theory of information, and the conceptual framework of statistical physics, this paper explores some of the meta-physical challenges posed by a naturalistic notion of semantical information. It is argued that Dretske’s theory cannot be said to be naturalistically grounded in the world described by classical physics and that Dretske information is not consistent with Shannon information. A possible route to reconciling Dretske’s insights with Shannon’s (...) class='Hi'>theory is proposed. Along the way, an attempt is made to clarify several points of possible confusion about the relationships between Dretske information, Shannon information and statistical physics. (shrink)
Scott Soames has recently argued that traditional accounts of propositions as n-tuples or sets of objects and properties or functions from worlds to extensions cannot adequately explain how these abstract entities come to represent the world. Soames’ new cognitive theory solves this problem by taking propositions to be derived from agents representing the world to be a certain way. Agents represent the world to be a certain way, for example, when they engage in the cognitive act of predicating, or (...) cognizing, an act that takes place during cognitive events, such as perceiving, believing, judging and asserting. On the cognitive theory, propositions just are act types involving the act of predicating and certain other mental operations. This theory, Soames argues, solves not only the problem of how propositions come to represent but also a number of other difficulties for traditional theories, including the problem of de se propositions and the problems of accounting for how agents are capable of grasping propositions and how they come to stand in the relation of expression to sentences. I argue here that Soames’ particular version of the cognitive theory makes two problematic assumptions about cognitive operations and the contents of proper names. I then briefly examine what can count as evidence for the nature of the constituents of the cognitive operation types that produce propositions and argue that the common nature of cognitive operations and what they operate on ought to be determined empirically in cross-disciplinary work. I conclude by offering a semantics for cognitive act types that accommodates one type of empirical evidence. (shrink)
In earlier papers I described the conscious electromagnetic information (CEMI) field theory, which claimed that the substrate of consciousness is the brain’s electromagnetic (EM) field. I here further explore this theory by examining the properties and dynamics of the information underlying meaning in consciousness. I argue that meaning suffers from a binding problem, analogous to the binding problem described for visual perception, and describe how the gestalt (holistic) properties of meaning give rise to this binding problem. (...) To clarify the role of information in conscious meaning, I differentiate between extrinsic information that is symbolic and arbitrary, and intrinsic information, which preserves structural aspects of the represented object and thereby maintains some gestalt properties of the represented object. I contrast the requirement for a decoding process to extract meaning from extrinsic information, whereas meaning is intrinsic to the structure of the gestalt intrinsic information and does not require decoding. I thereby argue that to avoid the necessity of a decoding homunculus, conscious meaning must be encoded intrinsically — as gestalt information — in the brain. Moreover, I identify fields as the only plausible substrate for encoding gestalt intrinsic information and argue that the binding problem of meaning can only be solved by grounding meaning in this field-based gestalt information. I examine possible substrates for gestalt information in the brain and conclude that the only plausible substrate is the CEMI field. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.