Imagination seems to play an epistemic role in philosophical and scientific thought experiments, mindreading, and ordinary practical deliberations insofar as it generates new knowledge of contingent facts about the world. However, it also seems that imagination is limited to creative generation of ideas. Sometimes we imagine fanciful ideas that depart freely from reality. The conjunction of these claims is what I call the puzzle of knowledge through imagination. This chapter aims to resolve this puzzle. I argue that imagination has an (...) epistemic role to play, but it is limited to the context of discovery. Imagination generates ideas, but other cognitive capacities must be employed to evaluate these ideas in order for them to count as knowledge. Consideration of the Simulation Theory's so-called "threat of collapse” provides further evidence that imagination does not, on its own, yield new knowledge of contingent facts, and it suggests a way to supplement imagination in order to get such knowledge. (shrink)
Direct Social Perception (DSP) is the idea that we can non-inferentially perceive others’ mental states. In this paper, I argue that the standard way of framing DSP leaves the debate at an impasse. I suggest two alternative interpretations of the idea that we see others’ mental states: others’ mental states are represented in the content of our perception, and we have basic perceptual beliefs about others’ mental states. I argue that the latter interpretation of DSP is more promising and examine (...) the kinds of mental states that plausibly could satisfy this version of DSP. (shrink)
Mirror neurons are widely regarded as an important key to social cognition. Despite such wide agreement, there is very little consensus on how or why they are important. The goal of this paper is to clearly explicate the exact role mirror neurons play in social cognition. I aim to answer two questions about the relationship between mirroring and social cognition: What kind of social understanding is involved with mirroring? How is mirroring related to that understanding? I argue that philosophical and (...) empirical considerations lead us to accord a fairly minimal role for mirror neurons in social cognition. (shrink)
Direct Perception is the view that we can see others' mental states, i.e. that we perceive others' mental states with the same immediacy and directness that we perceive ordinary objects in the world. I evaluate Direct Perception by considering whether we can see intentions, a particularly promising candidate for Direct Perception. I argue that the view equivocates on the notion of intention. Disambiguating the Direct Perception claim reveals a troubling dilemma for the view: either it is banal or highly implausible.
This is a penultimate draft of a paper that will appear in Handbook of Imagination, Amy Kind (ed.). Routledge Press. Please cite only the final printed version.
Most people think of themselves as pretty good at understanding others’ beliefs, desires, emotions, and intentions. Accurate mindreading is an impressive cognitive feat, and for this reason the philosophical literature on mindreading has focused exclusively on explaining such successes. However, as it turns out, we regularly make mindreading mistakes. Understanding when and how mind misreading occurs is crucial for a complete account of mindreading. In this paper, I examine the conditions under which mind misreading occurs. I argue that these patterns (...) of mind misreading shed light on the limits of mindreading, reveal new perspectives on how mindreading works, and have implications for social epistemology. (shrink)
We often have affective responses to fictional events. We feel afraid for Desdemona when Othello approaches her in a murderous rage. We feel disgust toward Iago for orchestrating this tragic event. What mental architecture could explain these affective responses? In this paper I consider the claim that the best explanation of our affective responses to fiction involves imaginative desires. Some theorists argue that accounts that do not invoke imaginative desires imply that consumers of fiction have irrational desires. I argue that (...) there are serious worries about imaginative desires that warrant skepticism about the adequacy of the account. Moreover, it is quite difficult to articulate general principles of rationality for desires, and even according to the most plausible of these possible principles, desires about fiction are not irrational. (shrink)
In this paper, I examine the challenges socially extended minds pose for mainstream, individualistic accounts of social cognition. I argue that individualistic accounts of social cognition neglect phenomena important to social cognition that are properly emphasized by socially extended mind accounts. Although I do not think the evidence or arguments warrant replacing individualistic explanations of social cognition with socially extended explanations, I argue that we have good reason to supplement our individualistic accounts so as to include the ways in which (...) situational context affects social interactions. The result, I hope, is a more sophisticated individualism that offers a more comprehensive account of how we think and act together. (shrink)
Can phenomenological evidence play a decisive role in accepting or rejecting social cognition theories? Is it the case that a theory of social cognition ought to explain and be empirically supported by our phenomenological experience? There is serious disagreement about the answers to these questions. This paper aims to determine the methodological role of phenomenology in social cognition debates. The following three features are characteristic of evidence capable of playing a substantial methodological role: novelty, reliability, and relevance. I argue that (...) phenomenological evidence lacks all three criteria and, consequently, should not play a substantial role in debates about social cognition. (shrink)
In this paper I evaluate embodied social cognition, embodied cognition’s account of how we understand others. I identify and evaluate three claims that motivate embodied social cognition. These claims are not specific to social cognition; they are general hypotheses about cognition. As such, they may be used in more general arguments for embodied cognition. I argue that we have good reasons to reject these claims. Thus, the case for embodied social cognition fails. Moreover, to the extent that general arguments for (...) embodied cognition rest on these premises, they are correspondingly uncompelling. (shrink)
According to embodied cognition, the philosophical and empirical literature on theory of mind is misguided. Embodied cognition rejects the idea that social cognition requires theory of mind. It regards the intramural debate between the Theory Theory and the Simulation Theory as irrelevant, and it dismisses the empirical studies on theory of mind as ill conceived and misleading. Embodied cognition provides a novel deflationary account of social cognition that does not depend on theory of mind. In this chapter, l describe embodied (...) cognition’s alternative to theory of mind and discuss three challenges it faces. (shrink)
Empathy is many things to many people. Depending on who you ask, it is feeling what another person feels, feeling bad for another person’s suffering, understanding what another person feels, imagining yourself in another person’s situation and figuring out what you would feel, or your brain activating as if you were experiencing the emotion another person is experiencing. These are just some of the various notions of empathy that are at play in philosophy, cognitive science, neuroscience, developmental psychology, and primatology. (...) In this chapter, we will not stipulate a definition of empathy per se. Instead, we will review the development of empathy and purported mechanisms of empathy, which will allow us to tease apart various dimensions of empathy related concepts. Understanding the various dimensions of empathy provides context for some recent critiques of empathy as a moral compass and suggests several directions for future fruitful scientific and philosophical work on empathy. (shrink)
Theory of mind, also known as mindreading, refers to our ability to attribute mental states to agents in order to make sense of and interact with other agents. Recently, theorists in this literature have advanced a broad conception of mindreading. In particular, psychologists and philosophers have examined how we attribute knowledge, intention, mentalistically-loaded stereotypes, and personality traits to others. Moreover, the diversity of our goals in a social interaction – precision, efficiency, self/in-group protection – generates diversity in the mindreading processes (...) we employ. Finally, the products of mindreading are varied, as well. We produce different sorts of mindreading explanations depending on our epistemic goals and the situational context. In this article, I piece together these different strands of research to present a broad conception of mindreading that is complex, messy, and interesting. (shrink)
Successful athletic performance requires precision in many respects. A batter stands behind home plate awaiting the arrival of a ball that is less than three inches in diameter and moving close to 100 mph. His goal is to hit it with a bat that is also less than three inches in diameter. This impressive feat requires extraordinary temporal and spatial coordination. The sweet spot of the bat must be at the same place, at the same time, as the ball. A (...) basketball player must keep a ball bouncing as she speeds from one end of the court to another, evading defensive players. She may never break pace as she lifts from the ground, throwing the ball fifteen feet toward a hoop that is eighteen inches in diameter. One task facing a psychologist involves explaining how the body does such things within the sometimes very demanding spatial and temporal constraints that a given task imposes. Part of the goal of this chapter is to sketch the commitments of an embodied approach to such an explanation. We shall see that an embodied account of motor skills draws concepts that depart radically from more traditional cognitivist theories of motor activity. Similarly, because an embodied approach to cognition introduces new ways to understand the human capacity for social interaction, it also promises to shed new light on how athletes coordinate their actions with each other. (shrink)
Disagreeing with others about how to interpret a social interaction is a common occurrence. We often find ourselves offering divergent interpretations of others’ motives, intentions, beliefs, and emotions. Remarkably, philosophical accounts of how we understand others do not explain, or even attempt to explain such disagreements. I argue these disparities in social interpretation stem, in large part, from the effect of social categorization and our goals in social interactions, phenomena long studied by social psychologists. I argue we ought to expand (...) our accounts of how we understand others in order to accommodate these data and explain how such profound disagreements arise amongst informed, rational, well-meaning individuals. (shrink)
In How We Understand Others: Philosophy and Social Cognition, ShannonSpaulding develops a novel account of social cognition with pessimistic implications for mindreading accuracy: according to Spaulding, mistakes in mentalizing are much more common than traditional theories of mindreading commonly assume. In this commentary, I push against Spaulding’s pessimism from two directions. First, I argue that a number of the heuristic mindreading strategies that Spaulding views as especially error prone might be quite reliable in practice. (...) Second, I argue that current methods for measuring mindreading performance are not well-suited for the task of determining whether our mental-state attributions are generally accurate. I conclude that any claims about the accuracy or inaccuracy of mindreading are currently unjustified. (shrink)
This paper utilizes Robert Smithson's philosophy as a kind of counterpoint, rather than refutation, to many of Hegel's convictions on the nature and function of art in world historical spirit.
The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set--just as the usual logical notion of probability based on the Boolean logic (...) of subsets is the normalized counting measure of the subsets (events). Thus logical entropy is a measure on the set of ordered pairs, and all the compound notions of entropy (join entropy, conditional entropy, and mutual information) arise in the usual way from the measure (e.g., the inclusion-exclusion principle)--just like the corresponding notions of probability. The usual Shannon entropy of a partition is developed by replacing the normalized count of distinctions (dits) by the average number of binary partitions (bits) necessary to make all the distinctions of the partition. (shrink)
Categorical logic has shown that modern logic is essentially the logic of subsets (or "subobjects"). Partitions are dual to subsets so there is a dual logic of partitions where a "distinction" [an ordered pair of distinct elements (u,u′) from the universe U ] is dual to an "element". An element being in a subset is analogous to a partition π on U making a distinction, i.e., if u and u′ were in different blocks of π. Subset logic leads to finite (...) probability theory by taking the (Laplacian) probability as the normalized size of each subset-event of a finite universe. The analogous step in the logic of partitions is to assign to a partition the number of distinctions made by a partition normalized by the total number of ordered pairs |U|² from the finite universe. That yields a notion of "logical entropy" for partitions and a "logical information theory." The logical theory directly counts the (normalized) number of distinctions in a partition while Shannon's theory gives the average number of binary partitions needed to make those same distinctions. Thus the logical theory is seen as providing a conceptual underpinning for Shannon's theory based on the logical notion of "distinctions.". (shrink)
A generalized information theory is proposed as a natural extension of Shannon's information theory. It proposes that information comes from forecasts. The more precise and the more unexpected a forecast is, the more information it conveys. If subjective forecast always conforms with objective facts then the generalized information measure will be equivalent to Shannon's information measure. The generalized communication model is consistent with K. R. Popper's model of knowledge evolution. The mathematical foundations of the new information theory, the (...) generalized communication model , information measures for semantic information and sensory information, and the coding meanings of generalized entropy and generalized mutual information are introduced. Assessments and optimizations of pattern recognition, predictions, and detection with the generalized information criterion are discussed. For economization of communication, a revised version of rate-distortion theory: rate-of-keeping-precision theory, which is a theory for datum compression and also a theory for matching an objective channels with the subjective understanding of information receivers, is proposed. Applications include stock market forecasting and video image presentation. (shrink)
What is the content of a mental state? This question poses the problem of intentionality: to explain how mental states can be about other things, where being about them is understood as representing them. A framework that integrates predictive coding and signaling systems theories of cognitive processing offers a new perspective on intentionality. On this view, at least some mental states are evaluations, which differ in function, operation, and normativity from representations. A complete naturalistic theory of intentionality must account for (...) both types of intentional state. (shrink)
This article argues that we ought to reject Gregory Currie’s “Trace Account” of documentary film. According to the Trace Account, a film is a documentary so long the majority of its constitutive images are traces of the film’s subject matter. The argument proceeds by considering how proponents of the Trace Account could respond to Noel Carroll’s charge that their analysis is radically revisionary. I argue that the only responses available are either implausible or show that a fully worked out version (...) of the Trace Account collapses into Carroll’s own, rival definition of documentary. I then consider how advocates of the Trace Account might attempt to rescue the theory by reframing it as an account of a genre or as a theory of evaluation and argue that neither attempt would succeed. Given this, we ought to embrace Carroll’s own account of documentary, according to which a film is documentary if and only if it is a film of presumptive assertion. (shrink)
In this article, I consider the possibilities and limitations for testimonial justice in an international criminal courtroom. I begin by exploring the relationship between epistemology and criminal law, and consider how testimony contributes to the goals of truth and justice. I then assess the susceptibility of international criminal courts to the two harms of testimonial injustice: epistemic harm to the speaker, and harm to the truth-seeking process. I conclude that international criminal courtrooms are particularly susceptible to perpetrating testimonial injustice. Hearers (...) in the international criminal courtroom should practice testimonial justice, but the institution is not structured in a way that can prevent every instance of testimonial injustice. (shrink)
Understanding the mechanics of consciousness remains one of the most important challenges in modern cognitive science. One key step toward understanding consciousness is to associate unconscious physiological processes with subjective experiences of sensory, motor, and emotional contents. This article explores the role of various cellular membrane potential differences and how they give rise to the dynamic infrastructure of conscious experience. This article explains that consciousness is a body-wide, biological process not limited to individual organs because the mind and body are (...) unified as one entity; therefore, no single location of consciousness can be pinpointed. Consciousness exists throughout the entire body, and unified consciousness is experienced and maintained through dynamic repolarization during inhalation and expiration. Extant knowledge is reviewed to provide insight into how differences in cellular membrane potential play a vital role in the triggering of neural and non-neural oscillations. The role of dynamic cellular membrane potentials in the activity of the central nervous system, peripheral nervous system, cardiorespiratory system, and various other tissues (such as muscles and sensory organs) in the physiology of consciousness is also explored. Inspiration and expiration are accompanied by oscillating membrane potentials throughout all cells and play a vital role in subconscious human perception of feelings and states of mind. In addition, the role of the brainstem, hypothalamus, and complete nervous system (central, peripheral, and autonomic)within the mind-body space combine to allow consciousness to emerge and to come alive. This concept departs from the notion that the brain is the only organ that gives rise to consciousness. (shrink)
We designed an experiment to explore the learning effectiveness of three different ways of practicing dance movements. To our surprise we found that partial modeling, called marking in the dance world, is a better method than practicing the complete phrase, called practicing full-out; and both marking and full-out are better methods than practicing by repeated mental simulation. We suggest that marking is a form of practicing a dance phrase aspect-by-aspect. Our results also suggest that prior work on learning by observation (...) and learning by mental practice may not scale up to complex movements. (shrink)
Semantic Information conveyed by daily language has been researched for many years; yet, we still need a practical formula to measure information of a simple sentence or prediction, such as “There will be heavy rain tomorrow”. For practical purpose, this paper introduces a new formula, Semantic Information Formula (SIF), which is based on L. A. Zadeh’s fuzzy set theory and P. Z. Wang’s random set falling shadow theory. It carries forward C. E. Shannon and K. Popper’s thought. The fuzzy (...) set’s probability defined by Zadeh is treated as the logical probability sought by Popper, and the membership grade is treated as the truth-value of a proposition and also as the posterior logical probability. The classical relative information formula (Information=log(Posterior probability / Prior probability) is revised into SIF by replacing the posterior probability with the membership grade and the prior probability with the fuzzy set’s probability. The SIF can be explained as “Information=Testing severity – Relative square deviation” and hence can be used as Popper's information criterion to test scientific theories or propositions. The information measure defined by the SIF also means the spared codeword length as the classical information measure. This paper introduces the set-Bayes’ formula which establishes the relationship between statistical probability and logical probability, derives Fuzzy Information Criterion (FIC) for the optimization of semantic channel, and discusses applications of SIF and FIC in areas such as linguistic communication, prediction, estimation, test, GPS, translation, and fuzzy reasoning. Particularly, through a detailed example of reasoning, it is proved that we can improve semantic channel with proper fuzziness to increase average semantic information to reach its upper limit: Shannon mutual information. (shrink)
Information flow in a system is a core cybernetics concept. It has been used frequently in Sensory Psychology since 1951. There, Shannon Information Theory was used to calculate "information transmitted" in "absolute identification" experiments involving human subjects. Originally, in Shannon's "system", any symbol received ("outcome") is among the symbols sent ("events"). Not all symbols are received as transmitted, hence an indirect noise measure is calculated, "information transmitted", which requires knowing the confusion matrix, its columns labeled by "event" and (...) its rows labeled by "outcome". Each matrix entry is dependent upon the frequency with which a particular outcome corresponds to a particular event. However, for the sensory psychologist, stimulus intensities are "events"; the experimenter partitions the intensity continuum into ranges called "stimulus categories" and "response categories", such that each confusion-matrix entry represents the frequency with which a stimulus from a stimulus category falls within a particular response category. Of course, a stimulus evokes a sensation, and the subject's immediate memory of it is compared to the memories of sensations learned during practice, to make a categorization. Categorizing thus introduces "false noise", which is only removed if categorizations can be converted back to their hypothetical evoking stimuli. But sensations and categorizations are both statistically distributed, and the stimulus that corresponds to a given mean categorization cannot be known from only the latter; the relation of intensity to mean sensation, and of mean sensation to mean categorization, are needed. Neither, however, are presently knowable. This is a quandary, which arose because sensory psychologists ignored an ubiquitous component of Shannon's "system", the uninvolved observer, who calculates "information transmitted". Human sensory systems, however, are within de facto observers, making "false noise" inevitable. (shrink)
Information is a central notion for cognitive sciences and neurosciences, but there is no agreement on what it means for a cognitive system to acquire information about its surroundings. In this paper, we approximate three influential views on information: the one at play in ecological psychology, which is sometimes called information for action; the notion of information as covariance as developed by some enactivists, and the idea of information as minimization of uncertainty as presented by Shannon. Our main thesis (...) is that information for action can be construed as covariant information, and that learning to perceive covariant information is a matter of minimizing uncertainty through skilled performance. We argue that the agent’s cognitive system conveys information for acting in an environment by minimizing uncertainty about how to achieve her intended goals in that environment. We conclude by reviewing empirical findings that support our view and by showing how direct learning, seen as instance of ecological rationality at work, is how mere possibilities for action are turned into embodied know-how. Finally, we indicate the affinity between direct learning and sense-making activity. (shrink)
Shannon’s information theory has been a popular component of first-order cybernetics. It quantifies information transmitted in terms of the number of times a sent symbol is received as itself, or as another possible symbol. Sent symbols were events and received symbols were outcomes. Garner and Hake reinterpreted Shannon, describing events and outcomes as categories of a stimulus attribute, so as to quantify the information transmitted in the psychologist’s category (or absolute judgment) experiment. There, categories are represented by specific (...) stimuli, and the human subject must assign those stimuli, singly and in random order, to the categories that they represent. Hundreds of computations ensued of information transmitted and its alleged asymptote, the sensory channel capacity. The present paper critically re-examines those estimates. It also reviews estimates of memory capacity from memory experiments. It concludes that absolute judgment is memory-limited and that channel capacities are actually memory capacities. In particular, there are factors that affect absolute judgment that are not explainable within Shannon’s theory, factors such as feedback, practice, motivation, and stimulus range, as well as the anchor effect, sequential dependences, the rise in information transmitted with the increase in number of stimulus dimensions, and the phenomena of masking and stimulus duration dependence. It is recommended that absolute judgments be abandoned, because there are already many direct estimates of memory capacity. (shrink)
Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this (...) paper is to give the direct generalization to quantum logical information theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., qudits of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates that are distinguished by the measurement. Both the classical and quantum versions of logical entropy have simple interpretations as “two-draw” probabilities for distinctions. The conclusion is that quantum logical entropy is the simple and natural notion of information for quantum information theory focusing on the distinguishing of quantum states. (shrink)
In spite of being ubiquitous in life sciences, the concept of information is harshly criticized. Uses of the concept other than those derived from Shannon's theory are denounced as pernicious metaphors. We perform a computational experiment to explore whether Shannon's information is adequate to describe the uses of said concept in commonplace scientific practice. Our results show that semantic sequences do not have unique complexity values different from the value of meaningless sequences. This result suggests that quantitative theoretical (...) frameworks do not account fully for the complex phenomenon that the term “information” refers to. We propose a restructuring of the concept into two related, but independent notions, and conclude that a complete theory of biological information must account completely not only for both notions, but also for the relationship between them. (shrink)
The problem of emergence in physical theories makes necessary to build a general theory of the relationships between the observed system and the observing system. It can be shown that there exists a correspondence between classical systems and computational dynamics according to the Shannon-Turing model. A classical system is an informational closed system with respect to the observer; this characterizes the emergent processes in classical physics as phenomenological emergence. In quantum systems, the analysis based on the computation theory fails. (...) It is here shown that a quantum system is an informational open system with respect to the observer and able to exhibit processes of observational, radical emergence. Finally, we take into consideration the role of computation in describing the physical world. (shrink)
Purpose – A key cybernetics concept, information transmitted in a system, was quantified by Shannon. It quickly gained prominence, inspiring a version by Harvard psychologists Garner and Hake for “absolute identification” experiments. There, human subjects “categorize” sensory stimuli, affording “information transmitted” in perception. The Garner-Hake formulation has been in continuous use for 62 years, exerting enormous influence. But some experienced theorists and reviewers have criticized it as uninformative. They could not explain why, and were ignored. Here, the “why” is (...) answered. The paper aims to discuss these issues. Design/methodology/approach – A key Shannon data-organizing tool is the confusion matrix. Its columns and rows are, respectively, labeled by “symbol sent” (event) and “symbol received” (outcome), such that matrix entries represent how often outcomes actually corresponded to events. Garner and Hake made their own version of the matrix, which deserves scrutiny, and is minutely examined here. Findings – The Garner-Hake confusion-matrix columns represent “stimulus categories”, ranges of some physical stimulus attribute (usually intensity), and its rows represent “response categories” of the subject’s identification of the attribute. The matrix entries thus show how often an identification empirically corresponds to an intensity, such that “outcomes” and “events” differ in kind (unlike Shannon’s). Obtaining a true “information transmitted” therefore requires stimulus categorizations to be converted to hypothetical evoking stimuli, achievable (in principle) by relating categorization to sensation to intensity. But those relations are actually unknown, perhaps unknowable. Originality/value – The author achieves an important understanding: why “absolute identification” experiments do not illuminate sensory processes. (shrink)
In this work, we propose a definition of logical consequence based on the relation between the quantity of information present in a particular set of formulae and a particular formula. As a starting point, we use Shannon‟s quantitative notion of information, founded on the concepts of logarithmic function and probability value. We first consider some of the basic elements of an axiomatic probability theory, and then construct a probabilistic semantics for languages of classical propositional logic. We define the quantity (...) of information for the formulae of these languages and introduce the concept of informational logical consequence, identifying some important results, among them: certain arguments that have traditionally been considered valid, such as modus ponens, are not valid from the informational perspective; the logic underlying informational logical consequence is not classical, and is at the least paraconsistent sensu lato; informational logical consequence is not a Tarskian logical consequence. (shrink)
Logical Probability (LP) is strictly distinguished from Statistical Probability (SP). To measure semantic information or confirm hypotheses, we need to use sampling distribution (conditional SP function) to test or confirm fuzzy truth function (conditional LP function). The Semantic Information Measure (SIM) proposed is compatible with Shannon’s information theory and Fisher’s likelihood method. It can ensure that the less the LP of a predicate is and the larger the true value of the proposition is, the more information there is. So (...) the SIM can be used as Popper's information criterion for falsification or test. The SIM also allows us to optimize the true-value of counterexamples or degrees of disbelief in a hypothesis to get the optimized degree of belief, i. e. Degree of Confirmation (DOC). To explain confirmation, this paper 1) provides the calculation method of the DOC of universal hypotheses; 2) discusses how to resolve Raven Paradox with new DOC and its increment; 3) derives the DOC of rapid HIV tests: DOC of “+” =1-(1-specificity)/sensitivity, which is similar to Likelihood Ratio (=sensitivity/(1-specificity)) but has the upper limit 1; 4) discusses negative DOC for excessive affirmations, wrong hypotheses, or lies; and 5) discusses the DOC of general hypotheses with GPS as example. (shrink)
The first use of the term "information" to describe the content of nervous impulse occurs 20 years prior to Shannon`s (1948) work, in Edgar Adrian`s The Basis of Sensation (1928). Although, at least throughout the 1920s and early 30s, the term "information" does not appear in Adrian`s scientific writings to describe the content of nervous impulse, the notion that the structure of nervous impulse constitutes a type of message subject to certain constraints plays an important role in all of (...) his writings throughout the period. The appearance of the concept of information in Adrian`s work raises at least two important questions: (i) what were the relevant factors that motivated Adrian`s use of the concept of information? (ii) What concept of information does Adrian appeal to, and how can it be situated in relation to contemporary philosophical accounts of the notion of information in biology? The first question involves an account of the application of communications technology in neurobiology as well as the historical and scientific background of Adrian`s major scientific achievement, which was the recording of the action potential of a single sensory neuron. The response to the second question involves an explication of Adrian`s concept of information and an evaluation of how it may be situated in relation to more contemporary philosophical explications of a semantic concept of information. I suggest that Adrian`s concept of information places limitations on the sorts of systems that are referred to as information carriers by causal and functional accounts of information. (shrink)
This chapter describes the conceptual foundations of cognitive science during its establishment as a science in the 20th century. It is organized around the core ideas of individual agency as its basic explanans and information-processing as its basic explanandum. The latter consists of a package of ideas that provide a mathematico-engineering framework for the philosophical theory of materialism.
Equality and identity. Bulletin of Symbolic Logic. 19 (2013) 255-6. (Coauthor: Anthony Ramnauth) Also see https://www.academia.edu/s/a6bf02aaab This article uses ‘equals’ [‘is equal to’] and ‘is’ [‘is identical to’, ‘is one and the same as’] as they are used in ordinary exact English. In a logically perfect language the oxymoron ‘the numbers 3 and 2+1 are the same number’ could not be said. Likewise, ‘the number 3 and the number 2+1 are one number’ is just as bad from a logical point (...) of view. In normal English these two sentences are idiomatically taken to express the true proposition that ‘the number 3 is the number 2+1’. Another idiomatic convention that interferes with clarity about equality and identity occurs in discussion of numbers: it is usual to write ‘3 equals 2+1’ when “3 is 2+1” is meant. When ‘3 equals 2+1’ is written there is a suggestion that 3 is not exactly the same number as 2+1 but that they merely have the same value. This becomes clear when we say that two of the sides of a triangle are equal if the two angles they subtend are equal or have the same measure. -/- Acknowledgements: Robert Barnes, Mark Brown, Jack Foran, Ivor Grattan-Guinness, Forest Hansen, David Hitchcock, Spaulding Hoffman, Calvin Jongsma, Justin Legault, Joaquin Miller, Tania Miller, and Wyman Park. -/- ► JOHN CORCORAN AND ANTHONY RAMNAUTH, Equality and identity. Philosophy, University at Buffalo, Buffalo, NY 14260-4150, USA E-mail: corcoran@buffalo.edu The two halves of one line are equal but not identical [one and the same]. Otherwise the line would have only one half! Every line equals infinitely many other lines, but no line is [identical to] any other line—taking ‘identical’ strictly here and below. Knowing that two lines equaling a third are equal is useful; the condition “two lines equaling a third” often holds. In fact any two sides of an equilateral triangle is equal to the remaining side! But could knowing that two lines being [identical to] a third are identical be useful? The antecedent condition “two things identical to a third” never holds, nor does the consequent condition “two things being identical”. If two things were identical to a third, they would be the third and thus not be two things but only one. The plural predicate ‘are equal’ as in ‘All diameters of a given circle are equal’ is useful and natural. ‘Are identical’ as in ‘All centers of a given circle are identical’ is awkward or worse; it suggests that a circle has multiple centers. Substituting equals for equals [replacing one of two equals by the other] makes sense. Substituting identicals for identicals is empty—a thing is identical only to itself; substituting one thing for itself leaves that thing alone, does nothing. There are as many types of equality as magnitudes: angles, lines, planes, solids, times, etc. Each admits unit magnitudes. And each such equality analyzes as identity of magnitude: two lines are equal [in length] if the one’s length is identical to the other’s. Tarski [1] hardly mentioned equality-identity distinctions (pp. 54-63). His discussion begins: -/- Among the logical concepts […], the concept of IDENTITY or EQUALITY […] has the greatest importance. -/- Not until page 62 is there an equality-identity distinction. His only “notion of equality”, if such it is, is geometrical congruence—having the same size and shape—an equivalence relation not admitting any unit. Does anyone but Tarski ever say ‘this triangle is equal to that’ to mean that the first is congruent to that? What would motivate him to say such a thing? This lecture treats the history and philosophy of equality-identity distinctions. [1] ALFRED TARSKI, Introduction to Logic, Dover, New York, 1995. [This is expanded from the printed abstract.] . (shrink)
Purpose – The purpose of this paper is to ask whether a first-order-cybernetics concept, Shannon’s Information Theory, actually allows a far-reaching mathematics of perception allegedly derived from it, Norwich et al.’s “Entropy Theory of Perception”. Design/methodology/approach – All of The Entropy Theory, 35 years of publications, was scrutinized for its characterization of what underlies Shannon Information Theory: Shannon’s “general communication system”. There, “events” are passed by a “source” to a “transmitter”, thence through a “noisy channel” to a (...) “receiver”, that passes “outcomes” (received events) to a “destination”. Findings – In the entropy theory, “events” were sometimes interactions with the stimulus, but could be microscopic stimulus conditions. “Outcomes” often went unnamed; sometimes, the stimulus, or the interaction with it, or the resulting sensation, were “outcomes”. A “source” was often implied to be a “transmitter”, which frequently was a primary afferent neuron; elsewhere, the stimulus was the “transmitter” and perhaps also the “source”. “Channel” was rarely named; once, it was the whole eye; once, the incident photons; elsewhere, the primary or secondary afferent. “Receiver” was usually the sensory receptor, but could be an afferent. “Destination” went unmentioned. In sum, the entropy theory’s idea of Shannon’s “general communication system” was entirely ambiguous. Research limitations/implications – The ambiguities indicate that, contrary to claim, the entropy theory cannot be an “information theoretical description of the process of perception”. Originality/value – Scrutiny of the entropy theory’s use of information theory was overdue and reveals incompatibilities that force a reconsideration of information theory’s possible role in perception models. A second-order-cybernetics approach is suggested. (shrink)
Purpose – For half a century, neuroscientists have used Shannon Information Theory to calculate “information transmitted,” a hypothetical measure of how well neurons “discriminate” amongst stimuli. Neuroscientists’ computations, however, fail to meet even the technical requirements for credibility. Ultimately, the reasons must be conceptual. That conclusion is confirmed here, with crucial implications for neuroscience. The paper aims to discuss these issues. Design/methodology/approach – Shannon Information Theory depends upon a physical model, Shannon’s “general communication system.” Neuroscientists’ interpretation of (...) that model is scrutinized here. Findings – In Shannon’s system, a recipient receives a message composed of symbols. The symbols received, the symbols sent, and their hypothetical occurrence probabilities altogether allow calculation of “information transmitted.” Significantly, Shannon’s system’s “reception” (decoding) side physically mirrors its “transmission” (encoding) side. However, neurons lack the “reception” side; neuroscientists nonetheless insisted that decoding must happen. They turned to Homunculus, an internal humanoid who infers stimuli from neuronal firing. However, Homunculus must contain a Homunculus, and so on ad infinitum – unless it is super-human. But any need for Homunculi, as in “theories of consciousness,” is obviated if consciousness proves to be “emergent.” Research limitations/implications – Neuroscientists’ “information transmitted” indicates, at best, how well neuroscientists themselves can use neuronal firing to discriminate amongst the stimuli given to the research animal. Originality/value – A long-overdue examination unmasks a hidden element in neuroscientists’ use of Shannon Information Theory, namely, Homunculus. Almost 50 years’ worth of computations are recognized as irrelevant, mandating fresh approaches to understanding “discriminability.”. (shrink)
Purpose – The purpose of this paper is to examine the popular “information transmitted” interpretation of absolute judgments, and to provide an alternative interpretation if one is needed. Design/methodology/approach – The psychologists Garner and Hake and their successors used Shannon’s Information Theory to quantify information transmitted in absolute judgments of sensory stimuli. Here, information theory is briefly reviewed, followed by a description of the absolute judgment experiment, and its information theory analysis. Empirical channel capacities are scrutinized. A remarkable coincidence, (...) the similarity of maximum information transmitted to human memory capacity, is described. Over 60 representative psychology papers on “information transmitted” are inspected for evidence of memory involvement in absolute judgment. Finally, memory is conceptually integrated into absolute judgment through a novel qualitative model that correctly predicts how judgments change with increase in the number of judged stimuli. Findings – Garner and Hake gave conflicting accounts of how absolute judgments represent information transmission. Further, “channel capacity” is an illusion caused by sampling bias and wishful thinking; information transmitted actually peaks and then declines, the peak coinciding with memory capacity. Absolute judgments themselves have numerous idiosyncracies that are incompatible with a Shannon general communication system but which clearly imply memory dependence. Research limitations/implications – Memory capacity limits the correctness of absolute judgments. Memory capacity is already well measured by other means, making redundant the informational analysis of absolute judgments. Originality/value – This paper presents a long-overdue comprehensive critical review of the established interpretation of absolute judgments in terms of “information transmitted”. An inevitable conclusion is reached: that published measurements of information transmitted actually measure memory capacity. A new, qualitative model is offered for the role of memory in absolute judgments. The model is well supported by recently revealed empirical properties of absolute judgments. (shrink)
Introduction & Objectives: Norwich’s Entropy Theory of Perception (1975 [1] -present) stands alone. It explains many firing-rate behaviors and psychophysical laws from bare theory. To do so, it demands a unique sort of interaction between receptor and brain, one that Norwich never substantiated. Can it now be confirmed, given the accumulation of empirical sensory neuroscience? Background: Norwich conjoined sensation and a mathematical model of communication, Shannon’s Information Theory, as follows: “In the entropic view of sensation, magnitude of sensation is (...) regarded as a measure of the entropy or uncertainty of the stimulus signal” [2]. “To be uncertain about the outcome of an event, one must first be aware of a set of alternative outcomes” [3]. “The entropy-establishing process begins with the generation of a [internal] sensory signal by the stimulus generator. This is followed by receipt of the [external] stimulus by the sensory receptor, transmission of action potentials by the sensory neurons, and finally recapture of the [response to the internal] signal by the generator” [4]. The latter “recapture” differentiates external from internal stimuli. The hypothetical “stimulus generators” are internal emitters, that generate photons in vision, audible sounds in audition (to Norwich, the spontaneous otoacoustic emissions [SOAEs]), “temperatures in excess of local skin temperature” in skin temperature sensation [4], etc. Method (1): Several decades of empirical sensory physiology literature was scrutinized for internal “stimulus generators”. Results (1): Spontaneous photopigment isomerization (“dark light”) does not involve visible light. SOAEs are electromechanical basilar-membrane artefacts that rarely produce audible tones. The skin’s temperature sensors do not raise skin temperature, etc. Method (2): The putative action of the brain-and-sensory-receptor loop was carefully reexamined. Results (2): The sensory receptor allegedly “perceives”, experiences “awareness”, possesses “memory”, and has a “mind”. But those traits describe the whole human. The receptor, thus anthropomorphized, must therefore contain its own perceptual loop, containing a receptor, containing a perceptual loop, etc. Summary & Conclusions: The Entropy Theory demands sensory awareness of alternatives, through an imagined brain-and-sensory-receptor loop containing internal “stimulus generators”. But (1) no internal “stimulus generators” seem to exist and (2) the loop would be the outermost of an infinite nesting of identical loops. (shrink)
There are (at least) three approaches to quantifying information. The first, algorithmic information or Kolmogorov complexity, takes events as strings and, given a universal Turing machine, quantifies the information content of a string as the length of the shortest program producing it [1]. The second, Shannon information, takes events as belonging to ensembles and quantifies the information resulting from observing the given event in terms of the number of alternate events that have been ruled out [2]. The third, statistical (...) learning theory, has introduced measures of capacity that control (in part) the expected risk of classifiers [3]. These capacities quantify the expectations regarding future data that learning algorithms embed into classifiers. Solomonoff and Hutter have applied algorithmic information to prove remarkable results on universal induction. Shannon information provides the mathematical foundation for communication and coding theory. However, both approaches have shortcomings. Algorithmic information is not computable, severely limiting its practical usefulness. Shannon information refers to ensembles rather than actual events: it makes no sense to compute the Shannon information of a single string – or rather, there are many answers to this question depending on how a related ensemble is constructed. Although there are asymptotic results linking algorithmic and Shannon information, it is unsatisfying that there is such a large gap – a difference in kind – between the two measures. This note describes a new method of quantifying information, effective information, that links algorithmic information to Shannon information, and also links both to capacities arising in statistical learning theory [4, 5]. After introducing the measure, we show that it provides a non-universal analog of Kolmogorov complexity. We then apply it to derive basic capacities in statistical learning theory: empirical VC-entropy and empirical Rademacher complexity. A nice byproduct of our approach is an interpretation of the explanatory power of a learning algorithm in terms of the number of hypotheses it falsifies [6], counted in two different ways for the two capacities. We also discuss how effective information relates to information gain, Shannon and mutual information. (shrink)
After introducing the concept of compressed sensing as a complementary measurement mode to the classical Shannon-Nyquist approach, I discuss some of the drivers, potential challenges and obstacles to its implementation. I end with a speculative attempt to embed compressed sensing as an enabling methodology within the emergence of data-driven discovery. As a consequence I predict the growth of non-nomological sciences where heuristic correlations will find applications but often bypass conventional pure basic and use-inspired basic research stages due to the (...) lack of verifiable hypotheses. (shrink)
In recent decades, the analysis of phraseology has made use of the exploration of large corpora as a source of quantitative information about language. This paper intends to present the main lines of work in progress based on this empirical approach to linguistic analysis. In particular, we focus our attention on some problems relating to the morpho-syntactic annotation of corpora. The CORIS/CODIS corpus of contemporary written Italian, developed at CILTA – University of Bologna (Rossini Favretti 2000; Rossini Favretti, Tamburini, De (...) Santis in press), is a synchronic 100-million-word corpus and is being lemmatised and annotated with part-of-speech (POS) tags, in order to increase the quantity of information and improve data retrieval procedures (Tamburini 2000). The aim of POS tagging is to assign each lexical unit to the appropriate word class. Usually the set of tags is pre-established by the linguist, who uses his/her competence to identify the different word classes. The very first experiments we made revealed how the traditional part-of-speech distinctions in Italian (generally based on morphological and semantic criteria) are often inadequate to represent the syntactic features of words in context. It is worth noting that the uncertainties in categorisation contained in Italian grammars and dictionaries reflect a growing difficulty as they move from fundamental linguistic classes, such as nouns and verbs, to more complex classes, such as adverbs, pronouns, prepositions and conjunctions. This latter class, that groups together elements traditionally used to express connections between sentences, appears inadequate when describing cohesive relations in Italian. This phenomenon actually seems to involve other elements traditionally assigned to different classes, such as adverbs, pronouns and interjections. Recent studies proposed the class of ‘connectives’, grouping all words that, apart from their traditional word class, have the function of connecting phrases and contributing to textual cohesion. From this point of view, conjunctions can be considered as part of phrasal connectives, that can in turn be included in the wider category of textual connectives. The aim of this study is to identify elements that can be included in the class of phrasal connectives, using quantitative methods. According to Shannon and Weaver’s (1949) observation that words are linked by dependent probabilities, corroborated by Halliday’s (1991) argument that the grammatical “system” (in Firth’s sense of the term) is essentially probabilistic, quantitative data are introduced in order to provide evidence of relative frequencies. Section 2 presents a description of word-class categorisation from the point of view of grammars and dictionaries arguing that the traditional category of conjunctions is inadequate for capturing the notion of phrasal connective. Section 3 examines the notion of ‘connective’ and suggests a truth-function interpretation of connective behaviour. Section 4 describes the quantitative methods proposed for analysing the distributional properties of lexical units, and section 5 comments on the results obtained by applying such methods drawing some provisional conclusions. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.