Switch to: Citations

Add references

You must login to add references.
  1. (2 other versions)Knowledge and the flow of information.F. Dretske - 1989 - Trans/Form/Ação 12:133-139.
    Download  
     
    Export citation  
     
    Bookmark   1391 citations  
  • The Intentional Stance.Daniel Clement Dennett - 1981 - MIT Press.
    Through the use of such "folk" concepts as belief, desire, intention, and expectation, Daniel Dennett asserts in this first full scale presentation of...
    Download  
     
    Export citation  
     
    Bookmark   1473 citations  
  • Darwin's Dangerous Idea.Daniel Dennett - 1994 - Behavior and Philosophy 24 (2):169-174.
    Download  
     
    Export citation  
     
    Bookmark   547 citations  
  • Content and Consciousness.D. C. Dennett - 1969 - Journal of Philosophy 69 (18):604-604.
    Download  
     
    Export citation  
     
    Bookmark   331 citations  
  • Is coding a relevant metaphor for the brain?Romain Brette - 2019 - Behavioral and Brain Sciences 42:1-44.
    “Neural coding” is a popular metaphor in neuroscience, where objective properties of the world are communicated to the brain in the form of spikes. Here I argue that this metaphor is often inappropriate and misleading. First, when neurons are said to encode experimental parameters, the neural code depends on experimental details that are not carried by the coding variable. Thus, the representational power of neural codes is much more limited than generally implied. Second, neural codes carry information only by reference (...)
    Download  
     
    Export citation  
     
    Bookmark   15 citations  
  • From Bacteria to Bach and Back: The Evolution of Minds.Daniel Dennett - unknown
    Download  
     
    Export citation  
     
    Bookmark   191 citations  
  • Content in Simple Signalling Systems.Nicholas Shea, Peter Godfrey-Smith & Rosa Cao - 2018 - British Journal for the Philosophy of Science 69 (4):1009-1035.
    Our understanding of communication and its evolution has advanced significantly through the study of simple models involving interacting senders and receivers of signals. Many theorists have thought that the resources of mathematical information theory are all that are needed to capture the meaning or content that is being communicated in these systems. However, the way theorists routinely talk about the models implicitly draws on a conception of content that is richer than bare informational content, especially in contexts where false content (...)
    Download  
     
    Export citation  
     
    Bookmark   42 citations  
  • Functional Information: a Graded Taxonomy of Difference Makers.Nir Fresco, Simona Ginsburg & Eva Jablonka - 2020 - Review of Philosophy and Psychology 11 (3):547-567.
    There are many different notions of information in logic, epistemology, psychology, biology and cognitive science, which are employed differently in each discipline, often with little overlap. Since our interest here is in biological processes and organisms, we develop a taxonomy of functional information that extends the standard cue/signal distinction (in animal communication theory). Three general, main claims are advanced here. (1) This new taxonomy can be useful in describing learning and communication. (2) It avoids some problems that the natural/non-natural information (...)
    Download  
     
    Export citation  
     
    Bookmark   12 citations  
  • The Hard Problem Of Content: Solved (Long Ago).Marcin Miłkowski - 2015 - Studies in Logic, Grammar and Rhetoric 41 (1):73-88.
    In this paper, I argue that even if the Hard Problem of Content, as identified by Hutto and Myin, is important, it was already solved in natu- ralized semantics, and satisfactory solutions to the problem do not rely merely on the notion of information as covariance. I point out that Hutto and Myin have double standards for linguistic and mental representation, which leads to a peculiar inconsistency. Were they to apply the same standards to basic and linguistic minds, they would (...)
    Download  
     
    Export citation  
     
    Bookmark   21 citations  
  • Information: a very short introduction.Luciano Floridi - 2010 - New York: Oxford University Press.
    This book helps us understand the true meaning of the concept and how it can be used to understand our world.
    Download  
     
    Export citation  
     
    Bookmark   99 citations  
  • Signals: Evolution, Learning, and Information.Brian Skyrms - 2010 - Oxford, GB: Oxford University Press.
    Brian Skyrms offers a fascinating demonstration of how fundamental signals are to our world. He uses various scientific tools to investigate how meaning and communication develop. Signals operate in networks of senders and receivers at all levels of life, transmitting and processing information. That is how humans and animals think and interact.
    Download  
     
    Export citation  
     
    Bookmark   242 citations  
  • (1 other version)What is information?John Perry & David Israel - 2019 - In Studies in language and information. Stanford, California: Center for the Study of Language and Information.
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • A Philosophical Letter of Alfred Tarski.Morton White - 1987 - Journal of Philosophy 84 (1):28-32.
    Download  
     
    Export citation  
     
    Bookmark   22 citations  
  • A Mathematical Theory of Communication.Claude Elwood Shannon - 1948 - Bell System Technical Journal 27 (April 1924):379–423.
    The mathematical theory of communication.
    Download  
     
    Export citation  
     
    Bookmark   1203 citations  
  • Representational content in humans and machines.Mark H. Bickhard - 1993 - Journal of Experimental and Theoretical Artificial Intelligence 5:285-33.
    This article focuses on the problem of representational content. Accounting for representational content is the central issue in contemporary naturalism: it is the major remaining task facing a naturalistic conception of the world. Representational content is also the central barrier to contemporary cognitive science and artificial intelligence: it is not possible to understand representation in animals nor to construct machines with genuine representation given current (lack of) understanding of what representation is. An elaborated critique is offered to current approaches to (...)
    Download  
     
    Export citation  
     
    Bookmark   89 citations  
  • (1 other version)What is information?David J. Israel & John Perry - 1990 - In Philip P. Hanson (ed.), Information, Language and Cognition. University of British Columbia Press.
    Download  
     
    Export citation  
     
    Bookmark   41 citations  
  • On Quantifying Semantic Information.Simon D'Alfonso - 2011 - Information 2 (1):61-101.
    The purpose of this paper is to look at some existing methods of semantic information quantification and suggest some alternatives. It begins with an outline of Bar-Hillel and Carnap’s theory of semantic information before going on to look at Floridi’s theory of strongly semantic information. The latter then serves to initiate an in-depth investigation into the idea of utilising the notion of truthlikeness to quantify semantic information. Firstly, a couple of approaches to measure truthlikeness are drawn from the literature and (...)
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • (1 other version)Measuring consciousness: relating behavioural and neurophysiological approaches.Anil K. Seth, Zoltán Dienes, Axel Cleeremans, Morten Overgaard & Luiz Pessoa - 2008 - Trends in Cognitive Sciences 12 (8):314-321.
    Download  
     
    Export citation  
     
    Bookmark   83 citations  
  • The use of information theory in epistemology.William F. Harms - 1998 - Philosophy of Science 65 (3):472-501.
    Information theory offers a measure of "mutual information" which provides an appropriate measure of tracking efficiency for the naturalistic epistemologist. The statistical entropy on which it is based is arguably the best way of characterizing the uncertainty associated with the behavior of a system, and it is ontologically neutral. Though not appropriate for the naturalization of meaning, mutual information can serve as a measure of epistemic success independent of semantic maps and payoff structures. While not containing payoffs as terms, mutual (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • Information, Mechanism and Meaning.Donald M. Mackay - 1972 - Synthese 24 (3):472-474.
    Download  
     
    Export citation  
     
    Bookmark   132 citations