View topic on PhilPapers for more information
Related categories

34 found
Order:
More results on PhilPapers
  1. added 2019-07-22
    Real Patterns and Indispensability.Abel Suñé & Manolo Martínez - manuscript
    While scientific inquiry crucially relies on the extraction of patterns from data, we still have a very imperfect understanding of the metaphysics of patterns—and, in particular, of what it is that makes a pattern real. In this paper we derive a criterion of real-patternhood from the notion of conditional Kolmogorov complexity. The resulting account belongs in the philosophical tradition, initiated by Dennett, that links real-patternhood to data compressibility, but is simpler and formally more perspicuous than other proposals defended heretofore in (...)
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark  
  2. added 2019-06-05
    Information and Inaccuracy.William Roche & Tomoji Shogenji - 2018 - British Journal for the Philosophy of Science 69 (2):577-604.
    This article proposes a new interpretation of mutual information. We examine three extant interpretations of MI by reduction in doubt, by reduction in uncertainty, and by divergence. We argue that the first two are inconsistent with the epistemic value of information assumed in many applications of MI: the greater is the amount of information we acquire, the better is our epistemic position, other things being equal. The third interpretation is consistent with EVI, but it is faced with the problem of (...)
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark   1 citation  
  3. added 2019-02-19
    Angeletics and Epistemology, Angeletics as Epistemology: A Comparison Between Capurro’s Angeletics and Goldman’s Social Epistemology.Pak-Hang Wong - 2011 - In Messages and Messengers – Angeletics as an Approach to the Phenomenology of Communication / Von Boten und Botschaften – Die Angeletik als Weg zur Phänomenologie der Kommunikation.
    Nearly a decade ago, Rafael Capurro has gradually shifted his attention towards the ideas of message and of messenger. In lieu of ‘information’, he proposes and develops a new direction of research he calls Angeletics that aims to examine the nature of message and messenger, both of which are inherently social. Coincidently, at about the same time, we witnessed the rise of social epistemology in Angelo-American analytic philosophy. This coincidence is interesting, because both Capurro’s Angeletics and social epistemology indicated a (...)
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark  
  4. added 2018-11-19
    Semantic Information Measure with Two Types of Probability for Falsification and Confirmation.Lu Chenguang - manuscript
    Logical Probability (LP) is strictly distinguished from Statistical Probability (SP). To measure semantic information or confirm hypotheses, we need to use sampling distribution (conditional SP function) to test or confirm fuzzy truth function (conditional LP function). The Semantic Information Measure (SIM) proposed is compatible with Shannon’s information theory and Fisher’s likelihood method. It can ensure that the less the LP of a predicate is and the larger the true value of the proposition is, the more information there is. So the (...)
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark  
  5. added 2018-09-23
    What an Entangled Web We Weave: An Information-Centric Approach to Time-Evolving Socio-Technical Systems.Markus Luczak-Roesch, Kieron O’Hara, Jesse David Dinneen & Ramine Tinati - 2018 - Minds and Machines 28 (4):709-733.
    A new layer of complexity, constituted of networks of information token recurrence, has been identified in socio-technical systems such as the Wikipedia online community and the Zooniverse citizen science platform. The identification of this complexity reveals that our current understanding of the actual structure of those systems, and consequently the structure of the entire World Wide Web, is incomplete, which raises novel questions for data science research but also from the perspective of social epistemology. Here we establish the principled foundations (...)
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark  
  6. added 2018-07-31
    Information-Not-Thing: Further Problems with and Alternatives to the Belief That Information is Physical.Jesse David Dinneen & Christian Brauner - 2017 - Proceedings of 2017 CAIS-ACSI Conference.
    In this short paper, we show that a popular view in information science, information-as-thing, fails to account for a common example of information that seems physical. We then demonstrate how the distinction between types and tokens, recently used to analyse Shannon information, can account for this same example by viewing information as abstract, and discuss existing definitions of information that are consistent with this approach. -/- Dans ce court article nous montrons qu'une vision populaire en sciences de l'information, l'information en (...)
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark  
  7. added 2018-07-31
    From Coincidence to Purposeful Flow? Properties of Transcendental Information Cascades.Markus Luczak-Roesch, Ramine Tinati, Max van Kleek & Nigel Shadbolt - 2015 - In International Conference on Advances in Social Networks Analysis and Mining (ASONAM) 2015.
    In this paper, we investigate a method for constructing cascades of information co-occurrence, which is suitable to trace emergent structures in information in scenarios where rich contextual features are unavailable. Our method relies only on the temporal order of content-sharing activities, and intrinsic properties of the shared content itself. We apply this method to analyse information dissemination patterns across the active online citizen science project Planet Hunters, a part of the Zooniverse platform. Our results lend insight into both structural and (...)
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark   1 citation  
  8. added 2018-07-30
    When Resources Collide: Towards a Theory of Coincidence in Information Spaces.Markus Luczak-Roesch, Ramine Tinati & Nigel Shadbolt - 2015 - In WWW '15 Companion Proceedings of the 24th International Conference on World Wide Web. Florence, Metropolitan City of Florence, Italy: pp. 1137-1142.
    This paper is an attempt to lay out foundations for a general theory of coincidence in information spaces such as the World Wide Web, expanding on existing work on bursty structures in document streams and information cascades. We elaborate on the hypothesis that every resource that is published in an information space, enters a temporary interaction with another resource once a unique explicit or implicit reference between the two is found. This thought is motivated by Erwin Shroedingers notion of entanglement (...)
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark   2 citations  
  9. added 2018-03-02
    An Informational Theory of Counterfactuals.Danilo Fraga Dantas - 2018 - Acta Analytica 33 (4):525-538.
    Backtracking counterfactuals are problem cases for the standard, similarity based, theories of counterfactuals e.g., Lewis. These theories usually need to employ extra-assumptions to deal with those cases. Hiddleston, 632–657, 2005) proposes a causal theory of counterfactuals that, supposedly, deals well with backtracking. The main advantage of the causal theory is that it provides a unified account for backtracking and non-backtracking counterfactuals. In this paper, I present a backtracking counterfactual that is a problem case for Hiddleston’s account. Then I propose an (...)
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark  
  10. added 2018-02-12
    Toward an Algorithmic Metaphysics.Steve Petersen - 2013 - In David Dowe (ed.), Algorithmic Probability and Friends: Bayesian Prediction and Artificial Intelligence. Springer. pp. 306-317.
    There are writers in both metaphysics and algorithmic information theory (AIT) who seem to think that the latter could provide a formal theory of the former. This paper is intended as a step in that direction. It demonstrates how AIT might be used to define basic metaphysical notions such as *object* and *property* for a simple, idealized world. The extent to which these definitions capture intuitions about the metaphysics of the simple world, times the extent to which we think the (...)
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark   1 citation  
  11. added 2018-01-21
    Information-Theoretic Philosophy of Mind.Jason Winning & William Bechtel - 2016 - In Luciano Floridi (ed.), The Routledge Handbook of Philosophy of Information. London and New York: Routledge. pp. 347-360.
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark  
  12. added 2017-12-06
    A Quantitative-Informational Approach to Logical Consequence.Marcos Antonio Alves & Ítala M. Loffredo D'Otaviano - 2015 - In Jean-Yves Beziau (ed.), The Road to Universal Logic (Studies in Universal Logic). Switzerland: Springer International Publishing. pp. 105-24.
    In this work, we propose a definition of logical consequence based on the relation between the quantity of information present in a particular set of formulae and a particular formula. As a starting point, we use Shannon‟s quantitative notion of information, founded on the concepts of logarithmic function and probability value. We first consider some of the basic elements of an axiomatic probability theory, and then construct a probabilistic semantics for languages of classical propositional logic. We define the quantity of (...)
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark  
  13. added 2017-06-20
    Patterns, Information, and Causation.Holly Andersen - 2017 - Journal of Philosophy 114 (11):592-622.
    This paper articulates an account of causation as a collection of information-theoretic relationships between patterns instantiated in the causal nexus. I draw on Dennett’s account of real patterns to characterize potential causal relata as patterns with specific identification criteria and noise tolerance levels, and actual causal relata as those patterns instantiated at some spatiotemporal location in the rich causal nexus as originally developed by Salmon. I develop a representation framework using phase space to precisely characterize causal relata, including their degree (...)
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark   4 citations  
  14. added 2017-01-19
    On Interpreting Chaitin's Incompleteness Theorem.Panu Raatikainen - 1998 - Journal of Philosophical Logic 27 (6):569-586.
    The aim of this paper is to comprehensively question the validity of the standard way of interpreting Chaitin's famous incompleteness theorem, which says that for every formalized theory of arithmetic there is a finite constant c such that the theory in question cannot prove any particular number to have Kolmogorov complexity larger than c. The received interpretation of theorem claims that the limiting constant is determined by the complexity of the theory itself, which is assumed to be good measure of (...)
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark   8 citations  
  15. added 2016-11-09
    Counting Distinctions: On the Conceptual Foundations of Shannon’s Information Theory.David Ellerman - 2009 - Synthese 168 (1):119-149.
    Categorical logic has shown that modern logic is essentially the logic of subsets (or "subobjects"). Partitions are dual to subsets so there is a dual logic of partitions where a "distinction" [an ordered pair of distinct elements (u,u′) from the universe U ] is dual to an "element". An element being in a subset is analogous to a partition π on U making a distinction, i.e., if u and u′ were in different blocks of π. Subset logic leads to finite (...)
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark   12 citations  
  16. added 2016-04-23
    On Classical and Quantum Logical Entropy.David Ellerman - manuscript
    The notion of a partition on a set is mathematically dual to the notion of a subset of a set, so there is a logic of partitions dual to Boole's logic of subsets (Boolean logic is usually mis-specified as "propositional" logic). The notion of an element of a subset has as its dual the notion of a distinction of a partition (a pair of elements in different blocks). Boole developed finite logical probability as the normalized counting measure on elements of (...)
    Remove from this list   Download  
    Translate
     
     
    Export citation  
     
    Bookmark  
  17. added 2016-04-19
    Coincidence, Data Compression, and Mach's Concept of “Economy of Thought”.J. S. Markovitch - manuscript
    A case is made that Mach’s principle of “economy of thought”, and therefore usefulness, is related to the compressibility of data, but that a mathematical expression may compress data for reasons that are sometimes coincidental and sometimes not. An expression, therefore, may be sometimes explainable and sometimes not. A method is proposed for distinguishing coincidental data compression from non-coincidental, where this method may serve as a guide in uncovering new mathematical relationships. The method works by producing a probability that a (...)
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark  
  18. added 2016-03-17
    Les organismes vivants comme pièges à information.Antoine Danchin - 2008 - Ludus Vitalis 16 (30):211-212.
    Life can be defined as combining two entities that rest on completely different physico-chemical properties and on a particular way of handling information. The cell, first, is a « machine », that combines elements which are quite similar (although in a fairly fuzzy way) to those involved in a man-made factory. The machine combines two processes. First, it requires explicit compartmentalisation, including scaffolding structures similar to that of the châssis of engineered machines. In addition, cells define clearly an inside, the (...)
    Remove from this list   Download  
    Translate
     
     
    Export citation  
     
    Bookmark   1 citation  
  19. added 2016-03-11
    Communicating the Same Information to a Human and to a Machine: Is There a Difference in Principle?Vincent C. Müller - 2002 - In Konstantinos Boudouris & Takis Poulakos (eds.), Philosophy of communication: Proceedings of the 13th international conference on Greek philosophy (IAGP 13). Ionia. pp. 168-176.
    We try to show that there is no difference in principle between communicating a piece of information to a human and to a machine. The argumentation depends on the following theses: Communicating is transfer of information; information has propositional form; propositional form can be modelled as categorization; categorisation can be modelled in a machine; a suitably equipped machine can grasp propositional content designed for human communication. What I suggest is that the discussion should focus on the truth and precise meaning (...)
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark  
  20. added 2016-02-25
    A Generalization of Shannon's Information Theory.Chenguang Lu - 1999 - Int. J. Of General Systems 28 (6):453-490.
    A generalized information theory is proposed as a natural extension of Shannon's information theory. It proposes that information comes from forecasts. The more precise and the more unexpected a forecast is, the more information it conveys. If subjective forecast always conforms with objective facts then the generalized information measure will be equivalent to Shannon's information measure. The generalized communication model is consistent with K. R. Popper's model of knowledge evolution. The mathematical foundations of the new information theory, the generalized communication (...)
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark   1 citation  
  21. added 2016-02-23
    A Semantic Information Formula Compatible with Shannon and Popper's Theories.Chenguang Lu - manuscript
    Semantic Information conveyed by daily language has been researched for many years; yet, we still need a practical formula to measure information of a simple sentence or prediction, such as “There will be heavy rain tomorrow”. For practical purpose, this paper introduces a new formula, Semantic Information Formula (SIF), which is based on L. A. Zadeh’s fuzzy set theory and P. Z. Wang’s random set falling shadow theory. It carries forward C. E. Shannon and K. Popper’s thought. The fuzzy set’s (...)
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark  
  22. added 2015-08-16
    Objects and Processes: Two Notions for Understanding Biological Information.Agustín Mercado-Reyes, Pablo Padilla Longoria & Alfonso Arroyo-Santos - forthcoming - Journal of Theoretical Biology.
    In spite of being ubiquitous in life sciences, the concept of information is harshly criticized. Uses of the concept other than those derived from Shannon's theory are denounced as pernicious metaphors. We perform a computational experiment to explore whether Shannon's information is adequate to describe the uses of said concept in commonplace scientific practice. Our results show that semantic sequences do not have unique complexity values different from the value of meaningless sequences. This result suggests that quantitative theoretical frameworks do (...)
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark   1 citation  
  23. added 2015-05-25
    Concepts, Introspection, and Phenomenal Consciousness: An Information-Theoretical Approach.Murat Aydede & Guven Guzeldere - 2005 - Noûs 39 (2):197-255.
    This essay is a sustained attempt to bring new light to some of the perennial problems in philosophy of mind surrounding phenomenal consciousness and introspection through developing an account of sensory and phenomenal concepts. Building on the information-theoretic framework of Dretske (1981), we present an informational psychosemantics as it applies to what we call sensory concepts, concepts that apply, roughly, to so-called secondary qualities of objects. We show that these concepts have a special informational character and semantic structure that closely (...)
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark   23 citations  
  24. added 2015-03-26
    New Arguments for 'Intelligent Design'? Review Article on William A. Dembski, Being as Communion: A Metaphysics of Information. [REVIEW]Philippe Gagnon - 2015 - ESSSAT News and Reviews 25 (1):17-24.
    Critical notice assessing the use of information theory in the attempt to build a design inference, and to re-establish some aspects of the program of natural theology, as carried out in this third major monograph devoted to the subject of intelligent design theory by mathematician and philosopher William A. Dembski, after The Design Inference (1998) and No Free Lunch (2002).
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark  
  25. added 2014-10-16
    An Introduction to Logical Entropy and its Relation to Shannon Entropy.David Ellerman - 2013 - International Journal of Semantic Computing 7 (2):121-145.
    The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set--just as the usual logical notion of probability based on the Boolean logic (...)
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark   4 citations  
  26. added 2014-03-25
    "Que reste-t-il de la théologie à l'âge électronique ? Valeur et cybernétique axiologique chez Raymond Ruyer" [What is left of Theology in the Electronic Age? Value and Axiological Cybernetics in Raymond Ruyer].Philippe Gagnon - 2013 - In Chromatikon Ix: Annales de la Philosophie En Procès — Yearbook of Philosophy in Process, M. Weber & V. Berne. pp. 93-120.
    This is the outline: Introduction — La question de la cybernétique et de l'information — Une « pensée du milieu » — Cybernétique et homologie — Une théorie de l'apprentissage — L'information vue de l'autre côté — Champ et domaine unitaire — La thèse des « autres-je » — Le passage par l'axiologie — La rétroaction vraie — L'ontologie de Ruyer — Le bruissement de l'être même.
    Remove from this list   Download  
    Translate
     
     
    Export citation  
     
    Bookmark  
  27. added 2014-03-23
    Algorithmic Information Theory and Undecidability.Panu Raatikainen - 2000 - Synthese 123 (2):217-225.
    Chaitin’s incompleteness result related to random reals and the halting probability has been advertised as the ultimate and the strongest possible version of the incompleteness and undecidability theorems. It is argued that such claims are exaggerations.
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark   5 citations  
  28. added 2013-09-26
    Falsification and Future Performance.David Balduzzi - manuscript
    We information-theoretically reformulate two measures of capacity from statistical learning theory: empirical VC-entropy and empirical Rademacher complexity. We show these capacity measures count the number of hypotheses about a dataset that a learning algorithm falsifies when it finds the classifier in its repertoire minimizing empirical risk. It then follows from that the future performance of predictors on unseen data is controlled in part by how many hypotheses the learner falsifies. As a corollary we show that empirical VC-entropy quantifies the message (...)
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark   2 citations  
  29. added 2013-09-26
    Information, Learning and Falsification.David Balduzzi - manuscript
    There are (at least) three approaches to quantifying information. The first, algorithmic information or Kolmogorov complexity, takes events as strings and, given a universal Turing machine, quantifies the information content of a string as the length of the shortest program producing it [1]. The second, Shannon information, takes events as belonging to ensembles and quantifies the information resulting from observing the given event in terms of the number of alternate events that have been ruled out [2]. The third, statistical learning (...)
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark   1 citation  
  30. added 2013-03-26
    Bridging Conceptual Gaps: The Kolmogorov-Sinai Entropy.Massimiliano Badino - forthcoming - Isonomía. Revista de Teoría y Filosofía Del Derecho.
    The Kolmogorov-Sinai entropy is a fairly exotic mathematical concept which has recently aroused some interest on the philosophers’ part. The most salient trait of this concept is its working as a junction between such diverse ambits as statistical mechanics, information theory and algorithm theory. In this paper I argue that, in order to understand this very special feature of the Kolmogorov-Sinai entropy, is essential to reconstruct its genealogy. Somewhat surprisingly, this story takes us as far back as the beginning of (...)
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark  
  31. added 2012-10-10
    Information of the Chassis and Information of the Program in Synthetic Cells.Antoine Danchin - 2009 - Systems and Synthetic Biology 3:125-134.
    Synthetic biology aims at reconstructing life to put to the test the limits of our understanding. It is based on premises similar to those which permitted invention of computers, where a machine, which reproduces over time, runs a program, which replicates. The underlying heuristics explored here is that an authentic category of reality, information, must be coupled with the standard categories, matter, energy, space and time to account for what life is. The use of this still elusive category permits us (...)
    Remove from this list   Download  
    Translate
     
     
    Export citation  
     
    Bookmark   2 citations  
  32. added 2011-10-26
    Lunch Uncertain [Review Of: Floridi, Luciano (2011) The Philosophy of Information (Oxford)]. [REVIEW]Stevan Harnad - 2011 - Times Literary Supplement 5664 (22-23).
    The usual way to try to ground knowing according to contemporary theory of knowledge is: We know something if (1) it’s true, (2) we believe it, and (3) we believe it for the “right” reasons. Floridi proposes a better way. His grounding is based partly on probability theory, and partly on a question/answer network of verbal and behavioural interactions evolving in time. This is rather like modeling the data-exchange between a data-seeker who needs to know which button to press on (...)
    Remove from this list   Download  
    Translate
     
     
    Export citation  
     
    Bookmark  
  33. added 2011-06-10
    “What We Have Learnt From Systems Theory About the Things That Nature’s Understanding Achieves”.Philippe Gagnon - 2010 - In Dirk Evers, Antje Jackelén & Taede Smedes (eds.), How do we Know? Understanding in Science and Theology. Forum Scientiarum.
    The problem of knowledge has been centred around the study of the content of our consciousness, seeing the world through internal representation, without any satisfactory account of the operations of nature that would be a pre-condition for our own performances in terms of concept efficiency in organizing action externally. If we want to better understand where and how meaning fits in nature, we have to find the proper way to decipher its organization, and account for the fact that we have (...)
    Remove from this list   Download  
    Translate
     
     
    Export citation  
     
    Bookmark  
  34. added 2011-06-10
    La Théologie de la Nature Et la Science À l'Ère de L'Information.Philippe Gagnon - 2002 - Éditions du Cerf.
    The history of the relationship between Christian theology and the natural sciences has been conditioned by the initial decision of the masters of the "first scientific revolution" to disregard any necessary explanatory premiss to account for the constituting organization and the framing of naturally occurring entities. Not paying any attention to hierarchical control, they ended-up disseminating a vision and understanding in which it was no longer possible for a theology of nature to send questions in the direction of the experimental (...)
    Remove from this list   Download  
     
    Export citation  
     
    Bookmark