Information-based epistemology maintains that ‘being informed’ is an independent cognitive state that cannot be reduced to knowledge or to belief, and the modal logic KTB has been proposed as a model. But what distinguishes the KTB analysis of ‘being informed’, the Brouwersche schema (B), is precisely its downfall, for no logic of information should include (B) and, more generally, no epistemic logic should include (B), either.
In this article, I outline a logic of design of a system as a specific kind of conceptual logic of the design of the model of a system, that is, the blueprint that provides information about the system to be created. In section two, I introduce the method of levels of abstraction as a modelling tool borrowed from computer science. In section three, I use this method to clarify two main conceptual logics of information inherited from (...) modernity: Kant’s transcendental logic of conditions of possibility of a system, and Hegel’s dialectical logic of conditions of in/stability of a system. Both conceptual logics of information analyse structural properties of given systems. Strictly speaking, neither is a conceptual logic of information about the conditions of feasibility of a system, that is, neither is a logic of information as a logic of design. So, in section four, I outline this third conceptual logic of information and then interpret the conceptual logic of design as a logic of requirements, by introducing the relation of “sufficientisation”. In the conclusion, I argue that the logic of requirements is exactly what we need in order to make sense of, and buttress, a constructionist approach to knowledge. (shrink)
One of the open problems in the philosophy of information is whether there is an informationlogic (IL), different from epistemic (EL) and doxastic logic (DL), which formalises the relation “a is informed that p” (Iap) satisfactorily. In this paper, the problem is solved by arguing that the axiom schemata of the normal modal logic (NML) KTB (also known as B or Br or Brouwer’s system) are well suited to formalise the relation of “being informed”. (...) After having shown that IL can be constructed as an informational reading of KTB, four consequences of a KTB-based IL are explored: information overload; the veridicality thesis (Iap → p); the relation between IL and EL; and the Kp → Bp principle or entailment property, according to which knowledge implies belief. Although these issues are discussed later in the article, they are the motivations behind the development of IL. (shrink)
Purpose – To review and discuss Luciano Floridi’s 2019 book The Logic of Information: A Theory of Philosophy as Conceptual Design, the latest instalment in his philosophy of information tetralogy, particularly with respect to its implications for library and information studies. Design/methodology/approach – Nine scholars with research interests in philosophy and LIS read and responded to the book, raising critical and heuristic questions in the spirit of scholarly dialogue. Floridi responded to these questions. Findings – Floridi’s (...) PI, including this latest publication, is of interest to LIS scholars, and much insight can be gained by exploring this connection. It seems also that LIS has the potential to contribute to PI’s further development in some respects. Research implications – Floridi’s PI work is technical philosophy for which many LIS scholars do not have the training or patience to engage with, yet doing so is rewarding. This suggests a role for translational work between philosophy and LIS. Originality/value – The book symposium format, not yet seen in LIS, provides forum for sustained, multifaceted and generative dialogue around ideas. (shrink)
The purpose of this paper is to review and discuss Luciano Floridi’s 2019 book The Logic of Information: A Theory of Philosophy as Conceptual Design, the latest instalment in his philosophy of information (PI) tetralogy, particularly with respect to its implications for library and information studies (LIS) .
We present a framework for epistemic logic, modeling the logical aspects of System 1 and System 2 cognitive processes, as per dual process theories of reasoning. The framework combines non-normal worlds semantics with the techniques of Dynamic Epistemic Logic. It models non-logically-omniscient, but moderately rational agents: their System 1 makes fast sense of incoming information by integrating it on the basis of their background knowledge and beliefs. Their System 2 allows them to slowly, step-wise unpack some of (...) the logical consequences of such knowledge and beliefs, by paying a cognitive cost. The framework is applied to three instances of limited rationality, widely discussed in cognitive psychology: Stereotypical Thinking, the Framing Effect, and the Anchoring Effect. (shrink)
I want to model a finite, fallible cognitive agent who imagines that p in the sense of mentally representing a scenario—a configuration of objects and properties—correctly described by p. I propose to capture imagination, so understood, via variably strict world quantifiers, in a modal framework including both possible and so-called impossible worlds. The latter secure lack of classical logical closure for the relevant mental states, while the variability of strictness captures how the agent imports information from actuality in the (...) imagined non-actual scenarios. Imagination turns out to be highly hyperintensional, but not logically anarchic. Section 1 sets the stage and impossible worlds are quickly introduced in Sect. 2. Section 3 proposes to model imagination via variably strict world quantifiers. Section 4 introduces the formal semantics. Section 5 argues that imagination has a minimal mereological structure validating some logical inferences. Section 6 deals with how imagination under-determines the represented contents. Section 7 proposes additional constraints on the semantics, validating further inferences. Section 8 describes some welcome invalidities. Section 9 examines the effects of importing false beliefs into the imagined scenarios. Finally, Sect. 10 hints at possible developments of the theory in the direction of two-dimensional semantics. (shrink)
This is the revised version of an invited keynote lecture delivered at the "1st Australian Computing and Philosophy Conference". The paper is divided into two parts. The first part defends an informational approach to structural realism. It does so in three steps. First, it is shown that, within the debate about structural realism, epistemic and ontic structural realism are reconcilable. It follows that a version of OSR is defensible from a structuralist-friendly position. Second, it is argued that a version of (...) OSR is also plausible, because not all relata are logically prior to relations. Third, it is shown that a version of OSR is also applicable to both sub-observable and observable entities, by developing its ontology of structural objects in terms of informational objects. The outcome is informational structural realism, a version of OSR supporting the ontological commitment to a view of the world as the totality of informational objects dynamically interacting with each other. The paper has been discussed by several colleagues and, in the second half, ten objections that have been moved to the proposal are answered in order to clarify it further. (shrink)
This article explores the usefulness of interdisciplinarity as method of enquiry by proposing an investigation of the concept of information in the light of semiotics. This is because, as Kull, Deacon, Emmeche, Hoffmeyer and Stjernfelt state, information is an implicitly semiotic term (Biological Theory 4(2):167–173, 2009: 169), but the logical relation between semiosis and information has not been sufficiently clarified yet. Across the history of cybernetics, the concept of information undergoes an uneven development; that is, (...) class='Hi'>information is an ‘objective’ entity in first order cybernetics, and becomes a ‘subjective’ entity in second order cybernetics. This contradiction relegates the status of information to that of a ‘true’ or ‘false’ formal logic problem. The present study proposes that a solution to this contradiction can be found in Deely’s reconfiguration of Peirce’s ‘object’ (as found in his triadic model of semiosis) into ‘thing’ and ‘object’ (Deely 1981). This ontology allows one to argue that information is neither ‘true’ nor ‘false’, and to suggest that, when considered in light of its workability, information can be both true and false, and as such it constitutes an organism’s purely objective reality (Deely 2009b). It is stated that in the process of building such a reality, information is ‘motivated’ by environmental, physiological, emotional (including past feelings and expectations) constraints which are, in turn, framed by observership. Information is therefore found in the irreducible cybersemiotic process that links at once all these conditions and that is simultaneously constrained by them. The integration of cybernetics’ and semiotics’ understanding of information shows that history is the analytical principle that grants scientific rigour to interdisciplinary investigations. As such, in any attempt to clarify its epistemological stance (e.g. the semiotic aspect of information), it is argued that biosemiotics does not need only to acknowledge semiotics (as it does), but also cybernetics in its interdisciplinary heritage. (shrink)
This paper starts by indicating the analysis of Hempel's conditions of adequacy for any relation of confirmation (Hempel, 1945) as presented in Huber (submitted). There I argue contra Carnap (1962, Section 87) that Hempel felt the need for two concepts of confirmation: one aiming at plausible theories and another aiming at informative theories. However, he also realized that these two concepts are conflicting, and he gave up the concept of confirmation aiming at informative theories. The main part of the paper (...) consists in working out the claim that one can have Hempel's cake and eat it too - in the sense that there is a logic of theory assessment that takes into account both of the two conflicting aspects of plausibility and informativeness. According to the semantics of this logic, a is an acceptable theory for evidence β if and only if a is both sufficiently plausible given β and sufficiently informative about β. This is spelt out in terms of ranking functions (Spohn, 1988) and shown to represent the syntactically specified notion of an assessment relation. The paper then compares these acceptability relations to explanatory and confirmatory consequence relations (Flach, 2000) as well as to nonmonotonic consequence relations (Kraus et al., 1990). It concludes by relating the plausibility-informativeness approach to Carnap's positive relevance account, thereby shedding new light on Carnap's analysis as well as solving another problem of confirmation theory. (shrink)
This paper is concerned with a propositional modal logic with operators for necessity, actuality and apriority. The logic is characterized by a class of relational structures defined according to ideas of epistemic two-dimensional semantics, and can therefore be seen as formalizing the relations between necessity, actuality and apriority according to epistemic two-dimensional semantics. We can ask whether this logic is correct, in the sense that its theorems are all and only the informally valid formulas. This paper gives (...) outlines of two arguments that jointly show that this is the case. The first is intended to show that the logic is informally sound, in the sense that all of its theorems are informally valid. The second is intended to show that it is informally complete, in the sense that all informal validities are among its theorems. In order to give these arguments, a number of independently interesting results concerning the logic are proven. In particular, the soundness and completeness of two proof systems with respect to the semantics is proven (Theorems 2.11 and 2.15), as well as a normal form theorem (Theorem 3.2), an elimination theorem for the actuality operator (Corollary 3.6), and the decidability of the logic (Corollary 3.7). It turns out that the logic invalidates a plausible principle concerning the interaction of apriority and necessity; consequently, a variant semantics is briefly explored on which this principle is valid. The paper concludes by assessing the implications of these results for epistemic two-dimensional semantics. (shrink)
The paper surveys the currently available axiomatizations of common belief (CB) and common knowledge (CK) by means of modal propositional logics. (Throughout, knowledge- whether individual or common- is defined as true belief.) Section 1 introduces the formal method of axiomatization followed by epistemic logicians, especially the syntax-semantics distinction, and the notion of a soundness and completeness theorem. Section 2 explains the syntactical concepts, while briefly discussing their motivations. Two standard semantic constructions, Kripke structures and neighbourhood structures, are introduced in Sections (...) 3 and 4, respectively. It is recalled that Aumann's partitional model of CK is a particular case of a definition in terms of Kripke structures. The paper also restates the well-known fact that Kripke structures can be regarded as particular cases of neighbourhood structures. Section 3 reviews the soundness and completeness theorems proved w.r.t. the former structures by Fagin, Halpern, Moses and Vardi, as well as related results by Lismont. Section 4 reviews the corresponding theorems derived w.r.t. the latter structures by Lismont and Mongin. A general conclusion of the paper is that the axiomatization of CB does not require as strong systems of individual belief as was originally thought- only monotonicity has thusfar proved indispensable. Section 5 explains another consequence of general relevance: despite the "infinitary" nature of CB, the axiom systems of this paper admit of effective decision procedures, i.e., they are decidable in the logician's sense. (shrink)
The paper presents a new analysis of Hempel’s conditions of adequacy, differing from the one in Carnap. Hempel, so it is argued, felt the need for two concepts of confirmation: one aiming at true theories, and another aiming at informative theories. However, so the analysis continues, he also realized that these two concepts were conflicting, and so he gave up the concept of confirmation aiming at informative theories. It is then shown that one can have the cake and eat it: (...) There is a logic of confirmation that accounts for both of these two conflicting aspects. (shrink)
This paper presents a new analysis of C.G. Hempel’s conditions of adequacy for any relation of confirmation [Hempel C. G. (1945). Aspects of scientific explanation and other essays in the philosophy of science. New York: The Free Press, pp. 3–51.], differing from the one Carnap gave in §87 of his [1962. Logical foundations of probability (2nd ed.). Chicago: University of Chicago Press.]. Hempel, it is argued, felt the need for two concepts of confirmation: one aiming at true hypotheses and another (...) aiming at informative hypotheses. However, he also realized that these two concepts are conflicting, and he gave up the concept of confirmation aiming at informative hypotheses. I then show that one can have Hempel’s cake and eat it too. There is a logic that takes into account both of these two conflicting aspects. According to this logic, a sentence H is an acceptable hypothesis for evidence E if and only if H is both sufficiently plausible given E and sufficiently informative about E. Finally, the logic sheds new light on Carnap’s analysis. (shrink)
In this article, we discuss some issues concerning magical thinking—forms of thought and association mechanisms characteristic of early stages of mental development. We also examine good reasons for having an ambivalent attitude concerning the later permanence in life of these archaic forms of association, and the coexistence of such intuitive but informal thinking with logical and rigorous reasoning. At the one hand, magical thinking seems to serve the creative mind, working as a natural vehicle for new ideas and innovative insights, (...) and giving form to heuristic arguments. At the other hand, it is inherently difficult to control, lacking effective mechanisms needed for rigorous manipulation. Our discussion is illustrated with many examples from the Hebrew Bible, and some final examples from modern science. (shrink)
Gaining information can be modelled as a narrowing of epistemic space . Intuitively, becoming informed that such-and-such is the case rules out certain scenarios or would-be possibilities. Chalmers’s account of epistemic space treats it as a space of a priori possibility and so has trouble in dealing with the information which we intuitively feel can be gained from logical inference. I propose a more inclusive notion of epistemic space, based on Priest’s notion of open worlds yet which contains (...) only those epistemic scenarios which are not obviously impossible. Whether something is obvious is not always a determinate matter and so the resulting picture is of an epistemic space with fuzzy boundaries. (shrink)
Information-theoretic approaches to formal logic analyse the "common intuitive" concept of propositional implication (or argumental validity) in terms of information content of propositions and sets of propositions: one given proposition implies a second if the former contains all of the information contained by the latter; an argument is valid if the conclusion contains no information beyond that of the premise-set. This paper locates information-theoretic approaches historically, philosophically and pragmatically. Advantages and disadvantages are identified by (...) examining such approaches in themselves and by contrasting them with standard transformation-theoretic approaches. Transformation-theoretic approaches analyse validity (and thus implication) in terms of transformations that map one argument onto another: a given argument is valid if no transformation carries it onto an argument with all true premises and false conclusion. Model-theoretic, set-theoretic, and substitution-theoretic approaches, which dominate current literature, can be construed as transformation-theoretic, as can the so-called possible-worlds approaches. Ontic and epistemic presuppositions of both types of approaches are considered. Attention is given to the question of whether our historically cumulative experience applying logic is better explained from a purely information-theoretic perspective or from a purely transformation-theoretic perspective or whether apparent conflicts between the two types of approaches need to be reconciled in order to forge a new type of approach that recognizes their basic complementarity. (shrink)
Of all twentieth century philosophers, it is Gilles Deleuze whose work agitates most forcefully for a worldview privileging becoming over being, difference over sameness; the world as a complex, open set of multiplicities. Nevertheless, Deleuze remains singular in enlisting mathematical resources to underpin and inform such a position, refusing the hackneyed opposition between ‘static’ mathematical logic versus ‘dynamic’ physical world. This is an international collection of work commissioned from foremost philosophers, mathematicians and philosophers of science, to address the wide (...) range of problematics and influences in this most important strand of Deleuze’s thinking. Contributors are Charles Alunni, Alain Badiou, Gilles Châtelet, Manuel DeLanda, Simon Duffy, Robin Durie, Aden Evens, Arkady Plotnitsky, Jean-Michel Salanskis, Daniel Smith and David Webb. (shrink)
Logics of joint strategic ability have recently received attention, with arguably the most influential being those in a family that includes Coalition Logic (CL) and Alternating-time Temporal Logic (ATL). Notably, both CL and ATL bypass the epistemic issues that underpin Schelling-type coordination problems, by apparently relying on the meta-level assumption of (perfectly reliable) communication between cooperating rational agents. Yet such epistemic issues arise naturally in settings relevant to ATL and CL: these logics are standardly interpreted on structures where (...) agents move simultaneously, opening the possibility that an agent cannot foresee the concurrent choices of other agents. In this paper we introduce a variant of CL we call Two-Player Strategic Coordination Logic (SCL2). The key novelty of this framework is an operator for capturing coalitional ability when the cooperating agents cannot share strategic information. We identify significant differences in the expressive power and validities of SCL2 and CL2, and present a sound and complete axiomatization for SCL2. We briefly address conceptual challenges when shifting attention to games with more than two players and stronger notions of rationality. (shrink)
Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. (...) The purpose of this paper is to give the direct generalization to quantum logical information theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., qudits of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates that are distinguished by the measurement. Both the classical and quantum versions of logical entropy have simple interpretations as “two-draw” probabilities for distinctions. The conclusion is that quantum logical entropy is the simple and natural notion of information for quantum information theory focusing on the distinguishing of quantum states. (shrink)
In this article, we discuss some issues concerning magical thinking—forms of thought and association mechanisms characteristic of early stages of mental development. We also examine good reasons for having an ambivalent attitude concerning the later permanence in life of these archaic forms of association, and the coexistence of such intuitive but informal thinking with logical and rigorous reasoning. At the one hand, magical thinking seems to serve the creative mind, working as a natural vehicle for new ideas and innovative insights, (...) and giving form to heuristic arguments. At the other hand, it is inherently difficult to control, lacking effective mechanisms needed for rigorous manipulation. Our discussion is illustrated with many examples from the Hebrew Bible, and some final examples from modern science. (shrink)
(See also the separate entry for the volume itself.) This introduction has three parts. The first providing an overview of some main lines of research in deontic logic: the emergence of SDL, Chisholm's paradox and the development of dyadic deontic logics, various other puzzles/challenges and areas of development, along with philosophical applications. The second part focus on some actual and potential fruitful interactions between deontic logic, computer science and artificial intelligence. These include applications of deontic logic to (...) AI knowledge representation in legal systems, to modelling computer systems where it is expected that sub-ideal states will emerge and require countermeasures, to norm-governed human interactions with computer systems, and to the representation of some features of multi-agent systems where different agent-like computer systems interact with one another. The third and final part briefly groups and previews the papers in the anthology. (shrink)
In this paper, I present an informational approach to the nature of personal identity. In “Plato and the problem of the chariot”, I use Plato’s famous metaphor of the chariot to introduce a specific problem regarding the nature of the self as an informational multiagent system: what keeps the self together as a whole and coherent unity? In “Egology and its two branches” and “Egology as synchronic individualisation”, I outline two branches of the theory of the self: one concerning the (...) individualisation of the self as an entity, the other concerning the identification of such entity. I argue that both presuppose an informational approach, defend the view that the individualisation of the self is logically prior to its identification , and suggest that such individualisation can be provided in informational terms. Hence, in “A reconciling hypothesis: the three membranes model”, I offer an informational individualisation of the self, based on a tripartite model, which can help to solve the problem of the chariot. Once this model of the self is outlined, in “ICTs as technologies of the self” I use it to show how ICTs may be interpreted as technologies of the self. In “The logic of realisation”, I introduce the concept of “realization” (Aristotle’s anagnorisis ) and support the rather Spinozian view according to which, from the perspective of informational structural realism, selves are the final stage in the development of informational structures. The final “Conclusion: from the egology to the ecology of the self” briefly concludes the article with a reference to the purposeful shaping of the self, in a shift from egology to ecology. (shrink)
The paper argues that the two best known formal logical fallacies, namely denying the antecedent (DA) and affirming the consequent (AC) are not just basic and simple errors, which prove human irrationality, but rather informational shortcuts, which may provide a quick and dirty way of extracting useful information from the environment. DA and AC are shown to be degraded versions of Bayes’ theorem, once this is stripped of some of its probabilities. The less the probabilities count, the closer these (...) fallacies become to a reasoning that is not only informationally useful but also logically valid. (shrink)
Much work in MKM depends on the application of formal logic to mathematics. However, much mathematical knowledge is informal. Luckily, formal logic only represents one tradition in logic, specifically the modeling of inference in terms of logical form. Many inferences cannot be captured in this manner. The study of such inferences is still within the domain of logic, and is sometimes called informal logic. This paper explores some of the benefits informal logic may have (...) for the management of informal mathematical knowledge. (shrink)
This paper discusses an almost sixty year old problem in the philosophy of science -- that of a logic of confirmation. We present a new analysis of Carl G. Hempel's conditions of adequacy (Hempel 1945), differing from the one Carnap gave in §87 of his Logical Foundations of Probability (1962). Hempel, it is argued, felt the need for two concepts of confirmation: one aiming at true theories and another aiming at informative theories. However, he also realized that these two (...) concepts are conflicting, and he gave up the concept of confirmation aiming at informative theories. We then show that one can have Hempel's cake and eat it, too: There is a (rank-theoretic and genuinely nonmonotonic) logic of confirmation -- or rather, theory assessment -- that takes into account both of these two conflicting aspects. According to this logic, a statement H is an acceptable theory for the data E if and only if H is both sufficiently plausible given E and sufficiently informative about E. Finally, the logic sheds new light on Carnap's analysis (and solves another problem of confirmation theory). (shrink)
Husserl (a mathematician by education) remained a few famous and notable philosophical “slogans” along with his innovative doctrine of phenomenology directed to transcend “reality” in a more general essence underlying both “body” and “mind” (after Descartes) and called sometimes “ontology” (terminologically following his notorious assistant Heidegger). Then, Husserl’s tradition can be tracked as an idea for philosophy to be reinterpreted in a way to be both generalized and mathenatizable in the final analysis. The paper offers a pattern borrowed from the (...) theory of information and quantum information (therefore relating philosophy to both mathematics and physics) to formalize logically a few key concepts of Husserl’s phenomenology such as “epoché” “eidetic, phenomenological, and transcendental reductions” as well as the identification of “phenomenological, transcendental, and psychological reductions” in a way allowing for that identification to be continued to “eidetic reduction” (and thus to mathematics). The approach is tested by an independent and earlier idea of Husserl, “logical arithmetic” (parallelly implemented in mathematics by Whitehead and Russell’s Principia) as what “Hilbert arithmetic” generalizing Peano arithmetics is interpreted. A basic conclusion states for the unification of philosophy, mathematics, and physics in their foundations and fundamentals to be the Husserl tradition both tracked to its origin (in the being itself after Heidegger or after Husserl’s “zu Sache selbst”) and embodied in the development of human cognition in the third millennium. (shrink)
This study focuses on undergraduate students' ability to unpack informally written mathematical statements into the language of predicate calculus. Data were collected between 1989 and 1993 from 61students in six small sections of a “bridge" course designed to introduce proofs and mathematical reasoning. We discuss this data from a perspective that extends the notion of concept image to that of statement image and introduces the notion of proof framework to indicate the top-level logical structure of a proof. For simplified informal (...) calculus statements, just 8.5% of unpacking attempts were successful; for actual statements from calculus texts, this dropped to 5%. We infer that these students would be unable to reliably relate informally stated theorems with the top-level logical structure of their proofs and hence could not be expected to construct proofs or evaluate their validity. (shrink)
The article addresses the problem of how semantic information can be upgraded to knowledge. The introductory section explains the technical terminology and the relevant background. Section 2 argues that, for semantic information to be upgraded to knowledge, it is necessary and sufficient to be embedded in a network of questions and answers that correctly accounts for it. Section 3 shows that an information flow network of type A fulfils such a requirement, by warranting that the erotetic deficit, (...) characterising the target semantic information t by default, is correctly satisfied by the information flow of correct answers provided by an informational source s. Section 4 illustrates some of the major advantages of such a Network Theory of Account (NTA) and clears the ground of a few potential difficulties. Section 5 clarifies why NTA and an informational analysis of knowledge, according to which knowledge is accounted semantic information, is not subject to Gettier-type counterexamples. A concluding section briefly summarises the results obtained. (shrink)
Information-theoretic approaches to formal logic analyze the "common intuitive" concepts of implication, consequence, and validity in terms of information content of propositions and sets of propositions: one given proposition implies a second if the former contains all of the information contained by the latter; one given proposition is a consequence of a second if the latter contains all of the information contained by the former; an argument is valid if the conclusion contains no information (...) beyond that of the premise-set. This paper locates information-theoretic approaches historically, philosophically, and pragmatically. Advantages and disadvantages are identified by examining such approaches in themselves and by contrasting them with standard transformation-theoretic approaches. Transformation-theoretic approaches analyze validity (and thus implication) in terms of transformations that map one argument onto another: a given argument is valid if no transformation carries it onto an argument with all true premises and false conclusion. Model-theoretic, set-theoretic, and substitution-theoretic approaches, which dominate current literature, can be construed as transformation-theoretic, as can the so-called possible-worlds approaches. Ontic and epistemic presuppositions of both types of approaches are considered. Attention is given to the question of whether our historically cumulative experience applying logic is better explained from a purely information-theoretic perspective or from a purely transformation-theoretic perspective or whether apparent conflicts between the two types of approaches need to be reconciled in order to forge a new type of approach that recognizes their basic complementarity. (shrink)
Although it seems intuitively clear that acts of requesting are different from acts of commanding, it is not very easy to sate their differences precisely in dynamic terms. In this paper we show that it becomes possible to characterize, at least partially, the effects of acts of requesting and compare them with the effects of acts of commanding by combining dynamified deontic logic with epistemic logic. One interesting result is the following: each act of requesting is appropriately differentiated (...) from an act of commanding with the same content, but for each act of requesting, there is another act of commanding with much more complex content which updates models in exactly the same way as it does. We will also consider an application of our characterization of acts of requesting to acts of asking yes-no questions. It yields a straightforward formalization of the view of acts of asking questions as requests for information. (shrink)
The informal logic movement began as an attempt to develop – and teach – an alternative logic which can account for the real life arguing that surrounds us in our daily lives – in newspapers and the popular media, political and social commentary, advertising, and interpersonal exchange. The movement was rooted in research and discussion in Canada and especially at the University of Windsor, and has become a branch of argumentation theory which intersects with related traditions and approaches (...) (notably formal logic, rhetoric and dialectics in the form of pragma-dialectics). In this volume, some of the best known contributors to the movement discuss their views and the reasoning and argument which is informal logic’s subject matter. Many themes and issues are explored in a way that will fuel the continued evolution of the field. Federico Puppo adds an insightful essay which considers the origins and development of informal logic and whether informal logicians are properly described as a “school” of thought. In considering that proposition, Puppo introduces readers to a diverse range of essays, some of them previously published, others written specifically for this volume. (shrink)
An important problem with machine learning is that when label number n>2, it is very difficult to construct and optimize a group of learning functions, and we wish that optimized learning functions are still useful when prior distribution P(x) (where x is an instance) is changed. To resolve this problem, the semantic information G theory, Logical Bayesian Inference (LBI), and a group of Channel Matching (CM) algorithms together form a systematic solution. MultilabelMultilabel A semantic channel in the G theory (...) consists of a group of truth functions or membership functions. In comparison with likelihood functions, Bayesian posteriors, and Logistic functions used by popular methods, membership functions can be more conveniently used as learning functions without the above problem. In Logical Bayesian Inference (LBI), every label’s learning is independent. For Multilabel learning, we can directly obtain a group of optimized membership functions from a big enough sample with labels, without preparing different samples for different labels. A group of Channel Matching (CM) algorithms are developed for machine learning. For the Maximum Mutual Information (MMI) classification of three classes with Gaussian distributions on a two-dimensional feature space, 2-3 iterations can make mutual information between three classes and three labels surpass 99% of the MMI for most initial partitions. For mixture models, the Expectation-Maxmization (EM) algorithm is improved and becomes the CM-EM algorithm, which can outperform the EM algorithm when mixture ratios are imbalanced, or local convergence exists. The CM iteration algorithm needs to combine neural networks for MMI classifications on high-dimensional feature spaces. LBI needs further studies for the unification of statistics and logic. (shrink)
N. Wiener's negative definition of information is well known: it states what information is not. According to this definition, it is neither matter nor energy. But what is it? It is shown how one can follow the lead of dialectical logic as expounded by G.W.F. Hegel in his main work -- "The Science of Logic" -- to answer this and some related questions.
Semantic information is usually supposed to satisfy the veridicality thesis: p qualifies as semantic information only if p is true. However, what it means for semantic information to be true is often left implicit, with correspondentist interpretations representing the most popular, default option. The article develops an alternative approach, namely a correctness theory of truth (CTT) for semantic information. This is meant as a contribution not only to the philosophy of information but also to the (...) philosophical debate on the nature of truth. After the introduction, in Sect. 2, semantic information is shown to be translatable into propositional semantic information (i). In Sect. 3, i is polarised into a query (Q) and a result (R), qualified by a specific context, a level of abstraction and a purpose. This polarization is normalised in Sect. 4, where [Q + R] is transformed into a Boolean question and its relative yes/no answer [Q + A]. This completes the reduction of the truth of i to the correctness of A. In Sects. 5 and 6, it is argued that (1) A is the correct answer to Q if and only if (2) A correctly saturates Q by verifying and validating it (in the computer science’s sense of verification and validation ); that (2) is the case if and only if (3) [Q + A] generates an adequate model (m) of the relevant system (s) identified by Q; that (3) is the case if and only if (4) m is a proxy of s (in the computer science’s sense of proxy ) and (5) proximal access to m commutes with the distal access to s (in the category theory’s sense of commutation ); and that (5) is the case if and only if (6) reading/writing (accessing, in the computer science’s technical sense of the term) m enables one to read/write (access) s. Sect. 7 provides some further clarifications about CTT, in the light of semantic paradoxes. Section 8 draws a general conclusion about the nature of CTT as a theory for systems designers not just systems users. In the course of the article all technical expressions from computer science are explained. (shrink)
According to a prevalent view among philosophers formal logic is the philosopher’s main tool to assess the validity of arguments, i.e. the philosopher’s ars iudicandi. By drawing on a famous dispute between Russell and Strawson over the validity of a certain kind of argument – of arguments whose premises feature definite descriptions – this paper casts doubt on the accuracy of the ars iudicandi conception. Rather than settling the question whether the contentious arguments are valid or not, Russell and (...) Strawson, upon discussing the proper logical analysis of definite descriptions, merely contrast converse informal validity assessments rendered explicit by nonequivalent logical for-malizations. (shrink)
A principle, according to which any scientific theory can be mathematized, is investigated. That theory is presupposed to be a consistent text, which can be exhaustedly represented by a certain mathematical structure constructively. In thus used, the term “theory” includes all hypotheses as yet unconfirmed as already rejected. The investigation of the sketch of a possible proof of the principle demonstrates that it should be accepted rather a metamathematical axiom about the relation of mathematics and reality. Its investigation needs philosophical (...) means. Husserl’s phenomenology is what is used, and then the conception of “bracketing reality” is modelled to generalize Peano arithmetic in its relation to set theory in the foundation of mathematics. The obtained model is equivalent to the generalization of Peano arithmetic by means of replacing the axiom of induction with that of transfinite induction. A comparison to Mach’s doctrine is used to be revealed the fundamental and philosophical reductionism of Husserl’s phenomenology leading to a kind of Pythagoreanism in the final analysis. Accepting or rejecting the principle, two kinds of mathematics appear differing from each other by its relation to reality. Accepting the principle, mathematics has to include reality within itself in a kind of Pythagoreanism. These two kinds are called in paper correspondingly Hilbert mathematics and Gödel mathematics. The sketch of the proof of the principle demonstrates that the generalization of Peano arithmetic as above can be interpreted as a model of Hilbert mathematics into Gödel mathematics therefore showing that the former is not less consistent than the latter, and the principle is an independent axiom. An information interpretation of Hilbert mathematics is involved. It is a kind of ontology of information. Thus the problem which of the two mathematics is more relevant to our being is discussed. An information interpretation of the Schrödinger equation is involved to illustrate the above problem. (shrink)
The paper considers the symmetries of a bit of information corresponding to one, two or three qubits of quantum information and identifiable as the three basic symmetries of the Standard model, U(1), SU(2), and SU(3) accordingly. They refer to “empty qubits” (or the free variable of quantum information), i.e. those in which no point is chosen (recorded). The choice of a certain point violates those symmetries. It can be represented furthermore as the choice of a privileged reference (...) frame (e.g. that of the Big Bang), which can be described exhaustively by means of 16 numbers (4 for position, 4 for velocity, and 8 for acceleration) independently of time, but in space-time continuum, and still one, 17th number is necessary for the mass of rest of the observer in it. The same 17 numbers describing exhaustively a privileged reference frame thus granted to be “zero”, respectively a certain violation of all the three symmetries of the Standard model or the “record” in a qubit in general, can be represented as 17 elementary wave functions (or classes of wave functions) after the bijection of natural and transfinite natural (ordinal) numbers in Hilbert arithmetic and further identified as those corresponding to the 17 elementary of particles of the Standard model. Two generalizations of the relevant concepts of general relativity are introduced: (1) “discrete reference frame” to the class of all arbitrarily accelerated reference frame constituting a smooth manifold; (2) a still more general principle of relativity to the general principle of relativity, and meaning the conservation of quantum information as to all discrete reference frames as to the smooth manifold of all reference frames of general relativity. Then, the bijective transition from an accelerated reference frame to the 17 elementary wave functions of the Standard model can be interpreted by the still more general principle of relativity as the equivalent redescription of a privileged reference frame: smooth into a discrete one. The conservation of quantum information related to the generalization of the concept of reference frame can be interpreted as restoring the concept of the ether, an absolutely immovable medium and reference frame in Newtonian mechanics, to which the relative motion can be interpreted as an absolute one, or logically: the relations, as properties. The new ether is to consist of qubits (or quantum information). One can track the conceptual pathway of the “ether” from Newtonian mechanics via special relativity, via general relativity, via quantum mechanics to the theory of quantum information (or “quantum mechanics and information”). The identification of entanglement and gravity can be considered also as a ‘byproduct” implied by the transition from the smooth “ether of special and general relativity’ to the “flat” ether of quantum mechanics and information. The qubit ether is out of the “temporal screen” in general and is depicted on it as both matter and energy, both dark and visible. (shrink)
The paper proposes two logical analyses of (the norms of) justification. In a first, realist-minded case, truth is logically independent from justification and leads to a pragmatic logic LP including two epistemic and pragmatic operators, namely, assertion and hypothesis. In a second, antirealist-minded case, truth is not logically independent from justification and results in two logical systems of information and justification: AR4 and AR4¢, respectively, provided with a question-answer semantics. The latter proposes many more epistemic agents, each corresponding (...) to a wide variety of epistemic norms. After comparing the different norms of justification involved in these logical systems, two hexagons expressing Aristotelian relations of opposition will be gathered in order to clarify how (a fragment of) pragmatic formulas can be interpreted in a fuzzy-based question-answer semantics. (shrink)
A uniform construction for sequent calculi for finite-valued first-order logics with distribution quantifiers is exhibited. Completeness, cut-elimination and midsequent theorems are established. As an application, an analog of Herbrand’s theorem for the four-valued knowledge-representation logic of Belnap and Ginsberg is presented. It is indicated how this theorem can be used for reasoning about knowledge bases with incomplete and inconsistent information.
Any logic is represented as a certain collection of well-orderings admitting or not some algebraic structure such as a generalized lattice. Then universal logic should refer to the class of all subclasses of all well-orderings. One can construct a mapping between Hilbert space and the class of all logics. Thus there exists a correspondence between universal logic and the world if the latter is considered a collection of wave functions, as which the points in Hilbert space can (...) be interpreted. The correspondence can be further extended to the foundation of mathematics by set theory and arithmetic, and thus to all mathematics. (shrink)
Human actions and decisions are most of the times not only grounded on emotional reactions, they are irrationally debasing. While such emotions and heuristics were perhaps suitable for dealing with life in the Stone Age, they are woefully inadequate in the Silicon Age. The substitution of traditional news agencies and communication platforms in Nigeria with social media networks has not only increased human capacities, it has aided the common good and further eased communication and increased the human knowledge base. For (...) instance, the Silicon Valley in the state of California in the United State of America has proved the extent to which human ingenuity can be exerted in beneficial ways. Here in, the top of the multibillion-dollar communication companies like Apple, eBay, Cisco, Lockheed, Hewlett Packard (HP), Google, Netflix, Facebook, Oracle, Tesla, etc. whose yearly budgets far exceed the entire yearly budget of Nigeria, has proved that, ICT remains the building blocks of contemporary communities. The state of California for instance prides herself as the 6th largest economy of the world after France and Brazil. This romantic picture of ICT is but only one chapter of the ICT divide. In today’s world, the scale and speed of the highly partisan news and falsehoods that circulate in the human environment is deafening. In politics and governance for instance, the populist have exploited the ICT effectively and without restraints, to access power and authority. Today, we are in the era of post-truth, an era in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief. An era that harbours corruption of intellectual integrity and damage to the whole fabric of democracy. The battle to protect information integrity and expose fake news becomes sine qua non. We shall in this paper explore the dark side of the age of information, a side that has been exploited by the media mavens, political hacks, and ideological propagandists who promote lies, illusion, confusion, and other forms of demented or manipulated imagination. In doing this, we shall proceed as follows: i. Historicize the concept of ‘fake news’ here referred to as post-truth. ii. Interrogate the concept of ‘information’ and its goal in human affairs iii. Evaluate the nexus between ‘truth’ values in ‘information’, ‘Misinformation’ and ‘dis-information’ iv. Situate the role of Philosophy in the era of post-truth. (shrink)
The topic of this paper may be introduced by fast zooming in and out of the philosophy of information. In recent years, philosophical interest in the nature of information has been increasing steadily. This has led to a focus on semantic information, and then on the logic of being informed, which has attracted analyses concentrating both on the statal sense in which S holds the information that p (this is what I mean by logic (...) of being informed in the rest of this article) and on the actional sense in which S becomes informed that p. One of the consequences of the logic debate has been a renewed epistemological interest in the principle of information closure (henceforth PIC), which finally has motivated a revival of a skeptical objection against its tenability first made popular by Dretske. This is the topic of the paper, in which I seek to defend PIC against the skeptical objection. If I am successful, this means – and we are now zooming out – that the plausibility of PIC is not undermined by the skeptical objection, and therefore that a major epistemological argument against the formalization of the logic of being informed based on the axiom of distribution in modal logic is removed. But since the axiom of distribution discriminates between normal and non-normal modal logics, this means that a potentially good reason to look for a formalization of the logic of being informed among the non-normal modal logics, which reject the axiom, is also removed. And this in turn means that a formalization of the logic of being informed in terms of the normal modal logic B (also known as KTB) is still very plausible, at least insofar as this specific obstacle is concerned. In short, I shall argue that the skeptical objection against PIC fails, so it is not a good reason to abandon the normal modal logic B as a good formalization of the logic of being informed. (shrink)
The paper argues that the two best known formal logical fallacies, namely denying the antecedent (DA) and affirming the consequent (AC) are not just basic and simple errors, which prove human irrationality, but rather informational shortcuts, which may provide a quick and dirty way of extracting useful information from the environment. DA and AC are shown to be degraded versions of Bayes’ theorem, once this is stripped of some of its probabilities. The less the probabilities count, the closer these (...) fallacies become to a reasoning that is not only informationally useful but also logically valid. (shrink)
In this article, I define and then defend the principle of information closure (pic) against a sceptical objection similar to the one discussed by Dretske in relation to the principle of epistemic closure. If I am successful, given that pic is equivalent to the axiom of distribution and that the latter is one of the conditions that discriminate between normal and non-normal modal logics, a main result of such a defence is that one potentially good reason to look for (...) a formalization of the logic of “ $S$ is informed that $p$ ” among the non-normal modal logics, which reject the axiom, is also removed. This is not to argue that the logic of “ $S$ is informed that $p$ ” should be a normal modal logic, but that it could still be insofar as the objection that it could not be, based on the sceptical objection against pic, has been removed. In other word, I shall argue that the sceptical objection against pic fails, so such an objection provides no ground to abandon the normal modal logic B (also known as KTB) as a formalization of “ $S$ is informed that $p$ ”, which remains plausible insofar as this specific obstacle is concerned. (shrink)
The resolving of the main problem of quantum mechanics about how a quantum leap and a smooth motion can be uniformly described resolves also the problem of how a distribution of reliable data and a sequence of deductive conclusions can be uniformly described by means of a relevant wave function “Ψdata”.
Synthetic biology aims at reconstructing life to put to the test the limits of our understanding. It is based on premises similar to those which permitted invention of computers, where a machine, which reproduces over time, runs a program, which replicates. The underlying heuristics explored here is that an authentic category of reality, information, must be coupled with the standard categories, matter, energy, space and time to account for what life is. The use of this still elusive category permits (...) us to interact with reality via construction of self-consistent models producing predictions which can be instantiated into experiments. While the present theory of information has much to say about the program, with the creative properties of recursivity at its heart, we almost entirely lack a theory of the information supporting the machine. We suggest that the program of life codes for processes meant to trap information which comes from the context provided by the environment of the machine. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.