The informallogic movement began as an attempt to develop – and teach – an alternative logic which can account for the real life arguing that surrounds us in our daily lives – in newspapers and the popular media, political and social commentary, advertising, and interpersonal exchange. The movement was rooted in research and discussion in Canada and especially at the University of Windsor, and has become a branch of argumentation theory which intersects with related traditions and (...) approaches (notably formal logic, rhetoric and dialectics in the form of pragma-dialectics). In this volume, some of the best known contributors to the movement discuss their views and the reasoning and argument which is informallogic’s subject matter. Many themes and issues are explored in a way that will fuel the continued evolution of the field. Federico Puppo adds an insightful essay which considers the origins and development of informallogic and whether informal logicians are properly described as a “school” of thought. In considering that proposition, Puppo introduces readers to a diverse range of essays, some of them previously published, others written specifically for this volume. (shrink)
The argument diagramming method developed by Monroe C. Beardsley in his (1950) book Practical Logic, which has since become the gold standard for diagramming arguments in informallogic, makes it possible to map the relation between premises and conclusions of a chain of reasoning in relatively complex ways. The method has since been adapted and developed in a number of directions by many contemporary informal logicians and argumentation theorists. It has proved useful in practical applications and (...) especially pedagogically in teaching basic logic and critical reasoning skills at all levels of scientific education. I propose in this essay to build on Beardsley diagramming techniques to refine and supplement their structural tools for visualizing logical relationships in a number of categories not originally accommodated by Beardsley diagramming, including circular reasoning, reductio ad absurdum arguments, and efforts to dispute and contradict arguments, with applications and analysis. (shrink)
Much work in MKM depends on the application of formal logic to mathematics. However, much mathematical knowledge is informal. Luckily, formal logic only represents one tradition in logic, specifically the modeling of inference in terms of logical form. Many inferences cannot be captured in this manner. The study of such inferences is still within the domain of logic, and is sometimes called informallogic. This paper explores some of the benefits informal (...) class='Hi'>logic may have for the management of informal mathematical knowledge. (shrink)
I argue against the skeptical epistemological view exemplified by the Groarkes that “all theories of informal argument must face the regress problem.” It is true that in our theoretical representations of reasoning, infinite regresses of self-justification regularly and inadvertently arise with respect to each of the RSA criteria for argument cogency (the premises are to be relevant, sufficient, and acceptable). But they arise needlessly, by confusing an RSA criterion with argument content, usually premise material.
One of the open problems in the philosophy of information is whether there is an information logic (IL), different from epistemic (EL) and doxastic logic (DL), which formalises the relation “a is informed that p” (Iap) satisfactorily. In this paper, the problem is solved by arguing that the axiom schemata of the normal modal logic (NML) KTB (also known as B or Br or Brouwer’s system) are well suited to formalise the relation of “being informed”. After having (...) shown that IL can be constructed as an informational reading of KTB, four consequences of a KTB-based IL are explored: information overload; the veridicality thesis (Iap → p); the relation between IL and EL; and the Kp → Bp principle or entailment property, according to which knowledge implies belief. Although these issues are discussed later in the article, they are the motivations behind the development of IL. (shrink)
In this article, I outline a logic of design of a system as a specific kind of conceptual logic of the design of the model of a system, that is, the blueprint that provides information about the system to be created. In section two, I introduce the method of levels of abstraction as a modelling tool borrowed from computer science. In section three, I use this method to clarify two main conceptual logics of information inherited from modernity: Kant’s (...) transcendental logic of conditions of possibility of a system, and Hegel’s dialectical logic of conditions of in/stability of a system. Both conceptual logics of information analyse structural properties of given systems. Strictly speaking, neither is a conceptual logic of information about the conditions of feasibility of a system, that is, neither is a logic of information as a logic of design. So, in section four, I outline this third conceptual logic of information and then interpret the conceptual logic of design as a logic of requirements, by introducing the relation of “sufficientisation”. In the conclusion, I argue that the logic of requirements is exactly what we need in order to make sense of, and buttress, a constructionist approach to knowledge. (shrink)
Gaining information can be modelled as a narrowing of epistemic space . Intuitively, becoming informed that such-and-such is the case rules out certain scenarios or would-be possibilities. Chalmers’s account of epistemic space treats it as a space of a priori possibility and so has trouble in dealing with the information which we intuitively feel can be gained from logical inference. I propose a more inclusive notion of epistemic space, based on Priest’s notion of open worlds yet which contains only those (...) epistemic scenarios which are not obviously impossible. Whether something is obvious is not always a determinate matter and so the resulting picture is of an epistemic space with fuzzy boundaries. (shrink)
Information-theoretic approaches to formal logic analyse the "common intuitive" concept of propositional implication (or argumental validity) in terms of information content of propositions and sets of propositions: one given proposition implies a second if the former contains all of the information contained by the latter; an argument is valid if the conclusion contains no information beyond that of the premise-set. This paper locates information-theoretic approaches historically, philosophically and pragmatically. Advantages and disadvantages are identified by examining such approaches in themselves (...) and by contrasting them with standard transformation-theoretic approaches. Transformation-theoretic approaches analyse validity (and thus implication) in terms of transformations that map one argument onto another: a given argument is valid if no transformation carries it onto an argument with all true premises and false conclusion. Model-theoretic, set-theoretic, and substitution-theoretic approaches, which dominate current literature, can be construed as transformation-theoretic, as can the so-called possible-worlds approaches. Ontic and epistemic presuppositions of both types of approaches are considered. Attention is given to the question of whether our historically cumulative experience applying logic is better explained from a purely information-theoretic perspective or from a purely transformation-theoretic perspective or whether apparent conflicts between the two types of approaches need to be reconciled in order to forge a new type of approach that recognizes their basic complementarity. (shrink)
Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of (...) this paper is to give the direct generalization to quantum logical information theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., qudits of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates that are distinguished by the measurement. Both the classical and quantum versions of logical entropy have simple interpretations as “two-draw” probabilities for distinctions. The conclusion is that quantum logical entropy is the simple and natural notion of information for quantum information theory focusing on the distinguishing of quantum states. (shrink)
The paper argues that the two best known formal logical fallacies, namely denying the antecedent (DA) and affirming the consequent (AC) are not just basic and simple errors, which prove human irrationality, but rather informational shortcuts, which may provide a quick and dirty way of extracting useful information from the environment. DA and AC are shown to be degraded versions of Bayes’ theorem, once this is stripped of some of its probabilities. The less the probabilities count, the closer these fallacies (...) become to a reasoning that is not only informationally useful but also logically valid. (shrink)
The paper argues that the two best known formal logical fallacies, namely denying the antecedent (DA) and affirming the consequent (AC) are not just basic and simple errors, which prove human irrationality, but rather informational shortcuts, which may provide a quick and dirty way of extracting useful information from the environment. DA and AC are shown to be degraded versions of Bayes’ theorem, once this is stripped of some of its probabilities. The less the probabilities count, the closer these fallacies (...) become to a reasoning that is not only informationally useful but also logically valid. (shrink)
Information-theoretic approaches to formal logic analyze the "common intuitive" concepts of implication, consequence, and validity in terms of information content of propositions and sets of propositions: one given proposition implies a second if the former contains all of the information contained by the latter; one given proposition is a consequence of a second if the latter contains all of the information contained by the former; an argument is valid if the conclusion contains no information beyond that of the premise-set. (...) This paper locates information-theoretic approaches historically, philosophically, and pragmatically. Advantages and disadvantages are identified by examining such approaches in themselves and by contrasting them with standard transformation-theoretic approaches. Transformation-theoretic approaches analyze validity (and thus implication) in terms of transformations that map one argument onto another: a given argument is valid if no transformation carries it onto an argument with all true premises and false conclusion. Model-theoretic, set-theoretic, and substitution-theoretic approaches, which dominate current literature, can be construed as transformation-theoretic, as can the so-called possible-worlds approaches. Ontic and epistemic presuppositions of both types of approaches are considered. Attention is given to the question of whether our historically cumulative experience applying logic is better explained from a purely information-theoretic perspective or from a purely transformation-theoretic perspective or whether apparent conflicts between the two types of approaches need to be reconciled in order to forge a new type of approach that recognizes their basic complementarity. (shrink)
According to a prevalent view among philosophers formal logic is the philosopher’s main tool to assess the validity of arguments, i.e. the philosopher’s ars iudicandi. By drawing on a famous dispute between Russell and Strawson over the validity of a certain kind of argument – of arguments whose premises feature definite descriptions – this paper casts doubt on the accuracy of the ars iudicandi conception. Rather than settling the question whether the contentious arguments are valid or not, Russell and (...) Strawson, upon discussing the proper logical analysis of definite descriptions, merely contrast converse informal validity assessments rendered explicit by nonequivalent logical for-malizations. (shrink)
An important problem with machine learning is that when label number n>2, it is very difficult to construct and optimize a group of learning functions, and we wish that optimized learning functions are still useful when prior distribution P(x) (where x is an instance) is changed. To resolve this problem, the semantic information G theory, Logical Bayesian Inference (LBI), and a group of Channel Matching (CM) algorithms together form a systematic solution. MultilabelMultilabel A semantic channel in the G theory consists (...) of a group of truth functions or membership functions. In comparison with likelihood functions, Bayesian posteriors, and Logistic functions used by popular methods, membership functions can be more conveniently used as learning functions without the above problem. In Logical Bayesian Inference (LBI), every label’s learning is independent. For Multilabel learning, we can directly obtain a group of optimized membership functions from a big enough sample with labels, without preparing different samples for different labels. A group of Channel Matching (CM) algorithms are developed for machine learning. For the Maximum Mutual Information (MMI) classification of three classes with Gaussian distributions on a two-dimensional feature space, 2-3 iterations can make mutual information between three classes and three labels surpass 99% of the MMI for most initial partitions. For mixture models, the Expectation-Maxmization (EM) algorithm is improved and becomes the CM-EM algorithm, which can outperform the EM algorithm when mixture ratios are imbalanced, or local convergence exists. The CM iteration algorithm needs to combine neural networks for MMI classifications on high-dimensional feature spaces. LBI needs further studies for the unification of statistics and logic. (shrink)
Information-based epistemology maintains that ‘being informed’ is an independent cognitive state that cannot be reduced to knowledge or to belief, and the modal logic KTB has been proposed as a model. But what distinguishes the KTB analysis of ‘being informed’, the Brouwersche schema (B), is precisely its downfall, for no logic of information should include (B) and, more generally, no epistemic logic should include (B), either.
Quantum information is discussed as the universal substance of the world. It is interpreted as that generalization of classical information, which includes both finite and transfinite ordinal numbers. On the other hand, any wave function and thus any state of any quantum system is just one value of quantum information. Information and its generalization as quantum information are considered as quantities of elementary choices. Their units are correspondingly a bit and a qubit. The course of time is what generates choices (...) by itself, thus quantum information and any item in the world in final analysis. The course of time generates necessarily choices so: The future is absolutely unorderable in principle while the past is always well-ordered and thus unchangeable. The present as the mediation between them needs the well-ordered theorem equivalent to the axiom of choice. The latter guarantees the choice even among the elements of an infinite set, which is the case of quantum information. The concrete and abstract objects share information as their common base, which is quantum as to the formers and classical as to the latters. The general quantities of matter in physics, mass and energy can be considered as particular cases of quantum information. The link between choice and abstraction in set theory allows of “Hume’s principle” to be interpreted in terms of quantum mechanics as equivalence of “many” and “much” underlying quantum information. Quantum information as the universal substance of the world calls for the unity of physics and mathematics rather than that of the concrete and abstract objects and thus for a form of quantum neo-Pythagoreanism in final analysis. (shrink)
This article explores the usefulness of interdisciplinarity as method of enquiry by proposing an investigation of the concept of information in the light of semiotics. This is because, as Kull, Deacon, Emmeche, Hoffmeyer and Stjernfelt state, information is an implicitly semiotic term (Biological Theory 4(2):167–173, 2009: 169), but the logical relation between semiosis and information has not been sufficiently clarified yet. Across the history of cybernetics, the concept of information undergoes an uneven development; that is, information is an ‘objective’ entity (...) in first order cybernetics, and becomes a ‘subjective’ entity in second order cybernetics. This contradiction relegates the status of information to that of a ‘true’ or ‘false’ formal logic problem. The present study proposes that a solution to this contradiction can be found in Deely’s reconfiguration of Peirce’s ‘object’ (as found in his triadic model of semiosis) into ‘thing’ and ‘object’ (Deely 1981). This ontology allows one to argue that information is neither ‘true’ nor ‘false’, and to suggest that, when considered in light of its workability, information can be both true and false, and as such it constitutes an organism’s purely objective reality (Deely 2009b). It is stated that in the process of building such a reality, information is ‘motivated’ by environmental, physiological, emotional (including past feelings and expectations) constraints which are, in turn, framed by observership. Information is therefore found in the irreducible cybersemiotic process that links at once all these conditions and that is simultaneously constrained by them. The integration of cybernetics’ and semiotics’ understanding of information shows that history is the analytical principle that grants scientific rigour to interdisciplinary investigations. As such, in any attempt to clarify its epistemological stance (e.g. the semiotic aspect of information), it is argued that biosemiotics does not need only to acknowledge semiotics (as it does), but also cybernetics in its interdisciplinary heritage. (shrink)
Any logic is represented as a certain collection of well-orderings admitting or not some algebraic structure such as a generalized lattice. Then universal logic should refer to the class of all subclasses of all well-orderings. One can construct a mapping between Hilbert space and the class of all logics. Thus there exists a correspondence between universal logic and the world if the latter is considered a collection of wave functions, as which the points in Hilbert space can (...) be interpreted. The correspondence can be further extended to the foundation of mathematics by set theory and arithmetic, and thus to all mathematics. (shrink)
The resolving of the main problem of quantum mechanics about how a quantum leap and a smooth motion can be uniformly described resolves also the problem of how a distribution of reliable data and a sequence of deductive conclusions can be uniformly described by means of a relevant wave function “Ψdata”.
The purpose of this paper is to review and discuss Luciano Floridi’s 2019 book The Logic of Information: A Theory of Philosophy as Conceptual Design, the latest instalment in his philosophy of information (PI) tetralogy, particularly with respect to its implications for library and information studies (LIS) .
We present a formal semantics for epistemic logic, capturing the notion of knowability relative to information (KRI). Like Dretske, we move from the platitude that what an agent can know depends on her (empirical) information. We treat operators of the form K_AB (‘B is knowable on the basis of information A’) as variably strict quantifiers over worlds with a topic- or aboutness- preservation constraint. Variable strictness models the non-monotonicity of knowledge acquisition while allowing knowledge to be intrinsically stable. Aboutness-preservation (...) models the topic-sensitivity of information, allowing us to invalidate controversial forms of epistemic closure while validating less controversial ones. Thus, unlike the standard modal framework for epistemic logic, KRI accommodates plausible approaches to the Kripke-Harman dogmatism paradox, which bear on non-monotonicity, or on topic-sensitivity. KRI also strikes a better balance between agent idealization and a non-trivial logic of knowledge ascriptions. (shrink)
One can construct a mapping between Hilbert space and the class of all logic if the latter is defined as the set of all well-orderings of some relevant set (or class). That mapping can be further interpreted as a mapping of all states of all quantum systems, on the one hand, and all logic, on the other hand. The collection of all states of all quantum systems is equivalent to the world (the universe) as a whole. Thus that (...) mapping establishes a fundamentally philosophical correspondence between the physical world and universal logic by the meditation of a special and fundamental structure, that of Hilbert space, and therefore, between quantum mechanics and logic by mathematics. Furthermore, Hilbert space can be interpreted as the free variable of "quantum information" and any point in it, as a value of the same variable as "bound" already axiom of choice. (shrink)
In this work, we propose a definition of logical consequence based on the relation between the quantity of information present in a particular set of formulae and a particular formula. As a starting point, we use Shannon‟s quantitative notion of information, founded on the concepts of logarithmic function and probability value. We first consider some of the basic elements of an axiomatic probability theory, and then construct a probabilistic semantics for languages of classical propositional logic. We define the quantity (...) of information for the formulae of these languages and introduce the concept of informational logical consequence, identifying some important results, among them: certain arguments that have traditionally been considered valid, such as modus ponens, are not valid from the informational perspective; the logic underlying informational logical consequence is not classical, and is at the least paraconsistent sensu lato; informational logical consequence is not a Tarskian logical consequence. (shrink)
Purpose – To review and discuss Luciano Floridi’s 2019 book The Logic of Information: A Theory of Philosophy as Conceptual Design, the latest instalment in his philosophy of information tetralogy, particularly with respect to its implications for library and information studies. Design/methodology/approach – Nine scholars with research interests in philosophy and LIS read and responded to the book, raising critical and heuristic questions in the spirit of scholarly dialogue. Floridi responded to these questions. Findings – Floridi’s PI, including this (...) latest publication, is of interest to LIS scholars, and much insight can be gained by exploring this connection. It seems also that LIS has the potential to contribute to PI’s further development in some respects. Research implications – Floridi’s PI work is technical philosophy for which many LIS scholars do not have the training or patience to engage with, yet doing so is rewarding. This suggests a role for translational work between philosophy and LIS. Originality/value – The book symposium format, not yet seen in LIS, provides forum for sustained, multifaceted and generative dialogue around ideas. (shrink)
Human actions and decisions are most of the times not only grounded on emotional reactions, they are irrationally debasing. While such emotions and heuristics were perhaps suitable for dealing with life in the Stone Age, they are woefully inadequate in the Silicon Age. The substitution of traditional news agencies and communication platforms in Nigeria with social media networks has not only increased human capacities, it has aided the common good and further eased communication and increased the human knowledge base. For (...) instance, the Silicon Valley in the state of California in the United State of America has proved the extent to which human ingenuity can be exerted in beneficial ways. Here in, the top of the multibillion-dollar communication companies like Apple, eBay, Cisco, Lockheed, Hewlett Packard (HP), Google, Netflix, Facebook, Oracle, Tesla, etc. whose yearly budgets far exceed the entire yearly budget of Nigeria, has proved that, ICT remains the building blocks of contemporary communities. The state of California for instance prides herself as the 6th largest economy of the world after France and Brazil. This romantic picture of ICT is but only one chapter of the ICT divide. In today’s world, the scale and speed of the highly partisan news and falsehoods that circulate in the human environment is deafening. In politics and governance for instance, the populist have exploited the ICT effectively and without restraints, to access power and authority. Today, we are in the era of post-truth, an era in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief. An era that harbours corruption of intellectual integrity and damage to the whole fabric of democracy. The battle to protect information integrity and expose fake news becomes sine qua non. We shall in this paper explore the dark side of the age of information, a side that has been exploited by the media mavens, political hacks, and ideological propagandists who promote lies, illusion, confusion, and other forms of demented or manipulated imagination. In doing this, we shall proceed as follows: i. Historicize the concept of ‘fake news’ here referred to as post-truth. ii. Interrogate the concept of ‘information’ and its goal in human affairs iii. Evaluate the nexus between ‘truth’ values in ‘information’, ‘Misinformation’ and ‘dis-information’ iv. Situate the role of Philosophy in the era of post-truth. (shrink)
In this paper, I present an informational approach to the nature of personal identity. In “Plato and the problem of the chariot”, I use Plato’s famous metaphor of the chariot to introduce a specific problem regarding the nature of the self as an informational multiagent system: what keeps the self together as a whole and coherent unity? In “Egology and its two branches” and “Egology as synchronic individualisation”, I outline two branches of the theory of the self: one concerning the (...) individualisation of the self as an entity, the other concerning the identification of such entity. I argue that both presuppose an informational approach, defend the view that the individualisation of the self is logically prior to its identification , and suggest that such individualisation can be provided in informational terms. Hence, in “A reconciling hypothesis: the three membranes model”, I offer an informational individualisation of the self, based on a tripartite model, which can help to solve the problem of the chariot. Once this model of the self is outlined, in “ICTs as technologies of the self” I use it to show how ICTs may be interpreted as technologies of the self. In “The logic of realisation”, I introduce the concept of “realization” (Aristotle’s anagnorisis ) and support the rather Spinozian view according to which, from the perspective of informational structural realism, selves are the final stage in the development of informational structures. The final “Conclusion: from the egology to the ecology of the self” briefly concludes the article with a reference to the purposeful shaping of the self, in a shift from egology to ecology. (shrink)
The article addresses the problem of how semantic information can be upgraded to knowledge. The introductory section explains the technical terminology and the relevant background. Section 2 argues that, for semantic information to be upgraded to knowledge, it is necessary and sufficient to be embedded in a network of questions and answers that correctly accounts for it. Section 3 shows that an information flow network of type A fulfils such a requirement, by warranting that the erotetic deficit, characterising the target (...) semantic information t by default, is correctly satisfied by the information flow of correct answers provided by an informational source s. Section 4 illustrates some of the major advantages of such a Network Theory of Account (NTA) and clears the ground of a few potential difficulties. Section 5 clarifies why NTA and an informational analysis of knowledge, according to which knowledge is accounted semantic information, is not subject to Gettier-type counterexamples. A concluding section briefly summarises the results obtained. (shrink)
N. Wiener's negative definition of information is well known: it states what information is not. According to this definition, it is neither matter nor energy. But what is it? It is shown how one can follow the lead of dialectical logic as expounded by G.W.F. Hegel in his main work -- "The Science of Logic" -- to answer this and some related questions.
In this article, I define and then defend the principle of information closure (pic) against a sceptical objection similar to the one discussed by Dretske in relation to the principle of epistemic closure. If I am successful, given that pic is equivalent to the axiom of distribution and that the latter is one of the conditions that discriminate between normal and non-normal modal logics, a main result of such a defence is that one potentially good reason to look for a (...) formalization of the logic of “ $S$ is informed that $p$ ” among the non-normal modal logics, which reject the axiom, is also removed. This is not to argue that the logic of “ $S$ is informed that $p$ ” should be a normal modal logic, but that it could still be insofar as the objection that it could not be, based on the sceptical objection against pic, has been removed. In other word, I shall argue that the sceptical objection against pic fails, so such an objection provides no ground to abandon the normal modal logic B (also known as KTB) as a formalization of “ $S$ is informed that $p$ ”, which remains plausible insofar as this specific obstacle is concerned. (shrink)
According to certain normative theories in epistemology, rationality requires us to be logically omniscient. Yet this prescription clashes with our ordinary judgments of rationality. How should we resolve this tension? In this paper, I focus particularly on the logical omniscience requirement in Bayesian epistemology. Building on a key insight by Hacking :311–325, 1967), I develop a version of Bayesianism that permits logical ignorance. This includes: an account of the synchronic norms that govern a logically ignorant individual at any given time; (...) an account of how we reduce our logical ignorance by learning logical facts and how we should update our credences in response to such evidence; and an account of when logical ignorance is irrational and when it isn’t. At the end, I explain why the requirement of logical omniscience remains true of ideal agents with no computational, processing, or storage limitations. (shrink)
What is the ultimate nature of reality? This paper defends an answer in terms of informational realism (IR). It does so in three stages. First, it is shown that, within the debate about structural realism (SR), epistemic (ESR) and ontic (OSR) structural realism are reconcilable by using the methodology of the levels of abstractions. It follows that OSR is defensible from a structuralist-friendly position. Second, it is argued that OSR is also plausible, because not all related objects are logically prior (...) to all relational structures. The relation of difference is at least as fundamental as (because constitutive of) any relata. Third, it is suggested that an ontology of structural objects for OSR can reasonably be developed in terms of informational objects, and that Object Oriented Programming provides a flexible and powerful methodology with which to clarify and make precise the concept of “informational object”. The outcome is informational realism, the view that the world is the totality of informational objects dynamically interacting with each other.. (shrink)
I want to model a finite, fallible cognitive agent who imagines that p in the sense of mentally representing a scenario—a configuration of objects and properties—correctly described by p. I propose to capture imagination, so understood, via variably strict world quantifiers, in a modal framework including both possible and so-called impossible worlds. The latter secure lack of classical logical closure for the relevant mental states, while the variability of strictness captures how the agent imports information from actuality in the imagined (...) non-actual scenarios. Imagination turns out to be highly hyperintensional, but not logically anarchic. Section 1 sets the stage and impossible worlds are quickly introduced in Sect. 2. Section 3 proposes to model imagination via variably strict world quantifiers. Section 4 introduces the formal semantics. Section 5 argues that imagination has a minimal mereological structure validating some logical inferences. Section 6 deals with how imagination under-determines the represented contents. Section 7 proposes additional constraints on the semantics, validating further inferences. Section 8 describes some welcome invalidities. Section 9 examines the effects of importing false beliefs into the imagined scenarios. Finally, Sect. 10 hints at possible developments of the theory in the direction of two-dimensional semantics. (shrink)
We present a framework for epistemic logic, modeling the logical aspects of System 1 and System 2 cognitive processes, as per dual process theories of reasoning. The framework combines non-normal worlds semantics with the techniques of Dynamic Epistemic Logic. It models non-logically-omniscient, but moderately rational agents: their System 1 makes fast sense of incoming information by integrating it on the basis of their background knowledge and beliefs. Their System 2 allows them to slowly, step-wise unpack some of the (...) logical consequences of such knowledge and beliefs, by paying a cognitive cost. The framework is applied to three instances of limited rationality, widely discussed in cognitive psychology: Stereotypical Thinking, the Framing Effect, and the Anchoring Effect. (shrink)
Semantic information is usually supposed to satisfy the veridicality thesis: p qualifies as semantic information only if p is true. However, what it means for semantic information to be true is often left implicit, with correspondentist interpretations representing the most popular, default option. The article develops an alternative approach, namely a correctness theory of truth (CTT) for semantic information. This is meant as a contribution not only to the philosophy of information but also to the philosophical debate on the nature (...) of truth. After the introduction, in Sect. 2, semantic information is shown to be translatable into propositional semantic information (i). In Sect. 3, i is polarised into a query (Q) and a result (R), qualified by a specific context, a level of abstraction and a purpose. This polarization is normalised in Sect. 4, where [Q + R] is transformed into a Boolean question and its relative yes/no answer [Q + A]. This completes the reduction of the truth of i to the correctness of A. In Sects. 5 and 6, it is argued that (1) A is the correct answer to Q if and only if (2) A correctly saturates Q by verifying and validating it (in the computer science’s sense of verification and validation ); that (2) is the case if and only if (3) [Q + A] generates an adequate model (m) of the relevant system (s) identified by Q; that (3) is the case if and only if (4) m is a proxy of s (in the computer science’s sense of proxy ) and (5) proximal access to m commutes with the distal access to s (in the category theory’s sense of commutation ); and that (5) is the case if and only if (6) reading/writing (accessing, in the computer science’s technical sense of the term) m enables one to read/write (access) s. Sect. 7 provides some further clarifications about CTT, in the light of semantic paradoxes. Section 8 draws a general conclusion about the nature of CTT as a theory for systems designers not just systems users. In the course of the article all technical expressions from computer science are explained. (shrink)
This paper starts by indicating the analysis of Hempel's conditions of adequacy for any relation of confirmation (Hempel, 1945) as presented in Huber (submitted). There I argue contra Carnap (1962, Section 87) that Hempel felt the need for two concepts of confirmation: one aiming at plausible theories and another aiming at informative theories. However, he also realized that these two concepts are conflicting, and he gave up the concept of confirmation aiming at informative theories. The main part of the paper (...) consists in working out the claim that one can have Hempel's cake and eat it too - in the sense that there is a logic of theory assessment that takes into account both of the two conflicting aspects of plausibility and informativeness. According to the semantics of this logic, a is an acceptable theory for evidence β if and only if a is both sufficiently plausible given β and sufficiently informative about β. This is spelt out in terms of ranking functions (Spohn, 1988) and shown to represent the syntactically specified notion of an assessment relation. The paper then compares these acceptability relations to explanatory and confirmatory consequence relations (Flach, 2000) as well as to nonmonotonic consequence relations (Kraus et al., 1990). It concludes by relating the plausibility-informativeness approach to Carnap's positive relevance account, thereby shedding new light on Carnap's analysis as well as solving another problem of confirmation theory. (shrink)
(See also the separate entry for the volume itself.) This introduction has three parts. The first providing an overview of some main lines of research in deontic logic: the emergence of SDL, Chisholm's paradox and the development of dyadic deontic logics, various other puzzles/challenges and areas of development, along with philosophical applications. The second part focus on some actual and potential fruitful interactions between deontic logic, computer science and artificial intelligence. These include applications of deontic logic to (...) AI knowledge representation in legal systems, to modelling computer systems where it is expected that sub-ideal states will emerge and require countermeasures, to norm-governed human interactions with computer systems, and to the representation of some features of multi-agent systems where different agent-like computer systems interact with one another. The third and final part briefly groups and previews the papers in the anthology. (shrink)
An information recovery problem is the problem of constructing a proposition containing the information dropped in going from a given premise to a given conclusion that folIows. The proposition(s) to beconstructed can be required to satisfy other conditions as well, e.g. being independent of the conclusion, or being “informationally unconnected” with the conclusion, or some other condition dictated by the context. This paper discusses various types of such problems, it presents techniques and principles useful in solving them, and it develops (...) algorithmic methods for certain classes of such problems. The results are then applied to classical number theory, in particular, to questions concerning possible refinements of the 1931 Gödel Axiom Set, e.g. whether any of its axioms can be analyzed into “informational atoms”. Two propositions are “informationally unconnected” [with each other] if no informative (nontautological) consequence of one also follows from the other. A proposition is an “informational atom” if it is informative but no information can be dropped from it without rendering it uninformative (tautological). Presentation, employment, and investigation of these two new concepts are prominent features of this paper. (shrink)
Epistemic two-dimensional semantics is a theory in the philosophy of language that provides an account of meaning which is sensitive to the distinction between necessity and apriority. While this theory is usually presented in an informal manner, I take some steps in formalizing it in this paper. To do so, I define a semantics for a propositional modal logic with operators for the modalities of necessity, actuality, and apriority that captures the relevant ideas of epistemic two-dimensional semantics. I (...) also describe some properties of the logic that are interesting from a philosophical perspective, and apply it to the so-called nesting problem. (shrink)
This is the revised version of an invited keynote lecture delivered at the "1st Australian Computing and Philosophy Conference". The paper is divided into two parts. The first part defends an informational approach to structural realism. It does so in three steps. First, it is shown that, within the debate about structural realism, epistemic and ontic structural realism are reconcilable. It follows that a version of OSR is defensible from a structuralist-friendly position. Second, it is argued that a version of (...) OSR is also plausible, because not all relata are logically prior to relations. Third, it is shown that a version of OSR is also applicable to both sub-observable and observable entities, by developing its ontology of structural objects in terms of informational objects. The outcome is informational structural realism, a version of OSR supporting the ontological commitment to a view of the world as the totality of informational objects dynamically interacting with each other. The paper has been discussed by several colleagues and, in the second half, ten objections that have been moved to the proposal are answered in order to clarify it further. (shrink)
We analyze the logical form of the domain knowledge that grounds analogical inferences and generalizations from a single instance. The form of the assumptions which justify analogies is given schematically as the "determination rule", so called because it expresses the relation of one set of variables determining the values of another set. The determination relation is a logical generalization of the different types of dependency relations defined in database theory. Specifically, we define determination as a relation between schemata of first (...) order logic that have two kinds of free variables: (1) object variables and (2) what we call "polar" variables, which hold the place of truth values. Determination rules facilitate sound rule inference and valid conclusions projected by analogy from single instances, without implying what the conclusion should be prior to an inspection of the instance. They also provide a way to specify what information is sufficiently relevant to decide a question, prior to knowledge of the answer to the question. (shrink)
The phenomenon of distributed knowledge is well-known in epistemic logic. In this paper, a similar phenomenon in ethics, somewhat neglected so far, is investigated, namely distributed morality. The article explains the nature of distributed morality, as a feature of moral agency, and explores the implications of its occurrence in advanced information societies. In the course of the analysis, the concept of infraethics is introduced, in order to refer to the ensemble of moral enablers, which, although morally neutral per se, (...) can significantly facilitate or hinder both positive and negative moral behaviours. (shrink)
Information is often modelled as a set of relevant possibilities, treated as logically possible worlds. However, this has the unintuitive consequence that the logical consequences of an agent's information cannot be informative for that agent. There are many scenarios in which such consequences are clearly informative for the agent in question. Attempts to weaken the logic underlying each possible world are misguided. Instead, I provide a genuinely psychological notion of epistemic possibility and show how it can be captured in (...) a formal model, which I call a fan. I then show how to use fans to build formal models of being informed, as well as knowledge, belief and information update. (shrink)
Classical logic is usually interpreted as the logic of propositions. But from Boole's original development up to modern categorical logic, there has always been the alternative interpretation of classical logic as the logic of subsets of any given (nonempty) universe set. Partitions on a universe set are dual to subsets of a universe set in the sense of the reverse-the-arrows category-theoretic duality--which is reflected in the duality between quotient objects and subobjects throughout algebra. Hence the (...) idea arises of a dual logic of partitions. That dual logic is described here. Partition logic is at the same mathematical level as subset logic since models for both are constructed from (partitions on or subsets of) arbitrary unstructured sets with no ordering relations, compatibility or accessibility relations, or topologies on the sets. Just as Boole developed logical finite probability theory as a quantitative treatment of subset logic, applying the analogous mathematical steps to partition logic yields a logical notion of entropy so that information theory can be refounded on partition logic. But the biggest application is that when partition logic and the accompanying logical information theory are "lifted" to complex vector spaces, then the mathematical framework of quantum mechanics is obtained. Partition logic models indefiniteness (i.e., numerical attributes on a set become more definite as the inverse-image partition becomes more refined) while subset logic models the definiteness of classical physics (an entity either definitely has a property or definitely does not). Hence partition logic provides the backstory so the old idea of "objective indefiniteness" in QM can be fleshed out to a full interpretation of quantum mechanics. (shrink)
This article informally presents a solution to the paradoxes of truth and shows how the solution solves classical paradoxes (such as the original Liar) as well as the paradoxes that were invented as counter-arguments for various proposed solutions (``the revenge of the Liar''). Any solution to the paradoxes of truth necessarily establishes a certain logical concept of truth. This solution complements the classical procedure of determining the truth values of sentences by its own failure and, when the procedure fails, through (...) an appropriate semantic shift allows us to express the failure in a classical two-valued language. Formally speaking, the solution is a language with one meaning of symbols and two valuations of the truth values of sentences. The primary valuation is a classical valuation that is partial in the presence of the truth predicate. It enables us to determine the classical truth value of a sentence or leads to the failure of that determination. The language with the primary valuation is precisely the largest intrinsic fixed point of the strong Kleene three-valued semantics (LIFPSK3). The semantic shift that allows us to express the failure of the primary valuation is precisely the classical closure of LIFPSK3: it extends LIFPSK3 to a classical language in parts where LIFPSK3 is undetermined. Thus, this article provides a content-wise argumentation, which has not been present in contemporary debates so far, for the choice of LIFPSK3 and its classical closure as the right model for the logical concept of truth. In the end, an erroneous critique of Kripke-Feferman axiomatic theory of truth, which is present in contemporary literature, is pointed out. (shrink)
In explaining the notion of a fundamental property or relation, metaphysicians will often draw an analogy with languages. The fundamental properties and relations stand to reality as the primitive predicates and relations stand to a language: the smallest set of vocabulary God would need in order to write the “book of the world.” This paper attempts to make good on this metaphor. To that end, a modality is introduced that, put informally, stands to propositions as logical truth stands to sentences. (...) The resulting theory, formulated in higher-order logic, also vindicates the Humean idea that fundamental properties and relations are freely recombinable and a variant of the structural idea that propositions can be decomposed into their fundamental constituents via logical operations. Indeed, it is seen that, although these ideas are seemingly distinct, they are not independent, and fall out of a natural and general theory about the granularity of reality. (shrink)
The paper proposes two logical analyses of (the norms of) justification. In a first, realist-minded case, truth is logically independent from justification and leads to a pragmatic logic LP including two epistemic and pragmatic operators, namely, assertion and hypothesis. In a second, antirealist-minded case, truth is not logically independent from justification and results in two logical systems of information and justification: AR4 and AR4¢, respectively, provided with a question-answer semantics. The latter proposes many more epistemic agents, each corresponding to (...) a wide variety of epistemic norms. After comparing the different norms of justification involved in these logical systems, two hexagons expressing Aristotelian relations of opposition will be gathered in order to clarify how (a fragment of) pragmatic formulas can be interpreted in a fuzzy-based question-answer semantics. (shrink)
2nd edition. Many-valued logics are those logics that have more than the two classical truth values, to wit, true and false; in fact, they can have from three to infinitely many truth values. This property, together with truth-functionality, provides a powerful formalism to reason in settings where classical logic—as well as other non-classical logics—is of no avail. Indeed, originally motivated by philosophical concerns, these logics soon proved relevant for a plethora of applications ranging from switching theory to cognitive modeling, (...) and they are today in more demand than ever, due to the realization that inconsistency and vagueness in knowledge bases and information processes are not only inevitable and acceptable, but also perhaps welcome. The main modern applications of (any) logic are to be found in the digital computer, and we thus require the practical knowledge how to computerize—which also means automate—decisions (i.e. reasoning) in many-valued logics. This, in turn, necessitates a mathematical foundation for these logics. This book provides both these mathematical foundation and practical knowledge in a rigorous, yet accessible, text, while at the same time situating these logics in the context of the satisfiability problem (SAT) and automated deduction. The main text is complemented with a large selection of exercises, a plus for the reader wishing to not only learn about, but also do something with, many-valued logics. (shrink)
Since the time of Aristotle's students, interpreters have considered Prior Analytics to be a treatise about deductive reasoning, more generally, about methods of determining the validity and invalidity of premise-conclusion arguments. People studied Prior Analytics in order to learn more about deductive reasoning and to improve their own reasoning skills. These interpreters understood Aristotle to be focusing on two epistemic processes: first, the process of establishing knowledge that a conclusion follows necessarily from a set of premises (that is, on the (...) epistemic process of extracting information implicit in explicitly given information) and, second, the process of establishing knowledge that a conclusion does not follow. Despite the overwhelming tendency to interpret the syllogistic as formal epistemology, it was not until the early 1970s that it occurred to anyone to think that Aristotle may have developed a theory of deductive reasoning with a well worked-out system of deductions comparable in rigor and precision with systems such as propositional logic or equational logic familiar from mathematical logic. When modern logicians in the 1920s and 1930s first turned their attention to the problem of understanding Aristotle's contribution to logic in modern terms, they were guided both by the Frege-Russell conception of logic as formal ontology and at the same time by a desire to protect Aristotle from possible charges of psychologism. They thought they saw Aristotle applying the informal axiomatic method to formal ontology, not as making the first steps into formal epistemology. They did not notice Aristotle's description of deductive reasoning. Ironically, the formal axiomatic method (in which one explicitly presents not merely the substantive axioms but also the deductive processes used to derive theorems from the axioms) is incipient in Aristotle's presentation. Partly in opposition to the axiomatic, ontically-oriented approach to Aristotle's logic and partly as a result of attempting to increase the degree of fit between interpretation and text, logicians in the 1970s working independently came to remarkably similar conclusions to the effect that Aristotle indeed had produced the first system of formal deductions. They concluded that Aristotle had analyzed the process of deduction and that his achievement included a semantically complete system of natural deductions including both direct and indirect deductions. Where the interpretations of the 1920s and 1930s attribute to Aristotle a system of propositions organized deductively, the interpretations of the 1970s attribute to Aristotle a system of deductions, or extended deductive discourses, organized epistemically. The logicians of the 1920s and 1930s take Aristotle to be deducing laws of logic from axiomatic origins; the logicians of the 1970s take Aristotle to be describing the process of deduction and in particular to be describing deductions themselves, both those deductions that are proofs based on axiomatic premises and those deductions that, though deductively cogent, do not establish the truth of the conclusion but only that the conclusion is implied by the premise-set. Thus, two very different and opposed interpretations had emerged, interestingly both products of modern logicians equipped with the theoretical apparatus of mathematical logic. The issue at stake between these two interpretations is the historical question of Aristotle's place in the history of logic and of his orientation in philosophy of logic. This paper affirms Aristotle's place as the founder of logic taken as formal epistemology, including the study of deductive reasoning. A by-product of this study of Aristotle's accomplishments in logic is a clarification of a distinction implicit in discourses among logicians--that between logic as formal ontology and logic as formal epistemology. (shrink)
In order to predict and explain behavior, one cannot specify the mental state of an agent merely by saying what information she possesses. Instead one must specify what information is available to an agent relative to various purposes. Specifying mental states in this way allows us to accommodate cases of imperfect recall, cognitive accomplishments involved in logical deduction, the mental states of confused or fragmented subjects, and the difference between propositional knowledge and know-how .
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.