We propose a nonmonotonic DescriptionLogic of typicality able to account for the phenomenon of combining prototypical concepts, an open problem in the fields of AI and cognitive modelling. Our logic extends the logic of typicality ALC + TR, based on the notion of rational closure, by inclusions p :: T(C) v D (“we have probability p that typical Cs are Ds”), coming from the distributed semantics of probabilistic Description Logics. Additionally, it embeds a set (...) of cognitive heuristics for concept combination. We show that the complexity of reasoning in our logic is EXPTIME-complete as in ALC. (shrink)
We propose a nonmonotonic DescriptionLogic of typicality able to account for the phenomenon of the combination of prototypical concepts. The proposed logic relies on the logic of typicality ALC + TR, whose semantics is based on the notion of rational closure, as well as on the distributed semantics of probabilistic Description Logics, and is equipped with a cognitive heuristic used by humans for concept composition. We first extend the logic of typicality ALC + (...) TR by typicality inclusions of the form p :: T(C) v D, whose intuitive meaning is that “we believe with degree p about the fact that typical Cs are Ds”. As in the distributed semantics, we define different scenarios containing only some typicality inclusions, each one having a suitable probability. We then exploit such scenarios in order to ascribe typical properties to a concept C obtained as the combination of two prototypical concepts. We also show that reasoning in the proposed DescriptionLogic is EXPTIME-complete as for the underlying standard DescriptionLogic ALC. (shrink)
Formalisms based on one or other flavor of DescriptionLogic (DL) are sometimes put forward as helping to ensure that terminologies and controlled vocabularies comply with sound ontological principles. The objective of this paper is to study the degree to which one DL-based biomedical terminology (SNOMED CT) does indeed comply with such principles. We defined seven ontological principles (for example: each class must have at least one parent, each class must differ from its parent) and examined the properties (...) of SNOMED CT classes with respect to these principles. Our major results are: 31% of these classes have a single child; 27% have multiple parents; 51% do not exhibit any differentiae between the description of the parent and that of the child. The applications of this study to quality assurance for ontologies are discussed and suggestions are made for dealing with the phenomenon of multiple inheritance. The advantages and limitations of our approach are also discussed. (shrink)
Traditionally, consistency is the only criterion for the quality of a theory in logic-based approaches to reasoning about actions. This work goes beyond that and contributes to the meta-theory of actions by investigating what other properties a good domain de- scription should satisfy. Having Propositional Dynamic Logic (PDL) as background, we state some meta-theoretical postulates concerning this sore spot. When all pos- tulates are satisfied, we call the action theory modular. We point out the problems that arise when (...) the postulates about modularity are violated, and propose algorith- mic checks that can help the designer of an action theory to overcome them. Besides being easier to understand and more elaboration tolerant in McCarthy’s sense, mod- ular theories have interesting computational properties. Moreover, we also propose a framework for updating domain descriptions and show the importance modularity has in action theory change. (shrink)
This paper explores the question of what logic is not. It argues against the wide spread assumptions that logic is: a model of reason; a model of correct reason; the laws of thought, or indeed is related to reason at all such that the essential nature of the two are crucially or essentially co-illustrative. I note that due to such assumptions, our current understanding of the nature of logic itself is thoroughly entangled with the nature of reason. (...) I show that most arguments for the presence of any sort of essential re- lationship between logic and reason face intractable problems and demands, and fall well short of addressing them. These arguments include those for the notion that logic is normative for reason (or that logic and correct reason are in some way the same thing), that logic is some sort of description of correct reason and that logic is an abstracted or idealised version of correct reason. A strong version of logical realism is put forward as an alternative view, and is briefly explored. (shrink)
Since the time of Aristotle's students, interpreters have considered Prior Analytics to be a treatise about deductive reasoning, more generally, about methods of determining the validity and invalidity of premise-conclusion arguments. People studied Prior Analytics in order to learn more about deductive reasoning and to improve their own reasoning skills. These interpreters understood Aristotle to be focusing on two epistemic processes: first, the process of establishing knowledge that a conclusion follows necessarily from a set of premises (that is, on the (...) epistemic process of extracting information implicit in explicitly given information) and, second, the process of establishing knowledge that a conclusion does not follow. Despite the overwhelming tendency to interpret the syllogistic as formal epistemology, it was not until the early 1970s that it occurred to anyone to think that Aristotle may have developed a theory of deductive reasoning with a well worked-out system of deductions comparable in rigor and precision with systems such as propositional logic or equational logic familiar from mathematical logic. When modern logicians in the 1920s and 1930s first turned their attention to the problem of understanding Aristotle's contribution to logic in modern terms, they were guided both by the Frege-Russell conception of logic as formal ontology and at the same time by a desire to protect Aristotle from possible charges of psychologism. They thought they saw Aristotle applying the informal axiomatic method to formal ontology, not as making the first steps into formal epistemology. They did not notice Aristotle's description of deductive reasoning. Ironically, the formal axiomatic method (in which one explicitly presents not merely the substantive axioms but also the deductive processes used to derive theorems from the axioms) is incipient in Aristotle's presentation. Partly in opposition to the axiomatic, ontically-oriented approach to Aristotle's logic and partly as a result of attempting to increase the degree of fit between interpretation and text, logicians in the 1970s working independently came to remarkably similar conclusions to the effect that Aristotle indeed had produced the first system of formal deductions. They concluded that Aristotle had analyzed the process of deduction and that his achievement included a semantically complete system of natural deductions including both direct and indirect deductions. Where the interpretations of the 1920s and 1930s attribute to Aristotle a system of propositions organized deductively, the interpretations of the 1970s attribute to Aristotle a system of deductions, or extended deductive discourses, organized epistemically. The logicians of the 1920s and 1930s take Aristotle to be deducing laws of logic from axiomatic origins; the logicians of the 1970s take Aristotle to be describing the process of deduction and in particular to be describing deductions themselves, both those deductions that are proofs based on axiomatic premises and those deductions that, though deductively cogent, do not establish the truth of the conclusion but only that the conclusion is implied by the premise-set. Thus, two very different and opposed interpretations had emerged, interestingly both products of modern logicians equipped with the theoretical apparatus of mathematical logic. The issue at stake between these two interpretations is the historical question of Aristotle's place in the history of logic and of his orientation in philosophy of logic. This paper affirms Aristotle's place as the founder of logic taken as formal epistemology, including the study of deductive reasoning. A by-product of this study of Aristotle's accomplishments in logic is a clarification of a distinction implicit in discourses among logicians--that between logic as formal ontology and logic as formal epistemology. (shrink)
Description Logics are nowadays widely accepted as formalisms which provide reasoning facilities which allow us to discover inconsistencies in ontologies in an automatic fashion. Where ontologies are developed in modular fashion, they allow changes in one module to propogated through the system of ontologies automatically in a way which helps to maintain consistency and stability. For this feature to be utilized effectively, however, requires that domain ontologies be represented in a normalized form.
This paper uses classical logic for a simultaneous description of the syntax and semantics of a fragment of English and it is argued that such an approach to natural language allows procedural aspects of linguistic theory to get a purely declarative formulation. In particular, it will be shown how certain construction rules in Discourse Representation Theory, such as the rule that indefinites create new discourse referents and definites pick up an existing referent, can be formulated declaratively if (...) class='Hi'>logic is used as a metalanguage for English. In this case the declarative aspects of a rule are highlighted when we focus on the model theory of the description language while a procedural perspective is obtained when its proof theory is concentrated on. Themes of interest are Discourse Representation Theory, resolution of anaphora, resolution of presuppositions, and underspecification. (shrink)
Ontology engineering is a hard and error-prone task, in which small changes may lead to errors, or even produce an inconsistent ontology. As ontologies grow in size, the need for automated methods for repairing inconsistencies while preserving as much of the original knowledge as possible increases. Most previous approaches to this task are based on removing a few axioms from the ontology to regain consistency. We propose a new method based on weakening these axioms to make them less restrictive, employing (...) the use of refinement operators. We introduce the theoretical framework for weakening DL ontologies, propose algorithms to repair ontologies based on the framework, and provide an analysis of the computational complexity. Through an empirical analysis made over real-life ontologies, we show that our approach preserves significantly more of the original knowledge of the ontology than removing axioms. (shrink)
Propomos uma tipologia dos artefatos de representação para as áreas de saúde e ciências biológicas, e a associação dessa tipologia com diferentes tipos de ontologia formal e lógica, chegando a conclusões quanto aos pontos fortes e limitações da ontologia de diferentes tipos de recursos lógicos, enquanto mantemos o foco na lógica descritiva. Consideramos quatro tipos de representação de área: (i) representação léxico-semântica, (ii) representação de tipos de entidades, (iii) representação de conhecimento prévio, e (iv) representação de indivíduos. Defendemos uma clara (...) distinção entre os quatro tipos de representação, de forma a oferecer uma base mais racional para o uso das ontologias e artefatos relacionados no avanço da integração de dados e interoperabilidade de sistemas de raciocínio associados. Destacamos que apenas uma pequena porção de fatos cientificamente relevantes em áreas como a biomedicina pode ser adequadamente representada por ontologias formais, quando estas últimas são concebidas como representações de tipos de entidades. Particularmente, a tentativa de codificar conhecimento padrão ou probabilístico pela utilização de ontologias assim concebidas é fadada à produção de modelos não intencionais e errôneos. (shrink)
In the late summer of 1998, the authors, a cognitive scientist and a logician, started talking about the relevance of modern mathematical logic to the study of human reasoning, and we have been talking ever since. This book is an interim report of that conversation. It argues that results such as those on the Wason selection task, purportedly showing the irrelevance of formal logic to actual human reasoning, have been widely misinterpreted, mainly because the picture of logic (...) current in psychology and cognitive science is completely mistaken. We aim to give the reader a more accurate picture of mathematical logic and, in doing so, hope to show that logic, properly conceived, is still a very helpful tool in cognitive science. The main thrust of the book is therefore constructive. We give a number of examples in which logical theorizing helps in understanding and modeling observed behavior in reasoning tasks, deviations of that behavior in a psychiatric disorder (autism), and even the roots of that behavior in the evolution of the brain. (shrink)
In this paper, the authors show that there is a reading of St. Anselm's ontological argument in Proslogium II that is logically valid (the premises entail the conclusion). This reading takes Anselm's use of the definite description "that than which nothing greater can be conceived" seriously. Consider a first-order language and logic in which definite descriptions are genuine terms, and in which the quantified sentence "there is an x such that..." does not imply "x exists". Then, using an (...) ordinary logic of descriptions and a connected greater-than relation, God's existence logically follows from the claims: (a) there is a conceivable thing than which nothing greater is conceivable, and (b) if <em>x</em> doesn't exist, something greater than x can be conceived. To deny the conclusion, one must deny one of the premises. However, the argument involves no modal inferences and, interestingly, Descartes' ontological argument can be derived from it. (shrink)
Gettier presented the now famous Gettier problem as a challenge to epistemology. The methods Gettier used to construct his challenge, however, utilized certain principles of formal logic that are actually inappropriate for the natural language discourse of the Gettier cases. In that challenge to epistemology, Gettier also makes truth claims that would be considered controversial in analytic philosophy of language. The Gettier challenge has escaped scrutiny in these other relevant academic disciplines, however, because of its façade as an epistemological (...) analysis. This article examines Gettier's methods with the analytical tools of logic and analytic philosophy of language. (shrink)
Hilbert’s choice operators τ and ε, when added to intuitionistic logic, strengthen it. In the presence of certain extensionality axioms they produce classical logic, while in the presence of weaker decidability conditions for terms they produce various superintuitionistic intermediate logics. In this thesis, I argue that there are important philosophical lessons to be learned from these results. To make the case, I begin with a historical discussion situating the development of Hilbert’s operators in relation to his evolving program (...) in the foundations of mathematics and in relation to philosophical motivations leading to the development of intuitionistic logic. This sets the stage for a brief description of the relevant part of Dummett’s program to recast debates in metaphysics, and in particular disputes about realism and anti-realism, as closely intertwined with issues in philosophical logic, with the acceptance of classical logic for a domain reflecting a commitment to realism for that domain. Then I review extant results about what is provable and what is not when one adds epsilon to intuitionistic logic, largely due to Bell and DeVidi, and I give several new proofs of intermediate logics from intuitionistic logic+ε without identity. With all this in hand, I turn to a discussion of the philosophical significance of choice operators. Among the conclusions I defend are that these results provide a finer-grained basis for Dummett’s contention that commitment to classically valid but intuitionistically invalid principles reflect metaphysical commitments by showing those principles to be derivable from certain existence assumptions; that Dummett’s framework is improved by these results as they show that questions of realism and anti-realism are not an “all or nothing” matter, but that there are plausibly metaphysical stances between the poles of anti-realism and realism, because different sorts of ontological assumptions yield intermediate rather than classical logic; and that these intermediate positions between classical and intuitionistic logic link up in interesting ways with our intuitions about issues of objectivity and reality, and do so usefully by linking to questions around intriguing everyday concepts such as “is smart,” which I suggest involve a number of distinct dimensions which might themselves be objective, but because of their multivalent structure are themselves intermediate between being objective and not. Finally, I discuss the implications of these results for ongoing debates about the status of arbitrary and ideal objects in the foundations of logic, showing among other things that much of the discussion is flawed because it does not recognize the degree to which the claims being made depend on the presumption that one is working with a very strong logic. (shrink)
The classic analytic tradition associated the philosophy of George Berkeley with idealism. Yet in terms of the German Idealismus, Berkeley was no idealist. Rather, he described himself as an “immaterialist”. In the classic analytic tradition we find a misunderstanding of the German Idealismus. This paper will suggest, through reference to the work of Paul Redding, that Hegel’s Phenomenology of Spirit presents Idealismus as that which reconciles objectivity and subjectivity in the experience of consciousness. Hegel’s Phenomenology develops this idea in the (...) elaboration of a remarkably novel theory of consciousness. For Hegel, the conditions of the possibility of the objects of experience are a dialectical movement between consciousness and the object, or immediacy and mediacy. In the whole movement of consciousness we have the logic of contradiction working at the back of phenomenological experience that Hegel will make explicit in the Science of Logic, a logic that involves the thinker becoming consciously aware of their own thought processes. Yet Hegel’s Logic is different from the common meaning of ‘logic’. His Logic is not a formal approach to valid inference but captures the method and the moments and movement of logic. For Hegel, the great problem of classical logic is the immobility of the categories. This paper proposes that Hegel’s ‘holism’ entails the description wherein Logic, Nature, and Spirit are articulated as a whole in dialectical movement. (shrink)
One innovation in this paper is its identification, analysis, and description of a troubling ambiguity in the word ‘argument’. In one sense ‘argument’ denotes a premise-conclusion argument: a two-part system composed of a set of sentences—the premises—and a single sentence—the conclusion. In another sense it denotes a premise-conclusion-mediation argument—later called an argumentation: a three-part system composed of a set of sentences—the premises—a single sentence—the conclusion—and complex of sentences—the mediation. The latter is often intended to show that the conclusion follows (...) from the premises. The complementarity and interrelation of premise-conclusion arguments and premise-conclusion-mediation arguments resonate throughout the rest of the paper which articulates the conceptual structure found in logic from Aristotle to Tarski. This 1972 paper can be seen as anticipating Corcoran’s signature work: the more widely read 1989 paper, Argumentations and Logic, Argumentation 3, 17–43. MR91b:03006. The 1972 paper was translated into Portuguese. The 1989 paper was translated into Spanish, Portuguese, and Persian. (shrink)
We introduce a family of operators to combine DescriptionLogic concepts. They aim to characterise complex concepts that apply to instances that satisfy \enough" of the concept descriptions given. For instance, an individual might not have any tusks, but still be considered an elephant. To formalise the meaning of "enough", the operators take a list of weighted concepts as arguments, and a certain threshold to be met. We commence a study of the formal properties of these operators, and (...) study some variations. The intended applications concern the representation of cognitive aspects of classi cation tasks: the interdependencies among the attributes that de ne a concept, the prototype of a concept, and the typicality of the instances. (shrink)
In this paper we present a framework for the dynamic and automatic generation of novel knowledge obtained through a process of commonsense reasoning based on typicality-based concept combination. We exploit a recently introduced extension of a DescriptionLogic of typicality able to combine prototypical descriptions of concepts in order to generate new prototypical concepts and deal with problem like the PET FISH (Osherson and Smith, 1981; Lieto & Pozzato, 2019). Intuitively, in the context of our application of this (...)logic, the overall pipeline of our system works as follows: given a goal expressed as a set of properties, if the knowledge base does not contain a concept able to fulfill all these properties, then our system looks for two concepts to recombine in order to extend the original knowledge based satisfy the goal. (shrink)
We investigate a speci c model of knowledge and beliefs and their dynamics. The model is inspired by public announcement logic and the approach to puzzles concerning knowledge using that logic. In the model epistemic considerations are based on ontology. The main notion that constitutes a bridge between these two disciplines is the notion of epistemic capacities. Within the model we study scenarios in which agents can receive false announcements and can have incomplete or improper views about other (...) agent's epistemic capacities. Moreover, we try to express the description of problem speci cation using the tools from applied ontology { RDF format for information and the Protege editor. (shrink)
Combining typical knowledge to generate novel concepts is an important creative trait of human cognition. Dealing with such ability requires, from an AI perspective, the harmonization of two conflicting requirements that are hardly accommodated in symbolic systems: the need of a syntactic compositionality (typical of logical systems) and that one concerning the exhibition of typicality effects (see Frixione and Lieto, 2012). In this work we provide a logical framework able to account for this type of human-like concept combination. We propose (...) a nonmonotonic DescriptionLogic of typicality called TCL (Typicality-based Compositional Logic). (shrink)
We present Logical Description Grammar (LDG), a model ofgrammar and the syntax-semantics interface based on descriptions inelementary logic. A description may simultaneously describe the syntacticstructure and the semantics of a natural language expression, i.e., thedescribing logic talks about the trees and about the truth-conditionsof the language described. Logical Description Grammars offer a naturalway of dealing with underspecification in natural language syntax andsemantics. If a logical description (up to isomorphism) has exactly onetree plus truth-conditions as (...) a model, it completely specifies thatgrammatical object. More common is the situation, corresponding tounderspecification, in which there is more than one model. A situation inwhich there are no models corresponds to an ungrammatical input. (shrink)
This paper is divided in four parts. In the first part we introduce the method of internal critique of philosophical theories by examination of their external consistency with scientific theories. In the second part two metaphysical and one epistemological postulate of Wittgenstein's Tractatus are made explicit and formally expressed. In the third part we examine whether Tractarian metaphysical and epistemological postulates (the independence of simple states of affairs, the unique mode of their composition, possibility of complete empirical knowledge) are externally (...) consistent with the theory of quantum mechanics. The result of the inquiry is negative: Tractarian postulates ought to be be revised. Relying on the result we approach the question of the empirical character of logic in the fourth part. The description of theoretical transformations of the notion of disjunction, in its ontological, epistemological, and logical sense, is a common element of in all parts of the text. The conjecture on the existence of different types of disjunctive connectives in the language of quantum mechanics concludes the paper. (shrink)
The subject of the first section is Carnapian modal logic. One of the things I will do there is to prove that certain description principles, viz. the ''self-predication principles'', i.e. the principles according to which a descriptive term satisfies its own descriptive condition, are theorems and that others are not. The second section will be devoted to Carnapian modal arithmetic. I will prove that, if the arithmetical theory contains the standard weak principle of induction, modal truth collapses to (...) truth. Then I will propose a different formulation of Carnapian modal arithmetic and establish that it is free of collapse. Noteworthy is that one can retain the standard strong principle of induction. I will occupy myself in the third section with Carnapian epistemic logic and arithmetic. Here too it is claimed that the standard weak principle of induction is invalid and that the alternative principle is valid. In the fourth and last section I will get back to the self-predication principles and I will point to some of the consequences if one adds them to Carnapian Epistemic arithmetic. The interaction of self-predication principles and the strong principle of induction results in a collapse of de re knowability. (shrink)
In this paper, by suggesting a formal representation of science based on recent advances in logic-based Artificial Intelligence (AI), we show how three serious concerns around the realisation of traditional scientific realism (the theory/observation distinction, over-determination of theories by data, and theory revision) can be overcome such that traditional realism is given a new guise as ‘naturalised’. We contend that such issues can be dealt with (in the context of scientific realism) by developing a formal representation of science based (...) on the application of the following tools from Knowledge Representation: the family of Description Logics, an enrichment of classical logics via defeasible statements, and an application of the preferential interpretation of the approach to Belief Revision. (shrink)
This paper addresses a family of issues surrounding the biological phenomenon of resistance and its representation in realist ontologies. The treatments of resistance terms in various existing ontologies are examined and found to be either overly narrow, internally inconsistent, or otherwise problematic. We propose a more coherent characterization of resistance in terms of what we shall call blocking dispositions, which are collections of mutually coordinated dispositions which are of such a sort that they cannot undergo simultaneous realization within a single (...) bearer. A definition of ‘protective resistance’ is proposed for use in the Infectious Disease Ontology (IDO) and we show how this definition can be used to characterize the antibiotic resistance in Methicillin-Resistant Staphylococcus aureus (MRSA). The ontological relations between entities in our MRSA case study are used alongside a series of logical inference rules to illustrate logical reasoning about resistance. A descriptionlogic representation of blocking dispositions is also provided. We demonstrate that our characterization of resistance is sufficiently general to cover two other cases of resistance in the infectious disease domain involving HIV and malaria. (shrink)
There has been a recent surge of work on deontic modality within philosophy of language. This work has put the deontic logic tradition in contact with natural language semantics, resulting in significant increase in sophistication on both ends. This chapter surveys the main motivations, achievements, and prospects of this work.
This paper addresses the use of dispositions in the Infectious Disease Ontology (IDO). IDO is an ontology constructed according to the principles of the Open Biomedical Ontology (OBO) Foundry and uses the Basic Formal Ontology (BFO) as an upper ontology. After providing a brief introduction to disposition types in BFO and IDO, we discuss three general techniques for representing combinations of dispositions under the headings blocking dispositions, complementary dispositions, and collective dispositions. Motivating examples for each combination of dispositions is given (...) along with a specific use case in IDO. Descriptionlogic restrictions are used to formalize statements relating to these combinations. (shrink)
SNODENT is a dental diagnostic vocabulary incompletely integrated in SNOMED-CT. Nevertheless, SNODENT could become the de facto standard for dental diagnostic coding. SNODENT's manageable size, the fact that it is administratively self-contained, and relates to a well-understood domain provides valuable opportunities to formulate and test, in controlled experiments, a series of hypothesis concerning diagnostic systems. Of particular interest are questions related to establishing appropriate quality assurance methods for its optimal level of detail in content, its ontological structure, its construction and (...) maintenance. This paper builds on previous–software-based methodologies designed to assess the quality of SNOMED-CT. When applied to SNODENT several deficiencies were uncovered. 9.52% of SNODENT terms point to concepts in SNOMED-CT that have some problem. 18.53% of SNODENT terms point to SNOMED-CT concepts do not have, in SNOMED, the term used by SNODENT. Other findings include the absence of a clear specification of the exact relationship between a term and a termcode in SNODENT and the improper assignment of the same termcode to terms with significantly different meanings. An analysis of the way in which SNODENT is structurally integrated into SNOMED resulted in the generation of 1081 new termcodes reflecting entities not present in the SNOMED tables but required by SNOMED's own descriptionlogic based classification principles. Our results show that SNODENT requires considerable enhancements in content, quality of coding, quality of ontological structure and the manner in which it is integrated and aligned with SNOMED. We believe that methods for the analysis of the quality of diagnostic coding systems must be developed and employed if such systems are to be used effectively in both clinical practice and clinical research. (shrink)
The theory of proper names proposed by J.S. Mill in A system of logic (1843), and discussed in S. Kripke’s Naming and necessity (1980), is shown to be predated by A. Rosmini’s Nuovo saggio sull’origine delle idee (1830) and T. Reid’s Essays on the intellectual powers of man (1785). For philological reasons, Rosmini probably did not obtain his view of proper names from Reid. For philosophical reasons, it is unlikely that he got it from Hobbes, Locke, Smith, or Stewart. (...) Although not explicitly indicated by Rosmini himself, he may have been influenced by St. Thomas, who in Summa theologica discusses suppositum and natura in relation to the equivocal functions of the terms ”God” and ”sun” as common and proper names. As previously observed, forerunners of the idea can be found in Antiquity, in Plato’s Theaetetus and Aristotle’s Metaphysics. From a historical point of view, the fully developed ”Millian” opinion that connotation is not a fundamental aspect of proper names, and that their referents are not fixed by description, could more accurately be termed the Reid-Rosmini-Mill theory. (shrink)
As historically acknowledged in the Reasoning about Actions and Change community, intuitiveness of a logical domain description cannot be fully automated. Moreover, like any other logical theory, action theories may also evolve, and thus knowledge engineers need revision methods to help in accommodating new incoming information about the behavior of actions in an adequate manner. The present work is about changing action domain descriptions in multimodal logic. Its contribution is threefold: first we revisit the semantics of action theory (...) contraction proposed in previous work, giving more robust operators that express minimal change based on a notion of distance between Kripke-models. Second we give algorithms for syntactical action theory contraction and establish their correctness with respect to our semantics for those action theories that satisfy a principle of modularity investigated in previous work. Since modularity can be ensured for every action theory and, as we show here, needs to be computed at most once during the evolution of a domain description, it does not represent a limitation at all to the method here studied. Finally we state AGM-like postulates for action theory contraction and assess the behavior of our operators with respect to them. Moreover, we also address the revision counterpart of action theory change, showing that it benefits from our semantics for contraction. (shrink)
I develop and defend a truthmaker semantics for the relevant logic R. The approach begins with a simple philosophical idea and develops it in various directions, so as to build a technically adequate relevant semantics. The central philosophical idea is that truths are true in virtue of specific states. Developing the idea formally results in a semantics on which truthmakers are relevant to what they make true. A very natural notion of conditionality is added, giving us relevant implication. I (...) then investigate ways to add conjunction, disjunction, and negation; and I discuss how to justify contraposition and excluded middle within a truthmaker semantics. (shrink)
In this paper I will develop a view about the semantics of imperatives, which I term Modal Noncognitivism, on which imperatives might be said to have truth conditions (dispositionally, anyway), but on which it does not make sense to see them as expressing propositions (hence does not make sense to ascribe to them truth or falsity). This view stands against “Cognitivist” accounts of the semantics of imperatives, on which imperatives are claimed to express propositions, which are then enlisted in explanations (...) of the relevant logico-semantic phenomena. It also stands against the major competitors to Cognitivist accounts—all of which are non-truth-conditional and, as a result, fail to provide satisfying explanations of the fundamental semantic characteristics of imperatives (or so I argue). The view of imperatives I defend here improves on various treatments of imperatives on the market in giving an empirically and theoretically adequate account of their semantics and logic. It yields explanations of a wide range of semantic and logical phenomena about imperatives—explanations that are, I argue, at least as satisfying as the sorts of explanations of semantic and logical phenomena familiar from truth-conditional semantics. But it accomplishes this while defending the notion—which is, I argue, substantially correct—that imperatives could not have propositions, or truth conditions, as their meanings. (shrink)
The aim of the paper is to argue that all—or almost all—logical rules have exceptions. In particular, it is argued that this is a moral that we should draw from the semantic paradoxes. The idea that we should respond to the paradoxes by revising logic in some way is familiar. But previous proposals advocate the replacement of classical logic with some alternative logic. That is, some alternative system of rules, where it is taken for granted that these (...) hold without exception. The present proposal is quite different. According to this, there is no such alternative logic. Rather, classical logic retains the status of the ‘one true logic’, but this status must be reconceived so as to be compatible with (almost) all of its rules admitting of exceptions. This would seem to have significant repercussions for a range of widely held views about logic: e.g. that it is a priori, or that it is necessary. Indeed, if the arguments of the paper succeed, then such views must be given up. (shrink)
The five English words—sentence, proposition, judgment, statement, and fact—are central to coherent discussion in logic. However, each is ambiguous in that logicians use each with multiple normal meanings. Several of their meanings are vague in the sense of admitting borderline cases. In the course of displaying and describing the phenomena discussed using these words, this paper juxtaposes, distinguishes, and analyzes several senses of these and related words, focusing on a constellation of recommended senses. One of the purposes of this (...) paper is to demonstrate that ordinary English properly used has the resources for intricate and philosophically sound investigation of rather deep issues in logic and philosophy of language. No mathematical, logical, or linguistic symbols are used. Meanings need to be identified and clarified before being expressed in symbols. We hope to establish that clarity is served by deferring the extensive use of formalized or logically perfect languages until a solid “informal” foundation has been established. Questions of “ontological status”—e.g., whether propositions or sentences, or for that matter characters, numbers, truth-values, or instants, are “real entities”, are “idealizations”, or are “theoretical constructs”—plays no role in this paper. As is suggested by the title, this paper is written to be read aloud. -/- I hope that reading this aloud in groups will unite people in the enjoyment of the humanistic spirit of analytic philosophy. (shrink)
Classical logic is usually interpreted as the logic of propositions. But from Boole's original development up to modern categorical logic, there has always been the alternative interpretation of classical logic as the logic of subsets of any given (nonempty) universe set. Partitions on a universe set are dual to subsets of a universe set in the sense of the reverse-the-arrows category-theoretic duality--which is reflected in the duality between quotient objects and subobjects throughout algebra. Hence the (...) idea arises of a dual logic of partitions. That dual logic is described here. Partition logic is at the same mathematical level as subset logic since models for both are constructed from (partitions on or subsets of) arbitrary unstructured sets with no ordering relations, compatibility or accessibility relations, or topologies on the sets. Just as Boole developed logical finite probability theory as a quantitative treatment of subset logic, applying the analogous mathematical steps to partition logic yields a logical notion of entropy so that information theory can be refounded on partition logic. But the biggest application is that when partition logic and the accompanying logical information theory are "lifted" to complex vector spaces, then the mathematical framework of quantum mechanics is obtained. Partition logic models indefiniteness (i.e., numerical attributes on a set become more definite as the inverse-image partition becomes more refined) while subset logic models the definiteness of classical physics (an entity either definitely has a property or definitely does not). Hence partition logic provides the backstory so the old idea of "objective indefiniteness" in QM can be fleshed out to a full interpretation of quantum mechanics. (shrink)
Intuitionistic logic provides an elegant solution to the Sorites Paradox. Its acceptance has been hampered by two factors. First, the lack of an accepted semantics for languages containing vague terms has led even philosophers sympathetic to intuitionism to complain that no explanation has been given of why intuitionistic logic is the correct logic for such languages. Second, switching from classical to intuitionistic logic, while it may help with the Sorites, does not appear to offer any advantages (...) when dealing with the so-called paradoxes of higher-order vagueness. We offer a proposal that makes strides on both issues. We argue that the intuitionist’s characteristic rejection of any third alethic value alongside true and false is best elaborated by taking the normal modal system S4M to be the sentential logic of the operator ‘it is clearly the case that’. S4M opens the way to an account of higher-order vagueness which avoids the paradoxes that have been thought to infect the notion. S4M is one of the modal counterparts of the intuitionistic sentential calculus and we use this fact to explain why IPC is the correct sentential logic to use when reasoning with vague statements. We also show that our key results go through in an intuitionistic version of S4M. Finally, we deploy our analysis to reply to Timothy Williamson’s objections to intuitionistic treatments of vagueness. (shrink)
Sentences containing definite descriptions, expressions of the form ‘The F’, can be formalised using a binary quantifier ι that forms a formula out of two predicates, where ιx[F, G] is read as ‘The F is G’. This is an innovation over the usual formalisation of definite descriptions with a term forming operator. The present paper compares the two approaches. After a brief overview of the system INFι of intuitionist negative free logic extended by such a quantifier, which was presented (...) in (Kürbis 2019), INFι is first compared to a system of Tennant’s and an axiomatic treatment of a term forming ι operator within intuitionist negative free logic. Both systems are shown to be equivalent to the subsystem of INFι in which the G of ιx[F, G] is restricted to identity. INFι is then compared to an intuitionist version of a system of Lambert’s which in addition to the term forming operator has an operator for predicate abstraction for indicating scope distinctions. The two systems will be shown to be equivalent through a translation between their respective languages. Advantages of the present approach over the alternatives are indicated in the discussion. (shrink)
The rather unrestrained use of second-order logic in the neo-logicist program is critically examined. It is argued in some detail that it brings with it genuine set-theoretical existence assumptions and that the mathematical power that Hume’s Principle seems to provide, in the derivation of Frege’s Theorem, comes largely from the ‘logic’ assumed rather than from Hume’s Principle. It is shown that Hume’s Principle is in reality not stronger than the very weak Robinson Arithmetic Q. Consequently, only a few (...) rudimentary facts of arithmetic are logically derivable from Hume’s Principle. And that hardly counts as a vindication of logicism. (shrink)
This paper is concerned with a propositional modal logic with operators for necessity, actuality and apriority. The logic is characterized by a class of relational structures defined according to ideas of epistemic two-dimensional semantics, and can therefore be seen as formalizing the relations between necessity, actuality and apriority according to epistemic two-dimensional semantics. We can ask whether this logic is correct, in the sense that its theorems are all and only the informally valid formulas. This paper gives (...) outlines of two arguments that jointly show that this is the case. The first is intended to show that the logic is informally sound, in the sense that all of its theorems are informally valid. The second is intended to show that it is informally complete, in the sense that all informal validities are among its theorems. In order to give these arguments, a number of independently interesting results concerning the logic are proven. In particular, the soundness and completeness of two proof systems with respect to the semantics is proven (Theorems 2.11 and 2.15), as well as a normal form theorem (Theorem 3.2), an elimination theorem for the actuality operator (Corollary 3.6), and the decidability of the logic (Corollary 3.7). It turns out that the logic invalidates a plausible principle concerning the interaction of apriority and necessity; consequently, a variant semantics is briefly explored on which this principle is valid. The paper concludes by assessing the implications of these results for epistemic two-dimensional semantics. (shrink)
We formally introduce a novel, yet ubiquitous, category of norms: norms of instrumentality. Norms of this category describe which actions are obligatory, or prohibited, as instruments for certain purposes. We propose the Logic of Agency and Norms (LAN) that enables reasoning about actions, instrumentality, and normative principles in a multi-agent setting. Leveraging LAN , we formalize norms of instrumentality and compare them to two prevalent norm categories: norms to be and norms to do. Last, we pose principles relating the (...) three categories and evaluate their validity vis-à-vis notions of deliberative acting. On a technical note, the logic will be shown decidable via the finite model property. (shrink)
We reconsider the pragmatic interpretation of intuitionistic logic [21] regarded as a logic of assertions and their justi cations and its relations with classical logic. We recall an extension of this approach to a logic dealing with assertions and obligations, related by a notion of causal implication [14, 45]. We focus on the extension to co-intuitionistic logic, seen as a logic of hypotheses [8, 9, 13] and on polarized bi-intuitionistic logic as a (...) class='Hi'>logic of assertions and conjectures: looking at the S4 modal translation, we give a de nition of a system AHL of bi-intuitionistic logic that correctly represents the duality between intuitionistic and co-intuitionistic logic, correcting a mistake in previous work [7, 10]. A computational interpretation of cointuitionism as a distributed calculus of coroutines is then used to give an operational interpretation of subtraction.Work on linear co-intuitionism is then recalled, a linear calculus of co-intuitionistic coroutines is de ned and a probabilistic interpretation of linear co-intuitionism is given as in [9]. Also we remark that by extending the language of intuitionistic logic we can express the notion of expectation, an assertion that in all situations the truth of p is possible and that in a logic of expectations the law of double negation holds. Similarly, extending co-intuitionistic logic, we can express the notion of conjecture that p, de ned as a hypothesis that in some situation the truth of p is epistemically necessary. (shrink)
(1) This paper is about how to build an account of the normativity of logic around the claim that logic is constitutive of thinking. I take the claim that logic is constitutive of thinking to mean that representational activity must tend to conform to logic to count as thinking. (2) I develop a natural line of thought about how to develop the constitutive position into an account of logical normativity by drawing on constitutivism in metaethics. (3) (...) I argue that, while this line of thought provides some insights, it is importantly incomplete, as it is unable to explain why we should think. I consider two attempts at rescuing the line of thought. The first, unsuccessful response is that it is self-defeating to ask why we ought to think. The second response is that we need to think. But this response secures normativity only if thinking has some connection to human flourishing. (4) I argue that thinking is necessary for human flourishing. Logic is normative because it is constitutive of this good. (5) I show that the resulting account deals nicely with problems that vex other accounts of logical normativity. (shrink)
An exact truthmaker for A is a state which, as well as guaranteeing A’s truth, is wholly relevant to it. States with parts irrelevant to whether A is true do not count as exact truthmakers for A. Giving semantics in this way produces a very unusual consequence relation, on which conjunctions do not entail their conjuncts. This feature makes the resulting logic highly unusual. In this paper, we set out formal semantics for exact truthmaking and characterise the resulting notion (...) of entailment, showing that it is compact and decidable. We then investigate the effect of various restrictions on the semantics. We also formulate a sequent-style proof system for exact entailment and give soundness and completeness results. (shrink)
“Second-order Logic” in Anderson, C.A. and Zeleny, M., Eds. Logic, Meaning, and Computation: Essays in Memory of Alonzo Church. Dordrecht: Kluwer, 2001. Pp. 61–76. -/- Abstract. This expository article focuses on the fundamental differences between second- order logic and first-order logic. It is written entirely in ordinary English without logical symbols. It employs second-order propositions and second-order reasoning in a natural way to illustrate the fact that second-order logic is actually a familiar part of our (...) traditional intuitive logical framework and that it is not an artificial formalism created by specialists for technical purposes. To illustrate some of the main relationships between second-order logic and first-order logic, this paper introduces basic logic, a kind of zero-order logic, which is more rudimentary than first-order and which is transcended by first-order in the same way that first-order is transcended by second-order. The heuristic effectiveness and the historical importance of second-order logic are reviewed in the context of the contemporary debate over the legitimacy of second-order logic. Rejection of second-order logic is viewed as radical: an incipient paradigm shift involving radical repudiation of a part of our scientific tradition, a tradition that is defended by classical logicians. But it is also viewed as reactionary: as being analogous to the reactionary repudiation of symbolic logic by supporters of “Aristotelian” traditional logic. But even if “genuine” logic comes to be regarded as excluding second-order reasoning, which seems less likely today than fifty years ago, its effectiveness as a heuristic instrument will remain and its importance for understanding the history of logic and mathematics will not be diminished. Second-order logic may someday be gone, but it will never be forgotten. Technical formalisms have been avoided entirely in an effort to reach a wide audience, but every effort has been made to limit the inevitable sacrifice of rigor. People who do not know second-order logic cannot understand the modern debate over its legitimacy and they are cut-off from the heuristic advantages of second-order logic. And, what may be worse, they are cut-off from an understanding of the history of logic and thus are constrained to have distorted views of the nature of the subject. As Aristotle first said, we do not understand a discipline until we have seen its development. It is a truism that a person's conceptions of what a discipline is and of what it can become are predicated on their conception of what it has been. (shrink)
This paper contends that Stoic logic (i.e. Stoic analysis) deserves more attention from contemporary logicians. It sets out how, compared with contemporary propositional calculi, Stoic analysis is closest to methods of backward proof search for Gentzen-inspired substructural sequent logics, as they have been developed in logic programming and structural proof theory, and produces its proof search calculus in tree form. It shows how multiple similarities to Gentzen sequent systems combine with intriguing dissimilarities that may enrich contemporary discussion. Much (...) of Stoic logic appears surprisingly modern: a recursively formulated syntax with some truth-functional propositional operators; analogues to cut rules, axiom schemata and Gentzen’s negation-introduction rules; an implicit variable-sharing principle and deliberate rejection of Thinning and avoidance of paradoxes of implication. These latter features mark the system out as a relevance logic, where the absence of duals for its left and right introduction rules puts it in the vicinity of McCall’s connexive logic. Methodologically, the choice of meticulously formulated meta-logical rules in lieu of axiom and inference schemata absorbs some structural rules and results in an economical, precise and elegant system that values decidability over completeness. (shrink)
In the paper we present a formal system motivated by a specific methodology of creating norms. According to the methodology, a norm-giver before establishing a set of norms should create a picture of the agent by creating his repertoire of actions. Then, knowing what the agent can do in particular situations, the norm-giver regulates these actions by assigning deontic qualifications to each of them. The set of norms created for each situation should respect (1) generally valid deontic principles being the (...) theses of our logic and (2) facts from the ontology of action whose relevance for the systems of norms we postulate. (shrink)
We investigate an enrichment of the propositional modal language L with a "universal" modality ■ having semantics x ⊧ ■φ iff ∀y(y ⊧ φ), and a countable set of "names" - a special kind of propositional variables ranging over singleton sets of worlds. The obtained language ℒ $_{c}$ proves to have a great expressive power. It is equivalent with respect to modal definability to another enrichment ℒ(⍯) of ℒ, where ⍯ is an additional modality with the semantics x ⊧ ⍯φ (...) iff Vy(y ≠ x → y ⊧ φ). Model-theoretic characterizations of modal definability in these languages are obtained. Further we consider deductive systems in ℒ $_{c}$ . Strong completeness of the normal ℒ $_{c}$ logics is proved with respect to models in which all worlds are named. Every ℒ $_{c}$ -logic axiomatized by formulae containing only names (but not propositional variables) is proved to be strongly frame-complete. Problems concerning transfer of properties ([in]completeness, filtration, finite model property etc.) from ℒ to ℒ $_{c}$ are discussed. Finally, further perspectives for names in multimodal environment are briefly sketched. (shrink)
This paper presents a way of formalising definite descriptions with a binary quantifier ι, where ιx[F, G] is read as ‘The F is G’. Introduction and elimination rules for ι in a system of intuitionist negative free logic are formulated. Procedures for removing maximal formulas of the form ιx[F, G] are given, and it is shown that deductions in the system can be brought into normal form.
Epistemic logics based on the possible worlds semantics suffer from the problem of logical omniscience, whereby agents are described as knowing all logical consequences of what they know, including all tautologies. This problem is doubly challenging: on the one hand, agents should be treated as logically non-omniscient, and on the other hand, as moderately logically competent. Many responses to logical omniscience fail to meet this double challenge because the concepts of knowledge and reasoning are not properly separated. In this paper, (...) I present a dynamic logic of knowledge that models an agent’s epistemic state as it evolves over the course of reasoning. I show that the logic does not sacrifice logical competence on the altar of logical non- omniscience. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.