This paper develops a new framework for combining propositional logics, called "juxtaposition". Several general metalogical theorems are proved concerning the combination of logics by juxtaposition. In particular, it is shown that under reasonable conditions, juxtaposition preserves strong soundness. Under reasonable conditions, the juxtaposition of two consequence relations is a conservative extension of each of them. A general strong completeness result is proved. The paper then examines the philosophically important case of the combination of classical and intuitionist (...) class='Hi'>logics. Particular attention is paid to the phenomenon of collapse. It is shown that there are logics with two stocks of classical or intuitionist connectives that do not collapse. Finally, the paper briefy investigates the question of which rules, when added to these logics, lead to collapse. (shrink)
The general methodology of "algebraizing" logics is used here for combining different logics. The combination of logics is represented as taking the colimit of the constituent logics in the category of algebraizable logics. The cocompleteness of this category as well as its isomorphism to the corresponding category of certain first-order theories are proved.
We present an algorithm for concept combination inspired and informed by the research in cognitive and experimental psychology. Dealing with concept combination requires, from a symbolic AI perspective, to cope with competitive needs: the need for compositionality and the need to account for typicality effects. Building on our previous work on weighted logic, the proposed algorithm can be seen as a step towards the management of both these needs. More precisely, following a proposal of Hampton [1], it combines two weighted (...) Description Logic formulas, each defining a concept, using the following general strategy. First it selects all the features needed for the combination, based on the logical distinc- tion between necessary and impossible features. Second, it determines the threshold and assigns new weights to the features of the combined concept trying to preserve the relevance and the necessity of the features. We illustrate how the algorithm works exploiting some paradigmatic examples discussed in the cognitive literature. (shrink)
We propose a nonmonotonic Description Logic of typicality able to account for the phenomenon of combining prototypical concepts, an open problem in the fields of AI and cognitive modelling. Our logic extends the logic of typicality ALC + TR, based on the notion of rational closure, by inclusions p :: T(C) v D (“we have probability p that typical Cs are Ds”), coming from the distributed semantics of probabilistic Description Logics. Additionally, it embeds a set of cognitive heuristics (...) for concept combination. We show that the complexity of reasoning in our logic is EXPTIME-complete as in ALC. (shrink)
We propose a nonmonotonic Description Logic of typicality able to account for the phenomenon of the combination of prototypical concepts. The proposed logic relies on the logic of typicality ALC + TR, whose semantics is based on the notion of rational closure, as well as on the distributed semantics of probabilistic Description Logics, and is equipped with a cognitive heuristic used by humans for concept composition. We first extend the logic of typicality ALC + TR by typicality inclusions of (...) the form p :: T(C) v D, whose intuitive meaning is that “we believe with degree p about the fact that typical Cs are Ds”. As in the distributed semantics, we define different scenarios containing only some typicality inclusions, each one having a suitable probability. We then exploit such scenarios in order to ascribe typical properties to a concept C obtained as the combination of two prototypical concepts. We also show that reasoning in the proposed Description Logic is EXPTIME-complete as for the underlying standard Description Logic ALC. (shrink)
In this paper we propose a very general de nition of combination of logics by means of the concept of sheaves of logics. We first discuss some properties of this general definition and list some problems, as well as connections to related work. As applications of our abstract setting, we show that the notion of possible-translations semantics, introduced in previous papers by the first author, can be described in categorial terms. Possible-translations semantics constitute illustrative cases, since they provide (...) a new semantical account for abstract logical systems, particularly for many-valued and paraconsistent logics. (shrink)
Deontic logic is devoted to the study of logical properties of normative predicates such as permission, obligation and prohibition. Since it is usual to apply these predicates to actions, many deontic logicians have proposed formalisms where actions and action combinators are present. Some standard action combinators are action conjunction, choice between actions and not doing a given action. These combinators resemble boolean operators, and therefore the theory of boolean algebra offers a well-known athematical framework to study the properties of the (...) classic deontic operators when applied to actions. In his seminal work, Segerberg uses constructions coming from boolean algebras to formalize the usual deontic notions. Segerberg’s work provided the initial step to understand logical properties of deontic operators when they are applied to actions. In the last years, other authors have proposed related logics. In this chapter we introduce Segerberg’s work, study related formalisms and investigate further challenges in this area. (shrink)
The Logic of Causation: Definition, Induction and Deduction of Deterministic Causality is a treatise of formal logic and of aetiology. It is an original and wide-ranging investigation of the definition of causation (deterministic causality) in all its forms, and of the deduction and induction of such forms. The work was carried out in three phases over a dozen years (1998-2010), each phase introducing more sophisticated methods than the previous to solve outstanding problems. This study was intended as part of a (...) larger work on causal logic, which additionally treats volition and allied cause-effect relations (2004). The Logic of Causation deals with the main technicalities relating to reasoning about causation. Once all the deductive characteristics of causation in all its forms have been treated, and we have gained an understanding as to how it is induced, we are able to discuss more intelligently its epistemological and ontological status. In this context, past theories of causation are reviewed and evaluated (although some of the issues involved here can only be fully dealt with in a larger perspective, taking volition and other aspects of causality into consideration, as done in Volition and Allied Causal Concepts). Phase I: Macroanalysis. Starting with the paradigm of causation, its most obvious and strongest form, we can by abstraction of its defining components distinguish four genera of causation, or generic determinations, namely: complete, partial, necessary and contingent causation. When these genera and their negations are combined together in every which way, and tested for consistency, it is found that only four species of causation, or specific determinations, remain conceivable. The concept of causation thus gives rise to a number of positive and negative propositional forms, which can be studied in detail with relative ease because they are compounds of conjunctive and conditional propositions whose properties are already well known to logicians. The logical relations (oppositions) between the various determinations (and their negations) are investigated, as well as their respective implications (eductions). Thereafter, their interactions (in syllogistic reasoning) are treated in the most rigorous manner. The main question we try to answer here is: is (or when is) the cause of a cause of something itself a cause of that thing, and if so to what degree? The figures and moods of positive causative syllogism are listed exhaustively; and the resulting arguments validated or invalidated, as the case may be. In this context, a general and sure method of evaluation called ‘matricial analysis’ (macroanalysis) is introduced. Because this (initial) method is cumbersome, it is used as little as possible – the remaining cases being evaluated by means of reduction. Phase II: Microanalysis. Seeing various difficulties encountered in the first phase, and the fact that some issues were left unresolved in it, a more precise method is developed in the second phase, capable of systematically answering most outstanding questions. This improved matricial analysis (microanalysis) is based on tabular prediction of all logically conceivable combinations and permutations of conjunctions between two or more items and their negations (grand matrices). Each such possible combination is called a ‘modus’ and is assigned a permanent number within the framework concerned (for 2, 3, or more items). This allows us to identify each distinct (causative or other, positive or negative) propositional form with a number of alternative moduses. This technique greatly facilitates all work with causative and related forms, allowing us to systematically consider their eductions, oppositions, and syllogistic combinations. In fact, it constitutes a most radical approach not only to causative propositions and their derivatives, but perhaps more importantly to their constituent conditional propositions. Moreover, it is not limited to logical conditioning and causation, but is equally applicable to other modes of modality, including extensional, natural, temporal and spatial conditioning and causation. From the results obtained, we are able to settle with formal certainty most of the historically controversial issues relating to causation. Phase III: Software Assisted Analysis. The approach in the second phase was very ‘manual’ and time consuming; the third phase is intended to ‘mechanize’ much of the work involved by means of spreadsheets (to begin with). This increases reliability of calculations (though no errors were found, in fact) – but also allows for a wider scope. Indeed, we are now able to produce a larger, 4-item grand matrix, and on its basis find the moduses of causative and other forms needed to investigate 4-item syllogism. As well, now each modus can be interpreted with greater precision and causation can be more precisely defined and treated. In this latest phase, the research is brought to a successful finish! Its main ambition, to obtain a complete and reliable listing of all 3-item and 4-item causative syllogisms, being truly fulfilled. This was made technically feasible, in spite of limitations in computer software and hardware, by cutting up problems into smaller pieces. For every mood of the syllogism, it was thus possible to scan for conclusions ‘mechanically’ (using spreadsheets), testing all forms of causative and preventive conclusions. Until now, this job could only be done ‘manually’, and therefore not exhaustively and with certainty. It took over 72’000 pages of spreadsheets to generate the sought for conclusions. This is a historic breakthrough for causal logic and logic in general. Of course, not all conceivable issues are resolved. There is still some work that needs doing, notably with regard to 5-item causative syllogism. But what has been achieved solves the core problem. The method for the resolution of all outstanding issues has definitely now been found and proven. The only obstacle to solving most of them is the amount of labor needed to produce the remaining (less important) tables. As for 5-item syllogism, bigger computer resources are also needed. (shrink)
The result of combining classical quantificational logic with modal logic proves necessitism – the claim that necessarily everything is necessarily identical to something. This problem is reflected in the purely quantificational theory by theorems such as ∃x t=x; it is a theorem, for example, that something is identical to Timothy Williamson. The standard way to avoid these consequences is to weaken the theory of quantification to a certain kind of free logic. However, it has often been noted that in (...) order to specify the truth conditions of certain sentences involving constants or variables that don’t denote, one has to apparently quantify over things that are not identical to anything. In this paper I defend a contingentist, non-Meinongian metaphysics within a positive free logic. I argue that although certain names and free variables do not actually refer to anything, in each case there might have been something they actually refer to, allowing one to interpret the contingentist claims without quantifying over mere possibilia. (shrink)
Type-logical semantics studies linguistic meaning with the help of the theory of types. The latter originated with Russell as an answer to the paradoxes, but has the additional virtue that it is very close to ordinary language. In fact, type theory is so much more similar to language than predicate logic is, that adopting it as a vehicle of representation can overcome the mismatches between grammatical form and predicate logical form that were observed by Frege and Russell. The grammatical forms (...) of ordinary language sentences consequently may be taken to be much less misleading than logicians in the ﬁrst half of the 20th century often thought them to be. This was realized by Richard Montague, who used the theory of types to translate fragments of ordinary language into a logical language. Semantics is commonly divided into lexical semantics, which studies the meaning of words, and compositional semantics, which studies the way in which complex phrases obtain a meaning from their constituents. The strength of type-logical semantics lies with the latter, but type-logical theories can be combined with many competing hypotheses about lexical meaning, provided these hypotheses are expressed using the language of type theory. (shrink)
This paper embeds the core part of Discourse Representation Theory in the classical theory of types plus a few simple axioms that allow the theory to express key facts about variables and assignments on the object level of the logic. It is shown how the embedding can be used to combine core analyses of natural language phenomena in Discourse Representation Theory with analyses that can be obtained in Montague Semantics.
Dynamic conceptual reframing represents a crucial mechanism employed by humans, and partially by other animal species, to generate novel knowledge used to solve complex goals. In this talk, I will present a reasoning framework for knowledge invention and creative problem solving exploiting TCL: a non-monotonic extension of a Description Logic (DL) of typicality able to combine prototypical (commonsense) descriptions of concepts in a human-like fashion [1]. The proposed approach has been tested both in the task of goal-driven concept invention [2,3] (...) and has additionally applied within the context of serendipity-based recommendation systems [4]. I will present the obtained results, the lessons learned, and the road ahead of this research path. -/- . (shrink)
This paper is concerned with counterfactual logic and its implications for the modal status of mathematical claims. It is most directly a response to an ambitious program by Yli-Vakkuri and Hawthorne (2018), who seek to establish that mathematics is committed to its own necessity. I claim that their argument fails to establish this result for two reasons. First, their assumptions force our hand on a controversial debate within counterfactual logic. In particular, they license counterfactual strengthening— the inference from ‘If A (...) were true then C would be true’ to ‘If A and B were true then C would be true’—which many reject. Second, the system they develop is provably equivalent to appending Deduction Theorem to a T modal logic. It is unsurprising that the combination of Deduction Theorem with T results in necessitation; indeed, it is precisely for this reason that many logicians reject Deduction Theorem in modal contexts. If Deduction Theorem is unacceptable for modal logic, it cannot be assumed to derive the necessity of mathematics. (shrink)
This collection of articles was written over the last 10 years and the most important and longest within the last year. Also I have edited them to bring them up to date (2016). The copyright page has the date of this first edition and new editions will be noted there as I edit old articles or add new ones. All the articles are about human behavior (as are all articles by anyone about anything), and so about the limitations of having (...) a recent monkey ancestry (8 million years or much less depending on viewpoint) and manifest words and deeds within the framework of our innate psychology as presented in the table of intentionality. As famous evolutionist Richard Leakey says, it is critical to keep in mind not that we evolved from apes, but that in every important way, we are apes. If everyone was given a real understanding of this (i.e., human ecology and psychology) in school, maybe civilization would have a chance. -/- In my view these articles and reviews have many novel and highly useful elements, in that they use my own version of the recently (ca. 1980’s) developed dual systems view of our brain and behavior to lay out a logical system of rationality (personality, psychology, mind, language, behavior, thought, reasoning, reality etc.) that is sorely lacking in the behavioral sciences (psychology, philosophy, literature, politics, anthropology, history, economics, sociology etc.). The philosophy centers around the two writers I have found the most important, Ludwig Wittgenstein and John Searle, whose ideas I combine and extend within the dual system (two systems of thought) framework that has proven so useful in recent thinking and reasoning research. As I note, there is in my view essentially complete overlap between philosophy, in the strict sense of the enduring questions that concern the academic discipline, and the descriptive psychology of higher order thought (behavior). Once one has grasped Wittgenstein’s insight that there is only the issue of how the language game is to be played, one determines the Conditions of Satisfaction (what makes a statement true or satisfied etc.) and that is the end of the discussion. -/- Now that I think I understand how the games work I have mostly lost interest in philosophy, which of course is how Wittgenstein said it should be. But since they are the result of our innate psychology, or as Wittgenstein put, it due to the lack of perspicuity of language, the problems run throughout all human discourse, so there is endless need for philosophical analysis, not only in the ‘human sciences’ of philosophy, sociology, anthropology, political science, psychology, history, literature, religion, etc., but in the ‘hard sciences’ of physics, mathematics, and biology. It is universal to mix the language game questions with the real scientific ones as to what the empirical facts are. Scientism is ever present and the master has laid it before us long ago, i.e., Wittgenstein (hereafter W) beginning principally with the Blue and Brown Books in the early 1930’s. -/- "Philosophers constantly see the method of science before their eyes and are irresistibly tempted to ask and answer questions in the way science does. This tendency is the real source of metaphysics and leads the philosopher into complete darkness." (BBB p18) -/- Nevertheless, a real understanding of Wittgenstein’s work, and hence of how our psychology functions, is only beginning to spread in the second decade of the 21st century, due especially to P.M.S. Hacker (hereafter H) and Daniele Moyal-Sharrock (hereafter DMS),but also to many others, some of the more prominent of whom I mention in the articles. -/- When I read ‘On Certainty’ a few years ago I characterized it in an Amazon review as the Foundation Stone of Philosophy and Psychology and the most basic document for understanding behavior, and about the same time DMS was writing articles noting that it had solved the millennia old epistemological problem of how we can know anything for certain. I realized that W was the first one to grasp what is now characterized as the two systems or dual systems of thought, and I generated a dual systems (S1 and S2) terminology which I found to be very powerful in describing behavior. I took the small table that John Searle (hereafter S) had been using, expanded it greatly, and found later that it integrated perfectly with the framework being used by various current workers in thinking and reasoning research. -/- Since they were published individually, I have tried to make the book reviews and articles stand by themselves, insofar as possible, and this accounts for the repetition of various sections, notably the table and its explanation. I start with a short article that presents the table of intentionality and briefly describes its terminology and background. Next, is by far the longest article, which attempts a survey of the work of W and S as it relates to the table and so to an understanding or description (not explanation as W insisted) of behavior. -/- The key to everything about us is biology, and it is obliviousness to it that leads millions of smart educated people like Obama, Chomsky, Clinton and the Pope to espouse suicidal utopian ideals that inexorably lead straight to Hell On Earth. As W noted, it is what is always before our eyes that is the hardest to see. We live in the world of conscious deliberative linguistic System 2, but it is unconscious, automatic reflexive System 1 that rules. This is Searle’s The Phenomenological Illusion (TPI), Pinker’s Blank Slate and Tooby and Cosmides Standard Social Science Model. Democracy and equality are wonderful ideals, but without strict controls, selfishness and stupidity gain the upper hand and soon destroy any nation and any world that adopts them. The monkey mind steeply discounts the future, and so we sell our children’s heritage for temporary comforts. -/- The astute may wonder why we cannot see System 1 at work, but it is clearly counterproductive for an animal to be thinking about or second guessing every action, and in any case there is no time for the slow, massively integrated System 2 to be involved in the constant stream of split second ‘decisions’ we must make. As W noted, our ‘thoughts’ (T1 or the thoughts of System 1) must lead directly to actions. It is my contention that the table of intentionality (rationality, mind, thought, language, personality etc.) that features prominently here describes more or less accurately, or at least serves an heuristic for how we think and behave, and so it encompasses not merely philosophy and psychology but everything else (history, literature, mathematics, politics etc.). Note especially that intentionality and rationality as I (along with Searle, Wittgenstein and others) view it,includes both conscious deliberative System 2 and unconscious automated System 1 actions or reflexes. -/- Thus all the articles, like all behavior, are intimately connected if one knows how to look at them. As I note, The Phenomenological Illusion (oblivion to our automated System 1) is universal and extends not merely throughout philosophy but throughout life. I am sure that Chomsky, Obama, Zuckerberg and the Pope would be incredulous if told that they suffer from the same problem as Hegel, Husserl and Heidegger, but it’s clearly true. While the phenomenologists only wasted a lot of people’s time, they are wasting the earth. -/- My writings are available in updated versions as paperbacks and Kindles on Amazon. -/- Talking Monkeys: Philosophy, Psychology, Science, Religion and Politics on a Doomed Planet - Articles and Reviews 2006-2017 (2017) ASIN B071HVC7YP. -/- The Logical Structure of Philosophy, Psychology, Mind and Language in Ludwig Wittgenstein and John Searle--Articles and Reviews 2006-2016 (2017) ASIN B071P1RP1B. -/- Suicidal Utopian Delusions in the 21st century: Philosophy, Human Nature and the Collapse of Civilization - Articles and Reviews 2006-2017 (2017) 2nd printing with corrections (Feb 2018) ASIN B0711R5LGX -/- Suicide by Democracy: an Obituary for America and the World (2018) ASIN B07CQVWV9C . (shrink)
Revised version of chapter in J. N. Mohanty and W. McKenna (eds.), Husserl’s Phenomenology: A Textbook, Lanham: University Press of America, 1989, 29–67. -/- Logic for Husserl is a science of science, a science of what all sciences have in common in their modes of validation. Thus logic deals with universal laws relating to truth, to deduction, to verification and falsification, and with laws relating to theory as such, and to what makes for theoretical unity, both on the side of (...) the propositions of a theory and on the side of the domain of objects to which these propositions refer. This essay presents a systematic overview of Husserl’s views on these matters as put forward in his Logical Investigations. It shows how Husserl’s theory of linguistic meanings as species of mental acts, his formal ontology of part, whole and dependence, his theory of meaning categories, and his theory of categorial intuition combine with his theory of science to form a single whole. Finally, it explores the ways in which Husserl’s ideas on these matters can be put to use in solving problems in the philosophy of language, logic and mathematics in a way which does justice to the role of mental activity in each of these domains while at the same time avoiding the pitfalls of psychologism. (shrink)
When people combine concepts these are often characterised as “hybrid”, “impossible”, or “humorous”. However, when simply considering them in terms of extensional logic, the novel concepts understood as a conjunctive concept will often lack meaning having an empty extension (consider “a tooth that is a chair”, “a pet flower”, etc.). Still, people use different strategies to produce new non-empty concepts: additive or integrative combination of features, alignment of features, instantiation, etc. All these strategies involve the ability to deal with conflicting (...) attributes and the creation of new (combinations of) properties. We here consider in particular the case where a Head concept has superior ‘asymmetric’ control over steering the resulting concept combination (or hybridisation) with a Modifier concept. Specifically, we propose a dialogical approach to concept combination and discuss an implementation based on axiom weakening, which models the cognitive and logical mechanics of this asymmetric form of hybridisation. (shrink)
This paper provides an original approach to research on the logical processes that determine how certain forms participate in others. By introducing the concept of relational participation, the problems of self-referentiality of the Platonic forms can be dealt with more effectively. Applying this to the forms of likeness and unlikeness in Parmenides 132d-133a reveals a possible way to resolve different versions of the Third Man Argument. The method of generating numbers from oddness and evenness may also be of interest; relational (...) participation in these forms clarifies the interpretation of Parmenides 143e-144a. (shrink)
We present a framework for epistemic logic, modeling the logical aspects of System 1 and System 2 cognitive processes, as per dual process theories of reasoning. The framework combines non-normal worlds semantics with the techniques of Dynamic Epistemic Logic. It models non-logically-omniscient, but moderately rational agents: their System 1 makes fast sense of incoming information by integrating it on the basis of their background knowledge and beliefs. Their System 2 allows them to slowly, step-wise unpack some of the logical consequences (...) of such knowledge and beliefs, by paying a cognitive cost. The framework is applied to three instances of limited rationality, widely discussed in cognitive psychology: Stereotypical Thinking, the Framing Effect, and the Anchoring Effect. (shrink)
We study a fragment of Intuitionistic Linear Logic combined with non-normal modal operators. Focusing on the minimal modal logic, we provide a Gentzen-style sequent calculus as well as a semantics in terms of Kripke resource models. We show that the proof theory is sound and complete with respect to the class of minimal Kripke resource models. We also show that the sequent calculus allows cut elimination. We put the logical framework to use by instantiating it as a logic of agency. (...) In particular, we apply it to reason about the resource-sensitive use of artefacts. (shrink)
ABSTRACT: The modal systems of the Stoic logician Chrysippus and the two Hellenistic logicians Philo and Diodorus Cronus have survived in a fragmentary state in several sources. From these it is clear that Chrysippus was acquainted with Philo’s and Diodorus’ modal notions, and also that he developed his own in contrast of Diodorus’ and in some way incorporated Philo’s. The goal of this paper is to reconstruct the three modal systems, including their modal definitions and modal theorems, and to make (...) clear the exact relations between them; moreover, to elucidate the philosophical reasons that may have led Chrysippus to modify his predessors’ modal concept in the way he did. It becomes apparent that Chrysippus skillfully combined Philo’s and Diodorus’ modal notions, with making only a minimal change to Diodorus’ concept of possibility; and that he thus obtained a modal system of modalities (logical and physical) which fit perfectly fit into Stoic philosophy. (shrink)
Llull and Leibniz both subscribed to conceptual atomism: the belief that the majority of concepts are compounds constructed from a relatively small number of primitive concepts. Llull worked out techniques for finding the logically possible combinations of his primitives, but Leibniz criticized Llull’s execution of these techniques. This article argues that Leibniz was right about things being more complicated than Llull thought but that he was wrong about the details. The paper attempts to correct these details.
The truth functional account of conditional statements ‘if A then B’ is not only inadequate; it eliminates the very conditionality expressed by ‘if’. Focusing only on the truth-values of the statements ‘A’ and ‘B’ and different combinations of these, one is bound to miss out on the conditional relation expressed between them. All approaches that treat conditionals as functions of their antecedents and consequents will end up in some sort of logical atomism where causal matters simply are reduced to the (...) joint occurrence of A and B. (shrink)
We introduce a family of operators to combine Description Logic concepts. They aim to characterise complex concepts that apply to instances that satisfy \enough" of the concept descriptions given. For instance, an individual might not have any tusks, but still be considered an elephant. To formalise the meaning of "enough", the operators take a list of weighted concepts as arguments, and a certain threshold to be met. We commence a study of the formal properties of these operators, and study some (...) variations. The intended applications concern the representation of cognitive aspects of classi cation tasks: the interdependencies among the attributes that de ne a concept, the prototype of a concept, and the typicality of the instances. (shrink)
Traditional epistemology of knowledge and belief can be succinctly characterized as JTB-epistemology, i.e., it is characterized by the thesis that knowledge is justified true belief. Since Gettier’s trail-blazing paper of 1963 this account has become under heavy attack. The aim of is paper is to study the Gettier problem and related issues in the framework of topological epistemic logic. It is shown that in the framework of topological epistemic logic Gettier situations necessarily occur for most topological models of knowledge and (...) belief. On the other hand, there exists a special class of topological models (based on so called nodec spaces) for which traditional JTB-epistemology is valid. Further, it is shown that for each topological model of Stalnaker’s combined logic KB of knowledge and belief a canonical JTB-model (its JTB-doppelganger) can be constructed that shares many structural properties with the original model but is free of Gettier situations. The topological model and its JTB-doppelganger both share the same justified belief operator and have very similar knowledge operators. Seen from a somewhat different perspective, the JTB-account of epistemology amounts to a simplification of a more general epistemological account of knowledge and belief that assumes that these two concepts may differ in some cases. The JTB-account of knowledge and belief assumes that the epistemic agent’s cognitive powers are rather large. Thereby in the JTB-epistemology Gettier cases do not occur. Eventually, it is shown that for all topological models of Stalnaker’s KB- logic Gettier situations are topologically characterized as nowhere dense situations. This entails that Gettier situations are epistemologically invisible in the sense that they can neither be known nor believed with justification with respect to the knowledge operator and the belief operator of the models involved. Keywords. Stalnaker’s logic KB of knowledge and belief; Topological epistemology; Epistemic Invisibility; Doxastic invisibility; Gettier cases; . (shrink)
Causal models show promise as a foundation for the semantics of counterfactual sentences. However, current approaches face limitations compared to the alternative similarity theory: they only apply to a limited subset of counterfactuals and the connection to counterfactual logic is not straightforward. This paper addresses these difficulties using exogenous interventions, where causal interventions change the values of exogenous variables rather than structural equations. This model accommodates judgments about backtracking counterfactuals, extends to logically complex counterfactuals, and validates familiar principles of counterfactual (...) logic. This combines the interventionist intuitions of the causal approach with the logical advantages of the similarity approach. (shrink)
Giving justice to Maximus any philosophy wich does not include mysticism will be false as philosophy. Our metaphysics must be mystical in order to be rational. In Maximus’ doctrine, then, Christ comes not to destroy but to fulfill the metaphysics of mystery elaborated by the philosophers. For him there can be no separation between philosophy and theology, or between natural and revealed theology. Thereby, Christology and liturgical mysticism are not additional to a neoplatonic, aristotelian, and other methaphysics. Maximus concern was (...) to continue, not the philosophical tradition of the Aristotelian commentators, but the theological one of the Fathers. He was not an Aristotelian commentator himself. The union and distinction are basic logical concepts in Maximus’ thinking, but the Chalcedonian logic is the application of these concepts. Only in this way one can talk about Christianization of aristotelian logic. St. Maximus the Confessor synthesized Aristotelianism influences with those of Platonism in order to exceed the daring speculations of cosmology origeniene. He had an extraordinary ability to combine metaphysical requirements with the effort of defining the faith dogma, and the monastic experiences with the depth thinking, succeeding to propose a new conception in which converge all cultural and religious influences. This study is analyzing the relationship between logoi and energeia (the intentional or “logical” energeia and the ontology of divine energy as ontological “logic”) within the maximian cosmology, by referring to the palamite theology. The concept of logoi for St. Maximus play a role similar in many respects to that of energy (energeiai) in Cappadocian Fathers, but the functional similarity it should not lead to the identification rationales with the energies. (shrink)
Abstract. Traditional epistemology of knowledge and belief can be succinctly characterized as JTB-epistemology, i.e., it is characterized by the thesis that knowledge is justified true belief. Since Gettier’s trail-blazing paper of 1963 this account has become under heavy attack. The aim of is paper is to study the Gettier problem and related issues in the framework of topological epistemic logic. It is shown that in the framework of topological epistemic logic Gettier situations necessarily occur for most topological models of knowledge (...) and belief. On the other hand, there exists a special class of topological models (based on so called nodec spaces) for which traditional JTB-epistemology is valid. Further, it is shown that for each topological model of Stalnaker’s combined logic KB of knowledge and belief a canonical JTB-model (its JTB-doppelganger) can be constructed that shares many structural properties with the original model but is free of Gettier situations. The topological model and its JTB-doppelganger both share the same justified belief operator and have very similar knowledge operators. Seen from a somewhat different perspective, the JTB-account of epistemology amounts to a simplification of a more general epistemological account of knowledge and belief that assumes that these two concepts may differ in some cases. The JTB-account of knowledge and belief assumes that the epistemic agent’s cognitive powers are rather large. Thereby in the JTB-epistemology Gettier cases do not occur. Eventually, it is shown that for all topological models of Stalnaker’s KB-logic Gettier situations are topologically characterized as nowhere dense situations. This entails that Gettier situations are epistemologically invisible in the sense that they can neither be known nor believed with justification with respect to the knowledge operator and the belief operator of the models involved. (shrink)
According to tradition, logic is normative for reasoning. According to many contemporary philosophers of logic, there is more than one correct logic. What is the relationship between these two strands of thought? This paper makes two claims. First, logic is doubly normative for reasoning because, in addition to constraining the combinations of beliefs that we may have, logic also constrains the methods by which we may form them. Second, given that logic is doubly normative for reasoning, a wide array of (...) logical pluralisms are inconsistent with the normativity of logic as they entail contradictory claims about how agents ought to reason. Thus, if logic is normative for reasoning, these pluralisms are untenable. (shrink)
A graph-theoretic account of fibring of logics is developed, capitalizing on the interleaving characteristics of fibring at the linguistic, semantic and proof levels. Fibring of two signatures is seen as a multi-graph (m-graph) where the nodes and the m-edges include the sorts and the constructors of the signatures at hand. Fibring of two models is a multi-graph (m-graph) where the nodes and the m-edges are the values and the operations in the models, respectively. Fibring of two deductive systems is (...) an m-graph whose nodes are language expressions and the m-edges represent the inference rules of the two original systems. The sobriety of the approach is confirmed by proving that all the fibring notions are universal constructions. This graph-theoretic view is general enough to accommodate very different fibrings of propositional based logics encompassing logics with non-deterministic semantics, logics with an algebraic semantics, logics with partial semantics and substructural logics, among others. Soundness and weak completeness are proved to be preserved under very general conditions. Strong completeness is also shown to be preserved under tighter conditions. In this setting, the collapsing problem appearing in several combinations of logic systems can be avoided. (shrink)
A. J. Ayer’s Language, Truth, and Logic had been responsible for introducing the Vienna Circle’s ideas, developed within a Germanophone framework, to an Anglophone readership. Inevitably, this migration from one context to another resulted in the alteration of some of the concepts being transmitted. Such alterations have served to facilitate a number of false impressions of Logical Empiricism from which recent scholarship still tries to recover. In this paper, I will attempt to point to the ways in which LTL has (...) helped to foster the various mistaken stereotypes about Logical Empiricism which were combined into the received view. I will begin by examining Ayer’s all too brief presentation of an Anglocentric lineage for his ideas. This lineage, as we shall see, simply omits the major 19th century Germanophone influences on the rise of analytic philosophy. The Germanophone ideas he presents are selectively introduced into an Anglophone context, and directed towards various concerns that arose within that context. I will focus on the differences between Carnap’s version of the overcoming of metaphysics, and Ayer’s reconfiguration into what he calls the elimination of metaphysics. Having discussed the above, I will very briefly outline the consequences that Ayer’s radicalisation of the Vienna Circle’s doctrines had on the subsequent Anglophone reception of Logical Empiricism. (shrink)
Logical pluralism is the view that there is more than one correct logic. Most logical pluralists think that logic is normative in the sense that you make a mistake if you accept the premisses of a valid argument but reject its conclusion. Some authors have argued that this combination is self-undermining: Suppose that L1 and L2 are correct logics that coincide except for the argument from Γ to φ, which is valid in L1 but invalid in L2. If you (...) accept all sentences in Γ, then, by normativity, you make a mistake if you reject φ. In order to avoid mistakes, you should accept φ or suspend judgment about φ. Both options are problematic for pluralism. Can pluralists avoid this worry by rejecting the normativity of logic? I argue that they cannot. All else being equal, the argument goes through even if logic is not normative. (shrink)
The paper develops Lambda Grammars, a form of categorial grammar that, unlike other categorial formalisms, is non-directional. Linguistic signs are represented as sequences of lambda terms and are combined with the help of linear combinators.
Evolutionary psychology and the selectionist theories of neural development are usually regarded as two unrelated theories addressing two logically distinct questions. The focus of evolutionary psychology is the phylogeny of the human mind, whereas the selectionist theories of neural development analyse the ontogeny of the mind. This paper will endeavour to combine these two approaches in the explanation of the human mind. Doing so might help in overcoming some of the criticisms of both theories. The first part of the paper (...) mentions three standard objections to evolutionary psychology and then outlines three philosophical problems evolutionary psychology has to offer a solution to. The second part will try to show that an approach combining evolutionary psychology and the selectionist theory of neural development might overcome some of these objections. (shrink)
This paper develops a logical theory that unifies all three standard types of argumentative attack in AI, namely rebutting, undercutting and undermining attacks. We build on default justification logic that already represents undercutting and rebutting attacks, and we add undermining attacks. Intuitively, undermining does not target default inference, as undercutting, or default conclusion, as rebutting, but rather attacks an argument’s premise as a starting point for default reasoning. In default justification logic, reasoning starts from a set of premises, which is (...) then extended by conclusions that hold by default. We argue that modeling undermining defeaters in the view of default theories requires changing the set of premises upon receiving new information. To model changes to premises, we give a dynamic aspect to default justification logic by using the techniques from the logic of belief revision. More specifically, undermining is modeled with belief revision operations that include contracting a set of premises, that is, removing some information from it. The novel combination of default reasoning and belief revision in justification logic enriches both approaches to reasoning under uncertainty. By the end of the paper, we show some important aspects of defeasible argumentation in which our logic compares favorably to structured argumentation frameworks. (shrink)
Consider the concept combination ‘pet human’. In word association experiments, human subjects produce the associate ‘slave’ in relation to this combination. The striking aspect of this associate is that it is not produced as an associate of ‘pet’, or ‘human’ in isolation. In other words, the associate ‘slave’ seems to be emergent. Such emergent associations sometimes have a creative character and cognitive science is largely silent about how we produce them. Departing from a dimensional model of human conceptual space, this (...) article will explore concept combinations, and will argue that emergent associations are a result of abductive reasoning within conceptual space, that is, below the symbolic level of cognition. A tensor-based approach is used to model concept combinations allowing such combinations to be formalized as interacting quantum systems. Free association norm data is used to motivate the underlying basis of the conceptual space. It is shown by analogy how some concept combinations may behave like quantum-entangled particles. Two methods of analysis were presented for empirically validating the presence of non-separable concept combinations in human cognition. One method is based on quantum theory and another based on comparing a joint probability distribution with another distribution based on a separability assumption using a chi-square goodness-of-fit test. Although these methods were inconclusive in relation to an empirical study of bi-ambiguous concept combinations, avenues for further refinement of these methods are identified. (shrink)
This paper contends that Stoic logic (i.e. Stoic analysis) deserves more attention from contemporary logicians. It sets out how, compared with contemporary propositional calculi, Stoic analysis is closest to methods of backward proof search for Gentzen-inspired substructural sequent logics, as they have been developed in logic programming and structural proof theory, and produces its proof search calculus in tree form. It shows how multiple similarities to Gentzen sequent systems combine with intriguing dissimilarities that may enrich contemporary discussion. Much of (...) Stoic logic appears surprisingly modern: a recursively formulated syntax with some truth-functional propositional operators; analogues to cut rules, axiom schemata and Gentzen’s negation-introduction rules; an implicit variable-sharing principle and deliberate rejection of Thinning and avoidance of paradoxes of implication. These latter features mark the system out as a relevance logic, where the absence of duals for its left and right introduction rules puts it in the vicinity of McCall’s connexive logic. Methodologically, the choice of meticulously formulated meta-logical rules in lieu of axiom and inference schemata absorbs some structural rules and results in an economical, precise and elegant system that values decidability over completeness. (shrink)
“Trends in Logic XVI: Consistency, Contradiction, Paraconsistency, and Reasoning - 40 years of CLE” is being organized by the Centre for Logic, Epistemology and the History of Science at the State University of Campinas (CLEUnicamp) from September 12th to 15th, 2016, with the auspices of the Brazilian Logic Society, Studia Logica and the Polish Academy of Sciences. The conference is intended to celebrate the 40th anniversary of CLE, and is centered around the areas of logic, epistemology, philosophy and history of (...) science, while bringing together scholars in the fields of philosophy, logic, mathematics, computer science and other disciplines who have contributed significantly to what Studia Logica is today and to what CLE has achieved in its four decades of existence. It intends to celebrate CLE’s strong influence in Brazil and Latin America and the tradition of investigating formal methods inspired by, and devoted to, philosophical views, as well as philosophical problems approached by means of formal methods. The title of the event commemorates one of the three main areas of CLE, what has been called the “Brazilian school of paraconsistency”, combining such a pluralist view about logic and reasoning. (shrink)
We generalize intuitionistic tense logics to the multi-modal case by placing grammar logics on an intuitionistic footing. We provide axiomatizations for a class of base intuitionistic grammar logics as well as provide axiomatizations for extensions with combinations of seriality axioms and what we call "intuitionistic path axioms". We show that each axiomatization is sound and complete with completeness being shown via a typical canonical model construction.
Sentences about logic are often used to show that certain embedding expressions, including attitude verbs, conditionals, and epistemic modals, are hyperintensional. Yet it not clear how to regiment “logic talk” in the object language so that it can be compositionally embedded under such expressions. This paper does two things. First, it argues against a standard account of logic talk, viz., the impossible worlds semantics. It is shown that this semantics does not easily extend to a language with propositional quantifiers, which (...) are necessary for regimenting some logic talk. Second, it develops an alternative framework based on logical expressivism, which explains logic talk using shifting conventions. When combined with the standard S5π+ semantics for propositional quantifiers, this framework results in a well-behaved system that does not face the problems of the impossible worlds semantics. It can also be naturally extended with hybrid operators to regiment a broader range of logic talk, e.g., claims about what laws hold according to other logics. The resulting system, called hyperlogic, is therefore a better framework for modeling logic talk than previous accounts. (shrink)
In [5], Béziau provides a means by which Gentzen’s sequent calculus can be combined with the general semantic theory of bivaluations. In doing so, according to Béziau, it is possible to construe the abstract “core” of logics in general, where logical syntax and semantics are “two sides of the same coin”. The central suggestion there is that, by way of a modification of the notion of maximal consistency, it is possible to prove the soundness and completeness for any normal (...) logic. However, the reduction to bivaluation may be a side effect of the architecture of ordinary sequents, which is both overly restrictive, and entails certain expressive restrictions over the language. This paper provides an expansion of Béziau’s completeness results for logics, by showing that there is a natural extension of that line of thinking to n-sided sequent constructions. Through analogical techniques to Béziau’s construction, it is possible, in this setting, to construct abstract soundness and completeness results for n-valued logics. (shrink)
Trivalence is quite natural for deontic action logic, where actions are treated as good, neutral or bad.We present the ideas of trivalent deontic logic after J. Kalinowski and its realisation in a 3-valued logic of M. Fisher and two systems designed by the authors of the paper: a 4-valued logic inspired by N. Belnap’s logic of truth and information and a 3-valued logic based on nondeterministic matrices. Moreover, we combine Kalinowski’s idea of trivalence with deontic action logic based on boolean (...) algebra. (shrink)
Accurate and precise trajectory tracking is crucial for a quadrotor to operate in disturbed environments. This paper presents a novel tracking hybrid controller for a quadrotor UAV that combines the Adaptive and Fuzzy logic controller. The Adaptive fuzzy controller is implemented to govern the behavior of two degrees of freedom quadrotor UAV. The proposed controller allows controlling the movement of UAVs to track a given trajectory in a 2D vertical plane. The Fuzzy Logic system provides an automatic adjustment of the (...) Adaptive parameters to reduce tracking errors and improve the quality of the controller. The results showed perfect behavior for the control law to control a quadrotor trajectory tracking task. To show the effectiveness of the intelligent controller, simulation results are given to confirm the advantages of the proposed control method, compared with Fuzzy and Proportional integral derivative (PID) control methods. (shrink)
ABSTRACT Theories of sets such as Zermelo Fraenkel set theory are usually presented as the combination of two distinct kinds of principles: logical and set-theoretic principles. The set-theoretic principles are imposed ‘on top’ of first-order logic. This is in agreement with a traditional view of logic as universally applicable and topic neutral. Such a view of logic has been rejected by the intuitionists, on the ground that quantification over infinite domains requires the use of intuitionistic rather than classical logic. In (...) the following, I consider constructive set theories, which use intuitionistic rather than classical logic, and argue that they manifest a distinctive interdependence or an entanglement between sets and logic. In fact, Martin-Löf type theory identifies fundamental logical and set-theoretic notions. Remarkably, one of the motivations for this identification is the thought that classical quantification over infinite domains is problematic, while intuitionistic quantification is not. The approach to quantification adopted in Martin-Löf’s type theory is subtly interconnected with its predicativity. I conclude by recalling key aspects of an approach to predicativity inspired by Poincaré, which focuses on the issue of correct quantification over infinite domains and relate it back to Martin-Löf type theory. (shrink)
Although it seems intuitively clear that acts of requesting are different from acts of commanding, it is not very easy to sate their differences precisely in dynamic terms. In this paper we show that it becomes possible to characterize, at least partially, the effects of acts of requesting and compare them with the effects of acts of commanding by combining dynamified deontic logic with epistemic logic. One interesting result is the following: each act of requesting is appropriately differentiated from (...) an act of commanding with the same content, but for each act of requesting, there is another act of commanding with much more complex content which updates models in exactly the same way as it does. We will also consider an application of our characterization of acts of requesting to acts of asking yes-no questions. It yields a straightforward formalization of the view of acts of asking questions as requests for information. (shrink)
1) We will begin by offering a short introduction to Epistemic Logic and presenting Fitch’s paradox in an epistemic‑modal logic. (2) Then, we will proceed to presenting three Epistemic Temporal logical frameworks creat‑ ed by Hoshi (2009) : TPAL (Temporal Public Announcement Logic), TAPAL (Temporal Arbitrary Public Announcement Logic) and TPAL+P ! (Temporal Public Announcement Logic with Labeled Past Operators). We will show how Hoshi stated the Verificationist Thesis in the language of TAPAL and analyze his argument on why this (...) version of it is immune from paradox. (3) Edgington (1985) offered an interpretation of the Verificationist Thesis that blocks Fitch’s paradox and we will propose a way to formulate it in a TAPAL‑based lan‑ guage. The language we will use is a combination of TAPAL and TPAL+P ! with an Indefinite (Unlabeled) Past Operator (TAPAL+P !+P). Using indexed satisfi‑ ability relations (as introduced in (Wang 2010 ; 2011)) we will offer a prospec ‑ tive semantics for this language. We will investigate whether the tentative re‑ formulation of Edgington’s Verificationist Thesis in TAPAL+P !+P is free from paradox and adequate to Edgington’s ideas on how „all truths are knowable“ should be interpreted. (shrink)
An important problem with machine learning is that when label number n>2, it is very difficult to construct and optimize a group of learning functions, and we wish that optimized learning functions are still useful when prior distribution P(x) (where x is an instance) is changed. To resolve this problem, the semantic information G theory, Logical Bayesian Inference (LBI), and a group of Channel Matching (CM) algorithms together form a systematic solution. MultilabelMultilabel A semantic channel in the G theory consists (...) of a group of truth functions or membership functions. In comparison with likelihood functions, Bayesian posteriors, and Logistic functions used by popular methods, membership functions can be more conveniently used as learning functions without the above problem. In Logical Bayesian Inference (LBI), every label’s learning is independent. For Multilabel learning, we can directly obtain a group of optimized membership functions from a big enough sample with labels, without preparing different samples for different labels. A group of Channel Matching (CM) algorithms are developed for machine learning. For the Maximum Mutual Information (MMI) classification of three classes with Gaussian distributions on a two-dimensional feature space, 2-3 iterations can make mutual information between three classes and three labels surpass 99% of the MMI for most initial partitions. For mixture models, the Expectation-Maxmization (EM) algorithm is improved and becomes the CM-EM algorithm, which can outperform the EM algorithm when mixture ratios are imbalanced, or local convergence exists. The CM iteration algorithm needs to combine neural networks for MMI classifications on high-dimensional feature spaces. LBI needs further studies for the unification of statistics and logic. (shrink)
Kate Manne’s Down Girl: The Logic of Misogyny combines traditional conceptual analysis and feminist conceptual engineering with critical exploration of cases drawn from popular culture and current events in order to produce an ameliorative account of misogyny, i.e., one that will help address the problems of misogyny in the actual world. A feminist account of misogyny that is both intersectional and ameliorative must provide theoretical tools for recognizing misogyny in its many-dimensional forms, as it interacts and overlaps with other oppressions. (...) While Manne thinks subtly about many of the material conditions that create misogyny as a set of normative social practices, she does not fully extend this care to the other intersectional forms of oppression she discusses. After touching on the book’s strengths, I track variations of its main problem, namely, its failure to fully conceive of oppressions besides sexism and misogyny as systemic patterns of social practices, as inherently structural rather than mere collections of individual beliefs and behaviors. (shrink)
A graph-theoretic account of logics is explored based on the general notion of m-graph (that is, a graph where each edge can have a finite sequence of nodes as source). Signatures, interpretation structures and deduction systems are seen as m-graphs. After defining a category freely generated by a m-graph, formulas and expressions in general can be seen as morphisms. Moreover, derivations involving rule instantiation are also morphisms. Soundness and completeness theorems are proved. As a consequence of the generality of (...) the approach our results apply to very different logics encompassing, among others, substructural logics as well as logics with nondeterministic semantics, and subsume all logics endowed with an algebraic semantics. (shrink)
Imperatives cannot be true or false, so they are shunned by logicians. And yet imperatives can be combined by logical connectives: "kiss me and hug me" is the conjunction of "kiss me" with "hug me". This example may suggest that declarative and imperative logic are isomorphic: just as the conjunction of two declaratives is true exactly if both conjuncts are true, the conjunction of two imperatives is satisfied exactly if both conjuncts are satisfied—what more is there to say? Much more, (...) I argue. "If you love me, kiss me", a conditional imperative, mixes a declarative antecedent ("you love me") with an imperative consequent ("kiss me"); it is satisfied if you love and kiss me, violated if you love but don't kiss me, and avoided if you don't love me. So we need a logic of three -valued imperatives which mixes declaratives with imperatives. I develop such a logic. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.