What corresponds to the present-day ‘transcendental-pragmatic’ concept of ultimate grounding in Hegel is his claim to absoluteness of the logic. Hegel’s fundamental intuition is that of a ‘backward going grounding’ obtaining the initially unproved presuppositions, thereby ‘wrapping itself into a circle’ – the project of the self-grounding of logic, understood as the self-explication of logic by logical means. Yet this is not about one of the multiple ‘logics’ which as formal constructs cannot claim absoluteness. It is (...) rather a fundamentallogic that only makes logical textures possible at all and so owns transcendental character. The non-contradiction-principle is an example for this. Es- sential is that it is ‘under-cover-effcient’ as soon as meaningful concepts are used. Self-explication of the fundamentallogic then means explicating its implicit under-cover validity, in fact by means of the fundamentallogic itself. As is shown this is the affair of dialectic which thereby is to be understood as ultimate grounding of the fundamentallogic. This is analyzed in detail using the example of the being/non-being-dialectic. As is demonstrated each explication step generates a new implicit issue and therewith a new explication-discrepancy inducing an antinomical structure that anew forwards the explication procedure. So this is entirely determined by itself. Decisive for the ultimate grounding argumentation is that thereby an objectively verifyable procedure is found, which is apparently possible only in a Hegelian framework. In contrast the immediate evidence of a speech act claimed by the transcendental-pragmatic position has only private character, which is grounding-theoretically irrelevant. (shrink)
Autologos. A dialogue on fundamentallogic. - In this dialogue of three dialogue partners, an attempt is made to prove the logical prerequisites of any meaningful dialogue by using transcendental arguments. Among these inescapable logical premises are a semantics as strong as that of modal logic S5, and an epistemic anti-realism.
In this paper I will develop a view about the semantics of imperatives, which I term Modal Noncognitivism, on which imperatives might be said to have truth conditions (dispositionally, anyway), but on which it does not make sense to see them as expressing propositions (hence does not make sense to ascribe to them truth or falsity). This view stands against “Cognitivist” accounts of the semantics of imperatives, on which imperatives are claimed to express propositions, which are then enlisted in explanations (...) of the relevant logico-semantic phenomena. It also stands against the major competitors to Cognitivist accounts—all of which are non-truth-conditional and, as a result, fail to provide satisfying explanations of the fundamental semantic characteristics of imperatives (or so I argue). The view of imperatives I defend here improves on various treatments of imperatives on the market in giving an empirically and theoretically adequate account of their semantics and logic. It yields explanations of a wide range of semantic and logical phenomena about imperatives—explanations that are, I argue, at least as satisfying as the sorts of explanations of semantic and logical phenomena familiar from truth-conditional semantics. But it accomplishes this while defending the notion—which is, I argue, substantially correct—that imperatives could not have propositions, or truth conditions, as their meanings. (shrink)
“Second-order Logic” in Anderson, C.A. and Zeleny, M., Eds. Logic, Meaning, and Computation: Essays in Memory of Alonzo Church. Dordrecht: Kluwer, 2001. Pp. 61–76. -/- Abstract. This expository article focuses on the fundamental differences between second- order logic and first-order logic. It is written entirely in ordinary English without logical symbols. It employs second-order propositions and second-order reasoning in a natural way to illustrate the fact that second-order logic is actually a familiar part of (...) our traditional intuitive logical framework and that it is not an artificial formalism created by specialists for technical purposes. To illustrate some of the main relationships between second-order logic and first-order logic, this paper introduces basic logic, a kind of zero-order logic, which is more rudimentary than first-order and which is transcended by first-order in the same way that first-order is transcended by second-order. The heuristic effectiveness and the historical importance of second-order logic are reviewed in the context of the contemporary debate over the legitimacy of second-order logic. Rejection of second-order logic is viewed as radical: an incipient paradigm shift involving radical repudiation of a part of our scientific tradition, a tradition that is defended by classical logicians. But it is also viewed as reactionary: as being analogous to the reactionary repudiation of symbolic logic by supporters of “Aristotelian” traditional logic. But even if “genuine” logic comes to be regarded as excluding second-order reasoning, which seems less likely today than fifty years ago, its effectiveness as a heuristic instrument will remain and its importance for understanding the history of logic and mathematics will not be diminished. Second-order logic may someday be gone, but it will never be forgotten. Technical formalisms have been avoided entirely in an effort to reach a wide audience, but every effort has been made to limit the inevitable sacrifice of rigor. People who do not know second-order logic cannot understand the modern debate over its legitimacy and they are cut-off from the heuristic advantages of second-order logic. And, what may be worse, they are cut-off from an understanding of the history of logic and thus are constrained to have distorted views of the nature of the subject. As Aristotle first said, we do not understand a discipline until we have seen its development. It is a truism that a person's conceptions of what a discipline is and of what it can become are predicated on their conception of what it has been. (shrink)
In the present paper we propose a system of propositional logic for reasoning about justification, truthmaking, and the connection between justifiers and truthmakers. The logic of justification and truthmaking is developed according to the fundamental ideas introduced by Artemov. Justifiers and truthmakers are treated in a similar way, exploiting the intuition that justifiers provide epistemic grounds for propositions to be considered true, while truthmakers provide ontological grounds for propositions to be true. This system of logic is (...) then applied both for interpreting the notorious definition of knowledge as justified true belief and for advancing a new solution to Gettier counterexamples to this standard definition. (shrink)
We show how removing faith-based beliefs in current philosophies of classical and constructive mathematics admits formal, evidence-based, definitions of constructive mathematics; of a constructively well-defined logic of a formal mathematical language; and of a constructively well-defined model of such a language. -/- We argue that, from an evidence-based perspective, classical approaches which follow Hilbert's formal definitions of quantification can be labelled `theistic'; whilst constructive approaches based on Brouwer's philosophy of Intuitionism can be labelled `atheistic'. -/- We then adopt what (...) may be labelled a finitary, evidence-based, `agnostic' perspective and argue that Brouwerian atheism is merely a restricted perspective within the finitary agnostic perspective, whilst Hilbertian theism contradicts the finitary agnostic perspective. -/- We then consider the argument that Tarski's classic definitions permit an intelligence---whether human or mechanistic---to admit finitary, evidence-based, definitions of the satisfaction and truth of the atomic formulas of the first-order Peano Arithmetic PA over the domain N of the natural numbers in two, hitherto unsuspected and essentially different, ways. -/- We show that the two definitions correspond to two distinctly different---not necessarily evidence-based but complementary---assignments of satisfaction and truth to the compound formulas of PA over N. -/- We further show that the PA axioms are true over N, and that the PA rules of inference preserve truth over N, under both the complementary interpretations; and conclude some unsuspected constructive consequences of such complementarity for the foundations of mathematics, logic, philosophy, and the physical sciences. -/- . (shrink)
Gila Sher approaches knowledge from the perspective of the basic human epistemic situation—the situation of limited yet resourceful beings, living in a complex world and aspiring to know it in its full complexity. What principles should guide them? Two fundamental principles of knowledge are epistemic friction and freedom. Knowledge must be substantially constrained by the world (friction), but without active participation of the knower in accessing the world (freedom) theoretical knowledge is impossible. This requires a grounding of all knowledge, (...) empirical and abstract, in both mind and world, but the fall of traditional foundationalism has led many to doubt the viability of this ‘classical’ project. Sher challenges this skepticism, charting a new foundational methodology, foundational holism, that differs from others in being holistic, world-oriented, and universal (i.e., applicable to all fields of knowledge). Using this methodology, Epistemic Friction develops an integrated theory of knowledge, truth, and logic. This includes (i) a dynamic model of knowledge, incorporating some of Quine’s revolutionary ideas while rejecting his narrow empiricism, (ii) a substantivist, non-traditional correspondence theory of truth, and (iii) an outline of a joint grounding of logic in mind and world. The model of knowledge subjects all disciplines to demanding norms of both veridicality and conceptualization. The correspondence theory is robust and universal yet not simplistic or naive, admitting diverse forms of correspondence. Logic’s grounding in the world brings it in line with other disciplines while preserving, and explaining, its strong formality, necessity, generality, and normativity. (shrink)
Our concept of the universe and the material world is foundational for our thinking and our moral lives. In an earlier contribution to the URAM project I presented what I called 'the ultimate organizational principle' of the universe. In that article (Grandpierre 2000, pp. 12-35) I took as an adversary the wide-spread system of thinking which I called 'materialism'. According to those who espouse this way of thinking, the universe consists of inanimate units or sets of material such as atoms (...) or elementary particles. Against this point of view on reality, I argued that it is 'logic', which exists in our inner world as a function of our mind, that is the universal organizing power of the universe. The present contribution builds upon this insight. Then I focussed on rationality; now I am interested in the responsibility that is the driving force behind our effort to find coherence and ultimate perspectives in our cosmos. It is shown that biology fundamentally differs from physics. Biology has its own fundamental principle, which is formulated for the first time in history in a scientific manner by Ervin Bauer. This fundamental principle is the cosmic life principle. I show that if one considers the physical laws as corresponding to reality, as in scientific realism, than physicalism becomes fundamentally spiritual because the physical laws are not material. I point out that the physical laws originate from the fundamental principle of physics which is the least action principle. I show that the fundamental principle of physics can be considered as the "instinct of atoms". Our research has found deep and meaningful connections between the basic principle of physics and the ultimate principles of the universe: matter, life and reason. Therefore, the principle of least action is not necessarily an expression of sterile inanimateness. On the contrary, the principle of physics is related to the life principle of the universe, to the world of instincts behind the atomic world, in which the principles of physics, biology, and psychology arise from the same ultimate principle. Our research sheds new light to the sciences of physics, biology, and psychology in close relation to the basic principles. These ultimate principles have a primary importance in our understanding of the nature of Man and the Universe, together with the relations between Man and Nature, Man and the Universe. The results offer new foundations for our understanding our own role in the Earth, in the Nature and in the Universe. Even the apparently inanimate world of physics shows itself to be animate on long timescales and having a kind of pre- human consciousness in its basic organisation. This hypothesis offers a way to understand when and how the biological laws may direct physical laws, and, moreover, offers a new perspective to study and understand under which conditions can self-consciousness govern the laws of biology and physics. This point of view offers living beings and humans the possibility of strengthening our natural identity, and recognising the wide perspective arising from having access to the deepest ranges of our own human resources and realising the task for which human and individual life has been created. (shrink)
This is the first of a two-volume work combining two fundamental components of contemporary computing into classical deductive computing, a powerful form of computation, highly adequate for programming and automated theorem proving, which, in turn, have fundamental applications in areas of high complexity and/or high security such as mathematical proof, software specification and verification, and expert systems. Deductive computation is concerned with truth-preservation: This is the essence of the satisfiability problem, or SAT, the central computational problem in computability (...) and complexity theory. The Turing machine provides the classical version of this theory—classical computing—with its standard model, which is physically concretized—and thus spatial-temporally limited and restricted—in the von Neumann, or digital, computer. Although a number of new technological applications require classical deductive computation with non-classical logics, many key technologies still do well—or exclusively, for that matter—with classical logic. In this first volume, we elaborate on classical deductive computing with classical logic. The objective of the main text is to provide the reader with a thorough elaboration on both classical computing and classical deduction with the classical first-order predicate calculus with a view to computational implementations. As a complement to the mathematical-based exposition of the topics we offer the reader a very large selection of exercises. This selection aims at not only practice of discussed material, but also creative approaches to problems, for both discussed and novel contents, as well as at research into further relevant topics. (shrink)
We propose a solution to the problem of logical omniscience in what we take to be its fundamental version: as concerning arbitrary agents and the knowledge attitude per se. Our logic of knowledge is a spin-off from a general theory of thick content, whereby the content of a sentence has two components: an intension, taking care of truth conditions; and a topic, taking care of subject matter. We present a list of plausible logical validities and invalidities for the (...)logic of knowledge per se for arbitrary agents, and isolate three explanatory factors for them: the topic-sensitivity of content; the fragmentation of knowledge states; the defeasibility of knowledge acquisition. We then present a novel dynamic epistemic logic that yields precisely the desired validities and invalidities, for which we provide expressivity and completeness results. We contrast this with related systems and address possible objections. (shrink)
The present essay deals with certain questions in the feld of humanistic philosophy, ethics and axiology, discussed in the light of still newer and newer challenges of our changing times. It highlights the signicant role of Professor Andrzej Grzegorczyk in solving and overcoming problems encountered in the life of man, which is based on his natural logic and incessant eorts aimed at preservation of fundamental moral values, as well as at shaping the principles of the individual and social (...) life. The views held by An- drzej Grzegorczyk, which are outlined in the work, form a certain rationalistic vision of the world and mankind. (shrink)
Abu Nasr Muhammad Al-Farabi (870–950 AD), the second outstanding representative of the Muslim peripatetic after al Kindi (801–873 AD), was born in Turkestan about 870 AD. Al-Farabi’s studies commenced in Farab, then he travelled to Baghdad, where he studied logic with a Christian scholar named Yuhanna b. Hailan. Al-Farabi wrote numerous works dealing with almost every branch of science in the medieval world. In addition to a large number of books on logic and other sciences, he came to (...) be known as the “Second Teacher” (al-Mou’allim al-Thani), Aristotle being the first. One of Al-Farabi’s most important contributions was clarifying the func- tions of logic as follows: 1. He defined logic and compared it with grammar, and discussed the clas- sification and fundamental principles of science in a unique and useful manner. 2. He made the study of logic easier by dividing it into two categories: Takhayyul (idea) and Thubut (proof). 3. He believed that the objective of logic is to correct faults we may find in ourselves and in others, and faults that others find in us. 4. He said that if we do not comprehend logic, we must either have faith in all people, or mistrust all people, or differentiate between them. Such actions would be undertaken without a basis of evidence or experimen- tation. In this paper, I will analyse the functions of logic in Al-Farabi’s works, Enumeration of the Sciences, Book on the Syllogism, Book on Dialectic, Book on Demonstration and Ring Stones of Wisdom, in order to present his contributions in the field of logic. (shrink)
One can construct a mapping between Hilbert space and the class of all logic if the latter is defined as the set of all well-orderings of some relevant set (or class). That mapping can be further interpreted as a mapping of all states of all quantum systems, on the one hand, and all logic, on the other hand. The collection of all states of all quantum systems is equivalent to the world (the universe) as a whole. Thus that (...) mapping establishes a fundamentally philosophical correspondence between the physical world and universal logic by the meditation of a special and fundamental structure, that of Hilbert space, and therefore, between quantum mechanics and logic by mathematics. Furthermore, Hilbert space can be interpreted as the free variable of "quantum information" and any point in it, as a value of the same variable as "bound" already axiom of choice. (shrink)
Engaging with Kant’s transcendental logic seems to be a question of mere scholarly historical interest today. It is most commonly regarded a mixture between logic and psychology or epistemology, and by that, not a serious form of logic. Transcendental logic seems to be of no systematical impact on the concept of logic. My paper aims to disclose a different account on the endeavour of Kant’s transcendental logic in particular and of the “Critique of Pure (...) Reason” (CPR) in general. Kant’s fundamental question is in a revolutionary way aiming to ground the character of necessity of knowledge, which means to justify the claim that thinking in accordance with the forms and principles of formal logic does not lead to sheer tautologies or an unsolved contradiction, but to knowledge that is objectively valid. In a first part, I shall demonstrate the necessity and the significance of this new fundamental question of the CPR with respect to its genesis out of pre-Kantian metaphysics. A brief outline of Kant’s answer to this question, with special emphasis on his revolutionary new comprehension of logical form, will be given as well. A second part shall open up a perspective that lies beyond Kant’s standpoint with reference to Nietzsche and eventually to Hegel. I will answer the question: What knowledge do we achieve about being or actuality by means of formal logic? I will argue that Kant shows that formal logic is the logic of all technical-practical conduct but also, at least indirectly, the limitation of the technical-practical knowledge and its legitimate sphere of application. (shrink)
The title of the present paper might arouse some curiosity among the minds of the readers. The very first question that arises in this respect is whether India produced any logic in the real sense of the term as has been used in the West. This paper is centered only on the three systems of Indian philosophy namely Nyāya, Buddhism and Jainism. We have been talking of Indian philosophy, Indian religion, Indian culture and Indian spirituality, but not that which (...) are of more fundamental concepts for any branch of knowledge whether it is social sciences or humanities. No aspect of human life and the universe has been left unexamined by Indian philosophers, and this leads to a totality of vision in both philosophical and psychological fields. In this paper we will discuss the main thinkers, sources and main concepts related to Indian Logic. (shrink)
I proffer a success argument for classical logical consequence. I articulate in what sense that notion of consequence should be regarded as the privileged notion for metaphysical inquiry aimed at uncovering the fundamental nature of the world. Classical logic breeds necessitism. I use necessitism to produce problems for both ontological naturalism and atheism.
The present paper deals thus with some fundamental agreements and disagreements between Peirce and James, on crucial issues such as perception and consciousness. When Peirce first read the Principles, he was sketching his theory of the categories, testing its applications in many fields of knowledge, and many investigations were launched, concerning indexicals, diagrams, growth and development. James's utterances led Peirce to make his own views clearer on a wide range of topics that go to the heart of the foundations (...) of psychology and that involve the relationship between perception and logic, between consciousness and the categories, between abstraction and the 'stream of thought'. The idea is to show that Peirce detected important discoveries and insights in the Principles, but felt that James could not make proper use of them because of logical confusions, and also because of his "clandestine" metaphysics. The point in this essay is thus not to look for remains of psychologism in Peirce's writings,13 but to look at Peirce's comments about James's psychology in an attempt to identify where and why Peirce amended James's views. Since the project to provide some insight on Peirce's extensive reading ofJames's Principles of Psycho/.ogy would deserve a full volume, I shall focus here on three occasions where Peirce explicidy commented on Jarnes's Principles. In the first section, I shall consider bis assessment of James's chapter on space, which was published as a series of articles in 1887, in Mind. I shall then turn to the 1891 review of the Principles in The Nation for important complements on perception as inference. In the third section, I shall deal with Peirce's manuscript "Questions on James's Principles"(Rl099). These "Questions" reveal a deep interest in psychological problems and suggest different ways along which Peirce's new advances in the field of the categories, of continuity, and abstraction could provide a proper basis for the philosophy of mind. (shrink)
This paper formalizes part of the cognitive architecture that Kant develops in the Critique of Pure Reason. The central Kantian notion that we formalize is the rule. As we interpret Kant, a rule is not a declarative conditional stating what would be true if such and such conditions hold. Rather, a Kantian rule is a general procedure, represented by a conditional imperative or permissive, indicating which acts must or may be performed, given certain acts that are already being performed. These (...) acts are not propositions; they do not have truth-values. Our formalization is related to the input/ output logics, a family of logics designed to capture relations between elements that need not have truth-values. In this paper, we introduce KL3 as a formalization of Kant’s conception of rules as conditional imperatives and permissives. We explain how it differs from standard input/output logics, geometric logic, and first-order logic, as well as how it translates natural language sentences not well captured by first-order logic. Finally, we show how the various distinctions in Kant’s much-maligned Table of Judgements emerge as the most natural way of dividing up the various types and sub-types of rule in KL3. Our analysis sheds new light on the way in which normative notions play a fundamental role in the conception of logic at the heart of Kant’s theoretical philosophy. (shrink)
Identity is traditionally taken to be a fundamental notion of our conceptual framework as well as a fundamental metaphysical component of entities. But as far as we make this claim we face ourselves with two problems: what is identity? And why would it be fundamental? These questions will guide us towards a discussion put forward by Bueno (2014), Krause and Arenhart (2015). Bueno holds that there are four aspects that make identity being fundamental: (1) identity is (...) assumed in every conceptual system; (2) it is required for a minimal characterisation of being an individual; (3) it cannot be defined; and (4) identity is required for quantification. On the other hand, Krause and Arenhart refuse the thesis that identity is fundamental replying to Bueno's arguments. In this dissertation we will deal with this debate. In the introduction we will deal with the first problem what is identity? , showing how this concept is traditionally understood, either for its metaphysical characteristics as for its formal account. After that we will deal with each of the four aspects defended by Bueno and challenged by Krause and Arenhart. After a critical presentation of each position we will also provide other arguments for the current debate. Finally we will outline an alternative view to those defended throughout this work. -/- . (shrink)
(Abstract - Inglês) Identity is traditionally taken to be a fundamental notion of our conceptual framework as well as a fundamental metaphysical component of entities. But as far as we make this claim we face ourselves with two problems: what is identity? And why would it be fundamental? These questions will guide us towards a discussion put forward by Bueno (2014), Krause and Arenhart (2015). Bueno holds that there are four aspects that make identity being fundamental: (...) (1) identity is assumed in every conceptual system; (2) it is required for a minimal characterisation of being an individual; (3) it cannot be defined; and (4) identity is required for quantification. On the other hand, Krause and Arenhart refuse the thesis that identity is fundamental replying to Bueno's arguments. In this dissertation we will deal with this debate. In the introduction we will deal with the first problem – what is identity? –, showing how this concept is traditionally understood, either for its metaphysical characteristics as for its formal account. After that we will deal with each of the four aspects defended by Bueno and challenged by Krause and Arenhart. After a critical presentation of each position we will also provide other arguments for the current debate. Finally we will outline an alternative view to those defended throughout this work. -/- (Resumo - Português) Tradicionalmente a identidade é adotada como uma noção fundamental de nosso arcabouço conceitual e como um componente metafísico fundamental das entidades. Mas logo ao fazermos essa afirmação nos deparamos com dois problemas: O que é a identidade? E por que ela seria fundamental? Estas perguntas irão nos guiar à discussão conduzida por Otávio Bueno (2014), Décio Krause e Jonas Arenhart (2015). Bueno defende que há quatro aspectos que fazem a identidade ser fundamental: (1) A identidade é pressuposta em todo sistema conceitual; (2) é requerida para uma caracterização mínima de indivíduo; (3) não pode ser definida; e (4) a identidade é requerida para a quantificação. Por outro lado, Krause e Arenhart recusam a tese de que a identidade seja fundamental, respondendo aos argumentos de Bueno. Neste trabalho iremos tratar desse debate. Na introdução iremos tratar do primeiro problema – O que é a identidade? –, mostrando como este conceito é tradicionalmente compreendido, tanto suas características metafísicas como também seu tratamento formal. Posteriormente iremos tratar de cada um dos quatro aspectos defendidos por Bueno e atacados por Krause e Arenhart. Além da exposição crítica de cada posição iremos também oferecer outros argumentos para o debate atual. Ao final iremos esboçar uma posição alternativa às defendidas ao longo do texto. (shrink)
This paper examines the role of ?situations? in John Dewey's philosophy of logic. To do this properly it is necessary to contrast Dewey's conception of experience and mentality with views characteristic of modern epistemology. The primary difference is that, rather than treat experience as peripheral and or external to mental functions (reason, etc.), we should treat experience as a field in and as a part of which thinking takes place. Experience in this broad sense subsumes theory and fact, hypothesis (...) and evidence, reason and observation, thought and perception. Logic in this view is a formal study of the generic features of all possible kinds of experience in this broad (thick, deep, wide, multifaceted) sense. The goal of this paper is to explain what Dewey thinks a situation is in the context of this view of experience, and to argue for the fundamental importance of that idea for logic and philosophy in general. (shrink)
Continuing prior work by the author, a simple classical system for personal obligation is integrated with a fairly rich system for aretaic (agent-evaluative) appraisal. I then explore various relationships between definable aretaic statuses such as praiseworthiness and blameworthiness and deontic statuses such as obligatoriness and impermissibility. I focus on partitions of the normative statuses generated ("normative positions" but without explicit representation of agency). In addition to being able to model and explore fundamental questions in ethical theory about the connection (...) between blame, praise, permissibility and obligation, this allows me to carefully represent schemes for supererogation and kin. These controversial concepts have provided challenges to both ethical theory and deontic logic, and are among deontic logic's test cases. (shrink)
The distinction between the discrete and the continuous lies at the heart of mathematics. Discrete mathematics (arithmetic, algebra, combinatorics, graph theory, cryptography, logic) has a set of concepts, techniques, and application areas largely distinct from continuous mathematics (traditional geometry, calculus, most of functional analysis, differential equations, topology). The interaction between the two – for example in computer models of continuous systems such as fluid flow – is a central issue in the applicable mathematics of the last hundred years. This (...) article explains the distinction and why it has proved to be one of the great organizing themes of mathematics. (shrink)
The most widespread models of rational reasoners (the model based on modal epistemic logic and the model based on probability theory) exhibit the problem of logical omniscience. The most common strategy for avoiding this problem is to interpret the models as describing the explicit beliefs of an ideal reasoner, but only the implicit beliefs of a real reasoner. I argue that this strategy faces serious normative issues. In this paper, I present the more fundamental problem of logical omnipotence, (...) which highlights the normative content of the problem of logical omniscience. I introduce two developments of the notion of implicit belief (accessible and stable belief ) and use them in two versions of the most common strategy applied to the problem of logical omnipotence. (shrink)
Hilbert's ε-calculus is based on an extension of the language of predicate logic by a term-forming operator εx. Two fundamental results about the ε-calculus, the first and second epsilon theorem, play a rôle similar to that which the cut-elimination theorem plays in sequent calculus. In particular, Herbrand's Theorem is a consequence of the epsilon theorems. The paper investigates the epsilon theorems and the complexity of the elimination procedure underlying their proof, as well as the length of Herbrand disjunctions (...) of existential theorems obtained by this elimination procedure. (shrink)
Recent developments in pure mathematics and in mathematical logic have uncovered a fundamental duality between "existence" and "information." In logic, the duality is between the Boolean logic of subsets and the logic of quotient sets, equivalence relations, or partitions. The analogue to an element of a subset is the notion of a distinction of a partition, and that leads to a whole stream of dualities or analogies--including the development of new logical foundations for information theory (...) parallel to Boole's development of logical finite probability theory. After outlining these dual concepts in mathematical terms, we turn to a more metaphysical speculation about two dual notions of reality, a fully definite notion using Boolean logic and appropriate for classical physics, and the other objectively indefinite notion using partition logic which turns out to be appropriate for quantum mechanics. The existence-information duality is used to intuitively illustrate these two dual notions of reality. The elucidation of the objectively indefinite notion of reality leads to the "killer application" of the existence-information duality, namely the interpretation of quantum mechanics. (shrink)
This book concerns the foundations of epistemic modality. I examine the nature of epistemic modality, when the modal operator is interpreted as concerning both apriority and conceivability, as well as states of knowledge and belief. The book demonstrates how epistemic modality relates to the computational theory of mind; metaphysical modality; deontic modality; the types of mathematical modality; to the epistemic status of undecidable propositions and abstraction principles in the philosophy of mathematics; to the apriori-aposteriori distinction; to the modal profile of (...) rational propositional intuition; and to the types of intention, when the latter is interpreted as a modal mental state. Examining the nature of epistemic logic itself, I develop a novel approach to conditions of self-knowledge in the setting of the modal μ-calculus, as well as novel epistemicist solutions to Curry's and the liar paradoxes. Solutions to the Julius Caesar Problem, and to previously intransigent issues concerning the first-person concept, the distinction between fundamental and derivative truths, and the unity of intention and its role in decision theory, are developed along the way. (shrink)
The complexity, subtlety, interlinking, and scale of many problems faced individually and collectively in today's rapidly changing world requires an epistemology--a way of thinking about our knowing--capable of facilitating new kinds of responses that avoid recapitulation of old ways of thinking and living. Epistemology, which implicitly provides the basis for engagement with the world via the fundamental act of distinction, must therefore be included as a central facet of any practical attempts at self/world transformation. We need to change how (...) we think, not just what we think. The new epistemology needs to be of a higher order than the source of the problems we face. -/- This theoretical, transdisciplinary dissertation argues that such a new epistemology needs to be recursive and process-oriented. This means that the thoughts about thinking that it produces must explicitly follow the patterns of thinking by which those thoughts are generated. The new epistemology is therefore also phenomenological, requiring the development of a reflexivity in thinking that recursively links across two levels of order--between content and process. The result is an epistemology that is of (and for) the whole human being. It is an enacted (will-imbued) and aesthetic (feeling-permeated) epistemology (thinking-penetrated) that is sensitive to and integrative of material, soul, and spiritual aspects of ourselves and our world. I call this kind of epistemology aesthetic, because its primary characteristic is found in the phenomenological, mutually fructifying and transformative marriage between the capacity for thinking and the capacity for feeling. -/- Its foundations are brought forward through the confluence of multiple domains: cybernetic epistemology, the esoteric epistemology of anthroposophy (the spiritual science of Rudolf Steiner), and the philosophy of the implicit as developed by Eugene Gendlin. -/- The practice of aesthetic epistemology opens new phenomenal domains of experience, shedding light on relations between ontology and epistemology, mind and body, logic and thinking, as well as on the formation (and transformation) of identity, the immanence of thinking in world-processes, the existence of different types of logic, and the nature of beings, of objects, and most importantly of thinking itself and its relationship to spirit. (shrink)
Why do I have to be ethical? That is the essential question of a logical foundation of ethics in the phenomenology of Edmund Husserl. This article proposes to see the basic motivation of an ethical reason in the relationship between the two fundamental poles, that is the «Lifeworld» («Lebenswelt») and the «I-subject» («Ich-Subjekt»). This connection will be considered to constitute ethics in this article. This kind of ethics as a «condition of possibility» is then an a-priori ontological necessity. The (...) article will demonstrate how the composition of Husserl’s Prolegomena and his argumentation are an example of a foundation for phenomenological ethics: in this book Husserl derived logic as the first «condition of possibility». With logic’s three main characteristics — theory, normativity and praxis — it is the theoretical basis of a phenomenological ethics. (shrink)
The definitions of ‘deduction’ found in virtually every introductory logic textbook would encourage us to believe that the inductive/deductive distinction is a distinction among kinds of arguments and that the extension of ‘deduction’ is a determinate class of arguments. In this paper, we argue that that this approach is mistaken. Specifically, we defend the claim that typical definitions of ‘deduction’ operative in attempts to get at the induction/deduction distinction are either too narrow or insufficiently precise. We conclude by presenting (...) a deflationary understanding of the inductive/deductive distinction; in our view, its content is nothing over and above the answers to two fundamental sorts of questions central to critical thinking. (shrink)
Climate change assessments rely upon scenarios of socioeconomic developments to conceptualize alternative outcomes for global greenhouse gas emissions. These are used in conjunction with climate models to make projections of future climate. Specifically, the estimations of greenhouse gas emissions based on socioeconomic scenarios constrain climate models in their outcomes of temperatures, precipitation, etc. Traditionally, the fundamentallogic of the socioeconomic scenarios—that is, the logic that makes them plausible—is developed and prioritized using methods that are very subjective. This (...) introduces a fundamental challenge for climate change assessment: The veracity of projections of future climate currently rests on subjective ground. We elaborate on these subjective aspects of scenarios in climate change research. We then consider an alternative method for developing scenarios, a systems dynamics approach called ‘Cross-Impact Balance’ (CIB) analysis. We discuss notions of ‘objective’ and ‘objectivity’ as criteria for distinguishing appropriate scenario methods for climate change research. We distinguish seven distinct meanings of ‘objective,’ and demonstrate that CIB analysis is more objective than traditional subjective approaches. However, we also consider criticisms concerning which of the seven meanings of ‘objective’ are appropriate for scenario work. Finally, we arrive at conclusions regarding which meanings of ‘objective’ and ‘objectivity’ are relevant for climate change research. Because scientific assessments uncover knowledge relevant to the responses of a real, independently existing climate system, this requires scenario methodologies employed in such studies to also uphold the seven meanings of ‘objective’ and ‘objectivity.’. (shrink)
In this paper, we axiomatize the deontic logic in Fusco 2015, which uses a Stalnaker-inspired account of diagonal acceptance and a two-dimensional account of disjunction to treat Ross’s Paradox and the Puzzle of Free Choice Permission. On this account, disjunction-involving validities are a priori rather than necessary. We show how to axiomatize two-dimensional disjunction so that the introduction/elimination rules for boolean disjunction can be viewed as one-dimensional projections of more general two-dimensional rules. These completeness results help make explicit the (...) restrictions Fusco’s account must place on free-choice inferences. They are also of independent interest, as they raise difficult questions about how to ‘lift’ a Kripke frame for a one- dimensional modal logic into two dimensions. (shrink)
Minimal Type Theory (MTT) is based on type theory in that it is agnostic about Predicate Logic level and expressly disallows the evaluation of incompatible types. It is called Minimal because it has the fewest possible number of fundamental types, and has all of its syntax expressed entirely as the connections in a directed acyclic graph.
There has been a recent surge of work on deontic modality within philosophy of language. This work has put the deontic logic tradition in contact with natural language semantics, resulting in significant increase in sophistication on both ends. This chapter surveys the main motivations, achievements, and prospects of this work.
Derrida’s thought on “trace,” “différance,” “writing,” and “supplement” is always thought the breaking of logocentrism, the essence, the positive meaning, and the closure of the metaphysics of presence; this thinking is accordingly regarded the thinking with the fundamental structure of difference and openness. By tracking back to Saussure, Husserl and Levinas, this fundamental difference breaks the myth of ideal meaning as well as the illusion of the absolute open; its lack of ideality and absoluteness contains the fundamental (...) difference within itself and thus has the structure of open. However, from a broader perspective, I will re-ask the question “Whether or not Derrida’s ‘trace,’ ‘différance,’ ‘writing,’ and ‘supplement’ have a structure of open.” When the web of differences encompasses everything even its own “exit,” this “open” thus conceals and denies other modes of thinking. With the impossibility of going out of this mode of thinking, the structure of différance is closed. (shrink)
This paper aims to provide a basic explanation of existence, fundamental aspects of reality, and consciousness. Existence in its most general sense is identified with the principle of logical consistency: to exist means to be logically consistent. The essence of the principle of logical consistency is that every thing is what it is and is not what it is not. From this principle follows the existence of intrinsic, indescribable identities of things and relations between them. There are three (...) class='Hi'>fundamental, logically necessary relations: similarity, composition and instantiation. Set theory, mathematics, logic and science are presented as relational descriptions of reality. Qualities of consciousness (qualia) are identified with intrinsic identities of things or at least a certain subset of them, especially in the context of a dynamic form of organized complexity. (shrink)
A principle, according to which any scientific theory can be mathematized, is investigated. That theory is presupposed to be a consistent text, which can be exhaustedly represented by a certain mathematical structure constructively. In thus used, the term “theory” includes all hypotheses as yet unconfirmed as already rejected. The investigation of the sketch of a possible proof of the principle demonstrates that it should be accepted rather a metamathematical axiom about the relation of mathematics and reality. Its investigation needs philosophical (...) means. Husserl’s phenomenology is what is used, and then the conception of “bracketing reality” is modelled to generalize Peano arithmetic in its relation to set theory in the foundation of mathematics. The obtained model is equivalent to the generalization of Peano arithmetic by means of replacing the axiom of induction with that of transfinite induction. A comparison to Mach’s doctrine is used to be revealed the fundamental and philosophical reductionism of Husserl’s phenomenology leading to a kind of Pythagoreanism in the final analysis. Accepting or rejecting the principle, two kinds of mathematics appear differing from each other by its relation to reality. Accepting the principle, mathematics has to include reality within itself in a kind of Pythagoreanism. These two kinds are called in paper correspondingly Hilbert mathematics and Gödel mathematics. The sketch of the proof of the principle demonstrates that the generalization of Peano arithmetic as above can be interpreted as a model of Hilbert mathematics into Gödel mathematics therefore showing that the former is not less consistent than the latter, and the principle is an independent axiom. An information interpretation of Hilbert mathematics is involved. It is a kind of ontology of information. Thus the problem which of the two mathematics is more relevant to our being is discussed. An information interpretation of the Schrödinger equation is involved to illustrate the above problem. (shrink)
We present epistemic multilateral logic, a general logical framework for reasoning involving epistemic modality. Standard bilateral systems use propositional formulae marked with signs for assertion and rejection. Epistemic multilateral logic extends standard bilateral systems with a sign for the speech act of weak assertion (Incurvati and Schlöder 2019) and an operator for epistemic modality. We prove that epistemic multilateral logic is sound and complete with respect to the modal logic S5 modulo an appropriate translation. The logical (...) framework developed provides the basis for a novel, proof-theoretic approach to the study of epistemic modality. To demonstrate the fruitfulness of the approach, we show how the framework allows us to reconcile classical logic with the contradictoriness of so-called Yalcin sentences and to distinguish between various inference patterns on the basis of the epistemic properties they preserve. (shrink)
In 1926, Ernst Mally, an Austrian logician, has introduced a system of deontic logic in which he has proposed three fundamental distinctions which proved to be important in the context of the further development of the logic of norms. It is argued that in his philosophical considerations Mally has introduced a number of important distinctions concerning the very concept of norm, but by getting them confused in introducing the subsequent formalisms he failed to formally preserve them. In (...) some of his philosophically made distinctions Mally apparently foresaw contemporary trends in logic of norms. To some extent this particular feature of Mally’s system open wide opportunities to reconstruct –– with the corresponding renovations — his illformed Deontik into many nowadays known systems of logic of norms and thus provides a fertile ground for this kind of research. (shrink)
It has been largely assumed from the start that truth, the first premise of the Tripartite theory of Knowledge, is necessary for a mental state of knowing. And this has intuitively made sense. Examples that demonstrate the logic of this premise are wide-spread and easily found. Yet, if one tries to establish the necessity of this condition for oneself, one may discover, a logical flaw in this premise. In theory truth is necessary, however, in practice it is not truth (...) that establishes knowledge. -/- We obtain knowledge using our perception or five senses. We accept the subjectivity of perspective, even though truth may or may not be established. We reject knowledge only when evidence of falsehood is obtained, if ever. -/- In practice, in the obtaining of knowledge of the conditional, those things we experience that could have been otherwise, truth is not established. The fundamental application of how we attain knowledge is in witnessing with our perception, our five senses, evidence for our conclusion. However, evidence provides support for our conclusion, by way of justification, but does not by itself provide truth. And therefore, it is not truth that is necessary for knowledge. It is the lack of evidence of falsehood in our conclusion that is the basis of our knowledge. And a lack of evidence of falsehood, is not the equal of truth. -/- The following thought experiment will demonstrate this flaw between theory and practice. (shrink)
The article presents Martin Heidegger’s early conception of foundational questions of logic and science. It focuses on their treatment in the Introduction to Phenomenological Research. Presenting the conception of phenomenon, perception, language/speech, noun and verb, proposition and deceit, the article shows the fundamental idea of facticity of speaking as the ground of these questions. It uses ideas of existence, fact and time to a-chieve the result. The impact on the most famous early book by Martin Heidegger is also (...) considered. (shrink)
A presente dissertação tem como objetivo uma apresentação da proposta de Charles Kielkopf, de tradução da lógica deôntica standard em uma lógica normal alética e de seusresultados quanto à construção de um sistema de lógica deôntica que capture conceitos eprincípios kantianos como necessidade causal e as formulações do Imperativo Categórico acerca do Reino da Natureza e do Reino dos Fins. Uma vez que este processo resulta em uma interpretação de aspectos da filosofia kantiana, optou-se inicialmente por uma apresentação em linhas (...) gerais destas concepções e, tendo em vista as dificuldades referentes a aplicabilidade de um processo de tradução entre princípios deônticos e ônticos,fez-se necessária também uma exposição acerca do problema das barreiras inferenciais, bem como de sua conseqüência mais imediata, a saber, o Dilema de Jörgensen. Num segundo momento, também foi feita uma caracterização dos sistemas modais normais, tanto deônticos quanto aléticos, bem como da noção de tradução entre lógicas e dos modelos de Dawson. O capítulo final consiste justamente num exame crítico da proposta de Kielkopf, o qual faz uso dos modelos de Dawson para desenvolver uma lógica deôntica a partir do sistema lógico K1. Tais modelos de Dawson permitem a definição de uma lógica deôntica em termos de modalidades aléticas iteradas, com o que este modelo constitui-se em uma maneira de evitar os problemas referentes às barreiras inferenciais. O desenvolvimento de uma alternativa para a atribuição de um status lógico a concepções deônticas não constitui,contudo, o aspecto inovador desta proposta, tal aspecto encontra-se justamente na utilização por Kielkopf, de seu modelo formal como uma ferramenta para a investigação de concepções filosóficas, no caso, as concepções kantianas já citadas. -/- The current dissertation has as its main objective an exposition and critical analysis of Charles Kielkopf's translation of the Standard Deontic Logic (SDL) into a normal alethic logic, and the resulting construction of a system of deontic logic that captures Kantian fundamental concepts and principles such as the concept of causal necessity and the formulations of the Categorical Imperative concerning Kingdom of Nature and Kingdom of Ends. Since this process results in an interpretation of aspects of the Kantian philosophy, it has been chosen,initially, a presentation in general lines of these concepts and, considering the difficulties regarding the applicability of a translation process between deontic and ontic principles, it was done necessary also an exhibition concerning the problem of the inferential barriers, as well as of his more immediate consequence, i. e., Jörgensen's Dilemma. In a secon dmoment, it was also made a characterization of normal modal systems, deontic ones andalethic ones, as well as of the notions of translation between logics and of Dawson modelling. The final chapter consists of a critical examination of Kielkopf's proposal, which uses Dawson modelling to develop a deontic logic based on the alethic system K1. Such Dawson modelling allows the definition of a deontic modalities in terms of iterated alethic modalities,therefore this model constitutes a way of avoiding the problems regarding inferential barriers. The development of an alternative for attribution of logical status to deontic concepts doesn't constitute, however, in an innovative aspect of this proposal, such aspect can be founded inthe use made by Kielkopf of his formal model as a tool for the investigation of philosophical concepts, such as the Kantian concepts already mentioned. (shrink)
I develop and defend a truthmaker semantics for the relevant logic R. The approach begins with a simple philosophical idea and develops it in various directions, so as to build a technically adequate relevant semantics. The central philosophical idea is that truths are true in virtue of specific states. Developing the idea formally results in a semantics on which truthmakers are relevant to what they make true. A very natural notion of conditionality is added, giving us relevant implication. I (...) then investigate ways to add conjunction, disjunction, and negation; and I discuss how to justify contraposition and excluded middle within a truthmaker semantics. (shrink)
Weakly Aggregative Modal Logic (WAML) is a collection of disguised polyadic modal logics with n-ary modalities whose arguments are all the same. WAML has some interesting applications on epistemic logic and logic of games, so we study some basic model theoretical aspects of WAML in this paper. Specifically, we give a van Benthem-Rosen characterization theorem of WAML based on an intuitive notion of bisimulation and show that each basic WAML system Kn lacks Craig Interpolation.
The aim of the paper is to argue that all—or almost all—logical rules have exceptions. In particular, it is argued that this is a moral that we should draw from the semantic paradoxes. The idea that we should respond to the paradoxes by revising logic in some way is familiar. But previous proposals advocate the replacement of classical logic with some alternative logic. That is, some alternative system of rules, where it is taken for granted that these (...) hold without exception. The present proposal is quite different. According to this, there is no such alternative logic. Rather, classical logic retains the status of the ‘one true logic’, but this status must be reconceived so as to be compatible with (almost) all of its rules admitting of exceptions. This would seem to have significant repercussions for a range of widely held views about logic: e.g. that it is a priori, or that it is necessary. Indeed, if the arguments of the paper succeed, then such views must be given up. (shrink)
The five English words—sentence, proposition, judgment, statement, and fact—are central to coherent discussion in logic. However, each is ambiguous in that logicians use each with multiple normal meanings. Several of their meanings are vague in the sense of admitting borderline cases. In the course of displaying and describing the phenomena discussed using these words, this paper juxtaposes, distinguishes, and analyzes several senses of these and related words, focusing on a constellation of recommended senses. One of the purposes of this (...) paper is to demonstrate that ordinary English properly used has the resources for intricate and philosophically sound investigation of rather deep issues in logic and philosophy of language. No mathematical, logical, or linguistic symbols are used. Meanings need to be identified and clarified before being expressed in symbols. We hope to establish that clarity is served by deferring the extensive use of formalized or logically perfect languages until a solid “informal” foundation has been established. Questions of “ontological status”—e.g., whether propositions or sentences, or for that matter characters, numbers, truth-values, or instants, are “real entities”, are “idealizations”, or are “theoretical constructs”—plays no role in this paper. As is suggested by the title, this paper is written to be read aloud. -/- I hope that reading this aloud in groups will unite people in the enjoyment of the humanistic spirit of analytic philosophy. (shrink)
Sentences containing definite descriptions, expressions of the form ‘The F’, can be formalised using a binary quantifier ι that forms a formula out of two predicates, where ιx[F, G] is read as ‘The F is G’. This is an innovation over the usual formalisation of definite descriptions with a term forming operator. The present paper compares the two approaches. After a brief overview of the system INFι of intuitionist negative free logic extended by such a quantifier, which was presented (...) in (Kürbis 2019), INFι is first compared to a system of Tennant’s and an axiomatic treatment of a term forming ι operator within intuitionist negative free logic. Both systems are shown to be equivalent to the subsystem of INFι in which the G of ιx[F, G] is restricted to identity. INFι is then compared to an intuitionist version of a system of Lambert’s which in addition to the term forming operator has an operator for predicate abstraction for indicating scope distinctions. The two systems will be shown to be equivalent through a translation between their respective languages. Advantages of the present approach over the alternatives are indicated in the discussion. (shrink)
We formally introduce a novel, yet ubiquitous, category of norms: norms of instrumentality. Norms of this category describe which actions are obligatory, or prohibited, as instruments for certain purposes. We propose the Logic of Agency and Norms (LAN) that enables reasoning about actions, instrumentality, and normative principles in a multi-agent setting. Leveraging LAN , we formalize norms of instrumentality and compare them to two prevalent norm categories: norms to be and norms to do. Last, we pose principles relating the (...) three categories and evaluate their validity vis-à-vis notions of deliberative acting. On a technical note, the logic will be shown decidable via the finite model property. (shrink)
Classical logic is usually interpreted as the logic of propositions. But from Boole's original development up to modern categorical logic, there has always been the alternative interpretation of classical logic as the logic of subsets of any given (nonempty) universe set. Partitions on a universe set are dual to subsets of a universe set in the sense of the reverse-the-arrows category-theoretic duality--which is reflected in the duality between quotient objects and subobjects throughout algebra. Hence the (...) idea arises of a dual logic of partitions. That dual logic is described here. Partition logic is at the same mathematical level as subset logic since models for both are constructed from (partitions on or subsets of) arbitrary unstructured sets with no ordering relations, compatibility or accessibility relations, or topologies on the sets. Just as Boole developed logical finite probability theory as a quantitative treatment of subset logic, applying the analogous mathematical steps to partition logic yields a logical notion of entropy so that information theory can be refounded on partition logic. But the biggest application is that when partition logic and the accompanying logical information theory are "lifted" to complex vector spaces, then the mathematical framework of quantum mechanics is obtained. Partition logic models indefiniteness (i.e., numerical attributes on a set become more definite as the inverse-image partition becomes more refined) while subset logic models the definiteness of classical physics (an entity either definitely has a property or definitely does not). Hence partition logic provides the backstory so the old idea of "objective indefiniteness" in QM can be fleshed out to a full interpretation of quantum mechanics. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.