Firstly I characterize Simple PartialLogic (SPL) as the generalization and extension of a certain two-valued logic. Based on the characterization I present two definitions of validity in SPL. Finally I show that given my characterization these two definitions are more appropriate than other definitions that have been prevalent, since both have some desirable semantic properties that the others lack.
Simple partiallogic (=SPL) is, broadly speaking, an extensional logic which allows for the truth-value gap. First I give a system of propositional SPL by partializing classical logic, as well as extending it with several non-classical truth-functional operators. Second I show a way based on SPL to construct a system of tensed ontology, by representing tensed statements as two kinds of necessary statements in a linear model that consists of the present and future worlds. Finally I (...) compare that way with other two ways based on Łukasiewicz’s three-valued logic and branching temporal logic. (shrink)
In this paper we consider the theory of predicate logics in which the principle of Bivalence or the principle of Non-Contradiction or both fail. Such logics are partial or paraconsistent or both. We consider sequent calculi for these logics and prove Model Existence. For L4, the most general logic under consideration, we also prove a version of the Craig-Lyndon Interpolation Theorem. The paper shows that many techniques used for classical predicate logic generalise to partial and paraconsistent (...) logics once the right set-up is chosen. Our logic L4 has a semantics that also underlies Belnap’s [4] and is related to the logic of bilattices. L4 is in focus most of the time, but it is also shown how results obtained for L4 can be transferred to several variants. (shrink)
One response to the problem of logical omniscience in standard possible worlds models of belief is to extend the space of worlds so as to include impossible worlds. It is natural to think that essentially the same strategy can be applied to probabilistic models of partial belief, for which parallel problems also arise. In this paper, I note a difficulty with the inclusion of impossible worlds into probabilistic models. Under weak assumptions about the space of worlds, most of the (...) propositions which can be constructed from possible and impossible worlds are in an important sense inexpressible; leaving the probabilistic model committed to saying that agents in general have at least as many attitudes towards inexpressible propositions as they do towards expressible propositions. If it is reasonable to think that our attitudes are generally expressible, then a model with such commitments looks problematic. (shrink)
This is part one of a two-part paper, in which we develop an axiomatic theory of the relation of partial ground. The main novelty of the paper is the of use of a binary ground predicate rather than an operator to formalize ground. This allows us to connect theories of partial ground with axiomatic theories of truth. In this part of the paper, we develop an axiomatization of the relation of partial ground over the truths of arithmetic (...) and show that the theory is a proof-theoretically conservative extension of the theory PT of positive truth. We construct models for the theory and draw some conclusions for the semantics of conceptualist ground. (shrink)
A general framework for translating various logical systems is presented, including a set of partial unary operators of affirmation and negation. Despite its usual reading, affirmation is not redundant in any domain of values and whenever it does not behave like a full mapping. After depicting the process of partial functions, a number of logics are translated through a variety of affirmations and a unique pair of negations. This relies upon two preconditions: a deconstruction of truth-values as ordered (...) and structured objects, unlike its mainstream presentation as a simple object; a redefinition of the Principle of Bivalence as a set of four independent properties, such that its definition does not equate with normality. (shrink)
A Fortiori Logic: Innovations, History and Assessments is a wide-ranging and in-depth study of a fortiori reasoning, comprising a great many new theoretical insights into such argument, a history of its use and discussion from antiquity to the present day, and critical analyses of the main attempts at its elucidation. Its purpose is nothing less than to lay the foundations for a new branch of logic and greatly develop it; and thus to once and for all dispel the (...) many fallacious ideas circulating regarding the nature of a fortiori reasoning. -/- The work is divided into three parts. The first part, Formalities, presents the author’s largely original theory of a fortiori argument, in all its forms and varieties. Its four (or eight) principal moods are analyzed in great detail and formally validated, and secondary moods are derived from them. A crescendo argument is distinguished from purely a fortiori argument, and similarly analyzed and validated. These argument forms are clearly distinguished from the pro rata and analogical forms of argument. Moreover, we examine the wide range of a fortiori argument; the possibilities of quantifying it; the formal interrelationships of its various moods; and their relationships to syllogistic and analogical reasoning. Although a fortiori argument is shown to be deductive, inductive forms of it are acknowledged and explained. Although a fortiori argument is essentially ontical in character, more specifically logical-epistemic and ethical-legal variants of it are acknowledged. -/- The second part of the work, Ancient and Medieval History, looks into use and discussion of a fortiori argument in Greece and Rome, in the Talmud, among post-Talmudic rabbis, and in Christian, Moslem, Chinese and Indian sources. Aristotle’s approach to a fortiori argument is described and evaluated. There is a thorough analysis of the Mishnaic qal vachomer argument, and a reassessment of the dayo principle relating to it, as well as of the Gemara’s later take on these topics. The valuable contribution, much later, by Moshe Chaim Luzzatto is duly acknowledged. Lists are drawn up of the use of a fortiori argument in the Jewish Bible, the Mishna, the works of Plato and Aristotle, the Christian Bible and the Koran; and the specific moods used are identified. Moreover, there is a pilot study of the use of a fortiori argument in the Gemara, with reference to Rodkinson’s partial edition of the Babylonian Talmud, setting detailed methodological guidelines for a fuller study. There is also a novel, detailed study of logic in general in the Torah. -/- The third part of the present work, Modern and Contemporary Authors, describes and evaluates the work of numerous (some thirty) recent contributors to a fortiori logic, as well as the articles on the subject in certain lexicons. Here, we discover that whereas a few authors in the last century or so made some significant contributions to the field, most of them shot woefully off-target in various ways. The work of each author, whether famous or unknown, is examined in detail in a dedicated chapter, or at least in a section; and his ideas on the subject are carefully weighed. The variety of theories that have been proposed is impressive, and stands witness to the complexity and elusiveness of the subject, and to the crying need for the present critical and integrative study. But whatever the intrinsic value of each work, it must be realized that even errors and lacunae are interesting because they teach us how not to proceed. -/- This book also contains, in a final appendix, some valuable contributions to general logic, including new analyses of symbolization and axiomatization, existential import, the tetralemma, the Liar paradox and the Russell paradox. (shrink)
This is part two of a two-part paper in which we develop an axiomatic theory of the relation of partial ground. The main novelty of the paper is the of use of a binary ground predicate rather than an operator to formalize ground. In this part of the paper, we extend the base theory of the first part of the paper with hierarchically typed truth-predicates and principles about the interaction of partial ground and truth. We show that our (...) theory is a proof-theoretically conservative extension of the ramified theory of positive truth up to. (shrink)
In the paper, original formal-logical conception of syntactic and semantic: intensional and extensional senses of expressions of any language L is outlined. Syntax and bi-level intensional and extensional semantics of language L are characterized categorically: in the spirit of some Husserl’s ideas of pure grammar, Leśniewski-Ajukiewicz’s theory syntactic/semantic categories and in accordance with Frege’s ontological canons, Bocheński’s famous motto—syntax mirrors ontology and some ideas of Suszko: language should be a linguistic scheme of ontological reality and simultaneously a tool of its (...) cognition. In the logical conception of language L, its expressions should satisfy some general conditions of language adequacy. The adequacy ensures their unambiguous syntactic and semantic senses and mutual, syntactic, and semantic compatibility, correspondence guaranteed by the acceptance of a postulate of categorial compatibility syntactic and semantic categories of expressions of L. From this postulate, three principles of compositionality follow: one syntactic and two semantic already known to Frege. They are treated as conditions of homomorphism partial algebra of L into algebraic models of L: syntactic, intensional, and extensional. In the paper, they are applied to some expressions with quantifiers. Language adequacy connected with the logical senses described in the logical conception of language L is, of course, an idealization, but only expressions with high degrees of precision of their senses, after due justification, may become theorems of science. (shrink)
Even though it is not very often admitted, partial functions do play a significant role in many practical applications of deduction systems. Kleene has already given a semantic account of partial functions using a three-valued logic decades ago, but there has not been a satisfactory mechanization. Recent years have seen a thorough investigation of the framework of many-valued truth-functional logics. However, strong Kleene logic, where quantification is restricted and therefore not truthfunctional, does not fit the framework (...) directly. We solve this problem by applying recent methods from sorted logics. This paper presents a tableau calculus that combines the proper treatment of partial functions with the efficiency of sorted calculi. (shrink)
Recently, Bourne constructed a system of three-valued logic that he supposed to replace Łukasiewicz’s three-valued logic in view of the problems of future contingents. In this paper, I will show first that Bourne’s system makes no improvement to Łukasiewicz’s system. However, finding some good motivations and lessons in his attempt, next I will suggest a better way of achieving his original goal in some sense. The crucial part of my way lies in reconsidering the significance of the intermediate (...) truth-value so as to reconstruct Łukasiewicz’s three-valued logic as a kind of extensional modal logic based on partiallogic. (shrink)
The Logic of Causation: Definition, Induction and Deduction of Deterministic Causality is a treatise of formal logic and of aetiology. It is an original and wide-ranging investigation of the definition of causation (deterministic causality) in all its forms, and of the deduction and induction of such forms. The work was carried out in three phases over a dozen years (1998-2010), each phase introducing more sophisticated methods than the previous to solve outstanding problems. This study was intended as part (...) of a larger work on causal logic, which additionally treats volition and allied cause-effect relations (2004). The Logic of Causation deals with the main technicalities relating to reasoning about causation. Once all the deductive characteristics of causation in all its forms have been treated, and we have gained an understanding as to how it is induced, we are able to discuss more intelligently its epistemological and ontological status. In this context, past theories of causation are reviewed and evaluated (although some of the issues involved here can only be fully dealt with in a larger perspective, taking volition and other aspects of causality into consideration, as done in Volition and Allied Causal Concepts). Phase I: Macroanalysis. Starting with the paradigm of causation, its most obvious and strongest form, we can by abstraction of its defining components distinguish four genera of causation, or generic determinations, namely: complete, partial, necessary and contingent causation. When these genera and their negations are combined together in every which way, and tested for consistency, it is found that only four species of causation, or specific determinations, remain conceivable. The concept of causation thus gives rise to a number of positive and negative propositional forms, which can be studied in detail with relative ease because they are compounds of conjunctive and conditional propositions whose properties are already well known to logicians. The logical relations (oppositions) between the various determinations (and their negations) are investigated, as well as their respective implications (eductions). Thereafter, their interactions (in syllogistic reasoning) are treated in the most rigorous manner. The main question we try to answer here is: is (or when is) the cause of a cause of something itself a cause of that thing, and if so to what degree? The figures and moods of positive causative syllogism are listed exhaustively; and the resulting arguments validated or invalidated, as the case may be. In this context, a general and sure method of evaluation called ‘matricial analysis’ (macroanalysis) is introduced. Because this (initial) method is cumbersome, it is used as little as possible – the remaining cases being evaluated by means of reduction. Phase II: Microanalysis. Seeing various difficulties encountered in the first phase, and the fact that some issues were left unresolved in it, a more precise method is developed in the second phase, capable of systematically answering most outstanding questions. This improved matricial analysis (microanalysis) is based on tabular prediction of all logically conceivable combinations and permutations of conjunctions between two or more items and their negations (grand matrices). Each such possible combination is called a ‘modus’ and is assigned a permanent number within the framework concerned (for 2, 3, or more items). This allows us to identify each distinct (causative or other, positive or negative) propositional form with a number of alternative moduses. This technique greatly facilitates all work with causative and related forms, allowing us to systematically consider their eductions, oppositions, and syllogistic combinations. In fact, it constitutes a most radical approach not only to causative propositions and their derivatives, but perhaps more importantly to their constituent conditional propositions. Moreover, it is not limited to logical conditioning and causation, but is equally applicable to other modes of modality, including extensional, natural, temporal and spatial conditioning and causation. From the results obtained, we are able to settle with formal certainty most of the historically controversial issues relating to causation. Phase III: Software Assisted Analysis. The approach in the second phase was very ‘manual’ and time consuming; the third phase is intended to ‘mechanize’ much of the work involved by means of spreadsheets (to begin with). This increases reliability of calculations (though no errors were found, in fact) – but also allows for a wider scope. Indeed, we are now able to produce a larger, 4-item grand matrix, and on its basis find the moduses of causative and other forms needed to investigate 4-item syllogism. As well, now each modus can be interpreted with greater precision and causation can be more precisely defined and treated. In this latest phase, the research is brought to a successful finish! Its main ambition, to obtain a complete and reliable listing of all 3-item and 4-item causative syllogisms, being truly fulfilled. This was made technically feasible, in spite of limitations in computer software and hardware, by cutting up problems into smaller pieces. For every mood of the syllogism, it was thus possible to scan for conclusions ‘mechanically’ (using spreadsheets), testing all forms of causative and preventive conclusions. Until now, this job could only be done ‘manually’, and therefore not exhaustively and with certainty. It took over 72’000 pages of spreadsheets to generate the sought for conclusions. This is a historic breakthrough for causal logic and logic in general. Of course, not all conceivable issues are resolved. There is still some work that needs doing, notably with regard to 5-item causative syllogism. But what has been achieved solves the core problem. The method for the resolution of all outstanding issues has definitely now been found and proven. The only obstacle to solving most of them is the amount of labor needed to produce the remaining (less important) tables. As for 5-item syllogism, bigger computer resources are also needed. (shrink)
According to the preference-centric approach to understanding partial belief, the connection between partial beliefs and preferences is key to understanding what partial beliefs are and how they’re measured. As Ramsey put it, the ‘degree of a belief is a causal property of it, which we can express vaguely as the extent to which we are prepared to act on it’ The Foundations of Mathematics and Other Logical Essays, Routledge, Oxon, pp 156–198, 1931). But this idea is not (...) as popular as it once was. Nowadays, the preference-centric approach is frequently dismissed out-of-hand as behaviouristic, unpalatably anti-realist, and/or prone to devastating counterexamples. Cases like Eriksson and Hájek’s :183–213, 2007) preferenceless Zen monk and Christensen’s :356–376, 2001) other roles argument have suggested to many that any account of partial belief that ties them too closely to preferences is irretrievably flawed. In this paper I provide a defence of preference-centric accounts of partial belief. (shrink)
This paper uses a partially ordered set of syntactic categories to accommodate optionality and licensing in natural language syntax. A complex but well-studied data set pertaining to the syntax of quantifier scope and negative polarity licensing in Hungarian is used to illustrate the proposal. The presentation is geared towards both linguists and logicians. The paper highlights that the main ideas can be implemented in different grammar formalisms, and discusses in detail an implementation where the partial ordering on categories is (...) given by the derivability relation of a calculus with residuated and Galois-connected unary operators. (shrink)
A analysis of some concepts of logic is proposed, around the work of Edelcio de Souza. Two of his related issues will be emphasized, namely: opposition, and quasi-truth. After a review of opposition between logical systems [2], its extension to many-valuedness is considered following a special semantics including partial operators [13]. Following this semantic framework, the concepts of antilogic and counterlogic are translated into opposition-forming operators [15] and specified as special cases of contradictoriness and contrariety. Then quasi-truth [5] (...) is introduced and equally translated as a product of two partial operators. Finally, the reflections proposed around opposition and quasi-truth lead to a third new logical concept: quasi-opposition, borrowing the central feature of partiality and opening the way to a potential field of new investigations into philosophical logic. (shrink)
This article derives from a project attempting to show that Western formal logic, from Aristotle onward, has both been partially constituted by, and partially constitutive of, what has become known as racism. In the present article, I will first discuss, in light of Frege’s honorary role as founder of the philosophy of mathematics, Reuben Hersh’s What is Mathematics, Really? Second, I will explore how the infamous section of Frege’s 1924 diary (specifically the entries from March 10 to April 9) (...) supports Hersh's claim regarding the link between political conservatism and the (historically and currently) dominant school of the philosophy of mathematics (to which Frege undeniably belongs). Third, I will examine Frege’s attempt at a more reader-friendly introduction to his philosophy of mathematics, The Foundations of Arithmetic. And finally, I will briefly analyze Frege’s Begriffsschrift to see how questions of race arise even at the heights of his logical abstraction. (shrink)
The famous Cartesian Nicolas Malebranche (1638-1715) espoused the occasionalist doctrine that ‘there is only one true cause because there is only one true God; that the nature or power of each thing is nothing but the will of God; that all natural causes are not true causes but only occasional causes’ (LO, 448, original italics). One of Malebranche’s well-known arguments for occasionalism, known as, the ‘no necessary connection’ argument (or, NNC ) stems from the principle that ‘a true cause… is (...) one such that the mind perceives a necessary connection between it and its effect’ (LO, 450). The outline of this paper is as follows. I explicitly layout NNC and articulate some of its prima facie strengths (§1). I then critically discuss, what I take to be, the two main arguments against NNC of the Lee-Pyle interpretation (§2). The main conclusion from (§2) is that Malebranche did not abandon NNC in his later works given textual evidence from the Dialogues, contrary to the Lee-Pyle interpretation. In (§3) I discuss in what ways Suárez, Leibniz, Régis and Spinoza all accepted the main premise of NNC. Then, I rebut Steven Nadler’s influential and unchallenged criticism that Malebranche conflated causal and logical necessity, and provide a more accurate interpretation of Malebranche that only commits him to a partial reduction of causal to logical necessity (§4). (shrink)
Accounts of Hobbes’s ‘system’ of sciences oscillate between two extremes. On one extreme, the system is portrayed as wholly axiomtic-deductive, with statecraft being deduced in an unbroken chain from the principles of logic and first philosophy. On the other, it is portrayed as rife with conceptual cracks and fissures, with Hobbes’s statements about its deductive structure amounting to mere window-dressing. This paper argues that a middle way is found by conceiving of Hobbes’s _Elements of Philosophy_ on the model of (...) a mixed-mathematical science, not the model provided by Euclid’s _Elements of Geometry_. I suggest that Hobbes is a test case for understanding early-modern system-construction more generally, as inspired by the structure of the applied mathematical sciences. This approach has the additional virtue of bolstering, in a novel way, the thesis that the transformation of philosophy in the long seventeenth century was heavily indebted to mathematics, a thesis that has increasingly come under attack in recent years. (shrink)
Heinrich Behmann (1891-1970) obtained his Habilitation under David Hilbert in Göttingen in 1921 with a thesis on the decision problem. In his thesis, he solved - independently of Löwenheim and Skolem's earlier work - the decision problem for monadic second-order logic in a framework that combined elements of the algebra of logic and the newer axiomatic approach to logic then being developed in Göttingen. In a talk given in 1921, he outlined this solution, but also presented important (...) programmatic remarks on the significance of the decision problem and of decision procedures more generally. The text of this talk as well as a partial English translation are included. (shrink)
ABSTRACT: This 1974 paper builds on our 1969 paper (Corcoran-Weaver [2]). Here we present three (modal, sentential) logics which may be thought of as partial systematizations of the semantic and deductive properties of a sentence operator which expresses certain kinds of necessity. The logical truths [sc. tautologies] of these three logics coincide with one another and with those of standard formalizations of Lewis's S5. These logics, when regarded as logistic systems (cf. Corcoran [1], p. 154), are seen to be (...) equivalent; but, when regarded as consequence systems (ibid., p. 157), one diverges from the others in a fashion which suggests that two standard measures of semantic complexity may not be as closely linked as previously thought. -/- This 1974 paper uses the linear notation for natural deduction presented in [2]: each two-dimensional deduction is represented by a unique one-dimensional string of characters. Thus obviating need for two-dimensional trees, tableaux, lists, and the like—thereby facilitating electronic communication of natural deductions. The 1969 paper presents a (modal, sentential) logic which may be thought of as a partial systematization of the semantic and deductive properties of a sentence operator which expresses certain kinds of necessity. The logical truths [sc. tautologies] of this logic coincides those of standard formalizations of Lewis’s S4. Among the paper's innovations is its treatment of modal logic in the setting of natural deduction systems--as opposed to axiomatic systems. The author’s apologize for the now obsolete terminology. For example, these papers speak of “a proof of a sentence from a set of premises” where today “a deduction of a sentence from a set of premises” would be preferable. 1. Corcoran, John. 1969. Three Logical Theories, Philosophy of Science 36, 153–77. J P R -/- 2. Corcoran, John and George Weaver. 1969. Logical Consequence in Modal Logic: Natural Deduction in S5 Notre Dame Journal of Formal Logic 10, 370–84. MR0249278 (40 #2524). 3. Weaver, George and John Corcoran. 1974. Logical Consequence in Modal Logic: Some Semantic Systems for S4, Notre Dame Journal of Formal Logic 15, 370–78. MR0351765 (50 #4253). (shrink)
Łukasiewicz has often been criticized for his motive for inventing his three-valued logic, namely the avoidance of determinism. First of all, I want to show that almost all of the critcism along this line was wrong. Second I will indicate that he made mistakes, however, in constructing his system, because he had other motives at the same time. Finally I will propose some modification of his system and its interpretation which can attain his original purpose in some sense.
Although it seems intuitively clear that acts of requesting are different from acts of commanding, it is not very easy to sate their differences precisely in dynamic terms. In this paper we show that it becomes possible to characterize, at least partially, the effects of acts of requesting and compare them with the effects of acts of commanding by combining dynamified deontic logic with epistemic logic. One interesting result is the following: each act of requesting is appropriately differentiated (...) from an act of commanding with the same content, but for each act of requesting, there is another act of commanding with much more complex content which updates models in exactly the same way as it does. We will also consider an application of our characterization of acts of requesting to acts of asking yes-no questions. It yields a straightforward formalization of the view of acts of asking questions as requests for information. (shrink)
In this paper we focus our attention on tableau methods for propositional interval temporal logics. These logics provide a natural framework for representing and reasoning about temporal properties in several areas of computer science. However, while various tableau methods have been developed for linear and branching time point-based temporal logics, not much work has been done on tableau methods for interval-based ones. We develop a general tableau method for Venema's \cdt\ logic interpreted over partial orders (\nsbcdt\ for short). (...) It combines features of the classical tableau method for first-order logic with those of explicit tableau methods for modal logics with constraint label management, and it can be easily tailored to most propositional interval temporal logics proposed in the literature. We prove its soundness and completeness, and we show how it has been implemented. (shrink)
A graph-theoretic account of fibring of logics is developed, capitalizing on the interleaving characteristics of fibring at the linguistic, semantic and proof levels. Fibring of two signatures is seen as a multi-graph (m-graph) where the nodes and the m-edges include the sorts and the constructors of the signatures at hand. Fibring of two models is a multi-graph (m-graph) where the nodes and the m-edges are the values and the operations in the models, respectively. Fibring of two deductive systems is an (...) m-graph whose nodes are language expressions and the m-edges represent the inference rules of the two original systems. The sobriety of the approach is confirmed by proving that all the fibring notions are universal constructions. This graph-theoretic view is general enough to accommodate very different fibrings of propositional based logics encompassing logics with non-deterministic semantics, logics with an algebraic semantics, logics with partial semantics and substructural logics, among others. Soundness and weak completeness are proved to be preserved under very general conditions. Strong completeness is also shown to be preserved under tighter conditions. In this setting, the collapsing problem appearing in several combinations of logic systems can be avoided. (shrink)
In the paper, various notions of the logical semiotic sense of linguistic expressions – namely, syntactic and semantic, intensional and extensional – are considered and formalised on the basis of a formal-logical conception of any language L characterised categorially in the spirit of certain Husserl's ideas of pure grammar, Leśniewski-Ajdukiewicz's theory of syntactic/semantic categories and, in accordance with Frege's ontological canons, Bocheński's and some of Suszko's ideas of language adequacy of expressions of L. The adequacy ensures their unambiguous syntactic and (...) semantic senses and mutual, syntactic and semantic correspondence guaranteed by the acceptance of a postulate of categorial compatibility of syntactic and semantic categories of expressions of L. This postulate defines the unification of these three logical senses. There are three principles of compositionality which follow from this postulate: one syntactic and two semantic ones already known to Frege. They are treated as conditions of homomorphism of partial algebra of L into algebraic models of L: syntactic, intensional and extensional. In the paper, they are applied to some expressions with quantifiers. Language adequacy connected with the logical senses described in the logical conception of language L is, obviously, an idealisation. The syntactic and semantic unambiguity of its expressions is not, of course, a feature of natural languages, but every syntactically and semantically ambiguous expression of such languages may be treated as a schema representing all of its interpretations that are unambiguous expressions. (shrink)
This article informally presents a solution to the paradoxes of truth and shows how the solution solves classical paradoxes (such as the original Liar) as well as the paradoxes that were invented as counter-arguments for various proposed solutions (``the revenge of the Liar''). Any solution to the paradoxes of truth necessarily establishes a certain logical concept of truth. This solution complements the classical procedure of determining the truth values of sentences by its own failure and, when the procedure fails, through (...) an appropriate semantic shift allows us to express the failure in a classical two-valued language. Formally speaking, the solution is a language with one meaning of symbols and two valuations of the truth values of sentences. The primary valuation is a classical valuation that is partial in the presence of the truth predicate. It enables us to determine the classical truth value of a sentence or leads to the failure of that determination. The language with the primary valuation is precisely the largest intrinsic fixed point of the strong Kleene three-valued semantics (LIFPSK3). The semantic shift that allows us to express the failure of the primary valuation is precisely the classical closure of LIFPSK3: it extends LIFPSK3 to a classical language in parts where LIFPSK3 is undetermined. Thus, this article provides a content-wise argumentation, which has not been present in contemporary debates so far, for the choice of LIFPSK3 and its classical closure as the right model for the logical concept of truth. In the end, an erroneous critique of Kripke-Feferman axiomatic theory of truth, which is present in contemporary literature, is pointed out. (shrink)
This article is part of a larger project in which I attempt to show that Western formal logic, from its inception in Aristotle onward, has both been partially constituted by, and partially constitutive of, what has become known as racism. In contrast to this trend, the present article concerns the major philosopher whose contribution to logic has been perhaps the most derided and marginalized, and yet whose character and politics are, from a contemporary perspective, drastically superior—John Stuart Mill. (...) My approach to my core concern will be one of narrowing concentric circles. I will begin with Mill’s occasional political writings that bear on the issue of racism, including “The Negro Question.” From there, the core of the article will explore the political dimensions of Mill’s A System of Logic. (shrink)
In Winter 2017, the first author piloted a course in formal logic in which we aimed to (a) improve student engagement and mastery of the content, and (b) reduce maths anxiety and its negative effects on student outcomes, by adopting student oriented teaching including peer instruction and classroom flipping techniques. The course implemented a partially flipped approach, and incorporated group-work and peer learning elements, while retaining some of the traditional lecture format. By doing this, a wide variety of student (...) learning preferences could be provided for. (shrink)
In the final book of Logical Investigations from 1901, Husserl develops a theory of knowledge based on the intentional structure of consciousness. While there is some textual evidence that Husserl considered this to entail a critique of Kantian philosophy, he did not elaborate substantially on this. This paper reconstructs the covert critique of Kant’s theory of knowledge which LI contains. With respect to Kant, I discuss three core aspects of his theory of knowledge which, as Husserl’s reflections on Kant indicate, (...) Husserl was familiar with. These are the cooperation of two faculties for the justification of beliefs; the concept of a priori structures of knowledge Kant operated with; and the delivered transcendental proof of these structures. Regarding Logical Investigations, I first briefly outline the intentional structure of consciousness as presented in the fifth book and then turn to the theory of knowledge in the sixth book. I then clarify, partially on the basis of manuscripts and lecture notes, the covert critique of the three core aspects of Kant’s theory which the sixth book contains. (shrink)
Abstract. The aim of this paper is to sketch a topological epistemology that can be characterized as a knowledge first epistemology. For this purpose, the standard topological semantics for knowledge in terms of the interior kernel operator K of a topological space is extended to a topological semantics of belief operators B in a new way. It is shown that a topological structure has a kind of “derivation” (its “assembly” or “lattice of nuclei”) that defines a profusion of belief operators (...) B. These operators are compatible with the knowledge operator K in the sense that the all the pairs (K, B) satisfy the rules and axioms of a (weak) Stalnaker logic of knowledge and belief. The family of belief operators B compatible with K is partially ordered such that different belief operators can be compared according to their strength or reliability. Thereby, for a given topological knowledge operator, a kind of intuitionist logic of belief operators B compatible with K is defined. In sum, the topological knowledge first epistemology presented in this paper amounts to a pluralist knowledge first epistemology that conceives the relation between knowledge and belief not as a 1-1-relation but as a pluralist 1-n-relation, i.e., one knowledge operator K gives rise to a numerous family of compatible belief operators B. (shrink)
This paper examines the evolution of Edmund Husserl’s theory of perceptual occlusion. This task is accomplished in two stages. First, I elucidate Husserl’s conclusion, from his 1901 Logical Investigations, that the occluded parts of perceptual objects are intended by partial signitive acts. I focus on two doctrines of that account. I examine Husserl’s insight that signitive intentions are composed of Gehalt and I discuss his conclusion that signitive intentions sit on the continuum of fullness. Second, the paper discloses how (...) Husserl transforms his 1901 philosophy in his 1913 revisions to the Sixth Logical Investigation, affirming that the occluded parts of perceptual objects are intended by empty contiguity acts. I demonstrate how he overturns the two core doctrines of his theory from the Investigations in these revisions, claiming that empty intentions are not composed of Gehalt and asserting that those acts break with the continuum of fullness. Husserl implements these changes to solve problems that arise from his recognition of two new kinds of intentions; darker and completely dark acts. Finally, in the conclusion, I cash out this analysis, by indicating that, in 1913, Husserl transforms his theory of fulfillment on the basis of his new insights about empty acts. (shrink)
Recent work in formal philosophy has concentrated over-whelmingly on the logical problems pertaining to epistemic shortfall - which is to say on the various ways in which partial and sometimes incorrect information may be stored and processed. A directly depicting language, in contrast, would reflect a condition of epistemic perfection. It would enable us to construct representations not of our knowledge but of the structures of reality itself, in much the way that chemical diagrams allow the representation (at a (...) certain level of abstractness) of the structures of molecules of different sorts. A diagram of such a language would be true if that which it sets out to depict exists in reality, i.e. if the structural relations between the names (and other bits and pieces in the diagram) map structural relations among the corresponding objects in the world. Otherwise it would be false. All of this should, of course, be perfectly familiar. (See, for example, Aristotle, Metaphysics, 1027 b 22, 1051 b 32ff.) The present paper seeks to go further than its predecessors, however, in offering a detailed account of the syntax of a working universal characteristic and of the ways in which it might be used. (shrink)
Models of collective deliberation often assume that the chief aim of a deliberative exchange is the sharing of information. In this paper, we argue that an equally important role of deliberation is to draw participants’ attention to pertinent questions, which can aid the assembly and processing of distributed information by drawing deliberators’ attention to new issues. The assumption of logical omniscience renders classical models of agents’ informational states unsuitable for modelling this role of deliberation. Building on recent insights from psychology, (...) linguistics and philosophy about the role of questions in speech and thought, we propose a different model in which beliefs are treated as answers directed at specific questions. Here, questions are formally represented as partitions of the space of possibilities and individuals’ information states as sets of questions and corresponding partial answers to them. The state of conversation is then characterised by individuals’ information together with the questions under discussion, which can be steered by various deliberative inputs. Using this model, deliberation is then shown to shape collective decisions in ways that classical models cannot capture, allowing for novel explanations of how group consensus is achieved. (shrink)
The aim of this paper is to elaborate a topological semantics of knowledge and belief operators that can be used for an epistemological characterisation of Gettier cases. Relying on this semantics it will be shown that in Stalnaker’s logic KB every topological knowledge operator K is accompanied with a partially ordered family of belief operators B compatible with K in the sense that the pairs (K, B) of modal operators K and B satisfy all axioms of KB (except the (...) contentious axiom (NI) of negative introspection). For most topological models of KB Gettier cases occur in a natural way, i.e., most models of KB contain sets of possible worlds that can be interpreted as Gettier cases where true justified beliefs obtain that are not knowledge. On the other hand, there exist a special class of models that lack Gettier cases. Topologically, Gettier cases are characterized as nowhere dense sets. This entails that Gettier cases are “epistemically invisible” and “doxastically invisible”, i.e., they can neither be known by K nor consistently believed by B. The proof that Gettier cases cannot be known by knowledge operators K is elementary, the proof that they cannot be believed by belief operators B relies, however, on a non-trivial theorem of point-free topology, namely, Isbell’s density theorem. -/- Keywords. Stalnaker’s logic KB of knowledge and belief; Topological epistemology; Nuclei; Epistemic Invisibility; Doxastic invisibility; Gettier cases; Isbell’s theorem. (shrink)
This special issue collects together nine new essays on logical consequence :the relation obtaining between the premises and the conclusion of a logically valid argument. The present paper is a partial, and opinionated,introduction to the contemporary debate on the topic. We focus on two influential accounts of consequence, the model-theoretic and the proof-theoretic, and on the seeming platitude that valid arguments necessarilypreserve truth. We briefly discuss the main objections these accounts face, as well as Hartry Field’s contention that such (...) objections show consequenceto be a primitive, indefinable notion, and that we must reject the claim that valid arguments necessarily preserve truth. We suggest that the accountsin question have the resources to meet the objections standardly thought to herald their demise and make two main claims: (i) that consequence, as opposed to logical consequence, is the epistemologically significant relation philosophers should be mainly interested in; and (ii) that consequence is a paradoxical notion if truth is. (shrink)
Abstract. The aim of this paper is to show that the topological interpretation of knowledge as an interior kernel operator K of a topological space (X, OX) comes along with a partially ordered family of belief modalities B that fit K in the sense that the pairs (K, B) satisfy all axioms of Stalnaker’s KB logic of knowledge and belief with the exception of the contentious axiom of negative introspection (NI). The new belief modalities B introduced in this paper (...) are defined with the help of the (dense) nuclei of the Heyting algebra OX of open subsets on the topological space (X, OX). In this way, the natural context for the belief operators B related to topological knowledge operator K is shown to be the Heyting algebra NUC(OX) of the nuclei of the Heyting algebra OX.1 More precisely, the dense nuclei of NUC(OX) can be used to define a variety of bimodal logics of knowledge operators K and belief operators B. The operators K and B are compatible with each other in the sense that the pairs (K, B) satisfy all axioms of Stalnaker’s KB system with the exception of the axiom (NI). Therefore, for (X, OX), one obtains a bounded, partially ordered family of belief operators B defined by the elements of NUC(OX). (shrink)
The present paper wants to promote epistemic pluralism as an alternative view of non-classical logics. For this purpose, a bilateralist logic of acceptance and rejection is developed in order to make an important di erence between several concepts of epistemology, including information and justi cation. Moreover, the notion of disagreement corresponds to a set of epistemic oppositions between agents. The result is a non-standard theory of opposition for many-valued logics, rendering total and partial disagreement in terms of epistemic (...) negation and semi-negations. (shrink)
In order to decide whether a discursive product of human reason corresponds or not to the logical order, one must analyze it in terms of syntactic correctness, consistency, and validity. The first step in logical analysis is formalization, that is, the process by which logical forms of thoughts are represented in different formal languages or logical systems. Although each thought can be properly formalized in different ways, the formalization variants are not equally adequate. The adequacy of formalization seems to depend (...) on several essential features: parsimony, accuracy, transparency, fertility and reliability. Because there is a partial antinomy between these traits, it is impossible to find a perfectly adequate variant of formalization. However, it is possible and preferable to reach a reasonable compromise by choosing the variant of formalization which satisfies all of these fundamental characteristics. (shrink)
This paper offers an expressivist account of logical form, arguing that in order to fully understand it one must examine what valid arguments make us do (or: what Achilles does and the Tortoise doesn’t, in Carroll’s famed fable). It introduces Charles Peirce’s distinction between symbols, indices and icons as three different kinds of signification whereby the sign picks out its object by learned convention, by unmediated indication, and by resemblance respectively. It is then argued that logical form is represented by (...) the third, iconic, kind of sign. It is noted that icons uniquely enjoy partial identity between sign and object, and argued that this holds the key to Carroll’s puzzle. Finally, from this examination of sign-types metaphysical morals are drawn: that the traditional foes metaphysical realism and conventionalism constitute a false dichotomy, and that reality contains intriguingly inference-binding structures. (shrink)
I propose and defend a novel view called ‘de se consequentialism’, which is noteworthy for two reasons. First, it demonstrates — contra Doug Portmore, Mark Schroeder, Campbell Brown, and Michael Smith, among others — that a consequentialist theory employing agent-neutral value is logically consistent with agent-centered constraints. Second, de se consequentialism clarifies both the nature of agent-centered constraints and why philosophers have found them puzzling, thereby meriting attention from even dedicated non-consequentialists. Scrutiny reveals that moral theories in general, whether consequentialist (...) or not, incorporate constraints by assessing states in a first-personal guise. Consequently, it is no coincidence that de se consequentialism mimics constraints: its distinctive feature is the very feature through which non-consequentialist theories enact them. (shrink)
Fine (2017a) sets out a theory of content based on truthmaker semantics which distinguishes two kinds of consequence between contents. There is entailment, corresponding to the relationship between disjunct and disjunction, and there is containment, corresponding to the relationship between conjunctions and their conjuncts. Fine associates these with two notions of parthood: disjunctive and conjunctive. Conjunctive parthood is a very useful notion, allowing us to analyse partial content and partial truth. In this chapter, I extend the notion of (...) disjunctive parthood in terms of a structural relation of refinement, which stands to disjunctive parthood much as mereological parthood stands to conjunctive parthood. Philosophically, this relation may be modelled on the determinable- determinate relation, or on a fact-to-fact notion of grounding. I discuss its connection to two other Finean notions: vagueness (understood via precisification) and arbitrary objects. I then investigate what a logic of truthmaking with refinement might look like. I argue that (i) parthood naturally gives rise to a relevant conditional; (ii) refinement underlies a relevant notion of disjunction; and so (iii) truthmaker semantics with refinement is a natural home for relevant logic. The resulting formal models draw on Fine’s (1974) semantics for relevant logics. Finally, I use this understanding of relevant semantics to investigate the status of the mingle axiom. (shrink)
We present a theory of human artistic experience and the neural mechanisms that mediate it. Any theory of art has to ideally have three components. The logic of art: whether there are universal rules or principles; The evolutionary rationale: why did these rules evolve and why do they have the form that they do; What is the brain circuitry involved? Our paper begins with a quest for artistic universals and proposes a list of ‘Eight laws of artistic experience’ -- (...) a set of heuristics that artists either consciously or unconsciously deploy to optimally titillate the visual areas of the brain. One of these principles is a psychological phenomenon called the peak shift effect: If a rat is rewarded for discriminating a rectangle from a square, it will respond even more vigorously to a rectangle that is longer and skinnier that the prototype. We suggest that this principle explains not only caricatures, but many other aspects of art. Example: An evocative sketch of a female nude may be one which selectively accentuates those feminine form-attributes that allow one to discriminate it from a male figure; a Boucher, a Van Gogh, or a Monet may be a caricature in ‘colour space’ rather than form space. Even abstract art may employ ‘supernormal’ stimuli to excite form areas in the brain more strongly than natural stimuli. Second, we suggest that grouping is a very basic principle. The different extrastriate visual areas may have evolved specifically to extract correlations in different domains , and discovering and linking multiple features into unitary clusters -- objects -- is facilitated and reinforced by direct connections from these areas to limbic structures. In general, when object-like entities are partially discerned at any stage in the visual hierarchy, messages are sent back to earlier stages to alert them to certain locations or features in order to look for additional evidence for the object . Finally, given constraints on allocation of attentional resources, art is most appealing if it produces heightened activity in a single dimension rather than redundant activation of multiple modules. This idea may help explain the effectiveness of outline drawings and sketches, the savant syndrome in autists, and the sudden emergence of artistic talent in fronto-temporal dementia. In addition to these three basic principles we propose five others, constituting a total of ‘eight laws of aesthetic experience’. (shrink)
Epistemologists who study partial beliefs, or credences, have a well-developed account of how you should change your credences when you learn new evidence; that is, when your body of evidence grows. What's more, they boast a diverse range of epistemic and pragmatic arguments that support that account. But they do not have a satisfactory account of when and how you should change your credences when you become aware of possibilities and propositions you have not entertained before; that is, when (...) your awareness grows. In this paper, I consider each of the arguments for the credal epistemologist's account of how to respond to evidence, and I ask whether they can help us generate an account of how to respond to awareness growth. The results are surprising: the arguments that all support the same norms for responding to evidence growth support a number of different norms when they are applied to awareness growth. Some of these norms seem too weak, others too strong. I ask what we should conclude from this, and argue that our credal response to awareness growth is considerably less rigorously constrained than our credal response to new evidence. (shrink)
I am saying farewell after more than forty happy years of teaching logic at the University of Buffalo. But this is only a partial farewell. I will no longer be at UB to teach classroom courses or seminars. But nothing else will change. I will continue to be available for independent study. I will continue to write abstracts and articles with people who have taken courses or seminars with me. And I will continue to honor the LogicLifetimeGuarantee™, which (...) is earned by taking one of my logic courses or seminars. As you know, according to the terms of the LogicLifetimeGuarantee™, I stand behind everything I teach. If you find anything to be unsatisfactory, I am committed to fixing it. If you forget anything, I will remind you. If you have questions, I will answer them or ask more questions. And if you need more detail on any topic we discussed, I will help you to broaden and deepen your knowledge—and maybe write an abstract or article. Stay in touch. (shrink)
Dynamic conceptual reframing represents a crucial mechanism employed by humans, and partially by other animal species, to generate novel knowledge used to solve complex goals. In this talk, I will present a reasoning framework for knowledge invention and creative problem solving exploiting TCL: a non-monotonic extension of a Description Logic (DL) of typicality able to combine prototypical (commonsense) descriptions of concepts in a human-like fashion [1]. The proposed approach has been tested both in the task of goal-driven concept invention (...) [2,3] and has additionally applied within the context of serendipity-based recommendation systems [4]. I will present the obtained results, the lessons learned, and the road ahead of this research path. -/- . (shrink)
This collection of articles was written over the last 10 years and edited to bring them up to date (2019). All the articles are about human behavior (as are all articles by anyone about anything), and so about the limitations of having a recent monkey ancestry (8 million years or much less depending on viewpoint) and manifest words and deeds within the framework of our innate psychology as presented in the table of intentionality. As famous evolutionist Richard Leakey says, it (...) is critical to keep in mind not that we evolved from apes, but that in every important way, we are apes. If everyone was given a real understanding of this (i.e., of human ecology and psychology to actually give them some control over themselves), maybe civilization would have a chance. As things are however the leaders of society have no more grasp of things than their constituents and so collapse into anarchy is inevitable. -/- The first group of articles attempt to give some insight into how we behave that is reasonably free of theoretical delusions. In the next three groups, I comment on three of the principal delusions preventing a sustainable world— technology, religion and politics (cooperative groups). People believe that society can be saved by them, so I provide some suggestions in the rest of the book as to why this is unlikely via short articles and reviews of recent books by well-known writers. -/- It is critical to understand why we behave as we do and so the first section presents articles that try to describe (not explain as Wittgenstein insisted) behavior. I start with a brief review of the logical structure of rationality, which provides some heuristics for the description of language (mind, rationality, personality) and gives some suggestions as to how this relates to the evolution of social behavior. This centers around the two writers I have found the most important in this regard, Ludwig Wittgenstein and John Searle, whose ideas I combine and extend within the dual system (two systems of thought) framework that has proven so useful in recent thinking and reasoning research. As I note, there is in my view essentially complete overlap between philosophy, in the strict sense of the enduring questions that concern the academic discipline, and the descriptive psychology of higher order thought (behavior). Once one has grasped Wittgenstein’s insight that there is only the issue of how the language game is to be played, one determines the Conditions of Satisfaction (what makes a statement true or satisfied etc.) and that is the end of the discussion. No neurophysiology, no metaphysics, no postmodernism, no theology. -/- It is my contention that the table of intentionality (rationality, mind, thought, language, personality etc.) that features prominently here describes more or less accurately, or at least serves as an heuristic for, how we think and behave, and so it encompasses not merely philosophy and psychology, but everything else (history, literature, mathematics, politics etc.). Note especially that intentionality and rationality as I (along with Searle, Wittgenstein and others) view it, includes both conscious deliberative System 2 and unconscious automated System 1 actions or reflexes. -/- The next section describes the digital delusions, which confuse the language games of System 2 with the automatisms of System one, and so cannot distinguish biological machines (i.e., people) from other kinds of machines (i.e., computers). The ‘reductionist’ claim is that one can ‘explain’ behavior at a ‘lower’ level, but what actually happens is that one does not explain human behavior but a ‘stand in’ for it. Hence the title of Searle’s classic review of Dennett’s book (“Consciousness Explained”)— “Consciousness Explained Away”. In most contexts ‘reduction’ of higher level emergent behavior to brain functions, biochemistry, or physics is incoherent. Even for ‘reduction’ of chemistry or physics, the path is blocked by chaos and uncertainty. Anything can be ‘represented’ by equations, but when they ‘represent’ higher order behavior, it is not clear (and cannot be made clear) what the ‘results’ mean. Reductionist metaphysics is a joke, but most scientists and philosophers lack the appropriate sense of humor. -/- The last section describes The One Big Happy Family Delusion, i.e., that we are selected for cooperation with everyone, and that the euphonious ideals of Democracy, Diversity and Equality will lead us into utopia, if we just manage things correctly (the possibility of politics). Again, the No Free Lunch Principle ought to warn us it cannot be true, and we see throughout history and all over the contemporary world, that without strict controls, selfishness and stupidity gain the upper hand and soon destroy any nation that embraces these delusions. In addition, the monkey mind steeply discounts the future, and so we cooperate in selling our descendant’s heritage for temporary comforts, greatly exacerbating the problems. The only major change in this edition is the addition in the last article of a short discussion of China, a threat to peace and freedom as great as overpopulation and climate change and one to which even most professional scholars and politicians are oblivious so I regarded it as sufficiently important to warrant a new edition. -/- I describe versions of this delusion (i.e., that we are basically ‘friendly’ if just given a chance) as it appears in some recent books on sociology/biology/economics. Even Sapolsky’s otherwise excellent “Behave”(2017) embraces leftist politics and group selection and gives space to a discussion of whether humans are innately violent. I end with an essay on the great tragedy playing out in America and the world, which can be seen as a direct result of our evolved psychology manifested as the inexorable machinations of System 1. Our psychology, eminently adaptive and eugenic on the plains of Africa from ca. 6 million years ago, when we split from chimpanzees, to ca. 50,000 years ago, when many of our ancestors left Africa (i.e., in the EEA or Environment of Evolutionary Adaptation), is now maladaptive and dysgenic and the source of our Suicidal Utopian Delusions. So, like all discussions of behavior (philosophy, psychology, sociology, biology, anthropology, politics, law, literature, history, economics, soccer strategies, business meetings, etc.), this book is about evolutionary strategies, selfish genes and inclusive fitness (kin selection, natural selection). -/- The great mystic Osho said that the separation of God and Heaven from Earth and Humankind was the most evil idea that ever entered the Human mind. In the 20th century an even more evil notion arose, or at least became popular with leftists—that humans are born with rights, rather than having to earn privileges. The idea of human rights is an evil fantasy created by leftists to draw attention away from the merciless destruction of the earth by unrestrained 3rd world motherhood. Thus, every day the population increases by 200,000, who must be provided with resources to grow and space to live, and who soon produce another 200,000 etc. And one almost never hears it noted that what they receive must be taken from those already alive, and their descendants. Their lives diminish those already here in both major obvious and countless subtle ways. Every new baby destroys the earth from the moment of conception. In a horrifically overcrowded world with vanishing resources, there cannot be human rights without destroying the earth and our descendant’s futures. It could not be more obvious, but it is rarely mentioned in a clear and direct way, and one will never see the streets full of protesters against motherhood. -/- The most basic facts, almost never mentioned, are that there are not enough resources in America or the world to lift a significant percentage of the poor out of poverty and keep them there. Even the attempt to do this is already bankrupting America and destroying the world. The earth’s capacity to produce food decreases daily, as does our genetic quality. And now, as always, by far the greatest enemy of the poor is other poor and not the rich. -/- America and the world are in the process of collapse from excessive population growth, most of it for the last century, and now all of it, due to 3rd world people. Consumption of resources and the addition of 4 billion more ca. 2100 will collapse industrial civilization and bring about starvation, disease, violence and war on a staggering scale. The earth loses about 2% of its topsoil every year, so as it nears 2100, most of its food growing capacity will be gone. Billions will die and nuclear war is all but certain. In America, this is being hugely accelerated by massive immigration and immigrant reproduction, combined with abuses made possible by democracy. Depraved human nature inexorably turns the dream of democracy and diversity into a nightmare of crime and poverty. China will continue to overwhelm America and the world, as long as it maintains the dictatorship which limits selfishness. The root cause of collapse is the inability of our innate psychology to adapt to the modern world, which leads people to treat unrelated persons as though they had common interests (which I suggest may be regarded as an unrecognized -- but the commonest and most serious-- psychological problem -- Inclusive Fitness Disorder). This, plus ignorance of basic biology and psychology, leads to the social engineering delusions of the partially educated who control democratic societies. Few understand that if you help one person you harm someone else—there is no free lunch and every single item anyone consumes destroys the earth beyond repair. Consequently, social policies everywhere are unsustainable and one by one all societies without stringent controls on selfishness will collapse into anarchy or dictatorship. Without dramatic and immediate changes, there is no hope for preventing the collapse of America, or any country that follows a democratic system. Hence my concluding essay “Suicide by Democracy”. -/- Those wishing to read my other writings may see Talking Monkeys 2nd ed (2019), The Logical Structure of Philosophy, Psychology, Mind and Language in Ludwig Wittgenstein and John Searle 2nd ed (2019), Suicide by Democracy 3rd ed (2019), The Logical Stucture of Human Behavior (2019) and Suicidal Utopian Delusions in the 21st Century 4th ed (2019) . (shrink)
It is sometimes argued that certain sentences of natural language fail to express truth conditional contents. Standard examples include e.g. Tipper is ready and Steel is strong enough. In this paper, we provide a novel analysis of truth conditional meaning using the notion of a question under discussion. This account explains why these types of sentences are not, in fact, semantically underdetermined, provides a principled analysis of the process by which natural language sentences can come to have enriched meanings in (...) context, and shows why various alternative views, e.g. so-called Radical Contextualism, Moderate Contextualism, and Semantic Minimalism, are partially right in their respective analyses of the problem, but also all ultimately wrong. Our analysis achieves this result using a standard truth conditional and compositional semantics and without making any assumptions about enriched logical forms, i.e. logical forms containing phonologically null expressions. (shrink)
Othering is the construction and identification of the self or in-group and the other or out-group in mutual, unequal opposition by attributing relative inferiority and/or radical alienness to the other/out-group. The notion of othering spread from feminist theory and post-colonial studies to other areas of the humanities and social sciences, but is originally rooted in Hegel’s dialectic of identification and distantiation in the encounter of the self with some other in his “Master-Slave dialectic”. In this paper, after reviewing the philosophical (...) and psychological background of othering, I distinguish two kinds of othering, “crude” and “sophisticated”, that differ in the logical form of their underlying arguments. The essential difference is that the former is merely self-other distantiating, while the latter – as in Hegel’s dialectic – partially depends on self-other identification. While crude othering is closer to the paradigmatic notion of othering, sophisticated othering is closer to Hegel’s, but so is quasi-othering, which is nearly identical in form to sophisticated othering, but which misses the defining feature of othering – attributing relative inferiority and/or radical alienness. Because Hegel’s dialectic applies to any encounter of an interpreting self with some other, sophisticated or quasi-othering is at least potentially a very common occurrence in the interpretation of others, especially of those who do not belong to the in-group. However, although othering is usually undesirable, the Hegelian varieties can provide a “mirror”, which can be used as a tool to improve understanding of both the other and the interpreting self, and the malignant aspects of othering can be avoided through charity. (shrink)
This dissertation concerns the methodology Kant employs in the first two sections of the Groundwork of the Metaphysics of Morals (Groundwork I-II) with particular attention to how the execution of the method of analysis in these sections contributes to the establishment of moral metaphysics as a science. My thesis is that Kant had a detailed strategy for the Groundwork, that this strategy and Kant’s reasons for adopting it can be ascertained from the Critique of Pure Reason (first Critique) and his (...) lectures on logic, and that understanding this strategy gains us interpretive insight into Kant’s moral metaphysics. At the most general level of methodology, Kant says there are four steps for the establishment of any science: 1) make distinct the idea of the natural unity of its material 2) determine the special content of the science 3) articulate the systematic unity of the science 4) critique the science to determine its boundaries The first two of these steps are accomplished by the genetically scholastic method of analysis, paradigmatically the method whereby confused and obscure ideas are made clear and distinct, thereby logically perfecting them and transforming them into possible grounds of cognitive insight that are potentially complete and adequate to philosophical purposes. The analysis of Groundwork I is a paradigmatic analysis that makes distinct what is contained in common understanding, i.e. its Inhalt or intension, making distinct the higher partial concepts that together define the concept of morality. The analysis of Groundwork II is an employment more specifically of the method of logical division, which makes distinct what is contained under the concept, i.e. its Umfang, by which the extension or object of morality is determined. Part I introduces Kant’s conception of moral metaphysical science and why he took it to be in need of establishment, explains the general method for establishing science and the scholastic method of analysis by which its first two steps are to be accomplished, then provides an interpretation of Groundwork I as an execution of this method. Part II details Kant’s determination of the special content of moral science in Groundwork II in relation to the central problem for moral metaphysics – how synthetic a priori practical cognition is possible. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.