Sentences containing definite descriptions, expressions of the form ‘The F’, can be formalised using a binary quantifier ι that forms a formula out of two predicates, where ιx[F, G] is read as ‘The F is G’. This is an innovation over the usual formalisation of definite descriptions with a term forming operator. The present paper compares the two approaches. After a brief overview of the system INFι of intuitionist negativefreelogic extended by such a quantifier, which (...) was presented in (Kürbis 2019), INFι is first compared to a system of Tennant’s and an axiomatic treatment of a term forming ι operator within intuitionist negativefreelogic. Both systems are shown to be equivalent to the subsystem of INFι in which the G of ιx[F, G] is restricted to identity. INFι is then compared to an intuitionist version of a system of Lambert’s which in addition to the term forming operator has an operator for predicate abstraction for indicating scope distinctions. The two systems will be shown to be equivalent through a translation between their respective languages. Advantages of the present approach over the alternatives are indicated in the discussion. (shrink)
This paper presents a way of formalising definite descriptions with a binary quantifier ι, where ιx[F, G] is read as ‘The F is G’. Introduction and elimination rules for ι in a system of intuitionist negativefreelogic are formulated. Procedures for removing maximal formulas of the form ιx[F, G] are given, and it is shown that deductions in the system can be brought into normal form.
In this paper I am concerned with an analysis of negative existential sentences that contain proper names only by using negative or neutral freelogic. I will compare different versions of neutral freelogic with the standard system of negativefreelogic (Burge, Sainsbury) and aim to defend my version of neutral freelogic that I have labeled non-standard neutral freelogic.
In this paper I aim to defend a first‐order non‐discriminating property view concerning existence. The version of this view that I prefer is based on negative (or a specific neutral) freelogic that treats the existence predicate as first‐order logical predicate. I will provide reasons why such a view is more plausible than a second‐order discriminating property view concerning existence and I will also discuss four challenges for the proposed view and provide solutions to them.
Halbach has argued that Tarski biconditionals are not ontologically conservative over classical logic, but his argument is undermined by the fact that he cannot include a theory of arithmetic, which functions as a theory of syntax. This article is an improvement on Halbach's argument. By adding the Tarski biconditionals to inclusive negativefreelogic and the universal closure of minimal arithmetic, which is by itself an ontologically neutral combination, one can prove that at least one thing (...) exists. The result can then be strengthened to the conclusion that infinitely many things exist. Those things are not just all Gödel codes of sentences but rather all natural numbers. Against this background inclusive negativefreelogic collapses into noninclusive freelogic, which collapses into classical logic. The consequences for ontological deflationism with respect to truth are discussed. (shrink)
The principle of universal instantiation plays a pivotal role both in the derivation of intensional paradoxes such as Prior’s paradox and Kaplan’s paradox and the debate between necessitism and contingentism. We outline a distinctively free logical approach to the intensional paradoxes and note how the free logical outlook allows one to distinguish two different, though allied themes in higher-order necessitism. We examine the costs of this solution and compare it with the more familiar ramificationist approaches to higher-order (...) class='Hi'>logic. Our assessment of both approaches is largely pessimistic, and we remain reluctantly inclined to take Prior’s and Kaplan’s derivations at face value. (shrink)
From Leibniz to Krauss philosophers and scientists have raised the question as to why there is something rather than nothing. Why-questions request a type of explanation and this is often thought to include a deductive component. With classical logic in the background only trivial answers are forthcoming. With free logics in the background, be they of the negative, positive or neutral variety, only question-begging answers are to be expected. The same conclusion is reached for the modal version (...) of the Question, namely ‘Why is there something contingent rather than nothing contingent?’. The categorial version of the Question, namely ‘Why is there something concrete rather than nothing concrete?’, is also discussed. The conclusion is reached that deductive explanations are question-begging, whether one works with classical logic or positive or negativefreelogic. I also look skeptically at the prospects of giving causal-counterfactual or probabilistic answers to the Question, although the discussion of the options is less comprehensive and the conclusions are more tentative. The meta-question, viz. ‘Should we not stop asking the Question’, is accordingly tentatively answered affirmatively. (shrink)
In this extended critical discussion of 'Kant's Modal Metaphysics' by Nicholas Stang (OUP 2016), I focus on one central issue from the first chapter of the book: Stang’s account of Kant’s doctrine that existence is not a real predicate. In §2 I outline some background. In §§3-4 I present and then elaborate on Stang’s interpretation of Kant’s view that existence is not a real predicate. For Stang, the question of whether existence is a real predicate amounts to the question: ‘could (...) there be non-actual possibilia?’ (p.35). Kant’s view, according to Stang, is that there could not, and that the very notion of non-actual or ‘mere’ possibilia is incoherent. In §5 I take a close look at Stang’s master argument that Kant’s Leibnizian predecessors are committed to the claim that existence is a real predicate, and thus to mere possibilia. I argue that it involves substantial logical commitments that the Leibnizian could reject. I also suggest that it is danger of proving too much. In §6 I explore two closely related logical commitments that Stang’s reading implicitly imposes on Kant, namely a negative universal freelogic and a quantified modal logic that invalidates the Converse Barcan Formula. I suggest that each can seem to involve Kant himself in commitment to mere possibilia. (shrink)
Judaic Logic is an original inquiry into the forms of thought determining Jewish law and belief, from the impartial perspective of a logician. Judaic Logic attempts to honestly estimate the extent to which the logic employed within Judaism fits into the general norms, and whether it has any contributions to make to them. The author ranges far and wide in Jewish lore, finding clear evidence of both inductive and deductive reasoning in the Torah and other books of (...) the Bible, and analyzing the methodology of the Talmud and other Rabbinic literature by means of formal tools which make possible its objective evaluation with reference to scientific logic. The result is a highly innovative work – incisive and open, free of clichés or manipulation. Judaic Logic succeeds in translating vague and confusing interpretative principles and examples into formulas with the clarity and precision of Aristotelean syllogism. Among the positive outcomes, for logic in general, are a thorough listing, analysis and validation of the various forms of a-fortiori argument, as well as a clarification of dialectic logic. However, on the negative side, this demystification of Talmudic/Rabbinic modes of thought (hermeneutic and heuristic) reveals most of them to be, contrary to the boasts of orthodox commentators, far from deductive and certain. They are often, legitimately enough, inductive. But they are also often unnatural and arbitrary constructs, supported by unverifiable claims and fallacious techniques. Many other thought-processes, used but not noticed or discussed by the Rabbis, are identified in this treatise, and subjected to logical review. Various more or less explicit Rabbinic doctrines, which have logical significance, are also examined in it. In particular, this work includes a formal study of the ethical logic (deontology) found in Jewish law, to elicit both its universal aspects and its peculiarities. With regard to Biblical studies, one notable finding is an explicit formulation (which, however, the Rabbis failed to take note of and stress) of the principles of adduction in the Torah, written long before the acknowledgement of these principles in Western philosophy and their assimilation in a developed theory of knowledge. Another surprise is that, in contrast to Midrashic claims, the Tanakh (Jewish Bible) contains a lot more than ten instances of qal vachomer (a-fortiori) reasoning. In sum, Judaic Logic elucidates and evaluates the epistemological assumptions which have generated the Halakhah (Jewish religious jurisprudence) and allied doctrines. Traditional justifications, or rationalizations, concerning Judaic law and belief, are carefully dissected and weighed at the level of logical process and structure, without concern for content. This foundational approach, devoid of any critical or supportive bias, clears the way for a timely reassessment of orthodox Judaism (and incidentally, other religious systems, by means of analogies or contrasts). Judaic Logic ought, therefore, to be read by all Halakhists, as well as Bible and Talmud scholars and students; and also by everyone interested in the theory, practise and history of logic. (shrink)
The result of combining classical quantificational logic with modal logic proves necessitism – the claim that necessarily everything is necessarily identical to something. This problem is reflected in the purely quantificational theory by theorems such as ∃x t=x; it is a theorem, for example, that something is identical to Timothy Williamson. The standard way to avoid these consequences is to weaken the theory of quantification to a certain kind of freelogic. However, it has often been (...) noted that in order to specify the truth conditions of certain sentences involving constants or variables that don’t denote, one has to apparently quantify over things that are not identical to anything. In this paper I defend a contingentist, non-Meinongian metaphysics within a positive freelogic. I argue that although certain names and free variables do not actually refer to anything, in each case there might have been something they actually refer to, allowing one to interpret the contingentist claims without quantifying over mere possibilia. (shrink)
Currently available systems of action deontic logic are not designed to model procedures to assess the conduct of an agent which take into account the intentions of the agent and the circumstances in which she is acting. Yet, procedures of this kind are essential to determine what counts as culpable not doing. In light of this, we design an action logic, AL, in which it is possible to distinguish actions that are objectively possible for an agent, viz. there (...) are no objective impediments for the agent to do them, and actions that, besides being objectively possible, are compatible with the setting or intentions of the agent. (shrink)
This article reconstructs Kant's view on the existential import of categorical sentences. Kant is widely taken to have held that affirmative sentences (the A and I sentences of the traditional square of opposition) have existential import, whereas negative sentences (E and O) lack existential import. The article challenges this standard interpretation. It is argued that Kant ascribes existential import only to some affirmative synthetic sentences. However, the reasons for this do not fall within the remit of Kant's formal (...) class='Hi'>logic. Unlike traditional logic and modern standard quantification theory, Kant's formal logic is free from existential commitments. (shrink)
This paper describes a decision procedure for disjunctions of conjunctions of anti-prenex normal forms of pure first-order logic (FOLDNFs) that do not contain V within the scope of quantifiers. The disjuncts of these FOLDNFs are equivalent to prenex normal forms whose quantifier-free parts are conjunctions of atomic and negated atomic formulae (= Herbrand formulae). In contrast to the usual algorithms for Herbrand formulae, neither skolemization nor unification algorithms with function symbols are applied. Instead, a procedure is described that (...) rests on nothing but equivalence transformations within pure first-order logic (FOL). This procedure involves the application of a calculus for negative normal forms (the NNF-calculus) with A -||- A & A (= &I) as the sole rule that increases the complexity of given FOLDNFs. The described algorithm illustrates how, in the case of Herbrand formulae, decision problems can be solved through a systematic search for proofs that reduce the number of applications of the rule &I to a minimum in the NNF-calculus. In the case of Herbrand formulae, it is even possible to entirely abstain from applying &I. Finally, it is shown how the described procedure can be used within an optimized general search for proofs of contradiction and what kind of questions arise for a &I-minimal proof strategy in the case of a general search for proofs of contradiction. (shrink)
Debunking arguments—also known as etiological arguments, genealogical arguments, access problems, isolation objec- tions, and reliability challenges—arise in philosophical debates about a diverse range of topics, including causation, chance, color, consciousness, epistemic reasons, free will, grounding, laws of nature, logic, mathematics, modality, morality, natural kinds, ordinary objects, religion, and time. What unifies the arguments is the transition from a premise about what does or doesn't explain why we have certain mental states to a negative assessment of their epistemic (...) status. I examine the common, underlying structure of the arguments and the different strategies for motivating and resisting the premises of debunking arguments. (shrink)
In previous articles, it has been shown that the deductive system developed by Aristotle in his "second logic" is a natural deduction system and not an axiomatic system as previously had been thought. It was also stated that Aristotle's logic is self-sufficient in two senses: First, that it presupposed no other logical concepts, not even those of propositional logic; second, that it is (strongly) complete in the sense that every valid argument expressible in the language of the (...) system is deducible by means of a formal deduction in the system. Review of the system makes the first point obvious. The purpose of the present article is to prove the second. Strong completeness is demonstrated for the Aristotelian system. (shrink)
JOHN CORCORAN AND WAGNER SANZ, Disbelief Logic Complements Belief Logic. Philosophy, University at Buffalo, Buffalo, NY 14260-4150 USA E-mail: corcoran@buffalo.edu Filosofia, Universidade Federal de Goiás, Goiás, GO 74001-970 Brazil E-mail: sanz@fchf.ufg.br -/- Consider two doxastic states belief and disbelief. Belief is taking a proposition to be true and disbelief taking it to be false. Judging also dichotomizes: accepting a proposition results in belief and rejecting in disbelief. Stating follows suit: asserting a proposition conveys belief and denying conveys disbelief. (...) Traditional logic implicitly focused on logical relations and processes needed in expanding and organizing systems of beliefs. Deducing a conclusion from beliefs results in belief of the conclusion. Deduction presupposes consequence: one proposition is a consequence of a set of a propositions if the latter logically implies the former. The role of consequence depends on its being truth-preserving: every consequence of a set of truths is true. This paper, which builds on previous work by the second author, explores roles of logic in expanding and organizing systems of disbeliefs. Aducing a conclusion from disbeliefs results in disbelief of the conclusion. Aduction presupposes contrequence: one proposition is a contrequence of a set of propositions if the set of negations or contradictory opposites of the latter logically implies that of the former. The role of contrequence depends on its being falsity-preserving: every contrequence of a set of falsehoods is false. A system of aductions that includes, for every contrequence of a given set, an aduction of the contrequence from the set is said to be complete. Historical and philosophical discussion is illustrated and enriched by presenting complete systems of aductions constructed by the second author. One such, a natural aduction system for Aristotelian categorical propositions, is based on a natural deduction system attributed to Aristotle by the first author and others. ADDED NOTE: Wagner Sanz reconstructed Aristotle’s logic the way it would have been had Aristole focused on constructing “anti-sciences” instead of sciences: more generally, on systems of disbeliefs. (shrink)
Rawlins (2013: 160) observes that both unconditionals and more classical free choice can be meta-characterized using orthogonality, but does not actually unify the two. One reason may be that in English, different expressions serve in these roles. By contrast, in Hungarian, AKÁR expressions serve as NPIs, FCIs, and unconditional adjuncts, but not as interrogatives or free relatives. This paper offers a unified account of the Hungarian data, extending Chierchia 2013 and Dayal 2013. The account produces the same unconditional (...) meanings that Rawlins derives from an interrogative basis. This result highlights the fact that sets of alternatives arise from different morpho-syntactic sources and are utilized by the grammar in different ways, but the results may fully converge. (shrink)
It is quite plausible to say that you may read or write implies that you may read and you may write (though possibly not both at once). This so-called free choice principle is well-known in deontic logic. Sadly, despite being so intuitive and seemingly innocent, this principle causes a lot of worries. The paper briefly but critically examines leading accounts of free choice permission present in the literature. Subsequently, the paper suggests to accept the free choice (...) principle, but only as a default (or defeasible) rule, issuing to it a ticket-of-leave, granting it some freedom, until it commits an undesired inference. (shrink)
Paul Oppenheimer and Edward Zalta's formalisation of Anselm's ontological argument for the existence of God is automated by embedding a freelogic for definite descriptions within Isabelle/HOL.
In this paper I introduce a sequent system for the propositional modal logic S5. Derivations of valid sequents in the system are shown to correspond to proofs in a novel natural deduction system of circuit proofs (reminiscient of proofnets in linear logic, or multiple-conclusion calculi for classical logic). -/- The sequent derivations and proofnets are both simple extensions of sequents and proofnets for classical propositional logic, in which the new machinery—to take account of the modal vocabulary—is (...) directly motivated in terms of the simple, universal Kripke semantics for S5. The sequent system is cut-free and the circuit proofs are normalising. (shrink)
The additive presupposition of particles like "too"/"even" is uncontested, but usually stipulated. This paper proposes to derive it based on two properties. (i) "too"/"even" is cross-linguistically focus-sensitive, and (ii) in many languages, "too"/"even" builds negative polarity items and free-choice items as well, often in concert with other particles. (i) is the source of its existential presupposition, and (ii) offers clues regarding how additivity comes about. (i)-(ii) together demand a sparse semantics for "too/even," one that can work with different (...) kinds of alternatives (focus, subdomain, scalar) and invoke suitably different further operators. (shrink)
It is well known that systems of action deontic logic emerging from a standard analysis of permission in terms of possibility of doing an action without incurring in a violation of the law are subject to paradoxes. In general, paradoxes are acknowledged as such if we have intuitions telling us that things should be different. The aim of this paper is to introduce a paradox-free deontic action system by (i) identifying the basic intuitions leading to the emergence of (...) the paradoxes and (ii) exploiting these intuitions in order to develop a consistent deontic framework, where it can be shown why some phenomena seem to be paradoxical and why they are not so if interpreted in a correct way. (shrink)
Reinach’s essay of 1911 establishes an ontological theory of logic, based on the notion of Sachverhalt or state of affairs. He draws on the theory of meaning and reference advanced in Husserl’s Logical Investigations and at the same time anticipates both Wittgenstein’s Tractatus and later speech act theorists’ ideas on performative utterances. The theory is used by Reinach to draw a distinction between two kinds of negative judgment: the simple negative judgment, which is made true by a (...)negative state of affairs; and the polemical negative judgment, which is a performative utterance in which the truth of some earlier judgment – typically a judgment made by some other person – is denied. (shrink)
We present cut-free labelled sequent calculi for a central formalism in logics of agency: STIT logics with temporal operators. These include sequent systems for Ldm , Tstit and Xstit. All calculi presented possess essential structural properties such as contraction- and cut-admissibility. The labelled calculi G3Ldm and G3Tstit are shown sound and complete relative to irreflexive temporal frames. Additionally, we extend current results by showing that also Xstit can be characterized through relational frames, omitting the use of BT+AC frames.
In researching presuppositions dealing with logic and dynamic of belief we distinguish two related parts. The first part refers to presuppositions and logic, which is not necessarily involved with intentional operators. We are primarily concerned with classical, free and presuppositonal logic. Here, we practice a well known Strawson’s approach to the problem of presupposition in relation to classical logic. Further on in this work, freelogic is used, especially Van Fraassen’s research of the (...) role of presupposition in supervaluations logical systems. At the end of the first part, presuppositional logic, advocated by S.K. Thomason, is taken into consideration. The second part refers to the presuppositions in relation to the logic of the dynamics of belief. Here the logic of belief change is taken into consideration and other epistemic notions with immanent mechanism for the presentation of the dynamics. Three representative and dominant approaches are evaluated. First, we deal with new, less classical, situation semantics. Besides Strawson’s theory, the second theory is the theory of the belief change, developed by Alchourron, Gärdenfors, and Makinson (AGM theory). At the end, the oldest, universal, and dominant approach is used, recognized as Hintikka’s approach to the analysis of epistemic notions. (shrink)
Anonymity promotes free speech by protecting the identity of people who might otherwise face negative consequences for expressing their ideas. Wrongdoers, however, often abuse this invisibility cloak. Defenders of anonymity online emphasise its value in advancing public debate and safeguarding political dissension. Critics emphasise the need for identifiability in order to achieve accountability for wrongdoers such as trolls. The problematic tension between anonymity and identifiability online lies in the desirability of having low costs (no repercussions) for desirable speech (...) and high costs (appropriate repercussions) for undesirable speech. If we practice either full anonymity or identifiability, we end up having either low or high costs in all online contexts and for all kinds of speech. I argue that free speech is compatible with instituting costs in the form of repercussions and penalties for controversial and unacceptable speech. Costs can minimise the risks of anonymity by providing a reasonable degree of accountability. Pseudonymity is a tool that can help us regulate those costs while furthering free speech. This article argues that, in order to redesign the Internet to better serve free speech, we should shape much of it to resemble an online masquerade. (shrink)
Benjamin Libet’s work paved the way for the neuroscientific study of free will. Other scientists have praised this research as groundbreaking. In philosophy, the reception has been more negative, often even dismissive. First, I will propose a diagnosis of this striking discrepancy. I will suggest that the experiments seem irrelevant, from the perspective of philosophy, due to the way in which they operationalize free will. In particular, I will argue that this operational definition does not capture (...) class='Hi'>free will properly and that it is based on a false dichotomy between internal and external causes. However, I will also suggest that this problem could be overcome, as there are no obvious obstacles to an operationalization of free will that is in accord with the philosophical conception of free will. (shrink)
The problem of negative truth is the problem of how, if everything in the world is positive, we can speak truly about the world using negative propositions. A prominent solution is to explain negation in terms of a primitive notion of metaphysical incompatibility. I argue that if this account is correct, then minimal logic is the correct logic. The negation of a proposition A is characterised as the minimal incompatible of A composed of it and the (...) logical constant ¬. A rule based account of the meanings of logical constants that appeals to the notion of incompatibility in the introduction rule for negation ensures the existence and uniqueness of the negation of every proposition. But it endows the negation operator with no more formal properties than those it has in minimal logic. (shrink)
Several prominent contemporary philosophers, including Jürgen Habermas, John Caputo, and Robert Bernasconi, have at times painted a somewhat negative picture of Gadamer as not only an uncritical traditionalist, but also as one whose philosophical project fails to appreciate difference. Against such claims, I argue that Gadamer’s reflections on art exhibit a genuine appreciation for alterity not unrelated to his hermeneutical approach to the other. Thus, by bringing Gadamer’s reflections on our experience of art into conversation with key aspects of (...) his philosophical hermeneutics, we are able to better assess the viability of Gadamer’s contributions to contemporary discussions of difference and alterity. -/- Sections two through six focus on key concepts in Gadamer’s account of art’s dynamic ontology and our experience of art. Such concepts include the play structure of art, hermeneutic identity, tarrying with a work, and contemporaneity. The opening sections provide not only a discussion of these central themes, but they also (1) draw attention to the various ways in which difference and otherness are integral to Gadamer’s account, and (2) utilize relevant musical examples that prepare the reader for a more focused discussion of a Gadamerian approach to free jazz in section seven. By highlighting how Gadamer’s understanding of art possesses a dialogical play structure, is characterized by identity and difference, requires actively engaged spectators and auditors, and is amenable to what many criticize as an unintelligible musical expression, viz. free jazz, Gadamer’s project is shown as other-affirming and open to ambiguity and dynamism. That is, the essential structures and concepts characterizing Gadamer’s reflections on art are likewise central to his overall hermeneutical project, and hence are not rightly described as un-attuned to difference or other-negating. Rather, Gadamer’s philosophical project upholds difference, since it requires a dialogical interplay between self and other that creates the possibility for a transformative experience. (shrink)
Working within the broad lines of general consensus that mark out the core features of John Stuart Mill’s (1806–1873) logic, as set forth in his A System of Logic (1843–1872), this chapter provides an introduction to Mill’s logical theory by reviewing his position on the relationship between induction and deduction, and the role of general premises and principles in reasoning. Locating induction, understood as a kind of analogical reasoning from particulars to particulars, as the basic form of inference (...) that is both free-standing and the sole load-bearing structure in Mill’s logic, the foundations of Mill’s logical system are briefly inspected. Several naturalistic features are identified, including its subject matter, human reasoning, its empiricism, which requires that only particular, experiential claims can function as basic reasons, and its ultimate foundations in ‘spontaneous’ inference. The chapter concludes by comparing Mill’s naturalized logic to Russell’s (1907) regressive method for identifying the premises of mathematics. (shrink)
«The world is about to get rid of morality, becoming total organization that is total destruction. Progress tends to culminate in a catastrophe». This few words sum up the fears of the late Horkheimer, who is increasingly worried about the effects of the dialectic of enlightenment. The fatal outcome of such dialectic has led the world to the brink of annihilation. According to Horkheimer, the root of the dialectic of enlightenment is an instrumental reason tending to the dominion (the dominion (...) of man over nature and the dominion of man over man). With his theoretical production, Horkheimer has tried to prevent the disastrous effects of this instrumental reason, ending up framing the longing for the Totally Other. In our opinion, the Totally Other plays an anti-dialectic function over enlightenment, having the triple meaning of transcendence as other than immanence; of utopia as an other society than the present one; and of an other reason than the instrumental reason, as a promise of a thought that is not anymore allied to the dominion. A new interpretation of the Totally Other would free its critical power, and it would be a “negative” attempt (in the sense of negative theology) to prevent the catastrophe towards which we all are pushed by what Horkheimer calls the immanent logic of history. (shrink)
The goal of the article is to substantiate that despite the criticism the paradigm in economics will not change because of the axiomatic assumptions of value-free economics. How these assumptions work is demonstrated on the example of Gary Becker’s economic approach which is analyzed from the perspective of scientific research programme. The author indicates hard core of economic approach and the protective belt which makes hard core immune from any criticism. This immunity leads economists to believe that they are (...) objective scientists and, consequently, it results in epistemological hubris. Due to its tautological nature, economic approach is considered to be a degenerative programme. This conclusion is extended on value-free economics. In spite of these problems, many economists still believe in positive economics and they dismiss normative approaches. It has a negative influence on people. The conclusion of the article is that thanks to axiomatic assumptions economists do not have objective and ironclad methodology and they should accept normative values in their research. (shrink)
This paper presents a new analysis of C.G. Hempel’s conditions of adequacy for any relation of confirmation [Hempel C. G. (1945). Aspects of scientific explanation and other essays in the philosophy of science. New York: The Free Press, pp. 3–51.], differing from the one Carnap gave in §87 of his [1962. Logical foundations of probability (2nd ed.). Chicago: University of Chicago Press.]. Hempel, it is argued, felt the need for two concepts of confirmation: one aiming at true hypotheses and (...) another aiming at informative hypotheses. However, he also realized that these two concepts are conflicting, and he gave up the concept of confirmation aiming at informative hypotheses. I then show that one can have Hempel’s cake and eat it too. There is a logic that takes into account both of these two conflicting aspects. According to this logic, a sentence H is an acceptable hypothesis for evidence E if and only if H is both sufficiently plausible given E and sufficiently informative about E. Finally, the logic sheds new light on Carnap’s analysis. (shrink)
ABSTRACT: This chapter offers a revenge-free solution to the liar paradox (at the centre of which is the notion of Gestalt shift) and presents a formal representation of truth in, or for, a natural language like English, which proposes to show both why -- and how -- truth is coherent and how it appears to be incoherent, while preserving classical logic and most principles that some philosophers have taken to be central to the concept of truth and our (...) use of that notion. The chapter argues that, by using a truth operator rather than truth predicate, it is possible to provide a coherent, model-theoretic representation of truth with various desirable features. After investigating what features of liar sentences are responsible for their paradoxicality, the chapter identifies the logic as the normal modal logic KT4M (= S4M). Drawing on the structure of KT4M (=S4M), the author proposes that, pace deflationism, truth has content, that the content of truth is bivalence, and that the notions of both truth and bivalence are semideterminable. (shrink)
This paper presents what the authors call the ‘divergence problem’ regarding choosing between different future possibilities. As is discussed in the first half, the central issue of the problem is the difficulty of temporally locating the ‘active cause’ on the modal divergent diagram. In the second half of this paper, we discuss the ‘second-person freedom’ which is, strictly, neither compatibilist negative freedom nor incompatibilist positive freedom. The divergence problem leads us to two hypothetical views (i.e. the view of single-line (...) determination and that of one-off chance), and these views bring humans closer to the afree side – i.e. outside of the contrast between being free and being unfree. The afree side is greatly different from the ordinary human side. This paper tries to secure the second-person freedom as a substitute for the ordinary human freedom while preventing the divergence problem from arising. (shrink)
Combinatory logic (Curry and Feys 1958) is a “variable-free” alternative to the lambda calculus. The two have the same expressive power but build their expressions differently. “Variable-free” semantics is, more precisely, “free of variable binding”: it has no operation like abstraction that turns a free variable into a bound one; it uses combinators—operations on functions—instead. For the general linguistic motivation of this approach, see the works of Steedman, Szabolcsi, and Jacobson, among others. The standard view (...) in linguistics is that reflexive and personal pronouns are free variables that get bound by an antecedent through some coindexing mechanism. In variable free semantics the same task is performed by some combinator that identifies two arguments of the function it operates on (a duplicator). This combinator may be built into the lexical semantics of the pronoun, into that of the antecedent, or it may be a free-floating operation applicable to predicates or larger chunks of texts, i.e. a typeshifter. This note is concerned with the case of cross-sentential anaphora. It adopts Hepple’s and Jacobson’s interpretation of pronouns as identity maps and asks how this can be extended to the cross-sentential case, assuming the dynamic semantic view of anaphora. It first outlines the possibility of interpreting indefinites that antecede non-ccommanded pronouns as existential quantifiers enriched with a duplicator. Then it argues that it is preferable to use the duplicator as a type-shifter that applies “on the fly”. The proposal has consequences for two central ingredients of the classical dynamic semantic treatment: it does away with abstraction over assignments and with treating indefinites as inherently existentially quantified. However, cross-sentential anaphora remains a matter of binding, and the idea of propositions as context change potentials is retained. (shrink)
This book treats ancient logic: the logic that originated in Greece by Aristotle and the Stoics, mainly in the hundred year period beginning about 350 BCE. Ancient logic was never completely ignored by modern logic from its Boolean origin in the middle 1800s: it was prominent in Boole’s writings and it was mentioned by Frege and by Hilbert. Nevertheless, the first century of mathematical logic did not take it seriously enough to study the ancient (...) class='Hi'>logic texts. A renaissance in ancient logic studies occurred in the early 1950s with the publication of the landmark Aristotle’s Syllogistic by Jan Łukasiewicz, Oxford UP 1951, 2nd ed. 1957. Despite its title, it treats the logic of the Stoics as well as that of Aristotle. Łukasiewicz was a distinguished mathematical logician. He had created many-valued logic and the parenthesis-free prefix notation known as Polish notation. He co-authored with Alfred Tarski’s an important paper on metatheory of propositional logic and he was one of Tarski’s the three main teachers at the University of Warsaw. Łukasiewicz’s stature was just short of that of the giants: Aristotle, Boole, Frege, Tarski and Gödel. No mathematical logician of his caliber had ever before quoted the actual teachings of ancient logicians. -/- Not only did Łukasiewicz inject fresh hypotheses, new concepts, and imaginative modern perspectives into the field, his enormous prestige and that of the Warsaw School of Logic reflected on the whole field of ancient logic studies. Suddenly, this previously somewhat dormant and obscure field became active and gained in respectability and importance in the eyes of logicians, mathematicians, linguists, analytic philosophers, and historians. Next to Aristotle himself and perhaps the Stoic logician Chrysippus, Łukasiewicz is the most prominent figure in ancient logic studies. A huge literature traces its origins to Łukasiewicz. -/- This Ancient Logic and Its Modern Interpretations, is based on the 1973 Buffalo Symposium on Modernist Interpretations of Ancient Logic, the first conference devoted entirely to critical assessment of the state of ancient logic studies. (shrink)
Deontic logic is standardly conceived as the logic of true statements about the existence of obligations and permissions. In his last writings on the subject, G. H. von Wright criticized this view of deontic logic, stressing the rationality of norm imposition as the proper foundation of deontic logic. The present paper is an attempt to advance such an account of deontic logic using the formal apparatus of update semantics and dynamic logic. That is, we (...) first define norm systems and a semantics of norm performatives as transformations of the norm system. Then a static modal logic for norm propositions is defined on that basis. In the course of this exposition we stress the performative nature of (i) free choice permission, (ii) the sealing legal principle and (iii) the social nature of permission. That is, (i) granting a disjunctive permission means granting permission for both disjuncts; (ii) non-prohibition does not entail permission, but the authority can declare that whatever he does not forbid is thereby permitted; and (iii) granting permission to one person means that all others are committed to not prevent the invocation of that permission. (shrink)
The theory of imperatives is philosophically relevant since in building it — some of the long standing problems need to be addressed, and presumably some new ones are waiting to be discovered. The relevance of the theory of imperatives for philosophical research is remarkable, but usually recognized only within the ﬁeld of practical philosophy. Nevertheless, the emphasis can be put on problems of theoretical philosophy. Proper understanding of imperatives is likely to raise doubts about some of our deeply entrenched and (...) tacit presumptions. In philosophy of language it is the presumption that declaratives provide the paradigm for sentence form; in philosophy of science it is the belief that theory construction is independent from the language practice, in logic it is the conviction that logical meaning relations are constituted out of logical terminology, in ontology it is the view that language use is free from ontological commitments. The list is not exhaustive; it includes only those presumptions that this paper concerns. (shrink)
Two competing models of metaaxiological justification of politics are analyzed. Politics is understood broadly, as actions which aim at organizing social life. I will be, first of all, interested in law making activities. When I talk about metaaxiological justification I think not so much about determinations of what is good, but about determinations refering to the way the good is founded, in short: determinations which answer the question why something is good. In the first model, which is described here as (...) objectivistic, it is assumed that determining that which is good is a matter of cognition; in the second model, which could be described here as voluntaristic or excedingly liberal, it is assumed that determining good is not a matter of cognition but of will – something is good because it is wanted. In the latter model, the cognoscibility of good is rejected and therefore the objective criteria for evaluation of which ‘will’ is better and which is worse are rejected. As a consequence, negative freedom becomes the fundamental value of social order and the basic requirement is that of maximizing the sphere of individual’s free actions, the sphere which is free from interference of other individuals or institutions. I am going to argue that none of these models is acceptable as a basis of oragnizing social life, and at least because of one reason. Each of them leads to a certain version of totalitarianism. In the conclusion I am going to present a mixed model, which, in my opinion, reflexes well the practice of democratic states. Analysis of these three models allows, first of all, to identify more clearly some of the problems appearing in making law, including procedural questions. By pointing at the interdependence of the foundations of good and law making procedures it is argued that the choice of the concept of good (a metaaxiological choice) is primary to the choice of law making procedures. (shrink)
1) We will begin by offering a short introduction to Epistemic Logic and presenting Fitch’s paradox in an epistemic‑modal logic. (2) Then, we will proceed to presenting three Epistemic Temporal logical frameworks creat‑ ed by Hoshi (2009) : TPAL (Temporal Public Announcement Logic), TAPAL (Temporal Arbitrary Public Announcement Logic) and TPAL+P ! (Temporal Public Announcement Logic with Labeled Past Operators). We will show how Hoshi stated the Verificationist Thesis in the language of TAPAL and analyze (...) his argument on why this version of it is immune from paradox. (3) Edgington (1985) offered an interpretation of the Verificationist Thesis that blocks Fitch’s paradox and we will propose a way to formulate it in a TAPAL‑based lan‑ guage. The language we will use is a combination of TAPAL and TPAL+P ! with an Indefinite (Unlabeled) Past Operator (TAPAL+P !+P). Using indexed satisfi‑ ability relations (as introduced in (Wang 2010 ; 2011)) we will offer a prospec ‑ tive semantics for this language. We will investigate whether the tentative re‑ formulation of Edgington’s Verificationist Thesis in TAPAL+P !+P is free from paradox and adequate to Edgington’s ideas on how „all truths are knowable“ should be interpreted. (shrink)
This paper deals with higher-order vagueness in Williamson's 'logic of clarity'. Its aim is to prove that for 'fixed margin models' (W,d,α ,[ ]) the notion of higher-order vagueness collapses to second-order vagueness. First, it is shown that fixed margin models can be reformulated in terms of similarity structures (W,~). The relation ~ is assumed to be reflexive and symmetric, but not necessarily transitive. Then, it is shown that the structures (W,~) come along with naturally defined maps h and (...) s that define a Galois connection on the power set PW of W. These maps can be used to define two distinct boundary operators bd and BD on W. The main theorem of the paper states that higher-order vagueness with respect to bd collapses to second-order vagueness. This does not hold for BD, the iterations of which behave in quite an erratic way. In contrast, the operator bd defines a variety of tolerance principles that do not fall prey to the sorites paradox and, moreover, do not always satisfy the principles of positive and negative introspection. (shrink)
The Free Choice effect---whereby <>(p or q) seems to entail both <>p and <>q---has traditionally been characterized as a phenomenon affecting the deontic modal "may". This paper presents an extension of the semantic account of free choice defended in Fusco (2015) to the agentive modal "can", the "can" which, intuitively, describes an agent's powers. -/- I begin by sketching a model of inexact ability, which grounds a modal approach to agency (Belnap & Perloff 1998; Belnap, Perloff, and Xu (...) 2001) in a Williamson (1992, 2014)-style margin of error. A classical propositional semantics combined with this framework can reflect the intuitions highlighted by Kenny (1976)'s much-discussed dartboard cases, as well as the counterexamples to simple conditional views recently discussed by Mandelkern, Schultheis, and Boylan (2017). In Section 3, I turn to an actual-world-sensitive account of disjunction, and show how it extends free choice inferences into an object language for propositional modal logic. (shrink)
Future Logic is an original, and wide-ranging treatise of formal logic. It deals with deduction and induction, of categorical and conditional propositions, involving the natural, temporal, extensional, and logical modalities. Traditional and Modern logic have covered in detail only formal deduction from actual categoricals, or from logical conditionals (conjunctives, hypotheticals, and disjunctives). Deduction from modal categoricals has also been considered, though very vaguely and roughly; whereas deduction from natural, temporal and extensional forms of conditioning has been all (...) but totally ignored. As for induction, apart from the elucidation of adductive processes (the scientific method), almost no formal work has been done. This is the first work ever to strictly formalize the inductive processes of generalization and particularization, through the novel methods of factorial analysis, factor selection and formula revision. This is the first work ever to develop a formal logic of the natural, temporal and extensional types of conditioning (as distinct from logical conditioning), including their production from modal categorical premises. Future Logic contains a great many other new discoveries, organized into a unified, consistent and empirical system, with precise definitions of the various categories and types of modality (including logical modality), and full awareness of the epistemological and ontological issues involved. Though strictly formal, it uses ordinary language, wherever symbols can be avoided. Among its other contributions: a full list of the valid modal syllogisms (which is more restrictive than previous lists); the main formalities of the logic of change (which introduces a dynamic instead of merely static approach to classification); the first formal definitions of the modal types of causality; a new theory of class logic, free of the Russell Paradox; as well as a critical review of modern metalogic. But it is impossible to list briefly all the innovations in logical science — and therefore, epistemology and ontology — this book presents; it has to be read for its scope to be appreciated. (shrink)
The Logic of Causation: Definition, Induction and Deduction of Deterministic Causality is a treatise of formal logic and of aetiology. It is an original and wide-ranging investigation of the definition of causation (deterministic causality) in all its forms, and of the deduction and induction of such forms. The work was carried out in three phases over a dozen years (1998-2010), each phase introducing more sophisticated methods than the previous to solve outstanding problems. This study was intended as part (...) of a larger work on causal logic, which additionally treats volition and allied cause-effect relations (2004). The Logic of Causation deals with the main technicalities relating to reasoning about causation. Once all the deductive characteristics of causation in all its forms have been treated, and we have gained an understanding as to how it is induced, we are able to discuss more intelligently its epistemological and ontological status. In this context, past theories of causation are reviewed and evaluated (although some of the issues involved here can only be fully dealt with in a larger perspective, taking volition and other aspects of causality into consideration, as done in Volition and Allied Causal Concepts). Phase I: Macroanalysis. Starting with the paradigm of causation, its most obvious and strongest form, we can by abstraction of its defining components distinguish four genera of causation, or generic determinations, namely: complete, partial, necessary and contingent causation. When these genera and their negations are combined together in every which way, and tested for consistency, it is found that only four species of causation, or specific determinations, remain conceivable. The concept of causation thus gives rise to a number of positive and negative propositional forms, which can be studied in detail with relative ease because they are compounds of conjunctive and conditional propositions whose properties are already well known to logicians. The logical relations (oppositions) between the various determinations (and their negations) are investigated, as well as their respective implications (eductions). Thereafter, their interactions (in syllogistic reasoning) are treated in the most rigorous manner. The main question we try to answer here is: is (or when is) the cause of a cause of something itself a cause of that thing, and if so to what degree? The figures and moods of positive causative syllogism are listed exhaustively; and the resulting arguments validated or invalidated, as the case may be. In this context, a general and sure method of evaluation called ‘matricial analysis’ (macroanalysis) is introduced. Because this (initial) method is cumbersome, it is used as little as possible – the remaining cases being evaluated by means of reduction. Phase II: Microanalysis. Seeing various difficulties encountered in the first phase, and the fact that some issues were left unresolved in it, a more precise method is developed in the second phase, capable of systematically answering most outstanding questions. This improved matricial analysis (microanalysis) is based on tabular prediction of all logically conceivable combinations and permutations of conjunctions between two or more items and their negations (grand matrices). Each such possible combination is called a ‘modus’ and is assigned a permanent number within the framework concerned (for 2, 3, or more items). This allows us to identify each distinct (causative or other, positive or negative) propositional form with a number of alternative moduses. This technique greatly facilitates all work with causative and related forms, allowing us to systematically consider their eductions, oppositions, and syllogistic combinations. In fact, it constitutes a most radical approach not only to causative propositions and their derivatives, but perhaps more importantly to their constituent conditional propositions. Moreover, it is not limited to logical conditioning and causation, but is equally applicable to other modes of modality, including extensional, natural, temporal and spatial conditioning and causation. From the results obtained, we are able to settle with formal certainty most of the historically controversial issues relating to causation. Phase III: Software Assisted Analysis. The approach in the second phase was very ‘manual’ and time consuming; the third phase is intended to ‘mechanize’ much of the work involved by means of spreadsheets (to begin with). This increases reliability of calculations (though no errors were found, in fact) – but also allows for a wider scope. Indeed, we are now able to produce a larger, 4-item grand matrix, and on its basis find the moduses of causative and other forms needed to investigate 4-item syllogism. As well, now each modus can be interpreted with greater precision and causation can be more precisely defined and treated. In this latest phase, the research is brought to a successful finish! Its main ambition, to obtain a complete and reliable listing of all 3-item and 4-item causative syllogisms, being truly fulfilled. This was made technically feasible, in spite of limitations in computer software and hardware, by cutting up problems into smaller pieces. For every mood of the syllogism, it was thus possible to scan for conclusions ‘mechanically’ (using spreadsheets), testing all forms of causative and preventive conclusions. Until now, this job could only be done ‘manually’, and therefore not exhaustively and with certainty. It took over 72’000 pages of spreadsheets to generate the sought for conclusions. This is a historic breakthrough for causal logic and logic in general. Of course, not all conceivable issues are resolved. There is still some work that needs doing, notably with regard to 5-item causative syllogism. But what has been achieved solves the core problem. The method for the resolution of all outstanding issues has definitely now been found and proven. The only obstacle to solving most of them is the amount of labor needed to produce the remaining (less important) tables. As for 5-item syllogism, bigger computer resources are also needed. (shrink)
How does vagueness interact with metaphysical modality and with restrictions of it, such as nomological modality? In particular, how do definiteness, necessity (understood as restricted in some way or not), and actuality interact? This paper proposes a model-theoretic framework for investigating the logic and semantics of that interaction. The framework is put forward in an ecumenical spirit: it is intended to be applicable to all theories of vagueness that express vagueness using a definiteness (or: determinacy) operator. We will show (...) how epistemicists, supervaluationists, and theorists of metaphysical vagueness like Barnes and Williams (2010) can interpret the framework. We will also present a complete axiomatization of the logic we recommend to both epistemicists and local supervaluationists. . (shrink)
We provide a direct method for proving Craig interpolation for a range of modal and intuitionistic logics, including those containing a "converse" modality. We demonstrate this method for classical tense logic, its extensions with path axioms, and for bi-intuitionistic logic. These logics do not have straightforward formalisations in the traditional Gentzen-style sequent calculus, but have all been shown to have cut-free nested sequent calculi. The proof of the interpolation theorem uses these calculi and is purely syntactic, without (...) resorting to embeddings, semantic arguments, or interpreted connectives external to the underlying logical language. A novel feature of our proof includes an orthogonality condition for defining duality between interpolants. (shrink)
The determinism-free will debate is perhaps as old as philosophy itself and has been engaged in from a great variety of points of view including those of scientific, theological, and logical character. This chapter focuses on two arguments from logic. First, there is an argument in support of determinism that dates back to Aristotle, if not farther. It rests on acceptance of the Law of Excluded Middle, according to which every proposition is either true or false, no matter (...) whether the proposition is about the past, present or future. In particular, the argument goes, whatever one does or does not do in the future is determined in the present by the truth or falsity of the corresponding proposition. The second argument coming from logic is much more modern and appeals to Gödel's incompleteness theorems to make the case against determinism and in favour of free will, insofar as that applies to the mathematical potentialities of human beings. The claim more precisely is that as a consequence of the incompleteness theorems, those potentialities cannot be exactly circumscribed by the output of any computing machine even allowing unlimited time and space for its work. The chapter concludes with some new considerations that may be in favour of a partial mechanist account of the mathematical mind. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.