Prevailing opinion—defended by Jason Brennan and others—is that voting to change the outcome is irrational, since although the payoffs of tipping an election can be quite large, the probability of doing so is extraordinarily small. This paper argues that prevailing opinion is incorrect. Voting is shown to be rational so long as two conditions are satisfied: First, the average social benefit of electing the better candidate must be at least twice as great as the individual cost of voting, and second, (...) the chance of casting the decisive vote must be at least 1/N, where N stands for the number of citizens. It is argued that both of these conditions are often true in the real world. (shrink)
There is a well-known moral quandary concerning how to account for the rightness or wrongness of acts that clearly contribute to some morally significant outcome – but which each seem too small, individually, to make any meaningful difference. One consequentialist-friendly response to this problem is to deny that there could ever be a case of this type. This paper pursues this general strategy, but in an unusual way. Existing arguments for the consequentialist-friendly position are sorites-style arguments. Such arguments imagine varying (...) a subject’s predicament bit by bit until it is clear that a relevant difference has been achieved. The arguments offered in this paper are structurally different, and do not rely on any sorites series. For this reason, they are not vulnerable to objections that have been leveled against the sorites-style arguments. (shrink)
This paper is about how to aggregate outside opinion. If two experts are on one side of an issue, while three experts are on the other side, what should a non-expert believe? Certainly, the non-expert should take into account more than just the numbers. But which other factors are relevant, and why? According to the view developed here, one important factor is whether the experts should have been expected, in advance, to reach the same conclusion. When the agreement of two (...) (or of twenty) thinkers can be predicted with certainty in advance, their shared belief is worth only as much as one of their beliefs would be worth alone. This expectational model of belief dependence can be applied whether we think in terms of credences or in terms of all-or-nothing beliefs. (shrink)
Conciliationism faces a challenge that has not been satisfactorily addressed. There are clear cases of epistemically significant merely possible disagreement, but there are also clear cases where merely possible disagreement is epistemically irrelevant. Conciliationists have not yet accounted for this asymmetry. In this paper, we propose that the asymmetry can be explained by positing a selection constraint on all cases of peer disagreement—whether actual or merely possible. If a peer’s opinion was not selected in accordance with the proposed constraint, then (...) it lacks epistemic significance. This allows us to distinguish the epistemically significant cases of merely possible disagreement from the insignificant ones. (shrink)
The proof theory of many-valued systems has not been investigated to an extent comparable to the work done on axiomatizatbility of many-valued logics. Proof theory requires appropriate formalisms, such as sequent calculus, natural deduction, and tableaux for classical (and intuitionistic) logic. One particular method for systematically obtaining calculi for all finite-valued logics was invented independently by several researchers, with slight variations in design and presentation. The main aim of this report is to develop the proof theory of finite-valued first order (...) logics in a general way, and to present some of the more important results in this area. In Systems covered are the resolution calculus, sequent calculus, tableaux, and natural deduction. This report is actually a template, from which all results can be specialized to particular logics. (shrink)
Hilbert’s program was an ambitious and wide-ranging project in the philosophy and foundations of mathematics. In order to “dispose of the foundational questions in mathematics once and for all,” Hilbert proposed a two-pronged approach in 1921: first, classical mathematics should be formalized in axiomatic systems; second, using only restricted, “finitary” means, one should give proofs of the consistency of these axiomatic systems. Although Gödel’s incompleteness theorems show that the program as originally conceived cannot be carried out, it had many partial (...) successes, and generated important advances in logical theory and metatheory, both at the time and since. The article discusses the historical background and development of Hilbert’s program, its philosophical underpinnings and consequences, and its subsequent development and influences since the 1930s. (shrink)
Should we believe our controversial philosophical views? Recently, several authors have argued from broadly conciliationist premises that we should not. If they are right, we philosophers face a dilemma: If we believe our views, we are irrational. If we do not, we are not sincere in holding them. This paper offers a way out, proposing an attitude we can rationally take toward our views that can support sincerity of the appropriate sort. We should arrive at our views via a certain (...) sort of ‘insulated’ reasoning – that is, reasoning that involves setting aside certain higher-order worries, such as those provided by disagreement – when we investigate philosophical questions. (shrink)
The period from 1900 to 1935 was particularly fruitful and important for the development of logic and logical metatheory. This survey is organized along eight "itineraries" concentrating on historically and conceptually linked strands in this development. Itinerary I deals with the evolution of conceptions of axiomatics. Itinerary II centers on the logical work of Bertrand Russell. Itinerary III presents the development of set theory from Zermelo onward. Itinerary IV discusses the contributions of the algebra of logic tradition, in particular, Löwenheim (...) and Skolem. Itinerary V surveys the work in logic connected to the Hilbert school, and itinerary V deals specifically with consistency proofs and metamathematics, including the incompleteness theorems. Itinerary VII traces the development of intuitionistic and many-valued logics. Itinerary VIII surveys the development of semantical notions from the early work on axiomatics up to Tarski's work on truth. (shrink)
On some accounts of vagueness, predicates like “is a heap” are tolerant. That is, their correct application tolerates sufficiently small changes in the objects to which they are applied. Of course, such views face the sorites paradox, and various solutions have been proposed. One proposed solution involves banning repeated appeals to tolerance, while affirming tolerance in any individual case. In effect, this solution rejects the reasoning of the sorites argument. This paper discusses a thorny problem afflicting this approach to vagueness. (...) In particular, it is shown that, on the foregoing view, whether an object is a heap will sometimes depend on factors extrinsic to that object, such as whether its components came from other heaps. More generally, the paper raises the issue of how to count heaps in a tolerance-friendly framework. (shrink)
Hilbert's ε-calculus is based on an extension of the language of predicate logic by a term-forming operator εx. Two fundamental results about the ε-calculus, the first and second epsilon theorem, play a rôle similar to that which the cut-elimination theorem plays in sequent calculus. In particular, Herbrand's Theorem is a consequence of the epsilon theorems. The paper investigates the epsilon theorems and the complexity of the elimination procedure underlying their proof, as well as the length of Herbrand disjunctions of existential (...) theorems obtained by this elimination procedure. (shrink)
David Hilbert's finitistic standpoint is a conception of elementary number theory designed to answer the intuitionist doubts regarding the security and certainty of mathematics. Hilbert was unfortunately not exact in delineating what that viewpoint was, and Hilbert himself changed his usage of the term through the 1920s and 30s. The purpose of this paper is to outline what the main problems are in understanding Hilbert and Bernays on this issue, based on some publications by them which have so far received (...) little attention, and on a number of philosophical reconstructions of the viewpoint (in particular, by Hand, Kitcher, and Tait). (shrink)
Although arguments for and against competing theories of vagueness often appeal to claims about the use of vague predicates by ordinary speakers, such claims are rarely tested. An exception is Bonini et al. (1999), who report empirical results on the use of vague predicates by Italian speakers, and take the results to count in favor of epistemicism. Yet several methodological difficulties mar their experiments; we outline these problems and devise revised experiments that do not show the same results. We then (...) describe three additional empirical studies that investigate further claims in the literature on vagueness: the hypothesis that speakers confuse ‘P’ with ‘definitely P’, the relative persuasiveness of different formulations of the inductive premise of the Sorites, and the interaction of vague predicates with three different forms of negation. (shrink)
On the heels of Franzén's fine technical exposition of Gödel's incompleteness theorems and related topics (Franzén 2004) comes this survey of the incompleteness theorems aimed at a general audience. Gödel's Theorem: An Incomplete Guide to its Use and Abuse is an extended and self-contained exposition of the incompleteness theorems and a discussion of what informal consequences can, and in particular cannot, be drawn from them.
This chapter describes Kurt Gödel's paper on the incompleteness theorems. Gödel's incompleteness results are two of the most fundamental and important contributions to logic and the foundations of mathematics. It had been assumed that first-order number theory is complete in the sense that any sentence in the language of number theory would be either provable from the axioms or refutable. Gödel's first incompleteness theorem showed that this assumption was false: it states that there are sentences of number theory that are (...) neither provable nor refutable. The first theorem is general in the sense that it applies to any axiomatic theory, which is ω-consistent, has an effective proof procedure, and is strong enough to represent basic arithmetic. Their importance lies in their generality: although proved specifically for extensions of system, the method Gödel used is applicable in a wide variety of circumstances. Gödel's results had a profound influence on the further development of the foundations of mathematics. It pointed the way to a reconceptualization of the view of axiomatic foundations. (shrink)
Some of the most important developments of symbolic logic took place in the 1920s. Foremost among them are the distinction between syntax and semantics and the formulation of questions of completeness and decidability of logical systems. David Hilbert and his students played a very important part in these developments. Their contributions can be traced to unpublished lecture notes and other manuscripts by Hilbert and Bernays dating to the period 1917-1923. The aim of this paper is to describe these results, focussing (...) primarily on propositional logic, and to put them in their historical context. It is argued that truth-value semantics, syntactic ("Post-") and semantic completeness, decidability, and other results were first obtained by Hilbert and Bernays in 1918, and that Bernays's role in their discovery and the subsequent development of mathematical logic is much greater than has so far been acknowledged. (shrink)
Although Kurt Gödel does not figure prominently in the history of computabilty theory, he exerted a significant influence on some of the founders of the field, both through his published work and through personal interaction. In particular, Gödel’s 1931 paper on incompleteness and the methods developed therein were important for the early development of recursive function theory and the lambda calculus at the hands of Church, Kleene, and Rosser. Church and his students studied Gödel 1931, and Gödel taught a seminar (...) at Princeton in 1934. Seen in the historical context, Gödel was an important catalyst for the emergence of computability theory in the mid 1930s. (shrink)
Epstein and Carnielli's fine textbook on logic and computability is now in its second edition. The readers of this journal might be particularly interested in the timeline `Computability and Undecidability' added in this edition, and the included wall-poster of the same title. The text itself, however, has some aspects which are worth commenting on.
In Winter 2017, the first author piloted a course in formal logic in which we aimed to (a) improve student engagement and mastery of the content, and (b) reduce maths anxiety and its negative effects on student outcomes, by adopting student oriented teaching including peer instruction and classroom flipping techniques. The course implemented a partially flipped approach, and incorporated group-work and peer learning elements, while retaining some of the traditional lecture format. By doing this, a wide variety of student learning (...) preferences could be provided for. (shrink)
A textbook for modal and other intensional logics based on the Open Logic Project. It covers normal modal logics, relational semantics, axiomatic and tableaux proof systems, intuitionistic logic, and counterfactual conditionals.
Textbook on Gödel’s incompleteness theorems and computability theory, based on the Open Logic Project. Covers recursive function theory, arithmetization of syntax, the first and second incompleteness theorem, models of arithmetic, second-order logic, and the lambda calculus.
An introductory textbook on metalogic. It covers naive set theory, first-order logic, sequent calculus and natural deduction, the completeness, compactness, and Löwenheim-Skolem theorems, Turing machines, and the undecidability of the halting problem and of first-order logic. The audience is undergraduate students with some background in formal logic.
In The Boundary Stones of Thought, Rumfitt defends classical logic against challenges from intuitionistic mathematics and vagueness, using a semantics of pre-topologies on possibilities, and a topological semantics on predicates, respectively. These semantics are suggestive but the characterizations of negation face difficulties that may undermine their usefulness in Rumfitt’s project.
Roger White (2015) sketches an ingenious new solution to the problem of induction. He argues from the principle of indifference for the conclusion that the world is more likely to be induction- friendly than induction-unfriendly. But there is reason to be skeptical about the proposed indifference-based vindication of induction. It can be shown that, in the crucial test cases White concentrates on, the assumption of indifference renders induction no more accurate than random guessing. After discussing this result, the paper explains (...) why the indifference-based argument seemed so compelling, despite ultimately being unsound. (shrink)
Priest has provided a simple tableau calculus for Chellas's conditional logic Ck. We provide rules which, when added to Priest's system, result in tableau calculi for Chellas's CK and Lewis's VC. Completeness of these tableaux, however, relies on the cut rule.
In the 1920s, David Hilbert proposed a research program with the aim of providing mathematics with a secure foundation. This was to be accomplished by first formalizing logic and mathematics in their entirety, and then showing---using only so-called finitistic principles---that these formalizations are free of contradictions. ;In the area of logic, the Hilbert school accomplished major advances both in introducing new systems of logic, and in developing central metalogical notions, such as completeness and decidability. The analysis of unpublished material presented (...) in Chapter 2 shows that a completeness proof for propositional logic was found by Hilbert and his assistant Paul Bernays already in 1917--18, and that Bernays's contribution was much greater than is commonly acknowledged. Aside from logic, the main technical contribution of Hilbert's Program are the development of formal mathematical theories and proof-theoretical investigations thereof, in particular, consistency proofs. In this respect Wilhelm Ackermann's 1924 dissertation is a milestone both in the development of the Program and in proof theory in general. Ackermann gives a consistency proof for a second-order version of primitive recursive arithmetic which, surprisingly, explicitly uses a finitistic version of transfinite induction up to www . He also gave a faulty consistency proof for a system of second-order arithmetic based on Hilbert's &egr;-substitution method. Detailed analyses of both proofs in Chapter 3 shed light on the development of finitism and proof theory in the 1920s as practiced in Hilbert's school. ;In a series of papers, Charles Parsons has attempted to map out a notion of mathematical intuition which he also brings to bear on Hilbert's finitism. According to him, mathematical intuition fails to be able to underwrite the kind of intuitive knowledge Hilbert thought was attainable by the finitist. It is argued in Chapter 4 that the extent of finitistic knowledge which intuition can provide is broader than Parsons supposes. According to another influential analysis of finitism due to W. W. Tait, finitist reasoning coincides with primitive recursive reasoning. The acceptance of non-primitive recursive methods in Ackermann's dissertation presented in Chapter 3, together with additional textual evidence presented in Chapter 4, shows that this identification is untenable as far as Hilbert's conception of finitism is concerned. Tait's conception, however, differs from Hilbert's in important respects, yet it is also open to criticisms leading to the conclusion that finitism encompasses more than just primitive recursive reasoning. (shrink)
Methods available for the axiomatization of arbitrary finite-valued logics can be applied to obtain sound and complete intelim rules for all truth-functional connectives of classical logic including the Sheffer stroke and Peirce’s arrow. The restriction to a single conclusion in standard systems of natural deduction requires the introduction of additional rules to make the resulting systems complete; these rules are nevertheless still simple and correspond straightforwardly to the classical absurdity rule. Omitting these rules results in systems for intuitionistic versions of (...) the connectives in question. (shrink)
Angell's logic of analytic containment AC has been shown to be characterized by a 9-valued matrix NC by Ferguson, and by a 16-valued matrix by Fine. We show that the former is the image of a surjective homomorphism from the latter, i.e., an epimorphic image. The epimorphism was found with the help of MUltlog, which also provides a tableau calculus for NC extended by quantifiers that generalize conjunction and disjunction.
A construction principle for natural deduction systems for arbitrary, finitely-many-valued first order logics is exhibited. These systems are systematically obtained from sequent calculi, which in turn can be automatically extracted from the truth tables of the logics under consideration. Soundness and cut-free completeness of these sequent calculi translate into soundness, completeness, and normal-form theorems for natural deduction systems.
Internal Logic brings together several threads of Yvon Gauthier's work on the foundations of mathematics and revisits his attempt to, as he puts it, radicalize Hilbert's Program. A radicalization of Hilbert's Program, I take it, is supposed to take Hilberts' finitary viewpoint more seriously than other attempts to salvage Hilbert's Program have. Such a return to the "roots of Hilbert's metamathematical idea" will, so claims Gauthier, enable him to save Hilbert's Program.
Any intermediate propositional logic can be extended to a calculus with epsilon- and tau-operators and critical formulas. For classical logic, this results in Hilbert’s $\varepsilon $ -calculus. The first and second $\varepsilon $ -theorems for classical logic establish conservativity of the $\varepsilon $ -calculus over its classical base logic. It is well known that the second $\varepsilon $ -theorem fails for the intuitionistic $\varepsilon $ -calculus, as prenexation is impossible. The paper investigates the effect of adding critical $\varepsilon $ - (...) and $\tau $ -formulas and using the translation of quantifiers into $\varepsilon $ - and $\tau $ -terms to intermediate logics. It is shown that conservativity over the propositional base logic also holds for such intermediate ${\varepsilon \tau }$ -calculi. The “extended” first $\varepsilon $ -theorem holds if the base logic is finite-valued Gödel–Dummett logic, and fails otherwise, but holds for certain provable formulas in infinite-valued Gödel logic. The second $\varepsilon $ -theorem also holds for finite-valued first-order Gödel logics. The methods used to prove the extended first $\varepsilon $ -theorem for infinite-valued Gödel logic suggest applications to theories of arithmetic. (shrink)
Takeuti and Titani have introduced and investigated a logic they called intuitionistic fuzzy logic. This logic is characterized as the first-order Gödel logic based on the truth value set [0,1]. The logic is known to be axiomatizable, but no deduction system amenable to proof-theoretic, and hence, computational treatment, has been known. Such a system is presented here, based on previous work on hypersequent calculi for propositional Gödel logics by Avron. It is shown that the system is sound and complete, and (...) allows cut-elimination. A question by Takano regarding the eliminability of the Takeuti-Titani density rule is answered affirmatively. (shrink)
Entailment in propositional Gödel logics can be defined in a natural way. While all infinite sets of truth values yield the same sets of tautologies, the entailment relations differ. It is shown that there is a rich structure of infinite-valued Gödel logics, only one of which is compact. It is also shown that the compact infinite-valued Gödel logic is the only one which interpolates, and the only one with an r.e. entailment relation.
It is shown that Gqp↑, the quantified propositional Gödel logic based on the truth-value set V↑ = {1 - 1/n : n≥1}∪{1}, is decidable. This result is obtained by reduction to Büchi's theory S1S. An alternative proof based on elimination of quantifiers is also given, which yields both an axiomatization and a characterization of Gqp↑ as the intersection of all finite-valued quantified propositional Gödel logics.
The first-order temporal logics with □ and ○ of time structures isomorphic to ω (discrete linear time) and trees of ω-segments (linear time with branching gaps) and some of its fragments are compared: the first is not recursively axiomatizable. For the second, a cut-free complete sequent calculus is given, and from this, a resolution system is derived by the method of Maslov.
Propositional logics in general, considered as a set of sentences, can be undecidable even if they have “nice” representations, e.g., are given by a calculus. Even decidable propositional logics can be computationally complex (e.g., already intuitionistic logic is PSPACE-complete). On the other hand, finite-valued logics are computationally relatively simple—at worst NP. Moreover, finite-valued semantics are simple, and general methods for theorem proving exist. This raises the question to what extent and under what circumstances propositional logics represented in various ways can (...) be approximated by finite-valued logics. It is shown that the minimal m-valued logic for which a given calculus is strongly sound can be calculated. It is also investigated under which conditions propositional logics can be characterized as the intersection of (effectively given) sequences of finite-valued logics. (shrink)
The problem of algorithmic structuring of proofs in the sequent calculi LK and LKB ( LK where blocks of quantifiers can be introduced in one step) is investigated, where a distinction is made between linear proofs and proofs in tree form. In this framework, structuring coincides with the introduction of cuts into a proof. The algorithmic solvability of this problem can be reduced to the question of k-l-compressibility: "Given a proof of length k , and l ≤ k : Is (...) there is a proof of length ≤ l ?" When restricted to proofs with universal or existential cuts, this problem is shown to be (1) undecidable for linear or tree-like LK-proofs (corresponds to the undecidability of second order unification), (2) undecidable for linear LKB-proofs (corresponds to the undecidability of semi-unification), and (3) decidable for tree-like LKB -proofs (corresponds to a decidable subprob- lem of semi-unification). (shrink)
All first-order Gödel logics G_V with globalization operator based on truth value sets V C [0,1] where 0 and 1 lie in the perfect kernel of V are axiomatized by Ciabattoni’s hypersequent calculus HGIF.
It is shown that the infinite-valued first-order Gödel logic G° based on the set of truth values {1/k: k ε w {0}} U {0} is not r.e. The logic G° is the same as that obtained from the Kripke semantics for first-order intuitionistic logic with constant domains and where the order structure of the model is linear. From this, the unaxiomatizability of Kröger's temporal logic of programs (even of the fragment without the nexttime operator O) and of the authors' temporal (...) logic of linear discrete time with gaps follows. (shrink)
The problem of approximating a propositional calculus is to find many-valued logics which are sound for the calculus (i.e., all theorems of the calculus are tautologies) with as few tautologies as possible. This has potential applications for representing (computationally complex) logics used in AI by (computationally easy) many-valued logics. It is investigated how far this method can be carried using (1) one or (2) an infinite sequence of many-valued logics. It is shown that the optimal candidate matrices for (1) can (...) be computed from the calculus. (shrink)
It is shown how the schema of equivalence can be used to obtain short proofs of tautologies A , where the depth of proofs is linear in the number of variables in A .
The use of the symbol ∨ for disjunction in formal logic is ubiquitous. Where did it come from? The paper details the evolution of the symbol ∨ in its historical and logical context. Some sources say that disjunction in its use as connecting propositions or formulas was introduced by Peano; others suggest that it originated as an abbreviation of the Latin word for “or”, vel. We show that the origin of the symbol ∨ for disjunction can be traced to Whitehead (...) and Russell’s pre-Principia work in formal logic. Because of Principia’s influence, its notation was widely adopted by philosophers working in logic (the logical empiricists in the 1920s and 1930s, especially Carnap and early Quine). Hilbert’s adoption of ∨ in his Grundzüge der theoretischen Logik guaranteed its widespread use by mathematical logicians. The origins of other logical symbols are also discussed. (shrink)
forall x: Calgary is a full-featured textbook on formal logic. It covers key notions of logic such as consequence and validity of arguments, the syntax of truth-functional propositional logic TFL and truth-table semantics, the syntax of first-order (predicate) logic FOL with identity (first-order interpretations), translating (formalizing) English in TFL and FOL, and Fitch-style natural deduction proof systems for both TFL and FOL. It also deals with some advanced topics such as truth-functional completeness and modal logic. Exercises with solutions are available. (...) It is provided in PDF (for screen reading, printing, and a special version for dyslexics) and in LaTeX source code. (shrink)
Heinrich Behmann (1891-1970) obtained his Habilitation under David Hilbert in Göttingen in 1921 with a thesis on the decision problem. In his thesis, he solved - independently of Löwenheim and Skolem's earlier work - the decision problem for monadic second-order logic in a framework that combined elements of the algebra of logic and the newer axiomatic approach to logic then being developed in Göttingen. In a talk given in 1921, he outlined this solution, but also presented important programmatic remarks on (...) the significance of the decision problem and of decision procedures more generally. The text of this talk as well as a partial English translation are included. (shrink)
Livro-texto de introdução à lógica, com (mais do que) pitadas de filosofia da lógica, produzido como uma versão revista e ampliada do livro Forallx: Calgary. Trata-se da versão de 13 de outubro de 2022. Comentários, críticas, correções e sugestões são muito bem-vindos.
A general class of labeled sequent calculi is investigated, and necessary and sufficient conditions are given for when such a calculus is sound and complete for a finite -valued logic if the labels are interpreted as sets of truth values. Furthermore, it is shown that any finite -valued logic can be given an axiomatization by such a labeled calculus using arbitrary "systems of signs," i.e., of sets of truth values, as labels. The number of labels needed is logarithmic in the (...) number of truth values, and it is shown that this bound is tight. (shrink)
A uniform construction for sequent calculi for finite-valued first-order logics with distribution quantifiers is exhibited. Completeness, cut-elimination and midsequent theorems are established. As an application, an analog of Herbrand’s theorem for the four-valued knowledge-representation logic of Belnap and Ginsberg is presented. It is indicated how this theorem can be used for reasoning about knowledge bases with incomplete and inconsistent information.
The aim of this paper is to emphasize the fact that for all finitely-many-valued logics there is a completely systematic relation between sequent calculi and tableau systems. More importantly, we show that for both of these systems there are al- ways two dual proof sytems (not just only two ways to interpret the calculi). This phenomenon may easily escape one’s attention since in the classical (two-valued) case the two systems coincide. (In two-valued logic the assignment of a truth value and (...) the exclusion of the opposite truth value describe the same situation.). (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.