What should a person do when, through no fault of her own, she ends up believing a false moral theory? Some suggest that she should act against what the false theory recommends; others argue that she should follow her rationally held moral beliefs. While the former view better accords with intuitions about cases, the latter one seems to enjoy a critical advantage: It seems better able to render moral requirements ‘followable’ or ‘action-guiding.’ But this tempting thought proves difficult to justify. (...) Indeed, whether it can be justified turns out to depend importantly on the rational status of epistemic akrasia. Furthermore, it can be argued, from premises all parties to the moral ignorance debate should accept, that rational epistemic akrasia is possible. If the argument proves successful, it follows that a person should sometimes act against her rationally held moral convictions. (shrink)
Should we believe our controversial philosophical views? Recently, several authors have argued from broadly conciliationist premises that we should not. If they are right, we philosophers face a dilemma: If we believe our views, we are irrational. If we do not, we are not sincere in holding them. This paper offers a way out, proposing an attitude we can rationally take toward our views that can support sincerity of the appropriate sort. We should arrive at our views via a certain (...) sort of ‘insulated’ reasoning – that is, reasoning that involves setting aside certain higher-order worries, such as those provided by disagreement – when we investigate philosophical questions. (shrink)
There is a well-known moral quandary concerning how to account for the rightness or wrongness of acts that clearly contribute to some morally significant outcome – but which each seem too small, individually, to make any meaningful difference. One consequentialist-friendly response to this problem is to deny that there could ever be a case of this type. This paper pursues this general strategy, but in an unusual way. Existing arguments for the consequentialist-friendly position are sorites-style arguments. Such arguments imagine varying (...) a subject’s predicament bit by bit until it is clear that a relevant difference has been achieved. The arguments offered in this paper are structurally different, and do not rely on any sorites series. For this reason, they are not vulnerable to objections that have been leveled against the sorites-style arguments. (shrink)
This paper is about how to aggregate outside opinion. If two experts are on one side of an issue, while three experts are on the other side, what should a non-expert believe? Certainly, the non-expert should take into account more than just the numbers. But which other factors are relevant, and why? According to the view developed here, one important factor is whether the experts should have been expected, in advance, to reach the same conclusion. When the agreement of two (...) (or of twenty) thinkers can be predicted with certainty in advance, their shared belief is worth only as much as one of their beliefs would be worth alone. This expectational model of belief dependence can be applied whether we think in terms of credences or in terms of all-or-nothing beliefs. (shrink)
On some accounts of vagueness, predicates like “is a heap” are tolerant. That is, their correct application tolerates sufficiently small changes in the objects to which they are applied. Of course, such views face the sorites paradox, and various solutions have been proposed. One proposed solution involves banning repeated appeals to tolerance, while affirming tolerance in any individual case. In effect, this solution rejects the reasoning of the sorites argument. This paper discusses a thorny problem afflicting this approach to vagueness. (...) In particular, it is shown that, on the foregoing view, whether an object is a heap will sometimes depend on factors extrinsic to that object, such as whether its components came from other heaps. More generally, the paper raises the issue of how to count heaps in a tolerance-friendly framework. (shrink)
Conciliationism faces a challenge that has not been satisfactorily addressed. There are clear cases of epistemically significant merely possible disagreement, but there are also clear cases where merely possible disagreement is epistemically irrelevant. Conciliationists have not yet accounted for this asymmetry. In this paper, we propose that the asymmetry can be explained by positing a selection constraint on all cases of peer disagreement—whether actual or merely possible. If a peer’s opinion was not selected in accordance with the proposed constraint, then (...) it lacks epistemic significance. This allows us to distinguish the epistemically significant cases of merely possible disagreement from the insignificant ones. (shrink)
Roger White (2015) sketches an ingenious new solution to the problem of induction. He argues from the principle of indifference for the conclusion that the world is more likely to be induction- friendly than induction-unfriendly. But there is reason to be skeptical about the proposed indifference-based vindication of induction. It can be shown that, in the crucial test cases White concentrates on, the assumption of indifference renders induction no more accurate than random guessing. After discussing this result, the paper explains (...) why the indifference-based argument seemed so compelling, despite ultimately being unsound. (shrink)
Argument mapping is a way of diagramming the logical structure of an argument to explicitly and concisely represent reasoning. The use of argument mapping in critical thinking instruction has increased dramatically in recent decades. This paper overviews the innovation and provides a procedural approach for new teaches wanting to use argument mapping in the classroom. A brief history of argument mapping is provided at the end of this paper.
In The Boundary Stones of Thought, Rumfitt defends classical logic against challenges from intuitionistic mathematics and vagueness, using a semantics of pre-topologies on possibilities, and a topological semantics on predicates, respectively. These semantics are suggestive but the characterizations of negation face difficulties that may undermine their usefulness in Rumfitt’s project.
A natural view of testimony holds that a source's statements provide one with evidence about what the source believes, which in turn provides one with evidence about what is true. But some theorists have gone further and developed a broadly analogous view of memory. According to this view, which this essay calls the “diary model,” one's memory ordinarily serves as a means for one's present self to gain evidence about one's past judgments, and in turn about the truth. This essay (...) rejects the diary model's analogy between memory and testimony from one's former self, arguing first that memory and a diary differ with respect to their psychological roles, and second that this psychological difference underwrites important downstream epistemic differences. The resulting view stands opposed to prominent discussions of memory and testimony, which either, like the diary model, treat memory by analogy to what we naively wish to say about testimony, or which instead attempt to extend to testimony the epistemically preservative role of memory. (shrink)
On an optimistic version of realist moral epistemology, a significant range of ordinary moral beliefs, construed in realist terms, constitute knowledge—or at least some weaker positive epistemic status, such as epistemic justification. The “debunking challenge” to this view grants prima facie justification but claims that it is “debunked” (i.e., defeated), yielding the final verdict that moral beliefs are ultima facie unjustified. Notable candidate “debunkers” (i.e., defeaters) include the so-called “evolutionary debunking arguments,” the “Benacerraf-Field Challenge,” and persistent moral disagreement among epistemic (...) peers. Such defeaters are best treated as higher-order evidence—viz., evidence contesting the merits of the first-order evidence on which moral beliefs are based. This chapter first develops a theory of higher-order defeat in general, which it then applies to debunking in particular. The result: the challenge fails entirely on epistemic grounds—regardless of whether or not its empirical and metaphysical presuppositions are correct. An advantage of this purely epistemic defense over alternative strategies is that the former extends even to laypersons who themselves lack the expertise necessary to formulate an adequate response. However, this leaves open the prospects for non-epistemological interpretations of debunking (e.g., moral or ontological). The chapter therefore concludes with brief suggestions in that direction. (shrink)
Although arguments for and against competing theories of vagueness often appeal to claims about the use of vague predicates by ordinary speakers, such claims are rarely tested. An exception is Bonini et al. (1999), who report empirical results on the use of vague predicates by Italian speakers, and take the results to count in favor of epistemicism. Yet several methodological difficulties mar their experiments; we outline these problems and devise revised experiments that do not show the same results. We then (...) describe three additional empirical studies that investigate further claims in the literature on vagueness: the hypothesis that speakers confuse ‘P’ with ‘definitely P’, the relative persuasiveness of different formulations of the inductive premise of the Sorites, and the interaction of vague predicates with three different forms of negation. (shrink)
If the reliability of a source of testimony is open to question, it seems epistemically illegitimate to verify the source’s reliability by appealing to that source’s own testimony. Is this because it is illegitimate to trust a questionable source’s testimony on any matter whatsoever? Or is there a distinctive problem with appealing to the source’s testimony on the matter of that source’s own reliability? After distinguishing between two kinds of epistemically illegitimate circularity—bootstrapping and self-verification—I argue for a qualified version of (...) the claim that there is nothing especially illegitimate about using a questionable source to evaluate its own reliability. Instead, it is illegitimate to appeal to a questionable source’s testimony on any matter whatsoever, with the matter of the source’s own reliability serving only as a special case. (shrink)
The period from 1900 to 1935 was particularly fruitful and important for the development of logic and logical metatheory. This survey is organized along eight "itineraries" concentrating on historically and conceptually linked strands in this development. Itinerary I deals with the evolution of conceptions of axiomatics. Itinerary II centers on the logical work of Bertrand Russell. Itinerary III presents the development of set theory from Zermelo onward. Itinerary IV discusses the contributions of the algebra of logic tradition, in particular, Löwenheim (...) and Skolem. Itinerary V surveys the work in logic connected to the Hilbert school, and itinerary V deals specifically with consistency proofs and metamathematics, including the incompleteness theorems. Itinerary VII traces the development of intuitionistic and many-valued logics. Itinerary VIII surveys the development of semantical notions from the early work on axiomatics up to Tarski's work on truth. (shrink)
forall x: Calgary is a full-featured textbook on formal logic. It covers key notions of logic such as consequence and validity of arguments, the syntax of truth-functional propositional logic TFL and truth-table semantics, the syntax of first-order (predicate) logic FOL with identity (first-order interpretations), translating (formalizing) English in TFL and FOL, and Fitch-style natural deduction proof systems for both TFL and FOL. It also deals with some advanced topics such as truth-functional completeness and modal logic. Exercises with solutions are available. (...) It is provided in PDF (for screen reading, printing, and a special version for dyslexics) and in LaTeX source code. (shrink)
Priest has provided a simple tableau calculus for Chellas's conditional logic Ck. We provide rules which, when added to Priest's system, result in tableau calculi for Chellas's CK and Lewis's VC. Completeness of these tableaux, however, relies on the cut rule.
What is critical thinking, especially in the context of higher education? How have research and scholarship on the matter developed over recent past decades? What is the current state of the art here? How might the potential of critical thinking be enhanced? What kinds of teaching are necessary in order to realize that potential? And just why is this topic important now? These are the key questions motivating this volume. We hesitate to use terms such as “comprehensive” or “complete” or (...) “definitive,” but we believe that, taken in the round, the chapters in this volume together offer a fair insight into the contemporary understandings of higher education worldwide. We also believe that this volume is much needed, and we shall try to justify that claim in this introduction. (shrink)
On the heels of Franzén's fine technical exposition of Gödel's incompleteness theorems and related topics (Franzén 2004) comes this survey of the incompleteness theorems aimed at a general audience. Gödel's Theorem: An Incomplete Guide to its Use and Abuse is an extended and self-contained exposition of the incompleteness theorems and a discussion of what informal consequences can, and in particular cannot, be drawn from them.
Some of the most important developments of symbolic logic took place in the 1920s. Foremost among them are the distinction between syntax and semantics and the formulation of questions of completeness and decidability of logical systems. David Hilbert and his students played a very important part in these developments. Their contributions can be traced to unpublished lecture notes and other manuscripts by Hilbert and Bernays dating to the period 1917-1923. The aim of this paper is to describe these results, focussing (...) primarily on propositional logic, and to put them in their historical context. It is argued that truth-value semantics, syntactic ("Post-") and semantic completeness, decidability, and other results were first obtained by Hilbert and Bernays in 1918, and that Bernays's role in their discovery and the subsequent development of mathematical logic is much greater than has so far been acknowledged. (shrink)
According to a traditional Cartesian epistemology of perception, perception does not provide one with direct knowledge of the external world. Instead, your immediate perceptual evidence is limited to facts about your own visual experience, from which conclusions about the external world must be inferred. Cartesianism faces well-known skeptical challenges. But this chapter argues that any anti-Cartesian view strong enough to avoid these challenges must license a way of updating one’s beliefs in response to anticipated experiences that seems diachronically irrational. To (...) avoid this result, the anti-Cartesian must either license an unacceptable epistemic chauvinism, or else claim that merely reflecting on one’s experiences defeats perceptual justification. This leaves us with a puzzle: Although Cartesianism faces problems, avoiding them brings a new set of problems. (shrink)
Heinrich Behmann (1891-1970) obtained his Habilitation under David Hilbert in Göttingen in 1921 with a thesis on the decision problem. In his thesis, he solved - independently of Löwenheim and Skolem's earlier work - the decision problem for monadic second-order logic in a framework that combined elements of the algebra of logic and the newer axiomatic approach to logic then being developed in Göttingen. In a talk given in 1921, he outlined this solution, but also presented important programmatic remarks on (...) the significance of the decision problem and of decision procedures more generally. The text of this talk as well as a partial English translation are included. (shrink)
In Winter 2017, the first author piloted a course in formal logic in which we aimed to (a) improve student engagement and mastery of the content, and (b) reduce maths anxiety and its negative effects on student outcomes, by adopting student oriented teaching including peer instruction and classroom flipping techniques. The course implemented a partially flipped approach, and incorporated group-work and peer learning elements, while retaining some of the traditional lecture format. By doing this, a wide variety of student learning (...) preferences could be provided for. (shrink)
A common view among nontheists combines the de jure objection that theism is epistemically unacceptable with agnosticism about the de facto objection that theism is false. Following Plantinga, we can call this a “proper” de jure objection—a de jure objection that does not depend on any de facto objection. In his Warranted Christian Belief, Plantinga has produced a general argument against all proper de jure objections. Here I first show that this argument is logically fallacious (it makes subtle probabilistic fallacies (...) disguised by scope ambiguities), and proceed to lay the groundwork for the construction of actual proper de jure objections. (shrink)
Hilbert's ε-calculus is based on an extension of the language of predicate logic by a term-forming operator εx. Two fundamental results about the ε-calculus, the first and second epsilon theorem, play a rôle similar to that which the cut-elimination theorem plays in sequent calculus. In particular, Herbrand's Theorem is a consequence of the epsilon theorems. The paper investigates the epsilon theorems and the complexity of the elimination procedure underlying their proof, as well as the length of Herbrand disjunctions of existential (...) theorems obtained by this elimination procedure. (shrink)
A construction principle for natural deduction systems for arbitrary, finitely-many-valued first order logics is exhibited. These systems are systematically obtained from sequent calculi, which in turn can be automatically extracted from the truth tables of the logics under consideration. Soundness and cut-free completeness of these sequent calculi translate into soundness, completeness, and normal-form theorems for natural deduction systems.
Takeuti and Titani have introduced and investigated a logic they called intuitionistic fuzzy logic. This logic is characterized as the first-order Gödel logic based on the truth value set [0,1]. The logic is known to be axiomatizable, but no deduction system amenable to proof-theoretic, and hence, computational treatment, has been known. Such a system is presented here, based on previous work on hypersequent calculi for propositional Gödel logics by Avron. It is shown that the system is sound and complete, and (...) allows cut-elimination. A question by Takano regarding the eliminability of the Takeuti-Titani density rule is answered affirmatively. (shrink)
A general class of labeled sequent calculi is investigated, and necessary and sufficient conditions are given for when such a calculus is sound and complete for a finite -valued logic if the labels are interpreted as sets of truth values. Furthermore, it is shown that any finite -valued logic can be given an axiomatization by such a labeled calculus using arbitrary "systems of signs," i.e., of sets of truth values, as labels. The number of labels needed is logarithmic in the (...) number of truth values, and it is shown that this bound is tight. (shrink)
Propositional logics in general, considered as a set of sentences, can be undecidable even if they have “nice” representations, e.g., are given by a calculus. Even decidable propositional logics can be computationally complex (e.g., already intuitionistic logic is PSPACE-complete). On the other hand, finite-valued logics are computationally relatively simple—at worst NP. Moreover, finite-valued semantics are simple, and general methods for theorem proving exist. This raises the question to what extent and under what circumstances propositional logics represented in various ways can (...) be approximated by finite-valued logics. It is shown that the minimal m-valued logic for which a given calculus is strongly sound can be calculated. It is also investigated under which conditions propositional logics can be characterized as the intersection of (effectively given) sequences of finite-valued logics. (shrink)
Although Kurt Gödel does not figure prominently in the history of computabilty theory, he exerted a significant influence on some of the founders of the field, both through his published work and through personal interaction. In particular, Gödel’s 1931 paper on incompleteness and the methods developed therein were important for the early development of recursive function theory and the lambda calculus at the hands of Church, Kleene, and Rosser. Church and his students studied Gödel 1931, and Gödel taught a seminar (...) at Princeton in 1934. Seen in the historical context, Gödel was an important catalyst for the emergence of computability theory in the mid 1930s. (shrink)
David Hilbert's finitistic standpoint is a conception of elementary number theory designed to answer the intuitionist doubts regarding the security and certainty of mathematics. Hilbert was unfortunately not exact in delineating what that viewpoint was, and Hilbert himself changed his usage of the term through the 1920s and 30s. The purpose of this paper is to outline what the main problems are in understanding Hilbert and Bernays on this issue, based on some publications by them which have so far received (...) little attention, and on a number of philosophical reconstructions of the viewpoint (in particular, by Hand, Kitcher, and Tait). (shrink)
The first-order temporal logics with □ and ○ of time structures isomorphic to ω (discrete linear time) and trees of ω-segments (linear time with branching gaps) and some of its fragments are compared: the first is not recursively axiomatizable. For the second, a cut-free complete sequent calculus is given, and from this, a resolution system is derived by the method of Maslov.
Hilbert’s program was an ambitious and wide-ranging project in the philosophy and foundations of mathematics. In order to “dispose of the foundational questions in mathematics once and for all,” Hilbert proposed a two-pronged approach in 1921: first, classical mathematics should be formalized in axiomatic systems; second, using only restricted, “finitary” means, one should give proofs of the consistency of these axiomatic systems. Although Gödel’s incompleteness theorems show that the program as originally conceived cannot be carried out, it had many partial (...) successes, and generated important advances in logical theory and metatheory, both at the time and since. The article discusses the historical background and development of Hilbert’s program, its philosophical underpinnings and consequences, and its subsequent development and influences since the 1930s. (shrink)
It is shown that Gqp↑, the quantified propositional Gödel logic based on the truth-value set V↑ = {1 - 1/n : n≥1}∪{1}, is decidable. This result is obtained by reduction to Büchi's theory S1S. An alternative proof based on elimination of quantifiers is also given, which yields both an axiomatization and a characterization of Gqp↑ as the intersection of all finite-valued quantified propositional Gödel logics.
The problem of algorithmic structuring of proofs in the sequent calculi LK and LKB ( LK where blocks of quantifiers can be introduced in one step) is investigated, where a distinction is made between linear proofs and proofs in tree form. In this framework, structuring coincides with the introduction of cuts into a proof. The algorithmic solvability of this problem can be reduced to the question of k-l-compressibility: "Given a proof of length k , and l ≤ k : Is (...) there is a proof of length ≤ l ?" When restricted to proofs with universal or existential cuts, this problem is shown to be (1) undecidable for linear or tree-like LK-proofs (corresponds to the undecidability of second order unification), (2) undecidable for linear LKB-proofs (corresponds to the undecidability of semi-unification), and (3) decidable for tree-like LKB -proofs (corresponds to a decidable subprob- lem of semi-unification). (shrink)
Entailment in propositional Gödel logics can be defined in a natural way. While all infinite sets of truth values yield the same sets of tautologies, the entailment relations differ. It is shown that there is a rich structure of infinite-valued Gödel logics, only one of which is compact. It is also shown that the compact infinite-valued Gödel logic is the only one which interpolates, and the only one with an r.e. entailment relation.
A uniform construction for sequent calculi for finite-valued first-order logics with distribution quantifiers is exhibited. Completeness, cut-elimination and midsequent theorems are established. As an application, an analog of Herbrand’s theorem for the four-valued knowledge-representation logic of Belnap and Ginsberg is presented. It is indicated how this theorem can be used for reasoning about knowledge bases with incomplete and inconsistent information.
It is shown that the infinite-valued first-order Gödel logic G° based on the set of truth values {1/k: k ε w {0}} U {0} is not r.e. The logic G° is the same as that obtained from the Kripke semantics for first-order intuitionistic logic with constant domains and where the order structure of the model is linear. From this, the unaxiomatizability of Kröger's temporal logic of programs (even of the fragment without the nexttime operator O) and of the authors' temporal (...) logic of linear discrete time with gaps follows. (shrink)
It is shown how the schema of equivalence can be used to obtain short proofs of tautologies A , where the depth of proofs is linear in the number of variables in A .
This chapter describes Kurt Gödel's paper on the incompleteness theorems. Gödel's incompleteness results are two of the most fundamental and important contributions to logic and the foundations of mathematics. It had been assumed that first-order number theory is complete in the sense that any sentence in the language of number theory would be either provable from the axioms or refutable. Gödel's first incompleteness theorem showed that this assumption was false: it states that there are sentences of number theory that are (...) neither provable nor refutable. The first theorem is general in the sense that it applies to any axiomatic theory, which is ω-consistent, has an effective proof procedure, and is strong enough to represent basic arithmetic. Their importance lies in their generality: although proved specifically for extensions of system, the method Gödel used is applicable in a wide variety of circumstances. Gödel's results had a profound influence on the further development of the foundations of mathematics. It pointed the way to a reconceptualization of the view of axiomatic foundations. (shrink)
The problem of approximating a propositional calculus is to find many-valued logics which are sound for the calculus (i.e., all theorems of the calculus are tautologies) with as few tautologies as possible. This has potential applications for representing (computationally complex) logics used in AI by (computationally easy) many-valued logics. It is investigated how far this method can be carried using (1) one or (2) an infinite sequence of many-valued logics. It is shown that the optimal candidate matrices for (1) can (...) be computed from the calculus. (shrink)
All first-order Gödel logics G_V with globalization operator based on truth value sets V C [0,1] where 0 and 1 lie in the perfect kernel of V are axiomatized by Ciabattoni’s hypersequent calculus HGIF.
The proof theory of many-valued systems has not been investigated to an extent comparable to the work done on axiomatizatbility of many-valued logics. Proof theory requires appropriate formalisms, such as sequent calculus, natural deduction, and tableaux for classical (and intuitionistic) logic. One particular method for systematically obtaining calculi for all finite-valued logics was invented independently by several researchers, with slight variations in design and presentation. The main aim of this report is to develop the proof theory of finite-valued first order (...) logics in a general way, and to present some of the more important results in this area. In Systems covered are the resolution calculus, sequent calculus, tableaux, and natural deduction. This report is actually a template, from which all results can be specialized to particular logics. (shrink)
The aim of this paper is to emphasize the fact that for all finitely-many-valued logics there is a completely systematic relation between sequent calculi and tableau systems. More importantly, we show that for both of these systems there are al- ways two dual proof sytems (not just only two ways to interpret the calculi). This phenomenon may easily escape one’s attention since in the classical (two-valued) case the two systems coincide. (In two-valued logic the assignment of a truth value and (...) the exclusion of the opposite truth value describe the same situation.). (shrink)
Methods available for the axiomatization of arbitrary finite-valued logics can be applied to obtain sound and complete intelim rules for all truth-functional connectives of classical logic including the Sheffer stroke and Peirce’s arrow. The restriction to a single conclusion in standard systems of natural deduction requires the introduction of additional rules to make the resulting systems complete; these rules are nevertheless still simple and correspond straightforwardly to the classical absurdity rule. Omitting these rules results in systems for intuitionistic versions of (...) the connectives in question. (shrink)
We face grave global problems. We urgently need to learn how to tackle them in wiser, more effective, intelligent and humane ways than we have done so far. This requires that universities become devoted to helping humanity acquire the necessary wisdom to perform the task. But at present universities do not even conceive of their role in these terms. The essays of this book consider what needs to change in the university if it is to help humanity acquire the wisdom (...) it so urgently needs. (shrink)
Epstein and Carnielli's fine textbook on logic and computability is now in its second edition. The readers of this journal might be particularly interested in the timeline `Computability and Undecidability' added in this edition, and the included wall-poster of the same title. The text itself, however, has some aspects which are worth commenting on.
In the 1920s, David Hilbert proposed a research program with the aim of providing mathematics with a secure foundation. This was to be accomplished by first formalizing logic and mathematics in their entirety, and then showing---using only so-called finitistic principles---that these formalizations are free of contradictions. ;In the area of logic, the Hilbert school accomplished major advances both in introducing new systems of logic, and in developing central metalogical notions, such as completeness and decidability. The analysis of unpublished material presented (...) in Chapter 2 shows that a completeness proof for propositional logic was found by Hilbert and his assistant Paul Bernays already in 1917--18, and that Bernays's contribution was much greater than is commonly acknowledged. Aside from logic, the main technical contribution of Hilbert's Program are the development of formal mathematical theories and proof-theoretical investigations thereof, in particular, consistency proofs. In this respect Wilhelm Ackermann's 1924 dissertation is a milestone both in the development of the Program and in proof theory in general. Ackermann gives a consistency proof for a second-order version of primitive recursive arithmetic which, surprisingly, explicitly uses a finitistic version of transfinite induction up to www . He also gave a faulty consistency proof for a system of second-order arithmetic based on Hilbert's &egr;-substitution method. Detailed analyses of both proofs in Chapter 3 shed light on the development of finitism and proof theory in the 1920s as practiced in Hilbert's school. ;In a series of papers, Charles Parsons has attempted to map out a notion of mathematical intuition which he also brings to bear on Hilbert's finitism. According to him, mathematical intuition fails to be able to underwrite the kind of intuitive knowledge Hilbert thought was attainable by the finitist. It is argued in Chapter 4 that the extent of finitistic knowledge which intuition can provide is broader than Parsons supposes. According to another influential analysis of finitism due to W. W. Tait, finitist reasoning coincides with primitive recursive reasoning. The acceptance of non-primitive recursive methods in Ackermann's dissertation presented in Chapter 3, together with additional textual evidence presented in Chapter 4, shows that this identification is untenable as far as Hilbert's conception of finitism is concerned. Tait's conception, however, differs from Hilbert's in important respects, yet it is also open to criticisms leading to the conclusion that finitism encompasses more than just primitive recursive reasoning. (shrink)
Internal Logic brings together several threads of Yvon Gauthier's work on the foundations of mathematics and revisits his attempt to, as he puts it, radicalize Hilbert's Program. A radicalization of Hilbert's Program, I take it, is supposed to take Hilberts' finitary viewpoint more seriously than other attempts to salvage Hilbert's Program have. Such a return to the "roots of Hilbert's metamathematical idea" will, so claims Gauthier, enable him to save Hilbert's Program.
Decision theory has had a long-standing history in the behavioural and social sciences as a tool for constructing good approximations of human behaviour. Yet as artificially intelligent systems (AIs) grow in intellectual capacity and eventually outpace humans, decision theory becomes evermore important as a model of AI behaviour. What sort of decision procedure might an AI employ? In this work, I propose that policy-based causal decision theory (PCDT), which places a primacy on the decision-relevance of predictors and simulations of agent (...) behaviour, may be such a procedure. I compare this account to the recently-developed functional decision theory (FDT), which is motivated by similar concerns. I also address potentially counterintuitive features of PCDT, such as its refusal to condition on observations made at certain times. (shrink)
A textbook for modal and other intensional logics based on the Open Logic Project. It covers normal modal logics, relational semantics, axiomatic and tableaux proof systems, intuitionistic logic, and counterfactual conditionals.
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.