Proofs of God in Early Modern Europe offers a fascinating window into early modern efforts to prove God’s existence. Assembled here are twenty-two key texts, many translated into English for the first time, which illustrate the variety of arguments that philosophers of the seventeenth and eighteenth centuries offered for God. These selections feature traditional proofs—such as various ontological, cosmological, and design arguments—but also introduce more exotic proofs, such as the argument from eternal truths, the argument from universal (...) aseity, and the argument ex consensu gentium. Drawn from the work of eighteen philosophers, this book includes both canonical figures (such as Descartes, Spinoza, Newton, Leibniz, Locke, and Berkeley) and noncanonical thinkers (such as Norris, Fontenelle, Voltaire, Wolff, Du Châtelet, and Maupertuis). -/- Lloyd Strickland provides fresh translations of all selections not originally written in English and updates the spelling and grammar of those that were. Each selection is prefaced by a lengthy headnote, giving a biographical account of its author, an analysis of the main argument(s), and important details about the historical context. Strickland’s introductory essay provides further context, focusing on the various reasons that led so many thinkers of early modernity to develop proofs of God’s existence. -/- Proofs of God is perfect for both students and scholars of early modern philosophy and philosophy of religion. (shrink)
Many philosophers are sceptical about the power of philosophy to refute commonsensical claims. They look at the famous attempts and judge them inconclusive. I prove that even if those famous attempts are failures, there are alternative successful philosophical proofs against commonsensical claims. After presenting the proofs I briefly comment on their significance.
I argue that Graham Oppy’s attempt to redefend his charge that all modal theistic arguments “must be question-begging” is unsuccessful. Oppy’s attempt to show that theism and modal concretism are compatible is not only tangential for his purposes, it is marred by a misunderstanding of theism, and vulnerable to a counterexample that actually demonstrates incompatibility. Moreover, the notion of begging the question employed by Oppy against the theist is seen to be far too permissive.
Not focusing on the history of classical logic, this book provides discussions and quotes central passages on its origins and development, namely from a philosophical perspective. Not being a book in mathematical logic, it takes formal logic from an essentially mathematical perspective. Biased towards a computational approach, with SAT and VAL as its backbone, this is an introduction to logic that covers essential aspects of the three branches of logic, to wit, philosophical, mathematical, and computational.
As the 19th century drew to a close, logicians formalized an ideal notion of proof. They were driven by nothing other than an abiding interest in truth, and their proofs were as ethereal as the mind of God. Yet within decades these mathematical abstractions were realized by the hand of man, in the digital stored-program computer. How it came to be recognized that proofs and programs are the same thing is a story that spans a century, a chase (...) with as many twists and turns as a thriller. At the end of the story is a new principle for designing programming languages that will guide computers into the 21st century. -/- For my money, Gentzen’s natural deduction and Church’s lambda calculus are on a par with Einstein’s relativity and Dirac’s quantum physics for elegance and insight. And the maths are a lot simpler. I want to show you the essence of these ideas. I’ll need a few symbols, but not too many, and I’ll explain as I go along. -/- To simplify, I’ll present the story as we understand it now, with some asides to fill in the history. First, I’ll introduce Gentzen’s natural deduction, a formalism for proofs. Next, I’ll introduce Church’s lambda calculus, a formalism for programs. Then I’ll explain why proofs and programs are really the same thing, and how simplifying a proof corresponds to executing a program. Finally, I’ll conclude with a look at how these principles are being applied to design a new generation of programming languages, particularly mobile code for the Internet. (shrink)
Mathematicians distinguish between proofs that explain their results and those that merely prove. This paper explores the nature of explanatory proofs, their role in mathematical practice, and some of the reasons why philosophers should care about them. Among the questions addressed are the following: what kinds of proofs are generally explanatory (or not)? What makes a proof explanatory? Do all mathematical explanations involve proof in an essential way? Are there really such things as explanatory proofs, and (...) if so, how do they relate to the sorts of explanation encountered in philosophy of science and metaphysics? (shrink)
When modeling informal proofs like that of Euclid’s Elements using a sound logical system, we go from proofs seen as somewhat unrigorous – even having gaps to be filled – to rigorous proofs. However, metalogic grounds the soundness of our logical system, and proofs in metalogic are not like formal proofs and look suspiciously like the informal proofs. This brings about what I am calling here the groundedness problem: how can we decide with certainty (...) that our metalogical proofs are rigorous and sustain our logical system? In this paper, I will expose this problem. I will not try to solve it here. (shrink)
Considered in light of the readers expectation of a thoroughgoing criticism of the pretensions of the rational psychologist, and of the wealth of discussions available in the broader 18th century context, which includes a variety of proofs that do not explicitly turn on the identification of the soul as a simple substance, Kants discussion of immortality in the Paralogisms falls lamentably short. However, outside of the Paralogisms (and the published works generally), Kant had much more to say about the (...) arguments for the souls immortality as he devoted considerable time to the topic throughout his career in his lectures on metaphysics. In fact, as I show in this paper, the student lecture notes prove to be an indispensable supplement to the treatment in the Paralogisms, not only for illuminating Kants criticism of the rational psychologists views on the immortality of the soul, but also in reconciling this criticism with Kants own positive claims regarding certain theoretical proofs of immortality. (shrink)
The traditional view of evidence in mathematics is that evidence is just proof and proof is just derivation. There are good reasons for thinking that this view should be rejected: it misrepresents both historical and current mathematical practice. Nonetheless, evidence, proof, and derivation are closely intertwined. This paper seeks to tease these concepts apart. It emphasizes the role of argumentation as a context shared by evidence, proofs, and derivations. The utility of argumentation theory, in general, and argumentation schemes, in (...) particular, as a methodology for the study of mathematical practice is thereby demonstrated. Argumentation schemes represent an almost untapped resource for mathematics education. Notably, they provide a consistent treatment of rigorous and non-rigorous argumentation, thereby working to exhibit the continuity of reasoning in mathematics with reasoning in other areas. Moreover, since argumentation schemes are a comparatively mature methodology, there is a substantial body of existing work to draw upon, including some increasingly sophisticated software tools. Such tools have significant potential for the analysis and evaluation of mathematical argumentation. The first four sections of the paper address the relationships of evidence to proof, proof to derivation, argument to proof, and argument to evidence, respectively. The final section directly addresses some of the educational implications of an argumentation scheme account of mathematical reasoning. (shrink)
Review of John Stillwell, Reverse Mathematics: Proofs from the Inside Out. Princeton, NJ: Princeton University Press, 2018, pp. 200. ISBN 978-0-69-117717-5 (hbk), 978-0-69-119641-1 (pbk), 978-1-40-088903-7 (e-book).
The Four-Colour Theorem (4CT) proof, presented to the mathematical community in a pair of papers by Appel and Haken in the late 1970's, provoked a series of philosophical debates. Many conceptual points of these disputes still require some elucidation. After a brief presentation of the main ideas of Appel and Haken’s procedure for the proof and a reconstruction of Thomas Tymoczko’s argument for the novelty of 4CT’s proof, we shall formulate some questions regarding the connections between the points raised by (...) Tymoczko and some Wittgensteinian topics in the philosophy of mathematics such as the importance of the surveyability as a criterion for distinguishing mathematical proofs from empirical experiments. Our aim is to show that the “characteristic Wittgensteinian invention” (Mühlhölzer 2006) – the strong distinction between proofs and experiments – can shed some light in the conceptual confusions surrounding the Four-Colour Theorem. (shrink)
This paper discusses critically what simulation models of the evolution of cooperation can possibly prove by examining Axelrod’s “Evolution of Cooperation” (1984) and the modeling tradition it has inspired. Hardly any of the many simulation models in this tradition have been applicable empirically. Axelrod’s role model suggested a research design that seemingly allowed to draw general conclusions from simulation models even if the mechanisms that drive the simulation could not be identified empirically. But this research design was fundamentally flawed. At (...) best such simulations can claim to prove logical possibilities, i.e. they prove that certain phenomena are possible as the consequence of the modeling assumptions built into the simulation, but not that they are possible or can be expected to occur in reality. I suggest several requirements under which proofs of logical possibilities can nevertheless be considered useful. Sadly, most Axelrod-style simulations do not meet these requirements. It would be better not to use this kind of simulations at all. (shrink)
In this chapter we introduce concepts for analyzing proofs, and for analyzing undergraduate and beginning graduate mathematics students’ proving abilities. We discuss how coordination of these two analyses can be used to improve students’ ability to construct proofs. -/- For this purpose, we need a richer framework for keeping track of students’ progress than the everyday one used by mathematicians. We need to know more than that a particular student can, or cannot, prove theorems by induction or contradiction (...) or can, or cannot, prove certain theorems in beginning set theory or analysis. It is more useful to describe a student’s work in terms of a finer-grained framework that includes various smaller abilities that contribute to proving and that can be learned in differing ways and at differing periods of a student’s development. (shrink)
Corcoran reviews Boute’s 2013 paper “How to calculate proofs”. -/- There are tricky aspects to classifying occurrences of variables: is an occurrence of ‘x’ free as in ‘x + 1’, is it bound as in ‘{x: x = 1}’, or is it orthographic as in ‘extra’? The trickiness is compounded failure to employ conventions to separate use of expressions from their mention. The variable occurrence is free in the term ‘x + 1’ but it is orthographic in that term’s (...) quotes name ‘‘{x: x = 1}’’. The term has no quotes, the term’s name has one set of quotes, and the name of the term’s name has two sets of quotes. The trickiness is further compounded by failure to explicitly distinguish a variable’s values from it substituents. The variable ranges over its values but its occurrences are replaced by occurrences of its substituents. In arithmetic the values are numbers not numerals but the substituents are numerals not numbers. See https://www.academia.edu/s/1eddee0c62?source=link -/- Raymond Boute tries to criticize Daniel Velleman for mistakes in this area. However, Corcoran finds mistakes in Boute’s handling of the material. The reader is invited to find mistakes in Corcoran’s handling of this tricky material. -/- The paper and the review treat other issues as well. -/- Acknowledgements: Raymond Boute, Joaquin Miller, Daniel Velleman, George Weaver, and others. (shrink)
This paper considers logics which are formally dual to intuitionistic logic in order to investigate a co-constructive logic for proofs and refutations. This is philosophically motivated by a set of problems regarding the nature of constructive truth, and its relation to falsity. It is well known both that intuitionism can not deal constructively with negative information, and that defining falsity by means of intuitionistic negation leads, under widely-held assumptions, to a justification of bivalence. For example, we do not want (...) to equate falsity with the non-existence of a proof since this would render a statement such as “pi is transcendental” false prior to 1882. In addition, the intuitionist account of negation as shorthand for the derivation of absurdity is inadequate, particularly outside of purely mathematical contexts. To deal with these issues, I investigate the dual of intuitionistic logic, co-intuitionistic logic, as a logic of refutation, alongside intuitionistic logic of proofs. Direct proof and refutation are dual to each other, and are constructive, whilst there also exist syntactic, weak, negations within both logics. In this respect, the logic of refutation is weakly paraconsistent in the sense that it allows for statements for which, neither they, nor their negation, are refuted. I provide a proof theory for the co-constructive logic, a formal dualizing map between the logics, and a Kripke-style semantics. This is given an intuitive philosophical rendering in a re-interpretation of Kolmogorov’s logic of problems. (shrink)
This paper discusses critically what simulation models of the evolution ofcooperation can possibly prove by examining Axelrod’s “Evolution of Cooperation” and the modeling tradition it has inspired. Hardly any of the many simulation models of the evolution of cooperation in this tradition have been applicable empirically. Axelrod’s role model suggested a research design that seemingly allowed to draw general conclusions from simulation models even if the mechanisms that drive the simulation could not be identified empirically. But this research design was (...) fundamentally flawed, because it is not possible to draw general empirical conclusions from theoretical simulations. At best such simulations can claim to prove logical possibilities, i.e. they prove that certain phenomena are possible as the consequence of the modeling assumptions built into the simulation, but not that they are possible or can be expected to occur in reality I suggest several requirements under which proofs of logical possibilities can nevertheless be considered useful. Sadly, most Axelrod-style simulations do not meet these requirements. I contrast this with Schelling’s neighborhood segregation model, thecore mechanism of which can be retraced empirically. (shrink)
Transfinite ordinal numbers enter mathematical practice mainly via the method of definition by transfinite recursion. Outside of axiomatic set theory, there is a significant mathematical tradition in works recasting proofs by transfinite recursion in other terms, mostly with the intention of eliminating the ordinals from the proofs. Leaving aside the different motivations which lead each specific case, we investigate the mathematics of this action of proof transforming and we address the problem of formalising the philosophical notion of elimination (...) which characterises this move. (shrink)
The problem of algorithmic structuring of proofs in the sequent calculi LK and LKB ( LK where blocks of quantifiers can be introduced in one step) is investigated, where a distinction is made between linear proofs and proofs in tree form. In this framework, structuring coincides with the introduction of cuts into a proof. The algorithmic solvability of this problem can be reduced to the question of k-l-compressibility: "Given a proof of length k , and l ≤ (...) k : Is there is a proof of length ≤ l ?" When restricted to proofs with universal or existential cuts, this problem is shown to be (1) undecidable for linear or tree-like LK-proofs (corresponds to the undecidability of second order unification), (2) undecidable for linear LKB-proofs (corresponds to the undecidability of semi-unification), and (3) decidable for tree-like LKB -proofs (corresponds to a decidable subprob- lem of semi-unification). (shrink)
It is shown how the schema of equivalence can be used to obtain short proofs of tautologies A , where the depth of proofs is linear in the number of variables in A .
Hegel endorsed proofs of the existence of God, and also believed God to be a person. Some of his interpreters ignore these apparently retrograde tendencies, shunning them in favor of the philosopher's more forward-looking contributions. Others embrace Hegel's religious thought, but attempt to recast his views as less reactionary than they appear to be. Robert Williams's latest monograph belongs to a third category: he argues that Hegel's positions in philosophical theology are central to his philosophy writ large. The book (...) is diligently researched, and marshals an impressive amount of textual evidence concerning Hegel's view of the proofs, his theory of personhood, and his views on religious community.Many of... (shrink)
We introduce an effective translation from proofs in the display calculus to proofs in the labelled calculus in the context of tense logics. We identify the labelled calculus proofs in the image of this translation as those built from labelled sequents whose underlying directed graph possesses certain properties. For the basic normal tense logic Kt, the image is shown to be the set of all proofs in the labelled calculus G3Kt.
ABSTRACT This paper explores the role of aesthetic judgements in mathematics by focussing on the relationship between the epistemic and aesthetic criteria employed in such judgements, and on the nature of the psychological experiences underpinning them. I claim that aesthetic judgements in mathematics are plausibly understood as expressions of what I will call ‘aesthetic-epistemic feelings’ that serve a genuine cognitive and epistemic function. I will then propose a naturalistic account of these feelings in terms of sub-personal processes of representing and (...) assessing the relation between cognitive processes and certain properties of the stimuli at which they are directed. (shrink)
In their recent paper Bi-facial truth: a case for generalized truth values Zaitsev and Shramko [7] distinguish between an ontological and an epistemic interpretation of classical truth values. By taking the Cartesian product of the two disjoint sets of values thus obtained, they arrive at four generalized truth values and consider two “semi-classical negations” on them. The resulting semantics is used to define three novel logics which are closely related to Belnap’s well-known four valued logic. A syntactic characterization of these (...) logics is left for further work. In this paper, based on our previous work on a functionally complete extension of Belnap’s logic, we present a sound and complete tableau calculus for these logics. It crucially exploits the Cartesian nature of the four values, which is reflected in the fact that each proof consists of two tableaux. The bi-facial notion of truth of Z&S is thus augmented with a bi-facial notion of proof. We also provide translations between the logics for semi-classical negation and classical logic and show that an argument is valid in a logic for semi-classical negation just in case its translation is valid in classical logic. (shrink)
Could the intersection of [formal proofs of mathematical logic] and [sound deductive inference] specify formal systems having [deductively sound formal proofs of mathematical logic]? All that we have to do to provide [deductively sound formal proofs of mathematical logic] is select the subset of conventional [formal proofs of mathematical logic] having true premises and now we have [deductively sound formal proofs of mathematical logic].
We report on an exploratory study of the way eight mid-level undergraduate mathematics majors read and reflected on four student-generated arguments purported to be proofs of a single theorem. The results suggest that mid-level undergraduates tend to focus on surface features of such arguments and that their ability to determine whether arguments are proofs is very limited -- perhaps more so than either they or their instructors recognize. We begin by discussing arguments (purported proofs) regarded as texts (...) and validations of those arguments, i.e., reflections of individuals checking whether such arguments really are proofs of theorems. We relate the way the mathematics research community views proofs and their validations to ideas from reading comprehension and literary theory. We then give a detailed analysis of the four student-generated arguments and finally analyze the eight students' validations of them. (shrink)
Faith is the highest truth that ensures the happiness and salvation of man in the world and in the Hereafter. But the essence of superstitious is invalid and wrong. The realization of this happiness and salvation is possible by having a true faith. Another consequence of the true faith is the ability to recognize that this belief is right. Believing in true faith, ensures rightness and makes possible to prove and disclose this truth. It is important to have true faith (...) and accurate affirmation. This certainty requires certain criteria for accuracy and precision. In addition, this situation involves advocating the faith and delivering it to other people. This is closely related to the recognition and proof of these truths. It also makes it necessary to explore the distinction between presumption and believing as well as knowledge and ignorance. It requires examination of the concepts of certainty, doubt, evidence, persuasion and proof and what they have implied in terms of faith. The examination of the proofs of faith in terms of persuasion and proof also increases the understanding of the certainty of a belief. (shrink)
Gödel's incompleteness theorems establish the stunning result that mathematics cannot be fully formalized and, further, that any formal system containing a modicum of number or set theory cannot establish its own consistency. Wilfried Sieg and Clinton Field, in their paper Automated Search for Gödel's Proofs, presented automated proofs of Gödel's theorems at an abstract axiomatic level; they used an appropriate expansion of the strategic considerations that guide the search of the automated theorem prover AProS. The representability conditions that (...) allow the syntactic notions of the metalanguage to be represented inside the object language were taken as axioms in the automated proofs. The concrete task I am taking on in this project is to extend the search by formally verifying these conditions. Using a formal metatheory defined in the language of binary trees, the syntactic objects of the metatheory lend themselves naturally to a direct encoding in Zermelo's theory of sets. The metatheoretic notions can then be inductively defined and shown to be representable in the object-theory using appropriate inductive arguments. Formal verification of the representability conditions is the first step towards an automated proof thereof which, in turn, brings the automated verification of Gödel's theorems one step closer to completion. (shrink)
Several of Thomas Aquinas's proofs for the existence of God rely on the claim that causal series cannot proceed in infinitum. I argue that Aquinas has good reason to hold this claim given his conception of causation. Because he holds that effects are ontologically dependent on their causes, he holds that the relevant causal series are wholly derivative: the later members of such series serve as causes only insofar as they have been caused by and are effects of the (...) earlier members. Because the intermediate causes in such series possess causal powers only by deriving them from all the preceding causes, they need a first and non-derivative cause to serve as the source of their causal powers. (shrink)
This paper offers a reconstruction of Wittgenstein's discussion on inductive proofs. A "algebraic version" of these indirect proofs is offered and contrasted with the usual ones in which an infinite sequence of modus pones is projected.
REVIEW OF: Automated Development of Fundamental Mathematical Theories by Art Quaife. (1992: Kluwer Academic Publishers) 271pp. Using the theorem prover OTTER Art Quaife has proved four hundred theorems of von Neumann-Bernays-Gödel set theory; twelve hundred theorems and definitions of elementary number theory; dozens of Euclidean geometry theorems; and Gödel's incompleteness theorems. It is an impressive achievement. To gauge its significance and to see what prospects it offers this review looks closely at the book and the proofs it presents.
When adopting a sound logical system, reasonings made within this system are correct. The situation with reasonings expressed, at least in part, with natural language is much more ambiguous. One way to be certain of the correctness of these reasonings is to provide a logical model of them. To conclude that a reasoning process is correct we need the logical model to be faithful to the reasoning. In this case, the reasoning inherits, so to speak, the correctness of the logical (...) model. There is a weak link in this procedure, which I call the faithfulness problem: how do we decide that the logical model is faithful to the reasoning that it is supposed to model? That is an issue external to logic, and we do not have rigorous formal methods to make the decision. The purpose of this paper is to expose the faithfulness problem (not to solve it). For that purpose, we will consider two examples, one from the geometrical reasoning in Euclid’s Elements and the other from a study on deductive reasoning in the psychology of reasoning. (shrink)
A truth-preservation fallacy is using the concept of truth-preservation where some other concept is needed. For example, in certain contexts saying that consequences can be deduced from premises using truth-preserving deduction rules is a fallacy if it suggests that all truth-preserving rules are consequence-preserving. The arithmetic additive-associativity rule that yields 6 = (3 + (2 + 1)) from 6 = ((3 + 2) + 1) is truth-preserving but not consequence-preserving. As noted in James Gasser’s dissertation, Leibniz has been criticized for (...) using that rule in attempting to show that arithmetic equations are consequences of definitions. -/- A system of deductions is truth-preserving if each of its deductions having true premises has a true conclusion—and consequence-preserving if, for any given set of sentences, each deduction having premises that are consequences of that set has a conclusion that is a consequence of that set. Consequence-preserving amounts to: in each of its deductions the conclusion is a consequence of the premises. The same definitions apply to deduction rules considered as systems of deductions. Every consequence-preserving system is truth-preserving. It is not as well-known that the converse fails: not every truth-preserving system is consequence-preserving. Likewise for rules: not every truth-preserving rule is consequence-preserving. There are many famous examples. In ordinary first-order Peano-Arithmetic, the induction rule yields the conclusion ‘every number x is such that: x is zero or x is a successor’—which is not a consequence of the null set—from two tautological premises, which are consequences of the null set, of course. The arithmetic induction rule is truth-preserving but not consequence-preserving. Truth-preserving rules that are not consequence-preserving are non-logical or extra-logical rules. Such rules are unacceptable to persons espousing traditional truth-and-consequence conceptions of demonstration: a demonstration shows its conclusion is true by showing that its conclusion is a consequence of premises already known to be true. The 1965 Preface in Benson Mates (1972, vii) contains the first occurrence of truth-preservation fallacies in the book. (shrink)
Enlightenment philosopher, René Descartes, set out to establish what could be known with certainty, untainted by a deceiving demon. With his method of doubt, he rejected all previous beliefs, allowing only those that survived rigorous scrutiny. In this essay, Leslie Allan examines whether Descartes's program of skeptical enquiry was successful in laying a firm foundation for our manifold beliefs. He subjects Descartes's conclusions to Descartes's own uncompromising methodology to determine whether Descartes escaped from a self-imposed radical skepticism.
A guide to the first steps into the world of Geometry, Trigonometry and their lines-of-reasoning widely used through the high school and first years of college, in the exact-sciences context.
Inspired by Rudolf Carnap's Der Logische Aufbau Der Welt, David J. Chalmers argues that the world can be constructed from a few basic elements. He develops a scrutability thesis saying that all truths about the world can be derived from basic truths and ideal reasoning. This thesis leads to many philosophical consequences: a broadly Fregean approach to meaning, an internalist approach to the contents of thought, and a reply to W. V. Quine's arguments against the analytic and the a priori. (...) Chalmers also uses scrutability to analyze the unity of science, to defend a conceptual approach to metaphysics, and to mount a structuralist response to skepticism. Based on the 2010 John Locke lectures, Constructing the World opens up debate on central philosophical issues involving language, consciousness, knowledge, and reality. This major work by a leading philosopher will appeal to philosophers in all areas. This entry contains uncorrected proofs of front matter, chapter 1, and first excursus. (shrink)
While there has been much discussion about what makes some mathematical proofs more explanatory than others, and what are mathematical coincidences, in this article I explore the distinct phenomenon of mathematical facts that call for explanation. The existence of mathematical facts that call for explanation stands in tension with virtually all existing accounts of “calling for explanation”, which imply that necessary facts cannot call for explanation. In this paper I explore what theoretical revisions are needed in order to accommodate (...) this phenomenon. One of the important upshots is that, contrary to the current consensus, low prior probability is not a necessary condition for calling for explanation. In the final section I explain how the results of this inquiry help us make progress in assessing Hartry Field's style of reliability argument against mathematical Platonism and against robust realism in other domains of necessary facts, such as ethics. (shrink)
In this paper, I propose that applying the methods of data science to “the problem of whether mathematical explanations occur within mathematics itself” (Mancosu 2018) might be a fruitful way to shed new light on the problem. By carefully selecting indicator words for explanation and justification, and then systematically searching for these indicators in databases of scholarly works in mathematics, we can get an idea of how mathematicians use these terms in mathematical practice and with what frequency. The results of (...) this empirical study suggest that mathematical explanations do occur in research articles published in mathematics journals, as indicated by the occurrence of explanation indicators. When compared with the use of justification indicators, however, the data suggest that justifications occur much more frequently than explanations in scholarly mathematical practice. The results also suggest that justificatory proofs occur much more frequently than explanatory proofs, thus suggesting that proof may be playing a larger justificatory role than an explanatory role in scholarly mathematical practice. (shrink)
After reviewing Kant’s well-known criticisms of the traditional proofs of God’s existence and his preferred moral argument, this paper presents a detailedanalysis of a densely-packed theistic argument in Religion within the Bounds of Bare Reason. Humanity’s ultimate moral destiny can be fulfilled only through organized religion, for only by participating in a religious community can we overcome the evil in human nature. Yet we cannot conceive how such a community can even be founded without presupposing God’s existence. Viewing God (...) as the internal moral lawgiver, empowering a community of believers, is Kant’s ultimate rationale for theistic belief. (shrink)
There is a long tradition in formal epistemology and in the psychology of reasoning to investigate indicative conditionals. In psychology, the propositional calculus was taken for granted to be the normative standard of reference. Experimental tasks, evaluation of the participants’ responses and psychological model building, were inspired by the semantics of the material conditional. Recent empirical work on indicative conditionals focuses on uncertainty. Consequently, the normative standard of reference has changed. I argue why neither logic nor standard probability theory provide (...) appropriate rationality norms for uncertain conditionals. I advocate coherence based probability logic as an appropriate framework for investigating uncertain conditionals. Detailed proofs of the probabilistic non-informativeness of a paradox of the material conditional illustrate the approach from a formal point of view. I survey selected data on human reasoning about uncertain conditionals which additionally support the plausibility of the approach from an empirical point of view. (shrink)
Gauss’s quadratic reciprocity theorem is among the most important results in the history of number theory. It’s also among the most mysterious: since its discovery in the late 18th century, mathematicians have regarded reciprocity as a deeply surprising fact in need of explanation. Intriguingly, though, there’s little agreement on how the theorem is best explained. Two quite different kinds of proof are most often praised as explanatory: an elementary argument that gives the theorem an intuitive geometric interpretation, due to Gauss (...) and Eisenstein, and a sophisticated proof using algebraic number theory, due to Hilbert. Philosophers have yet to look carefully at such explanatory disagreements in mathematics. I do so here. According to the view I defend, there are two important explanatory virtues—depth and transparency—which different proofs (and other potential explanations) possess to different degrees. Although not mutually exclusive in principle, the packages of features associated with the two stand in some tension with one another, so that very deep explanations are rarely transparent, and vice versa. After developing the theory of depth and transparency and applying it to the case of quadratic reciprocity, I draw some morals about the nature of mathematical explanation. (shrink)
ABSTRACT Wilfrid Sellars argued that Kant’s account of the conceptual structures involved in experience can be given a linguistic turn so as to provide an analytic account of the resources a language must have in order to be the bearer of empirical knowledge. In this paper I examine the methodological aspects of Kant’s transcendental philosophy that Sellars took to be fundamental to influential themes in his own philosophy. My first aim here is to clarify and argue for the plausibility of (...) what I claim is Sellars’ interpretation of Kant’s ‘analytic’ transcendental method in the first Critique, based ultimately on non-trivial analytic truths concerning the concept of an object of our possible experience. Kant’s ‘transcendental proofs’ thereby avoid a certain methodological trilemma confronting the candidate premises of any such proof, taken from Sellars’ 1970s undergraduate exam question on Kant. In part II of the essay I conclude by highlighting in general terms how Kant’s method, as interpreted in the analytic manner explained in part I, was adapted by Sellars to produce some of the more influential aspects of his own philosophy, expressed in terms of what he contends is their sustainable reformulation in light of the so-called linguistic turn in twentieth-century philosophy. (shrink)
We aim to compile some means for a rational reconstruction of a named part of the start-over of Baruch (Benedictus) de Spinoza's metaphysics in 'de deo' (which is 'pars prima' of the 'ethica, ordine geometrico demonstrata' ) in terms of 1st order model theory. In so far, as our approach will be judged successful, it may, besides providing some help in understanding Spinoza, also contribute to the discussion of some or other philosophical evergreen, e.g. 'ontological commitment'. For this text we (...) assume the reader familiar with 'de deo' as well as with some basic concepts and results of 1st order model theory. Before we start reconstruction, we will first revisit shortly the concept of 'attributum' (definitio IV) in it's setting in 'de deo' , next scan for formalizable aspects of 'in suo genere finita' ('de deo', definitio II), subsequently list the model theoretic constructs we will make use of. Then we begin reconstruction by stating "coordinative definitions" for the notions of 'attribute (of a substance)', 'modus (as conceived via an attribute)' and 'substance (as conceived via an attribute)', reasoning shortly for each of them. The "coordinative definitions", we will arrive at, must not be understood as literal translations of Spinoza's concepts - of course, there can't be such a thing as a literal translation - they are meant as formal analoga of these concepts, mapping some logical structure. But even with this caveat they may seem strange to the reader at this stage of discussion. Additional justification for them then should be found in our endeavour, to map some argumentation of Spinoza's proofs of some of his propositions from this starting point. (shrink)
In this paper, I argue that, other things being equal, simpler arguments are better. In other words, I argue that, other things being equal, it is rational to prefer simpler arguments over less simple ones. I sketch three arguments in support of this claim: an argument from mathematical proofs, an argument from scientific theories, and an argument from the conjunction rule.
Plato is commonly considered a metaphysical dualist conceiving of a world of Forms separate from the world of particulars in which we live. This paper explores the motivation for postulating that second world as opposed to making do with the one we have. The main objective is to demonstrate that and how everything, Forms and all, can instead fit into the same world. The approach is exploratory, as there can be no proof in the standard sense. The debate between explaining (...) Plato’s ontology with a single world and requiring a two-world model to make sense of the same thing is typically about scouring the Platonic corpus for evidence and turning to Aristotle for help where we need it. The aim here is to dig deeper than what either of them, or anyone else, has said or implied about the number or worlds, searching instead for any insight that might be gained from the way Forms are supposed to exist versus what we ourselves might understand by existence. Although the paper is, at its most basic level, about the existence of the Forms, the intention is not to prove that they exist, nor to evaluate the proofs and objections already on record, but to consider how they might exist and what follows if they do. Dialogue toward consensus on why we think the Forms exist, and why we think they do not, could perhaps provide a smoother “second sailing” in waters where we have been unable to agree whether they do and where they would if they did. (shrink)
Inquiry into the meaning of logical terms in natural language (‘and’, ‘or’, ‘not’, ‘if’) has generally proceeded along two dimensions. On the one hand, semantic theories aim to predict native speaker intuitions about the natural language sentences involving those logical terms. On the other hand, logical theories explore the formal properties of the translations of those terms into formal languages. Sometimes, these two lines of inquiry appear to be in tension: for instance, our best logical investigation into conditional connectives may (...) show that there is no conditional operator that has all the properties native speaker intuitions suggest if has. Indicative conditionals have famously been the source of one such tension, ever since the triviality proofs of both Lewis (1976) and Gibbard (1981) established conclusions which are in prima facie tension with ordinary judgments about natural language indicative conditionals. In a recent series of papers, Branden Fitelson has strengthened both triviality results (Fitelson 2013, 2015, 2016), revealing a common culprit: a logical schema known as IMPORT-EXPORT. Fitelson’s results focus the tension between the logical results and ordinary judgments, since IMPORT-EXPORT seems to be supported by intuitions about natural language. In this paper, we argue that the intuitions which have been taken to support IMPORT-EXPORT are really evidence for a closely related, but subtly different, principle. We show that the two principles are independent by showing how, given a standard assumption about the conditional operator in the formal language in which IMPORT-EXPORT is stated, many existing theories of indicative conditionals validate one, but not the other. Moreover, we argue that once we clearly distinguish these principles, we can use propositional anaphora to show that IMPORT-EXPORT is in fact not valid for natural language indicative conditionals (given this assumption about the formal conditional operator). This gives us a principled and independently motivated way of rejecting a crucial premise in many triviality results, while still making sense of the speaker intuitions which appeared to motivate that premise. We suggest that this strategy has broad application and an important lesson: in theorizing about the logic of natural language, we must pay careful attention to the translation between the formal languages in which logical results are typically proved, and natural languages which are the subject matter of semantic theory. (shrink)
Benchmarking automated theorem proving (ATP) systems using standardized problem sets is a well-established method for measuring their performance. However, the availability of such libraries for non-classical logics is very limited. In this work we propose a library for benchmarking Girard's (propositional) intuitionistic linear logic. For a quick bootstrapping of the collection of problems, and for discussing the selection of relevant problems and understanding their meaning as linear logic theorems, we use translations of the collection of Kleene's intuitionistic theorems in the (...) traditional monograph "Introduction to Metamathematics". We analyze four different translations of intuitionistic logic into linear logic and compare their proofs using a linear logic based prover with focusing. In order to enhance the set of problems in our library, we apply the three provability-preserving translations to the propositional benchmarks in the ILTP Library. Finally, we generate a comprehensive set of reachability problems for Petri nets and encode such problems as linear logic sequents, thus enlarging our collection of problems. (shrink)
This paper is concerned with the claim that supervaluationist consequence is not classical for a language including an operator for definiteness. Although there is some sense in which this claim is uncontroversial, there is a sense in which the claim must be qualified. In particular I defend Keefe's position according to which supervaluationism is classical except when the inference from phi to Dphi is involved. The paper provides a precise content to this claim showing that we might provide complete (and (...) sound) systems of deduction for supervaluationist consequence in which proofs are completely classical with the exception of a single last step (involving the above mentioned inference). (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.