This paper discusses proof-theoretic semantics, the project of specifying the meanings of the logical constants in terms of rules of inference governing them. I concentrate on Michael Dummett’s and Dag Prawitz’ philosophical motivations and give precise characterisations of the crucial notions of harmony and stability, placed in the context of proving normalisation results in systems of natural deduction. I point out a problem for defining the meaning of negation in this framework and prospects for an account of the meanings (...) of modal operators in terms of rules of inference. (shrink)
Smith argues that, unlike other forms of evidence, naked statistical evidence fails to satisfy normic support. This is his solution to the puzzles of statistical evidence in legal proof. This paper focuses on Smith’s claim that DNA evidence in cold-hit cases does not satisfy normic support. I argue that if this claim is correct, virtually no other form of evidence used at trial can satisfy normic support. This is troublesome. I discuss a few ways in which Smith can respond.
Which rules for aggregating judgments on logically connected propositions are manipulable and which not? In this paper, we introduce a preference-free concept of non-manipulability and contrast it with a preference-theoretic concept of strategy-proofness. We characterize all non-manipulable and all strategy-proof judgment aggregation rules and prove an impossibility theorem similar to the Gibbard--Satterthwaite theorem. We also discuss weaker forms of non-manipulability and strategy-proofness. Comparing two frequently discussed aggregation rules, we show that “conclusion-based voting” is less vulnerable to manipulation than “premise-based (...) voting”, which is strategy-proof only for “reason-oriented” individuals. Surprisingly, for “outcome-oriented” individuals, the two rules are strategically equivalent, generating identical judgments in equilibrium. Our results introduce game-theoretic considerations into judgment aggregation and have implications for debates on deliberative democracy. (shrink)
The paper briefly surveys the sentential proof-theoretic semantics for fragment of English. Then, appealing to a version of Frege’s context-principle (specified to fit type-logical grammar), a method is presented for deriving proof-theoretic meanings for sub-sentential phrases, down to lexical units (words). The sentential meaning is decomposed according to the function-argument structure as determined by the type-logical grammar. In doing so, the paper presents a novel proof-theoretic interpretation of simple type, replacing Montague’s model-theoretic type interpretation (in arbitrary Henkin (...) models). The domains of derivations are collections of derivations in the associated “dedicated” natural-deduction proof-system, and functions therein (with no appeal to models, truth-values and elements of a domain). The compositionality of the semantics is analyzed. (shrink)
Kurt Gödel wrote (1964, p. 272), after he had read Husserl, that the notion of objectivity raises a question: “the question of the objective existence of the objects of mathematical intuition (which, incidentally, is an exact replica of the question of the objective existence of the outer world)”. This “exact replica” brings to mind the close analogy Husserl saw between our intuition of essences in Wesensschau and of physical objects in perception. What is it like to experience a mathematical proving (...) process? What is the ontological status of a mathematical proof? Can computer assisted provers output a proof? Taking a naturalized world account, I will assess the relationship between mathematics, the physical world and consciousness by introducing a significant conceptual distinction between proving and proof. I will propose that proving is a phenomenological conscious experience. This experience involves a combination of what Kurt Gödel called intuition, and what Husserl called intentionality. In contrast, proof is a function of that process — the mathematical phenomenon — that objectively self-presents a property in the world, and that results from a spatiotemporal unity being subject to the exact laws of nature. In this essay, I apply phenomenology to mathematical proving as a performance of consciousness, that is, a lived experience expressed and formalized in language, in which there is the possibility of formulating intersubjectively shareable meanings. (shrink)
In order to perform certain actions – such as incarcerating a person or revoking parental rights – the state must establish certain facts to a particular standard of proof. These standards – such as preponderance of evidence and beyond reasonable doubt – are often interpreted as likelihoods or epistemic confidences. Many theorists construe them numerically; beyond reasonable doubt, for example, is often construed as 90 to 95% confidence in the guilt of the defendant. -/- A family of influential cases (...) suggests standards of proof should not be interpreted numerically. These ‘proof paradoxes’ illustrate that purely statistical evidence can warrant high credence in a disputed fact without satisfying the relevant legal standard. In this essay I evaluate three influential attempts to explain why merely statistical evidence cannot satisfy legal standards. (shrink)
Recent years have seen fresh impetus brought to debates about the proper role of statistical evidence in the law. Recent work largely centres on a set of puzzles known as the ‘proof paradox’. While these puzzles may initially seem academic, they have important ramifications for the law: raising key conceptual questions about legal proof, and practical questions about DNA evidence. This article introduces the proof paradox, why we should care about it, and new work attempting to resolve (...) it. (shrink)
The proof theory of many-valued systems has not been investigated to an extent comparable to the work done on axiomatizatbility of many-valued logics. Proof theory requires appropriate formalisms, such as sequent calculus, natural deduction, and tableaux for classical (and intuitionistic) logic. One particular method for systematically obtaining calculi for all finite-valued logics was invented independently by several researchers, with slight variations in design and presentation. The main aim of this report is to develop the proof theory of (...) finite-valued first order logics in a general way, and to present some of the more important results in this area. In Systems covered are the resolution calculus, sequent calculus, tableaux, and natural deduction. This report is actually a template, from which all results can be specialized to particular logics. (shrink)
The impossibility results in judgement aggregation show a clash between fair aggregation procedures and rational collective outcomes. In this paper, we are interested in analysing the notion of rational outcome by proposing a proof-theoretical understanding of collective rationality. In particular, we use the analysis of proofs and inferences provided by linear logic in order to define a fine-grained notion of group reasoning that allows for studying collective rationality with respect to a number of logics. We analyse the well-known paradoxes (...) in judgement aggregation and we pinpoint the reasoning steps that trigger the inconsistencies. Moreover, we extend the map of possibility and impossibility results in judgement aggregation by discussing the case of substructural logics. In particular, we show that there exist fragments of linear logic for which general possibility results can be obtained. (shrink)
“Proof of concept” is a phrase frequently used in descriptions of research sought in program announcements, in experimental studies, and in the marketing of new technologies. It is often coupled with either a short definition or none at all, its meaning assumed to be fully understood. This is problematic. As a phrase with potential implications for research and technology, its assumed meaning requires some analysis to avoid it becoming a descriptive category that refers to all things scientifically exciting. I (...) provide a short analysis of proof of concept research and offer an example of it within synthetic biology. I suggest that not only are there activities that circumscribe new epistemological categories but there are also associated normative ethical categories or principles linked to the research. I examine these and provide an outline for an alternative ethical account to describe these activities that I refer to as “extended agency ethics”. This view is used to explain how the type of research described as proof of concept also provides an attendant proof of principle that is the result of decision-making that extends across practitioners, their tools, techniques, and the problem solving activities of other research groups. (shrink)
This paper contends that Stoic logic (i.e. Stoic analysis) deserves more attention from contemporary logicians. It sets out how, compared with contemporary propositional calculi, Stoic analysis is closest to methods of backward proof search for Gentzen-inspired substructural sequent logics, as they have been developed in logic programming and structural proof theory, and produces its proof search calculus in tree form. It shows how multiple similarities to Gentzen sequent systems combine with intriguing dissimilarities that may enrich contemporary discussion. (...) Much of Stoic logic appears surprisingly modern: a recursively formulated syntax with some truth-functional propositional operators; analogues to cut rules, axiom schemata and Gentzen’s negation-introduction rules; an implicit variable-sharing principle and deliberate rejection of Thinning and avoidance of paradoxes of implication. These latter features mark the system out as a relevance logic, where the absence of duals for its left and right introduction rules puts it in the vicinity of McCall’s connexive logic. Methodologically, the choice of meticulously formulated meta-logical rules in lieu of axiom and inference schemata absorbs some structural rules and results in an economical, precise and elegant system that values decidability over completeness. (shrink)
How is the burden of proof to be distributed among individuals who are involved in resolving a particular issue? Under what conditions should the burden of proof be distributed unevenly? We distinguish attitudinal from dialectical burdens and argue that these questions should be answered differently, depending on which is in play. One has an attitudinal burden with respect to some proposition when one is required to possess sufficient evidence for it. One has a dialectical burden with respect to (...) some proposition when one is required to provide supporting arguments for it as part of a deliberative process. We show that the attitudinal burden with respect to certain propositions is unevenly distributed in some deliberative contexts, but in all of these contexts, establishing the degree of support for the proposition is merely a means to some other deliberative end, such as action guidance, or persuasion. By contrast, uneven distributions of the dialectical burden regularly further the aims of deliberation, even in contexts where the quest for truth is the sole deliberative aim, rather than merely a means to some different deliberative end. We argue that our distinction between these two burdens resolves puzzles about unevenness that have been raised in the literature. (shrink)
Most human actions are complex, but some of them are basic. Which are these? In this paper, I address this question by invoking slips, a common kind of mistake. The proposal is this: an action is basic if and only if it is not possible to slip in performing it. The argument discusses some well-established results from the psychology of language production in the context of a philosophical theory of action. In the end, the proposed criterion is applied to discuss (...) some well-known theories of basic actions. (shrink)
Theism and its cousins, atheism and agnosticism, are seldom taken to task for logical-epistemological incoherence. This paper provides a condensed proof that not only theism, but atheism and agnosticism as well, are all of them conceptually self-undermining, and for the same reason: All attempt to make use of the concept of “transcendent reality,” which here is shown not only to lack meaning, but to preclude the very possibility of meaning. In doing this, the incoherence of theism, atheism, and agnosticism (...) is secondary to the more general incoherence of any attempts to refer to so-called “transcendent realities.” -/- A recognition of the conceptually fundamental incoherence of theism, atheism, and agnosticism compels our rational assent to a position the author names “paratheism.”. (shrink)
According to Jim Pryor’s dogmatism, if you have an experience as if P, you acquire immediate prima facie justification for believing P. Pryor contends that dogmatism validates Moore’s infamous proof of a material world. Against Pryor, I argue that if dogmatism is true, Moore’s proof turns out to be non-transmissive of justification according to one of the senses of non-transmissivity defined by Crispin Wright. This type of non-transmissivity doesn’t deprive dogmatism of its apparent antisceptical bite.
A textbook on proof in mathematics, inspired by an Aristotelian point of view on mathematics and proof. The book expounds the traditional view of proof as deduction of theorems from evident premises via obviously valid steps. It deals with the proof of "all" statements, "some" statements, multiple quantifiers and mathematical induction.
The paper proposes two logical analyses of (the norms of) justification. In a first, realist-minded case, truth is logically independent from justification and leads to a pragmatic logic LP including two epistemic and pragmatic operators, namely, assertion and hypothesis. In a second, antirealist-minded case, truth is not logically independent from justification and results in two logical systems of information and justification: AR4 and AR4¢, respectively, provided with a question-answer semantics. The latter proposes many more epistemic agents, each corresponding to a (...) wide variety of epistemic norms. After comparing the different norms of justification involved in these logical systems, two hexagons expressing Aristotelian relations of opposition will be gathered in order to clarify how (a fragment of) pragmatic formulas can be interpreted in a fuzzy-based question-answer semantics. (shrink)
The current industrial revolution is said to be driven by the digitization that exploits connected information across all aspects of manufacturing. Standards have been recognized as an important enabler. Ontology-based information standard may provide benefits not offered by current information standards. Although there have been ontologies developed in the industrial manufacturing domain, they have been fragmented and inconsistent, and little has received a standard status. With successes in developing coherent ontologies in the biological, biomedical, and financial domains, an effort called (...) Industrial Ontologies Foundry (IOF) has been formed to pursue the same goal for the industrial manufacturing domain. However, developing a coherent ontology covering the entire industrial manufacturing domain has been known to be a mountainous challenge because of the multidisciplinary nature of manufacturing. To manage the scope and expectations, the IOF community kicked-off its effort with a proof-of-concept (POC) project. This paper describes the developments within the project. It also provides a brief update on the IOF organizational set up. (shrink)
We investigated whether mathematicians typically agree about the qualities of mathematical proofs. Between-mathematician consensus in proof appraisals is an implicit assumption of many arguments made by philosophers of mathematics, but to our knowledge the issue has not previously been empirically investigated. We asked a group of mathematicians to assess a specific proof on four dimensions, using the framework identified by Inglis and Aberdein (2015). We found widespread disagreement between our participants about the aesthetics, intricacy, precision and utility of (...) the proof, suggesting that a priori assumptions about the consistency of mathematical proof appraisals are unreasonable. (shrink)
One of the most fundamental questions in the philosophy of mathematics concerns the relation between truth and formal proof. The position according to which the two concepts are the same is called deflationism, and the opposing viewpoint substantialism. In an important result of mathematical logic, Kurt Gödel proved in his first incompleteness theorem that all consistent formal systems containing arithmetic include sentences that can neither be proved nor disproved within that system. However, such undecidable Gödel sentences can be established (...) to be true once we expand the formal system with Alfred Tarski s semantical theory of truth, as shown by Stewart Shapiro and Jeffrey Ketland in their semantical arguments for the substantiality of truth. According to them, in Gödel sentences we have an explicit case of true but unprovable sentences, and hence deflationism is refuted. -/- Against that, Neil Tennant has shown that instead of Tarskian truth we can expand the formal system with a soundness principle, according to which all provable sentences are assertable, and the assertability of Gödel sentences follows. This way, the relevant question is not whether we can establish the truth of Gödel sentences, but whether Tarskian truth is a more plausible expansion than a soundness principle. In this work I will argue that this problem is best approached once we think of mathematics as the full human phenomenon, and not just consisting of formal systems. When pre-formal mathematical thinking is included in our account, we see that Tarskian truth is in fact not an expansion at all. I claim that what proof is to formal mathematics, truth is to pre-formal thinking, and the Tarskian account of semantical truth mirrors this relation accurately. -/- However, the introduction of pre-formal mathematics is vulnerable to the deflationist counterargument that while existing in practice, pre-formal thinking could still be philosophically superfluous if it does not refer to anything objective. Against this, I argue that all truly deflationist philosophical theories lead to arbitrariness of mathematics. In all other philosophical accounts of mathematics there is room for a reference of the pre-formal mathematics, and the expansion of Tarkian truth can be made naturally. Hence, if we reject the arbitrariness of mathematics, I argue in this work, we must accept the substantiality of truth. Related subjects such as neo-Fregeanism will also be covered, and shown not to change the need for Tarskian truth. -/- The only remaining route for the deflationist is to change the underlying logic so that our formal languages can include their own truth predicates, which Tarski showed to be impossible for classical first-order languages. With such logics we would have no need to expand the formal systems, and the above argument would fail. From the alternative approaches, in this work I focus mostly on the Independence Friendly (IF) logic of Jaakko Hintikka and Gabriel Sandu. Hintikka has claimed that an IF language can include its own adequate truth predicate. I argue that while this is indeed the case, we cannot recognize the truth predicate as such within the same IF language, and the need for Tarskian truth remains. In addition to IF logic, also second-order logic and Saul Kripke s approach using Kleenean logic will be shown to fail in a similar fashion. (shrink)
The concept of burden of proof is used in a wide range of discourses, from philosophy to law, science, skepticism, and even in everyday reasoning. This paper provides an analysis of the proper deployment of burden of proof, focusing in particular on skeptical discussions of pseudoscience and the paranormal, where burden of proof assignments are most poignant and relatively clear-cut. We argue that burden of proof is often misapplied or used as a mere rhetorical gambit, with (...) little appreciation of the underlying principles. The paper elaborates on an important distinction between evidential and prudential varieties of burdens of proof, which is cashed out in terms of Bayesian probabilities and error management theory. Finally, we explore the relationship between burden of proof and several (alleged) informal logical fallacies. This allows us to get a firmer grip on the concept and its applications in different domains, and also to clear up some confusions with regard to when exactly some fallacies (ad hominem, ad ignorantiam, and petitio principii) may or may not occur. (shrink)
According to a common conception of legal proof, satisfying a legal burden requires establishing a claim to a numerical threshold. Beyond reasonable doubt, for example, is often glossed as 90% or 95% likelihood given the evidence. Preponderance of evidence is interpreted as meaning at least 50% likelihood given the evidence. In light of problems with the common conception, I propose a new ‘relevant alternatives’ framework for legal standards of proof. Relevant alternative accounts of knowledge state that a person (...) knows a proposition when their evidence rules out all relevant error possibilities. I adapt this framework to model three legal standards of proof—the preponderance of evidence, clear and convincing evidence, and beyond reasonable doubt standards. I describe virtues of this framework. I argue that, by eschewing numerical thresholds, the relevant alternatives framework avoids problems inherent to rival models. I conclude by articulating aspects of legal normativity and practice illuminated by the relevant alternatives framework. (shrink)
According to one of Leibniz's theories of contingency a proposition is contingent if and only if it cannot be proved in a finite number of steps. It has been argued that this faces the Problem of Lucky Proof , namely that we could begin by analysing the concept ‘Peter’ by saying that ‘Peter is a denier of Christ and …’, thereby having proved the proposition ‘Peter denies Christ’ in a finite number of steps. It also faces a more general (...) but related problem that we dub the Problem of Guaranteed Proof . We argue that Leibniz has an answer to these problems since for him one has not proved that ‘Peter denies Christ’ unless one has also proved that ‘Peter’ is a consistent concept, an impossible task since it requires the full decomposition of the infinite concept ‘Peter’. We defend this view from objections found in the literature and maintain that for Leibniz all truths about created individual beings are contingent. (shrink)
By the middle of the seventeenth century we that find that algebra is able to offer proofs in its own right. That is, by that time algebraic argument had achieved the status of proof. How did this transformation come about?
In this article, I argue that it is impossible to complete infinitely many tasks in a finite time. A key premise in my argument is that the only way to get to 0 tasks remaining is from 1 task remaining, when tasks are done 1-by-1. I suggest that the only way to deny this premise is by begging the question, that is, by assuming that supertasks are possible. I go on to present one reason why this conclusion (that supertasks are (...) impossible) is important, namely that it implies a new verdict on a decision puzzle propounded by Jeffrey Barrett and Frank Arntzenius. (shrink)
Cantor’s proof that the powerset of the set of all natural numbers is uncountable yields a version of Richard’s paradox when restricted to the full definable universe, that is, to the universe containing all objects that can be defined not just in one formal language but by means of the full expressive power of natural language: this universe seems to be countable on one account and uncountable on another. We argue that the claim that definitional contexts impose restrictions on (...) the scope of quantifiers reveals a natural way out. (shrink)
The essay argues that while there is no general agreement on whether moral realism is true, there is general agreement on at least some of the moral obligations that we have if moral realism is true. Given that moral realism might be true, and given that we know some of the things we ought to do if it is true, we have a reason to do those things. Furthermore, this reason is itself an objective moral reason. Thus, if moral realism (...) might be true, then it is true. (shrink)
In this paper, I develop a quasi-transcendental argument to justify Kant’s infamous claim “man is evil by nature.” The cornerstone of my reconstruction lies in drawing a systematic distinction between the seemingly identical concepts of “evil disposition” (böseGesinnung) and “propensity to evil” (Hang zumBösen). The former, I argue, Kant reserves to describe the fundamental moral outlook of a single individual; the latter, the moral orientation of the whole species. Moreover, the appellative “evil” ranges over two different types of moral failure: (...) while an “evil disposition” is a failure to realize the good (i.e., to adopt the motive of duty as limiting condition for all one’s desires), an “evil propensity” is a failure to realize the highest good (i.e., to engage in the collective project of transforming the legal order into an ethical community). This correlation between units of moral analysis and types of obligation suggests a way to offer a deduction of the universal propensity on behalf of Kant. It consists in tracing the source of radical evil to the same subjective necessity that gives rise to the doctrine of the highest good. For, at the basis of Kant’s two doctrines lies the same natural dialectic between happiness and morality. While the highest good brings about the critically acceptable resolution of this dialectic, the propensity to evil perpetuates and aggravates it. Instead of connecting happiness and morality in an objective relation, the human will subordinatesmorality to the pursuit of happiness according to the subjective order of association. If this reading is correct, it would explain why prior attempts at a transcendental deduction have failed: interpreters have looked for the key to the deduction in the body of Kant’s text, where it is not to be found, for it is tucked, instead, in the Preface to the first edition. (shrink)
Definitions I presented in a previous article as part of a semantic approach in epistemology assumed that the concept of derivability from standard logic held across all mathematical and scientific disciplines. The present article argues that this assumption is not true for quantum mechanics (QM) by showing that concepts of validity applicable to proofs in mathematics and in classical mechanics are inapplicable to proofs in QM. Because semantic epistemology must include this important theory, revision is necessary. The one I propose (...) also extends semantic epistemology beyond the ‘hard’ sciences. The article ends by presenting and then refuting some responses QM theorists might make to my arguments. (shrink)
This work provides proof-search algorithms and automated counter-model extraction for a class of STIT logics. With this, we answer an open problem concerning syntactic decision procedures and cut-free calculi for STIT logics. A new class of cut-free complete labelled sequent calculi G3LdmL^m_n, for multi-agent STIT with at most n-many choices, is introduced. We refine the calculi G3LdmL^m_n through the use of propagation rules and demonstrate the admissibility of their structural rules, resulting in auxiliary calculi Ldm^m_nL. In the single-agent case, (...) we show that the refined calculi Ldm^m_nL derive theorems within a restricted class of (forestlike) sequents, allowing us to provide proof-search algorithms that decide single-agent STIT logics. We prove that the proof-search algorithms are correct and terminate. (shrink)
Takeuti and Titani have introduced and investigated a logic they called intuitionistic fuzzy logic. This logic is characterized as the first-order Gödel logic based on the truth value set [0,1]. The logic is known to be axiomatizable, but no deduction system amenable to proof-theoretic, and hence, computational treatment, has been known. Such a system is presented here, based on previous work on hypersequent calculi for propositional Gödel logics by Avron. It is shown that the system is sound and complete, (...) and allows cut-elimination. A question by Takano regarding the eliminability of the Takeuti-Titani density rule is answered affirmatively. (shrink)
Semantics plays a role in grammar in at least three guises. (A) Linguists seek to account for speakers‘ knowledge of what linguistic expressions mean. This goal is typically achieved by assigning a model theoretic interpretation2 in a compositional fashion. For example, No whale flies is true if and only if the intersection of the sets of whales and fliers is empty in the model. (B) Linguists seek to account for the ability of speakers to make various inferences based on semantic (...) knowledge. For example, No whale flies entails No blue whale flies and No whale flies high. (C) The wellformedness of a variety of syntactic constructions depends on morpho-syntactic features with a semantic flavor. For example, Under no circumstances would a whale fly is grammatical, whereas Under some circumstances would a whale fly is not, corresponding to the downward vs. upward monotonic features of the preposed phrases. (shrink)
Researchers often pursue proof of concept research, but criteria for evaluating such research remain poorly specified. This paper proposes a general framework for proof of concept research that knits together and augments earlier discussions. The framework includes prototypes, proof of concept demonstrations, and post facto demonstrations. With a case from theoretical evolutionary genetics, the paper illustrates the general framework and articulates some of the reasoning strategies used within that field. This paper provides both specific tools with which (...) to understand how researchers evaluate models in theoretical evolutionary genetics, and general tools that apply to proof of concept research more generally. (shrink)
In this paper, I propose that applying the methods of data science to “the problem of whether mathematical explanations occur within mathematics itself” (Mancosu 2018) might be a fruitful way to shed new light on the problem. By carefully selecting indicator words for explanation and justification, and then systematically searching for these indicators in databases of scholarly works in mathematics, we can get an idea of how mathematicians use these terms in mathematical practice and with what frequency. The results of (...) this empirical study suggest that mathematical explanations do occur in research articles published in mathematics journals, as indicated by the occurrence of explanation indicators. When compared with the use of justification indicators, however, the data suggest that justifications occur much more frequently than explanations in scholarly mathematical practice. The results also suggest that justificatory proofs occur much more frequently than explanatory proofs, thus suggesting that proof may be playing a larger justificatory role than an explanatory role in scholarly mathematical practice. (shrink)
Could it be right to convict and punish defendants using only statistical evidence? In this paper, I argue that it is not and explain why it would be wrong. This is difficult to do because there is a powerful argument for thinking that we should convict and punish defendants using statistical evidence. It looks as if the relevant cases are cases of decision under risk and it seems we know what we should do in such cases (i.e., maximize expected value). (...) Given some standard assumptions about the values at stake, the case for convicting and punishing using statistical evidence seems solid. In trying to show where this argument goes wrong, I shall argue (against Lockeans, reliabilists, and others) that beliefs supported only by statistical evidence are epistemically defective and (against Enoch, Fisher, and Spectre) that these epistemic considerations should matter to the law. To solve the puzzle about the role of statistical evidence in the law, we need to revise some commonly held assumptions about epistemic value and defend the relevance of epistemology to this practical question. (shrink)
Roughly, a proof of a theorem, is “pure” if it draws only on what is “close” or “intrinsic” to that theorem. Mathematicians employ a variety of terms to identify pure proofs, saying that a pure proof is one that avoids what is “extrinsic,” “extraneous,” “distant,” “remote,” “alien,” or “foreign” to the problem or theorem under investigation. In the background of these attributions is the view that there is a distance measure (or a variety of such measures) between mathematical (...) statements and proofs. Mathematicians have paid little attention to specifying such distance measures precisely because in practice certain methods of proof have seemed self- evidently impure by design: think for instance of analytic geometry and analytic number theory. By contrast, mathematicians have paid considerable attention to whether such impurities are a good thing or to be avoided, and some have claimed that they are valuable because generally impure proofs are simpler than pure proofs. This article is an investigation of this claim, formulated more precisely by proof- theoretic means. After assembling evidence from proof theory that may be thought to support this claim, we will argue that on the contrary this evidence does not support the claim. (shrink)
In this dissertation, we shall investigate whether Tennant's criterion for paradoxicality(TCP) can be a correct criterion for genuine paradoxes and whether the requirement of a normal derivation(RND) can be a proof-theoretic solution to the paradoxes. Tennant’s criterion has two types of counterexamples. The one is a case which raises the problem of overgeneration that TCP makes a paradoxical derivation non-paradoxical. The other is one which generates the problem of undergeneration that TCP renders a non-paradoxical derivation paradoxical. Chapter 2 deals (...) with the problem of undergeneration and Chapter 3 concerns the problem of overgeneration. Chapter 2 discusses that Tenant’s diagnosis of the counterexample which applies CR−rule and causes the undergeneration problem is not correct and presents a solution to the problem of undergeneration. Chapter 3 argues that Tennant’s diagnosis of the counterexample raising the overgeneration problem is wrong and provides a solution to the problem. Finally, Chapter 4 addresses what should be explicated in order for RND to be a proof-theoretic solution to the paradoxes. (shrink)
A question, long discussed by legal scholars, has recently provoked a considerable amount of philosophical attention: ‘Is it ever appropriate to base a legal verdict on statistical evidence alone?’ Many philosophers who have considered this question reject legal reliance on bare statistics, even when the odds of error are extremely low. This paper develops a puzzle for the dominant theories concerning why we should eschew bare statistics. Namely, there seem to be compelling scenarios in which there are multiple sources of (...) incriminating statistical evidence. As we conjoin together different types of statistical evidence, it becomes increasingly incredible to suppose that a positive verdict would be impermissible. I suggest that none of the dominant views in the literature can easily accommodate such cases, and close by offering a diagnosis of my own. (shrink)
I discuss two types of evidential problems with the most widely touted experiments in evolutionary psychology, those performed by Leda Cosmides and interpreted by Cosmides and John Tooby. First, and despite Cosmides and Tooby's claims to the contrary, these experiments don't fulfil the standards of evidence of evolutionary biology. Second Cosmides and Tooby claim to have performed a crucial experiment, and to have eliminated rival approaches. Though they claim that their results are consistent with their theory but contradictory to the (...) leading non-evolutionary alternative, Pragmatic Reasoning Schemas theory, I argue that this claim is unsupported. In addition, some of Cosmides and Tooby's interpretations arise from misguided and simplistic understandings of evolutionary biology. While I endorse the incorporation of evolutionary approaches into psychology, I reject the claims of Cosmides and Tooby that a modular approach is the only one supported by evolutionary biology. Lewontin's critical examinations of the applications of adaptationist thinking provide a background of evidentiary standards against which to view the currently fashionable claims of evolutionary psychology. (shrink)
In this paper we introduce a Gentzen calculus for (a functionally complete variant of) Belnap's logic in which establishing the provability of a sequent in general requires \emph{two} proof trees, one establishing that whenever all premises are true some conclusion is true and one that guarantees the falsity of at least one premise if all conclusions are false. The calculus can also be put to use in proving that one statement \emph{necessarily approximates} another, where necessary approximation is a natural (...) dual of entailment. The calculus, and its tableau variant, not only capture the classical connectives, but also the `information' connectives of four-valued Belnap logics. This answers a question by Avron. (shrink)
Introduction to the Scientific Proof of the Natural Moral Law This paper proves that Aquinas has a means of demonstrating and deriving both moral goodness and the natural moral law from human nature alone. Aquinas scientifically proves the existence of the natural moral law as the natural rule of human operations from human nature alone. The distinction between moral goodness and transcendental goodness is affirmed. This provides the intellectual tools to refute the G.E. Moore (Principles of Ethics) attack against (...) the natural law as committing a "naturalistic fallacy". This article proves that instead Moore commits the fallacy of equivocation between moral goodness and transcendental goodness in his very assertion of a "naturalistic fallacy" by the proponents of the natural moral law. In the process the new deontological/kantian theory of natural law as articulated by John Finnis, Robert George, and Germain Grisez is false historically and philosophically. Ethical naturalism is affirmed as a result. (shrink)
This paper explains a way of understanding Kant's proof of God's existence in the Critique of Practical Reason that has hitherto gone unnoticed and argues that this interpretation possesses several advantages over its rivals. By first looking at examples where Kant indicates the role that faith plays in moral life and then reconstructing the proof of the second Critique with this in view, I argue that, for Kant, we must adopt a certain conception of the highest good, and (...) so also must choose to believe in the kind of God that can make it possible, because this is essentially a way of actively striving for virtue. One advantage of this interpretation, I argue, is that it is able to make sense of the strong link Kant draws between morality and religion. (shrink)
Presumption is a complex concept in law, affecting the dialogue setting. However, it is not clear how presumptions work in everyday argumentation, in which the concept of “plausible argumentation” seems to encompass all kinds of inferences. By analyzing the legal notion of presumption, it appears that this type of reasoning combines argument schemes with reasoning from ignorance. Presumptive reasoning can be considered a particular form of reasoning, which needs positive or negative evidence to carry a probative weight on the conclusion. (...) For this reason, presumptions shift the burden of providing evidence or explanations onto the interlocutor. The latter can provide new information or fail to do so: whereas in the first case the new information rebuts the presumption, in the second case, the absence of information that the interlocutor could reasonably provide strengthen the conclusion of the presumptive reasoning. In both cases the result of the presumption is to strengthen the conclusion of the reasoning from lack of evidence. As shown in the legal cases, the effect of presumption is to shift the burden of proof to the interlocutor; however, the shift a presumption effects is only the shift of the evidential burden, or the burden of completing the incomplete knowledge from which the conclusion was drawn. The burden of persuasion remains on the proponent of the presumption. On the contrary, reasoning from definition in law is a conclusive proof, and shifts to the other party the burden to prove the contrary. This crucial difference can be applied to everyday argumentation: natural arguments can be divided into dialectical and presumptive arguments, leading to conclusions materially different in strength. -/- . (shrink)
The author argues that Plato’s “proof” that happiness follows justice has a fatal flaw – because the philosopher king in Plato’s Republic is itself a counter example.
Writing strategic documents is a major practice of many actors striving to see their educational ideas realised in the curriculum. In these documents, arguments are systematically developed to create the legitimacy of a new educational goal and competence to make claims about it. Through a qualitative analysis of the writing strategies used in these texts, I show how two of the main actors in the Czech educational discourse have developed a proof that a new educational goal is needed. I (...) draw on the connection of the relational approach in the sociology of education with Lyotard’s analytical semantics of instances in the event. The comparison of the writing strategies in the two documents reveals differences in the formation of a particular pattern of justification. In one case the texts function as a herald of pure reality, and in the other case as a messenger of other witnesses. This reveals different regimens of proof, although both of them were written as prescriptive directives – normative models of the educational world. (shrink)
Bell inequalities are usually derived by assuming locality and realism, and therefore violations of the Bell-CHSH inequality are usually taken to imply violations of either locality or realism, or both. But, after reviewing an oversight by Bell, in the Corollary below we derive the Bell-CHSH inequality by assuming only that Bob can measure along vectors b and b' simultaneously while Alice measures along either a or a', and likewise Alice can measure along vectors a and a' simultaneously while Bob measures (...) along either b or b', without assuming locality. The violations of the Bell-CHSH inequality therefore only mean impossibility of measuring along b and b' simultaneously. (shrink)
The debate on how to interpret Kant's transcendental idealism has been prominent for several decades now. In his book Kant's Transcendental Proof of Realism Kenneth R. Westphal introduces and defends his version of the metaphysical dual-aspect reading. But his real aim lies deeper: to provide a sound transcendental proof for realism, based on Kant's work, without resorting to transcendental idealism. In this sense his aim is similar to that of Peter F. Strawson – although Westphal's approach is far (...) more sophisticated. First he attempts to show that noumenal causation – on the reality of which his argument partly rests – is coherent in and necessary for Kant's transcendental idealism. Westphal then aims to undermine transcendental idealism by two major claims: Kant can neither account for transcendental affinity nor satisfactorily counter Hume's causal scepticism. Finally Westphal defends his alternative for transcendental idealism by showing that it solves these problems and thus offers a genuine transcendental proof for realism. In this paper I will show that all the three steps outlined above suffer from decisive shortcomings, and that consequently, regardless of its merits, Westphal's transcendental argument for realism remains undemonstrated. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.