Deductive Cogency holds that the set of propositions towards which one has, or is prepared to have, a given type of propositional attitude should be consistent and closed under logical consequence. While there are many propositional attitudes that are not subject to this requirement, e.g. hoping and imagining, it is at least prima facie plausible that Deductive Cogency applies to the doxastic attitude involved in propositional knowledge, viz. belief. However, this thought is undermined by the well-known preface paradox, leading a (...) number of philosophers to conclude that Deductive Cogency has at best a very limited role to play in our epistemic lives. I argue here that Deductive Cogency is still an important epistemic requirement, albeit not as a requirement on belief. Instead, building on a distinction between belief and acceptance introduced by Jonathan Cohen and recent developments in the epistemology of understanding, I propose that Deductive Cogency applies to the attitude of treating propositions as given in the context of attempting to understand a given phenomenon. I then argue that this simultaneously accounts for the plausibility of the considerations in favor of Deductive Cogency and avoids the problematic consequences of the preface paradox. (shrink)
In this paper, I consider a family of three-valued regular logics: the well-known strong and weak S.C. Kleene’s logics and two intermedi- ate logics, where one was discovered by M. Fitting and the other one by E. Komendantskaya. All these systems were originally presented in the semantical way and based on the theory of recursion. However, the proof theory of them still is not fully developed. Thus, natural deduction sys- tems are built only for strong Kleene’s logic both with (...) one (A. Urquhart, G. Priest, A. Tamminga) and two designated values (G. Priest, B. Kooi, A. Tamminga). The purpose of this paper is to provide natural deduction systems for weak and intermediate regular logics both with one and two designated values. (shrink)
Methods available for the axiomatization of arbitrary finite-valued logics can be applied to obtain sound and complete intelim rules for all truth-functional connectives of classical logic including the Sheffer stroke and Peirce’s arrow. The restriction to a single conclusion in standard systems of natural deduction requires the introduction of additional rules to make the resulting systems complete; these rules are nevertheless still simple and correspond straightforwardly to the classical absurdity rule. Omitting these rules results in systems for intuitionistic versions (...) of the connectives in question. (shrink)
The present article illustrates a conflict between the claim that rational belief sets are closed under deductive consequences, and a very inclusive claim about the factors that are sufficient to determine whether it is rational to believe respective propositions. Inasmuch as it is implausible to hold that the factors listed here are insufficient to determine whether it is rational to believe respective propositions, we have good reason to deny that rational belief sets are closed under deductive consequences.
In section 1, I develop epistemic communism, my view of the function of epistemically evaluative terms such as ‘rational’. The function is to support the coordination of our belief-forming rules, which in turn supports the reliable acquisition of beliefs through testimony. This view is motivated by the existence of valid inferences that we hesitate to call rational. I defend the view against the worry that it fails to account for a function of evaluations within first-personal deliberation. In the rest of (...) the paper, I then argue, on the basis of epistemic communism, for a view about rationality itself. I set up the argument in section 2 by saying what a theory of rational deduction is supposed to do. I claim that such a theory would provide a necessary, sufficient, and explanatorily unifying condition for being a rational rule for inferring deductive consequences. I argue in section 3 that, given epistemic communism and the conventionality that it entails, there is no such theory. Nothing explains why certain rules for deductive reasoning are rational. (shrink)
We present a sound and complete Fitch-style natural deduction system for an S5 modal logic containing an actuality operator, a diagonal necessity operator, and a diagonal possibility operator. The logic is two-dimensional, where we evaluate sentences with respect to both an actual world (first dimension) and a world of evaluation (second dimension). The diagonal necessity operator behaves as a quantifier over every point on the diagonal between actual worlds and worlds of evaluation, while the diagonal possibility quantifies over some (...) point on the diagonal. Thus, they are just like the epistemic operators for apriority and its dual. We take this extension of Fitch’s familiar derivation system to be a very natural one, since the new rules and labeled lines hereby introduced preserve the structure of Fitch’s own rules for the modal case. (shrink)
This presentation of Aristotle's natural deduction system supplements earlier presentations and gives more historical evidence. Some fine-tunings resulted from conversations with Timothy Smiley, Charles Kahn, Josiah Gould, John Kearns,John Glanvillle, and William Parry.The criticism of Aristotle's theory of propositions found at the end of this 1974 presentation was retracted in Corcoran's 2009 HPL article "Aristotle's demonstrative logic".
James Van Cleve has argued that Kant’s Transcendental Deduction of the categories shows, at most, that we must apply the categories to experience. And this falls short of Kant’s aim, which is to show that they must so apply. In this discussion I argue that once we have noted the differences between the first and second editions of the Deduction, this objection is less telling. But Van Cleve’s objection can help illuminate the structure of the B Deduction, (...) and it suggests an interesting reason why the rewriting might have been thought necessary. (shrink)
It is tempting to think that multi premise closure creates a special class of paradoxes having to do with the accumulation of risks, and that these paradoxes could be escaped by rejecting the principle, while still retaining single premise closure. I argue that single premise deduction is also susceptible to risks. I show that what I take to be the strongest argument for rejecting multi premise closure is also an argument for rejecting single premise closure. Because of the symmetry (...) between the principles, they come as a package: either both will have to be rejected or both will have to be revised. (shrink)
This paper describes a cubic water tank equipped with a movable partition receiving various amounts of liquid used to represent joint probability distributions. This device is applied to the investigation of deductive inferences under uncertainty. The analogy is exploited to determine by qualitative reasoning the limits in probability of the conclusion of twenty basic deductive arguments (such as Modus Ponens, And-introduction, Contraposition, etc.) often used as benchmark problems by the various theoretical approaches to reasoning under uncertainty. The probability bounds imposed (...) by the premises on the conclusion are derived on the basis of a few trivial principles such as "a part of the tank cannot contain more liquid than its capacity allows", or "if a part is empty, the other part contains all the liquid". This stems from the equivalence between the physical constraints imposed by the capacity of the tank and its subdivisions on the volumes of liquid, and the axioms and rules of probability. The device materializes de Finetti's coherence approach to probability. It also suggests a physical counterpart of Dutch book arguments to assess individuals' rationality in probability judgments in the sense that individuals whose degrees of belief in a conclusion are out of the bounds of coherence intervals would commit themselves to executing physically impossible tasks. (shrink)
The paper discusses the origin of dark matter and dark energy from the concepts of time and the totality in the final analysis. Though both seem to be rather philosophical, nonetheless they are postulated axiomatically and interpreted physically, and the corresponding philosophical transcendentalism serves heuristically. The exposition of the article means to outline the “forest for the trees”, however, in an absolutely rigorous mathematical way, which to be explicated in detail in a future paper. The “two deductions” are two successive (...) stage of a single conclusion mentioned above. The concept of “transcendental invariance” meaning ontologically and physically interpreting the mathematical equivalence of the axiom of choice and the well-ordering “theorem” is utilized again. Then, time arrow is a corollary from that transcendental invariance, and in turn, it implies quantum information conservation as the Noether correlate of the linear “increase of time” after time arrow. Quantum information conservation implies a few fundamental corollaries such as the “conservation of energy conservation” in quantum mechanics from reasons quite different from those in classical mechanics and physics as well as the “absence of hidden variables” (versus Einstein’s conjecture) in it. However, the paper is concentrated only into the inference of another corollary from quantum information conservation, namely, dark matter and dark energy being due to entanglement, and thus and in the final analysis, to the conservation of quantum information, however observed experimentally only on the “cognitive screen” of “Mach’s principle” in Einstein’s general relativity. therefore excluding any other source of gravitational field than mass and gravity. Then, if quantum information by itself would generate a certain nonzero gravitational field, it will be depicted on the same screen as certain masses and energies distributed in space-time, and most presumably, observable as those dark energy and dark matter predominating in the universe as about 96% of its energy and matter quite unexpectedly for physics and the scientific worldview nowadays. Besides on the cognitive screen of general relativity, entanglement is available necessarily on still one “cognitive screen” (namely, that of quantum mechanics), being furthermore “flat”. Most probably, that projection is confinement, a mysterious and ad hoc added interaction along with the fundamental tree ones of the Standard model being even inconsistent to them conceptually, as far as it need differ the local space from the global space being definable only as a relation between them (similar to entanglement). So, entanglement is able to link the gravity of general relativity to the confinement of the Standard model as its projections of the “cognitive screens” of those two fundamental physical theories. (shrink)
For deductive reasoning to be justified, it must be guaranteed to preserve truth from premises to conclusion; and for it to be useful to us, it must be capable of informing us of something. How can we capture this notion of information content, whilst respecting the fact that the content of the premises, if true, already secures the truth of the conclusion? This is the problem I address here. I begin by considering and rejecting several accounts of informational content. I (...) then develop an account on which informational contents are indeterminate in their membership. This allows there to be cases in which it is indeterminate whether a given deduction is informative. Nevertheless, on the picture I present, there are determinate cases of informative (and determinate cases of uninformative) inferences. I argue that the model I offer is the best way for an account of content to respect the meaning of the logical constants and the inference rules associated with them without collapsing into a classical picture of content, unable to account for informative deductive inferences. (shrink)
Harold Hodes in [1] introduces an extension of first-order modal logic featuring a backtracking operator, and provides a possible worlds semantics, according to which the operator is a kind of device for ‘world travel’; he does not provide a proof theory. In this paper, I provide a natural deduction system for modal logic featuring this operator, and argue that the system can be motivated in terms of a reading of the backtracking operator whereby it serves to indicate modal scope. (...) I prove soundness and completeness theorems with respect to Hodes’ semantics, as well as semantics with fewer restrictions on the accessibility relation. (shrink)
In the transcendental deduction, the central argument of the Critique of Pure Reason, Kant seeks to secure the objective validity of our basic categories of thought. He distinguishes objective and subjective sides of this argument. The latter side, the subjective deduction, is normally understood as an investigation of our cognitive faculties. It is identified with Kant’s account of a threefold synthesis involved in our cognition of objects of experience, and it is said to precede and ground Kant’s proof (...) of the validity of the categories in the objective deduction. I challenge this standard reading of the subjective deduction, arguing, first, that there is little textual evidence for it, and, second, that it encourages a problematic conception of how the deduction works. In its place, I present a new reading of the subjective deduction. Rather than being a broad investigation of our cognitive faculties, it should be seen as addressing a specific worry that arises in the course of the objective deduction. The latter establishes the need for a necessary connection between our capacities for thinking and being given objects, but Kant acknowledges that his readers might struggle to comprehend how these seemingly independent capacities are coordinated. Even worse, they might well believe that in asserting this necessary connection, Kant’s position amounts to an implausible subjective idealism. The subjective deduction ismeant to allay these concerns by showing that they rest on a misunderstanding of the relation between these faculties. This new reading of the subjective deduction offers a better fit with Kant’s text. It also has broader implications, for it reveals the more philosophically plausible account of our relation to the world as thinkers that Kant is defending – an account that is largely obscured by the standard reading of the subjective deduction. (shrink)
Deductive inference is usually regarded as being “tautological” or “analytical”: the information conveyed by the conclusion is contained in the information conveyed by the premises. This idea, however, clashes with the undecidability of first-order logic and with the (likely) intractability of Boolean logic. In this article, we address the problem both from the semantic and the proof-theoretical point of view. We propose a hierarchy of propositional logics that are all tractable (i.e. decidable in polynomial time), although by means of growing (...) computational resources, and converge towards classical propositional logic. The underlying claim is that this hierarchy can be used to represent increasing levels of “depth” or “informativeness” of Boolean reasoning. Special attention is paid to the most basic logic in this hierarchy, the pure “intelim logic”, which satisfies all the requirements of a natural deduction system (allowing both introduction and elimination rules for each logical operator) while admitting of a feasible (quadratic) decision procedure. We argue that this logic is “analytic” in a particularly strict sense, in that it rules out any use of “virtual information”, which is chiefly responsible for the combinatorial explosion of standard classical systems. As a result, analyticity and tractability are reconciled and growing degrees of computational complexity are associated with the depth at which the use of virtual information is allowed. (shrink)
One of the strongest motivations for conceptualist readings of Kant is the belief that the Transcendental Deduction is incompatible with nonconceptualism. In this article, I argue that this belief is simply false: the Deduction and nonconceptualism are compatible at both an exegetical and a philosophical level. Placing particular emphasis on the case of non-human animals, I discuss in detail how and why my reading diverges from those of Ginsborg, Allais, Gomes and others. I suggest ultimately that it is (...) only by embracing nonconceptualism that we can fully recognise the delicate calibration of the trap which the Critique sets for Hume. (shrink)
This essay presents deductive arguments to an introductory-level audience via a discussion of Aristotle's three types of rhetoric, the goals of and differences between deductive and non-deductive arguments, and the major features of deductive arguments (e.g., validity and soundness).
The definitions of ‘deduction’ found in virtually every introductory logic textbook would encourage us to believe that the inductive/deductive distinction is a distinction among kinds of arguments and that the extension of ‘deduction’ is a determinate class of arguments. In this paper, we argue that that this approach is mistaken. Specifically, we defend the claim that typical definitions of ‘deduction’ operative in attempts to get at the induction/deduction distinction are either too narrow or insufficiently precise. We (...) conclude by presenting a deflationary understanding of the inductive/deductive distinction; in our view, its content is nothing over and above the answers to two fundamental sorts of questions central to critical thinking. (shrink)
Deductive reasoning is the kind of reasoning in which, roughly, the truth of the input propositions (the premises) logically guarantees the truth of the output proposition (the conclusion), provided that no mistake has been made in the reasoning. The premises may be propositions that the reasoner believes or assumptions that the reasoner is exploring. Deductive reasoning contrasts with inductive reasoning, the kind of reasoning in which the truth of the premises need not guarantee the truth of the conclusion.
The deduction of categories in the 1781 edition of the Critique of the Pure Reason (A Deduction) has “two sides”—the “objective deduction” and the “subjective deduction”. Kant seems ambivalent about the latter deduction. I treat it as a significant episode of Kant’s thinking about categories that extended from the early 1770s to around 1790. It contains his most detailed answer to the question about the origin of categories that he formulated in the 1772 letter to (...) Marcus Herz. The answer is that categories are generated a priori through a kind of intellectual “epigenesis”. This account leaves unexplained why precisely such and such categories should be generated. While this observation caused Kant to worry about the hypothetical status of the subjective deduction in 1781, he would come to acquiesce in the recognition that the ground of the possibility of categories is itself inscrutable. I call this his “methodological skepticism”. (shrink)
Juhani Yli-Vakkuri has argued that the Twin Earth thought experiments offered in favour of semantic externalism can be replaced by a straightforward deductive argument from premisses widely accepted by both internalists and externalists alike. The deductive argument depends, however, on premisses that, on standard formulations of internalism, cannot be satisfied by a single belief simultaneously. It does not therefore, constitute a proof of externalism. The aim of this article is to explain why.
It is commonly held that Kant ventured to derive morality from freedom in Groundwork III. It is also believed that he reversed this strategy in the second Critique, attempting to derive freedom from morality instead. In this paper, I set out to challenge these familiar assumptions: Kant’s argument in Groundwork III rests on a moral conception of the intelligible world, one that plays a similar role as the ‘fact of reason’ in the second Critique. Accordingly, I argue, there is no (...) reversal in the proof-structure of Kant’s two works. (shrink)
It is often assumed that Fichte's aim in Part I of the System of Ethics is to provide a deduction of the moral law, the very thing that Kant – after years of unsuccessful attempts – deemed impossible. On this familiar reading, what Kant eventually viewed as an underivable 'fact' (Factum), the authority of the moral law, is what Fichte traces to its highest ground in what he calls the principle of the 'I'. However, scholars have largely overlooked a (...) passage in the System of Ethics where Fichte explicitly invokes Kant's doctrine of the fact of reason with approval, claiming that consciousness of the moral law grounds our belief in freedom (GA I/5:65). On the reading I defend, Fichte's invocation of the Factum is consistent with the structure of Part I when we distinguish (a) the feeling of moral compulsion from (b) the moral law itself. (shrink)
This paper offers an account of the role that critical skepticism plays in the transcendental deduction of the categories of the understanding, arguing that deferred trust in our cognitive faculty is pivotal for reason’s maturation as Kant conceives it and as Hegel subsequently redefines it in his science of the experience of consciousness.
A construction principle for natural deduction systems for arbitrary, finitely-many-valued first order logics is exhibited. These systems are systematically obtained from sequent calculi, which in turn can be automatically extracted from the truth tables of the logics under consideration. Soundness and cut-free completeness of these sequent calculi translate into soundness, completeness, and normal-form theorems for natural deduction systems.
Looking at every sense this article proves through deduction; that your mind needs a source to dream. Dreams are old experienced essences of platonic forms. You can only experience new forms essences when you are awake because of initial experiences. If dreams are old, experienced essences (what this article proves) therefore you know you are awake when you initially sense new experienced essences.
According to non-conceptualist interpretations, Kant held that the application of concepts is not necessary for perceptual experience. Some have motivated non-conceptualism by noting the affinities between Kant's account of perception and contemporary relational theories of perception. In this paper I argue (i) that non-conceptualism cannot provide an account of the Transcendental Deduction and thus ought to be rejected; and (ii) that this has no bearing on the issue of whether Kant endorsed a relational account of perceptual experience.
How is moral knowledge possible? This paper defends the anti-Humean thesis that we can acquire moral knowledge by deduction from wholly non-moral premises. According to Hume’s Law, as it has become known, we cannot deduce an ‘ought’ from an ‘is’, since it is “altogether inconceivable how this new relation can be a deduction from others, which are entirely different from it” (Hume, 1739, 3.1.1). This paper explores the prospects for a deductive theory of moral knowledge that rejects Hume’s (...) Law. (shrink)
The thesis of this paper is that we can justify induction deductively relative to one end, and deduction inductively relative to a different end. I will begin by presenting a contemporary variant of Hume ’s argument for the thesis that we cannot justify the principle of induction. Then I will criticize the responses the resulting problem of induction has received by Carnap and Goodman, as well as praise Reichenbach ’s approach. Some of these authors compare induction to deduction. (...) Haack compares deduction to induction, and I will critically discuss her argument for the thesis that we cannot justify the principles of deduction next. In concluding I will defend the thesis that we can justify induction deductively relative to one end, and deduction inductively relative to a different end, and that we can do so in a non-circular way. Along the way I will show how we can understand deductive and inductive logic as normative theories, and I will briefly sketch an argument to the effect that there are only hypothetical, but no categorical imperatives. (shrink)
Cette étude montre comment le météorologue Edward Lorenz, dans deux articles de 1963 et 1964, explore les propriétés des systèmes chaotiques par des allers-retours entre une déduction mathématique (basée sur la théorie des systèmes dynamiques) et une étude des solutions numériques du système dit « de Lorenz » dans un régime d’instabilité. This study aims at showing how the metereologist Edward Lorenz, in two papers of 1963 and 1964, explores the properties of chaotic systems thanks to the interplay between a (...) mathematical deduction (based on dynamical systems theory) and a study of the mathematical solutions of the Lorenz system in an instability regime. (shrink)
Mathematicians often speak of conjectures, yet unproved, as probable or well-confirmed by evidence. The Riemann Hypothesis, for example, is widely believed to be almost certainly true. There seems no initial reason to distinguish such probability from the same notion in empirical science. Yet it is hard to see how there could be probabilistic relations between the necessary truths of pure mathematics. The existence of such logical relations, short of certainty, is defended using the theory of logical probability (or objective Bayesianism (...) or non-deductive logic), and some detailed examples of its use in mathematics surveyed. Examples of inductive reasoning in experimental mathematics are given and it is argued that the problem of induction is best appreciated in the mathematical case. (shrink)
This paper raises obvious questions undermining any residual confidence in Mates work and revealing our embarrassing ignorance of true nature of Stoic deduction. It was inspired by the challenging exploratory work of JOSIAH GOULD.
The idea that knowledge can be extended by inference from what is known seems highly plausible. Yet, as shown by familiar preface paradox and lottery-type cases, the possibility of aggregating uncertainty casts doubt on its tenability. We show that these considerations go much further than previously recognized and significantly restrict the kinds of closure ordinary theories of knowledge can endorse. Meeting the challenge of uncertainty aggregation requires either the restriction of knowledge-extending inferences to single premises, or eliminating epistemic uncertainty in (...) known premises. The first strategy, while effective, retains little of the original idea—conclusions even of modus ponens inferences from known premises are not always known. We then look at the second strategy, inspecting the most elaborate and promising attempt to secure the epistemic role of basic inferences, namely Timothy Williamson’s safety theory of knowledge. We argue that while it indeed has the merit of allowing basic inferences such as modus ponens to extend knowledge, Williamson’s theory faces formidable difficulties. These difficulties, moreover, arise from the very feature responsible for its virtue- the infallibilism of knowledge. (shrink)
The third deduction in Plato’s Parmenides is often given a constructive reading on which Plato’s Parmenides, or even Plato himself, presents us with a positive account of the relation between parts and wholes. However, I argue that there is a hitch in the third deduction which threatens to undermine the mereology of the third deduction by the lights of the dialogue. Roughly, even if the Others partake of the One, the account of the third deduction leads (...) to an ontology of gunk, that is, an ontology on which there are no mereological atoms (except for the One). Hence, it is unclear whether the participation relation between the Others and the One is sufficient to impose the sort of structure on the Others which, in the context of this deduction, Parmenides (or Plato) seems to hope for. Instead of the constructive reading of the third deduction, I therefore offer an aporetic reading on which the third deduction raises further difficulties for the participation relation at the heart of young Socrates’ theory of Forms. (shrink)
I take up Kant's remarks about a " transcendental deduction" of the "concepts of space and time". I argue for the need to make a clearer assessment of the philosophical resources of the Aesthetic in order to account for this transcendental deduction. Special attention needs to be given to the fact that the central task of the Aesthetic is simply the "exposition" of these concepts. The Metaphysical Exposition reflects upon facts about our usage to reveal our commitment to (...) the idea that these concepts refer to pure intuitions. But the legitimacy of these concepts still hangs in the balance: these concepts may turn out to refer to nothing real at all. The subsequent Transcendental Exposition addresses this issue. The objective validity of the concepts of space and time, and hence their transcendental deduction, hinges on careful treatment of this last point. (shrink)
This essay presents and discusses the currently most famous among the deductive conceptions of explanation, i.e., the deductive-nomological one, and proceeds to apply it to microeconomic theory. After restating the basic ideas, the essay investigates some of the important objections raised against it, with a view to decide whether or not they invalidate the proposed application to economics.
In attempt to provide an answer to the question of origin of deductive proofs, I argue that Aristotle’s philosophy of math is more accurate opposed to a Platonic philosophy of math, given the evidence of how mathematics began. Aristotle says that mathematical knowledge is a posteriori, known through induction; but once knowledge has become unqualified it can grow into deduction. Two pieces of recent scholarship on Greek mathematics propose new ways of thinking about how mathematics began in the Greek (...) culture. Both claimed there was a close relationship between the culture and mathematicians; mathematics was understood through imaginative processes, experiencing the proofs in tangible ways, and establishing a consistent unified form of argumentation. These pieces of evidence provide the context in which Aristotle worked and their contributions lend support to the argument that mathematical premises as inductively available is a better way of understanding the origins of deductive practices, opposed to the Platonic tradition. (shrink)
This paper presents a way of formalising definite descriptions with a binary quantifier ι, where ιx[F, G] is read as ‘The F is G’. Introduction and elimination rules for ι in a system of intuitionist negative free logic are formulated. Procedures for removing maximal formulas of the form ιx[F, G] are given, and it is shown that deductions in the system can be brought into normal form.
This essay partly builds on and partly criticizes a striking idea of Dieter Henrich. Henrich argues that Kant's distinction in the first Critique between the question of fact (quid facti) and the question of law (quid juris) provides clues to the argumentative structure of a philosophical "Deduction". Henrich suggests that the unity of apperception plays a role analogous to a legal factum. By contrast, I argue, first, that the question of fact in the first Critique is settled by the (...) Metaphysical Deduction, which establishes the purity of origin of the Categories, and, second, that in the second Critique, the relevant factum is the Fact of Reason, which amounts to the fact that the Moral Law is pure in origin. (shrink)
my goal here is to provide a detailed analysis of the methods of inference that are employed in De prospectiva pingendi. For this purpose, a method of natural deduction is proposed. the treatise by Piero della Francesca is a manifestation of a union between the ne arts and the mathematical sciences of arithmetic and geometry. He de nes painting as a part of perspective and, speaking precisely, as a branch of geometry, which is why we nd advanced geometrical exercises (...) here. (shrink)
One semantic and two syntactic decision procedures are given for determining the validity of Aristotelian assertoric and apodeictic syllogisms. Results are obtained by using the Aristotelian deductions that necessarily have an even number of premises.
Medical diagnosis has been traditionally recognized as a privileged field of application for so called probabilistic induction. Consequently, the Bayesian theorem, which mathematically formalizes this form of inference, has been seen as the most adequate tool for quantifying the uncertainty surrounding the diagnosis by providing probabilities of different diagnostic hypotheses, given symptomatic or laboratory data. On the other side, it has also been remarked that differential diagnosis rather works by exclusion, e.g. by modus tollens, i.e. deductively. By drawing on a (...) case history, this paper aims at clarifying some points on the issue. Namely: 1) Medical diagnosis does not represent, strictly speaking, a form of induction, but a type, of what in Peircean terms should be called ‘abduction’ (identifying a case as the token of a specific type); 2) in performing the single diagnostic steps, however, different inferential methods are used for both inductive and deductive nature: modus tollens, hypothetical-deductive method, abduction; 3) Bayes’ theorem is a probabilized form of abduction which uses mathematics in order to justify the degree of confidence which can be entertained on a hypothesis given the available evidence; 4) although theoretically irreconcilable, in practice, both the hypothetical- deductive method and the Bayesian one, are used in the same diagnosis with no serious compromise for its correctness; 5) Medical diagnosis, especially differential diagnosis, also uses a kind of “probabilistic modus tollens”, in that, signs (symptoms or laboratory data) are taken as strong evidence for a given hypothesis not to be true: the focus is not on hypothesis confirmation, but instead on its refutation [Pr (¬ H/E1, E2, …, En)]. Especially at the beginning of a complicated case, odds are between the hypothesis that is potentially being excluded and a vague “other”. This procedure has the advantage of providing a clue of what evidence to look for and to eventually reduce the set of candidate hypotheses if conclusive negative evidence is found. 6) Bayes’ theorem in the hypothesis-confirmation form can more faithfully, although idealistically, represent the medical diagnosis when the diagnostic itinerary has come to a reduced set of plausible hypotheses after a process of progressive elimination of candidate hypotheses; 7) Bayes’ theorem is however indispensable in the case of litigation in order to assess doctor’s responsibility for medical error by taking into account the weight of the evidence at his disposal. (shrink)
Kant’s argument in § 38 of the *Critique of Judgment* is subject to a dilemma: if the subjective condition of cognition is the sufficient condition of the pleasure of taste, then every object of experience must produce that pleasure; if not, then the universal communicability of cognition does not entail the universal communicability of the pleasure. Kant’s use of an additional premise in § 21 may get him out of this difficulty, but the premises themselves hang in the air and (...) have no independent plausibility. What Kant offers as a proof of our right to make judgments of taste is more charitably construed as an indirect argument for the adequacy of a speculative explanation of a *presumed* right to make judgments of taste. (shrink)
We argue that the need for commentary in commonly used linear calculi of natural deduction is connected to the “deletion” of illocutionary expressions that express the role of propositions as reasons, assumptions, or inferred propositions. We first analyze the formalization of an informal proof in some common calculi which do not formalize natural language illocutionary expressions, and show that in these calculi the formalizations of the example proof rely on commentary devices that have no counterpart in the original proof. (...) We then present a linear natural deduction calculus that makes use of formal illocutionary expressions in such a way that unique readability for derivations is guaranteed – thus showing that formalizing illocutionary expressions can eliminate the need for commentary. (shrink)
Fichte argues that the conclusion of Kant’s transcendental deduction of the categories is correct yet lacks a crucial premise, given Kant’s admission that the metaphysical deduction locates an arbitrary origin for the categories. Fichte provides the missing premise by employing a new method: a genetic deduction of the categories from a first principle. Since Fichte claims to articulate the same view as Kant in a different, it is crucial to grasp genetic deduction in relation to the (...) sorts of deduction that Kant offers. I propose to interpret genetic deduction as the simultaneous fulfillment of two tasks: answering the question quid facti by deriving the categories from the I and answering the question quid juris by establishing our entitlement to the categories as conditions of experience. While the second task represents Fichte’s agreement with Kant’s transcendental deduction, the first reflects his correction of Kant’s metaphysical deduction. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.