References in:
A probabilistic analysis of argument cogency
Synthese 195 (4):17151740 (2018)
Add references
You must login to add references.


Recent work on conditional reasoning argues that denying the antecedent [DA] and affirming the consequent [AC] are defeasible but cogent patterns of argument, either because they are effective, rational, albeit heuristic applications of Bayesian probability, or because they are licensed by the principle of total evidence. Against this, we show that on any prevailing interpretation of indicative conditionals the premises of DA and AC arguments do not license their conclusions without additional assumptions. The cogency of DA and AC inferences rather (...) 

This paper begins a working through of Blair’s (2001) theoretical agenda concerning argumentation schemes and their attendant critical questions, in which we propose a number of solutions to some outstanding theoretical issues. We consider the classification of schemes, their ultimate nature, their role in argument reconstruction, their foundation as normative categories of argument, and the evaluative role of critical questions.We demonstrate the role of schemes in argument reconstruction, and defend a normative account of their nature against specific criticisms due to (...) 

Scientific reasoning is—and ought to be—conducted in accordance with the axioms of probability. This Bayesian view—so called because of the central role it accords to a theorem first proved by Thomas Bayes in the late eighteenth ... 



Bayesian reasoning has been applied formally to statistical inference, machine learning and analysing scientific method. Here I apply it informally to more common forms of inference, namely natural language arguments. I analyse a variety of traditional fallacies, deductive, inductive and causal, and find more merit in them than is generally acknowledged. Bayesian principles provide a framework for understanding ordinary arguments which is well worth developing. 

Consider the proposition, "Informal logic is a subdiscipline of philosophy". The best chance of showing this to be true is showing that informal logic is part of logic, which in turn is a part of philosophy. Part 1 is given over to the task of sorting out these connections. If successful, informal logic can indeed be seen as part of philosophy; but there is no question of an exclusive relationship. Part 2 is a critical appraisal of the suggestion that informal (...) 

Bayes' Theorem is a simple mathematical formula used for calculating conditional probabilities. It figures prominently in subjectivist or Bayesian approaches to epistemology, statistics, and inductive logic. Subjectivists, who maintain that rational belief is governed by the laws of probability, lean heavily on conditional probabilities in their theories of evidence and their models of empirical learning. Bayes' Theorem is central to these enterprises both because it simplifies the calculation of conditional probabilities and because it clarifies significant features of subjectivist position. Indeed, (...) 

This paper is an exercise in intellectual history, an attempt to understand how a specific term—”informal logic”— came to be interpreted in so many different ways. I trace the emergence and development of “informal logic” to help explain the many different meanings, how they emerged and how they are related. This paper is also, to some degree, an account of a movement that developed outside the mainstream of philosophy, whose origins lie in a desire to make logic useful (echoing Dewey). 



The appeal to expert opinion is an argument form that uses the verdict of an expert to support a position or hypothesis. A previous schemebased treatment of the argument form is formalized within a Bayesian network that is able to capture the critical aspects of the argument form, including the central considerations of the expert's expertise and trustworthiness. We propose this as an appropriate normative framework for the argument form, enabling the development and testing of quantitative predictions as to how (...) 

Bayesian conﬁrmation theory—abbreviated to in these notes—is the predominant approach to conﬁrmation in late twentieth century philosophy of science. It has many critics, but no rival theory can claim anything like the same following. The popularity of the Bayesian approach is due to its ﬂexibility, its apparently effortless handling of various technical problems, the existence of various a priori arguments for its validity, and its injection of subjective and contextual elements into the process of conﬁrmation in just the places where (...) 



In this article, we argue for the general importance of normative theories of argument strength. We also provide some evidence based on our recent work on the fallacies as to why Bayesian probability might, in fact, be able to supply such an account. In the remainder of the article we discuss the general characteristics that make a specifically Bayesian approach desirable, and critically evaluate putative flaws of Bayesian probability that have been raised in the argumentation literature. 







Norms—that is, specifications of what we ought to do—play a critical role in the study of informal argumentation, as they do in studies of judgment, decisionmaking and reasoning more generally. Specifically, they guide a recurring theme: are people rational? Though rules and standards have been central to the study of reasoning, and behavior more generally, there has been little discussion within psychology about why (or indeed if) they should be considered normative despite the considerable philosophical literature that bears on this (...) 

Starting with a brief overview of current usages, this paper offers some constituents of a usebased analysis of ‘fallacy’, listing 16 conditions that have, for the most part implicitly, been discussed in the literature. Our thesis is that at least three related conceptions of ‘fallacy’ can be identified. The 16 conditions thus serve to “carve out” a semantic core and to distinguish three corespecifications. As our discussion suggests, these specifications can be related to three normative positions in the philosophy of (...) 





I argue in a nonreductive sense for a plausible epistemic principle, which can (1) theoretically and instrumentally unify or systematize all fallacies, and (2) provide a justification for using such a principle for characterizing an erroneous argument as a fallacy. This plausible epistemic principle involves the idea of an error in the method of justification, which results in a failure to provide relevant evidence to satisfy certain standards of adequate proof. Thus, all fallacies are systematically disguised failures to provide substantive (...) 

This paper examines the adequacy of commitment change, as a measure of the successful resolution of a difference of opinion. I argue that differences of opinion are only effectively resolved if commitments undertaken in argumentation survive beyond its conclusion and go on to govern an arguer’s actions in everyday life, e.g., by serving as premises in her practical reasoning. Yet this occurs, I maintain, only when an arguer’s beliefs are changed, not merely her commitments. 

We examine in detail three classic reasoning fallacies, that is, supposedly ``incorrect'' forms of argument. These are the socalled argumentam ad ignorantiam, the circular argument or petitio principii, and the slippery slope argument. In each case, the argument type is shown to match structurally arguments which are widely accepted. This suggests that it is not the form of the arguments as such that is problematic but rather something about the content of those examples with which they are typically justified. This (...) 

The study of deductive reasoning has been a major paradigm in psychology for approximately the past 40 years. Research has shown that people make many logical errors on such tasks and are strongly influenced by problem content and context. It is argued that this paradigm was developed in a context of logicist thinking that is now outmoded. Few reasoning researchers still believe that logic is an appropriate normative system for most human reasoning, let alone a model for describing the process (...) 



This paper presents a formalization of informal logic using the Carneades Argumentation System, a formal, computational model of argument that consists of a formal model of argument graphs and audiences. Conflicts between pro and con arguments are resolved using proof standards, such as preponderance of the evidence. CAS also formalizes argumentation schemes. Schemes can be used to check whether a given argument instantiates the types of argument deemed normatively appropriate for the type of dialogue. 



There is an ongoing controversy in philosophy about the connection between explanation and inference. According to Bayesians, explanatory considerations should be given weight in determining which inferences to make, if at all, only insofar as doing so is compatible with Strict Conditionalization. Explanationists, on the other hand, hold that explanatory considerations can be relevant to the question of how much confidence to invest in our hypotheses in ways which violate Strict Conditionalization. The controversy has focused on normative issues. This paper (...) 



In this paper, it is argued that the most fruitful approach to developing normative models of argument quality is one that combines the argumentation scheme approach with Bayesian argumentation. Three sample argumentation schemes from the literature are discussed: the argument from sign, the argument from expert opinion, and the appeal to popular opinion. Limitations of the schemebased treatment of these argument forms are identified and it is shown how a Bayesian perspective may help to overcome these. At the same time, (...) 

My One Fallacy theory says there is only one fallacy: equivocation, or playing on an ambiguity. In this paper I explain how this theory arose from rnetaphilosophical concerns. And I contrast this theory with purely logical, dialectical, and psychological notions of fallacy. 

in The Oxford Handbook of Corporate Social Responsibility, ed. Paul Anand, Prasanta Pattanaik, and Clemens Puppe, forthcoming 2007. 

A Theory of Argument is an advanced textbook intended for students in philosophy, communications studies and linguistics who have completed at least one course in argumentation theory, information logic, critical thinking or formal logic. Containing nearly 400 exercises, Mark Vorobej develops a novel approach to argument interpretation and evaluation. One of the key themes of the book is that we cannot succeed in distinguishing good argument from bad arguments until we learn to listen carefully to others. Part I develops a (...) 

This book provides a systematic analysis of many common argumentation schemes and a compendium of 96 schemes. The study of these schemes, or forms of argument that capture stereotypical patterns of human reasoning, is at the core of argumentation research. Surveying all aspects of argumentation schemes from the ground up, the book takes the reader from the elementary exposition in the first chapter to the latest state of the art in the research efforts to formalize and classify the schemes, outlined (...) 





Bayesian reasoning has been applied formally to statistical inference, machine learning and analysing scientific method. Here I apply it informally to more common forms of inference, namely natural language arguments. I analyse a variety of traditional fallacies, deductive, inductive and causal, and find more merit in them than is generally acknowledged. Bayesian principles provide a framework for understanding ordinary arguments which is well worth developing. 

This book provides a systematic analysis of many common argumentation schemes and a compendium of 96 schemes. The study of these schemes, or forms of argument that capture stereotypical patterns of human reasoning, is at the core of argumentation research. Surveying all aspects of argumentation schemes from the ground up, the book takes the reader from the elementary exposition in the first chapter to the latest state of the art in the research efforts to formalize and classify the schemes, outlined (...) 



In Logical SelfDefense , Johnson and I introduced the criteria of acceptability, relevance and sufficiency as appropriate for the evaluation of arguments in the sense of reasons offered in support of a claim. These three criteria have been widely adopted, but each has been subjected to a number of criticisms; and also 30 years of research have intervened. How do these criteria stand up today? In this paper I argue that they still have a place in argument analysis and evaluation, (...) 

According to Bayesian confirmation theory, evidence E (incrementally) confirms (or supports) a hypothesis H (roughly) just in case E and H are positively probabilistically correlated (under an appropriate probability function Pr). There are many logically equivalent ways of saying that E and H are correlated under Pr. Surprisingly, this leads to a plethora of nonequivalent quantitative measures of the degree to which E confirms H (under Pr). In fact, many nonequivalent Bayesian measures of the degree to which E confirms (or (...) 



‘Bayesian epistemology’ became an epistemological movement in the 20th century, though its two main features can be traced back to the eponymous Reverend Thomas Bayes (c. 170161). Those two features are: (1) the introduction of a formal apparatus for inductive logic; (2) the introduction of a pragmatic selfdefeat test (as illustrated by Dutch Book Arguments) for epistemic rationality as a way of extending the justification of the laws of deductive logic to include a justification for the laws of inductive logic. (...) 

The book also comes with an exhaustive array of study aids that enable the reader to monitor and enhance the learning process. 

