The paper considers contemporary models of presumption in terms of their ability to contribute to a working theory of presumption for argumentation. Beginning with the Whatelian model, we consider its contemporary developments and alternatives, as proposed by Sidgwick, Kauffeld, Cronkhite, Rescher, Walton, Freeman, Ullmann-Margalit, and Hansen. Based on these accounts, we present a picture of presumptions characterized by their nature, function, foundation and force. On our account, presumption is a modal status that is attached to a claim and has (...) the effect of shifting, in a dialogue, a burden of proof set at a local level. Presumptions can be analysed and evaluated inferentially as components of rule-based structures. Presumptions are defeasible, and the force of a presumption is a function of its normative foundation. This picture seeks to provide a framework to guide the development of specific theories of presumption. (shrink)
Reasoning from negative evidence takes place where an expected outcome is tested for, and when it is not found, a conclusion is drawn based on the significance of the failure to find it. By using Gricean maxims and implicatures, we show how a set of alternatives, which we call a paradigm, provides the deep inferential structure on which reasoning from lack of evidence is based. We show that the strength of reasoning from negative evidence depends on how the arguer defines (...) his conclusion and what he considers to be in the paradigm of negated alternatives. If we negate only two of the several possible alternatives, even if they are the most probable, the conclusion will be weak. However, if we deny all possible alternatives, the reasoning will be strong, and even in some cases deductively valid. (shrink)
I assess the thesis that counterfactual asymmetries are explained by an asymmetry of the global entropy at the temporal boundaries of the universe, by developing a method of evaluating counterfactuals that includes, as a background assumption, the low entropy of the early universe. The resulting theory attempts to vindicate the common practice of holding the past mostly fixed under counterfactual supposition while at the same time allowing the counterfactual's antecedent to obtain by a natural physical development. Although the theory has (...) some success in evaluating a wide variety of ordinary counterfactuals, it fails as an explanation of counterfactual asymmetry. (shrink)
This book analyzes the uses of emotive language and redefinitions from pragmatic, dialectical, epistemic and rhetorical perspectives, investigating the relationship between emotions, persuasion and meaning, and focusing on the implicit dimension of the use of a word and its dialectical effects. It offers a method for evaluating the persuasive and manipulative uses of emotive language in ordinary and political discourse. Through the analysis of political speeches and legal arguments, the book offers a systematic study of emotive language in argumentation, rhetoric, (...) communication, political science and public speaking. (shrink)
Statutory interpretation involves the reconstruction of the meaning of a legal statement when it cannot be considered as accepted or granted. This phenomenon needs to be considered not only from the legal and linguistic perspective, but also from the argumentative one - which focuses on the strategies for defending a controversial or doubtful viewpoint. This book draws upon linguistics, legal theory, computing, and dialectics to present an argumentation-based approach to statutory interpretation. By translating and summarizing the existing legal interpretative canons (...) into eleven patterns of natural arguments - called argumentation schemes - the authors offer a system of argumentation strategies for developing, defending, assessing, and attacking an interpretation. Illustrated through major cases from both common and civil law, this methodology is summarized in diagrams and maps for application to computer sciences. These visuals help make the structures, strategies, and vulnerabilities of legal reasoning accessible to both legal professionals and laypeople. (shrink)
This book shows how research in linguistic pragmatics, philosophy of language, and rhetoric can be connected through argumentation to analyze a recognizably common strategy used in political and everyday conversation, namely the distortion of another’s words in an argumentative exchange. Straw man argumentation refers to the modification of a position by misquoting, misreporting or wrenching the original speaker’s statements from their context in order to attack them more easily or more effectively. Through 63 examples taken from different contexts (including political (...) and forensic discourses and dialogs) and 20 legal cases, the book analyzes the explicit and implicit types of straw man, shows how to assess the correctness of a quote or a report, and illustrates the arguments that can be used for supporting an interpretation and defending against a distortion. The tools of argumentation theory, a discipline aimed at investigating the uses of arguments by combining insights from pragmatics, logic, and communication, are applied to provide an original account of interpretation and reporting, and to describe and illustrate tactics and procedures that can be used and implemented for practical purposes.. This book will appeal to scholars in the fields of political communication, communication in general, argumentation theory, rhetoric and pragmatics, as well as to people working in public speech, speech writing, and discourse analysis. (shrink)
The aim of the paper is to present a typology of argument schemes. In first place, we found it helpful to define what an argument scheme is. Since many argument schemes found in contemporary theories stem from the ancient tradition, we took in consideration classical and medieval dialectical studies and their relation with argumentation theory. This overview on the main works on topics and schemes provides a summary of the main principles of classification. In the second section, Walton’s theory (...) is briefly explained to introduce the schemes classification and its different levels. At least, the final part shows the main application of the schemes in computing and AI. -/- . (shrink)
Using tools like argument diagrams and profiles of dialogue, this paper studies a number of examples of everyday conversational argumentation where determination of relevance and irrelevance can be assisted by means of adopting a new dialectical approach. According to the new dialectical theory, dialogue types are normative frameworks with specific goals and rules that can be applied to conversational argumentation. In this paper is shown how such dialectical models of reasonable argumentation can be applied to a determination of whether an (...) argument in a specific case is relevant are not in these examples. The approach is based on a linguistic account of dialogue and text from congruity theory, and on the notion of a dialectical shift. Such a shift occurs where an argument starts out as fitting into one type of dialogue, but then it only continues to makes sense as a coherent argument if it is taken to be a part of a different type of dialogue. (shrink)
DouglasWalton’s multitudinous contributions to the study of argumentation seldom, if ever, directly engage with argumentation in mathematics. Nonetheless, several of the innovations with which he is most closely associated lend themselves to improving our understanding of mathematical arguments. I concentrate on two such innovations: dialogue types (§1) and argumentation schemes (§2). I argue that both devices are much more applicable to mathematical reasoning than may be commonly supposed.
Argumentation schemes can be described as abstract structures representing the most generic types of argument, constituting the building blocks of the ones used in everyday reasoning. This paper investigates the structure, classification, and uses of such schemes. Three goals are pursued: 1) to describe the schemes, showing how they evolved and how they have been classified in the traditional and the modern theories; 2) to propose a method for classifying them based on ancient and modern developments; and 3) to outline (...) and show how schemes can be used to describe and analyze or produce real arguments. To this purpose, we will build on the traditional distinctions for building a dichotomic classification of schemes, and we will advance a modular approach to argument analysis, in which different argumentation schemes are combined together in order to represent each step of reasoning on which a complex argument relies. Finally, we will show how schemes are applied to formal systems, focusing on their applications to Artificial Intelligence, AI & Law, argument mining, and formal ontologies. (shrink)
In this paper, we use concepts, structure and tools from argumentation theory to show how conversational implicatures are triggered by conflicts of presumptions. Presumptive implicatures are shown to be based on defeasible forms of inference used in conditions of lack of knowledge, including analogical reasoning, inference to the best explanation, practical reasoning, appeal to pity, and argument from cause. Such inferences are modelled as communicative strategies to knowledge gaps that shift the burden of providing the missing contrary evidence to the (...) other party in a dialogue. Through a series of illustrative examples, we show how such principles of inference are based on common knowledge about the ordinary course of events shared by participants in a structured dialogue setting in which they take turns putting forward and responding to speech acts. (shrink)
Argument from analogy is a common and formidable form of reasoning in law and in everyday conversation. Although there is substantial literature on the subject, according to a recent survey ( Juthe 2005) there is little fundamental agreement on what form the argument should take, or on how it should be evaluated. Th e lack of conformity, no doubt, stems from the complexity and multiplicity of forms taken by arguments that fall under the umbrella of analogical reasoning in argumentation, dialectical (...) studies, and law. Modeling arguments with argumentation schemes has proven useful in attempts to refine the analyst’s understanding of not only the logical structures that shape the backbone of the argument itself, but also the logical underpinning of strategies for evaluating it, strategies based on the semantic categories of genus and relevance. By clarifying the distinction between argument from example and argument from analogy, it is possible to advance a useful proposal for the treatment of argument from analogy in law. (shrink)
This paper uses argument diagrams, argumentation schemes, and some tools from formal argumentation systems developed in artificial intelligence to build a graph-theoretic model of relevance shown to be applicable as a practical method for helping a third party judge issues of relevance or irrelevance of an argument in real examples. Examples used to illustrate how the method works are drawn from disputes about relevance in natural language discourse, including a criminal trial and a parliamentary debate.
This paper proposes an argumentation-based procedure for legal interpretation, by reinterpreting the traditional canons of textual interpretation in terms of argumentation schemes, which are then classified, formalized, and represented through argument visualization and evaluation tools. The problem of statutory interpretation is framed as one of weighing contested interpretations as pro and con arguments. The paper builds an interpretation procedure by formulating a set of argumentation schemes that can be used to comparatively evaluate the types of arguments used in cases of (...) contested statutory interpretation in law. A simplified version of the Carneades Argumentation System is applied in a case analysis showing how the procedure works. A logical model for statutory interpretation is finally presented, covering pro-tanto and all-things-considered interpretive conclusions. (shrink)
This paper explains the importance of classifying argumentation schemes, and outlines how schemes are being used in current research in artificial intelligence and computational linguistics on argument mining. It provides a survey of the literature on scheme classification. What are so far generally taken to represent a set of the most widely useful defeasible argumentation schemes are surveyed and explained systematically, including some that are difficult to classify. A new classification system covering these centrally important schemes is built.
This paper analyzes selected examples of uses of argumentation tactics that exploit emotive language, many of them criticized as deceptive and even fallacious by classical and recent sources, including current informal logic textbooks. The analysis is based on six argumentation schemes, and an account of the dialectical setting in which these schemes are used. The three conclusions are (1) that such uses of emotive language are often reasonable and necessary in argumentation based on values, (2) but that they are defeasible, (...) and hence need to be seen as open to critical questioning (3) and that when they are used fallaciously, it is because they interfere with critical questioning or conceal the need for it. The analysis furnishes criteria for distinguishing between arguments based on the use of emotive words that are reasonable tools of persuasion, and those that are fallacious tactics used to conceal and distort information. (shrink)
This paper explains the importance of classifying argumentation schemes, and outlines how schemes are being used in current research in artificial intelligence and computational linguistics on argument mining. It provides a survey of the literature on scheme classification. What are so far generally taken to represent a set of the most widely useful defeasible argumentation schemes are surveyed and explained systematically, including some that are difficult to classify. A new classification system covering these centrally important schemes is built.
This article analyses the fallacy of wrenching from context, using the dialectical notions of commitment and implicature as tools. The data, a set of key examples, is used to sharpen the conceptual borderlines around the related fallacies of straw man, accent, misquotation, and neglect of qualifications. According to the analysis, the main characteristics of wrenching from context are the manipulation of the meaning of the other’s statement through devices such as the use of misquotations, selective quotations, and quoting out of (...) context. The theoretical tools employed in the analysis are pragmatic theories of meaning and a dialectical model of commitment, used to explain how and why a standpoint is distorted. The analysis is based on a conception of fallacies as deceptive strategic moves in a game of dialogue. As a consequence, our focus is not only on misquotations as distortions of meaning, but on how they are used as dialectical tools to attack an opponent or win a dispute. Wrenching from context is described as a fallacy of unfairly attributing a commitment to another party that he never held. Its power as a deceptive argumentation tactic is based on complex mechanisms of implicit commitments and on their misemployment to improperly suggest an attribution of commitment. (shrink)
With the advent of the semantic web, the problem of ambiguity is becoming more and more urging. Semantic analysis is necessary for explaining and resolving some sorts of ambiguity by inquiring into the relation between possibilities of predication and definition of a concept in order to solve problems such as interpretation and ambiguity. If computing is now approaching such problems of linguistic analysis, what is worth inquiring into is how the development of linguistic studies can be useful for developing the (...) theoretical background of ontologies. Our proposal is to develop a theory of definition alternative to the traditional metaphysical approach and the modern relativistic account. We interpret the ancient notion of essential definition in a dialectical perspective, and show how the dialectical definition by genus and difference corresponds to the semantic analysis of the definiendum. The dialectical definition is shown to be grounded on the deepest endoxa (shared knowledge) of a community, and to be the argumentatively strongest definition. After presenting the most common types of definition used in argumentation, the linguistic and logical characteristics of the notion of definition by genus and difference are set out in a pragmatic framework. (shrink)
We argue that common knowledge, of the kind used in reasoning in law and computing is best analyzed using a dialogue model of argumentation (Walton & Krabbe 1995). In this model, implicit premises resting on common knowledge are analyzed as endoxa or widely accepted opinions and generalizations (Tardini 2005). We argue that, in this sense, common knowledge is not really knowledge of the kind represent by belief and/or knowledge of the epistemic kind studied in current epistemology. This paper takes (...) a different approach, defining it in relation to a common commitment store of two participants in a rule-governed dialogue in which two parties engage in rational argumentation (Jackson & Jacobs 1980; van Eemeren & Grootendorst 2004). A theme of the paper is how arguments containing common knowledge premises can be studied with the help of argumentation schemes for arguments from generally accepted opinion and expert opinion. It is argued that common knowledge is a species of provi- sional acceptance of a premise that is not in dispute at a given point in a dia- logue, but may later be defeated as the discussion proceeds . (shrink)
In this paper we analyze the uses and misuses of argumentation schemes from verbal classification, and show how argument from definition supports argumentation based on argument from verbal classification. The inquiry has inevitably included the broader study of the concept of definition. The paper presents the schemes for argument from classification and for argument from definition, and shows how the latter type of argument so typically supports the former. The problem of analyzing arguments based on classification is framed in a (...) structure that reveals the crucial role it plays in the persuasion process. The survey of the literature includes the work of Hastings, Perelman, Kienpointner and Schiappa, but still finds much of value in Aristotle. Lessons drawn from Aristotle’s Topics are shown to be useful for developing new tools for assessing definitions and arguments from definition. (shrink)
There are emotively powerful words that can modify our judgment, arouse our emotions and influence our decisions. This paper shows how the use of emotive meaning in argumentation can be explained by showing how their logical dimension, which can be analysed using argumentation schemes, combines with heuristic processes triggered by emotions. Arguing with emotive words is shown to use value-based practical reasoning grounded on hierarchies of values and maxims of experience for evaluative classification.
In this paper, we present a survey of the development of the technique of argument diagramming covering not only the fields in which it originated - informal logic, argumentation theory, evidence law and legal reasoning – but also more recent work in applying and developing it in computer science and artificial intelligence. Beginning with a simple example of an everyday argument, we present an analysis of it visualised as an argument diagram constructed using a software tool. In the context of (...) a brief history of the development of diagramming, it is then shown how argument diagrams have been used to analyze and work with argumentation in law, philosophy and artificial intelligence. (shrink)
Statutory Interpretation as Argumentation.DouglasWalton, Giovanni Sartor & Fabrizio Macagno - 2018 - In Colin Aitken, Amalia Amaya, Kevin D. Ashley, Carla Bagnoli, Giorgio Bongiovanni, Bartosz Brożek, Cristiano Castelfranchi, Samuele Chilovi, Marcello Di Bello, Jaap Hage, Kenneth Einar Himma, Lewis A. Kornhauser, Emiliano Lorini, Fabrizio Macagno, Andrei Marmor, J. J. Moreso, Veronica Rodriguez-Blanco, Antonino Rotolo, Giovanni Sartor, Burkhard Schafer, Chiara Valentini, Bart Verheij, Douglas Walton & Wojciech Załuski (eds.), Handbook of Legal Reasoning and Argumentation. Cambridge University Press. pp. 519-560.details
This chapter proposes a dialectical approach to legal interpretation, consisting of three dimensions: a formalization of the canons of interpretation in terms of argumentation schemes; a dialectical classification of interpretive schemes; and a logical and computational model for comparing the arguments pro and contra an interpretation. The traditional interpretive maxims or canons used in both common and civil law are translated into defeasible patterns of arguments, which can be evaluated through sets of corresponding critical questions. These interpretive argumentation schemes are (...) classified in general categories and a distinction is drawn between schemes supporting and rebutting an interpretation. This framework allows conceiving statutory interpretation as a dialectical procedure consisting in weighing arguments pro and contra an interpretation. This procedure is formalized and represented computationally through tools from formal argumentation systems. (shrink)
This paper argues for a reinterpretation of Aristotle's concept of an enthymeme and also his wider informal logic in terms of arguments that are defeasible. They are represented by forms of argument that are called argumentation schemes, considered to be similar to forms of argument found in deductive logic, but different from the foregoing in virtue of their being defeasible. Indeed, the most interesting schemes have been put forward as a helpful way of characterizing structures of human reasoning that have (...) proved troublesome to model deductively. The paper sheds new light on Aristotle's topics and how to dene `enthymeme'. If the traditional denition of an enthymeme in logic accepted for over two thousand years is a misnomer, the question is raised whether we ought to redene it as a defeasible argumentations scheme or leave things as they are. (shrink)
In this paper it is shown how certain defeasible argumentation schemes can be used to represent the logical structure of the most common types of argument used for statutory interpretation both in civil and common law. The method is based on an argumentation structure in which the conclusion, namely, the meaning attributed to a legal source, is modeled as a claim that needs that is be supported by pro and con defeasible arguments. The defeasible nature of each scheme is shown (...) by means of critical questions, which identify the default conditions for the accepting interpretative arguments and provide a method for evaluating a given argument as weak or strong. (shrink)
This paper compares current ways of modeling the inferential structure of practical reasoning arguments, and proposes a new approach in which it is regarded in a modular way. Practical reasoning is not simply seen as reasoning from a goal and a means to an action using the basic argumentation scheme. Instead, it is conceived as a complex structure of classificatory, evaluative, and practical inferences, which is formalized as a cluster of three types of distinct and interlocked argumentation schemes. Using two (...) real examples, we show how applying the three types of schemes to a cluster of practical argumentation allows an argument analyst to reconstruct the tacit premises presupposed and evaluate the argumentative reasoning steps involved. This approach will be shown to overcome the limitations of the existing models of practical reasoning arguments within the BDI and commitment theoretical frameworks, providing a useful tool for discourse analysis and other disciplines. In particular, applying this method brings to light the crucial role of classification in practical argumentation, showing how the ordering of values and preferences is only one of the possible areas of deep disagreement. (shrink)
This paper proposes an argumentation-based procedure for legal interpretation, by reinterpreting the traditional canons of textual interpretation in terms of argumentation schemes, which are then classified, formalized, and represented through argument visualization and evaluation tools. The problem of statutory interpretation is framed as one of weighing contested interpretations as pro and con arguments. The paper builds an interpretation procedure by formulating a set of argumentation schemes that can be used to comparatively evaluate the types of arguments used in cases of (...) contested statutory interpretation in law. A simplified version of the Carneades Argumentation System is applied in a case analysis showing how the procedure works. A logical model for statutory interpretation is finally presented, covering pro-tanto and all-things-considered interpretive conclusions. -/- . (shrink)
The purpose of this paper is to analyze the structure and the defeasibility conditions of argument from analogy, addressing the issues of determining the nature of the comparison underlying the analogy and the types of inferences justifying the conclusion. In the dialectical tradition, different forms of similarity were distinguished and related to the possible inferences that can be drawn from them. The kinds of similarity can be divided into four categories, depending on whether they represent fundamental semantic features of the (...) terms of the comparison or non-semantic ones, indicating possible characteristics of the referents. Such distinct types of similarity characterize different kinds of analogical arguments, all based on a similar general structure, in which a common genus is abstracted. Depending on the nature of the abstracted common feature, different rules of inference will apply, guaranteeing the attribution of the analogical predicate to the genus and to the primary subject. This analysis of similarity and the relationship thereof with the rules of inference allows a deeper investigation of the defeasibility conditions. (shrink)
Manipulation of quotation, shown to be a common tactic of argumentation in this paper, is associated with fallacies like wrenching from context, hasty generalization, equivocation, accent, the straw man fallacy, and ad hominem arguments. Several examples are presented from everyday speech, legislative debates and trials. Analysis using dialog models explains the critical defects of argumentation illustrated in each of the examples. In the formal dialog system CB, a proponent and respondent take turns in making moves in an orderly goal-directed sequence (...) of argumentation in which the proponent tries to persuade the respondent to become committed to a conclusion by asking questions and offering arguments. Analyzing quotation by using the notion of commitment in dialog, it is shown (a) how an arguer’s previous assertions can be brought to light in the course of a dialog to deal with problems arising from misquotation, (b) how the profile of dialog model allows a critic to analyse the fundamental effects misquotation brings about in a dialog, and (c) how the critic can use such an analysis to correct the problem. (shrink)
A problem for dialogue models of argumentation is to specify a set of conditions under which an opponent’s claims, offered in support of a standpoint under dispute, ought to be challenged. This project is related to the issue of providing a set of acceptability conditions for claims made in a dialogue. In this paper, we consider the conditions of suspicion and trust articulated by Jacobs (Alta, 2003), arguing that neither are acceptable as general conditions for challenge. We propose a third (...) condition that attempts to mark a middle ground between suspicion and trust. (shrink)
The representation and classification of the structure of natural arguments has been one of the most important aspects of Aristotelian and medieval dialectical and rhetorical theories. This traditional approach is represented nowadays in models of argumentation schemes. The purpose of this article is to show how arguments are characterized by a complex combination of two levels of abstraction, namely, semantic relations and types of reasoning, and to provide an effective and comprehensive classification system for this matrix of semantic and quasilogical (...) connections. To this purpose, we propose a dichotomous criterion of classification, transcending both levels of abstraction and representing not what an argument is but how it is understood and interpreted. The schemes are grouped according to an end-means criterion, which is strictly bound to the ontological structure of the conclusion and the premises. On this view, a scheme can be selected according to the intended or reconstructed purpose of an argument and the possible strategies that can be used to achieve it. (shrink)
The fields of linguistic pragmatics and legal interpretation are deeply interrelated. The purpose of this paper is to show how pragmatics and the developments in argumentation theory can contribute to the debate on legal interpretation. The relation between the pragmatic maxims and the presumptions underlying the legal canons are brought to light, unveiling the principles that underlie the types of argument usually used to justify a construction. The Gricean maxims and the arguments of legal interpretation are regarded as presumptions subject (...) to default used to justify an interpretation. This approach can allow one to trace the different legal interpretive arguments back to their basic underlying presumptions, so that they can be compared, ordered, and assessed according to their defeasibility conditions. This approach allows one to understand the difference between various types of interpretive canons, and their strength in justifying an interpretation. (shrink)
This paper builds a nine-step method for determining whether a straw man fallacy has been committed in a given case or not, by starting with some relatively easy textbook cases and moving to more realistic and harder cases. The paper shows how the type of argument associated with the fallacy can be proved to be a fallacy in a normative argumentation model, and then moves on to the practical task of building a hands-on method for applying the model to real (...) examples of argumentation. Insights from linguistic pragmatics are used to distinguish the different pragmatic processes involved in reconstructing what is said and what is meant by an utterance, and to differentiate strong and weak commitments. In particular, the process of interpretation is analyzed in terms of an abductive pattern of reasoning, based on co-textual and contextual information, and assessable through the instruments of argumentation theory. (shrink)
We contend that it is possible to argue reasonably for and against arguments from classifications and definitions, provided they are seen as defeasible (subject to exceptions and critical questioning). Arguments from classification of the most common sorts are shown to be based on defeasible reasoning of various kinds represented by patterns of logical reasoning called defeasible argumentation schemes. We show how such schemes can be identified with heuristics, or short-cut solutions to a problem. We examine a variety of arguments of (...) this sort, including argument from abductive classification, argument from causal classification, argument from analogy-based classification and arguments from classification based on generalizations. (shrink)
In this paper we present an analysis of persuasive definition based on argumentation schemes. Using the medieval notion of differentia and the traditional approach to topics, we explain the persuasiveness of emotive terms in persuasive definitions by applying the argumentation schemes for argument from classification and argument from values. Persuasive definitions, we hold, are persuasive because their goal is to modify the emotive meaning denotation of a persuasive term in a way that contains an implicit argument from values. However, our (...) theory is different from Stevenson’s, a positivistic view that sees emotive meaning as subjective, and defines it as a behavioral effect. Our proposal is to treat the persuasiveness produced by the use of emotive words and persuasive definitions as due to implicit arguments that an interlocutor may not be aware of. We use congruence theory to provide the linguistic framework for connecting a term with the function it is supposed to play in a text. Our account allows us to distinguish between conflicts of values and conflicts of classifications. (shrink)
Donald Trump’s speeches and messages are characterized by terms that are commonly referred to as “thick” or “emotive,” meaning that they are characterized by a tendency to be used to generate emotive reactions. This paper investigates how emotive meaning is related to emotions, and how it is generated or manipulated. Emotive meaning is analyzed as an evaluative conclusion that results from inferences triggered by the use of a term, which can be represented and assessed using argumentation schemes. The evaluative inferences (...) are regarded as part of the connotation of emotive words, which can be modified and stabilized by means of recontextualizations. The manipulative risks underlying the misuse and the redefinition of emotive words are accounted for in terms of presuppositions and implicit modifications of the interlocutors’ commitments. (shrink)
In this paper a theoretical definition that helps to explain how the logical structure of legal presumptions is constructed by applying the Carneades model of argumentation developed in artificial intelligence. Using this model, it is shown how presumptions work as devices used in evidentiary reasoning in law in the event of a lack of evidence to assist a chain of reasoning to move forward to prove or disprove a claim. It is shown how presumptions work as practical devices that may (...) be useful in cases in which there is insufficient evidence to prove the claim to an appropriate standard of proof. (shrink)
The purpose of this paper is to inquire into the relationship between persuasive definition and common knowledge (propositions generally accepted and not subject to dispute in a discussion). We interpret the gap between common knowledge and persuasive definition (PD) in terms of potential disagreements: PDs are conceived as implicit arguments to win a potential conflict. Persuasive definitions are analyzed as arguments instantiating two argumentation schemes, argument from classification and argument from values, and presupposing a potential disagreement. The argumentative structure of (...) PDs reveals different levels of disagreement, and different pos-sibilities of resolving the conflict or causing dialogical deadlock. (shrink)
An ancient argument attributed to the philosopher Carneades is presented that raises critical questions about the concept of an all-virtuous Divine being. The argument is based on the premises that virtue involves overcoming pains and dangers, and that only a being that can suffer or be destroyed is one for whom there are pains and dangers. The conclusion is that an all-virtuous Divine (perfect) being cannot exist. After presenting this argument, reconstructed from sources in Sextus Empiricus and Cicero, this paper (...) goes on to model it as a deductively valid sequence of reasoning. The paper also discusses whet her the premises are true. Questions about the possibility and value of proving and disproving the existence of God by logical reasoning are raised, as well as ethical questions about how the cardinal ethical virtues should be defined. (shrink)
In this paper we use a series of examples to show how oppositions and dichotomies are fundamental in legal argumentation, and vitally important to be aware of, because of their twofold nature. On the one hand, they are argument structures underlying various kinds of rational argumentation commonly used in law as a means of getting to the truth in a conflict of opinion under critical discussion by two opposing sides before a tryer of fact. On the other hand, they are (...) argument structures underling moves made in strategic advocacy by both sides that function as platforms for different kinds of questionable argumentation tactics and moves that are in some instances tricky and deceptive. (shrink)
One of the goals of physiologists who study the detailed physical, chemical,and neurological mechanisms operating within the human body is to understand the intricate causal processes which underlie human abilities and activities. It is doubtless premature to predict that they will eventually be able to explain the behaviour of a particular human being as we might now explain the behaviour of a pendulum clock or even the invisible changes occurring within the hardware of a modern electronic computer. Nonetheless, it seems (...) fair to say that hovering in the background of investigations into human physiology is the promise or threat, depending upon how one looks at the matter that human beings are complete physical-chemical systems and that all events taking place within their bodies and all movements of their bodies could be accounted for by physical causes if we but knew enough. I am not concerned at the moment with whether or not this ’mechanistic’ hypothesis is true, assuming that it is clear enough to be intelligible, nor with whether or not we could ever know that it is true. I wish to consider the somewhat more accessible yet equally important question whether our coming to believe that the hypothesis is true would warrant our relinquishing our conception of ourselves as beings who are capable of acting for reasons to achieve ends of our own choosing. I use the word ’warrant’ to indicate that I will not be discussing the possibility that believing the mechanistic hypothesis might lead us, as a matter of psychological fact, to think of human beings as mere automata, as objects whose movements are to be explained only by causes rather than by reasons, as are the actions of a personal subject. I intend to consider only whether the acceptance of mechanism would in fact justify such a change in conception. (shrink)
So-called ‘distinctively mathematical explanations’ (DMEs) are said to explain physical phenomena, not in terms of contingent causal laws, but rather in terms of mathematical necessities that constrain the physical system in question. Lange argues that the existence of four or more equilibrium positions of any double pendulum has a DME. Here we refute both Lange’s claim itself and a strengthened and extended version of the claim that would pertain to any n-tuple pendulum system on the ground that such explanations are (...) actually causal explanations in disguise and their associated modal conditionals are not general enough to explain the said features of such dynamical systems. We argue and show that if circumscribing the antecedent for a necessarily true conditional in such explanations involves making a causal analysis of the problem, then the resulting explanation is not distinctively mathematical or non-causal. Our argument generalises to other dynamical systems that may have purported DMEs analogous to the one proposed by Lange, and even to some other counterfactual accounts of non-causal explanation given by Reutlinger and Rice. (shrink)
Antimicrobial resistance (AMR) is a global public health disaster driven largely by antibiotic use in human health care. Doctors considering whether to prescribe antibiotics face an ethical conflict between upholding individual patient health and advancing public health aims. Existing literature mainly examines whether patients awaiting consultations desire or expect to receive antibiotic prescriptions, but does not report views of the wider public regarding conditions under which doctors should prescribe antibiotics. It also does not explore the ethical significance of public views (...) or their sensitivity to awareness of AMR risks or the standpoint (self-interested or impartial) taken by participants. Methods: An online survey was conducted with a sample of the U.S. public (n = 158). Participants were asked to indicate what relative priority should be given to individual patients and society-at-large from various standpoints and in various contexts, including antibiotic prescription. Results: Of the participants, 50.3% thought that doctors should generally prioritize individual patients over society, whereas 32.0% prioritized society over individual patients. When asked in the context of AMR, 39.2% prioritized individuals whereas 45.5% prioritized society. Participants were significantly less willing to prioritize society over individuals when they themselves were the patient, both in general (p = .001) and in relation to AMR specifically (p = .006). Conclusions: Participants’ attitudes were more oriented to society and sensitive to collective responsibility when informed about the social costs of antibiotic use and when considered from a third-person rather than first-person perspective. That is, as participants came closer to taking the perspective of an informed and impartial “ideal observer,” their support for prioritizing society increased. Our findings suggest that, insofar as antibiotic policies and practices should be informed by attitudes that are impartial and well-informed, there is significant support for prioritizing society. (shrink)
(Longer version - work in progress) Various accounts of distinctively mathematical explanations (DMEs) of complex systems have been proposed recently which bypass the contingent causal laws and appeal primarily to mathematical necessities constraining the system. These necessities are considered to be modally exalted in that they obtain with a greater necessity than the ordinary laws of nature (Lange 2016). This paper focuses on DMEs of the number of equilibrium positions of n-tuple pendulum systems and considers several different DMEs of these (...) systems which bypass causal features. It then argues that there is a tension between the modal strength of these DMEs and their epistemic hooking, and we are forced to choose between (a) a purported DME with greater modal strength and wider applicability but poor epistemic hooking, or (b) a narrowly applicable DME with lesser modal strength but with the right kind of epistemic hooking. It also aims to show why some kind of DMEs may be unappealing for working scientists despite their strong modality, and why some DMEs fail to be modally robust because of making ill-informed assumptions about their target systems. The broader goal is to show why such tensions weaken the case for DMEs for pendulum systems in general. (shrink)
In this paper, I will try to answer the question: How are we supposed to assess the expert’s opinion in an argument from the position of an outsider to the specialized field? by placing it in the larger context of the political status of epistemic authority. In order to do this I will first sketch the actual debate around the problem of expertise in a democracy and relate this to the issue of the status of science in society. Secondly, I (...) will review how DouglasWalton’s pragma-dialectical approach offers a practical procedure to assess the expert bias from a nonprofessional’s perspective. Thirdly, I will introduce the problem of group bias using insights from Bohman and Fischer and show how Walton’s solution does not address this specific type of bias. Lastly, I will try proposing a revision of Walton’s solution in order to address this problem. In order to make the explanation more easy to follow I will use a case study concerning the medical expertise in the public debate on second-hand smoke. (shrink)
Corroborative evidence can have a dual function in argument whereby not only does it have a primary function of providing direct evidence supporting the main conclusion, but it also has a secondary, bolstering function which increases the probative value of some other piece of evidence in the argument. It has been argued (Redmayne, 2000) that this double function gives rise to the fallacy of double counting whereby the probative weight of evidence is overvalued by counting it twice. Walton has (...) proposed several models of corroborative evidence, each of which seems to accept the fallaciousness of double counting thereby seeming to deny the dual function of corroborative evidence. Against this view, I argue that the bolstering effect is legitimate, and can be explained by recourse to inference to the best explanation. (shrink)
Kendall Walton’s “Categories of Art” (1970) is one of the most important and influential papers in twentieth-century aesthetics. It is almost universally taken to refute traditional aesthetic formalism/empiricism, according to which all that matters aesthetically is what is manifest to perception. Most commentators assume that the argument of “Categories” applies to works of literature. Walton himself notes a word of caution: “The aesthetic properties of works of literature are not happily called ‘perceptual’ … (The notion of perceiving a (...) work in a category … is not straightforwardly applicable to literary works)” (335 n.5). However, he goes on to say that although he focuses “on visual and musical works … the central points I make concerning them hold, with suitable modifications, for novels, plays, and poems as well” (335 n.5). Here I consider what “suitable modifications” are required to extend the account to literature. (shrink)
DouglasWalton’s work is extremely vast, multifaceted, and interdisciplinary. He developed theoretical proposals that have been used in disciplines that are not traditionally related to philosophy, such as law, education, discourse analysis, artificial intelligence, or medical communication. Through his papers and books, Walton redefined the boundaries not only of argumentation theory, but also logic and philosophy. He was a philosopher in the sense that his interest was developing theoretical models that can help explain reality, and more importantly (...) interact with it. For this reason, he proposed methods that have been used for analyzing different types of dialogical interactions, and modeling procedures for regulating them. (shrink)
Argument schemes—an epistemological approach.Christoph Lumer - 2011 - Argumentation. Cognition and Community. Proceedings of the 9th International Conference of the Ontario Society for the Study of Argumentation (OSSA), May 18-22, 2011.details
The paper develops a classificatory system of basic argument types on the basis of the epis-temological approach to argumentation. This approach has provided strict rules for several kinds of argu-ments. These kinds may be brought into a system of basic irreducible types, which rely on different parts of epistemology: deductive logic, probability theory, utility theory. The system reduces a huge mass of differ-ent argument schemes to basic types and gives them an epistemological foundation.
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.