What are people who disagree about logic disagreeing about? The paper argues that (in a wide range of cases) they are primarily disagreeing about how to regulate their degrees of belief. An analogy is drawn between beliefs about validity and beliefs about chance: both sorts of belief serve primarily to regulate degrees of belief about other matters, but in both cases the concepts have a kind of objectivity nonetheless.
Neuroscience has studied deductive reasoning over the last 20 years under the assumption that deductive inferences are not only de jure but also de facto distinct from other forms of inference. The objective of this research is to verify if logically valid deductions leave any cerebral electrical trait that is distinct from the trait left by non-valid deductions. 23 subjects with an average age of 20.35 years were registered with MEG and placed into a two conditions paradigm (100 trials for (...) each condition) which each presented the exact same relational complexity (same variables and content) but had distinct logical complexity. Both conditions show the same electromagnetic components (P3, N4) in the early temporal window (250–525 ms) and P6 in the late temporal window (500–775 ms). The significant activity in both valid and invalid conditions is found in sensors from medial prefrontal regions, probably corresponding to the ACC or to the medial prefrontal cortex. The amplitude and intensity of valid deductions is significantly lower in both temporal windows (p = 0.0003). The reaction time was 54.37% slower in the valid condition. Validity leaves a minimal but measurable hypoactive electrical trait in brain processing. The minor electrical demand is attributable to the recursive and automatable character of valid deductions, suggesting a physical indicator of computational deductive properties. It is hypothesized that all valid deductions are recursive and hypoactive. (shrink)
Firstly I characterize Simple Partial Logic (SPL) as the generalization and extension of a certain two-valued logic. Based on the characterization I present two definitions of validity in SPL. Finally I show that given my characterization these two definitions are more appropriate than other definitions that have been prevalent, since both have some desirable semantic properties that the others lack.
The first learning game to be developed to help students to develop and hone skills in constructing proofs in both the propositional and first-order predicate calculi. It comprises an autotelic (self-motivating) learning approach to assist students in developing skills and strategies of proof in the propositional and predicate calculus. The text of VALIDITY consists of a general introduction that describes earlier studies made of autotelic learning games, paying particular attention to work done at the Law School of Yale University, (...) called the ALL Project (Accelerated Learning of Logic). Following the introduction, the game of VALIDITY is described, first with reference to the propositional calculus, and then in connection with the first-order predicate calculus with identity. Sections in the text are devoted to discussions of the various rules of derivation employed in both calculi. Three appendices follow the main text; these provide a catalogue of sequents and theorems that have been proved for the propositional calculus and for the predicate calculus, and include suggestions for the classroom use of VALIDITY in university-level courses in mathematical logic. (shrink)
Background: Despite being often taken as the benchmark of quality for diagnostic and classificatory tools, 'validity' is admitted as a poorly worked out notion in psychiatric nosology. Objective: Here we aim at presenting a view that we believe to do better justice to the significance of the notion of validity, as well as at explaining away some misconceptions and inappropriate expectations regarding this attribute in the aforementioned context. Method: The notion of validity is addressed taking into account (...) its role, the framework according to which it should be assessed and the specific contents to which it refers within psychiatric nosology. Results and Conclusions: The notion of validity has an epistemological thrust and its foremost role is distinguishing correct reasoning and truth from what is irrational or false. From it follows not only that 'validity' always refers to elements of knowledge and rationality such as arguments, inferences and propositions, but also that the appropriate frameworks to assess 'validity' are logics and scientific methodology. When the validity of a psychiatric diagnostic category is at stake, the contents to which it refers are those relevantly related to the notion of 'diagnostic concept'. The consequences of our reading on the notion of 'validity' are discussed vis-à-vis the challenges faced by psychiatric nosology in order to have its diagnostic categories validated. (shrink)
Beall and Murzi :143–165, 2013) introduce an object-linguistic predicate for naïve validity, governed by intuitive principles that are inconsistent with the classical structural rules. As a consequence, they suggest that revisionary approaches to semantic paradox must be substructural. In response to Beall and Murzi, Field :1–19, 2017) has argued that naïve validity principles do not admit of a coherent reading and that, for this reason, a non-classical solution to the semantic paradoxes need not be substructural. The aim of (...) this paper is to respond to Field’s objections and to point to a coherent notion of validity which underwrites a coherent reading of Beall and Murzi’s principles: grounded validity. The notion, first introduced by Nicolai and Rossi, is a generalisation of Kripke’s notion of grounded truth, and yields an irreflexive logic. While we do not advocate the adoption of a substructural logic, we take the notion of naïve validity to be a legitimate semantic notion that points to genuine expressive limitations of fully structural revisionary approaches. (shrink)
Tarski's Undefinability of Truth Theorem comes in two versions: that no consistent theory which interprets Robinson's Arithmetic (Q) can prove all instances of the T-Scheme and hence define truth; and that no such theory, if sound, can even express truth. In this note, I prove corresponding limitative results for validity. While Peano Arithmetic already has the resources to define a predicate expressing logicalvalidity, as Jeff Ketland has recently pointed out (2012, Validity as a primitive. Analysis (...) 72: 421-30), no theory which interprets Q closed under the standard structural rules can define nor express validity, on pain of triviality. The results put pressure on the widespread view that there is an asymmetry between truth and validity, viz. that while the former cannot be defined within the language, the latter can. I argue that Vann McGee's and Hartry Field's arguments for the asymmetry view are problematic. (shrink)
Nontransitive responses to the validity Curry paradox face a dilemma that was recently formulated by Barrio, Rosenblatt and Tajer. It seems that, in the nontransitive logic ST enriched with a validity predicate, either you cannot prove that all derivable metarules preserve validity, or you can prove that instances of Cut that are not admissible in the logic preserve validity. I respond on behalf of the nontransitive approach. The paper argues, first, that we should reject the detachment (...) principle for naive validity. Secondly, I show how to add a validity predicate to ST while avoiding the dilemma. (shrink)
The logical-pragmatic perspective on the psychiatric diagnosis, presented by Rodriguez and Banzato contributes to and develops the existing conventional taxonomic framework. The latter is regarded as grounded on the epistemological prerequisites proponed by Carl Gustav Hempel in the late 1960s, adopted by the DSM task force of R. Spitzer in 1973.
Any theory of truth must find a way around Curry’s paradox, and there are well-known ways to do so. This paper concerns an apparently analogous paradox, about validity rather than truth, which JC Beall and Julien Murzi call the v-Curry. They argue that there are reasons to want a common solution to it and the standard Curry paradox, and that this rules out the solutions to the latter offered by most “naive truth theorists.” To this end they recommend a (...) radical solution to both paradoxes, involving a substructural logic, in particular, one without structural contraction. In this paper I argue that substructuralism is unnecessary. Diagnosing the “v-Curry” is complicated because of a multiplicity of readings of the principles it relies on. But these principles are not analogous to the principles of naive truth, and taken together, there is no reading of them that should have much appeal to anyone who has absorbed the morals of both the ordinary Curry paradox and the second incompleteness theorem. (shrink)
This paper considers Rumfitt’s bilateral classical logic (BCL), which is proposed to counter Dummett’s challenge to classical logic. First, agreeing with several authors, we argue that Rumfitt’s notion of harmony, used to justify logical rules by a purely proof theoretical manner, is not sufficient to justify coordination rules in BCL purely proof-theoretically. For the central part of this paper, we propose a notion of proof-theoretical validity similar to Prawitz for BCL and proves that BCL is sound and complete (...) respect to this notion of validity. The major difficulty in defining validity for BCL is that validity of positive +A appears to depend on negative −A, and vice versa. Thus, the straightforward inductive definition does not work because of this circular dependance. However, Knaster-Tarski’s fixed point theorem can resolve this circularity. Finally, we discuss the philosophical relevance of our work, in particular, the impact of the use of fixed point theorem and the issue of decidability. (shrink)
Logic arguably plays a role in the normativity of reasoning. In particular, there are plausible norms of belief/disbelief whose antecedents are constituted by claims about what follows from what. But is logic also relevant to the normativity of agnostic attitudes? The question here is whether logical entailment also puts constraints on what kinds of things one can suspend judgment about. In this paper I address that question and I give a positive answer to it. In particular, I advance two (...)logical norms of agnosticism, where the first one allows us to assess situations in which the subject is agnostic about the conclusion of a valid argument and the second one allows us to assess situations in which the subject is agnostic about one of the premises of a valid argument. (shrink)
The so-called materially valid inferences have come to new prominence through the work of Robert Brandom. This paper introduces a fragment of a logic of concepts that does not reduce concepts to their extensions. Concept logic and ist semantics allow us to represent the conceptual knowledge used in material inferences and thus suggests a way to deal with them.
It is one thing for a given proposition to follow or to not follow from a given set of propositions and it is quite another thing for it to be shown either that the given proposition follows or that it does not follow.* Using a formal deduction to show that a conclusion follows and using a countermodel to show that a conclusion does not follow are both traditional practices recognized by Aristotle and used down through the history of logic. These (...) practices presuppose, respectively, a criterion of validity and a criterion of invalidity each of which has been extended and refined by modern logicians: deductions are studied in formal syntax (proof theory) and coun¬termodels are studied in formal semantics (model theory). The purpose of this paper is to compare these two criteria to the corresponding criteria employed in Boole’s first logical work, The Mathematical Analysis of Logic (1847). In particular, this paper presents a detailed study of the relevant metalogical passages and an analysis of Boole’s symbolic derivations. It is well known, of course, that Boole’s logical analysis of compound terms (involving ‘not’, ‘and’, ‘or’, ‘except’, etc.) contributed to the enlargement of the class of propositions and arguments formally treatable in logic. The present study shows, in addition, that Boole made significant contributions to the study of deduc¬tive reasoning. He identified the role of logical axioms (as opposed to inference rules) in formal deductions, he conceived of the idea of an axiomatic deductive sys¬tem (which yields logical truths by itself and which yields consequences when ap¬plied to arbitrary premises). Nevertheless, surprisingly, Boole’s attempt to imple¬ment his idea of an axiomatic deductive system involved striking omissions: Boole does not use his own formal deductions to establish validity. Boole does give symbolic derivations, several of which are vitiated by “Boole’s Solutions Fallacy”: the fallacy of supposing that a solution to an equation is necessarily a logical consequence of the equation. This fallacy seems to have led Boole to confuse equational calculi (i.e., methods for gen-erating solutions) with deduction procedures (i.e., methods for generating consequences). The methodological confusion is closely related to the fact, shown in detail below, that Boole had adopted an unsound criterion of validity. It is also shown that Boole totally ignored the countermodel criterion of invalid¬ity. Careful examination of the text does not reveal with certainty a test for invalidity which was adopted by Boole. However, we have isolated a test that he seems to use in this way and we show that this test is ineffectual in the sense that it does not serve to identify invalid arguments. We go beyond the simple goal stated above. Besides comparing Boole’s earliest criteria of validity and invalidity with those traditionally (and still generally) employed, this paper also investigates the framework and details of THE MATHEMATICAL ANALYSIS OF LOGIC. (shrink)
Monists say that the nature of truth is invariant, whichever sentence you consider; pluralists say that the nature of truth varies between different sets of sentences. The orthodoxy is that logic and logical form favour monism: there must be a single property that is preserved in any valid inference; and any truth-functional complex must be true in the same way as its components. The orthodoxy, I argue, is mistaken. Logic and logical form impose only structural constraints on a (...) metaphysics of truth. Monistic theories are not guaranteed to satisfy these constraints, and there is a pluralistic theory that does so. (shrink)
The need to distinguish between logical and extra-logical varieties of inference, entailment, validity, and consistency has played a prominent role in meta-ethical debates between expressivists and descriptivists. But, to date, the importance that matters of logical form play in these distinctions has been overlooked. That’s a mistake given the foundational place that logical form plays in our understanding of the difference between the logical and the extra-logical. This essay argues that descriptivists are better (...) positioned than their expressivist rivals to provide the needed account of logical form, and so better able to capture the needed distinctions. This finding is significant for several reasons: First, it provides a new argument against expressivism. Second, it reveals that descriptivists can make use of this new argument only if they are willing to take a controversial—but plausible—stand on claims about the nature and foundations of logic. (shrink)
Paraconsistent logics are logical systems that reject the classical principle, usually dubbed Explosion, that a contradiction implies everything. However, the received view about paraconsistency focuses only the inferential version of Explosion, which is concerned with formulae, thereby overlooking other possible accounts. In this paper, we propose to focus, additionally, on a meta-inferential version of Explosion, i.e. which is concerned with inferences or sequents. In doing so, we will offer a new characterization of paraconsistency by means of which a logic (...) is paraconsistent if it invalidates either the inferential or the meta-inferential notion of Explosion. We show the non-triviality of this criterion by discussing a number of logics. On the one hand, logics which validate and invalidate both versions of Explosion, such as classical logic and Asenjo–Priest’s 3-valued logic LP. On the other hand, logics which validate one version of Explosion but not the other, such as the substructural logics TS and ST, introduced by Malinowski and Cobreros, Egré, Ripley and van Rooij, which are obtained via Malinowski’s and Frankowski’s q- and p-matrices, respectively. (shrink)
My first paper on the Is/Ought issue. The young Arthur Prior endorsed the Autonomy of Ethics, in the form of Hume’s No-Ought-From-Is (NOFI) but the later Prior developed a seemingly devastating counter-argument. I defend Prior's earlier logical thesis (albeit in a modified form) against his later self. However it is important to distinguish between three versions of the Autonomy of Ethics: Ontological, Semantic and Ontological. Ontological Autonomy is the thesis that moral judgments, to be true, must answer to a (...) realm of sui generis non-natural PROPERTIES. Semantic autonomy insists on a realm of sui generis non-natural PREDICATES which do not mean the same as any natural counterparts. Logical Autonomy maintains that moral conclusions cannot be derived from non-moral premises.-moral premises with the aid of logic alone. Logical Autonomy does not entail Semantic Autonomy and Semantic Autonomy does not entail Ontological Autonomy. But, given some plausible assumptions Ontological Autonomy entails Semantic Autonomy and given the conservativeness of logic – the idea that in a valid argument you don’t get out what you haven’t put in – Semantic Autonomy entails Logical Autonomy. So if Logical Autonomy is false – as Prior appears to prove – then Semantic and Ontological Autonomy would appear to be false too! I develop a version of Logical Autonomy (or NOFI) and vindicate it against Prior’s counterexamples, which are also counterexamples to the conservativeness of logic as traditionally conceived. The key concept here is an idea derived in part from Quine - that of INFERENCE-RELATIVE VACUITY. I prove that you cannot derive conclusions in which the moral terms appear non-vacuously from premises from which they are absent. But this is because you cannot derive conclusions in which ANY (non-logical) terms appear non-vacuously from premises from which they are absent Thus NOFI or Logical Autonomy comes out as an instance of the conservativeness of logic. This means that the reverse entailment that I have suggested turns out to be a mistake. The falsehood of Logical Autonomy would not entail either the falsehood Semantic Autonomy or the falsehood of Ontological Autonomy, since Semantic Autonomy only entails Logical Autonomy with the aid of the conservativeness of logic of which Logical Autonomy is simply an instance. Thus NOFI or Logical Autonomy is vindicated, but it turns out to be a less world-shattering thesis than some have supposed. It provides no support for either non-cognitivism or non-naturalism. (shrink)
Logical pluralism is the view that there is more than one correct logic. Most logical pluralists think that logic is normative in the sense that you make a mistake if you accept the premisses of a valid argument but reject its conclusion. Some authors have argued that this combination is self-undermining: Suppose that L1 and L2 are correct logics that coincide except for the argument from Γ to φ, which is valid in L1 but invalid in L2. If (...) you accept all sentences in Γ, then, by normativity, you make a mistake if you reject φ. In order to avoid mistakes, you should accept φ or suspend judgment about φ. Both options are problematic for pluralism. Can pluralists avoid this worry by rejecting the normativity of logic? I argue that they cannot. All else being equal, the argument goes through even if logic is not normative. (shrink)
In this paper, the authors show that there is a reading of St. Anselm's ontological argument in Proslogium II that is logically valid (the premises entail the conclusion). This reading takes Anselm's use of the definite description "that than which nothing greater can be conceived" seriously. Consider a first-order language and logic in which definite descriptions are genuine terms, and in which the quantified sentence "there is an x such that..." does not imply "x exists". Then, using an ordinary logic (...) of descriptions and a connected greater-than relation, God's existence logically follows from the claims: (a) there is a conceivable thing than which nothing greater is conceivable, and (b) if <em>x</em> doesn't exist, something greater than x can be conceived. To deny the conclusion, one must deny one of the premises. However, the argument involves no modal inferences and, interestingly, Descartes' ontological argument can be derived from it. (shrink)
I want to model a finite, fallible cognitive agent who imagines that p in the sense of mentally representing a scenario—a configuration of objects and properties—correctly described by p. I propose to capture imagination, so understood, via variably strict world quantifiers, in a modal framework including both possible and so-called impossible worlds. The latter secure lack of classical logical closure for the relevant mental states, while the variability of strictness captures how the agent imports information from actuality in the (...) imagined non-actual scenarios. Imagination turns out to be highly hyperintensional, but not logically anarchic. Section 1 sets the stage and impossible worlds are quickly introduced in Sect. 2. Section 3 proposes to model imagination via variably strict world quantifiers. Section 4 introduces the formal semantics. Section 5 argues that imagination has a minimal mereological structure validating some logical inferences. Section 6 deals with how imagination under-determines the represented contents. Section 7 proposes additional constraints on the semantics, validating further inferences. Section 8 describes some welcome invalidities. Section 9 examines the effects of importing false beliefs into the imagined scenarios. Finally, Sect. 10 hints at possible developments of the theory in the direction of two-dimensional semantics. (shrink)
In this paper I discuss a prevailing view by which logical terms determine forms of sentences and arguments and therefore the logicalvalidity of arguments. This view is common to those who hold that there is a principled distinction between logical and nonlogical terms and those holding relativistic accounts. I adopt the Tarskian tradition by which logicalvalidity is determined by form, but reject the centrality of logical terms. I propose an alternative framework (...) for logic where logical terms no longer play a distinctive role. This account employs a new notion of semantic. (shrink)
ABSTRACT: An introduction to Stoic logic. Stoic logic can in many respects be regarded as a fore-runner of modern propositional logic. I discuss: 1. the Stoic notion of sayables or meanings (lekta); the Stoic assertibles (axiomata) and their similarities and differences to modern propositions; the time-dependency of their truth; 2.-3. assertibles with demonstratives and quantified assertibles and their truth-conditions; truth-functionality of negations and conjunctions; non-truth-functionality of disjunctions and conditionals; language regimentation and ‘bracketing’ devices; Stoic basic principles of propositional logic; 4. (...) Stoic modal logic; 5. Stoic theory of arguments: two premisses requirement; validity and soundness; 6. Stoic syllogistic or theory of formally valid arguments: a reconstruction of the Stoic deductive system, which consisted of accounts of five types of indemonstrable syllogisms, which function as nullary argumental rules that identify indemonstrables or axioms of the system, and four deductive rules (themata) by which certain complex arguments can be reduced to indemonstrables and thus shown to be formally valid themselves; 7. arguments that were considered as non-syllogistically valid (subsyllogistic and unmethodically concluding arguments). Their validity was explained by recourse to formally valid arguments. (shrink)
The main objective of the paper is to provide a conceptual apparatus of a general logical theory of language communication. The aim of the paper is to outline a formal-logical theory of language in which the concepts of the phenomenon of language communication and language communication in general are defined and some conditions for their adequacy are formulated. The theory explicates the key notions of contemporary syntax, semantics, and pragmatics. The theory is formalized on two levels: token-level and (...) type-level. As such, it takes into account the dual – token and type – ontological character of linguistic entities. The basic notions of the theory: language communication, meaning and interpretation are introduced on the second, type-level of formalization, and their required prior formalization of some of the notions introduced on the first, token-level; among others, the notion of an act of communication. Owing to the theory, it is possible to address the problems of adequacy of both empirical acts of communication and of language communication in general. All the conditions of adequacy of communication discussed in the presented paper, are valid for one-way communication (sender-recipient); nevertheless, they can also apply to the reverse direction of language communication (recipient-sender). Therefore, they concern the problem of two-way understanding in language communication. (shrink)
Argumentations are at the heart of the deductive and the hypothetico-deductive methods, which are involved in attempts to reduce currently open problems to problems already solved. These two methods span the entire spectrum of problem-oriented reasoning from the simplest and most practical to the most complex and most theoretical, thereby uniting all objective thought whether ancient or contemporary, whether humanistic or scientific, whether normative or descriptive, whether concrete or abstract. Analysis, synthesis, evaluation, and function of argumentations are described. Perennial philosophic (...) problems, epistemic and ontic, related to argumentations are put in perspective. So much of what has been regarded as logic is seen to be involved in the study of argumentations that logic may be usefully defined as the systematic study of argumentations, which is virtually identical to the quest of objective understanding of objectivity. (shrink)
We are much better equipped to let the facts reveal themselves to us instead of blinding ourselves to them or stubbornly trying to force them into preconceived molds. We no longer embarrass ourselves in front of our students, for example, by insisting that “Some Xs are Y” means the same as “Some X is Y”, and lamely adding “for purposes of logic” whenever there is pushback. Logic teaching in this century can exploit the new spirit of objectivity, humility, clarity, observationalism, (...) contextualism, and pluralism. Besides the new spirit there have been quiet developments in logic and its history and philosophy that could radically improve logic teaching. One rather conspicuous example is that the process of refining logical terminology has been productive. Future logic students will no longer be burdened by obscure terminology and they will be able to read, think, talk, and write about logic in a more careful and more rewarding manner. Closely related is increased use and study of variable-enhanced natural language as in “Every proposition x that implies some proposition y that is false also implies some proposition z that is true”. Another welcome development is the culmination of the slow demise of logicism. No longer is the teacher blocked from using examples from arithmetic and algebra fearing that the students had been indoctrinated into thinking that every mathematical truth was a tautology and that every mathematical falsehood was a contradiction. A fifth welcome development is the separation of laws of logic from so-called logical truths, i.e., tautologies. Now we can teach the logical independence of the laws of excluded middle and non-contradiction without fear that students had been indoctrinated into thinking that every logical law was a tautology and that every falsehood of logic was a contradiction. This separation permits the logic teacher to apply logic in the clarification of laws of logic. This lecture expands the above points, which apply equally well in first, second, and third courses, i.e. in “critical thinking”, “deductive logic”, and “symbolic logic”. (shrink)
We analyze the logical form of the domain knowledge that grounds analogical inferences and generalizations from a single instance. The form of the assumptions which justify analogies is given schematically as the "determination rule", so called because it expresses the relation of one set of variables determining the values of another set. The determination relation is a logical generalization of the different types of dependency relations defined in database theory. Specifically, we define determination as a relation between schemata (...) of first order logic that have two kinds of free variables: (1) object variables and (2) what we call "polar" variables, which hold the place of truth values. Determination rules facilitate sound rule inference and valid conclusions projected by analogy from single instances, without implying what the conclusion should be prior to an inspection of the instance. They also provide a way to specify what information is sufficiently relevant to decide a question, prior to knowledge of the answer to the question. (shrink)
Supervaluationism is often described as the most popular semantic treatment of indeterminacy. There???s little consensus, however, about how to fill out the bare-bones idea to include a characterization of logical consequence. The paper explores one methodology for choosing between the logics: pick a logic that norms belief as classical consequence is standardly thought to do. The main focus of the paper considers a variant of standard supervaluational, on which we can characterize degrees of determinacy. It applies the methodology above (...) to focus on degree logic. This is developed first in a basic, single-premise case; and then extended to the multipremise case, and to allow degrees of consequence. The metatheoretic properties of degree logic are set out. On the positive side, the logic is supraclassical???all classical valid sequents are degree logic valid. Strikingly, metarules such as cut and conjunction introduction fail. (shrink)
We are much better equipped to let the facts reveal themselves to us instead of blinding ourselves to them or stubbornly trying to force them into preconceived molds. We no longer embarrass ourselves in front of our students, for example, by insisting that “Some Xs are Y” means the same as “Some X is Y”, and lamely adding “for purposes of logic” whenever there is pushback. Logic teaching in this century can exploit the new spirit of objectivity, humility, clarity, observationalism, (...) contextualism, and pluralism. Besides the new spirit there have been quiet developments in logic and its history and philosophy that could radically improve logic teaching. This lecture expands points which apply equally well in first, second, and third courses, i.e. in “critical thinking”, “deductive logic”, and “symbolic logic”. (shrink)
In previous articles, it has been shown that the deductive system developed by Aristotle in his "second logic" is a natural deduction system and not an axiomatic system as previously had been thought. It was also stated that Aristotle's logic is self-sufficient in two senses: First, that it presupposed no other logical concepts, not even those of propositional logic; second, that it is (strongly) complete in the sense that every valid argument expressible in the language of the system is (...) deducible by means of a formal deduction in the system. Review of the system makes the first point obvious. The purpose of the present article is to prove the second. Strong completeness is demonstrated for the Aristotelian system. (shrink)
When discussing Logical Pluralism several critics argue that such an open-minded position is untenable. The key to this conclusion is that, given a number of widely accepted assumptions, the pluralist view collapses into Logical Monism. In this paper we show that the arguments usually employed to arrive at this conclusion do not work. The main reason for this is the existence of certain substructural logics which have the same set of valid inferences as Classical Logic—although they are, in (...) a clear sense, non-identical to it. We argue that this phenomenon can be generalized, given the existence of logics which coincide with Classical Logic regarding a number of metainferential levels—although they are, again, clearly different systems. We claim this highlights the need to arrive at a more refined version of the Collapse Argument, which we discuss at the end of the paper. (shrink)
Judaic Logic is an original inquiry into the forms of thought determining Jewish law and belief, from the impartial perspective of a logician. Judaic Logic attempts to honestly estimate the extent to which the logic employed within Judaism fits into the general norms, and whether it has any contributions to make to them. The author ranges far and wide in Jewish lore, finding clear evidence of both inductive and deductive reasoning in the Torah and other books of the Bible, and (...) analyzing the methodology of the Talmud and other Rabbinic literature by means of formal tools which make possible its objective evaluation with reference to scientific logic. The result is a highly innovative work – incisive and open, free of clichés or manipulation. Judaic Logic succeeds in translating vague and confusing interpretative principles and examples into formulas with the clarity and precision of Aristotelean syllogism. Among the positive outcomes, for logic in general, are a thorough listing, analysis and validation of the various forms of a-fortiori argument, as well as a clarification of dialectic logic. However, on the negative side, this demystification of Talmudic/Rabbinic modes of thought (hermeneutic and heuristic) reveals most of them to be, contrary to the boasts of orthodox commentators, far from deductive and certain. They are often, legitimately enough, inductive. But they are also often unnatural and arbitrary constructs, supported by unverifiable claims and fallacious techniques. Many other thought-processes, used but not noticed or discussed by the Rabbis, are identified in this treatise, and subjected to logical review. Various more or less explicit Rabbinic doctrines, which have logical significance, are also examined in it. In particular, this work includes a formal study of the ethical logic (deontology) found in Jewish law, to elicit both its universal aspects and its peculiarities. With regard to Biblical studies, one notable finding is an explicit formulation (which, however, the Rabbis failed to take note of and stress) of the principles of adduction in the Torah, written long before the acknowledgement of these principles in Western philosophy and their assimilation in a developed theory of knowledge. Another surprise is that, in contrast to Midrashic claims, the Tanakh (Jewish Bible) contains a lot more than ten instances of qal vachomer (a-fortiori) reasoning. In sum, Judaic Logic elucidates and evaluates the epistemological assumptions which have generated the Halakhah (Jewish religious jurisprudence) and allied doctrines. Traditional justifications, or rationalizations, concerning Judaic law and belief, are carefully dissected and weighed at the level of logical process and structure, without concern for content. This foundational approach, devoid of any critical or supportive bias, clears the way for a timely reassessment of orthodox Judaism (and incidentally, other religious systems, by means of analogies or contrasts). Judaic Logic ought, therefore, to be read by all Halakhists, as well as Bible and Talmud scholars and students; and also by everyone interested in the theory, practise and history of logic. (shrink)
Since the time of Aristotle's students, interpreters have considered Prior Analytics to be a treatise about deductive reasoning, more generally, about methods of determining the validity and invalidity of premise-conclusion arguments. People studied Prior Analytics in order to learn more about deductive reasoning and to improve their own reasoning skills. These interpreters understood Aristotle to be focusing on two epistemic processes: first, the process of establishing knowledge that a conclusion follows necessarily from a set of premises (that is, on (...) the epistemic process of extracting information implicit in explicitly given information) and, second, the process of establishing knowledge that a conclusion does not follow. Despite the overwhelming tendency to interpret the syllogistic as formal epistemology, it was not until the early 1970s that it occurred to anyone to think that Aristotle may have developed a theory of deductive reasoning with a well worked-out system of deductions comparable in rigor and precision with systems such as propositional logic or equational logic familiar from mathematical logic. When modern logicians in the 1920s and 1930s first turned their attention to the problem of understanding Aristotle's contribution to logic in modern terms, they were guided both by the Frege-Russell conception of logic as formal ontology and at the same time by a desire to protect Aristotle from possible charges of psychologism. They thought they saw Aristotle applying the informal axiomatic method to formal ontology, not as making the first steps into formal epistemology. They did not notice Aristotle's description of deductive reasoning. Ironically, the formal axiomatic method (in which one explicitly presents not merely the substantive axioms but also the deductive processes used to derive theorems from the axioms) is incipient in Aristotle's presentation. Partly in opposition to the axiomatic, ontically-oriented approach to Aristotle's logic and partly as a result of attempting to increase the degree of fit between interpretation and text, logicians in the 1970s working independently came to remarkably similar conclusions to the effect that Aristotle indeed had produced the first system of formal deductions. They concluded that Aristotle had analyzed the process of deduction and that his achievement included a semantically complete system of natural deductions including both direct and indirect deductions. Where the interpretations of the 1920s and 1930s attribute to Aristotle a system of propositions organized deductively, the interpretations of the 1970s attribute to Aristotle a system of deductions, or extended deductive discourses, organized epistemically. The logicians of the 1920s and 1930s take Aristotle to be deducing laws of logic from axiomatic origins; the logicians of the 1970s take Aristotle to be describing the process of deduction and in particular to be describing deductions themselves, both those deductions that are proofs based on axiomatic premises and those deductions that, though deductively cogent, do not establish the truth of the conclusion but only that the conclusion is implied by the premise-set. Thus, two very different and opposed interpretations had emerged, interestingly both products of modern logicians equipped with the theoretical apparatus of mathematical logic. The issue at stake between these two interpretations is the historical question of Aristotle's place in the history of logic and of his orientation in philosophy of logic. This paper affirms Aristotle's place as the founder of logic taken as formal epistemology, including the study of deductive reasoning. A by-product of this study of Aristotle's accomplishments in logic is a clarification of a distinction implicit in discourses among logicians--that between logic as formal ontology and logic as formal epistemology. (shrink)
ABSTRACT: A detailed presentation of Stoic theory of arguments, including truth-value changes of arguments, Stoic syllogistic, Stoic indemonstrable arguments, Stoic inference rules (themata), including cut rules and antilogism, argumental deduction, elements of relevance logic in Stoic syllogistic, the question of completeness of Stoic logic, Stoic arguments valid in the specific sense, e.g. "Dio says it is day. But Dio speaks truly. Therefore it is day." A more formal and more detailed account of the Stoic theory of deduction can be found (...) in S. Bobzien, Stoic Syllogistic, OSAP 1996. (shrink)
For semantic inferentialists, the basic semantic concept is validity. An inferentialist theory of meaning should offer an account of the meaning of "valid." If one tries to add a validity predicate to one's object language, however, one runs into problems like the v-Curry paradox. In previous work, I presented a validity predicate for a non-transitive logic that can adequately capture its own meta-inferences. Unfortunately, in that system, one cannot show of any inference that it is invalid. Here (...) I extend the system so that it can capture invalidities. (shrink)
This paper is concerned with a propositional modal logic with operators for necessity, actuality and apriority. The logic is characterized by a class of relational structures defined according to ideas of epistemic two-dimensional semantics, and can therefore be seen as formalizing the relations between necessity, actuality and apriority according to epistemic two-dimensional semantics. We can ask whether this logic is correct, in the sense that its theorems are all and only the informally valid formulas. This paper gives outlines of two (...) arguments that jointly show that this is the case. The first is intended to show that the logic is informally sound, in the sense that all of its theorems are informally valid. The second is intended to show that it is informally complete, in the sense that all informal validities are among its theorems. In order to give these arguments, a number of independently interesting results concerning the logic are proven. In particular, the soundness and completeness of two proof systems with respect to the semantics is proven (Theorems 2.11 and 2.15), as well as a normal form theorem (Theorem 3.2), an elimination theorem for the actuality operator (Corollary 3.6), and the decidability of the logic (Corollary 3.7). It turns out that the logic invalidates a plausible principle concerning the interaction of apriority and necessity; consequently, a variant semantics is briefly explored on which this principle is valid. The paper concludes by assessing the implications of these results for epistemic two-dimensional semantics. (shrink)
ABSTRACT: A detailed presentation of Stoic logic, part one, including their theories of propositions (or assertibles, Greek: axiomata), demonstratives, temporal truth, simple propositions, non-simple propositions(conjunction, disjunction, conditional), quantified propositions, logical truths, modal logic, and general theory of arguments (including definition, validity, soundness, classification of invalid arguments).
What sort of logic do we get if we adopt a supervaluational semantics for vagueness? As it turns out, the answer depends crucially on how the standard notion of validity as truth preservation is recasted. There are several ways of doing that within a supervaluational framework, the main alternative being between “global” construals (e.g., an argument is valid iff it preserves truth-under-all-precisifications) and “local” construals (an argument is valid iff, under all precisifications, it preserves truth). The former alternative is (...) by far more popular, but I argue in favor of the latter, for (i) it does not suffer from a number of serious objections, and (ii) it makes it possible to restore global validity as a defined notion. (shrink)
We generalize, by a progressive procedure, the notions of conjunction and disjunction of two conditional events to the case of n conditional events. In our coherence-based approach, conjunctions and disjunctions are suitable conditional random quantities. We define the notion of negation, by verifying De Morgan’s Laws. We also show that conjunction and disjunction satisfy the associative and commutative properties, and a monotonicity property. Then, we give some results on coherence of prevision assessments for some families of compounded conditionals; in particular (...) we examine the Fréchet-Hoeffding bounds. Moreover, we study the reverse probabilistic inference from the conjunction Cn+1 of n + 1 conditional events to the family {Cn,En+1|Hn+1}. We consider the relation with the notion of quasi-conjunction and we examine in detail the coherence of the prevision assessments related with the conjunction of three conditional events. Based on conjunction, we also give a characterization of p-consistency and of p-entailment, with applications to several inference rules in probabilistic nonmonotonic reasoning. Finally, we examine some non p-valid inference rules; then, we illustrate by an example two methods which allow to suitably modify non p-valid inference rules in order to get inferences which are p-valid. (shrink)
What is a valid measuring instrument? Recent philosophy has attended to logic of justification of measures, such as construct validation, but not to the question of what it means for an instrument to be a valid measure of a construct. A prominent approach grounds validity in the existence of a causal link between the attribute and its detectable manifestations. Some of its proponents claim that, therefore, validity does not depend on pragmatics and research context. In this paper, I (...) cast doubt on the possibility of a context-independent causal account of validity. I assess several versions, arguing that all of them fail to judge the validity of measuring instruments correctly. Because different research purposes require different properties from measuring instruments, no account of validity succeeds without referring to the specific research purpose that creates the need for measurement in the first place. (shrink)
Causal models provide a framework for making counterfactual predictions, making them useful for evaluating the truth conditions of counterfactual sentences. However, current causal models for counterfactual semantics face limitations compared to the alternative similarity-based approach: they only apply to a limited subset of counterfactuals and the connection to counterfactual logic is not straightforward. This paper argues that these limitations arise from the theory of interventions where intervening on variables requires changing structural equations rather than the values of variables. Using an (...) alternative theory of exogenous interventions, this paper extends the causal approach to counterfactuals to handle more complex counterfactuals, including backtracking counterfactuals and those with logically complex antecedents. The theory also validates familiar principles of counterfactual logic and offers an explanation for counterfactual disagreement and backtracking readings of forward counterfactuals. (shrink)
Information-theoretic approaches to formal logic analyse the "common intuitive" concept of propositional implication (or argumental validity) in terms of information content of propositions and sets of propositions: one given proposition implies a second if the former contains all of the information contained by the latter; an argument is valid if the conclusion contains no information beyond that of the premise-set. This paper locates information-theoretic approaches historically, philosophically and pragmatically. Advantages and disadvantages are identified by examining such approaches in themselves (...) and by contrasting them with standard transformation-theoretic approaches. Transformation-theoretic approaches analyse validity (and thus implication) in terms of transformations that map one argument onto another: a given argument is valid if no transformation carries it onto an argument with all true premises and false conclusion. Model-theoretic, set-theoretic, and substitution-theoretic approaches, which dominate current literature, can be construed as transformation-theoretic, as can the so-called possible-worlds approaches. Ontic and epistemic presuppositions of both types of approaches are considered. Attention is given to the question of whether our historically cumulative experience applying logic is better explained from a purely information-theoretic perspective or from a purely transformation-theoretic perspective or whether apparent conflicts between the two types of approaches need to be reconciled in order to forge a new type of approach that recognizes their basic complementarity. (shrink)
I consider the first-order modal logic which counts as valid those sentences which are true on every interpretation of the non-logical constants. Based on the assumptions that it is necessary what individuals there are and that it is necessary which propositions are necessary, Timothy Williamson has tentatively suggested an argument for the claim that this logic is determined by a possible world structure consisting of an infinite set of individuals and an infinite set of worlds. He notes that only (...) the cardinalities of these sets matters, and that not all pairs of infinite sets determine the same logic. I use so-called two-cardinal theorems from model theory to investigate the space of logics and consequence relations determined by pairs of infinite sets, and show how to eliminate the assumption that worlds are individuals from Williamson’s argument. (shrink)
A complete axiomatic system CTL$_{rp}$ is introduced for a temporal logic for finitely branching $\omega^+$-trees in a temporal language extended with so called reference pointers. Syntactic and semantic interpretations are constructed for the branching time computation tree logic CTL$^{*}$ into CTL$_{rp}$. In particular, that yields a complete axiomatization for the translations of all valid CTL$^{*}$-formulae. Thus, the temporal logic with reference pointers is brought forward as a simpler (with no path quantifiers), but in a way more expressive medium for reasoning (...) about branching time. (shrink)
Revised version of chapter in J. N. Mohanty and W. McKenna (eds.), Husserl’s Phenomenology: A Textbook, Lanham: University Press of America, 1989, 29–67. -/- Logic for Husserl is a science of science, a science of what all sciences have in common in their modes of validation. Thus logic deals with universal laws relating to truth, to deduction, to verification and falsification, and with laws relating to theory as such, and to what makes for theoretical unity, both on the side of (...) the propositions of a theory and on the side of the domain of objects to which these propositions refer. This essay presents a systematic overview of Husserl’s views on these matters as put forward in his Logical Investigations. It shows how Husserl’s theory of linguistic meanings as species of mental acts, his formal ontology of part, whole and dependence, his theory of meaning categories, and his theory of categorial intuition combine with his theory of science to form a single whole. Finally, it explores the ways in which Husserl’s ideas on these matters can be put to use in solving problems in the philosophy of language, logic and mathematics in a way which does justice to the role of mental activity in each of these domains while at the same time avoiding the pitfalls of psychologism. (shrink)
Definitions I presented in a previous article as part of a semantic approach in epistemology assumed that the concept of derivability from standard logic held across all mathematical and scientific disciplines. The present article argues that this assumption is not true for quantum mechanics (QM) by showing that concepts of validity applicable to proofs in mathematics and in classical mechanics are inapplicable to proofs in QM. Because semantic epistemology must include this important theory, revision is necessary. The one I (...) propose also extends semantic epistemology beyond the ‘hard’ sciences. The article ends by presenting and then refuting some responses QM theorists might make to my arguments. (shrink)
The following four theses all have some intuitive appeal: (I) There are valid norms. (II) A norm is valid only if justified by a valid norm. (III) Justification, on the class of norms, has an irreflexive proper ancestral. (IV) There is no infinite sequence of valid norms each of which is justified by its successor. However, at least one must be false, for (I)--(III) together entail the denial of (IV). There is thus a conflict between intuition and logical possibility. (...) This paper, after distinguishing various conceptions of a norm, of validity and of justification, argues for the following position. (I) is true. (II) is false for legislative justification and true for epistemic justification. (III) is true for legislative and false for epistemic justification. (IV) is true for legislative justification; for epistemic justification (IV) is true or false depending on the conception taken of a norm. Our intuition in favour of (II) must therefore be abandoned where justification is conceived legislatively. Our intuition in favour of (III) must be abandoned, and our intuition in favour of (IV) qualified, where justification is conceived epistemically. (shrink)
Suppose that a sign at the entrance of a hotel reads: “Don’t enter these premises unless you are accompanied by a registered guest”. You see someone who is about to enter, and you tell her: “Don’t enter these premises if you are an unaccompanied registered guest”. She asks why, and you reply: “It follows from what the sign says”. It seems that you made a valid inference from an imperative premise to an imperative conclusion. But it also seems that imperatives (...) cannot be true or false, so what does it mean to say that your inference is valid? It cannot mean that the truth of its premise guarantees the truth of its conclusion. One is thus faced with what is known as “Jørgensen’s dilemma” (Ross 1941: 55-6): it seems that imperative logic cannot exist because logic deals only with entities that, unlike imperatives, can be true or false, but it also seems that imperative logic must exist. It must exist not only because inferences with imperatives can be valid, but also because imperatives (like “Enter” and “Don’t enter”) can be inconsistent with each other, and also because one can apply logical operations to imperatives: “Don’t enter” is the negation of “Enter”, and “Sing or dance” is the disjunction of “Sing” and “Dance”. A standard reaction to this dilemma consists in basing imperative logic on analogues of truth and falsity. For example, the imperative “Don’t enter” is satisfied if you don’t enter and is violated if you enter, and one might say that an inference from an imperative premise to an imperative conclusion is valid exactly if the satisfaction (rather than the truth) of the premise guarantees the satisfaction of the conclusion. But before getting into the details, more needs to be said on what exactly imperatives are. (shrink)
Complete deductive systems are constructed for the non-valid (refutable) formulae and sequents of some propositional modal logics. Thus, complete syntactic characterizations in the sense of Lukasiewicz are established for these logics and, in particular, purely syntactic decision procedures for them are obtained. The paper also contains some historical remarks and a general discussion on refutation systems.
While Thomas Kuhn's theory of scientific revolutions does not specifically deal with validation, the validation of simulations can be related in various ways to Kuhn's theory: 1) Computer simulations are sometimes depicted as located between experiments and theoretical reasoning, thus potentially blurring the line between theory and empirical research. Does this require a new kind of research logic that is different from the classical paradigm which clearly distinguishes between theory and empirical observation? I argue that this is not the case. (...) 2) Another typical feature of computer simulations is their being ``motley'' (Winsberg 2003) with respect to the various premises that enter into simulations. A possible consequence is that in case of failure it can become difficult to tell which of the premises is to blame. Could this issue be understood as fostering Kuhn's mild relativism with respect to theory choice? I argue that there is no need to worry about relativism with respect to computer simulations, in particular. 3) The field of social simulations, in particular, still lacks a common understanding concerning the requirements of empirical validation of simulations. Does this mean that social simulations are still in a pre-scientific state in the sense of Kuhn? My conclusion is that despite ongoing efforts to promote quality standards in this field, lack of proper validation is still a problem of many published simulation studies and that, at least large parts of social simulations must be considered as pre-scientific. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.