Complete deductivesystems are constructed for the non-valid (refutable) formulae and sequents of some propositional modal logics. Thus, complete syntactic characterizations in the sense of Lukasiewicz are established for these logics and, in particular, purely syntactic decision procedures for them are obtained. The paper also contains some historical remarks and a general discussion on refutation systems.
This is the first of a two-volume work combining two fundamental components of contemporary computing into classical deductive computing, a powerful form of computation, highly adequate for programming and automated theorem proving, which, in turn, have fundamental applications in areas of high complexity and/or high security such as mathematical proof, software specification and verification, and expert systems. Deductive computation is concerned with truth-preservation: This is the essence of the satisfiability problem, or SAT, the central computational problem in (...) computability and complexity theory. The Turing machine provides the classical version of this theory—classical computing—with its standard model, which is physically concretized—and thus spatial-temporally limited and restricted—in the von Neumann, or digital, computer. Although a number of new technological applications require classical deductive computation with non-classical logics, many key technologies still do well—or exclusively, for that matter—with classical logic. In this first volume, we elaborate on classical deductive computing with classical logic. The objective of the main text is to provide the reader with a thorough elaboration on both classical computing and classical deduction with the classical first-order predicate calculus with a view to computational implementations. As a complement to the mathematical-based exposition of the topics we offer the reader a very large selection of exercises. This selection aims at not only practice of discussed material, but also creative approaches to problems, for both discussed and novel contents, as well as at research into further relevant topics. (shrink)
In this paper, two axiomatic theories T− and T′ are constructed, which are dual to Tarski’s theory T+ (1930) of deductivesystems based on classical propositional calculus. While in Tarski’s theory T+ the primitive notion is the classical consequence function (entailment) Cn+, in the dual theory T− it is replaced by the notion of Słupecki’s rejection consequence Cn− and in the dual theory T′ it is replaced by the notion of the family Incons of inconsistent sets. The author (...) has proved that the theories T+, T−, and T′ are equivalent. (shrink)
ABSTRACT: This 1974 paper builds on our 1969 paper (Corcoran-Weaver [2]). Here we present three (modal, sentential) logics which may be thought of as partial systematizations of the semantic and deductive properties of a sentence operator which expresses certain kinds of necessity. The logical truths [sc. tautologies] of these three logics coincide with one another and with those of standard formalizations of Lewis's S5. These logics, when regarded as logistic systems (cf. Corcoran [1], p. 154), are seen to (...) be equivalent; but, when regarded as consequence systems (ibid., p. 157), one diverges from the others in a fashion which suggests that two standard measures of semantic complexity may not be as closely linked as previously thought. -/- This 1974 paper uses the linear notation for natural deduction presented in [2]: each two-dimensional deduction is represented by a unique one-dimensional string of characters. Thus obviating need for two-dimensional trees, tableaux, lists, and the like—thereby facilitating electronic communication of natural deductions. The 1969 paper presents a (modal, sentential) logic which may be thought of as a partial systematization of the semantic and deductive properties of a sentence operator which expresses certain kinds of necessity. The logical truths [sc. tautologies] of this logic coincides those of standard formalizations of Lewis’s S4. Among the paper's innovations is its treatment of modal logic in the setting of natural deduction systems--as opposed to axiomatic systems. The author’s apologize for the now obsolete terminology. For example, these papers speak of “a proof of a sentence from a set of premises” where today “a deduction of a sentence from a set of premises” would be preferable. 1. Corcoran, John. 1969. Three Logical Theories, Philosophy of Science 36, 153–77. J P R -/- 2. Corcoran, John and George Weaver. 1969. Logical Consequence in Modal Logic: Natural Deduction in S5 Notre Dame Journal of Formal Logic 10, 370–84. MR0249278 (40 #2524). 3. Weaver, George and John Corcoran. 1974. Logical Consequence in Modal Logic: Some Semantic Systems for S4, Notre Dame Journal of Formal Logic 15, 370–78. MR0351765 (50 #4253). (shrink)
To eliminate incompleteness, undecidability and inconsistency from formal systems we only need to convert the formal proofs to theorem consequences of symbolic logic to conform to the sound deductive inference model. -/- Within the sound deductive inference model there is a (connected sequence of valid deductions from true premises to a true conclusion) thus unlike the formal proofs of symbolic logic provability cannot diverge from truth.
This paper is concerned with the construction of theories of software systems yielding adequate predictions of their target systems’ computations. It is first argued that mathematical theories of programs are not able to provide predictions that are consistent with observed executions. Empirical theories of software systems are here introduced semantically, in terms of a hierarchy of computational models that are supplied by formal methods and testing techniques in computer science. Both deductive top-down and inductive bottom-up approaches (...) in the discovery of semantic software theories are refused to argue in favour of the abductive process of hypothesising and refining models at each level in the hierarchy, until they become satisfactorily predictive. Empirical theories of computational systems are required to be modular, as modular are most software verification and testing activities. We argue that logic relations must be thereby defined among models representing different modules in a semantic theory of a modular software system. We exclude that scientific structuralism is able to define module relations needed in software modular theories. The algebraic Theory of Institutions is finally introduced to specify the logic structure of modular semantic theories of computational systems. (shrink)
In this paper, partly historical and partly theoretical, after having shortly outlined the development of the meta-ethics in the 1900?s starting from the Tractatus of Wittgenstein, I argue it is possible to sustain that emotivism and intuitionism are unsatisfactory ethical conceptions, while on the contrary, reason (intended in a logical-deductive sense) plays an effective role both in ethical discussions and in choices. There are some characteristics of the ethical language (prescriptivity, universalizability and predominance) that cannot be eluded (pain the (...) non significativity of the same language) by those who want to morally reason, i.e. by those who intend to regulate their own behaviour on the basis of knowledged and coherent principles. These characteristics can be found whether or not all possible ontological-metaphysics foundations of morals are taken into account. Furthermore the deontic logic systems allow the formalization of ethical theories and - at least in principle - a rigorous critical discussion of the same, but obviously nothing can be affirmed on the value of truth of the axioms of a system. In the deontic logic systems Hume?s law is assumed as an implicit result of inferential (conventional) rules and the acceptance of Hume?s law as a logical-linguistic thesis does not involve the cancellation of values (nihilism) or ethical relativism or indifferentism. (shrink)
The idea of rejection originated by Aristotle. The notion of rejection was introduced into formal logic by Łukasiewicz [20]. He applied it to complete syntactic characterization of deductivesystems using an axiomatic method of rejection of propositions [22, 23]. The paper gives not only genesis, but also development and generalization of the notion of rejection. It also emphasizes the methodological approach to biaspectual axiomatic method of characterization of deductivesystems as acceptance (asserted) systems and rejection (...) (refutation) systems, introduced by Łukasiewicz and developed by his student Słupecki, the pioneers of the method, which becomes relevant in modern approaches to logic. (shrink)
The idea of rejection originated by Aristotle. The notion of rejection was introduced into formal logic by Łukasiewicz [20]. He applied it to complete syntactic characterization of deductivesystems using an axiomatic method of rejection of propositions [22, 23]. The paper gives not only genesis, but also development and generalization of the notion of rejection. It also emphasizes the methodological approach to biaspectual axiomatic method of characterization of deductivesystems as acceptance (asserted) systems and rejection (...) (refutation) systems, introduced by Łukasiewicz and developed by his student Słupecki, the pioneers of the method, which becomes relevant in modern approaches to logic. (shrink)
This note considers deductivesystems for the operator a of unprovability in some particular propositional normal modal logics. We give thus complete syntactic characterization of these logics in the sense of Lukasiewicz: for every formula either ` or a (but not both) is derivable. In particular, purely syntactic decision procedure is provided for the logics under considerations.
We investigate an enrichment of the propositional modal language L with a "universal" modality ■ having semantics x ⊧ ■φ iff ∀y(y ⊧ φ), and a countable set of "names" - a special kind of propositional variables ranging over singleton sets of worlds. The obtained language ℒ $_{c}$ proves to have a great expressive power. It is equivalent with respect to modal definability to another enrichment ℒ(⍯) of ℒ, where ⍯ is an additional modality with the semantics x ⊧ ⍯φ (...) iff Vy(y ≠ x → y ⊧ φ). Model-theoretic characterizations of modal definability in these languages are obtained. Further we consider deductivesystems in ℒ $_{c}$ . Strong completeness of the normal ℒ $_{c}$ logics is proved with respect to models in which all worlds are named. Every ℒ $_{c}$ -logic axiomatized by formulae containing only names (but not propositional variables) is proved to be strongly frame-complete. Problems concerning transfer of properties ([in]completeness, filtration, finite model property etc.) from ℒ to ℒ $_{c}$ are discussed. Finally, further perspectives for names in multimodal environment are briefly sketched. (shrink)
Recent work in formal semantics suggests that the language system includes not only a structure building device, as standardly assumed, but also a natural deductive system which can determine when expressions have trivial truth‐conditions (e.g., are logically true/false) and mark them as unacceptable. This hypothesis, called the ‘logicality of language’, accounts for many acceptability patterns, including systematic restrictions on the distribution of quantifiers. To deal with apparent counter‐examples consisting of acceptable tautologies and contradictions, the logicality of language is often (...) paired with an additional assumption according to which logical forms are radically underspecified: i.e., the language system can see functional terms but is ‘blind’ to open class terms to the extent that different tokens of the same term are treated as if independent. This conception of logical form has profound implications: it suggests an extreme version of the modularity of language, and can only be paired with non‐classical—indeed quite exotic—kinds of deductivesystems. The aim of this paper is to show that we can pair the logicality of language with a different and ultimately more traditional account of logical form. This framework accounts for the basic acceptability patterns which motivated the logicality of language, can explain why some tautologies and contradictions are acceptable, and makes better predictions in key cases. As a result, we can pursue versions of the logicality of language in frameworks compatible with the view that the language system is not radically modular vis‐á‐vis its open class terms and employs a deductive system that is basically classical. (shrink)
Recent work in formal semantics suggests that the language system includes not only a structure building device, as standardly assumed, but also a natural deductive system which can determine when expressions have trivial truth-conditions (e.g., are logically true/false) and mark them as unacceptable. This hypothesis, called the `logicality of language', accounts for many acceptability patterns, including systematic restrictions on the distribution of quantifiers. To deal with apparent counter-examples consisting of acceptable tautologies and contradictions, the logicality of language is often (...) paired with an additional assumption according to which logical forms are radically underspecified: i.e., the language system can see functional terms but is `blind' to open class terms to the extent that different tokens of the same term are treated as if independent. This conception of logical form has profound implications: it suggests an extreme version of the modularity of language, and can only be paired with non-classical---indeed quite exotic---kinds of deductivesystems. The aim of this paper is to show that we can pair the logicality of language with a different and ultimately more traditional account of logical form. This framework accounts for the basic acceptability patterns which motivated the logicality of language, can explain why some tautologies and contradictions are acceptable, and makes better predictions in key cases. As a result, we can pursue versions of the logicality of language in frameworks compatible with the view that the language system is not radically modular vis-a-vis its open class terms and employs a deductive system that is basically classical. (shrink)
We introduce and study hierarchies of extensions of the propositional modal and temporal languages with pairs of new syntactic devices: point of reference-reference pointer which enable semantic references to be made within a formula. We propose three different but equivalent semantics for the extended languages, discuss and compare their expressiveness. The languages with reference pointers are shown to have great expressive power (especially when their frugal syntax is taken into account), perspicuous semantics, and simple deductivesystems. For instance, (...) Kamp's and Stavi's temporal operators, as well as nominals (names, clock variables), are definable in them. Universal validity in these languages is proved undecidable. The basic modal and temporal logics with reference pointers are uniformly axiomatized and a strong completeness theorem is proved for them and extended to some classes of their extensions. (shrink)
This paper is concerned with the claim that supervaluationist consequence is not classical for a language including an operator for definiteness. Although there is some sense in which this claim is uncontroversial, there is a sense in which the claim must be qualified. In particular I defend Keefe's position according to which supervaluationism is classical except when the inference from phi to Dphi is involved. The paper provides a precise content to this claim showing that we might provide complete (and (...) sound) systems of deduction for supervaluationist consequence in which proofs are completely classical with the exception of a single last step (involving the above mentioned inference). (shrink)
Deductionism assimilates nature to conceptual artifacts (models, equations), and tacitly holds that real physical systems are such artifacts. Some physical concepts represent properties of deductivesystems rather than of nature. Properties of mathematical or deductivesystems can thereby sometimes falsely be ascribed to natural systems.
A graph-theoretic account of fibring of logics is developed, capitalizing on the interleaving characteristics of fibring at the linguistic, semantic and proof levels. Fibring of two signatures is seen as a multi-graph (m-graph) where the nodes and the m-edges include the sorts and the constructors of the signatures at hand. Fibring of two models is a multi-graph (m-graph) where the nodes and the m-edges are the values and the operations in the models, respectively. Fibring of two deductivesystems (...) is an m-graph whose nodes are language expressions and the m-edges represent the inference rules of the two original systems. The sobriety of the approach is confirmed by proving that all the fibring notions are universal constructions. This graph-theoretic view is general enough to accommodate very different fibrings of propositional based logics encompassing logics with non-deterministic semantics, logics with an algebraic semantics, logics with partial semantics and substructural logics, among others. Soundness and weak completeness are proved to be preserved under very general conditions. Strong completeness is also shown to be preserved under tighter conditions. In this setting, the collapsing problem appearing in several combinations of logic systems can be avoided. (shrink)
This is the PhD dissertation, written under supervision of Professor Jerzy Słupecki, published in the book: U.Wybraniec-Skardowska i Grzegorz Bryll "Z badań nad teorią zdań odrzuconych" ( "Studies of theory of rejected sentences"), Zeszyty Naukowe Wyższej Szkoły Pedagogicznej w Opolu, Seria B: Studia i Monografie nr 22, pp. 5-131. It is the first, original publication on the theory of rejected sentences on which are based, among other, papers: "Theory of rejected propositions. I"and "Theory of rejected propositions II" with Jerzy Słupecki (...) and Grzegorz Bryll. -/- -/- . (shrink)
The paper is about 'absolute logic': an approach to logic that differs from the standard first-order logic and other known approaches. It should be a new approach the author has created proposing to obtain a general and unifying approach to logic and a faithful model of human mathematical deductive process. In first-order logic there exist two different concepts of term and formula, in place of these two concepts in our approach we have just one notion of expression. In our (...) system the set-builder notation is an expression-building pattern. In our system we can easily express second-order, third order and any-order conditions. The meaning of a sentence will depend solely on the meaning of the symbols it contains, it will not depend on external 'structures'. Our deductive system is based on a very simple definition of proof and provides a good model of human mathematical deductive process. The soundness and consistency of the system are proved. We discuss on the completeness of our deductivesystems. We also discuss how our system relates to the most know types of paradoxes, from the discussion no specific vulnerability to paradoxes comes out. The paper provides both the theoretical material and a fully documented example of deduction. (shrink)
The idea of rejection originated by Aristotle. The notion of rejection was introduced into formal logic by Łukasiewicz. He applied it to complete syntactic characterization of deductivesystems using an axiomatic method of rejection of propositions. The paper gives not only genesis, but also development and generalization of the notion of rejection. It also emphasizes the methodological approach to biaspectual axiomatic method of characterization of deductivesystems as acceptance (asserted) systems and rejection (refutation) systems, (...) introduced by Łukasiewicz and developed by his student Słupecki, the pioneers of the method, which becomes relevant in modern approaches to logic. (shrink)
Ian Rumfitt has proposed systems of bilateral logic for primitive speech acts of assertion and denial, with the purpose of ‘exploring the possibility of specifying the classically intended senses for the connectives in terms of their deductive use’ : 810f). Rumfitt formalises two systems of bilateral logic and gives two arguments for their classical nature. I assess both arguments and conclude that only one system satisfies the meaning-theoretical requirements Rumfitt imposes in his arguments. I then formalise an (...) intuitionist system of bilateral logic which also meets those requirements. Thus Rumfitt cannot claim that only classical bilateral rules of inference succeed in imparting a coherent sense onto the connectives. My system can be extended to classical logic by adding the intuitionistically unacceptable half of a structural rule Rumfitt uses to codify the relation between assertion and denial. Thus there is a clear sense in which, in the bilateral framework, the difference between classicism and intuitionism is not one of the rules of inference governing negation, but rather one of the relation between assertion and denial. (shrink)
Since the time of Aristotle's students, interpreters have considered Prior Analytics to be a treatise about deductive reasoning, more generally, about methods of determining the validity and invalidity of premise-conclusion arguments. People studied Prior Analytics in order to learn more about deductive reasoning and to improve their own reasoning skills. These interpreters understood Aristotle to be focusing on two epistemic processes: first, the process of establishing knowledge that a conclusion follows necessarily from a set of premises (that is, (...) on the epistemic process of extracting information implicit in explicitly given information) and, second, the process of establishing knowledge that a conclusion does not follow. Despite the overwhelming tendency to interpret the syllogistic as formal epistemology, it was not until the early 1970s that it occurred to anyone to think that Aristotle may have developed a theory of deductive reasoning with a well worked-out system of deductions comparable in rigor and precision with systems such as propositional logic or equational logic familiar from mathematical logic. When modern logicians in the 1920s and 1930s first turned their attention to the problem of understanding Aristotle's contribution to logic in modern terms, they were guided both by the Frege-Russell conception of logic as formal ontology and at the same time by a desire to protect Aristotle from possible charges of psychologism. They thought they saw Aristotle applying the informal axiomatic method to formal ontology, not as making the first steps into formal epistemology. They did not notice Aristotle's description of deductive reasoning. Ironically, the formal axiomatic method (in which one explicitly presents not merely the substantive axioms but also the deductive processes used to derive theorems from the axioms) is incipient in Aristotle's presentation. Partly in opposition to the axiomatic, ontically-oriented approach to Aristotle's logic and partly as a result of attempting to increase the degree of fit between interpretation and text, logicians in the 1970s working independently came to remarkably similar conclusions to the effect that Aristotle indeed had produced the first system of formal deductions. They concluded that Aristotle had analyzed the process of deduction and that his achievement included a semantically complete system of natural deductions including both direct and indirect deductions. Where the interpretations of the 1920s and 1930s attribute to Aristotle a system of propositions organized deductively, the interpretations of the 1970s attribute to Aristotle a system of deductions, or extended deductive discourses, organized epistemically. The logicians of the 1920s and 1930s take Aristotle to be deducing laws of logic from axiomatic origins; the logicians of the 1970s take Aristotle to be describing the process of deduction and in particular to be describing deductions themselves, both those deductions that are proofs based on axiomatic premises and those deductions that, though deductively cogent, do not establish the truth of the conclusion but only that the conclusion is implied by the premise-set. Thus, two very different and opposed interpretations had emerged, interestingly both products of modern logicians equipped with the theoretical apparatus of mathematical logic. The issue at stake between these two interpretations is the historical question of Aristotle's place in the history of logic and of his orientation in philosophy of logic. This paper affirms Aristotle's place as the founder of logic taken as formal epistemology, including the study of deductive reasoning. A by-product of this study of Aristotle's accomplishments in logic is a clarification of a distinction implicit in discourses among logicians--that between logic as formal ontology and logic as formal epistemology. (shrink)
It is argued that the goal of Hilbert's program was to prove the model-theoretical consistency of different axiom systems. This Hilbert proposed to do by proving the deductive consistency of the relevant systems. In the extended independence-friendly logic there is a complete proof method for the contradictory negations of independence-friendly sentences, so the existence of a single proposition that is not disprovable from arithmetic axioms can be shown formally in the extended independence-friendly logic. It can also be (...) proved by means of independence-friendly logic that proof-theoretical consistency of a sentence S implies the existence of a model in which S is not false. Hence the consistency of the axioms of arithmetic in the sense of being not-false in a model can be proved. (shrink)
Automated reasoning about uncertain knowledge has many applications. One difficulty when developing such systems is the lack of a completely satisfactory integration of logic and probability. We address this problem directly. Expressive languages like higher-order logic are ideally suited for representing and reasoning about structured knowledge. Uncertain knowledge can be modeled by using graded probabilities rather than binary truth-values. The main technical problem studied in this paper is the following: Given a set of sentences, each having some probability of (...) being true, what probability should be ascribed to other (query) sentences? A natural wish-list, among others, is that the probability distribution (i) is consistent with the knowledge base, (ii) allows for a consistent inference procedure and in particular (iii) reduces to deductive logic in the limit of probabilities being 0 and 1, (iv) allows (Bayesian) inductive reasoning and (v) learning in the limit and in particular (vi) allows confirmation of universally quantified hypotheses/sentences. We translate this wish-list into technical requirements for a prior probability and show that probabilities satisfying all our criteria exist. We also give explicit constructions and several general characterizations of probabilities that satisfy some or all of the criteria and various (counter) examples. We also derive necessary and sufficient conditions for extending beliefs about finitely many sentences to suitable probabilities over all sentences, and in particular least dogmatic or least biased ones. We conclude with a brief outlook on how the developed theory might be used and approximated in autonomous reasoning agents. Our theory is a step towards a globally consistent and empirically satisfactory unification of probability and logic. (shrink)
Because formal systems of symbolic logic inherently express and represent the deductive inference model formal proofs to theorem consequences can be understood to represent sound deductive inference to deductive conclusions without any need for other representations.
Because formal systems of symbolic logic inherently express and represent the deductive inference model formal proofs to theorem consequences can be understood to represent sound deductive inference to true conclusions without any need for other representations such as model theory.
Judaic Logic is an original inquiry into the forms of thought determining Jewish law and belief, from the impartial perspective of a logician. Judaic Logic attempts to honestly estimate the extent to which the logic employed within Judaism fits into the general norms, and whether it has any contributions to make to them. The author ranges far and wide in Jewish lore, finding clear evidence of both inductive and deductive reasoning in the Torah and other books of the Bible, (...) and analyzing the methodology of the Talmud and other Rabbinic literature by means of formal tools which make possible its objective evaluation with reference to scientific logic. The result is a highly innovative work – incisive and open, free of clichés or manipulation. Judaic Logic succeeds in translating vague and confusing interpretative principles and examples into formulas with the clarity and precision of Aristotelean syllogism. Among the positive outcomes, for logic in general, are a thorough listing, analysis and validation of the various forms of a-fortiori argument, as well as a clarification of dialectic logic. However, on the negative side, this demystification of Talmudic/Rabbinic modes of thought (hermeneutic and heuristic) reveals most of them to be, contrary to the boasts of orthodox commentators, far from deductive and certain. They are often, legitimately enough, inductive. But they are also often unnatural and arbitrary constructs, supported by unverifiable claims and fallacious techniques. Many other thought-processes, used but not noticed or discussed by the Rabbis, are identified in this treatise, and subjected to logical review. Various more or less explicit Rabbinic doctrines, which have logical significance, are also examined in it. In particular, this work includes a formal study of the ethical logic (deontology) found in Jewish law, to elicit both its universal aspects and its peculiarities. With regard to Biblical studies, one notable finding is an explicit formulation (which, however, the Rabbis failed to take note of and stress) of the principles of adduction in the Torah, written long before the acknowledgement of these principles in Western philosophy and their assimilation in a developed theory of knowledge. Another surprise is that, in contrast to Midrashic claims, the Tanakh (Jewish Bible) contains a lot more than ten instances of qal vachomer (a-fortiori) reasoning. In sum, Judaic Logic elucidates and evaluates the epistemological assumptions which have generated the Halakhah (Jewish religious jurisprudence) and allied doctrines. Traditional justifications, or rationalizations, concerning Judaic law and belief, are carefully dissected and weighed at the level of logical process and structure, without concern for content. This foundational approach, devoid of any critical or supportive bias, clears the way for a timely reassessment of orthodox Judaism (and incidentally, other religious systems, by means of analogies or contrasts). Judaic Logic ought, therefore, to be read by all Halakhists, as well as Bible and Talmud scholars and students; and also by everyone interested in the theory, practise and history of logic. (shrink)
For many years, the human need for the group, social life, and the impact of this form of life on mental health and body have been discussed. This is said to be less about loneliness and the role played by human beings. Loneliness is a global issue experienced by all humans more or less and with their lives. In other words, many people with races, cultures, social classes, and at different ages and times each experience some kind of loneliness. It (...) is true that human being is an absolute social being and we always hear from the benefits of communication and satisfaction from it, But this should not be overlooked by the constructive and positive aspects of being alone, especially in education systems; Hence, this article was conducted with the aim of examining the opinions and views of Eric Forum on loneliness and analyzing its consequences in education. This research is part of qualitative research, which is done by analytical-deductive method. Findings indicate that Any social person with extensive communication does not necessarily have mental health; On the other hand, loneliness is not always a sign for malicious and anti-social characters, In other words, what is important is the difference between antisocial people and a group consciously choosing loneliness. Therefore, loneliness is an emotional feeling that in the case of balance, it can be constructive and lead to self-knowledge, the development of reflection thinking, self-consciousness and if it is to be extreme, there will be plenty of harm in the community, especially education systems. (shrink)
The present article illustrates a conflict between the claim that rational belief sets are closed under deductive consequences, and a very inclusive claim about the factors that are sufficient to determine whether it is rational to believe respective propositions. Inasmuch as it is implausible to hold that the factors listed here are insufficient to determine whether it is rational to believe respective propositions, we have good reason to deny that rational belief sets are closed under deductive consequences.
In section 1, I develop epistemic communism, my view of the function of epistemically evaluative terms such as ‘rational’. The function is to support the coordination of our belief-forming rules, which in turn supports the reliable acquisition of beliefs through testimony. This view is motivated by the existence of valid inferences that we hesitate to call rational. I defend the view against the worry that it fails to account for a function of evaluations within first-personal deliberation. In the rest of (...) the paper, I then argue, on the basis of epistemic communism, for a view about rationality itself. I set up the argument in section 2 by saying what a theory of rational deduction is supposed to do. I claim that such a theory would provide a necessary, sufficient, and explanatorily unifying condition for being a rational rule for inferring deductive consequences. I argue in section 3 that, given epistemic communism and the conventionality that it entails, there is no such theory. Nothing explains why certain rules for deductive reasoning are rational. (shrink)
Deductive Cogency holds that the set of propositions towards which one has, or is prepared to have, a given type of propositional attitude should be consistent and closed under logical consequence. While there are many propositional attitudes that are not subject to this requirement, e.g. hoping and imagining, it is at least prima facie plausible that Deductive Cogency applies to the doxastic attitude involved in propositional knowledge, viz. belief. However, this thought is undermined by the well-known preface paradox, (...) leading a number of philosophers to conclude that Deductive Cogency has at best a very limited role to play in our epistemic lives. I argue here that Deductive Cogency is still an important epistemic requirement, albeit not as a requirement on belief. Instead, building on a distinction between belief and acceptance introduced by Jonathan Cohen and recent developments in the epistemology of understanding, I propose that Deductive Cogency applies to the attitude of treating propositions as given in the context of attempting to understand a given phenomenon. I then argue that this simultaneously accounts for the plausibility of the considerations in favor of Deductive Cogency and avoids the problematic consequences of the preface paradox. (shrink)
According to the received view of scientific theories, a scientific theory is an axiomatic-deductive linguistic structure which must include some set of guidelines (“correspondence rules”) for interpreting its theoretical terms with reference to the world of observable phenomena. According to the semantic view, a scientific theory need not be formulated as an axiomatic-deductive structure with correspondence rules, but need only specify models which are said to be “isomorphic” with actual phenomenal systems. In this paper, I consider both (...) the received and semantic views as they bear on the issue of how a theory relates to the world (Section 1). Then I offer a critique of some arguments frequently put forth in support of the semantic view (Section 2). Finally, I suggest a more convincing “meta-methodological” argument (based on the thought of Bernard Lonergan) in favor of the semantic view (Section 3). (shrink)
One of the most expected properties of a logical system is that it can be algebraizable, in the sense that an algebraic counterpart of the deductive machinery could be found. Since the inception of da Costa's paraconsistent calculi, an algebraic equivalent for such systems have been searched. It is known that these systems are non self-extensional (i.e., they do not satisfy the replacement property). More than this, they are not algebraizable in the sense of Blok-Pigozzi. The same (...) negative results hold for several systems of the hierarchy of paraconsistent logics known as Logics of Formal Inconsistency (LFIs). Because of this, these logics are uniquely characterized by semantics of non-deterministic kind. This paper offers a solution for two open problems in the domain of paraconsistency, in particular connected to algebraization of LFIs, by obtaining several LFIs weaker than C1, each of one is algebraizable in the standard Lindenbaum-Tarski's sense by a suitable variety of Boolean algebras extended with operators. This means that such LFIs satisfy the replacement property. The weakest LFI satisfying replacement presented here is called RmbC, which is obtained from the basic LFI called mbC. Some axiomatic extensions of RmbC are also studied, and in addition a neighborhood semantics is defined for such systems. It is shown that RmbC can be defined within the minimal bimodal non-normal logic E+E defined by the fusion of the non-normal modal logic E with itself. Finally, the framework is extended to first-order languages. RQmbC, the quantified extension of RmbC, is shown to be sound and complete w.r.t. BALFI semantics. (shrink)
The objective of this research programme is to contribute to the establishment of the emerging science of Formal Ontology in Information Systems via a collaborative project involving researchers from a range of disciplines including philosophy, logic, computer science, linguistics, and the medical sciences. The researchers will work together on the construction of a unified formal ontology, which means: a general framework for the construction of ontological theories in specific domains. The framework will be constructed using the axiomatic-deductive method (...) of modern formal ontology. It will be tested via a series of applications relating to on-going work in Leipzig on medical taxonomies and data dictionaries in the context of clinical trials. This will lead to the production of a domain-specific ontology which is designed to serve as a basis for applications in the medical field. (shrink)
Could the intersection of [formal proofs of mathematical logic] and [sound deductive inference] specify formal systems having [deductively sound formal proofs of mathematical logic]? All that we have to do to provide [deductively sound formal proofs of mathematical logic] is select the subset of conventional [formal proofs of mathematical logic] having true premises and now we have [deductively sound formal proofs of mathematical logic].
Tarski "proved" that there cannot possibly be any correct formalization of the notion of truth entirely on the basis of an insufficiently expressive formal system that was incapable of recognizing and rejecting semantically incorrect expressions of language. -/- The only thing required to eliminate incompleteness, undecidability and inconsistency from formal systems is transforming the formal proofs of symbolic logic to use the sound deductive inference model.
I argue 1) That in his celebrated Is/Ought passage, Hume employs ‘deduction’ in the strict sense, according to which if a conclusion B is justly or evidently deduced from a set of premises A, A cannot be true and B false, or B false and the premises A true. 2) That Hume was following the common custom of his times which sometimes employed ‘deduction’ in a strict sense to denote inferences in which, in the words of Dr Watts’ Logick, ‘the (...) premises, according to the reason of things, do really contain the conclusion that is deduced from them’; that although Hume sometimes uses ‘demonstrative argument’ as a synonym for ‘deduction’, like most of his contemporaries, he generally reserves the word ‘demonstration’ for deductive inferences in which the premises are both necessary and self-evident. 3) That Mr Hume did indeed mean to suggest that deductions from IS to OUGHT were ‘altogether inconceivable’ since if ought represents a new relation or affirmation, it cannot, in the strict sense, be justly deduced from premises which do not really contain it. 4) That in a large and liberal (or perhaps loose and promiscuous) sense Hume does deduce oughts and ought nots from observations concerning human affairs, but that the deductions in question are not inferences, but explanations, since in another sense of ‘deduce’, common in the Eighteenth Century, to deduce B from A is to trace B back to A or to explain B in terms of A; 5) That a small attention to the context of Hume’s remarks and to the logical notions on which they are based would subvert those vulgar systems of philosophy which exaggerate the distinction between fact and value; for just because it is ‘altogether inconceivable’ that the new relation or affirmation OUGHT should be a deduction from others that are entirely different from it, it does not follow that the facts represented by IS and IS NOT are at bottom any different from the values represented by OUGHT and OUGHT NOT. (shrink)
This paper discusses the theoretical assumptions behind the conception of the logic of faith and deed (LF&D) and outlines its formal-axiomatic frame and its method of construction, which enable us to understand it as a kind of deductive science. The paper is divided into several sections, starting with the logical analysis of the ambiguous terms of 'faith’ and 'action', and focusing in particular on the concepts of religious faith and deed as a type of conscious activity relating to a (...) matter or matters of social importance. After outlining the main ideas and basic assumptions of the theoretical conception of the LF&D as an axiomatic theory, the author introduces some axiom systems for: 1) the logics of faith LF (doxastic logics), 2) the logic of deed LD, and 3) certain logics of norms DL (deontic logics) connected with "duties" and concerning actions/deeds. Lastly, the paper outlines the scientific LF&D based on the three types of logic 1)-3). (shrink)
This essay (a revised version of my undergraduate honors thesis at Stanford) constructs a theory of analogy as it applies to argumentation and reasoning, especially as used in fields such as philosophy and law. The word analogy has been used in different senses, which the essay defines. The theory developed herein applies to analogia rationis, or analogical reasoning. Building on the framework of situation theory, a type of logical relation called determination is defined. This determination relation solves a puzzle about (...) analogy in the context of logical argument, namely, whether an analogous situation contributes anything logically over and above what could be inferred from the application of prior knowledge to a present situation. Scholars of reasoning have often claimed that analogical arguments are never logically valid, and that they therefore lack cogency. However, when the right type of determination structure exists, it is possible to prove that projecting a conclusion inferred by analogy onto the situation about which one is reasoning is both valid and non-redundant. Various other properties and consequences of the determination relation are also proven. Some analogical arguments are based on principles such as similarity, which are not logically valid. The theory therefore provides us with a way to distinguish between legitimate and illegitimate arguments. It also provides an alternative to procedures based on the assessment of similarity for constructing analogies in artificial intelligence systems. (shrink)
This paper describes a cubic water tank equipped with a movable partition receiving various amounts of liquid used to represent joint probability distributions. This device is applied to the investigation of deductive inferences under uncertainty. The analogy is exploited to determine by qualitative reasoning the limits in probability of the conclusion of twenty basic deductive arguments (such as Modus Ponens, And-introduction, Contraposition, etc.) often used as benchmark problems by the various theoretical approaches to reasoning under uncertainty. The probability (...) bounds imposed by the premises on the conclusion are derived on the basis of a few trivial principles such as "a part of the tank cannot contain more liquid than its capacity allows", or "if a part is empty, the other part contains all the liquid". This stems from the equivalence between the physical constraints imposed by the capacity of the tank and its subdivisions on the volumes of liquid, and the axioms and rules of probability. The device materializes de Finetti's coherence approach to probability. It also suggests a physical counterpart of Dutch book arguments to assess individuals' rationality in probability judgments in the sense that individuals whose degrees of belief in a conclusion are out of the bounds of coherence intervals would commit themselves to executing physically impossible tasks. (shrink)
Juhani Yli-Vakkuri has argued that the Twin Earth thought experiments offered in favour of semantic externalism can be replaced by a straightforward deductive argument from premisses widely accepted by both internalists and externalists alike. The deductive argument depends, however, on premisses that, on standard formulations of internalism, cannot be satisfied by a single belief simultaneously. It does not therefore, constitute a proof of externalism. The aim of this article is to explain why.
The idea that knowledge can be extended by inference from what is known seems highly plausible. Yet, as shown by familiar preface paradox and lottery-type cases, the possibility of aggregating uncertainty casts doubt on its tenability. We show that these considerations go much further than previously recognized and significantly restrict the kinds of closure ordinary theories of knowledge can endorse. Meeting the challenge of uncertainty aggregation requires either the restriction of knowledge-extending inferences to single premises, or eliminating epistemic uncertainty in (...) known premises. The first strategy, while effective, retains little of the original idea—conclusions even of modus ponens inferences from known premises are not always known. We then look at the second strategy, inspecting the most elaborate and promising attempt to secure the epistemic role of basic inferences, namely Timothy Williamson’s safety theory of knowledge. We argue that while it indeed has the merit of allowing basic inferences such as modus ponens to extend knowledge, Williamson’s theory faces formidable difficulties. These difficulties, moreover, arise from the very feature responsible for its virtue- the infallibilism of knowledge. (shrink)
This article presents the first, systematic analysis of the ethical challenges posed by recommender systems through a literature review. The article identifies six areas of concern, and maps them onto a proposed taxonomy of different kinds of ethical impact. The analysis uncovers a gap in the literature: currently user-centred approaches do not consider the interests of a variety of other stakeholders—as opposed to just the receivers of a recommendation—in assessing the ethical impacts of a recommender system.
This essay presents and discusses the currently most famous among the deductive conceptions of explanation, i.e., the deductive-nomological one, and proceeds to apply it to microeconomic theory. After restating the basic ideas, the essay investigates some of the important objections raised against it, with a view to decide whether or not they invalidate the proposed application to economics.
This essay presents deductive arguments to an introductory-level audience via a discussion of Aristotle's three types of rhetoric, the goals of and differences between deductive and non-deductive arguments, and the major features of deductive arguments (e.g., validity and soundness).
We are justified in employing the rule of inference Modus Ponens (or one much like it) as basic in our reasoning. By contrast, we are not justified in employing a rule of inference that permits inferring to some difficult mathematical theorem from the relevant axioms in a single step. Such an inferential step is intuitively “too large” to count as justified. What accounts for this difference? In this paper, I canvass several possible explanations. I argue that the most promising approach (...) is to appeal to features like usefulness or indispensability to important or required cognitive projects. On the resulting view, whether an inferential step counts as large or small depends on the importance of the relevant rule of inference in our thought. (shrink)
In this reply to James H. Fetzer’s “Minds and Machines: Limits to Simulations of Thought and Action”, I argue that computationalism should not be the view that (human) cognition is computation, but that it should be the view that cognition (simpliciter) is computable. It follows that computationalism can be true even if (human) cognition is not the result of computations in the brain. I also argue that, if semiotic systems are systems that interpret signs, then both humans and (...) computers are semiotic systems. Finally, I suggest that minds can be considered as virtual machines implemented in certain semiotic systems, primarily the brain, but also AI computers. In doing so, I take issue with Fetzer’s arguments to the contrary. (shrink)
Developmental systems theory (DST) is a wholeheartedly epigenetic approach to development, inheritance and evolution. The developmental system of an organism is the entire matrix of resources that are needed to reproduce the life cycle. The range of developmental resources that are properly described as being inherited, and which are subject to natural selection, is far wider than has traditionally been allowed. Evolution acts on this extended set of developmental resources. From a developmental systems perspective, development does not proceed (...) according to a preformed plan; what is inherited is much more than DNA; and evolution is change not only in gene frequencies, but in entire developmental systems. (shrink)
With the advent of computers in the experimental labs, dynamic systems have become a new tool for research on problem solving and decision making. A short review of this research is given and the main features of these systems (connectivity and dynamics) are illustrated. To allow systematic approaches to the influential variables in this area, two formal frameworks (linear structural equations and finite state automata) are presented. Besides the formal background, the article sets out how the task demands (...) of system identification and system control can be realised in these environments, and how psychometrically acceptable dependent variables can be derived. (shrink)
In this paper we will demonstrate that a computational system can meet the criteria for autonomy laid down by classical enactivism. The two criteria that we will focus on are operational closure and structural determinism, and we will show that both can be applied to a basic example of a physically instantiated Turing machine. We will also address the question of precariousness, and briefly suggest that a precarious Turing machine could be designed. Our aim in this paper is to challenge (...) the assumption that computational systems are necessarily heteronomous systems, to try and motivate in enactivism a more nuanced and less rigid conception of computational systems, and to demonstrate to computational theorists that they might find some interesting material within the enactivist tradition, despite its historical hostility towards computationalism. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.