Some of the most important developments of symbolic logic took place in the 1920s. Foremost among them are the distinction between syntax and semantics and the formulation of questions of completeness and decidability of logical systems. David Hilbert and his students played a very important part in these developments. Their contributions can be traced to unpublished lecture notes and other manuscripts by Hilbert and Bernays dating to the period 1917-1923. The aim of this paper is to describe these results, (...) focussing primarily on propositional logic, and to put them in their historical context. It is argued that truth-value semantics, syntactic ("Post-") and semantic completeness, decidability, and other results were first obtained by Hilbert and Bernays in 1918, and that Bernays's role in their discovery and the subsequent development of mathematical logic is much greater than has so far been acknowledged. (shrink)
We are much better equipped to let the facts reveal themselves to us instead of blinding ourselves to them or stubbornly trying to force them into preconceived molds. We no longer embarrass ourselves in front of our students, for example, by insisting that “Some Xs are Y” means the same as “Some X is Y”, and lamely adding “for purposes of logic” whenever there is pushback. Logic teaching in this century can exploit the new spirit of objectivity, humility, clarity, observationalism, (...) contextualism, and pluralism. Besides the new spirit there have been quiet developments in logic and its history and philosophy that could radically improve logic teaching. One rather conspicuous example is that the process of refining logical terminology has been productive. Future logic students will no longer be burdened by obscure terminology and they will be able to read, think, talk, and write about logic in a more careful and more rewarding manner. Closely related is increased use and study of variable-enhanced natural language as in “Every proposition x that implies some proposition y that is false also implies some proposition z that is true”. Another welcome development is the culmination of the slow demise of logicism. No longer is the teacher blocked from using examples from arithmetic and algebra fearing that the students had been indoctrinated into thinking that every mathematical truth was a tautology and that every mathematical falsehood was a contradiction. A fifth welcome development is the separation of laws of logic from so-called logical truths, i.e., tautologies. Now we can teach the logical independence of the laws of excluded middle and non-contradiction without fear that students had been indoctrinated into thinking that every logical law was a tautology and that every falsehood of logic was a contradiction. This separation permits the logic teacher to apply logic in the clarification of laws of logic. This lecture expands the above points, which apply equally well in first, second, and third courses, i.e. in “critical thinking”, “deductive logic”, and “symbolic logic”. (shrink)
As noted in 1962 by Timothy Smiley, if Aristotle’s logic is faithfully translated into modern symbolic logic, the fit is exact. If categorical sentences are translated into many-sorted logic MSL according to Smiley’s method or the two other methods presented here, an argument with arbitrarily many premises is valid according to Aristotle’s system if and only if its translation is valid according to modern standard many-sorted logic. As William Parry observed in 1973, this result can be proved using my (...) 1972 proof of the completeness of Aristotle’s syllogistic. (shrink)
This paper concerns “human symbolic output,” or strings of characters produced by humans in our various symbolic systems; e.g., sentences in a natural language, mathematical propositions, and so on. One can form a set that consists of all of the strings of characters that have been produced by at least one human up to any given moment in human history. We argue that at any particular moment in human history, even at moments in the distant future, this (...) set is finite. But then, given fundamental results in recursion theory, the set will also be recursive, recursively enumerable, axiomatizable, and could be the output of a Turing machine. We then argue that it is impossible to produce a string of symbols that humans could possibly produce but no Turing machine could. Moreover, we show that any given string of symbols that we could produce could also be the output of a Turing machine. Our arguments have implications for Hilbert’s sixth problem and the possibility of axiomatizing particular sciences, they undermine at least two distinct arguments against the possibility of Artificial Intelligence, and they entail that expert systems that are the equals of human experts are possible, and so at least one of the goals of Artificial Intelligence can be realized, at least in principle. (shrink)
A Mathematical Review by John Corcoran, SUNY/Buffalo -/- Macbeth, Danielle Diagrammatic reasoning in Frege's Begriffsschrift. Synthese 186 (2012), no. 1, 289–314. ABSTRACT This review begins with two quotations from the paper: its abstract and the first paragraph of the conclusion. The point of the quotations is to make clear by the “give-them-enough-rope” strategy how murky, incompetent, and badly written the paper is. I know I am asking a lot, but I have to ask you to read the quoted passages—aloud (...) if possible. Don’t miss the silly attempt to recycle Kant’s quip “Concepts without intuitions are empty; intuitions without concepts are blind”. What the paper was aiming at includes the absurdity: “Proofs without definitions are empty; definitions without proofs are, if not blind, then dumb.” But the author even bollixed this. The editor didn’t even notice. The copy-editor missed it. And the author’s proof-reading did not catch it. In order not to torment you I will quote the sentence as it appears: “In a slogan: proofs without definitions are empty, merely the aimless manipulation of signs according to rules; and definitions without proofs are, if no blind, then dumb.”[sic] The rest of my review discusses the paper’s astounding misattribution to contemporary logicians of the information-theoretic approach. This approach was cruelly trashed by Quine in his 1970 Philosophy of Logic, and thereafter ignored by every text I know of. The paper under review attributes generally to modern philosophers and logicians views that were never espoused by any of the prominent logicians—such as Hilbert, Gödel, Tarski, Church, and Quine—apparently in an attempt to distance them from Frege: the focus of the article. On page 310 we find the following paragraph. “In our logics it is assumed that inference potential is given by truth-conditions. Hence, we think, deduction can be nothing more than a matter of making explicit information that is already contained in one’s premises. If the deduction is valid then the information contained in the conclusion must be contained already in the premises; if that information is not contained already in the premises […], then the argument cannot be valid.” Although the paper is meticulous in citing supporting literature for less questionable points, no references are given for this. In fact, the view that deduction is the making explicit of information that is only implicit in premises has not been espoused by any standard symbolic logic books. It has only recently been articulated by a small number of philosophical logicians from a younger generation, for example, in the prize-winning essay by J. Sagüillo, Methodological practice and complementary concepts of logical consequence: Tarski’s model-theoretic consequence and Corcoran’s information-theoretic consequence, History and Philosophy of Logic, 30 (2009), pp. 21–48. The paper omits definitions of key terms including ‘ampliative’, ‘explicatory’, ‘inference potential’, ‘truth-condition’, and ‘information’. The definition of prime number on page 292 is as follows: “To say that a number is prime is to say that it is not divisible without remainder by another number”. This would make one be the only prime number. The paper being reviewed had the benefit of two anonymous referees who contributed “very helpful comments on an earlier draft”. Could these anonymous referees have read the paper? -/- J. Corcoran, U of Buffalo, SUNY -/- PS By the way, if anyone has a paper that has been turned down by other journals, any journal that would publish something like this might be worth trying. (shrink)
The syllogistic figures and moods can be taken to be argument schemata as can the rules of the Stoic propositional logic. Sentence schemata have been used in axiomatizations of logic only since the landmark 1927 von Neumann paper [31]. Modern philosophers know the role of schemata in explications of the semantic conception of truth through Tarski’s 1933 Convention T [42]. Mathematical logicians recognize the role of schemata in first-order number theory where Peano’s second-order Induction Axiom is approximated by Herbrand’s (...) Induction-Axiom Schema [23]. Similarly, in first-order set theory, Zermelo’s second-order Separation Axiom is approximated by Fraenkel’s first-order Separation Schema [17]. In some of several closely related senses, a schema is a complex system having multiple components one of which is a template-text or scheme-template, a syntactic string composed of one or more “blanks” and also possibly significant words and/or symbols. In accordance with a side condition the template-text of a schema is used as a “template” to specify a multitude, often infinite, of linguistic expressions such as phrases, sentences, or argument-texts, called instances of the schema. The side condition is a second component. The collection of instances may but need not be regarded as a third component. The instances are almost always considered to come from a previously identified language (whether formal or natural), which is often considered to be another component. This article reviews the often-conflicting uses of the expressions ‘schema’ and ‘scheme’ in the literature of logic. It discusses the different definitions presupposed by those uses. And it examines the ontological and epistemic presuppositions circumvented or mooted by the use of schemata, as well as the ontological and epistemic presuppositions engendered by their use. In short, this paper is an introduction to the history and philosophy of schemata. (shrink)
Przedmowa Problematyka związana z zależnościami przyczynowymi, ich modelowaniem i odkrywa¬niem, po długiej nieobecności w filozofii i metodologii nauk, budzi współcześnie duże zainteresowanie. Wiąże się to przede wszystkim z dynamicznym rozwojem, zwłaszcza od lat 1990., technik obli¬czeniowych. Wypracowane w tym czasie sieci bayesowskie uznaje się za matematyczny język przyczynowości. Pozwalają one na daleko idącą auto¬matyzację wnioskowań, co jest także zachętą do podjęcia prób algorytmiza¬cji odkrywania przyczyn. Na potrzeby badań naukowych, które pozwalają na przeprowadzenie eksperymentu z randomizacją, standardowe metody ustalania zależności przyczynowych (...) opracowano na początku XX wieku. Zupełnie inaczej sprawa przedstawia się w przypadku badań nieeksperymentalnych, gdzie podobne rozwiązania pozostają kwestią przyszłości. Zadaniem tej książki jest podanie warunków, które powinny być spełnione przez te rozwiązania, oraz sformułowanie proceduralnego kryterium zależności przy¬czynowych jako szczegółowej realizacji tych warunków. Pociąga ono waż¬kie konsekwencje dla filozofii i metodologii nauk, które ujawnia – podany w Części II – zarys me-todolo¬gii proceduralnej. W literaturze przedmiotu brakuje w miarę wszechstronnego i systema¬tycznego omówie¬nia najnowszych filozoficznych i metodologicznych dys¬kusji na temat przy¬czynowości, co niech będzie wytłumaczeniem, dlaczego w niektórych punktach obecnej książki szczegółowo referuję trudno dos¬tępne teksty źró¬dłowe. Przymiotnik „proceduralny” używam tu w znaczeniu węższym niż Huw Price (w którego pracach właściwszy byłby termin „kryterialny”) dla podkre¬ślenia – zgodnie z łacińskim źródłosłowem procedo – że dla ustalenia przy¬czyny niezbędne jest podjęcie przez uczonych określonych interakcji z ba¬daną rzeczywistością. Zalążki zamysłu prezentowanego w tej książce przedstawiłem podczas warsztatów filozoficznych „Philosophy and Probability” w roku 2002, zor¬ganizowanych przez Instytut Filozofii Uniwersytetu w Konstancji. Wdzięczny jestem uczestnikom tych warsztatów za uwagi, a przede wszyst¬kim następującym osobom: Luc Bovens, Brandon Fitelson, Alan Hájek, Stephan Hartmann oraz Jon Williamson. Podczas międzynarodowej konferencji „Analytical Pragmatism”, zorgani¬zowanej w Lublinie w roku 2003 przez Wydział Filo¬zofii Katolickiego Uniwersytetu Lubelskiego, odniosłem swoją koncepcję do prac Nancy Cartwright. Szczególnie inspirujący okazał się komentarz Huw Price’a do mojego referatu i przeprowadzona z nim dysku¬sja. Ujęcie koncepcji metodologii proceduralnej na szerszym tle współ-czes¬nego nurtu empirystycznego w filozofii nauki przedstawiłem w roku 2004 podczas konferencji „5th Quadrennial Fellows Conference”, zorgani¬zowanej przez Instytut Filozofii Uniwersytetu Jagiellońskiego oraz Centrum Filozofii Nauki w Pittsburghu. Szczególnie pomocne w dalszych moich pra¬cach były uwagi Jamesa Bogena, Janet Kourany, Jamesa Lennoxa, Johna Nortona, Thomasa Bonka, Jana Woleńskiego i Johna Worralla, za które wyra¬żam swoją wdzięczność. Korpus książki powstał podczas mojego stażu w Centrum Filozofii Nauki w Pittsburghu, który odbyłem jako stypendysta Fundacji na Rzecz Nauki Polskiej w roku akademickim 2004-2005. Uczestniczyłem w tym czasie w życiu naukowym Centrum i w pracach badawczych zespołu z Instytutu Filozofii Uniwersytetu Carnegie-Mellon w Pittsburghu kierowanego przez Clarka Glymoura. Na jego ręce składam podziękowanie za wiele po¬mocnych uwag do moich wy¬stąpień oraz tekstów i za dyskusje przede wszystkim z nim samym i z jego najbliższymi współpracownikami: Peterem Spirtesem oraz Richardem Scheinesem, a także pozostałymi członkami tego zes¬połu, doktorantami i uczestnikami seminarium badawczego „Causality in the Social Sciences”. Za wieloletnie wsparcie, wielopłaszczyznowe inspiracje towarzyszące pi¬saniu tej książki, a także liczne pomocne uwagi do jej wcześniejszych wersji dziękuję przede wszystkim Księdzu Profesorowi Andrzejowi Bronkowi oraz Księdzu Profesorowi Józefowi Herbutowi, współprowadzącemu seminarium doktorskie w Katedrze Metodologii Nauk Katolickiego Uniwersytetu Lubel¬skiego im. Jana Pawła II, jak również pozostałym uczestnikom tego semina¬rium. Dziękuję mojej Żonie, dr Annie Kawalec za wiele wysiłku włożonego w ulepszenie redakcji – językowej i merytorycznej – obecnej książki. Książkę tę można czytać na kilka sposobów. Czytelnikom zainteresowa¬nym przede wszystkim prowadzeniem badań empirycznych polecałbym rozpoczęcie od Rozdziału 2. i kontynuację pozostałych rozdziałów Części I, a następnie Dodatków. Czytelnikom zainteresowanym problemami filozo¬fii i metodologii nauk polecałbym rozpoczęcie lektury książki od Części II i uzupełniającą lekturę Rozdziału 2., a następnie Wprowadzenia i Zakończenia. Czytelnikom mniej zainteresowanym zagadnieniami teoretycznymi pole¬całbym zapoznanie się z fascynującymi dziejami odkrycia przyczyn cholery przez Johna Snowa, które rekonstruuję w Rozdziale 1. W dalszej części nato¬miast polecałbym przejście do Wprowadzenia i Zakończenia, które w mniej specjalistyczny sposób przybliżają proponowane tu rozstrzygnięcia. Tekst książki nie był dotąd publikowany. Wyjątkiem są pewne fragmenty Rozdziału 8. oraz 9., które w zmienionej postaci ukazały się w Rocznikach Filozoficznych (Kawalec 2004). -/- Lublin, luty 2006 r. (shrink)
The Symbolic Logic Study Guide is designed to accompany the widely used symbolic logic textbook Language, Proof and Logic (LPL), by Jon Barwise and John Etchemendy (CSLI Publications 2003). The guide has two parts. The first part contains condensed, essential lecture notes, which streamline and systematize the first fourteen chapters of the book into seven teaching sections, and thus provide a clear, well-designed roadmap for the understanding of the text. The second part consists of twelve sample quizzes and (...) solutions. The Symbolic Logic Study Guide is essential for all instructors and students who use LPL in their symbolic logic classes. (shrink)
The period from 1900 to 1935 was particularly fruitful and important for the development of logic and logical metatheory. This survey is organized along eight "itineraries" concentrating on historically and conceptually linked strands in this development. Itinerary I deals with the evolution of conceptions of axiomatics. Itinerary II centers on the logical work of Bertrand Russell. Itinerary III presents the development of set theory from Zermelo onward. Itinerary IV discusses the contributions of the algebra of logic tradition, in particular, Löwenheim (...) and Skolem. Itinerary V surveys the work in logic connected to the Hilbert school, and itinerary V deals specifically with consistency proofs and metamathematics, including the incompleteness theorems. Itinerary VII traces the development of intuitionistic and many-valued logics. Itinerary VIII surveys the development of semantical notions from the early work on axiomatics up to Tarski's work on truth. (shrink)
Recent experimental evidence from developmental psychology and cognitive neuroscience indicates that humans are equipped with unlearned elementary mathematical skills. However, formal mathematics has properties that cannot be reduced to these elementary cognitive capacities. The question then arises how human beings cognitively deal with more advanced mathematical ideas. This paper draws on the extended mind thesis to suggest that mathematical symbols enable us to delegate some mathematical operations to the external environment. In this view, mathematical symbols (...) are not only used to express mathematical concepts—they are constitutive of the mathematical concepts themselves. Mathematical symbols are epistemic actions, because they enable us to represent concepts that are literally unthinkable with our bare brains. Using case-studies from the history of mathematics and from educational psychology, we argue for an intimate relationship between mathematical symbols and mathematical cognition. (shrink)
This is a homework in correlation with the area of symbolic logic which is an important part of philosophy and mathematics. The rules of implication are a key to solve problems in symbolic logic as shown in this example.
The determinism-free will debate is perhaps as old as philosophy itself and has been engaged in from a great variety of points of view including those of scientific, theological, and logical character. This chapter focuses on two arguments from logic. First, there is an argument in support of determinism that dates back to Aristotle, if not farther. It rests on acceptance of the Law of Excluded Middle, according to which every proposition is either true or false, no matter whether the (...) proposition is about the past, present or future. In particular, the argument goes, whatever one does or does not do in the future is determined in the present by the truth or falsity of the corresponding proposition. The second argument coming from logic is much more modern and appeals to Gödel's incompleteness theorems to make the case against determinism and in favour of free will, insofar as that applies to the mathematical potentialities of human beings. The claim more precisely is that as a consequence of the incompleteness theorems, those potentialities cannot be exactly circumscribed by the output of any computing machine even allowing unlimited time and space for its work. The chapter concludes with some new considerations that may be in favour of a partial mechanist account of the mathematical mind. (shrink)
A crucial part of the contemporary interest in logicism in the philosophy of mathematics resides in its idea that arithmetical knowledge may be based on logical knowledge. Here an implementation of this idea is considered that holds that knowledge of arithmetical principles may be based on two things: (i) knowledge of logical principles and (ii) knowledge that the arithmetical principles are representable in the logical principles. The notions of representation considered here are related to theory-based and structure-based notions of representation (...) from contemporary mathematical logic. It is argued that the theory-based versions of such logicism are either too liberal (the plethora problem) or are committed to intuitively incorrect closure conditions (the consistency problem). Structure-based versions must on the other hand respond to a charge of begging the question (the circularity problem) or explain how one may have a knowledge of structure in advance of a knowledge of axioms (the signature problem). This discussion is significant because it gives us a better idea of what a notion of representation must look like if it is to aid in realizing some of the traditional epistemic aims of logicism in the philosophy of mathematics. (shrink)
Call an explanation in which a non-mathematical fact is explained—in part or in whole—by mathematical facts: an extra-mathematical explanation. Such explanations have attracted a great deal of interest recently in arguments over mathematical realism. In this article, a theory of extra-mathematical explanation is developed. The theory is modelled on a deductive-nomological theory of scientific explanation. A basic DN account of extra-mathematical explanation is proposed and then redeveloped in the light of two difficulties that the (...) basic theory faces. The final view appeals to relevance logic and uses resources in information theory to understand the explanatory relationship between mathematical and physical facts. 1Introduction2Anchoring3The Basic Deductive-Mathematical Account4The Genuineness Problem5Irrelevance6Relevance and Information7Objections and Replies 7.1Against relevance logic7.2Too epistemic7.3Informational containment8Conclusion. (shrink)
What can we infer from numerical cognition about mathematical realism? In this paper, I will consider one aspect of numerical cognition that has received little attention in the literature: the remarkable similarities of numerical cognitive capacities across many animal species. This Invariantism in Numerical Cognition (INC) indicates that mathematics and morality are disanalogous in an important respect: proto-moral beliefs differ substantially between animal species, whereas proto-mathematical beliefs (at least in the animals studied) seem to show more similarities. This (...) makes moral beliefs more susceptible to a contingency challenge from evolution compared to mathematical beliefs, and indicates that mathematical beliefs might be less vulnerable to evolutionary debunking arguments. I will then examine to what extent INC can be used to flesh out a positive case for mathematical realism. Finally, I will review two forms of mathematical realism that are promising in the light of the evolutionary evidence about numerical cognition, ante rem structuralism and Millean empiricism. (shrink)
In the present article we attempt to show that Aristotle's syllogistic is an underlying logiC which includes a natural deductive system and that it isn't an axiomatic theory as had previously been thought. We construct a mathematical model which reflects certain structural aspects of Aristotle's logic. We examine the relation of the model to the system of logic envisaged in scattered parts of Prior and Posterior Analytics. Our interpretation restores Aristotle's reputation as a logician of consummate imagination and skill. (...) Several attributions of shortcomings and logical errors to Aristotle are shown to be without merit. Aristotle's logic is found to be self-sufficient in several senses: his theory of deduction is logically sound in every detail. (His indirect deductions have been criticized, but incorrectly on our account.) Aristotle's logic presupposes no other logical concepts, not even those of propositional logic. The Aristotelian system is seen to be complete in the sense that every valid argument expressible in his system admits of a deduction within his deductive system: every semantically valid argument is deducible. (shrink)
The development of symbolic logic is often presented in terms of a cumulative story of consecutive innovations that led to what is known as modern logic. This narrative hides the difficulties that this new logic faced at first, which shaped its history. Indeed, negative reactions to the emergence of the new logic in the second half of the nineteenth century were numerous and we study here one case, namely logic at Oxford, where one finds Lewis Carroll, a mathematical (...) teacher who promoted symbolic logic, and John Cook Wilson, the Wykeham Professor of Logic who notoriously opposed it. An analysis of their disputes on the topic of logical symbolism shows that their opposition was not as sharp as it might look at first, as Cook Wilson was not so much opposed to the « symbolic » character of logic, but the intrusion of mathematics and what he perceived to be the futility of some of its problems, for logicians and philosophers alike. (shrink)
In spite of its significance for everyday and philosophical discourse, the explanatory connective has not received much treatment in the philosophy of logic. The present paper develops a logic for based on systematic connections between and the truth-functional connectives.
The first learning game to be developed to help students to develop and hone skills in constructing proofs in both the propositional and first-order predicate calculi. It comprises an autotelic (self-motivating) learning approach to assist students in developing skills and strategies of proof in the propositional and predicate calculus. The text of VALIDITY consists of a general introduction that describes earlier studies made of autotelic learning games, paying particular attention to work done at the Law School of Yale University, called (...) the ALL Project (Accelerated Learning of Logic). Following the introduction, the game of VALIDITY is described, first with reference to the propositional calculus, and then in connection with the first-order predicate calculus with identity. Sections in the text are devoted to discussions of the various rules of derivation employed in both calculi. Three appendices follow the main text; these provide a catalogue of sequents and theorems that have been proved for the propositional calculus and for the predicate calculus, and include suggestions for the classroom use of VALIDITY in university-level courses in mathematical logic. (shrink)
Whether mathematical truths are syntactical (as Rudolf Carnap claimed) or empirical (as Mill actually never claimed, though Carnap claimed that he did) might seem merely an academic topic. However, it becomes a practical concern as soon as we consider the role of questions. For if we inquire as to the truth of a mathematical statement, this question must be (in a certain respect) meaningless for Carnap, as its truth or falsity is certain in advance due to its purely (...) syntactical (or formal-semantical) nature. In contrast, for Mill such a question is as valid as any other. These differing views have their consequences for contemporary erotetic logic. (shrink)
Symbolic logic faced great difficulties in its early stage of development in order to acquire recognition of its utility for the needs of science and society. The aim of this paper is to discuss an early attempt by the British logician Lewis Carroll (1832–1898) to promote symbolic logic as a social good. This examination is achieved in three phases: first, Carroll’s belief in the social utility of logic, broadly understood, is demonstrated by his numerous interventions to fight fallacious (...) reasoning in public debates. Then, Carroll’s attempts to promote symbolic logic, specifically, are revealed through his work on a treatise that would make the subject accessible to a wide and young audience. Finally, it is argued that Carroll’s ideal of logic as a common good influenced the logical methods he invented and allowed him to tackle more efficiently some problems that resisted to early symbolic logicians. (shrink)
This review concludes that if the authors know what mathematical logic is they have not shared their knowledge with the readers. This highly praised book is replete with errors and incoherency.
This paper responds to recent work in the philosophy of Homotopy Type Theory by James Ladyman and Stuart Presnell. They consider one of the rules for identity, path induction, and justify it along ‘pre-mathematical’ lines. I give an alternate justification based on the philosophical framework of inferentialism. Accordingly, I construct a notion of harmony that allows the inferentialist to say when a connective or concept is meaning-bearing and this conception unifies most of the prominent conceptions of harmony through category (...) theory. This categorical harmony is stated in terms of adjoints and says that any concept definable by iterated adjoints from general categorical operations is harmonious. Moreover, it has been shown that identity in a categorical setting is determined by an adjoint in the relevant way. Furthermore, path induction as a rule comes from this definition. Thus we arrive at an account of how path induction, as a rule of inference governing identity, can be justified on mathematically motivated grounds. (shrink)
This paper considers logics which are formally dual to intuitionistic logic in order to investigate a co-constructive logic for proofs and refutations. This is philosophically motivated by a set of problems regarding the nature of constructive truth, and its relation to falsity. It is well known both that intuitionism can not deal constructively with negative information, and that defining falsity by means of intuitionistic negation leads, under widely-held assumptions, to a justification of bivalence. For example, we do not want to (...) equate falsity with the non-existence of a proof since this would render a statement such as “pi is transcendental” false prior to 1882. In addition, the intuitionist account of negation as shorthand for the derivation of absurdity is inadequate, particularly outside of purely mathematical contexts. To deal with these issues, I investigate the dual of intuitionistic logic, co-intuitionistic logic, as a logic of refutation, alongside intuitionistic logic of proofs. Direct proof and refutation are dual to each other, and are constructive, whilst there also exist syntactic, weak, negations within both logics. In this respect, the logic of refutation is weakly paraconsistent in the sense that it allows for statements for which, neither they, nor their negation, are refuted. I provide a proof theory for the co-constructive logic, a formal dualizing map between the logics, and a Kripke-style semantics. This is given an intuitive philosophical rendering in a re-interpretation of Kolmogorov’s logic of problems. (shrink)
While many different mechanisms contribute to the generation of spatial order in biological development, the formation of morphogenetic fields which in turn direct cell responses giving rise to pattern and form are of major importance and essential for embryogenesis and regeneration. Most likely the fields represent concentration patterns of substances produced by molecular kinetics. Short range autocatalytic activation in conjunction with longer range “lateral” inhibition or depletion effects is capable of generating such patterns (Gierer and Meinhardt, 1972). Non-linear reactions are (...) required, and mathematical criteria were derived to design molecular models capable of pattern generation. The classical embryological feature of proportion regulation can be incorporated into the models. The conditions are mathematically necessary for the simplest two-factor case, and are likely to be a fair approximation in multi-component systems in which activation and inhibition are systems parameters subsuming the action of several agents. Gradients, symmetric and periodic patterns, in one or two dimensions, stable or pulsing in time, can be generated on this basis. Our basic concept of autocatalysis in conjunction with lateral inhibition accounts for self-regulatory biological features, including the reproducible formation of structures from near-uniform initial conditions as required by the logic of the generation cycle. Real tissue form, for instance that of budding Hydra, may often be traced back to local curvature arising within an initially relatively flat cell sheet, the position of evagination being determined by morphogenetic fields. Shell theory developed for architecture may also be applied to such biological processes. (shrink)
Anti-exceptionalism about logic is the doctrine that logic does not require its own epistemology, for its methods are continuous with those of science. Although most recently urged by Williamson, the idea goes back at least to Lakatos, who wanted to adapt Popper's falsicationism and extend it not only to mathematics but to logic as well. But one needs to be careful here to distinguish the empirical from the a posteriori. Lakatos coined the term 'quasi-empirical' `for the counterinstances to putative (...) class='Hi'>mathematical and logical theses. Mathematics and logic may both be a posteriori, but it does not follow that they are empirical. Indeed, as Williamson has demonstrated, what counts as empirical knowledge, and the role of experience in acquiring knowledge, are both unclear. Moreover, knowledge, even of necessary truths, is fallible. Nonetheless, logical consequence holds in virtue of the meaning of the logical terms, just as consequence in general holds in virtue of the meanings of the concepts involved; and so logic is both analytic and necessary. In this respect, it is exceptional. But its methodologyand its epistemology are the same as those of mathematics and science in being fallibilist, and counterexamples to seemingly analytic truths are as likely as those in any scientic endeavour. What is needed is a new account of the evidential basis of knowledge, one which is, perhaps surprisingly, found in Aristotle. (shrink)
Review of Joseph Y. Halpern (ed.), Theoretical Aspects of Reasoning About Knowledge: Proceedings of the 1986 Conference (Los Altos, CA: Morgan Kaufmann, 1986),.
This paper suggests that time could have a much richer mathematical structure than that of the real numbers. Clark & Read (1984) argue that a hypertask (uncountably many tasks done in a finite length of time) cannot be performed. Assuming that time takes values in the real numbers, we give a trivial proof of this. If we instead take the surreal numbers as a model of time, then not only are hypertasks possible but so is an ultratask (a sequence (...) which includes one task done for each ordinal number—thus a proper class of them). We argue that the surreal numbers are in some respects a better model of the temporal continuum than the real numbers as defined in mainstream mathematics, and that surreal time and hypertasks are mathematically possible. (shrink)
K. Marx’s 200th jubilee coincides with the celebration of the 85 years from the first publication of his “Mathematical Manuscripts” in 1933. Its editor, Sofia Alexandrovna Yanovskaya (1896–1966), was a renowned Soviet mathematician, whose significant studies on the foundations of mathematics and mathematical logic, as well as on the history and philosophy of mathematics are unduly neglected nowadays. Yanovskaya, as a militant Marxist, was actively engaged in the ideological confrontation with idealism and its influence on modern mathematics and (...) their interpretation. Concomitantly, she was one of the pioneers of mathematical logic in the Soviet Union, in an era of fierce disputes on its compatibility with Marxist philosophy. Yanovskaya managed to embrace in an originally Marxist spirit the contemporary level of logico-philosophical research of her time. Due to her highly esteemed status within Soviet academia, she became one of the most significant pillars for the culmination of modern mathematics in the Soviet Union. In this paper, I attempt to trace the influence of the complex socio-cultural context of the first decades of the Soviet Union on Yanovskaya’s work. Among the several issues I discuss, her encounter with L. Wittgenstein is striking. (shrink)
This paper shows how to conservatively extend classical logic with a transparent truth predicate, in the face of the paradoxes that arise as a consequence. All classical inferences are preserved, and indeed extended to the full (truth—involving) vocabulary. However, not all classical metainferences are preserved; in particular, the resulting logical system is nontransitive. Some limits on this nontransitivity are adumbrated, and two proof systems are presented and shown to be sound and complete. (One proof system allows for Cut—elimination, but the (...) other does not.). (shrink)
One of the key features of modern mathematics is the adoption of the abstract method. Our goal in this paper is to propose an explication of that method that is rooted in the history of the subject.
This book treats ancient logic: the logic that originated in Greece by Aristotle and the Stoics, mainly in the hundred year period beginning about 350 BCE. Ancient logic was never completely ignored by modern logic from its Boolean origin in the middle 1800s: it was prominent in Boole’s writings and it was mentioned by Frege and by Hilbert. Nevertheless, the first century of mathematical logic did not take it seriously enough to study the ancient logic texts. A renaissance in (...) ancient logic studies occurred in the early 1950s with the publication of the landmark Aristotle’s Syllogistic by Jan Łukasiewicz, Oxford UP 1951, 2nd ed. 1957. Despite its title, it treats the logic of the Stoics as well as that of Aristotle. Łukasiewicz was a distinguished mathematical logician. He had created many-valued logic and the parenthesis-free prefix notation known as Polish notation. He co-authored with Alfred Tarski’s an important paper on metatheory of propositional logic and he was one of Tarski’s the three main teachers at the University of Warsaw. Łukasiewicz’s stature was just short of that of the giants: Aristotle, Boole, Frege, Tarski and Gödel. No mathematical logician of his caliber had ever before quoted the actual teachings of ancient logicians. -/- Not only did Łukasiewicz inject fresh hypotheses, new concepts, and imaginative modern perspectives into the field, his enormous prestige and that of the Warsaw School of Logic reflected on the whole field of ancient logic studies. Suddenly, this previously somewhat dormant and obscure field became active and gained in respectability and importance in the eyes of logicians, mathematicians, linguists, analytic philosophers, and historians. Next to Aristotle himself and perhaps the Stoic logician Chrysippus, Łukasiewicz is the most prominent figure in ancient logic studies. A huge literature traces its origins to Łukasiewicz. -/- This Ancient Logic and Its Modern Interpretations, is based on the 1973 Buffalo Symposium on Modernist Interpretations of Ancient Logic, the first conference devoted entirely to critical assessment of the state of ancient logic studies. (shrink)
This paper is a contribution to graded model theory, in the context of mathematical fuzzy logic. We study characterizations of classes of graded structures in terms of the syntactic form of their first-order axiomatization. We focus on classes given by universal and universal-existential sentences. In particular, we prove two amalgamation results using the technique of diagrams in the setting of structures valued on a finite MTL-algebra, from which analogues of the Łoś–Tarski and the Chang–Łoś–Suszko preservation theorems follow.
Review of Karel Lambert, Meinong and the Principle of Independence: Its Place in Meinong's Theory of Objects and Its Significance in Contemporary Philosophical Logic.
2nd edition. Many-valued logics are those logics that have more than the two classical truth values, to wit, true and false; in fact, they can have from three to infinitely many truth values. This property, together with truth-functionality, provides a powerful formalism to reason in settings where classical logic—as well as other non-classical logics—is of no avail. Indeed, originally motivated by philosophical concerns, these logics soon proved relevant for a plethora of applications ranging from switching theory to cognitive modeling, and (...) they are today in more demand than ever, due to the realization that inconsistency and vagueness in knowledge bases and information processes are not only inevitable and acceptable, but also perhaps welcome. The main modern applications of (any) logic are to be found in the digital computer, and we thus require the practical knowledge how to computerize—which also means automate—decisions (i.e. reasoning) in many-valued logics. This, in turn, necessitates a mathematical foundation for these logics. This book provides both these mathematical foundation and practical knowledge in a rigorous, yet accessible, text, while at the same time situating these logics in the context of the satisfiability problem (SAT) and automated deduction. The main text is complemented with a large selection of exercises, a plus for the reader wishing to not only learn about, but also do something with, many-valued logics. (shrink)
This presentation includes a complete bibliography of John Corcoran’s publications devoted at least in part to Aristotle’s logic. Sections I–IV list 20 articles, 43 abstracts, 3 books, and 10 reviews. It starts with two watershed articles published in 1972: the Philosophy & Phenomenological Research article that antedates Corcoran’s Aristotle’s studies and the Journal of Symbolic Logic article first reporting his original results; it ends with works published in 2015. A few of the items are annotated with endnotes connecting them (...) with other work. In addition, Section V “Discussions” is a nearly complete secondary bibliography of works describing, interpreting, extending, improving, supporting, and criticizing Corcoran’s work: 8 items published in the 1970s, 22 in the 1980s, 39 in the 1990s, 56 in the 2000s, and 65 in the current decade. The secondary bibliography is annotated with endnotes: some simply quoting from the cited item, but several answering criticisms and identifying errors. As is evident from the Acknowledgements sections, all of Corcoran’s publications benefited from correspondence with other scholars, most notably Timothy Smiley, Michael Scanlan, and Kevin Tracy. All of Corcoran’s Greek translations were done in consultation with two or more classicists. Corcoran never published a sentence without discussing it with his colleagues and students. REQUEST: Please send errors, omissions, and suggestions. I am especially interested in citations made in non-English publications. (shrink)
Classical logic is usually interpreted as the logic of propositions. But from Boole's original development up to modern categorical logic, there has always been the alternative interpretation of classical logic as the logic of subsets of any given (nonempty) universe set. Partitions on a universe set are dual to subsets of a universe set in the sense of the reverse-the-arrows category-theoretic duality--which is reflected in the duality between quotient objects and subobjects throughout algebra. Hence the idea arises of a dual (...) logic of partitions. That dual logic is described here. Partition logic is at the same mathematical level as subset logic since models for both are constructed from (partitions on or subsets of) arbitrary unstructured sets with no ordering relations, compatibility or accessibility relations, or topologies on the sets. Just as Boole developed logical finite probability theory as a quantitative treatment of subset logic, applying the analogous mathematical steps to partition logic yields a logical notion of entropy so that information theory can be refounded on partition logic. But the biggest application is that when partition logic and the accompanying logical information theory are "lifted" to complex vector spaces, then the mathematical framework of quantum mechanics is obtained. Partition logic models indefiniteness (i.e., numerical attributes on a set become more definite as the inverse-image partition becomes more refined) while subset logic models the definiteness of classical physics (an entity either definitely has a property or definitely does not). Hence partition logic provides the backstory so the old idea of "objective indefiniteness" in QM can be fleshed out to a full interpretation of quantum mechanics. (shrink)
A central area of current philosophical debate in the foundations of mathematics concerns whether or not there is a single, maximal, universe of set theory. Universists maintain that there is such a universe, while Multiversists argue that there are many universes, no one of which is ontologically privileged. Often model-theoretic constructions that add sets to models are cited as evidence in favour of the latter. This paper informs this debate by developing a way for a Universist to interpret talk that (...) seems to necessitate the addition of sets to V. We argue that, despite the prima facie incoherence of such talk for the Universist, she nonetheless has reason to try and provide interpretation of this discourse. We present a method of interpreting extension-talk (V-logic), and show how it captures satisfaction in `ideal' outer models and relates to impredicative class theories. We provide some reasons to regard the technique as philosophically virtuous, and argue that it opens new doors to philosophical and mathematical discussions for the Universist. (shrink)
A reasoned argument or tarka is essential for a wholesome vāda that aims at establishing the truth. A strong tarka constitutes of a number of elements including an anumāna based on a valid hetu. Several scholars, such as Dharmakīrti, Vasubandhu and Dignāga, have worked on theories for the establishment of a valid hetu to distinguish it from an invalid one. This paper aims to interpret Dignāga’s hetu-cakra, called the wheel of grounds, from a modern philosophical perspective by deconstructing it into (...) a simple probabilistic mathematical model. The objective is to understand how and why a vāda based on a probabilistically weaker hetu can degrade into a Jalpa or vitaṇḍā. To do so, the paper maps the concept of ‘Bounded Rationality’ onto the hetu-cakra. Bounded Rationality, an idea coined by the management thinker Herbert Simon, is often employed in understanding decision-making processes of rational agents. In the context of this paper, the concept would state that the prativādin and ālocaka (debater) may not hold unbounded information to back their pratijñā (proposition). The paper argues that within the probabilistically deconstructed hetu-cakra model, most people argue in the ‘Zone of Bounded Rationality’, and thus, the probability of a debate degrading into Jalpa or vitaṇḍā is high. -/- . (shrink)
1971. Discourse Grammars and the Structure of Mathematical Reasoning II: The Nature of a Correct Theory of Proof and Its Value, Journal of Structural Learning 3, #2, 1–16. REPRINTED 1976. Structural Learning II Issues and Approaches, ed. J. Scandura, Gordon & Breach Science Publishers, New York, MR56#15263. -/- This is the second of a series of three articles dealing with application of linguistics and logic to the study of mathematical reasoning, especially in the setting of a concern for (...) improvement of mathematical education. The present article presupposes the previous one. Herein we develop our ideas of the purposes of a theory of proof and the criterion of success to be applied to such theories. In addition we speculate at length concerning the specific kinds of uses to which a successful theory of proof may be put vis-a-vis improvement of various aspects of mathematical education. The final article will deal with the construction of such a theory. The 1st is the 1971. Discourse Grammars and the Structure of Mathematical Reasoning I: Mathematical Reasoning and Stratification of Language, Journal of Structural Learning 3, #1, 55–74. https://www.academia.edu/s/fb081b1886?source=link . (shrink)
Heinrich Behmann (1891-1970) obtained his Habilitation under David Hilbert in Göttingen in 1921 with a thesis on the decision problem. In his thesis, he solved - independently of Löwenheim and Skolem's earlier work - the decision problem for monadic second-order logic in a framework that combined elements of the algebra of logic and the newer axiomatic approach to logic then being developed in Göttingen. In a talk given in 1921, he outlined this solution, but also presented important programmatic remarks on (...) the significance of the decision problem and of decision procedures more generally. The text of this talk as well as a partial English translation are included. (shrink)
Since the time of Aristotle's students, interpreters have considered Prior Analytics to be a treatise about deductive reasoning, more generally, about methods of determining the validity and invalidity of premise-conclusion arguments. People studied Prior Analytics in order to learn more about deductive reasoning and to improve their own reasoning skills. These interpreters understood Aristotle to be focusing on two epistemic processes: first, the process of establishing knowledge that a conclusion follows necessarily from a set of premises (that is, on the (...) epistemic process of extracting information implicit in explicitly given information) and, second, the process of establishing knowledge that a conclusion does not follow. Despite the overwhelming tendency to interpret the syllogistic as formal epistemology, it was not until the early 1970s that it occurred to anyone to think that Aristotle may have developed a theory of deductive reasoning with a well worked-out system of deductions comparable in rigor and precision with systems such as propositional logic or equational logic familiar from mathematical logic. When modern logicians in the 1920s and 1930s first turned their attention to the problem of understanding Aristotle's contribution to logic in modern terms, they were guided both by the Frege-Russell conception of logic as formal ontology and at the same time by a desire to protect Aristotle from possible charges of psychologism. They thought they saw Aristotle applying the informal axiomatic method to formal ontology, not as making the first steps into formal epistemology. They did not notice Aristotle's description of deductive reasoning. Ironically, the formal axiomatic method (in which one explicitly presents not merely the substantive axioms but also the deductive processes used to derive theorems from the axioms) is incipient in Aristotle's presentation. Partly in opposition to the axiomatic, ontically-oriented approach to Aristotle's logic and partly as a result of attempting to increase the degree of fit between interpretation and text, logicians in the 1970s working independently came to remarkably similar conclusions to the effect that Aristotle indeed had produced the first system of formal deductions. They concluded that Aristotle had analyzed the process of deduction and that his achievement included a semantically complete system of natural deductions including both direct and indirect deductions. Where the interpretations of the 1920s and 1930s attribute to Aristotle a system of propositions organized deductively, the interpretations of the 1970s attribute to Aristotle a system of deductions, or extended deductive discourses, organized epistemically. The logicians of the 1920s and 1930s take Aristotle to be deducing laws of logic from axiomatic origins; the logicians of the 1970s take Aristotle to be describing the process of deduction and in particular to be describing deductions themselves, both those deductions that are proofs based on axiomatic premises and those deductions that, though deductively cogent, do not establish the truth of the conclusion but only that the conclusion is implied by the premise-set. Thus, two very different and opposed interpretations had emerged, interestingly both products of modern logicians equipped with the theoretical apparatus of mathematical logic. The issue at stake between these two interpretations is the historical question of Aristotle's place in the history of logic and of his orientation in philosophy of logic. This paper affirms Aristotle's place as the founder of logic taken as formal epistemology, including the study of deductive reasoning. A by-product of this study of Aristotle's accomplishments in logic is a clarification of a distinction implicit in discourses among logicians--that between logic as formal ontology and logic as formal epistemology. (shrink)
Not focusing on the history of classical logic, this book provides discussions and quotes central passages on its origins and development, namely from a philosophical perspective. Not being a book in mathematical logic, it takes formal logic from an essentially mathematical perspective. Biased towards a computational approach, with SAT and VAL as its backbone, this is an introduction to logic that covers essential aspects of the three branches of logic, to wit, philosophical, mathematical, and computational.
Supervaluationism is often described as the most popular semantic treatment of indeterminacy. There???s little consensus, however, about how to fill out the bare-bones idea to include a characterization of logical consequence. The paper explores one methodology for choosing between the logics: pick a logic that norms belief as classical consequence is standardly thought to do. The main focus of the paper considers a variant of standard supervaluational, on which we can characterize degrees of determinacy. It applies the methodology above to (...) focus on degree logic. This is developed first in a basic, single-premise case; and then extended to the multipremise case, and to allow degrees of consequence. The metatheoretic properties of degree logic are set out. On the positive side, the logic is supraclassical???all classical valid sequents are degree logic valid. Strikingly, metarules such as cut and conjunction introduction fail. (shrink)
The rather unrestrained use of second-order logic in the neo-logicist program is critically examined. It is argued in some detail that it brings with it genuine set-theoretical existence assumptions and that the mathematical power that Hume’s Principle seems to provide, in the derivation of Frege’s Theorem, comes largely from the ‘logic’ assumed rather than from Hume’s Principle. It is shown that Hume’s Principle is in reality not stronger than the very weak Robinson Arithmetic Q. Consequently, only a few rudimentary (...) facts of arithmetic are logically derivable from Hume’s Principle. And that hardly counts as a vindication of logicism. (shrink)
“Second-order Logic” in Anderson, C.A. and Zeleny, M., Eds. Logic, Meaning, and Computation: Essays in Memory of Alonzo Church. Dordrecht: Kluwer, 2001. Pp. 61–76. -/- Abstract. This expository article focuses on the fundamental differences between second- order logic and first-order logic. It is written entirely in ordinary English without logical symbols. It employs second-order propositions and second-order reasoning in a natural way to illustrate the fact that second-order logic is actually a familiar part of our traditional intuitive logical framework and (...) that it is not an artificial formalism created by specialists for technical purposes. To illustrate some of the main relationships between second-order logic and first-order logic, this paper introduces basic logic, a kind of zero-order logic, which is more rudimentary than first-order and which is transcended by first-order in the same way that first-order is transcended by second-order. The heuristic effectiveness and the historical importance of second-order logic are reviewed in the context of the contemporary debate over the legitimacy of second-order logic. Rejection of second-order logic is viewed as radical: an incipient paradigm shift involving radical repudiation of a part of our scientific tradition, a tradition that is defended by classical logicians. But it is also viewed as reactionary: as being analogous to the reactionary repudiation of symbolic logic by supporters of “Aristotelian” traditional logic. But even if “genuine” logic comes to be regarded as excluding second-order reasoning, which seems less likely today than fifty years ago, its effectiveness as a heuristic instrument will remain and its importance for understanding the history of logic and mathematics will not be diminished. Second-order logic may someday be gone, but it will never be forgotten. Technical formalisms have been avoided entirely in an effort to reach a wide audience, but every effort has been made to limit the inevitable sacrifice of rigor. People who do not know second-order logic cannot understand the modern debate over its legitimacy and they are cut-off from the heuristic advantages of second-order logic. And, what may be worse, they are cut-off from an understanding of the history of logic and thus are constrained to have distorted views of the nature of the subject. As Aristotle first said, we do not understand a discipline until we have seen its development. It is a truism that a person's conceptions of what a discipline is and of what it can become are predicated on their conception of what it has been. (shrink)
In the present paper we propose a system of propositional logic for reasoning about justification, truthmaking, and the connection between justifiers and truthmakers. The logic of justification and truthmaking is developed according to the fundamental ideas introduced by Artemov. Justifiers and truthmakers are treated in a similar way, exploiting the intuition that justifiers provide epistemic grounds for propositions to be considered true, while truthmakers provide ontological grounds for propositions to be true. This system of logic is then applied both for (...) interpreting the notorious definition of knowledge as justified true belief and for advancing a new solution to Gettier counterexamples to this standard definition. (shrink)
Modern categorical logic as well as the Kripke and topological models of intuitionistic logic suggest that the interpretation of ordinary “propositional” logic should in general be the logic of subsets of a given universe set. Partitions on a set are dual to subsets of a set in the sense of the category-theoretic duality of epimorphisms and monomorphisms—which is reflected in the duality between quotient objects and subobjects throughout algebra. If “propositional” logic is thus seen as the logic of subsets of (...) a universe set, then the question naturally arises of a dual logic of partitions on a universe set. This paper is an introduction to that logic of partitions dual to classical subset logic. The paper goes from basic concepts up through the correctness and completeness theorems for a tableau system of partition logic. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.