Formal systems are standardly envisaged in terms of a grammar specifying well-formed formulae together with a set of axioms and rules. Derivations are ordered lists of formulae each of which is either an axiom or is generated from earlier items on the list by means of the rules of the system; the theorems of a formalsystem are simply those formulae for which there are derivations. Here we outline a set of alternative and explicitly visual ways (...) of envisaging and analyzing at least simple formal systems using fractal patterns of infinite depth. Progressively deeper dimensions of such a fractal can be used to map increasingly complex wffs or increasingly complex 'value spaces', with tautologies, contradictions, and various forms of contingency coded in terms of color. This and related approaches, it turns out, offer not only visually immediate and geometrically intriguing representations of formal systems as a whole but also promising formal links (1) between standard systems and classical patterns in fractal geometry, (2) between quite different kinds of value spaces in classical and infinite-valued logics, and (3) between cellular automata and logic. It is hoped that pattern analysis of this kind may open possibilities for a geometrical approach to further questions within logic and metalogic. (shrink)
Because formal systems of symbolic logic inherently express and represent the deductive inference model formal proofs to theorem consequences can be understood to represent sound deductive inference to deductive conclusions without any need for other representations.
To eliminate incompleteness, undecidability and inconsistency from formal systems we only need to convert the formal proofs to theorem consequences of symbolic logic to conform to the sound deductive inference model. -/- Within the sound deductive inference model there is a (connected sequence of valid deductions from true premises to a true conclusion) thus unlike the formal proofs of symbolic logic provability cannot diverge from truth.
The naive idea of a mimesis between theory and experiments, a concept still lasing in many epistemologies, is here substituted by a more sophisticated mathematical methexis where theoretical physics is a system of production of formal structures under strong mathematical constraints, such as global and local symmetries. Instead of an ultimate “everything theory”, the image of physical theories here proposed is a totality of interconnected structures establishing the very conditions of its “thinkability” and the relations with the experimental (...) domain. (shrink)
The central hypothesis of the collaboration between Language and Computing (L&C) and the Institute for Formal Ontology and Medical Information Science (IFOMIS) is that the methodology and conceptual rigor of a philosophically inspired formal ontology will greatly benefit software application ontologies. To this end LinKBase®, L&C’s ontology, which is designed to integrate and reason across various external databases simultaneously, has been submitted to the conceptual demands of IFOMIS’s Basic Formal Ontology (BFO). With this, we aim to move (...) beyond the level of controlled vocabularies to yield an ontology with the ability to support reasoning applications. (shrink)
This special issue of the Logic Journal of the IGPL includes revised and updated versions of the best work presented at the fourth edition of the workshop Formal Ap- proaches to Multi-Agent Systems, FAMAS'09, which took place in Turin, Italy, from 7 to 11 September, 2009, under the umbrella of the Multi-Agent Logics, Languages, and Organisations Federated Workshops (MALLOW). -/- Just like its predecessor, research reported in this FAMAS 2009 special issue is very much inspired by practical concerns. This (...) time the authors of all the five selected papers are concerned with knowledge and beliefs in multi-agent settings: How to create a group belief in a fair way from individual plausibility orderings? How to close gaps and resolve ambiguities in a tractable way, when information comes from multiple sources? How to reason about a spatial environment? How to compare the strengths of an agent's beliefs in a principled way? How to decide as efficiently as possible whether a given formula concerning group beliefs is valid? These questions and their answers lead to a multi-faceted and at the same time coherent special issue. We concisely introduce the five articles. (shrink)
Over the last decade, multi-agent systems have come to form one of the key tech- nologies for software development. The Formal Approaches to Multi-Agent Systems (FAMAS) workshop series brings together researchers from the fields of logic, theoreti- cal computer science and multi-agent systems in order to discuss formal techniques for specifying and verifying multi-agent systems. FAMAS addresses the issues of logics for multi-agent systems, formal methods for verification, for example model check- ing, and formal approaches to (...) cooperation, multi-agent planning, communication, coordination, negotiation, games, and reasoning under uncertainty in a distributed environment. In 2007, the third FAMAS workshop, FAMAS'007, was one of the agent workshops gathered together under the umbrella of Multi-Agent Logics, Languages, and Organ- isations - Federated Workshops, MALLOW'007, taking place from 3 to 7 September 2007 in Durham. This current special issue of the Logic Journal of the IGPL gathers together the revised and updated versions of the five best FAMAS'007 contributions. (shrink)
This paper reviews the central points and presents some recent developments of the epistemic approach to paraconsistency in terms of the preservation of evidence. Two formal systems are surveyed, the basic logic of evidence (BLE) and the logic of evidence and truth (LET J ), designed to deal, respectively, with evidence and with evidence and truth. While BLE is equivalent to Nelson’s logic N4, it has been conceived for a different purpose. Adequate valuation semantics that provide decidability are given (...) for both BLE and LET J . The meanings of the connectives of BLE and LET J , from the point of view of preservation of evidence, is explained with the aid of an inferential semantics. A formalization of the notion of evidence for BLE as proposed by M. Fitting is also reviewed here. As a novel result, the paper shows that LET J is semantically characterized through the so-called Fidel structures. Some opportunities for further research are also discussed. (shrink)
Edward Nieznański developed two logical systems in order to deal with a version of the problem of evil associated with two formulations of religious determinism. The aim of this research was to revisit these systems, providing them with a more appropriate formalization. The new resulting systems, namely, N1 and N2, were reformulated in first-order modal logic; they retain much of their original basic structures, but some additional results were obtained. Furthermore, our research found that an underlying minimal set of axioms (...) is enough to settle the questions proposed. Thus, we developed a minimal system, called N3, that solves the same issues tackled by N1 and N2, but with less assumptions than these systems. All of the systems developed here are proposed as solutions to the logical problem of evil through the refutation of two versions of religious determinism, showing that the attributes of God in Classical Theism, namely, those of omniscience, omnipotence, infallibility, and omnibenevolence, when formalized, are consistent with the existence of evil, providing one more response to this traditional issue. (shrink)
This work aims to present an overview of the top-level ontology BFO - Basic Formal Ontology - and its applicability for Satellite Systems. As an upper level ontology, the BFO was designed to be extended, providing the basis for the specification of detailed representational artifacts about scientific information domains. These aspects and the challenges of satellite systems complexity and large size compose a suitable scenario for the creation of a specialized dialect to improve efficiency and accuracy when modeling such (...) systems. By analyzing BFO based ontologies in other disciplines and existing satellite models it is possible to describe an application for satellite systems, which can provide a foundation for the creation of a concrete ontology to be applied on satellite modeling. (shrink)
This paper introduces new logical systems which axiomatize a formal representation of inconsistency (here taken to be equivalent to contradictoriness) in classical logic. We start from an intuitive semantical account of inconsistent data, fixing some basic requirements, and provide two distinct sound and complete axiomatics for such semantics, LFI1 and LFI2, as well as their first-order extensions, LFI1* and LFI2*, depending on which additional requirements are considered. These formal systems are examples of what we dub Logics of (...) class='Hi'>Formal Inconsistency (LFI) and form part of a much larger family of similar logics. We also show that there are translations from classical and paraconsistent first-order logics into LFI1* and LFI2*, and back. Hence, despite their status as subsystems of classical logic, LFI1* and LFI2* can codify any classical or paraconsistent reasoning. (shrink)
Although formal thought disorder (FTD) has been for long a clinical label in the assessment of some psychiatric disorders, in particular of schizophrenia, it remains a source of controversy, mostly because it is hard to say what exactly the “formal” in FTD refers to. We see anomalous processing of terminological knowledge, a core construct of human knowledge in general, behind FTD symptoms and we approach this anomaly from a strictly formal perspective. More specifically, we present here a (...) symbolic computational model of storage in, and activation of, a human semantic network, or semantic memory, whose core element is logical form; this is normalized by description logic (DL), namely by CL, a DL-based language – Conception Language – designed to formalize conceptualization from the viewpoint of individual cognitive agency. In this model, disruptions in the rule-based implementation of the logical form account for the apparently semantic anomalies symptomatic of FTD, which are detected by means of a CL-based algorithmic assessment. (shrink)
Common sense is on the one hand a certain set of processes of natural cognition - of speaking, reasoning, seeing, and so on. On the other hand common sense is a system of beliefs (of folk physics, folk psychology and so on). Over against both of these is the world of common sense, the world of objects to which the processes of natural cognition and the corresponding belief-contents standardly relate. What are the structures of this world? How does the (...) scientific treatment of this world relate to traditional and contemporary metaphysics and formal ontology? Can we embrace a thesis of common-sense realism to the effect that the world of common sense exists uniquely? Or must we adopt instead a position of cultural relativism which would assign distinct worlds of common sense to each group and epoch? The present paper draws on recent work in computer science (especially in the fields of naive and qualitative physics), in perceptual and developmental psychology, and in cognitive anthropology, in order to consider in a new light these and related questions and to draw conclusions for the methodology and philosophical foundations of the cognitive sciences. (shrink)
The formal sciences - mathematical as opposed to natural sciences, such as operations research, statistics, theoretical computer science, systems engineering - appear to have achieved mathematically provable knowledge directly about the real world. It is argued that this appearance is correct.
Two senses of ‘ontology’ can be distinguished in the current literature. First is the sense favored by information scientists, who view ontologies as software implementations designed to capture in some formal way the consensus conceptualization shared by those working on information systems or databases in a given domain. [Gruber 1993] Second is the sense favored by philosophers, who regard ontologies as theories of different types of entities (objects, processes, relations, functions) [Smith 2003]. Where information systems ontologists seek to maximize (...) reasoning efficiency even at the price of simplifications on the side of representation, philosophical ontologists argue that representational adequacy can bring benefits for the stability and resistance to error of an ontological framework and also for its extendibility in the future. In bioinformatics, however, a third sense of ‘ontology’ has established itself, above all as a result of the successes of the Gene Ontology (hereafter: GO), which is a tool for the representation and processing of information about gene products and their biological functions [Gene Ontology Consortium 2000]. We show how Basic Formal Ontology (BFO) has established itself as an overarching ontology drawing on all three of the strands distinguished above, and describe applications of BFO especially in the treatment of biological granularity. (shrink)
One of the tasks of ontology in information science is to support the classification of entities according to their kinds and qualities. We hold that to realize this task as far as entities such as material objects are concerned we need to distinguish four kinds of entities: substance particulars, quality particulars, substance universals, and quality universals. These form, so to speak, an ontological square. We present a formal theory of classification based on this idea, including both a semantics for (...) the theory and a provably sound axiomatization. (shrink)
Edward Nieznański developed two logical systems to deal with the problem of evil and to refute religious determinism. However, when formalized in first-order modal logic, two axioms of each system contradict one another, revealing that there is an underlying minimal set of axioms enough to settle the questions. In this article, we develop this minimal system, called N3, which is based on Nieznański’s contribution. The purpose of N3 is to solve the logical problem of evil through the defeat (...) of a version of religious determinism. On the one hand, these questions are also addressed by Nieznański’s systems, but, on the other hand, they are obtained in N3 with fewer assumptions. Our approach can be considered a case of logic of religion, that is, of logic applied to religious discourse, as proposed by Józef Maria Bocheński; in this particular case, it is a discourse in theodicy, which is situated in the context of the philosophy of religion. (shrink)
We propose a typology of representational artifacts for health care and life sciences domains and associate this typology with different kinds of formal ontology and logic, drawing conclusions as to the strengths and limitations for ontology in a description logics framework. The four types of domain representation we consider are: (i) lexico-semantic representation, (ii) representation of types of entities, (iii) representations of background knowledge, and (iv) representation of individuals. We advocate a clear distinction of the four kinds of representation (...) in order to provide a more rational basis for using ontologies and related artifacts to advance integration of data and enhance interoperability of associated reasoning systems. We highlight the fact that only a minor portion of scientifically relevant facts in a domain such as biomedicine can be adequately represented by formal ontologies as long as the latter are conceived as representations of entity types. In particular, the attempt to encode default or probabilistic knowledge using ontologies so conceived is prone to produce unintended, erroneous models. (shrink)
Intelligent tutoring system (ITS) is a computer system which aims to provide immediate and customized or reactions to learners, usually without the intervention of human teacher's instructions. Secretariats professional to have the common goal of learning a meaningful and effective manner through the use of a variety of computing technologies enabled. There are many examples of professional Secretariats used in both formal education and in professional settings that have proven their capabilities. There is a close relationship between (...) private lessons intelligent, cognitive learning and design theories; and there are ongoing to improve the effectiveness of ITS research. And it aims to find a solution to the problem of over-reliance on students' teachers for quality education. The program aims to provide access to high-quality education to every student, and therefore the reform of the education system as a whole. In this paper, we will use Intelligent Tutoring System Builder (ITSB) to build an education system on cloud computing in terms of the concept of cloud computing and components and how to take advantage of cloud computing in the field. (shrink)
The human body is a system made of systems. The body is divided into bodily systems proper, such as the endocrine and circulatory systems, which are subdivided into many sub-systems at a variety of levels, whereby all systems and subsystems engage in massive causal interaction with each other and with their surrounding environments. Here we offer an explicit definition of bodily system and provide a framework for understanding their causal interactions. Medical sciences provide at best informal accounts of (...) basic notions such as system, process, and function, and while such informality is acceptable in documentation created for human beings, it falls short of what is needed for computer representations. In our analysis we will accordingly provide the framework for a formal definition of bodily system and of associated notions. (shrink)
We propose an ontological theory that is powerful enough to describe both complex spatio-temporal processes (occurrents) and the enduring entities (continuants) that participate in such processes. For this purpose we distinguish between meta-ontology and token ontologies. Token ontologies fall into two major categories: ontologies of type SPAN and ontologies of type SNAP. These represent two complementary perspectives on reality and result in distinct though compatible systems of categories. The meta-ontological level then describes the relationships between the different token ontologies. In (...) a SNAP (snapshot) ontology we have enduring entities such as substances, qualities, roles, functions as these exist to be inventoried at a given moment of time. In a SPAN ontology we have perduring entities such as processes and their parts and aggregates. We argue that both kinds of ontological theory are required, together with the metaontology which joins them together, in order to give a non-reductionistic account of both static and dynamic aspects of the geospatial world. (shrink)
An important part of the Unified Medical Language System (UMLS) is its Semantic Network, consisting of 134 Semantic Types connected to each other by edges formed by one or more of 54 distinct Relation Types. This Network is however for many purposes overcomplex, and various groups have thus made attempts at simplification. Here we take this work further by simplifying the relations which involve the three Semantic Types – Diagnostic Procedure, Laboratory Procedure and Therapeutic or Preventive Procedure. We define (...) operators which can be used to generate terms instantiating types from this selected set when applied to terms designating certain other Semantic Types, including almost all the terms specifying clinical tasks. Usage of such operators thus provides a useful and economical way of specifying clinical tasks. The operators allow us to define a mapping between those types within the UMLS which do not represent clinical tasks and those which do. This mapping then provides a basis for an ontology of clinical tasks that can be used in the formulation of computer-interpretable clinical guideline models. (shrink)
Since the beginning of the 20th Century to the present day, it has rarely been doubted that whenever formal aesthetic methods meet their iconological counterparts, the two approaches appear to be mutually exclusive. In reality, though, an ahistorical concept is challenging a historical analysis of art. It is especially Susanne K. Langer´s long-overlooked system of analogies between perceptions of the world and of artistic creations that are dependent on feelings which today allows a rapprochement of these positions. Krois’s (...) insistence on a similar point supports this analysis. - I - Unbestritten bis heute gilt, formwissenschaftliche und ikonologische Methoden scheinen sich grundsätzlich auszuschließen, da die ersteren auf ahistorischen und die letzteren auf historischen Grundlagen aufbauen. Dem entgegen soll mit diesem Beitrag gezeigt werden, wie insbesondere die Forschungen Susanne K. Langers und ergänzend diejenigen von John M. Krois eine Annäherung beider Positionen ermöglichen. (shrink)
A generative grammar for a language L generates one or more syntactic structures for each sentence of L and interprets those structures both phonologically and semantically. A widely accepted assumption in generative linguistics dating from the mid-60s, the Generative Grammar Hypothesis , is that the ability of a speaker to understand sentences of her language requires her to have tacit knowledge of a generative grammar of it, and the task of linguistic semantics in those early days was taken to be (...) that of specifying the form that the semantic component of a generative grammar must take. Then in the 70s linguistic semantics took a curious turn. Without rejecting GGH, linguists turned away from the task of characterizing the semantic component of a generative grammar to pursue instead the Montague-inspired project of providing for natural languages the same kind of model-theoretic semantics that logicians devise for the artificial languages of formal systems of logic, and “formal semantics” continues to dominate semantics in linguistics. This essay argues that the sort of compositional meaning theory that would verify GGH would not only be quite different from the theories formal semanticists construct, but would be a more fundamental theory that supersedes those theories in that it would explain why they are true when they are true, but their truth wouldn’t explain its truth. Formal semantics has undoubtedly made important contributions to our understanding of such phenomena as anaphora and quantification, but semantics in linguistics is supposed to be the study of meaning. This means that the formal semanticist can’t be unconcerned that the kind of semantic theory for a natural language that interests her has no place in a theory of linguistic competence; for if GGH is correct, then the more fundamental semantic theory is the compositional meaning theory that is the semantic component of the internally represented generative grammar, and if that is so, then linguistic semantics has so far ignored what really ought to be its primary concern. (shrink)
A theory of cognitive systems individuation is presented and defended. The approach has some affinity with Leonard Talmy's Overlapping Systems Model of Cognitive Organization, and the paper's first section explores aspects of Talmy's view that are shared by the view developed herein. According to the view on offer -- the conditional probability of co-contribution account (CPC) -- a cognitive system is a collection of mechanisms that contribute, in overlapping subsets, to a wide variety of forms of intelligent behavior. Central (...) to this approach is the idea of an integrated system. A formal characterization of integration is laid out in the form of a conditional-probabilitybased measure of the clustering of causal contributors to the production of intelligent behavior. I relate the view to the debate over extended and embodied cognition and respond to objections that have been raised in print by Andy Clark, Colin Klein, and Felipe de Brigard. (shrink)
Could the intersection of [formal proofs of mathematical logic] and [sound deductive inference] specify formal systems having [deductively sound formal proofs of mathematical logic]? All that we have to do to provide [deductively sound formal proofs of mathematical logic] is select the subset of conventional [formal proofs of mathematical logic] having true premises and now we have [deductively sound formal proofs of mathematical logic].
Ontologies are some of the most central constructs in today's large plethora of knowledge technologies, namely in the context of the semantic web. As their coinage indicates, they are direct heirs to the ontological investigations in the long Western philosophical tradition, but it is not easy to make bridges between them. Contemporary ontological commitments often take causality as a central aspect for the ur-segregation of entities, especially in scientific upper ontologies; theories of causality and philosophical ontological investigations often go hand-in-hand, (...) and were essentially inseparable in medieval thought. This constitutes the foundation for a bridge, and this article analyzes the causality-based ontology of the late medieval philosopher Dietrich of Freiberg from the viewpoint of today's upper-ontology engineering. In this bridging attempt, it offers a translation into English of the first part of Dietrich's De origine (abbreviated title) that is a compromise between traditional scholarly translations of medieval Latin philosophical texts and contemporary ontology. (shrink)
With the advent of computers in the experimental labs, dynamic systems have become a new tool for research on problem solving and decision making. A short review of this research is given and the main features of these systems (connectivity and dynamics) are illustrated. To allow systematic approaches to the influential variables in this area, two formal frameworks (linear structural equations and finite state automata) are presented. Besides the formal background, the article sets out how the task demands (...) of system identification and system control can be realised in these environments, and how psychometrically acceptable dependent variables can be derived. (shrink)
In this paper, we present an intelligent tutoring system developed to help students in learning Computer Theory. The Intelligent tutoring system was built using ITSB authoring tool. The system helps students to learn finite automata, pushdown automata, Turing machines and examines the relationship between these automata and formal languages, deterministic and nondeterministic machines, regular expressions, context free grammars, undecidability, and complexity. During the process the intelligent tutoring system gives assistance and feedback of many types in (...) an intelligent manner according to the behavior of the student. An evaluation of the intelligent tutoring system has revealed reasonably acceptable results in terms of its usability and learning abilities are concerned. (shrink)
This paper defends a view of the Gene Ontology (GO) and of Basic Formal Ontology (BFO) as examples of what the manufacturing industry calls product-service systems. This means that they are products (the ontologies) bundled with a range of ontology services such as updates, training, help desk, and permanent identifiers. The paper argues that GO and BFO are contrasted in this respect with DOLCE, which approximates more closely to a scientific theory or a scientific publication. The paper provides a (...) detailed overview of ontology services and concludes with a discussion of some implications of the product-service system approach for the understanding of the nature of applied ontology. Ontology developer communities are compared in this respect with developers of scientific theories and of standards (such as W3C). For each of these we can ask: what kinds of products do they develop and what kinds of services do they provide for the users of these products? (shrink)
The term ‘formal ontology’ was first used by the philosopher Edmund Husserl in his Logical Investigations to signify the study of those formal structures and relations – above all relations of part and whole – which are exemplified in the subject-matters of the different material sciences. We follow Husserl in presenting the basic concepts of formal ontology as falling into three groups: the theory of part and whole, the theory of dependence, and the theory of boundary, continuity (...) and contact. These basic concepts are presented in relation to the problem of providing an account of the formal ontology of the mesoscopic realm of everyday experience, and specifically of providing an account of the concept of individual substance. (shrink)
Substructural logics and their application to logical and semantic paradoxes have been extensively studied, but non-reflexive systems have been somewhat neglected. Here, we aim to fill this lacuna, at least in part, by presenting a non-reflexive logic and theory of naive consequence (and truth). We also investigate the semantics and the proof-theory of the system. Finally, we develop a compositional theory of truth (and consequence) in our non-reflexive framework.
The article deals with some current pioneering formal reconstructions and interpretations of the problem well known in antiquity as The Master Argument. This problem is concerning with enrichment of formal logical systems with modal and temporal notions. The opening topic is devoted to reconstruction of Arthur Prior. while the other here included approach to the problem arc mostly reactions. revisions or additions to this one.
We present an elementary system of axioms for the geometry of Minkowski spacetime. It strikes a balance between a simple and streamlined set of axioms and the attempt to give a direct formalization in first-order logic of the standard account of Minkowski spacetime in [Maudlin 2012] and [Malament, unpublished]. It is intended for future use in the formalization of physical theories in Minkowski spacetime. The choice of primitives is in the spirit of [Tarski 1959]: a predicate of betwenness and (...) a four place predicate to compare the square of the relativistic intervals. Minkowski spacetime is described as a four dimensional ‘vector space’ that can be decomposed everywhere into a spacelike hyperplane - which obeys the Euclidean axioms in [Tarski and Givant, 1999] - and an orthogonal timelike line. The length of other ‘vectors’ are calculated according to Pythagora’s theorem. We conclude with a Representation Theorem relating models of our system that satisfy second order continuity to the mathematical structure called ‘Minkowski spacetime’ in physics textbooks. (shrink)
It is outlined the possibility to extend the quantum formalism in relation to the requirements of the general systems theory. It can be done by using a quantum semantics arising from the deep logical structure of quantum theory. It is so possible taking into account the logical openness relationship between observer and system. We are going to show how considering the truth-values of quantum propositions within the context of the fuzzy sets is here more useful for systemics. In conclusion (...) we propose an example of formal quantum coherence. (shrink)
The integration of information resources in the life sciences is one of the most challenging problems facing bioinformatics today. We describe how Language and Computing nv, originally a developer of ontology-based natural language understanding systems for the healthcare domain, is developing a framework for the integration of structured data with unstructured information contained in natural language texts. L&C’s LinkSuite™ combines the flexibility of a modular software architecture with an ontology based on rigorous philosophical and logical principles that is designed to (...) comprehend the basic formal relationships that structure both reality and the ways humans perceive and communicate about reality. (shrink)
In this paper two systems of AGM-like Paraconsistent Belief Revision are overviewed, both defined over Logics of Formal Inconsistency (LFIs) due to the possibility of defining a formal consistency operator within these logics. The AGM° system is strongly based on this operator and internalize the notion of formal consistency in the explicit constructions and postulates. Alternatively, the AGMp system uses the AGM-compliance of LFIs and thus assumes a wider notion of paraconsistency - not necessarily related (...) to the notion of formal consistency. (shrink)
A critique is given of the attempt by Hettema and Kuipers to formalize the periodic table. In particular I dispute their notions of identifying a naïve periodic table with tables having a constant periodicity of eight elements and their views on the different conceptions of the atom by chemists and physicists. The views of Hettema and Kuipers on the reduction of the periodic system to atomic physics are also considered critically.
Because formal systems of symbolic logic inherently express and represent the deductive inference model formal proofs to theorem consequences can be understood to represent sound deductive inference to true conclusions without any need for other representations such as model theory.
A discussion of what operates from "within" formal agency as irreal surplus to artworks and how otherwise discursive systems become abstracted by the artwork. Text by Gavin Keeney. Images by Parsa Khalili.
This short paper grew out of an observation—made in the course of a larger research project—of a surprising convergence between, on the one hand, certain themes in the work of Mary Hesse and Nelson Goodman in the 1950/60s and, on the other hand, recent work on the representational resources of science, in particular regarding model-based representation. The convergence between these more recent accounts of representation in science and the earlier proposals by Hesse and Goodman consists in the recognition that, in (...) order to secure successful representation in science, collective representational resources must be available. Such resources may take the form of (amongst others) mathematical formalisms, diagrammatic methods, notational rules, or—in the case of material models—conventions regarding the use and manipulation of the constituent parts. More often than not, an abstract characterization of such resources tells only half the story, as they are constituted equally by the pattern of (practical and theoretical) activities—such as instances of manipulation or inference—of the researchers who deploy them. In other words, representational resources need to be sustained by a social practice; this is what renders them collective representational resources in the first place. (shrink)
In his book, History as a Science and the System of the Sciences, Thomas Seebohm articulates the view that history can serve to mediate between the sciences of explanation and the sciences of interpretation, that is, between the natural sciences and the human sciences. Among other things, Seebohm analyzes history from a phenomenological perspective to reveal the material foundations of the historical human sciences in the lifeworld. As a preliminary to his analyses, Seebohm examines the formal and material (...) presuppositions of phenomenological epistemology, as well as the emergence of the human sciences and the traditional distinctions and divisions that are made between the natural and the human sciences. -/- As part of this examination, Seebohm devotes a section to discussing Husserl’s formal mereology because he understands that a reflective analysis of the foundations of the historical sciences requires a reflective analysis of the objects of the historical sciences, that is, of concrete organic wholes (i.e., social groups) and of their parts. Seebohm concludes that Husserl’s mereological ontology needs to be altered with regard to the historical sciences because the relations between organic wholes and their parts are not summative relations. Seebohm’s conclusion is relevant for the issue of the reducibility of organic wholes such as social groups to their parts and for the issue of the reducibility of the historical sciences to the lower-order sciences, that is, to the sciences concerned with lower-order ontologies. -/- In this paper, I propose to extend Seebohm’s conclusion to the ontology of chemical wholes as object of quantum chemistry and to argue that Husserl’s formal mereology is descriptively inadequate for this regional ontology as well. This may seem surprising at first, since the objects studied by quantum chemists are not organic wholes. However, my discussion of atoms and molecules as they are understood in quantum chemistry will show that Husserl’s classical summative and extensional mereology does not accurately capture the relations between chemical wholes and their parts. This conclusion is relevant for the question of the reducibility of chemical wholes to their parts and of the reducibility of chemistry to physics, issues that have been of central importance within the philosophy of chemistry for the past several decades. (shrink)
Judaic Logic is an original inquiry into the forms of thought determining Jewish law and belief, from the impartial perspective of a logician. Judaic Logic attempts to honestly estimate the extent to which the logic employed within Judaism fits into the general norms, and whether it has any contributions to make to them. The author ranges far and wide in Jewish lore, finding clear evidence of both inductive and deductive reasoning in the Torah and other books of the Bible, and (...) analyzing the methodology of the Talmud and other Rabbinic literature by means of formal tools which make possible its objective evaluation with reference to scientific logic. The result is a highly innovative work – incisive and open, free of clichés or manipulation. Judaic Logic succeeds in translating vague and confusing interpretative principles and examples into formulas with the clarity and precision of Aristotelean syllogism. Among the positive outcomes, for logic in general, are a thorough listing, analysis and validation of the various forms of a-fortiori argument, as well as a clarification of dialectic logic. However, on the negative side, this demystification of Talmudic/Rabbinic modes of thought (hermeneutic and heuristic) reveals most of them to be, contrary to the boasts of orthodox commentators, far from deductive and certain. They are often, legitimately enough, inductive. But they are also often unnatural and arbitrary constructs, supported by unverifiable claims and fallacious techniques. Many other thought-processes, used but not noticed or discussed by the Rabbis, are identified in this treatise, and subjected to logical review. Various more or less explicit Rabbinic doctrines, which have logical significance, are also examined in it. In particular, this work includes a formal study of the ethical logic (deontology) found in Jewish law, to elicit both its universal aspects and its peculiarities. With regard to Biblical studies, one notable finding is an explicit formulation (which, however, the Rabbis failed to take note of and stress) of the principles of adduction in the Torah, written long before the acknowledgement of these principles in Western philosophy and their assimilation in a developed theory of knowledge. Another surprise is that, in contrast to Midrashic claims, the Tanakh (Jewish Bible) contains a lot more than ten instances of qal vachomer (a-fortiori) reasoning. In sum, Judaic Logic elucidates and evaluates the epistemological assumptions which have generated the Halakhah (Jewish religious jurisprudence) and allied doctrines. Traditional justifications, or rationalizations, concerning Judaic law and belief, are carefully dissected and weighed at the level of logical process and structure, without concern for content. This foundational approach, devoid of any critical or supportive bias, clears the way for a timely reassessment of orthodox Judaism (and incidentally, other religious systems, by means of analogies or contrasts). Judaic Logic ought, therefore, to be read by all Halakhists, as well as Bible and Talmud scholars and students; and also by everyone interested in the theory, practise and history of logic. (shrink)
This work addresses a broad range of questions which belong to four fields: computation theory, general philosophy of science, philosophy of cognitive science, and philosophy of mind. Dynamical system theory provides the framework for a unified treatment of these questions. ;The main goal of this dissertation is to propose a new view of the aims and methods of cognitive science--the dynamical approach . According to this view, the object of cognitive science is a particular set of dynamical systems, which (...) I call "cognitive systems". The goal of a cognitive study is to specify a dynamical model of a cognitive system, and then use this model to produce a detailed account of the specific cognitive abilities of that system. The dynamical approach does not limit a-priori the form of the dynamical models which cognitive science may consider. In particular, this approach is compatible with both computational and connectionist modeling, for both computational systems and connectionist networks are special types of dynamical systems. ;To substantiate these methodological claims about cognitive science, I deal first with two questions in two different fields: What is a computational system? What is a dynamical explanation of a deterministic process? ;Intuitively, a computational system is a deterministic system which evolves in discrete time steps, and which can be described in an effective way. In chapter 1, I give a formal definition of this concept which employs the notions of isomorphism between dynamical systems, and of Turing computable function. In chapter 2, I propose a more comprehensive analysis which is based on a natural generalization of the concept of Turing machine. ;The goal of chapter 3 is to develop a theory of the dynamical explanation of a deterministic process. By a "dynamical explanation" I mean the specification of a dynamical model of the system or process which we want to explain. I start from the analysis of a specific type of explanandum--dynamical phenomena--and I then use this analysis to shed light on the general form of a dynamical explanation. Finally, I analyze the structure of those theories which generate explanations of this form, namely dynamical theories. (shrink)
One of the most expected properties of a logical system is that it can be algebraizable, in the sense that an algebraic counterpart of the deductive machinery could be found. Since the inception of da Costa's paraconsistent calculi, an algebraic equivalent for such systems have been searched. It is known that these systems are non self-extensional (i.e., they do not satisfy the replacement property). More than this, they are not algebraizable in the sense of Blok-Pigozzi. The same negative results (...) hold for several systems of the hierarchy of paraconsistent logics known as Logics of Formal Inconsistency (LFIs). Because of this, these logics are uniquely characterized by semantics of non-deterministic kind. This paper offers a solution for two open problems in the domain of paraconsistency, in particular connected to algebraization of LFIs, by obtaining several LFIs weaker than C1, each of one is algebraizable in the standard Lindenbaum-Tarski's sense by a suitable variety of Boolean algebras extended with operators. This means that such LFIs satisfy the replacement property. The weakest LFI satisfying replacement presented here is called RmbC, which is obtained from the basic LFI called mbC. Some axiomatic extensions of RmbC are also studied, and in addition a neighborhood semantics is defined for such systems. It is shown that RmbC can be defined within the minimal bimodal non-normal logic E+E defined by the fusion of the non-normal modal logic E with itself. Finally, the framework is extended to first-order languages. RQmbC, the quantified extension of RmbC, is shown to be sound and complete w.r.t. BALFI semantics. (shrink)
Life as self-organization is philosophically understood by L. Polo in terms of co-causality between matter, formal configuration and intrinsic efficiency. This characterization provides a dynamic account of life and soul, capable to explain both its identity and its continuous renovation. In this article I especially highlight in this author the metaphysical notions of finality, unity and cosmos, which may be helpful to understand the sense of biological systems in the universe.
Genera, typically hand-in-hand with their branching species, are essential elements of vocabulary-based information constructs, in particular scientific taxonomies. Should they also feature in formal ontologies, the highest of such constructs? I argue in this article that the answer is “Yes” and that the question posed in its title also has a Yes-answer: The way medieval ontologists sliced up the world into genera does matter to formal ontology. More specifically, the way Dietrich of Freiberg, a Latin scholastic, conceived and (...) applied strictly generic criteria to slice up the world into its entities can provide some guidelines to the field of formal ontology with respect to not only its contents, but also its scope. In particular, Dietrich's information criterion plays here a central role. (shrink)
One of the main motivations for having a compositional semantics is the account of the productivity of natural languages. Formal languages are often part of the account of productivity, i.e., of how beings with finite capaci- ties are able to produce and understand a potentially infinite number of sen- tences, by offering a model of this process. This account of productivity con- sists in the generation of proofs in a formalsystem, that is taken to represent the (...) way speakers grasp the meaning of an indefinite number of sentences. The informational basis is restricted to what is represented in the lexicon. This constraint is considered as a requirement for the account of productivity, or at least of an important feature of productivity, namely, that we can grasp auto- matically the meaning of a huge number of complex expressions, far beyond what can be memorized. However, empirical results in psycholinguistics, and especially particular patterns of ERP, show that the brain integrates informa- tion of different sources very fast, without any felt effort on the part of the speaker. This shows that formal procedures do not explain productivity. How- ever, formal models are still useful in the account of how we get at the seman- tic value of a complex expression, once we have the meanings of its parts, even if there is no formal explanation of how we get at those meanings. A practice-oriented view of modeling gives an adequate interpretation of this re- sult: formal compositional semantics may be a useful model for some ex- planatory purposes concerning natural languages, without being a good model for dealing with other explananda. (shrink)
Arthur Norman Prior was born on 4 December 1914 in Masterton, New Zealand. He studied philosophy in the 1930s and was a significant, and often provocative, voice in theological debates until well into the 1950s. He became a lecturer in philosophy at Canterbury University College in Christchurch in 1946 succeeding Karl Popper. He became a full professor in 1952. He left New Zealand permanently for England in 1959, first taking a chair in philosophy at Manchester University, and then becoming a (...) fellow of Balliol College, Oxford, in 1966. Prior died on 6 October 1969 in Trondheim, Norway. After Prior’s death, many logicians and philosophers have analysed and discussed his approach to formal and philosophical logic. In particular, his contributions to modal logic, tense-logic and deontic logic have been studied. -/- In 1957, A.N. Prior proposed the three-valued modal logic Q as a ‘correct’ modal logic from his philosophical motivations, see Prior (1957). Prior developed Q in order to offer a logic for contingent beings, in which one could intelligibly and rationally state that some beings are contingent and some are necessary, see Akama & Nagata (2005). According to Akama & Nagata (2005), Q has a natural semantics. In other words, from the philosophical point of view, Q can be regarded as an ‘actualist’ modal logic. -/- This review article is a developed description of, and discussion on, ‘The System Q’ that is the fifth chapter of Prior (1957). In addition, in his logical analysis of ‘Time & Existence’ (that is the eights chapter of Prior (1967)), Prior has worked on system Q. Thus, Prior (1967) has also been very useful for this article. This article analyses the logical structure of system Q in order to provide a more understandable description as well as logical analysis for today’s logicians, philosophers, and information-computer scientists. In the paper, the Polish notations are translated into modern notations in order to be more comprehensible and to support the developed formal descriptions as well as semantic analysis. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.