The use-mention distinction is elaborated into a four-way distinction between use, formal mention, material mention and pragmatic mention. The notion of pragmatic mention is motivated through the problem of monsters in Kaplanian indexical semantics. It is then formalized and applied in an account of schemata in formalized languages.
One of the main motivations for having a compositional semantics is the account of the productivity of natural languages. Formal languages are often part of the account of productivity, i.e., of how beings with finite capaci- ties are able to produce and understand a potentially infinite number of sen- tences, by offering a model of this process. This account of productivity con- sists in the generation of proofs in a formal system, that is taken to represent the way (...) speakers grasp the meaning of an indefinite number of sentences. The informational basis is restricted to what is represented in the lexicon. This constraint is considered as a requirement for the account of productivity, or at least of an important feature of productivity, namely, that we can grasp auto- matically the meaning of a huge number of complex expressions, far beyond what can be memorized. However, empirical results in psycholinguistics, and especially particular patterns of ERP, show that the brain integrates informa- tion of different sources very fast, without any felt effort on the part of the speaker. This shows that formal procedures do not explain productivity. How- ever, formal models are still useful in the account of how we get at the seman- tic value of a complex expression, once we have the meanings of its parts, even if there is no formal explanation of how we get at those meanings. A practice-oriented view of modeling gives an adequate interpretation of this re- sult: formal compositional semantics may be a useful model for some ex- planatory purposes concerning natural languages, without being a good model for dealing with other explananda. (shrink)
Abstract The purpose of this paper is twofold: (i) we will argue that formal semantics might have faltered due to its failure in distinguishing between two fundamentally very different types of concepts, namely ontological concepts, that should be types in a strongly-typed ontology, and logical concepts, that are predicates corresponding to properties of, and relations between, objects of various ontological types; and (ii) we show that accounting for these differences amounts to a new formal semantics; one that integrates (...) lexical and compositional semantics in one coherent framework and one where formal semantics is embedded with a strongly typed ontology; an ontology that reflects our commonsense knowledge of the world and the way we talk about it in ordinary language. We will show how in such a framework a number of challenges in the semantics of natural language are adequately and systematically treated. (shrink)
For any natural (human) or formal (mathematical) language L we know that an expression X of language L is true if and only if there are expressions Γ of language L that connect X to known facts. -/- By extending the notion of a Well Formed Formula to include syntactically formalized rules for rejecting semantically incorrect expressions we recognize and reject expressions that evaluate to neither True nor False.
The central hypothesis of the collaboration between Language and Computing (L&C) and the Institute for Formal Ontology and Medical Information Science (IFOMIS) is that the methodology and conceptual rigor of a philosophically inspired formal ontology greatly benefits application ontologies. To this end r®, L&C’s ontology, which is designed to integrate and reason across various external databases simultaneously, has been submitted to the conceptual demands of IFOMIS’s Basic Formal Ontology (BFO). With this project we aim to move (...) beyond the level of controlled vocabularies to yield an ontology with the ability to support reasoning applications. Our general procedure has been the implementation of a meta-ontological definition space in which the definitions of all the concepts and relations in LinKBase® are standardized in a framework of first-order logic. In this paper we describe how this standardization has already led to an improvement in the LinKBase® structure that allows for a greater degree of internal coherence than ever before possible. We then show the use of this philosophical standardization for the purpose of mapping external databases to one another, using LinKBase® as translation hub, with a greater degree of success than possible hitherto. We demonstrate how this offers a genuine advance over other application ontologies that have not submitted themselves to the demands of philosophical scrutiny. LinKBase® is one of the world’s largest applications-oriented medical domain ontologies, and BFO is one of the world’s first philosophically driven reference ontologies. The collaboration of the two thus initiates a new phase in the quest to solve the so-called “Tower of Babel”. (shrink)
3rd ed, 2021. A circumscription of the classical theory of computation building up from the Chomsky hierarchy. With the usual topics in formallanguage and automata theory.
الكيمياء علمٌ تجريبي بطبيعته، يشتغل معمليًا بالجواهر تحليلاً وتركيبًا، ويُقيم بناءاته النسقية استرشادًا بقواعد محددة تحكم إجراءات البحث التجريبي ونتائجه. وكشأن أي نشاط علمي آخر، تستلزم الممارسة الكيميائية لغة جزئية خاصة تصف بناءاتها التجريبية وتُنمّط أشكالها. وما دام التحليل والتركيب – كإجراءين تجريبيين – هما عمادا البحث الكيميائي وجوهره، فمن الضروري أن تحوي لغة الكيمياء تمثيلات صورية توصف بدورها بأنها صيغٌ أو عبارات تحليلية وتركيبية. يمكننا إذن الزعم بأن ثمة علاقة اعتماد متبادلة بين لغة الكيمياء وممارساتها المعملية؛ فاللغة تؤثر مباشرة (...) – وبأكثر من طريقة – على مجرى البحث الكيميائي وتطوره, ومن خلالها يمكن تحديد مدى قوته أو ضعفه ... تقدمه أو تخلفه ... شرعية تنبؤاته أو افتقادها لسمة الشرعية المنطقية؛ كما أن سُبل البحث المعملي للكيمياء تنعكس بالضرورة على قواعد الصياغة الصورية لهذه اللغة ومدى إمكانية تطويرها وتنقيتها من شوائب اللغة العادية التي اشتُقت منها. ويعني ذلك – بعبارة أخرى – أن اللغة والتجريب شقان مُكملان لبعضهما البعض؛ وجهان لعملة واحدة تحمل اسم الممارسة العلمية. فهل تختلف اللغة الكيميائية عن غيرها من لغات العلم الخاصة؟ وكيف يستخدم الكيميائيون لغتهم؟ ما هي القواعد الحاكمة لهذه اللغة؟, وما هي نتائج استخدامها بالنسبة للكيمياء ككل؟. (shrink)
In the era of “big data,” science is increasingly information driven, and the potential for computers to store, manage, and integrate massive amounts of data has given rise to such new disciplinary fields as biomedical informatics. Applied ontology offers a strategy for the organization of scientific information in computer-tractable form, drawing on concepts not only from computer and information science but also from linguistics, logic, and philosophy. This book provides an introduction to the field of applied ontology that is of (...) particular relevance to biomedicine, covering theoretical components of ontologies, best practices for ontology design, and examples of biomedical ontologies in use. After defining an ontology as a representation of the types of entities in a given domain, the book distinguishes between different kinds of ontologies and taxonomies, and shows how applied ontology draws on more traditional ideas from metaphysics. It presents the core features of the Basic Formal Ontology (BFO), now used by over one hundred ontology projects around the world, and offers examples of domain ontologies that utilize BFO. The book also describes Web Ontology Language (OWL), a common framework for Semantic Web technologies. Throughout, the book provides concrete recommendations for the design and construction of domain ontologies. (shrink)
The discussions which follow rest on a distinction, first expounded by Husserl, between formal logic and formal ontology. The former concerns itself with (formal) meaning-structures; the latter with formal structures amongst objects and their parts. The paper attempts to show how, when formal ontological considerations are brought into play, contemporary extensionalist theories of part and whole, and above all the mereology of Leniewski, can be generalised to embrace not only relations between concrete objects and object-pieces, (...) but also relations between what we shall call dependent parts or moments. A two-dimensional formallanguage is canvassed for the resultant ontological theory, a language which owes more to the tradition of Euler, Boole and Venn than to the quantifier-centred languages which have predominated amongst analytic philosophers since the time of Frege and Russell. Analytic philosophical arguments against moments, and against the entire project of a formal ontology, are considered and rejected. The paper concludes with a brief account of some applications of the theory presented. (shrink)
Recent work in formal semantics suggests that the language system includes not only a structure building device, as standardly assumed, but also a natural deductive system which can determine when expressions have trivial truth-conditions (e.g., are logically true/false) and mark them as unacceptable. This hypothesis, called the `logicality of language', accounts for many acceptability patterns, including systematic restrictions on the distribution of quantifiers. To deal with apparent counter-examples consisting of acceptable tautologies and contradictions, the logicality of (...) class='Hi'>language is often paired with an additional assumption according to which logical forms are radically underspecified: i.e., the language system can see functional terms but is `blind' to open class terms to the extent that different tokens of the same term are treated as if independent. This conception of logical form has profound implications: it suggests an extreme version of the modularity of language, and can only be paired with non-classical---indeed quite exotic---kinds of deductive systems. The aim of this paper is to show that we can pair the logicality of language with a different and ultimately more traditional account of logical form. This framework accounts for the basic acceptability patterns which motivated the logicality of language, can explain why some tautologies and contradictions are acceptable, and makes better predictions in key cases. As a result, we can pursue versions of the logicality of language in frameworks compatible with the view that the language system is not radically modular vis-a-vis its open class terms and employs a deductive system that is basically classical. (shrink)
Recent work in formal semantics suggests that the language system includes not only a structure building device, as standardly assumed, but also a natural deductive system which can determine when expressions have trivial truth‐conditions (e.g., are logically true/false) and mark them as unacceptable. This hypothesis, called the ‘logicality of language’, accounts for many acceptability patterns, including systematic restrictions on the distribution of quantifiers. To deal with apparent counter‐examples consisting of acceptable tautologies and contradictions, the logicality of (...) class='Hi'>language is often paired with an additional assumption according to which logical forms are radically underspecified: i.e., the language system can see functional terms but is ‘blind’ to open class terms to the extent that different tokens of the same term are treated as if independent. This conception of logical form has profound implications: it suggests an extreme version of the modularity of language, and can only be paired with non‐classical—indeed quite exotic—kinds of deductive systems. The aim of this paper is to show that we can pair the logicality of language with a different and ultimately more traditional account of logical form. This framework accounts for the basic acceptability patterns which motivated the logicality of language, can explain why some tautologies and contradictions are acceptable, and makes better predictions in key cases. As a result, we can pursue versions of the logicality of language in frameworks compatible with the view that the language system is not radically modular vis‐á‐vis its open class terms and employs a deductive system that is basically classical. (shrink)
There are many different ways to talk about the world. Some ways of talking are more expressive than others—that is, they enable us to say more things about the world. But what exactly does this mean? When is one language able to express more about the world than another? In my dissertation, I systematically investigate different ways of answering this question and develop a formal theory of expressive power, translation, and notational variance. In doing so, I show how (...) these investigations help to clarify the role that expressive power plays within debates in metaphysics, logic, and the philosophy of language. (shrink)
Formal thought disorder (FTD) is a clinical mental condition that is typically diagnosable by the speech productions of patients. However, this has been a vexing condition for the clinical community, as it is not at all easy to determine what “formal” means in the plethora of symptoms exhibited. We present a logic-based model for the syntax–semantics interface in semantic networking that can not only explain, but also diagnose, FTD. Our model is based on description logic (DL), which is (...) well known for its adequacy to model terminological knowledge. More specifically, we show how faulty logical form as defined in DL-based Conception Language (CL) impacts the semantic content of linguistic productions that are characteristic of FTD. We accordingly call this the dyssyntax model. (shrink)
A Formal Model of Metaphor in Frame Semantics.Vasil Penchev - 2015 - In Proceedings of the 41st Annual Convention of the Society for the Study of Artificial Intelligence and the Simulation of Behaviour. New York: Curran Associates, Inc.. pp. 187-194.details
A formal model of metaphor is introduced. It models metaphor, first, as an interaction of “frames” according to the frame semantics, and then, as a wave function in Hilbert space. The practical way for a probability distribution and a corresponding wave function to be assigned to a given metaphor in a given language is considered. A series of formal definitions is deduced from this for: “representation”, “reality”, “language”, “ontology”, etc. All are based on Hilbert space. A (...) few statements about a quantum computer are implied: The sodefined reality is inherent and internal to it. It can report a result only “metaphorically”. It will demolish transmitting the result “literally”, i.e. absolutely exactly. A new and different formal definition of metaphor is introduced as a few entangled wave functions corresponding to different “signs” in different language formally defined as above. The change of frames as the change from the one to the other formal definition of metaphor is interpreted as a formal definition of thought. Four areas of cognition are unified as different but isomorphic interpretations of the mathematical model based on Hilbert space. These are: quantum mechanics, frame semantics, formal semantics by means of quantum computer, and the theory of metaphor in linguistics. (shrink)
Fodor and Pylyshyn's critique of connectionism has posed a challenge to connectionists: Adequately explain such nomological regularities as systematicity and productivity without postulating a "language of thought" (LOT). Some connectionists like Smolensky took the challenge very seriously, and attempted to meet it by developing models that were supposed to be non-classical. At the core of these attempts lies the claim that connectionist models can provide a representational system with a combinatorial syntax and processes sensitive to syntactic structure. They are (...) not implementation models because, it is claimed, the way they obtain syntax and structure sensitivity is not "concatenative," hence "radically different" from the way classicists handle them. In this paper, I offer an analysis of what it is to physically satisfy/realize a formal system. In this context, I examine the minimal truth-conditions of LOT Hypothesis. From my analysis it will follow that concatenative realization of formal systems is irrelevant to LOTH since the very notion of LOT is indifferent to such an implementation level issue as concatenation. I will conclude that to the extent to which they can explain the law-like cognitive regularities, a certain class of connectionist models proposed as radical alternatives to the classical LOT paradigm will in fact turn out to be LOT models, even though new and potentially very exciting ones. (shrink)
A preliminary statement of the formal theory of the truthmaker relation advanced in the paper “Truth-makers” (Mulligan, Simons and Smith) in 1984. Correspondence theories of truth have. I give a brief account of some more or less obvious formal characteristics of this almost forgotten basic truthmaker relation. I then attempt to show how this account may be extended to provide elements of a theory of truth which is in keeping with the spirit of Wittgenstein’s Tractatus.
In a first part, I defend that formal semantics can be used as a guide to ontological commitment. Thus, if one endorses an ontological view \(O\) and wants to interpret a formallanguage \(L\) , a thorough understanding of the relation between semantics and ontology will help us to construct a semantics for \(L\) in such a way that its ontological commitment will be in perfect accordance with \(O\) . Basically, that is what I call constructing (...) class='Hi'>formal semantics from an ontological perspective. In the rest of the paper, I develop rigorously and put into practice such a method, especially concerning the interpretation of second-order quantification. I will define the notion of ontological framework: it is a set-theoretical structure from which one can construct semantics whose ontological commitments correspond exactly to a given ontological view. I will define five ontological frameworks corresponding respectively to: (i) predicate nominalism, (ii) resemblance nominalism, (iii) armstrongian realism, (iv) platonic realism, and (v) tropism. From those different frameworks, I will construct different semantics for first-order and second-order languages. Notably I will present different kinds of nominalist semantics for second-order languages, showing thus that we can perfectly quantify over properties and relations while being ontologically committed only to individuals. I will show in what extent those semantics differ from each other; it will make clear how the disagreements between the ontological views extend from ontology to logic, and thus why endorsing an ontological view should have an impact on the kind of logic one should use. (shrink)
Revised version of chapter in J. N. Mohanty and W. McKenna (eds.), Husserl’s Phenomenology: A Textbook, Lanham: University Press of America, 1989, 29–67. -/- Logic for Husserl is a science of science, a science of what all sciences have in common in their modes of validation. Thus logic deals with universal laws relating to truth, to deduction, to verification and falsification, and with laws relating to theory as such, and to what makes for theoretical unity, both on the side of (...) the propositions of a theory and on the side of the domain of objects to which these propositions refer. This essay presents a systematic overview of Husserl’s views on these matters as put forward in his Logical Investigations. It shows how Husserl’s theory of linguistic meanings as species of mental acts, his formal ontology of part, whole and dependence, his theory of meaning categories, and his theory of categorial intuition combine with his theory of science to form a single whole. Finally, it explores the ways in which Husserl’s ideas on these matters can be put to use in solving problems in the philosophy of language, logic and mathematics in a way which does justice to the role of mental activity in each of these domains while at the same time avoiding the pitfalls of psychologism. (shrink)
As conceived by analytic philosophers ontology consists in the application of the methods of mathematical logic to the analysis of ontological discourse. As conceived by realist philosophers such as Meinong and the early Husserl, Reinach and Ingarden, it consists in the investigation of the forms of entities of various types. The suggestion is that formal methods be employed by phenomenological ontologists, and that phenomenological insights may contribute to the construction of adequate formal-ontological languages. The paper sketches an account (...) of what might be involved in this new discipline, an account which is illustrated in application to the formal-ontological problems raised by negative states of affairs. (shrink)
A generative grammar for a language L generates one or more syntactic structures for each sentence of L and interprets those structures both phonologically and semantically. A widely accepted assumption in generative linguistics dating from the mid-60s, the Generative Grammar Hypothesis , is that the ability of a speaker to understand sentences of her language requires her to have tacit knowledge of a generative grammar of it, and the task of linguistic semantics in those early days was taken (...) to be that of specifying the form that the semantic component of a generative grammar must take. Then in the 70s linguistic semantics took a curious turn. Without rejecting GGH, linguists turned away from the task of characterizing the semantic component of a generative grammar to pursue instead the Montague-inspired project of providing for natural languages the same kind of model-theoretic semantics that logicians devise for the artificial languages of formal systems of logic, and “formal semantics” continues to dominate semantics in linguistics. This essay argues that the sort of compositional meaning theory that would verify GGH would not only be quite different from the theories formal semanticists construct, but would be a more fundamental theory that supersedes those theories in that it would explain why they are true when they are true, but their truth wouldn’t explain its truth. Formal semantics has undoubtedly made important contributions to our understanding of such phenomena as anaphora and quantification, but semantics in linguistics is supposed to be the study of meaning. This means that the formal semanticist can’t be unconcerned that the kind of semantic theory for a natural language that interests her has no place in a theory of linguistic competence; for if GGH is correct, then the more fundamental semantic theory is the compositional meaning theory that is the semantic component of the internally represented generative grammar, and if that is so, then linguistic semantics has so far ignored what really ought to be its primary concern. (shrink)
This paper formalizes part of the cognitive architecture that Kant develops in the Critique of Pure Reason. The central Kantian notion that we formalize is the rule. As we interpret Kant, a rule is not a declarative conditional stating what would be true if such and such conditions hold. Rather, a Kantian rule is a general procedure, represented by a conditional imperative or permissive, indicating which acts must or may be performed, given certain acts that are already being performed. These (...) acts are not propositions; they do not have truth-values. Our formalization is related to the input/ output logics, a family of logics designed to capture relations between elements that need not have truth-values. In this paper, we introduce KL3 as a formalization of Kant’s conception of rules as conditional imperatives and permissives. We explain how it differs from standard input/output logics, geometric logic, and first-order logic, as well as how it translates natural language sentences not well captured by first-order logic. Finally, we show how the various distinctions in Kant’s much-maligned Table of Judgements emerge as the most natural way of dividing up the various types and sub-types of rule in KL3. Our analysis sheds new light on the way in which normative notions play a fundamental role in the conception of logic at the heart of Kant’s theoretical philosophy. (shrink)
The central hypothesis of the collaboration between Language and Computing (L&C) and the Institute for Formal Ontology and Medical Information Science (IFOMIS) is that the methodology and conceptual rigor of a philosophically inspired formal ontology will greatly benefit software application ontologies. To this end LinKBase®, L&C’s ontology, which is designed to integrate and reason across various external databases simultaneously, has been submitted to the conceptual demands of IFOMIS’s Basic Formal Ontology (BFO). With this, we aim to (...) move beyond the level of controlled vocabularies to yield an ontology with the ability to support reasoning applications. (shrink)
The paper concentrates on the problem of adequate reflection of fragments of reality via expressions of language and inter-subjective knowledge about these fragments, called here, in brief, language adequacy. This problem is formulated in several aspects, the most being: the compatibility of language syntax with its bi-level semantics: intensional and extensional. In this paper, various aspects of language adequacy find their logical explication on the ground of the formal-logical theory T of any categorial language (...) L generated by the so-called classical categorial grammar, and also on the ground of its extension to the bi-level, intensional and extensional semantic-pragmatic theory ST for L. In T, according to the token-type distinction of Ch.S. Peirce, L is characterized first as a language of well-formed expression-tokens (wfe-tokens) - material, concrete objects - and then as a language of wfe-types - abstract objects, classes of wfe-tokens. In ST the semantic-pragmatic notions of meaning and interpretation for wfe-types of L of intensional semantics and the notion of denotation of extensional semanics for wfe-types and constituents of knowledge are formalized. These notions allow formulating a postulate (an axiom of categorial adequacy) from which follow all the most important conditions of the language adequacy, including the above, and a structural one connected with three principles of compositionality. (shrink)
The Monist’s call for papers for this issue ended: “if formalism is true, then it must be possible in principle to mechanize meaning in a conscious thinking and language-using machine; if intentionalism is true, no such project is intelligible”. We use the Grelling-Nelson paradox to show that natural language is indefinitely extensible, which has two important consequences: it cannot be formalized and model theoretic semantics, standard for formal languages, is not suitable for it. We also point out (...) that object-object mapping theories of semantics, the usual account for the possibility of non intentional semantics, doesn’t seem able to account for the indefinitely extensible productivity of natural language. (shrink)
This special issue of the Logic Journal of the IGPL includes revised and updated versions of the best work presented at the fourth edition of the workshop Formal Ap- proaches to Multi-Agent Systems, FAMAS'09, which took place in Turin, Italy, from 7 to 11 September, 2009, under the umbrella of the Multi-Agent Logics, Languages, and Organisations Federated Workshops (MALLOW). -/- Just like its predecessor, research reported in this FAMAS 2009 special issue is very much inspired by practical concerns. This (...) time the authors of all the five selected papers are concerned with knowledge and beliefs in multi-agent settings: How to create a group belief in a fair way from individual plausibility orderings? How to close gaps and resolve ambiguities in a tractable way, when information comes from multiple sources? How to reason about a spatial environment? How to compare the strengths of an agent's beliefs in a principled way? How to decide as efficiently as possible whether a given formula concerning group beliefs is valid? These questions and their answers lead to a multi-faceted and at the same time coherent special issue. We concisely introduce the five articles. (shrink)
Over the last decade, multi-agent systems have come to form one of the key tech- nologies for software development. The Formal Approaches to Multi-Agent Systems (FAMAS) workshop series brings together researchers from the fields of logic, theoreti- cal computer science and multi-agent systems in order to discuss formal techniques for specifying and verifying multi-agent systems. FAMAS addresses the issues of logics for multi-agent systems, formal methods for verification, for example model check- ing, and formal approaches to (...) cooperation, multi-agent planning, communication, coordination, negotiation, games, and reasoning under uncertainty in a distributed environment. In 2007, the third FAMAS workshop, FAMAS'007, was one of the agent workshops gathered together under the umbrella of Multi-Agent Logics, Languages, and Organ- isations - Federated Workshops, MALLOW'007, taking place from 3 to 7 September 2007 in Durham. This current special issue of the Logic Journal of the IGPL gathers together the revised and updated versions of the five best FAMAS'007 contributions. (shrink)
An important part of the Unified Medical Language System (UMLS) is its Semantic Network, consisting of 134 Semantic Types connected to each other by edges formed by one or more of 54 distinct Relation Types. This Network is however for many purposes overcomplex, and various groups have thus made attempts at simplification. Here we take this work further by simplifying the relations which involve the three Semantic Types – Diagnostic Procedure, Laboratory Procedure and Therapeutic or Preventive Procedure. We define (...) operators which can be used to generate terms instantiating types from this selected set when applied to terms designating certain other Semantic Types, including almost all the terms specifying clinical tasks. Usage of such operators thus provides a useful and economical way of specifying clinical tasks. The operators allow us to define a mapping between those types within the UMLS which do not represent clinical tasks and those which do. This mapping then provides a basis for an ontology of clinical tasks that can be used in the formulation of computer-interpretable clinical guideline models. (shrink)
In this paper we present a philosophical motivation for the logics of formal inconsistency, a family of paraconsistent logics whose distinctive feature is that of having resources for expressing the notion of consistency within the object language in such a way that consistency may be logically independent of non-contradiction. We defend the view according to which logics of formal inconsistency may be interpreted as theories of logical consequence of an epistemological character. We also argue that in order (...) to philosophically justify paraconsistency there is no need to endorse dialetheism, the thesis that there are true contradictions. Furthermore, we show that mbC, a logic of formal inconsistency based on classical logic, may be enhanced in order to express the basic ideas of an intuitive interpretation of contradictions as conflicting evidence. (shrink)
Hyppolite stresses his proximity to Merleau-Ponty, but the received interpretation of his “anti-humanist” reading of Hegel suggests a greater distance between their projects. This paper focuses on an under-explored dimension of their philosophical relationship. I argue that Merleau-Ponty and Hyppolite are both committed to formulating a mode of philosophical expression that can avoid the pitfalls of purely formal or literal and purely aesthetic or creative modes of expression. Merleau-Ponty’s attempt to navigate this dichotomy, I suggest, closely resembles Hyppolite’s interpretation (...) of Hegel’s “speculative” mode of expression. In particular, his emphasis on the “mediating” character of philosophical language, which moves between descriptive and creative expression, suggests a debt to Hyppolite. This reading provides more evidence to think that Hyppolite cannot be straightforwardly understood as an anti-humanist or post-phenomenological thinker, and paves the way for a _rapprochement_ between his work and the broader phenomenological tradition. (shrink)
Inquiry into the meaning of logical terms in natural language (‘and’, ‘or’, ‘not’, ‘if’) has generally proceeded along two dimensions. On the one hand, semantic theories aim to predict native speaker intuitions about the natural language sentences involving those logical terms. On the other hand, logical theories explore the formal properties of the translations of those terms into formal languages. Sometimes, these two lines of inquiry appear to be in tension: for instance, our best logical investigation (...) into conditional connectives may show that there is no conditional operator that has all the properties native speaker intuitions suggest if has. Indicative conditionals have famously been the source of one such tension, ever since the triviality proofs of both Lewis (1976) and Gibbard (1981) established conclusions which are in prima facie tension with ordinary judgments about natural language indicative conditionals. In a recent series of papers, Branden Fitelson has strengthened both triviality results (Fitelson 2013, 2015, 2016), revealing a common culprit: a logical schema known as IMPORT-EXPORT. Fitelson’s results focus the tension between the logical results and ordinary judgments, since IMPORT-EXPORT seems to be supported by intuitions about natural language. In this paper, we argue that the intuitions which have been taken to support IMPORT-EXPORT are really evidence for a closely related, but subtly different, principle. We show that the two principles are independent by showing how, given a standard assumption about the conditional operator in the formallanguage in which IMPORT-EXPORT is stated, many existing theories of indicative conditionals validate one, but not the other. Moreover, we argue that once we clearly distinguish these principles, we can use propositional anaphora to show that IMPORT-EXPORT is in fact not valid for natural language indicative conditionals (given this assumption about the formal conditional operator). This gives us a principled and independently motivated way of rejecting a crucial premise in many triviality results, while still making sense of the speaker intuitions which appeared to motivate that premise. We suggest that this strategy has broad application and an important lesson: in theorizing about the logic of natural language, we must pay careful attention to the translation between the formal languages in which logical results are typically proved, and natural languages which are the subject matter of semantic theory. (shrink)
In his new book, Logical Form, Andrea Iacona distinguishes between two different roles that have been ascribed to the notion of logical form: the logical role and the semantic role. These two roles entail a bifurcation of the notion of logical form. Both notions of logical form, according to Iacona, are descriptive, having to do with different features of natural language sentences. I agree that the notion of logical form bifurcates, but not that the logical role is merely descriptive. (...) In this paper, I focus on formalization, a process by which logical form, on its logical role, is attributed to natural language sentences. According to some, formalization is a form of explication, and it involves normative, pragmatic, as well as creative aspects. I present a view by which formalization involves explicit commitments on behalf of a reasoner or an interpreter, which serve the normative grounds for the evaluation of a given text. In previous work, I proposed the framework of semantic constraints for the explication of logical consequence. Here, I extend the framework to include formalization constraints. The various constraints then serve the role of commitments. I discuss specific issues raised by Iacona concerning univocality, co-reference and equivocation, and I show how our views on these matters diverge as a result of our different starting assumptions. (shrink)
The main purpose of the paper is to outline the formal-logical, general theory of language treated as a particular ontological being. The theory itself is called the ontology of language, because it is motivated by the fact that the language plays a special role: it reflects ontology and ontology reflects the world. Language expressions are considered to have a dual ontological status. They are understood as either concretes, that is tokens – material, physical objects, or (...) types – classes of tokens, which are abstract objects. Such a duality is taken into account in the presented logical theory of syntax, semantics and pragmatics. We point to the possibility of building it on two different levels; one which stems from concretes, language tokens of expressions, whereas the other one – from their classes, types conceived as abstract, ideal beings. The aim of this work is not only to outline this theory as taking into account the functional approach to language, with respect to the dual ontological nature of its expressions, but also to show that the logic based on it is ontologically neutral in the sense that it abstracts from accepting some existential assumptions, related with the ontological nature of these linguistic expressions and their extra-linguistic ontological counterparts (objects). (shrink)
One of the most expected properties of a logical system is that it can be algebraizable, in the sense that an algebraic counterpart of the deductive machinery could be found. Since the inception of da Costa's paraconsistent calculi, an algebraic equivalent for such systems have been searched. It is known that these systems are non self-extensional (i.e., they do not satisfy the replacement property). More than this, they are not algebraizable in the sense of Blok-Pigozzi. The same negative results hold (...) for several systems of the hierarchy of paraconsistent logics known as Logics of Formal Inconsistency (LFIs). Because of this, these logics are uniquely characterized by semantics of non-deterministic kind. This paper offers a solution for two open problems in the domain of paraconsistency, in particular connected to algebraization of LFIs, by obtaining several LFIs weaker than C1, each of one is algebraizable in the standard Lindenbaum-Tarski's sense by a suitable variety of Boolean algebras extended with operators. This means that such LFIs satisfy the replacement property. The weakest LFI satisfying replacement presented here is called RmbC, which is obtained from the basic LFI called mbC. Some axiomatic extensions of RmbC are also studied, and in addition a neighborhood semantics is defined for such systems. It is shown that RmbC can be defined within the minimal bimodal non-normal logic E+E defined by the fusion of the non-normal modal logic E with itself. Finally, the framework is extended to first-order languages. RQmbC, the quantified extension of RmbC, is shown to be sound and complete w.r.t. BALFI semantics. (shrink)
The formal and empirical-generative perspectives of computation are demonstrated to be inadequate to secure the goals of simulation in the social sciences. Simulation does not resemble formal demonstrations or generative mechanisms that deductively explain how certain models are sufficient to generate emergent macrostructures of interest. The description of scientific practice implies additional epistemic conceptions of scientific knowledge. Three kinds of knowledge that account for a comprehensive description of the discipline were identified: formal, empirical and intentional knowledge. The (...) use of formal conceptions of computation for describing simulation is refuted; the roles of programming languages according to intentional accounts of computation are identified; and the roles of iconographic programming languages and aesthetic machines in simulation are characterized. The roles that simulation and intentional decision making may be able to play in a participative information society are also discussed. (shrink)
The paper develops Lambda Grammars, a form of categorial grammar that, unlike other categorial formalisms, is non-directional. Linguistic signs are represented as sequences of lambda terms and are combined with the help of linear combinators.
The verb to legitimate is often used in political discourse in a way that is prima facie perplexing. To wit, it is often said that an actor legitimates a practice which is officially prohibited in the relevant context – for example, that a worker telling sexist jokes legitimates sex discrimination in the workplace. In order to clarify the meaning of statements like this, and show how they can sometimes be true and informative, we need an explanation of how something that (...) is officially illegitimate can have a kind of ersatz legitimacy conferred on it, and how this can occur even when the actor ‘doing the legitimating’ lacks formal authority. I examine one putative explanation centred around the phenomenon of normalization, and I highlight some advantages that this account has in comparison to an alternative explanation, one that makes reference to the phenomenon of licensed authority. (shrink)
In recent years the term ‘family language policy’ has begun to circulate in the international sociolinguistics literature (cf. Spolsky 2004, 2007, 2012, King et al. 2008; Caldas 2012; Schwartz & Verschik 2013)1. From a conceptual standpoint, however, the creation and/or use of this syntagma, applied directly to the language decisions taken by family members to speak to one another, can raise questions about whether one should apply what appears rather to be a framework that pertains to actions arising (...) out of institutionalisation, public debate, and formal decisions to a phenomenon produced ‘spontaneously’. ‘Language policy’, which is also commonly associated with the term ‘planning’, has traditionally evoked the study of actions taken by public authorities at the level of the institutional and social use of languages and of their process of decision-making, implementation and any effects on social language behaviours that may ensue. The expansion of this concept to the level of interpersonal uses in families, which corresponds to another sphere involving elements that are distinct from those of the political level or of a formally constituted organisation, can be misleading and conceal phenomena specific to this level of social reality. (shrink)
J.L. Austin is regarded as having an especially acute ear for fine distinctions of meaning overlooked by other philosophers. Austin employs an informal experimental approach to gathering evidence in support of these fine distinctions in meaning, an approach that has become a standard technique for investigating meaning in both philosophy and linguistics. In this paper, we subject Austin's methods to formal experimental investigation. His methods produce mixed results: We find support for his most famous distinction, drawn on the basis (...) of his `donkey stories', that `mistake' and `accident' apply to different cases, but not for some of his other attempts to distinguish the meaning of philosophically significant terms. We critically examine the methodology of informal experiments employed in ordinary language philosophy and much of contemporary philosophy of language and linguistics, and discuss the role that experimenter bias can play in influencing judgments about informal and formal linguistic experiments. (shrink)
The main objective of the paper is to provide a conceptual apparatus of a general logical theory of language communication. The aim of the paper is to outline a formal-logical theory of language in which the concepts of the phenomenon of language communication and language communication in general are defined and some conditions for their adequacy are formulated. The theory explicates the key notions of contemporary syntax, semantics, and pragmatics. The theory is formalized on two (...) levels: token-level and type-level. As such, it takes into account the dual – token and type – ontological character of linguistic entities. The basic notions of the theory: language communication, meaning and interpretation are introduced on the second, type-level of formalization, and their required prior formalization of some of the notions introduced on the first, token-level; among others, the notion of an act of communication. Owing to the theory, it is possible to address the problems of adequacy of both empirical acts of communication and of language communication in general. All the conditions of adequacy of communication discussed in the presented paper, are valid for one-way communication (sender-recipient); nevertheless, they can also apply to the reverse direction of language communication (recipient-sender). Therefore, they concern the problem of two-way understanding in language communication. (shrink)
Carnap and Twentieth-Century Thought: Explication as En lighten ment is the first book in the English language that seeks to place Carnap's philosophy in a broad cultural, political and intellectual context. According to the author, Carnap synthesized many different cur rents of thought and thereby arrived at a novel philosophical perspective that remains strik ing ly relevant today. Whether the reader agrees with Carus's bold theses on Carnap's place in the landscape of twentieth-century philosophy, and his even bolder claims (...) concerning the role that philosophy in Carnap's style should play in the thought of our century, does not matter so much as the excellent opportunity Carus's book offers to thoroughly rethink one's ideas about Carnap's philosophy. One reason why Carnap and Twentieth-Century Thought might change one's ideas is that Carus has unearthed much hitherto unknown material from the archives that sheds new light on Carnap's early life and thought. Indeed, the many archival findings presented in CTT for the first time suffice to make the book re warding reading for philosophers and historians of philosophy alike. CTT exhibits a high standard of historical scholarship, and the book itself is a beautiful example of high-quality academic publishing. Up to now, Carnap has remained a controversial figure on the philo sophical scene. On the one hand, he has a solid reputation as a leading figure of logical positivism . According to conventional wisdom, this was a school of thought characterized by its formal and technical philosophy, as well as being rather dismissive of other ways of doing philosophy, dogmatically sticking to its own theses. As a typical example of this arrogant logical empiricist attitude, one usually refers to Carnap's notorious Overcoming Metaphysics by Logical Analysis of Language , written when the Vienna Circle's Logical Empiricism had entered its most radical phase. Self-proclaimed postpositivist philosophers of science dismissed logical positivism, in particular Carnap's, as the dogmatic and orthodox “received view.” The tendency to portray logical empiricism as an obsolete doctrine centering around certain “dogmas” started with Quine's Two Dogmas of Empiricism and reached its somewhat ridiculous culmination in the early 1980s when allegedly “six or seven dogmas” were discovered . Thereby an allegedly un brid geable gap between classical “dogmatic” logical em pi ricism and its modern “enlightened” suc ces sors was construed. (shrink)
The integration of information resources in the life sciences is one of the most challenging problems facing bioinformatics today. We describe how Language and Computing nv, originally a developer of ontology-based natural language understanding systems for the healthcare domain, is developing a framework for the integration of structured data with unstructured information contained in natural language texts. L&C’s LinkSuite™ combines the flexibility of a modular software architecture with an ontology based on rigorous philosophical and logical principles that (...) is designed to comprehend the basic formal relationships that structure both reality and the ways humans perceive and communicate about reality. (shrink)
Here we focus on two questions: What is the proper semantics for deontic modal expressions in English? And what is the connection between true deontic modal statements and normative reasons? Our contribution towards thinking about the first, which makes up the bulk of our paper, considers a representative sample of recent challenges to a Kratzer-style formal semantics for modal expressions, as well as the rival views—Fabrizio Cariani’s contrastivism, John MacFarlane’s relativism, and Mark Schroeder’s ambiguity theory—those challenges are thought to (...) motivate. These include the Professor Procrastinate challenge to Inheritance (the principle that ‘If ought p and p entails q, then ought q), as well as Parfit’s miners puzzle regarding information-sensitive deontic modals. Here we argue that a Kratzer-style view is able to meet all of the challenges we’ll consider. In addition, we’ll identify challenges for each of those rival views. Our overall conclusion is that a Kratzer-style semantics remains the one to beat. With this assumption in place, we then ask how we should understand the relationship between true deontic modal statements and normative reasons. Should, for example, we hold that the truth of such a statement entails the existence of a normative reason for some agent to comply? Here we argue that, in many cases, acceptance of Kratzer’s semantics for deontic modals leaves open for substantive normative theorizing the question of whether an agent has a normative reason to comply with what she ought to do. (shrink)
The paper deals with polymodal languages combined with standard semantics defined by means of some conditions on the frames. So, a notion of "polymodal base" arises which provides various enrichments of the classical modal language. One of these enrichments, viz. the base £(R,-R), with modalities over a relation and over its complement, is the paper's main paradigm. The modal definability (in the spirit of van Benthem's correspondence theory) of arbitrary and ~-elementary classes of frames in this base and in (...) some of its extensions, e.g., £(R,-R,R-1 ,_R-1), £(R,-R,=I=) etc., is described, and numerous examples of conditions definable there, as well as undefinable ones, are adduced. (shrink)
Traditional views concerning musical meaning, in the field of philosophy, quite often oscillate around the discussion of whether music can transfer meaning (and if so if it happens by a means similar to language). Philosophers have provided a wide range of views – according to some, music has no meaning whatsoever, or if there is any meaning involved, it is only of a formal/structural significance. According to the opposing views, music can contain meaning similarly to language and (...) what is more, sometimes it can be even richer than language, as in music we are – arguably – able to encode “emotional meanings”. In recent years, several approaches – also speculative – to the old philosophical question have been proposed by evolutionary psychologists, one of the most controversial views being that of Stephen Pinker’s famous metaphor for music as “auditory cheesecake”. This anti-adaptationist view has been challenged f. ex. by Geoffrey Miller or Ian Cross. In this chapter, I enlist some main philosophical views on the titular problem and investigate some evolutionary-paradigm-based propositions for its solution, to examine whether – both from explanatory and methodological standpoints – the philosophy of music could gain something from recent developments in evolutionary psychology. (shrink)
Formal ontologies are nowadays widely considered a standard tool for knowledge representation and reasoning in the Semantic Web. In this context, they are expected to play an important role in helping automated processes to access information. Namely: they are expected to provide a formal structure able to explicate the relationships between different concepts/terms, thus allowing intelligent agents to interpret, correctly, the semantics of the web resources improving the performances of the search technologies. Here we take into account a (...) problem regarding Knowledge Representation in general, and ontology based representations in particular; namely: the fact that knowledge modeling seems to be constrained between conflicting requirements, such as compositionality, on the one hand and the need to represent prototypical information on the other. In particular, most common sense concepts seem not to be captured by the stringent semantics expressed by such formalisms as, for example, Description Logics (which are the formalisms on which the ontology languages have been built). The aim of this work is to analyse this problem, suggesting a possible solution suitable for formal ontologies and semantic web representations. The questions guiding this research, in fact, have been: is it possible to provide a formal representational framework which, for the same concept, combines both the classical modelling view (accounting for compositional information) and defeasible, prototypical knowledge ? Is it possible to propose a modelling architecture able to provide different type of reasoning (e.g. classical deductive reasoning for the compositional component and a non monotonic reasoning for the prototypical one)? We suggest a possible answer to these questions proposing a modelling framework able to represent, within the semantic web languages, a multilevel representation of conceptual information, integrating both classical and non classical (typicality based) information. Within this framework we hypothesise, at least in principle, the coexistence of multiple reasoning processes involving the different levels of representation. (shrink)
This takes a closer look at the actual semantic behavior of apparent truth predicates in English and re-evaluates the way they could motivate particular philosophical views regarding the formal status of 'truth predicates' and their semantics. The paper distinguishes two types of 'truth predicates' and proposes semantic analyses that better reflect the linguistic facts. These analyses match particular independently motivated philosophical views.
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.