Citations of:
Add citations
You must login to add citations.


We study definability in terms of monotone generalized quantifiers satisfying Isomorphism Closure, Conservativity and Extension. Among the quantifiers with the latter three properties  here called CE quantifiers  one finds the interpretations of determiner phrases in natural languages. The property of monotonicity is also linguistically ubiquitous, though some determiners like an even number of are highly nonmonotone. They are nevertheless definable in terms of monotone CE quantifiers: we give a necessary and sufficient condition for such definability. We further identify (...) 

Abstract This paper discusses the semantic theory presented in Robert Brandom’s Making It Explicit . I argue that it is best understood as a special version of dynamic semantics, so that these semantics by themselves offer an interesting theoretical alternative to more standard truthconditional theories. This reorientation also has implications for more foundational issues. I argue that it gives us the resources for a renewed argument for the normativity of meaning. The paper ends by critically assessing the view in both (...) 

This paper discusses the possibility of modelling inductive inference (Gold 1967) in dynamic epistemic logic (see e.g. van Ditmarsch et al. 2007). The general purpose is to propose a semantic basis for designing a modal logic for learning in the limit. First, we analyze a variety of epistemological notions involved in identification in the limit and match it with traditional epistemic and doxastic logic approaches. Then, we provide a comparison of learning by erasing (Lange et al. 1996) and iterated epistemic (...) 

We implement the extension of the logical consequence relation to a partial order ≤ on arbitary types built from e (entities) and t (Booleans) that was given in [1], and the definition of monotonicity preserving and monotonicity reversing functions in terms of ≤. Next, we present a new algorithm for polarity marking, and implement this for a particular fragment of syntax. Finally, we list the reseach agenda that these definitions and this algorithm suggest. The implementations use Haskell [8], and are (...) 



Epistemic modals are a prominent topic in the literature on natural language semantics, with wideranging implications for issues in philosophy of language and philosophical logic. Considerations about the role that epistemic "might" and "must" play in discourse and reasoning have led to the development of several important alternatives to classical possible worlds semantics for natural language modal expressions. This is an opinionated overview of what I take to be some of the most exciting issues and developments in the field. 

One well known problem regarding quantifiers, in particular the 1storder quantifiers, is connected with their syntactic categories and denotations. The unsatisfactory efforts to establish the syntactic and ontological categories of quantifiers in formalized firstorder languages can be solved by means of the so called principle of categorial compatibility formulated by Roman Suszko, referring to some innovative ideas of Gottlob Frege and visible in syntactic and semantic compatibility of language expressions. In the paper the principle is introduced for categorial languages generated (...) 

A logic is called higher order if it allows for quantiﬁcation over higher order objects, such as functions of individuals, relations between individuals, functions of functions, relations between functions, etc. Higher order logic began with Frege, was formalized in Russell [46] and Whitehead and Russell [52] early in the previous century, and received its canonical formulation in Church [14].1 While classical type theory has since long been overshadowed by set theory as a foundation of mathematics, recent decades have shown remarkable (...) 

In this paper, I shall explore a determiner in natural language which is ambivalent as to whether it should be classiﬁed as quantiﬁcational or objectdenoting: the determiner both. Both in many ways appears to be a paradigmatic quantiﬁer; and yet, I shall argue, it can be interpreted as having an individual—an object—as semantic value. To show the signiﬁcance of this, I shall discuss two ways of thinking about quantiﬁers. We often think about quantiﬁers via intuitions about kinds of thoughts. Certain (...) 

The paper concentrates on the problem of adequate reflection of fragments of reality via expressions of language and intersubjective knowledge about these fragments, called here, in brief, language adequacy. This problem is formulated in several aspects, the most being: the compatibility of language syntax with its bilevel semantics: intensional and extensional. In this paper, various aspects of language adequacy find their logical explication on the ground of the formallogical theory T of any categorial language L generated by the socalled classical (...) 

The present volume contains a collection of papers presented at the 21st annual meeting “Sinn und Bedeutung” of the Gesellschaft fur Semantik, which was held at the University of Edinburgh on September 4th–6th, 2016. The Sinn und Bedeutung conferences are one of the leading international venues for research in formal semantics. 

The paper provides a compositional account of cumulative readings with nonincreasing modified numerals (aka van Benthem's puzzle), for example, Exactly 3 boys saw exactly 5 movies. The main proposal is that modified numerals make two kinds of semantic contributions. Their asserted/atissue contribution is a maximization operator that introduces the maximal set of entities that satisfies their restrictor and nuclear scope. The second contribution is a postsupposition, that is, a cardinality constraint that needs to be satisfied relative to the context that (...) 



We provide a computational model of semantic alignment among communicating agents constrained by social and cognitive pressures. We use our model to analyze the effects of social stratification and a local transmission bottleneck on the coordination of meaning in isolated dyads. The analysis suggests that the traditional approach to learning—understood as inferring prescribed meaning from observations—can be viewed as a special case of semantic alignment, manifesting itself in the behaviour of socially imbalanced dyads put under mild pressure of a local (...) 

We examine the verification of simple quantifiers in natural language from a computational model perspective. We refer to previous neuropsychological investigations of the same problem and suggest extending their experimental setting. Moreover, we give some direct empirical evidence linking computational complexity predictions with cognitive reality.<br>In the empirical study we compare time needed for understanding different types of quantifiers. We show that the computational distinction between quantifiers recognized by finiteautomata and pushdown automata is psychologically relevant. Our research improves upon hypothesis and (...) 

One well known problem regarding quantifiers, in particular the 1st order quantifiers, is connected with their syntactic categories and denotations.The unsatisfactory efforts to establish the syntactic and ontological categories of quantifiers in formalized firstorder languages can be solved by means of the so called principle of categorial compatibility formulated by Roman Suszko, referring to some innovative ideas of Gottlob Frege and visible in syntactic and semantic compatibility of language expressions. In the paper the principle is introduced for categorial languages generated (...) 

Natural language sentences that talk about two or more sets of entities can be assigned various readings. The ones in which the sets are independent of one another are particularly challenging from the formal point of view. In this paper we will call them ‘Independent Set (IS) readings’. Cumulative and collective readings are paradigmatic examples of IS readings. Most approaches aiming at representing the meaning of IS readings implement some kind of maximality conditions on the witness sets involved. Two kinds (...) 



This article points out problems in current dynamic treatments of anaphora and provides a new account that solves these by grafting Muskens' Compositional Discourse Representation Theory onto a partial theory of types. Partiality is exploited to keep track of which discourse referents have been introduced in the text (thus avoiding the overwrite problem) and to account for cases of anaphoric failure. Another key assumption is that the set of discourse referents is wellordered, so that we can keep track of the (...) 

In the paper, original formallogical conception of syntactic and semantic: intensional and extensional senses of expressions of any language L is outlined. Syntax and bilevel intensional and extensional semantics of language L are characterized categorically: in the spirit of some Husserl’s ideas of pure grammar, LeśniewskiAjukiewicz’s theory syntactic/semantic categories and in accordance with Frege’s ontological canons, Bocheński’s famous motto—syntax mirrors ontology and some ideas of Suszko: language should be a linguistic scheme of ontological reality and simultaneously a tool of its (...) 

The article studies two related issues. First, it introduces the notion of the contraposition of quantifiers which is a “dual” notion of symmetry and has similar relations to cointersectivity as symmetry has to intersectivity. Second, it shows how symmetry and contraposition can be generalised to higher order type quantifiers, while preserving their relations with other notions from generalized quantifiers theory. 

Some dynamic semantic theories include an attempt to derive truthconditional meaning from context change potential. This implies defining truth in terms of context change. Focusing on presuppositions and epistemic modals, this paper points out some problems with how this project has been carried out. It then suggests a way of overcoming these problems. This involves appealing to a richer notion of context than the one found in standard dynamic systems. 



This paper discusses some formal properties of trivalent approaches to presupposition projection, and in particular of the middle Kleene system of Peters (1977) and Krahmer (1998). After exploring the relationship between trivalent truthfunctional accounts and dynamic accounts in the tradition of Heim (1983), I show how the middle Kleene trivalent account can be formulated in a way which shows that it meets the explanatory challenge of Schlenker (2006, 2008a,b), and provide some results relating to the application of the middle Kleene (...) 

The paper gives a survey of known results related to computational devices (finite and push–down automata) recognizing monadic generalized quantifiers in finite models. Some of these results are simple reinterpretations of descriptive—feasible correspondence theorems from finite–model theory. Additionally a new result characterizing monadic quantifiers recognized by push down automata is proven. 

This paper is concerned with De Morgan?s explanation of the validity of arguments that involve relational notions. It discusses De Morgan?s expansion of traditional logic aimed at accommodating those inferences, and makes the point that his endeavour is not successful in that the rules that made up his new logic are not sound. Nevertheless, the most important scholarly work on De Morgan?s logic, and contrary to that De Morgan?s mistake is not beyond repair. The rules that determine his new logic (...) 

The LambekGrishin calculus is a symmetric version of categorial grammar obtained by augmenting the standard inventory of typeforming operations (product and residual left and right division) with a dual family: coproduct, left and right difference. Interaction between these two families is provided by distributivity laws. These distributivity laws have pleasant invariance properties: stability of interpretations for the CurryHoward derivational semantics, and structurepreservation at the syntactic end. The move to symmetry thus offers novel ways of reconciling the demands of natural language (...) 

This work explores the hypothesis that natural language is a tool for changing a language user's state of mind and, more specifically, the hypothesis that a sentence's meaning is constituted by its characteristic role in fulfilling this purpose. This view contrasts with the dominant approach to semantics due to Frege, Tarski and others' work on artificial languages: language is first and foremost a tool for representing the world. Adapted to natural language by Davidson, Lewis, Montague, et. al. this dominant approach (...) 



Logic has its roots in the study of valid argument, but while traditional logicians worked with natural language directly, modern approaches first translate natural arguments into an artificial language. The reason for this step is that some artificial languages now have very well developed inferential systems. There is no doubt that this is a great advantage in general, but for the study of natural reasoning it is a drawback that the original linguistic forms get lost in translation. An alternative approach (...) 

In this paper we discuss a new perspective on the syntaxsemantics interface. Semantics, in this new setup, is not ‘read off’ from Logical Forms as in mainstream approaches to generative grammar. Nor is it assigned to syntactic proofs using a CurryHoward correspondence as in versions of the Lambek Calculus, or read off from fstructures using Linear Logic as in LexicalFunctional Grammar (LFG, Kaplan & Bresnan [9]). All such approaches are based on the idea that syntactic objects (trees, proofs, fstructures) are (...) 



We overview logical and computational explanations of the notion of tractability as applied in cognitive science. We start by introducing the basics of mathematical theories of complexity: computability theory, computational complexity theory, and descriptive complexity theory. Computational philosophy of mind often identifies mental algorithms with computable functions. However, with the development of programming practice it has become apparent that for some computable problems finding effective algorithms is hardly possible. Some problems need too much computational resource, e.g., time or memory, to (...) 



The semantic automata framework, developed originally in the 1980s, provides computational interpretations of generalized quantifiers. While recent experimental results have associated structural features of these automata with neuroanatomical demands in processing sentences with quantifiers, the theoretical framework has remained largely unexplored. In this paper, after presenting some classic results on semantic automata in a modern style, we present the first application of semantic automata to polyadic quantification, exhibiting automata for iterated quantifiers. We also discuss the role of semantic automata in (...) 

We report two experiments which tested whether cognitive capacities are limited to those functions that are computationally tractable (PTIMECognition Hypothesis). In particular, we investigated the semantic processing of reciprocal sentences with generalized quantifiers, i.e., sentences of the form Q dots are directly connected to each other, where Q stands for a generalized quantifier, e.g. all or most. Sentences of this type are notoriously ambiguous and it has been claimed in the semantic literature that the logically strongest reading is preferred (Strongest (...) 

ABSTRACT George Gargov was an active pioneer in the ‘Sofia School’ of modal logicians. Starting in the 1970s, he and his colleagues expanded the scope of the subject by introducing new modal expressive power, of various innovative kinds. The aim of this paper is to show some general patterns behind such extensions, and review some very general results that we know by now, 20 years later. We concentrate on simulation invariance, decidability, and correspondence. What seems clear is that ‘modal logic’ (...) 

ABSTRACT This paper gives a survey of known results related to computational devices recognising monadic generalised quantifiers infinite models. Some of these results are simple reinterpretations of descriptivefeasible correspondence theorems from finitemodel theory. Additionally a new result characterizing monadic quantifiers recognized by push down automata is proven. 

We give derivations of two formal models of Gricean Quantity implicature and strong exhaustivity in bidirectional optimality theory and in a signalling games framework. We show that, under a unifying model based on signalling games, these interpretative strategies are gametheoretic equilibria when the speaker is known to be respectively minimally and maximally expert in the matter at hand. That is, in this framework the optimal strategy for communication depends on the degree of knowledge the speaker is known to have concerning (...) 







We give derivations of two formal models of Gricean Quantity implicature and strong exhaustivity in bidirectional optimality theory and in a signalling games framework. We show that, under a unifying model based on signalling games, these interpretative strategies are gametheoretic equilibria when the speaker is known to be respectively minimally and maximally expert in the matter at hand. That is, in this framework the optimal strategy for communication depends on the degree of knowledge the speaker is known to have concerning (...) 

Generalized quantifiers are functions from pairs of properties to truthvalues; these functions can be used to interpret natural language quantifiers. The space of such functions is vast and a great deal of research has sought to find natural constraints on the functions that interpret determiners and create quantifiers. These constraints have demonstrated that quantifiers rest on number and number sense. In the first part of the paper, we turn to developing this argument. In the remainder, we report on work in (...) 