The epistemic modal auxiliaries must and might are vehicles for expressing the force with which a proposition follows from some body of evidence or information. Standard approaches model these operators using quantificational modal logic, but probabilistic approaches are becoming increasingly influential. According to a traditional view, must is a maximally strong epistemic operator and might is a bare possibility one. A competing account—popular amongst proponents of a probabilisitic turn—says that, given a body of evidence, must \ entails that \\) (...) is high but non-maximal and might \ that \\) is significantly greater than 0. Drawing on several observations concerning the behavior of must, might and similar epistemic operators in evidential contexts, deductive inferences, downplaying and retractions scenarios, and expressions of epistemic tension, I argue that those two influential accounts have systematic descriptive shortcomings. To better make sense of their complex behavior, I propose instead a broadly Kratzerian account according to which must \ entails that \ = 1\) and might \ that \ > 0\), given a body of evidence and a set of normality assumptions about the world. From this perspective, must and might are vehicles for expressing a common mode of reasoning whereby we draw inferences from specific bits of evidence against a rich set of background assumptions—some of which we represent as defeasible—which capture our general expectations about the world. I will show that the predictions of this Kratzerian account can be substantially refined once it is combined with a specific yet independently motivated ‘grammatical’ approach to the computation of scalar implicatures. Finally, I discuss some implications of these results for more general discussions concerning the empirical and theoretical motivation to adopt a probabilisitic semantic framework. (shrink)
In my article, I present a new version of a probabilistic truth prescribing semantics for natural language indicative conditionals. The proposed truth conditions can be paraphrased as follows: an indicative conditional is true if the corresponding conditional probability is high and the antecedent is positively probabilistically relevant for the consequent or the probability of the antecedent of the conditional equals 0. In the paper, the truth conditions are defended and some of the logical properties of the proposed (...) class='Hi'>semantics are described. (shrink)
We investigate a basic probabilistic dynamic semantics for a fragment containing conditionals, probability operators, modals, and attitude verbs, with the aim of shedding light on the prospects for adding probabilistic structure to models of the conversational common ground.
Quantum mechanics admits a “linguistic interpretation” if one equates preliminary any quantum state of some whether quantum entity or word, i.e. a wave function interpret-able as an element of the separable complex Hilbert space. All possible Feynman pathways can link to each other any two semantic units such as words or term in any theory. Then, the causal reasoning would correspond to the case of classical mechanics (a single trajectory, in which any next point is causally conditioned), and the (...) class='Hi'>probabilistic reasoning, to the case of quantum mechanics (many Feynman trajectories). Frame semantics turns out to be the natural counterpart of that linguistic interpretation of quantum mechanics. (shrink)
Conceptual combination performs a fundamental role in creating the broad range of compound phrases utilised in everyday language. This article provides a novel probabilistic framework for assessing whether the semantics of conceptual combinations are compositional, and so can be considered as a function of the semantics of the constituent concepts, or not. While the systematicity and productivity of language provide a strong argument in favor of assuming compositionality, this very assumption is still regularly questioned in both cognitive (...) science and philosophy. Additionally, the principle of semantic compositionality is underspecifi ed, which means that notions of both "strong" and "weak" compositionality appear in the literature. Rather than adjudicating between different grades of compositionality, the framework presented here contributes formal methods for determining a clear dividing line between compositional and non-compositional semantics. In addition, we suggest that the distinction between these is contextually sensitive. Compositionality is equated with a joint probability distribution modeling how the constituent concepts in the combination are interpreted. Marginal selectivity is introduced as a pivotal probabilistic constraint for the application of the Bell/CH and CHSH systems of inequalities. Non-compositionality is equated with a failure of marginal selectivity, or violation of either system of inequalities in the presence of marginal selectivity. This means that the conceptual combination cannot be modeled in a joint probability distribution, the variables of which correspond to how the constituent concepts are being interpreted. The formal analysis methods are demonstrated by applying them to an empirical illustration of twenty-four non-lexicalised conceptual combinations. (shrink)
Formalization of the semantics of generics has been considered extremely challenging for their inherent vagueness and context-dependence that hinder a single fixed truth condition. The present study suggests a way to formalize the semantics of generics by constructing flexible acceptance conditions with comparative probabilities. Findings from our in-depth psycholinguistic experiment show that two comparative probabilities—cue validity and prevalence—indeed construct the flexible acceptance conditions for generics in a systematic manner that can be applied to a diverse types of generics: (...) Acceptability of IS_A relational generics is mostly determined by prevalence without interaction with cue validity; feature-describing generics are endorsed acceptable with high cue validity, albeit mediated by prevalence; and acceptability of feature-describing generics with low cue validity is mostly determined by prevalence irrespective of cue validity. Such systematic patterns indicate a great potential for the formalization of the semantics of generics. (shrink)
We present an interdisciplinary approach to study systematic relations between logical form and attacks between claims in an argumentative framework. We propose to generalize qualitative attack principles by quantitative ones. Specifically, we use coherent conditional probabilities to evaluate the rationality of principles which govern the strength of argumentative attacks. Finally, we present an experiment which explores the psychological plausibility of selected attack principles.
Moss (2018) argues that rational agents are best thought of not as having degrees of belief in various propositions but as having beliefs in probabilistic contents, or probabilistic beliefs. Probabilistic contents are sets of probability functions. Probabilistic belief states, in turn, are modeled by sets of probabilistic contents, or sets of sets of probability functions. We argue that this Mossean framework is of considerable interest quite independently of its role in Moss’ account of probabilistic (...) knowledge or her semantics for epistemic modals and probability operators. It is an extremely general model of uncertainty. Indeed, it is at least as general and expressively powerful as every other current imprecise probability framework, including lower probabilities, lower previsions, sets of probabilities, sets of desirable gambles, and choice functions. In addition, we partially answer an important question that Moss leaves open, viz., why should rational agents have consistent probabilistic beliefs? We show that an important subclass of Mossean believers avoid Dutch bookability iff they have consistent probabilistic beliefs. (shrink)
This paper calls for a re-appraisal of McGee's analysis of the semantics, logic and probabilities of indicative conditionals presented in his 1989 paper Conditional probabilities and compounds of conditionals. The probabilistic measures introduced by McGee are given a new axiomatisation built on the principle that the antecedent of a conditional is probabilistically independent of the conditional and a more transparent method of constructing such measures is provided. McGee's Dutch book argument is restructured to more clearly reveal that it (...) introduces a novel contribution to the epistemology of semantic indeterminacy, and shows that its more controversial implications are unavoidable if we want to maintain the Ramsey Test along with the standard laws of probability. Importantly, it is shown that the counterexamples that have been levelled at McGee's analysis|generating a rather wide consensus that it yields `unintuitive' or `wrong' probabilities for compounds |fail to strike at their intended target; for to honour the intuitions of the counterexamples one must either give up the Ramsey Test or the standard laws of probability. It will be argued that we need to give up neither if we take the counterexamples as further evidence that the indicative conditional sometimes allows for a non-epistemic `causal' interpretation alongside its usual epistemic interpretation. (shrink)
In recent years, a number of theorists have claimed that beliefs about probability are transparent. To believe probably p is simply to have a high credence that p. In this paper, I prove a variety of triviality results for theses like the above. I show that such claims are inconsistent with the thesis that probabilistic modal sentences have propositions or sets of worlds as their meaning. Then I consider the extent to which a dynamic semantics for probabilistic (...) modals can capture theses connecting belief, certainty, credence, and probability. I show that although a dynamic semantics for probabilistic modals does allow one to validate such theses, it can only do so at a cost. I prove that such theses can only be valid if probabilistic modals do not satisfy the axioms of the probability calculus. (shrink)
Contrastivists view ought-sentences as expressing comparisons among alternatives. Deontic actualists believe that the value of each alternative in such a comparison is determined by what would actually happen if that alternative were to be the case. One of the arguments that motivates actualism is a challenge to the principle of agglomeration over conjunction—the principle according to which if you ought to run and you ought to jump, then you ought to run and jump. I argue that there is no way (...) of developing the actualist insight into a logic that invalidates the agglomeration principle without also invalidating other desirable patterns of inference. After doing this, I extend the analysis to other contrastive views that challenge agglomeration in the way that the actualist does. This motivates skepticism about the actualist’s way of challenging agglomeration. (shrink)
Joyce (1998) gives an argument for probabilism: the doctrine that rational credences should conform to the axioms of probability. In doing so, he provides a distinctive take on how the normative force of probabilism relates to the injunction to believe what is true. But Joyce presupposes that the truth values of the propositions over which credences are defined are classical. I generalize the core of Joyce’s argument to remove this presupposition. On the same assumptions as Joyce uses, the credences of (...) a rational agent should always be weighted averages of truth value assignments. In the special case where the truth values are classical, the weighted averages of truth value assignments are exactly the probability functions. But in the more general case, probabilistic axioms formulated in terms of classical logic are violated—but we will show that generalized versions of the axioms formulated in terms of non-classical logics are satisfied. (shrink)
The disjunction problem and the distality problem each presents a challenge that any theory of mental content must address. Here we consider their bearing on purely probabilistic causal theories. In addition to considering these problems separately, we consider a third challenge—that a theory must solve both. We call this “the hard problem.” We consider 8 basic ppc theories along with 240 hybrids of them, and show that some can handle the disjunction problem and some can handle the distality problem, (...) but none can handle the hard problem. This is our main result. We then discuss three possible responses to that result, and argue that though the first two fail, the third has some promise. (shrink)
A fundamental problem in understanding the nature of time is explaining its directionality. This 1990 PhD thesis re-examines the concepts of time flow, the physical directionality of time, and the semantics of tensed language. Several novel results are argued for that contradict the orthodox anti-realist views still dominant in the subject. Specifically, the concept of "metaphysical time flow" is supported as a valid scientific concept, and argued to be intrinsic to the directionality of objective probabilities in quantum mechanics; the (...) common claims that quantum probability theory is time reversible is shown to be based on an analytic error, stemming from a false choice for the criterion for reversibility of probabilistic theories (recognized by Satosi Watanabe in the 1950s but ignored in all philosophical discussions); and a consistent semantics for tensed language (adapted from the tree model of Storrs McCall) is constructed, showing that the common rejection of "time flow" as having no meaningful semantics is false. These debates are still ongoing in almost exactly the same state they were pre-1990, and there is appears to be no visible progress in the subject. Critical points made against errors in the orthodox account (which has been sustained for 70 years by the anti-realist philosophy of time, typified by the "Pittsburg School" of Grunbaum-Earman-Norton-Roberts), are still not recognized in the philosophy of time or physics. Some key technical proofs in this thesis have been published in physics proper. See p.ii-iii for full original abstract. (This pdf is uploaded from the Massey University archive.). (shrink)
Probabilistic theories of “should” and “ought” face a predicament. At first blush, it seems that such theories must provide different lexical entries for the epistemic and the deontic interpretations of these modals. I show that there is a new style of premise semantics that can avoid this consequence in an attractively conservative way.
In this work, we propose a definition of logical consequence based on the relation between the quantity of information present in a particular set of formulae and a particular formula. As a starting point, we use Shannon‟s quantitative notion of information, founded on the concepts of logarithmic function and probability value. We first consider some of the basic elements of an axiomatic probability theory, and then construct a probabilisticsemantics for languages of classical propositional logic. We define the (...) quantity of information for the formulae of these languages and introduce the concept of informational logical consequence, identifying some important results, among them: certain arguments that have traditionally been considered valid, such as modus ponens, are not valid from the informational perspective; the logic underlying informational logical consequence is not classical, and is at the least paraconsistent sensu lato; informational logical consequence is not a Tarskian logical consequence. (shrink)
Recent years have witnessed a proliferation of attempts to apply the mathematical theory of probability to the semantics of natural language probability talk. These sorts of “probabilistic” semantics are often motivated by their ability to explain intuitions about inferences involving “likely” and “probably”—intuitions that Angelika Kratzer’s canonical semantics fails to accommodate through a semantics based solely on an ordering of worlds and a qualitative ranking of propositions. However, recent work by Wesley Holliday and Thomas Icard (...) has been widely thought to undercut this motivation: they present a world-ordering semantics that yields essentially the same logic as probabilisticsemantics. In this paper, I argue that the challenge remains: defenders of world-ordering semantics have yet to offer a plausible semantics that captures the logic of comparative likelihood. Holliday & Icard’s semantics yields an adequate logic only if models are restricted to Noetherian pre-orders. But I argue that the Noetherian restriction faces problems in cases involving infinitely large domains of epistemic possibilities. As a result, probabilisticsemantics remains the better explanation of the data. (shrink)
In this paper, new evidence is presented for the assumption that the reason-relation reading of indicative conditionals ('if A, then C') reflects a conventional implicature. In four experiments, it is investigated whether relevance effects found for the probability assessment of indicative conditionals (Skovgaard-Olsen, Singmann, and Klauer, 2016a) can be classified as being produced by a) a conversational implicature, b) a (probabilistic) presupposition failure, or c) a conventional implicature. After considering several alternative hypotheses and the accumulating evidence from other studies (...) as well, we conclude that the evidence is most consistent with the Relevance Effect being the outcome of a conventional implicature. This finding indicates that the reason-relation reading is part of the semantic content of indicative conditionals, albeit not part of their primary truth-conditional content. (shrink)
We advocate and develop a states-based semantics for both nominal and adjectival confidence reports, as in "Ann is confident/has confidence that it's raining", and their comparatives "Ann is more confident/has more confidence that it's raining than that it's snowing". Other examples of adjectives that can report confidence include "sure" and "certain". Our account adapts Wellwood's account of adjectival comparatives in which the adjectives denote properties of states, and measure functions are introduced compositionally. We further explore the prospects of applying (...) these tools to the semantics of probability operators. We emphasize three desirable and novel features of our semantics: (i) probability claims only exploit qualitative resources unless there is explicit compositional pressure for quantitative resources; (ii) the semantics applies to both probabilistic adjectives (e.g., "likely") and probabilistic nouns (e.g., "probability"); (iii) the semantics can be combined with an account of belief reports that allows thinkers to have incoherent probabilistic beliefs (e.g. thinking that A & B is more likely than A) even while validating the relevant purely probabilistic claims (e.g. validating the claim that A & B is never more likely than A). Finally, we explore the interaction between confidence-reporting discourse (e.g., "I am confident that...") and belief-reports about probabilistic discourse (e.g.,"I think it's likely that.."). (shrink)
Epistemic modal operators give rise to something very like, but also very unlike, Moore's paradox. I set out the puzzling phenomena, explain why a standard relational semantics for these operators cannot handle them, and recommend an alternative semantics. A pragmatics appropriate to the semantics is developed and interactions between the semantics, the pragmatics, and the definition of consequence are investigated. The semantics is then extended to probability operators. Some problems and prospects for probabilistic representations (...) of content and context are explored. (shrink)
Recent studies indicate that indicative conditionals like "If people wear masks, the spread of Covid-19 will be diminished" require a probabilistic dependency between their antecedents and consequents to be acceptable (Skovgaard-Olsen et al., 2016). But it is easy to make the slip from this claim to the thesis that indicative conditionals are acceptable only if this probabilistic dependency results from a causal relation between antecedent and consequent. According to Pearl (2009), understanding a causal relation involves multiple, hierarchically organized (...) conceptual dimensions: prediction, intervention, and counterfactual dependence. In a series of experiments, we test the hypothesis that these conceptual dimensions are differentially encoded in indicative and counterfactual conditionals. If this hypothesis holds, then there are limits as to how much of a causal relation is captured by indicative conditionals alone. Our results show that the acceptance of indicative and counterfactual conditionals can become dissociated. Furthermore, it is found that the acceptance of both is needed for accepting a causal relation between two co-occurring events. The implications that these findings have for the hypothesis above, and for recent debates at the intersection of the psychology of reasoning and causal judgment, are critically discussed. Our findings are consistent with viewing indicative conditionals as answering predictive queries requiring evidential relevance (even in the absence of direct causal relations). Counterfactual conditionals in contrast target causal relevance, specifically. Finally, we discuss the implications our results have for the yet unsolved question of how reasoners succeed in constructing causal models from verbal descriptions. (shrink)
This paper concerns the semantic difference between strong and weak neces-sity modals. First we identify a number of explananda: their well-known in-tuitive difference in strength between ‘must’ and ‘ought’ as well as differ-ences in connections to probabilistic considerations and acts of requiring and recommending. Here we argue that important extant analyses of the se-mantic differences, though tailored to account for some of these aspects, fail to account for all. We proceed to suggest that the difference between ’ought’ and ’must’ (...) lies in how they relate to scalar and binary standards. Briefly put, must(φ) says that among the relevant alternatives, φ is selected by the relevant binary standard, whereas ought(φ) says that among the relevant al-ternatives, φ is selected by the relevant scale. Given independently plausi-ble assumptions about how standards are provided by context, this ex-plains the relevant differences discussed. (shrink)
Ranking theory is a formal epistemology that has been developed in over 600 pages in Spohn's recent book The Laws of Belief, which aims to provide a normative account of the dynamics of beliefs that presents an alternative to current probabilistic approaches. It has long been received in the AI community, but it has not yet found application in experimental psychology. The purpose of this paper is to derive clear, quantitative predictions by exploiting a parallel between ranking theory and (...) a statistical model called logistic regression. This approach is illustrated by the development of a model for the conditional inference task using Spohn's ranking theoretic approach to conditionals. (shrink)
We propose a nonmonotonic Description Logic of typicality able to account for the phenomenon of the combination of prototypical concepts. The proposed logic relies on the logic of typicality ALC + TR, whose semantics is based on the notion of rational closure, as well as on the distributed semantics of probabilistic Description Logics, and is equipped with a cognitive heuristic used by humans for concept composition. We first extend the logic of typicality ALC + TR by typicality (...) inclusions of the form p :: T(C) v D, whose intuitive meaning is that “we believe with degree p about the fact that typical Cs are Ds”. As in the distributed semantics, we define different scenarios containing only some typicality inclusions, each one having a suitable probability. We then exploit such scenarios in order to ascribe typical properties to a concept C obtained as the combination of two prototypical concepts. We also show that reasoning in the proposed Description Logic is EXPTIME-complete as for the underlying standard Description Logic ALC. (shrink)
This paper motivates and develops a novel semantic framework for deontic modals. The framework is designed to shed light on two things: the relationship between deontic modals and substantive theories of practical rationality and the interaction of deontic modals with conditionals, epistemic modals and probability operators. I argue that, in order to model inferential connections between deontic modals and probability operators, we need more structure than is provided by classical intensional theories. In particular, we need probabilistic structure that interacts (...) directly with the compositional semantics of deontic modals. However, I reject theories that provide this probabilistic structure by claiming that the semantics of deontic modals is linked to the Bayesian notion of expectation. I offer a probabilistic premise semantics that explains all the data that create trouble for the rival theories. (shrink)
We propose a nonmonotonic Description Logic of typicality able to account for the phenomenon of combining prototypical concepts, an open problem in the fields of AI and cognitive modelling. Our logic extends the logic of typicality ALC + TR, based on the notion of rational closure, by inclusions p :: T(C) v D (“we have probability p that typical Cs are Ds”), coming from the distributed semantics of probabilistic Description Logics. Additionally, it embeds a set of cognitive heuristics (...) for concept combination. We show that the complexity of reasoning in our logic is EXPTIME-complete as in ALC. (shrink)
Logical empiricism is commonly seen as a counter-position to scientific realism. In the present paper it is shown that there indeed existed a realist faction within the logical empiricist movement. In particular, I shall point out that at least four types of realistic arguments can be distinguished within this faction: Reichenbach’s ‘probabilistic argument,’ Feigl’s ‘pragmatic argument,’ Hempel’s ‘indispensability argument,’ and Kaila’s ‘invariantist argument.’ All these variations of arguments are intended to prevent the logical empiricist agenda from the shortcomings of (...) radical positivism, instrumentalism, and other forms of scientific antirealism. On the whole, it will be seen that logical empiricism and scientific realism are essentially compatible with each other. Especially Kaila’s invariantist approach to science (and nature) comes quite close to what nowadays is discussed under the label ‘structural realism.’ This, in turn, necessitates a fundamental reevaluation of Kaila’s role in the logical empiricist movement in particular and in twentieth-century philosophy of science in general. (shrink)
This paper defines the form of prior knowledge that is required for sound inferences by analogy and single-instance generalizations, in both logical and probabilistic reasoning. In the logical case, the first order determination rule defined in Davies (1985) is shown to solve both the justification and non-redundancy problems for analogical inference. The statistical analogue of determination that is put forward is termed 'uniformity'. Based on the semantics of determination and uniformity, a third notion of "relevance" is defined, both (...) logically and probabilistically. The statistical relevance of one function in determining another is put forward as a way of defining the value of information: The statistical relevance of a function F to a function G is the absolute value of the change in one's information about the value of G afforded by specifying the value of F. This theory provides normative justifications for conclusions projected by analogy from one case to another, and for generalization from an instance to a rule. The soundness of such conclusions, in either the logical or the probabilistic case, can be identified with the extent to which the corresponding criteria (determination and uniformity) actually hold for the features being related. (shrink)
In this work we present an explainable system for emotion attribution and recommendation (called DEGARI (Dynamic Emotion Generator And ReclassIfier) relying on a recently introduced probabilistic commonsense reasoning framework (i.e. the TCL logic, see Lieto & Pozzato 2020) which is based on a human-like procedure for the automatic generation of novel concepts in a Description Logics knowledge base (see also Lieto et al. 2019, Chiodino et al. 2020 for other applications). In particular, in order to model human-like forms of (...) concept combinations, the TCL logic combines a probabilistic description logics of typicality with the HEAD-MODIFIER heuristics coming from cognitive semantics. In the context of our application of such framework, our system exploits the logic TCL to automatically generate novel commonsense semantic representations of compound emotions (e.g. Love as derived from the combination of Joy and Trust according to Plutchik’s theory of emotions, see Plutchik 2001). The generated emotions corresponds to prototypes, i.e. commonsense representations of given concepts, and have been used to reclassify emotion-related contents in a variety of artistic domains. We have tested our system in the context of the H2020 EU project SPICE for providing artistic recommendations of museum items in the Galleria di Arte Moderna (GAM) of Turin. The obtained results (reported in Lieto et al, 2021 and Lieto at al., to appear) show promising results for what concerns both the acceptance of the provided affective recommendation and the explainability associated to each suggestion made by the users. Here we discuss, the open problems and the lessons learned. -/- . (shrink)
In this paper I present a precise version of Stalnaker's thesis and show that it is both consistent and predicts our intuitive judgments about the probabilities of conditionals. The thesis states that someone whose total evidence is E should have the same credence in the proposition expressed by 'if A then B' in a context where E is salient as they have conditional credence in the proposition B expresses given the proposition A expresses in that context. The thesis is formalised (...) rigorously and two models are provided that demonstrate that the new thesis is indeed tenable within a standard possible world semantics based on selection functions. Unlike the Stalnaker-Lewis semantics the selection functions cannot be understood in terms of similarity. A probabilistic account of selection is defended in its place. -/- I end the paper by suggesting that this approach overcomes some of the objections often leveled at accounts of indicatives based on the notion of similarity. (shrink)
We propose a typology of representational artifacts for health care and life sciences domains and associate this typology with different kinds of formal ontology and logic, drawing conclusions as to the strengths and limitations for ontology in a description logics framework. The four types of domain representation we consider are: (i) lexico-semantic representation, (ii) representation of types of entities, (iii) representations of background knowledge, and (iv) representation of individuals. We advocate a clear distinction of the four kinds of representation in (...) order to provide a more rational basis for using ontologies and related artifacts to advance integration of data and enhance interoperability of associated reasoning systems. We highlight the fact that only a minor portion of scientifically relevant facts in a domain such as biomedicine can be adequately represented by formal ontologies as long as the latter are conceived as representations of entity types. In particular, the attempt to encode default or probabilistic knowledge using ontologies so conceived is prone to produce unintended, erroneous models. (shrink)
In recent decades, the analysis of phraseology has made use of the exploration of large corpora as a source of quantitative information about language. This paper intends to present the main lines of work in progress based on this empirical approach to linguistic analysis. In particular, we focus our attention on some problems relating to the morpho-syntactic annotation of corpora. The CORIS/CODIS corpus of contemporary written Italian, developed at CILTA – University of Bologna (Rossini Favretti 2000; Rossini Favretti, Tamburini, De (...) Santis in press), is a synchronic 100-million-word corpus and is being lemmatised and annotated with part-of-speech (POS) tags, in order to increase the quantity of information and improve data retrieval procedures (Tamburini 2000). The aim of POS tagging is to assign each lexical unit to the appropriate word class. Usually the set of tags is pre-established by the linguist, who uses his/her competence to identify the different word classes. The very first experiments we made revealed how the traditional part-of-speech distinctions in Italian (generally based on morphological and semantic criteria) are often inadequate to represent the syntactic features of words in context. It is worth noting that the uncertainties in categorisation contained in Italian grammars and dictionaries reflect a growing difficulty as they move from fundamental linguistic classes, such as nouns and verbs, to more complex classes, such as adverbs, pronouns, prepositions and conjunctions. This latter class, that groups together elements traditionally used to express connections between sentences, appears inadequate when describing cohesive relations in Italian. This phenomenon actually seems to involve other elements traditionally assigned to different classes, such as adverbs, pronouns and interjections. Recent studies proposed the class of ‘connectives’, grouping all words that, apart from their traditional word class, have the function of connecting phrases and contributing to textual cohesion. From this point of view, conjunctions can be considered as part of phrasal connectives, that can in turn be included in the wider category of textual connectives. The aim of this study is to identify elements that can be included in the class of phrasal connectives, using quantitative methods. According to Shannon and Weaver’s (1949) observation that words are linked by dependent probabilities, corroborated by Halliday’s (1991) argument that the grammatical “system” (in Firth’s sense of the term) is essentially probabilistic, quantitative data are introduced in order to provide evidence of relative frequencies. Section 2 presents a description of word-class categorisation from the point of view of grammars and dictionaries arguing that the traditional category of conjunctions is inadequate for capturing the notion of phrasal connective. Section 3 examines the notion of ‘connective’ and suggests a truth-function interpretation of connective behaviour. Section 4 describes the quantitative methods proposed for analysing the distributional properties of lexical units, and section 5 comments on the results obtained by applying such methods drawing some provisional conclusions. (shrink)
ACTION-THEORETICALLY EXPLANATORY INTERPRETATIONS AS A MEANS OF SEMANTIC MEANING ANALYSIS The article first develops a general procedure for semantic meaning analysis in difficult cases where the meaning is very uncertain. The procedure consists of searching for one or more possible hypothetical causal explanations of the text, these explanations containing, among other things, the semantic intention of the author, his subjective reasons for this meaning and for the writing down of the text, but also the path of transmission of the text (...) from the writing down to the respective present text version. These hypothetical explanations (interpretations) are deductive-nomological or probabilistic, therefore presuppose corresponding general law hypotheses. If there is only one possible (hypothetical) explanation of the text at hand, this hypothetical explanation is also true. Very often, however, there are several possible explanations. In this case, the probabilities of these possible explanations must be determined; these probabilities then carry over to the individual hypotheses in the explanation (according to Bayes' law). This general procedure of meaning analysis is used in the second part of the article to reconstruct two famous interpretations (by Wapnewski and Hahn) of a poem by Walther von der Vogelweide ("Nemt, frowe, disen kranz"). -/- GERMAN ABSTRACT In dem Artikel wird zunächst ein allgemeines Verfahren zur semantischen Bedeutungsanalyse in schwierigen Fällen entwickelt, in denen die Bedeutung sehr unsicher ist. Das Verfahren besteht darin, daß eine oder mehrere mögliche hypothetische kausale Erklärungen des Textes gesucht werden, wobei diese Erklärungen u.a. die semantische Intention des Autors enthalten, seine subjektiven Gründe für diese Bedeutung und für die Niederschrift des Textes, aber auch den Weg der Tradierung des Textes von der Niederschrift bis zur jeweils vorliegenden Textversion. Diese hypothetischen Erklärungen (Deutungen) sind deduktiv-nomologisch oder probabilistisch, setzen deshalb entsprechende allgemeine Gesetzeshypothesen voraus. Wenn es nur eine mögliche (hypothetische) Erklärung des vorliegenden Textes gibt, ist diese hypothetische Erklärung auch wahr. Sehr häufig gibt es aber mehrere mögliche Erklärungen. In diesem Fall müssen die Wahrscheinlichkeiten dieser möglichen Erklärungen bestimmt werden; diese Wahrscheinlichkeiten übertragen sich dann auch auf die einzelnen Hypothesen in der Erklärung (nach dem Bayesschen Gesetz). Dieses allgemeine Verfahren der Bedeutungsanalyse wird im zweiten Teil des Artikels zur Rekonstruktion zweier berühmter Interpretationen (von Wapnewski und Hahn) eines Gedichts von Walther von der Vogelweide („Nemt, frowe, disen kranz“) verwendet. (shrink)
Many recent theories of epistemic discourse exploit an informational notion of consequence, i.e. a notion that defines entailment as preservation of support by an information state. This paper investigates how informational consequence fits with probabilistic reasoning. I raise two problems. First, all informational inferences that are not also classical inferences are, intuitively, probabilistically invalid. Second, all these inferences can be exploited, in a systematic way, to generate triviality results. The informational theorist is left with two options, both of them (...) radical: they can either deny that epistemic modal claims have probability at all, or they can move to a nonstandard probability theory. (shrink)
Famous results by David Lewis show that plausible-sounding constraints on the probabilities of conditionals or evaluative claims lead to unacceptable results, by standard probabilistic reasoning. Existing presentations of these results rely on stronger assumptions than they really need. When we strip these arguments down to a minimal core, we can see both how certain replies miss the mark, and also how to devise parallel arguments for other domains, including epistemic “might,” probability claims, claims about comparative value, and so on. (...) A popular reply to Lewis's results is to claim that conditional claims, or claims about subjective value, lack truth conditions. For this strategy to have a chance of success, it needs to give up basic structural principles about how epistemic states can be updated—in a way that is strikingly parallel to the commitments of the project of dynamic semantics. (shrink)
A series of recent studies have explored the impact of people's judgments regarding physical law, morality, and probability. Surprisingly, such studies indicate that these three apparently unrelated types of judgments often have precisely the same impact. We argue that these findings provide evidence for a more general hypothesis about the kind of cognition people use to think about possibilities. Specifically, we suggest that this aspect of people's cognition is best understood using an idea developed within work in the formal (...) class='Hi'>semantics tradition, namely the notion of modality. On the view we propose, people may have separate representations for physical, moral and probabilistic considerations, but they also integrate these various considerations into a unified representation of modality. (shrink)
There is a long tradition in formal epistemology and in the psychology of reasoning to investigate indicative conditionals. In psychology, the propositional calculus was taken for granted to be the normative standard of reference. Experimental tasks, evaluation of the participants’ responses and psychological model building, were inspired by the semantics of the material conditional. Recent empirical work on indicative conditionals focuses on uncertainty. Consequently, the normative standard of reference has changed. I argue why neither logic nor standard probability theory (...) provide appropriate rationality norms for uncertain conditionals. I advocate coherence based probability logic as an appropriate framework for investigating uncertain conditionals. Detailed proofs of the probabilistic non-informativeness of a paradox of the material conditional illustrate the approach from a formal point of view. I survey selected data on human reasoning about uncertain conditionals which additionally support the plausibility of the approach from an empirical point of view. (shrink)
This dissertation is devoted to empirically contrasting the Suppositional Theory of conditionals, which holds that indicative conditionals serve the purpose of engaging in hypothetical thought, and Inferentialism, which holds that indicative conditionals express reason relations. Throughout a series of experiments, probabilistic and truth-conditional variants of Inferentialism are investigated using new stimulus materials, which manipulate previously overlooked relevance conditions. These studies are some of the first published studies to directly investigate the central claims of Inferentialism empirically. In contrast, the Suppositional (...) Theory of conditionals has an impressive track record through more than a decade of intensive testing. The evidence for the Suppositional Theory encompasses three sources. Firstly, direct investigations of the probability of indicative conditionals, which substantiate “the Equation” (P(if A, then C) = P(C|A)). Secondly, the pattern of results known as “the defective truth table” effect, which corroborates the de Finetti truth table. And thirdly, indirect evidence from the uncertain and-to-if inference task. Through four studies each of these sources of evidence are scrutinized anew under the application of novel stimulus materials that factorially combine all permutations of prior and relevance levels of two conjoined sentences. The results indicate that the Equation only holds under positive relevance (P(C|A) – P(C|¬A) > 0) for indicative conditionals. In the case of irrelevance (P(C|A) – P(C|¬A) = 0), or negative relevance (P(C|A) – P(C|¬A) < 0), the strong relationship between P(if A, then C) and P(C|A) is disrupted. This finding suggests that participants tend to view natural language conditionals as defective under irrelevance and negative relevance (Chapter 2). Furthermore, most of the participants turn out only to be probabilistically coherent above chance levels for the uncertain and-to-if inference in the positive relevance condition, when applying the Equation (Chapter 3). Finally, the results on the truth table task indicate that the de Finetti truth table is at most descriptive for about a third of the participants (Chapter 4). Conversely, strong evidence for a probabilistic implementation of Inferentialism could be obtained from assessments of P(if A, then C) across relevance levels (Chapter 2) and the participants’ performance on the uncertain-and-to-if inference task (Chapter 3). Yet the results from the truth table task suggest that these findings could not be extended to truth-conditional Inferentialism (Chapter 4). On the contrary, strong dissociations could be found between the presence of an effect of the reason relation reading on the probability and acceptability evaluations of indicative conditionals (and connate sentences), and the lack of an effect of the reason relation reading on the truth evaluation of the same sentences. A bird’s eye view on these surprising results is taken in the final chapter and it is discussed which perspectives these results open up for future research. (shrink)
The Ramsey Test is considered to be the default test for the acceptability of indicative conditionals. I will argue that it is incompatible with some of the recent developments in conceptualizing conditionals, namely the growing empirical evidence for the _Relevance Hypothesis_. According to the hypothesis, one of the necessary conditions of acceptability for an indicative conditional is its antecedent being positively probabilistically relevant for the consequent. The source of the idea is _Evidential Support Theory_ presented in Douven (2008). I will (...) defend the hypothesis against alleged counterexamples, and show that it is supported by growing empirical evidence. Finally, I will present a version of the Ramsey test which incorporates the relevance condition and therefore is consistent with growing empirical evidence for the relevance hypothesis. (shrink)
In this paper, a critical discussion is made of the role of entailments in the so-called New Paradigm of psychology of reasoning based on Bayesian models of rationality (Elqayam & Over, 2013). It is argued that assessments of probabilistic coherence cannot stand on their own, but that they need to be integrated with empirical studies of intuitive entailment judgments. This need is motivated not just by the requirements of probability theory itself, but also by a need to enhance the (...) interdisciplinary integration of the psychology of reasoning with formal semantics in linguistics. The constructive goal of the paper is to introduce a new experimental paradigm, called the Dialogical Entailment task, to supplement current trends in the psychology of reasoning towards investigating knowledge-rich, social reasoning under uncertainty (Oaksford and Chater, 2019). As a case study, this experimental paradigm is applied to reasoning with conditionals and negation operators (e.g. CEM, wide and narrow negation). As part of the investigation, participants’ entailment judgments are evaluated against their probability evaluations to assess participants’ cross-task consistency over two experimental sessions. (shrink)
There is an ongoing debate in the philosophical literature whether the conditionals that are central to deliberation are subjunctive or indicative conditionals and, if the latter, what semantics of the indicative conditional is compatible with the role that conditionals play in deliberation. We propose a possible-world semantics where conditionals of the form “if I take action a the outcome will be x” are interpreted as material conditionals. The proposed framework is illustrated with familiar examples and both qualitative and (...)probabilistic beliefs are considered. Issues such as common-cause cases and ‘Egan-style’ cases are discussed. (shrink)
The nature and topology of time remains an open question in philosophy, both tensed and tenseless concepts of time appear to have merit. A concept of time including both kinds of time evolution of physical systems in quantum mechanics subsumes the properties of both notions. The linear dynamics defines the universe probabilistically throughout space-time, and can be seen as the definition of a block universe. The collapse dynamics is the time evolution of the linear dynamics, and is thus of different (...) logical type to the linear dynamics. These two different kinds of time evolution are respectively tensed and tenseless. Ascribing tensed semantics to the collapse dynamics is problematic in the light of special relativity, but this difficulty does not apply to a relational quantum mechanics. In this context, while the linear dynamics is the time evolution of the universe objectively, the collapse dynamics is the time evolution of the universe subjectively, applying solely in the functional frame of reference of the observer. (shrink)
Prezentări și recenzii ale unor lucrări filosofice care au influențat conceptele și discursurile filosofice, și viziunea actuală asupra lumii. -/- CUPRINS: -/- Despre sens și referință Structura normativă a științei - Universalism - "Comunism" - Dezinteresare - Scepticism organizat Natura teoriilor - Teorii și non-observabile - Reguli de corespondență - Cum sunt derivate noile legi empirice din legile teoretice Legile și rolul lor în explicarea științifică - Două cerințe de bază pentru explicații științifice - Explicația deductiv-nomologică - Legi universale și (...) generalizări accidentale - - Ce diferențiază legile autentice de generalizările accidentale? - Explicație probabilistică: fundamente - Probabilități statistice și legi probabilistice - Caracterul inductiv al explicației probabiliste Evenimente și particularități Numire și necesitate - Prelegerea I - - Prelegerea II - Prelegerea III - Concluzii Metodologia programelor de cercetare științifică Către o filosofie a tehnologiei - Dinamica formală a tehnologiei - - Trăsături ale tehnologiei moderne - Aspectele materiale ale tehnologiei - Către o etică a tehnologiei Viața în laborator Filosofia morală ca știință aplicată Progres sau raționalitate? Raționalizarea democratică Relativitatea existențială Cercetarea științifică este o datorie morală - Principiul beneficiarității - Principiul corectitudinii Coordonarea între variabile - Antinomia variabilei - Abordarea tarskiană - Respingerea rolului semantic - Abordarea instanțială - Abordarea algebrică - Abordarea relațională - Semantica relațională pentru logica de primul ordin Referințe Despre autor - Nicolae Sfetcu - - De același autor - - Contact Editura - MultiMedia Publishing . (shrink)
The present dissertation aims at analyzing some linguistic aspects related to the lexical, semantic and syntactic behaviour of a number of verbs of FEELING in English whose lexical, grammatical and idiosyncratic properties have been entered into the FunGramKB Editor in application of study of the theoretical assumptions propounded by the Lexical-Constructional Model. -/- Analysis and subsequent input of data have been assessed against the background of some of the 20th-century trends in linguistics which find their expression in the first decade (...) of this century, and the role of semantics in a world in which increasing priority is given to probabilistic, machine-learned output in lexicographic work. From this stance, the generic features contained in the FunGramKB meaning postulates and thematic frames as outlined in the Lexical-Constructional Model bring hope for a more faithful rendering of the semantic relationships established within human expression, while making provisions for a semanticist‟s contribution to refinement and storage of both thorough and extensive knowledge. (shrink)
Richard Dawkins has popularized an argument that he thinks sound for showing that there is almost certainly no God. It rests on the assumptions (1) that complex and statistically improbable things are more difficult to explain than those that are not and (2) that an explanatory mechanism must show how this complexity can be built up from simpler means. But what justifies claims about the designer’s own complexity? One comes to a different understanding of order and of simplicity when one (...) considers the psychological counterpart of information. In assessing his treatment of biological organisms as either self-programmed machines or algorithms, I show how self-generated organized complexity does not fit well with our knowledge of abduction and of information theory as applied to genetics. I also review some philosophical proposals for explaining how the complexity of the world could be externally controlled if one wanted to uphold a traditional understanding of divine simplicity. (shrink)
The aim of the paper is to develop general criteria of argumentative validity and adequacy for probabilistic arguments on the basis of the epistemological approach to argumentation. In this approach, as in most other approaches to argumentation, proabilistic arguments have been neglected somewhat. Nonetheless, criteria for several special types of probabilistic arguments have been developed, in particular by Richard Feldman and Christoph Lumer. In the first part (sects. 2-5) the epistemological basis of probabilistic arguments is discussed. With (...) regard to the philosophical interpretation of probabilities a new subjectivist, epistemic interpretation is proposed, which identifies probabilities with tendencies of evidence (sect. 2). After drawing the conclusions of this interpretation with respect to the syntactic features of the probability concept, e.g. one variable referring to the data base (sect. 3), the justification of basic probabilities (priors) by judgements of relative frequency (sect. 4) and the justification of derivative probabilities by means of the probability calculus are explained (sect. 5). The core of the paper is the definition of '(argumentatively) valid derivative probabilistic arguments', which provides exact conditions for epistemically good probabilistic arguments, together with conditions for the adequate use of such arguments for the aim of rationally convincing an addressee (sect. 6). Finally, some measures for improving the applicability of probabilistic reasoning are proposed (sect. 7). (shrink)
How can different individuals' probability assignments to some events be aggregated into a collective probability assignment? Classic results on this problem assume that the set of relevant events -- the agenda -- is a sigma-algebra and is thus closed under disjunction (union) and conjunction (intersection). We drop this demanding assumption and explore probabilistic opinion pooling on general agendas. One might be interested in the probability of rain and that of an interest-rate increase, but not in the probability of rain (...) or an interest-rate increase. We characterize linear pooling and neutral pooling for general agendas, with classic results as special cases for agendas that are sigma-algebras. As an illustrative application, we also consider probabilistic preference aggregation. Finally, we compare our results with existing results on binary judgment aggregation and Arrovian preference aggregation. This paper is the first of two self-contained, but technically related companion papers inspired by binary judgment-aggregation theory. (shrink)
Suppose several individuals (e.g., experts on a panel) each assign probabilities to some events. How can these individual probability assignments be aggregated into a single collective probability assignment? This article reviews several proposed solutions to this problem. We focus on three salient proposals: linear pooling (the weighted or unweighted linear averaging of probabilities), geometric pooling (the weighted or unweighted geometric averaging of probabilities), and multiplicative pooling (where probabilities are multiplied rather than averaged). We present axiomatic characterisations of each class of (...) pooling functions (most of them classic, but one new) and argue that linear pooling can be justified procedurally, but not epistemically, while the other two pooling methods can be justified epistemically. The choice between them, in turn, depends on whether the individuals' probability assignments are based on shared information or on private information. We conclude by mentioning a number of other pooling methods. (shrink)
How can different individuals' probability functions on a given sigma-algebra of events be aggregated into a collective probability function? Classic approaches to this problem often require 'event-wise independence': the collective probability for each event should depend only on the individuals' probabilities for that event. In practice, however, some events may be 'basic' and others 'derivative', so that it makes sense first to aggregate the probabilities for the former and then to let these constrain the probabilities for the latter. We formalize (...) this idea by introducing a 'premise-based' approach to probabilistic opinion pooling, and show that, under a variety of assumptions, it leads to linear or neutral opinion pooling on the 'premises'. This paper is the second of two self-contained, but technically related companion papers inspired by binary judgment-aggregation theory. (shrink)
A common view among nontheists combines the de jure objection that theism is epistemically unacceptable with agnosticism about the de facto objection that theism is false. Following Plantinga, we can call this a “proper” de jure objection—a de jure objection that does not depend on any de facto objection. In his Warranted Christian Belief, Plantinga has produced a general argument against all proper de jure objections. Here I first show that this argument is logically fallacious (it makes subtle probabilistic (...) fallacies disguised by scope ambiguities), and proceed to lay the groundwork for the construction of actual proper de jure objections. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.