In the classic Miners case, an agent subjectively ought to do what they know is objectively wrong. This case shows that the subjective and objective ‘oughts’ are somewhat independent. But there remains a powerful intuition that the guidance of objective ‘oughts’ is more authoritative—so long as we know what they tell us. We argue that this intuition must be given up in light of a monotonicity principle, which undercuts the rationale for saying that objective ‘oughts’ are an authoritative guide (...) for agents and advisors. (shrink)
A non-embracing consequence relation is one such that no set of wffs closed under it is equal to the set of all wffs. I prove that these relations have no deductive power if they are also extensive and monotonic.
In this paper we present a framework for the dynamic and automatic generation of novel knowledge obtained through a process of commonsense reasoning based on typicality-based concept combination. We exploit a recently introduced extension of a Description Logic of typicality able to combine prototypical descriptions of concepts in order to generate new prototypical concepts and deal with problem like the PET FISH (Osherson and Smith, 1981; Lieto & Pozzato, 2019). Intuitively, in the context of our application of this logic, the (...) overall pipeline of our system works as follows: given a goal expressed as a set of properties, if the knowledge base does not contain a concept able to fulfill all these properties, then our system looks for two concepts to recombine in order to extend the original knowledge based satisfy the goal. (shrink)
In questo contributo descriviamo un sistema di creatività computazionale in grado di generare automaticamente nuovi concetti utilizzando una logica descrittiva non monotòna che integra tre ingredienti principali: una logica descrittiva della tipicalità, una estensione probabilistica basata sulla semantica distribuita nota come DISPONTE, e una euristica di ispirazione cognitiva per la combinazione di più concetti. Una delle applicazioni principali del sistema riguarda il campo della creatività computazionale e, più specificatamente, il suo utilizzo come sistema di supporto alla creatività in ambito mediale. (...) In particolare, tale sistema è in grado di: generare nuove storie (a partire da una rappresentazione narrativa di storie pre-esistenti), generare nuovi personaggi (ad es. il nuovo “cattivo” di una serie tv o di un cartone animato) e, in generale, può essere utilizzato per proporre nuove soluzioni e format narrativi da esplorare nell’ambito dell’industria creativa. (shrink)
Nguyen argues that only his radically pragmatic account and Sterken’s indexical account can capture what we call the positive data. We present some new data, which we call the negative data, and argue that no theory of generics on the market is compatible with both the positive data and the negative data. We develop a novel version of the indexical account and show that it captures both the positive data and the negative data. In particular, we argue that there is (...) a semantic constraint that, in any context, the semantic value of GEN is upward monotone and non-symmetric. On the other hand, the pragmatic account has difficulty accommodating the negative data. This is because no pragmatic principles have been developed that can explain the negative data. In the paper, we focus on only the pragmatic account and the indexical account, but our discussion has broad implications for the debate on generics: any empirically adequate accounts of generics must be flexible enough to accommodate the positive data and yet constrained enough to accommodate the negative data. (shrink)
Inferentialism, especially Brandom’s theory, is the project aimed at understanding meaning as determined by inferences, and language as a social practice governed by rational discursive norms. Discursive practice is thus understood as the basic rational practice, where commitments undertaken by participants are evaluated in terms of their being correct/incorrect. This model of explanation is also intended to rescue, by means of reasons, the commitments we undertake ourselves and assess the commitments we attribute to others, in an objective sense: starting from (...) our subjective normative and doxastic attitudes we should be able to use the normative discursive resources apt to assess our commitments, not only referring to what we take to be correct, but also referring to how things actually are.My hypothesis is that this objectivity is not achieved only on the basis of the rational structure of discursive practice. The main worry concerns the fact that material inferences, those responsible for the content of our concepts (and commitments), are in general non-monotonic. These inferences put experts in an advantageous position,namely as those capable of defeasible reasoning. I propose a view by which this asymmetry among language users is the crucial factor in assessing the objectivity of claims within discursive practice. (shrink)
It is usually accepted that deductions are non-informative and monotonic, inductions are informative and nonmonotonic, abductions create hypotheses but are epistemically irrelevant, and both deductions and inductions can’t provide new insights. In this article, I attempt to provide a more cohesive view of the subject with the following hypotheses: (1) the paradigmatic examples of deductions, such as modus ponens and hypothetical syllogism, are not inferential forms, but coherence requirements for inferences; (2) since any reasoner aims to be coherent, any inference (...) must be deductive; (3) a coherent inference is an intuitive process where the premises should be taken as sufficient evidence for the conclusion, which on its turn should be viewed as a necessary evidence for the premises in some modal range; (4) inductions, properly understood, are abductions, but there are no abductions beyond the fact that in any inference the conclusion should be regarded as a necessary evidence for the premises; (5) motonocity is not only compatible with the retraction of past inferences given new information, but it is a requirement for it; (6) this explanation of inferences holds true for discovery processes, predictions and trivial inferences. (shrink)
We study imagination as reality-oriented mental simulation : the activity of simulating nonactual scenarios in one’s mind, to investigate what would happen if they were realized. Three connected questions concerning ROMS are: What is the logic, if there is one, of such an activity? How can we gain new knowledge via it? What is voluntary in it and what is not? We address them by building a list of core features of imagination as ROMS, drawing on research in cognitive psychology (...) and the philosophy of mind. We then provide a logic of imagination as ROMS which models such features, combining techniques from epistemic logic, action logic, and subject matter semantics. Our logic comprises a modal propositional language with non-monotonic imagination operators, a formal semantics, and an axiomatization. (shrink)
This paper offers a new account of metaphysical explanation. The account is modelled on Kitcher’s unificationist approach to scientific explanation. We begin, in Sect. 2, by briefly introducing the notion of metaphysical explanation and outlining the target of analysis. After that, we introduce a unificationist account of metaphysical explanation before arguing that such an account is capable of capturing four core features of metaphysical explanations: irreflexivity, non-monotonicity, asymmetry and relevance. Since the unificationist theory of metaphysical explanation inherits irreflexivity and (...) non-monotonicity directly from the unificationist theory of scientific explanation that underwrites it, we focus on demonstrating how the account can secure asymmetry and relevance. (shrink)
This paper presents a uniform semantic treatment of nonmonotonic inference operations that allow for inferences from infinite sets of premises. The semantics is formulated in terms of selection functions and is a generalization of the preferential semantics of Shoham (1987), (1988), Kraus, Lehman, and Magidor (1990) and Makinson (1989), (1993). A selection function picks out from a given set of possible states (worlds, situations, models) a subset consisting of those states that are, in some sense, the most preferred ones. A (...) proposition α is a nonmonotonic consequence of a set of propositions Γ iff α holds in all the most preferred Γ-states. In the literature on revealed preference theory, there are a number of well-known theorems concerning the representability of selection functions, satisfying certain properties, in terms of underlying preference relations. Such theorems are utilized here to give corresponding representation theorems for nonmonotonic inference operations. At the end of the paper, the connection between nonmonotonic inference and belief revision, in the sense of Alchourrón, Gärdenfors, and Makinson, is explored. In this connection, infinitary belief revision operations that allow for the revision of a theory with a possibly infinite set of propositions are introduced and characterized axiomatically. (shrink)
I am presenting a sequent calculus that extends a nonmonotonic consequence relation over an atomic language to a logically complex language. The system is in line with two guiding philosophical ideas: (i) logical inferentialism and (ii) logical expressivism. The extension defined by the sequent rules is conservative. The conditional tracks the consequence relation and negation tracks incoherence. Besides the ordinary propositional connectives, the sequent calculus introduces a new kind of modal operator that marks implications that hold monotonically. Transitivity fails, but (...) for good reasons. Intuitionism and classical logic can easily be recovered from the system. (shrink)
We present a formal semantics for epistemic logic, capturing the notion of knowability relative to information (KRI). Like Dretske, we move from the platitude that what an agent can know depends on her (empirical) information. We treat operators of the form K_AB (‘B is knowable on the basis of information A’) as variably strict quantifiers over worlds with a topic- or aboutness- preservation constraint. Variable strictness models the non-monotonicity of knowledge acquisition while allowing knowledge to be intrinsically stable. Aboutness-preservation (...) models the topic-sensitivity of information, allowing us to invalidate controversial forms of epistemic closure while validating less controversial ones. Thus, unlike the standard modal framework for epistemic logic, KRI accommodates plausible approaches to the Kripke-Harman dogmatism paradox, which bear on non-monotonicity, or on topic-sensitivity. KRI also strikes a better balance between agent idealization and a non-trivial logic of knowledge ascriptions. (shrink)
In this article we present an advanced version of Dual-PECCS, a cognitively-inspired knowledge representation and reasoning system aimed at extending the capabilities of artificial systems in conceptual categorization tasks. It combines different sorts of common-sense categorization (prototypical and exemplars-based categorization) with standard monotonic categorization procedures. These different types of inferential procedures are reconciled according to the tenets coming from the dual process theory of reasoning. On the other hand, from a representational perspective, the system relies on the hypothesis of conceptual (...) structures represented as heterogeneous proxytypes. Dual-PECCS has been experimentally assessed in a task of conceptual categorization where a target concept illustrated by a simple common-sense linguistic description had to be identified by resorting to a mix of categorization strategies, and its output has been compared to human responses. The obtained results suggest that our approach can be beneficial to improve the representational and reasoning conceptual capabilities of standard cognitive artificial systems, and –in addition– that it may be plausibly applied to different general computational models of cognition. The current version of the system, in fact, extends our previous work, in that Dual-PECCS is now integrated and tested into two cognitive architectures, ACT-R and CLARION, implementing different assumptions on the underlying invariant structures governing human cognition. Such integration allowed us to extend our previous evaluation. (shrink)
This article proposes a new interpretation of mutual information. We examine three extant interpretations of MI by reduction in doubt, by reduction in uncertainty, and by divergence. We argue that the first two are inconsistent with the epistemic value of information assumed in many applications of MI: the greater is the amount of information we acquire, the better is our epistemic position, other things being equal. The third interpretation is consistent with EVI, but it is faced with the problem of (...) measure sensitivity and fails to justify the use of MI in giving definitive answers to questions of information. We propose a fourth interpretation of MI by reduction in expected inaccuracy, where inaccuracy is measured by a strictly proper monotonic scoring rule. It is shown that the answers to questions of information given by MI are definitive whenever this interpretation is appropriate, and that it is appropriate in a wide range of applications with epistemic implications. _1_ Introduction _2_ Formal Analyses of the Three Interpretations _2.1_ Reduction in doubt _2.2_ Reduction in uncertainty _2.3_ Divergence _3_ Inconsistency with Epistemic Value of Information _4_ Problem of Measure Sensitivity _5_ Reduction in Expected Inaccuracy _6_ Resolution of the Problem of Measure Sensitivity _6.1_ Alternative measures of inaccuracy _6.2_ Resolution by strict propriety _6.3_ Range of applications _7_ Global Scoring Rules _8_ Conclusion. (shrink)
This paper outlines an account of conditionals, the evidential account, which rests on the idea that a conditional is true just in case its antecedent supports its consequent. As we will show, the evidential account exhibits some distinctive logical features that deserve careful consideration. On the one hand, it departs from the material reading of ‘if then’ exactly in the way we would like it to depart from that reading. On the other, it significantly differs from the non-material accounts which (...) hinge on the Ramsey Test, advocated by Adams, Stalnaker, Lewis, and others. (shrink)
We introduce a ranking of multidimensional alternatives, including uncertain prospects as a particular case, when these objects can be given a matrix form. This ranking is separable in terms of rows and columns, and continuous and monotonic in the basic quantities. Owing to the theory of additive separability developed here, we derive very precise numerical representations over a large class of domains (i.e., typically notof the Cartesian product form). We apply these representationsto (1)streams of commodity baskets through time, (2)uncertain social (...) prospects, (3)uncertain individual prospects. Concerning(1), we propose a finite horizon variant of Koopmans’s (1960) axiomatization of infinite discounted utility sums. The main results concern(2). We push the classic comparison between the exanteand expostsocial welfare criteria one step further by avoiding any expected utility assumptions, and as a consequence obtain what appears to be the strongest existing form of Harsanyi’s (1955) Aggregation Theorem. Concerning(3), we derive a subjective probability for Anscombe and Aumann’s (1963) finite case by merely assuming that there are two epistemically independent sources of uncertainty. (shrink)
We introduce the notion of complexity, first at an intuitive level and then in relatively more concrete terms, explaining the various characteristic features of complex systems with examples. There exists a vast literature on complexity, and our exposition is intended to be an elementary introduction, meant for a broad audience. -/- Briefly, a complex system is one whose description involves a hierarchy of levels, where each level is made of a large number of components interacting among themselves. The time evolution (...) of such a system is of a complex nature, depending on the interactions among subsystems in the next level below the one under consideration and, at the same time, conditioned by the level above, where the latter sets the context for the evolution. Generally speaking, the interactions among the constituents of the various levels lead to a dynamics characterized by numerous characteristic scales, each level having its own set of scales. What is more, a level commonly exhibits ‘emergent properties’ that cannot be derived from considerations relating to its component systems taken in isolation or to those in a different contextual setting. In the dynamic evolution of some particular level, there occurs a self-organized emergence of a higher level and the process is repeated at still higher levels. -/- The interaction and self-organization of the components of a complex system follow the principle commonly expressed by saying that the ‘whole is different from the sum of the parts’. In the case of systems whose behavior can be expressed mathematically in terms of differential equations this means that the interactions are nonlinear in nature. -/- While all of the above features are not universally exhibited by complex systems, these are nevertheless indicative of a broad commonness relative to which individual systems can be described and analyzed. There exist measures of complexity which, once again, are not of universal applicability, being more heuristic than exact. The present state of knowledge and understanding of complex systems is itself an emerging one. Still, a large number of results on various systems can be related to their complex character, making complexity an immensely fertile concept in the study of natural, biological, and social phenomena. -/- All this puts a very definite limitation on the complete description of a complex system as a whole since such a system can be precisely described only contextually, relative to some particular level, where emergent properties rule out an exact description of more than one levels within a common framework. -/- We discuss the implications of these observations in the context of our conception of the so-called noumenal reality that has a mind-independent existence and is perceived by us in the form of the phenomenal reality. The latter is derived from the former by means of our perceptions and interpretations, and our efforts at sorting out and making sense of the bewildering complexity of reality takes the form of incessant processes of inference that lead to theories. Strictly speaking, theories apply to models that are constructed as idealized versions of parts of reality, within which inferences and abstractions can be carried out meaningfully, enabling us to construct the theories. -/- There exists a correspondence between the phenomenal and the noumenal realities in terms of events and their correlations, where these are experienced as the complex behavior of systems or entities of various descriptions. The infinite diversity of behavior of systems in the phenomenal world are explained within specified contexts by theories. The latter are constructs generated in our ceaseless attempts at interpreting the world, and the question arises as to whether these are reflections of `laws of nature' residing in the noumenal world. This is a fundamental concern of scientific realism, within the fold of which there exists a trend towards the assumption that theories express truths about the noumenal reality. We examine this assumption (referred to as a ‘point of view’ in the present essay) closely and indicate that an alternative point of view is also consistent within the broad framework of scientific realism. This is the view that theories are domain-specific and contextual, and that these are arrived at by independent processes of inference and abstractions in the various domains of experience. Theories in contiguous domains of experience dovetail and interpenetrate with one another, and bear the responsibility of correctly explaining our observations within these domains. -/- With accumulating experience, theories get revised and the network of our theories of the world acquires a complex structure, exhibiting a complex evolution. There exists a tendency within the fold of scientific realism of interpreting this complex evolution in rather simple terms, where one assumes (this, again, is a point of view) that theories tend more and more closely to truths about Nature and, what is more, progress towards an all-embracing ‘ultimate theory’ -- a foundational one in respect of all our inquiries into nature. We examine this point of view closely and outline the alternative view -- one broadly consistent with scientific realism -- that there is no ‘ultimate’ law of nature, that theories do not correspond to truths inherent in reality, and that successive revisions in theory do not lead monotonically to some ultimate truth. Instead, the theories generated in succession are incommensurate with each other, testifying to the fact that a theory gives us a perspective view of some part of reality, arrived at contextually. Instead of resembling a monotonically converging series successive theories are analogous to asymptotic series. -/- Before we summarize all the above considerations, we briefly address the issue of the complexity of the {\it human mind} -- one as pervasive as the complexity of Nature at large. The complexity of the mind is related to the complexity of the underlying neuronal organization in the brain, which operates within a larger biological context, its activities being modulated by other physiological systems, notably the one involving a host of chemical messengers. The mind, with no materiality of its own, is nevertheless emergent from the activity of interacting neuronal assemblies in the brain. As in the case of reality at large, there can be no ultimate theory of the mind, from which one can explain and predict the entire spectrum of human behavior, which is an infinitely rich and diverse one. (shrink)
In this paper we propose a computational framework aimed at extending the problem solving capabilities of cognitive artificial agents through the introduction of a novel, goal-directed, dynamic knowledge generation mechanism obtained via a non monotonic reasoning procedure. In particular, the proposed framework relies on the assumption that certain classes of problems cannot be solved by simply learning or injecting new external knowledge in the declarative memory of a cognitive artificial agent but, on the other hand, require a mechanism for the (...) automatic and creative re-framing, or re-formulation, of the available knowledge. We show how such mechanism can be obtained trough a framework of dynamic knowledge generation that is able to tackle the problem of commonsense concept combination. In addition, we show how such a framework can be employed in the field of cognitive architectures in order to overcome situations like the impasse in SOAR by extending the possible options of its subgoaling procedures. (shrink)
This paper formulates some paradoxes of inductive knowledge. Two responses in particular are explored: According to the first sort of theory, one is able to know in advance that certain observations will not be made unless a law exists. According to the other, this sort of knowledge is not available until after the observations have been made. Certain natural assumptions, such as the idea that the observations are just as informative as each other, the idea that they are independent, and (...) that they increase your knowledge monotonically (among others) are given precise formulations. Some surprising consequences of these assumptions are drawn, and their ramifications for the two theories examined. Finally, a simple model of inductive knowledge is offered, and independently derived from other principles concerning the interaction of knowledge and counterfactuals. (shrink)
In the late summer of 1998, the authors, a cognitive scientist and a logician, started talking about the relevance of modern mathematical logic to the study of human reasoning, and we have been talking ever since. This book is an interim report of that conversation. It argues that results such as those on the Wason selection task, purportedly showing the irrelevance of formal logic to actual human reasoning, have been widely misinterpreted, mainly because the picture of logic current in psychology (...) and cognitive science is completely mistaken. We aim to give the reader a more accurate picture of mathematical logic and, in doing so, hope to show that logic, properly conceived, is still a very helpful tool in cognitive science. The main thrust of the book is therefore constructive. We give a number of examples in which logical theorizing helps in understanding and modeling observed behavior in reasoning tasks, deviations of that behavior in a psychiatric disorder (autism), and even the roots of that behavior in the evolution of the brain. (shrink)
Donkey sentences have existential and universal readings, but they are not often perceived as ambiguous. We extend the pragmatic theory of nonmaximality in plural definites by Križ (2016) to explain how context disambiguates donkey sentences. We propose that the denotations of such sentences produce truth-value gaps — in certain scenarios the sentences are neither true nor false — and demonstrate that Križ’s pragmatic theory fills these gaps to generate the standard judgments of the literature. Building on Muskens’s (1996) Compositional Discourse (...) Representation Theory and on ideas from supervaluation semantics, the semantic analysis defines a general schema for quantification that delivers the required truth-value gaps. Given the independently motivated pragmatic theory of Križ 2016, we argue that mixed readings of donkey sentences require neither plural information states, contra Brasoveanu 2008, 2010, nor error states, contra Champollion 2016, nor singular donkey pronouns with plural referents, contra Krifka 1996, Yoon 1996. We also show that the pragmatic account improves over alternatives like Kanazawa 1994 that attribute the readings of donkey sentences to the monotonicity properties of the embedding quantifier. (shrink)
James Woodward’s Making Things Happen presents the most fully developed version of a manipulability theory of causation. Although the ‘interventionist’account of causation that Woodward defends in Making Things Happen has many admirable qualities, Michael Strevens argues that it has a fatal flaw. Strevens maintains that Woodward’s interventionist account of causation renders facts about causation relative to an individual’s perspective. In response to this charge, Woodward claims that although on his account X might be a relativized cause of Y relative to (...) some perspective, this does not lead to the problematic relativity that Strevens claims. Roughly, Woodward argues this is so because if X is a relativized cause of Y with respect to some perspective, then X is a cause of Y simpliciter. So, the truth of whether X is a cause of Y is not relative to one’s perspective. Strevens counters by arguing that Woodward’s response fails because relativized causation is not monotonic. In this paper I argue that Strevens’ argument that relativized causation is not monotonic is unsound. (shrink)
The model of self-referential truth presented in this paper, named Revision-theoretic supervaluation, aims to incorporate the philosophical insights of Gupta and Belnap’s Revision Theory of Truth into the formal framework of Kripkean fixed-point semantics. In Kripke-style theories the final set of grounded true sentences can be reached from below along a strictly increasing sequence of sets of grounded true sentences: in this sense, each stage of the construction can be viewed as an improvement on the previous ones. I want to (...) do something similar replacing the Kripkean sets of grounded true sentences with revision-theoretic sets of stable true sentences. This can be done by defining a monotone operator through a variant of van Fraassen’s supervaluation scheme which is simply based on ω-length iterations of the Tarskian operator. Clearly, all virtues of Kripke-style theories are preserved, and we can also prove that the resulting set of “grounded” true sentences shares some nice features with the sets of stable true sentences which are provided by the usual ways of formalising revision. What is expected is that a clearer philosophical content could be associated to this way of doing revision; hopefully, a content directly linked with the insights underlying finite revision processes. (shrink)
In the theory of judgment aggregation, it is known for which agendas of propositions it is possible to aggregate individual judgments into collective ones in accordance with the Arrow-inspired requirements of universal domain, collective rationality, unanimity preservation, non-dictatorship and propositionwise independence. But it is only partially known (e.g., only in the monotonic case) for which agendas it is possible to respect additional requirements, notably non-oligarchy, anonymity, no individual veto power, or implication preservation. We fully characterize the agendas for which there (...) are such possibilities, thereby answering the most salient open questions about propositionwise judgment aggregation. Our results build on earlier results by Nehring and Puppe (2002), Nehring (2006), Dietrich and List (2007a) and Dokow and Holzman (2010a). (shrink)
This is part of an authors meets critics session on Daniel Star's wonderful book, Knowing Better. I discuss a potential problem with Kearns and Star's Reasons as Evidence thesis. The issue has to do with the difficulties we face is we treat normative reasons as evidence and impose no possession conditions on evidence. On such a view, it's hard to see how practical reasoning could be a non-monotonic process. One way out of the difficulty would be to allow for (potent) (...) unpossessed reasons but insist that all evidence is possessed evidence. This option, I argue, isn't open to proponents of the Reasons as Evidence thesis. Instead, it seems that they'll have to say that all normative reasons are identified with pieces of possessed evidence. This requires the proponents of the Reasons as Evidence thesis to impose epistemic constraints on norms that some of us find objectionable. (shrink)
The paper surveys the currently available axiomatizations of common belief (CB) and common knowledge (CK) by means of modal propositional logics. (Throughout, knowledge- whether individual or common- is defined as true belief.) Section 1 introduces the formal method of axiomatization followed by epistemic logicians, especially the syntax-semantics distinction, and the notion of a soundness and completeness theorem. Section 2 explains the syntactical concepts, while briefly discussing their motivations. Two standard semantic constructions, Kripke structures and neighbourhood structures, are introduced in Sections (...) 3 and 4, respectively. It is recalled that Aumann's partitional model of CK is a particular case of a definition in terms of Kripke structures. The paper also restates the well-known fact that Kripke structures can be regarded as particular cases of neighbourhood structures. Section 3 reviews the soundness and completeness theorems proved w.r.t. the former structures by Fagin, Halpern, Moses and Vardi, as well as related results by Lismont. Section 4 reviews the corresponding theorems derived w.r.t. the latter structures by Lismont and Mongin. A general conclusion of the paper is that the axiomatization of CB does not require as strong systems of individual belief as was originally thought- only monotonicity has thusfar proved indispensable. Section 5 explains another consequence of general relevance: despite the "infinitary" nature of CB, the axiom systems of this paper admit of effective decision procedures, i.e., they are decidable in the logician's sense. (shrink)
I present a possible worlds semantics for a hyperintensional belief revision operator, which reduces the logical idealization of cognitive agents affecting similar operators in doxastic and epistemic logics, as well as in standard AGM belief revision theory. belief states are not closed under classical logical consequence; revising by inconsistent information does not perforce lead to trivialization; and revision can be subject to ‘framing effects’: logically or necessarily equivalent contents can lead to different revisions. Such results are obtained without resorting to (...) non-classical logics, or to non-normal or impossible worlds semantics. The framework combines, instead, a standard semantics for propositional S5 with a simple mereology of contents. (shrink)
Simultaneous hypothesis tests can fail to provide results that meet logical requirements. For example, if A and B are two statements such that A implies B, there exist tests that, based on the same data, reject B but not A. Such outcomes are generally inconvenient to statisticians (who want to communicate the results to practitioners in a simple fashion) and non-statisticians (confused by conflicting pieces of information). Based on this inconvenience, one might want to use tests that satisfy logical requirements. (...) However, Izbicki and Esteves shows that the only tests that are in accordance with three logical requirements (monotonicity, invertibility and consonance) are trivial tests based on point estimation, which generally lack statistical optimality. As a possible solution to this dilemma, this paper adapts the above logical requirements to agnostic tests, in which one can accept, reject or remain agnostic with respect to a given hypothesis. Each of the logical requirements is characterized in terms of a Bayesian decision theoretic perspective. Contrary to the results obtained for regular hypothesis tests, there exist agnostic tests that satisfy all logical requirements and also perform well statistically. In particular, agnostic tests that fulfill all logical requirements are characterized as region estimator-based tests. Examples of such tests are provided. (shrink)
The many-property problem has traditionally been taken to show that the adverbial theory of perception is untenable. This paper first shows that several widely accepted views concerning the nature of perception---including both representational and non-representational views---likewise face the many-property problem. It then presents a solution to the many-property problem for these views, but goes on to show how this solution can be adapted to provide a novel, fully compositional solution to the many-property problem for adverbialism. Thus, with respect to the (...) many-property problem, adverbialism and several widely accepted views in the philosophy of perception are on a par, and the problem is solved. (shrink)
Recent work in cognitive modelling has found that most of the data that has been cited as evidence for the dual-process theory (DPT) of reasoning is best explained by non-linear, “monotonic” one-process models (Stephens et al., 2018, 2019). In this paper, I consider an important caveat of this research: it uses models that are committed to unrealistic assumptions about how effectively task conditions can isolate Type-1 and Type-2 reasoning. To avoid this caveat, I develop a coordinated theoretical, experimental, and modelling (...) strategy to better test DPT. First, I propose that Type-1 and Type-2 reasoning are defined as reasoning that precedes and follows metacognitive control, respectively. Second, I argue that reasoning that precedes and follows metacognitive control can be effectively isolated using debiasing paradigms that manipulate metacognitive heuristics (e.g., processing fluency) to prevent or trigger metacognitive control, respectively. Third, I argue that monotonic modelling can allow us to decisively test DPT only when we use them to analyse data from this particular kind of debiasing paradigm. (shrink)
Section 1 provides a brief summary of the pair-list literature singling out some points that are particularly relevant for the coming discussion. -/- Section 2 shows that the dilemma of quantifi cation versus domain restriction arises only in extensional complement interrogatives. In matrix questions and in intensional complements only universals support pairlist readings, whence the simplest domain restriction treatment suffices. Related data including conjunction, disjunction, and cumulative readings are discussed -/- Section 3 argues that in the case of extensional complements (...) the domain restriction treatment is inadequate for at least two independent reasons. One has to do with the fact that not only upward monotonic quantifi ers support pairlist readings, and the other with the derivation of apparent scope out readings. The reasoning is supplemented with some discussion of the semantic properties of layered quantifi ers. The above will establish the need for quantifi cation, so the question arises how the objections explicitly enlisted in the literature against quantifi cation can be answered. Section 4 considers the de dicto reading of the quantifi er s restriction, quanti cational variability, and the absence of pairlist readings with whether questions, and argues that they need not militate against the quanti ficational analysis. -/- Section 5 summarizes the emergent proposal -/- Finally, section 6 discusses the signifi cance of the above findings for the behavior of weak islands. (shrink)
Need considerations play an important role in empirically informed theories of distributive justice. We propose a concept of need-based justice that is related to social participation and provide an ethical measurement of need-based justice. The β-ε-index satisfies the need-principle, monotonicity, sensitivity, transfer and several »technical« axioms. A numerical example is given.
Assuming that votes are independent, the epistemically optimal procedure in a binary collective choice problem is known to be a weighted supermajority rule with weights given by personal log-likelihood-ratios. It is shown here that an analogous result holds in a much more general model. Firstly, the result follows from a more basic principle than expected-utility maximisation, namely from an axiom (Epistemic Monotonicity) which requires neither utilities nor prior probabilities of the ‘correctness’ of alternatives. Secondly, a person’s input need not (...) be a vote for an alternative, it may be any type of input, for instance a subjective degree of belief or probability of the correctness of one of the alternatives. The case of a profile of subjective degrees of belief is particularly appealing, since here no parameters such as competence parameters need to be known. (shrink)
In this paper we discuss the new Tweety puzzle. The original Tweety puzzle was addressed by approaches in non-monotonic logic, which aim to adequately represent the Tweety case, namely that Tweety is a penguin and, thus, an exceptional bird, which cannot fly, although in general birds can fly. The new Tweety puzzle is intended as a challenge for probabilistic theories of epistemic states. In the first part of the paper we argue against monistic Bayesians, who assume that epistemic states can (...) at any given time be adequately described by a single subjective probability function. We show that monistic Bayesians cannot provide an adequate solution to the new Tweety puzzle, because this requires one to refer to a frequency-based probability function. We conclude that monistic Bayesianism cannot be a fully adequate theory of epistemic states. In the second part we describe an empirical study, which provides support for the thesis that monistic Bayesianism is also inadequate as a descriptive theory of cognitive states. In the final part of the paper we criticize Bayesian approaches in cognitive science, insofar as their monistic tendency cannot adequately address the new Tweety puzzle. We, further, argue against monistic Bayesianism in cognitive science by means of a case study. In this case study we show that Oaksford and Chater’s (2007, 2008) model of conditional inference—contrary to the authors’ theoretical position—has to refer also to a frequency-based probability function. (shrink)
Standard theories of scope are semantically blind. They employ a single logico-syntactic rule of scope assignment quantifying in Quantifier Raising, storage, or type change etc which roughly speaking prefixes an expression \aplha.
We generalize, by a progressive procedure, the notions of conjunction and disjunction of two conditional events to the case of n conditional events. In our coherence-based approach, conjunctions and disjunctions are suitable conditional random quantities. We define the notion of negation, by verifying De Morgan’s Laws. We also show that conjunction and disjunction satisfy the associative and commutative properties, and a monotonicity property. Then, we give some results on coherence of prevision assessments for some families of compounded conditionals; in (...) particular we examine the Fréchet-Hoeffding bounds. Moreover, we study the reverse probabilistic inference from the conjunction Cn+1 of n + 1 conditional events to the family {Cn,En+1|Hn+1}. We consider the relation with the notion of quasi-conjunction and we examine in detail the coherence of the prevision assessments related with the conjunction of three conditional events. Based on conjunction, we also give a characterization of p-consistency and of p-entailment, with applications to several inference rules in probabilistic nonmonotonic reasoning. Finally, we examine some non p-valid inference rules; then, we illustrate by an example two methods which allow to suitably modify non p-valid inference rules in order to get inferences which are p-valid. (shrink)
This paper motivates and defends a principle which captures a systematic connection between essence, truth, and grounding. It says that if a proposition expresses an essential truth, i.e., if it is true in virtue of the nature of some objects, then there are grounds for its truth which involve these objects. Together with the assumption that a fact can only be grounded in facts which are relevant to it, this principle is then applied in an argument against the monotonicity (...) of the Essentialist notion ‘true in virtue of the nature of’. (shrink)
In this essay I propose a new measure of social welfare. It captures the intuitive idea that quantity, quality, and equality of individual welfare all matter for social welfare. More precisely, it satisfies six conditions: Equivalence, Dominance, Quality, Strict Monotonicity, Equality and Asymmetry. These state that i) populations equivalent in individual welfare are equal in social welfare; ii) a population that dominates another in individual welfare is better; iii) a population that has a higher average welfare than another population (...) is better, other things being equal; iv) the addition of a well-faring individual makes a population better, whereas the addition of an ill-faring individual makes a population worse; v) a population that has a higher degree of equality than another population is better, other things being equal; and vi) individual illfare matters more for social welfare than individual welfare. By satisfying the six conditions, the measure improves on previously proposed measures, such as the utilitarian Total and Average measures, as well as different kinds of Prioritarian measures. (shrink)
People reason not just in beliefs, but also in intentions, preferences, and other attitudes. They form preferences from existing preferences, or intentions from existing beliefs and intentions, and so on, often facing choices between rival conclusions. Building on Broome (2013) and Dietrich et al. (2019), we present a philosophical and formal analysis of reasoning in attitudes with or without facing such choices. Reasoning in attitudes is a mental activity that differs fundamentally from reasoning about attitudes, a form of theoretical reasoning (...) by which one discovers rather than forms attitudes. Reasoning in attitudes has standard formal features (such as monotonicity), but is indeterministic (reflecting choice in reasoning). Like theoretical reasoning, it need not follow logical entailment, but for different reasons related to indeterminism. This makes reasoning in attitudes harder to model logically than theoretical reasoning. (shrink)
In this paper, I discuss the analysis of logic in the pragmatic approach recently proposed by Brandom. I consider different consequence relations, formalized by classical, intuitionistic and linear logic, and I will argue that the formal theory developed by Brandom, even if provides powerful foundational insights on the relationship between logic and discursive practices, cannot account for important reasoning patterns represented by non-monotonic or resource-sensitive inferences. Then, I will present an incompatibility semantics in the framework of linear logic which allow (...) to refine Brandom’s concept of defeasible inference and to account for those non-monotonic and relevant inferences that are expressible in linear logic. Moreover, I will suggest an interpretation of discursive practices based on an abstract notion of agreement on what counts as a reason which is deeply connected with linear logic semantics. (shrink)
We present two defeasible logics of norm-propositions (statements about norms) that (i) consistently allow for the possibility of normative gaps and normative conflicts, and (ii) map each premise set to a sufficiently rich consequence set. In order to meet (i), we define the logic LNP, a conflict- and gap-tolerant logic of norm-propositions capable of formalizing both normative conflicts and normative gaps within the object language. Next, we strengthen LNP within the adaptive logic framework for non-monotonic reasoning in order to meet (...) (ii). This results in the adaptive logics LNPr and LNPm, which interpret a given set of premises in such a way that normative conflicts and normative gaps are avoided ‘whenever possible’. LNPr and LNPm are equipped with a preferential semantics and a dynamic proof theory. (shrink)
Semantics plays a role in grammar in at least three guises. (A) Linguists seek to account for speakers‘ knowledge of what linguistic expressions mean. This goal is typically achieved by assigning a model theoretic interpretation in a compositional fashion. For example, *No whale flies* is true if and only if the intersection of the sets of whales and fliers is empty in the model. (B) Linguists seek to account for the ability of speakers to make various inferences based on semantic (...) knowledge. For example, *No whale flies* entails *No blue whale flies* and *No whale flies high*. (C) The wellformedness of a variety of syntactic constructions depends on morpho-syntactic features with a semantic flavor. For example, *Under no circumstances would a whale fly* is grammatical, whereas *Under some circumstances would a whale fly* is not, corresponding to the downward vs. upward monotonic features of the preposed phrases. It is usually assumed that once a compositional model theoretic interpretation is assigned to all expressions, its fruits can be freely enjoyed by inferencing and syntax. What place might proof theory have in this picture? (shrink)
In a previous work we introduced the algorithm \SQEMA\ for computing first-order equivalents and proving canonicity of modal formulae, and thus established a very general correspondence and canonical completeness result. \SQEMA\ is based on transformation rules, the most important of which employs a modal version of a result by Ackermann that enables elimination of an existentially quantified predicate variable in a formula, provided a certain negative polarity condition on that variable is satisfied. In this paper we develop several extensions of (...) \SQEMA\ where that syntactic condition is replaced by a semantic one, viz. downward monotonicity. For the first, and most general, extension \SSQEMA\ we prove correctness for a large class of modal formulae containing an extension of the Sahlqvist formulae, defined by replacing polarity with monotonicity. By employing a special modal version of Lyndon's monotonicity theorem and imposing additional requirements on the Ackermann rule we obtain restricted versions of \SSQEMA\ which guarantee canonicity, too. (shrink)
This is a first tentative examination of the possibility of reinstating reduction as a valid candidate for presenting relations between mental and physical properties. Classical Nagelian reduction is undoubtedly contaminated in many ways, but here I investigate the possibility of adapting to problems concerning mental properties an alternative definition for theory reduction in philosophy of science. The definition I offer is formulated with the aid of non-monotonic logic, which I suspect might be a very interesting realm for testing notions concerning (...) localized mental-physical reduction. The reason for this is that non-monotonic reasoning by definition is about appeals made not only to explicit observations, but also to an implicit selection of background knowledge containing heuristic information. The flexibility of this definition and the fact that it is not absolute, i.e. that the relation of reduction may be retracted or allowed to shift without fuss, add at least an interesting alternative factor to current materialist debates. South African Journal of Philosophy Vol. 25(2) 2006: 102-112. (shrink)
When considering the social valuation of a life-year, there is a conflict between two basic intuitions: on the one hand, the intuition of universality, according to which the value of an additional life-year should be universal, and, as such, should be invariant to the context considered; on the other hand, the intuition of complementarity, according to which the value of a life-year should depend on what this extra life-year allows for, and, hence, on the quality of that life-year, because the (...) quantity of life and the quality of life are complement to each other. This paper proposes three distinct accounts of the intuition of universality, and shows that those accounts either conflict with a basic monotonicity property, or lead to indifference with respect to how life-years are distributed within the population. Those results support dropping the intuition of universality. But abandoning the intuition of universality does not prevent a social evaluator from giving priority, when allocating life-years, to individuals with the lowest quality of life. (shrink)
In this paper, first the term 'prioritarianism' is defined, with some mathematical precision, on the basis of intuitive conceptions of prioritarianism, especially the idea that "benefiting people matters more the worse off these people are". (The prioritarian weighting function is monotonously ascending and concave, while its first derivation is smoothly descending and convex but positive throughout.) Furthermore, (moderate welfare) egalitarianism is characterized. In particular a new symmetry condition is defended, i.e. that egalitarianism evaluates upper and lower deviations from the social (...) middle symmetrically and equally negatively (as do e.g. variance and Gini). Finally, it is shown that this feature distinguishes egalitarianism also extensionally from prioritarianism. (shrink)
According to Brandom’s conceptual role semantics, to grasp a concept involves a commitment to drawing certain inferences. This is a consequence of the inferentialist thesis that the meaning of a term is given by its justification through assertibility conditions. Inferential commitments come out from a material notion of inference which underwrites human rational discourse and activity. In this paper I discuss a problem of Brandom’s semantics allegedly exposed in an argument by Paul Boghossian against Dummett’s and Brandom’s substantive conception of (...) meaning. I contend that Boghossian’s analysis is dubious because it overlooks an important difference between Dummett’s and Brandom’s positions related respectively to a monotonic and a non-monotonic view of the norm underwriting meaning. (shrink)
A model-theoretic realist account of science places linguistic systems and their corresponding non-linguistic structures at different stages or different levels of abstraction of the scientific process. Apart from the obvious problem of underdetermination of theories by data, philosophers of science are also faced with the inverse (and very real) problem of overdetermination of theories by their empirical models, which is what this article will focus on. I acknowledge the contingency of the factors determining the nature – and choice – of (...) a certain model at a certain time, but in my terms, this is a matter about which we can talk and whose structure we can formalise. In this article a mechanism for tracing "empirical choices" and their particularized observational-theoretical entanglements will be offered in the form of Yoav Shoham's version of non-monotonic logic. Such an analysis of the structure of scientific theories may clarify the motivations underlying choices in favor of certain empirical models (and not others) in a way that shows that "disentangling" theoretical and observation terms is more deeply model-specific than theory-specific. This kind of analysis offers a method for getting an articulable grip on the overdetermination of theories by their models – implied by empirical equivalence – which Kuipers' structuralist analysis of the structure of theories does not offer. (shrink)
[Abstract] Suppose that the Big Bang was the first singularity in the history of the cosmos. Then it would be plausible to presume that the availability of the strong general intelligence should mark the second singularity for the natural human race. The human race needs to be prepared to make it sure that if a singularity robot becomes a person, the robotic person should be a blessing for the humankind rather than a curse. Toward this direction I would scrutinize the (...) implication of the hypothesis that the singularity robot is a member of the human society. I will ask how the robot is equipped to satisfy the ontological criteria such as accountability, consciousness, identity, by demonstrating a possibility that it has the epistemological capacities like conceptual role semantic understanding and non-monotonous inference, and by probing whether it can behave in the way human moral visions expect it to. -/- [Table of contents] 1. Opening: Singularity robots are coming 1) Singularity robots of strong general intelligence 2) A singularity robot is a member of the human community 2. Ontological interpretation of singularity robot 1) Responsibility: thinking, understanding, belief 2) Consciousness: three characteristics – zombie, enjoyment, sympathy 3) Identity: I, body, autonomy, unity 3. Epistmological prospects of singularity robot 1) Semantics of general intelligence: conceptual role semantics 2) Logic for general intelligence: non-monotonous logic 4. Moral horizon of singularity robot 1) When a singularity robot becomes a robot person 2) Humanity independence: singularity robot is a user of human languages 5. Concluding: preemptive humanities. (shrink)
In previous work, we studied four well known systems of qualitative probabilistic inference, and presented data from computer simulations in an attempt to illustrate the performance of the systems. These simulations evaluated the four systems in terms of their tendency to license inference to accurate and informative conclusions, given incomplete information about a randomly selected probability distribution. In our earlier work, the procedure used in generating the unknown probability distribution (representing the true stochastic state of the world) tended to yield (...) probability distributions with moderately high entropy levels. In the present article, we present data charting the performance of the four systems when reasoning in environments of various entropy levels. The results illustrate variations in the performance of the respective reasoning systems that derive from the entropy of the environment, and allow for a more inclusive assessment of the reliability and robustness of the four systems. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.