Switch to: References

Citations of:

Learning Conditional Information

Mind and Language 27 (3):239-263 (2012)

Add citations

You must login to add citations.
  1. Confirmation, Coherence and the Strength of Arguments.Stephan Hartmann & Borut Trpin - 2023 - Proceedings of the Annual Meeting of the Cognitive Science Society 45:1473-1479.
    Alongside science and law, argumentation is also of central importance in everyday life. But what characterizes a good argument? This question has occupied philosophers and psychologists for centuries. The theory of Bayesian argumentation is particularly suitable for clarifying it, because it allows us to take into account in a natural way the role of uncertainty, which is central to much argumentation. Moreover, it offers the possibility of measuring the strength of an argument in probabilistic terms. One way to do this, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • On argument strength.Niki Pfeifer - 2012 - In Frank Zenker (ed.), Bayesian Argumentation – The Practical Side of Probability. Springer. pp. 185-193.
    Everyday life reasoning and argumentation is defeasible and uncertain. I present a probability logic framework to rationally reconstruct everyday life reasoning and argumentation. Coherence in the sense of de Finetti is used as the basic rationality norm. I discuss two basic classes of approaches to construct measures of argument strength. The first class imposes a probabilistic relation between the premises and the conclusion. The second class imposes a deductive relation. I argue for the second class, as the first class is (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • The Compact Compendium of Experimental Philosophy.Alexander Max Bauer & Stephan Kornmesser (eds.) - 2023 - Berlin and Boston: De Gruyter.
    Download  
     
    Export citation  
     
    Bookmark  
  • Learning as Hypothesis Testing: Learning Conditional and Probabilistic Information.Jonathan Vandenburgh - manuscript
    Complex constraints like conditionals ('If A, then B') and probabilistic constraints ('The probability that A is p') pose problems for Bayesian theories of learning. Since these propositions do not express constraints on outcomes, agents cannot simply conditionalize on the new information. Furthermore, a natural extension of conditionalization, relative information minimization, leads to many counterintuitive predictions, evidenced by the sundowners problem and the Judy Benjamin problem. Building on the notion of a `paradigm shift' and empirical research in psychology and economics, I (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Learning from Conditionals.Benjamin Eva, Stephan Hartmann & Soroush Rafiee Rad - 2020 - Mind 129 (514):461-508.
    In this article, we address a major outstanding question of probabilistic Bayesian epistemology: how should a rational Bayesian agent update their beliefs upon learning an indicative conditional? A number of authors have recently contended that this question is fundamentally underdetermined by Bayesian norms, and hence that there is no single update procedure that rational agents are obliged to follow upon learning an indicative conditional. Here we resist this trend and argue that a core set of widely accepted Bayesian norms is (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • Bayesian Argumentation and the Value of Logical Validity.Benjamin Eva & Stephan Hartmann - unknown
    According to the Bayesian paradigm in the psychology of reasoning, the norms by which everyday human cognition is best evaluated are probabilistic rather than logical in character. Recently, the Bayesian paradigm has been applied to the domain of argumentation, where the fundamental norms are traditionally assumed to be logical. Here, we present a major generalisation of extant Bayesian approaches to argumentation that (i)utilizes a new class of Bayesian learning methods that are better suited to modelling dynamic and conditional inferences than (...)
    Download  
     
    Export citation  
     
    Bookmark   23 citations  
  • Making Ranking Theory Useful for Psychology of Reasoning.Niels Skovgaard Olsen - 2014 - Dissertation, University of Konstanz
    An organizing theme of the dissertation is the issue of how to make philosophical theories useful for scientific purposes. An argument for the contention is presented that it doesn’t suffice merely to theoretically motivate one’s theories, and make them compatible with existing data, but that philosophers having this aim should ideally contribute to identifying unique and hard to vary predictions of their theories. This methodological recommendation is applied to the ranking-theoretic approach to conditionals, which emphasizes the epistemic relevance and the (...)
    Download  
     
    Export citation  
     
    Bookmark   18 citations  
  • Conditional Learning Through Causal Models.Jonathan Vandenburgh - 2020 - Synthese (1-2):2415-2437.
    Conditional learning, where agents learn a conditional sentence ‘If A, then B,’ is difficult to incorporate into existing Bayesian models of learning. This is because conditional learning is not uniform: in some cases, learning a conditional requires decreasing the probability of the antecedent, while in other cases, the antecedent probability stays constant or increases. I argue that how one learns a conditional depends on the causal structure relating the antecedent and the consequent, leading to a causal model of conditional learning. (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • How to Learn Concepts, Consequences, and Conditionals.Franz Huber - 2015 - Analytica: an electronic, open-access journal for philosophy of science 1 (1):20-36.
    In this brief note I show how to model conceptual change, logical learning, and revision of one's beliefs in response to conditional information such as indicative conditionals that do not express propositions.
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • A New Approach to Testimonial Conditionals.Stephan Hartmann & Ulrike Hahn - 2020 - In Stephan Hartmann & Ulrike Hahn (eds.), CogSci 2020 Proceedings. Toronto, Ontario, Kanada: pp. 981–986.
    Conditionals pervade every aspect of our thinking, from the mundane and everyday such as ‘if you eat too much cheese, you will have nightmares’ to the most fundamental concerns as in ‘if global warming isn’t halted, sea levels will rise dramatically’. Many decades of research have focussed on the semantics of conditionals and how people reason from conditionals in everyday life. Here it has been rather overlooked how we come to such conditionals in the first place. In many cases, they (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Formal Epistemology and the New Paradigm Psychology of Reasoning.Niki Pfeifer & Igor Douven - 2014 - Review of Philosophy and Psychology 5 (2):199-221.
    This position paper advocates combining formal epistemology and the new paradigm psychology of reasoning in the studies of conditionals and reasoning with uncertainty. The new paradigm psychology of reasoning is characterized by the use of probability theory as a rationality framework instead of classical logic, used by more traditional approaches to the psychology of reasoning. This paper presents a new interdisciplinary research program which involves both formal and experimental work. To illustrate the program, the paper discusses recent work on the (...)
    Download  
     
    Export citation  
     
    Bookmark   11 citations  
  • Learning Conditional Information by Jeffrey Imaging on Stalnaker Conditionals.Mario Günther - 2018 - Journal of Philosophical Logic 47 (5):851-876.
    We propose a method of learning indicative conditional information. An agent learns conditional information by Jeffrey imaging on the minimally informative proposition expressed by a Stalnaker conditional. We show that the predictions of the proposed method align with the intuitions in Douven, 239–263 2012)’s benchmark examples. Jeffrey imaging on Stalnaker conditionals can also capture the learning of uncertain conditional information, which we illustrate by generating predictions for the Judy Benjamin Problem.
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • Ramsey’s conditionals.Mario Günther & Caterina Sisti - 2022 - Synthese 200 (2):1-31.
    In this paper, we propose a unified account of conditionals inspired by Frank Ramsey. Most contemporary philosophers agree that Ramsey’s account applies to indicative conditionals only. We observe against this orthodoxy that his account covers subjunctive conditionals as well—including counterfactuals. In light of this observation, we argue that Ramsey’s account of conditionals resembles Robert Stalnaker’s possible worlds semantics supplemented by a model of belief. The resemblance suggests to reinterpret the notion of conditional degree of belief in order to overcome a (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Dynamic inference and everyday conditional reasoning in the new paradigm.Mike Oaksford & Nick Chater - 2013 - Thinking and Reasoning 19 (3-4):346-379.
    Download  
     
    Export citation  
     
    Bookmark   35 citations  
  • Learning from Simple Indicative Conditionals.Leendert M. Huisman - 2017 - Erkenntnis 82 (3):583-601.
    An agent who receives information in the form of an indicative conditional statement and who trusts her source will modify her credences to bring them in line with the conditional. I will argue that the agent, upon the acquisition of such information, should, in general, expand her prior credence function to an indeterminate posterior one; that is, to a set of credence functions. Two different ways the agent might interpret the conditional will be presented, and the properties of the resulting (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations