4 found
Order:
See also
Jonathan Vandenburgh
Stanford University
  1. Causal Models and the Logic of Counterfactuals.Jonathan Vandenburgh - manuscript
    Causal models show promise as a foundation for the semantics of counterfactual sentences. However, current approaches face limitations compared to the alternative similarity theory: they only apply to a limited subset of counterfactuals and the connection to counterfactual logic is not straightforward. This paper addresses these difficulties using exogenous interventions, where causal interventions change the values of exogenous variables rather than structural equations. This model accommodates judgments about backtracking counterfactuals, extends to logically complex counterfactuals, and validates familiar principles of counterfactual (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  2. Triviality Results, Conditional Probability, and Restrictor Conditionals.Jonathan Vandenburgh - manuscript
    Conditional probability is often used to represent the probability of the conditional. However, triviality results suggest that the thesis that the probability of the conditional always equals conditional probability leads to untenable conclusions. In this paper, I offer an interpretation of this thesis in a possible worlds framework, arguing that the triviality results make assumptions at odds with the use of conditional probability. I argue that these assumptions come from a theory called the operator theory and that the rival restrictor (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  3. A Causal Safety Criterion for Knowledge.Jonathan Vandenburgh - manuscript
    Safety purports to explain why cases of accidentally true belief are not knowledge, addressing Gettier cases and cases of belief based on statistical evidence. However, numerous examples suggest that safety fails as a condition on knowledge: a belief can be safe even when one's evidence is clearly insufficient for knowledge and knowledge is compatible with the nearby possibility of error, a situation ruled out by the safety condition. In this paper, I argue for a new modal condition designed to capture (...)
    Download  
     
    Export citation  
     
    Bookmark  
  4. Learning as Hypothesis Testing: Learning Conditional and Probabilistic Information.Jonathan Vandenburgh - manuscript
    Complex constraints like conditionals ('If A, then B') and probabilistic constraints ('The probability that A is p') pose problems for Bayesian theories of learning. Since these propositions do not express constraints on outcomes, agents cannot simply conditionalize on the new information. Furthermore, a natural extension of conditionalization, relative information minimization, leads to many counterintuitive predictions, evidenced by the sundowners problem and the Judy Benjamin problem. Building on the notion of a `paradigm shift' and empirical research in psychology and economics, I (...)
    Download  
     
    Export citation  
     
    Bookmark