Switch to: References

Add citations

You must login to add citations.
  1. Is Causal Reasoning Harder Than Probabilistic Reasoning?Milan Mossé, Duligur Ibeling & Thomas Icard - 2024 - Review of Symbolic Logic 17 (1):106-131.
    Many tasks in statistical and causal inference can be construed as problems of entailment in a suitable formal language. We ask whether those problems are more difficult, from a computational perspective, for causal probabilistic languages than for pure probabilistic (or “associational”) languages. Despite several senses in which causal reasoning is indeed more complex—both expressively and inferentially—we show that causal entailment (or satisfiability) problems can be systematically and robustly reduced to purely probabilistic problems. Thus there is no jump in computational complexity. (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Gravity and Grace.Gordon Belot - 2022 - Philosophers' Imprint 22 (1).
    This paper revisits the bearing of underdetermination arguments on scientific realism. First it argues that underdetermination considerations provide good reason to doubt that science is objective in the strong sense that anyone following the its methods will be led closer and closer to the truth about any given question within the purview of those methods, as more relevant data are considered. Then it argues that scientific realism is difficult to maintain in the absence of this sort of objectivity.
    Download  
     
    Export citation  
     
    Bookmark  
  • Unprincipled.Gordon Belot - 2024 - Review of Symbolic Logic 17 (2):435-474.
    It is widely thought that chance should be understood in reductionist terms: claims about chance should be understood as claims that certain patterns of events are instantiated. There are many possible reductionist theories of chance, differing as to which possible pattern of events they take to be chance-making. It is also widely taken to be a norm of rationality that credence should defer to chance: special cases aside, rationality requires that one’s credence function, when conditionalized on the chance-making facts, should (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • The no-free-lunch theorems of supervised learning.Tom F. Sterkenburg & Peter D. Grünwald - 2021 - Synthese 199 (3-4):9979-10015.
    The no-free-lunch theorems promote a skeptical conclusion that all possible machine learning algorithms equally lack justification. But how could this leave room for a learning theory, that shows that some algorithms are better than others? Drawing parallels to the philosophy of induction, we point out that the no-free-lunch results presuppose a conception of learning algorithms as purely data-driven. On this conception, every algorithm must have an inherent inductive bias, that wants justification. We argue that many standard learning algorithms should rather (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • No free theory choice from machine learning.Bruce Rushing - 2022 - Synthese 200 (5):1-21.
    Ravit Dotan argues that a No Free Lunch theorem from machine learning shows epistemic values are insufficient for deciding the truth of scientific hypotheses. She argues that NFL shows that the best case accuracy of scientific hypotheses is no more than chance. Since accuracy underpins every epistemic value, non-epistemic values are needed to assess the truth of scientific hypotheses. However, NFL cannot be coherently applied to the problem of theory choice. The NFL theorem Dotan’s argument relies upon is a member (...)
    Download  
     
    Export citation  
     
    Bookmark