Local explanations via necessity and sufficiency: unifying theory and practice

Minds and Machines 32:185-218 (2022)
  Copy   BIBTEX

Abstract

Necessity and sufficiency are the building blocks of all successful explanations. Yet despite their importance, these notions have been conceptually underdeveloped and inconsistently applied in explainable artificial intelligence (XAI), a fast-growing research area that is so far lacking in firm theoretical foundations. Building on work in logic, probability, and causality, we establish the central role of necessity and sufficiency in XAI, unifying seemingly disparate methods in a single formal framework. We provide a sound and complete algorithm for computing explanatory factors with respect to a given context, and demonstrate its flexibility and competitive performance against state of the art alternatives on various tasks.

Author Profiles

David Watson
University College London
Luciano Floridi
Yale University

Analytics

Added to PP
2021-06-15

Downloads
438 (#52,343)

6 months
125 (#38,050)

Historical graph of downloads since first upload
This graph includes both downloads from PhilArchive and clicks on external links on PhilPapers.
How can I increase my downloads?