Exploring foci of:
arXiv (Cornell University) • Vol 1
Problems with Shapley-value-based explanations as feature importance measures
February 2020 • I. Elizabeth Kumar, Suresh Venkatasubramanian, Carlos Scheidegger, Sorelle A. Friedler
Game-theoretic formulations of feature importance have become popular as a way to "explain" machine learning models. These methods define a cooperative game between the features of a model and distribute influence among these input elements using some form of the game's unique Shapley values. Justification for these methods rests on two pillars: their desirable mathematical properties, and their applicability to specific motivations for explanations. We show that mathematical problems arise when Shapley values are…
Shapley Value
Computer Science
Game Theory
Artificial Intelligence
Mathematics
Machine Learning
Philosophy