Epistemic values in feature importance methods: Lessons from feminist epistemology

29 Jan 2021  ·  Leif Hancox-Li, I. Elizabeth Kumar ·

As the public seeks greater accountability and transparency from machine learning algorithms, the research literature on methods to explain algorithms and their outputs has rapidly expanded. Feature importance methods form a popular class of explanation methods. In this paper, we apply the lens of feminist epistemology to recent feature importance research. We investigate what epistemic values are implicitly embedded in feature importance methods and how or whether they are in conflict with feminist epistemology. We offer some suggestions on how to conduct research on explanations that respects feminist epistemic values, taking into account the importance of social context, the epistemic privileges of subjugated knowers, and adopting more interactional ways of knowing.

PDF Abstract
No code implementations yet. Submit your code now

Categories


Computers and Society

Datasets


  Add Datasets introduced or used in this paper