I am a second-year Computing Ph.D. student at the University of Utah, where I am advised by Suresh Venkatasubramanian. I am interested in analyzing and measuring the social impact of black-box machine learning systems to develop better AI law and policy. My research has been supported by the ARCS Foundation.

Previously, I worked on the Data Science team at MassMutual while completing my M.S. in Computer Science at the University of Massachusetts. I received my B.A. in Mathematics from Scripps College in 2016.

cv: here
email: kumari at cs dot utah dot edu

Papers

Shapley Residuals: Quantifying the limits of the Shapley value for explanations. I. Elizabeth Kumar, Carlos Scheidegger, Suresh Venkatasubramanian, Sorelle Friedler. Presented at the 5th ICML Workshop on Human Interpretability in Machine Learning (WHI), 2020.

Problems with Shapley-value-based explanations as feature importance measures. I. Elizabeth Kumar, Suresh Venkatasubramanian, Carlos Scheidegger, Sorelle Friedler. Forthcoming in Proceedings of the 37th International Conference on Machine Learning (ICML), 2020.