no code implementations • 1 Feb 2023 • Aria Khademi, Michael Hopka, Devesh Upadhyay
We further discuss multiple aspects of model monitoring and robustness that need to be analyzed \emph{simultaneously} to achieve robustness for industry safety standards.
no code implementations • 2 Oct 2020 • Christopher Seto, Aria Khademi, Corina Graif, Vasant G. Honavar
This study explored how population mobility flows form commuting networks across US counties and influence the spread of COVID-19.
no code implementations • 1 Aug 2020 • Aria Khademi, Vasant Honavar
We aim to address this problem in settings where the predictive model is a black box; That is, we can only observe the response of the model to various inputs, but have no knowledge about the internal structure of the predictive model, its parameters, the objective function, and the algorithm used to optimize the model.
no code implementations • 24 Nov 2019 • Aria Khademi, Vasant Honavar
Specifically, we assess whether COMPAS exhibits racial bias against African American defendants using FACT, a recently introduced causality grounded measure of algorithmic fairness.
no code implementations • 27 Mar 2019 • Aria Khademi, Sanghack Lee, David Foley, Vasant Honavar
As virtually all aspects of our lives are increasingly impacted by algorithmic decision making systems, it is incumbent upon us as a society to ensure such systems do not become instruments of unfair discrimination on the basis of gender, race, ethnicity, religion, etc.