Power analysis of knockoff filters for correlated designs

NeurIPS 2019  ·  Jingbo Liu, Philippe Rigollet ·

The knockoff filter introduced by Barber and Cand\`es 2016 is an elegant framework for controlling the false discovery rate in variable selection. While empirical results indicate that this methodology is not too conservative, there is no conclusive theoretical result on its power. When the predictors are i.i.d. Gaussian, it is known that as the signal to noise ratio tend to infinity, the knockoff filter is consistent in the sense that one can make FDR go to 0 and power go to 1 simultaneously. In this work we study the case where the predictors have a general covariance matrix $\Sigma$. We introduce a simple functional called effective signal deficiency (ESD) of the covariance matrix $\Sigma$ that predicts consistency of various variable selection methods. In particular, ESD reveals that the structure of the precision matrix $\Sigma^{-1}$ plays a central role in consistency and therefore, so does the conditional independence structure of the predictors. To leverage this connection, we introduce Conditional Independence knockoff, a simple procedure that is able to compete with the more sophisticated knockoff filters and that is defined when the predictors obey a Gaussian tree graphical models (or when the graph is sufficiently sparse). Our theoretical results are supported by numerical evidence on synthetic data.

PDF Abstract NeurIPS 2019 PDF NeurIPS 2019 Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here