Rademacher Complexity Bounds for Non-I.I.D. Processes

NeurIPS 2008  ·  Mehryar Mohri, Afshin Rostamizadeh ·

This paper presents the first data-dependent generalization bounds for non-i.i.d. settings based on the notion of Rademacher complexity. Our bounds extend to the non-i.i.d. case existing Rademacher complexity bounds derived for the i.i.d. setting. These bounds provide a strict generalization of the ones found in the i.i.d. case, and can also be used within the standard i.i.d. scenario. They apply to the standard scenario of beta-mixing stationary sequences examined in many previous studies of non-i.i.d. settings and benefit form the crucial advantages of Rademacher complexity over other measures of the complexity of hypothesis classes. In particular, they are data-dependent and measure the complexity of a class of hypotheses based on the training sample. The empirical Rademacher complexity can be estimated from finite samples and lead to tighter bounds.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here