Fundamental Limits of Prediction, Generalization, and Recursion: An Entropic-Innovations Perspective

12 Jan 2020  ·  Song Fang, Quanyan Zhu ·

In this paper, we examine the fundamental performance limits of prediction, with or without side information. More specifically, we derive generic lower bounds on the $\mathcal{L}_p$ norms of the prediction errors that are valid for any prediction algorithms and for any data distributions. Meanwhile, we combine the entropic analysis from information theory and the innovations approach from prediction/estimation theory to characterize the conditions (in terms of, e.g., directed information or mutual information) to achieve the bounds. We also investigate the implications of the results in analyzing the fundamental limits of generalization in fitting (learning) problems from the perspective of prediction with side information, as well as the fundamental limits of recursive algorithms by viewing them as generalized prediction problems.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here