no code implementations • 1 Feb 2024 • Francisco Daunas, Iñaki Esnaola, Samir M. Perlaza, H. Vincent Poor
The solution to empirical risk minimization with $f$-divergence regularization (ERM-$f$DR) is presented under mild conditions on $f$.
no code implementations • 12 Jun 2023 • Francisco Daunas, Iñaki Esnaola, Samir M. Perlaza, H. Vincent Poor
The analysis of the solution unveils the following properties of relative entropy when it acts as a regularizer in the ERM-RER problem: i) relative entropy forces the support of the Type-II solution to collapse into the support of the reference measure, which introduces a strong inductive bias that dominates the evidence provided by the training data; ii) Type-II regularization is equivalent to classical relative entropy regularization with an appropriate transformation of the empirical risk function.