no code implementations • 5 Jul 2018 • Septimia Sârbu, Riccardo Volpi, Alexandra Peşte, Luigi Malagò
In this paper we propose two novel bounds for the log-likelihood based on Kullback-Leibler and the R\'{e}nyi divergences, which can be used for variational inference and in particular for the training of Variational AutoEncoders.