no code implementations • ICLR Workshop DeepGenStruct 2019 • Septimia Sârbu, Luigi Malagò
In training, they exploit the power of variational inference, by optimizing a lower bound on the model evidence.
no code implementations • 5 Jul 2018 • Septimia Sârbu, Riccardo Volpi, Alexandra Peşte, Luigi Malagò
In this paper we propose two novel bounds for the log-likelihood based on Kullback-Leibler and the R\'{e}nyi divergences, which can be used for variational inference and in particular for the training of Variational AutoEncoders.