Paper

A Perceived Environment Design using a Multi-Modal Variational Autoencoder for learning Active-Sensing

This contribution comprises the interplay between a multi-modal variational autoencoder and an environment to a perceived environment, on which an agent can act. Furthermore, we conclude our work with a comparison to curiosity-driven learning.

Results in Papers With Code
(↓ scroll down to see all results)