no code implementations • 4 Dec 2019 • Walter Vinci, Lorenzo Buffoni, Hossein Sadeghi, Amir Khoshaman, Evgeny Andriyash, Mohammad H. Amin
The hybrid structure of QVAE allows us to deploy current-generation quantum annealers in QCH generative models to achieve competitive performance on datasets such as MNIST.
no code implementations • 26 Aug 2019 • Hossein Sadeghi, Evgeny Andriyash, Walter Vinci, Lorenzo Buffoni, Mohammad H. Amin
Here we introduce PixelVAE++, a VAE with three types of latent variables and a PixelCNN++ for the decoder.
Ranked #22 on Image Generation on CIFAR-10 (bits/dimension metric)
3 code implementations • ICML 2020 • Arash Vahdat, Evgeny Andriyash, William G. Macready
We extend the class of posterior models that may be learned by using undirected graphical models.
no code implementations • 29 Sep 2018 • Evgeny Andriyash, Arash Vahdat, Bill Macready
In many applications we seek to maximize an expectation with respect to a distribution over discrete variables.
no code implementations • 27 Sep 2018 • Evgeny Andriyash, Arash Vahdat, Bill Macready
In many applications we seek to optimize an expectation with respect to a distribution over discrete variables.
no code implementations • NeurIPS 2018 • Arash Vahdat, Evgeny Andriyash, William G. Macready
Experiments on the MNIST and OMNIGLOT datasets show that these relaxations outperform previous discrete VAEs with Boltzmann priors.
no code implementations • 15 Feb 2018 • Amir Khoshaman, Walter Vinci, Brandon Denis, Evgeny Andriyash, Hossein Sadeghi, Mohammad H. Amin
We show that our model can be trained end-to-end by maximizing a well-defined loss-function: a 'quantum' lower-bound to a variational approximation of the log-likelihood.
no code implementations • ICML 2018 • Arash Vahdat, William G. Macready, Zhengbing Bian, Amir Khoshaman, Evgeny Andriyash
Training of discrete latent variable models remains challenging because passing gradient information through discrete units is difficult.
Ranked #53 on Image Generation on CIFAR-10 (bits/dimension metric)
no code implementations • 14 Nov 2016 • Dmytro Korenkevych, Yanbo Xue, Zhengbing Bian, Fabian Chudak, William G. Macready, Jason Rolfe, Evgeny Andriyash
We argue that this relates to the fact that we are training a quantum rather than classical Boltzmann distribution in this case.
no code implementations • 8 Jan 2016 • Mohammad H. Amin, Evgeny Andriyash, Jason Rolfe, Bohdan Kulchytskyy, Roger Melko
Inspired by the success of Boltzmann Machines based on classical Boltzmann distribution, we propose a new machine learning approach based on quantum Boltzmann distribution of a transverse-field Ising Hamiltonian.
Quantum Physics