no code implementations • 15 Feb 2024 • H. Rangika Iroshani Peiris, Chao Wang, Richard Gerlach, Minh-Ngoc Tran
A semi-parametric joint Value-at-Risk (VaR) and Expected Shortfall (ES) forecasting framework employing multiple realized measures is developed.
1 code implementation • 5 Sep 2023 • Chen Liu, Minh-Ngoc Tran, Chao Wang, Richard Gerlach, Robert Kohn
For many years, researchers have been exploring the use of deep learning in the forecasting of financial time series.
1 code implementation • 24 May 2023 • Nhat-Minh Nguyen, Minh-Ngoc Tran, Christopher Drovandi, David Nott
We combine the Wasserstein Gaussianization transformation with robust BSL, and an efficient Variational Bayes procedure for posterior approximation, to develop a highly efficient and reliable approximate Bayesian inference method for likelihood-free problems.
1 code implementation • 24 Mar 2023 • Minh-Ngoc Tran, Paco Tseng, Robert Kohn
The Mean Field Variational Bayes (MFVB) method is one of the most computationally efficient techniques for Bayesian inference.
1 code implementation • 16 Feb 2023 • Chen Liu, Chao Wang, Minh-Ngoc Tran, Robert Kohn
We propose a new approach to volatility modeling by combining deep learning (LSTM) and realized volatility measures.
no code implementations • 13 Dec 2021 • Anna Lopatnikova, Minh-Ngoc Tran, Scott A. Sisson
Quantum computers promise to surpass the most powerful classical supercomputers when it comes to solving many critically important practical problems, such as pharmaceutical and fertilizer design, supply chain and traffic optimization, or optimization for machine learning tasks.
no code implementations • 10 Jun 2021 • Anna Lopatnikova, Minh-Ngoc Tran
Variational Bayes (VB) is a critical method in machine learning and statistics, underpinning the recent success of Bayesian deep learning.
1 code implementation • 1 Mar 2021 • Minh-Ngoc Tran, Trong-Nghia Nguyen, Viet-Hung Dao
This tutorial gives a quick introduction to Variational Bayes (VB), also called Variational Inference or Variational Approximation, from a practical point of view.
no code implementations • 17 Aug 2020 • Renlong Jie, Junbin Gao, Andrey Vasnev, Minh-Ngoc Tran
In this study, we investigate learning rate adaption at different levels based on the hyper-gradient descent framework and propose a method that adaptively learns the optimizer parameters by combining multiple levels of learning rates with hierarchical structures.
no code implementations • 23 Jan 2020 • Zhengkun Li, Minh-Ngoc Tran, Chao Wang, Richard Gerlach, Junbin Gao
Value-at-Risk (VaR) and Expected Shortfall (ES) are widely used in the financial sector to measure the market risk and manage the extreme market movement.
no code implementations • 25 Sep 2019 • Renlong Jie, Junbin Gao, Andrey Vasnev, Minh-Ngoc Tran
Based on this, we develop two novel flexible activation functions that can be implemented in LSTM cells and auto-encoder layers.
1 code implementation • 8 Aug 2019 • Minh-Ngoc Tran, Dang H. Nguyen, Duy Nguyen
Nonetheless, the development of the existing VB algorithms is so far generally restricted to the case where the variational parameter space is Euclidean, which hinders the potential broad application of VB methods.
no code implementations • 7 Jun 2019 • Trong-Nghia Nguyen, Minh-Ngoc Tran, David Gunawan, R. Kohn
The Stochastic Volatility (SV) model and its variants are widely used in the financial sector while recurrent neural network (RNN) models are successfully used in many large-scale industrial applications of Deep Learning.
no code implementations • 11 Feb 2019 • Bingxin Zhou, Junbin Gao, Minh-Ngoc Tran, Richard Gerlach
Gaussian variational approximation is a popular methodology to approximate posterior distributions in Bayesian inference especially in high dimensional and large data settings.
no code implementations • 23 Jul 2018 • Matias Quiroz, Mattias Villani, Robert Kohn, Minh-Ngoc Tran, Khue-Dung Dang
The rapid development of computing power and efficient Markov Chain Monte Carlo (MCMC) simulation algorithms have revolutionized Bayesian statistics, making it a highly practical inference method in applied work.
2 code implementations • 25 May 2018 • Minh-Ngoc Tran, Nghia Nguyen, David Nott, Robert Kohn
Efficient computational methods for high-dimensional Bayesian inference are developed using Gaussian variational approximation, with a parsimonious but flexible factor parametrization of the covariance matrix.
Computation
no code implementations • 8 May 2018 • David Gunawan, Khue-Dung Dang, Matias Quiroz, Robert Kohn, Minh-Ngoc Tran
SMC sequentially updates a cloud of particles through a sequence of distributions, beginning with a distribution that is easy to sample from such as the prior and ending with the posterior distribution.
no code implementations • 2 Aug 2017 • Khue-Dung Dang, Matias Quiroz, Robert Kohn, Minh-Ngoc Tran, Mattias Villani
The key insight in our article is that efficient subsampling HMC for the parameters is possible if both the dynamics and the acceptance probability are computed from the same data subsample in each complete HMC iteration.
no code implementations • 27 Mar 2016 • Matias Quiroz, Minh-Ngoc Tran, Mattias Villani, Robert Kohn, Khue-Dung Dang
A pseudo-marginal MCMC method is proposed that estimates the likelihood by data subsampling using a block-Poisson estimator.
no code implementations • 16 Apr 2014 • Matias Quiroz, Robert Kohn, Mattias Villani, Minh-Ngoc Tran
We propose Subsampling MCMC, a Markov Chain Monte Carlo (MCMC) framework where the likelihood function for $n$ observations is estimated from a random subset of $m$ observations.
no code implementations • 30 Jul 2013 • David J. Nott, Minh-Ngoc Tran, Anthony Y. C. Kuk, Robert Kohn
We propose a divide and recombine strategy for the analysis of large datasets, which partitions a large dataset into smaller pieces and then combines the variational distributions that have been learnt in parallel on each separate piece using the hybrid Variational Bayes algorithm.
Methodology