1 code implementation • 17 May 2019 • Christopher Aicher, Nicholas J. Foti, Emily B. Fox
Truncated backpropagation through time (TBPTT) is a popular method for learning in recurrent neural networks (RNNs) that saves computation and memory at the cost of bias by truncating backpropagation after a fixed number of lags.
2 code implementations • 29 Jan 2019 • Christopher Aicher, Srshti Putcha, Christopher Nemeth, Paul Fearnhead, Emily B. Fox
We evaluate our proposed particle buffered stochastic gradient using stochastic gradient MCMC for inference on both long sequential synthetic and minute-resolution financial returns data, demonstrating the importance of this class of methods.
1 code implementation • 22 Oct 2018 • Christopher Aicher, Yi-An Ma, Nicholas J. Foti, Emily B. Fox
However, inference in SSMs is often computationally prohibitive for long time series.
no code implementations • 19 Jul 2018 • Christopher Aicher, Emily B. Fox
We develop a framework for approximating collapsed Gibbs sampling in generative latent variable cluster models.
no code implementations • 2 Apr 2014 • Christopher Aicher, Abigail Z. Jacobs, Aaron Clauset
We then evaluate the WSBM's performance on both edge-existence and edge-weight prediction tasks for a set of real-world weighted networks.
no code implementations • 24 May 2013 • Christopher Aicher, Abigail Z. Jacobs, Aaron Clauset
We generalize the stochastic block model to the important case in which edges are annotated with weights drawn from an exponential family distribution.