Search Results for author: Cole Pospisil

Found 2 papers, 2 papers with code

Breaking Time Invariance: Assorted-Time Normalization for RNNs

1 code implementation28 Sep 2022 Cole Pospisil, Vasily Zadorozhnyy, Qiang Ye

Methods such as Layer Normalization (LN) and Batch Normalization (BN) have proven to be effective in improving the training of Recurrent Neural Networks (RNNs).

Language Modelling

Orthogonal Gated Recurrent Unit with Neumann-Cayley Transformation

1 code implementation12 Aug 2022 Edison Mucllari, Vasily Zadorozhnyy, Cole Pospisil, Duc Nguyen, Qiang Ye

In recent years, using orthogonal matrices has been shown to be a promising approach in improving Recurrent Neural Networks (RNNs) with training, stability, and convergence, particularly, to control gradients.

Cannot find the paper you are looking for? You can Submit a new open access paper.