Music Modeling
22 papers with code • 2 benchmarks • 6 datasets
( Image credit: R-Transformer )
Libraries
Use these libraries to find Music Modeling models and implementationsMost implemented papers
R-Transformer: Recurrent Neural Network Enhanced Transformer
Recurrent Neural Networks have long been the dominating choice for sequence modeling.
Improving Polyphonic Music Models with Feature-Rich Encoding
We show that training a neural network to predict a seemingly more complex sequence, with extra features included in the series being modelled, can improve overall model performance significantly.
Learning Style-Aware Symbolic Music Representations by Adversarial Autoencoders
Through the paper, we show how Gaussian mixtures taking into account music metadata information can be used as an effective prior for the autoencoder latent space, introducing the first Music Adversarial Autoencoder (MusAE).
Sequential Neural Models with Stochastic Layers
How can we efficiently propagate uncertainty in a latent state representation with recurrent neural networks?
Diagonal RNNs in Symbolic Music Modeling
In this paper, we propose a new Recurrent Neural Network (RNN) architecture.
Bivariate Beta-LSTM
Long Short-Term Memory (LSTM) infers the long term dependency through a cell state maintained by the input and the forget gate structures, which models a gate output as a value in [0, 1] through a sigmoid function.
Seq-U-Net: A One-Dimensional Causal U-Net for Efficient Sequence Modelling
In comparison to TCN and Wavenet, our network consistently saves memory and computation time, with speed-ups for training and inference of over 4x in the audio generation experiment in particular, while achieving a comparable performance in all tasks.
PopMAG: Pop Music Accompaniment Generation
To improve harmony, in this paper, we propose a novel MUlti-track MIDI representation (MuMIDI), which enables simultaneous multi-track generation in a single sequence and explicitly models the dependency of the notes from different tracks.
Gates Are Not What You Need in RNNs
In this paper, we propose a new recurrent cell called Residual Recurrent Unit (RRU) which beats traditional cells and does not employ a single gate.
Low-Rank Constraints for Fast Inference in Structured Models
This work demonstrates a simple approach to reduce the computational and memory complexity of a large class of structured models.