The Legendre Memory Unit (LMU) is mathematically derived to orthogonalize its continuous-time history – doing so by solving d coupled ordinary differential equations (ODEs), whose phase space linearly maps onto sliding windows of time via the Legendre polynomials up to degree d-1. It is optimal for compressing temporal information.
See paper for equations (markdown isn't working).
Official github repo: https://github.com/abr/lmu
Source: Legendre Memory Units: Continuous-Time Representation in Recurrent Neural NetworksPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Image Classification | 2 | 10.53% |
Machine Translation | 2 | 10.53% |
Sentiment Analysis | 2 | 10.53% |
Sequential Image Classification | 2 | 10.53% |
Translation | 2 | 10.53% |
Fraud Detection | 1 | 5.26% |
Speech Recognition | 1 | 5.26% |
Natural Language Inference | 1 | 5.26% |
Semantic Similarity | 1 | 5.26% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |