Methods > Sequential > Sequence To Sequence Models

Hierarchical BiLSTM Max Pooling

Introduced by Talman et al. in Sentence Embeddings in NLI with Iterative Refinement Encoders

HBMP is a hierarchy-like structure of BiLSTM layers with max pooling. All in all, this model improves the previous state of the art for SciTail and achieves strong results for the SNLI and MultiNLI.

Source: Sentence Embeddings in NLI with Iterative Refinement Encoders

Latest Papers

PAPER DATE
FarsTail: A Persian Natural Language Inference Dataset
| Hossein AmirkhaniMohammad Azari JafariAzadeh AmirakZohreh PourjafariSoroush Faridan JahromiZeinab Kouhkan
2020-09-18
Learning hierarchical behavior and motion planning for autonomous driving
Jingke WangYue WangDongkun ZhangYezhou YangRong Xiong
2020-05-08
Sentence Embeddings in NLI with Iterative Refinement Encoders
| Aarne TalmanAnssi Yli-JyräJörg Tiedemann
2018-08-27

Components

COMPONENT TYPE
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories