Sequence To Sequence Models

Hierarchical BiLSTM Max Pooling

Introduced by Talman et al. in Sentence Embeddings in NLI with Iterative Refinement Encoders

HBMP is a hierarchy-like structure of BiLSTM layers with max pooling. All in all, this model improves the previous state of the art for SciTail and achieves strong results for the SNLI and MultiNLI.

Source: Sentence Embeddings in NLI with Iterative Refinement Encoders

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Natural Language Inference 2 22.22%
Autonomous Driving 1 11.11%
Decision Making 1 11.11%
Motion Planning 1 11.11%
Reinforcement Learning (RL) 1 11.11%
Sentence 1 11.11%
Sentence Embedding 1 11.11%
Sentence Embeddings 1 11.11%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories