Deep Biaffine Attention for Neural Dependency Parsing

Last updated on Mar 15, 2021

Deep Biaffine Attention for Neural Dependency Parsing

Parameters 20 Million
Encoder Layers 3
File Size 69.83 MB
Training Data Penn Treebank
Training Resources
Training Time

Architecture Dropout, ELU, Feedforward Network, Linear Layer, Variational Dropout
Epochs 50
Dropout 0.3
Batch Size 128
Encoder Type stacked_bidirectional_lstm
Encoder Layers 3
Encoder Input Size 200
Encoder Hidden Size 400
SHOW MORE
SHOW LESS
README.md

Summary

This dependency parser follows the model of Deep Biaffine Attention for Neural Dependency Parsing (Dozat and Manning, 2016).

Word representations are generated using a bidirectional LSTM, followed by separate biaffine classifiers for pairs of words, predicting whether a directed arc exists between the two words and the dependency label the arc should have. Decoding can either be done greedily, or the optimal Minimum Spanning Tree can be decoded using Edmond's algorithm by viewing the dependency tree as a MST on a fully connected graph, where nodes are words and edges are scored dependency arcs.

Explore live Dependency Parsing demo at AllenNLP.

How do I load this model?

from allennlp_models.pretrained import load_predictor
predictor = load_predictor("structured-prediction-biaffine-parser")

Getting predictions

sentence = "The dog was chased by the cat."
preds = predictor.predict(sentence)
words = preds["words"]
poss = preds["pos"]
deps = preds["predicted_dependencies"]
for word, pos, dep in zip(words, poss, deps):
    print(f"{word} ({pos}) [{dep}]")
# prints:
# The (DET) [det]
# dog (NOUN) [nsubjpass]
# was (AUX) [auxpass]
# chased (VERB) [root]
# by (ADP) [prep]
# the (DET) [det]
# cat (NOUN) [pobj]
# . (PUNCT) [punct]

You can also get predictions using allennlp command line interface:

echo '{"sentence": "The dog was chased by the cat."}' | \
    allennlp predict https://storage.googleapis.com/allennlp-public-models/biaffine-dependency-parser-ptb-2020.04.06.tar.gz -

How do I train this model?

To train this model you can use allennlp CLI tool and the configuration file dependency_parser.jsonnet:

allennlp train dependency_parser.jsonnet -s output_dir

See the AllenNLP Training and prediction guide for more details.

Citation

@article{Dozat2017DeepBA,
 author = {Timothy Dozat and Christopher D. Manning},
 journal = {ArXiv},
 title = {Deep Biaffine Attention for Neural Dependency Parsing},
 volume = {abs/1611.01734},
 year = {2017}
}

Results

Dependency Parsing on Penn Treebank

Dependency Parsing
BENCHMARK MODEL METRIC NAME METRIC VALUE GLOBAL RANK
Penn Treebank Deep Biaffine Attention for Neural Dependency Parsing UAS 95.57 # 1
LAS 94.44 # 1