SRL BERT

Last updated on Mar 15, 2021

SRL BERT

Parameters 110 Million
File Size 387.17 MB
Training Data OntoNotes 5.0

Training Techniques AdamW
Architecture BERT, Dropout, Layer Normalization, Linear Layer, Tanh
LR 0.00005
Epochs 15
Batch Size 32
SHOW MORE
SHOW LESS
README.md

Summary

An implementation of a BERT based model (Shi et al, 2019) with some modifications (no additional parameters apart from a linear classification layer).

Explore live Semantic Role Labeling demo at AllenNLP.

How do I load this model?

from allennlp_models.pretrained import load_predictor
predictor = load_predictor("structured-prediction-srl-bert")

Getting predictions

sentence = "John broke the window with a rock."
preds = predictor.predict(sentence)
print(preds["verbs"][0]["description"])
# prints:
# [ARG0: John] [V: broke] [ARG1: the window] [ARG2: with a rock] .

You can also get predictions using allennlp command line interface:

echo '{"sentence": "John broke the window with a rock."}' | \
    allennlp predict https://storage.googleapis.com/allennlp-public-models/structured-prediction-srl-bert.2020.12.15.tar.gz -

How do I train this model?

To train this model you can use allennlp CLI tool and the configuration file bert_base_srl.jsonnet:

allennlp train bert_base_srl.jsonnet -s output_dir

See the AllenNLP Training and prediction guide for more details.

Citation

@article{Shi2019SimpleBM,
 author = {Peng Shi and Jimmy Lin},
 journal = {ArXiv},
 title = {Simple BERT Models for Relation Extraction and Semantic Role Labeling},
 volume = {abs/1904.05255},
 year = {2019}
}

Results

Semantic Role Labeling on OntoNotes

Semantic Role Labeling
BENCHMARK MODEL METRIC NAME METRIC VALUE GLOBAL RANK
OntoNotes SRL BERT F1 86.49 # 1