Syntactic Multi-view Learning for Open Information Extraction

5 Dec 2022  ยท  Kuicai Dong, Aixin Sun, Jung-jae Kim, XiaoLi Li ยท

Open Information Extraction (OpenIE) aims to extract relational tuples from open-domain sentences. Traditional rule-based or statistical models have been developed based on syntactic structures of sentences, identified by syntactic parsers. However, previous neural OpenIE models under-explore the useful syntactic information. In this paper, we model both constituency and dependency trees into word-level graphs, and enable neural OpenIE to learn from the syntactic structures. To better fuse heterogeneous information from both graphs, we adopt multi-view learning to capture multiple relationships from them. Finally, the finetuned constituency and dependency representations are aggregated with sentential semantic representations for tuple generation. Experiments show that both constituency and dependency information, and the multi-view learning are effective.

PDF Abstract

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Open Information Extraction LSOIE-wiki SMiLe-OIE F1 51.73 # 1
Open Information Extraction LSOIE-wiki BERT + Dep-GCN - Const-GCN F1 50.21 # 2
Open Information Extraction LSOIE-wiki BERT + Dep-GCN [?] Const-GCN F1 49.89 # 3
Open Information Extraction LSOIE-wiki BERT + Const-GCN F1 49.71 # 4
Open Information Extraction LSOIE-wiki BERT + Dep-GCN F1 48.71 # 6
Open Information Extraction LSOIE-wiki CIGL-OIE + IGL-CA Kolluru et al. (2020) F1 44.75 # 8
Open Information Extraction LSOIE-wiki IMoJIE Kolluru et al. (2020) F1 49.24 # 5
Open Information Extraction LSOIE-wiki CopyAttention Cui et al. (2018) F1 39.52 # 11
Open Information Extraction LSOIE-wiki BERT Solawetz and Larson (2021) F1 47.54 # 7
Open Information Extraction LSOIE-wiki GloVe + bi-LSTM + CRF F1 44.48 # 9
Open Information Extraction LSOIE-wiki GloVe + bi-LSTM Stanovsky et al. (2018) F1 43.9 # 10

Methods


No methods listed for this paper. Add relevant methods here