AMR Parsing with Latent Structural Information

ACL 2020  ·  Qiji Zhou, Yue Zhang, Donghong Ji, Hao Tang ·

Abstract Meaning Representations (AMRs) capture sentence-level semantics structural representations to broad-coverage natural sentences. We investigate parsing AMR with explicit dependency structures and interpretable latent structures. We generate the latent soft structure without additional annotations, and fuse both dependency and latent structure via an extended graph neural networks. The fused structural information helps our experiments results to achieve the best reported results on both AMR 2.0 (77.5{\%} Smatch F1 on LDC2017T10) and AMR 1.0 ((71.8{\%} Smatch F1 on LDC2014T12).

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here