1 code implementation • NAACL 2022 • Fei Wang, Zhewei Xu, Pedro Szekely, Muhao Chen
This prunes the full self-attention structure into an order-invariant graph attention that captures the connected graph structure of cells belonging to the same row or column, and it differentiates between relevant cells and irrelevant cells from the structural perspective.
Ranked #2 on Data-to-Text Generation on ToTTo