ConTFV: A Contrastive Learning Framework for Table-based Fact Verification

ACL ARR November 2021  ·  Anonymous ·

Table-based fact verification is a binary classification task where the challenging part lies in the table's structural parsing and symbolic reasoning. Jointly pre-training on abundant textual and tabular data has been conducted for table semantic parsing recently. However, these models are designed for the table's general understanding, directly fine-tuning for table-based fact verification cannot exploit the advantages to the full. In this paper, we propose ConTFV, a Contrastive learning framework for Table-based Fact Verification, to make the pre-trained model more task-relevant and tap its potential for better representations. By transforming it into a semantic similarity task, our method can outperform baselines with 1.2% on TabFact.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods