PALT: Parameter-Lite Transfer of Language Models for Knowledge Graph Completion

25 Oct 2022  ·  Jianhao Shen, Chenguang Wang, Ye Yuan, Jiawei Han, Heng Ji, Koushik Sen, Ming Zhang, Dawn Song ·

This paper presents a parameter-lite transfer learning approach of pretrained language models (LM) for knowledge graph (KG) completion. Instead of finetuning, which modifies all LM parameters, we only tune a few new parameters while keeping the original LM parameters fixed. We establish this via reformulating KG completion as a "fill-in-the-blank" task, and introducing a parameter-lite encoder on top of the original LMs. We show that, by tuning far fewer parameters than finetuning, LMs transfer non-trivially to most tasks and reach competitiveness with prior state-of-the-art approaches. For instance, we outperform the fully finetuning approaches on a KG completion benchmark by tuning only 1% of the parameters. The code and datasets are available at \url{https://github.com/yuanyehome/PALT}.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Link Prediction FB15k-237 PALT Hits@10 0.444 # 59
MR 144 # 7
Link Prediction UMLS PALT Hits@10 0.990 # 5
MR 1.57 # 7
Link Prediction WN18RR PALT Hits@10 0.693 # 8
MR 61 # 4

Methods


No methods listed for this paper. Add relevant methods here