Search Results for author: Kichun Lee

Found 1 papers, 1 papers with code

Knowledge Distillation for BERT Unsupervised Domain Adaptation

1 code implementation22 Oct 2020 Minho Ryu, Kichun Lee

A pre-trained language model, BERT, has brought significant performance improvements across a range of natural language processing tasks.

General Classification Knowledge Distillation +4

Cannot find the paper you are looking for? You can Submit a new open access paper.