Search Results for author: Jae-young Jo

Found 1 papers, 1 papers with code

Roles and Utilization of Attention Heads in Transformer-based Neural Language Models

1 code implementation ACL 2020 Jae-young Jo, Sung-Hyon Myaeng

Sentence encoders based on the transformer architecture have shown promising results on various natural language tasks.

Sentence

Cannot find the paper you are looking for? You can Submit a new open access paper.