Language Representation in Multilingual BERT and its applications to improve Cross-lingual Generalization

A token embedding in multilingual BERT (m-BERT) contains both language and semantic information. We find that representation of a language can be obtained by simply averaging the embeddings of the tokens of the language... (read more)

Results in Papers With Code
(↓ scroll down to see all results)