Search Results for author: Tao

Found 1 papers, 1 papers with code

N-Grammer: Augmenting Transformers with latent n-grams

2 code implementations13 Jul 2022 Aurko Roy, Rohan Anil, Guangda Lai, Benjamin Lee, Jeffrey Zhao, Shuyuan Zhang, Shibo Wang, Ye Zhang, Shen Wu, Rigel Swavely, Tao, Yu, Phuong Dao, Christopher Fifty, Zhifeng Chen, Yonghui Wu

Transformer models have recently emerged as one of the foundational models in natural language processing, and as a byproduct, there is significant recent interest and investment in scaling these models.

Common Sense Reasoning Coreference Resolution +5

Cannot find the paper you are looking for? You can Submit a new open access paper.