HuggingFace's Transformers: State-of-the-art Natural Language Processing

9 Oct 2019Thomas WolfLysandre DebutVictor SanhJulien ChaumondClement DelangueAnthony MoiPierric CistacTim RaultRémi LoufMorgan FuntowiczJoe DavisonSam ShleiferPatrick von PlatenClara MaYacine JerniteJulien PluCanwen XuTeven Le ScaoSylvain GuggerMariama DrameQuentin LhoestAlexander M. Rush

Recent progress in natural language processing has been driven by advances in both model architecture and model pretraining. Transformer architectures have facilitated building higher-capacity models and pretraining has made it possible to effectively utilize this capacity for a wide variety of tasks... (read more)

PDF Abstract

Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper