1 code implementation • 15 Jun 2020 • Yi Yang, Mark Christopher Siy UY, Allen Huang
Contextual pretrained language models, such as BERT (Devlin et al., 2019), have made significant breakthrough in various NLP tasks by training on large scale of unlabeled text re-sources. Financial sector also accumulates large amount of financial communication text. However, there is no pretrained finance specific language models available.