FinBERT: Financial Sentiment Analysis with Pre-trained Language Models

27 Aug 2019  Â·  Dogu Araci ·

Financial sentiment analysis is a challenging task due to the specialized language and lack of labeled data in that domain. General-purpose models are not effective enough because of the specialized language used in a financial context. We hypothesize that pre-trained language models can help with this problem because they require fewer labeled examples and they can be further trained on domain-specific corpora. We introduce FinBERT, a language model based on BERT, to tackle NLP tasks in the financial domain. Our results show improvement in every measured metric on current state-of-the-art results for two financial sentiment analysis datasets. We find that even with a smaller training set and fine-tuning only a part of the model, FinBERT outperforms state-of-the-art machine learning methods.

PDF Abstract

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Sentiment Analysis Financial PhraseBank FinBERT Accuracy 86 # 1
F1 score 84 # 1
Sentiment Analysis FiQA FinBERT MSE 0.07 # 2
R^2 0.55 # 2

Methods