Language Models

BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total).

Source: BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Language Modelling 15 12.50%
Question Answering 8 6.67%
Machine Translation 5 4.17%
Quantization 5 4.17%
Translation 5 4.17%
Text Generation 4 3.33%
Large Language Model 4 3.33%
Benchmarking 3 2.50%
Text Classification 2 1.67%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories