GPT-4 is a transformer based model pre-trained to predict the next token in a document.
Source: GPT-4 Technical ReportPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Language Modelling | 80 | 11.78% |
Large Language Model | 49 | 7.22% |
Question Answering | 40 | 5.89% |
Retrieval | 26 | 3.83% |
In-Context Learning | 25 | 3.68% |
Code Generation | 18 | 2.65% |
Benchmarking | 17 | 2.50% |
Decision Making | 15 | 2.21% |
Prompt Engineering | 15 | 2.21% |