Language Models

CodeGen is an autoregressive transformers with next-token prediction language modeling as the learning objective trained on a natural language corpus and programming language data curated from GitHub.

Source: CodeGen: An Open Large Language Model for Code with Multi-Turn Program Synthesis

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Code Generation 16 41.03%
Language Modelling 5 12.82%
Program Synthesis 4 10.26%
Memorization 2 5.13%
In-Context Learning 2 5.13%
Large Language Model 2 5.13%
Benchmarking 2 5.13%
Prompt Engineering 1 2.56%
Question Answering 1 2.56%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories