Source Code Summarization
37 papers with code • 9 benchmarks • 7 datasets
Code Summarization is a task that tries to comprehend code and automatically generate descriptions directly from the source code.
Source: Improving Automatic Source Code Summarization via Deep Reinforcement Learning
Libraries
Use these libraries to find Source Code Summarization models and implementationsDatasets
Latest papers
A Prompt Learning Framework for Source Code Summarization
PromptCS trains a prompt agent that can generate continuous prompts to unleash the potential for LLMs in code summarization.
Revisiting File Context for Source Code Summarization
Source code summarization is the task of writing natural language descriptions of source code.
Distilled GPT for Source Code Summarization
A code summary is a brief natural language description of source code.
Semantic Similarity Loss for Neural Source Code Summarization
We also propose to combine our loss with traditional CCE for each word, which streamlines the training process compared to baselines.
Statement-based Memory for Neural Source Code Summarization
For example, by taking the entire subroutine as input to a Transformer or RNN-based encoder.
Tram: A Token-level Retrieval-augmented Mechanism for Source Code Summarization
In this paper, we propose a fine-grained Token-level retrieval-augmented mechanism (Tram) on the decoder side rather than the encoder side to enhance the performance of neural models and produce more low-frequency tokens in generating summaries.
An Extractive-and-Abstractive Framework for Source Code Summarization
The extractive module in the framework performs a task of extractive code summarization, which takes in the code snippet and predicts important statements containing key factual details.
M2TS: Multi-Scale Multi-Modal Approach Based on Transformer for Source Code Summarization
They use the learned code representations as input to code summarization models, which can accordingly generate summaries describing source code.
Source Code Summarization with Structural Relative Position Guided Transformer
We further show that how the proposed SCRIPT captures the structural relative dependencies.
Compositionality-Aware Graph2Seq Learning
It is expected that the compositionality in a graph can be associated to the compositionality in the output sequence in many graph2seq tasks.