Search Results for author: Shengmai Chen

Found 1 papers, 0 papers with code

Is Model Attention Aligned with Human Attention? An Empirical Study on Large Language Models for Code Generation

no code implementations2 Jun 2023 Bonan Kou, Shengmai Chen, Zhijie Wang, Lei Ma, Tianyi Zhang

Through a quantitative experiment and a user study, we confirmed that, among twelve different attention computation methods, attention computed by the perturbation-based method is most aligned with human attention and is constantly favored by human programmers.

Code Generation

Cannot find the paper you are looking for? You can Submit a new open access paper.