Search Results for author: Seohyun Kim

Found 5 papers, 2 papers with code

Improving Code Autocompletion with Transfer Learning

no code implementations12 May 2021 Wen Zhou, Seohyun Kim, Vijayaraghavan Murali, Gareth Ari Aye

Software language models have achieved promising results predicting code completion usages, and several industry studies have described successful IDE integrations.

Code Completion Transfer Learning

Rotation-Invariant Local-to-Global Representation Learning for 3D Point Cloud

no code implementations NeurIPS 2020 Seohyun Kim, Jaeyoo Park, Bohyung Han

We propose a local-to-global representation learning algorithm for 3D point cloud data, which is appropriate to handle various geometric transformations, especially rotation, without explicit data augmentation with respect to the transformations.

3D Object Recognition Data Augmentation +1

Code Prediction by Feeding Trees to Transformers

1 code implementation30 Mar 2020 Seohyun Kim, Jinman Zhao, Yuchi Tian, Satish Chandra

We provide comprehensive experimental evaluation of our proposal, along with alternative design choices, on a standard Python dataset, as well as on a Python corpus internal to Facebook.

Type prediction Value prediction Software Engineering

When Deep Learning Met Code Search

2 code implementations9 May 2019 Jose Cambronero, Hongyu Li, Seohyun Kim, Koushik Sen, Satish Chandra

Our evaluation shows that: 1. adding supervision to an existing unsupervised technique can improve performance, though not necessarily by much; 2. simple networks for supervision can be more effective that more sophisticated sequence-based networks for code search; 3. while it is common to use docstrings to carry out supervision, there is a sizeable gap between the effectiveness of docstrings and a more query-appropriate supervision corpus.

Code Search Natural Language Queries

Cannot find the paper you are looking for? You can Submit a new open access paper.