Search Results for author: Avishree Khare

Found 2 papers, 1 papers with code

GrACE: Generation using Associated Code Edits

no code implementations23 May 2023 Priyanshu Gupta, Avishree Khare, Yasharth Bajpai, Saikat Chakraborty, Sumit Gulwani, Aditya Kanade, Arjun Radhakrishna, Gustavo Soares, Ashish Tiwari

In our experiments with two datasets, the knowledge of prior edits boosts the performance of the LLMs significantly and enables them to generate 29% and 54% more correctly edited code in top-1 suggestions relative to the current state-of-the-art symbolic and neural approaches, respectively.

Bug fixing Code Generation

KD-Lib: A PyTorch library for Knowledge Distillation, Pruning and Quantization

1 code implementation30 Nov 2020 Het Shah, Avishree Khare, Neelay Shah, Khizir Siddiqui

In recent years, the growing size of neural networks has led to a vast amount of research concerning compression techniques to mitigate the drawbacks of such large sizes.

Knowledge Distillation Model Compression +1

Cannot find the paper you are looking for? You can Submit a new open access paper.