Search Results for author: Ho-young Kim

Found 2 papers, 1 papers with code

Towards Next-Level Post-Training Quantization of Hyper-Scale Transformers

no code implementations14 Feb 2024 Junhan Kim, Kyungphil Park, Chungman Lee, Ho-young Kim, Joonyoung Kim, Yongkweon Jeon

Through extensive experiments on various language models and complexity analysis, we demonstrate that aespa is accurate and efficient in quantizing Transformer models.

Quantization

Genie: Show Me the Data for Quantization

1 code implementation CVPR 2023 Yongkweon Jeon, Chungman Lee, Ho-young Kim

We also propose a post-training quantization algorithm to enhance the performance of quantized models.

Data Free Quantization

Cannot find the paper you are looking for? You can Submit a new open access paper.