Search Results for author: Hyung Yong Kim

Found 3 papers, 0 papers with code

Oracle Teacher: Leveraging Target Information for Better Knowledge Distillation of CTC Models

no code implementations5 Nov 2021 Ji Won Yoon, Hyung Yong Kim, Hyeonseung Lee, Sunghwan Ahn, Nam Soo Kim

Extending this supervised scheme further, we introduce a new type of teacher model for connectionist temporal classification (CTC)-based sequence models, namely Oracle Teacher, that leverages both the source inputs and the output labels as the teacher model's input.

Knowledge Distillation Machine Translation +5

TutorNet: Towards Flexible Knowledge Distillation for End-to-End Speech Recognition

no code implementations3 Aug 2020 Ji Won Yoon, Hyeonseung Lee, Hyung Yong Kim, Won Ik Cho, Nam Soo Kim

To reduce this computational burden, knowledge distillation (KD), which is a popular model compression method, has been used to transfer knowledge from a deep and complex model (teacher) to a shallower and simpler model (student).

Knowledge Distillation Model Compression +3

Cannot find the paper you are looking for? You can Submit a new open access paper.