Search Results for author: Kyusam Oh

Found 2 papers, 0 papers with code

Deep Collective Knowledge Distillation

no code implementations18 Apr 2023 Jihyeon Seo, Kyusam Oh, Chanho Min, Yongkeun Yun, Sungwoo Cho

We propose deep collective knowledge distillation for model compression, called DCKD, which is a method for training student models with rich information to acquire knowledge from not only their teacher model but also other student models.

Knowledge Distillation Model Compression

DUET: Detection Utilizing Enhancement for Text in Scanned or Captured Documents

no code implementations10 Jun 2021 Eun-Soo Jung, HyeongGwan Son, Kyusam Oh, Yongkeun Yun, Soonhwan Kwon, Min Soo Kim

Moreover, ablations are conducted and the results confirm the effectiveness of the synthetic data, auxiliary task, and weak-supervision.

Multi-Task Learning Text Detection

Cannot find the paper you are looking for? You can Submit a new open access paper.