no code implementations • 30 May 2023 • Hyun Seung Lee, Seungtaek Choi, Yunsung Lee, Hyeongdon Moon, Shinhyeok Oh, Myeongho Jeong, Hyojun Go, Christian Wallraven
To mitigate these issues, here we propose a novel retrieval approach CEAA that provides effective learning in educational text classification.
no code implementations • 26 May 2023 • Shinhyeok Oh, Hyojun Go, Hyeongdon Moon, Yunsung Lee, Myeongho Jeong, Hyun Seung Lee, Seungtaek Choi
To this end, we propose to paraphrase the reference question for a more robust QG evaluation.
1 code implementation • CVPR 2023 • Hyojun Go, Yunsung Lee, Jin-Young Kim, SeungHyun Lee, Myeongho Jeong, Hyun Seung Lee, Seungtaek Choi
For that, the existing practice is to fine-tune the guidance models with labeled data corrupted with noises.
no code implementations • 9 Oct 2021 • Hyun Seung Lee, Christian Wallraven
Recent research has found that knowledge distillation can be effective in reducing the size of a network and in increasing generalization.