no code implementations • RepL4NLP (ACL) 2022 • Changwook Jun, Hansol Jang, Myoseop Sim, Hyun Kim, Jooyoung Choi, Kyungkoo Min, Kyunghoon Bae
Pre-trained language models have brought significant improvements in performance in a variety of natural language processing tasks.
no code implementations • 22 May 2023 • Mengxi Liu, Bo Zhou, Zimin Zhao, Hyeonseok Hong, Hyun Kim, Sungho Suh, Vitor Fortes Rey, Paul Lukowicz
In this work, we propose an open-source scalable end-to-end RTL framework FieldHAR, for complex human activity recognition (HAR) from heterogeneous sensors using artificial neural networks (ANN) optimized for FPGA or ASIC integration.
no code implementations • 28 Mar 2022 • Changwook Jun, Hansol Jang, Myoseop Sim, Hyun Kim, Jooyoung Choi, Kyungkoo Min, Kyunghoon Bae
Pre-trained language models have brought significant improvements in performance in a variety of natural language processing tasks.
1 code implementation • LREC 2022 • Changwook Jun, Jooyoung Choi, Myoseop Sim, Hyun Kim, Hansol Jang, Kyungkoo Min
Subsequently, we then build a pre-trained language model based on Transformer and fine-tune the model for table question answering with these datasets.
no code implementations • 3 Sep 2020 • Duy Thanh Nguyen, Hyun Kim, Hyuk-Jae Lee
The proposed design employs two layer-specific optimizations: layer-specific mixed data flow and layer-specific mixed precision.
no code implementations • WS 2019 • Hyun Kim, Joon-Ho Lim, Hyun-Ki Kim, Seung-Hoon Na
Our proposed model is re-purposed BERT for the translation quality estimation and uses multi-task learning for the sentence-level task and word-level subtasks (i. e., source word, target word, and target gap).
4 code implementations • ICCV 2019 • Jiwoong Choi, Dayoung Chun, Hyun Kim, Hyuk-Jae Lee
Therefore, a detection algorithm that can cope with mislocalizations is required in autonomous driving applications.