no code implementations • 1 May 2024 • Donghee Choi, Mogan Gim, Donghyeon Park, Mujeen Sung, Hyunjae Kim, Jaewoo Kang, Jihun Choi
This paper introduces CookingSense, a descriptive collection of knowledge assertions in the culinary domain extracted from various sources, including web data, scientific papers, and recipes, from which knowledge covering a broad range of aspects is acquired.
no code implementations • 21 Feb 2024 • Seong Hoon Lim, Taejun Yun, Jinhyeon Kim, Jihun Choi, Taeuk Kim
The successful adaptation of multilingual language models (LMs) to a specific language-task pair critically depends on the availability of data tailored for that condition.
1 code implementation • 14 Oct 2022 • Mogan Gim, Donghee Choi, Kana Maruyama, Jihun Choi, Hajung Kim, Donghyeon Park, Jaewoo Kang
To perform this task, we developed RecipeMind, a food affinity score prediction model that quantifies the suitability of adding an ingredient to set of other ingredients.
1 code implementation • ICLR 2020 • Taeuk Kim, Jihun Choi, Daniel Edmiston, Sang-goo Lee
With the recent success and popularity of pre-trained language models (LMs) in natural language processing, there has been a rise in efforts to understand their inner workings.
no code implementations • ACL 2019 • Jihun Choi, Taeuk Kim, Sang-goo Lee
We present a latent variable model for predicting the relationship between a pair of text sequences.
1 code implementation • SEMEVAL 2019 • Sanghwan Bae, Jihun Choi, Sang-goo Lee
We present several techniques to tackle the mismatch in class distributions between training and test data in the Contextual Emotion Detection task of SemEval 2019, by extending the existing methods for class imbalance problem.
Ranked #6 on Emotion Recognition in Conversation on EC
1 code implementation • 6 Mar 2019 • Sanghwan Bae, Jihun Choi, Sang-goo Lee
We present several techniques to tackle the mismatch in class distributions between training and test data in the Contextual Emotion Detection task of SemEval 2019, by extending the existing methods for class imbalance problem.
2 code implementations • 7 Sep 2018 • Taeuk Kim, Jihun Choi, Daniel Edmiston, Sanghwan Bae, Sang-goo Lee
Most existing recursive neural network (RvNN) architectures utilize only the structure of parse trees, ignoring syntactic tags which are provided as by-products of parsing.
no code implementations • 7 Sep 2018 • Jihun Choi, Taeuk Kim, Sang-goo Lee
We propose a method of stacking multiple long short-term memory (LSTM) layers for modeling sentences.
Ranked #10 on Sentiment Analysis on SST-5 Fine-grained classification
no code implementations • SEMEVAL 2018 • Jihun Choi, Taeuk Kim, Sang-goo Lee
When we build a neural network model predicting the relationship between two sentences, the most general and intuitive approach is to use a Siamese architecture, where the sentence vectors obtained from a shared encoder is given as input to a classifier.
1 code implementation • SEMEVAL 2018 • Taeuk Kim, Jihun Choi, Sang-goo Lee
We present a novel neural architecture for the Argument Reasoning Comprehension task of SemEval 2018.
1 code implementation • SEMEVAL 2018 • Taeuk Kim, Jihun Choi, Sang-goo Lee
We present a novel neural architecture for the Argument Reasoning Comprehension task of SemEval 2018.
1 code implementation • 10 Jul 2017 • Jihun Choi, Kang Min Yoo, Sang-goo Lee
For years, recursive neural networks (RvNNs) have been shown to be suitable for representing text into fixed-length vectors and achieved good performance on several natural language processing tasks.
Ranked #62 on Natural Language Inference on SNLI