Search Results for author: Hyopil Shin

Found 18 papers, 1 papers with code

Generating Slogans with Linguistic Features using Sequence-to-Sequence Transformer

no code implementations ICON 2021 Yeoun Yi, Hyopil Shin

We present LexPOS, a sequence-to-sequence transformer model that generates slogans given phonetic and structural information.

POS

KIT-19: A Comprehensive Korean Instruction Toolkit on 19 Tasks for Fine-Tuning Korean Large Language Models

no code implementations25 Mar 2024 Dongjun Jang, Sungjoo Byun, Hyemi Jo, Hyopil Shin

Based on the its quality and empirical results, this paper proposes that \textit{KIT-19} has the potential to make a substantial contribution to the future improvement of Korean LLMs' performance.

A Study on How Attention Scores in the BERT Model are Aware of Lexical Categories in Syntactic and Semantic Tasks on the GLUE Benchmark

no code implementations25 Mar 2024 Dongjun Jang, Sungjoo Byun, Hyopil Shin

This study examines whether the attention scores between tokens in the BERT model significantly vary based on lexical categories during the fine-tuning process for downstream tasks.

CARBD-Ko: A Contextually Annotated Review Benchmark Dataset for Aspect-Level Sentiment Classification in Korean

no code implementations23 Feb 2024 Dongjun Jang, Jean Seo, Sungjoo Byun, Taekyoung Kim, Minseok Kim, Hyopil Shin

In order to tackle these challenges, we introduce CARBD-Ko (a Contextually Annotated Review Benchmark Dataset for Aspect-Based Sentiment Classification in Korean), a benchmark dataset that incorporates aspects and dual-tagged polarities to distinguish between aspect-specific and aspect-agnostic sentiment classification.

Classification Hallucination +2

DaG LLM ver 1.0: Pioneering Instruction-Tuned Language Modeling for Korean NLP

no code implementations23 Nov 2023 Dongjun Jang, Sangah Lee, Sungjoo Byun, Jinwoong Kim, Jean Seo, Minseok Kim, Soyeon Kim, Chaeyoung Oh, Jaeyoon Kim, Hyemi Jo, Hyopil Shin

This paper presents the DaG LLM (David and Goliath Large Language Model), a language model specialized for Korean and fine-tuned through Instruction Tuning across 41 tasks within 13 distinct categories.

Language Modelling Large Language Model

KR-BERT: A Small-Scale Korean-Specific Language Model

1 code implementation10 Aug 2020 Sangah Lee, Hansol Jang, Yunmee Baik, Suzi Park, Hyopil Shin

Since the appearance of BERT, recent works including XLNet and RoBERTa utilize sentence embedding models pre-trained by large corpora and a large number of parameters.

Language Modelling Sentence +2

A New Approach for Measuring Sentiment Orientation based on Multi-Dimensional Vector Space

no code implementations31 Dec 2017 Youngsam Kim, Hyopil Shin

This study implements a vector space model approach to measure the sentiment orientations of words.

Cannot find the paper you are looking for? You can Submit a new open access paper.