Search Results for author: Minseok Choi

Found 13 papers, 3 papers with code

PairEval: Open-domain Dialogue Evaluation with Pairwise Comparison

no code implementations1 Apr 2024 ChaeHun Park, Minseok Choi, Dohyun Lee, Jaegul Choo

Recent studies proposed evaluation metrics that assess generated responses by considering their relevance to previous dialogue histories.

Dialogue Evaluation

SimCKP: Simple Contrastive Learning of Keyphrase Representations

1 code implementation12 Oct 2023 Minseok Choi, Chaeheon Gwak, SeHo Kim, Si Hyeong Kim, Jaegul Choo

Keyphrase generation (KG) aims to generate a set of summarizing words or phrases given a source document, while keyphrase extraction (KE) aims to identify them from the text.

Contrastive Learning Keyphrase Extraction +1

Two Tales of Platoon Intelligence for Autonomous Mobility Control: Enabling Deep Learning Recipes

no code implementations19 Jul 2023 Soohyun Park, Haemin Lee, Chanyoung Park, Soyi Jung, Minseok Choi, Joongheon Kim

This paper presents the deep learning-based recent achievements to resolve the problem of autonomous mobility control and efficient resource management of autonomous vehicles and UAVs, i. e., (i) multi-agent reinforcement learning (MARL), and (ii) neural Myerson auction.

Autonomous Vehicles Management +1

HistRED: A Historical Document-Level Relation Extraction Dataset

1 code implementation10 Jul 2023 Soyoung Yang, Minseok Choi, Youngwoo Cho, Jaegul Choo

To demonstrate the usefulness of our dataset, we propose a bilingual RE model that leverages both Korean and Hanja contexts to predict relations between entities.

Document-level Relation Extraction Relation +1

SplitGP: Achieving Both Generalization and Personalization in Federated Learning

no code implementations16 Dec 2022 Dong-Jun Han, Do-Yeon Kim, Minseok Choi, Christopher G. Brinton, Jaekyun Moon

A fundamental challenge to providing edge-AI services is the need for a machine learning (ML) model that achieves personalization (i. e., to individual clients) and generalization (i. e., to unseen data) properties concurrently.

Federated Learning

Bayesian deep learning framework for uncertainty quantification in high dimensions

no code implementations21 Oct 2022 Jeahan Jung, Minseok Choi

A BNN efficiently learns the posterior distribution of the parameters in deep neural networks by performing Bayesian inference on the network parameters.

Bayesian Inference Uncertainty Quantification +1

Search Space Adaptation for Differentiable Neural Architecture Search in Image Classification

no code implementations5 Jun 2022 Youngkee Kim, Soyi Jung, Minseok Choi, Joongheon Kim

As deep neural networks achieve unprecedented performance in various tasks, neural architecture search (NAS), a research field for designing neural network architectures with automated processes, is actively underway.

Image Classification Neural Architecture Search

Quantum Distributed Deep Learning Architectures: Models, Discussions, and Applications

no code implementations19 Feb 2022 Yunseok Kwak, Won Joon Yun, Jae Pyoung Kim, Hyunhee Cho, Minseok Choi, Soyi Jung, Joongheon Kim

Although deep learning (DL) has already become a state-of-the-art technology for various data processing tasks, data security and computational overload problems often arise due to their high data and computational power dependency.

Sageflow: Robust Federated Learning against Both Stragglers and Adversaries

no code implementations NeurIPS 2021 Jungwuk Park, Dong-Jun Han, Minseok Choi, Jaekyun Moon

While federated learning (FL) allows efficient model training with local data at edge devices, among major issues still to be resolved are: slow devices known as stragglers and malicious attacks launched by adversaries.

Federated Learning

Joint Mobile Charging and Coverage-Time Extension for Unmanned Aerial Vehicles

no code implementations27 Jun 2021 Soohyun Park, Won-Yong Shin, Minseok Choi, Joongheon Kim

To overcome this, we need to characterize a new type of drones, so-called charging drones, which can deliver energy to MBS drones.

Scheduling

FedMes: Speeding Up Federated Learning with Multiple Edge Servers

no code implementations1 Jan 2021 Dong-Jun Han, Minseok Choi, Jungwuk Park, Jaekyun Moon

Our key idea is to utilize the devices located in the overlapping areas between the coverage of edge servers; in the model-downloading stage, the devices in the overlapping areas receive multiple models from different edge servers, take the average of the received models, and then update the model with their local data.

Federated Learning

Sself: Robust Federated Learning against Stragglers and Adversaries

no code implementations1 Jan 2021 Jungwuk Park, Dong-Jun Han, Minseok Choi, Jaekyun Moon

While federated learning allows efficient model training with local data at edge devices, two major issues that need to be resolved are: slow devices known as stragglers and malicious attacks launched by adversaries.

Data Poisoning Federated Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.