Search Results for author: Mingming Yang

Found 7 papers, 5 papers with code

Benchmarking LLMs via Uncertainty Quantification

1 code implementation23 Jan 2024 Fanghua Ye, Mingming Yang, Jianhui Pang, Longyue Wang, Derek F. Wong, Emine Yilmaz, Shuming Shi, Zhaopeng Tu

The proliferation of open-source Large Language Models (LLMs) from various institutions has highlighted the urgent need for comprehensive evaluation methods.

Benchmarking Uncertainty Quantification

Context Consistency between Training and Testing in Simultaneous Machine Translation

1 code implementation13 Nov 2023 Meizhi Zhong, Lemao Liu, Kehai Chen, Mingming Yang, Min Zhang

Simultaneous Machine Translation (SiMT) aims to yield a real-time partial translation with a monotonically growing the source-side context.

Machine Translation Translation

Image super-resolution via dynamic network

1 code implementation16 Oct 2023 Chunwei Tian, Xuanyu Zhang, Qi Zhang, Mingming Yang, Zhaojie Ju

In this paper, we present a dynamic network for image super-resolution (DSRNet), which contains a residual enhancement block, wide enhancement block, feature refinement block and construction block.

Image Super-Resolution

Hard Exudate Segmentation Supplemented by Super-Resolution with Multi-scale Attention Fusion Module

no code implementations17 Nov 2022 Jiayi Zhang, Xiaoshan Chen, Zhongxi Qiu, Mingming Yang, Yan Hu, Jiang Liu

Specifically, we propose a fusion module named Multi-scale Attention Fusion (MAF) module for our dual-stream framework to effectively integrate features of the two tasks.

Boundary Detection Segmentation +1

Dual-path CNN with Max Gated block for Text-Based Person Re-identification

1 code implementation20 Sep 2020 Tinghuai Ma, Mingming Yang, Huan Rong, Yurong Qian, Yuan Tian, NajlaAl-Nabhan

With that in mind, a novel Dual-path CNN with Max Gated block (DCMG) is proposed to extract discriminative word embeddings and make visual-textual association concern more on remarkable features of both modalities.

Language Modelling Person Re-Identification +1

Sentence-Level Agreement for Neural Machine Translation

no code implementations ACL 2019 Mingming Yang, Rui Wang, Kehai Chen, Masao Utiyama, Eiichiro Sumita, Min Zhang, Tiejun Zhao

The training objective of neural machine translation (NMT) is to minimize the loss between the words in the translated sentences and those in the references.

Machine Translation NMT +2

Cannot find the paper you are looking for? You can Submit a new open access paper.