Search Results for author: Zhao Jun

Found 9 papers, 0 papers with code

A Robustly Optimized BERT Pre-training Approach with Post-training

no code implementations CCL 2021 Liu Zhuang, Lin Wayne, Shi Ya, Zhao Jun

“In the paper we present a ‘pre-training’+‘post-training’+‘fine-tuning’ three-stage paradigm which is a supplementary framework for the standard ‘pre-training’+‘fine-tuning’ languagemodel approach.

Extractive Question-Answering Question Answering

An Exploration of Prompt-Based Zero-Shot Relation Extraction Method

no code implementations CCL 2022 Zhao Jun, Hu Yuan, Xu Nuo, Gui Tao, Zhang Qi, Chen Yunwen, Gao Xiang

In addition, very few relation descriptions are exposed to the model during training, which we argue is the performance bottleneck of two-tower methods.

Language Modelling Relation +1

Topic Knowledge Acquisition and Utilization for Machine Reading Comprehension in Social Media Domain

no code implementations CCL 2021 Tian Zhixing, Zhang Yuanzhe, Liu Kang, Zhao Jun

Having realized this we propose a novel method that utilizes the topic knowledge implied by the clustered messages to aid in the comprehension of those short messages.

Clustering Machine Reading Comprehension

Intelligent reflecting surface aided wireless networks-Harris Hawks optimization for beamforming design

no code implementations5 Oct 2020 Xu Huaqiang, Zhang Guodong, Zhao Jun, Quoc-Viet Pham

Intelligent Reflecting Surface (IRS) is envisioned to be a promising green and cost-effective solution to enhance wireless network performance by smartly reconfiguring the signal propagation.

Networking and Internet Architecture

End-to-End Neural Ranking for eCommerce Product Search: an application of task models and textual embeddings

no code implementations19 Jun 2018 Brenner Eliot, Zhao Jun, Kutiyanawala Aliasgar, Yan Zheng

The different types of relevance models developed for IR have complementary advantages and disadvantages when applied to eCommerce product search.

Benchmarking

Cannot find the paper you are looking for? You can Submit a new open access paper.