Search Results for author: Yinpeng Guo

Found 6 papers, 2 papers with code

MultiCoder: Multi-Programming-Lingual Pre-Training for Low-Resource Code Completion

no code implementations19 Dec 2022 Zi Gong, Yinpeng Guo, Pingyi Zhou, Cuiyun Gao, Yasheng Wang, Zenglin Xu

On the other hand, there are few studies exploring the effects of multi-programming-lingual (MultiPL) pre-training for the code completion, especially the impact on low-resource programming languages.

Code Completion

PanGu-Coder: Program Synthesis with Function-Level Language Modeling

1 code implementation22 Jul 2022 Fenia Christopoulou, Gerasimos Lampouras, Milan Gritta, Guchun Zhang, Yinpeng Guo, Zhongqi Li, Qi Zhang, Meng Xiao, Bo Shen, Lin Li, Hao Yu, Li Yan, Pingyi Zhou, Xin Wang, Yuchi Ma, Ignacio Iacobacci, Yasheng Wang, Guangtai Liang, Jiansheng Wei, Xin Jiang, Qianxiang Wang, Qun Liu

We present PanGu-Coder, a pretrained decoder-only language model adopting the PanGu-Alpha architecture for text-to-code generation, i. e. the synthesis of programming language solutions given a natural language problem description.

Code Generation Language Modelling +2

FreeTransfer-X: Safe and Label-Free Cross-Lingual Transfer from Off-the-Shelf Models

no code implementations Findings (NAACL) 2022 Yinpeng Guo, Liangyou Li, Xin Jiang, Qun Liu

However, labeled cross-lingual corpus is expensive or even inaccessible, especially in the fields where labels are private, such as diagnostic results of symptoms in medicine and user profiles in business.

Cross-Lingual Transfer Knowledge Distillation +3

Learning Multilingual Representation for Natural Language Understanding with Enhanced Cross-Lingual Supervision

no code implementations9 Jun 2021 Yinpeng Guo, Liangyou Li, Xin Jiang, Qun Liu

Recently, pre-training multilingual language models has shown great potential in learning multilingual representation, a crucial topic of natural language processing.

Natural Language Understanding

Training Multilingual Pre-trained Language Model with Byte-level Subwords

1 code implementation23 Jan 2021 Junqiu Wei, Qun Liu, Yinpeng Guo, Xin Jiang

The pre-trained language models have achieved great successes in various natural language understanding (NLU) tasks due to its capacity to capture the deep contextualized information in text by pre-training on large-scale corpora.

Language Modelling Natural Language Understanding

Zero-Shot Paraphrase Generation with Multilingual Language Models

no code implementations9 Nov 2019 Yinpeng Guo, Yi Liao, Xin Jiang, Qing Zhang, Yibo Zhang, Qun Liu

Leveraging multilingual parallel texts to automatically generate paraphrases has drawn much attention as size of high-quality paraphrase corpus is limited.

Denoising Machine Translation +3

Cannot find the paper you are looking for? You can Submit a new open access paper.