1 code implementation • 13 Oct 2021 • Guochen Yu, Andong Li, Chengshi Zheng, Yinuo Guo, Yutian Wang, Hui Wang
Curriculum learning begins to thrive in the speech enhancement area, which decouples the original spectrum estimation task into multiple easier sub-tasks to achieve better performance.
no code implementations • 13 Dec 2020 • Yinuo Guo, Zeqi Lin, Jian-Guang Lou, Dongmei Zhang
Experiments on Geo, ComplexWebQuestions, and Formulas show that our framework can consistently improve performances of neural semantic parsers in different domains.
no code implementations • 8 Dec 2020 • Yinuo Guo, Hualei Zhu, Zeqi Lin, Bei Chen, Jian-Guang Lou, Dongmei Zhang
Human intelligence exhibits compositional generalization (i. e., the capacity to understand and produce unseen combinations of seen components), but current neural seq2seq models lack such ability.
no code implementations • WMT (EMNLP) 2020 • Jin Xu, Yinuo Guo, Junfeng Hu
Copying mechanism has been commonly used in neural paraphrasing networks and other text generation tasks, in which some important words in the input sequence are preserved in the output sequence.
no code implementations • NeurIPS 2020 • Yinuo Guo, Zeqi Lin, Jian-Guang Lou, Dongmei Zhang
We formalize human language understanding as a structured prediction task where the output is a partially ordered set (poset).
Ranked #4 on Semantic Parsing on CFQ
no code implementations • 16 Jan 2020 • Yinuo Guo, Tao Ge, Furu Wei
To overcome the challenges, we first propose the Fact-aware Sentence Encoding, which enables the model to learn facts from the long sentence and thus improves the precision of sentence split; then we introduce Permutation Invariant Training to alleviate the effects of order variance in seq2seq learning for this task.
no code implementations • WS 2019 • Yinuo Guo, Junfeng Hu
This paper describes Meteor++ 2. 0, our submission to the WMT19 Metric Shared Task.
no code implementations • WS 2018 • Yinuo Guo, Chong Ruan, Junfeng Hu
In machine translation evaluation, a good candidate translation can be regarded as a paraphrase of the reference.