Analogical reasoning is effective in capturing linguistic regularities.
Chit-chat models are known to have several problems: they lack specificity, do not display a consistent personality and are often not very captivating.
Subword units are an effective way to alleviate the open vocabulary problems in neural machine translation (NMT).
We investigate a lattice-structured LSTM model for Chinese NER, which encodes a sequence of input characters as well as all potential words that match a lexicon.
Ranked #4 on Chinese Named Entity Recognition on Weibo NER
The process of translation is ambiguous, in that there are typically many valid trans- lations for a given sentence.
With recent advances in network architectures for Neural Machine Translation (NMT) recurrent models have effectively been replaced by either convolutional or self-attentional approaches, such as in the Transformer.
Inspired by how humans summarize long documents, we propose an accurate and fast summarization model that first selects salient sentences and then rewrites them abstractively (i. e., compresses and paraphrases) to generate a concise overall summary.