no code implementations • WS 2019 • Tom Lippincott, Pamela Shapiro, Kevin Duh, Paul McNamee
Our submission to the MADAR shared task on Arabic dialect identification employed a language modeling technique called Prediction by Partial Matching, an ensemble of neural architectures, and sources of additional data for training word embeddings and auxiliary language models.
no code implementations • WS 2019 • Pamela Shapiro, Kevin Duh
When translating diglossic languages such as Arabic, situations may arise where we would like to translate a text but do not know which dialect it is.
no code implementations • NAACL 2019 • Xuan Zhang, Pamela Shapiro, Gaurav Kumar, Paul McNamee, Marine Carpuat, Kevin Duh
We introduce a curriculum learning approach to adapt generic neural machine translation models to a specific domain.
no code implementations • WS 2019 • Adithya Renduchintala, Pamela Shapiro, Kevin Duh, Philipp Koehn
Neural machine translation (NMT) systems operate primarily on words (or sub-words), ignoring lower-level patterns of morphology.
no code implementations • 5 Sep 2018 • Pamela Shapiro, Kevin Duh
Neural Machine Translation (NMT) in low-resource settings and of morphologically rich languages is made difficult in part by data sparsity of vocabulary words.
2 code implementations • EMNLP 2018 • Shijie Wu, Pamela Shapiro, Ryan Cotterell
We compare soft and hard non-monotonic attention experimentally and find that the exact algorithm significantly improves performance over the stochastic approximation and outperforms soft attention.
no code implementations • WS 2018 • Pamela Shapiro, Kevin Duh
Neural machine translation has achieved impressive results in the last few years, but its success has been limited to settings with large amounts of parallel data.