1 code implementation • EMNLP 2021 • Kayo Yin, Kenneth DeHaan, Malihe Alikhani
Coreference resolution is key to many natural language processing tasks and yet has been relatively unexplored in Sign Language Processing.
no code implementations • EMNLP 2021 • Aditi Chaudhary, Kayo Yin, Antonios Anastasopoulos, Graham Neubig
Learning fine-grained distinctions between vocabulary items is a key challenge in learning a new language.
1 code implementation • 21 Feb 2022 • Kayo Yin, Graham Neubig
Model interpretability methods are often used to explain NLP model decisions on tasks such as text classification, where the output space is relatively small.
no code implementations • 15 Sep 2021 • Patrick Fernandes, Kayo Yin, Emmy Liu, André F. T. Martins, Graham Neubig
Although proper handling of discourse significantly contributes to the quality of machine translation (MT), these improvements are not adequately measured in common translation quality metrics.
1 code implementation • 13 Sep 2021 • Aditi Chaudhary, Kayo Yin, Antonios Anastasopoulos, Graham Neubig
Learning fine-grained distinctions between vocabulary items is a key challenge in learning a new language.
no code implementations • MTSummit 2021 • Amit Moryossef, Kayo Yin, Graham Neubig, Yoav Goldberg
Sign language translation (SLT) is often decomposed into video-to-gloss recognition and gloss-to-text translation, where a gloss is a sequence of transcribed spoken-language words in the order in which they are signed.
Data Augmentation Low-Resource Neural Machine Translation +3
1 code implementation • ACL 2021 • Kayo Yin, Patrick Fernandes, Danish Pruthi, Aditi Chaudhary, André F. T. Martins, Graham Neubig
Are models paying large amounts of attention to the same context?
no code implementations • ACL 2021 • Kayo Yin, Amit Moryossef, Julie Hochgesang, Yoav Goldberg, Malihe Alikhani
Signed languages are the primary means of communication for many deaf and hard of hearing individuals.
1 code implementation • ACL 2021 • Patrick Fernandes, Kayo Yin, Graham Neubig, André F. T. Martins
Recent work in neural machine translation has demonstrated both the necessity and feasibility of using inter-sentential context -- context from sentences other than those currently being translated.
1 code implementation • COLING 2020 • Kayo Yin, Jesse Read
This contradicts previous claims that GT gloss translation acts as an upper bound for SLT performance and reveals that glosses are an inefficient representation of sign language.
Ranked #1 on Sign Language Translation on ASLG-PC12 (using extra training data)