no code implementations • 15 Sep 2023 • Raphael Reinauer, Patrick Simianer, Kaden Uhlig, Johannes E. M. Mosig, Joern Wuebker
The emergent ability of Large Language Models to use a small number of examples to learn to perform in novel domains and tasks, also called in-context learning (ICL).
no code implementations • 11 Nov 2020 • Samuel Läubli, Patrick Simianer, Joern Wuebker, Geza Kovacs, Rico Sennrich, Spence Green
Widely used computer-aided translation (CAT) tools divide documents into segments such as sentences and arrange them in a side-by-side, spreadsheet-like view.
no code implementations • NAACL 2019 • Patrick Simianer, Joern Wuebker, John DeNero
Incremental domain adaptation, in which a system learns from the correct output for each input immediately after making its prediction for that input, can dramatically improve system performance for interactive machine translation.
no code implementations • EMNLP 2018 • Joern Wuebker, Patrick Simianer, John DeNero
We propose and compare methods for gradient-based domain adaptation of self-attentive neural machine translation models.
no code implementations • 13 Dec 2017 • Sariya Karimova, Patrick Simianer, Stefan Riezler
The advantages of neural machine translation (NMT) have been extensively validated for offline translation of several language pairs for different domains of spoken and written language.
no code implementations • COLING 2016 • Patrick Simianer, Sariya Karimova, Stefan Riezler
Our translation systems may learn from post-edits using several weight, language model and novel translation model adaptation techniques, in part by exploiting the output of the graphical interface.