Search Results for author: Patrick Simianer

Found 9 papers, 0 papers with code

Neural Machine Translation Models Can Learn to be Few-shot Learners

no code implementations15 Sep 2023 Raphael Reinauer, Patrick Simianer, Kaden Uhlig, Johannes E. M. Mosig, Joern Wuebker

The emergent ability of Large Language Models to use a small number of examples to learn to perform in novel domains and tasks, also called in-context learning (ICL).

Domain Adaptation In-Context Learning +4

The Impact of Text Presentation on Translator Performance

no code implementations11 Nov 2020 Samuel Läubli, Patrick Simianer, Joern Wuebker, Geza Kovacs, Rico Sennrich, Spence Green

Widely used computer-aided translation (CAT) tools divide documents into segments such as sentences and arrange them in a side-by-side, spreadsheet-like view.

Sentence Translation

Measuring Immediate Adaptation Performance for Neural Machine Translation

no code implementations NAACL 2019 Patrick Simianer, Joern Wuebker, John DeNero

Incremental domain adaptation, in which a system learns from the correct output for each input immediately after making its prediction for that input, can dramatically improve system performance for interactive machine translation.

Domain Adaptation Machine Translation +2

Compact Personalized Models for Neural Machine Translation

no code implementations EMNLP 2018 Joern Wuebker, Patrick Simianer, John DeNero

We propose and compare methods for gradient-based domain adaptation of self-attentive neural machine translation models.

Domain Adaptation Machine Translation +1

A User-Study on Online Adaptation of Neural Machine Translation to Human Post-Edits

no code implementations13 Dec 2017 Sariya Karimova, Patrick Simianer, Stefan Riezler

The advantages of neural machine translation (NMT) have been extensively validated for offline translation of several language pairs for different domains of spoken and written language.

Machine Translation NMT +1

A Post-editing Interface for Immediate Adaptation in Statistical Machine Translation

no code implementations COLING 2016 Patrick Simianer, Sariya Karimova, Stefan Riezler

Our translation systems may learn from post-edits using several weight, language model and novel translation model adaptation techniques, in part by exploiting the output of the graphical interface.

Domain Adaptation Language Modelling +2

Cannot find the paper you are looking for? You can Submit a new open access paper.