Search Results for author: Poorya Zaremoodi

Found 6 papers, 0 papers with code

Learning to Multi-Task Learn for Better Neural Machine Translation

no code implementations10 Jan 2020 Poorya Zaremoodi, Gholamreza Haffari

We effectively and efficiently learn the training schedule policy within the imitation learning framework using an oracle policy algorithm that dynamically sets the importance weights of auxiliary tasks based on their contributions to the generalisability of the main NMT task.

Imitation Learning Machine Translation +5

Adaptively Scheduled Multitask Learning: The Case of Low-Resource Neural Machine Translation

no code implementations WS 2019 Poorya Zaremoodi, Gholamreza Haffari

The role of training schedule becomes even more crucial in \textit{biased-MTL} where the goal is to improve one (or a subset) of tasks the most, e. g. translation quality.

Low-Resource Neural Machine Translation NMT +1

Incorporating Syntactic Uncertainty in Neural Machine Translation with a Forest-to-Sequence Model

no code implementations COLING 2018 Poorya Zaremoodi, Gholamreza Haffari

Incorporating syntactic information in Neural Machine Translation (NMT) can lead to better reorderings, particularly useful when the language pairs are syntactically highly divergent or when the training bitext is not large.

Machine Translation NMT +2

Incorporating Syntactic Uncertainty in Neural Machine Translation with Forest-to-Sequence Model

no code implementations19 Nov 2017 Poorya Zaremoodi, Gholamreza Haffari

In this paper, we propose a forest-to-sequence Attentional Neural Machine Translation model to make use of exponentially many parse trees of the source sentence to compensate for the parser errors.

Machine Translation Sentence +1

Cannot find the paper you are looking for? You can Submit a new open access paper.