Search Results for author: Ergun Bi{\c{c}}ici

Found 19 papers, 0 papers with code

RTM Stacking Results for Machine Translation Performance Prediction

no code implementations WS 2019 Ergun Bi{\c{c}}ici

We obtain new results using referential translation machines with increased number of learning models in the set of results that are stacked to obtain a better mixture of experts prediction.

Machine Translation Sentence +1

Machine Translation with parfda, Moses, kenlm, nplm, and PRO

no code implementations WS 2019 Ergun Bi{\c{c}}ici

We build parfda Moses statistical machine translation (SMT) models for most language pairs in the news translation task.

Machine Translation Translation

RTM results for Predicting Translation Performance

no code implementations WS 2018 Ergun Bi{\c{c}}ici

With improved prediction combination using weights based on their training performance and stacking and multilayer perceptrons to build deeper prediction models, RTMs become the 3rd system in general at the sentence-level prediction of translation scores and achieve the lowest RMSE in English to German NMT QET results.

Language Modelling Machine Translation +3

Robust parfda Statistical Machine Translation Results

no code implementations WS 2018 Ergun Bi{\c{c}}ici

We build parallel feature decay algorithms (parfda) Moses statistical machine translation (SMT) models for language pairs in the translation task.

Language Modelling Machine Translation +1

RTM at SemEval-2017 Task 1: Referential Translation Machines for Predicting Semantic Similarity

no code implementations SEMEVAL 2017 Ergun Bi{\c{c}}ici

We use referential translation machines for predicting the semantic similarity of text in all STS tasks which contain Arabic, English, Spanish, and Turkish this year.

Machine Translation Semantic Similarity +3

Cannot find the paper you are looking for? You can Submit a new open access paper.