no code implementations • WS 2019 • Ergun Bi{\c{c}}ici
We obtain new results using referential translation machines with increased number of learning models in the set of results that are stacked to obtain a better mixture of experts prediction.
no code implementations • WS 2019 • Ergun Bi{\c{c}}ici
We build parfda Moses statistical machine translation (SMT) models for most language pairs in the news translation task.
no code implementations • WS 2018 • Ergun Bi{\c{c}}ici
With improved prediction combination using weights based on their training performance and stacking and multilayer perceptrons to build deeper prediction models, RTMs become the 3rd system in general at the sentence-level prediction of translation scores and achieve the lowest RMSE in English to German NMT QET results.
no code implementations • WS 2018 • Ergun Bi{\c{c}}ici
We build parallel feature decay algorithms (parfda) Moses statistical machine translation (SMT) models for language pairs in the translation task.
no code implementations • SEMEVAL 2017 • Ergun Bi{\c{c}}ici
We use referential translation machines for predicting the semantic similarity of text in all STS tasks which contain Arabic, English, Spanish, and Turkish this year.