Word Alignment

84 papers with code • 7 benchmarks • 4 datasets

Word Alignment is the task of finding the correspondence between source and target words in a pair of sentences that are translations of each other.

Source: Neural Network-based Word Alignment through Score Aggregation

Most implemented papers

Saliency-driven Word Alignment Interpretation for Neural Machine Translation

shuoyangd/meerkat WS 2019

Despite their original goal to jointly learn to align and translate, Neural Machine Translation (NMT) models, especially Transformer, are often perceived as not learning interpretable word alignments.

Ultrasound tongue imaging for diarization and alignment of child speech therapy sessions

UltraSuite/ultrasuite-kaldi 1 Jul 2019

We investigate the automatic processing of child speech therapy sessions using ultrasound visual biofeedback, with a specific focus on complementing acoustic features with ultrasound images of the tongue for the tasks of speaker diarization and time-alignment of target words.

Unsupervised Multilingual Word Embedding with Limited Resources using Neural Language Models

twadada/multilingual-nlm ACL 2019

Recently, a variety of unsupervised methods have been proposed that map pre-trained word embeddings of different languages into the same space without any parallel data.

Bilingual Lexicon Induction through Unsupervised Machine Translation

artetxem/monoses ACL 2019

A recent research line has obtained strong results on bilingual lexicon induction by aligning independently trained word embeddings in two languages and using the resulting cross-lingual embeddings to induce word translation pairs through nearest neighbor or related retrieval methods.

Learning Trilingual Dictionaries for Urdu -- Roman Urdu -- English

MoizRauf/Urdu--Roman-Urdu--English--Dictionary WS 2019

In this paper, we present an effort to generate a joint Urdu, Roman Urdu and English trilingual lexicon using automated methods.

Jointly Learning to Align and Translate with Transformer Models

pytorch/fairseq IJCNLP 2019

The state of the art in machine translation (MT) is governed by neural approaches, which typically provide superior translation accuracy over statistical approaches.

How Language-Neutral is Multilingual BERT?

jlibovicky/assess-multilingual-bert 8 Nov 2019

Multilingual BERT (mBERT) provides sentence representations for 104 languages, which are useful for many multi-lingual tasks.

Unsupervised Multilingual Alignment using Wasserstein Barycenter

alixxxin/multi-lang 28 Jan 2020

We study unsupervised multilingual alignment, the problem of finding word-to-word translations between multiple languages without using any parallel data.

On the Language Neutrality of Pre-trained Multilingual Representations

jlibovicky/assess-multilingual-bert Findings of the Association for Computational Linguistics 2020

Multilingual contextual embeddings, such as multilingual BERT and XLM-RoBERTa, have proved useful for many multi-lingual tasks.

Attention is Not Only a Weight: Analyzing Transformers with Vector Norms

gorokoba560/norm-analysis-of-transformer EMNLP 2020

Attention is a key component of Transformers, which have recently achieved considerable success in natural language processing.