Morphological Inflection

37 papers with code • 0 benchmarks • 1 datasets

Morphological Inflection is the task of generating a target (inflected form) word from a source word (base form), given a morphological attribute, e.g. number, tense, and person etc. It is useful for alleviating data sparsity issues in translating morphologically rich languages. The transformation from a base form to an inflected form usually includes concatenating the base form with a prefix or a suffix and substituting some characters. For example, the inflected form of a Finnish stem eläkeikä (retirement age) is eläkeiittä when the case is abessive and the number is plural.

Source: Tackling Sequence to Sequence Mapping Problems with Neural Networks

Falling Through the Gaps: Neural Architectures as Models of Morphological Rule Learning

denizbeser/gaps 8 May 2021

We evaluate the Transformer as a model of morphological rule learning and compare it with Recurrent Neural Networks (RNN) on English, German, and Russian.

0
08 May 2021

Minimal Supervision for Morphological Inflection

onlplab/morphodetection EMNLP 2021

Neural models for the various flavours of morphological inflection tasks have proven to be extremely accurate given ample labeled data -- data that may be slow and costly to obtain.

2
17 Apr 2021

On Biasing Transformer Attention Towards Monotonicity

ZurichNLP/monotonicity_loss NAACL 2021

Many sequence-to-sequence tasks in natural language processing are roughly monotonic in the alignment between source and target sequence, and previous work has facilitated or enforced learning of monotonic attention behavior via specialized attention functions or pretraining.

4
08 Apr 2021

Interpretability for Morphological Inflection: from Character-level Predictions to Subword-level Rules

tatyana-ruzsics/interpretable-inflection EACL 2021

We apply our methodology to analyze the model{'}s decisions on three typologically-different languages and find that a) our pattern extraction method applied to cross-attention weights uncovers variation in form of inflection morphemes, b) pattern extraction from self-attention shows triggers for such variation, c) both types of patterns are closely aligned with grammar inflection classes and class assignment criteria, for all three languages.

1
01 Apr 2021

Smoothing and Shrinking the Sparse Seq2Seq Search Space

deep-spin/S7 NAACL 2021

Current sequence-to-sequence models are trained to minimize cross-entropy and use softmax to compute the locally normalized probabilities over target sequences.

6
18 Mar 2021

SIGMORPHON 2020 Shared Task 0: Typologically Diverse Morphological Inflection

sigmorphon2020/task0-data WS 2020

Systems were developed using data from 45 languages and just 5 language families, fine-tuned with data from an additional 45 languages and 10 language families (13 in total), and evaluated on all 90 languages.

5
20 Jun 2020

Applying the Transformer to Character-level Transduction

shijie-wu/neural-transducer EACL 2021

The transformer has been shown to outperform recurrent neural network-based sequence-to-sequence models in various word-level NLP tasks.

71
20 May 2020

CAMeL Tools: An Open Source Python Toolkit for Arabic Natural Language Processing

CAMeL-Lab/camel_tools LREC 2020

We present CAMeL Tools, a collection of open-source tools for Arabic natural language processing in Python.

378
01 May 2020

Mind Your Inflections! Improving NLP for Non-Standard Englishes with Base-Inflection Encoding

salesforce/bite EMNLP 2020

Inflectional variation is a common feature of World Englishes such as Colloquial Singapore English and African American Vernacular English.

11
30 Apr 2020

A Latent Morphology Model for Open-Vocabulary Neural Machine Translation

d-ataman/lmm ICLR 2020

Translation into morphologically-rich languages challenges neural machine translation (NMT) models with extremely sparse vocabularies where atomic treatment of surface forms is unrealistic.

8
30 Oct 2019