Morphological Inflection
37 papers with code • 0 benchmarks • 1 datasets
Morphological Inflection is the task of generating a target (inflected form) word from a source word (base form), given a morphological attribute, e.g. number, tense, and person etc. It is useful for alleviating data sparsity issues in translating morphologically rich languages. The transformation from a base form to an inflected form usually includes concatenating the base form with a prefix or a suffix and substituting some characters. For example, the inflected form of a Finnish stem eläkeikä (retirement age) is eläkeiittä when the case is abessive and the number is plural.
Source: Tackling Sequence to Sequence Mapping Problems with Neural Networks
Benchmarks
These leaderboards are used to track progress in Morphological Inflection
Most implemented papers
Interpretability for Morphological Inflection: from Character-level Predictions to Subword-level Rules
We apply our methodology to analyze the model{'}s decisions on three typologically-different languages and find that a) our pattern extraction method applied to cross-attention weights uncovers variation in form of inflection morphemes, b) pattern extraction from self-attention shows triggers for such variation, c) both types of patterns are closely aligned with grammar inflection classes and class assignment criteria, for all three languages.
On Biasing Transformer Attention Towards Monotonicity
Many sequence-to-sequence tasks in natural language processing are roughly monotonic in the alignment between source and target sequence, and previous work has facilitated or enforced learning of monotonic attention behavior via specialized attention functions or pretraining.
Minimal Supervision for Morphological Inflection
Neural models for the various flavours of morphological inflection tasks have proven to be extremely accurate given ample labeled data -- data that may be slow and costly to obtain.
Falling Through the Gaps: Neural Architectures as Models of Morphological Rule Learning
We evaluate the Transformer as a model of morphological rule learning and compare it with Recurrent Neural Networks (RNN) on English, German, and Russian.
(Un)solving Morphological Inflection: Lemma Overlap Artificially Inflates Models' Performance
The effect is most significant for low-resourced languages with a drop as high as 95 points, but even high-resourced languages lose about 10 points on average.
Rule-based Morphological Inflection Improves Neural Terminology Translation
Current approaches to incorporating terminology constraints in machine translation (MT) typically assume that the constraint terms are provided in their correct morphological forms.
Eeny, meeny, miny, moe. How to choose data for morphological inflection
In this paper, we explore four sampling strategies for the task of morphological inflection using a Transformer model: a pair of oracle experiments where data is chosen based on whether the model already can or cannot inflect the test forms correctly, as well as strategies based on high/low model confidence, entropy, as well as random selection.
A Framework for Bidirectional Decoding: Case Study in Morphological Inflection
Transformer-based encoder-decoder models that generate outputs in a left-to-right fashion have become standard for sequence-to-sequence tasks.
Understanding Compositional Data Augmentation in Typologically Diverse Morphological Inflection
In this study, we aim to shed light on the theoretical aspects of the prominent data augmentation strategy StemCorrupt (Silfverberg et al., 2017; Anastasopoulos and Neubig, 2019), a method that generates synthetic examples by randomly substituting stem characters in gold standard training examples.
Morphological Inflection: A Reality Check
Morphological inflection is a popular task in sub-word NLP with both practical and cognitive applications.