Morphological Inflection
37 papers with code • 0 benchmarks • 1 datasets
Morphological Inflection is the task of generating a target (inflected form) word from a source word (base form), given a morphological attribute, e.g. number, tense, and person etc. It is useful for alleviating data sparsity issues in translating morphologically rich languages. The transformation from a base form to an inflected form usually includes concatenating the base form with a prefix or a suffix and substituting some characters. For example, the inflected form of a Finnish stem eläkeikä (retirement age) is eläkeiittä when the case is abessive and the number is plural.
Source: Tackling Sequence to Sequence Mapping Problems with Neural Networks
Benchmarks
These leaderboards are used to track progress in Morphological Inflection
Latest papers with no code
OOVs in the Spotlight: How to Inflect them?
For testing in OOV conditions, we automatically extracted a large dataset of nouns in the morphologically rich Czech language, with lemma-disjoint data splits, and we further manually annotated a real-world OOV dataset of neologisms.
Exploring Linguistic Probes for Morphological Generalization
Modern work on the cross-linguistic computational modeling of morphological inflection has typically employed language-independent data splitting algorithms.
Autoregressive Modeling with Lookahead Attention
To predict the next token, autoregressive models ordinarily examine the past.
Modeling the Graphotactics of Low-Resource Languages Using Sequential GANs
Generative Adversarial Networks (GANs) have been shown to aid in the creation of artificial data in situations where large amounts of real data are difficult to come by.
A Comprehensive Comparison of Neural Networks as Cognitive Models of Inflection
Neural networks have long been at the center of a debate around the cognitive mechanism by which humans process inflectional morphology.
UniMorph 4.0: Universal Morphology
The project comprises two major thrusts: a language-independent feature schema for rich morphological annotation and a type-level resource of annotated data in diverse languages realizing that schema.
How do we get there? Evaluating transformer neural networks as cognitive models for English past tense inflection
Neural network models have achieved good performance on morphological inflection tasks, including English past tense inflection.
A Three Step Training Approach with Data Augmentation for Morphological Inflection
We present the BME submission for the SIGMORPHON 2021 Task 0 Part 1, Generalization Across Typologically Diverse Languages shared task.
Do RNN States Encode Abstract Phonological Alternations?
Sequence-to-sequence models have delivered impressive results in word formation tasks such as morphological inflection, often learning to model subtle morphophonological details with limited training data.
Can a Transformer Pass the Wug Test? Tuning Copying Bias in Neural Morphological Inflection Models
Deep learning sequence models have been successfully applied to the task of morphological inflection.