no code implementations • Findings (EMNLP) 2021 • Stalin Varanasi, Saadullah Amin, Guenter Neumann
There has been a significant progress in the field of Extractive Question Answering (EQA) in the recent years.
no code implementations • 14 Nov 2023 • Tanja Baeumel, Soniya Vijayakumar, Josef van Genabith, Guenter Neumann, Simon Ostermann
Pretrained language models (PLMs) form the basis of most state-of-the-art NLP technologies.
no code implementations • 4 Sep 2020 • Eleni Metheniti, Guenter Neumann, Josef van Genabith
Inflection is an essential part of every human language's morphology, yet little effort has been made to unify linguistic theory and computational methods in recent years.
no code implementations • WS 2020 • Stalin Varanasi, Saadullah Amin, Guenter Neumann
Contextualized word embeddings provide better initialization for neural networks that deal with various natural language understanding (NLU) tasks including Question Answering (QA) and more recently, Question Generation(QG).
no code implementations • LREC 2020 • Eleni Metheniti, Guenter Neumann
We are evaluating a generated, multilingual inflectional corpus with morpheme boundaries, generated from the English Wiktionary (Metheniti and Neumann, 2018), against the largest, multilingual, high-quality inflectional corpus of the UniMorph project (Kirov et al., 2018).
no code implementations • WS 2019 • Dominik Stammbach, Guenter Neumann
This paper contains our system description for the second Fact Extraction and VERification (FEVER) challenge.
no code implementations • SEMEVAL 2019 • Dominik Stammbach, Stalin Varanasi, Guenter Neumann
Our hand-in for subtask A consists of a fine-tuned classifier from this BERT checkpoint.
no code implementations • EACL 2017 • Georg Heigold, Guenter Neumann, Josef van Genabith
This paper investigates neural character-based morphological tagging for languages with complex morphology and large tag sets.
1 code implementation • 21 Jun 2016 • Georg Heigold, Guenter Neumann, Josef van Genabith
We systematically explore a variety of neural architectures (DNN, CNN, CNNHighway, LSTM, BLSTM) to obtain character-based word vectors combined with bidirectional LSTMs to model across-word context in an end-to-end setting.