no code implementations • COLING 2022 • Lennart Wachowiak, Dagmar Gromann
We propose to build on the recent success of large multilingual pretrained language models and a small dataset of examples from image schema literature to train a supervised classifier that classifies natural language expressions of varying lengths into image schemas.
1 code implementation • 8 Mar 2024 • Lennart Wachowiak, Andrew Coles, Oya Celiktutan, Gerard Canal
We find that GPT-4 strongly outperforms other models, generating answers that correlate strongly with users' answers in two studies $\unicode{x2014}$ the first study dealing with selecting the most appropriate communicative act for a robot in various situations ($r_s$ = 0. 82), and the second with judging the desirability, intentionality, and surprisingness of behavior ($r_s$ = 0. 83).
no code implementations • 1 Feb 2024 • Philipp Wicke, Lennart Wachowiak
Surprisingly, correlations between model outputs and human responses emerge, revealing adaptability without a tangible connection to embodied experiences.
1 code implementation • NeurIPS Data-Centric AI Workshop 2021 • Christian Lang, Lennart Wachowiak, Barbara Heinisch, Dagmar Gromann
Predicting lexical-semantic relations between word pairs has successfully been accomplished by pre-trained neural language models.
1 code implementation • 3rd Conference on Language, Data and Knowledge 2021 • Lennart Wachowiak, Christian Lang, Barbara Heinisch, Dagmar Gromann
Terminological Concept Systems (TCS) provide a means of organizing, structuring and representing domain-specific multilingual information and are important to ensure terminological consistency in many tasks, such as translation and cross-border communication.
1 code implementation • 12 Dec 2020 • Lennart Wachowiak, Christian Lang, Barbara Heinisch, Dagmar Gromann
We describe our submission to the CogALex-VI shared task on the identification of multilingual paradigmatic relations building on XLM-RoBERTa (XLM-R), a robustly optimized and multilingual BERT model.