Search Results for author: Dragan Milchevski

Found 3 papers, 1 papers with code

A Study on Entity Linking Across Domains”:" Which Data is Best for Fine-Tuning?

no code implementations RepL4NLP (ACL) 2022 Hassan Soliman, Heike Adel, Mohamed H. Gad-Elrab, Dragan Milchevski, Jannik Strötgen

In particular, we represent the entities of different KGs in a joint vector space and address the questions of which data is best suited for creating and fine-tuning that space, and whether fine-tuning harms performance on the general domain.

Entity Linking

AnnoCTR: A Dataset for Detecting and Linking Entities, Tactics, and Techniques in Cyber Threat Reports

2 code implementations11 Apr 2024 Lukas Lange, Marc Müller, Ghazaleh Haratinezhad Torbati, Dragan Milchevski, Patrick Grau, Subhash Pujari, Annemarie Friedrich

In our few-shot scenario, we find that for identifying the MITRE ATT&CK concepts that are mentioned explicitly or implicitly in a text, concept descriptions from MITRE ATT&CK are an effective source for training data augmentation.

Data Augmentation

Cannot find the paper you are looking for? You can Submit a new open access paper.