Search Results for author: Derrick Tate

Found 1 papers, 1 papers with code

Sentence Embedding Models for Ancient Greek Using Multilingual Knowledge Distillation

2 code implementations24 Aug 2023 Kevin Krahn, Derrick Tate, Andrew C. Lamicela

In this work, we use a multilingual knowledge distillation approach to train BERT models to produce sentence embeddings for Ancient Greek text.

Authorship Attribution Knowledge Distillation +11

Cannot find the paper you are looking for? You can Submit a new open access paper.