Search Results for author: Alec Alameddine

Found 1 papers, 0 papers with code

Tokenization Is More Than Compression

no code implementations28 Feb 2024 Craig W. Schmidt, Varshini Reddy, Haoran Zhang, Alec Alameddine, Omri Uzan, Yuval Pinter, Chris Tanner

Tokenization is a foundational step in Natural Language Processing (NLP) tasks, bridging raw text and language models.

Data Compression

Cannot find the paper you are looking for? You can Submit a new open access paper.