Search Results for author: Tianduo Wang

Found 3 papers, 3 papers with code

TinyLlama: An Open-Source Small Language Model

2 code implementations4 Jan 2024 Peiyuan Zhang, Guangtao Zeng, Tianduo Wang, Wei Lu

We present TinyLlama, a compact 1. 1B language model pretrained on around 1 trillion tokens for approximately 3 epochs.

Computational Efficiency Language Modelling

Differentiable Data Augmentation for Contrastive Sentence Representation Learning

1 code implementation29 Oct 2022 Tianduo Wang, Wei Lu

Fine-tuning a pre-trained language model via the contrastive learning framework with a large amount of unlabeled sentences or labeled sentence pairs is a common way to obtain high-quality sentence representations.

Contrastive Learning Data Augmentation +3

Cannot find the paper you are looking for? You can Submit a new open access paper.