Search Results for author: Chih-chan Tien

Found 2 papers, 2 papers with code

Transformers are efficient hierarchical chemical graph learners

1 code implementation2 Oct 2023 Zihan Pengmei, Zimu Li, Chih-chan Tien, Risi Kondor, Aaron R. Dinner

We demonstrate SubFormer on benchmarks for predicting molecular properties from chemical structures and show that it is competitive with state-of-the-art graph transformers at a fraction of the computational cost, with training times on the order of minutes on a consumer-grade graphics card.

Graph Representation Learning

Bilingual alignment transfers to multilingual alignment for unsupervised parallel text mining

1 code implementation ACL 2022 Chih-chan Tien, Shane Steinert-Threlkeld

To study this theory, we design unsupervised models trained on unpaired sentences and single-pair supervised models trained on bitexts, both based on the unsupervised language model XLM-R with its parameters frozen.

Retrieval Sentence +1

Cannot find the paper you are looking for? You can Submit a new open access paper.