1 code implementation • 2 Feb 2024 • Menghua Wu, Yujia Bao, Regina Barzilay, Tommi Jaakkola
Causal discovery, the task of inferring causal structure from data, promises to accelerate scientific research, inform policy making, and more.
1 code implementation • 28 Sep 2023 • Yujia Bao, Srinivasan Sivanandan, Theofanis Karaletsos
We evaluate the performance of ChannelViT on ImageNet, JUMP-CP (microscopy cell imaging), and So2Sat (satellite imaging).
1 code implementation • 30 May 2023 • Yujia Bao, Theofanis Karaletsos
Additionally, we introduce a context inference network to predict such tokens on-the-fly, given a batch of samples from the group.
1 code implementation • 28 Apr 2022 • Yujia Bao, Regina Barzilay
Classifiers are biased when trained on biased datasets.
1 code implementation • 15 Jun 2021 • Yujia Bao, Shiyu Chang, Regina Barzilay
Empirical results demonstrate that our algorithm is able to maintain robustness on the target task for both synthetically generated environments and real-world environments.
1 code implementation • 26 May 2021 • Yujia Bao, Shiyu Chang, Regina Barzilay
In this work, we prove that by interpolating the distributions of the correct predictions and the wrong predictions, we can uncover an oracle distribution where the unstable correlation vanishes.
2 code implementations • ICLR 2020 • Yujia Bao, Menghua Wu, Shiyu Chang, Regina Barzilay
In this paper, we explore meta-learning for few-shot text classification.
no code implementations • 24 Apr 2019 • Yujia Bao, Zhengyi Deng, Yan Wang, Heeyoon Kim, Victor Diego Armengol, Francisco Acevedo, Nofal Ouardaoui, Cathy Wang, Giovanni Parmigiani, Regina Barzilay, Danielle Braun, Kevin S. Hughes
We developed and evaluated two machine learning models to classify abstracts as relevant to the penetrance (risk of cancer for germline mutation carriers) or prevalence of germline genetic mutations.
3 code implementations • EMNLP 2018 • Yujia Bao, Shiyu Chang, Mo Yu, Regina Barzilay
Attention-based models are successful when trained on large amounts of data.