1 code implementation • 3 May 2024 • Miruna Beţianu, Abele Mălan, Marco Aldinucci, Robert Birke, Lydia Chen
In this paper, we design DALLMi, Domain Adaptation Large Language Model interpolator, a first-of-its-kind semi-supervised domain adaptation method for text data models based on LLMs, specifically BERT.
no code implementations • 4 Apr 2024 • Aditya Shankar, Hans Brouwer, Rihan Hai, Lydia Chen
We introduce SiloFuse, a novel generative framework for high-quality synthesis from cross-silo tabular data.
3 code implementations • 19 Oct 2023 • Zilong Zhao, Robert Birke, Lydia Chen
Results show that Tabula averagely reduces 46. 2% training time per epoch comparing to current LLMs-based state-of-the-art algorithm and consistently achieves even higher synthetic data utility.
no code implementations • 19 Sep 2021 • Ben Proven-Bessel, Zilong Zhao, Lydia Chen
No existing machine learning algorithms have been developed to create comic illustrations based on descriptions of illustrations, or the dialogue in comics.
no code implementations • 6 Jul 2021 • Aditya Kunar, Robert Birke, Zilong Zhao, Lydia Chen
Additionally, we rigorously evaluate the theoretical privacy guarantees offered by DP empirically against membership and attribute inference attacks.