no code implementations • 8 Mar 2024 • Warren Morningstar, Alex Bijamov, Chris Duvarney, Luke Friedman, Neha Kalibhat, Luyang Liu, Philip Mansfield, Renan Rojas-Gomez, Karan Singhal, Bradley Green, Sushant Prakash
We study the relative effects of data augmentations, pretraining algorithms, and model architectures in Self-Supervised Learning (SSL).
no code implementations • 21 Feb 2024 • Lin Ning, Luyang Liu, Jiaxing Wu, Neo Wu, Devora Berlowitz, Sushant Prakash, Bradley Green, Shawn O'Banion, Jun Xie
Large language models (LLMs) have revolutionized natural language processing.
no code implementations • 12 Jan 2024 • Yae Jee Cho, Luyang Liu, Zheng Xu, Aldi Fahrezi, Gauri Joshi
Foundation models (FMs) adapt well to specific domains or tasks with fine-tuning, and federated learning (FL) enables the potential for privacy-preserving fine-tuning of the FMs with on-device local data.
no code implementations • 7 Jan 2024 • Syed Irfan Ali Meerza, Luyang Liu, Jiaxin Zhang, Jian Liu
Specifically, we utilize constrained optimization to enforce local fairness on the client side and adopt a fairness-aware clustering-based aggregation on the server to further ensure the global model fairness across different sensitive groups while maintaining high utility.
no code implementations • 2 Dec 2023 • Neha Kalibhat, Warren Morningstar, Alex Bijamov, Luyang Liu, Karan Singhal, Philip Mansfield
We define augmentations in frequency space called Fourier Domain Augmentations (FDA) and show that training SSL models on a combination of these and image augmentations can improve the downstream classification accuracy by up to 1. 3% on ImageNet-1K.
no code implementations • 11 Apr 2023 • Yue Cui, Syed Irfan Ali Meerza, Zhuohang Li, Luyang Liu, Jiaxin Zhang, Jian Liu
In this paper, we seek to reconcile utility and privacy in FL by proposing a user-configurable privacy defense, RecUP-FL, that can better focus on the user-specified sensitive attributes while obtaining significant improvements in utility over traditional defenses.
3 code implementations • 3 Dec 2022 • Samiul Alam, Luyang Liu, Ming Yan, Mi Zhang
Most cross-device federated learning (FL) studies focus on the model-homogeneous setting where the global server model and local client models are identical.
1 code implementation • CVPR 2022 • Zhuohang Li, Jiaxin Zhang, Luyang Liu, Jian Liu
Federated Learning (FL) framework brings privacy benefits to distributed learning systems by allowing multiple clients to participate in a learning task under the coordination of a central server without exchanging their private data.
no code implementations • 4 Feb 2022 • Luyang Liu, David Racz, Kara Vaillancourt, Julie Michelman, Matt Barnes, Stefan Mellem, Paul Eastham, Bradley Green, Charles Armstrong, Rishi Bal, Shawn O'Banion, Feng Guo
Hard-braking events have been widely used as a safety surrogate due to their relatively high prevalence and ease of detection with embedded vehicle sensors.
no code implementations • 23 Sep 2021 • Zewen Chi, Heyan Huang, Luyang Liu, Yu Bai, Xian-Ling Mao
The success of pretrained cross-lingual language models relies on two essential abilities, i. e., generalization ability for learning downstream tasks in a source language, and cross-lingual transferability for transferring the task knowledge to other languages.
2 code implementations • 14 Jul 2021 • Jianyu Wang, Zachary Charles, Zheng Xu, Gauri Joshi, H. Brendan McMahan, Blaise Aguera y Arcas, Maruan Al-Shedivat, Galen Andrew, Salman Avestimehr, Katharine Daly, Deepesh Data, Suhas Diggavi, Hubert Eichner, Advait Gadhikar, Zachary Garrett, Antonious M. Girgis, Filip Hanzely, Andrew Hard, Chaoyang He, Samuel Horvath, Zhouyuan Huo, Alex Ingerman, Martin Jaggi, Tara Javidi, Peter Kairouz, Satyen Kale, Sai Praneeth Karimireddy, Jakub Konecny, Sanmi Koyejo, Tian Li, Luyang Liu, Mehryar Mohri, Hang Qi, Sashank J. Reddi, Peter Richtarik, Karan Singhal, Virginia Smith, Mahdi Soltanolkotabi, Weikang Song, Ananda Theertha Suresh, Sebastian U. Stich, Ameet Talwalkar, Hongyi Wang, Blake Woodworth, Shanshan Wu, Felix X. Yu, Honglin Yuan, Manzil Zaheer, Mi Zhang, Tong Zhang, Chunxiang Zheng, Chen Zhu, Wennan Zhu
Federated learning and analytics are a distributed approach for collaboratively learning models (or statistics) from decentralized data, motivated by and designed for privacy protection.
no code implementations • 3 Jul 2021 • Zhuohang Li, Luyang Liu, Jiaxin Zhang, Jian Liu
Federated Learning (FL) enables multiple distributed clients (e. g., mobile devices) to collaboratively train a centralized model while keeping the training data locally on the client.
no code implementations • 4 Jun 2021 • Jianyu Wang, Zheng Xu, Zachary Garrett, Zachary Charles, Luyang Liu, Gauri Joshi
Popular optimization algorithms of FL use vanilla (stochastic) gradient descent for both local updates at clients and global updates at the aggregating server.
1 code implementation • 6 Jul 2020 • Amol Kapoor, Xue Ben, Luyang Liu, Bryan Perozzi, Matt Barnes, Martin Blais, Shawn O'Banion
In this work, we examine a novel forecasting approach for COVID-19 case prediction that uses Graph Neural Networks and mobility data.
1 code implementation • COLING 2018 • Qian Liu, He-Yan Huang, Yang Gao, Xiaochi Wei, Yuxin Tian, Luyang Liu
In this paper, we propose a task-oriented word embedding method and apply it to the text classification task.
Ranked #21 on Text Classification on AG News