no code implementations • 5 May 2023 • Danilo Ribeiro, Omid Abdar, Jack Goetz, Mike Ross, Annie Dong, Kenneth Forbus, Ahmed Mohamed
In this work we propose OpenFSP, a framework that allows for easy creation of new domains from a handful of simple labels that can be generated without specific NLP knowledge.
1 code implementation • 4 Apr 2022 • Shengyuan Hu, Jack Goetz, Kshitiz Malik, Hongyuan Zhan, Zhe Liu, Yue Liu
Model compression is important in federated learning (FL) with large models to reduce communication cost.
no code implementations • 12 Oct 2021 • Pooja Sethi, Denis Savenkov, Forough Arabshahi, Jack Goetz, Micaela Tolliver, Nicolas Scheffer, Ilknur Kabul, Yue Liu, Ahmed Aly
Improving the quality of Natural Language Understanding (NLU) models, and more specifically, task-oriented semantic parsing models, in production is a cumbersome task.
no code implementations • 11 Aug 2020 • Jack Goetz, Ambuj Tewari
Federated learning allows for the training of a model using data on multiple clients without the clients transmitting that raw data.
no code implementations • 11 Oct 2019 • Jack Goetz, Ambuj Tewari
We generalize Stone's Theorem in the noise free setting, proving consistency for well known classifiers such as $k$-NN, histogram and kernel estimators under conditions which mirror classical results.
no code implementations • ICLR 2020 • Duc Bui, Kshitiz Malik, Jack Goetz, Honglei Liu, Seungwhan Moon, Anuj Kumar, Kang G. Shin
Furthermore, we show that user embeddings learned in FL and the centralized setting have a very similar structure, indicating that FURL can learn collaboratively through the shared parameters while preserving user privacy.
no code implementations • 27 Sep 2019 • Jack Goetz, Kshitiz Malik, Duc Bui, Seungwhan Moon, Honglei Liu, Anuj Kumar
To exploit this we propose Active Federated Learning, where in each round clients are selected not uniformly at random, but with a probability conditioned on the current model and the data on the client to maximize efficiency.
1 code implementation • NeurIPS 2018 • Jack Goetz, Ambuj Tewari, Paul Zimmerman
Active learning is the task of using labelled data to select additional points to label, with the goal of fitting the most accurate model with a fixed budget of labelled points.
1 code implementation • NeurIPS 2017 • Young Hun Jung, Jack Goetz, Ambuj Tewari
Recent work has extended the theoretical analysis of boosting algorithms to multiclass problems and to online settings.