Search Results for author: Jack Goetz

Found 9 papers, 3 papers with code

Towards Zero-Shot Frame Semantic Parsing with Task Agnostic Ontologies and Simple Labels

no code implementations5 May 2023 Danilo Ribeiro, Omid Abdar, Jack Goetz, Mike Ross, Annie Dong, Kenneth Forbus, Ahmed Mohamed

In this work we propose OpenFSP, a framework that allows for easy creation of new domains from a handful of simple labels that can be generated without specific NLP knowledge.

Semantic Parsing Sentence +1

FedSynth: Gradient Compression via Synthetic Data in Federated Learning

1 code implementation4 Apr 2022 Shengyuan Hu, Jack Goetz, Kshitiz Malik, Hongyuan Zhan, Zhe Liu, Yue Liu

Model compression is important in federated learning (FL) with large models to reduce communication cost.

Federated Learning Model Compression

AutoNLU: Detecting, root-causing, and fixing NLU model errors

no code implementations12 Oct 2021 Pooja Sethi, Denis Savenkov, Forough Arabshahi, Jack Goetz, Micaela Tolliver, Nicolas Scheffer, Ilknur Kabul, Yue Liu, Ahmed Aly

Improving the quality of Natural Language Understanding (NLU) models, and more specifically, task-oriented semantic parsing models, in production is a cumbersome task.

Active Learning Natural Language Understanding +1

Federated Learning via Synthetic Data

no code implementations11 Aug 2020 Jack Goetz, Ambuj Tewari

Federated learning allows for the training of a model using data on multiple clients without the clients transmitting that raw data.

Federated Learning

Not All are Made Equal: Consistency of Weighted Averaging Estimators Under Active Learning

no code implementations11 Oct 2019 Jack Goetz, Ambuj Tewari

We generalize Stone's Theorem in the noise free setting, proving consistency for well known classifiers such as $k$-NN, histogram and kernel estimators under conditions which mirror classical results.

Active Learning

Federated User Representation Learning

no code implementations ICLR 2020 Duc Bui, Kshitiz Malik, Jack Goetz, Honglei Liu, Seungwhan Moon, Anuj Kumar, Kang G. Shin

Furthermore, we show that user embeddings learned in FL and the centralized setting have a very similar structure, indicating that FURL can learn collaboratively through the shared parameters while preserving user privacy.

Federated Learning Privacy Preserving +1

Active Federated Learning

no code implementations27 Sep 2019 Jack Goetz, Kshitiz Malik, Duc Bui, Seungwhan Moon, Honglei Liu, Anuj Kumar

To exploit this we propose Active Federated Learning, where in each round clients are selected not uniformly at random, but with a probability conditioned on the current model and the data on the client to maximize efficiency.

Federated Learning

Active Learning for Non-Parametric Regression Using Purely Random Trees

1 code implementation NeurIPS 2018 Jack Goetz, Ambuj Tewari, Paul Zimmerman

Active learning is the task of using labelled data to select additional points to label, with the goal of fitting the most accurate model with a fixed budget of labelled points.

Active Learning Binary Classification +2

Online Multiclass Boosting

1 code implementation NeurIPS 2017 Young Hun Jung, Jack Goetz, Ambuj Tewari

Recent work has extended the theoretical analysis of boosting algorithms to multiclass problems and to online settings.

Binary Classification General Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.