Search Results for author: Zelda Mariet

Found 14 papers, 6 papers with code

Pre-training helps Bayesian optimization too

1 code implementation7 Jul 2022 Zi Wang, George E. Dahl, Kevin Swersky, Chansoo Lee, Zelda Mariet, Zachary Nado, Justin Gilmer, Jasper Snoek, Zoubin Ghahramani

Contrary to a common belief that BO is suited to optimizing black-box functions, it actually requires domain knowledge on characteristics of those functions to deploy BO successfully.

Bayesian Optimization

Ensembling over Classifiers: a Bias-Variance Perspective

no code implementations21 Jun 2022 Neha Gupta, Jamie Smith, Ben Adlam, Zelda Mariet

Empirically, standard ensembling reducesthe bias, leading us to hypothesize that ensembles of classifiers may perform well in part because of this unexpected reduction. We conclude by an empirical analysis of recent deep learning methods that ensemble over hyperparameters, revealing that these techniques indeed favor bias reduction.

Understanding the bias-variance tradeoff of Bregman divergences

no code implementations8 Feb 2022 Ben Adlam, Neha Gupta, Zelda Mariet, Jamie Smith

We show that, similarly to the label, the central prediction can be interpreted as the mean of a random variable, where the mean operates in a dual space defined by the loss function itself.

Population-Based Black-Box Optimization for Biological Sequence Design

no code implementations ICML 2020 Christof Angermueller, David Belanger, Andreea Gane, Zelda Mariet, David Dohan, Kevin Murphy, Lucy Colwell, D. Sculley

The cost and latency of wet-lab experiments requires methods that find good sequences in few experimental rounds of large batches of sequences--a setting that off-the-shelf black-box optimization methods are ill-equipped to handle.

Weighting Is Worth the Wait: Bayesian Optimization with Importance Sampling

no code implementations23 Feb 2020 Setareh Ariafar, Zelda Mariet, Ehsan Elhamifar, Dana Brooks, Jennifer Dy, Jasper Snoek

Casting hyperparameter search as a multi-task Bayesian optimization problem over both hyperparameters and importance sampling design achieves the best of both worlds: by learning a parameterization of IS that trades-off evaluation complexity and quality, we improve upon Bayesian optimization state-of-the-art runtime and final validation error across a variety of datasets and complex neural architectures.

Bayesian Optimization

DPPNet: Approximating Determinantal Point Processes with Deep Networks

no code implementations ICLR 2019 Zelda Mariet, Yaniv Ovadia, Jasper Snoek

Determinantal Point Processes (DPPs) provide an elegant and versatile way to sample sets of items that balance the point-wise quality with the set-wise diversity of selected items.

Point Processes

Foundations of Sequence-to-Sequence Modeling for Time Series

no code implementations9 May 2018 Vitaly Kuznetsov, Zelda Mariet

The availability of large amounts of time series data, paired with the performance of deep-learning algorithms on a broad class of problems, has recently led to significant interest in the use of sequence-to-sequence models for time series forecasting.

Time Series Time Series Forecasting

Learning Determinantal Point Processes by Corrective Negative Sampling

no code implementations15 Feb 2018 Zelda Mariet, Mike Gartrell, Suvrit Sra

To address this issue, which reduces the quality of the learned model, we introduce a novel optimization problem, Contrastive Estimation (CE), which encodes information about "negative" samples into the basic learning model.

Language Modelling Point Processes

Kronecker Determinantal Point Processes

4 code implementations NeurIPS 2016 Zelda Mariet, Suvrit Sra

Determinantal Point Processes (DPPs) are probabilistic models over all subsets a ground set of $N$ items.

Point Processes Stochastic Optimization

Fixed-point algorithms for learning determinantal point processes

no code implementations4 Aug 2015 Zelda Mariet, Suvrit Sra

Determinantal point processes (DPPs) offer an elegant tool for encoding probabilities over subsets of a ground set.

Point Processes

Cannot find the paper you are looking for? You can Submit a new open access paper.