Search Results for author: Aryeh Kontorovich

Found 42 papers, 3 papers with code

Distribution Estimation under the Infinity Norm

no code implementations13 Feb 2024 Aryeh Kontorovich, Amichai Painsky

A variety of techniques are utilized and innovated upon, including Chernoff-type inequalities and empirical Bernstein bounds.

Splitting the Difference on Adversarial Training

1 code implementation3 Oct 2023 Matan Levi, Aryeh Kontorovich

The ability to achieve such near-optimal natural accuracy, while maintaining a significant level of robustness, makes our method applicable to real-world applications where natural accuracy is at a premium.

Efficient Agnostic Learning with Average Smoothness

no code implementations29 Sep 2023 Steve Hanneke, Aryeh Kontorovich, Guy Kornowski

While the recent work of Hanneke et al. (2023) established tight uniform convergence bounds for average-smooth functions in the realizable case and provided a computationally efficient realizable learning algorithm, both of these results currently lack analogs in the general agnostic (i. e. noisy) case.

Metric-valued regression

no code implementations7 Feb 2022 Dan Tsir Cohen, Aryeh Kontorovich

We propose an efficient algorithm for learning mappings between two metric spaces, $\X$ and $\Y$.

regression

Adaptive Data Analysis with Correlated Observations

no code implementations21 Jan 2022 Aryeh Kontorovich, Menachem Sadigurschi, Uri Stemmer

The vast majority of the work on adaptive data analysis focuses on the case where the samples in the dataset are independent.

Tree density estimation

no code implementations23 Nov 2021 László Györfi, Aryeh Kontorovich, Roi Weiss

data we identify an optimal tree $T^*$ and efficiently construct a tree density estimate $f_n$ such that, without any regularity conditions on the density $f$, one has $\lim_{n\to \infty} \int |f_n(\boldsymbol x)-f_{T^*}(\boldsymbol x)|d\boldsymbol x=0$ a. s. For Lipschitz $f$ with bounded support, $\mathbb E \left\{ \int |f_n(\boldsymbol x)-f_{T^*}(\boldsymbol x)|d\boldsymbol x\right\}=O\big(n^{-1/4}\big)$, a dimension-free rate.

Density Estimation

Fat-Shattering Dimension of $k$-fold Aggregations

no code implementations10 Oct 2021 Idan Attias, Aryeh Kontorovich

We provide estimates on the fat-shattering dimension of aggregation rules of real-valued function classes.

Domain Invariant Adversarial Learning

1 code implementation1 Apr 2021 Matan Levi, Idan Attias, Aryeh Kontorovich

We present a new adversarial training method, Domain Invariant Adversarial Learning (DIAL), which learns a feature representation that is both robust and domain invariant.

Stable Sample Compression Schemes: New Applications and an Optimal SVM Margin Bound

no code implementations9 Nov 2020 Steve Hanneke, Aryeh Kontorovich

We analyze a family of supervised learning algorithms based on sample compression schemes that are stable, in the sense that removing points from the training set which were not selected for the compression set does not alter the resulting classifier.

Generalization Bounds Open-Ended Question Answering

Non-parametric Binary regression in metric spaces with KL loss

no code implementations19 Oct 2020 Ariel Avital, Klim Efremenko, Aryeh Kontorovich, David Toplin, Bo Waggoner

We propose a non-parametric variant of binary regression, where the hypothesis is regularized to be a Lipschitz function taking a metric space to [0, 1] and the loss is logarithmic.

Generalization Bounds regression

Functions with average smoothness: structure, algorithms, and learning

no code implementations13 Jul 2020 Yair Ashlagi, Lee-Ad Gottlieb, Aryeh Kontorovich

Rather than using the Lipschitz constant as the regularizer, we define a local slope at each point and gauge the function complexity as the average of these values.

Denoising Generalization Bounds

On Biased Random Walks, Corrupted Intervals, and Learning Under Adversarial Design

no code implementations30 Mar 2020 Daniel Berend, Aryeh Kontorovich, Lev Reyzin, Thomas Robinson

We tackle some fundamental problems in probability theory on corrupted random processes on the integer line.

Nested Barycentric Coordinate System as an Explicit Feature Map

no code implementations5 Feb 2020 Lee-Ad Gottlieb, Eran Kaufman, Aryeh Kontorovich, Gabriel Nivasch, Ofir Pele

We propose a new embedding method which is particularly well-suited for settings where the sample size greatly exceeds the ambient dimension.

Generalization Bounds

Apportioned Margin Approach for Cost Sensitive Large Margin Classifiers

no code implementations4 Feb 2020 Lee-Ad Gottlieb, Eran Kaufman, Aryeh Kontorovich

We consider the problem of cost sensitive multiclass classification, where we would like to increase the sensitivity of an important class at the expense of a less important one.

Generalization Bounds

Fast and Bayes-consistent nearest neighbors

no code implementations7 Oct 2019 Klim Efremenko, Aryeh Kontorovich, Moshe Noivirt

Research on nearest-neighbor methods tends to focus somewhat dichotomously either on the statistical or the computational aspects -- either on, say, Bayes consistency and rates of convergence or on techniques for speeding up the proximity search.

Universal Bayes consistency in metric spaces

no code implementations24 Jun 2019 Steve Hanneke, Aryeh Kontorovich, Sivan Sabato, Roi Weiss

This is the first learning algorithm known to enjoy this property; by comparison, the $k$-NN classifier and its variants are not generally universally Bayes-consistent, except under additional structural assumptions, such as an inner product, a norm, finite dimension, or a Besicovitch-type property.

Estimating the Mixing Time of Ergodic Markov Chains

no code implementations1 Feb 2019 Geoffrey Wolfer, Aryeh Kontorovich

Furthermore, even if an eigenvalue perturbation analysis with better dependence on $d$ were available, in the non-reversible case the connection between the spectral gap and the mixing time is not nearly as straightforward as in the reversible case.

Minimax Testing of Identity to a Reference Ergodic Markov Chain

no code implementations31 Jan 2019 Geoffrey Wolfer, Aryeh Kontorovich

We exhibit an efficient procedure for testing, based on a single long state sequence, whether an unknown Markov chain is identical to or $\varepsilon$-far from a given reference chain.

Improved Generalization Bounds for Adversarially Robust Learning

no code implementations4 Oct 2018 Idan Attias, Aryeh Kontorovich, Yishay Mansour

For binary classification, the algorithm of Feige et al. (2015) uses a regret minimization algorithm and an ERM oracle as a black box; we adapt it for the multiclass and regression settings.

Binary Classification General Classification +3

Agnostic Sample Compression Schemes for Regression

no code implementations3 Oct 2018 Idan Attias, Steve Hanneke, Aryeh Kontorovich, Menachem Sadigurschi

For the $\ell_2$ loss, does every function class admit an approximate compression scheme of polynomial size in the fat-shattering dimension?

Open-Ended Question Answering regression

Statistical Estimation of Ergodic Markov Chain Kernel over Discrete State Space

no code implementations13 Sep 2018 Geoffrey Wolfer, Aryeh Kontorovich

We investigate the statistical complexity of estimating the parameters of a discrete-state Markov chain kernel from a single long sequence of state observations.

Learning convex polyhedra with margin

no code implementations NeurIPS 2018 Lee-Ad Gottlieb, Eran Kaufman, Aryeh Kontorovich, Gabriel Nivasch

We present an improved algorithm for {\em quasi-properly} learning convex polyhedra in the realizable PAC setting from data with a margin.

Sample Compression for Real-Valued Learners

no code implementations21 May 2018 Steve Hanneke, Aryeh Kontorovich, Menachem Sadigurschi

We give an algorithmically efficient version of the learner-to-compression scheme conversion in Moran and Yehudayoff (2016).

Open-Ended Question Answering regression

A New Lower Bound for Agnostic Learning with Sample Compression Schemes

no code implementations21 May 2018 Steve Hanneke, Aryeh Kontorovich

We establish a tight characterization of the worst-case rates for the excess risk of agnostic learning with sample compression schemes and for uniform convergence for agnostic sample compression schemes.

Mixing time estimation in reversible Markov chains from a single sample path

no code implementations NeurIPS 2015 Daniel Hsu, Aryeh Kontorovich, David A. Levin, Yuval Peres, Csaba Szepesvári

The interval is constructed around the relaxation time $t_{\text{relax}} = 1/\gamma$, which is strongly related to the mixing time, and the width of the interval converges to zero roughly at a $1/\sqrt{n}$ rate, where $n$ is the length of the sample path.

Temporal anomaly detection: calibrating the surprise

1 code implementation29 May 2017 Eyal Gutflaish, Aryeh Kontorovich, Sivan Sabato, Ofer Biller, Oded Sofer

We learn a low-rank stationary model from the training data, and then fit a regression model for predicting the expected likelihood score of normal access patterns in the future.

Anomaly Detection

Active Nearest-Neighbor Learning in Metric Spaces

no code implementations NeurIPS 2016 Aryeh Kontorovich, Sivan Sabato, Ruth Urner

We propose a pool-based non-parametric active learning algorithm for general metric spaces, called MArgin Regularized Metric Active Nearest Neighbor (MARMANN), which outputs a nearest-neighbor classifier.

Active Learning Model Selection

Mixing Time Estimation in Reversible Markov Chains from a Single Sample Path

no code implementations NeurIPS 2015 Daniel Hsu, Aryeh Kontorovich, Csaba Szepesvári

The interval is constructed around the relaxation time $t_{\text{relax}}$, which is strongly related to the mixing time, and the width of the interval converges to zero roughly at a $\sqrt{n}$ rate, where $n$ is the length of the sample path.

Nearly optimal classification for semimetrics

no code implementations22 Feb 2015 Lee-Ad Gottlieb, Aryeh Kontorovich

We initiate the rigorous study of classification in semimetric spaces, which are point sets with a distance function that is non-negative and symmetric, but need not satisfy the triangle inequality.

Classification General Classification

A Bayes consistent 1-NN classifier

no code implementations1 Jul 2014 Aryeh Kontorovich, Roi Weiss

We show that a simple modification of the 1-nearest neighbor classifier yields a strongly Bayes consistent learner.

Near-optimal sample compression for nearest neighbors

no code implementations NeurIPS 2014 Lee-Ad Gottlieb, Aryeh Kontorovich, Pinhas Nisnevitch

We present the first sample compression algorithm for nearest neighbors with non-trivial performance guarantees.

General Classification

Maximum Margin Multiclass Nearest Neighbors

no code implementations30 Jan 2014 Aryeh Kontorovich, Roi Weiss

We prove generalization bounds that match the state of the art in sample size $n$ and significantly improve the dependence on the number of classes $k$.

Generalization Bounds

Consistency of weighted majority votes

no code implementations NeurIPS 2014 Daniel Berend, Aryeh Kontorovich

We revisit the classical decision-theoretic problem of weighted expert voting from a statistical learning perspective.

Predictive PAC Learning and Process Decompositions

no code implementations NeurIPS 2013 Cosma Rohilla Shalizi, Aryeh Kontorovich

We informally call a stochastic process learnable if it admits a generalization error approaching zero in probability for any concept class with finite VC-dimension (IID processes are the simplest example).

PAC learning

Concentration in unbounded metric spaces and algorithmic stability

no code implementations4 Sep 2013 Aryeh Kontorovich

We prove an extension of McDiarmid's inequality for metric spaces with unbounded diameter.

Efficient Classification for Metric Data

no code implementations11 Jun 2013 Lee-Ad Gottlieb, Aryeh Kontorovich, Robert Krauthgamer

We design a new algorithm for classification in general metric spaces, whose runtime and accuracy depend on the doubling dimension of the data points, and can thus achieve superior classification performance in many common scenarios.

Classification Computational Efficiency +1

Adaptive Metric Dimensionality Reduction

no code implementations12 Feb 2013 Lee-Ad Gottlieb, Aryeh Kontorovich, Robert Krauthgamer

We study adaptive data-dependent dimensionality reduction in the context of supervised learning in general metric spaces.

Dimensionality Reduction Generalization Bounds

Cannot find the paper you are looking for? You can Submit a new open access paper.