Search Results for author: Mingda Qiao

Found 18 papers, 0 papers with code

Collaborative Learning with Different Labeling Functions

no code implementations16 Feb 2024 Yuyang Deng, Mingda Qiao

We study a variant of Collaborative PAC Learning, in which we aim to learn an accurate classifier for each of the $n$ data distributions, while minimizing the number of samples drawn from them in total.

Computational Efficiency PAC learning

On the Distance from Calibration in Sequential Prediction

no code implementations12 Feb 2024 Mingda Qiao, Letian Zheng

We then show that an $O(\sqrt{T})$ lower calibration distance can be achieved via a simple minimax argument and a reduction to online learning on a Lipschitz class.

A Combinatorial Approach to Robust PCA

no code implementations28 Nov 2023 Weihao Kong, Mingda Qiao, Rajat Sen

We study the problem of recovering Gaussian data under adversarial corruptions when the noises are low-rank and the corruptions are on the coordinate level.

A Fourier Approach to Mixture Learning

no code implementations5 Oct 2022 Mingda Qiao, Guru Guruganesh, Ankit Singh Rawat, Avinava Dubey, Manzil Zaheer

Regev and Vijayaraghavan (2017) showed that with $\Delta = \Omega(\sqrt{\log k})$ separation, the means can be learned using $\mathrm{poly}(k, d)$ samples, whereas super-polynomially many samples are required if $\Delta = o(\sqrt{\log k})$ and $d = \Omega(\log k)$.

Open Problem: Properly learning decision trees in polynomial time?

no code implementations29 Jun 2022 Guy Blanc, Jane Lange, Mingda Qiao, Li-Yang Tan

The previous fastest algorithm for this problem ran in $n^{O(\log n)}$ time, a consequence of Ehrenfeucht and Haussler (1989)'s classic algorithm for the distribution-free setting.

Properly learning decision trees in almost polynomial time

no code implementations1 Sep 2021 Guy Blanc, Jane Lange, Mingda Qiao, Li-Yang Tan

We give an $n^{O(\log\log n)}$-time membership query algorithm for properly and agnostically learning decision trees under the uniform distribution over $\{\pm 1\}^n$.

Decision tree heuristics can fail, even in the smoothed setting

no code implementations2 Jul 2021 Guy Blanc, Jane Lange, Mingda Qiao, Li-Yang Tan

Greedy decision tree learning heuristics are mainstays of machine learning practice, but theoretical justification for their empirical success remains elusive.

Exponential Weights Algorithms for Selective Learning

no code implementations29 Jun 2021 Mingda Qiao, Gregory Valiant

We study the selective learning problem introduced by Qiao and Valiant (2019), in which the learner observes $n$ labeled data points one at a time.

Stronger Calibration Lower Bounds via Sidestepping

no code implementations7 Dec 2020 Mingda Qiao, Gregory Valiant

In this paper, we prove an $\Omega(T^{0. 528})$ bound on the calibration error, which is the first super-$\sqrt{T}$ lower bound for this setting to the best of our knowledge.

A Theory of Selective Prediction

no code implementations12 Feb 2019 Mingda Qiao, Gregory Valiant

The algorithm is allowed to choose when to make the prediction as well as the length of the prediction window, possibly depending on the observations so far.

Open-Ended Question Answering

Do Outliers Ruin Collaboration?

no code implementations ICML 2018 Mingda Qiao

We consider the problem of learning a binary classifier from $n$ different data sources, among which at most an $\eta$ fraction are adversarial.

Collaborative PAC Learning

no code implementations NeurIPS 2017 Avrim Blum, Nika Haghtalab, Ariel D. Procaccia, Mingda Qiao

We introduce a collaborative PAC learning model, in which k players attempt to learn the same underlying concept.

PAC learning

Learning Discrete Distributions from Untrusted Batches

no code implementations22 Nov 2017 Mingda Qiao, Gregory Valiant

Specifically, we consider the setting where there is some underlying distribution, $p$, and each data source provides a batch of $\ge k$ samples, with the guarantee that at least a $(1-\epsilon)$ fraction of the sources draw their samples from a distribution with total variation distance at most $\eta$ from $p$.

Nearly Optimal Sampling Algorithms for Combinatorial Pure Exploration

no code implementations4 Jun 2017 Lijie Chen, Anupam Gupta, Jian Li, Mingda Qiao, Ruosong Wang

We provide a novel instance-wise lower bound for the sample complexity of the problem, as well as a nontrivial sampling algorithm, matching the lower bound up to a factor of $\ln|\mathcal{F}|$.

Multi-Armed Bandits

Practical Algorithms for Best-K Identification in Multi-Armed Bandits

no code implementations19 May 2017 Haotian Jiang, Jian Li, Mingda Qiao

In the Best-$K$ identification problem (Best-$K$-Arm), we are given $N$ stochastic bandit arms with unknown reward distributions.

Multi-Armed Bandits

Nearly Instance Optimal Sample Complexity Bounds for Top-k Arm Selection

no code implementations13 Feb 2017 Lijie Chen, Jian Li, Mingda Qiao

In the Best-$k$-Arm problem, we are given $n$ stochastic bandit arms, each associated with an unknown reward distribution.

Towards Instance Optimal Bounds for Best Arm Identification

no code implementations22 Aug 2016 Lijie Chen, Jian Li, Mingda Qiao

$H(I)=\sum_{i=2}^n\Delta_{[i]}^{-2}$ is the complexity of the instance.

Cannot find the paper you are looking for? You can Submit a new open access paper.