Search Results for author: Maxim Sviridenko

Found 8 papers, 0 papers with code

Gradient Descent Converges Linearly for Logistic Regression on Separable Data

no code implementations26 Jun 2023 Kyriakos Axiotis, Maxim Sviridenko

We show that running gradient descent with variable learning rate guarantees loss $f(x) \leq 1. 1 \cdot f(x^*) + \epsilon$ for the logistic regression objective, where the error $\epsilon$ decays exponentially with the number of iterations and polynomially with the magnitude of the entries of an arbitrary fixed solution $x^*$.

regression

Iterative Hard Thresholding with Adaptive Regularization: Sparser Solutions Without Sacrificing Runtime

no code implementations11 Apr 2022 Kyriakos Axiotis, Maxim Sviridenko

We propose a simple modification to the iterative hard thresholding (IHT) algorithm, which recovers asymptotically sparser solutions as a function of the condition number.

TSI: an Ad Text Strength Indicator using Text-to-CTR and Semantic-Ad-Similarity

no code implementations18 Aug 2021 Shaunak Mishra, Changwei Hu, Manisha Verma, Kevin Yen, Yifan Hu, Maxim Sviridenko

To realize this opportunity, we propose an ad text strength indicator (TSI) which: (i) predicts the click-through-rate (CTR) for an input ad text, (ii) fetches similar existing ads to create a neighborhood around the input ad, (iii) and compares the predicted CTRs in the neighborhood to declare whether the input ad is strong or weak.

Click-Through Rate Prediction Retrieval +2

VisualTextRank: Unsupervised Graph-based Content Extraction for Automating Ad Text to Image Search

no code implementations5 Aug 2021 Shaunak Mishra, Mikhail Kuznetsov, Gaurav Srivastava, Maxim Sviridenko

Motivated by our observations in logged data on ad image search queries (given ad text), we formulate a keyword extraction problem, where a keyword extracted from the ad text (or its augmented version) serves as the ad image query.

Image Retrieval Keyword Extraction +2

Local Search Algorithms for Rank-Constrained Convex Optimization

no code implementations ICLR 2021 Kyriakos Axiotis, Maxim Sviridenko

We propose greedy and local search algorithms for rank-constrained convex optimization, namely solving $\underset{\mathrm{rank}(A)\leq r^*}{\min}\, R(A)$ given a convex function $R:\mathbb{R}^{m\times n}\rightarrow \mathbb{R}$ and a parameter $r^*$.

Matrix Completion

Sparse Convex Optimization via Adaptively Regularized Hard Thresholding

no code implementations ICML 2020 Kyriakos Axiotis, Maxim Sviridenko

We present a new Adaptively Regularized Hard Thresholding (ARHT) algorithm that makes significant progress on this problem by bringing the bound down to $\gamma=O(\kappa)$, which has been shown to be tight for a general class of algorithms including LASSO, OMP, and IHT.

On the computational complexity of the probabilistic label tree algorithms

no code implementations1 Jun 2019 Robert Busa-Fekete, Krzysztof Dembczynski, Alexander Golovnev, Kalina Jasinska, Mikhail Kuznetsov, Maxim Sviridenko, Chao Xu

First, we show that finding a tree with optimal training cost is NP-complete, nevertheless there are some tractable special cases with either perfect approximation or exact solution that can be obtained in linear time in terms of the number of labels $m$.

Multi-class Classification

An Algorithm for Online K-Means Clustering

no code implementations18 Dec 2014 Edo Liberty, Ram Sriharsha, Maxim Sviridenko

We also show that, experimentally, it is not much worse than k-means++ while operating in a strictly more constrained computational model.

Clustering

Cannot find the paper you are looking for? You can Submit a new open access paper.