no code implementations • 28 Apr 2023 • Frank E. Curtis, Vyacheslav Kungurtsev, Daniel P. Robinson, Qi Wang
A stochastic-gradient-based interior-point algorithm for minimizing a continuously differentiable objective function (that may be nonconvex) subject to bound constraints is presented, analyzed, and demonstrated through experimental results.
no code implementations • 6 Oct 2021 • Yunchen Yang, Xinyue Zhang, Tianjiao Ding, Daniel P. Robinson, Rene Vidal, Manolis C. Tsakiris
In this paper, we revisit the problem of local optimization in RANSAC.
1 code implementation • 24 Jun 2021 • Albert S. Berahas, Frank E. Curtis, Michael J. O'Neill, Daniel P. Robinson
A sequential quadratic optimization algorithm is proposed for solving smooth nonlinear equality constrained optimization problems in which the objective function is defined by an expectation of a stochastic function.
1 code implementation • 29 Jul 2020 • Frank E. Curtis, Yutong Dai, Daniel P. Robinson
We consider the problem of minimizing an objective function that is the sum of a convex function and a group sparsity-inducing regularizer.
Optimization and Control 49M37, 65K05, 65K10, 65Y20, 68Q25, 90C30, 90C60
1 code implementation • 20 Jul 2020 • Albert Berahas, Frank E. Curtis, Daniel P. Robinson, Baoyu Zhou
It is assumed in this setting that it is intractable to compute objective function and derivative values explicitly, although one can compute stochastic function and gradient estimates.
no code implementations • 7 Jun 2020 • Chong You, Chi Li, Daniel P. Robinson, Rene Vidal
When the dataset is drawn from a union of independent subspaces, our method is able to select sufficiently many representatives from each subspace.
no code implementations • ICCV 2019 • Chong You, Chun-Guang Li, Daniel P. Robinson, Rene Vidal
Specifically, our analysis provides conditions that guarantee the correctness of affine subspace clustering methods both with and without the affine constraint, and shows that these conditions are satisfied for high-dimensional data.
no code implementations • 30 Dec 2019 • Daniel P. Robinson, Rene Vidal, Chong You
The goal is to have the representation $c$ correctly identify the subspace, i. e. the nonzero entries of $c$ should correspond to columns of $A$ that are in the subspace $\mathcal{S}_0$.
no code implementations • 12 Oct 2019 • Mustafa D. Kaba, Mengnan Zhao, Rene Vidal, Daniel P. Robinson, Enrique Mallada
In the case of the partial discrete Fourier transform, our characterization of the largest sparsity pattern that can be recovered requires the unknown signal to be real and its dimension to be a prime number.
no code implementations • 2 Aug 2019 • Guilherme França, Daniel P. Robinson, René Vidal
We show that similar discretization schemes applied to Newton's equation with an additional dissipative force, which we refer to as accelerated gradient flow, allow us to obtain accelerated variants of all these proximal algorithms -- the majority of which are new although some recover known cases in the literature.
1 code implementation • NeurIPS 2020 • Guilherme França, Jeremias Sulam, Daniel P. Robinson, René Vidal
Arguably, the two most popular accelerated or momentum-based optimization methods in machine learning are Nesterov's accelerated gradient and Polyaks's heavy ball, both corresponding to different discretizations of a particular second order differential equation with friction.
no code implementations • 24 Dec 2018 • Zhihui Zhu, Yifan Wang, Daniel P. Robinson, Daniel Q. Naiman, Rene Vidal, Manolis C. Tsakiris
However, its geometric analysis is based on quantities that are difficult to interpret and are not amenable to statistical analysis.
no code implementations • ECCV 2018 • Chong You, Chi Li, Daniel P. Robinson, Rene Vidal
Our experiments demonstrate that the proposed method outperforms state-of-the-art subspace clustering methods in two large-scale image datasets that are imbalanced.
no code implementations • 13 Aug 2018 • Guilherme França, Daniel P. Robinson, René Vidal
Recently, there has been great interest in connections between continuous-time dynamical systems and optimization methods, notably in the context of accelerated methods for smooth and unconstrained problems.
no code implementations • CVPR 2017 • Chong You, Daniel P. Robinson, René Vidal
While outlier detection methods based on robust statistics have existed for decades, only recently have methods based on sparse and low-rank representation been developed along with guarantees of correct outlier detection when the inliers lie in one or more low-dimensional subspaces.
1 code implementation • CVPR 2016 • Chong You, Chun-Guang Li, Daniel P. Robinson, Rene Vidal
Our geometric analysis also provides a theoretical justification and a geometric interpretation for the balance between the connectedness (due to $\ell_2$ regularization) and subspace-preserving (due to $\ell_1$ regularization) properties for elastic net subspace clustering.
Ranked #7 on Image Clustering on coil-100 (Accuracy metric)
no code implementations • 20 Apr 2016 • Daniel P. Robinson, Suchi Saria
For the challenging real-world application of risk prediction for sepsis in intensive care units, the use of our regularizer leads to models that are in harmony with the underlying cost structure and thus provide an excellent prediction accuracy versus cost tradeoff.
2 code implementations • CVPR 2016 • Chong You, Daniel P. Robinson, Rene Vidal
Subspace clustering methods based on $\ell_1$, $\ell_2$ or nuclear norm regularization have become very popular due to their simplicity, theoretical guarantees and empirical success.
Ranked #6 on Image Clustering on Extended Yale-B