no code implementations • ICML 2020 • Alper Atamturk, Andres Gomez
We give safe screening rules to eliminate variables from regression with L0 regularization or cardinality constraint.
no code implementations • 25 Jul 2023 • Salar Fattahi, Andres Gomez
More specifically, we show that the entire solution path of the time-varying MRF for all sparsity levels can be obtained in $\mathcal{O}(pT^3)$, where $T$ is the number of time steps and $p$ is the number of unknown parameters at any given time.
no code implementations • AAAI Workshop AdvML 2022 • Nathan Justin, Sina Aghaei, Andres Gomez, Phebe Vayanos
In many high-stakes domains, the data used to drive machine learning algorithms is noisy (due to e. g., the sensitive nature of the data being collected, limited resources available to validate the data, etc).
no code implementations • 5 Aug 2021 • Andres Gomez, Andreas Tretter, Pascal Alexander Hager, Praveenth Sanmugarajah, Luca Benini, Lothar Thiele
By leveraging interkernel data dependencies, these energy-bounded execution cycles minimize the number of system activations and nonvolatile data transfers, and thus the total energy overhead.
no code implementations • NeurIPS 2021 • Salar Fattahi, Andres Gomez
Most of the existing methods for the inference of time-varying Markov random fields (MRFs) rely on the \textit{regularized maximum likelihood estimation} (MLE), that typically suffer from weak statistical guarantees and high computational time.
no code implementations • NeurIPS 2021 • Salar Fattahi, Andres Gomez
In this paper, we study the problem of inferring time-varying Markov random fields (MRF), where the underlying graphical model is both sparse and changes sparsely over time.
no code implementations • 29 Dec 2020 • Alper Atamturk, Andres Gomez
We show that the convex hull of the epigraph of the quadratic can be obtaining from inequalities for the underlying supermodular set function by lifting them into nonlinear inequalities in the original space of variables.
no code implementations • 30 Jun 2020 • Linchuan Wei, Andres Gomez, Simge Kucukyavuz
Motivated by modern regression applications, in this paper, we study the convexification of a class of convex optimization problems with indicator variables and combinatorial constraints on the indicators.
no code implementations • 21 Feb 2020 • Sina Aghaei, Andres Gomez, Phebe Vayanos
To fill this gap in the literature, we propose a flow-based MIP formulation for optimal binary classification trees that has a stronger linear programming relaxation.
no code implementations • 29 Jan 2019 • Alper Atamturk, Andres Gomez
Sparse regression models are increasingly prevalent due to their ease of interpretability and superior out-of-sample performance.
no code implementations • 6 Nov 2018 • Alper Atamturk, Andres Gomez, Shaoning Han
Signal estimation problems with smoothness and sparsity priors can be naturally modeled as quadratic optimization with $\ell_0$-"norm" constraints.
no code implementations • 20 Apr 2017 • Andres Gomez, Camilo Lara, Udo Kebschull
A remarkable example of a Grid in High Energy Physics (HEP) research is used in the ALICE experiment at European Organization for Nuclear Research CERN.