Search Results for author: Malgorzata Bogdan

Found 7 papers, 3 papers with code

Structure Learning of Gaussian Markov Random Fields with False Discovery Rate Control

no code implementations24 Oct 2019 Sangkyun Lee, Piotr Sobczyk, Malgorzata Bogdan

Adapting SL1 for probabilistic graphical models, we show that SL1 can be used for the structure learning of Gaussian MRFs using our suggested procedure nsSLOPE (neighborhood selection Sorted L-One Penalized Estimation), controlling the FDR of detecting edges.

Model Selection

Adaptive Bayesian SLOPE -- High-dimensional Model Selection with Missing Values

3 code implementations14 Sep 2019 Wei Jiang, Malgorzata Bogdan, Julie Josse, Blazej Miasojedow, Veronika Rockova, Traumabase group

We consider the problem of variable selection in high-dimensional settings with missing observations among the covariates.

Methodology Applications Computation

High-dimensional robust regression and outliers detection with SLOPE

no code implementations7 Dec 2017 Alain Virouleau, Agathe Guilloux, Stéphane Gaïffas, Malgorzata Bogdan

Following a recent set of works providing methods for simultaneous robust regression and outliers detection, we consider in this paper a model of linear regression with individual intercepts, in a high-dimensional setting.

regression Vocal Bursts Intensity Prediction

Sparse Portfolio Selection via the sorted $\ell_{1}$-Norm

no code implementations6 Oct 2017 Philipp J. Kremer, Sangkyun Lee, Malgorzata Bogdan, Sandra Paterlini

We introduce a financial portfolio optimization framework that allows us to automatically select the relevant assets and estimate their weights by relying on a sorted $\ell_1$-Norm penalization, henceforth SLOPE.

Portfolio Optimization

Group SLOPE - adaptive selection of groups of predictors

1 code implementation17 Oct 2016 Damian Brzyski, Alexej Gossmann, Weijie Su, Malgorzata Bogdan

Sorted L-One Penalized Estimation (SLOPE) is a relatively new convex optimization procedure which allows for adaptive selection of regressors under sparse high dimensional designs.

Methodology 46N10 G.1.6

Fast Saddle-Point Algorithm for Generalized Dantzig Selector and FDR Control with the Ordered l1-Norm

no code implementations18 Nov 2015 Sangkyun Lee, Damian Brzyski, Malgorzata Bogdan

In this paper we propose a primal-dual proximal extragradient algorithm to solve the generalized Dantzig selector (GDS) estimation problem, based on a new convex-concave saddle-point (SP) reformulation.

Variable Selection

False Discoveries Occur Early on the Lasso Path

3 code implementations5 Nov 2015 Weijie Su, Malgorzata Bogdan, Emmanuel Candes

In regression settings where explanatory variables have very low correlations and there are relatively few effects, each of large magnitude, we expect the Lasso to find the important variables with few errors, if any.

Cannot find the paper you are looking for? You can Submit a new open access paper.