Search Results for author: Martin Slawski

Found 18 papers, 0 papers with code

Permuted and Unlinked Monotone Regression in $\mathbb{R}^d$: an approach based on mixture modeling and optimal transport

no code implementations10 Jan 2022 Martin Slawski, Bodhisattva Sen

We study permutation recovery in the permuted regression setting and develop a computationally efficient and easy-to-use algorithm for denoising based on the Kiefer-Wolfowitz [Ann.

Denoising Math +1

Regularization for Shuffled Data Problems via Exponential Family Priors on the Permutation Group

no code implementations2 Nov 2021 Zhenbang Wang, Emanuel Ben-David, Martin Slawski

In the analysis of data sets consisting of (X, Y)-pairs, a tacit assumption is that each pair corresponds to the same observation unit.

Asynchronous Online Federated Learning for Edge Devices with Non-IID Data

no code implementations5 Nov 2019 Yujing Chen, Yue Ning, Martin Slawski, Huzefa Rangwala

In this paper, we present an Asynchronous Online Federated Learning (ASO-Fed) framework, where the edge devices perform online learning with continuous streaming local data and a central server aggregates model parameters from clients.

Federated Learning

A Pseudo-Likelihood Approach to Linear Regression with Partially Shuffled Data

no code implementations3 Oct 2019 Martin Slawski, Guoqing Diao, Emanuel Ben-David

In this paper, we present a method to adjust for such mismatches under ``partial shuffling" in which a sufficiently large fraction of (predictors, response)-pairs are observed in their correct correspondence.

Data Integration regression

The Benefits of Diversity: Permutation Recovery in Unlabeled Sensing from Multiple Measurement Vectors

no code implementations5 Sep 2019 Hang Zhang, Martin Slawski, Ping Li

For the case in which both the signal and permutation are unknown, the problem is reformulated as a bi-convex optimization problem with an auxiliary variable, which can be solved by the Alternating Direction Method of Multipliers (ADMM).

A Two-Stage Approach to Multivariate Linear Regression with Sparsely Mismatched Data

no code implementations16 Jul 2019 Martin Slawski, Emanuel Ben-David, Ping Li

A tacit assumption in linear regression is that (response, predictor)-pairs correspond to identical observational units.

regression

A Note on Coding and Standardization of Categorical Variables in (Sparse) Group Lasso Regression

no code implementations17 May 2018 Felicitas J. Detmer, Martin Slawski

Categorical regressor variables are usually handled by introducing a set of indicator variables, and imposing a linear constraint to ensure identifiability in the presence of an intercept, or equivalently, using one of various coding schemes.

regression Variable Selection

Simple strategies for recovering inner products from coarsely quantized random projections

no code implementations NeurIPS 2017 Ping Li, Martin Slawski

Random projections have been increasingly adopted for a diverse set of tasks in machine learning involving dimensionality reduction.

Data Compression Dimensionality Reduction +1

Linear Regression with Sparsely Permuted Data

no code implementations16 Oct 2017 Martin Slawski, Emanuel Ben-David

In this paper, we consider the situation of "permuted data" in which this basic correspondence has been lost.

regression

On Principal Components Regression, Random Projections, and Column Subsampling

no code implementations23 Sep 2017 Martin Slawski

In this paper, we present an analysis showing that for random projections satisfying a Johnson-Lindenstrauss embedding property, the prediction error in subsequent regression is close to that of PCR, at the expense of requiring a slightly large number of random projections than principal components.

Dimensionality Reduction regression

Quantized Random Projections and Non-Linear Estimation of Cosine Similarity

no code implementations NeurIPS 2016 Ping Li, Michael Mitzenmacher, Martin Slawski

Random projections constitute a simple, yet effective technique for dimensionality reduction with applications in learning and search problems.

Dimensionality Reduction LEMMA +1

Methods for Sparse and Low-Rank Recovery under Simplex Constraints

no code implementations2 May 2016 Ping Li, Syama Sundar Rangapuram, Martin Slawski

The de-facto standard approach of promoting sparsity by means of $\ell_1$-regularization becomes ineffective in the presence of simplex constraints, i. e.,~the target is known to have non-negative entries summing up to a given constant.

Density Estimation Portfolio Optimization +1

b-bit Marginal Regression

no code implementations NeurIPS 2015 Martin Slawski, Ping Li

We consider the problem of sparse signal recovery from $m$ linear measurements quantized to $b$ bits.

Quantization regression

Regularization-free estimation in trace regression with symmetric positive semidefinite matrices

no code implementations NeurIPS 2015 Martin Slawski, Ping Li, Matthias Hein

Over the past few years, trace regression models have received considerable attention in the context of matrix completion, quantum state tomography, and compressed sensing.

Matrix Completion Quantum State Tomography +1

Matrix factorization with Binary Components

no code implementations NeurIPS 2013 Martin Slawski, Matthias Hein, Pavlo Lutsik

Motivated by an application in computational biology, we consider low-rank matrix factorization with $\{0, 1\}$-constraints on one of the factors and optionally convex constraints on the second one.

LEMMA

Non-negative least squares for high-dimensional linear models: consistency and sparse recovery without regularization

no code implementations4 May 2012 Martin Slawski, Matthias Hein

We show that for these designs, the performance of NNLS with regard to prediction and estimation is comparable to that of the lasso.

Sparse recovery by thresholded non-negative least squares

no code implementations NeurIPS 2011 Martin Slawski, Matthias Hein

Non-negative data are commonly encountered in numerous fields, making non-negative least squares regression (NNLS) a frequently used tool.

regression

Cannot find the paper you are looking for? You can Submit a new open access paper.