Search Results for author: Dan Feldman

Found 41 papers, 8 papers with code

ORBSLAM3-Enhanced Autonomous Toy Drones: Pioneering Indoor Exploration

no code implementations20 Dec 2023 Murad Tukan, Fares Fares, Yotam Grufinkle, Ido Talmor, Loay Mualem, Vladimir Braverman, Dan Feldman

In response to this formidable challenge, we introduce a real-time autonomous indoor exploration system tailored for drones equipped with a monocular \emph{RGB} camera.

Provable Data Subset Selection For Efficient Neural Network Training

1 code implementation9 Mar 2023 Murad Tukan, Samson Zhou, Alaa Maalouf, Daniela Rus, Vladimir Braverman, Dan Feldman

In this paper, we introduce the first algorithm to construct coresets for \emph{RBFNNs}, i. e., small weighted subsets that approximate the loss of the input data on any radial basis function network and thus approximate any function defined by an \emph{RBFNN} on the larger input data.

Efficient Neural Network

Deep Learning on Home Drone: Searching for the Optimal Architecture

1 code implementation21 Sep 2022 Alaa Maalouf, Yotam Gurfinkel, Barak Diker, Oren Gal, Daniela Rus, Dan Feldman

We suggest the first system that runs real-time semantic segmentation via deep learning on a weak micro-computer such as the Raspberry Pi Zero v2 (whose price was \$15) attached to a toy-drone.

Real-Time Semantic Segmentation

Obstacle Aware Sampling for Path Planning

no code implementations8 Mar 2022 Murad Tukan, Alaa Maalouf, Dan Feldman, Roi Poranne

While this approach is very simple, it can become costly when the obstacles are unknown, since samples hitting these obstacles are wasted.

New Coresets for Projective Clustering and Applications

1 code implementation8 Mar 2022 Murad Tukan, Xuan Wu, Samson Zhou, Vladimir Braverman, Dan Feldman

$(j, k)$-projective clustering is the natural generalization of the family of $k$-clustering and $j$-subspace clustering problems.

Clustering regression

Coresets for Data Discretization and Sine Wave Fitting

no code implementations6 Mar 2022 Alaa Maalouf, Murad Tukan, Eric Price, Daniel Kane, Dan Feldman

The goal (e. g., for anomaly detection) is to approximate the $n$ points received so far in $P$ by a single frequency $\sin$, e. g. $\min_{c\in C}cost(P, c)+\lambda(c)$, where $cost(P, c)=\sum_{i=1}^n \sin^2(\frac{2\pi}{N} p_ic)$, $C\subseteq [N]$ is a feasible set of solutions, and $\lambda$ is a given regularization function.

Anomaly Detection

Newton-PnP: Real-time Visual Navigation for Autonomous Toy-Drones

no code implementations5 Mar 2022 Ibrahim Jubran, Fares Fares, Yuval Alfassi, Firas Ayoub, Dan Feldman

The Perspective-n-Point problem aims to estimate the relative pose between a calibrated monocular camera and a known 3D model, by aligning pairs of 2D captured image points to their corresponding 3D points in the model.

Visual Navigation

Introduction to Coresets: Approximated Mean

no code implementations4 Nov 2021 Alaa Maalouf, Ibrahim Jubran, Dan Feldman

The survey may help guide new researchers unfamiliar with the field, and introduce them to the very basic foundations of coresets, through a simple, yet fundamental, problem.

A Unified Approach to Coreset Learning

no code implementations4 Nov 2021 Alaa Maalouf, Gilad Eini, Ben Mussay, Dan Feldman, Margarita Osadchy

Our approach offers a new definition of coreset, which is a natural relaxation of the standard definition and aims at approximating the \emph{average} loss of the original data over the queries.

Network Pruning

Coresets for Decision Trees of Signals

1 code implementation NeurIPS 2021 Ibrahim Jubran, Ernesto Evgeniy Sanches Shayda, Ilan Newman, Dan Feldman

Its regression or classification loss to a given matrix $D$ of $N$ entries (labels) is the sum of squared differences over every label in $D$ and its assigned label by $t$.

Compressing Neural Networks: Towards Determining the Optimal Layer-wise Decomposition

2 code implementations NeurIPS 2021 Lucas Liebenwein, Alaa Maalouf, Oren Gal, Dan Feldman, Daniela Rus

We present a novel global compression framework for deep neural networks that automatically analyzes each layer to identify the optimal per-layer compression ratio, while simultaneously achieving the desired overall compression.

Low-rank compression

Low-Regret Active learning

no code implementations6 Apr 2021 Cenk Baykal, Lucas Liebenwein, Dan Feldman, Daniela Rus

We develop an online learning algorithm for identifying unlabeled data points that are most informative for training (i. e., active learning).

Active Learning Informativeness

Provably Approximated ICP

no code implementations10 Jan 2021 Ibrahim Jubran, Alaa Maalouf, Ron Kimmel, Dan Feldman

A harder version is the \emph{registration problem}, where the correspondence is unknown, and the minimum is also over all possible correspondence functions from $P$ to $Q$.

Provably Approximated Point Cloud Registration

no code implementations ICCV 2021 Ibrahim Jubran, Alaa Maalouf, Ron Kimmel, Dan Feldman

A harder version is the registration problem, where the correspondence is unknown, and the minimum is also over all possible correspondence functions from P to Q. Algorithms such as the Iterative Closest Point (ICP) and its variants were suggested for these problems, but none yield a provable non-trivial approximation for the global optimum.

Point Cloud Registration

Introduction to Core-sets: an Updated Survey

no code implementations18 Nov 2020 Dan Feldman

In optimization or machine learning problems we are given a set of items, usually points in some metric space, and the goal is to minimize or maximize an objective function over some space of candidate solutions.

Clustering Data Summarization

Deep Learning Meets Projective Clustering

no code implementations ICLR 2021 Alaa Maalouf, Harry Lang, Daniela Rus, Dan Feldman

Based on this approach, we provide a novel architecture that replaces the original embedding layer by a set of $k$ small layers that operate in parallel and are then recombined with a single fully-connected layer.

Clustering

Compressed Deep Networks: Goodbye SVD, Hello Robust Low-Rank Approximation

no code implementations11 Sep 2020 Murad Tukan, Alaa Maalouf, Matan Weksler, Dan Feldman

Here, $d$ is the number of the neurons in the layer, $n$ is the number in the next one, and $A_{k, 2}$ can be stored in $O((n+d)k)$ memory instead of $O(nd)$.

Faster PAC Learning and Smaller Coresets via Smoothed Analysis

no code implementations9 Jun 2020 Alaa Maalouf, Ibrahim Jubran, Murad Tukan, Dan Feldman

PAC-learning usually aims to compute a small subset ($\varepsilon$-sample/net) from $n$ items, that provably approximates a given loss function for every query (model, classifier, hypothesis) from a given set of queries, up to an additive error $\varepsilon\in(0, 1)$.

PAC learning

Coresets for Near-Convex Functions

no code implementations NeurIPS 2020 Murad Tukan, Alaa Maalouf, Dan Feldman

Coreset is usually a small weighted subset of $n$ input points in $\mathbb{R}^d$, that provably approximates their loss function for a given set of queries (models, classifiers, etc.).

regression

Sets Clustering

no code implementations ICML 2020 Ibrahim Jubran, Murad Tukan, Alaa Maalouf, Dan Feldman

The input to the \emph{sets-$k$-means} problem is an integer $k\geq 1$ and a set $\mathcal{P}=\{P_1,\cdots, P_n\}$ of sets in $\mathbb{R}^d$.

Clustering Document Classification

On Coresets for Support Vector Machines

no code implementations15 Feb 2020 Murad Tukan, Cenk Baykal, Dan Feldman, Daniela Rus

A coreset is a small, representative subset of the original data points such that a models trained on the coreset are provably competitive with those trained on the original data set.

Small Data Image Classification

Provable Filter Pruning for Efficient Neural Networks

2 code implementations ICLR 2020 Lucas Liebenwein, Cenk Baykal, Harry Lang, Dan Feldman, Daniela Rus

We present a provable, sampling-based approach for generating compact Convolutional Neural Networks (CNNs) by identifying and removing redundant filters from an over-parameterized network.

Introduction to Coresets: Accurate Coresets

no code implementations19 Oct 2019 Ibrahim Jubran, Alaa Maalouf, Dan Feldman

A coreset (or core-set) of an input set is its small summation, such that solving a problem on the coreset as its input, provably yields the same result as solving the same problem on the original (full) set, for a given family of problems (models, classifiers, loss functions).

Math

SiPPing Neural Networks: Sensitivity-informed Provable Pruning of Neural Networks

2 code implementations11 Oct 2019 Cenk Baykal, Lucas Liebenwein, Igor Gilitschenski, Dan Feldman, Daniela Rus

We introduce a pruning algorithm that provably sparsifies the parameters of a trained model in a way that approximately preserves the model's predictive accuracy.

Data-Independent Neural Pruning via Coresets

no code implementations ICLR 2020 Ben Mussay, Margarita Osadchy, Vladimir Braverman, Samson Zhou, Dan Feldman

We propose the first efficient, data-independent neural pruning algorithm with a provable trade-off between its compression rate and the approximation error for any future test sample.

Model Compression Network Pruning

Tight Sensitivity Bounds For Smaller Coresets

no code implementations2 Jul 2019 Alaa Maalouf, Adiel Statman, Dan Feldman

With high probability, non-uniform sampling based on upper bounds on what is known as importance or sensitivity of each row in $A$ yields a coreset.

Coresets for Gaussian Mixture Models of Any Shape

no code implementations12 Jun 2019 Dan Feldman, Zahi Kfir, Xuan Wu

For example, for any input set $D$ whose coordinates are integers in $[-n^{100}, n^{100}]$ and any fixed $k, d\geq 1$, the coreset size is $(\log n)^{O(1)}/\varepsilon^2$, and can be computed in time near-linear in $n$, with high probability.

Clustering

Fast and Accurate Least-Mean-Squares Solvers

1 code implementation NeurIPS 2019 Alaa Maalouf, Ibrahim Jubran, Dan Feldman

Least-mean squares (LMS) solvers such as Linear / Ridge / Lasso-Regression, SVD and Elastic-Net not only solve fundamental machine learning problems, but are also the building blocks in a variety of other methods, such as decision trees and matrix factorizations.

Data Summarization

Provable Approximations for Constrained $\ell_p$ Regression

no code implementations27 Feb 2019 Ibrahim Jubran, David Cohn, Dan Feldman

The $\ell_p$ linear regression problem is to minimize $f(x)=||Ax-b||_p$ over $x\in\mathbb{R}^d$, where $A\in\mathbb{R}^{n\times d}$, $b\in \mathbb{R}^n$, and $p>0$.

regression

Real-Time EEG Classification via Coresets for BCI Applications

no code implementations2 Jan 2019 Eitan Netzer, Alex Frid, Dan Feldman

We suggest an algorithm that maintains the representation such coreset tailored to handle the EEG signal which enables: (i) real time and continuous computation of the Common Spatial Pattern (CSP) feature extraction method on a coreset representation of the signal (instead on the signal itself) , (ii) improvement of the CSP algorithm efficiency with provable guarantees by applying CSP algorithm on the coreset, and (iii) real time addition of the data trials (EEG data windows) to the coreset.

Classification Data Summarization +3

Aligning Points to Lines: Provable Approximations

no code implementations23 Jul 2018 Ibrahim Jubran, Dan Feldman

This problem is non-trivial even if $z=1$ and the matching $\pi$ is given.

Data-Dependent Coresets for Compressing Neural Networks with Applications to Generalization Bounds

no code implementations ICLR 2019 Cenk Baykal, Lucas Liebenwein, Igor Gilitschenski, Dan Feldman, Daniela Rus

We present an efficient coresets-based neural network compression algorithm that sparsifies the parameters of a trained fully-connected neural network in a manner that provably approximates the network's output.

Generalization Bounds Neural Network Compression

Generic Coreset for Scalable Learning of Monotonic Kernels: Logistic Regression, Sigmoid and more

no code implementations21 Feb 2018 Elad Tolochinsky, Ibrahim Jubran, Dan Feldman

Coreset (or core-set) is a small weighted \emph{subset} $Q$ of an input set $P$ with respect to a given \emph{monotonic} function $f:\mathbb{R}\to\mathbb{R}$ that \emph{provably} approximates its fitting loss $\sum_{p\in P}f(p\cdot x)$ to \emph{any} given $x\in\mathbb{R}^d$.

regression

Small Coresets to Represent Large Training Data for Support Vector Machines

no code implementations ICLR 2018 Cenk Baykal, Murad Tukan, Dan Feldman, Daniela Rus

Support Vector Machines (SVMs) are one of the most popular algorithms for classification and regression analysis.

Coresets for Vector Summarization with Applications to Network Graphs

no code implementations ICML 2017 Dan Feldman, Sedat Ozer, Daniela Rus

We provide a deterministic data summarization algorithm that approximates the mean $\bar{p}=\frac{1}{n}\sum_{p\in P} p$ of a set $P$ of $n$ vectors in $\REAL^d$, by a weighted mean $\tilde{p}$ of a \emph{subset} of $O(1/\eps)$ vectors, i. e., independent of both $n$ and $d$.

Data Summarization

Training Gaussian Mixture Models at Scale via Coresets

no code implementations23 Mar 2017 Mario Lucic, Matthew Faulkner, Andreas Krause, Dan Feldman

In this work we show how to construct coresets for mixtures of Gaussians.

Dimensionality Reduction of Massive Sparse Datasets Using Coresets

no code implementations NeurIPS 2016 Dan Feldman, Mikhail Volkov, Daniela Rus

An open practical problem has been to compute a non-trivial approximation to the PCA of very large but sparse databases such as the Wikipedia document-term matrix in a reasonable time.

Dimensionality Reduction

Coresets for Kinematic Data: From Theorems to Real-Time Systems

no code implementations30 Nov 2015 Soliman Nasser, Ibrahim Jubran, Dan Feldman

By maintaining such a coreset for kinematic (moving) set of $n$ points, we can run pose-estimation algorithms, such as Kabsch or PnP, on the small coresets, instead of the $n$ points, in real-time using weak devices, while obtaining the same results.

Pose Estimation

Coresets for k-Segmentation of Streaming Data

no code implementations NeurIPS 2014 Guy Rosman, Mikhail Volkov, Dan Feldman, John W. Fisher III, Daniela Rus

We consider the problem of computing optimal segmentation of such signals by k-piecewise linear function, using only one pass over the data by maintaining a coreset for the signal.

Segmentation Time Series +1

Scalable Training of Mixture Models via Coresets

no code implementations NeurIPS 2011 Dan Feldman, Matthew Faulkner, Andreas Krause

In this paper, we show how to construct coresets for mixtures of Gaussians and natural generalizations.

Density Estimation

A Unified Framework for Approximating and Clustering Data

no code implementations7 Jun 2011 Dan Feldman, Michael Langberg

In the $k$-clustering variant, each $x\in X$ is a tuple of $k$ shapes, and $f(x)$ is the distance from $p$ to its closest shape in $x$.

Clustering PAC learning

Cannot find the paper you are looking for? You can Submit a new open access paper.