Search Results for author: Anthony Nouy

Found 10 papers, 0 papers with code

Weighted least-squares approximation with determinantal point processes and generalized volume sampling

no code implementations21 Dec 2023 Anthony Nouy, Bertrand Michel

We first provide a generalized version of volume-rescaled sampling yielding quasi-optimality results in expectation with a number of samples $n = O(m\log(m))$, that means that the expected $L^2$ error is bounded by a constant times the best approximation error in $L^2$.

Point Processes

Approximation Theory of Tree Tensor Networks: Tensorized Multivariate Functions

no code implementations28 Jan 2021 Mazen Ali, Anthony Nouy

To answer the latter: as a candidate model class we consider approximation classes of TNs and show that these are (quasi-)Banach spaces, that many types of classical smoothness spaces are continuously embedded into said approximation classes and that TN approximation classes are themselves not embedded in any classical smoothness space.

Tensor Networks

A PAC algorithm in relative precision for bandit problem with costly sampling

no code implementations30 Jul 2020 Marie Billaud-Friess, Arthur Macherey, Anthony Nouy, Clémentine Prieur

This paper considers the problem of maximizing an expectation function over a finite set, or finite-arm bandit problem.

Approximation of Smoothness Classes by Deep Rectifier Networks

no code implementations30 Jul 2020 Mazen Ali, Anthony Nouy

We consider approximation rates of sparsely connected deep rectified linear unit (ReLU) and rectified power unit (RePU) neural networks for functions in Besov spaces $B^\alpha_{q}(L^p)$ in arbitrary dimension $d$, on general domains.

Learning with tree tensor networks: complexity estimates and model selection

no code implementations2 Jul 2020 Bertrand Michel, Anthony Nouy

We propose and analyze a complexity-based model selection method for tree tensor networks in an empirical risk minimization framework and we analyze its performance over a wide range of smoothness classes.

Model Selection Quantization +1

Approximation Theory of Tree Tensor Networks: Tensorized Univariate Functions -- Part II

no code implementations30 Jun 2020 Mazen Ali, Anthony Nouy

The considered approximation tool combines a tensorization of functions in $L^p([0, 1))$, which allows to identify a univariate function with a multivariate function (or tensor), and the use of tree tensor networks (the tensor train format) for exploiting low-rank structures of multivariate functions.

Tensor Networks

Approximation Theory of Tree Tensor Networks: Tensorized Univariate Functions -- Part I

no code implementations30 Jun 2020 Mazen Ali, Anthony Nouy

The results of this work are both an analysis of the approximation spaces of TNs and a study of the expressivity of a particular type of neural networks (NN) -- namely feed-forward sum-product networks with sparse architecture.

Tensor Networks

Learning high-dimensional probability distributions using tree tensor networks

no code implementations17 Dec 2019 Erwan Grelier, Anthony Nouy, Régis Lebrun

These algorithms exploit the multilinear parametrization of the formats to recast the nonlinear minimization problem into a sequence of empirical risk minimization problems with linear models.

Model Selection Tensor Networks +1

Learning with tree-based tensor formats

no code implementations11 Nov 2018 Erwan Grelier, Anthony Nouy, Mathilde Chevreuil

For a given tree, the selection of the tuple of tree-based ranks that minimize the risk is a combinatorial problem.

Small Data Image Classification

A least-squares method for sparse low rank approximation of multivariate functions

no code implementations30 Apr 2013 Mathilde Chevreuil, Régis Lebrun, Anthony Nouy, Prashant Rai

In this paper, we propose a low-rank approximation method based on discrete least-squares for the approximation of a multivariate function from random, noisy-free observations.

Cannot find the paper you are looking for? You can Submit a new open access paper.