Search Results for author: Mazen Ali

Found 5 papers, 1 papers with code

Approximation Theory of Tree Tensor Networks: Tensorized Multivariate Functions

no code implementations28 Jan 2021 Mazen Ali, Anthony Nouy

To answer the latter: as a candidate model class we consider approximation classes of TNs and show that these are (quasi-)Banach spaces, that many types of classical smoothness spaces are continuously embedded into said approximation classes and that TN approximation classes are themselves not embedded in any classical smoothness space.

Tensor Networks

$H^1$-Stability of the $L^2$-Projection onto Finite Element Spaces on Adaptively Refined Quadrilateral Meshes

1 code implementation28 Aug 2020 Mazen Ali, Stefan A. Funken, Anja Schmidt

The $L^2$-orthogonal projection $\Pi_h:L^2(\Omega)\rightarrow\mathbb{V}_h$ onto a finite element (FE) space $\mathbb{V}_h$ is called $H^1$-stable iff $\|\nabla\Pi_h u\|_{L^2(\Omega)}\leq C\|u\|_{H^1(\Omega)}$, for any $u\in H^1(\Omega)$ with a positive constant $C\neq C(h)$ independent of the mesh size $h>0$.

Numerical Analysis Numerical Analysis 65M50

Approximation of Smoothness Classes by Deep Rectifier Networks

no code implementations30 Jul 2020 Mazen Ali, Anthony Nouy

We consider approximation rates of sparsely connected deep rectified linear unit (ReLU) and rectified power unit (RePU) neural networks for functions in Besov spaces $B^\alpha_{q}(L^p)$ in arbitrary dimension $d$, on general domains.

Approximation Theory of Tree Tensor Networks: Tensorized Univariate Functions -- Part I

no code implementations30 Jun 2020 Mazen Ali, Anthony Nouy

The results of this work are both an analysis of the approximation spaces of TNs and a study of the expressivity of a particular type of neural networks (NN) -- namely feed-forward sum-product networks with sparse architecture.

Tensor Networks

Approximation Theory of Tree Tensor Networks: Tensorized Univariate Functions -- Part II

no code implementations30 Jun 2020 Mazen Ali, Anthony Nouy

The considered approximation tool combines a tensorization of functions in $L^p([0, 1))$, which allows to identify a univariate function with a multivariate function (or tensor), and the use of tree tensor networks (the tensor train format) for exploiting low-rank structures of multivariate functions.

Tensor Networks

Cannot find the paper you are looking for? You can Submit a new open access paper.