Search Results for author: Behnoosh Zamanlooy

Found 5 papers, 2 papers with code

Machine Learning-Powered Course Allocation

no code implementations3 Oct 2022 Ermis Soumalias, Behnoosh Zamanlooy, Jakob Weissteiner, Sven Seuken

We study the course allocation problem, where universities assign course schedules to students.

Fairness

Do ReLU Networks Have An Edge When Approximating Compactly-Supported Functions?

no code implementations24 Apr 2022 Anastasis Kratsios, Behnoosh Zamanlooy

Our first main result transcribes this "structured" approximation problem into a universality problem.

Universal Approximation Under Constraints is Possible with Transformers

no code implementations ICLR 2022 Anastasis Kratsios, Behnoosh Zamanlooy, Tianlin Liu, Ivan Dokmanić

Many practical problems need the output of a machine learning model to satisfy a set of constraints, $K$.

Learning Sub-Patterns in Piecewise Continuous Functions

1 code implementation29 Oct 2020 Anastasis Kratsios, Behnoosh Zamanlooy

Most stochastic gradient descent algorithms can optimize neural networks that are sub-differentiable in their parameters; however, this implies that the neural network's activation function must exhibit a degree of continuity which limits the neural network model's uniform approximation capacity to continuous functions.

A Canonical Transform for Strengthening the Local $L^p$-Type Universal Approximation Property

2 code implementations24 Jun 2020 Anastasis Kratsios, Behnoosh Zamanlooy

The transformed model class, denoted by $\mathscr{F}\text{-tope}$, is shown to be dense in $L^p_{\mu,\text{strict}}(\mathbb{R}^d,\mathbb{R}^D)$ which is a topological space whose elements are locally $p$-integrable functions and whose topology is much finer than usual norm topology on $L^p_{\mu}(\mathbb{R}^d,\mathbb{R}^D)$; here $\mu$ is any suitable $\sigma$-finite Borel measure $\mu$ on $\mathbb{R}^d$.

Cannot find the paper you are looking for? You can Submit a new open access paper.