Search Results for author: Patricia Pauli

Found 13 papers, 5 papers with code

Lipschitz constant estimation for general neural network architectures using control tools

1 code implementation2 May 2024 Patricia Pauli, Dennis Gramlich, Frank Allgöwer

This paper is devoted to the estimation of the Lipschitz constant of neural networks using semidefinite programming.

State space representations of the Roesser type for convolutional layers

no code implementations18 Mar 2024 Patricia Pauli, Dennis Gramlich, Fran Allgöwer

For this reason, we explicitly provide a state space representation of the Roesser type for 2-D convolutional layers with $c_\mathrm{in}r_1 + c_\mathrm{out}r_2$ states, where $c_\mathrm{in}$/$c_\mathrm{out}$ is the number of input/output channels of the layer and $r_1$/$r_2$ characterizes the width/length of the convolution kernel.

Novel Quadratic Constraints for Extending LipSDP beyond Slope-Restricted Activations

no code implementations25 Jan 2024 Patricia Pauli, Aaron Havens, Alexandre Araujo, Siddharth Garg, Farshad Khorrami, Frank Allgöwer, Bin Hu

However, a direct application of LipSDP to the resultant residual ReLU networks is conservative and even fails in recovering the well-known fact that the MaxMin activation is 1-Lipschitz.

Lipschitz-bounded 1D convolutional neural networks using the Cayley transform and the controllability Gramian

1 code implementation20 Mar 2023 Patricia Pauli, Ruigang Wang, Ian R. Manchester, Frank Allgöwer

We establish a layer-wise parameterization for 1D convolutional neural networks (CNNs) with built-in end-to-end robustness guarantees.

Convolutional Neural Networks as 2-D systems

no code implementations6 Mar 2023 Dennis Gramlich, Patricia Pauli, Carsten W. Scherer, Frank Allgöwer, Christian Ebenbauer

This paper introduces a novel representation of convolutional Neural Networks (CNNs) in terms of 2-D dynamical systems.

Lipschitz constant estimation for 1D convolutional neural networks

no code implementations28 Nov 2022 Patricia Pauli, Dennis Gramlich, Frank Allgöwer

In this work, we propose a dissipativity-based method for Lipschitz constant estimation of 1D convolutional neural networks (CNNs).

Bounding the difference between model predictive control and neural networks

no code implementations13 Apr 2022 Ross Drummond, Stephen R. Duncan, Matthew C. Turner, Patricia Pauli, Frank Allgöwer

There is a growing debate on whether the future of feedback control systems will be dominated by data-driven or model-driven approaches.

Model Predictive Control

Neural network training under semidefinite constraints

1 code implementation3 Jan 2022 Patricia Pauli, Niklas Funcke, Dennis Gramlich, Mohamed Amine Msalmi, Frank Allgöwer

This paper is concerned with the training of neural networks (NNs) under semidefinite constraints, which allows for NN training with robustness and stability guarantees.

Linear systems with neural network nonlinearities: Improved stability analysis via acausal Zames-Falb multipliers

1 code implementation31 Mar 2021 Patricia Pauli, Dennis Gramlich, Julian Berberich, Frank Allgöwer

In this paper, we analyze the stability of feedback interconnections of a linear time-invariant system with a neural network nonlinearity in discrete time.

Computational Efficiency

Offset-free setpoint tracking using neural network controllers

no code implementations23 Nov 2020 Patricia Pauli, Johannes Köhler, Julian Berberich, Anne Koch, Frank Allgöwer

In this paper, we present a method to analyze local and global stability in offset-free setpoint tracking using neural network controllers and we provide ellipsoidal inner approximations of the corresponding region of attraction.

Robust and optimal predictive control of the COVID-19 outbreak

no code implementations7 May 2020 Johannes Köhler, Lukas Schwenkel, Anne Koch, Julian Berberich, Patricia Pauli, Frank Allgöwer

Our theoretical findings support various recent studies by showing that 1) adaptive feedback strategies are required to reliably contain the COVID-19 outbreak, 2) well-designed policies can significantly reduce the number of fatalities compared to simpler ones while keeping the amount of social distancing measures on the same level, and 3) imposing stronger social distancing measures early on is more effective and cheaper in the long run than opening up too soon and restoring stricter measures at a later time.

Model Predictive Control

Training robust neural networks using Lipschitz bounds

1 code implementation6 May 2020 Patricia Pauli, Anne Koch, Julian Berberich, Paul Kohler, Frank Allgöwer

More specifically, we design an optimization scheme based on the Alternating Direction Method of Multipliers that minimizes not only the training loss of an NN but also its Lipschitz constant resulting in a semidefinite programming based training procedure that promotes robustness.

Cannot find the paper you are looking for? You can Submit a new open access paper.