Search Results for author: James Ferlez

Found 11 papers, 1 papers with code

SEO: Safety-Aware Energy Optimization Framework for Multi-Sensor Neural Controllers at the Edge

no code implementations24 Feb 2023 Mohanad Odema, James Ferlez, Yasser Shoukry, Mohammad Abdullah Al Faruque

Runtime energy management has become quintessential for multi-sensor autonomous systems at the edge for achieving high performance given the platform constraints.

Autonomous Driving energy management +1

EnergyShield: Provably-Safe Offloading of Neural Network Controllers for Energy Efficiency

no code implementations13 Feb 2023 Mohanad Odema, James Ferlez, Goli Vaisi, Yasser Shoukry, Mohammad Abdullah Al Faruque

To mitigate the high energy demand of Neural Network (NN) based Autonomous Driving Systems (ADSs), we consider the problem of offloading NN controllers from the ADS to nearby edge-computing infrastructure, but in such a way that formal vehicle safety properties are guaranteed.

Autonomous Driving Edge-computing

Polynomial-Time Reachability for LTI Systems with Two-Level Lattice Neural Network Controllers

no code implementations20 Sep 2022 James Ferlez, Yasser Shoukry

In this paper, we consider the computational complexity of bounding the reachable set of a Linear Time-Invariant (LTI) system controlled by a Rectified Linear Unit (ReLU) Two-Level Lattice (TLL) Neural Network (NN) controller.

Fast BATLLNN: Fast Box Analysis of Two-Level Lattice Neural Networks

no code implementations17 Nov 2021 James Ferlez, Haitham Khedr, Yasser Shoukry

In this paper, we present the tool Fast Box Analysis of Two-Level Lattice Neural Networks (Fast BATLLNN) as a fast verifier of box-like output constraints for Two-Level Lattice (TLL) Neural Networks (NNs).

Vocal Bursts Valence Prediction

Assured Neural Network Architectures for Control and Identification of Nonlinear Systems

no code implementations21 Sep 2021 James Ferlez, Yasser Shoukry

In this paper, we consider the problem of automatically designing a Rectified Linear Unit (ReLU) Neural Network (NN) architecture (number of layers and number of neurons per layer) with the assurance that it is sufficiently parametrized to control a nonlinear system; i. e. control the system to satisfy a given formal specification.

Safe-by-Repair: A Convex Optimization Approach for Repairing Unsafe Two-Level Lattice Neural Network Controllers

no code implementations6 Apr 2021 Ulices Santa Cruz, James Ferlez, Yasser Shoukry

In this paper, we consider the problem of repairing a data-trained Rectified Linear Unit (ReLU) Neural Network (NN) controller for a discrete-time, input-affine system.

Bounding the Complexity of Formally Verifying Neural Networks: A Geometric Approach

no code implementations22 Dec 2020 James Ferlez, Yasser Shoukry

Specifically, we show that for two different NN architectures -- shallow NNs and Two-Level Lattice (TLL) NNs -- the verification problem with (convex) polytopic constraints is polynomial in the number of neurons in the NN to be verified, when all other aspects of the verification problem held fixed.

PEREGRiNN: Penalized-Relaxation Greedy Neural Network Verifier

1 code implementation18 Jun 2020 Haitham Khedr, James Ferlez, Yasser Shoukry

However, unique in our approach is the way we use a convex solver not only as a linear feasibility checker, but also as a means of penalizing the amount of relaxation allowed in solutions.

ShieldNN: A Provably Safe NN Filter for Unsafe NN Controllers

no code implementations16 Jun 2020 James Ferlez, Mahmoud Elnaggar, Yasser Shoukry, Cody Fleming

In this paper, we consider the problem of creating a safe-by-design Rectified Linear Unit (ReLU) Neural Network (NN), which, when composed with an arbitrary control NN, makes the composition provably safe.

Two-Level Lattice Neural Network Architectures for Control of Nonlinear Systems

no code implementations20 Apr 2020 James Ferlez, Xiaowu Sun, Yasser Shoukry

In this paper, we consider the problem of automatically designing a Rectified Linear Unit (ReLU) Neural Network (NN) architecture (number of layers and number of neurons per layer) with the guarantee that it is sufficiently parametrized to control a nonlinear system.

Vocal Bursts Valence Prediction

AReN: Assured ReLU NN Architecture for Model Predictive Control of LTI Systems

no code implementations5 Nov 2019 James Ferlez, Yasser Shoukry

In this paper, we consider the problem of automatically designing a Rectified Linear Unit (ReLU) Neural Network (NN) architecture that is sufficient to implement the optimal Model Predictive Control (MPC) strategy for an LTI system with quadratic cost.

Model Predictive Control

Cannot find the paper you are looking for? You can Submit a new open access paper.