2 code implementations • 28 Dec 2023 • Christopher Brix, Stanley Bak, Changliu Liu, Taylor T. Johnson
This report summarizes the 4th International Verification of Neural Networks Competition (VNN-COMP 2023), held as a part of the 6th Workshop on Formal Methods for ML-Enabled Autonomous Systems (FoMLAS), that was collocated with the 35th International Conference on Computer-Aided Verification (CAV).
3 code implementations • NeurIPS 2023 • Suhas Kotha, Christopher Brix, Zico Kolter, Krishnamurthy Dvijotham, huan zhang
Most work on the formal verification of neural networks has focused on bounding the set of outputs that correspond to a given set of inputs (for example, bounded perturbations of a nominal input).
no code implementations • 14 Jan 2023 • Christopher Brix, Mark Niklas Müller, Stanley Bak, Taylor T. Johnson, Changliu Liu
This paper presents a summary and meta-analysis of the first three iterations of the annual International Verification of Neural Networks Competition (VNN-COMP) held in 2020, 2021, and 2022.
1 code implementation • 20 Dec 2022 • Mark Niklas Müller, Christopher Brix, Stanley Bak, Changliu Liu, Taylor T. Johnson
This report summarizes the 3rd International Verification of Neural Networks Competition (VNN-COMP 2022), held as a part of the 5th Workshop on Formal Methods for ML-Enabled Autonomous Systems (FoMLAS), which was collocated with the 34th International Conference on Computer-Aided Verification (CAV).
no code implementations • 24 Nov 2020 • Parnia Bahar, Christopher Brix, Hermann Ney
Neural translation models have proven to be effective in capturing sufficient information from a source sentence and generating a high-quality target sentence.
1 code implementation • 16 Jun 2020 • Christopher Brix, Thomas Noll
We provide proofs for tight upper and lower bounds on max-pooling layers in convolutional networks.
no code implementations • ACL 2020 • Christopher Brix, Parnia Bahar, Hermann Ney
Sparse models require less memory for storage and enable a faster inference by reducing the necessary number of FLOPs.
1 code implementation • EMNLP 2018 • Parnia Bahar, Christopher Brix, Hermann Ney
This work investigates an alternative model for neural machine translation (NMT) and proposes a novel architecture, where we employ a multi-dimensional long short-term memory (MDLSTM) for translation modeling.