Search Results for author: Nathaniel A. Trask

Found 6 papers, 1 papers with code

Machine learning structure preserving brackets for forecasting irreversible processes

no code implementations NeurIPS 2021 Kookjin Lee, Nathaniel A. Trask, Panos Stinis

Forecasting of time-series data requires imposition of inductive biases to obtain predictive extrapolation, and recent works have imposed Hamiltonian/Lagrangian form to preserve structure for systems with reversible dynamics.

BIG-bench Machine Learning Time Series +1

Partition of unity networks: deep hp-approximation

no code implementations27 Jan 2021 Kookjin Lee, Nathaniel A. Trask, Ravi G. Patel, Mamikon A. Gulian, Eric C. Cyr

Approximation theorists have established best-in-class optimal approximation rates of deep neural networks by utilizing their ability to simultaneously emulate partitions of unity and monomials.

Unity

A physics-informed operator regression framework for extracting data-driven continuum models

1 code implementation25 Sep 2020 Ravi G. Patel, Nathaniel A. Trask, Mitchell A. Wood, Eric C. Cyr

The application of deep learning toward discovery of data-driven models requires careful application of inductive biases to obtain a description of physics which is both accurate and robust.

regression

A block coordinate descent optimizer for classification problems exploiting convexity

no code implementations17 Jun 2020 Ravi G. Patel, Nathaniel A. Trask, Mamikon A. Gulian, Eric C. Cyr

By alternating between a second-order method to find globally optimal parameters for the linear layer and gradient descent to train the hidden layers, we ensure an optimal fit of the adaptive basis to data throughout training.

Classification General Classification +2

Robust Training and Initialization of Deep Neural Networks: An Adaptive Basis Viewpoint

no code implementations10 Dec 2019 Eric C. Cyr, Mamikon A. Gulian, Ravi G. Patel, Mauro Perego, Nathaniel A. Trask

Motivated by the gap between theoretical optimal approximation rates of deep neural networks (DNNs) and the accuracy realized in practice, we seek to improve the training of DNNs.

regression

Cannot find the paper you are looking for? You can Submit a new open access paper.