Search Results for author: Julianne Chung

Found 4 papers, 1 papers with code

Goal-oriented Uncertainty Quantification for Inverse Problems via Variational Encoder-Decoder Networks

no code implementations17 Apr 2023 Babak Maboudi Afkham, Julianne Chung, Matthias Chung

In this work, we describe a new approach that uses variational encoder-decoder (VED) networks for efficient goal-oriented uncertainty quantification for inverse problems.

Decoder Uncertainty Quantification

slimTrain -- A Stochastic Approximation Method for Training Separable Deep Neural Networks

1 code implementation28 Sep 2021 Elizabeth Newman, Julianne Chung, Matthias Chung, Lars Ruthotto

In the absence of theoretical guidelines or prior experience on similar tasks, this requires solving many training problems, which can be time-consuming and demanding on computational resources.

Stochastic Optimization

Learning Regularization Parameters of Inverse Problems via Deep Neural Networks

no code implementations14 Apr 2021 Babak Maboudi Afkham, Julianne Chung, Matthias Chung

We emphasize that the key advantage of using DNNs for learning regularization parameters, compared to previous works on learning via optimal experimental design or empirical Bayes risk minimization, is greater generalizability.

Bilevel Optimization Experimental Design

Stochastic Newton and Quasi-Newton Methods for Large Linear Least-squares Problems

no code implementations23 Feb 2017 Julianne Chung, Matthias Chung, J. Tanner Slagel, Luis Tenorio

We describe stochastic Newton and stochastic quasi-Newton approaches to efficiently solve large linear least-squares problems where the very large data sets present a significant computational burden (e. g., the size may exceed computer memory or data are collected in real-time).

Cannot find the paper you are looking for? You can Submit a new open access paper.