Search Results for author: George Em Karniadakis

Found 80 papers, 27 papers with code

Leveraging viscous Hamilton-Jacobi PDEs for uncertainty quantification in scientific machine learning

no code implementations12 Apr 2024 Zongren Zou, Tingwei Meng, Paula Chen, Jérôme Darbon, George Em Karniadakis

We provide several examples from SciML involving noisy data and \textit{epistemic uncertainty} to illustrate the potential advantages of our approach.

Bayesian Inference Uncertainty Quantification

Learning in PINNs: Phase transition, total diffusion, and generalization

no code implementations27 Mar 2024 Sokratis J. Anagnostopoulos, Juan Diego Toscano, Nikolaos Stergiopulos, George Em Karniadakis

We investigate the learning dynamics of fully-connected neural networks through the lens of gradient signal-to-noise ratio (SNR), examining the behavior of first-order optimizers like Adam in non-convex objectives.

Two-scale Neural Networks for Partial Differential Equations with Small Parameters

no code implementations27 Feb 2024 Qiao Zhuang, Chris Ziyi Yao, Zhongqiang Zhang, George Em Karniadakis

We propose a two-scale neural network method for solving partial differential equations (PDEs) with small parameters using physics-informed neural networks (PINNs).

Score-Based Physics-Informed Neural Networks for High-Dimensional Fokker-Planck Equations

no code implementations12 Feb 2024 Zheyuan Hu, Zhongqiang Zhang, George Em Karniadakis, Kenji Kawaguchi

The score function, defined as the gradient of the LL, plays a fundamental role in inferring LL and PDF and enables fast SDE sampling.

RiemannONets: Interpretable Neural Operators for Riemann Problems

1 code implementation16 Jan 2024 Ahmad Peyvan, Vivek Oommen, Ameya D. Jagtap, George Em Karniadakis

Developing the proper representations for simulating high-speed flows with strong shock waves, rarefactions, and contact discontinuities has been a long-standing question in numerical analysis.

Hutchinson Trace Estimation for High-Dimensional and High-Order Physics-Informed Neural Networks

1 code implementation22 Dec 2023 Zheyuan Hu, Zekun Shi, George Em Karniadakis, Kenji Kawaguchi

We further showcase HTE's convergence to the original PINN loss and its unbiased behavior under specific conditions.

AI-Lorenz: A physics-data-driven framework for black-box and gray-box identification of chaotic systems with symbolic regression

1 code implementation21 Dec 2023 Mario De Florio, Ioannis G. Kevrekidis, George Em Karniadakis

The performance of this framework is validated by recovering the right-hand sides and unknown terms of certain complex, chaotic systems such as the well-known Lorenz system, a six-dimensional hyperchaotic system, and the non-autonomous Sprott chaotic system, and comparing them with their known analytical expressions.

Symbolic Regression

Rethinking materials simulations: Blending direct numerical simulations with neural operators

1 code implementation8 Dec 2023 Vivek Oommen, Khemraj Shukla, Saaketh Desai, Remi Dingreville, George Em Karniadakis

This methodology is based on the integration of a community numerical solver with a U-Net neural operator, enhanced by a temporal-conditioning mechanism that enables accurate extrapolation and efficient time-to-solution predictions of the dynamics.

Geophysics

GPT vs Human for Scientific Reviews: A Dual Source Review on Applications of ChatGPT in Science

no code implementations5 Dec 2023 Chenxi Wu, Alan John Varghese, Vivek Oommen, George Em Karniadakis

Herein, we consider 13 GPT-related papers across different scientific domains, reviewed by a human reviewer and SciSpace, a large language model, with the reviews evaluated by three distinct types of evaluators, namely GPT-3. 5, a crowd panel, and GPT-4.

Language Modelling Large Language Model

Rethinking Skip Connections in Spiking Neural Networks with Time-To-First-Spike Coding

no code implementations1 Dec 2023 Youngeun Kim, Adar Kahana, Ruokai Yin, Yuhang Li, Panos Stinis, George Em Karniadakis, Priyadarshini Panda

In this work, we delve into the role of skip connections, a widely used concept in Artificial Neural Networks (ANNs), within the domain of SNNs with TTFS coding.

Bias-Variance Trade-off in Physics-Informed Neural Networks with Randomized Smoothing for High-Dimensional PDEs

no code implementations26 Nov 2023 Zheyuan Hu, Zhouhao Yang, Yezhen Wang, George Em Karniadakis, Kenji Kawaguchi

To optimize the bias-variance trade-off, we combine the two approaches in a hybrid method that balances the rapid convergence of the biased version with the high accuracy of the unbiased version.

Computational Efficiency

Mechanical Characterization and Inverse Design of Stochastic Architected Metamaterials Using Neural Operators

no code implementations23 Nov 2023 Hanxun Jin, Enrui Zhang, Boyu Zhang, Sridhar Krishnaswamy, George Em Karniadakis, Horacio D. Espinosa

Our work marks a significant advancement in the field of materials-by-design, potentially heralding a new era in the discovery and development of next-generation metamaterials with unparalleled mechanical characteristics derived directly from experimental insights.

Uncertainty quantification for noisy inputs-outputs in physics-informed neural networks and neural operators

no code implementations19 Nov 2023 Zongren Zou, Xuhui Meng, George Em Karniadakis

As a result, UQ for noisy inputs becomes a crucial factor for reliable and trustworthy deployment of these models in applications involving physical knowledge.

Uncertainty Quantification

Leveraging Hamilton-Jacobi PDEs with time-dependent Hamiltonians for continual scientific machine learning

no code implementations13 Nov 2023 Paula Chen, Tingwei Meng, Zongren Zou, Jérôme Darbon, George Em Karniadakis

This connection allows us to reinterpret incremental updates to learned models as the evolution of an associated HJ PDE and optimal control problem in time, where all of the previous information is intrinsically encoded in the solution to the HJ PDE.

Computational Efficiency Continual Learning

Operator Learning Enhanced Physics-informed Neural Networks for Solving Partial Differential Equations Characterized by Sharp Solutions

no code implementations30 Oct 2023 Bin Lin, Zhiping Mao, Zhicheng Wang, George Em Karniadakis

Initially, we utilize DeepONet to learn the solution operator for a set of smooth problems relevant to the PDEs characterized by sharp solutions.

Operator learning

Correcting model misspecification in physics-informed neural networks (PINNs)

no code implementations16 Oct 2023 Zongren Zou, Xuhui Meng, George Em Karniadakis

Despite the effectiveness of PINNs for discovering governing equations, the physical models encoded in PINNs may be misspecified in complex systems as some of the physical processes may not be fully understood, leading to the poor accuracy of PINN predictions.

DON-LSTM: Multi-Resolution Learning with DeepONets and Long Short-Term Memory Neural Networks

1 code implementation3 Oct 2023 Katarzyna Michałowska, Somdatta Goswami, George Em Karniadakis, Signe Riemer-Sørensen

Deep operator networks (DeepONets, DONs) offer a distinct advantage over traditional neural networks in their ability to be trained on multi-resolution data.

AI-Aristotle: A Physics-Informed framework for Systems Biology Gray-Box Identification

1 code implementation29 Sep 2023 Nazanin Ahmadi Daryakenari, Mario De Florio, Khemraj Shukla, George Em Karniadakis

The proposed framework -- named AI-Aristotle -- combines eXtreme Theory of Functional Connections (X-TFC) domain-decomposition and Physics-Informed Neural Networks (PINNs) with symbolic regression (SR) techniques for parameter discovery and gray-box identification.

regression Symbolic Regression

Artificial to Spiking Neural Networks Conversion for Scientific Machine Learning

no code implementations31 Aug 2023 Qian Zhang, Chenxi Wu, Adar Kahana, Youngeun Kim, Yuhang Li, George Em Karniadakis, Priyadarshini Panda

We introduce a method to convert Physics-Informed Neural Networks (PINNs), commonly used in scientific machine learning, to Spiking Neural Networks (SNNs), which are expected to have higher energy efficiency compared to traditional Artificial Neural Networks (ANNs).

Computational Efficiency

Sound propagation in realistic interactive 3D scenes with parameterized sources using deep neural operators

1 code implementation9 Aug 2023 Nikolas Borrel-Jensen, Somdatta Goswami, Allan P. Engsig-Karup, George Em Karniadakis, Cheol-Ho Jeong

We address the challenge of sound propagation simulations in 3D virtual rooms with moving sources, which have applications in virtual/augmented reality, game audio, and spatial computing.

Tackling the Curse of Dimensionality with Physics-Informed Neural Networks

no code implementations23 Jul 2023 Zheyuan Hu, Khemraj Shukla, George Em Karniadakis, Kenji Kawaguchi

We demonstrate in various diverse tests that the proposed method can solve many notoriously hard high-dimensional PDEs, including the Hamilton-Jacobi-Bellman (HJB) and the Schr\"{o}dinger equations in tens of thousands of dimensions very fast on a single GPU using the PINNs mesh-free approach.

Real-time Inference and Extrapolation via a Diffusion-inspired Temporal Transformer Operator (DiTTO)

no code implementations18 Jul 2023 Oded Ovadia, Vivek Oommen, Adar Kahana, Ahmad Peyvan, Eli Turkel, George Em Karniadakis

The proposed method, named Diffusion-inspired Temporal Transformer Operator (DiTTO), is inspired by latent diffusion models and their conditioning mechanism, which we use to incorporate the temporal evolution of the PDE, in combination with elements from the transformer architecture to improve its capabilities.

Operator learning Super-Resolution

Discovering a reaction-diffusion model for Alzheimer's disease by combining PINNs with symbolic regression

no code implementations16 Jul 2023 Zhen Zhang, Zongren Zou, Ellen Kuhl, George Em Karniadakis

Specifically, we integrate physics informed neural networks (PINNs) and symbolic regression to discover a reaction-diffusion type partial differential equation for tau protein misfolding and spreading.

regression Symbolic Regression

TransformerG2G: Adaptive time-stepping for learning temporal graph embeddings using transformers

1 code implementation5 Jul 2023 Alan John Varghese, Aniruddha Bora, Mengjia Xu, George Em Karniadakis

Hence, incorporating long-range dependencies from the historical graph context plays a crucial role in accurately learning their temporal dynamics.

Anomaly Detection Computational Efficiency +6

Residual-based attention and connection to information bottleneck theory in PINNs

1 code implementation1 Jul 2023 Sokratis J. Anagnostopoulos, Juan Diego Toscano, Nikolaos Stergiopulos, George Em Karniadakis

Driven by the need for more efficient and seamless integration of physical models and data, physics-informed neural networks (PINNs) have seen a surge of interest in recent years.

MyCrunchGPT: A chatGPT assisted framework for scientific machine learning

no code implementations27 Jun 2023 Varun Kumar, Leonard Gleyzer, Adar Kahana, Khemraj Shukla, George Em Karniadakis

To demonstrate the flow of the MyCrunchGPT, and create an infrastructure that can facilitate a broader vision, we built a webapp based guided user interface, that includes options for a comprehensive summary report.

Code Generation Geophysics

A Framework Based on Symbolic Regression Coupled with eXtended Physics-Informed Neural Networks for Gray-Box Learning of Equations of Motion from Data

no code implementations18 May 2023 Elham Kiyani, Khemraj Shukla, George Em Karniadakis, Mikko Karttunen

In addition, symbolic regression is employed to determine the closed form of the unknown part of the equation from the data, and the results confirm the accuracy of the X-PINNs based approach.

Symbolic Regression

Physics-informed neural networks for predicting gas flow dynamics and unknown parameters in diesel engines

no code implementations26 Apr 2023 Kamaljyoti Nath, Xuhui Meng, Daniel J Smith, George Em Karniadakis

In other words, the mean value model uses both the PINN model and the DNNs to represent the engine's states, with the PINN providing a physics-based understanding of the engine's overall dynamics and the DNNs offering a more engine-specific and adaptive representation of the empirical formulae.

Learning in latent spaces improves the predictive accuracy of deep neural operators

1 code implementation15 Apr 2023 Katiana Kontolati, Somdatta Goswami, George Em Karniadakis, Michael D. Shields

Operator regression provides a powerful means of constructing discretization-invariant emulators for partial-differential equations (PDEs) describing physical systems.

Computational Efficiency

Real-Time Prediction of Gas Flow Dynamics in Diesel Engines using a Deep Neural Operator Framework

no code implementations2 Apr 2023 Varun Kumar, Somdatta Goswami, Daniel J. Smith, George Em Karniadakis

As an alternative to physics based models, we develop an operator-based regression model (DeepONet) to learn the relevant output states for a mean-value gas flow engine model using the engine operating conditions as input variables.

Leveraging Multi-time Hamilton-Jacobi PDEs for Certain Scientific Machine Learning Problems

1 code implementation22 Mar 2023 Paula Chen, Tingwei Meng, Zongren Zou, Jérôme Darbon, George Em Karniadakis

Hamilton-Jacobi partial differential equations (HJ PDEs) have deep connections with a wide range of fields, including optimal control, differential games, and imaging sciences.

Continual Learning Transfer Learning

LNO: Laplace Neural Operator for Solving Differential Equations

no code implementations19 Mar 2023 Qianying Cao, Somdatta Goswami, George Em Karniadakis

Herein, we demonstrate the superior approximation accuracy of a single Laplace layer in LNO over four Fourier modules in FNO in approximating the solutions of three ODEs (Duffing oscillator, driven gravity pendulum, and Lorenz system) and three PDEs (Euler-Bernoulli beam, diffusion equation, and reaction-diffusion system).

Operator learning

ViTO: Vision Transformer-Operator

no code implementations15 Mar 2023 Oded Ovadia, Adar Kahana, Panos Stinis, Eli Turkel, George Em Karniadakis

We combine vision transformers with operator learning to solve diverse inverse problems described by partial differential equations (PDEs).

Operator learning Super-Resolution

Neural Operator Learning for Long-Time Integration in Dynamical Systems with Recurrent Neural Networks

no code implementations3 Mar 2023 Katarzyna Michałowska, Somdatta Goswami, George Em Karniadakis, Signe Riemer-Sørensen

Deep neural networks are an attractive alternative for simulating complex dynamical systems, as in comparison to traditional scientific computing methods, they offer reduced computational costs during inference and can be trained directly from observational data.

Operator learning

Learning stiff chemical kinetics using extended deep neural operators

1 code implementation23 Feb 2023 Somdatta Goswami, Ameya D. Jagtap, Hessam Babaee, Bryan T. Susi, George Em Karniadakis

Specifically, to train the DeepONet for the syngas model, we solve the skeletal kinetic model for different initial conditions.

Unity

Learning bias corrections for climate models using deep neural operators

no code implementations7 Feb 2023 Aniruddha Bora, Khemraj Shukla, Shixuan Zhang, Bryce Harrop, Ruby Leung, George Em Karniadakis

In this study, we replace the bias correction process with a surrogate model based on the Deep Operator Network (DeepONet).

Deep neural operators can serve as accurate surrogates for shape optimization: A case study for airfoils

no code implementations2 Feb 2023 Khemraj Shukla, Vivek Oommen, Ahmad Peyvan, Michael Penwarden, Luis Bravo, Anindya Ghoshal, Robert M. Kirby, George Em Karniadakis

Deep neural operators, such as DeepONets, have changed the paradigm in high-dimensional nonlinear regression from function regression to (differential) operator regression, paving the way for significant changes in computational engineering applications.

regression

A Hybrid Deep Neural Operator/Finite Element Method for Ice-Sheet Modeling

no code implementations26 Jan 2023 Qizhi He, Mauro Perego, Amanda A. Howard, George Em Karniadakis, Panos Stinis

One of the most challenging and consequential problems in climate modeling is to provide probabilistic projections of sea level rise.

Friction Uncertainty Quantification

L-HYDRA: Multi-Head Physics-Informed Neural Networks

no code implementations5 Jan 2023 Zongren Zou, George Em Karniadakis

We introduce multi-head neural networks (MH-NNs) to physics-informed machine learning, which is a type of neural networks (NNs) with all nonlinear hidden layers as the body and multiple linear output layers as multi-head.

Few-Shot Learning Multi-Task Learning +2

Reliable extrapolation of deep neural operators informed by physics or sparse observations

1 code implementation13 Dec 2022 Min Zhu, Handi Zhang, Anran Jiao, George Em Karniadakis, Lu Lu

Deep neural operators can learn nonlinear mappings between infinite-dimensional function spaces via deep neural networks.

SMS: Spiking Marching Scheme for Efficient Long Time Integration of Differential Equations

no code implementations17 Nov 2022 Qian Zhang, Adar Kahana, George Em Karniadakis, Panos Stinis

We propose a Spiking Neural Network (SNN)-based explicit numerical scheme for long time integration of time-dependent Ordinary and Partial Differential Equations (ODEs, PDEs).

How important are activation functions in regression and classification? A survey, performance comparison, and future directions

no code implementations6 Sep 2022 Ameya D. Jagtap, George Em Karniadakis

For this purpose, we also discuss various requirements for activation functions that have been used in the physics-informed machine learning framework.

Physics-informed machine learning regression

A Hybrid Iterative Numerical Transferable Solver (HINTS) for PDEs Based on Deep Operator Network and Relaxation Methods

no code implementations28 Aug 2022 Enrui Zhang, Adar Kahana, Eli Turkel, Rishikesh Ranade, Jay Pathak, George Em Karniadakis

Based on recent advances in scientific deep learning for operator regression, we propose HINTS, a hybrid, iterative, numerical, and transferable solver for differential equations.

NeuralUQ: A comprehensive library for uncertainty quantification in neural differential equations and operators

1 code implementation25 Aug 2022 Zongren Zou, Xuhui Meng, Apostolos F Psaros, George Em Karniadakis

In this paper, we present an open-source Python library (https://github. com/Crunch-UQ4MI), termed NeuralUQ and accompanied by an educational tutorial, for employing UQ methods for SciML in a convenient and structured manner.

Uncertainty Quantification

G2Φnet: Relating Genotype and Biomechanical Phenotype of Tissues with Deep Learning

no code implementations21 Aug 2022 Enrui Zhang, Bart Spronck, Jay D. Humphrey, George Em Karniadakis

Many genetic mutations adversely affect the structure and function of load-bearing soft tissues, with clinical sequelae often responsible for disability or death.

Physics-Informed Deep Neural Operator Networks

no code implementations8 Jul 2022 Somdatta Goswami, Aniruddha Bora, Yue Yu, George Em Karniadakis

Standard neural networks can approximate general nonlinear operators, represented either explicitly by a combination of mathematical operators, e. g., in an advection-diffusion-reaction partial differential equation, or simply as a black box, e. g., a system-of-systems.

Uncertainty Quantification

Spiking Neural Operators for Scientific Machine Learning

no code implementations17 May 2022 Adar Kahana, Qian Zhang, Leonard Gleyzer, George Em Karniadakis

We demonstrate this new approach for classification using the SNN in the branch, achieving results comparable to the literature.

Edge-computing regression

Scalable algorithms for physics-informed neural and graph networks

no code implementations16 May 2022 Khemraj Shukla, Mengjia Xu, Nathaniel Trask, George Em Karniadakis

For more complex systems or systems of systems and unstructured data, graph neural networks (GNNs) present some distinct advantages, and here we review how physics-informed learning can be accomplished with GNNs based on graph exterior calculus to construct differential operators; we refer to these architectures as physics-informed graph networks (PIGNs).

BIG-bench Machine Learning Physics-informed machine learning

Bayesian Physics-Informed Neural Networks for real-world nonlinear dynamical systems

no code implementations12 May 2022 Kevin Linka, Amelie Schafer, Xuhui Meng, Zongren Zou, George Em Karniadakis, Ellen Kuhl

Our study reveals the inherent advantages and disadvantages of Neural Networks, Bayesian Inference, and a combination of both and provides valuable guidelines for model selection.

Bayesian Inference Model Selection +1

Neural operator learning of heterogeneous mechanobiological insults contributing to aortic aneurysms

no code implementations8 May 2022 Somdatta Goswami, David S. Li, Bruno V. Rego, Marcos Latorre, Jay D. Humphrey, George Em Karniadakis

Thoracic aortic aneurysm (TAA) is a localized dilatation of the aorta resulting from compromised wall composition, structure, and function, which can lead to life-threatening dissection or rupture.

Operator learning

Deep transfer operator learning for partial differential equations under conditional shift

1 code implementation20 Apr 2022 Somdatta Goswami, Katiana Kontolati, Michael D. Shields, George Em Karniadakis

Transfer learning (TL) enables the transfer of knowledge gained in learning to perform one task (source) to a related but different task (target), hence addressing the expense of data acquisition and labeling, potential computational power limitations, and dataset distribution mismatches.

Domain Adaptation Operator learning +4

Learning two-phase microstructure evolution using neural operators and autoencoder architectures

no code implementations11 Apr 2022 Vivek Oommen, Khemraj Shukla, Somdatta Goswami, Remi Dingreville, George Em Karniadakis

We utilize the convolutional autoencoder to provide a compact representation of the microstructure data in a low-dimensional latent space.

Vocal Bursts Valence Prediction

Discovering and forecasting extreme events via active learning in neural operators

no code implementations5 Apr 2022 Ethan Pickering, Stephen Guth, George Em Karniadakis, Themistoklis P. Sapsis

This model-agnostic framework pairs a BED scheme that actively selects data for quantifying extreme events with an ensemble of DNOs that approximate infinite-dimensional nonlinear operators.

Active Learning Experimental Design +1

On the influence of over-parameterization in manifold based surrogates and deep neural operators

1 code implementation9 Mar 2022 Katiana Kontolati, Somdatta Goswami, Michael D. Shields, George Em Karniadakis

In contrast, an even highly over-parameterized DeepONet leads to better generalization for both smooth and non-smooth dynamics.

Operator learning

Interfacing Finite Elements with Deep Neural Operators for Fast Multiscale Modeling of Mechanics Problems

no code implementations25 Feb 2022 Minglang Yin, Enrui Zhang, Yue Yu, George Em Karniadakis

In this work, we explore the idea of multiscale modeling with machine learning and employ DeepONet, a neural operator, as an efficient surrogate of the expensive solver.

Physics-informed neural networks for inverse problems in supersonic flows

no code implementations23 Feb 2022 Ameya D. Jagtap, Zhiping Mao, Nikolaus Adams, George Em Karniadakis

Accurate solutions to inverse supersonic compressible flow problems are often required for designing specialized aerospace vehicles.

Systems Biology: Identifiability analysis and parameter identification via systems-biology informed neural networks

2 code implementations3 Feb 2022 Mitchell Daneker, Zhen Zhang, George Em Karniadakis, Lu Lu

The dynamics of systems biological processes are usually modeled by a system of ordinary differential equations (ODEs) with many unknown parameters that need to be inferred from noisy and sparse measurements.

Uncertainty Quantification in Scientific Machine Learning: Methods, Metrics, and Comparisons

1 code implementation19 Jan 2022 Apostolos F Psaros, Xuhui Meng, Zongren Zou, Ling Guo, George Em Karniadakis

Neural networks (NNs) are currently changing the computational paradigm on how to combine data with mathematical laws in physics and engineering in a profound way, tackling challenging inverse and ill-posed problems not solvable with traditional methods.

BIG-bench Machine Learning Uncertainty Quantification

SympOCnet: Solving optimal control problems with applications to high-dimensional multi-agent path planning problems

1 code implementation14 Jan 2022 Tingwei Meng, Zhen Zhang, Jérôme Darbon, George Em Karniadakis

Solving high-dimensional optimal control problems in real-time is an important but challenging problem, with applications to multi-agent path planning problems, which have drawn increased attention given the growing popularity of drones in recent years.

Gradient-enhanced physics-informed neural networks for forward and inverse PDE problems

2 code implementations1 Nov 2021 Jeremy Yu, Lu Lu, Xuhui Meng, George Em Karniadakis

We tested gPINNs extensively and demonstrated the effectiveness of gPINNs in both forward and inverse PDE problems.

DynG2G: An Efficient Stochastic Graph Embedding Method for Temporal Graphs

1 code implementation28 Sep 2021 Mengjia Xu, Apoorva Vikram Singh, George Em Karniadakis

However, recent advances mostly focus on learning node embeddings as deterministic "vectors" for static graphs yet disregarding the key graph temporal dynamics and the evolving uncertainties associated with node embedding in the latent space.

Dynamic graph embedding Uncertainty Quantification

When Do Extended Physics-Informed Neural Networks (XPINNs) Improve Generalization?

no code implementations20 Sep 2021 Zheyuan Hu, Ameya D. Jagtap, George Em Karniadakis, Kenji Kawaguchi

Specifically, for general multi-layer PINNs and XPINNs, we first provide a prior generalization bound via the complexity of the target functions in the PDE problem, and a posterior generalization bound via the posterior matrix norms of the networks after optimization.

GFINNs: GENERIC Formalism Informed Neural Networks for Deterministic and Stochastic Dynamical Systems

1 code implementation31 Aug 2021 Zhen Zhang, Yeonjong Shin, George Em Karniadakis

We propose the GENERIC formalism informed neural networks (GFINNs) that obey the symmetric degeneracy conditions of the GENERIC formalism.

Meta-learning PINN loss functions

no code implementations12 Jul 2021 Apostolos F Psaros, Kenji Kawaguchi, George Em Karniadakis

In the computational examples, the meta-learned losses are employed at test time for addressing regression and PDE task distributions.

Meta-Learning

Learning Functional Priors and Posteriors from Data and Physics

no code implementations8 Jun 2021 Xuhui Meng, Liu Yang, Zhiping Mao, Jose del Aguila Ferrandis, George Em Karniadakis

In summary, the proposed method is capable of learning flexible functional priors, and can be extended to big data problems using stochastic HMC or normalizing flows since the latent space is generally characterized as low dimensional.

Meta-Learning regression +1

Physics-informed neural networks (PINNs) for fluid mechanics: A review

no code implementations20 May 2021 Shengze Cai, Zhiping Mao, Zhicheng Wang, Minglang Yin, George Em Karniadakis

Despite the significant progress over the last 50 years in simulating flow problems using numerical discretization of the Navier-Stokes equations (NSE), we still cannot incorporate seamlessly noisy data into existing algorithms, mesh-generation is complex, and we cannot tackle high-dimensional problems governed by parametrized NSE.

Deep Kronecker neural networks: A general framework for neural networks with adaptive activation functions

2 code implementations20 May 2021 Ameya D. Jagtap, Yeonjong Shin, Kenji Kawaguchi, George Em Karniadakis

We propose a new type of neural networks, Kronecker neural networks (KNNs), that form a general framework for neural networks with adaptive activation functions.

A Caputo fractional derivative-based algorithm for optimization

no code implementations6 Apr 2021 Yeonjong Shin, Jérôme Darbon, George Em Karniadakis

We propose three versions -- non-adaptive, adaptive terminal and adaptive order.

Measure-conditional Discriminator with Stationary Optimum for GANs and Statistical Distance Surrogates

no code implementations17 Jan 2021 Liu Yang, Tingwei Meng, George Em Karniadakis

We propose a simple but effective modification of the discriminators, namely measure-conditional discriminators, as a plug-and-play module for different GANs.

Transfer Learning

Operator learning for predicting multiscale bubble growth dynamics

no code implementations23 Dec 2020 Chensen Lin, Zhen Li, Lu Lu, Shengze Cai, Martin Maxey, George Em Karniadakis

Simulating and predicting multiscale problems that couple multiple physics and dynamics across many orders of spatiotemporal scales is a great challenge that has not been investigated systematically by deep neural networks (DNNs).

Computational Physics

Multi-fidelity Bayesian Neural Networks: Algorithms and Applications

no code implementations19 Dec 2020 Xuhui Meng, Hessam Babaee, George Em Karniadakis

We propose a new class of Bayesian neural networks (BNNs) that can be trained using noisy data of variable fidelity, and we apply them to learn function approximations as well as to solve inverse problems based on partial differential equations (PDEs).

Active Learning Uncertainty Quantification

Learning Poisson systems and trajectories of autonomous systems via Poisson neural networks

1 code implementation5 Dec 2020 Pengzhan Jin, Zhen Zhang, Ioannis G. Kevrekidis, George Em Karniadakis

We propose the Poisson neural networks (PNNs) to learn Poisson systems and trajectories of autonomous systems from data.

Cannot find the paper you are looking for? You can Submit a new open access paper.