Search Results for author: Takaharu Yaguchi

Found 8 papers, 2 papers with code

Neural Operators Meet Energy-based Theory: Operator Learning for Hamiltonian and Dissipative PDEs

no code implementations14 Feb 2024 Yusuke Tanaka, Takaharu Yaguchi, Tomoharu Iwata, Naonori Ueda

The operator learning has received significant attention in recent years, with the aim of learning a mapping between function spaces.

Operator learning Super-Resolution

Good Lattice Training: Physics-Informed Neural Networks Accelerated by Number Theory

no code implementations26 Jul 2023 Takashi Matsubara, Takaharu Yaguchi

However, the solutions to PDEs are inherently infinite-dimensional, and the distance between the output and the solution is defined by an integral over the domain.

FINDE: Neural Differential Equations for Finding and Preserving Invariant Quantities

no code implementations1 Oct 2022 Takashi Matsubara, Takaharu Yaguchi

However, these models incorporate the underlying structures, and in most situations where neural networks learn unknown systems, these structures are also unknown.

Neural Symplectic Form: Learning Hamiltonian Equations on General Coordinate Systems

no code implementations NeurIPS 2021 Yuhan Chen, Takashi Matsubara, Takaharu Yaguchi

In this study, we propose a model that learns the symplectic form from data using neural networks, thereby providing a method for learning Hamiltonian equations from data represented in general coordinate systems, which are not limited to the generalized coordinates and the generalized momenta.

KAM Theory Meets Statistical Learning Theory: Hamiltonian Neural Networks with Non-Zero Training Loss

no code implementations22 Feb 2021 Yuhan Chen, Takashi Matsubara, Takaharu Yaguchi

To apply the KAM theory, we provide a generalization error bound for Hamiltonian neural networks by deriving an estimate of the covering number of the gradient of the multi-layer perceptron, which is the key ingredient of the model.

Learning Theory

Symplectic Adjoint Method for Exact Gradient of Neural ODE with Minimal Memory

1 code implementation NeurIPS 2021 Takashi Matsubara, Yuto Miyatake, Takaharu Yaguchi

The symplectic adjoint method obtains the exact gradient (up to rounding error) with memory proportional to the number of uses plus the network size.

Numerical Integration

Method for estimating hidden structures determined by unidentifiable state-space models and time-series data based on the Groebner basis

no code implementations22 Dec 2020 Mizuka Komatsu, Takaharu Yaguchi

As the parameters of unidentifiable models cannot be uniquely determined from the given data, it is difficult to examine the systems described by such models.

Time Series Time Series Analysis

Deep Energy-Based Modeling of Discrete-Time Physics

1 code implementation NeurIPS 2020 Takashi Matsubara, Ai Ishikawa, Takaharu Yaguchi

Physical phenomena in the real world are often described by energy-based modeling theories, such as Hamiltonian mechanics or the Landau theory, which yield various physical laws.

Cannot find the paper you are looking for? You can Submit a new open access paper.