no code implementations • 17 Nov 2023 • Sergei Manzhos, Manabu Ihara
Kernel methods such as kernel ridge regression and Gaussian process regressions with Matern type kernels have been increasingly used, in particular, to fit potential energy surfaces (PES) and density functionals, and for materials informatics.
no code implementations • 11 Feb 2023 • Sergei Manzhos, Manabu Ihara
Here, we show that neural network models of orders-of-coupling representations can be easily built by using a recently proposed neural network with optimal neuron activation functions computed with a first-order additive Gaussian process regression [arXiv:2301. 05567] and avoiding non-linear parameter optimization.
no code implementations • 13 Jan 2023 • Sergei Manzhos, Manabu Ihara
While even a single-hidden layer NN is a universal approximator, its expressive power is limited by the use of simple neuron activation functions (such as sigmoid functions) that are typically the same for all neurons.
no code implementations • 21 Nov 2022 • Sergei Manzhos, Manabu Ihara
It is also critical to the formulation of multi-zeta type basis functions widely used in computational chemistry We show, on the example of fitting of molecular potential energy surfaces of increasing dimensionality, the practical disappearance of the property of locality of a Gaussian-like kernel in high dimensionality.
1 code implementation • 24 Nov 2020 • Owen Ren, Mohamed Ali Boussaidi, Dmitry Voytsekhovsky, Manabu Ihara, Sergei Manzhos
We present a Python implementation for RS-HDMR-GPR (Random Sampling High Dimensional Model Representation Gaussian Process Regression).