no code implementations • 31 Jan 2024 • Yahong Yang, Juncai He
Constructing the architecture of a neural network is a challenging pursuit for the machine learning community, and the dilemma of whether to go deeper or wider remains a persistent question.
no code implementations • 8 Nov 2023 • Yahong Yang, Yulong Lu
This paper establishes the nearly optimal rate of approximation for deep neural networks (DNNs) when applied to Korobov functions, effectively overcoming the curse of dimensionality.
no code implementations • 26 Sep 2023 • Yahong Yang, Qipin Chen, Wenrui Hao
In this paper, we present a novel training approach called the Homotopy Relaxation Training Algorithm (HRTA), aimed at accelerating the training process in contrast to traditional methods.
no code implementations • 28 May 2022 • Yahong Yang, Yang Xiang
In this paper, we establish a neural network to approximate functionals, which are maps from infinite dimensional spaces to finite dimensional spaces.