Universal Approximation and the Topological Neural Network

26 May 2023  ·  Michael A. Kouritzin, Daniel Richard ·

A topological neural network (TNN), which takes data from a Tychonoff topological space instead of the usual finite dimensional space, is introduced. As a consequence, a distributional neural network (DNN) that takes Borel measures as data is also introduced. Combined these new neural networks facilitate things like recognizing long range dependence, heavy tails and other properties in stochastic process paths or like acting on belief states produced by particle filtering or hidden Markov model algorithms. The veracity of the TNN and DNN are then established herein by a strong universal approximation theorem for Tychonoff spaces and its corollary for spaces of measures. These theorems show that neural networks can arbitrarily approximate uniformly continuous functions (with respect to the sup metric) associated with a unique uniformity. We also provide some discussion showing that neural networks on positive-finite measures are a generalization of the recent deep learning notion of deep sets.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here