no code implementations • 26 May 2023 • Sushma Kumari, Vladimir G. Pestov
Thanks to the results of C\'erou and Guyader (2006) and Preiss (1983), this rule is known to be universally consistent in every such metric space that is sigma-finite dimensional in the sense of Nagata.
no code implementations • 4 May 2020 • Vladimir G. Pestov
We show that the $k$-nearest neighbour learning rule is universally consistent in a metric space $X$ if and only if it is universally consistent in every separable subspace of $X$ and the density of $X$ is less than every real-measurable cardinal.
no code implementations • 28 Feb 2020 • Benoît Collins, Sushma Kumari, Vladimir G. Pestov
The generalization is non-trivial because of the distance ties being more prevalent in the non-euclidean setting, and on the way we investigate the relevant geometric properties of the metrics and the limitations of the Stone argument, by constructing various examples.
no code implementations • 6 Oct 2019 • Vladimir G. Pestov
The topics include: the geometry of the Hamming cube, concentration of measure, shattering and VC dimension, Glivenko-Cantelli classes, PAC learnability, universal consistency and the k-NN classifier in metric spaces, dimensionality reduction, universal approximation, sample compression.