no code implementations • 7 Nov 2023 • Jonathan H. Manton
Recursive filters are treated as linear time-invariant (LTI) systems but they are not: uninitialised, they have an infinite number of outputs for any given input, while if initialised, they are not time-invariant.
no code implementations • 26 Mar 2023 • Xuetong Wu, Jonathan H. Manton, Uwe Aickelin, Jingge Zhu
However, such a learning rate is typically considered to be ``slow", compared to a ``fast rate" of $O(\lambda/n)$ in many learning scenarios.
no code implementations • 12 Jul 2022 • Xuetong Wu, Jonathan H. Manton, Uwe Aickelin, Jingge Zhu
Specifically, we provide generalization error upper bounds for the empirical risk minimization (ERM) algorithm where data from both distributions are available in the training phase.
no code implementations • 10 May 2022 • Xuetong Wu, Mingming Gong, Jonathan H. Manton, Uwe Aickelin, Jingge Zhu
We show that in causal learning, the excess risk depends on the size of the source sample at a rate of O(1/m) only if the labelling distribution between the source and target domains remains unchanged.
no code implementations • 6 May 2022 • Xuetong Wu, Jonathan H. Manton, Uwe Aickelin, Jingge Zhu
However, such a learning rate is typically considered to be "slow", compared to a "fast rate" of O(1/n) in many learning scenarios.
no code implementations • 3 Sep 2021 • Xuetong Wu, Jonathan H. Manton, Uwe Aickelin, Jingge Zhu
Transfer learning is a machine learning paradigm where knowledge from one problem is utilized to solve a new but related problem.
no code implementations • 4 May 2021 • Xuetong Wu, Jonathan H. Manton, Uwe Aickelin, Jingge Zhu
On the one hand, it is conceivable that knowledge from one task could be useful for solving a related problem.
no code implementations • 11 Jan 2021 • Salem Said, Nicolas Le Bihan, Jonathan H. Manton
Hidden Markov chain, or Markov field, models, with observations in a Euclidean space, play a major role across signal and image processing.
Statistics Theory Statistics Theory
no code implementations • 4 Oct 2020 • Pavel Tolmachev, Jonathan H. Manton
Hopfield neural networks are a possible basis for modelling associative memory in living organisms.
no code implementations • 18 May 2020 • Xuetong Wu, Jonathan H. Manton, Uwe Aickelin, Jingge Zhu
Specifically, we provide generalization error upper bounds for general transfer learning algorithms and extend the results to a specific empirical risk minimization (ERM) algorithm where data from both distributions are available in the training phase.