Gaussian Process Meta Few-shot Classifier Learning via Linear Discriminant Laplace Approximation

9 Nov 2021  ·  Minyoung Kim, Timothy Hospedales ·

The meta learning few-shot classification is an emerging problem in machine learning that received enormous attention recently, where the goal is to learn a model that can quickly adapt to a new task with only a few labeled data. We consider the Bayesian Gaussian process (GP) approach, in which we meta-learn the GP prior, and the adaptation to a new task is carried out by the GP predictive model from the posterior inference. We adopt the Laplace posterior approximation, but to circumvent the iterative gradient steps for finding the MAP solution, we introduce a novel linear discriminant analysis (LDA) plugin as a surrogate for the MAP solution. In essence, the MAP solution is approximated by the LDA estimate, but to take the GP prior into account, we adopt the prior-norm adjustment to estimate LDA's shared variance parameters, which ensures that the adjusted estimate is consistent with the GP prior. This enables closed-form differentiable GP posteriors and predictive distributions, thus allowing fast meta training. We demonstrate considerable improvement over the previous approaches.

PDF Abstract
No code implementations yet. Submit your code now

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods