Learning What Not to Model: Gaussian Process Regression with Negative Constraints

1 Jan 2021  ·  Gaurav Shrivastava, Harsh Shrivastava, Abhinav Shrivastava ·

Gaussian Process (GP) regression fits a curve on a set of datapairs, with each pair consisting of an input point '$\mathbf{x}$' and its corresponding target regression value '$y(\mathbf{x})$' (a positive datapair). But, what if for an input point '$\bar{\mathbf{x}}$', we want to constrain the GP to avoid a target regression value '$\bar{y}(\bar{\mathbf{x}})$' (a negative datapair)? This requirement can often appear in real-world navigation tasks, where an agent would want to avoid obstacles, like furniture items in a room when planning a trajectory to navigate. In this work, we propose to incorporate such negative constraints in a GP regression framework. Our approach, 'GP-NC' or Gaussian Process with Negative Constraints, fits over the positive datapairs while avoiding the negative datapairs. Specifically, our key idea is to model the negative datapairs using small blobs of Gaussian distribution and maximize its KL divergence from the GP. We jointly optimize the GP-NC for both the positive and negative datapairs. We empirically demonstrate that our GP-NC framework performs better than the traditional GP learning and that our framework does not affect the scalability of Gaussian Process regression and helps the model converge faster as the size of the data increases.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods