Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 12:27
21 Sep 2020

Nonparametric regression using Gaussian Process (GP) models is a powerful but computationally demanding method. While various approximation methods have been developed to mitigate its computation complexity, few works have addressed the quality of the resulting approximations of the target posterior. In this paper we start from a general belief updating framework that can generate various approximations. We show that applying using composite likelihoods yields computationally scalable approximations for both GP learning and prediction. We then analyze the quality of the approximation in terms of averaged prediction errors as well as Kullback-Leibler (KL) divergences.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00