Kernel Learning With Tensor Networks
Kriton Konstantinidis, Shengxi Li, Danilo P. Mandic
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:07:05
The expressive power of Gaussian Processes (GPs) is largely attributed to their kernel function, which highlights the crucial role of kernel design. Efforts in this direction include modern Neural Network (NN) based kernel design, which despite success, suffers from the lack of interpretability and tendency to overfit. To this end, we introduce a Tensor Network (TN) approach to learning kernel embeddings, with a TN serving to map the input to a low dimensional manifold, where a suitable base kernel function can be applied. The proposed framework allows for joint learning of the TN and base kernel parameters using stochastic variational inference, while leveraging on the low-rank regularization and multi-linear nature of TNs to boost model performance and provide enhanced interpretability. Performance evaluation within the regression paradigm against TNs and Deep Kernels demonstrates the potential of the framework, providing conclusive evidence for promising future extensions to other learning paradigms.
Chairs:
Shuchin Aeron