DISCRETE MULTI-KERNEL K-MEANS WITH DIVERSE AND OPTIMAL KERNEL LEARNING
Yihang Lu, Jitao Lu, Rong Wang, Feiping Nie
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:14:17
Multiple Kernel k-means and its variants integrate a group of kernels to improve clustering performance, but it still has some drawbacks: 1) linearly combining base kernels to get the optimal one limits the kernel representability and cuts off the negotiation of kernel learning and clustering; 2) ignoring the correlation among kernels leads to kernel redundancy; 3) solving NP-hard cluster assignment problem by a two-stage strategy leads to information loss. In this paper, we propose the Discrete Multi-kernel k-means with Diverse and Optimal Kernel Learning (DMK-DOK) model, which adaptively seeks for a better kernel by residing in the base kernel neighborhood and negotiates the kernel learning and clustering. Moreover, it implicitly penalizes the highly correlated kernels to enhance the kernel fusion with less redundancy and more diversity. What's more, it jointly learns discrete and relaxed labels in the same optimization objective, which can avoid information loss. Lastly, extensive experiments conducted on real-world datasets illustrated the superiority of our model.