Proximal Multitask Learning Over Distributed Networks With Jointly Sparse Structure
Danqi Jin, Jie Chen, Jingdong Chen, Cédric Richard
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 15:17
Modeling relations between local optimum parameter vectors in multitask networks has attracted much attention over the last years. This work considers a distributed optimization problem for parameter vectors with a jointly sparse structure among nodes, that is, the parameter vectors share the same support set. By introducing an L_{\infty,1}-norm penalty at each node, and using a proximal gradient method to minimize the regularized cost, we devise a proximal multitask diffusion LMS algorithm which promotes the joint-sparsity to enhance the estimation performance. Analyses are provided to ensure the stability. Simulation results are presented to highlight the performance.