Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:14:55
13 May 2022

Generalized canonical correlation analysis (GCCA) aims to learn common low-dimensional representations from multiple ``views'' of the data (e.g., audio and video of the same event). In the era of big data, GCCA computation encounters many new challenges. In particular, distributed optimization for GCCA---whhich is well-motivated in applications like internet of things and parallel computing---may incur prohibitively high communication costs. To address this challenge, this work proposes a communication-efficient distributed GCCA algorithm under the popular MAX-VAR GCCA paradigm. A quantization strategy for information exchange among the computing agents is employed in the proposed algorithm. It is observed that our design, leveraging the idea of error feedback-based quantization, can reduce communication cost by at least 90% while maintaining essentially the same GCCA performance as the unquantized version. Furthermore, the proposed method is guaranteed to converge to a neighborhood of the optimal solution in a geometric rate---even under aggressive quantization. The effectiveness of our method is demonstrated using both synthetic and real data experiments.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00