Skip to main content

Low-Complexity Low-Rank Approximation SVD for Massive Matrix in Tensor Train Format

Jung-Chun Chi (National Tsing Hua University); Chiao-En Chen (National Chung Hsing University); Yuan-Hao Huang (National Tsing Hua University)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
07 Jun 2023

Tensor train decomposition (TTD) has recently been proposed for high-demension signals because it can save significant storage in various signal processing applications. This paper presents a low-rank approximation algorithm of singular value decomposition (SVD) for large-scale matrices in tensor train format (TT-format). The proposed alternating least square block power SVD (ALS-BPSVD) algorithm can reduce the computational complexity by decomposing the large-scale SVD into a low-rank approximation scheme with a fixed-iteration block power method for searching singular values and vectors. Moreover, a low-complexity two-step truncation scheme was proposed to reduce more complexity and facilitate the parallel processing. The proposed ALS-BPSVD algorithm can support the low-rank approximation SVD for matrices with dimension higher than 2^11 x 2^11 . The simulation results show that the ALS-BPSVD achieved up to 21.3 times speed-up compared to the benchmark ALS-SVD algorithm for the random matrices with prescribed singular values.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00