Skip to main content

A PROXIMAL APPROACH TO IVA-G WITH CONVERGENCE GUARANTEES

Clément Cosserat (CVN); Ben Gabrielson (University of Maryland, Baltimore County); Emilie Chouzenoux (Inria Saclay); Jean-Christophe Pesquet (CentraleSupelec); Tulay Adali (University of Maryland, Baltimore County)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
09 Jun 2023

Independent vector analysis (IVA) generalizes independent compo- nent analysis (ICA) to multiple datasets, and when used with a mul- tivariate Gaussian model (IVA-G), provides a powerful tool for joint analysis of multiple datasets in an array of applications. While IVA- G enjoys uniqueness guarantees, the current solution to the problem exhibits significant variability across runs necessitating the use of a scheme for selecting the most consistent one, which is costly. In this paper, we present a penalized maximum-likelihood framework for the problem, which enables us to derive a non-convex cost func- tion that depends on the precision matrices of the source component vectors, the main mechanism by which IVA-G leverages correlation across the datasets. By adding a quadratic regularization, a block- coordinate proximal algorithm is shown to offer a suitable solution to this minimization problem. The proposed method also provides convergence guarantees that are lacking in other state-of-the-art ap- proaches to the problem. This allows us to obtain overall better per- formance, and in particular, we show that our method yields better estimation than the current IVA-G algorithm for various source num- bers, datasets, and degrees of correlation across the data.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00