Alternating Constrained Minimization based Approximate Message Passing
Christo Kurisummoottil Thomas (Virginia Tech); Dirk Slock (EURECOM, France)
-
SPS
IEEE Members: $11.00
Non-members: $15.00
Generalized Approximate Message Passing (GAMP) allows for
Bayesian inference in linear models with non-identically indepen-
dently distributed (n.i.i.d.) priors and n.i.i.d. measurements of the
linear mixture outputs. It represents an efficient technique for ap-
proximate inference, which becomes accurate when both rows and
columns of the measurement matrix can be treated as sets of in-
dependent vectors and both dimensions become large. It has been
shown that the fixed points of GAMP correspond to the extrema of
a large system limit of the Bethe Free Energy (LSL-BFE), which
represents a meaningful approximation optimization criterion regard-
less of whether the measurement matrix exhibits the independence
properties. However, the convergence of (G)AMP can be notori-
ously problematic for certain measurement matrices, and the only
sure fix so far is damping (by a difficult-to-determine amount). In
this paper, we revisit the GAMP algorithm for a sparse Bayesian
learning problem by rigorously applying an alternating constrained
minimization strategy to an appropriately reparameterized LSL-BFE
with matched variable and constraint partitioning. This guarantees
convergence, at least to a local optimum. We furthermore introduce a
natural extension of the BFE to integrate the estimation of hyperpa-
rameters via Variational Bayes, leading to Variational AMBGAMP or
VAMBGAMP.