Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 15:00
04 May 2020

We consider the estimation of the Probability Mass Function (PMF) of a discrete random vector from partial observations. Since the PMF takes the form of a multi-way tensor, under certain model assumptions the problem becomes closely related to tensor factorization. It was recently shown that a low-rank PMF tensor can be recovered (under mild conditions) using a low-rank joint factorization of all joint PMFs of subsets of fixed cardinality larger than two. The (approximate) joint factorization is based on a Least Squares (LS) fit to the estimated sub-tensors. Here we take a different estimation approach by fitting the partial factorization directly to the observed partial data in the sense of Kullback-Leibler divergence (KLD). Consequently, we avoid the need for particular selection and direct estimation of sub-tensors of a particular order, as we inherently apply proper weighting to all the available partial data. We show that our approach essentially attains the Maximum Likelihood estimate of the full PMF tensor, and therefore enjoys its consistency and asymptotic efficiency. In addition, based on the Bayesian model interpretation of the low-rank model, we propose an Estimation-Maximization (EM) based approach, which is computationally cheap per iteration. Simulation results demonstrate the advantages of our proposed approach.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00