Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 14:05
28 May 2020

A general class of Bayesian lower bounds when the underlying loss function is a Bregman divergence is demonstrated. This class can be considered as an extension of the Weinstein-Weiss family of bounds for the mean squared error and relies on finding a variational characterization of Bayesian risk. The approach allows for the derivation of a version of the Cramér-Rao bound that is specific to a given Bregman divergence. The effectiveness of the new bound is evaluated in the Poisson noise setting.