Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:11:09
28 Mar 2022

Artificial Neural Networks (ANN) are being widely used in supervised Machine Learning (ML) to analyse signals or images for many applications. Using a learning database, one of the main challenges is to optimize the network weights. This optimization step is generally performed using a gradient-based approach with a back-propagation strategy. For the sake of efficiency, regularization is generally used. When non-smooth regularizers are used especially to promote sparse networks, this optimization becomes challenging. Classical gradient-based optimizers cannot be used due to differentiability issues. In this paper, we propose an MCMC-based optimization scheme formulated in a Bayesian framework. Hamiltonian dynamics are used to design an efficient sampling scheme. Promising results show the usefulness of the proposed method to allow ANNs with low complexity levels reaching high accuracy rates.

Value-Added Bundle(s) Including this Product