-
SPS
IEEE Members: Free
Non-members: FreeLength: 0:30:53
Part 5 of the series of lectures is dedicated to Bayesian learning. Starting from the basics, i.e., maximum likelihood and maximum a-posteriori (MAP) estimator, we move to the EM algorithm and, finally, to the variational approximation methods.
Some introductory remarks are given concerning Bayesian and frequentists methods. Probabilities are looked at not as measures of randomness but as measures of ignorance. The Bayes theorem is interpreted as an inverse problem solver.