Skip to main content
Machine Learning Lecture Series

Courses: Machine Learning

A Machine Learning Lecture Series

The IEEE Signal Processing Society (SPS) is proud to offer this free course bundle "A Machine Learning Lecture Series" by Prof. Sergios Theodoridis, with course material preparation by Konstantinos Koutroumbas.

The goal of this series of lectures is to introduce the newcomer to the “secrets” of the machine learning (ML) discipline. In the dawn of the 4th industrial revolution era, machine learning is one among the key technologies that drive the advances and the fast evolution of this new historical period.

Dr. Sergios Theodoridis
Prof. Sergios Theodoridis

The series of lectures is intended to cover a major part of what is considered as basic knowledge in machine learning. The lectures start from the definitions of regression and classification and move on from the classics to the most recent advances in the field. The online lectures have been developed to address the needs of those who wish to grasp and understand the basic notions behind the methods and algorithms that have been developed and not just the needs of the black box type of users of ML algorithms.

The series of lectures comprises five parts.

  • Part 1 deals with the basic definitions as well as the fundamentals related to regression and classification.
  • Part 2 deals with the classics on classification, starting with the Bayes classifier rule and ending with the classification trees and the "boosting" concept.
  • Part 3 presents the notion of kernels and support vector machines.
  • Part 4 focuses on deep learning, following a historical development, starting from the classical perceptron and moving on to convolutional neural networks, recurrent neural networks, adversarial examples and GANs.
  • Part 5 presents Bayesian learning, latent variables, the expectation-maximization algorithm and the variational approximation concept, with applications to Gaussian mixtures and regression.

Part 1, Part 2 and Part 4 are a must.

Earn Educational Credits for Short Courses

Access all of the courses below on Machine Learning and take advantage of earning educational credits and continuing educational units by taking a quiz. Or, visit the Machine Learning Course page directly.

Note: Look for Professional Development Hours (PDH) and Continuing Educational Units (CEU) notations indicated on these courses on our Resource Center to acquire educational credits. Our webinars also offer PDHs.

Biography

Sergios Theodoridis is Professor Emeritus of Signal Processing and Machine Learning with the Department of Informatics and Telecommunications of the National and Kapodistrian University of Athens, Greece. He is also adjunct Professor with the Aalborg University, Denmark. He has served as Distinguished Professor, Aalborg University, Denmark and, also, as a visiting Professor with the Shenzhen Research Institute of Big Data (SRIBD), the Chinese University of Hong Kong, Shenzhen, China (2018-2020). His research interests lie in the areas of Machine Learning and Signal Processing.

He is the author of the book “Machine Learning: From the Classics, to Deep Learning, Transformers and Diffusion Models”, Academic Press, 3nd Ed., 2024, the co-author of the best-selling book “Pattern Recognition”, Academic Press, 4th ed. 2009, the co-author of the book “Introduction to Pattern Recognition: A MATLAB Approach”, Academic Press, 2010, the co-editor of the book “Efficient Algorithms for Signal Processing and System Identification”, Prentice Hall 1993, and the co-author of three books in Greek, two of them for the Greek Open University. His books have been translated into Chinese, Korean, Japanese and Greek languages.

He is the co-author of seven papers that have received Best Paper Awards including the 2014 IEEE Signal Processing Magazine Best Paper Award and the 2009 IEEE Computational Intelligence Society Transactions on Neural Networks Outstanding Paper Award.

He has received an honorary doctoral degree (D.Sc) from the University of Edinburgh, Scotland, UK, 2023. He is the recipient of the 2021 Institute of Electronics and Electrical Engineering (IEEE) Signal Processing Society (SPS) Norbert Wiener Award, which is the IEEE SP Society’s highest honor, the 2017 European Association for Signal and Image Processing (EURASIP) Athanasios Papoulis Award, the 2014 IEEE SPS Carl Friedrich Gauss Education Award and the 2014 EURASIP Meritorious Service Award. He has served as a Distinguished Lecturer for the IEEE SP as well as the Circuits and Systems Societies. He was Otto Monstead Guest Professor, Technical University of Denmark, 2012, and holder of the Excellence Chair in the Department of Signal Processing and Communications, University Carlos III, Madrid, Spain, 2011.

He currently serves as Chairman of the IEEE SP Society Awards Board. He has served as Vice President IEEE Signal Processing Society, as President of the European Association for Signal Processing (EURASIP), as a member of the Board of Governors for the IEEE Circuits and Systems (CAS) Society, as a member of the Board of Governors (Member-at-Large) of the IEEE SP Society and as a Chair of the Signal Processing Theory and Methods (SPTM) technical committee of IEEE SPS. He has served as Editor-in-Chief for the IEEE Transactions on Signal Processing, the world’s flagship journal in Signal Processing.

He is Fellow of IET, a Corresponding Fellow of the Royal Society of Edinburgh (RSE), a Fellow of EURASIP and a Life Fellow of IEEE.


Links to Machine Learning Lecture Series on the IEEE SPS Resource Center

Part 1

  1. P0 - Machine Learning-An Overview
  2. P1.1 - Introduction I
  3. P1.2 - Introduction II
  4. P1.3-Least Squares Linear Regression
  5. P1.4-Classification
  6. P1.5 Biased VS Unbiased Estimators
  7. P1.6-Cramer-Rao Bound
  8. P1.7-Ridge Regression
  9. P1.8-Inverse Problems and Overfitting
  10. P1.9-MSE Optimal Estimator
  11. P1.10-Bias-Variance Tradeoff
  12. P1.11-Maximum Likelihood Method
  13. P1.12-Curse of Dimensionality
  14. P1.13-Cross Validation

Part 2

  1. P2.1-Bayesian Rule
  2. P2.2-Average Risk
  3. P2.3-The Gaussian Case
  4. P2.4-Minimum Distance and Naive Bayes Classifiers
  5. P2.5-k-Nearest Neighbor Rule
  6. P2.6-Logistic Regression
  7. P2.7-Scatter Matrices
  8. P2.8-Fisher's Linear Discriminant
  9. P2.9-Proof of Fisher's Method
  10. P2.10-Classification Trees I
  11. P2.11-Classification Trees II
  12. P2.12-Combining Classifiers
  13. P2.13-AdaBoost

Part 3

  1. P3.1-Nonlinear Models and Cover's Theorem
  2. P3.2-Reproducing Kernel Hilbert Spaces
  3. P3.3-Kernel Trick
  4. P3.4-Examples of Kernels
  5. P3.5-Representer Theorem
  6. P3.6-Kernel Ridge Regression
  7. P3.7-Support Vector Regression
  8. P3.8-SVM-The Linearly Separable Classes Case
  9. P3.9-SVM-The Nonseparable Classes Case
  10. P3.10-SVM- The Hinge Loss Function

Part 4

  1. P4.1a-Perceptron Follow Up
  2. P4.1-Perceptron
  3. P4.2-Feed Forward NNs
  4. P4.3-Training NNs
  5. P4.4-The Backpropagation Algorithm
  6. P4.5-Variants of the Gradient Descent Scheme
  7. P4.6-Batch Normalization
  8. P4.7-Loss Function Selection and the Softmax Activation Function
  9. P4.8-ReLU Activation Function
  10. P4.9-Regularization and Dropout
  11. P4.10-Universal Approximation Property and Deep Networks
  12. P4.11-On Optimization and Generalization Properties of Deep Networks
  13. P4.12-Convolutional NNs I
  14. P4.13-Convolutional NNs II
  15. P4.14-Recurrent Neural Networks
  16. P4.15-Attention Mechanism
  17. P4.16-Adversarial Examples
  18. P4.17-Generative Adversarial Networks
  19. P4.18-Capsule Networks
  20. P4.19-Neural Machine Translation I
  21. P4.20-Neural Machine Translation II
  22. P4.21-Neural Machine Translation III

Part 5

  1. P5.1-.Bayesian Learning Introductory Remarks
  2. P5.2-Maximum Likelihood and MAP
  3. P5.3-Bayesian Generalized Linear Regression
  4. P5.4-Bayesian Regression-Simulation Examples
  5. P5.5-The Evidence Function
  6. P5.6-The EM Algorithm
  7. P5.7-The EM and the Linear Regression Task
  8. P5.8-Gaussian Mixture Models
  9. P5.9-Clustering, GMMs and the k-Means Algorithm
  10. P5.10-A Lower Bound Interpretation of EM
  11. P5.11-Exponential Family of Distributions
  12. P5.12-The Variational Approximation Method
  13. P5.13-Variational Inference and Linear Regression
Posted: 23 April 2024