Machine Learning Course SS 15 U Stuttgart
Please subscribe to this mailing list.
See my general teaching page for previous versions of this lecture.
This lecture introduces to modern methods in Machine Learning, including discriminative as well as probabilistic generative models. A preliminary outline of topics is:
- probabilistic modeling and inference
- regression and classification methods (kernel methods, Gaussian Processes, Bayesian kernel logistic regression, relations)
- discriminative learning (logistic regression, Conditional Random Fields)
- feature selection
- boosting and ensemble learning
- representation learning and embedding (kernel PCA and derivatives, deep learning)
- graphical models
- inference in graphical models (MCMC, message passing, variational)
- learning in graphical models
- This is the central website of the lecture. Link to slides, exercise sheets, announcements, etc will all be posted here.
- See the 01-introduction slides for further information.
- Schedule, slides & exercises
date topics slides exercises
(due before the lecture)
16.4. Introduction 01-introduction NO TUTORIALS
in the first week
23.4. Regression 02-regression e01-intro 30.4. Classification 03-classification e02-linearRegression
7.5. Regression & Classification
(change of rhythm)
14.5. HOLIDAY e03-classification
(due on May 11 & 13)
21.5. SVMs 04-MLbreadth e04-kernelsAndCRFs
(due on May 18 & 20)
29.5. HOLIDAY HOLIDAY 4.5. HOLIDAY e05-SVM
(discussed on June 1 & 3)
11.6. Unsupervised Learning e06-NN
(discussed on June 8 & 10)
18.6. Clustering e07-NN
25.6. Bootstrap estimates & Boosting e08-PCA 2.7. Bayesian Learning 05-probabilities
e09-clustering 9.7. e10-weka-scikit 16.7. e11-Bayes-GPs 23.7. SUMMARY 15-MachineLearning-script e12_EM_graphicalModel
 The Elements of Statistical Learning: Data Mining, Inference, and Prediction
by Trevor Hastie, Robert Tibshirani and Jerome Friedman. Springer, Second Edition, 2009.
full online version available
(recommended: read introductory chapter)
 Pattern Recognition and Machine Learning by Bishop, C. M.. Springer 2006.
(especially chapter 8, which is fully online)
[email by Stefan Otte:] This is a nice little (26 pages) linear algebra and matrix calculus reference. It's used for the ML class in Stanford. Maybe it's interesting for your ML class. link
[email by Stefan Otte:] Feature selection, l1 vs. l2 regularization, and rotational invariance Paper: link Comments: link