Machine Learning Course SS 15 U Stuttgart


Please subscribe to this mailing list.


See my general teaching page for previous versions of this lecture.


picture
Exploiting large-scale data is a central challenge of our time. Machine Learning is the core discipline to address this challenge, aiming to extract useful models and structure from data. Studying Machine Learning is motivated in multiple ways: 1) as the basis of commercial data mining (Google, Amazon, Picasa, etc), 2) a core methodological tool for data analysis in all sciences (vision, linguistics, software engineering, but also biology, physics, neuroscience, etc) and finally, 3) as a core foundation of autonomous intelligent systems.


This lecture introduces to modern methods in Machine Learning, including discriminative as well as probabilistic generative models. A preliminary outline of topics is:

  • motivation
  • probabilistic modeling and inference
  • regression and classification methods (kernel methods, Gaussian Processes, Bayesian kernel logistic regression, relations)
  • discriminative learning (logistic regression, Conditional Random Fields)
  • feature selection
  • boosting and ensemble learning
  • representation learning and embedding (kernel PCA and derivatives, deep learning)
  • graphical models
  • inference in graphical models (MCMC, message passing, variational)
  • learning in graphical models
Students should bring basic knowledge of linear algebra, probability theory and optimization.
Organization
  • This is the central website of the lecture. Link to slides, exercise sheets, announcements, etc will all be posted here.
  • See the 01-introduction slides for further information.
Schedule, slides & exercises
date topics slides exercises
(due before the lecture)
16.4. Introduction 01-introduction NO TUTORIALS
in the first week
23.4. Regression 02-regression e01-intro
30.4. Classification 03-classification e02-linearRegression
../data/dataLinReg2D.txt
../data/dataQuadReg2D.txt
../data/dataQuadReg2D_noisy.txt
7.5. Regression & Classification
(cont'd)
NO TUTORIALS
(change of rhythm)
14.5. HOLIDAY e03-classification
../data/data2Class.txt
(due on May 11 & 13)
21.5. SVMs 04-MLbreadth e04-kernelsAndCRFs
(due on May 18 & 20)
29.5. HOLIDAY HOLIDAY
4.5. HOLIDAY e05-SVM
(discussed on June 1 & 3)
11.6. Unsupervised Learning e06-NN
(discussed on June 8 & 10)
18.6. Clustering e07-NN
../data/data2ClassHastie.txt
25.6. Bootstrap estimates & Boosting e08-PCA
2.7. Bayesian Learning 05-probabilities
06-BayesianRegressionClassification
07-graphicalModels
08-graphicalModels-Learning
e09-clustering
9.7. e10-weka-scikit
16.7. e11-Bayes-GPs
23.7. SUMMARY 15-MachineLearning-script e12_EM_graphicalModel
Literature
[1] The Elements of Statistical Learning: Data Mining, Inference, and Prediction by Trevor Hastie, Robert Tibshirani and Jerome Friedman. Springer, Second Edition, 2009.
full online version available
(recommended: read introductory chapter)

[2] Pattern Recognition and Machine Learning by Bishop, C. M.. Springer 2006.
online
(especially chapter 8, which is fully online)

[email by Stefan Otte:] This is a nice little (26 pages) linear algebra and matrix calculus reference. It's used for the ML class in Stanford. Maybe it's interesting for your ML class. link

[email by Stefan Otte:] Feature selection, l1 vs. l2 regularization, and rotational invariance Paper: link Comments: link

Recent Posts