Optimization Algorithms Course WS 20/21 TU Berlin

Optimization is one of the most fundamental tools of modern sciences. Many phenomena -- be it in computer science, artificial intelligence, logistics, physics, finance, or even psychology and neuroscience -- are typically described in terms of optimality principles. It is often easier to describe or design an optimality principle or a cost function rather than the system itself. However, if systems are described in terms of optimality principles, the computational problem of optimization becomes central to all these sciences.

This lecture aims give an overview and introdution to various approaches to optimization together with practical experience in the exercises. The focus will be on continuous optimization problems and is structured in three parts:

  • Part 1: Downhill algorithms for unconstrained optimization:
    • gradient descent, backtracking line search, Wolfe conditions, convergence properties
    • covariant gradients, Newton, quasi-Newton methods, (L)BFGS
  • Part 2: Algorithms for constrained optimization:
    • KKT conditions, Lagrangian
    • Log-barrier, Augmented Lagrangian, primal-dual Newton
    • SQP
  • Part 3: Structured Problems, Libraries, Applications, Non-Convexity:
    • Applications in AI, Robotics
    • Factorization, structure \& sparsity
    • Libraries
    • Non-convexity
Students should bring solid knowledge of linear algebra and analysis, as well as good coding skills.
Schedule, slides & exercises
date topics slides exercises
(due on 'date'+1)
Nov 3. Introduction & Orga 01-introduction
Nov 10. Unconstrained Optimization 02-unconstrainedOpt
Nov 17. Unconstrained Optimization e01-gradientDescent
Nov 24. Constrained Optimization 03-constrainedOpt e02-unconstrainedOpt
Nov 31. e03-newtonMethods