Optimization Algorithms Course WS 20/21 TU Berlin
 Description

Optimization is one of the most fundamental tools of modern
sciences. Many phenomena  be it in computer science, artificial
intelligence, logistics, physics, finance, or even psychology and
neuroscience  are typically described in terms of optimality
principles. It is often easier to describe or design an optimality
principle or a cost function rather than the system itself. However, if
systems are described in terms of optimality principles, the
computational problem of optimization becomes central to all these
sciences.
This lecture aims give an overview and introdution to various approaches to optimization together with practical experience in the exercises. The focus will be on continuous optimization problems and is structured in three parts:
 Part 1: Downhill algorithms for unconstrained optimization:
 gradient descent, backtracking line search, Wolfe conditions, convergence properties
 covariant gradients, Newton, quasiNewton methods, (L)BFGS
 Part 2: Algorithms for constrained optimization:
 KKT conditions, Lagrangian
 Logbarrier, Augmented Lagrangian, primaldual Newton
 SQP
 Part 3: Structured Problems, Libraries, Applications, NonConvexity:
 Applications in AI, Robotics
 Factorization, structure \& sparsity
 Libraries
 Nonconvexity
 Part 1: Downhill algorithms for unconstrained optimization:
 References
 Schedule, slides & exercises

date topics slides exercises
(due on 'date'+1)Nov 3. Introduction & Orga 01introduction Nov 10. Unconstrained Optimization 02unconstrainedOpt
02functionse00mathsCheck
e00pythonCheckNov 17. Unconstrained Optimization e01gradientDescent Nov 24. Constrained Optimization 03constrainedOpt e02unconstrainedOpt Nov 31. e03newtonMethods