Convex Analysis and Optimization

(for CS and ML: Convex and Non-Convex Optimization)

Lecturer: Peter Ochs

Winter Term 2021 / 2022
Lecture (4h) and Tutorial (2h)
9 ECTS

Lecture: Tuesday 8-10 c.t. in N03
Lecture: Thursday 10-12 c.t. in N05
Presence plus video recordings.
Date of First Lecture: Tuesday, 19. October, 2021

Teaching Assistants: Sheheryar Mehmood and Shida Wang

Tutorials: In-person on Fridays 14-16 c.t. in N14 (C-Bau)
and Online on Wednesdays 14-16 c.t.

Lecture for Mathematics, Computer Science
and Machine Learning
Language: English
Prerequisites: Basics of Mathematics
(e.g. Linear Algebra 1-2, Analysis 1-3, Mathematics 1-3 for Computer Science)
News Description Registration
Exam Schedule Literature
News
27.09.2021: Updated information and Moodle Website is set up.
06.07.2021: Updated information.
01.05.2021: Webpage is online.
Description
T. Rockafellar stated in a SIAM review: "The watershed in optimization is between convexity and non-convexity instead of linearity and non-linearity". For many decades linear optimization problems were considered as solvable and non-linear problems as very difficult to solve. However, the development of fundamental convex analysis tools such as Fenchel duality changed this picture. A broad and deep understanding of the theory led to many efficient algorithms for convex optimization problems, which has also contributed to the advance of applications, for example, in machine learning, computer vision, image processing, and compressed sensing.

The course introduces basic and advanced concepts of convex analysis. After definition, generation, and relations of convexity for sets and functions are studied, slightly more advanced tools such as the Moreau envelope, the subdifferential, and Fermat's rule are developed. These tools are fundamental for the study of convex optimization problems, optimality conditions, and algorithms. The second part of the lecture is devoted to the analysis of first order convex optimization algorithms that are ubiquitious in data science applications such as Image Processing, Computer Vision, Machine Learning and Statistics. Then, the study of convex duality allows us to introduce widely used primal-dual algorithms.

After taking the course, students know about the most relevant concepts of convex analysis and convex optimization. They are able to read and understand related scientific literature. Moreover, they can rate the difficulty of convex optimization problems arising in applications in machine learning or computer vision and select an efficient algorithm accordingly. Moreover, they develop basic skills in solving practical problems with Python.
Registration:
Registration open in Moodle.
Exam:
Oral or written exam depending on the number of students.

Qualification requirements:

  • Submit solutions for the weekly exercise sheets.
  • Work in groups of 2 or 3 people.
  • Present your solutions.
  • Gain 60% of the total points.

Documentation and Schedule:
Participants of this lecture may download the course material directly from Moodle, including detailed lecture notes (script) and exercise sheets. Please note that the documentation that is provided is not meant to replace the attendance of the lectures and tutorials, that it may contain errors, and that it is not exclusively meant to define the contents of the exam.

Schedule

The following table shows the schedule from Winter Term 2019.

Date No. Title Exercise Comment
Tu. 15.10. L1 Introduction [pdf]
We. 16.10. L2 Affine Sets, Convex Sets Ex01 [pdf]
Tu. 22.10. L3 Convex Hull,
Topological Properties,
Convex Cone
We. 23.10. L4 Projection Theorem,
Convex Feasibility Problems
Ex02 [pdf]
Tu. 29.10. L5 Notions of Variational Analsis
We. 30.10. L6 Lower Semi-Continuity
and Convex Functions
Ex03 [pdf]
Tu. 5.11. L7 Convexity Preserving Operations
and Derivative Tests
We. 6.11. L8 Strict / Strong Convexity
and Infimal Convolution
Ex04 [pdf]
Tu. 12.11. L9 Continuity of Convex Functions
We. 13.11. L10 Separation Theorems Ex05 [pdf]
Tu. 19.11. L11 Convex Duality
We. 20.11. L12 Convex Duality II Ex06 [pdf]
Tu. 26.11. L13 The Subdifferential
We. 27.11. L14 Subdifferential Calculus Ex07 [pdf]
Tu. 3.12. L15 Fermat's Rule
We. 4.12. L16 Moreau Envelope and Proximal Mapping Ex08 [pdf]
Tu. 10.12. L17 Proximal Gradient Descent
We. 11.12. L18 Proximal Gradient Descent II Ex09 [pdf]
Tu. 17.12. L19 Proximal Subgradient Descent
We. 18.12. L20 Conditional Gradient Method Ex10 [pdf]
Tu. 7.1. L21 Lower Complexity Bounds [slides]
We. 8.1. L22 Accelerated and Inertial Algorithm Ex11 [pdf]
Tu. 14.1. L23 Summary of Algorithms
We. 15.1. L24 Primal-Dual Hybrid Gradient Algorithm Ex12 [pdf]
Tu. 21.1. --- --- cancelled --- --- ---
We. 22.1. --- --- cancelled --- --- ---
Tu. 28.1. L25 Stochastic Optimization
We. 29.1. L26 Stochastic Optimization II
Tu. 4.2. L27 oral exam
We. 5.2. L28 oral exam
Literature
The lecture is based on the following literature:
  • T. Rockafellar: Convex Analysis. Princeton University Press, 1970.
  • Y. Nesterov: Introductory Lectures on Convex Optimization - A Basic Course. Kluwer Academic Publishers, 2004.
  • D. P. Bertsekas: Convex Analysis and Optimization. Athena Scientific, 2003.
  • S. Boyd: Convex Optimization. Cambridge Univeristy Press, 2004.
  • H. H. Bauschke and P. L. Combettes: Convex Analysis and Monotone Operator Theory in Hilbert Spaces. Springer, 2011.
  • T. Rockafellar and R. J.-B. Wets: Variational Analysis. Springer-Verlag Berlin Heidelberg, 1998.