Aaron Sidford (sidford@stanford.edu)

Welcome

This page has informatoin and lecture notes from the course "Introduction to Optimization Theory" (MS&E213 / CS 269O) which I taught in Fall 2019. I hope you enjoy the content as much as I enjoyed teaching the class and if you have questions or feedback on the note, feel free to email me.

Course Overview

This class will introduce the theoretical foundations of continuous optimization. Starting from first principles we show how to design and analyze simple iterative methods for efficiently solving broad classes of optimization problems. The focus of the course will be on achieving provable convergence rates for solving large-scale problems.

Email: sidford@stanford.edu

Here are the links for the course lecture notes. Feedback is welcome and if there is anything that is unclear or you would like added, please feel free to contact me. My apologies for missing citations and references; these may be added over time in future course offerings

**Chapter 1: Introduction**: The notes for this chapter are here.**Chapter 2: Smoothness**: The notes for this chapter are here.**Chapter 3: Convexity**: The notes for this chapter are here.**Chapter 4: Acceleration**: The notes for this chapter are here.**Chapter 5: Smooth Extensions**: The notes for this chapter are here.**Chapter 6: Non-smooth Convex Functions**: The notes for this chapter are here.**Chapter 7: Cutting Plane Methods**: The notes for this chapter are here.**Chapter 8**:**Subgradient / Mirror Descent**: The notes for this chapter are here.**Chapter 9: Interior Point Methods**: The*optional*notes for this chapter are here.**Appendix A: Norms**: The notes for this chapter are here.

Here is the schedule of material for the course.

Week 1

**Lecture #1 (Tu 9/24)**: Introduction – oracles, efficiency, and why optimization is impossible**Lecture #2 (Th 9/26)**: Introduction – why optimization is doable, but expensive (Lipschitz functions)**Reading**: Finish Chapter 1 and start Chapter 2

Week 2

**Lecture #3 (T 10/1)**: Smoothness - computing critical points dimension free**Lecture #4 (Th 10/3)**: Convexity – computing global optima of strongly convex functions**Homework #1**: due at start of class Th 10/3**Reading**: Finish Chapter 2 and start Chapter 3P

Week 3

**Lecture #5 (Tu 10/8)**: Convexity – computing global optima of convex functions**Lecture #6 (Th 10/10)**: Acceleration – tight rates for smooth convex functions**Homework #2**: due at start of class Th 10/10**Reading**: Finish Chapter 3 and start Chapter 4

Week 4

**Lecture #7 (T 10/15)**: Acceleration and momentum continued**Lecture #8 (Th 10/17)**: Generalizations to arbitrary norms and composite functions**Homework #3**: due at start of class Th 10/17**Reading**: Finish Chapter 4 and start Chapter 5

Week 5

**Lecture #9 (Tu 10/22)**: Composite function minimization and coordinate descent**Midterm (Th 10/24)**: this will be in class midterm for the duration of the usual lecture time**Reading**: Finish Chapter 5 and start Chapter 6

Week 6

**Lecture #10 (Tu 10/29)**: Finish coordinate descent and start unit on convex sets, separating hyperplanes, and subgradients.**Lecture #11 (Th 10/31)**: Convex sets, separating hyperplanes, and subgradients.**Reading**: Finish Chapter 6

Week 7

**Lecture #12 (T 11/5)**: Prove separating hyperplane theorem**Lecture #13 (Th 11/7)**: Cutting plane methods**Homework #4**: due at start of class Th 11/7**Reading**: Finish Chapter 7

Week 8

**Lecture #14 (Tu 11/12)**: Introduce subgradient optimization, online linear optimization, and learning from experts.**Lecture #15 (Th 11/14)**: Analyze follow the regularized leader (FTRL) and solve learning from experts**Homework #5**: due at start of class Th 11/14**Reading**: Start Chapter 8

Week 9

**Lecture #16 (Tu 11/19)**: Mirror descent and stochastic gradient descent and variance reduction**Lecture #17 (Th 11/21)**: Stochastic mirror descent and variance reduction**Homework #6**: due at start of class Th 11/21**Reading**: Finish Chapter 8

Thanksgiving Break

**No lecture (Tu 11/26)****No lecture (Th 11/28)**

Week 10

**Lecture #18 (Tu 12/3)**: Guest lecture on advanced acceleration techniques (End-Quarter Period)**Lecture #19 (Th 12/5)**: Course recap and discussion of advanced topics (e.g. interior point methods) (End-Quarter Period)**Reading**: Optional Chapter 9

Take Home Final

- Released by end of week 10
- Due 6:30PM on Tu 12/10

The material in the lecture notes is based primarily on my own experience with optimization and the following two texts:

- "Introductory Lectures on Convex Programming Volume I: Basic Course" by Yurii Nesterov.
- Convex Optimization: Algorithms and Complexity by Sébastien Bubeck.

Additional resources that may be helpful include the following:

- Convex Optimization by Stephen Boyd and Lieven Vandenberghe.
- CSE 599: Interplay between Convex Optimization and Geometry a course by Yin Tat Lee.