MS&E 319: Approximation Algorithms

 

Instructor: Amin Saberi

saberi@stanford.edu

(650) 704-7857

Time and Location: TBD

Teaching Assistant:  TBD

Class website: http://msande319.stanford.edu

Course Description

 

Discrete optimization problems are everywhere, from classic operations problems such as scheduling, facility location, and traveling salesman problems, to modern applications in ride sharing, online advertising, social network analysis, and data mining. Yet most interesting discrete optimization problems are NP-hard and unless P = NP, there are no efficient algorithms for solving them optimally. This course shows how to design approximation algorithms: efficient algorithms that find provably near-optimal solutions.

 

The course is organized around central techniques for designing approximation algorithms. Each week a new technique is introduced and illustrated through several applications, covering combinatorial algorithms, linear and semidefinite relaxations, and rounding using randomization, metric embedding, and sampling.

 

The course starts from basic concepts assuming only familiarity with basic concepts in algorithms, linear and convex programming, and discrete probability. But it will move at a fast pace to cover more recent contributions in this active area of research.

Textbooks (available online)

Course load and Grading

 

Topics

 

Combinatorial Algorithms

·      Job scheduling, set cover and vertex cover

·      Traveling Salesman Problems, k-center

LP-based algorithms

·      Linear programming relaxations, randomized rounding and de-randomization

·      Primal-dual schema and its economic interpretations

Semi-definite and spectral relaxations

·      Low distortion metric embedding, tree metrics, and tree-cut packing

·      Sum-of-squares method and applications

·      Graph sparsification and electrical networks

Rounding by Sampling and maximum entropy distributions

·      Traveling Salesman problems, Assignment problems

Approximate counting and sampling for #P-hard problems

·      Self-reducibility, Markov chain Monte Carlo methods, and Importance Sampling

 

 

Tentative Outline

 

Week 1

·      Introduction.  Maximum Satisfiability (Chapters 1 & 16 of Vazirani)

·      Scheduling on Unrelated Parallel Machines (Chapter 17 of Vazirani)

 

Week 2

·      Set cover from three different points of view (Chapters 2, 13, and 15 of Vazirani)

·      Facility Location and K-median (Chapters 24 and 25 of Vazirani)

 

Week 3

·      Semidefinite Programming and Maximum Cut (Chapter 26 of Vazirani).

·      Randomized Rounding of Semidefinite Programs (Chapter 6 of Williamson-Shmoys)

·      3-Coloring Problem (Section 6.5 of Williamson-Shmoys)

 

Week 4

·      Multiway Cut and k-Cut (Chapter 4 of Vazirani)

·      Cuts and Metrics (Sections 8.1-8.3 of Williamson-Shmoys)

·      Metrics, Cut Packings, and l1-embeddability (Chapter 21 of Vazirani, Section 15.1 of Williamson-Shmoys)

 

Week 5

·      Cheeger’s inequality (Luca Trevisan’s notes here and there. Also Lovasz’s excellent survey)

·      Random walks and electrical networks, matrix-tree theorem (Doyle and Snell’s excellent exposition)

 

 

Weeks 6 & 7 (Nov. 15 & 17)  

·      Sum of Square Method (guest lectures by Tselil Schramm)

 

Other topics include applications to clustering (guest lecture by Moses Charikar), Traveling Salesman Problems (1-2 guest lectures), counting and sampling, and graph sparsification.