- 2016-06-20
- The re-exam on June 28 is cancelled.
- 2016-06-02
- Solutions for Exercise sheet 8 are online. Exercise sheet 8 has been updated.
- 2016-05-29
- The notes about linear programming and related things are online.
- 2016-05-26
- Solutions for Exercise sheet 7 are online.
- 2016-05-25
- The lecture notes have been updated (correction of a few typos, some more facts about Gaussian distributions).
- 2016-05-19
- Homework 4 has been updated. In problem 1c, the definition of P* has been (greatly) simplified.
- 2016-05-16
- Homework 4 has been updated. A typo was fixed in problem 1a), and in problem 2, the notation
*T*(*v*) was changed to*S*(*v*). - 2016-05-11
- Homework 4 is online.
- 2016-05-09
- The missing proof from "clustering under approximation stability" has been added.
- 2016-05-04
- The lecture notes have been updated. Except for the proof how to get rid of the assumption that we know
*w*_{avg}, clustering under approximation stability should be complete now. The missing proof will be added next week. Please note the last sentence of Lemma 8.6(ii), which is important, but I did not mention it yesterday. - 2016-04-25
- Some more material is online. In the folder "simplex", you find three papers. Most relevant are pages 175-176 of Borgwardt's paper (phase 1 algorithm), pages 18-26 of Vershynin's (bound for the number of edges in the shadow using the Distance Lemma), and pages 45-51 of Spielman and Teng's paper (Distance Lemma). There are also two lecture notes: convexoptim-notes.pdf (Chapter 1 up through Section 1.5 contains proofs and notation of the basics in convex analysis, including the proof of the separation theorem and the convex analytic definition of a face) and polyhedral-notes.pdf (Chapters 1-4 cover most of the background material on polyhedral theory and linear programming).
- 2016-04-21
- Homework 3 has been updated. The definition of face been corrected to include the requirement that a face be convex.
- 2016-04-14
- Homework 3 has been updated. Typos were fixed in the definition of a face in Exercise 1 and in the statement of Exercise 1 part e. The statement of Exercise 3 part c was also updated to be more precise.
- 2016-04-09
- Homework assignment 3 has been updated. In Exercise 3, Part d has been modified slightly and there is a clarification about Assumption 3.
- 2016-04-08
- Homework assignment 3 is online. It also contains some notes about the lecture. Deadline is April 26 (one week extension compared to what we have announced previously).
- 2016-04-03
- Some old lecture notes for Lecture 8 are online.
- 2016-03-30
- The exam will be written and takes place on Tuesday, June 7.
- 2016-03-27
- Exercises for week 8 are online.
- 2016-03-22
- Lecture notes are updated and include smoothed analysis of SSP. Exercises for week 7 and partial solutions of the exercises of week 6 are online.
- 2016-03-08
- Exercises for week 6 and solutions for week 5 are online. Three papers (about max-cut, the knapsack problem, and the one-step model) are online. Slides of the lecture today are online.
- 2016-03-07
- The lecture notes are updated. They contain now sections about max-cut and knapsack.
- 2016-03-07
- Homework assignment 2 contained a small mistake. In exercise 1d, there was a "1 -" missing in the probability. The updated version is online.
- 2016-03-03
- Homework assignment 2 and exercises for week 5 are online. The deadline for the homework assignment is March 22.

- Lecturer: Daniel Dadush, Bodo Manthey.
- The course takes place on Tuesdays, starting at February 9, 2016. The lectures take place in Ruppert D, Utrecht University. All lectures start at 14:00, except for the last lecture (May 24), which starts already at 13:15.
- This course is part of the Dutch Master's Degree Programme in Mathematics (Mastermath) and a master course of the mathematics cluster DIAMANT. It accounts for 8 EC.

- Lecture Notes (will be updated frequently)
- Notes about linear programming and related things.
- Slides of Lectures 1 and 2 (introduction and 2-opt)
- Slides of Lectures 3 and 4
(
*k*-means method) - Slides of Lecture 5 (flip heuristic for max-cut)
- Slides of Lecture 6 (knapsack in polynomial time)
- Slides of Lecture 7 (successive shortest path algorithm for min-cost flows)
- Some old lecture notes for Lecture 8 (smoothed polynomial complexity vs. pseudo-polynomiality - relevant is Section 2.2)
- Lecture notes about convex optimization (Chapter 1 up through Section 1.5 contains proofs and notation of the basics in convex analysis, including the proof of the separation theorem and the convex analytic definition of a face) and polyhedral geometry (Chapters 1-4 cover most of the background material on polyhedral theory and linear programming).

The grade will be determined by the exam (70%) and the graded homework assignments (30%). The written exam takes place on Tuesday, June 7, 14:00-17:00. The re-exam is either written or oral. This will be announced in due course.

The exam will be written or oral, depending on the number of participants.

There will be four sets of graded homework assignments. These can be handed in in groups of at most two students. Submission deadline for the four sets are the beginning of the 3rd, 7th, 11th, and 15th lecture, respectively.

- Homework assignment 1, due February 23, 2016.
- Homework assignment 2, due March 22, 2016.
- Homework assignment 3, due April 26, 2016.
- Homework assignment 4, due May 24, 2016.

There will be weekly exercises, which will not be graded and do not have to be handed in.

- Exercises for week 1. Partial solutions.
- Exercises for week 2. Partial solutions.
- Exercises for week 3. Partial solutions.
- Exercises for week 4. Partial solutions.
- Exercises for week 5. Partial solutions.
- Exercises for week 6. Partial solutions.
- Exercises for week 7. Partial solutions.
- Exercises for week 8. Partial solutions.

The following is a rough schedule of the course. This schedule might be subject to change.

- Lecture 1 (February 9)
- introduction; 2-Opt heuristic for the TSP
- Lecture 2 (February 16)
- 2-Opt heuristic for the TSP
- Lecture 3 (February 23)
*k*-means method for clustering- Lecture 4 (March 1)
*k*-means method for clustering- Lecture 5 (March 8)
- flip heuristic for Max-Cut
- Lecture 6 (March 15)
- solving the knapsack problem in polynomial time
- Lecture 7 (March 22)
- successive shortest path algorithm for minimum-cost flows
- Lecture 8 (March 29)
- pseudo-polynomiality and smoothed complexity
- Lecture 9 (April 5)
- smoothed analysis of the simplex method (1)
- Lecture 10 (April 12)
- smoothed analysis of the simplex method (2)
- Lecture 11 (April 19)
- smoothed analysis of the simplex method (3)
- Lecture 12 (April 26)
- smoothed analysis of condition numbers
- Lecture 13 (May 3)
- clustering and approximation stability
- Lecture 14 (May 10)
- semi-random coloring
- Lecture 15 (May 17)
- semi-random partitioning
- Lecture 16 (May 24)
- questions, answers, and open problems

For many algorithm, the classical (worst-case) analysis of algorithms yields only unsatisfactory insights into the performance that are often far too pessimistic and do not reflect empirical observations. Only in recent years, there is a paradigm shift towards a more realistic and robust algorithmic theory has been initiated.

The focus of this lecture is on methods and techniques to rigorously analyze algorithms beyond the classical worst-case analysis. Roughly, it can be divided into two parts:

- Probalistic input models: algorithms are analyzed on inputs that are to a certain extent random, i.e., average-case analysis, smoothed analysis, and semi-random input models.
- Deterministic input models: refined analysis based on input parameters such as conditions numbers or non-degeneracy conditions.