IEOR 262B: Mathematical Programming II

Instructor: Javad Lavaei
Time: Tuesdays and Thursdays, 11am-12:30pm
Location: 1174 Etcheverry
Instructor's Office Hours: Wednesdays, 10-11am
Grading Policy:

  • 15% homework

  • 40% midterm exam (April 11th)

  • 45% project

Description

This course provides a fundamental understanding of general nonlinear optimization theory, convex optimization, conic optimization, low-rank optimization, numerical algorithms, and distributed computation. Some of the topics covered in this course are as follows:

  • Nonlinear optimization: First- and second-order conditions, Fritz John optimality conditions, Lagrangian, duality, augmented Lagrangian, etc.

  • Convexity: Convex sets, convex functions, convex optimization, conic optimization, convex reformulations, etc.

  • Convexification: Low-rank optimization, conic relaxations, sum-of-squares, etc.

  • Algorithms: Descent algorithms, conjugate gradient methods, gradient projection & conditional gradient methods, block coordinate methods, proximal methods, second-order methods, accelerated methods, decomposition and distributed algorithms, convergence analysis, etc.

  • Applications: Machine learning, data science, etc.

Textbook

  • “Nonlinear Programming” by Dimitri P. Bertsekas, Athena Scientific, 3rd Edition.

  • “Linear and Nonlinear Programming” by David Luenberger and Yinyu Ye, Springer, 4th Edition.

  • “Convex Optimization” by Stephen Boyd and Lieven Vandenberghe, Cambridge University Press, 2004 (click here to download the book).

  • ‘‘Low-Rank Semidefinite Programming: Theory and Applications" by Alex Lemon, Anthony Man-Cho So and Yinyu Ye, Foundations and Trends in Optimization, 2015 (click here to download the monograph).

Lecture notes

  • Lecture 5: Convergence analysis of first-order methods

  • Lecture 6: Convergence analysis of first-order methods

  • Lecture 7: Convergence analysis of first-order methods

  • Lecture 8: Convergence analysis of second-order methods

  • Lecture 9: Quasi-newton and conjugate gradient methods

  • Lecture 10: Conjugate gradient and coordinate descent methods

  • Lecture 12: Algorithms for constrained optimization

  • Lecture 13: Algorithms for constrained optimization

  • Lecture 14: Algorithms for constrained optimization

  • Lecture 16: Proximal gradient algorithm and optimality conditions

  • Lecture 17: Optimality conditions for constrained optimization

  • Lecture 18: Optimality conditions for constrained optimization

  • Lecture 19: Fritz-John conditions and constraint qualifications

  • Lecture 23: Generalized inequalities and convexification

  • Lecture 24: Convexification and low-rank optimization

  • Lecture 25: Rounding technique and sum of squares

Homework

Homework 1

Homework 2

Homework 3

Homework 4

Homework 5 (due April 12 at 5pm)

Homework 6 (due April 29 at 5pm)