IEOR 262B: Mathematical Programming II
Instructor: Javad Lavaei
Time: Tuesdays and Thursdays, 11am-12:30pm
Location: 1174 Etcheverry
Instructor's Office Hours: Wednesdays, 10-11am
Grading Policy:
Description
This course provides a fundamental understanding of general nonlinear optimization theory, convex optimization, conic optimization, low-rank optimization, numerical algorithms, and distributed computation. Some of the topics covered in this course are as follows:
Nonlinear optimization: First- and second-order conditions, Fritz John optimality conditions, Lagrangian, duality, augmented Lagrangian, etc.
Convexity: Convex sets, convex functions, convex optimization, conic optimization, convex reformulations, etc.
Convexification: Low-rank optimization, conic relaxations, sum-of-squares, etc.
Algorithms: Descent algorithms, conjugate gradient methods, gradient projection & conditional gradient methods, block coordinate methods, proximal methods, second-order methods, accelerated methods, decomposition and distributed algorithms, convergence analysis, etc.
Applications: Machine learning, data science, etc.
Textbook
‘‘Low-Rank Semidefinite Programming: Theory and Applications" by Alex Lemon, Anthony Man-Cho So and Yinyu Ye, Foundations and Trends in Optimization, 2015 (click here to download the monograph).
Lecture notes
Homework
Homework 1
Homework 2
Homework 3
Homework 4
Homework 5 (due April 12 at 5pm)
Homework 6 (due April 29 at 5pm)
|