Advanced Numerical Optimization
CMSC764 / AMSC604 - Spring 2016

Course Information

When: TuTh 12:30pm - 1:45pm
Where: CSI 3120
Instructor:  Tom Goldstein
                  Office Hours: Th 2-3pm, AVW 3141
TA:  Peter Sutor
                  Office Hours:  Wed 1:30-2:30

Final Exam:  Tue May 17 at 1:30pm in CSIC 3120

Homework
All homework should be submitted on the UMD submit server.  Instructions from last year's course can be found here.
Homework 1 - gradients, linear classifiers, and FFTs 
Homework 2 - convex functions
Homework 3 - gradient methods
Homework 4 - dualitylatex source
Homework 5 - forward-backward splitting
Homework 6 - Lagrangian Methods

Course Description
This is an introductory survey of optimization.  Special emphasis will be put methods with applications in machine learning, model fitting, and image processing.  There are no formal pre-requisites for this course, however students should have a strong background in applied mathematics (especially linear algebra) and computer programming.

Students' grades will be based on completion of the following:
  •  Homework assignments (70% of grade):  these will consist of programming tasks and theoretical exercises.
  •  Final Exam (30% of grade)

Topics Covered
  • multivariable calculus and optimality conditions
  • convex functions
  • duality theory
  • gradient methods
  • linear programming
  • interior point methods
  • splitting methods
  • stochastic algorithms
  • derivative free methods
  • semidefinite programming
  • global optimization
  • ...and more

Book & Other Sources
All course materials are available for free online.   Suggested reading material for various topics includes:

Numerical Linear Algebra:  Numerical Linear Algebra by Trefethen and Bau
L1 models and sparsity:  Sparse modeling for Image and Vision Processing
Convex functions and gradient methods:  Convex Optimization by Boyd and Vandenberghe
Convergence rates for gradient methods: Optimal Rates in Convex Optimization
Proximal methods: A Field Guide to Forward-Backward Splitting
ADMM: Fast Alternating Direction Optimization Methods
Consensus ADMM: Distributed Optimization and Statistical Learning
Unwrapped ADMM:  Unwrapping ADMM
PDHG:  Adaptive Primal-Dual Hybrid Gradient Methods
SGD: Incremental Gradient, Subgradient, and Proximal Methods
SGD convergence rates: Stochastic Gradient Descent for Non-Smooth Optimization
Monte-Carlo: An Introduction to MCMC for Machine Learning
Barrier Methods: Convex Optimization by Boyd and Vandenberghe, chapter 11
Primal-Dual Interior Point Methods: Nocedal and Wright, chapter 14
Semi-definite programming:  Vandenbergh and Boyd
Metric learning: Distance Metric Learning for LMNN


Restricted Access
Lecture Slides