Advanced Topics in Numerical Methods
Sparsity and Machine Learning
CMSC878O - Spring 2015

Course Information

Schedule: TuTh 12:30pm - 1:45pm
Instructor:  Tom Goldstein
                  Office Hours: Th 2-3pm, AVW 3141
TA:  Neal Gupta: (nguptateaching gmail com)
                  Office Hours:  M 3-4pm, AVW 4103

Homework
All homework should be submitted on the UMD submit server.  Instructions can be found here (courtesy of Neal).
homework 1:  gradients and fourier transforms
homework 2:  descent methods
homework 3:  forward-backward splitting
homework 4:  ADMM


Course Description
This is an introductory survey of optimization for computer scientists.  Special emphasis will be put methods with applications in machine learning, model fitting, and image processing.  The format of the class is split between a traditional lecture course where the instructor presents material, and a reading course where students present papers.  There are no formal pre-requisites for this course, however students should have a strong background in applied mathematics (especially linear algebra)  and computer programming.

Students' grades will be based on completion of the following:
  •  Homework assignments:  these will consist mostly of  programming tasks that may be completed in either Matlab of python.
  • A 15-25 minute paper presentation (in groups of 1-4)
  • A 15-25 minute project presentation (in groups of 1-4)
The paper presentation and project are related - the paper presentation is intended to educate the class on the background necessary to understand the presentation of your final project.  The final project should be on some topic that combines optimization with your own domain-specific knowledge or ongoing research.  You are not required to writeup your final project, but rather you present  your results in class.


  Topics covered in lectures will include:  multivariable calculus and optimality conditions, gradient methods, interior point methods,  splitting methods, and stochastic optimization.  Applications covered will include: fitting generalized linear models, sparse regression methods, matrix factorizations, neural networks, support vector machines, and more.


Book & Other Sources
All course materials are available for free online.   Suggested reading material for various topics includes:

Numerical Linear Algebra:  Numerical Linear Algebra by Trefethen and Bau
L1 models and sparsity:  Sparse modeling for Image and Vision Processing
Convex functions and gradient methods:  Convex Optimization by Boyd and Vandenberghe
Proximal methods: A Field Guide to Forward-Backward Splitting
ADMM: Fast Alternating Direction Optimization Methods
Consensus ADMM: Distributed Optimization and Statistical Learning
Unwrapped ADMM:  Unwrapping ADMM
PDHG:  Adaptive Primal-Dual Hybrid Gradient Methods
SGD: Stochastic Gradient Descent for Non-Smooth Optimization
Monte-Carlo: An Introduction to MCMC for Machine Learning
Barrier Methods: Convex Optimization by Boyd and Vandenberghe, chapter 11
Primal-Dual Interior Point Methods: Nocedal and Wright, chapter 14
Semi-definite programming:  Vandenbergh and Boyd
Metric learning: Distance Metric Learning for LMNN

Topics Covered
Course Overview, linear algebra overview
Sparse models and L1 optimization
Total variation, calculus, and FFT
Solvers for Linear problems.
Convex functions
Standard form Problems
Unconstrained optimization
Compressed sensing
Duality
Splitting methods
Interior point methods
High-dimensional statistics
Stochastic methods (backprop, SGD, Monte-Carlo methods)
Integer programming
Derivative-free methods
Semidefinite programming

The course will focus on methods applicable to: Sparse least-squares/Lasso, total variation image processing, deconvolutions, sparse+low rank approximations, support vector machines, factor analysis, neural nets, logistic regression, L-infinity regularized problems.

Restricted Access
Calendar
Lecture Slides