Biography

My research lies at the intersection of optimization and distributed computing, and targets applications in machine learning and image processing. I design optimization methods for a wide range of platforms. This includes powerful cluster/cloud computing environments for machine learning and computer vision, in addition to resource limited integrated circuits and FPGAs for real-time signal processing. My research takes an integrative approach that jointly considers theory, algorithms, and hardware to build practical, high-performance systems. Before joining the faculty at Maryland, I completed my PhD in Mathematics at UCLA, and was a research scientist at Rice University and Stanford University. I have been the recipient of several awards, including SIAM’s DiPrima Prize, a DARPA Young Faculty Award, and a Sloan Fellowship.

Research

Here are some of my most recent projects. I believe in reproducible research, and I try to develop open-source tools to accompany my research when possible. For a full list of software and projects, see my complete research page.

Attacking Neural Nets with Poison Frogs

on April 11, 2018

Data poisoning is an adversarial attack in which examples are added to the training set of a classifier to manipulate the behavior of the model at test time. We propose a new poisoning attack that is effective on neural nets, and can be executed by an outsider with no control over the training process.

Continue reading

Visualizing Neural Net Loss Landscapes

on January 5, 2018

It is well known that certain neural network architectures produce loss functions that train easier and generalize better, but the reasons for this are not well understood. To understand this better, we explore the structure of neural loss functions using a range of visualization methods.

Continue reading

Stabilizing GANs with Prediction

on December 11, 2017

Adversarial networks are notoriously hard to train, and simple training methods often collapse. We present a simple modification to the standard training method that increases stability. The method is provably stable for a class of saddle-point problems, and improves performance of numerous GANs.

Continue reading

PhasePack: A Phase Retrieval Library

on November 20, 2017

PhasePack is a software library that implements a wide range of different phase retrieval algorithms and initialization methods. It can also produce comparisons between algorithms, and comes with empirical datasets for testing on real-world problems.

Continue reading

Theory of Binary and Low Precision Nets

on November 10, 2017

Neural net parameters can often be compressed down to just one single bit without a significant loss in network performance, yielding a huge reduction in memory footprint and computational workload. We develop a theory of quantized nets, and explain the performance of algorithms for weight quantization.

Continue reading

PhaseMax: convex phase retrieval

on December 27, 2016

A number of non-convex optimization problems can be convexified by “lifting” strategies. These methods yield convex formulations at the cost of substantially increased dimensionality. PhaseMax is a new type of convex relaxation that does not require lifting; it solves problems in their original low-dimensional parameter space.

Continue reading

Distributed Machine Learning

on November 27, 2016

Classical machine learning methods, include stochastic gradient descent (aka backprop), work great on one machine, but don’t scale well to the cloud or cluster setting. We propose a variety of algorithmic frameworks for scaling machine learning across many workers.

Continue reading

FASTA: Your handy optimizer

on October 23, 2016

FASTA (Fast Adaptive Shrinkage/ Thresholding Algorithm) is an efficient, easy-to-use implementation of the Forward-Backward Splitting (FBS) method (also known as the proximal gradient method) for regularized optimization problems. Many variations on FBS are available in FASTA, including the popular accelerated variant FISTA (Beck and Teboulle ’09), the adaptive stepsize rule SpaRSA

Continue reading

For students

I teach courses in discrete mathematics and optimization.

View course webpages

Sponsors

My research is made possible by the generous support of the following organizations.