Projects

Stabilizing GANs with Prediction

Adversarial networks are notoriously hard to train, and simple training methods often collapse. We present a simple modification to the standard training method that increases stability. The method is provably stable for a class of saddle-point problems, and improves performance of numerous GANs.

Continue reading

PhasePack

PhasePack is a software library that implements a wide range of different phase retrieval algorithms and initialization methods. It can also produce comparisons between algorithms, and comes with empirical datasets for testing on real-world problems.

Continue reading

Training Quantized Nets: A Deeper Understanding

Neural net parameters can often be compressed down to just one single bit without a significant loss in network performance, yielding a huge reduction in memory footprint and computational workload. We develop a theory of quantized nets, and explain the performance of algorithms for weight quantization.

Continue reading

PhaseMax

A number of non-convex optimization problems can be convexified by “lifting” strategies. These methods yield convex formulations at the cost of substantially increased dimensionality. PhaseMax is a new type of convex relaxation that does not require lifting; it solves problems in their original low-dimensional parameter space.

Continue reading

Distributed Machine Learning

Classical machine learning methods, include stochastic gradient descent (aka backprop), work great on one machine, but don’t scale well to the cloud or cluster setting. We propose a variety of algorithmic frameworks for scaling machine learning across many workers.

Continue reading

FASTA

FASTA (Fast Adaptive Shrinkage/ Thresholding Algorithm) is an efficient, easy-to-use implementation of the Forward-Backward Splitting (FBS) method (also known as the proximal gradient method) for regularized optimization problems. Many variations on FBS are available in FASTA, including the popular accelerated variant FISTA (Beck and Teboulle ’09), the adaptive stepsize rule SpaRSA

Continue reading

Primal-dual hybrid gradient method

PDHG is a powerful splitting method that can solve a wide range of constrained and non-differentiable optimization problems. Unlike the popular ADMM method, the PDHG approach usually does not require expensive minimization sub-steps. We provide adaptive stepsize selection rules that automate the solver, while increasing its speed and robustness.

Continue reading

The stone transform: flexible compressed sensing

The stone transform enables images and videos to be under-sampled, and then reconstructed instantly at Nyquist rates, or at high resolution using compressed sensing.

Continue reading

PIT

The Perfusion Imaging Toolkit (PIT) is a comprehensive set of tools for MR-based perfusion imaging. In addition to several different perfusion calculation tools, the software smoothly integrates file formatting, image denoising, registration, segmentation, mean curve extraction, and many other image processing tasks. PIT has the capability to generate perfusion measurements from regions of interest, as well as to generate pixel-by-pixel perfusion maps.

Continue reading

Split Bregman

The split Bregman method (also know as ADMM) is a method for solving a wide range of image processing and signal reconstruction problems. It makes optimization fast and easy, particularly when L1 or total-variation priors make other methods difficult.

Continue reading