Visualizing the Loss Landscape of Neural Nets

It is well known that certain neural network architectures produce loss functions that train easier and generalize better, but the reasons for this are not well understood. To understand this better, we explore the structure of neural loss functions using a range of visualization methods.

Continue reading


PhasePack is a software library that implements a wide range of different phase retrieval algorithms. It can also produce algorithm comparisons, and comes with empirical datasets for testing on real-world problems.

Continue reading

Distributed Machine Learning

Classical machine learning methods, include stochastic gradient descent (aka backprop), work great on one machine, but don’t scale well to the cloud or cluster setting. We propose a variety of algorithmic frameworks for scaling machine learning across many workers.

Continue reading