Tensor Methods: A new paradigm for training probabilistic models and neural networks
Tensors are rich structures for modeling complex higher order relationships in data rich domains such as social networks, computer vision, internet of things, and so on. Tensor decomposition methods are embarrassingly parallel and scalable to enormous datasets. They are guaranteed to converge to the global optimum and yield consistent estimates for many probabilistic models such as topic models, community models, hidden Markov models, and so on. I will also demonstrate how tensor methods can yield rich discriminative features for classification tasks and provides a guaranteed method for training neural networks.