PhD Proposal: A Reconciliation of Deep Learning and the Brain: Towards Hybrid Biologically-Augmented Recurrent Neural Networks for Temporal Sensory Perception

Talk
Matthew Evanusa
Time: 
12.07.2020 14:30 to 16:30
Location: 

Remote

In 1958, Frank Rosenblatt conceived of the Perceptron, in an effort to fulfill the dream of connectionism, to explain and recreate brain phenomena such as learning and behavior through simple learning rules of simple neurons. After his tragic death and the A.I. winter, and the resurgence that followed, his more brain-focused network was distilled into the more standardized feed-forward deep multi-layer perceptrons, or deep artificial neural networks, that we are more familiar with today. However, even in proposing the perceptron, Rosenblatt hinted that it was really hierarchical and temporal information that was interesting: an intuitively clear point, as all the data we experience is in the temporal domain. Backpropagation continues to dominate, although it appears to be ill-fitted for recurrent network training. This is reinforced by the fact that backpropagation-trained feed-forward Transformer networks outperform RNNs on temporal tasks, causing RNNs to lose favor in the ML and AI communities for temporal data. Reservoir computing, a type of recurrent neural network that keeps random recurrent connections but trains only a readout layer, avoids the pitfalls of backpropagation with recurrence while showing strong performance, but needs further development to compete with the state of the art, especially hierarchical or deep variants.My proposed dissertation aims to pick up where the perceptron left off, in motivation and spirit, to continue to look to the brain to construct networks via the connectionist philosophy. Still believing that recurrent connections will be a powerful tool for temporal learning, and looking to the biology, I will propose a new class of recurrent neural networks, what I will call B-RNNs, short for Biologically-Augmented RNNs, that will build off of the success of the reservoir computing paradigm, but go steps further by incorporating reservoirs into new hybrid hierarchical architectures trained by new backpropagation alternatives, deep reinforcement learning, and composed of new insights and findings from neuroscience. The B-RNN nomenclature will also serve as a taxonomical organization umbrella for these networks. I will show in completed work that even simple hybridizations can beat deep LSTM and GRUs at complex temporal classification tasks, and propose several more complex B-RNNs in development and beyond. I will also lay out a framework for how B-RNNs can serve as a standardization for spiking neural network architectures..Examining Committee:

Chair: Dr. Yiannis Aloimonos Dept rep: Dr. James Reggia Members: Dr. Cornelia Fermüller Dr. Michelle Girvan Dr. Daniel Butts