C PEHLEVAN (Harvard)

Date: 

Thursday, March 28, 2019, 3:00pm to 4:00pm

Location: 

Pierce Hall, Rm 209, 29 Oxford St, Cambridge, MA 02138, USA
"What Learning Objectives Does Hebbian Plasticity Optimize?" Synaptic plasticity is widely accepted to be the mechanism behind learning in the brain’s neural networks (NNs). A central question is how synapses, which have access to only local information about the state of the network, can still organize collectively and perform circuit-wide learning in an efficient manner. We show that such local learning actually optimizes a novel class of network-wide learning objectives. These objectives are based on similarities and contain a term that aligns the similarity of outputs to the similarity of inputs. Online optimization of these objectives leads to algorithms implementable by biologically-plausible NNs with local learning rules. Members of the fast-growing family of similarity-based objectives and associated NNs solve unsupervised learning tasks relevant to the brain such as dimensionality reduction, sparse and/or nonnegative feature extraction, blind source separation, clustering and manifold learning. In addition to serving as models of natural NNs, such networks can serve as general-purpose machine learning algorithms. Speaker Bio: Cengiz (pronounced "Jen·ghiz”) Pehlevan is an Assistant Professor of Applied Mathematics at Harvard SEAS. His research interests are in theoretical neuroscience and neural computation. Cengiz comes to SEAS from the Flatiron Institute's Center for Computational Biology (CCB), where he was a a research scientist in the neuroscience group. Before CCB, Cengiz was a postdoctoral associate at Janelia Research Campus, and before that a Swartz Fellow at Harvard. Cengiz received a doctorate in physics from Brown University and undergraduate degrees in physics and electrical engineering from Bogazici University in Turkey.