Learning visual motion in recurrent neural networks

Marius Pachitariu

Gatsby Institute, University College London
Monday, December 10, 2012 at 12:00pm
560 Evans Hall

We present a dynamic nonlinear generative model for visual motion based on a latent representation of binary-gated Gaussian variables connected in a network. Trained on sequences of images by an STDP-like rule the model learns to represent different movement directions in different variables. We use an online approximate inference scheme that can be mapped to the dynamics of networks of neurons. Probed with drifting grating stimuli and moving bars of light, neurons in the model show patterns of responses analogous to those of direction-selective simple cells in primary visual cortex. We show how the computations of the model are enabled by a specific pattern of learnt asymmetric recurrent connections. I will also briefly discuss our application of recurrent neural networks as statistical models of simultaneously recorded spiking neurons.