Gatsby Institute, University College London
Learning visual motion in recurrent neural networks
Monday 10th of December 2012 at 12:00pm
We present a dynamic nonlinear generative model for visual motion based on a
latent representation of binary-gated Gaussian variables connected in a network. Trained on sequences of images by an STDP-like rule the model learns to represent different movement directions in different variables. We use an online approximate inference scheme that can be mapped to the dynamics of networks of neurons. Probed with drifting grating stimuli and moving bars of light, neurons in the model show patterns of responses analogous to those of direction-selective simple cells in primary visual cortex. We show how the computations of the model are enabled by a specific pattern of learnt asymmetric recurrent connections. I will also briefly discuss our application of recurrent neural networks as statistical models of simultaneously recorded spiking neurons.
Join Email List
You can subscribe to our weekly seminar email list by sending an email to
email@example.com that contains the words
subscribe redwood in the body of the message.
(Note: The subject line can be arbitrary and will be ignored)