Mechanisms of bottom-up and top-down processing in visual perception
Wednesday 22nd of April 2009 at 12:00pm
Perception involves a complex interaction between feedforward sensory-driven information and feedback attentional, memory, and executive processes that modulate such feedforward processing. A mechanistic understanding of feedforward and feedback integration is a necessary step towards elucidating key aspects of visual and cognitive functions and dysfunctions.
508-20 Evans Hall
In this talk, I will describe a computational framework for the study of visual perception. I will present computational as well as experimental evidence suggesting that bottom-up and top-down processes make a distinct and essential contribution to the recognition of complex visual scenes. A feedforward computational architecture may provide a satisfactory account of “immediate recognition” corresponding to the first few hundred milliseconds of visual processing. However, such an architecture may be limited in recognizing complex cluttered visual scenes. Attentional mechanisms and cortical feedback may be necessary to overcome these limitations. Finally, I will show that it is possible to reliably read the mind’s eye from fMRI signals and predict the category of objects that are mentally imagined by human observers. This result constitutes a case in point to argue that cortical feedback may be highly specific.
Join Email List
You can subscribe to our weekly seminar email list by sending an email to
firstname.lastname@example.org that contains the words
subscribe redwood in the body of the message.
(Note: The subject line can be arbitrary and will be ignored)