Course page

VS265: Neural Computation - Fall 2020

This course provides an introduction to theories of neural computation, with an emphasis on the visual system. The goal is to familiarize students with the major theoretical frameworks and models used in neuroscience and psychology, and to provide hands-on experience in using these models. Topics include neural network models, principles of neural coding and information processing, self-organization (learning rules), recurrent networks and attractor dynamics, hierarchical models, and computing with distributed representations.

Instructor:  Bruno Olshausen, baolshausen@berkeley.edu, office hours immediately after class

GSI:  Sophia Sanborn, sanborn@berkeley.edu

Lectures:  Tuesdays & Thursdays 3:30-5, online.

Grading:  based on challenge problems (60%) and final project (40%)

  • Challenge problems will be posted on Notion.  See Instructions and Guidelines here.
  • Late challenge problems will not be accepted but your lowest scoring assignment will be dropped at the end of the semester
  • Final project guidelines:
    • 5 page report + poster or oral presentation at project presentation day (Dec. 10).
    • You may work in teams of 3-4 students.
    • The project should explore one of the covered in class in more depth, either mathematically or computationally, or it can also be a critical analysis of the prospects for how these approaches can be used to inform our understanding of the brain.
    • Some possible project suggestions.

Textbooks:

  • [HKP] Hertz, J. and Krogh, A. and Palmer, R.G. Introduction to the theory of neural computation. Amazon
  • [DJCM] MacKay, D.J.C. Information Theory, Inference and Learning Algorithms. Available online or Amazon
  • [DA] Dayan, P. and Abbott, L.F. Theoretical neuroscience: computational and mathematical modeling of neural systems. Amazon
  • [SL] Sterling, P. and Laughlin, S.  Principles of Neural Design.  MITCogNet

Discussion forum:  We have established a Piazza site where students can ask questions or propose topics for discussion.

 

Syllabus

Aug. 27:  Introduction

Sept. 1:  Brains

Sept. 3:  Neural mechanisms and models

Sept. 8:  Signal detection and amplification

  • Computing with chemistry/allostery
  • Phototransduction
  • Lecture slides and recording passcode: K0u#9Fja
  • Reading:  SL chapters 5, 6, 8

Sept. 10:  Physics of computation

Sept. 11:  Challenge problem 1 due 

Sept. 15, 17:  Neural coding

Sept. 22:  Retinal tiling

Sept. 24:  Auditory coding

Sept. 29:  Starburst amacrine cells and the computation of motion (Rowland Taylor)

Oct. 1:  Neural coding in Lateral Geniculate Nucleus (LGN) (Fritz Sommer)

Oct. 2:  Challenge problem 2 due 

Oct. 6, 8, 13:  Inference

Oct. 15, 20:  Organization, topography

Oct. 20:  Challenge problem 3 due 

Oct. 22, 27:  Attractor dynamics

Oct. 29:  Probabilistic models

Nov. 3:  Inference – denoising, dynamical models

Nov. 5, 10:  Hierarchical models and invariance

Nov. 6:  Challenge problem 4 due

Nov. 12 – Project mini-presentations

Nov. 17:  Perception-action loop

Nov. 19:  Supervised learning (and cerebellum)

Nov. 24:  Reinforcement learning and basal ganglia (Sophia)

Nov. 26:  — Thanksgiving holiday —

Dec. 1, 3:  Hyperdimensional computing

Dec. 8:  Neural oscillations and synchrony

  • Binding by synchrony
  • Coupled oscillator networks
  • Computing with waves

Dec. 10:  Project presentations