Course page

Neuroscience 299: Computing with High-Dimensional Vectors - Fall 2021

This seminar will introduce an emerging computing framework that is based on using high-dimensional vectors to represent and manipulate symbols, data structures, and functions. This framework, commonly known as both Hyperdimensional Computing or Vector Symbolic Architectures (VSAs), originated at the intersection of symbolic and connectionist approaches to Artificial Intelligence but has turned into a research area of its own. In recent years, there have been an increasing number of applications in perception, analogical reasoning, models of memory, and language processing. These applications in turn can help us understand how these functions are performed by distributed networks of neurons in the brain. The purpose of this seminar is to convey this framework and recent developments to students across a wide variety of disciplines spanning neuroscience, computer science, electrical engineering, mathematics, and cognitive science.

Denis Kleyko (
Bruno Olshausen (
Fritz Sommer (
Pentti Kanerva (

GSI: Chris Kymn (
Office hours: Monday 4:30-5:30pm (Zoom link) or by appointment

Time: Wednesdays, 1-4pm, Evans 560

Grading: based on (1) regular attendance & participation (10%), (2) paper presentation (30%), (3) weekly short programming or writing assignments (60%).

  • Discussion questions and weekly writing/programming assignments should be submitted via the course Notion page.
  • Late weekly assignments will not be accepted but only your best 9 scores will count towards your final grade.
  • You may work in groups of 2-3 on weekly assignments, but please submit individual writeups and indicate whom you worked with.

Discussion forum: We have established a Piazza site where students can ask questions or propose topics for discussion.

Prerequisites: The seminar is intended for graduate or undergraduate students with basic knowledge of linear algebra or abstract algebra, probability theory, elementary logic, as well as basic programming skills.

Organizational Matters

  • All seminars will be held in person in Evans Hall, room 560.
  • The first seminar will occur on Wednesday, September 1 at 1 pm. (We will NOT meet on the first day of instruction, August 25.)
  • There will be an assignment after every seminar due the following week.
  • Each class session will be structured in two halves:
    • the first half will consist of student presentations on the previous week’s topic;
    • the second half will be a presentation on the current week’s topic given by one of the instructors or a guest lecturer.
  • Grading is based on the combination of assignments, student presentations, and class participation.

Schedule (provisional)

Date Week Module Speaker(s)
9/1 1 Introduction to computing with high-dimensional vectors Bruno Olshausen & Pentti Kanerva
9/8 2 Overview of different HD Computing/VSA models Denis Kleyko
9/15 3 Semantic vectors Ryan Moughan
9/22 4 Representation and manipulation of data structures Denis Kleyko
9/29 5 Resonator Networks Paxon Frady
10/6 6 Analogical reasoning* TBA
10/13 7 Connections to information theory* Fritz Sommer
10/20 8 Locality-preserving encodings: representing continuous values and functions Chris Kymn
10/27 9 Solving classification problems Laura Galindez
11/3 10 Relations to neural networks Denis Kleyko
11/10 11 Hardware implementations TBA
11/17 12 Applications: Communication Ping-Chen Huang
11/24 No class (Thanksgiving Holiday)
12/1 13 Discussion of module 12/To be announced

*These lecture dates are subject to be swapped.

Module 1 (9/1): Introduction

Module 2 (9/8): Surveying Different HD Computing/VSA Models

Module 3 (9/15): Semantic vectors

Module 4 (9/22): Data structures

  • Focus paper: Kleyko et al., Vector Symbolic Architectures as a Computing Framework for Nanoscale Hardware
  • Note for focus paper: You are encouraged to read the whole paper, but the main focus of our discussion will be on Sections IV & V.
  • Further recommended reading:
    • Yerxa, Anderson, Weiss: The Hyperdimensional Stack Machine
    • Joshi, Halseth, Kanerva. Language Geometry Using Random Indexing
    • Gayler, Levy: A Distributed Basis for Analogical Mapping: New frontiers in Analogy Research
    • Hannagan, Dupoux, Christophe: Holographic String Encoding
    • Bloom, Space/Time Trade-offs in Hash Coding with Allowable Errors
    • Kleyko, Rahimi, Gayler, Osipov: Autoscaling Bloom Filter: Controlling Trade-off between True and False Positives
    • Nickel, Rosasco, Poggio: Holographic Embeddings of Knowledge Graphs
    • Rachkovskij: Representation and Processing of Structures with Binary Sparse Distributed Codes

Related resources
VSAOnline — Online Speakers’ Corner on Vector Symbolic Architectures and Hyperdimensional Computing