This seminar will introduce an emerging computing framework that is based on using high-dimensional vectors to represent and manipulate symbols, data structures, and functions. This framework, commonly known as both Hyperdimensional Computing or Vector Symbolic Architectures (VSAs), originated at the intersection of symbolic and connectionist approaches to Artificial Intelligence but has turned into a research area of its own. In recent years, there have been an increasing number of applications in perception, analogical reasoning, models of memory, and language processing. These applications in turn can help us understand how these functions are performed by distributed networks of neurons in the brain. The purpose of this seminar is to convey this framework and recent developments to students across a wide variety of disciplines spanning neuroscience, computer science, electrical engineering, mathematics, and cognitive science.
Instructors
Denis Kleyko (denkle@berkeley.edu)
Bruno Olshausen (baolshausen@berkeley.edu)
Fritz Sommer (fsommer@berkeley.edu)
Pentti Kanerva (pkanerva@berkeley.edu)
GSI: Chris Kymn (cjkymn@berkeley.edu)
Office hours: Monday 4:30-5:30pm or by appointment
Time: Wednesdays, 1-4pm, Evans 560
Grading: based on (1) regular attendance & participation (10%), (2) paper presentation (30%), (3) weekly short programming or writing assignments (60%).
- Late weekly assignments will not be accepted but only your best 9 scores will count towards your final grade.
- You may work in groups of 2-3 on weekly assignments, but please submit individual writeups and indicate whom you worked with.
Discussion forum: We have established a Piazza site where students can ask questions or propose topics for discussion.
Prerequisites: The seminar is intended for graduate or undergraduate students with basic knowledge of linear algebra or abstract algebra, probability theory, elementary logic, as well as basic programming skills.
Organizational Matters
- All seminars will be held in person in Evans Hall, room 560.
- The first seminar will occur on Wednesday, September 1 at 1 pm. (We will NOT meet on the first day of instruction, August 25.)
- There will be an assignment after every seminar due the following week.
- Each class session will be structured in two halves:
- the first half will consist of student presentations on the previous week’s topic;
- the second half will be a presentation on the current week’s topic given by one of the instructors or a guest lecturer.
- Grading is based on the combination of assignments, student presentations, and class participation.
Schedule (provisional)
Date | Week | Module | Speaker(s) |
9/1 | 1 | Introduction to computing with high-dimensional vectors | Bruno Olshausen & Pentti Kanerva |
9/8 | 2 | Overview of different HD Computing/VSA models | Denis Kleyko |
9/15 | 3 | Semantic vectors | Ryan Moughan |
9/22 | 4 | Representation and manipulation of data structures | Denis Kleyko |
9/29 | 5 | Resonator Networks | Paxon Frady |
10/6 | 6 | Analogical reasoning | Ross Gayler |
10/13 | 7 | Connections to information theory | Fritz Sommer |
10/20 | 8 | Locality-preserving encodings: representing continuous values and functions | Chris Kymn |
10/27 | 9 | Solving classification problems | Laura Galindez Olascoaga |
11/3 | 10 | Relations to neural networks | Denis Kleyko |
11/10 | 11 | Hardware implementations | Mohamed Ibrahim |
11/17 | 12 | Applications: Communication | Ping-Chen Huang |
11/24 | No class (Thanksgiving Holiday) | ||
12/1 | 13 | Discussion of module 12/To be announced |
- Lecture slides (Olshausen, Kanerva) and recordings (Olshausen, Kanerva)
- Focus paper: Kanerva, Hyperdimensional Computing: An Introduction to Computing in Distributed Representation with High-Dimensional Random Vectors
- Assignments for next session (9/8):
- (Recommended) Join Piazza site
- Fill out presentation preferences sheet (by 9/3)
- Submit 2 discussion questions to Notion page (to be posted)
- Complete Assignment 1
- Further recommended reading:
- Neubert, Schubert, Protzel: An Introduction to Hyperdimensional Computing for Robotics
- Kanerva: Computing with High-Dimensional Vectors
- Gayler: Vector Symbolic Architectures Answer Jackendoff’s Challenges for Cognitive Neuroscience
- Fodor, Pylyshyn: Connectionism and Cognitive Architecture: A Critical Analysis
- Plate: Distributed Representations and Nested Compositional Structure
Module 2 (9/8): Surveying Different HD Computing/VSA Models
- Lecture slides and recording
- Focus paper: Schlegel, Neubert, Protzel: A Comparison of Vector Symbolic Architectures
- Assignments: Submit Assignment 2 and discussion questions by next session (9/15)
- Further recommended reading:
- Plate, Holographic Reduced Representations
- Kanerva, Fully Distributed Representation
- Gayler, Multiplicative Binding, Representation Operators & Analogy
- Gallant, Okaywe: Representing Objects, Relations & Sequences
- Rachkovskij, Kussul: Binding and Normalization of Binary Sparse Distributed Representations by Context-Dependent Thinning
- Laiho, Pokonen, Kanerva, Lehtonen: High-Dimensional Computing with Sparse Vectors
- Smolensky: Tensor Product Variable Binding and the Representation of Symbolic Structures in Connectionist Systems
- Mizraji, Context-Dependent Associations in Linear Distributed Memories
Module 3 (9/15): Semantic vectors
- Lecture slides and recording
- Assignments: Submit Assignment 3 and discussion questions by next session (9/22)
- Python notebook for programming assignment
- Focus paper: Jones, Mewhort: Representing Word Meaning and Order Information in a Composite Holographic Lexicon
- Further recommended reading:
- Mikolov, Sutskever, Chen, Corrado, Dean: Distributed Representations of Words and Phrases and their Compositionality
- Pennington, Socher, Manning: GloVe: Global Vectors for Word Representation
- Sahlgren, An Introduction to Random Indexing
- Sahlgren, Holst, Kanerva: Permutations as a Means to Encode Order in Word Space
- Recchia, Sahlgren, Kanerva, Jones: Encoding Sequential Information in Semantic Space Models: Comparing Holographic Reduced Representation and Random Permutation
- Sutor, Summers-Stay, Aloimonos: A Computational Theory for Life-long Learning of Semantics
- Kelly, Mewhort, and West: The Memory Tesseract: Mathematical Equivalence between Composite and Separate Storage Memory Models
Module 4 (9/22): Data structures
- Lecture slides and recording
- Assignments: Submit Assignment 4 and discussion questions by next session (9/29)
- Python notebook for programming assignment
- Focus paper: Kleyko et al., Vector Symbolic Architectures as a Computing Framework for Nanoscale Hardware
- Note for focus paper: You are encouraged to read the whole paper, but the main focus of our discussion will be on Sections IV & V.
- Further recommended reading:
- Yerxa, Anderson, Weiss: The Hyperdimensional Stack Machine
- Joshi, Halseth, Kanerva. Language Geometry Using Random Indexing
- Gayler, Levy: A Distributed Basis for Analogical Mapping: New frontiers in Analogy Research
- Hannagan, Dupoux, Christophe: Holographic String Encoding
- Bloom, Space/Time Trade-offs in Hash Coding with Allowable Errors
- Kleyko, Rahimi, Gayler, Osipov: Autoscaling Bloom Filter: Controlling Trade-off between True and False Positives
- Nickel, Rosasco, Poggio: Holographic Embeddings of Knowledge Graphs
- Rachkovskij: Representation and Processing of Structures with Binary Sparse Distributed Codes
- Neubert, Protzel: Towards Hypervector Representations for Learning and Planning with Schemas
Module 5 (9/29): Resonator networks
- Lecture slides and recording
- Focus paper: Frady et al., “Resonator circuits” & “Resonator Networks, 1”
- Assignments: Submit Assignment 5 and discussion questions by next session (10/6)
- Further recommended reading:
Module 6 (10/6): Analogical reasoning
- Lecture slides and recording
- Focus paper: Gayler, Levy: A Distributed Basis for Analogical Mapping
- Assignments: Submit Assignment 6 and discussion questions by next session (10/13)
- Further recommended reading (annotated list)
Module 7 (10/13): Information theory
- Lecture slides and recording
- Focus paper: Frady, Kleyko, Sommer: A Theory of Sequence Indexing and Working Memory in Recurrent Neural Networks
- Assignments: Submit Assignment 7 and discussion questions by next session (10/20)
- Further recommended reading:
- Plate: Distributed Representations and Nested Compositional Structure (Section 3.8 and Appendices B-E)
- Gallant, Okaywe: Representing Objects, Relations & Sequences (Section 5)
- Thomas, Dasgupta, Rosing: A Theoretical Perspective on Hyperdimensional Computing
- Mirus, Stewart, Conradt: Analyzing the Capacity of Distributed Vector Representations to Encode Spatial Information
- Kleyko, Rosato, Frady, Panella, Sommer: Perceptron Theory for Predicting the Accuracy of Neural Networks
Module 8 (10/20): Locality-preserving encoding
- Lecture slides and recording
- Focus paper: Frady, Kleyko, Kymn, Olshausen, Sommer: Computing on Functions Using Randomized Vector Representations
- Assignments: Submit Assignment 8 and discussion questions by next session (10/27)
- Further recommended reading:
- Rachkovskij, Slipchenko, Kussul, Baidyk: Sparse Binary Distributed Encoding of Scalars
- Smith, Stanford: A Random Walk in Hamming Space
- Purdy, Encoding Data for HTM [Hierarchical Temporal Memory] Systems
- Räsänen, Generating Hyperdimensional Distributed Representations from Continuous Valued Multivariate Sensory Input
- Thomas, Dasgupta, Rosing: A Theoretical Perspective on Hyperdimensional Computing (Section 5)
Module 9 (10/27): Solving classification problems
- Lecture slides and recording
- Focus paper: Rahimi, Kanerva, Benini, Rabaey: Efficient Biosignal Processing Using Hyperdimensional Computing: Network Templates for Combined Learning and Classification of ExG Signals
- Assignments: Submit Assignment 9 and discussion questions by next session (November 3)
- Python Notebooks (for part 1, for part 2)
- Further recommended reading:
- Ge, Parhi: Classification using Hyperdimensional Computing: A Review
- Neubert, Schubert, Protzel: An Introduction to Hyperdimensional Computing for Robotics
- Joshi, Halseth, Kanerva: Language Geometry Using Random Indexing
- Rahimi, Benatti, Kanerva, Benini, Rabaey: Hyperdimensional Biosignal Processing: A Case Study for EMG-based Hand Gesture Recognition
- Kleyko, Rahimi, Rachkovskij, Osipov, Rabaey: Classification and Recall with Binary Hyperdimensional Computing: Tradeoffs in Choice of Density and Mapping Characteristic
Module 10 (11/3): Relations to neural networks
- Lecture slides and recording*
- *Note: due to a technical problem, approximately 1 minute is missing at 00:31:55
- Focus paper: Danihelka, Wayne, Uria, Kalchbrenner, Graves: Associative Long Short-Term Memory
- Assignments: Submit Assignment 10 and discussion questions by next session (November 10)
- Further recommended reading:
- Ganesan et al.: Learning with Holographic Reduced Representations
- Karunaratne et al.: Robust high-dimensional memory-augmented neural networks
- Cheung, Terekhov, Chen, Agrawal, Olshausen: Superposition of many models into one
- Kleyko, Kheffache, Frady, Wiklund, Osipov: Density Encoding Enables Resource-Efficient Randomly Connected Neural Networks
- Kleyko, Frady, Kheffache, Osipov: Integer Echo State Networks: Efficient Reservoir Computing for Digital Hardware
- Mitrokhin, Sutor, Summers-Stay, Fermuller, Aloimonos: Symbolic Representation and Learning with Hyperdimensional Computing
- Mirus, Blouw, Stewart, Conradt: An Investigation of Vehicle Behavior Prediction Using a Vector Power Representation to Encode Spatial Positions of Multiple Objects and Neural Networks
- “Kanerva machines”
- Wu, Wayne, Graves, Lillicrap: The Kanerva Machine: A Generative Distributed Memory
- Marblestone, Wu, Wayne: Product Kanerva Machines: Factorized Bayesian Memory
- Ramapuram, Wu, Kalousis: Kanerva++: Extending the Kanerva Machine With Differentiable, Locally Block Allocated Latent Memory
Module 11 (11/10): Hardware implementations
- Lecture slides and recording
- Focus paper: Karunaratne et al., In-Memory Hyperdimensional Computing
- Assignments: Submit Assignment 11 and discussion questions by next session (November 17)
- Further recommended reading:
- Specialized Hardware
- Rahimi, Kanerva, Rabaey: A robust and energy-efficient classifier using brain-inspired hyperdimensional computing
- Imani et al., Low-power sparse hyperdimensional encoder for language recognition
- Menon et al., A Highly Energy-Efficient Hyperdimensional Computing Processor for Wearable Multi-Modal Classification
- Nearly General-Purpose Hardware
- Design of Associative Memories
- Imani et al., Exploring hyperdimensional associative memory
- NVM and Nanoscalable Paradigm (see also focus paper)
- Other Brain-Inspired Hardware Models
- Specialized Hardware
Module 12 (11/17): Communications
- Lecture slides and recording
- Focus paper: Kim, Hyper-Dimensional Modulation for Robust Low-Power Communications
- Assignments: Submit Assignment 12 and discussion questions by next session (December 1)
- Further recommended reading:
- Jakimovski, Becker, Sigg, Schmidtke, Beigl: Collective Computing for Dense Sensing Environments
- Hsu, Kim: Collision-Tolerant Narrowband Communication using Non-Orthogonal Modulation and Multiple Access
- Hsu, Kim: Non-Orthogonal Modulation for Short Packets in Massive Machine Type Communications
- Simpkin et al., Constructing distributed time-critical applications using cognitive enabled services
- Simpkin et al., Efficient Orchestration of Node-RED IoT Workflows using a Vector Symbolic Architecture
Related resources
https://www.hd-computing.com/
VSAOnline — Online Speakers’ Corner on Vector Symbolic Architectures and Hyperdimensional Computing