Hyperdimensional Computing (HDC) is an emerging computational paradigm for representing compositional information as high-dimensional vectors, and has a promising potential in applications ranging from machine learning to neuromorphic computing. One of the longstanding challenges in HDC is the decomposition of compositional information to its constituent factors, known as the recovery problem.
Coding theory is branch of information theory which studies the mathematics of information transmission through noisy channels. In this talk it will be shown that linear codes, a well-studied coding-theoretic concept, are well suited for HDC. In particular, their rich algebraic structure simplifies and unifies many use-cases of HDC, but more importantly, they provide very efficient algorithms for the recovery problem, which outperform existing ones often by orders of magnitude.
The talk will be self-contained, and no knowledge of coding or information theory will be assumed. The talk is based on the arXiv paper https://arxiv.org/abs/2403.03278, which will appear in Neural Computation (MIT press).
Short bio: Netanel Raviv is an Assistant Professor with the Department of Computer Science and Engineering, Washington University in St. Louis, St. Louis, MO. He received a Ph.D. in computer science from the Technion–Israel Institute of Technology in 2017. His research interests include applications of coding and information theory to security, distributed computations, and machine learning.