I will present 4 topics from the theory of brain computation: Memory/Encoding, Invariance, Behavioral Rhythms, and Language. Each topic incorporates a particular mathematical model of the relevant processing: Discrete Recurrent Neural Nets (DRNNs) / Rate-Distortion Theory, The Bispectrum from Applied Group Theory, Renewal Processes in Time-Series Modeling, and Hyperdimensional Computing. To appeal to the largest audience possible, I will not give too many details of the techniques involved, but instead focus on high-level concepts and explicit examples. Details can be found in the references below.
– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –
References:
Efficient and optimal binary Hopfield associative memory storage using minimum probability flow https://arxiv.org/abs/1204.2916 Robust exponential memory in Hopfield networks https://arxiv.org/abs/1411.4625
A Hopfield recurrent neural network trained on natural images performs state-of-the-art image compression http://ieeexplore.ieee.org/abstract/document/7025831/
Exploring discrete approaches to lossy compression schemes for natural image patches http://ieeexplore.ieee.org/abstract/document/7362782/
Maximum entropy distributions on graphs https://arxiv.org/abs/1301.3321
When is sparse dictionary learning well-posed? https://arxiv.org/abs/1606.06997
A novel set of rotationally and translationally invariant features for images based on the non-commutative bispectrum https://arxiv.org/abs/cs/0701127
Informational and Causal Architecture of Discrete-Time Renewal Processes https://arxiv.org/abs/1408.6876
What We Mean When We Say “What’s the Dollar of Mexico?”: Prototypes and Mapping in Concept Space https://pdfs.semanticscholar.org/f477/232c0a0835dcbc4fc6b6283db484695482f9.pdf https://github.com/qualiaphile/hypercomputing