Using noise to probe recurrent neural network structure and prune connections

Rishidev Chaudhuri

UC Davis
Tuesday, February 28, 2023 at 12:00pm
Evans Hall Room 560 and via Zoom (see below to obtain Zoom link)

Many networks in the brain are sparsely connected, and the brain eliminates connections during development and learning. How could the brain decide which synapses to prune? In a recurrent network, determining the importance of a connection between two neurons is a difficult computational problem, depending on the role that both neurons play and on all possible pathways of information flow between them.

Noise is ubiquitous in neural systems, and often considered an irritant to be overcome. Here we suggest that noise could play a functional role in pruning, allowing the brain to probe network structure and determine which connections are redundant. We construct a simple, local, unsupervised rule that either strengthens or prunes synapses using only connection weight and the noise-driven covariance of the neighboring neurons. For a subset of linear and rectified-linear networks, we adapt matrix concentration of measure arguments from the field of graph sparsification to prove that this rule preserves the spectrum of the original matrix and hence preserves network dynamics even when the fraction of pruned connections asymptotically approaches 1. The plasticity rule is biologically-plausible and may suggest a new role for noise in neural computation.