Guy Isely

Reveal Contact Info

PhD Student

Neuroscience

Sommer Lab

Current Research

Most current training algorithms for recurrent neural networks place hard limits on the potential timescales of learning. While network architectures exist that are capable of operating over the multiple timescales of an experience, they are not well matched with multi-timescale training algorithms. Insights from the study of hippocampal replay suggest novel multi-timescale training algorithms.

Background

I did philosophy as an undergraduate and I spent a lot time thinking about the problem of free will. While philosophers as early as Hume had a pretty rich definition of freedom, our modern foray into understanding the brain will no doubt help us refine the notion of freedom. I Also spent a fair bit of time thinking about language understanding and decided that perceptual grounding was a key missing piece for semantics. Fortunately, neural network models are already beginning to provide that grounding.

For fun I like to have dance parties with my housemate’s dogs.