Ari Benjamin
Computational and theoretical neuroscience
I study the similarities – and differences – between artificial intelligence and the brain.
Research themes
What are the computational roles of cell types? Modern estimates put the number of cell types in the brain at between 5,000 and 10,000 types. What is the computational purpose of such diversity?
AI for single-cell biology Any theory of cellular function must be grounded experimental knowledge. In reality, cellular biology is extremely complex and only beginning to be understood. I believe that AI methods are the right tools for understanding the hidden structure of cellular neurobiology.
How does initial connectivity shape what can easily be learned? All animals easily learn some things but are stumped by other problems. What defines the line between easy and hard, 'natural' and 'unnatural' tasks? I use ideas from deep learning theory to understand learning biases, and look to experimental data for hypotheses. I operate under the hypothesis that learning modifies a genetically-encoded initial scaffold of long-range connectivity that defines brain area identities.
What are the algorithms of learning in the brain? How do cell types coordinate their plasticity such that animals learn what they do? And what is it that we are learning, exactly? I follow various literatures including neural plasticity, deep learning optimization, credit assignment, and classical learning theory and attempt to tie these together into a coherent picture of learning and development.
Current position
I work as a postdoc in the laboratory of Tony Zador at Cold Spring Harbor Laboratory. I completed my PhD with Konrad Kording at the University of Pennsylvania. My dissertation was titled "Machine learning as tool and theory for neuroscience."