About the lab
Our group develops and adapts cutting-edge approaches from mathematics, machine learning, and physics to analyze learning, dynamics, computation, efficiency, and robustness of computation in the brain.
We wish to uncover how high-level cognitive function emerges from the bottom up. What aspects of circuit architecture drive the emergence of function from simple constituents? What are the properties of memory systems constructed from noisy, leaky neurons, and how are limitations of the building blocks overcome? What are the structural biases — such as modularity — in brains that make them efficient and robust learners?
We find that the brain contains neural circuits with invariant low-dimensional dynamics that underlie fundamental computations like integration, surprising new analog error-correcting codes that enable fault-tolerant computation, and ways in which the brain outperforms modern AI in learning speed, data efficiency, and robustness — and we seek to transfer these insights to build better machine intelligence.
Recent news
- 2025-12
New preprints on modular connectivity, control between subsystems, and brain-wide processing loops are out — see the Papers page.
- 2025-09
Welcome to new lab members joining across postdoc, graduate, and affiliate roles.
- 2024-10
New papers on neural network scaling laws and modularity from Boopathy et al. accepted at top ML venues.
- 2022-10
- 2021-04
NSF Graduate Research Fellowship awarded to Akhilan.
- 2021-02
Congrats to Akhilan and Aaditya — 2021 Hertz finalists.