In a first research thrust, we study the brain using computational models and network analysis methods, leveraging the rich datasets that are becoming available about the brain's connectome and activity.
In a second research thrust, we design novel machine learning architectures that are neuro-inspired and that can learn in an unsupervised, adaptive and robust manner. We believe that the solution to the problem of Artificial General Intelligence will be neuro-inspired.
Conference on Lifelong Learning Agents (CoLLAs), 2024
Patch-Based Contrastive Learning and Memory Consolidation for Online Unsupervised Continual Learning
with Cameron Taylor and Vassilis Vassiliades.
International Conference on Neural Information Processing Systems (NeurIPS) 2023:
Neural Sculpting: Uncovering hierarchically modular task structure in neural networks through pruning and network analysis
with Shreyas Malakarjun Patil and Loizos Michael.
International Conference on Machine Learning (ICML) 2022:
NISPA: Neuro-Inspired Stability-Plasticity Adaptation for Continual Learning in Sparse Networks
with Mustafa Burak Gurbuz. Code repo: Github link.
International Conference on Machine Learning (ICML) 2021:
PHEW: Constructing Sparse Networks that Learn Fast and Generalize Well without Training Data
with Shreyas Malakarjun Patil. Code repo: Github link.
Former PhD and PostDoc advisees: