Research Focus

Our group focuses on the overlap between Network Science (the study of complex systems using network modeling and graph mining methods), Neuroscience, and Machine Learning.

In a first research thrust, we study the brain using computational models and network analysis methods, leveraging the rich datasets that are becoming available about the brain's connectome and activity.

In a second research thrust, we design novel machine learning architectures that are neuro-inspired and that can learn in an unsupervised, adaptive and robust manner. We believe that the solution to the problem of Artificial General Intelligence will be neuro-inspired.

Some representative papers and projects:

International Conference on Neural Information Processing Systems (NeurIPS) 2023:
Neural Sculpting: Uncovering hierarchically modular task structure in neural networks through pruning and network analysis with Shreyas Malakarjun Patil and Loizos Michael. NeurIPS, December 2023.

International Conference on Machine Learning (ICML) 2022:
NISPA: Neuro-Inspired Stability-Plasticity Adaptation for Continual Learning in Sparse Networks with Mustafa Burak Gurbuz. In ICML 2022, July 2022. Code repo: Github link.

International Conference on Machine Learning (ICML) 2021:
PHEW: Constructing Sparse Networks that Learn Fast and Generalize Well without Training Data with Shreyas Malakarjun Patil. In ICML 2021, July 2021. Code repo: Github link.

International Joint Conference on Artificial Intelligence (IJCAI) 2021:
Unsupervised Progressive Learning and the STAM Architecture with James Smith, Cameron Taylor, and Seth Baer. In IJCAI 2021, August 2021. Code repo: Github link.


Teaching (recently)


PhD students and PostDocs:

Former PhD and PostDoc advisees:

Funding and sponsors (recently)

Open-Source Code (recently)