Research Focus

Our group focuses on the overlap between Network Science (the study of complex systems using network modeling and graph mining methods), Neuroscience, and Machine Learning.

In a first research thrust, we study the brain using computational models and network analysis methods, leveraging the rich datasets that are becoming available about the brain's connectome and activity.

In a second research thrust, we design novel machine learning architectures that are neuro-inspired and that can learn in an unsupervised, adaptive and robust manner. We believe that the solution to the problem of Artificial General Intelligence will be neuro-inspired.


Some representative papers and projects:

IEEE Computer Vision and Pattern Recognition (CVPR) conference, 2024
NICE: Neurogenesis Inspired Contextual Encoding for Replay-free Class Incremental Learning with Mustafa Burak Gurbuz and Jean Michael Mooreman.

Conference on Lifelong Learning Agents (CoLLAs), 2024
Patch-Based Contrastive Learning and Memory Consolidation for Online Unsupervised Continual Learning with Cameron Taylor and Vassilis Vassiliades.

International Conference on Neural Information Processing Systems (NeurIPS) 2023:
Neural Sculpting: Uncovering hierarchically modular task structure in neural networks through pruning and network analysis with Shreyas Malakarjun Patil and Loizos Michael.

International Conference on Machine Learning (ICML) 2022:
NISPA: Neuro-Inspired Stability-Plasticity Adaptation for Continual Learning in Sparse Networks with Mustafa Burak Gurbuz. Code repo: Github link.

International Conference on Machine Learning (ICML) 2021:
PHEW: Constructing Sparse Networks that Learn Fast and Generalize Well without Training Data with Shreyas Malakarjun Patil. Code repo: Github link.


Education


Teaching (recently)


Advisees

PhD students and PostDocs:

Former PhD and PostDoc advisees:


Funding and sponsors (recently)


Open-Source Code (recently)