Research Focus
We believe that the solution to the problem of Artificial General Intelligence will be neuro-inspired.
Some representative papers:
Transactions on Machine Learning Research (TMLR), 2025
How Can Knowledge of a Task’s Modular Structure Improve Generalization and Training Efficiency?
with Shreyas Malakarjun Patil and Cameron Ethan Taylor.
Transactions on Machine Learning Research (TMLR), 2025
Before Forgetting, There's Learning: Representation Learning Challenges in Online Unsupervised Continual Learning
with Cameron Taylor and Shreyas Malakarjun Patil.
IEEE Computer Vision and Pattern Recognition (CVPR) conference, 2024
NICE: Neurogenesis Inspired Contextual Encoding for Replay-free Class Incremental Learning
with Mustafa Burak Gurbuz and Jean Michael Mooreman.
Conference on Lifelong Learning Agents (CoLLAs), 2024
Patch-Based Contrastive Learning and Memory Consolidation for Online Unsupervised Continual Learning
with Cameron Taylor and Vassilis Vassiliades.
International Conference on Neural Information Processing Systems (NeurIPS) 2023:
Neural Sculpting: Uncovering hierarchically modular task structure in neural networks through pruning and network analysis
with Shreyas Malakarjun Patil and Loizos Michael.
International Conference on Machine Learning (ICML) 2022:
NISPA: Neuro-Inspired Stability-Plasticity Adaptation for Continual Learning in Sparse Networks
with Mustafa Burak Gurbuz. Code repo: Github link.
International Conference on Machine Learning (ICML) 2021:
PHEW: Constructing Sparse Networks that Learn Fast and Generalize Well without Training Data
with Shreyas Malakarjun Patil. Code repo: Github link.
Education
Teaching (recently)
Advisees
Funding and sponsors (before moving to Cyprus in 2023)
Open-Source Code (recently)