I’m a Ph.D. candidate in Computer Science & Engineering at Koc University and an AI Fellow at the KUIS AI Research Center. I am a senior member of the Machine Learning & Information Processing (MLIP) group, advised by Prof. Zafer Dogan.
My research develops a high-dimensional learning theory for how the statistics of learned representations arise from the interplay of architecture, data, and optimization—and how these statistics govern behaviors like generalization, in-context learning, and robustness in modern deep learning.
Research interests
Learning theory High-dimensional statistics Deep learning Gaussian universality Random matrix theory Robust learning
Currently on the postdoctoral job market (start date: Fall 2026)
Selected publications
- Demir & Dogan. How Data Mixing Shapes In-Context Learning: Asymptotic Equivalence for Transformers with MLPs. NeurIPS 2025.
- Demir & Dogan. Asymptotic Analysis of Two-Layer Neural Networks after One Gradient Step under Gaussian Mixtures with Structure. ICLR 2025.
- Demir & Dogan. Optimal Attention Temperature Enhances ICL under Distribution Shift. arXiv 2025.
- Demir & Dogan. Random features outperform linear models: Effect of strong input-label correlation in spiked covariance data. arXiv 2024.
- More on Google Scholar.
Academic service
- Reviewer: ICLR (2025–26), NeurIPS 2025, ACML (2024–25).
Teaching
- Head TA: Introduction to Modern Learning Theory (2025– ); Programming for Engineers (2024– ).
- TA: Algorithms & Complexity (2023–24), Software Engineering (2023), Programming with Python (2022), Computer Systems & Programming (2020–22), Introduction to Computing (2017).