Jaeyeon Kim

Jaeyeon Kim

Ph.D. Student

Harvard University

Biography

Hello, my name is Jaeyeon Kim! I also go with Jay. I’m currently a first-year Ph.D. student at Harvard CS, advised by Sitan Chen and Sham Kakade.

At Harvard, my research focuses on deepening our understanding of modern generative models and pioneering new paradigms for generative modeling. My recent work on Masked Diffusion received an Outstanding Paper Award at ICML!

Previously, I earned a B.S. in Math at Seoul National University, where I worked in Optimization Theory with professors Ernest Ryu and Asu Ozdaglar. Together, we developed H-duality, a novel duality framework for optimization algorithms. (1, 2, 3). Earlier in my academic journey, I won multiple awards in Math Olympiads.

I was born and raised in South Korea. When I’m not working, you’ll probably catch me outside—traveling, working out, or running. I’m quick at learning new environments, both in real life and academically: as soon as I grasp the underlying structure, I translate it into my own words and turn it into novel insights.

Interests
  • Diffusion Generative Models
  • Optimization Theory
Education
  • Ph.D. in Computer Science, 2024

    Harvard University

  • B.S. in Mathematics, 2020

    Seoul National University

Recent News

  2025.07   My paper on Masked Diffusion recieved an outstanding paper award at ICML 2025! This award was given to just 6 papers. (Award website, News article, Linkedin post)

  2025.05   Two papers (Masked Diffusion, LoRA theory) are accepted at ICML 2025, both as oral presentations! (Top 1.0%).

  2024.09   I’m starting my Ph.D. at Harvard University, prospectively advised by Prof. Sitan Chen and Sham KaKade. I’m really thrilled to pursue my research career at Harvard University!

  2024.08   I’m honored to be selected as Ilju Foundation scholarship, which supports graduate students studying abroad.

  2024.04   Excited to announce my new paper, Optimal Acceleration for Minimax and Fixed-Point Problems is Not Unique (ICML 2024, Spotlight, Top 3.5%). By proposing novel algorithms, we suggested that the optimal acceleration mechanism in minimax optimization and fixed-point problems is not unique. Surprisingly, our new algorithms are H-dual to the prior anchor-based accelerated methods: We discover H-duality in other setups!

  2023.12   I attended NeurIPS 2023 and gave a poster presentation.

Talks

  2025.09   Panel at Delta Podcast, talked about my recent paper on Masked Diffusion Models. Video, Tweet

  2025.07   In-person talk at ICCOPT 2025, talked about H-duality. Slides

  2025.05   Virtual seminar on Ph.D. applications for korean students, hosted by Seoul National University. Slides

Collaborators

I enjoy working with my peers! If you are a junior student and willing to work with me, do not hesitate to reach out to me!

Chanwoo Park: Optimization theory, Oct 2022 - Oct 2023. (1, 2)
Junsu Kim: Theoretical analysis of LoRA fine-tuning, June 2024 - Jan 2025. (3)
Kiwhan Song: Diffusion models, Jan 2025 - Present
Brian Lee: Generative modeling for discrete data, April 2025 - Present
Gaspard Beaudouin : Image editing with diffusion models, April 2025 - Present