Jaeyeon Kim

Jaeyeon Kim

Ph.D. Student

Harvard University

Biography

Hello, my name is Jaeyeon Kim! I’m a first-year Ph.D. student at Harvard CS, advised by Sitan Chen and Sham Kakade.

My academic journey began in junior high school, with multiple awards in Math Olympiads. Since then, I’ve been driven by a passion for solving challenging problems through mathematics. Afterward, I earned a B.S. in Mathematics from Seoul National University, where I enjoyed organizing mathematical concepts in my own words. I got an A+ in all of the math courses that I have taken!!

During my undergraduate years, I had the privilege of working in Optimization Theory with professors Ernest Ryu and Asu Ozdaglar. We have developed H-duality, a novel duality framework in optimization algorithms. (1, 2, 3)

At Harvard, my research has shifted toward Diffusion Models. With incredible advisors, my recent work on Masked Diffusion Models sheds light on their training and inference processes. My broader goal is to develop efficient generative models for discrete data while deepening our mathematical understanding of Diffusion Models.

I’m always happy to chat about Machine Learning, Optimization, and beyond—feel free to reach out!

Interests
  • Diffusion Generative Models
  • Optimization Theory
Education
  • Ph.D. in Computer Science, 2024

    Harvard University

  • B.S. in Mathematics, 2020

    Seoul National University

Recent News

  2025.05   Two papers (Discrete Diffusion, LoRA theory) are accepted at ICML 2025, both as oral presentations! (Top 1.0%).

  2024.09   I’m starting my Ph.D. at Harvard University, prospectively advised by Prof. Sitan Chen and Sham KaKade. I’m really thrilled to pursue my research career at Harvard University!

  2024.08   I’m honored to be selected as Ilju Foundation scholarship, which supports graduate students studying abroad.

  2024.04   Excited to announce my new paper, Optimal Acceleration for Minimax and Fixed-Point Problems is Not Unique (ICML 2024, Spotlight, Top 3.5%). By proposing novel algorithms, we suggested that the optimal acceleration mechanism in minimax optimization and fixed-point problems is not unique. Surprisingly, our new algorithms are H-dual to the prior anchor-based accelerated methods: We discover H-duality in other setups!

  2023.12   I attended NeurIPS 2023 and gave a poster presentation.

Collaborators

I enjoy working with my peers!. If you are a junior student and willing to work with me, do not hesitate to reach out to me!

Chanwoo Park: Optimization theory, Oct 2022 - Oct 2023. (1, 2)
Junsu Kim: Theoretical analysis of LoRA fine-tuning, June 2024 - Jan 2025. (3)
Kiwhan Song: Diffusion models, Jan 2025 - Present
Brian Lee: Generative modeling for discrete data, April 2025 - Present
Gaspard Beaudouin : Image editing with diffusion models, April 2025 - Present