Jaeyeon Kim

Jaeyeon Kim

Ph.D. Student

Harvard University

Biography

Hello, my name is Jaeyeon Kim! I’m a first-year Ph.D. student at Harvard CS, prospectively advised by Sitan Chen and Sham Kakade.

My academic journey started in junior high school, with multiple awards in Math Olympiads. Since then, I’ve always been driven by solving challenging problems with mathematics. Afterward, I studied Mathematics at Seoul National University, where I received B.S. in Mathematics. I loved to organize mathematical concepts in my own words–also I got 4.3/4.3 in every math course that I’ve taken!

Fortunately, I worked with Prof. Ernest Ryu in Optimization Theory. I discovered H-duality, a duality between first-order algorithms. H-duality is the duality between algorithms, thus distinct from any previously known duality, and further extended on various setups by my works (Mirror Descent, Fixed-point problems).

At Harvard University, I turned my attention to Diffusion Models. With amazing and supportive professors, my recent work on Masked Diffusion Models demystifies its training and inference. My broad goal is to develop an efficient generative model for discrete data, as well as mathematically identify scientific phenomena on Diffusion Models.

Feel free to reach out to me! I’d be happy to chat on various topics on Machine Learning.

Interests
  • Optimization Theory
  • Science of Deep Learning
Education
  • Ph.D. in Computer Science, 2024

    Harvard University

  • B.S. in Mathematics, 2020

    Seoul National University

Recent News

  2024.09   I’m starting my Ph.D. at Harvard University, prospectively advised by Prof. Sitan Chen and Sham KaKade. I’m really thrilled to pursue my research career at Harvard University!

  2024.08   I’m honored to be selected as Ilju Foundation scholarship, which supports graduate students studying abroad.

  2024.04   Excited to announce my new paper, Optimal Acceleration for Minimax and Fixed-Point Problems is Not Unique (ICML 2024, Spotlight, Top 3.5%). By proposing novel algorithms, we suggested that the optimal acceleration mechanism in minimax optimization and fixed-point problems is not unique. Surprisingly, our new algorithms are H-dual to the prior anchor-based accelerated methods: We discover H-duality in another setups!

  2023.12   I attended NeurIPS 2023 and gave a poster presentation.

Experience

 
 
 
 
 
Research Intern at MIT
Under Professor Asuman Ozdaglar
July 2023 – August 2023
Research on Optimization Theory
 
 
 
 
 
Research Intern at Seoul National University
Under Professor Ernest Ryu
September 2022 – August 2024
Research on Optimization Theory