News

  2024.09   I’ll start my Ph.D. at Harvard University, prospectively advised by Prof. Sitan Chen. I’m really thrilled to pursue my research career at Harvard University!

  2024.04   Excited to announce my new paper, Optimal Acceleration for Minimax and Fixed-Point Problems is Not Unique (ICML 2024). By proposing novel algorithms, we suggested that the optimal acceleration mechanism in minimax optimization and fixed-point problems is not unique. Surprisingly, our new algorithms are H-dual to the prior anchor-based accelerated methods: We discover H-duality in another setups!

  2023.12   I attended NeurIPS 2023 and gave a poster presentation.

  2023.11   Excited to announce Mirror Duality in Convex Optimization, which is the joint work with MIT EECS and UW Madion CS. This paper provides a novel perspective on gradient reduction in the mirror descent framework for Banach spaces, and we hope this work opens the door to interesting questions on gradient reduction algorithms.

  2023.09   H-duality paper is accepted at NeurIps 2023.

  2023.07   I attended ICML 2023, and gave an oral presentation (Top 3 papers) at the Workshop: Duality Principles in Modern Machine Learning.

  2023.07   I started a research internship under Prof. Asuman Ozdaglar. We hope to extend H-duality, which is presented in my paper, into various settings.

  2023.05   My first paper is uploaded in Arxiv! This is the joint work with MIT EECS. This work presents a new duality principle: H-duality, a duality between optimization algorithms for reducing function values and reducing gradient magnitude. To the best of our knowledge, this work is the first instance of a duality of optimization algorithms.

  2022.09   I joined the Optimization Research Group led by Prof. Ernest Ryu as a research intern.