Donghwan Kim

alt text 

About Me

I am an associate professor in the Department of Mathematical Sciences at KAIST.

Curriculum Vitae, Google scholar entry

Current Research Interests

  • Optimization for Machine Learning

    • Convergence of (Stochastic) Gradient Method in Minimax Optimization

  • Generalization in Machine Learning

    • Implicit Bias of Gradient Method

  • Generative Model

Recent News

  • 24.09.25: We have one paper accepted to NeurIPS 2024.

  • 24.08.16: I started my sabbatical year at UC Berkeley.

  • 24.07.04: Yejun Kim and Dongchan Shin joined our group as M.S.-Ph.D. Students.

  • 24.05.16: I gave a talk at the KIAS under the title ‘‘How to make the gradient descent-ascent converge to local minimax optima.’’

  • 24.05.02: We have two papers accepted to ICML 2024.

  • 24.04.19: I organized the ‘‘Optimization and Machine Learning’’ session at the 2024 KMS Spring Meeting.

  • 24.04.19: Sucheol Lee, the first Ph.D. graduate from our group, received an excellent dissertation award from the Korean Mathematical Society (KMS).

  • 24.03.04: Jimyeong Kim joined our group as a postdoc.

  • 24.02.01: Kyungjae Lee joined our group as a M.S.-Ph.D. student.

  • 24.01.17: We have one paper accepted to ICLR 2024.

  • 24.01.02: We have one undergraduate student (Minhee Hong) doing individual study this winter semester.

  • 23.08.28: Munsik Kim joined our group as a Ph.D. student.

  • 23.07.28: I gave a talk at the KAI-X Mathematics Summer School under the title ‘‘The power of gradient descent in machine learning.’’

  • 23.06.26: We have one graduate student (Keyan Shi, Technische Universitat Munchen) doing KAI-X research internship this summer semester.

  • 23.06.21: We have one paper accepted to COLT (Open problems track) 2023.

  • 23.06.19: We have two undergraduate students (Changmin Kang, Changhyun Ko) doing individual study this summer semester.

  • 23.05.31: I gave a talk at the SIAM Conference on Optimization under the title ‘‘Fast extra gradient methods for smooth structured nonconvex-nonconcave minimax problems.’’

  • 23.05.19: I gave a talk at the KSIAM Spring Conference under the title ‘‘Two-timescale extragradient converges to local minimax points.’’

  • 23.04.28: I gave a talk at the KMS Spring Meeting under the title ‘‘Two-timescale extragradient converges to local minimax points.’’

  • 23.04.24: I gave a talk at the HKUST-KAIST-NUS Joint Workshop on Applied and Computational Mathematics under the title ‘‘Two-timescale extragradient converges to local minimax points.’’

  • 23.04.05: I gave a talk at the KAIST BTM S&T Biz Colloquium under the title ‘‘AI learning and optimization.’’

  • 23.03.31: I gave a talk at the Postech IME seminar about our recent research under the title ‘‘Gradient methods for minimax optimization.’’

  • 23.01.13: We have two undergraduate students (Seong Bae Lim, Yujun Kim) doing individual study this winter semester.

  • 23.01.13: I will now start posting our group's recent news here.

Contact Info

donghwankim (at) kaist (dot) ac (dot) kr