[Notice] 28th KMGS on November 30 (Thu), 2023

The 28th KMGS will be held on November 30th, Thursday, at Natural Science Building (E6-1) Room 1501.
We invite a speaker Sangmin Lee from the Dept. of Mathematical Sciences, KAIST.
The abstract of the talk is as follows.

Slot (AM 11:50~PM 12:30)
[Speaker] 이상민 (Sangmin Lee) from Dept. of Mathematical Sciences, KAIST, supervised by Prof. 예종철 교수님 (Jong Chul Ye)
[Title] Data Topology and Geometry-dependent Bounds on ReLU Network Widths
[Discipline] Machine Learning
[Abstract]
While deep neural networks (DNNs) have been widely used in numerous applications over the past few decades, their underlying theoretical mechanisms remain incompletely understood. In this presentation, we propose a geometrical and topological approach to understand how deep ReLU networks work on classification tasks. Specifically, we provide lower and upper bounds of neural network widths based on the geometrical and topological features of the given data manifold.  We also prove that irrespective of whether the mean square error (MSE) loss or binary cross entropy (BCE) loss is employed, the loss landscape has no local minimum.
[Language] Korean but English if it is requested

[Notice] 27th KMGS on November 16 (Thu), 2023

The 27th KMGS will be held on November 16th, Thursday, at Natural Science Building (E6-1) Room 1501.
We invite a speaker Sejun Park from the Dept. of Mathematical Sciences, KAIST.
The abstract of the talk is as follows.

Slot (AM 11:50~PM 12:30)
[Speaker] 박세준 (Sejun Park) from Dept. of Mathematical Sciences, KAIST, supervised by Prof. 황강욱 교수님 (Ganguk Hwang)
[Title] Provable Ensemble Distillation based Federated Learning Algorithm
[Discipline] Machine Learning
[Abstract]
In this talk, we will primarily discuss the theoretical analysis of knowledge distillation based federated learning algorithms. Before we explore the main topics, we will introduce the basic concepts of federated learning and knowledge distillation. Subsequently, we will understand a nonparametric view of knowledge distillation based federated learning algorithms and introduce generalization analysis of these algorithms based on the theory of regularized kernel regression methods.
[Language] Korean

[Notice] 26th KMGS on November 2 (Thu), 2023

The 26th KMGS will be held on November 2nd, Thursday, at Natural Science Building (E6-1) Room 1501.
We invite a speaker Minseong Kwon from the Dept. of Mathematical Sciences, KAIST.
The abstract of the talk is as follows.

Slot (AM 11:50~PM 12:30)
[Speaker] 권민성 (Minseong Kwon) from Dept. of Mathematical Sciences, KAIST, supervised by Prof. 황준묵 교수님 (Jun-Muk Hwang)
[Title] Complex Geometry Arising from Quaternionic Kähler Manifolds
[Discipline] Complex Geometry
[Abstract]
In this talk, I will introduce twistor theory, which connects complex geometry, Riemannian geometry, and algebraic geometry by producing a complex manifold, called the twistor space, from a quaternionic Kähler manifold. First, I will explain why quaternionic Kähler manifolds have to be studied in view of holonomy theory in Riemannian geometry, and how twistor theory enables us to use algebraic geometry in studying their geometry. Next, based on the realization of homogeneous twistor spaces as adjoint varieties, I will present a description of the compactified spaces of conics in adjoint varieties, which is motivated by twistor theory.
[Language] Korean

[Notice] 25th KMGS on October 5 (Thu), 2023

The 25th KMGS will be held on October 5th, Thursday, at Natural Science Building (E6-1) Room 1501.
We invite a speaker Yeongjong Kim from the Dept. of Mathematical Sciences, KAIST.
The abstract of the talk is as follows.

Slot (AM 11:50~PM 12:30)
[Speaker] 김영종 (Yeongjong Kim) from Dept. of Mathematical Sciences, KAIST, supervised by Prof. 백상훈 교수님 (Sanghoon Baek)
[Title] Online Learning in Markov Settings
[Discipline] Optimization
[Abstract]
In this talk, I will explain the setting of online convex optimization and the definition of regret and constraint violation. I then will introduce various algorithms and their theoretical guarantees under various assumptions. The connection with some topics in machine learning such as stochastic gradient descent, multi-armed bandit, and reinforcement learning will also be briefly discussed.
[Language] Korean

[Notice] 24th KMGS on September 21 (Thu), 2023

The 24th KMGS will be held on September 21st, Thursday, at Natural Science Building (E6-1) Room 1501.
We invite a speaker Uihyeon Jeong from the Dept. of Mathematical Sciences, KAIST.
The abstract of the talk is as follows.

Slot (AM 11:50~PM 12:30)
[Speaker] 정의현 (Uihyeon Jeong) from Dept. of Mathematical Sciences, KAIST, supervised by Prof. 권순식 교수님 (Soonsik Kwon)
[Title] Quantized slow blow up dynamics for the energy-critical co-rotational wave maps problem
[Discipline] Analysis
[Abstract]
In this talk, we consider the blow-up dynamics of co-rotational solutions for energy-critical wave maps with the 2-sphere target.  We briefly introduce the (2+1)-dimensional wave maps problem and its co-rotational symmetry, which reduces the full wave map to the (1+1)-dimensional semilinear wave equation. Under such symmetry, we see that this problem has a unique explicit stationary solution, so-called “harmonic map”. Then we point out some of the works of analyzing the long-term dynamics of the flow near the harmonic map.  Among them, we focus on the smooth blow-up result that corresponds to the stable regime. In particular, the case where the homotopy index is one has a distinctive nature from the other cases, which allows us to exhibit the smooth blow-up with the quantized blow-up rates corresponding to the excited regime.
[Language] Korean