[Notice] 28th KMGS on November 30 (Thu), 2023

The 28th KMGS will be held on November 30th, Thursday, at Natural Science Building (E6-1) Room 1501.
We invite a speaker Sangmin Lee from the Dept. of Mathematical Sciences, KAIST.
The abstract of the talk is as follows.

Slot (AM 11:50~PM 12:30)
[Speaker] 이상민 (Sangmin Lee) from Dept. of Mathematical Sciences, KAIST, supervised by Prof. 예종철 교수님 (Jong Chul Ye)
[Title] Data Topology and Geometry-dependent Bounds on ReLU Network Widths
[Discipline] Machine Learning
[Abstract]
While deep neural networks (DNNs) have been widely used in numerous applications over the past few decades, their underlying theoretical mechanisms remain incompletely understood. In this presentation, we propose a geometrical and topological approach to understand how deep ReLU networks work on classification tasks. Specifically, we provide lower and upper bounds of neural network widths based on the geometrical and topological features of the given data manifold.  We also prove that irrespective of whether the mean square error (MSE) loss or binary cross entropy (BCE) loss is employed, the loss landscape has no local minimum.
[Language] Korean but English if it is requested

[Notice] 27th KMGS on November 16 (Thu), 2023

The 27th KMGS will be held on November 16th, Thursday, at Natural Science Building (E6-1) Room 1501.
We invite a speaker Sejun Park from the Dept. of Mathematical Sciences, KAIST.
The abstract of the talk is as follows.

Slot (AM 11:50~PM 12:30)
[Speaker] 박세준 (Sejun Park) from Dept. of Mathematical Sciences, KAIST, supervised by Prof. 황강욱 교수님 (Ganguk Hwang)
[Title] Provable Ensemble Distillation based Federated Learning Algorithm
[Discipline] Machine Learning
[Abstract]
In this talk, we will primarily discuss the theoretical analysis of knowledge distillation based federated learning algorithms. Before we explore the main topics, we will introduce the basic concepts of federated learning and knowledge distillation. Subsequently, we will understand a nonparametric view of knowledge distillation based federated learning algorithms and introduce generalization analysis of these algorithms based on the theory of regularized kernel regression methods.
[Language] Korean