The 27th KMGS will be held on November 16th, Thursday, at Natural Science Building (E6-1) Room 1501.
We invite a speaker Sejun Park from the Dept. of Mathematical Sciences, KAIST.
The abstract of the talk is as follows.
Slot (AM 11:50~PM 12:30)
[Speaker] 박세준 (Sejun Park) from Dept. of Mathematical Sciences, KAIST, supervised by Prof. 황강욱 교수님 (Ganguk Hwang)
[Title] Provable Ensemble Distillation based Federated Learning Algorithm
[Discipline] Machine Learning
[Abstract]
In this talk, we will primarily discuss the theoretical analysis of knowledge distillation based federated learning algorithms. Before we explore the main topics, we will introduce the basic concepts of federated learning and knowledge distillation. Subsequently, we will understand a nonparametric view of knowledge distillation based federated learning algorithms and introduce generalization analysis of these algorithms based on the theory of regularized kernel regression methods.
[Language] Korean