[Notice] 15th KMGS on Dec. 1(Thu), 2022

The 15th KMGS will be held on December 1st, Thursday, at Natural Science Building (E6-1) Room 1501.
We invite a speaker Yeongjong Kim(김영종) from the Dept. of Mathematical Sciences, KAIST.
The abstract of the talk is as follows.

Slot (AM 11:50~PM 12:30)
[Speaker] Yeongjong Kim(김영종) from Dept. of Mathematical Sciences, KAIST, supervised by Prof. Sanghoon Baek (백상훈 교수님)
[Title] Linear algebraic groups and related structures
[Discipline] Algebraic Geometry
[Abstract] In this talk, I will give a brief introduction of what a linear algebraic group is and how it is structured. Then I will talk about the Galois descent related to linear algebraic groups. At last, I will explain what a torsor is and how it is related to other algebraic structures.
[Language] Korean (English if it is requested)

[Notice] 14th KMGS on Nov. 17(Thu), 2022

The 14th KMGS will be held on November 17th, Thursday, at Natural Science Building (E6-1) Room 1501.
We invite a speaker Junseok Kim (김준석) from the Dept. of Mathematical Sciences, KAIST.
The abstract of the talk is as follows.

Slot (AM 11:50~PM 12:30)
[Speaker] Junseok Kim (김준석) from Dept. of Mathematical Sciences, KAIST, supervised by Prof. Hyungryul Baik (백형렬 교수님)
[Title] Hyperbolicity in Groups
[Discipline] Geometric Group Theory
[Abstract] Geometric group theory concerns about how to see geometric properties in finitely generated groups. Defining Cayley graph of a finitely generated group with respect to finite generating set gives a perspective to describe geometric properties of finitely generated groups. Once we get a geometric perspective, we can classify finitely generated groups via quasi-isometry, since two Cayley graphs are quasi-isometric. In this talk, we will explain some basic notions appeared in geometric group theory (for example, quasi-isometry, hyperbolic groups, Švarc–Milnor lemma) and some theorems related to (relative) hyperbolicity of groups.
[Language] Korean

[Notice] 13th KMGS on Nov. 3(Thu), 2022

The 13th KMGS will be held on November 3rd, Thursday, at Natural Science Building (E6-1) Room 1501.
We invite a speaker Seonghyuk Im (임성혁) from the Dept. of Mathematical Sciences, KAIST.
The abstract of the talk is as follows.

Slot (AM 11:50~PM 12:30)
[Speaker] Seonghyuk Im (임성혁) from Dept. of Mathematical Sciences, KAIST, supervised by Prof. Jaehoon Kim (김재훈 교수님)
[Title] Large clique subdivisions in graphs without small dense subgraphs
[Discipline] Combinatorics
[Abstract] In extremal graph theory, one big question is finding a condition of the number of edges that guarantees the existence of a particular substructure in a graph. In the first half of this talk, I’ll talk about the history of such problems, especially focusing on clique subdivisions. In the last half of the talk, I’ll introduce my recent result with Jaehoon Kim, Younjin Kim, and Hong Liu, which states that if a graph G has no dense small subgraph, then G has a clique subdivision of size almost linear in its average degree and discuss some applications and further open questions. 
[Language] Korean (English if it is requested)

[Notice] 12th KMGS on Oct. 6(Thu), 2022

The 12th KMGS will be held on October 6th, Thursday, at Natural Science Building (E6-1) Room 1501.
We invite a speaker Juhun Baik (백주헌) from the Dept. of Mathematical Sciences, KAIST.
The abstract of the talk is as follows.

Slot (AM 11:50~PM 12:30)
[Speaker] Juhun Baik (백주헌) from Dept. of Mathematical Sciences, KAIST, supervised by Prof. Hyungryul Baik (백형렬 교수님)
[Title] Shift locus of cubic polynomial
[Discipline] Topology
[Abstract]This talk is about the complex dynamics, which cares the iteration of holomorphic map (usually a rational map on the Riemann sphere), and the shift locus is a nice set of polynomials that all critical points escape to infinity under iteration.Understanding the shape and topology of shift locus is a challenge for decades, and accumulated works are done by Blanchard, Branner, Hubbard, Keen, McMullen, and recently Calegari introduce a nice lamination model.In this talk I will explain the basic complex dynamics and introduce the topology of the shift locus of cubic polynomials done by Calegari’s paper ‘Sausages and Butcher paper’ and if time allows, I will end this talk with the connection to the Big mapping class group, the MCG of Sphere – Cantor set.
[Language] Korean (English if it is requested)

[Notice] 11th KMGS on Sep. 29(Thu), 2022

The 11th KMGS will be held on September 29th, Thursday, at Natural Science Building (E6-1) Room 1501.
We invite a speaker Junyoung Park (박준영) from the Dept. of Mathematical Sciences, KAIST.
The abstract of the talk is as follows.

Slot (AM 11:50~PM 12:30)
[Speaker] Junyoung Park (박준영) from Dept. of Mathematical Sciences, KAIST, supervised by Prof. Cheolwoo Park (박철우 교수님), Prof. Jeongyoun Ahn (안정연 교수님)
[Title] Kernel methods for radial transformed compositional data with many zeros
[Discipline] Statistics
[Abstract]
Compositional data analysis with a high proportion of zeros has gained increasing popularity, especially in chemometrics and human gut microbiomes research. Statistical analyses of this type of data are typically carried out via a log-ratio transformation after replacing zeros with small positive values. We should note, however, that this procedure is geometrically improper, as it causes anomalous distortions through the transformation. We propose a radial transformation that does not require zero substitutions and more importantly results in essential equivalence between domains before and after the transformation. We show that a rich class of kernels on hyperspheres can successfully define a kernel embedding for compositional data based on this equivalence. The applicability of the proposed approach is demonstrated with kernel principal component analysis.
[Language] Korean (English if it is requested)

[Notice] 10th KMGS on Sep. 15(Thu), 2022

The 10th KMGS will be held on September 15th, Thursday, at Natural Science Building (E6-1) Room 1501.
We invite two speakers Sungho Han (한성호) and Hoil Lee (이호일) from the Dept. of Mathematical Sciences, KAIST.
The abstracts of the talks are as follows.

1st slot (AM 11:50~PM 12:10)
[Speaker] Sungho Han (한성호) from Dept. of Mathematical Sciences, KAIST, supervised by Prof. Moon-Jin Kang (강문진 교수님)
[Title] Large time behavior of one-dimensional barotropic compressible Navier-Stokes equations
[Discipline] Analysis (PDE)
[Abstract]
We will discuss on large time behavior of the one dimensional barotropic compressible Navier-Stokes equations with initial data connecting two different constant states. When the two constant states are prescribed by the Riemann data of the associated Euler equations, the Navier-Stokes flow would converge to a viscous counterpart of Riemann solution. This talk will present the latest result on the cases where the Riemann solution consist of two shocks, and introduce the main idea for using to prove.
[Language] Korean

2nd slot (PM 12:15~12:35)
[Speaker] Hoil Lee (이호일) from Dept. of Mathematical Sciences, KAIST, supervised by Prof. Ji Oon Lee (이지운 교수님)
[Title] On infinitely wide deep neural networks
[Discipline] Probability theory, Deep learning
[Abstract]
Deep neural networks have proven to work very well on many complicated tasks. However, theoretical explanations on why deep networks are very good at such tasks are yet to come. To give a satisfactory mathematical explanation, one recently developed theory considers an idealized network where it has infinitely many nodes on each layer and an infinitesimal learning rate. This simplifies the stochastic behavior of the whole network at initialization and during the training. This way, it is possible to answer, at least partly, why the initialization and training of such a network is good at particular tasks, in terms of other statistical tools that have been previously developed. In this talk, we consider the limiting behavior of a deep feed-forward network and its training dynamics, under the setting where the width tends to infinity. Then we see that the limiting behaviors can be related to Bayesian posterior inference and kernel methods. If time allows, we will also introduce a particular way to encode heavy-tailed behaviors into the network, as there are some empirical evidences that some neural networks exhibit heavy-tailed distributions.
[Language] Korean (English if it is requested)