# 학과 세미나 및 콜로퀴엄

구글 Calendar나 iPhone 등에서 구독하면 세미나 시작 전에 알림을 받을 수 있습니다.

We consider a deep generative model for nonparametric distribution estimation problems. The true data-generating distribution is assumed to possess a certain low-dimensional structure. Under this assumption, we study convergence rates of estimators obtained by likelihood approaches and generative adversarial networks (GAN). The convergence rate depends only on the noise level, intrinsic dimension and smoothness of the underlying structure. The true distribution may or may not possess the Lebesgue density, depending on the underlying structure. For the singular case (no Lebesgue density), the convergence rate of GAN is strictly better than that of the likelihood approaches. Our lower bound of the minimax optimal rates shows that the convergence rate of GAN is close to the optimal rate. If the true distribution allows a smooth Lebesgue density, an estimator obtained by a likelihood approach achieves the minimax optimal rate.

A family of surfaces is called mean curvature flow (MCF) if the velocity of surface is equal to the mean curvature of the surface at that point. Even starting from smooth surface, the MCF typically encounters some singularities and various generalized notions of MCF have been proposed to extend the existence past singularities. They are level set flow, Brakke flow and BV flow, just to name a few. In my talk I explain a recent global-in-time existence result of a particular generalized solution which has some desirable properties. I describe a basic outline of how to construct the solution.

Over recent years, data science and machine learning have been the center of attention in both the scientific community and the general public. Closely tied to the ‘AI-hype’, these fields are enjoying expanding scientific influence as well as a booming job market. In this talk, I will first discuss why mathematical knowledge is important for becoming a good machine learner and/or data scientist, by covering various topics in modern deep learning research. I will then introduce my recent efforts in utilizing various deep learning methods for statistical analysis of mathematical simulations and observational data, including surrogate modeling, parameter estimation, and long-term trend reconstruction. Various scientific application examples will be discussed, including ocean diffusivity estimation, WRF-hydro calibration, AMOC reconstruction, and SIR calibration.

Polarization is a technique in algebra which provides combinatorial tools to study algebraic invariants of monomial ideals. Depolarization of a square free monomial ideal is a monomial ideal whose polarization is the original ideal. In this talk, we briefly introduce the depolarization and related problems and introduce the new method using hyper graph coloring.

We define the notion of infinity-categories and Kan complex using observations from the previous talk. A process, called the nerve construction, producing infinity-categories from usual categories will be introduced and we will set dictionaries between them. Infinity-categories of functors will be introduced as well.