Wednesday, September 14, 2022

<< >>  
2022. 8
Sun Mon Tue Wed Thu Fri Sat
1 2 3 4 5 6
7 8 9 10 11 12 13
14 15 16 17 18 19 20
21 22 23 24 25 26 27
28 29 30 31
2022. 9
Sun Mon Tue Wed Thu Fri Sat
1 2 3
4 5 6 7 8 9 10
11 12 13 14 15 16 17
18 19 20 21 22 23 24
25 26 27 28 29 30
2022. 10
Sun Mon Tue Wed Thu Fri Sat
1
2 3 4 5 6 7 8
9 10 11 12 13 14 15
16 17 18 19 20 21 22
23 24 25 26 27 28 29
30 31
2022-09-21 / 16:00 ~ 17:00
SAARC 세미나 - SAARC 세미나: 인쇄
by 라준현()
It is challenging to perform a multiscale analysis of mesoscopic systems exhibiting singularities at the macroscopic scale. In this paper, we study the hydrodynamic limit of the Boltzmann equations \begin{equation} \mathrm{St} \partial_t F + v \cdot \nabla_x F = \frac{1}{\mathrm{Kn} } Q(F, F) \end{equation} toward the singular solutions of 2D incompressible Euler equations whose vorticity is unbounded \begin{equation} \partial_t u + u \cdot \nabla_x u + \nabla_x p = 0, \quad \mathrm{div} u = 0. \end{equation} We obtain a microscopic description of the singularity through the so-called kinetic vorticity and understand its behavior in the vicinity of the macroscopic singularity. As a consequence of our new analysis, we settle affirmatively an open problem of convergence toward Lagrangian solutions of the 2D incompressible Euler equation whose vorticity is unbounded ($\omega \in L^{\mathfrak{p} }$ for any fixed $1 \le \mathfrak{p} < \infty$). Moreover, we prove the convergence of kinetic vorticities toward the vorticity of the Lagrangian solution of the Euler equation. In particular, we obtain the rate of convergence when the vorticity blows up moderately in $L^{\mathfrak{p} }$ as $\mathfrak{p} \rightarrow \infty$ (localized Yudovich class).
2022-09-15 / 12:15 ~ 12:35
대학원생 세미나 - 대학원생 세미나: On infinitely wide deep neural networks 인쇄
by 이호일(KAIST)
Deep neural networks have proven to work very well on many complicated tasks. However, theoretical explanations on why deep networks are very good at such tasks are yet to come. To give a satisfactory mathematical explanation, one recently developed theory considers an idealized network where it has infinitely many nodes on each layer and an infinitesimal learning rate. This simplifies the stochastic behavior of the whole network at initialization and during the training. This way, it is possible to answer, at least partly, why the initialization and training of such a network is good at particular tasks, in terms of other statistical tools that have been previously developed. In this talk, we consider the limiting behavior of a deep feed-forward network and its training dynamics, under the setting where the width tends to infinity. Then we see that the limiting behaviors can be related to Bayesian posterior inference and kernel methods. If time allows, we will also introduce a particular way to encode heavy-tailed behaviors into the network, as there are some empirical evidences that some neural networks exhibit heavy-tailed distributions.
2022-09-15 / 11:50 ~ 12:10
대학원생 세미나 - 대학원생 세미나: Large time behavior of one-dimensional barotropic compressible Navier-Stokes equations 인쇄
by 한성호(KAIST)
We will discuss on large time behavior of the one dimensional barotropic compressible Navier-Stokes equations with initial data connecting two different constant states. When the two constant states are prescribed by the Riemann data of the associated Euler equations, the Navier-Stokes flow would converge to a viscous counterpart of Riemann solution. This talk will present the latest result on the cases where the Riemann solution consist of two shocks, and introduce the main idea for using to prove.
Events for the 취소된 행사 포함 모두인쇄
export to Google calendar  .ics download