Department Seminars & Colloquia




2021-11
Sun Mon Tue Wed Thu Fri Sat
  1 2 3 1 4 1 5 1 6
7 8 9 10 11 12 1 13
14 15 16 17 18 19 1 20
21 22 23 24 25 26 27
28 29 30        
2021-12
Sun Mon Tue Wed Thu Fri Sat
      1 2 3 1 4
5 6 7 8 9 10 11
12 13 14 15 16 17 18
19 20 21 22 23 24 25
26 27 28 29 30 31  

You can get notification if you subscribe the calendar in Google Calendar or iPhone calendar etc.

In astrophysical fluid dynamics, stars are considered as isolated fluid masses subject to self-gravity. A classical model to describe the dynamics of Newtonian stars is given by the gravitational Euler-Poisson system, which admits a wide range of star solutions that are in equilibrium or expand for all time or collapse in a finite time or rotate. In particular, using numerics, the Euler-Poisson system in the super-critical regime has been widely used inastrophysics literature todescribe the gravitational collapse, but its rigorous proof has been established only recently. The main challenge comes from thepressure, which actsagainstgravitational force. In this talk, I will discuss some recent progress on Newtonian dust-like collapse and self-similar collapse.
Host: 확률 해석 및 응용 연구센터     Contact: 확률 해석 및 응용 연구센터 (042-350-8111/8117)     To be announced     2021-11-19 17:45:11
In this talk, we first review some basics on stochastic processes. Then we discuss about the recent developments on Brownian-like jump processes. This talk is based on joint projects with Ante Mimica, Joohak Bae, Jaehoon Kang, Jaehun Lee.
Host: 확률 해석 및 응용 연구센터     Contact: 확률 해석 및 응용 연구센터 (042-350-8111/8117)     To be announced     2021-11-05 15:14:25
Deep neural networks have brought remarkable progress in a wide range of applications, but a satisfactory mathematical answer on why they are so effective has yet to come. One promising direction, with a large amount of recent research activity, is to analyse neural networks in an idealised setting where the networks have infinite widths and the so-called step size becomes infinitesimal. In this idealised setting, seemingly intractable questions can be answered. For instance, it has been shown that as the widths of deep neural networks tend to infinity, the networks converge to Gaussian processes, both before and after training, if their weights are initialized with i.i.d. samples from the  Gaussian distribution and normalised appropriately. Furthermore, in this setting, the training of a deep neural network is shown to achieve zero training error, and the analytic form of a fully-trained network with zero error has been identified. These results, in turn, enable the use of tools from stochastic processes and differential equations for analyzing deep neural networks in a novel way. In this talk, I will explain our efforts for extending the above analysis to a new type of neural networks that arise from recent studies on Bayesian deep neural networks, network pruning, and design of effective learning rates. In these networks, each network node is equipped with its own scala parameter that is intialised randomly and independently but is not updated during training. This scale parameter of a node determines the scale of weights of outgoing network edges from the node at initialisation, thereby introducing the dependency among the weights. Also, its square becomes the learning rate of those weights. I will show that these networks at given inputs become infinitely-divisible random variables at the infinite-width limit, and describe how this characterisation at the infinite-width limit can help us to understand the behaviour of these neural networks. This is joint work with Hoil Lee, Juho Lee, and Paul Jung at KAIST, Francois Caron at Oxford, and Fadhel Ayed at Huawei technologies
일정에 변동이 생겨 부득이하게 11.12.(금)으로 변경되었음을 알려드립니다.
Host: 확률 해석 및 응용 연구센터     Contact: 확률 해석 및 응용 연구센터 (042-350-8111/8117)     To be announced     2021-10-18 14:20:22

11.03.(수) ~ 11.05.(금)
Host: 확률 해석 및 응용 연구센터     Contact: 확률 해석 및 응용 연구센터 (042-350-8111/8117)     To be announced     2021-10-22 15:41:27

11.03.(수) ~ 11.05.(금)
Host: 확률 해석 및 응용 연구센터     Contact: 확률 해석 및 응용 연구센터 (042-350-8111/8117)     To be announced     2021-10-22 15:40:19

11.03.(수) ~ 11.05.(금)
Host: 확률 해석 및 응용 연구센터     Contact: 확률 해석 및 응용 연구센터 (042-350-8111/8117)     To be announced     2021-10-22 15:36:34