Department Seminars & Colloquia




2020-09
Sun Mon Tue Wed Thu Fri Sat
    1 2 3 4 5
6 7 8 9 10 11 12
13 14 15 16 17 1 18 1 19
20 21 22 23 24 1 25 26
27 28 29 30      
2020-10
Sun Mon Tue Wed Thu Fri Sat
        1 2 3
4 5 6 7 8 1 9 10
11 12 13 1 14 2 15 16 17
18 19 20 21 22 23 24
25 26 27 28 1 29 30 31

When you're logged in, you can subscribe seminars via e-mail

I will present my recent work on uniqueness of Riemann problem consisting of two shocks for 1D isentropic Euler system. The uniqueness is guaranteed in the class of vanishing viscosity limits of solutions to the associate Navier-Stokes system, as the Bianchini-Bressan conjecture. The main idea to achieve this issue is to get a uniform stability of any large perturbations from a composite wave of two viscous shocks to the Navier-Stokes. Especially, I will explain about this main idea in a simpler context, that is, in the case of a single shock. This is based on joint works with Alexis Vasseur.
Host: 권순식     To be announced     2020-10-20 13:20:48
The essential dimension quantifies the algebraic-geometric complexity of a class of algebraic objects (such as, but not necessarily, the class of Galois extensions with a given group): roughly speaking, it is the minimal number of parameters required to describe all objects in this class (over all fields containing a given field K). We introduce and discuss arithmetic-geometric and local analogues of this notion. These are supposed to quantify the difference in complexity between the local and global Galois theory of a given group over a given number field K. In particular, we show that the "local dimension" of a finite group is bounded by 2 - whereas arithmetic dimension remains mysterious in general. We give an application concerning solution of Grunwald problems.
Host: 백상훈     English     2020-09-25 16:09:30
이 강연에서는 4차산업혁명에 있어서 수학이 얼마나 중요한 역할을 하는 지 살펴 본다. 먼저 과학발전의 역사를 살펴보고 4차산업혁명시대의 도래에 대하여 논한다. 4차산업혁명의 기초이론을 살펴보고 4차산업혁명의 가장 핵심인 인공지능과 생명공학기술을 논한 다음 수학산업의 경쟁력 제고 방안을 논한다. (Zoom link: https://kaist.zoom.us/j/86300617369?pwd=WlkyNkJYY0RZR2lNZHRHazBZcU5sUT09)
Host: 이창옥     To be announced     2020-10-07 17:27:06
In this talk, we introduce an idea of producing rational torsion points on J_0(N), which is well-known. Conjecturally, the points constructed in this way exhaust all the rational torsion points on J_0(N). So, we briefly explain how to compute the orders of such points, and prove the conjecture up to finitely many primes. (If you would like to join this online seminar, please contact Bo-Hae Im to get the Zoom link.)
Host: Bo-Hae Im     To be announced     2020-09-16 13:54:13
Recently, with the enormous development of deep learning techniques, solving underdetermined linear systems (more unknowns than equations) have become one of major concerns in medical imaging. Typical examples include undersampled MRI, local tomography, and sparse view CT, where deep learning techniques have shown excellent performance. Although deep learning methods appear to overcome limitations of existing mathematical methods in handling various underdetermined problems, there is a lack of rigorous mathematical foundations which would allow us to understand reasons why deep learning methods perform that well. This talk deals with this learning causal relationship about structure of training data suitable for deep learning to solve highly underdetermined inverse problems. We examine whether or not a desired reconstruction map can be learnable from the training data and the underdetermined system. Most problems of solving underdetermined linear systems in medical imaging are highly non-linear.
Host: 이창옥     To be announced     2020-09-09 11:03:00
In this talk, I will explain first what is the representation theory with an easy example from finite groups. Next, I will introduce certain algebraic objects called the quantum affine algebras and the quiver Hecke algebras, and explain a connection between those objects via representation theory of associative algebras.
Korean     2020-09-09 11:00:37
Hessian operators are, roughly speaking, the ones that depend on the eigenvalues of the Hessian matrix. Classical examples include the Laplacian and the real and complex Monge-Amp\`ere operator. Typically discussion of Hessian equations is restricted to subfamilies of functions, so that the problem becomes (degenerate) elliptic. In my talk I will discuss the basics of general Hessian equations and explain its links to problems arising in geometric analysis. If time permits I will focus on more specific examples admitting a richer theory.
English     2020-09-14 10:38:36
Deep neural networks usually act on fixed dimensional items. However, many real-world problems are formulated as learning mappings from sets of items to outputs. Such problems include multiple-instance learning, visual scene understandings, few-shot classifications, and even generic Bayesian inference procedures. Recently, several methods have been proposed to construct neural networks taking sets as inputs. The key properties required for those neural networks are permutation invariance and equivariance, meaning that intermediate outputs and final values of a network should remain unchanged with respect to the processing order of items in sets. This talk discusses recent advances in permutation invariant and equivariant neural networks, and discuss their theoretical properties, especially their universalities. The later part of the talk will also introduce interesting applications of the permutation invariant/equivariant neural networks.
Host: 전현호     To be announced     2020-09-09 10:58:40