Department Seminars & Colloquia




2023-09
Sun Mon Tue Wed Thu Fri Sat
          1 2
3 4 5 6 7 1 8 1 9
10 11 1 12 13 14 3 15 1 16
17 18 19 20 21 2 22 23
24 25 1 26 27 28 29 30
2023-10
Sun Mon Tue Wed Thu Fri Sat
1 2 3 4 1 5 2 6 2 7
8 9 10 11 12 2 13 1 14
15 16 17 18 19 1 20 21
22 23 24 25 1 26 1 27 28
29 30 3 31        

When you're logged in, you can subscribe seminars via e-mail

: For a translation surface, the associated saddle connection graph has saddle connections as vertices, and edges connecting pairs of non-crossing saddle connections. This can be viewed as an induced subgraph of the arc graph of the surface. In this talk, I will discuss both the fine and coarse geometry of the saddle connection graph. We show that the isometry type is rigid: any isomorphism between two such graphs is induced by an affine diffeomorphism between the underlying translation surfaces. However, the situation is completely different when one considers the quasi-isometry type: all saddle connection graphs form a single quasi-isometry class. We will also discuss the Gromov boundary in terms of foliations. This is based on joint work with Valentina Disarlo, Huiping Pan, and Anja Randecker.
To be announced     2023-10-27 16:02:46

심사위원장: 김동환, 심사위원: 이창옥, 홍영준, 곽도영, 오덕순(충남대학교)
To be announced     2023-10-06 14:11:17

심사위원장: 김동환, 심사위원: 이창옥, 홍영준, 곽도영, 오덕순(충남대학교)
To be announced     2023-10-06 14:11:18
The Gauss-Bonnet theorem implies that the two dimensional torus does not have nonnegative Gauss curvature unless it is flat, and that the two dimensional sphere does not a metric which has Gaussian curvature bounded below by one and metric bounded below by the standard round metric. Gromov proposed a series of conjectures on generalizing the Gauss-Bonnet theorem in his four lectures. I will report my work with Gaoming Wang (now Tsinghua) on Gromov dihedral rigidity conjecture in hyperbolic 3-space and scalar curvature comparison of rotationally symmetric convex bodies with some simple singularities.
Host: 박지원     To be announced     2023-09-12 15:21:33

심사위원장: 변재형, 심사위원 : 강문진, 김용정, 배명진, 권오상(충북대학교)
To be announced     2023-09-13 13:42:03
With the success of deep learning technologies in many scientific and engineering applications, neural network approximation methods have emerged as an active research area in numerical partial differential equations. However, the new approximation methods still need further validations on their accuracy, stability, and efficiency so as to be used as alternatives to classical approximation methods. In this talk, we first introduce the neural network approximation methods for partial differential equations, where a neural network function is introduced to approximate the PDE (Partial Differential Equation) solution and its parameters are then optimized to minimize the cost function derived from the differential equation. We then present the approximation error and the optimization error behaviors in the neural network approximate solution. To reduce the approximation error, a neural network function with a larger number of parameters is often employed but when optimizing such a larger number of parameters the optimization error usually pollutes the solution accuracy. In addition to that, the gradient-based parameter optimization usually requires computation of the cost function gradient over a tremendous number of epochs and it thus makes the cost for a neural network solution very expensive. To deal with such problems in the neural network approximation, a partitioned neural network function can be formed to approximate the PDE solution, where localized neural network functions are used to form the global neural network solution. The parameters in each local neural network function are then optimized to minimize the corresponding cost function. To enhance the parameter training efficiency further, iterative algorithms for the partitioned neural network function can be developed. We finally discuss the possibilities in this new approach as a way of enhancing the neural network solution accuracy, stability, and efficiency by utilizing classical domain decomposition algorithms and their convergence theory. Some interesting numerical results are presented to show the performance of the partitioned neural network approximation and the iteration algorithms.
Host: 이창옥     Contact: 설윤창 (010-8785-5872)     To be announced     2023-10-03 20:07:53
(information) "Introduction to Oriented Matroids" Series Thursdays 14:30-15:45
Host: Andreads Holmsen     English     2023-09-13 17:56:08
Maximal functions of various forms have played crucial roles in harmonic analysis. Various outstanding open problems are related to Lp boundedness (estimate) of the associated maximal functions. In this talk, we discuss Lp boundedness of maximal functions given by averages over curves.
Host: 권순식     English     2023-09-08 15:22:49
"어떻게 하면 더 좋은 제품을 더 빠르게 개발할 수 있을까?"라는 문제는 모든 제조업이 안고 있는 숙제입니다. 최근 DX를 통해 많은 데이터들이 디지털화되고, AI의 급격한 발전을 통해 제품개발프로세스를 혁신하려는 시도가 일어나고 있습니다. 과거의 시뮬레이션 기반 설계에서 AI 기반 설계로의 패러다임 전환을 통해 제품개발 기간을 단축함과 동시에 제품의 품질을 향상시킬 수 있습니다. 본 세미나는 딥러닝을 통해 제품 설계안을 생성/탐색/예측/최적화/추천할 수 있는 생성형 AI 기반의 설계 프로세스(Deep Generative Design)를 소개하고, 모빌리티를 비롯한 제조 산업에 적용된 다양한 사례들을 소개합니다.
Host: 홍영준     Contact: 설윤창 (010-8785-5872)     To be announced     2023-09-24 22:03:17
In this talk, we discuss the Neural Tangent Kernel. The NTK is closely related to the dynamics of the neural network during training via the Gradient Flow(or Gradient Descent). But, since the NTK is random at initialization and varies during training, it is quite delicate to understand the dynamics of the neural network. In relation to this issue, we introduce an interesting result: in the infinite-width limit, the NTK converge to a deterministic kernel at initialization and remains constant during training. We provide a brief proof of the result for the simplest case.
9월 14일, 10월 4일, 5일 세 번에 걸친 발표.
Host: 홍영준     Contact: 이명수 ()     Korean     2023-09-25 15:35:17
(information) "Introduction to Oriented Matroids" Series Thursdays 14:30-15:45
Host: Andreads Holmsen     English     2023-09-13 17:54:08
In this talk, we discuss the Neural Tangent Kernel. The NTK is closely related to the dynamics of the neural network during training via the Gradient Flow(or Gradient Descent). But, since the NTK is random at initialization and varies during training, it is quite delicate to understand the dynamics of the neural network. In relation to this issue, we introduce an interesting result: in the infinite-width limit, the NTK converge to a deterministic kernel at initialization and remains constant during training. We provide a brief proof of the result for the simplest case.
9월 14일, 10월 4일, 5일 세 번에 걸친 발표로, 본 시간에는 주로 9월 14일 내용의 리뷰를 주로 다룸.
Host: 홍영준     Contact: 이명수 ()     Korean     2023-09-25 15:29:33
최근의 생성모델에 관하여 스탠포드대학의 Ermon교수팀에서 NeurIPS2019, ICLR2021에 발표한 아래의 2편의 논문을 집중 리뷰하면서 SDE를 이용한 Generative Modeling의 연구동향과 발전 방향을 심층토의하게 됩니다.
2주 연속 강의 (2번째) *공지: 22일 (금) 강연이 취소되고, 25일 (월) 강연으로 변경됨.
Host: 홍영준     Contact: 설윤창 (010-8785-5872)     To be announced     2023-09-17 01:21:36
(information) "Introduction to Oriented Matroids" Series Thursdays 14:30-15:45
Host: Andreads Holmsen     English     2023-09-06 20:19:28
We will discuss certain main problems concerning group actions on 1-dimensional manifolds (the circle and the interval) and perspectives for future research.
KAI-X Distinguished lecture
Host: 백형렬     English     2023-09-08 15:21:21
최근의 생성모델에 관하여 스탠포드대학의 Ermon교수팀에서 NeurIPS2019, ICLR2021에 발표한 아래의 2편의 논문을 집중 리뷰하면서 SDE를 이용한 Generative Modeling의 연구동향과 발전 방향을 심층토의 하게 됩니다.
2주 연속 강의(1번째)
Host: 이창옥     Korean English if it is requested     2023-09-06 21:41:05
In this talk, we discuss the Neural Tangent Kernel. The NTK is closely related to the dynamics of the neural network during training via the Gradient Flow(or Gradient Descent). But, since the NTK is random at initialization and varies during training, it is quite delicate to understand the dynamics of the neural network. In relation to this issue, we introduce an interesting result: in the infinite-width limit, the NTK converge to a deterministic kernel at initialization and remains constant during training. We provide a brief proof of the result for the simplest case.
Host: 홍영준     Contact: 김규식 (T.2702)     To be announced     2023-09-11 16:14:54
(information) "Introduction to Oriented Matroids" Series Thursdays 14:30-15:45
Host: Andreads Holmsen     English     2023-09-06 20:11:46
Questions of parameter estimation – that is, finding the parameter values that allow a model to best fit some data – and parameter identifiability – that is, the uniqueness of such parameter values – are often considered in settings where experiments can be repeated to gain more certainty about the data. In this talk, however, I will consider parameter estimation and parameter identifiability in situations where data can only be collected from a single experiment or trajectory. Our motivation comes from medical settings, where data comes from a patient; such limitations in data also arise in finance, ecology, and climate, for example. In this setting, we can try to find the best parameters to fit our limited data. In this talk, I will introduce a novel, alternative goal, which we refer to as a qualitative inverse problem. The aim here is to analyze what information we can gain about a system from the available data even if we cannot estimate its parameter values precisely. I will discuss results that allow us to determine whether a given model has the ability to fit the data, whether its parameters are identifiable, the signs of model parameters, and/or the local dynamics around system fixed points, as well as how much measurement error can be tolerated without changing the conclusions of our analysis. I will consider various classes of model systems and will illustrate our latest results with the classic Lotka-Volterra system.
Host: Jaekyoung Kim     Contact: Kyushik Kim (T.2702)     English     2023-08-31 13:57:07
In the past decades, there has been considerable progress in the theory of random walks on groups acting on hyperbolic spaces. Despite the abundance of such groups, this theory is inherently not preserved under quasi-isometry. In this talk, I will present our study of random walks on groups that satisfy a certain QI-invariant property that does not refer to an action on hyperbolic spaces. Joint work with Kunal Chawla, Kasra Rafi, and Vivian He.
Host: 백형렬     English     2023-09-06 15:27:22
In this talk, we discuss some concepts that are used to study (hyperbolic) holomorphic dynamics on K3 surfaces. These concepts include Green currents, their laminations, and Green measures, which emerge as the natural choice for measuring maximal entropy. These tools effectively establish Kummer rigidity – that is, when the Green measure is absolutely continuous to the volume measure, the surface is Kummer, and the dynamics is linear. We provide an overview of the techniques employed to establish this principle and provide a glimpse into their extension within the hyperkähler context – one of the higher-dimensional analogues of K3 surfaces.
Host: 박진현     Contact: 박진현 (2734)     English     2023-07-30 20:10:14
Hénon maps were introduced by Michel Hénon as a simplified model of the Poincaré section of the Lorenz model. They are among the most studied discrete-time dynamical systems that exhibit chaotic behavior. Complex Hénon maps in any dimension have been extensively studied over the last three decades, in parallel with the development of pluripotential theory. We will present the dynamical properties of these maps such as the behavior of point orbits, variety orbits, equidistribution of periodic points and fine ergodic properties of the systems. This talk is based on the work of Bedford, Fornaess, Lyubich, Sibony, Smillie, and on recent work of the speaker in collaboration with Bianchi and Sibony.
Host: Nguyen Ngoc Cuong     Contact: Kyushik Kim (T.2702)     English     2023-08-31 13:56:00