Wednesday, October 11, 2023

<< >>  
2023. 9
Sun Mon Tue Wed Thu Fri Sat
1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
2023. 10
Sun Mon Tue Wed Thu Fri Sat
1 2 3 4 5 6 7
8 9 10 11 12 13 14
15 16 17 18 19 20 21
22 23 24 25 26 27 28
29 30 31
2023. 11
Sun Mon Tue Wed Thu Fri Sat
1 2 3 4
5 6 7 8 9 10 11
12 13 14 15 16 17 18
19 20 21 22 23 24 25
26 27 28 29 30
2023-10-17 / 16:30 ~ 17:30
IBS-KAIST 세미나 - 이산수학: Essentially tight bounds for rainbow cycles in proper edge-colourings 인쇄
by Matija Bucić(Princeton University)
An edge-coloured graph is said to be rainbow if it uses no colour more than once. Extremal problems involving rainbow objects have been a focus of much research over the last decade as they capture the essence of a number of interesting problems in a variety of areas. A particularly intensively studied question due to Keevash, Mubayi, Sudakov and Verstraëte from 2007 asks for the maximum possible average degree of a properly edge-coloured graph on n vertices without a rainbow cycle. Improving upon a series of earlier bounds, Tomon proved an upper bound of $(\log n)^{2+o(1)}$ for this question. Very recently, Janzer-Sudakov and Kim-Lee-Liu-Tran independently removed the $o(1)$ term in Tomon's bound. We show that the answer to the question is equal to $(\log n)^{1+o(1)}$. A key tool we use is the theory of robust sublinear expanders. In addition, we observe a connection between this problem and several questions in additive number theory, allowing us to extend existing results on these questions for abelian groups to the case of non-abelian groups. Joint work with: Noga Alon, Lisa Sauermann, Dmitrii Zakharov and Or Zamir.
2023-10-13 / 11:00 ~ 12:00
학과 세미나/콜로퀴엄 - 응용 및 계산수학 세미나: 인쇄
by 김혜현(경희대학교)
With the success of deep learning technologies in many scientific and engineering applications, neural network approximation methods have emerged as an active research area in numerical partial differential equations. However, the new approximation methods still need further validations on their accuracy, stability, and efficiency so as to be used as alternatives to classical approximation methods. In this talk, we first introduce the neural network approximation methods for partial differential equations, where a neural network function is introduced to approximate the PDE (Partial Differential Equation) solution and its parameters are then optimized to minimize the cost function derived from the differential equation. We then present the approximation error and the optimization error behaviors in the neural network approximate solution. To reduce the approximation error, a neural network function with a larger number of parameters is often employed but when optimizing such a larger number of parameters the optimization error usually pollutes the solution accuracy. In addition to that, the gradient-based parameter optimization usually requires computation of the cost function gradient over a tremendous number of epochs and it thus makes the cost for a neural network solution very expensive. To deal with such problems in the neural network approximation, a partitioned neural network function can be formed to approximate the PDE solution, where localized neural network functions are used to form the global neural network solution. The parameters in each local neural network function are then optimized to minimize the corresponding cost function. To enhance the parameter training efficiency further, iterative algorithms for the partitioned neural network function can be developed. We finally discuss the possibilities in this new approach as a way of enhancing the neural network solution accuracy, stability, and efficiency by utilizing classical domain decomposition algorithms and their convergence theory. Some interesting numerical results are presented to show the performance of the partitioned neural network approximation and the iteration algorithms.
2023-10-12 / 14:30 ~ 15:45
학과 세미나/콜로퀴엄 - 기타: 인쇄
by ()
(information) "Introduction to Oriented Matroids" Series Thursdays 14:30-15:45
2023-10-12 / 16:15 ~ 17:15
학과 세미나/콜로퀴엄 - 콜로퀴엄: 인쇄
by 이상혁(서울대학교)
Maximal functions of various forms have played crucial roles in harmonic analysis. Various outstanding open problems are related to Lp boundedness (estimate) of the associated maximal functions. In this talk, we discuss Lp boundedness of maximal functions given by averages over curves.
Events for the 취소된 행사 포함 모두인쇄
export to Google calendar  .ics download