Friday, May 3, 2024

<< >>  
2024. 4
Sun Mon Tue Wed Thu Fri Sat
1 2 3 4 5 6
7 8 9 10 11 12 13
14 15 16 17 18 19 20
21 22 23 24 25 26 27
28 29 30
2024. 5
Sun Mon Tue Wed Thu Fri Sat
1 2 3 4
5 6 7 8 9 10 11
12 13 14 15 16 17 18
19 20 21 22 23 24 25
26 27 28 29 30 31
2024. 6
Sun Mon Tue Wed Thu Fri Sat
1
2 3 4 5 6 7 8
9 10 11 12 13 14 15
16 17 18 19 20 21 22
23 24 25 26 27 28 29
30
2024-05-03 / 14:00 ~ 15:00
학과 세미나/콜로퀴엄 - 기타: 인쇄
by 허은우()
Link prediction (LP), inferring the connectivity between nodes, is a significant research area in graph data, where a link represents essential information on relationships between nodes. Although graph neural network (GNN)-based models have achieved high performance in LP, understanding why they perform well is challenging because most comprise complex neural networks. We employ persistent homology (PH), a topological data analysis method that helps analyze the topological information of graphs, to explain the reasons for the high performance. We propose a novel method that employs PH for LP (PHLP) focusing on how the presence or absence of target links influences the overall topology. The PHLP utilizes the angle hop subgraph and new node labeling called degree double radius node labeling (Degree DRNL), distinguishing the information of graphs better than DRNL. Using only a classifier, PHLP performs similarly to state-of-the-art (SOTA) models on most benchmark datasets. Incorporating the outputs calculated using PHLP into the existing GNN-based SOTA models improves performance across all benchmark datasets. To the best of our knowledge, PHLP is the first method of applying PH to LP without GNNs. The proposed approach, employing PH while not relying on neural networks, enables the identification of crucial factors for improving performance. https://arxiv.org/abs/2404.15225
2024-05-03 / 11:00 ~ 12:00
학과 세미나/콜로퀴엄 - 계산수학 세미나: Multi-stage Neural Networks: Function Approximator of Machine Precision 인쇄
by Yongji Wang(Stanford Univerisity)
Deep learning techniques are increasingly applied to scientific problems, where the precision of networks is crucial. Despite being deemed as universal function approximators, neural networks, in practice, struggle to reduce the prediction errors below O(10−5) even with large network size and extended training iterations. To address this issue, we developed the multi-stage neural networks that divides the training process into different stages, with each stage using a new network that is optimized to fit the residue from the previous stage. Across successive stages, the residue magnitudes decreases substantially and follows an inverse power-law relationship with the residue frequencies. The multi-stage neural networks effectively mitigate the spectral biases associated with regular neural networks, enabling them to capture the high frequency feature of target functions. We demonstrate that the prediction error from the multi-stage training for both regression problems and physics-informed neural networks can nearly reach the machine-precision O(10−16) of double-floating point within a finite number of iterations. Such levels of accuracy are rarely attainable using single neural networks alone.
2024-05-03 / 14:00 ~ 16:00
학과 세미나/콜로퀴엄 - 계산수학 세미나: Rapid Convergence of Unadjusted Langevin Algorithm 인쇄
by 최우진(카이스트)
최신 논문 리뷰: Rapid Convergence of Unadjusted Langevin Algorithm (Vempala et al) and Score-Based Generative Models(Song et al)
2024-05-03 / 14:00 ~ 16:00
학과 세미나/콜로퀴엄 - 기타: Introduction to étale cohomology 4 인쇄
by 이제학(KAIST)
This is an introductory reading seminar presented by a senior undergraduate student, Jaehak Lee, who is studying the subject.
2024-05-10 / 14:00 ~ 16:00
학과 세미나/콜로퀴엄 - 기타: Introduction to complex algebraic geometry and Hodge theory #5 인쇄
by 김재홍(KAIST)
This is part of an informal seminar series to be given by Mr. Jaehong Kim, who has been studying the book "Hodge theory and Complex Algebraic Geometry Vol 1 by Claire Voisin" for a few months. There will be 6-8 seminars during Spring 2024, and it will summarize about 70-80% of the book.
2024-05-07 / 16:30 ~ 17:30
IBS-KAIST 세미나 - 이산수학: Aharoni’s rainbow cycle conjecture holds up to an additive constant 인쇄
by Tony Huynh(Sapienza Università di Roma)
In 2017, Aharoni proposed the following generalization of the Caccetta-Häggkvist conjecture for digraphs. If G is a simple n-vertex edge-colored graph with n color classes of size at least r, then G contains a rainbow cycle of length at most ⌈n/r⌉. In this talk, we prove that Aharoni’s conjecture holds up to an additive constant. Specifically, we show that for each fixed r, there exists a constant c such that if G is a simple n-vertex edge-colored graph with n color classes of size at least r, then G contains a rainbow cycle of length at most n/r+c. This is joint work with Patrick Hompe.
2024-05-07 / 16:00 ~ 17:30
학과 세미나/콜로퀴엄 - 대수기하학: 인쇄
by 황준묵(IBS-CCG)
We introduce a general equivalence problems for geometric structures arising from minimal rational curves on uniruled complex projective manifolds. To study these problems, we need approaches fusing differential geometry and algebraic geometry. Among such geometric structures, those associated to homogeneous manifolds are particularly accessible to differential-geometric methods of Cartan geometry. But even in these cases, only a few cases have been worked out so far. We review some recent developments.
2024-05-03 / 16:00 ~ 17:30
학과 세미나/콜로퀴엄 - 대수기하학: 인쇄
by 황준묵(IBS-CCG)
I tell a personal story of how a mathematician working in complex algebraic geometry had come to discover the relevance of Cartan geometry, a subject in differential geometry, in an old problem in algebraic geometry, the problem of deformations of Grassmannians as projective manifolds, which originated from the work of Kodaira and Spencer. In my joint work with Ngaiming Mok, we used the theory of minimal rational curves to study such deformations and it reduced the question to a problem in Cartan geometry.
2024-05-10 / 11:00 ~ 12:00
IBS-KAIST 세미나 - 수리생물학: 인쇄
by ()

2024-05-03 / 11:00 ~ 12:00
IBS-KAIST 세미나 - 수리생물학: 인쇄
by ()

2024-05-09 / 16:15 ~ 17:15
학과 세미나/콜로퀴엄 - 콜로퀴엄: 인쇄
by 홍영준()
This lecture explores the topics and areas that have guided my research in computational mathematics and machine learning in recent years. Numerical methods in computational science are essential for comprehending real-world phenomena, and deep neural networks have achieved state-of-the-art results in a range of fields. The rapid expansion and outstanding success of deep learning and scientific computing have led to their applications across multiple disciplines, ranging from fluid dynamics to material sciences. In this lecture, I will focus on bridging machine learning with applied mathematics, specifically discussing topics such as scientific machine learning, numerical PDEs, and mathematical approaches of machine learning, including generative models and adversarial examples.
Events for the 취소된 행사 포함 모두인쇄
export to Google calendar  .ics download