Friday, November 10, 2023

<< >>  
2023. 10
Sun Mon Tue Wed Thu Fri Sat
1 2 3 4 5 6 7
8 9 10 11 12 13 14
15 16 17 18 19 20 21
22 23 24 25 26 27 28
29 30 31
2023. 11
Sun Mon Tue Wed Thu Fri Sat
1 2 3 4
5 6 7 8 9 10 11
12 13 14 15 16 17 18
19 20 21 22 23 24 25
26 27 28 29 30
2023. 12
Sun Mon Tue Wed Thu Fri Sat
1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31
2023-11-17 / 11:00 ~ 12:00
학과 세미나/콜로퀴엄 - 응용 및 계산수학 세미나: 인쇄
by 최하영(경북대학교 수학과)
In this talk, we consider a group-sparse matrix estimation problem. This problem can be solved by applying the existing compressed sensing techniques, which either suffer from high computational complexities or lack of algorithm robustness. To overcome the situation, we propose a novel algorithm unrolling framework based on the deep neural network to simultaneously achieve low computational complexity and high robustness. Specifically, we map the original iterative shrinkage thresholding algorithm (ISTA) into an unrolled recurrent neural network (RNN), thereby improving the convergence rate and computational efficiency through end-to-end training. Moreover, the proposed algorithm unrolling approach inherits the structure and domain knowledge of the ISTA, thereby maintaining the algorithm robustness, which can handle non-Gaussian preamble sequence matrix in massive access. We further simplify the unrolled network structure with rigorous theoretical analysis by reducing the redundant training parameters. Furthermore, we prove that the simplified unrolled deep neural network structures enjoy a linear convergence rate. Extensive simulations based on various preamble signatures show that the proposed unrolled networks outperform the existing methods regarding convergence rate, robustness, and estimation accuracy.
2023-11-17 / 16:00 ~ 17:00
SAARC 세미나 - SAARC 세미나: 인쇄
by 이미경(부산대학교 수학과)
In this talk, we consider nonlinear elliptic equations of the $p$-Laplacian type with lower order terms which involve nonnegative potentials satisfying a reverse H\"older type condition. We establish interior and boundary $L^q$ estimates for the gradient of weak solutions and the lower order terms, independently, under sharp regularity conditions on the coefficients and the boundaries. In addition, we prove interior estimates for Hessian of strong solutions and the lower order terms for nondivergence type elliptic equations. The talk is based on joint works with Jihoon Ok and Yoonjung Lee.
2023-11-13 / 17:00 ~ 18:00
학과 세미나/콜로퀴엄 - 박사논문심사: 타원 곡선의 모델-베유 군과 아벨리안 다양체의 자기동형 군 인쇄
by 김한솔()

2023-11-10 / 11:00 ~ 12:00
학과 세미나/콜로퀴엄 - 응용 및 계산수학 세미나: 인쇄
by ()
In this talk, I will introduce the use of deep neural networks (DNNs) to solve high-dimensional evolution equations. Unlike some existing methods (e.g., least squares method/physics-informed neural networks) that simultaneously deal with time and space variables, we propose a deep adaptive basis approximation structure. On the one hand, orthogonal polynomials are employed to form the temporal basis to achieve high accuracy in time. On the other hand, DNNs are employed to create the adaptive spatial basis for high dimensions in space. Numerical examples, including high-dimensional linear parabolic and hyperbolic equations and a nonlinear Allen–Cahn equation, are presented to demonstrate that the performance of the proposed DABG method is better than that of existing DNNs. zoom link: https://kaist.zoom.us/j/3844475577 zoom ID: 384 447 5577
2023-11-14 / 14:00 ~ 15:00
학과 세미나/콜로퀴엄 - 박사논문심사: 가약군의 essential dimension 인쇄
by 김영종()

2023-11-17 / 11:00 ~ 12:00
IBS-KAIST 세미나 - 수리생물학: 인쇄
by ()
TBD
2023-11-10 / 11:00 ~ 12:00
IBS-KAIST 세미나 - 수리생물학: 인쇄
by ()
Interpreting data using mechanistic mathematical models provides a foundation for discovery and decision-making in all areas of science and engineering. Key steps in using mechanistic mathematical models to interpret data include: (i) identifiability analysis; (ii) parameter estimation; and (iii) model prediction. Here we present a systematic, computationally efficient likelihood-based workflow that addresses all three steps in a unified way. Recently developed methods for constructing profile-wise prediction intervals enable this workflow and provide the central linkage between different workflow components. These methods propagate profile-likelihood-based confidence sets for model parameters to predictions in a way that isolates how different parameter combinations affect model predictions. We show how to extend these profile-wise prediction intervals to two-dimensional interest parameters, and then combine profile-wise prediction confidence sets to give an overall prediction confidence set that approximates the full likelihood-based prediction confidence set well. We apply our methods to a range of synthetic data and real-world ecological data describing re-growth of coral reefs on the Great Barrier Reef after some external disturbance, such as a tropical cyclone or coral bleaching event.
2023-11-16 / 11:50 ~ 12:40
대학원생 세미나 - 대학원생 세미나: Provable Ensemble Distillation based Federated Learning Algorithm 인쇄
by 박세준(Dept. of Mathematical Sciences, KAIST)
In this talk, we will primarily discuss the theoretical analysis of knowledge distillation based federated learning algorithms. Before we explore the main topics, we will introduce the basic concepts of federated learning and knowledge distillation. Subsequently, we will understand a nonparametric view of knowledge distillation based federated learning algorithms and introduce generalization analysis of these algorithms based the theory of regularized kernel regression methods.
2023-11-16 / 14:30 ~ 15:45
학과 세미나/콜로퀴엄 - 기타: 인쇄
by ()
(information) "Introduction to Oriented Matroids" Series Thursdays 14:30-15:45
Events for the 취소된 행사 포함 모두인쇄
export to Google calendar  .ics download