Wednesday, November 23, 2022

<< >>  
2022. 10
Sun Mon Tue Wed Thu Fri Sat
1
2 3 4 5 6 7 8
9 10 11 12 13 14 15
16 17 18 19 20 21 22
23 24 25 26 27 28 29
30 31
2022. 11
Sun Mon Tue Wed Thu Fri Sat
1 2 3 4 5
6 7 8 9 10 11 12
13 14 15 16 17 18 19
20 21 22 23 24 25 26
27 28 29 30
2022. 12
Sun Mon Tue Wed Thu Fri Sat
1 2 3
4 5 6 7 8 9 10
11 12 13 14 15 16 17
18 19 20 21 22 23 24
25 26 27 28 29 30 31
2022-11-29 / 17:00 ~ 18:00
IBS-KAIST 세미나 - 대수기하학: A 1-dimensional component of K-moduli of del Pezzo surfaces 인쇄
by Andrea Petracci(Università di Bologna)
Fano varieties are algebraic varieties with positive curvature; they are basic building blocks of algebraic varieties. Great progress has been recently made by Xu et al. to construct moduli spaces of Fano varieties by using K-stability (which is related to the existence of Kähler-Einstein metrics). These moduli spaces are called K-moduli. In this talk I will explain how to easily deduce some geometric properties of K-moduli by using toric geometry and deformation theory. In particular, I will show how to construct a 1-dimensional component of K-moduli which parametrises certain K-polystable del Pezzo surfaces. * ZOOM information will not be provided. Please send an email to Jinhyung Park if you are interested in.
2022-11-28 / 10:40 ~ 11:40
학과 세미나/콜로퀴엄 - 박사논문심사: 인쇄
by 김창섭(KAIST)()

2022-11-28 / 09:30 ~ 10:30
학과 세미나/콜로퀴엄 - 박사논문심사: 인쇄
by 정성구(KAIST)()

2022-11-30 / 16:00 ~ 17:00
IBS-KAIST 세미나 - 수리생물학: 인쇄
by ()
TBA
2022-11-23 / 16:00 ~ 17:00
IBS-KAIST 세미나 - 수리생물학: 인쇄
by ()
TBD
2022-11-23 / 17:00 ~ 18:00
학과 세미나/콜로퀴엄 - 정수론: 인쇄
by ()
Let $E$ be a number field and $X$ a smooth geometrically connected variety defined over a characteristic $p$ finite field. Given an $n$-dimensional pure $E$-compatible system of semisimple $\lambda$-adic representations of the \'etale fundamental group of $X$ with connected algebraic monodromy groups $\bG_\lambda$, we construct a common $E$-form $\bG$ of all the groups $\bG_\lambda$ and in the absolutely irreducible case, a common $E$-form $\bG\hookrightarrow\GL_{n,E}$ of all the tautological representations $\bG_\lambda\hookrightarrow\GL_{n,E_\lambda}$. Analogous rationality results in characteristic $p$ assuming the existence of crystalline companions in $\mathrm{\textbf{F-Isoc}}^{\dagger}(X)\otimes E_{v}$ for all $v|p$ and in characteristic zero assuming ordinariness are also obtained. Applications include a construction of $\bG$-compatible system from some $\GL_n$-compatible system and some results predicted by the Mumford-Tate conjecture. (If you would like to join this seminar please contact Bo-Hae Im to get the zoom link.)
2022-11-25 / 10:00 ~ 11:00
SAARC 세미나 - SAARC 세미나: 인쇄
by 문일철,김동준(KAIST 산업및시스템공학과)
Deep generative models (DGM) have been an intersection between the probabilistic modeling and the machine learning communities. Particularly, DGM has impacted the field by introducing VAE, GAN, Flow, and recently Diffusion models with its capability to learn the data density of datasets. While there are many model variations in DGM, there are also common fundamental theories, assumptions and limitations to study from the theoretic perspectives. This seminar provides such general and fundamental challenges in DGMs, and later we particularly focus on the key developments in diffusion models and their mathematical properties in detail.
2022-11-24 / 16:15 ~ 17:15
학과 세미나/콜로퀴엄 - 콜로퀴엄: 인쇄
by ()
Machine learning (ML) has achieved unprecedented empirical success in diverse applications. It now has been applied to solve scientific problems, which has become an emerging field, Scientific Machine Learning (SciML). Many ML techniques, however, are very complex and sophisticated, commonly requiring many trial-and-error and tricks. These result in a lack of robustness and interpretability, which are critical factors for scientific applications. This talk centers around mathematical approaches for SciML, promoting trustworthiness. The first part is about how to embed physics into neural networks (NNs). I will present a general framework for designing NNs that obey the first and second laws of thermodynamics. The framework not only provides flexible ways of leveraging available physics information but also results in expressive NN architectures. The second part is about the training of NNs, one of the biggest challenges in ML. I will present an efficient training method for NNs - Active Neuron Least Squares (ANLS). ANLS is developed from the insight gained from the analysis of gradient descent training.
Events for the 취소된 행사 포함 모두인쇄
export to Google calendar  .ics download