학과 세미나 및 콜로퀴엄
The essential dimension of an algebraic object E over a field L is heuristically the number of parameters it takes to define it. This notion was formalized and developed by Buhler and Reichstein in the late 90s, who noticed at the time, that several classical results could be interpreted as theorems
about essential dimension. Since the paper of Buhler and Reichstein, most of the progress on essential dimension has had to do with essential dimension of versal G-torsors for an algebraic group G. But recently Farb, Kisin and Wolfson showed that interesting theorems can be proved for certain (usually) non-versal torsors arising from congruence covers of Shimura varieties.
I'll explain this work, some extensions of it proved by me and Fakhruddin, and a conjecture on period maps which generalizes the picture.
For hyperbolic systems of conservation laws in one space dimension endowed with a single convex entropy, it is an open question if it is possible to construct solutions via convex integration. Such solutions, if they exist, would be highly non-unique and exhibit little regularity. In particular, they would not have the strong traces necessary for the nonperturbative $L^2$ stability theory of Vasseur. Whether convex integration is possible is a question about large data, and the global geometric structure of genuine nonlinearity for the underlying PDE. In this talk, I will discuss recent work which shows the impossibility, for a large class of 2x2 systems, of doing convex integration via the use of $T_4$ configurations. Our work applies to every well-known 2x2 hyperbolic system of conservation laws which verifies the Liu entropy condition. This talk is based on joint work with László Székelyhidi.
This three-day lecture series aims to explore some topics in mathematical image processing before the era of neural networks, highlighting the techniques and applications that were prevalent at that time. From the classical filter-based models to PDE-based or minimization-based models, a variety of example-driven explanations and underlying mathematical theories are provided. By attending the lecture series, participants will gain a comprehensive understanding of image processing techniques used before the advent of neural networks, exploring the challenges, innovations and applications of classical algorithms. This knowledge will provide a foundation for further exploration in the field of image processing and its evolution into the AI-driven era.
This three-day lecture series aims to explore some topics in mathematical image processing before the era of neural networks, highlighting the techniques and applications that were prevalent at that time. From the classical filter-based models to PDE-based or minimization-based models, a variety of example-driven explanations and underlying mathematical theories are provided. By attending the lecture series, participants will gain a comprehensive understanding of image processing techniques used before the advent of neural networks, exploring the challenges, innovations and applications of classical algorithms. This knowledge will provide a foundation for further exploration in the field of image processing and its evolution into the AI-driven era.
3-day lecture series (2 of 3)
3-day lecture series (2 of 3)
This three-day lecture series aims to explore some topics in mathematical image processing before the era of neural networks, highlighting the techniques and applications that were prevalent at that time. From the classical filter-based models to PDE-based or minimization-based models, a variety of example-driven explanations and underlying mathematical theories are provided. By attending the lecture series, participants will gain a comprehensive understanding of image processing techniques used before the advent of neural networks, exploring the challenges, innovations and applications of classical algorithms. This knowledge will provide a foundation for further exploration in the field of image processing and its evolution into the AI-driven era.
3-day lecture series (1 of 3)
3-day lecture series (1 of 3)
I will discuss the ‘global’ nonlinear asymptotic stability of the traveling front solutions to the Korteweg-de Vries–Burgers equation, and other dispersive-dissipative perturbations of the Burgers equation. Earlier works made strong use of the monotonicity of the profile, for relatively weak dispersion effects. We exploit the modulation of the translation parameter, establishing a new stability criterion that does not require monotonicity. Instead, a certain Schrodinger operator in one dimension must have exactly one negative eigenvalue, so that a rank-one perturbation of the operator can be made positive definite. Counting the number of bound states of the Schrodinger equation, we find a sufficient condition in terms of the ’width’ of a front. We analytically verify that our stability criterion is met for an open set in the parameter regime including all monotone fronts. Our numerical experiments, revealing more stable fronts, suggest a computer-assisted proof. Joint with Blake Barker, Jared Bronski, and Zhao Yang.
We introduce configurations of lines in the combinatorial and geometric setting. After a brief summary of the classical theory we will discuss results in the 4-dimensional setting. These include work of Ruberman and Starkston in the topological category and work in progress in the smooth category that is joint work with D. McCoy And J. Park.
We discuss an explicit formula for the structure of Bloch–Kato Selmer groups of the central critical twist of modular forms if the analytic rank is ≤ 1 or the Iwasawa main conjecture localized at the augmentation ideal holds. This formula reveals more refined arithmetic information than the p-part of the Tamagawa number conjecture for motives of modular forms and reduces the corresponding Beilinson–Bloch–Kato conjecture to a purely analytic statement. Our formula is insensitive to the local behavior at p.
In this talk, we explore a duality between federated learning and subspace correction, which are concepts from two very different fields. Federated learning is a paradigm of supervised machine learning in which data is decentralized into a number of clients and each client updates a local correction of a global model independently via the local data. Subspace correction is an abstraction of general iterative algorithms such as multigrid and domain decomposition methods for solving scientific problems numerically. Based on the duality between federated learning and subspace correction, we propose a novel federated learning algorithm called DualFL (Dualized Federated Learning). DualFL is the first federated learning algorithm that achieves communication acceleration, even when the cost function is either nonsmooth or non-strongly convex.
Tropicalizations of affine varieties give interesting ways to sketch and study affine varieties, whose tools are astonishingly elementary at the algebraic level. Not only that, studying algebraic dynamics on varieties may give interesting pictures under tropicalizations, as worked by Spalding and Veselov, or Filip. In this talk, we will introduce some basicmost ideas of tropicalizations, and play with the Markov cubic surfaces
$$X^2+Y^2+Z^2+XYZ=AX+BY+CZ+D,$$
where A, B, C, D are parameters, as an example of tropical study of algebraic dynamics. It turns out that we obtain a $(\infty,\infty,\infty)$-triangle group action on the hyperbolic plane as a model of dynamics of interest. 언어: Korean (possibly English, depending on the audience)
Sequential decision making under uncertainty is a problem class with solid real-life foundation and application. We overview the concept of Knowledge Gradient (KG) from the perspective of multi-armed bandit (MAB) problem and reinforcement learning. Then we discuss the first KG algorithm with sublinear regret bounds for Gaussian MAB problems.
(Online participation) Zoom Link: https://kaist.zoom.us/j/87516570701
(Online participation) Zoom Link: https://kaist.zoom.us/j/87516570701
A digital twin is a virtual representation of real-world physical objects. Through accurate and streamlined simulations, it effectively enhances our understanding of the real world, enabling us to predict complex and dynamic phenomena in a fraction of the time. In this talk, we will explore real-world applications of AI-based partial differential equation (PDE) solvers in various fields. Additionally, we will examine how such AI can be utilized to facilitate downstream tasks related to PDEs.