학과 세미나 및 콜로퀴엄




2025-06
Sun Mon Tue Wed Thu Fri Sat
1 2 3 4 3 5 6 7
8 9 10 11 1 12 13 14
15 16 17 18 19 20 21
22 23 24 25 26 27 28
29 30          
2025-07
Sun Mon Tue Wed Thu Fri Sat
    1 2 3 4 5
6 7 8 9 10 11 12
13 14 15 16 17 18 19
20 21 22 23 24 25 26
27 28 29 30 31    

로그인 시, 세미나를 이메일로 구독할 수 있습니다.

Confidence sequence provides ways to characterize uncertainty in stochastic environments, which is a widely-used tool for interactive machine learning algorithms and statistical problems including A/B testing, Bayesian optimization, reinforcement learning, and offline evaluation/learning.In these problems, constructing confidence sequences that are tight and correct is crucial since it has a significant impact on the performance of downstream tasks. In this talk, I will first show how to derive one of the tightest empirical Bernstein-style confidence bounds, both theoretically and numerically. This derivation is done via the existence of regret bounds in online learning, inspired by the seminal work of Raklin& Sridharan (2017). Then, I will discuss how our confidence bound extends to unbounded nonnegative random variables with provable tightness. In offline contextual bandits, this leads to the best-known second-order bound in the literature with promising preliminary empirical results. Finally, I will turn to the $[0,1]$-valued regression problem and show how the intuition from our confidence bounds extends to a novel betting-based loss function that exhibits variance-adaptivity. I will conclude with future work including some recent LLM-related topics.
Host: 황강욱     Contact: saarc (8117)     미정     2025-05-16 15:28:36
Given a distribution, say, of data or mass, over a space, it is natural to consider a lower dimensional structure that is most “similar” or “close” to it. For example, consider a planning problem for an irrigation system (1-dimensional structure) over an agricultural region (2-dimensional distribution) where one wants to optimize the coverage and effectiveness of the water supply. This type of problem is related to “principal curves” in statistics and “manifold learning” in AI research. We will discuss some recent results in this direction that employ optimal transport approaches. This talk will be based on joint projects with Anton Afanassiev, Jonathan Hayase, Forest Kobayashi, Lucas O’Brien, Geoffrey Schiebinger, and Andrew Warren.
Host: 남경식     영어     2025-05-15 16:15:16
The investigation of $G_2$-structures and exceptional holonomy on 7-dimensional manifolds involves the analysis of a nonlinear Laplace-type operator on 3-forms. We will discuss the existence of solutions to the Poisson equation for this operator. Based on joint work with Timothy Buttsworth (The University of New South Wales).
Host: 박지원     영어     2025-05-19 18:18:35
Computing obstructions is a useful tool for determining the dimension and singularity of a Hilbert scheme at a given point. However, this task can be quite challenging when the obstruction space is nonzero. In a previous joint work with S. Mukai and its sequels, we developed techniques to compute obstructions to deforming curves on a threefold, under the assumption that the curves lie on a "good" surface (e.g., del Pezzo, K3, Enriques, etc.) contained in the threefold. In this talk, I will review some known results in the case where the intermediate surface is a K3 surface and the ambient threefold is Fano. Finally, I will discuss the deformations of certain space curves lying on a complete intersection K3 surface, and the construction of a generically non-reduced component of the Hilbert scheme of P^5.
Host: 곽시종     Contact: 김윤옥 (5745)     미정     2025-05-21 10:36:39