Monday, July 20, 2020

<< >>  
2020. 6
Sun Mon Tue Wed Thu Fri Sat
1 2 3 4 5 6
7 8 9 10 11 12 13
14 15 16 17 18 19 20
21 22 23 24 25 26 27
28 29 30
2020. 7
Sun Mon Tue Wed Thu Fri Sat
1 2 3 4
5 6 7 8 9 10 11
12 13 14 15 16 17 18
19 20 21 22 23 24 25
26 27 28 29 30 31
2020. 8
Sun Mon Tue Wed Thu Fri Sat
1
2 3 4 5 6 7 8
9 10 11 12 13 14 15
16 17 18 19 20 21 22
23 24 25 26 27 28 29
30 31
2020-07-27 / 16:30 ~ 17:30
학과 세미나/콜로퀴엄 - 계산수학 세미나: Misaligned Domains Occurring in Practice and Test Time-Adaptive Domain Adaptation 인쇄
by 공서택((주)뷰노)
Modern deep learning (DL) algorithms rely extensively on large amounts of annotated data. Even when a large dataset is available, DL algorithms often fail miserably when deployed to settings with data characteristics significantly differing from those used for training. Domain adaptation (DA) and domain generalization (DG) algorithms aim to mitigate the gap between source (train) and target (test) distributions by learning domain-agnostic features or minimizing the discrepancy in the model’s predictions between the source and target distributions. This issue is prevalent in practical medical imaging settings, as the cost of obtaining both images and annotations is extremely expensive, limiting data accessibility to only a bulk of images collected from a few hospitals or detector devices, but a model must be suitable for multi-center, multi-device settings. In this seminar, we will cover existing literature on DA and DG, discussing their capabilities, assumptions, methodologies, along with their limitations. The session will conclude with research directions relevant to pragmatic industrial settings.
2020-07-20 / 16:30 ~ 17:30
학과 세미나/콜로퀴엄 - 계산수학 세미나: A Mathematical Perspective of Semi-Supervised Learning: Empirical Successes and Coherence to Generalization Theory 인쇄
by 공서택((주)뷰노)
With the goal of reducing the number of annotated data necessary for current deep learning (DL) algorithms, semi-supervised learning (SSL) algorithms use unlabeled data which is vastly more accessible than their labeled counterpart to enhance the performance of deep neural networks (DNNs) when trained on a small number of labeled data. As an example, state-of-the-art SSL algorithms can achieve up to ~84% accuracy on the CIFAR10 dataset using 1 image per class, as long as the single image is of “prototypical” quality. This session will introduce common SSL settings considered in recent works and cover DL-based SSL algorithms in a chronological fashion. While existing SSL algorithms are mainly heuristics (they lack theoretical justifications), the intuition underlying such algorithms will also be discussed in relation to the merging consensus in DL-based generalization theory/studies.
Events for the 취소된 행사 포함 모두인쇄
export to Google calendar  .ics download