Department Seminars & Colloquia




2020-09
Sun Mon Tue Wed Thu Fri Sat
    1 2 3 4 5
6 7 8 9 10 11 12
13 14 15 16 17 1 18 19
20 21 22 23 24 1 25 26
27 28 29 30      
2020-10
Sun Mon Tue Wed Thu Fri Sat
        1 2 3
4 5 6 7 8 1 9 10
11 12 13 14 15 16 17
18 19 20 21 22 23 24
25 26 27 28 29 30 31

When you're logged in, you can subscribe seminars via e-mail

Recently, with the enormous development of deep learning techniques, solving underdetermined linear systems (more unknowns than equations) have become one of major concerns in medical imaging. Typical examples include undersampled MRI, local tomography, and sparse view CT, where deep learning techniques have shown excellent performance. Although deep learning methods appear to overcome limitations of existing mathematical methods in handling various underdetermined problems, there is a lack of rigorous mathematical foundations which would allow us to understand reasons why deep learning methods perform that well. This talk deals with this learning causal relationship about structure of training data suitable for deep learning to solve highly underdetermined inverse problems. We examine whether or not a desired reconstruction map can be learnable from the training data and the underdetermined system. Most problems of solving underdetermined linear systems in medical imaging are highly non-linear.
Host: 이창옥     To be announced     2020-09-09 11:03:00
In this talk, I will explain first what is the representation theory with an easy example from finite groups. Next, I will introduce certain algebraic objects called the quantum affine algebras and the quiver Hecke algebras, and explain a connection between those objects via representation theory of associative algebras.
Korean     2020-09-09 11:00:37
Deep neural networks usually act on fixed dimensional items. However, many real-world problems are formulated as learning mappings from sets of items to outputs. Such problems include multiple-instance learning, visual scene understandings, few-shot classifications, and even generic Bayesian inference procedures. Recently, several methods have been proposed to construct neural networks taking sets as inputs. The key properties required for those neural networks are permutation invariance and equivariance, meaning that intermediate outputs and final values of a network should remain unchanged with respect to the processing order of items in sets. This talk discusses recent advances in permutation invariant and equivariant neural networks, and discuss their theoretical properties, especially their universalities. The later part of the talk will also introduce interesting applications of the permutation invariant/equivariant neural networks.
Host: 전현호     To be announced     2020-09-09 10:58:40