Partial differential equations such as heat equations have traditionally been our main tool to study physical systems. However, physical systems are affected by randomness (noise). Thus, stochastic partial differential equations have gained popularity as an alternative. In this talk, we first consider what “noise” means mathematically and then consider stochastic heat equations perturbed by space-time white noise such as parabolic Anderson model and stochastic reaction-diffusion equations (e.g., KPP or Allen-Cahn equations). Those stochastic heat equations have similar properties as heat equations, but exhibit different behavior such as intermittency and dissipation, especially as time increases. We investigate in this talk how the long-time behaviors of the stochastic heat equations are different from heat equations.
Recently, deep learning approaches have become the main research frontier for image reconstruction and enhancement problems thanks to their high performance, along with their ultra-fast inference times. However, due to the difficulty of obtaining matched reference data for supervised learning, there has been increasing interest in unsupervised learning approaches that do not need paired reference data. In particular, self-supervised learning and generative models have been successfully used for various inverse problem applications. In this talk, we overview these approaches from a coherent perspective in the context of classical inverse problems and discuss their various applications. In particular, the cycleGAN approach and a recent Noise2Score approach for unsupervised learning will be explained in detail using optimal transport theory and Tweedie’s formula with score matching.