Self-Consistency of the Fokker-Planck Equation
发布时间:2022年10月04日
浏览次数:3405
发布者: He Liu
主讲人: Zebang Shen (UPenn)
活动时间: 从 2022-10-06 10:00 到 11:00
场地: 北京国际数学研究中心,镜春园78号院(怀新园)78301室
The Fokker-Planck equation (FPE) is the partial differential equation that governs the density evolution of the Ito process and is of great importance to the literature of statistical physics and machine learning. The FPE can be regarded as a continuity equation where the change of the density is completely determined by a time varying velocity field. Importantly, this velocity field also depends on the current density function. As a result, the ground-truth velocity field can be shown to be the solution of a fixed-point equation, a property that we call self-consistency.
In this talk, we exploit this concept to design a potential function of the hypothesis velocity fields, and prove that, if such a function diminishes to zero during the training procedure, the trajectory of the densities generated by the hypothesis velocity fields converges to the solution of the FPE in the KL sense. The proposed potential function is amenable to neural-network based parameterization as the stochastic gradient with respect to the parameter can be efficiently computed. Once a parameterized model, such as Neural Ordinary Differential Equation is trained, we can generate the entire trajectory to the FPE.
Reference[1] Zebang Shen, Zhenfu Wang, Satyen Kale, Alejandro Ribeiro, Aim Karbasi, Hamed Hassani. “Self-Consistency of the Fokker-Planck Equation.” Conference on Learning Theory (COLT) 2022.
Short Bio:
Zebang Shen received his Ph.D. degree from Zhejiang University in 2019, where he obtained his Bachelor degree in 2014. He was a Post-Doctoral Researcher with the University of Pennsylvania from 2020 to 2022. His research mainly focuses on optimization in Euclidean space and Wasserstein space and he is working on the direction of solving PDEs with neural networks.