Provable in-context learning of PDEs
Speaker(s): Yulong Lu (University of Minnesota Twin Cities)
Time: 15:00-16:00 July 22, 2025
Venue: Room 77201, Jingchunyuan 78, BICMR
Abstract: Transformer-based foundation models, pre-trained on a wide range of tasks with large datasets, demonstrate remarkable adaptability to diverse downstream applications, even with limited data. One of the most striking features of these models is their in-context learning (ICL) capability: when presented with a prompt containing examples from a new task alongside a query, they can make accurate predictions without requiring parameter updates. This emergent behavior has been recognized as a paradigm shift in transformers, though its theoretical underpinnings remain underexplored. In this talk, I will discuss some recent theoretical understandings of ICL for PDEs, emphasizing its approximation power and generalization capabilities. The theoretical analysis will focus on two scientific problems: elliptic PDEs and stochastic dynamical systems.
Bio: Dr. Yulong Lu is currently an Assistant Professor in the School of Mathematics at the University of Minnesota Twin Cities. His research interests mainly lie at the interaction between applied mathematics, statistics and machine learning. He is a recipient of 2025 National Science Foundation (NSF) Early CAREER Award.