Monday, November 29, 2021 - 3:30pm to 4:30pm
- Virtual by zoom
Abstract: In this talk, we first propose a new class of metrics and show that under such metrics, the convergence of empirical measures in high dimensions is free of the curse of dimensionality, in contrast to Wasserstein distance. Proposed metrics originate from the maximum mean discrepancy, which we generalize by proposing criteria for test function spaces. Examples include RKHS, Barron space, and flow-induced function spaces. One application studies the construction of Nash equilibrium for the homogeneous n-player game by its mean-field limit (mean-field game). Then we discuss mean-field games with common noise and propose a deep learning algorithm based on fictitious play and signatures in rough path theory. The first part of the work collaborates with Jiequn Han and Jihao Long; the second part is the joint work with Ming Min.
September 13, 2021 - 1:43pm