no code implementations • 10 Dec 2023 • Yinuo Ren, Yiping Lu, Lexing Ying, Grant M. Rotskoff
Inferring a diffusion equation from discretely-observed measurements is a statistical challenge of significant importance in a variety of fields, from single-molecule tracking in biophysical systems to modeling financial instruments.
1 code implementation • 12 Nov 2021 • Shriram Chennakesavalu, Grant M. Rotskoff
Experimental advances enabling high-resolution external control create new opportunities to produce materials with exotic properties.
Multi-agent Reinforcement Learning reinforcement-learning +2
no code implementations • ICML Workshop INNF 2021 • Marylou Gabrié, Grant M. Rotskoff, Eric Vanden-Eijnden
Normalizing flows can generate complex target distributions and thus show promise in many applications in Bayesian statistics as an alternative or complement to MCMC for sampling posteriors.
no code implementations • NeurIPS 2020 • Zhengdao Chen, Grant M. Rotskoff, Joan Bruna, Eric Vanden-Eijnden
Furthermore, if the mean-field dynamics converges to a measure that interpolates the training data, we prove that the asymptotic deviation eventually vanishes in the CLT scaling.
1 code implementation • 11 Aug 2020 • Grant M. Rotskoff, Andrew R. Mitchell, Eric Vanden-Eijnden
Deep neural networks, when optimized with sufficient data, provide accurate representations of high-dimensional functions; in contrast, function approximation techniques that have predominated in scientific computing do not scale well with dimensionality.
2 code implementations • 28 Sep 2018 • Grant M. Rotskoff, Eric Vanden-Eijnden
Nonequilibrium sampling is potentially much more versatile than its equilibrium counterpart, but it comes with challenges because the invariant distribution is not typically known when the dynamics breaks detailed balance.
Statistical Mechanics
no code implementations • 2 May 2018 • Grant M. Rotskoff, Eric Vanden-Eijnden
We show that, when the number $n$ of units is large, the empirical distribution of the particles descends on a convex landscape towards the global minimum at a rate independent of $n$, with a resulting approximation error that universally scales as $O(n^{-1})$.