Microsoft Research333 тыс
Следующее
Опубликовано 2 июня 2021, 14:52
We show how to do gradient-based stochastic variational inference in stochastic differential equations (SDEs), in a way that allows the use of adaptive SDE solvers. This allows us to scalably fit a new family of richly-parameterized distributions over irregularly-sampled time series. We apply latent SDEs to motion capture data, and to demonstrate infinitely-deep Bayesian neural networks. We also discuss the pros and cons of this barely-explored model class, comparing it to Gaussian processes and neural processes.
David Duvenaud is an assistant professor in computer science at the University of Toronto. His research focuses on continuous-time models, latent-variable models, and deep learning. His postdoc was done at Harvard University, and his Ph.D. at the University of Cambridge. David is a founding member of the Vector Institute for Artificial Intelligence, and also co-founded Invenia, an energy forecasting company.
Learn more about the 2020-2021 Directions in ML: AutoML and Automating Algorithms virtual speaker series: aka.ms/diml
David Duvenaud is an assistant professor in computer science at the University of Toronto. His research focuses on continuous-time models, latent-variable models, and deep learning. His postdoc was done at Harvard University, and his Ph.D. at the University of Cambridge. David is a founding member of the Vector Institute for Artificial Intelligence, and also co-founded Invenia, an energy forecasting company.
Learn more about the 2020-2021 Directions in ML: AutoML and Automating Algorithms virtual speaker series: aka.ms/diml
Свежие видео