Microsoft Research335 тыс
Следующее
Опубликовано 4 июля 2019, 1:56
The goal of Physics ∩ ML is to bring together researchers from machine learning and physics to learn from each other and push research forward together. In this inaugural edition, we will especially highlight some amazing progress made in string theory with machine learning and in the understanding of deep learning from a physical angle. Nevertheless, we invite a cast with wide ranging expertise in order to spark new ideas. Plenary sessions from experts in each field and shorter specialized talks will introduce existing research. We will hold moderated discussions and breakout groups in which participants can identify problems and hopefully begin new collaborations in both directions. For example, physical insights can motivate advanced algorithms in machine learning, and analysis of geometric and topological datasets with machine learning can yield critical new insights in fundamental physics.
Session 2 - Applying physical insights to ML
2:45 PM–4:05 PM
Short talks
Neural tangent kernel and the dynamics of large neural nets by Clement Hongler
On the global convergence of gradient descent for over-parameterized models using optimal transport by Lénaïc Chizat
Pathological spectrum of the Fisher information matrix in deep neural networks by Ryo Karakida
Q&A
Fluctuation-dissipation relation for stochastic gradient descent by Sho Yaida
From optimization algorithms to continuous dynamical systems and back by Rene Vidal
The effect of network width on stochastic gradient descent and generalization by Daniel Park
Q&A
Short certificates for symmetric graph density inequalities by Rekha Thomas
Geometric representation learning in hyperbolic space by Maximilian Nickel
The fundamental equations of MNIST by Cedric Beny
Q&A
Talk title to be announced by Paul Smolensky
Multi-scale deep generative Networks for Bayesian inverse problems by Pengchuan Zhang
Variational quantum classifiers in the context of quantum machine learning by Alex Bocharov
Q&A
See more at microsoft.com/en-us/research/e...
Session 2 - Applying physical insights to ML
2:45 PM–4:05 PM
Short talks
Neural tangent kernel and the dynamics of large neural nets by Clement Hongler
On the global convergence of gradient descent for over-parameterized models using optimal transport by Lénaïc Chizat
Pathological spectrum of the Fisher information matrix in deep neural networks by Ryo Karakida
Q&A
Fluctuation-dissipation relation for stochastic gradient descent by Sho Yaida
From optimization algorithms to continuous dynamical systems and back by Rene Vidal
The effect of network width on stochastic gradient descent and generalization by Daniel Park
Q&A
Short certificates for symmetric graph density inequalities by Rekha Thomas
Geometric representation learning in hyperbolic space by Maximilian Nickel
The fundamental equations of MNIST by Cedric Beny
Q&A
Talk title to be announced by Paul Smolensky
Multi-scale deep generative Networks for Bayesian inverse problems by Pengchuan Zhang
Variational quantum classifiers in the context of quantum machine learning by Alex Bocharov
Q&A
See more at microsoft.com/en-us/research/e...
Случайные видео