Microsoft Research334 тыс
Опубликовано 21 апреля 2018, 3:42
The performance of stochastic gradient descent (SGD) can be improved by actively selecting mini-batches. In this work, we explore active mini-batch selection using repulsive point processes. This simultaneously introduces active bias and leads to stochastic gradients with lower variance. We show empirically that our approach improves over standard SGD both in terms of convergence speed as well as final model performance.
See more at microsoft.com/en-us/research/v...
See more at microsoft.com/en-us/research/v...
Свежие видео
Enhance data access governance with enforced metadata rules in Amazon DataZone | Amazon Web Services
Случайные видео