Microsoft Research335 тыс
Опубликовано 28 июля 2016, 0:14
His talk is an overview of the machine learning course I have just taught at Cambridge University (UK) during the Lent term (Jan to March) 2012. The course is an introduction to basic concepts in probabilistic machine learning, focussing on statistical methods for unsupervised and supervised learning. It is centred around three recent illustrative successful applications: Gaussian processes for regression and classification, Latent Dirichlet Allocation models for unsupervised text modelling and the TrueSkill probabilistic ranking model. The course syllabus includes linear models, maximum likelihood and Bayesian inference, the Gaussian distribution and Gaussian processes, model selection, latent variable models, the expectation-maximization (EM) algorithm, the expectation-propagation (EP) algorithm, the Dirichlet distribution, generative models and graphical models, and approximate message passing inference in factor graphs. The course was taught over a total of 16 hours. This talk will highlight a selection of the topics covered, making sure to touch each of the three areas of application. The slides for the course as well as the three pieces of hands-on coursework can be found online (mlg.eng.cam.ac.uk/teaching/4f1...
Свежие видео