Microsoft Research333 тыс
Опубликовано 22 июня 2016, 1:49
Distributed machine learning is an important area that has been receiving considerable attention from academic and industrial communities, as data is growing in unprecedented rate. In the first part of the talk, we review several popular approaches that are proposed/used to learn classifier models in the big data scenario. With commodity clusters priced on system configurations becoming popular, machine learning algorithms have to be aware of the computation and communication costs involved in order to be cost effective and efficient. In the second part of the talk, we focus on methods that address this problem; in particular, considering different data distribution settings (e.g., example and feature partitions), we present efficient distributed learning algorithms that trade-off computation and communication costs.
Свежие видео
Случайные видео