Machine Learning Algorithms Workshop

480
53.3
Опубликовано 7 июля 2016, 23:09
Machine Learning Algorithms Workshop: Logarithmic Time Online Multiclass Prediction & Log-Concave Sampling with SGD
Logarithmic Time Online Multiclass prediction: We study the problem of multiclass classification with an extremely large number of classes (k), with the goal of obtaining train and test time complexity logarithmic in the number of classes. We develop top-down tree construction approaches for constructing logarithmic depth trees. On the theoretical front, we formulate a new objective function, which is optimized at each node of the tree and creates dynamic partitions of the data which are both pure (in terms of class labels) and balanced. We demonstrate that under favorable conditions, we can construct logarithmic depth trees that have leaves with low label entropy. However, the objective function at the nodes is challenging to optimize computationally. We address the empirical problem with a new online decision tree construction procedure. Experiments demonstrate that this online algorithm quickly achieves improvement in test error compared to more common logarithmic training time approaches, which makes it a plausible method in computationally constrained large-k applications. And... Log-concave Sampling with SGD: We extend the Langevin Monte Carlo (LMC) algorithm to compactly supported measures via a projection step, akin to projected Stochastic Gradient Descent (SGD). We show that (projected) LMC allows to sample in polynomial time from a log-concave distribution with smooth potential. This gives a new Markov chain to sample from a log-concave distribution.
Случайные видео
09.07.23 – 3 243 27826:11
I said YES to everything… I regret it
11.09.22 – 115 4420:37
Try This To Fix Gaming Lag #Shorts
автотехномузыкадетское