Towards Agnostically Learning Halfspace [1/6]

198
Опубликовано 8 сентября 2016, 1:01
A longstanding open problem in computational learning theory is that of learning halfspaces in the agnostic model of Kearns, Schapire and Sellie (which model can also be viewed as learning with adversarial noise). In this problem, there is an arbitrary and unknown joint distribution over n-dimensional vectors X and their 0-1 labels Y. The goal is to design an algorithm that provably learns to predict the labels of future examples with error at most (opt + eps), where opt is the error of the best halfspace predictor for the distribution, for any eps > 0, using runtime and a number of samples of this distribution that is polynomial in n and 1/eps. We make progress on this problem by giving a simple learning algorithm that learns such functions for a very general class of distributions on n-dimensional vectors X (including any log-concave distribution) in time polynomial in the dimension n (but exponential in 1/eps). Note that our distributional assumptions are only on X, and thus the algorithm tolerates worst-case noise. The simple randomized algorithm generalizes Fourier learning and is also related to support
автотехномузыкадетское