Discovery of Latent Factors in High-dimensional Data via Tensor Decomposition

2 794
37.3
Следующее
Популярные
137 дней – 20 3935:48
AutoGen Update: Complex Tasks and Agents
233 дня – 1 5231:07:34
Connectivity is a thing, is THE thing
Опубликовано 11 августа 2016, 8:12
Latent or hidden variable models have applications in almost every domain, e.g., social network analysis, natural language processing, computer vision and computational biology. Training latent variable models is challenging due to non-convexity of the likelihood objective function. An alternative method is based on the spectral decomposition of low order moment matrices and tensors. This versatile framework is guaranteed to estimate the correct model consistently. I will discuss my results on convergence to globally optimal solution for stochastic gradient descent, despite non-convexity of the objective. I will then discuss large-scale implementations (which are highly parallel and scalable) of spectral methods, carried out on CPU/GPU and Spark platforms. We obtain a gain in both accuracies and in running times by several orders of magnitude compared to the state-of-art variational methods. I will discuss the following applications in detail: (1) learning hidden user commonalities (communities) in social networks, and (2) learning sentence embeddings for paraphrase detection using convolutional models. More generally, I have applied the methods to a variety of problems such as text and social network analysis, healthcare analytics, and neuroscience.
5 дней – 4 14213:50
VIVE Focus Vision Teardown
9 дней – 127 00012:13
Pixel Watch 3 Review: Time To Grow Up
9 дней – 1761:24
Machine Learning
9 дней – 2 8174:24
Visualize data with Looker Studio
10 дней – 961 6197:41
Upgrade Your MacBook...
11 часов – 4 2650:21
#XiaomiFacts: Hibernation mode
автотехномузыкадетское