General-purpose representation learning from words to sentences

527
35.1
Следующее
Популярные
Опубликовано 11 августа 2016, 7:43
Real-valued vector representations of words (i.e. embeddings) that are trained on naturally occurring data by optimising general-purpose objectives are useful for a range of downstream language tasks. However, the picture is less clear for larger linguistic units such as phrases or sentences. Phrases and sentences typically encode the facts and propositions that constitute the 'general knowledge' missing from many NLP systems at present, so the potential benefit of making representation-learning work for these units is huge. I will present a systematic comparison of (both novel and existing) ways of inducing such representations with neural language models. The results demonstrate clear and interesting differences between the representations learned by different methods; in particular, more elaborate or computationally expensive methods are not necessarily best. I'll also discuss a key challenge facing all research in unsupervised or representation learning for NLP - the lack of robust evaluations.
Свежие видео
12 дней – 632 6550:12
Meet Titan Gray | Xiaomi 14T Series
18 дней – 5 913 5360:20
Old chat, new friends 🫂
18 дней – 97 13910:29
Fractal's ITX Monster - ERA 2 Review
автотехномузыкадетское