Transition-based Natural Language Processing

1 673
37.2
Опубликовано 11 августа 2016, 7:58
Transition-based models use a transition system, or abstract state machine, to model structured prediction problems, for example syntactic dependency parsing, as a sequence of actions. We propose a technique for learning representations of these states. Our primary innovation is a new control structure for sequence-to-sequence neural networks---the stack LSTM. Like the conventional stack data structures used in transition-based parsing, elements can be pushed to or popped from the top of the stack in constant time, but, in addition, an LSTM maintains a continuous space embedding of the stack contents. This lets us formulate efficient transition-based natural language processing models that captures three facets of the state: (i) unbounded look-ahead into the buffer of incoming words, (ii) the complete history of transition actions taken, and (iii) the complete contents of the stack of partially built fragments, including their internal structures. We also discuss different approaches to word representations, by modelling words and by modelling characters. The latter improves the way of handling out of vocabulary words without using external resources and improves the performance in morphologically rich languages. Transition-based modeling for natural language processing is not limited to syntactic parsing. In this talk I will explain how we successfully applied Stack-LSTMs to dependency parsing, phrase-structure parsing, language modeling and named entity recognition with outstanding results.
автотехномузыкадетское