Greedy Transition-Based Dependency Parsing with Stack-LSTMs

1 002
19.6
Следующее
Популярные
177 дней – 21 3845:48
AutoGen Update: Complex Tasks and Agents
213 дней – 6333:07:51
AI For All: Embracing Equity for All
Опубликовано 22 июня 2016, 19:38
We propose a technique for learning representations of parser states in transition-based dependency parsers. Our primary innovation is a new control structure for sequence-to-sequence neural networks—the stack LSTM. Like the conventional stack data structures used in transition-based parsing, elements can be pushed to or popped from the top of the stack in constant time, but, in addition, an LSTM maintains a continuous space embedding of the stack contents. This lets us formulate an efficient parsing model that captures three facets of a parser's state: Unbounded look-ahead into the buffer of incoming words, (ii) the complete history of transition actions taken by the parser, and (iii) the complete contents of the stack of partially built tree fragments, including their internal structures. In addition, we discuss different word representations, by modelling words and by modelling characters, the former is useful for all languages while the latter improves the way of handling out of vocabulary words without a pretraining regime and improves the parsing of morphologically rich languages.
Свежие видео
11 дней – 1 879 20120:41
My Wife Hates our Smart House
13 дней – 1 588 15123:37
I Can't BELIEVE They Let Me in Here!
14 дней – 27 3301:51
What is Gemma Scope?
Случайные видео
22.01.22 – 516 77117:02
When Phones Were Fun: NEXTEL
04.08.20 – 63 30811:04
Welcome to your new Play Console
автотехномузыкадетское