Exploring Massively Multilingual, Massive Neural Machine Translation

2 403
14.1
Опубликовано 27 февраля 2020, 18:52
We will be giving an overview of the recent efforts towards universal translation at Google Research. From training a single translation model for 100+ languages to scaling neural networks beyond 80 billion parameters with 1000 layers deep Transformers and several research and engineering challenges that the project has tackled; multi-task learning with hundreds of tasks, learning under heavy data imbalance, trainability of very deep networks, understanding the learned representations, cross-lingual down-stream transfer and many more insights will be shared.

See more at microsoft.com/en-us/research/v...
автотехномузыкадетское