Microsoft Research334 тыс
Опубликовано 15 августа 2018, 16:39
The fact that many commonly used networks take hours to days for training has motivated recent research towards reducing training time. On the other hand networks, once trained, are heavyweight dense linear algebra computations, usually requiring expensive acceleration to execute in real time. However, recent advances in algorithms, hardware, and systems have broken through these barriers dramatically. Models that took days to train are now reported to be trainable in under an hour. Further, with model optimization techniques and emerging commodity silicon, these models can be executed on the edge or in the cloud at surprisingly low energy and dollar cost. This session will present the ideas and techniques underlying these breakthroughs and discuss the implications of this new regime of “free inference and instant training.”
See more at microsoft.com/en-us/research/v...
See more at microsoft.com/en-us/research/v...
Свежие видео