Microsoft Research335 тыс
Опубликовано 28 июня 2021, 19:37
Traditional deep learning frameworks such as TensorFlow and PyTorch support training on a single deep neural network (DNN) model, which involves computing the weights iteratively for the DNN model. Designing a DNN model for a task remains an experimental science and is typically a practice of deep learning model exploration. Retrofitting such exploratory-training into the training process of a single DNN model, as supported by current deep learning frameworks, is unintuitive, cumbersome, and inefficient.
In this webinar, Microsoft Research Asia Senior Researcher Quanlu Zhang and Principal Program Manager Scarlett Li will analyze these challenges within the context of Neural Architecture Search (NAS). The first part of the webinar will focus on Retiarii, a deep learning exploratory-training framework for DNN models. Retiarii also offers a just-in-time (JIT) engine that instantiates models and manages their training, gathers information for the exploration strategy to consume, and executes the decisions accordingly. Retiarii identifies correlations between the instantiated models and develops cross-model optimizations to improve the overall exploratory-training process. Retiarii does so by introducing a key abstraction, Mutator, that connects the specifications of DNN model spaces and exploration strategies, while exposing the correlations between models for optimization. The benefits include ease of programming, reuse of components, and vastly improved (up to 8.58x) overall exploratory-training efficiency.
The second part of the talk will introduce Retiarii's implementation on the open source project Neural Network Intelligence (NNI), and how the toolkit can enable users to design state-of-the-art NAS more efficiently.
Together, you’ll explore:
■ Defining arbitrary model space with a new abstraction Mutator.
■ Decoupling model space from model exploration strategy, making exploration strategy highly customizable and reusable.
■ Exposing the correlations between models for cross-model optimizations, which greatly speeds up model exploration.
■ The features of the AutoML open-source toolkit NNI and how it can be used to help model developers find good models.
𝗥𝗲𝘀𝗼𝘂𝗿𝗰𝗲 𝗹𝗶𝘀𝘁:
■ Retiarii: A Deep Learning Exploratory-Training Framework (publication): microsoft.com/en-us/research/p...
■ Neural Network Intelligence (project page): microsoft.com/en-us/research/p...
■ Microsoft NNI (GitHub): github.com/microsoft/nni
■ NNI on YouTube (YouTube channel): youtube.com/channel/UCKcafm686...
■ Cost-effective Hyper-parameter Tuning using AdaptDL with NNI (case study): medium.com/casl-project/cost-e...
■ Quanlu Zhang (Microsoft Research profile): microsoft.com/en-us/research/p...
■ Scarlett Li (Microsoft Research profile): microsoft.com/en-us/research/p...
*This on-demand webinar features a previously recorded Q&A session and open captioning.
This webinar originally aired on June, 24 2021
Explore more Microsoft Research webinars: aka.ms/msrwebinars
In this webinar, Microsoft Research Asia Senior Researcher Quanlu Zhang and Principal Program Manager Scarlett Li will analyze these challenges within the context of Neural Architecture Search (NAS). The first part of the webinar will focus on Retiarii, a deep learning exploratory-training framework for DNN models. Retiarii also offers a just-in-time (JIT) engine that instantiates models and manages their training, gathers information for the exploration strategy to consume, and executes the decisions accordingly. Retiarii identifies correlations between the instantiated models and develops cross-model optimizations to improve the overall exploratory-training process. Retiarii does so by introducing a key abstraction, Mutator, that connects the specifications of DNN model spaces and exploration strategies, while exposing the correlations between models for optimization. The benefits include ease of programming, reuse of components, and vastly improved (up to 8.58x) overall exploratory-training efficiency.
The second part of the talk will introduce Retiarii's implementation on the open source project Neural Network Intelligence (NNI), and how the toolkit can enable users to design state-of-the-art NAS more efficiently.
Together, you’ll explore:
■ Defining arbitrary model space with a new abstraction Mutator.
■ Decoupling model space from model exploration strategy, making exploration strategy highly customizable and reusable.
■ Exposing the correlations between models for cross-model optimizations, which greatly speeds up model exploration.
■ The features of the AutoML open-source toolkit NNI and how it can be used to help model developers find good models.
𝗥𝗲𝘀𝗼𝘂𝗿𝗰𝗲 𝗹𝗶𝘀𝘁:
■ Retiarii: A Deep Learning Exploratory-Training Framework (publication): microsoft.com/en-us/research/p...
■ Neural Network Intelligence (project page): microsoft.com/en-us/research/p...
■ Microsoft NNI (GitHub): github.com/microsoft/nni
■ NNI on YouTube (YouTube channel): youtube.com/channel/UCKcafm686...
■ Cost-effective Hyper-parameter Tuning using AdaptDL with NNI (case study): medium.com/casl-project/cost-e...
■ Quanlu Zhang (Microsoft Research profile): microsoft.com/en-us/research/p...
■ Scarlett Li (Microsoft Research profile): microsoft.com/en-us/research/p...
*This on-demand webinar features a previously recorded Q&A session and open captioning.
This webinar originally aired on June, 24 2021
Explore more Microsoft Research webinars: aka.ms/msrwebinars
Свежие видео
The Era of Generative AI in Personal Computing, Adrian Macias,AMD Sr. Director AI Product Management
Случайные видео