Fast and Flexible Multi-Task Classification Using Conditional Neural Adaptive Processes

1 703
16.2
Опубликовано 5 мая 2020, 20:45
This talk will describe our recent work on designing image classification systems that, after an initial multi-task training phase, can automatically adapt to new tasks encountered at test time. I will introduce an approach that relates to existing approaches to meta-learning and so-called conditional neural processes, generalising them to the multi-task classification setting. The resulting approach, called Conditional Neural Adaptive Processes (CNAPS), comprises a classifier whose parameters are modulated by an adaptation network that takes the current task's dataset as input. I will show that CNAPS achieves state-of-the-art results on the challenging Meta-Dataset few-shot learning benchmark indicating high-quality transfer-learning which is robust, avoiding both over-fitting in low-shot regimes and under-fitting in high-shot regimes. Timing experiments reveal that CNAPS is computationally efficient at test-time as it does not involve gradient based adaptation. Finally, I will show that trained models are immediately deployable to continual learning and active learning where they can outperform existing approaches that do not leverage transfer learning.

See more at microsoft.com/en-us/research/v...
Свежие видео
7 дней – 1 3853:25
Setup Google Workspace billing
12 дней – 115 02116:43
The Best $1500 Gaming PC Build!
автотехномузыкадетское