Fast and Flexible Multi-Task Classification Using Conditional Neural Adaptive Processes

Subscribers:
344,000
Published on ● Video Link: https://www.youtube.com/watch?v=apayUKSExmU



Duration: 15:27
1,551 views
34


This talk will describe our recent work on designing image classification systems that, after an initial multi-task training phase, can automatically adapt to new tasks encountered at test time. I will introduce an approach that relates to existing approaches to meta-learning and so-called conditional neural processes, generalising them to the multi-task classification setting. The resulting approach, called Conditional Neural Adaptive Processes (CNAPS), comprises a classifier whose parameters are modulated by an adaptation network that takes the current task's dataset as input. I will show that CNAPS achieves state-of-the-art results on the challenging Meta-Dataset few-shot learning benchmark indicating high-quality transfer-learning which is robust, avoiding both over-fitting in low-shot regimes and under-fitting in high-shot regimes. Timing experiments reveal that CNAPS is computationally efficient at test-time as it does not involve gradient based adaptation. Finally, I will show that trained models are immediately deployable to continual learning and active learning where they can outperform existing approaches that do not leverage transfer learning.

See more at https://www.microsoft.com/en-us/research/video/fast-and-flexible-multi-task-classification-using-conditional-neural-adaptive-processes/




Other Videos By Microsoft Research


2020-05-26MSR Distinguished Lecture Series: First-person Perception and Interaction
2020-05-26Large-scale live video analytics over 5G multi-hop camera networks
2020-05-26Kristin Lauter's TED Talk on Private AI at Congreso Futuro during Panel 11 / SOLVE
2020-05-19How an AI agent can balance a pole using a simulation
2020-05-19How to build Intelligent control systems using new tools from Microsoft and simulations by Mathworks
2020-05-13Diving into Deep InfoMax with Dr. Devon Hjelm | Podcast
2020-05-08An Introduction to Graph Neural Networks: Models and Applications
2020-05-07MSR Cambridge Lecture Series: Photonic-chip-based soliton microcombs
2020-05-07Multi-level Optimization Approaches to Computer Vision
2020-05-05How good is your classifier? Revisiting the role of evaluation metrics in machine learning
2020-05-05Fast and Flexible Multi-Task Classification Using Conditional Neural Adaptive Processes
2020-05-05Hypergradient descent and Universal Probabilistic Programming
2020-05-04Learning over sets, subgraphs, and streams: How to accurately incorporate graph context
2020-05-04An Ethical Crisis in Computing?
2020-04-21Presentation on “Beyond the Prototype” by Rushil Khurana
2020-04-20Understanding and Improving Database-backed Applications
2020-04-20Efficient Learning from Diverse Sources of Information
2020-04-08Project Orleans and the distributed database future with Dr. Philip Bernstein | Podcast
2020-04-07Reprogramming the American Dream: A conversation with Kevin Scott and J.D. Vance, with Greg Shaw
2020-04-01An interview with Microsoft President Brad Smith | Podcast
2020-03-30Microsoft Rocketbox Avatar library



Tags:
Multi-Task Classification
Conditional Neural Adaptive Processes
multi-task training
meta-learning
Conditional Neural Adaptive Processes (CNAPS)
Meta-Dataset
John Bronskill
Sebastian Tschiatschek
microsoft research cambridge