Semi-Supervised Few-Shot Intent Classification and Slot Filling

NLP
NLU
Microsoft
conversational AI
paper
Our paper improving meta-learning for joint IC/SF using semi-supervised techniques (contrastive learning + data augmentation) — out on arXiv.
Author

synesis

Published

September 25, 2021

Semi-Supervised Few-Shot Intent/Slot Paper. Image: LinkedIn.

Our paper on improving meta-learning on joint intent/slot using semi-supervised techniques is out on ArXiv.

Title: Semi-Supervised Few-Shot Intent Classification and Slot Filling

Abstract: Intent classification (IC) and slot filling (SF) are two fundamental tasks in modern Natural Language Understanding (NLU) systems. Collecting and annotating large amounts of data to train deep learning models for such systems is not scalable. This problem can be addressed by learning from few examples using fast supervised meta-learning techniques such as prototypical networks. In this work, we systematically investigate how contrastive learning and unsupervised data augmentation methods can benefit these existing supervised meta-learning pipelines for jointly modelled IC/SF tasks. Through extensive experiments across standard IC/SF benchmarks (SNIPS and ATIS), we show that our proposed semi-supervised approaches outperform standard supervised meta-learning methods: contrastive losses in conjunction with prototypical networks consistently outperform the existing state-of-the-art for both IC and SF tasks, while data augmentation strategies primarily improve few-shot IC by a significant margin.

https://arxiv.org/abs/2109.08754

Originally posted on LinkedIn.