View on GitHub

NLP-progress

Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the most common NLP tasks.

Intent Detection and Slot Filling

Intent Detection and Slot Filling is the task of interpreting user commands/queries by extracting the intent and the relevant slots.

Example (from ATIS):

Query: What flights are available from pittsburgh to baltimore on thursday morning
Intent: flight info
Slots: 
    - from_city: pittsburgh
    - to_city: baltimore
    - depart_date: thursday
    - depart_time: morning

ATIS

ATIS (Air Travel Information System) (Hemphill et al.) is a dataset by Microsoft CNTK. Available from the github page. The slots are labeled in the BIO (Inside Outside Between) format (similar to NER). This dataset contains only air travel related commands. Most of the ATIS results are based on the work here.

Model Slot F1 Score Intent Accuracy Paper / Source Code
Bi-model with decoder 96.89 98.99 A Bi-model based RNN Semantic Frame Parsing Model for Intent Detection and Slot Filling  
Attention Encoder-Decoder NN 95.87 98.43 Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling  
SF-ID (BLSTM) network 95.80 97.76 A Novel Bi-directional Interrelated Model for Joint Intent Detection and Slot Filling Official
Capsule-NLU 95.20 95.00 Joint Slot Filling and Intent Detection via Capsule Neural Networks Official
Joint GRU model(W) 95.49 98.10 A Joint Model of Intent Determination and Slot Filling for Spoken Language Understanding  
Slot-Gated BLSTM with Attension 95.20 94.10 Slot-Gated Modeling for Joint Slot Filling and Intent Prediction Official
Joint model with recurrent slot label context 94.64 98.40 Joint Online Spoken Language Understanding and Language Modeling with Recurrent Neural Networks Official
Recursive NN 93.96 95.40 JOINT SEMANTIC UTTERANCE CLASSIFICATION AND SLOT FILLING WITH RECURSIVE NEURAL NETWORKS  
Encoder-labeler Deep LSTM 95.66 NA Leveraging Sentence-level Information with Encoder LSTM for Natural Language Understanding  
RNN with Label Sampling 94.89 NA Recurrent Neural Network Structured Output Prediction for Spoken Language Understanding  
Hybrid RNN 95.06 NA Using recurrent neural networks for slot filling in spoken language understanding.  
RNN-EM 95.25 NA Recurrent neural networks with external memory for language understanding  
CNN-CRF 94.35 NA Convolutional neural network based triangular crf for joint intent detection and slot filling  

SNIPS

SNIPS is a dataset by Snips.ai for Intent Detection and Slot Filling benchmarking. Available from the github page. This dataset contains several day to day user command categories (e.g. play a song, book a restaurant).

Model Slot F1 Score Intent Accuracy Paper / Source Code
SF-ID (BLSTM) network 92.23 97.43 A Novel Bi-directional Interrelated Model for Joint Intent Detection and Slot Filling Official
Capsule-NLU 91.80 97.70 Joint Slot Filling and Intent Detection via Capsule Neural Networks Official
Slot-Gated BLSTM with Attension 88.80 97.00 Slot-Gated Modeling for Joint Slot Filling and Intent Prediction Official