Abstract:The recognition of the speaker's intention has greatly promoted the development of natural language understanding.In previous studies,the bidirectional long short-term memory (Bi-LSTM) model has been mostly employed in natural language processing to extract the features of words and the relationships between them.However,Bi-LSTM cannot establish a well-enough relation between the information contained in a sentence and its individual vocabulary.Another previously proposed model,i.e.,the S-LSTM (Sentence-state LSTM) model,can establish a relation between sentence information and its individual words.This,in turn,facilitates the establishment of the relationship between intention detection and slot filling,for the purpose of proposing a joint model to better understand the semantics contained in the question-answer system.Therefore,in this paper,slot-gate mechanism is introduced to solve the waste of the latest iteration sentence state when S-LSTM is applied to the joint task of intention detection and slot filling.The experimental results based on ATIS and Snips datasets confirm that the proposed mechanism is superior to other state-of-the-art approaches.