Abstract
This paper presents our latest investigation of recurrent neural etworks for the slot filling task of spoken language understanding. We implement a bi-directional Elman-type recurrent neural network which takes the information not only from the past but also from the future context to predict the semantic label of the target word. Furthermore, we propose to use ranking loss function to train the model. This improves the performance over the cross entropy loss function. On the ATIS benchmark data set, we achieve a new state-of-theart result of 95.56% F1-score without using any additional knowledge or data sources.
Item Type: | Journal article |
---|---|
Faculties: | Languages and Literatures |
Subjects: | 400 Language > 400 Language |
ISSN: | 1520-6149 |
Language: | English |
Item ID: | 47207 |
Date Deposited: | 27. Apr 2018, 08:12 |
Last Modified: | 04. Nov 2020, 13:24 |