![]() |
个人信息Personal Information
教授
博士生导师
硕士生导师
性别:女
毕业院校:大连理工大学
学位:博士
所在单位:计算机科学与技术学院
学科:计算机应用技术
办公地点:创新园大厦B811
联系方式:0411-84706009-2811
电子邮箱:wangjian@dlut.edu.cn
HAN-ReGRU: hierarchical attention network with residual gated recurrent unit for emotion recognition in conversation
点击次数:
论文类型:期刊论文
发表时间:2021-04-03
发表刊物:NEURAL COMPUTING & APPLICATIONS
卷号:33
期号:7
页面范围:2685-2703
ISSN号:0941-0643
关键字:Emotion recognition in conversation; Pre-trained word embedding; Hierarchical attention network; Bidirectional gated recurrent unit; Residual connection; Position embedding
摘要:Emotion recognition in conversation aims to identify the emotion of each consistent utterance in a conversation from several pre-defined emotions. The task has recently become a new popular research frontier in natural language processing because of the increase in open conversational data and its application in opinion mining. However, most existing methods for the task cannot capture the long-range contextual information in an utterance and a conversation effectively. To alleviate this problem, we propose a novel hierarchical attention network with residual gated recurrent unit framework. Firstly, we adopt the pre-trained BERT-Large model to obtain context-dependent representation for each token of each utterance in a conversation. Then, a hierarchical attention network is proposed to capture long-range contextual information about the conversation structure. Besides, in order to better model position information of the utterances in a conversation, we add position embedding to the input of the multi-head attention. Experiments on two textual dialogue emotion datasets demonstrate that our model significantly outperforms the state-of-the-art baseline methods.