location: Current position: Home >> Scientific Research >> Paper Publications

HAN-ReGRU: hierarchical attention network with residual gated recurrent unit for emotion recognition in conversation

Hits:

Indexed by:Journal Papers

Date of Publication:2021-04-03

Journal:NEURAL COMPUTING & APPLICATIONS

Volume:33

Issue:7

Page Number:2685-2703

ISSN No.:0941-0643

Key Words:Emotion recognition in conversation; Pre-trained word embedding; Hierarchical attention network; Bidirectional gated recurrent unit; Residual connection; Position embedding

Abstract:Emotion recognition in conversation aims to identify the emotion of each consistent utterance in a conversation from several pre-defined emotions. The task has recently become a new popular research frontier in natural language processing because of the increase in open conversational data and its application in opinion mining. However, most existing methods for the task cannot capture the long-range contextual information in an utterance and a conversation effectively. To alleviate this problem, we propose a novel hierarchical attention network with residual gated recurrent unit framework. Firstly, we adopt the pre-trained BERT-Large model to obtain context-dependent representation for each token of each utterance in a conversation. Then, a hierarchical attention network is proposed to capture long-range contextual information about the conversation structure. Besides, in order to better model position information of the utterances in a conversation, we add position embedding to the input of the multi-head attention. Experiments on two textual dialogue emotion datasets demonstrate that our model significantly outperforms the state-of-the-art baseline methods.

Pre One:Star-BiLSTM-LAN for Document-level Mutation-Disease Relation Extraction from Biomedical Literature

Next One:Adversarial neural network with sentiment-aware attention for detecting adverse drug reactions