location: Current position: Home >> Scientific Research >> Paper Publications

A Hierarchical Iterative Attention Model for Machine Comprehension

Hits:

Indexed by:会议论文

Date of Publication:2017-01-01

Included Journals:EI、CPCI-S

Volume:10260

Page Number:341-352

Key Words:Machine comprehension; Hierarchical Iterative Attention; Tree-LSTM; Chinese machine comprehension; Cloze-style reading comprehension

Abstract:Enabling a computer to understand a document so that it can answer comprehension questions is a central, yet unsolved goal of Natural Language Processing, so reading comprehension of text is an important problem in NLP research. In this paper, we propose a novel Hierarchical Iterative Attention model (HIA), which constructs iterative alternating attention mechanism over tree-structured rather than sequential representations. The proposed HIA model continually refines its view of the query and document while aggregating the information required to answer a query, aiming to compute the attentions not only for the document but also the query side, which will benefit from the mutual information. Experimental results show that HIA has achieved significant state-of-the-art performance in public English datasets, such as CNN and Childrens Book Test datasets. Furthermore, HIA also outperforms state-of-the-art systems by a large margin in Chinese datasets, including People Daily and Childrens Fairy Tale datasets, which are recently released and the first Chinese reading comprehension datasets.

Pre One:The analysis and recognition of Chinese temporal expressions based on a mixtured model using statistics and rules

Next One:Research on entity relation extraction in TCM acupuncture and moxibustion field