location: Current position: Home >> Scientific Research >> Paper Publications

Joining External Context Characters to Improve Chinese Word Embedding

Hits:

Indexed by:会议论文

Date of Publication:2017-01-01

Included Journals:EI、CPCI-S

Volume:10262

Page Number:405-415

Key Words:Word embeddings; Neural network; NLP

Abstract:In Chinese, a word is usually composed of several characters, the semantic meaning of a word is related to its composing characters and contexts. Previous studies have shown that modeling the characters can benefit learning word embeddings, however, they ignore the external context characters. In this paper, we propose a novel Chinese word embeddings model which considers both internal characters and external context characters. In this way, isolated characters have more relevance and character embeddings contain more semantic information. Therefore, the effectiveness of Chinese word embeddings is improved. Experimental results show that our model outperforms other word embeddings methods on word relatedness computation, analogical reasoning and text classification tasks, and our model is empirically robust to the proportion of character modeling and corpora size.

Pre One:Supervised Ranking Framework for Relationship Prediction in Heterogeneous Information Networks

Next One:A Novel Extreme Learning Machine-Based Classification Algorithm for Uncertain Data