location: Current position: Home >> Scientific Research >> Paper Publications

Word Representation on Small Background Texts

Hits:

Indexed by:会议论文

Date of Publication:2016-10-29

Included Journals:EI、CPCI-S

Volume:669

Page Number:143-150

Key Words:Natural language processing; Maximum margin; Word representation; Small background texts

Abstract:Vector representations of words learned from large scale background texts can be used as useful features in natural language processing and machine learning applications. Word representations in previous works were often trained on large-scale unlabeled texts. However, in some scenarios, large scale background texts are not available. Therefore, in this paper, we propose a novel word representation model based on maximum-margin to train word representation using small set of background texts. Experimental results show many advantages of our method.

Pre One:Biomedical Event Extraction via Long Short Term Memory Networks along Dynamic Extended Tree

Next One:Recognizing biomedical named entities based on the sentence vector/twin word embeddings conditioned bidirectional LSTM