用于文本分类的局部化双向长短时记忆

万圣贤;兰艳艳;郭嘉丰;徐 君;庞 亮;程学旗

PDF(1685 KB)
PDF(1685 KB)
中文信息学报 ›› 2017, Vol. 31 ›› Issue (3) : 62-68.
语言分析与计算

用于文本分类的局部化双向长短时记忆

  • 万圣贤1;2;兰艳艳1;郭嘉丰1;徐 君1;庞 亮1;2;程学旗1
作者信息 +

Local Bidirectional Long Short Term Memory for Text Classification

  • WAN Shengxian1;2; LAN Yanyan 1; GUO Jiafeng 1; XU Jun1; PANG Liang1;2; CHENG Xueqi1
Author information +
History +

摘要

近年来,深度学习越来越广泛地应用于自然语言处理领域,人们提出了诸如循环神经网络(RNN)等模型来构建文本表达并解决文本分类等任务。长短时记忆(long short term memory,LSTM)是一种具有特别神经元结构的RNN。LSTM的输入是句子的单词序列,模型对单词序列进行扫描并最终得到整个句子的表达。然而,常用的做法是只把LSTM在扫描完整个句子时得到的表达输入到分类器中,而忽略了扫描过程中生成的中间表达。这种做法不能高效地提取一些局部的文本特征,而这些特征往往对决定文档的类别非常重要。为了解决这个问题,该文提出局部化双向LSTM模型,包括MaxBiLSTM和ConvBiLSTM。MaxBiLSTM直接对双向LSTM的中间表达进行max pooling。ConvBiLSTM对双向LSTM的中间表达先卷积再进行max pooling。在两个公开的文本分类数据集上进行了实验。结果表明,局部化双向LSTM尤其是ConvBiLSTM相对于LSTM有明显的效果提升,并取得了目前的最优结果。

Abstract

Deep learning has shown great benefits for natural language processing in recent years. Models such as Recurrent Neural Networks (RNNs) have been proposed to extract text representation, which can be applied for text classification. Long short term memory (LSTM) is an advanced kind of RNN with special neural cells. LSTM accepts a sequence of words from a sentence scans over the whole sequence and outputs the representation of the sentence. However, customary practices use only the last representation LSTM produced for classification, ignoring all other intermediate representations. A clear drawback is that it could not capture efficiently local features that are very important for determining the sentences class label. In this paper, we propose the local bidirectional long short term memory to deal with this problem, including MaxBiLSTM and ConvBiLSTM. MaxBiLSTM conducts a max pooling operation and ConvBiLSTM conducts a convolution operation followed with a max pooling operation on all intermediate representations generated by bidirectional LSTM. Experimental results on two public datasets for text classification show that local bidirectional LSTM, especially ConvBiLSTM, outperforms bidirectional LSTM consistently and reaches the state-of-the-art performances.

关键词

文本分类 / 深度学习 / 长短时记忆 / 卷积

Key words

text classification / deep learning / long short term memory / convolution

引用本文

导出引用
万圣贤;兰艳艳;郭嘉丰;徐 君;庞 亮;程学旗. 用于文本分类的局部化双向长短时记忆. 中文信息学报. 2017, 31(3): 62-68
WAN Shengxian; LAN Yanyan ; GUO Jiafeng ; XU Jun; PANG Liang; CHENG Xueqi. Local Bidirectional Long Short Term Memory for Text Classification. Journal of Chinese Information Processing. 2017, 31(3): 62-68

基金

973基金项目(2014CB340401,2012CB316303);国家自然科学基金(6122010,61472401,61433014,61425016,61203298);中国科学院青年创新促进会(2014310,2016102)
PDF(1685 KB)

Accesses

Citation

Detail

段落导航
相关文章

/