• 中国精品科技期刊
  • CCF推荐A类中文期刊
  • 计算领域高质量科技期刊T1类
Advanced Search
Zhang Xiangwen, Lu Ziyao, Yang Jing, Lin Qian, Lu Yu, Wang Hongji, Su Jinsong. Weighted Lattice Based Recurrent Neural Networks for Sentence Semantic Representation Modeling[J]. Journal of Computer Research and Development, 2019, 56(4): 854-865. DOI: 10.7544/issn1000-1239.2019.20170917
Citation: Zhang Xiangwen, Lu Ziyao, Yang Jing, Lin Qian, Lu Yu, Wang Hongji, Su Jinsong. Weighted Lattice Based Recurrent Neural Networks for Sentence Semantic Representation Modeling[J]. Journal of Computer Research and Development, 2019, 56(4): 854-865. DOI: 10.7544/issn1000-1239.2019.20170917

Weighted Lattice Based Recurrent Neural Networks for Sentence Semantic Representation Modeling

More Information
  • Published Date: March 31, 2019
  • Currently, recurrent neural networks (RNNs) have been widely used in semantic representation modeling of text sequences in natural language processing. For those languages without natural word delimiters (e.g., Chinese), RNNs generally take the segmented word sequence as input. However, sub-optimal segmentation granularity and segmentation errors may affect sentence semantic modeling negatively, as well as subsequent natural language processing tasks. To address these issues, the proposed weighted word lattice based RNNs take the weighted word lattice as input and produce current state at each time step by integrating arbitrarily many input vectors and the corresponding previous hidden states. Weighted word lattice expresses a compressed data structure that contains exponential word segmentation results. To a certain extent, the weighted word lattice reflects the consistency of different word segmentation results. Specifically, lattice weights are further exploited as a supervised regularizer to refine weights modeling of the semantic composition operation in this model, leading to better sentence semantic representation learning. Compared with traditional RNNs, the proposed model not only alleviates the negative impact of segmentation errors but also is more expressive and flexible to sentence representation learning. Experimental results on sentiment classification and question classification tasks demonstrate the superiority of the proposed model.
  • Related Articles

    [1]Tian Xuan, Xu Zezhou, Wang Zihan. Review of Deep Learning Based Query Suggestion[J]. Journal of Computer Research and Development, 2024, 61(12): 3168-3187. DOI: 10.7544/issn1000-1239.202220837
    [2]Zhang Yang, Qiao Liu, Dong Chunhao, Gao Hongbin. Deep Learning Based Data Race Detection Approach[J]. Journal of Computer Research and Development, 2022, 59(9): 1914-1928. DOI: 10.7544/issn1000-1239.20220014
    [3]Shen Zhengchen, Zhang Qianli, Zhang Chaofan, Tang Xiangyu, Wang Jilong. Location Privacy Attack Based on Deep Learning[J]. Journal of Computer Research and Development, 2022, 59(2): 390-402. DOI: 10.7544/issn1000-1239.20200843
    [4]Gu Mianxue, Sun Hongyu, Han Dan, Yang Su, Cao Wanying, Guo Zhen, Cao Chunjie, Wang Wenjie, Zhang Yuqing. Software Security Vulnerability Mining Based on Deep Learning[J]. Journal of Computer Research and Development, 2021, 58(10): 2140-2162. DOI: 10.7544/issn1000-1239.2021.20210620
    [5]Chen Jinyin, Chen Yipeng, Chen Yiming, Zheng Haibin, Ji Shouling, Shi Jie, Cheng Yao. Fairness Research on Deep Learning[J]. Journal of Computer Research and Development, 2021, 58(2): 264-280. DOI: 10.7544/issn1000-1239.2021.20200758
    [6]Cheng Keyang, Wang Ning, Shi Wenxi, Zhan Yongzhao. Research Advances in the Interpretability of Deep Learning[J]. Journal of Computer Research and Development, 2020, 57(6): 1208-1217. DOI: 10.7544/issn1000-1239.2020.20190485
    [7]Liu Fang, Li Ge, Hu Xing, Jin Zhi. Program Comprehension Based on Deep Learning[J]. Journal of Computer Research and Development, 2019, 56(8): 1605-1620. DOI: 10.7544/issn1000-1239.2019.20190185
    [8]Zhou Ye, Zhang Junping. Multi-Scale Deep Learning for Product Image Search[J]. Journal of Computer Research and Development, 2017, 54(8): 1824-1832. DOI: 10.7544/issn1000-1239.2017.20170197
    [9]Hu Zhen, Fu Kun, Zhang Changshui. Audio Classical Composer Identification by Deep Neural Network[J]. Journal of Computer Research and Development, 2014, 51(9): 1945-1954. DOI: 10.7544/issn1000-1239.2014.20140189
    [10]Zhu Jun, Zhao Jieyu, Dong Zhenyu. Image Classification Using Hierarchical Feature Learning Method Combined with Image Saliency[J]. Journal of Computer Research and Development, 2014, 51(9): 1919-1928. DOI: 10.7544/issn1000-1239.2014.20140138
  • Cited by

    Periodical cited type(9)

    1. 魏志森,邓城. 一种基于深度学习的核酸结合蛋白多标签预测模型. 闽南师范大学学报(自然科学版). 2024(01): 40-53 .
    2. 邵爱斌,杨洋. 致病氨基酸变异预测的新型融合模型. 电子科技大学学报. 2022(01): 25-31 .
    3. 陈佐瓒,徐兵,丁小军,甘井中. 基于深度学习和支持向量机的基因结合蛋白预测. 济南大学学报(自然科学版). 2021(05): 428-432 .
    4. 李肃义,唐世杰,李凤,齐建卓,熊文激. 基于深度学习的生物医学数据分析进展. 生物医学工程学杂志. 2020(02): 349-357 .
    5. 林克全,吴石松. NLP中序列到序列深度学习模型设计. 自动化与仪器仪表. 2020(06): 95-97+102 .
    6. 任盛,周志飞,卜龙敏,王艺錂,刘文婕. 反弹效应下低压线损预测的深度学习算法. 电子设计工程. 2020(22): 104-107+112 .
    7. 刘志宏,旦增巴桑,夏强,廖晓群. 基于深度学习的多能互补供需自适应调度方法. 电子设计工程. 2020(24): 108-111+116 .
    8. 石文浩,孟军,张朋,刘婵娟. 融合CNN和Bi-LSTM的miRNA-lncRNA互作关系预测模型. 计算机研究与发展. 2019(08): 1652-1660 . 本站查看
    9. 王佳,吴宇伦. 基于机器学习预测H1亚型流感病毒抗原变异的研究. 信息通信. 2018(09): 63-64 .

    Other cited types(13)

Catalog

    Article views (1032) PDF downloads (402) Cited by(22)

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return