• 中国精品科技期刊
  • CCF推荐A类中文期刊
  • 计算领域高质量科技期刊T1类
Advanced Search
Che Haiyan, Feng Tie, Zhang Jiachen, Chen Wei, and Li Dali. Automatic Knowledge Extraction from Chinese Natural Language Documents[J]. Journal of Computer Research and Development, 2013, 50(4): 834-842.
Citation: Che Haiyan, Feng Tie, Zhang Jiachen, Chen Wei, and Li Dali. Automatic Knowledge Extraction from Chinese Natural Language Documents[J]. Journal of Computer Research and Development, 2013, 50(4): 834-842.

Automatic Knowledge Extraction from Chinese Natural Language Documents

More Information
  • Published Date: April 14, 2013
  • Automatic knowledge extraction method can recognize and extract the factual knowledge on matching the ontology from the Web documents automatically. These factual knowledge can not only be used to implement knowledge-based services but also provide necessary semantic content to enable the realization of the vision of Semantic Web. However, it is very difficult to deal with the natural language documents, especially the Chinese natural language documents. This paper proposes a new knowledge extraction method (AKE) based on Semantic Web theory and Chinese natural language processing (NLP) technologies. This method uses aggregated knowledge concept to depict N-ary relation knowledge in ontology and can automatically extract not only the explicit but also the implicit simple and N-ary complex factual knowledge from Chinese natural language documents without using the large scale linguistics databases and synonym table. Experimental results show that this method is better than other similar methods.
  • Related Articles

    [1]Gao Jianwei, Wan Huaiyu, Lin Youfang. Integrating External Entity Knowledge for Distantly Supervised Relation Extraction[J]. Journal of Computer Research and Development, 2022, 59(12): 2794-2802. DOI: 10.7544/issn1000-1239.20210445
    [2]Ning Yuanlong, Zhou Gang, Lu Jicang, Yang Dawei, Zhang Tian. A Representation Learning Method of Knowledge Graph Integrating Relation Path and Entity Description Information[J]. Journal of Computer Research and Development, 2022, 59(9): 1966-1979. DOI: 10.7544/issn1000-1239.20210651
    [3]Guo Zhengshan, Zuo Jie, Duan Lei, Li Renhao, He Chengxin, Xiao Yingjie, Wang Peiyan. A Generative Adversarial Negative Sampling Method for Knowledge Hypergraph Link Prediction[J]. Journal of Computer Research and Development, 2022, 59(8): 1742-1756. DOI: 10.7544/issn1000-1239.20220074
    [4]Wang Peiyan, Duan Lei, Guo Zhengshan, Jiang Weipeng, Zhang Yidan. Knowledge Hypergraph Link Prediction Model Based on Tensor Decomposition[J]. Journal of Computer Research and Development, 2021, 58(8): 1599-1611. DOI: 10.7544/issn1000-1239.2021.20210315
    [5]Yu Donghua, Guo Maozu, Liu Xiaoyan, Cheng Shuang. Predicted Results Evaluation and Query Verification of Drug-Target Interaction[J]. Journal of Computer Research and Development, 2019, 56(9): 1881-1888. DOI: 10.7544/issn1000-1239.2019.20180830
    [6]Wang Yongqing, Xie Lunguo, Fu Qingchao. Moveable Bubble Flow Control and Adaptive Routing Mechanism in Torus Networks[J]. Journal of Computer Research and Development, 2014, 51(8): 1854-1862. DOI: 10.7544/issn1000-1239.2014.20121097
    [7]Xu Yan, Jin Zhi, Li Ge, Wei Qiang. Acquiring Topical Concept Network from Multiple Web Information Sources[J]. Journal of Computer Research and Development, 2013, 50(9): 1843-1854.
    [8]Hu Kai, Wang Zhe, Jiang Shu, and Yin Baolin. A Performance Model of k-Ary n-Cube Under Communication Locality[J]. Journal of Computer Research and Development, 2011, 48(11): 2083-2093.
    [9]Chen Huahong, Luo Xiaonan, Ling Ruotian, Ma Jianping. A Mesh Simplification Algorithm Based on n-Edges-Mesh Collapse[J]. Journal of Computer Research and Development, 2008, 45(6).
    [10]Wang Songxin, Wang Fei, Zhou Shuigeng, Zhou Aoying. N-SHOQ(D): A Nonmonotonic Extension of Description Logic SHOQ(D)[J]. Journal of Computer Research and Development, 2005, 42(4): 570-575.

Catalog

    Article views (1368) PDF downloads (1135) Cited by()

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return