• 中国精品科技期刊
  • CCF推荐A类中文期刊
  • 计算领域高质量科技期刊T1类
Advanced Search
Liu Zhuang, Liu Chang, Wayne Lin, Zhao Jun. Pretraining Financial Language Model with Multi-Task Learning for Financial Text Mining[J]. Journal of Computer Research and Development, 2021, 58(8): 1761-1772. DOI: 10.7544/issn1000-1239.2021.20210298
Citation: Liu Zhuang, Liu Chang, Wayne Lin, Zhao Jun. Pretraining Financial Language Model with Multi-Task Learning for Financial Text Mining[J]. Journal of Computer Research and Development, 2021, 58(8): 1761-1772. DOI: 10.7544/issn1000-1239.2021.20210298

Pretraining Financial Language Model with Multi-Task Learning for Financial Text Mining

Funds: This work was supported by the Basic Scientific Research Project (General Program) of Department of Education of Liaoning Province and the University-Industry Collaborative Education Program of the Ministry of Education of China (202002037015).
More Information
  • Published Date: July 31, 2021
  • Financial text mining is becoming increasingly important as the number of financial documents rapidly grows. With the progress in machine learning, extracting valuable information from financial literature has gained attention among researchers, and deep learning has boosted the development of effective financial text mining models. However, as deep learning models require a large amount of labeled training data, applying deep learning to financial text mining is often unsuccessful due to the lack of training data in financial fields. Recent researches on training contextualized language representation models on text corpora shed light on the possibility of leveraging a large number of unlabeled financial text corpora. We introduce F-BERT (BERT for financial text mining), which is a domain specific language representation model pre-trained on large-scale financial corpora. Based on the BERT architecture, F-BERT effectively transfers the knowledge from a large amount of financial texts to financial text mining models with minimal task-specific architecture modifications. The results show that our F-BERT outperforms most current state-of-the-art models, which demonstrates the effectiveness and robustness of the proposed F-BERT.
  • Related Articles

    [1]Cui Yuanning, Sun Zequn, Hu Wei. A Pre-trained Universal Knowledge Graph Reasoning Model Based on Rule Prompts[J]. Journal of Computer Research and Development, 2024, 61(8): 2030-2044. DOI: 10.7544/issn1000-1239.202440133
    [2]Wei Shaopeng, Liang Ting, Zhao Yu, Zhuang Fuzhen, Ren Fuji. Multi-View Heterogeneous Graph Neural Network Method for Enterprise Credit Risk Assessment[J]. Journal of Computer Research and Development, 2024, 61(8): 1957-1967. DOI: 10.7544/issn1000-1239.202440126
    [3]Chen Rui, Wang Zhanquan. Uni-LSDPM: A Unified Online Learning Session Dropout Prediction Model Based on Pre-Training[J]. Journal of Computer Research and Development, 2024, 61(2): 441-459. DOI: 10.7544/issn1000-1239.202220834
    [4]Wen Yimin, Yuan Zhe, Yu Hang. A New Semi-Supervised Inductive Transfer Learning Framework: Co-Transfer[J]. Journal of Computer Research and Development, 2023, 60(7): 1603-1614. DOI: 10.7544/issn1000-1239.202220232
    [5]Wang Yan, Tong Xiangrong. Cross-Domain Trust Prediction Based on tri-training and Extreme Learning Machine[J]. Journal of Computer Research and Development, 2022, 59(9): 2015-2026. DOI: 10.7544/issn1000-1239.20210467
    [6]Zhang Dongjie, Huang Longtao, Zhang Rong, Xue Hui, Lin Junyu, Lu Yao. Fake Review Detection Based on Joint Topic and Sentiment Pre-Training Model[J]. Journal of Computer Research and Development, 2021, 58(7): 1385-1394. DOI: 10.7544/issn1000-1239.2021.20200817
    [7]Cheng Xiaoyang, Zhan Yongzhao, Mao Qirong, Zhan Zhicai. Video Semantic Analysis Based on Topographic Sparse Pre-Training CNN[J]. Journal of Computer Research and Development, 2018, 55(12): 2703-2714. DOI: 10.7544/issn1000-1239.2018.20170579
    [8]Wen Yimin, Tang Shiqi, Feng Chao, Gao Kai. Online Transfer Learning for Mining Recurring Concept in Data Stream Classification[J]. Journal of Computer Research and Development, 2016, 53(8): 1781-1791. DOI: 10.7544/issn1000-1239.2016.20160223
    [9]Hong Jiaming, Yin Jian, Huang Yun, Liu Yubao, and Wang Jiahai. TrSVM: A Transfer Learning Algorithm Using Domain Similarity[J]. Journal of Computer Research and Development, 2011, 48(10): 1823-1830.
    [10]Mei Canhua, Zhang Yuhong, Hu Xuegang, and Li Peipei. A Weighted Algorithm of Inductive Transfer Learning Based on Maximum Entropy Model[J]. Journal of Computer Research and Development, 2011, 48(9): 1722-1728.
  • Cited by

    Periodical cited type(9)

    1. 方海泉,邓明明. 具有自主学习与记忆功能的智能政务问答系统研究. 电子技术应用. 2024(01): 21-26 .
    2. 曹策,陈焰,周兰江. 基于深度学习和文本情感的上市公司财务舞弊识别方法. 计算机工程与应用. 2024(04): 338-346 .
    3. 胡菊香,吕学强,游新冬,周建设. 聚类标注和多粒度特征融合的基金新闻分类. 小型微型计算机系统. 2024(02): 257-264 .
    4. 王润周,张新生,王明虎. 融合动态掩码注意力与多教师多特征知识蒸馏的文本分类. 中文信息学报. 2024(03): 113-129 .
    5. 康雷,张瑜. 基于文本挖掘的俄罗斯羽绒服消费需求. 现代纺织技术. 2024(08): 108-116 .
    6. 文益民,员喆,余航. 一种新的半监督归纳迁移学习框架:Co-Transfer. 计算机研究与发展. 2023(07): 1603-1614 . 本站查看
    7. 丁晓蔚,季婧,赵笑宇,王本强,丁毅杰,王献东. 互联网金融安全情绪感知及风险预警应用研究——基于BERT所作的探索. 情报杂志. 2023(09): 57-70 .
    8. 毕鑫,聂豪杰,赵相国,袁野,王国仁. 面向知识图谱约束问答的强化学习推理技术. 软件学报. 2023(10): 4565-4583 .
    9. 胡丹. 金融学文本大数据挖掘方法分析. 互联网周刊. 2022(09): 12-14 .

    Other cited types(17)

Catalog

    Article views (954) PDF downloads (759) Cited by(26)

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return