• 中国精品科技期刊
  • CCF推荐A类中文期刊
  • 计算领域高质量科技期刊T1类
Advanced Search
Liao Haibin, Xu Bin. Robust Face Expression Recognition Based on Gender and Age Factor Analysis[J]. Journal of Computer Research and Development, 2021, 58(3): 528-538. DOI: 10.7544/issn1000-1239.2021.20200288
Citation: Liao Haibin, Xu Bin. Robust Face Expression Recognition Based on Gender and Age Factor Analysis[J]. Journal of Computer Research and Development, 2021, 58(3): 528-538. DOI: 10.7544/issn1000-1239.2021.20200288

Robust Face Expression Recognition Based on Gender and Age Factor Analysis

Funds: This work was supported by the National Natural Science Foundation of China (61701174), the Xianning Municipal Natural Science Foundation (2019kj130), and the Cultivation Foundation of Hubei University of Science and Technology (202022GP03).
More Information
  • Published Date: February 28, 2021
  • A robust face expression recognition method based on deep conditional random forest is proposed to solve the problem of factors such as race, gender and age in non-controllable environment. Different from the traditional single task facial expression recognition models, we devise an effective multi-task face expression recognition architecture that is capable of learning from auxiliary attributes like gender and age. In the study, we find that facial attributes of gender and age have a great impact on facial expression recognition. In order to capture the relationship between facial attributes and facial expressions, a deep conditional random forest based on facial attributes is proposed for face expression recognition. In the feature extraction stage, multi-instance learning integrated with attention mechanism is used to extract face features to remove variations including illumination, occlusion and low resolution. In the facial expression recognition stage, according to the facial attributes of gender and age, the multi-condition random forest method is used to recognize facial expressions. A large number of experiments have been carried out on the open CK+, ExpW, RAF-DB and AffectNet face expression databases: the recognition rate reaches 99% on the normalized CK+ face database and 70.52% on the challenging natural scene database. The experimental results show that our proposed method has better performance than the state-of-the-art methods; furthermore, it is robust to occlusion, noise and resolution variation in the wild.
  • Related Articles

    [1]Zeng Biqing, Zeng Feng, Han Xuli, Shang Qi. Aspect Extraction Model Based on Interactive Feature Representation[J]. Journal of Computer Research and Development, 2021, 58(1): 224-232. DOI: 10.7544/issn1000-1239.2021.20190305
    [2]Wang Xin, Wang Ying, Zuo Wanli. Exploring Interactional Opinions and Status Theory for Predicting Links in Signed Network[J]. Journal of Computer Research and Development, 2016, 53(4): 764-775. DOI: 10.7544/issn1000-1239.2016.20151079
    [3]Zhu Jun, Guo Changguo, Wu Quanyuan. A Web Services Interaction Behavior-Environment Model Based on Generalized Stochastic Petri Nets[J]. Journal of Computer Research and Development, 2012, 49(11): 2450-2463.
    [4]Zhu Jun, Guo Changguo, Wu Quanyuan. A Runtime Monitoring Web Services Interaction Behaviors Method Based on CPN[J]. Journal of Computer Research and Development, 2011, 48(12): 2277-2289.
    [5]Zhu Yingjie, Li Chunpeng, Ma Wanli, Xia Shihong, Zhang Tielin, Wang Zhaoqi. Interaction Feature Modeling of Virtual Object in Immersive Virtual Assembly[J]. Journal of Computer Research and Development, 2011, 48(7): 1298-1306.
    [6]Lu Difei, Ren Wenhua, Li Guojun, and Si Jin. Example Based 3D Animation Creating Interactively[J]. Journal of Computer Research and Development, 2010, 47(1): 62-71.
    [7]Tong Xiangrong, Huang Houkuan, Zhang Wei. Prediction and Abnormal Behavior Detection of Agent Dynamic Interaction Trust[J]. Journal of Computer Research and Development, 2009, 46(8): 1364-1370.
    [8]Wu Lingda, Gao Yu, and Wei Yingmei. A Survey of Interactive Rendering of Large-Scale and Complex Scenes[J]. Journal of Computer Research and Development, 2007, 44(9): 1579-1587.
    [9]Wang Xiaochun, Tian Feng, Qin Yanyan, and Dai Guozhong. UPIM: A User-Centered Pen-Based Interactive System[J]. Journal of Computer Research and Development, 2006, 43(8): 1337-1344.
    [10]Liu Wei, ChunTung Chou, Cheng Wenqing, Du Xu. Proxy Caching for Interactive Streaming Media[J]. Journal of Computer Research and Development, 2006, 43(4): 594-600.

Catalog

    Article views (1094) PDF downloads (684) Cited by()

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return