Advanced Search
    Feng Lin, Liu Shenglan, Zhang Jing, and Wang Huibing. Robust Activation Function of Extreme Learning Machine and Linear Dimensionality Reduction in High-Dimensional Data[J]. Journal of Computer Research and Development, 2014, 51(6): 1331-1340.
    Citation: Feng Lin, Liu Shenglan, Zhang Jing, and Wang Huibing. Robust Activation Function of Extreme Learning Machine and Linear Dimensionality Reduction in High-Dimensional Data[J]. Journal of Computer Research and Development, 2014, 51(6): 1331-1340.

    Robust Activation Function of Extreme Learning Machine and Linear Dimensionality Reduction in High-Dimensional Data

    • Extreme learning machine (ELM), with the advantage of fast training and high classification accuracy, has been widely used in practical applications (for example, face recognition) and got good result. While ELM algorithm is often severely affected by noise and outliers in high-dimensional real word datasets, which will reduce the accuracy rate of ELM. This should be attributed to the following two reasons: 1) the high dimensionalities of input samples; 2) improper selection of activation function. The two reasons above result in that the outputs of activation functions are approaching zero, which finally reduce the performance of ELM. As for the first problem, we propose a robust linear dimensionality reduction method, RAF-Global Embedding (RAF-GE), to preprocess the high dimensional data and then classify the data with ELM. While for the second one, we give an in-depth analysis of different activation function and propose a robust activation function (RAF) which can avoid the outputs of activation function approaching zero, thus it can improve the performance of RAF-GE and ELM as well. The experiment results show that the performance of face recognition method in this paper generally outperforms the comparing methods with other activation function.
    • loading

    Catalog

      Turn off MathJax
      Article Contents

      /

      DownLoad:  Full-Size Img  PowerPoint
      Return
      Return