• 中国精品科技期刊
  • CCF推荐A类中文期刊
  • 计算领域高质量科技期刊T1类
Advanced Search
Zhu Hong, Ding Shifei, Xu Xinzheng. An AP Clustering Algorithm of Fine-Grain Parallelism Based on Improved Attribute Reduction[J]. Journal of Computer Research and Development, 2012, 49(12): 2638-2644.
Citation: Zhu Hong, Ding Shifei, Xu Xinzheng. An AP Clustering Algorithm of Fine-Grain Parallelism Based on Improved Attribute Reduction[J]. Journal of Computer Research and Development, 2012, 49(12): 2638-2644.

An AP Clustering Algorithm of Fine-Grain Parallelism Based on Improved Attribute Reduction

More Information
  • Published Date: December 14, 2012
  • Affinity propagation (AP) clustering simultaneously considers all data points as potential exemplars. It takes similarity between pairs of data points as input measures, and clusters gradually during the message-passing procedure. AP is an efficient and fast clustering algorithm for large dataset compared with the existing clustering algorithms. Therefore, attributes reduction is important for AP. Meanwhile, fine-grain parallelism is emphasized in the design of massively parallel computers to acquire higher performance. In this paper, an AP clustering algorithm based on improved attribute reduction and fine-grain parallelism (IRPAP) is proposed. Firstly, granularity is introduced into parallel computing and granularity principle is applied as well. Secondly, data set is preprocessed by the improved attribute reduction algorithm through which elements in discernibility matrix will be calculated and selected in parallel, in order to reduce the complexity of time and space. Finally, data set is clustered by means of a parallel AP algorithm. The whole task can be divided into multiple threads to be processed simultaneously. Experimental results show that the IRPAP algorithm is more efficient than the AP algorithm for large data set clustering.
  • Related Articles

    [1]Zhang Shuyi, Xi Zhengjun. Quantum Hypothesis Testing Mutual Information[J]. Journal of Computer Research and Development, 2021, 58(9): 1906-1914. DOI: 10.7544/issn1000-1239.2021.20210346
    [2]Chu Xiaokai, Fan Xinxin, Bi Jingping. Position-Aware Network Representation Learning via K-Step Mutual Information Estimation[J]. Journal of Computer Research and Development, 2021, 58(8): 1612-1623. DOI: 10.7544/issn1000-1239.2021.20210321
    [3]Xu Mengfan, Li Xinghua, Liu Hai, Zhong Cheng, Ma Jianfeng. An Intrusion Detection Scheme Based on Semi-Supervised Learning and Information Gain Ratio[J]. Journal of Computer Research and Development, 2017, 54(10): 2255-2267. DOI: 10.7544/issn1000-1239.2017.20170456
    [4]Zha Zhengjun, Zheng Xiaoju. Query and Feedback Technologies in Multimedia Information Retrieval[J]. Journal of Computer Research and Development, 2017, 54(6): 1267-1280. DOI: 10.7544/issn1000-1239.2017.20170004
    [5]Li Feng, Miao Duoqian, Zhang Zhifei, Zhang Wei. Mutual Information Based Granular Feature Weighted k-Nearest Neighbors Algorithm for Multi-Label Learning[J]. Journal of Computer Research and Development, 2017, 54(5): 1024-1035. DOI: 10.7544/issn1000-1239.2017.20160351
    [6]Xue Yuanhai, Yu Xiaoming, Liu Yue, Guan Feng, Cheng Xueqi. Exploration of Weighted Proximity Measure in Information Retrieval[J]. Journal of Computer Research and Development, 2014, 51(10): 2216-2224. DOI: 10.7544/issn1000-1239.2014.20130339
    [7]Zhang Zhenhai, Li Shining, Li Zhigang, and Chen Hao. Multi-Label Feature Selection Algorithm Based on Information Entropy[J]. Journal of Computer Research and Development, 2013, 50(6): 1177-1184.
    [8]Xu Junling, Zhou Yuming, Chen Lin, Xu Baowen. An Unsupervised Feature Selection Approach Based on Mutual Information[J]. Journal of Computer Research and Development, 2012, 49(2): 372-382.
    [9]Liu He, Zhang Xianghong, Liu Dayou, Li Yanjun, Yin Lijun. A Feature Selection Method Based on Maximal Marginal Relevance[J]. Journal of Computer Research and Development, 2012, 49(2): 354-360.
    [10]Wang Wenhui, Feng Qianjin, Chen Wufan. Segmentation of Brain MR Images Based on the Measurement of Difference of Mutual Information and Gauss-Markov Random Field Model[J]. Journal of Computer Research and Development, 2009, 46(3): 521-527.

Catalog

    Article views (880) PDF downloads (621) Cited by()

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return