• 中国精品科技期刊
  • CCF推荐A类中文期刊
  • 计算领域高质量科技期刊T1类
Advanced Search
Wang Pengjie, Pan Zhigeng, Xu Mingliang, Liu Yongkui. A Fast and Lossless Compression Algorithm for Point-Based Models Based on Local Minimal Spanning Tree[J]. Journal of Computer Research and Development, 2011, 48(7): 1263-1268.
Citation: Wang Pengjie, Pan Zhigeng, Xu Mingliang, Liu Yongkui. A Fast and Lossless Compression Algorithm for Point-Based Models Based on Local Minimal Spanning Tree[J]. Journal of Computer Research and Development, 2011, 48(7): 1263-1268.

A Fast and Lossless Compression Algorithm for Point-Based Models Based on Local Minimal Spanning Tree

More Information
  • Published Date: July 14, 2011
  • Point-based graphics has become one of the hottest topics in 3D computer graphics recently. Since point-based models are often too large to be stored and transferred in limited hardware and bandwidth easily, it is necessary to design effective compression methods. We propose an efficient and fast lossless geometry compression algorithm for point-based models. Firstly, point-sampled surface is split into many equal sized surface patches. Over the points of each patch, a minimal spanning tree is constructed and encoded in breadth first order. During this process, each point is predicted from its father in the spanning tree. Then both predicted and actual positions are broken into sign, exponent and mantissa, and their corrections are separately compressed by using arithmetic coding in the different contexts. The achieved bit-rate and time usage in our algorithm outperforms the previous lossless compression methods of point-based models. Our algorithm nicely complements those lossy compression algorithms of point-based models, and it can be used under the situation where lossy compression is not preferred.
  • Related Articles

    [1]Ren Jiadong, Liu Xinqian, Wang Qian, He Haitao, Zhao Xiaolin. An Multi-Level Intrusion Detection Method Based on KNN Outlier Detection and Random Forests[J]. Journal of Computer Research and Development, 2019, 56(3): 566-575. DOI: 10.7544/issn1000-1239.2019.20180063
    [2]Liu Lu, Zuo Wanli, Peng Tao. Tensor Representation Based Dynamic Outlier Detection Method in Heterogeneous Network[J]. Journal of Computer Research and Development, 2016, 53(8): 1729-1739. DOI: 10.7544/issn1000-1239.2016.20160178
    [3]Zhao Xingwang, Liang Jiye. An Attribute Weighted Clustering Algorithm for Mixed Data Based on Information Entropy[J]. Journal of Computer Research and Development, 2016, 53(5): 1018-1028. DOI: 10.7544/issn1000-1239.2016.20150131
    [4]Huang Tianqiang, Yu Yangqiang, Guo Gongde, Qin Xiaolin. Trajectory Outlier Detection Based on Semi-Supervised Technology[J]. Journal of Computer Research and Development, 2011, 48(11): 2074-2082.
    [5]Zhang Jing, Sun Zhihui, Yang Ming, Ni Weiwei, Yang Yidong. Fast Incremental Outlier Mining Algorithm Based on Grid and Capacity[J]. Journal of Computer Research and Development, 2011, 48(5): 823-830.
    [6]Yu Hao, Wang Bin, Xiao Gang, Yang Xiaochun. Distance-Based Outlier Detection on Uncertain Data[J]. Journal of Computer Research and Development, 2010, 47(3): 474-484.
    [7]Ni Weiwei, Chen Geng, Lu Jieping, Wu Yingjie, Sun Zhihui. Local Entropy Based Weighted Subspace Outlier Mining Algorithm[J]. Journal of Computer Research and Development, 2008, 45(7): 1189-1194.
    [8]Jin Yifu, Zhu Qingsheng, Xing Yongkang. An Algorithm for Clustering of Outliers Based on Key Attribute Subspace[J]. Journal of Computer Research and Development, 2007, 44(4): 651-659.
    [9]Ni Weiwei, Lu Jieping, Chen Geng, and Sun Zhihui. An Efficient Data Stream Outliers Detection Algorithm Based on k-Means Partitioning[J]. Journal of Computer Research and Development, 2006, 43(9): 1639-1643.
    [10]Yang Yidong, Sun Zhihui, Zhang Jing. Finding Outliers in Distributed Data Streams Based on Kernel Density Estimation[J]. Journal of Computer Research and Development, 2005, 42(9): 1498-1504.

Catalog

    Article views (811) PDF downloads (549) Cited by()

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return