高级检索

    基于稀疏表示的含噪图像超分辨重建方法

    Noisy Image Super-Resolution Reconstruction Based on Sparse Representation

    • 摘要: 传统的含噪图像超分辨方法只能将图像去噪和图像超分辨分别进行处理,基于稀疏表示与字典训练的含噪声图像超分辨重建方法将两者融合在一起.提出一种基于图像块在训练字典下稀疏表示的协同处理方法,来解决含噪图像超分辨的问题.由于图像块可以由字典下的稀疏系数来表示,所以可训练一个分别适用于含噪低分辨率图像块和清晰高分辨率图像块的字典对,使得高低分辨率图像块在该字典对下具有相同的稀疏表示.当输入含噪低分辨率图像块时,先计算出其在低分辨率字典下的稀疏表示系数,然后利用此稀疏系数在高分辨率字典下进行重建,可得到清晰高分辨率图像块,最后通过整体优化完成清晰高分辨率图像,实现图像超分辨和图像去噪的目的.实验证明,采用局部自适应插值的方法放大低分辨率图像到中间分辨率再进行特征提取,比以往采用的双三线性插值的方法在重建图像质量上有提高,并通过研究字典λ参数的设置使得超分辨重建和去噪结果同时达到最佳,即在图像的视觉和质量上都具有较为明显的优势,具有很好的鲁棒性和有效性.

       

      Abstract: Denoising and super-resolution reconstruction are performed separately in traditional methods for noisy image super-resolution reconstruction, while in the noisy image super-resolution reconstruction method based on sparse representation and dictionary learning the two processes are compounded together. Since an image patch can be well represented as a sparse linear combination of elements from an appropriately chosen over-complete dictionary, two dictionaries are trained respectively from noisy low- and clean high- resolution image patches by enforcing the similarity of two sparse representations with respect to their own dictionary. Given a noisy low-resolution image, sparse representations of low-resolution patches via trained low-dictionary are computed, then the high-resolution image can be reconstructed from high-resolution patches with the help of the related low-resolution sparse representations and trained high-dictionary, after global optimization a clean high-resolution is obtained to accomplish the goal of image super-resolution and denosing simultaneously. The experiments show that zooming low-resolution image to a middle-resolution using locally adaptive zooming algorithm for extracting features can get a better reconstructed image than bicubic interpolation algorithm. By setting the parameter λ, we can obtain the best performance both in super-resolution and denoising with absolute advantages in image quality and visual effect, which demonstrates the validity and robustness of our algorithm.

       

    /

    返回文章
    返回