Advanced Search
    Zhu Lixin, Pheng Ann Heng, Xia Deshen. Nonlinear Diffusion based Image Denoising Coupling Gradient Fidelity Term[J]. Journal of Computer Research and Development, 2007, 44(8): 1390-1398.
    Citation: Zhu Lixin, Pheng Ann Heng, Xia Deshen. Nonlinear Diffusion based Image Denoising Coupling Gradient Fidelity Term[J]. Journal of Computer Research and Development, 2007, 44(8): 1390-1398.

    Nonlinear Diffusion based Image Denoising Coupling Gradient Fidelity Term

    • Image denoising with second order non-linear diffusion PDEs often leads to an undesirable staircase effect, namely, the transformation of smooth regions into piecewise constant ones. In this paper, these nonlinear diffusion models are improved by adding the Euler-Lagrange equation derived from the gradient fidelity term which describes the similarity in gradient between the noise images and the restored ones. After coupling the new restriction equation derived from the gradient fidelity term, the classical second order PDE-based denoising models will produce piecewise smooth results, while preserving sharp jump discontinuities in images. The convexity of the proposed model is proved and the existence and uniqueness of optimal solution is ensured. The influence of introducing spatial regularization on the gradient estimation is also analyzed and the importance of proper regularization parameter selection to the final results is emphasized theoretically and experimentally. In addition, the gradient fidelity term is integrable in bounded variation function space which makes the models outperform fourth order nonlinear PDEs based denoising methods suffering from the leakage problems and the sensitivity to high frequency components in images. Experimental results show that the new model alleviates the staircase effect to some extent and preserves the image features well, such as textures and edges.
    • loading

    Catalog

      Turn off MathJax
      Article Contents

      /

      DownLoad:  Full-Size Img  PowerPoint
      Return
      Return