Adaptive smoothing via contextual and local discontinuities

    Research output: Contribution to journalArticlepeer-review

    Abstract

    A novel adaptive smoothing approach is proposed for noise removal and feature preservation where two distinct measures are simultaneously adopted to detect discontinuities in an image. Inhomogeneity underlying an image is employed as a multiscale measure to detect contextual discontinuities for feature preservation and control of the smoothing speed, while local spatial gradient is used for detection of variable local discontinuities during smoothing. Unlike previous adaptive smoothing approaches, two discontinuity measures are combined in our algorithm for synergy in preserving nontrivial features, which leads to a constrained anisotropic diffusion process that inhomogeneity offers intrinsic constraints for selective smoothing. Thanks to the use of intrinsic constraints, our smoothing scheme is insensitive to termination times and the resultant images in a wide range of iterations are applicable to achieve nearly identical results for various early vision tasks. Our algorithm is formally analyzed and related to anisotropic diffusion. Comparative results indicate that our algorithm yields favorable smoothing results, and its application in extraction of hydrographic objects demonstrates its usefulness as a tool for early vision. © 2005 IEEE.
    Original languageEnglish
    Pages (from-to)1552-1567
    Number of pages15
    JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
    Volume27
    Issue number10
    DOIs
    Publication statusPublished - Oct 2005

    Keywords

    • Adaptive smoothing
    • Anisotropic diffusion
    • Extraction of hydrographic objects
    • Feature preservation
    • Inhomogeneity
    • Local scale control
    • Multiple scales
    • Noise removal
    • Spatial gradient
    • The termination problem

    Fingerprint

    Dive into the research topics of 'Adaptive smoothing via contextual and local discontinuities'. Together they form a unique fingerprint.

    Cite this