Line detection method with robustness against contrast and width variation applied in gradient vector field

Yukiyasu Yoshinaga, Hidefumi Kobatake

    Research output: Contribution to journalArticlepeer-review

    7 Citations (Scopus)

    Abstract

    This paper proposes a new method of extracting lines characterized as long and narrow bright regions, or curvilinear convex regions, in images. Generally, the contrast and the line width cause difficulty in extraction of ridgelines and skeletons. Therefore, this paper proposes the line convergence vector field model based on the intensity gradient vector field for a model of the curvilinear convex region. This model is independent of the contrast and the scale. Next, we propose the line convergence degree as a value of model evaluation for realizing the extraction of the curvilinear convex regions on the basis of this model. Then, we propose the Gradient-Angle-Weighted Hough Transform (GAWHT) for calculation of the evaluation value. Moreover, we obtain the theoretical value for the line convergence degree, as well as show its effectiveness in extraction of the curvilinear convex regions. We applied our method to artificial and practical images in experiment. The results show successfully extracted lines with a normalized index of the shape of the curvilinear convex region, demonstrating the effectiveness in extracting such regions free from the contrast and the scale.

    Original languageEnglish
    Pages (from-to)49-58
    Number of pages10
    JournalSystems and Computers in Japan
    Volume31
    Issue number3
    DOIs
    Publication statusPublished - Mar 1 2000

    All Science Journal Classification (ASJC) codes

    • Theoretical Computer Science
    • Information Systems
    • Hardware and Architecture
    • Computational Theory and Mathematics

    Fingerprint

    Dive into the research topics of 'Line detection method with robustness against contrast and width variation applied in gradient vector field'. Together they form a unique fingerprint.

    Cite this