Towards Spectral-Texture Approach to Hyperspectral Image Analysis for Plant Classification

Ali AlSuwaidi, Bruce Grieve, Hujun Yin

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    228 Downloads (Pure)

    Abstract

    The use of hyperspectral imaging systems in studying plant properties, types, and conditions has significantly increased due to numerous economical and financial benefits. It can also enable automatic identification of plant phenotypes. Such systems can underpin a new generation of precision agriculture techniques, for instance, the selective application of plant nutrients to crops, preventing costly losses to soils, and the associated environmental impact to their ingress into watercourses. This paper is concerned with the analysis of hyperspectral images and data for monitoring and classifying plant conditions. A spectral-texture approach based on feature selection and the Markov random field model is proposed to enhance classification and prediction performance, as compared to conventional approaches. Two independent hyperspectral datasets, captured by two proximal hyperspectral instrumentations with different acquisition dates and exposure times, were used in the evaluation. Experimental results show promising improvements in the discrimination performance of the proposed approach. The study shows that such an approach can shed a light on the attributes that can better differentiate plants, their properties, and conditions.
    Original languageUndefined
    Title of host publicationIntelligent Data Engineering and Automated Learning -- IDEAL 2017
    EditorsHujun Yin, Yang Gao, Songcan Chen, Yimin Wen, Guoyong Cai, Tianlong Gu, Junping Du, Antonio J. Tallón-Ballesteros, Minling Zhang
    Place of PublicationCham
    PublisherSpringer Nature
    Pages251-260
    Number of pages10
    ISBN (Print)978-3-319-68935-7
    Publication statusPublished - 2017

    Publication series

    NameLecture Notes in Computer Science
    Volume10585
    ISSN (Print)0302-9743

    Cite this