Aarhus University Seal

Transferability and the effect of colour calibration during multi-image classification of Arctic vegetation change

New publication by Samira Kolyaie, Urs Albert Treier, Gary Richard Watmough, Bjarke Madsen, Peder Klith Bøcher, Achilleas Psomas, Ruedi Bösch, Signe Normand

Abstract:

Mapping changes in vegetation cover is essential for understanding the consequences of climate change on Arctic ecosystems. Classification of ultra-high spatial-resolution (UHR, < 1 cm) imagery can provide estimates of vegetation cover across space and time. The challenge of this approach is to assure comparability of classification across many images taken at different illumination conditions and locations. With warming, vegetation at higher elevation is expected to resemble current vegetation at lower elevation. To investigate the value of classification of UHR imagery for monitoring vegetation change, we collected visible and near-infrared images from 108 plots with hand-held cameras along an altitudinal gradient in Greenland and examined the classification accuracy of shrub cover on independent images (i.e. classification transferability). We implemented several models to examine if colour calibration improves transferability based on an in-image calibration target. The classifier was trained on different number of images to find the minimum training subset size. With a training set of ~ 20% of the images the overall accuracy levelled off at about 81% and 68% on the non-calibrated training and validation images, respectively. Colour calibration improved the accuracy on training images (1–4%) while it only improved the classifier transferability significantly for training sets < 20%. Linear calibration only based on the target’s grey series improved transferability most. Reasonable transferability of Arctic shrub cover classification can be obtained based only on spectral data and about 20% of all images. This is promising for vegetation monitoring through multi-image classification of UHR imagery acquired with hand-held cameras or Unmanned Aerial Systems.

https://doi.org/10.1007/s00300-019-02491-7