Vegetation mapping using multispectral UAV images


DJI recently introduced the P4 Multispectral, a high-precision unmanned aerial vehicle (UAV or “drone”) that leverages multispectral camera integration to aid agricultural and environmental monitoring applications. As a result, collecting imagery data for vegetation mapping is now easier and more efficient than ever.

In the DJI P4 Multispectral, images are collected by an RGB camera and a multispectral camera array with five global shutter cameras covering the blue, green, red, red edge, and near-infrared bands at a resolution of 1600 x 1300 pixels. (Figure 1 ). Real-time, centimeter-accurate positioning data on images captured by DJI’s six integrated system cameras are used to align the flight controller, RGB/multispectral cameras, and RTK module. This fixes the positioning data in the center of the CMOS and ensures that each frame uses the most accurate metadata. All cameras benefit from the calibration process whereby radial and tangential lens distortions are measured and recorded in each image’s metadata to facilitate image post-processing.

More importantly, an integrated spectral insolation sensor on top of the drone captures solar irradiance to maximize the accuracy and consistency of data collection at different times of day. This provides the most accurate NDVI results.

Figure 1: DJI P4 multispectral drone.

Study area and data collection

Babol Noshirvani University of Technology (BNUT), which is Iran’s leading university according to the Times Higher Education World University Rankings, is located in the north of the country. The campus comprises 11 hectares of several buildings and a green space mainly covered with orange trees (Figure 2).

The dataset was collected on October 24, 2020. The flight was planned in the DJI GS Pro iPad app at an altitude of 70 meters with 65% forward and side overlaps. Image collection was done at midday to minimize shadows, and it took about ten minutes to cover the campus with 522 geotagged vertical, multispectral RGB images.

Figure 2: BNUT campus (orange line) and study area (green line).

Data processing

The photogrammetric processing of the drone images was carried out using the Agisoft Metashape software. The processing workflow – including image alignment to produce sparse point clouds, build dense cloud, build mesh, build texture, build digital elevation model (DEM) and build the orthomosaic – was performed and finally, to generate a 3D map of the study area, the multispectral point clouds and the orthomosaic were exported in (.las) and (.tiff) formats respectively. The 3D point cloud with a density of 900 points/m² and an orthomosaic with a ground sampling distance (GSD) of 3 centimeters were generated from the point clouds and images (Figure 3).

Figure 3: True, color-coded dense point cloud of the study area.

Results

The multispectral orthomosaic derived from the photogrammetric processing of UAV images was used to calculate the vegetation indices as shown in Table 1. The well-known multispectral and visible-band vegetation indices such as NDVI, NDRE, NGRDI, VIDVI, CIVE, ExG, ExR and VEG were used. The corresponding vegetation index maps for the study area are shown in Figure 4.

Although trees and lawns are highlighted by all vegetation indices, vegetation zones are more distinguished by the NDVI. In addition, buildings and areas without vegetation are clearly highlighted by all clues. The NDI and VEG indices provided similar results and outperformed the other visible band indices. The CIVE, VDVI, ExG and ExR indices are sensitive to shadows. As a result, shadow areas are highlighted as vegetation.

Table 1: Vegetation indices. R: red, G: green, B: blue, NIR: near infrared and RE: red edge.

Conclusion

Multispectral UAV images can be used for many applications such as urban tree mapping, horticulture, precision agriculture and more. In addition to opening a new era of applications, RGB-derived vegetation indices can be calibrated and validated more precisely using multispectral UAV images. As a result, drone-based RGB images will be an invaluable source of data for green space management in urban and rural areas.

Figure 4: Vegetation indices.

Future Reading

McKinnon, Tom and Paul Hoff. Comparison of RGB-based vegetation indices with NDVI for agricultural drone detection. Agribotix. Com 21.17 (2017): p. 1-8.

Yeom, Junho et al. Comparison of vegetation indices derived from UAV data to differentiate the effects of tillage in agriculture. remote sensing 11.13 (2019): p. 1,548.

Stary, K., et al. Comparison of RGB-based vegetation indices from UAV images to estimate hop canopy area. Agricultural Research 18.4 (2020): p. 2592-2601.

Thanks

The author would like to thank Roodkhiz Water and Environment Company for collecting the UAV images.

Previous single-pixel detector classifies images using a diffractive optical grating | The news feed
Next New infrared imager converts infrared light into clear images