Mapping seaweed using drones

Introduction

Seaweed provides vital ecological services and commercial resources in coastal areas. However, macroalgal habitats face various environmental and anthropogenic pressures. Mapping and monitoring macroalgal habitats are crucial for sustainable management and conservation of these valuable resources.

Ireland harvests 28,000 tonnes of seaweed annually ( FAO 2021 ). However, this figure is only an estimate, and the real harvested value is probably higher, but it is difficult to say with certainty. The last  seaweed survey in Ireland  was in 1998, which concerned nine counties in the west of Ireland and two main species, kelp and ascophyllum nodosum (rockweed, egg wrack).

Since then, technology has made a huge leap forward, and now there are drones, which make available mapping and biomass assessment of seaweed resources at scale. Multispectral sensors allow for accurate species delineation and change monitoring. This study investigates the use of drones equipped with RGB and multispectral sensors for enhanced classification and assessment of macroalgal habitats compared to traditional RGB images. By integrating multispectral bands, we demonstrate more accurate delineation of macroalgal species, which is vital for addressing conservation challenges.

Study sites

The study was carried out in three different locations in the west and north of Ireland: Inishmore, the Aran Islands, Co. Galway (Site 1), Caherrush Point, Co. Clare (Site 2), and Gortnamullan, Co. Donegal (Site 3). The surveys were conducted during the low tide, to capture the fully exposed intertidal zone.

UAV survey of the bay in Inishmore (zoomable).

Donegal UAV survey (zoomable)

Quilty UAV survey (zoomable)

Methodology

Aerial surveys were conducted using a fixed-wing eBee Ag drone. The drone is kitted with a real-time kinematic (RTK) positioning unit, which significantly reduces the number of ground control points to be collected for georectification of orthomosaics. RGB imagery was captured using a 20 Mpix S.O.D.A. camera, and a Parrot Sequoia+ camera was used to collect multispectral imagery. Each sensor of the camera has narrow-band filter for collecting data in the green, red, red edge and near-infrared wavelengths. The multispectral sensor was calibrated using a reflectance panel before each flight. The sensor is also supplied with a downwelling light sensor which allows for automatic radiometric calibration during the flight.

eBee Ag fixed-wing drone

The dual camera allows for simultaneous collection of RGB and multispectral images, reducing the survey time and ensuring the consistency of lighting conditions

A collection of drone photographs were stitched together into orthomosaics, RGB and multispectral. Additionally, composite, seven-band RGB-MS images were created.

Survey site at Inishmore, in RGB (left) and false-colour composite (right)

Species classification then was done on a pixel-by-pixel basis on the three image types. Three supervised classifiers were used to classify the orthomosaics: maximum likelihood classifier (MLC), support vector machines (SVM) and Random Forest (RF). For image training, a minimum of 50 training polygons were created for each class.

The accuracy of habitat classification maps was assessed using confusion matrices, overall accuracy (OA), user’s accuracy (UA) and producer’s accuracy (PA). To that end, a stratified random sampling was used with equal number of points per each class. Accuracy assessments used random points, generated using a stratified random sampling strategy, whereby each class has a number of points proportional to its relative area. Ground survey data and aerial images were used as a means of validation of classification results.

Nowadays, there are multiple options to facilitate field data collection, eliminate paper notes and optimise data transfer. One such option is QField app. It allows users to collect field information using a mobile phone and integrate it with desktop — collected data is stored as a shapefile, compatible with QGIS or any other GIS software. Geotagged photographs are attached to the shapefile attribute table and can be imported to the desktop app and used as a means of verification of drone surveys and image classification results.

Ground survey points collected at Site 3 using QField app

Results

RGB images can be used for mapping broad classes, for example, red, green, and brown algae, and between certain species as well. However, missing spectral information causes confusion between vegetation and non-vegetation, which can be visible on classification images of the site in Donegal:

Classification images of Site 3, a)-c) RGB, d)-f) MS, g)-i) RGB-MS

On RGB images (a-c) substratum pixels were mistakenly classified as Himanthalia elongata and Fucus spp. due to their colour similarity with the substratum at this site. Adding spectral bands allowed us to separate vegetation pixels from non-vegetation more accurately, and also to distinguish between brown seaweed species, achieving overall accuracy of about 80% for RGB-MS images:

Classifier/ Image type

RF

MLC

SVM

RGB

66.5 ± 2.8

59.2 ± 2.8

67.9 ± 2.5

MS

70.6 ± 2.4

76.0 ± 2.4

70.1 ± 3.2

RGB-MS

77.3 ± 2.5

82.0 ± 2.3

81.4 ± 2.4

Overall accuracy, %, Site 3.

However, multispectral images did not do well in correctly mapping red and green algae, overestimating their presence at the study sites (see images d-f below):

Classification images of Site 1, a)-c) RGB, d)-f) MS, g)-i) RGB-MS

In certain areas, RGB images were better in mapping red and green algae than MS datasets:

Classification images of Site 2, a)-c) RGB, d)-f) MS, g)-i) RGB-MS

Misclassification of red and green algae resulted in overall accuracies of MS classification maps similar or even lower than those of RGB images:

Classifier/ Image type

RF

MLC

SVM

RGB

65.2±2.6

57.5±2.6

62.1 ± 2.5

MS

63.7±2.4

49.9±2.4

66.4±2.5

RGB-MS

70.7±2.3

72.5±2.5

76.7±2.4

Overall accuracy, %, Site 2.

Low accuracies of mapping red and green algae using MS images can be attributed to the fact that the multispectral sensor, used in the surveys, collects imagery at four bands: 550 ± 40, 660 ± 40, 735 ± 10, and 790 ± 40 nm, which are shown below, superimposed on the reflectance spectra of several seaweed species.

Original spectra: Gema Casal, Tom Rossiter, Thomas Furey, Dagmar B. Stengel, Building a spectral library for coastal mapping in Ireland using Earth Observation, 2019

While the spectral information the sensor collects is critical to distinguish between different seaweed species, the limited number of bands limits its effectiveness as it fails to capture some of the critical regions of the spectra, thus limiting the accuracy of certain seaweed species. Sensors with more spectral bands may be a solution to poor mapping accuracy due to limited spectral information (for example, a 10-band multispectral camera, used by  Román et al. )

Error matrices were calculated to area proportion matrices using the methodology proposed by Olofsson et al. ( 1  and  2 ), whereby an area estimator, corrected for the omission and commission bias, is introduced. Then class areas were calculated to estimate the spatial presence of different seaweed species at each site.

At Site 3, which was the least complex in terms of species — they were also juvenile at the time of the survey and well-spaced out — and the classifiers yielded similar results across datasets, demonstrating agreement.

Class area coverage estimates in hectares, Site 3.

In more complex sites, such as the one near Quilty, where seaweed species were mature and layered on top of one another, image classification posed a greater challenge, leading to discrepancies among the classifiers. For instance, the Maximum Likelihood Classifier (MLC) overestimated the presence of Fucus spp. in RGB-MS classification, and accurately mapping red algae proved to be difficult.

Class area coverage estimates in hectares, Site 2.

The SVM algorithm exhibited a more balanced performance at complex sites, as evidenced by the overall accuracy table above and the confusion matrices provided below. The addition of spectral data notably increased the mapping accuracy for species like Fucus spp. However, it's important to note the particularly low user's accuracies (main diagonal elements in the row-normalized matrices) for green algae. In this context, RGB images outperformed multispectral images for mapping this class. The SVM classifier demonstrated comparable accuracy in identifying red and green seaweed to that of RGB images and showed a superior capability in detecting various types of brown algae, making it a more suitable option for complex site classification.

Row-normalized (user’s accuracies) and column-normalized (producer’s accuracies) confusion matrices, Site 2. Rows represent instances of the reference (true) class, whereas columns represent map classes, predicted by classifiers. Diagonal elements represent the degree of correctly predicted classes. The confusion is expressed by the non-diagonal elements.

To summarise, RGB images can effectively map broad classes such as red, green, and brown algae. They can even outperform MS images in mapping red and green algae. However, MSimages are better at separating different brown seaweed species. RGB-MS images combine the advantages of both RGB and MS data, yielding the highest overall accuracy. On less complex sites, the classifiers produced comparable results, with the SVM algorithm demonstrating more balanced and accurate performance at taxonomically complex sites. These findings underscore the importance of selecting appropriate data types and classification algorithms based on study site characteristics. This advocates for broader adoption of UAVs and multispectral sensors in ecological research while emphasizing the need to consider environmental factors, site characteristics, and species-specific traits that may influence algorithm effectiveness for intertidal habitat monitoring.

eBee Ag fixed-wing drone

The dual camera allows for simultaneous collection of RGB and multispectral images, reducing the survey time and ensuring the consistency of lighting conditions

Survey site at Inishmore, in RGB (left) and false-colour composite (right)

Classification images of Site 3, a)-c) RGB, d)-f) MS, g)-i) RGB-MS

Classification images of Site 1, a)-c) RGB, d)-f) MS, g)-i) RGB-MS

Classification images of Site 2, a)-c) RGB, d)-f) MS, g)-i) RGB-MS

Original spectra: Gema Casal, Tom Rossiter, Thomas Furey, Dagmar B. Stengel, Building a spectral library for coastal mapping in Ireland using Earth Observation, 2019