Automated classification of underwater multispectral imagery for coral reef monitoring

A. C R Gleason, Pamela R Reid, Kenneth Voss

Research output: Chapter in Book/Report/Conference proceedingConference contribution

25 Citations (Scopus)

Abstract

Survey protocols for monitoring the composition and status of coral reef benthic communities vary in the level of detail acquired, but fundamentally follow one of two approaches: 1) a diver identifies organisms in the field, 2) an analyst identifies organisms from underwater imagery (photos or video). Both methods are highly labor intensive and require a trained biologist / geologist. A method for automated classification of reef benthos would improve coral reef monitoring by reducing the cost of data analysis. Spectral classification of standard (three-band) color underwater imagery does not work well for distinguishing major bottom types. Recent publications of hyperspectral reflectance of corals, algae, and sediment, on the other hand, suggest that careful choice of narrow (∼10 nm) spectral bands might improve classification accuracy relative to the three wide bands available on commercial cameras. We built an underwater multispectral camera to test whether narrow spectral bands were actually superior to standard RGB cameras for automated classification of underwater images. A filter wheel was used to acquire imagery in six 10 nm spectral bands, which were chosen from suggestions in the literature. Results indicate that the algorithms suggested in the literature require very careful compensation for variable illumination and water column attenuation for even marginal success in classifying underwater imagery. On the other hand, a new algorithm, based on the normalized difference ratio of images at 568 nm and 546 nm can reliably segment photosynthetic organisms (corals and algae) from non-photosynthetic background. Moreover, when this new algorithm is combined with very simple texture segmentation, the general cover classes of coral and algae can be discriminated from the image background with accuracies on the order of 80%. These results suggest that a combination of high spectral resolution and texture-based image segmentation may be an optimal methodology for automated classification of underwater coral reef imagery.

Original languageEnglish (US)
Title of host publicationOceans Conference Record (IEEE)
DOIs
StatePublished - 2007
EventOceans 2007 MTS/IEEE Conference - Vancouver, BC, Canada
Duration: Sep 29 2007Oct 4 2007

Other

OtherOceans 2007 MTS/IEEE Conference
CountryCanada
CityVancouver, BC
Period9/29/0710/4/07

Fingerprint

coral reef
imagery
monitoring
coral
alga
segmentation
benthos
texture
spectral resolution
reflectance
reef
labor
water column
filter
methodology
cost
sediment
organism
spectral band
method

ASJC Scopus subject areas

  • Oceanography

Cite this

Automated classification of underwater multispectral imagery for coral reef monitoring. / Gleason, A. C R; Reid, Pamela R; Voss, Kenneth.

Oceans Conference Record (IEEE). 2007. 4449394.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Gleason, ACR, Reid, PR & Voss, K 2007, Automated classification of underwater multispectral imagery for coral reef monitoring. in Oceans Conference Record (IEEE)., 4449394, Oceans 2007 MTS/IEEE Conference, Vancouver, BC, Canada, 9/29/07. https://doi.org/10.1109/OCEANS.2007.4449394
@inproceedings{6bfed8a51a6f4f8aa77aa58e7256e83e,
title = "Automated classification of underwater multispectral imagery for coral reef monitoring",
abstract = "Survey protocols for monitoring the composition and status of coral reef benthic communities vary in the level of detail acquired, but fundamentally follow one of two approaches: 1) a diver identifies organisms in the field, 2) an analyst identifies organisms from underwater imagery (photos or video). Both methods are highly labor intensive and require a trained biologist / geologist. A method for automated classification of reef benthos would improve coral reef monitoring by reducing the cost of data analysis. Spectral classification of standard (three-band) color underwater imagery does not work well for distinguishing major bottom types. Recent publications of hyperspectral reflectance of corals, algae, and sediment, on the other hand, suggest that careful choice of narrow (∼10 nm) spectral bands might improve classification accuracy relative to the three wide bands available on commercial cameras. We built an underwater multispectral camera to test whether narrow spectral bands were actually superior to standard RGB cameras for automated classification of underwater images. A filter wheel was used to acquire imagery in six 10 nm spectral bands, which were chosen from suggestions in the literature. Results indicate that the algorithms suggested in the literature require very careful compensation for variable illumination and water column attenuation for even marginal success in classifying underwater imagery. On the other hand, a new algorithm, based on the normalized difference ratio of images at 568 nm and 546 nm can reliably segment photosynthetic organisms (corals and algae) from non-photosynthetic background. Moreover, when this new algorithm is combined with very simple texture segmentation, the general cover classes of coral and algae can be discriminated from the image background with accuracies on the order of 80{\%}. These results suggest that a combination of high spectral resolution and texture-based image segmentation may be an optimal methodology for automated classification of underwater coral reef imagery.",
author = "Gleason, {A. C R} and Reid, {Pamela R} and Kenneth Voss",
year = "2007",
doi = "10.1109/OCEANS.2007.4449394",
language = "English (US)",
isbn = "0933957351",
booktitle = "Oceans Conference Record (IEEE)",

}

TY - GEN

T1 - Automated classification of underwater multispectral imagery for coral reef monitoring

AU - Gleason, A. C R

AU - Reid, Pamela R

AU - Voss, Kenneth

PY - 2007

Y1 - 2007

N2 - Survey protocols for monitoring the composition and status of coral reef benthic communities vary in the level of detail acquired, but fundamentally follow one of two approaches: 1) a diver identifies organisms in the field, 2) an analyst identifies organisms from underwater imagery (photos or video). Both methods are highly labor intensive and require a trained biologist / geologist. A method for automated classification of reef benthos would improve coral reef monitoring by reducing the cost of data analysis. Spectral classification of standard (three-band) color underwater imagery does not work well for distinguishing major bottom types. Recent publications of hyperspectral reflectance of corals, algae, and sediment, on the other hand, suggest that careful choice of narrow (∼10 nm) spectral bands might improve classification accuracy relative to the three wide bands available on commercial cameras. We built an underwater multispectral camera to test whether narrow spectral bands were actually superior to standard RGB cameras for automated classification of underwater images. A filter wheel was used to acquire imagery in six 10 nm spectral bands, which were chosen from suggestions in the literature. Results indicate that the algorithms suggested in the literature require very careful compensation for variable illumination and water column attenuation for even marginal success in classifying underwater imagery. On the other hand, a new algorithm, based on the normalized difference ratio of images at 568 nm and 546 nm can reliably segment photosynthetic organisms (corals and algae) from non-photosynthetic background. Moreover, when this new algorithm is combined with very simple texture segmentation, the general cover classes of coral and algae can be discriminated from the image background with accuracies on the order of 80%. These results suggest that a combination of high spectral resolution and texture-based image segmentation may be an optimal methodology for automated classification of underwater coral reef imagery.

AB - Survey protocols for monitoring the composition and status of coral reef benthic communities vary in the level of detail acquired, but fundamentally follow one of two approaches: 1) a diver identifies organisms in the field, 2) an analyst identifies organisms from underwater imagery (photos or video). Both methods are highly labor intensive and require a trained biologist / geologist. A method for automated classification of reef benthos would improve coral reef monitoring by reducing the cost of data analysis. Spectral classification of standard (three-band) color underwater imagery does not work well for distinguishing major bottom types. Recent publications of hyperspectral reflectance of corals, algae, and sediment, on the other hand, suggest that careful choice of narrow (∼10 nm) spectral bands might improve classification accuracy relative to the three wide bands available on commercial cameras. We built an underwater multispectral camera to test whether narrow spectral bands were actually superior to standard RGB cameras for automated classification of underwater images. A filter wheel was used to acquire imagery in six 10 nm spectral bands, which were chosen from suggestions in the literature. Results indicate that the algorithms suggested in the literature require very careful compensation for variable illumination and water column attenuation for even marginal success in classifying underwater imagery. On the other hand, a new algorithm, based on the normalized difference ratio of images at 568 nm and 546 nm can reliably segment photosynthetic organisms (corals and algae) from non-photosynthetic background. Moreover, when this new algorithm is combined with very simple texture segmentation, the general cover classes of coral and algae can be discriminated from the image background with accuracies on the order of 80%. These results suggest that a combination of high spectral resolution and texture-based image segmentation may be an optimal methodology for automated classification of underwater coral reef imagery.

UR - http://www.scopus.com/inward/record.url?scp=50449097647&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=50449097647&partnerID=8YFLogxK

U2 - 10.1109/OCEANS.2007.4449394

DO - 10.1109/OCEANS.2007.4449394

M3 - Conference contribution

SN - 0933957351

SN - 9780933957350

BT - Oceans Conference Record (IEEE)

ER -