Cost-effective Framework for Rapid Underwater Mapping with Digital Camera and Color Correction Method

Anjin Chang, Jinha Jung, Dugan Um, Junho Yeom, Frederick Hanselmann

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

Geo-referenced mapping in an aquatic environment is challenging because it is hard to measure the location of objects and images underwater. In this paper, we propose the method and share the results: cost-effective framework to generate 2D and 3D underwater maps with digital imagery and a Structure from Motion (SfM) algorithm. The proposed method consists of data acquisition, image processing, and color correction. 292 and 437 images were acquired from each study site located in Spring Lake in San Marcos, Texas, U.S.A. Agisoft Photoscan Pro software was used to generate 3D point cloud data and orthomosaic images after feature matching and image alignment from geo-tagged imagery. The mosaic images with high resolution (< 0.2 cm per pixel) were generated with 2D underwater images. After color correction, the red reduction effect was recovered, and the bluer color was removed. The 3D underwater map was generated directly from 3D dense point clouds including geo-coordinates and RGB color information. As a result, the Very High Resolution (VHR) 2D and 3D maps were generated and the topographic surface of underwater structures was obtained in great detail. Although the RMSE were about 1 m, the proposed method provided more detailed surface of underwater features.

Original languageEnglish (US)
Pages (from-to)1776-1785
Number of pages10
JournalKSCE Journal of Civil Engineering
Volume23
Issue number4
DOIs
StatePublished - Apr 1 2019

Keywords

  • 3D mapping
  • Structure from Motion (SfM)
  • bathymetry
  • underwater
  • underwater color correction

ASJC Scopus subject areas

  • Civil and Structural Engineering

Fingerprint

Dive into the research topics of 'Cost-effective Framework for Rapid Underwater Mapping with Digital Camera and Color Correction Method'. Together they form a unique fingerprint.

Cite this