Recovery of 3D depth map from image shading for underwater applications

Shaomin Zhang, Shahriar Negahdaripour

Research output: Contribution to journalConference articlepeer-review

5 Scopus citations


We study the problem of exploiting image shading for the recovery of an object's 3D shape in underwater. This requires the employment of image models that take into account shading effects due to the medium optical properties, as well as the surface shape and reflectance properties. A simplified image model that incorporates the attenuation of the incident light due to beam spreading, as well as medium absorption and forward-scattering of the incident and reflected light, have been employed to recover the orientation of a planar surface from image shading. We investigate the application of this model for the recovery of the 3D shape of curved surfaces. To do this, we have implemented two methods based on the generalization of the shape from shading method of Tsai and Shah. One iterative method involves recursive adjustment of the underwater image, based on the estimate of the target's 3D shape, to correct for the shading effects due to illumination attenuation. Convergence is reached when the recovered shape, after image shading correction for the medium effects, is consistent with the object's reflectance map. A second method estimates the object's 3D shape directly from the underwater image shading model. Convergence properties and the accuracy of the solutions from the two methods are demonstrated using synthetic data. Results from experiments with the images of an underwater scene are presented to show performance with real data.

Original languageEnglish (US)
Pages (from-to)618-622
Number of pages5
JournalOceans Conference Record (IEEE)
StatePublished - Dec 1 1997
EventProceedings of the 1997 Oceans Conference. Part 1 (of 2) - Halifax, NS, Can
Duration: Oct 6 1997Oct 9 1997

ASJC Scopus subject areas

  • Oceanography


Dive into the research topics of 'Recovery of 3D depth map from image shading for underwater applications'. Together they form a unique fingerprint.

Cite this