Optical calibration, bathymetry, water column correction and bottom typing of shallow marine areas, using passive remote sensing imageries


see review of papers by other investigators
4SM errors

1 - NO NEED for field data, nor for atmospheric correction
2 - this is demonstrated in this website, using a variety of hyper/multi spectral data
Requirements are
1 - homogeneous water body and atmosphere
2 - some coverage of optically deep water
3 - some coverage of dry land
Problems are
1 - the precision on estimated depth is found wanting, because the noise-equivalent change in radiance  of accessible data is too high for shallow water column correction work 
2 - radiance data should be preprocessed by the provider at level 1 in order to improve S/N ratio
3 - exponential decay: the deeper/darker the bottom, the poorer the performances
I keep digging
until suitable data
become available

Your comments are invited                            See you on 4SM blog

4SM details

  • The water body is homogeneous: each waveband i is assumed to have a specific operational two-ways attenuation coefficient Ki for radiance
  • The water body and the atmosphere are homogeneous: each waveband i is assumed to exhibit a deep water radiance value Lsw i over a deep water area in the image
  • Backscatter of the deep water column Lw (water volume reflectance)
    • Atmospheric path radiance is noted La
    • We assume that we may write Lsw=La+Lw and that the glint at sea-surface is negligible
  • Then in 4SM we introduce some more assumptions as follows.

On the beach

  • A clean and fine-grained coral sand pixel on the beach is bright in both bands i and j in the image: LsBi and LsBj
  • A black body pixel on the beach is black in both bands in the image: Lai and Laj
  • From the brightest pixel to the darkest pixel on the beach,we may consider the linear radiometric model of the Soil Line (SL)
  • All intermediate pixels on the beach ideally plot along the linear SL in a ~diagonal position in a bidimensional histogram natural SL
  • Of course, this is a sheer reduction of spectral diversity of natural substrates

Shallow bottoms

  • Let the tide come : Ls=Lsw + (LsB-Lsw)/exp(K*Z)
    • Ls decreases from Ls=LsB (Z=0) all the way to Ls=Lsw (Z=infinite) as Z increases
    • at all visible and near-infrared wavelengths
  • This model accounts for the backscatter of the shallow water column (i.e. color of the water) because Lsw=La+Lw
  • Let Ki <Kj , and examine the scatter plot Lsi= f( Lsj)       plot of natural RTE
  • ==> the brightest pixels plot along an exponential-shaped bulge: the Brightest_Pixels_Line (BPL) : this is the "exponential decay"
  • ==> The BPL may be thought of as the outer physical limit of the scatter plot ot the natural_BPL

A RATIO METHOD : a consistent system of ratios
  • for a 3-bands image i, j and k, one gets a consistent system of 3 ratios
    • Ki/Kj
    • Ki/Kk
    • ==> Kj/Kk  = Ki/Kj   /   Ki/Kk
  • for a 4-bands image i, j, k and l, one gets a consistent system of 6 ratios
  • and so on
A seed value
  • A seed value may be introduced for Ki at any particular wavelength
    • in order to specify spectral K , from Ki to Kn , for all wavebands available i to n
  • This seed value does not need to be realistic: it might as well be taken as Ki=any_value,
    • as all other values are then specified through the series of ratios
  • This choice of a seed value only affects the depth computed in meters,
    • while "water column correction" actually depends on the ratios among spectral K values to compute spectral bottom signature.




  • Some sea truth is still needed to fine-tune this calibration of Z:
    • a CoefZ must be estimated from seatruth depth data through linear regression
    • a tide correction may be applied
Zfinal = TideHeight + CoefZ * Zcomputed
 3 : go to Summary further: why does it possibly work?

4SM : where's the catch?
Semi-analytical methods
EOMAP, SAMBUCA, ALUT, DigitalGlobe's, Hope, ...
empirical method
Other empirical methods
like Polcyn's, Lyzenga's, Stumpf's
Inverting the model
is an optimization process

which needs some form of assumption
as regard water column corrected spectral 
reflectance signature
Inverting the model
is an optimization process

which needs some form of assumption
as regard water column corrected
radiance signature
Coefficients A,  B and C
in Polcyn's magic formula

or Coefficients m0 and m1
in Stumpf's magic formula

are a form of of assumption
as regard water column corrected spectral 
reflectance/radiance signature
Just make Z=0 and rewrite to get
the equation of a straight line
They iterate
the inversion of the semi-analytical RTE
  • -by increasing Z in a LUT-
  • until an occurrence is found
  • that yields the closest spectral matching
  • with the observed signature at the current pixel
4SM  iterates
the inversion of the simplified RTE
  • -by increasing Z-
  • until the water column corrected spectral reflectance for the current pixel
  • is deemed to match satisfactorily some form of the spectral "Soils Line" where Z is null
No iteration is needed to compute Z
For this they need
  • spectral value for 2K, the diffuse attenuation coefficient at specified wavelengths for  the bansdet
  • a database of spectral signatures for all endmembers bottom types that possibly exist at the site
  • the a, b, c, ... coefficients for the mixture of endmembers they want to unmix at the current pixel
For this I need
  • the spectral ratio Ki/Kj observed in the image, Lyzenga's way, for all pairs of bands i, j, k, ...
  • and a seed value derived from Jerlov's data, so  that spectral value for 2K, the diffuse attenuation coefficient , is  estimated at all specified wavelengths for  the bansdet
  • a spectral Soils Line model derived from observed spectral signature of bareland in the image at null depth
For this they need
  • a dataset of depth points over the whole shallow depth range
    • to represent all major shallow bottom types that exist at the site
  • The least we can say is that such dataset is difficult and costly to collect and tricky to reduce
  • The actual dataset is quite often limited to a few depth soundings that feature on some outdated existing nautical or bathymetric chart
Because these are unknown,
  • and because their database is a discrete collection of pure endmembers at null depth,
  • they choose that particular quantitative mixture of all possible endmembers
    • with all possible 2K values
  • which yields a spectral signature that best fits the current pixel.
  • This they call "spectral matching"
    • of the observed signature with zillions of LUT occurrences
Because these are all derived
from the image itself
and Jerlov's data
  • through 4SM's own calibration process,
  • the inversion then is a simple matter of increasing Z as explained above.
  • If sandy bottoms prevail in this dataset, then depths estimated over vegetated/dark  bottoms shall most probably be underestimated, possibly very badly (a common complaint!!!)
  • Conversely, if dark bottoms prevail in this dataset, then  depths estimated over bright  bottoms shall most probably be overestimated, possibly very badly.

Involves multidimensional matrixes, 
entails horrific computing time


Very fast
results in very attractive performances


Very fast
But their process also accounts for
-and maps-
spatial variations
of water optical quality
on the fly
  • this would appear to be a distinct advantage
  • this has a very high cost though!!

Because using Lyzenga's trick in 4SM
pertains to the brightest bottoms in the clearest waters  in the scene
4SM cannot account properly for areas
where waters are less clear
they shall show badly in the true color composite screen display of water column corrected bottom  signatures.

This provides a means for the practioner to
  • either devise specific conditions for the processing of areas affected
  • or flag them as artifacts
  • see this artifact on Ikonos at Dubai

Specific conditions
  •  Like in 4SM, water optical properties are assumed to be constant
  • This does not "unmix" the influence of variable depth from that of variable bottom signature, as computed depth is the only output
  • Nobody worries about bad or fancy results
    • garbage in ==>> garbage out
    • until these methods ultimately are shelved and poeple start investigating into semi-analytical methods
go to Summary Further
why does it possibly work