Skip to content

Latest commit

 

History

History
112 lines (87 loc) · 8.33 KB

paper.md

File metadata and controls

112 lines (87 loc) · 8.33 KB
title tags authors affiliations date bibliography
DroneWQ: A Python package for processing MicaSense multispectral drone imagery for aquatic remote sensing
Python
UAS
drone
remote sensing
water quality
name orcid affiliation equal-contrib corresponding
Anna E. Windle
0000-0002-4852-5848
1, 2
true
true
name orcid affiliation equal_contrib
Patrick C. Gray
0000-0002-8997-5255
3, 4
true
name orcid affiliation
Alejandro Román
0000-0002-8868-9302
5
name orcid affiliation
Sergio Heredia
0009-0003-9495-9625
5
name orcid affiliation
Gabriel Navarro
0000-0002-8919-0060
5
name orcid affiliation
Greg M. Silsbe
0000-0003-2673-1162
6
name index
NASA Goddard Space Flight Center, Greenbelt, MD, United States
1
name index
Science Systems and Applications, Inc., Lanham, MD, United States
2
name index
School of Marine Sciences, University of Maine, Orono, ME, United States
3
name index
Department of Marine Geosciences, Charney School of Marine Sciences, University of Haifa, Haifa, Israel
4
name index
Department of Ecology and Coastal Management, Institute of Marine Sciences of Andalusia (ICMAN-CSIC), Spanish National Research Council (CSIC), 11519 Puerto Real, Spain
5
name index
Horn Point Laboratory, University of Maryland Center for Environmental Science, Cambridge, MD, United States
6
13 November 2024
paper.bib

Summary

Small aerial drones, or unoccupied aerial systems (UAS), conveniently achieve scales of observation between satellite resolutions and in situ sampling, effectively diminishing the “blind spot” between these established measurement techniques [@gray_larsen_johnston_2022]. UAS equipped with off-the-shelf multispectral sensors originally designed for terrestrial applications are being increasingly used to derive water quality properties. Multispectral UAS imagery requires post processing to radiometrically calibrate raw pixel values to useful radiometric units such as reflectance. In aquatic applications, there are additional steps to remove surface reflected light and sun glint, and different approaches to estimate water quality parameters. Georeferencing and mapping UAS imagery over water also comes with challenges since typical structure from motion photogrammtey techniques fail due to lack of feature matching. DroneWQ can 1) convert raw multispectral imagery to total radiance (Lt) with units of W/m2/nm/sr, 2) remove surface reflected light (Lsr) to calculate water leaving radiance (Lw), 3) measure downwelling irradiance (Ed) from either the calibrated reflectance panel, downwelling light sensor (DLS), or a combination of the two, 4) calculate remote sensing reflectance (Rrs) by dividing Ed by Lw, and 5) mask pixels containing specular sun glint or instances of vegetation, shadowing, etc., 6) use Rrs as input into various bio-optical algorithms to derive chlorophyll a and total suspended sediment concentrations, and 7) georeference using image metadata and sensor specifications to orient and map to a known coordinate system.

Statement of need

DroneWQ is a Python package for multispectral UAS imagery processing to obtain remote sensing reflectance (Rrs), the fundamental input into ocean color algorithms which can be used to estimate and map water quality parameters. The processing steps, calibrations, and corrections necessary to obtain research quality Rrs data from UAS can be prohibitively difficult for those who do not specialize in optics and remote sensing, yet this data can reveal entirely new insight into aquatic ecosystems. DroneWQ was designed to be a simple pipeline for those who wish to utilize UAS multispectral remote sensing to analyze ocean color and water quality. The simple functionality of DroneWQ will enable effective water quality monitoring at fine spatial resolutions, leading to exciting scientific exploration of UAS remote sensing by students, scientists, and water quality managers.

Background/Theory

UAS can measure remote sensing reflectance (Rrs) defined as:

Eq. 1     Rrs (θ, φ, λ) = LW(θ, φ, λ) / Ed(θ, φ, λ)

where LW (W m-2 nm-1 sr-1) is water-leaving radiance, Ed (W m-2 nm-1) is downwelling irradiance, θ represents the sensor viewing angle between the sun and the vertical (zenith), φ represents the angular direction relative to the sun (azimuth) and λ represents wavelength.

UAS do not measure Rrs directly as the at-sensor total radiance (LT, W m-2 nm-1 sr-1) constitutes the sum of LW and incident radiance reflected off the sea surface into the detector's field of view, referred to as surface reflected radiance (LSR). LW is the radiance that emanates from the water and contains a spectral shape and magnitude governed by optically active water constituents, while LSR is independent of water constituents and instead governed by the water surface reflecting the downwelling light; a familiar example is sun glint. Here we define UAS total reflectance (RUAS) as:

Eq. 2     RUAS(θ, Φ, λ) = LT(θ, Φ, λ) / Ed(λ)
where
Eq. 3     LT(θ, Φ, λ) = LW(θ, Φ, λ) + LSR(θ, Φ, λ)

Due to the differing orientation of wave facets reflecting radiance from different parts of the sky, LSR can vary widely within a single UAS image. 'DroneWQ` provides multiple options from the literature for removing LSR.

Caption for example figure.\label{fig:removal_Lsr_fig}
Figure 1. Example of an individual UAS image (green band) at different processing steps and methods: (A) RUAS, (B) RUAS with initial sun glint masking and (C–F) remote sensing reflectance (Rrs) using various methods to remove surface reflected light: (C) ⍴ look-up table (LUT) from HydroLight simulations, (D) Dark pixel assumption with NIR = 0, (E) Dark pixel assumption with NIR > 0, (F) Deglingting methods following [@hedley_harborne_mumby_2005]. Figure taken from [@windle_silsbe_2021].

A secondary challenge with aquatic UAS remote sensing is georeferencing and mosaicking imagery. Current photogrammetry techniques (e.g. Structure-from-Motion (SfM)) are not capable of stitching UAS images captured over large bodies of water due to a lack of key points in images of homogenous water surfaces. DroneWQ uses sensor pose information to project and mosaick imagery.

Caption for example figure.\label{fig:chl_mosaic}
Figure 2. Final orthmosaic of UAS images collected over Western Lake Erie processed to chlorophyll a concentration.

Publications utilizing DroneWQ

Román, A., Heredia, S., Windle, A. E., Tovar-Sánchez, A., & Navarro, G. (2024). Enhancing Georeferencing and Mosaicking Techniques over Water Surfaces with High-Resolution Unmanned Aerial Vehicle (UAV) Imagery. Remote Sensing, 16(2), 290.

Gray, P. C., Windle, A. E., Dale, J., Savelyev, I. B., Johnson, Z. I., Silsbe, G. M., ... & Johnston, D. W. (2022). Robust ocean color from drones: Viewing geometry, sky reflection removal, uncertainty analysis, and a survey of the Gulf Stream front. Limnology and Oceanography: Methods, 20(10), 656-673.

Windle, A. E., & Silsbe, G. M. (2021). Evaluation of unoccupied aircraft system (UAS) remote sensing reflectance retrievals for water quality monitoring in coastal waters. Frontiers in Environmental Science, 9, 674247.

Acknowledgements

We acknowledge and appreciate helpful support from the Micasense team. We thank Julian Dale for assisting with UAS flights.

Contributions

Contributions are welcome, and they are greatly appreciated! Every little bit helps, and credit will always be given.

Report bugs, request features or submit feedback as a GitHub Issue. Make fixes, add content or improvements using GitHub Pull Requests

References