Preliminary Result of Drone UAV Derived Multispectral Bathymetry in Coral Reef Ecosystem: A Case Study of Pemuteran Beach

Masita Dwi Mandini Manessa (1), Dadang Handoko (2), Fajar Dwi Pamungkas (3), Riza Putera Syamsuddin (4), Dwi Sutarko (5), Agus Sukma Yogiswara (6), Mutia Kamalia Mukhtar (7), Supriatna Supriatna (8)
(1) Geography Departement, University of Indonesia, Depok, 10430, Indonesia
(2) School High Tech Navy, Jakarta, 14430, Indonesia
(3) Geography Departement, University of Indonesia, Depok, 10430, Indonesia
(4) Geography Departement, University of Indonesia, Depok, 10430, Indonesia
(5) School High Tech Navy, Jakarta, 14430, Indonesia
(6) Center for Remote Sensing and Ocean Science, Udayana University, Denpasar, 80361, Indonesia
(7) Geography Departement, University of Indonesia, Depok, 10430, Indonesia
(8) Geography Departement, University of Indonesia, Depok, 10430, Indonesia
Fulltext View | Download
How to cite (IJASEIT) :
Manessa, Masita Dwi Mandini, et al. “Preliminary Result of Drone UAV Derived Multispectral Bathymetry in Coral Reef Ecosystem: A Case Study of Pemuteran Beach”. International Journal on Advanced Science, Engineering and Information Technology, vol. 12, no. 4, July 2022, pp. 1512-6, doi:10.18517/ijaseit.12.4.16107.
UAV-derived multispectral bathymetry is an alternative to creating a shallow water bathymetry map without a massive field survey. Multispectral UAV technology can be used for detailed scale identification scopes because it has better spatial resolution and relatively affordable cost. The UAV used in this study record the coastal area using four multispectral sensors, blue, green, red, and near-infrared bands. The UAV images are processed into point cloud information under the use of the Structure from Motion (SfM)-based algorithm with a spatial resolution of 0.075 m. Then the point cloud information is used to predict the water depth using the random forest algorithm. This research was conducted at Pemuteran Beach, Bali, Indonesia. We compared the performance of only spectral, cloud point, and the combination of cloud point - spectral information to predict the water depth. As a result, the cloud point - spectral based shows significant accuracy improvement compared with the spectral only approach that reaches ~1.5, ~2.5 m, and ~0.3m for R2, RMSE, and MAPE, respectively. So, the use of the SfM UAV technique can improve the common spectral-based SDB method.

D. R. Stoddart, R. F. McLean, and D. Hopley, “Geomorphology of reef islands, northern Great Barrier Reef,” Philos. Trans. R. Soc. London. B, Biol. Sci., vol. 284, no. 999, pp. 39-61, Nov. 1978, doi: 10.1098/rstb.1978.0052.

J. G. Fryer, “A Simple System for Photogrammetric Mapping in Shallow Water,” Photogramm. Rec., vol. 11, no. 62, pp. 203-208, Oct. 1983, doi: 10.1111/J.1477-9730.1983.TB00471.X.

H. Yao, R. Qin, and X. Chen, “Unmanned aerial vehicle for remote sensing applications - A review,” Remote Sens., vol. 11, no. 12, pp. 1-22, 2019, doi: 10.3390/rs11121443.

L. Rossi, I. Mammi, and F. Pelliccia, “UAV-Derived Multispectral Bathymetry,” Remote Sens. 2020, Vol. 12, Page 3897, vol. 12, no. 23, p. 3897, Nov. 2020, doi: 10.3390/RS12233897.

R. K. Slocum, C. E. Parrish, and C. H. Simpson, “Combined geometric-radiometric and neural network approach to shallow bathymetric mapping with UAS imagery,” ISPRS J. Photogramm. Remote Sens., vol. 169, pp. 351-363, Nov. 2020, doi: 10.1016/j.isprsjprs.2020.09.002.

P. Agrafiotis, D. Skarlatos, A. Georgopoulos, and K. Karantzalos, “Shallow Water Bathymetry Mapping from UAV Imagery Based on Machine Learning,” in ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Apr. 2019, vol. 42, no. 2/W10, pp. 9-16, doi: 10.5194/isprs-archives-XLII-2-W10-9-2019.

Y. Matsuba and S. Sato, “Nearshore bathymetry estimation using UAV,” Coast. Eng. J., vol. 60, no. 1, pp. 51-59, 2018, doi: 10.1080/21664250.2018.1436239.

L. Fallati, L. Saponari, A. Savini, F. Marchese, C. Corselli, and P. Galli, “Multi-Temporal UAV Data and Object-Based Image Analysis (OBIA) for Estimation of Substrate Changes in a Post-Bleaching Scenario on a Maldivian Reef,” Remote Sens. 2020, Vol. 12, Page 2093, vol. 12, no. 13, p. 2093, Jun. 2020, doi: 10.3390/RS12132093.

C. G. David, N. Kohl, E. Casella, A. Rovere, P. Ballesteros, and T. Schlurmann, “Structure-from-Motion on shallow reefs and beaches: potential and limitations of consumer-grade drones to reconstruct topography and bathymetry,” Coral Reefs 2021 403, vol. 40, no. 3, pp. 835-851, May 2021, doi: 10.1007/S00338-021-02088-9.

J. T. Dietrich, “Bathymetric Structure-from-Motion: extracting shallow stream bathymetry from multi-view stereo photogrammetry,” Earth Surface Processes and Landforms, vol. 42, no. 2. John Wiley and Sons Ltd, pp. 355-364, Feb. 2017, doi: 10.1002/esp.4060.

P. Agrafiotis, K. Karantzalos, A. Georgopoulos, and D. Skarlatos, “Correcting image refraction: Towards accurate aerial image-based bathymetry mapping in shallow waters,” Remote Sens., vol. 12, no. 2, Jan. 2020, doi: 10.3390/rs12020322.

V. Raoult, S. Reid-Anderson, A. Ferri, and J. E. Williamson, “How Reliable Is Structure from Motion (SfM) over Time and between Observers? A Case Study Using Coral Reef Bommies,” Remote Sens. 2017, Vol. 9, Page 740, vol. 9, no. 7, p. 740, Jul. 2017, doi: 10.3390/RS9070740.

S. Harwin and A. Lucieer, “Assessing the accuracy of geo-referenced point clouds produced via multi-view stereopsis from Unmanned Aerial Vehicle (UAV) imagery,” Remote Sens., vol. 4, no. 6, pp. 1573-1599, Jun. 2012, doi: 10.3390/RS4061573.

J. P. Duffy, J. D. Shutler, M. J. Witt, L. DeBell, and K. Anderson, “Tracking Fine-Scale Structural Changes in Coastal Dune Morphology Using Kite Aerial Photography and Uncertainty-Assessed Structure-from-Motion Photogrammetry,” Remote Sens. 2018, Vol. 10, Page 1494, vol. 10, no. 9, p. 1494, Sep. 2018, doi: 10.3390/RS10091494.

O. Bagheri, M. Ghodsian, M. Saadatseresht, O. Bagheri, M. Ghodsian, and M. Saadatseresht, “Reach Scale Application oF UAV+SfM Method in Shallow Rivers Hyperspectral Bathymetry,” 2015, doi: 10.5194/isprsarchives-XL-1-W5-77-2015.

M. D. M. Manessa et al., “Satellite-Derived Bathymetry Using Random Forest Algorithm and Worldview-2 Imagery,” Geoplanning J. Geomatics Plan., vol. 3, no. 2, pp. 117-126, Nov. 2016, doi: 10.14710/geoplanning.3.2.117-126.

M. D. M. Manessa, K. T. Setiawan, M. Haidar, S. Supriatna, A. Pataropura, and A. H. Supardjo, “Optimization of the random forest algorithm for multispectral derived bathymetry,” Int. J. Geoinformatics, vol. 16, no. 3, 2020.

P. Agrafiotis, D. Skarlatos, A. Georgopoulos, and K. Karantzalos, “Shallow Water Bathymetry Mapping from UAV Imagery based on Machine Learning,” ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci., vol. 42, no. 2/W10, pp. 9-16, Feb. 2019, doi: 10.5194/isprs-archives-XLII-2-W10-9-2019.

DJI, “P4 Multispectral - Specifications,” 2020. .

“Agisoft Metashape.” .

G. Verhoeven, “Taking computer vision aloft - archaeological three-dimensional reconstructions from aerial photographs with photoscan,” Archaeological Prospection, vol. 18, no. 1. John Wiley & Sons, Ltd, pp. 67-73, Jan. 2011, doi: 10.1002/arp.399.

S. Jiang, C. Jiang, and W. Jiang, “Efficient structure from motion for large-scale UAV images: A review and a comparison of SfM tools,” ISPRS J. Photogramm. Remote Sens., vol. 167, no. April, pp. 230-251, 2020, doi: 10.1016/j.isprsjprs.2020.04.016.

A. Eltner and G. Sofia, “Structure from motion photogrammetric technique,” Dev. Earth Surf. Process., vol. 23, no. June, pp. 1-24, 2020, doi: 10.1016/B978-0-444-64177-9.00001-1.

D. R. Lyzenga, “Passive remote sensing techniques for mapping water depth and bottom features,” Appl. Opt., vol. 17, no. 3, pp. 379-383, 1978, doi: 10.1364/AO.17.000379.

L. Breiman, “Random forests,” Mach. Learn., vol. 45, no. 1, pp. 5-32, 2001, doi: 10.1017/CBO9781107415324.004.

M. Andy Liaw, Wiener and M. Andy Liaw, “Package ‘randomForest’ Title Breiman and Cutler’s Random Forests for Classification and Regression,” 2018, doi: 10.1023/A:1010933404324.

Authors who publish with this journal agree to the following terms:

    1. Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
    2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
    3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).