Explore open access research and scholarly works from NERC Open Research Archive

Advanced Search

Predicting seafloor visual classes from multimodal remote sensed priors using location-guided self-supervised Learning

Liang, Cailei; Cappelletto, Jose; Bodenmann, Adrian; Turnock, Stephen; Thornton, Blair; Huvenne, Veerle A. I. ORCID: https://orcid.org/0000-0001-7135-6360; Wardell, Catherine. 2025 Predicting seafloor visual classes from multimodal remote sensed priors using location-guided self-supervised Learning. In: 2024 IEEE/OES Autonomous Underwater Vehicles Symposium (AUV), Boston, MA, USA, 18 - 20 September 2024. IEEE, 1-6.

Abstract
Remote sensed mapping data and seafloor in-situ imagery are often gathered to infer benthic habitat distributions. However, leveraging multimodal data is challenging because of inherent inconsistencies between measurement modes (e.g., resolution, positional offsets, shape discrepancies). We investigate the impact of using location metadata in multimodal, self-supervised feature learning on habitat classification. Experiments were carried out on a multimodal dataset gathered using and Autonomous Underwater Vehicle (AUV) at the Darwin Mounds Marine Protected Area (MPA). Introducing location metadata improved F1 classification performance of a Bayesian classifier by an average of 27.7% over all conditions tested in this work, with a larger improvement of 32.9% achieved when multiple remote sensing data modes are combined for the analysis.
Documents
Full text not available from this repository.
Information
Programmes:
NOC Programmes > Ocean BioGeosciences
Library
Metrics

Altmetric Badge

Dimensions Badge

Share
Add to AnyAdd to TwitterAdd to FacebookAdd to LinkedinAdd to PinterestAdd to Email
View Item