Experiments with Visual Odometry for Hydrobatic Autonomous Underwater Vehicles

University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

Abstract: Hydrobatic Autonomous Underwater Vehicles (AUVs) are underactuated robots that can perform agile maneuvers in challenging underwater environments with high efficiency in speed and range. The challenge lies in localizing and navigating these AUVs particularly for performing manipulation tasks because common sensors such as GPS become very unreliable underwater due to their poor accuracy. To address this challenge, Visual Odometry (VO) is a viable technique that estimates the position and orientation of a robot by figuring out the movement of a camera and tracking the changes in the associated camera images taken by one or more cameras. VO is a promising solution for underwater localization as it provides information about egomotion utilizing the visual cues in a robot. This research explores the applicability of VO algorithms on hydrobatic AUVs using a simulated underwater dataset obtained in Stonefish, an advanced open-source simulation tool specifically developed for marine robotics. This work focuses on the feasibility of employing two state-of-the-art feature-based VO frameworks, referred to as ORB-SLAM2 and VISO2 respectively since very little research is available for learning-based VO frameworks in underwater environments. The assessment is performed on a baseline underwater dataset captured by cameras of a hydrobatic AUV using the Stonefish simulator in a simulated algae farm, which is one of the target applications of hydrobatic AUVs. A novel software architecture has also been proposed for hydrobatic AUVs, which can be used for integrating VO with other components as a node stack to ensure robust localization. This study further suggests enhancements, including camera calibration and timestamp synchronization, as a future step to optimize VO accuracy and functionality. ORB-SLAM2 performs well in the baseline scenario but is prone to slight drift when turbidity arises in the simulated underwater environment. VISO2 is recommended for such high turbidity scenarios but it fails to estimate the camera motion accurately due to advanced hardware synchronization issues that are prevalent in the dataset as it is highly sensitive to accurate camera calibration and synchronized time stamps. Despite these limitations, the results show immense potential of both ORB-SLAM2 and VISO2 as feature-based VO methods for future deployment in hydrobatic AUVs with ORB-SLAM2 being preferred for overall localization and mapping of hydrobatic AUVs in low turbidity environments that are less prone to drift and VISO2 preferred for high turbidity environments with highly accurate camera calibration and synchronization.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)