Probabilistic Multi-Modal Data Fusion and Precision Coordination for Autonomous Mobile Systems Navigation : A Predictive and Collaborative Approach to Visual-Inertial Odometry in Distributed Sensor Networks using Edge Nodes

University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

Abstract: This research proposes a novel approach for improving autonomous mobile system navigation in dynamic and potentially occluded environments. The research introduces a tracking framework that combines data from stationary sensing units and on-board sensors, addressing challenges of computational efficiency, reliability, and scalability. The work innovates by integrating spatially-distributed LiDAR and RGB-D Camera sensors, with the optional inclusion of on-board IMU-based dead-reckoning, forming a robust and efficient coordination framework for autonomous systems. Two key developments are achieved. Firstly, a point cloud object detection technique, "Generalized L-Shape Fitting”, is advanced, enhancing bounding box fitting over point cloud data. Secondly, a new estimation framework, the Distributed Edge Node Switching Filter (DENS-F), is established. The DENS-F optimizes resource utilization and coordination, while minimizing reliance on on-board computation. Furthermore, it incorporates a short-term predictive feature, thanks to the Adaptive-Constant Acceleration motion model, which utilizes behaviour-based control inputs. The findings indicate that the DENS-F substantially improves accuracy and computational efficiency compared to the Kalman Consensus Filter (KCF), particularly when additional inertial data is provided by the vehicle. The type of sensor deployed and the consistency of the vehicle's path are also found to significantly influence the system's performance. The research opens new viewpoints for enhancing autonomous vehicle tracking, highlighting opportunities for future exploration in prediction models, sensor selection, and precision coordination.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)