Deep Active Learning for 3D Object Detection for Autonomous Driving

University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

Author: Xiao Wei; [2019]

Keywords: ;

Abstract: 3D object detection is vital for autonomous driving. However, to train a 3D detector often requires a huge amount of labeled data which are extremely expensive and tedious to obtain. In order to alleviate the annotation effort while maintaining detection performance, we aim to adopt active learning framework for training a 3D object detector with the least amount of labeled data. In contrast with the conventional passive learning that a machine learning model is trained on a pre-determined training dataset, active learning allows the model to actively select the most informative samples for labeling and add them to the training set. Therefore, only a fraction of data need to be labeled. To the best of our knowledge, this thesis is the first that studies active learning for 3D detection. We take progressive steps towards the goal. There are three stages with increasingly complex models and learning tasks. First, we start with active learning for image classification which can be viewed as a sub-problem of object detection. Second, we investigate and build a multi-task active learning framework with a deep refinement network for multi-modal 3D object detection. Lastly, we further analyze multi-task active learning with a more complicated two-stage 3D LiDAR vehicle detector. In our experiments, we study the fundamental and important aspects of an active learning framework with an emphasis on evaluating several popular data selection strategies based on prediction uncertainty. Without bells and whistles, we successfully propose an active learning framework for 3D object detection using 3D LiDAR point clouds and accurate 2D image proposals that saves up to 60% of labeled data on a public dataset. In the end, we also discuss some underlying challenges on this topic from both academic and industrial perspectives.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)