Fast Vision-based 3D Indoor Localization and Tracking for Commodity Hardware
In real-time augmented reality applications, it is important to find and track the 3D position and orientation of a moving camera with high accuracy. However, it is challenging to compute a camera's 3D trajectories at high frequency on commodity hardware because the computation involves heavy-weight 2D-3D matching process. In this paper, we propose a fast vision-based 3D indoor localization and tracking system which is suitable for commodity hardware. We reduce per-frame 3D tracking computation overhead by a factor of seven by splitting a camera viewpoint into multiple smaller regions, and processing only one region at a time for each frame. When the absolute 3D location or sufficient keypoints are needed prior to 3D tracking, we rely on the cloud for fast 3D positioning. From prototype experiments, we demonstrate that our system can conduct sub-meter accuracy 3D tracking at 20-30 fps in a large-scale indoor hallway environment.