Development of an Autonomous Mobile Robot with Self-Localization and Searching Target in a Real Environment

Abstract
Developed autonomous mobile robot In describing real-world self-localization and target-search methods, this paper discusses a mobile robot developed to verify a method proposed in Tsukuba Challenge 2014. The Tsukaba Challenge course includes promenades and parks containing ordinary pedestrians and bicyclists that require the robot to move toward a goal while avoiding the moving objects around it. Common self-localization methods often include 2D laser range finders (LRFs), but such LRFs do not always capture enough data for localization if, for example, the scanned plane has few landmarks. To solve this problem, we used a three-dimensional (3D) LRF for self-localization. The 3D LRF captures more data than the 2D type, resulting in more robust localization. Robots that provide practical services in real life must, among other functions, recognize a target and serve it autonomously. To enable robots to do so, this paper describes a method for searching for a target by using a cluster point cloud from the 3D LRF together with image processing of colored images captured by cameras. In Tsukuba Challenge 2014, the robot we developed providing the proposed methods completed the course and found the targets, verifying the effectiveness of our proposals.