An Evaluation of the Fusion of Relative Positioning Sensors in the Accuracy of Land Mobile Robot´s Localization Systems

Authors

  • Henrique Santos Senai-CIMATEC, Av. Orlando Gomes, 1845, Piatã, Salvador - BA, 41650-010, Brazil.
  • Artur Kronbauer Salvador University (Unifacs), Av. Tancredo Neves, 2131, Salvador-BA, 41820-021, Brazil,Bahia State University (UNEB), Rua Silveira Martins, 2555, Salvador-BA, 41150-000, Brazil.
  • Jorge Campos Salvador University (Unifacs), Av. Tancredo Neves, 2131, Salvador-BA, 41820-021, Brazil,Bahia State University (UNEB), Rua Silveira Martins, 2555, Salvador-BA, 41150-000, Brazil.

Keywords:

Robotic Navigation Systems, Relative Positioning Sensor, Sensor fusion, Odometry, Robot ego-motion

Abstract

The precise location of a robot is a fundamental challenge and one of the most important tasks for robot´s navigation systems. For autonomous navigation, the robot must be aware of its pose and the map of the environment so that it can set the path it must follow to perform a task. Most terrestrial robots use an odometry system based on the movement of the wheels to keep track of their location. Wheel odometry has high sensitivity to the kind of pavement, which leads to an inaccuracy that increases over time. One way to improve the robot's positioning accuracy is by merging all kinds of sensors capable of measuring the robot's displacement and speed. The most used sensors to improve knowledge of the robot's position on low-cost robotic platforms are cameras and inertial sensors.  This work analyzes the accuracy of the positioning systems based on the odometry obtained from the wheels, inertial sensors, and visual odometry. To analyze the precision of the robot's movement, the real trajectories of a two-wheeled robot are compared with the expected trajectory (ground truth) using different combinations of the mentioned sensors. The results of the experiment provide a good indication of the cost-benefit of using these types of sensors to perform the odometry of robotic platforms.

References

L. A. Nguyen, P. T. Dung, T. D. Ngo, and X. T. Truong, “Improving the accuracy of the autonomous mobile robot localization systems based on the multiple sensor fusion methods,” Proc. - 2019 3rd Int. Conf. Recent Adv. Signal Process. Telecommun. Comput. SigTelCom 2019, pp. 33–37, 2019, doi: 10.1109/SIGTELCOM.2019.8696103.

H. Deilamsalehy and T. C. Havens, “Sensor fused three-dimensional localization using IMU, camera and LiDAR,” Proc. IEEE Sensors, pp. 1–3, 2017, doi: 10.1109/ICSENS.2016.7808523.

M. B. Alatise and G. P. Hancke, “Pose estimation of a mobile robot based on fusion of IMU data and vision data using an extended kalman filter,” Sensors (Switzerland), vol. 17, no. 10, 2017, doi: 10.3390/s17102164.

S. J. Kim and B. K. Kim, “Dynamic ultrasonic hybrid localization system for indoor mobile robots,” IEEE Trans. Ind. Electron., vol. 60, no. 10, pp. 4562–4573, 2013, doi: 10.1109/TIE.2012.2216235.

A. Breitenmoser, L. Kneip, and R. Siegwart, “A monocular vision-based system for 6D relative robot localization,” pp. 79–85, 2011, doi: 10.1109/iros.2011.6094851.

D. Schwesinger, A. Shariati, C. Montella, and J. Spletzer, “A smart wheelchair ecosystem for autonomous navigation in urban environments,” Auton. Robots, vol. 41, no. 3, pp. 519–538, 2017, doi: 10.1007/s10514-016-9549-1.

R. B. Sousa, M. R. Petry, and A. P. Moreira, “Evolution of Odometry Calibration Methods for Ground Mobile Robots,” in 2020 IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC), 2020, pp. 294–299, doi: 10.1109/ICARSC49921.2020.9096154.

G. Ligorio and A. M. Sabatini, “Extended Kalman filter-based methods for pose estimation using visual, inertial and magnetic sensors: Comparative analysis and performance evaluation,” Sensors (Switzerland), vol. 13, no. 2, pp. 1919–1941, 2013, doi: 10.3390/s130201919.

D. Nistér, O. Naroditsky, and J. Bergen, “Visual odometry,” Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., vol. 1, no. C, 2004.

P. F. Martins, H. Costelha, L. C. Bento, and C. Neves, “Monocular Camera Calibration for Autonomous Driving — a comparative study,” in 2020 IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC), 2020, pp. 306–311, doi: 10.1109/ICARSC49921.2020.9096104.

Downloads

Published

2022-06-04

How to Cite

Henrique Santos, Artur Kronbauer, & Campos, J. (2022). An Evaluation of the Fusion of Relative Positioning Sensors in the Accuracy of Land Mobile Robot´s Localization Systems. American Scientific Research Journal for Engineering, Technology, and Sciences, 88(1), 120–132. Retrieved from https://www.asrjetsjournal.org/index.php/American_Scientific_Journal/article/view/7622

Issue

Section

Articles