Vision-based navigation or optical navigation uses computer vision algorithms and optical sensors, including laser-based range finder and photometric cameras using CCD arrays, to extract the visual features required to the localization in the surrounding environment. However, there are a range of techniques for navigation and localization using vision information, the main components of each technique are:
In order to give an overview of vision-based navigation and its techniques, we classify these techniques under indoor navigation and outdoor navigation.
The easiest way of making a robot go to a goal location is simply to guide it to this location. This guidance can be done in different ways: burying an inductive loop or magnets in the floor, painting lines on the floor, or by placing beacons, markers, bar codes etc. in the environment. Such Automated Guided Vehicles (AGVs) are used in industrial scenarios for transportation tasks. Indoor Navigation of Robots are possible by IMU based indoor positioning devices.34
There are a very wider variety of indoor navigation systems. The basic reference of indoor and outdoor navigation systems is "Vision for mobile robot navigation: a survey" by Guilherme N. DeSouza and Avinash C. Kak.
Also see "Vision based positioning" and AVM Navigator.
Typical Open Source Autonomous Flight Controllers have the ability to fly in full automatic mode and perform the following operations;
The onboard flight controller relies on GPS for navigation and stabilized flight, and often employ additional Satellite-based augmentation systems (SBAS) and altitude (barometric pressure) sensor.5
Some navigation systems for airborne robots are based on inertial sensors.6
Autonomous underwater vehicles can be guided by underwater acoustic positioning systems.7 Navigation systems using sonar have also been developed.8
Robots can also determine their positions using radio navigation.9
Stachniss, Cyrill. "Robotic mapping and exploration." Vol. 55. Springer, 2009. https://books.google.nl/books?dq=Robotic+Mapping+and+Exploration&ots=mY66zOcyW2&sig=jtVaI6hBrohUnRf49v714zBn180#v=onepage&q=Robotic%20Mapping%20and%20Exploration ↩
Fuentes-Pacheco, Jorge, José Ruiz-Ascencio, and Juan Manuel Rendón-Mancha. "Visual simultaneous localization and mapping: a survey." Artificial Intelligence Review 43.1 (2015): 55-81. https://www.researchgate.net/profile/Jose_Ascencio/publication/234081012_Visual_Simultaneous_Localization_and_Mapping_A_Survey/links/55383e610cf247b8587d3d58/Visual-Simultaneous-Localization-and-Mapping-A-Survey.pdf ↩
Chen, C.; Chai, W.; Nasir, A. K.; Roth, H. (April 2012). "Low cost IMU based indoor mobile robot navigation with the assist of odometry and Wi-Fi using dynamic constraints". Proceedings of the 2012 IEEE/ION Position, Location and Navigation Symposium. pp. 1274–1279. doi:10.1109/PLANS.2012.6236984. ISBN 978-1-4673-0387-3. S2CID 19472012. 978-1-4673-0387-3 ↩
GT Silicon (2017-01-07), An awesome robot with cool navigation and real-time monitoring, archived from the original on 2021-12-12, retrieved 2018-04-04 https://www.youtube.com/watch?v=c1bYfUYlVSo ↩
"Flying | AutoQuad". http://autoquad.org/wiki/wiki/configuring-autoquad-flightcontroller/flying/ ↩
Bruno Siciliano; Oussama Khatib (20 May 2008). Springer Handbook of Robotics. Springer Science & Business Media. pp. 1020–. ISBN 978-3-540-23957-4. 978-3-540-23957-4 ↩
Mae L. Seto (9 December 2012). Marine Robot Autonomy. Springer Science & Business Media. pp. 35–. ISBN 978-1-4614-5659-9. 978-1-4614-5659-9 ↩
John J. Leonard; Hugh F. Durrant-Whyte (6 December 2012). Directed Sonar Sensing for Mobile Robot Navigation. Springer Science & Business Media. ISBN 978-1-4615-3652-9. 978-1-4615-3652-9 ↩
Oleg Sergiyenko (2019). Machine Vision and Navigation. Springer Nature. pp. 172–. ISBN 978-3-030-22587-2. 978-3-030-22587-2 ↩