Visual Navigation Datasets for Event-based Vision: 2014-2021
Proceedings of the 18th International Conference on Informatics in Control, Automation and Robotics 2021
Andrejs Zujevs, Agris Ņikitenko

Visual navigation is becoming the primary approach to the way unmanned vehicles such as mobile robots and drones navigate in their operational environment. A novel type of visual sensor named dynamic visual sensor or event-based camera has significant advantages over conventional digital colour or grey-scale cameras. It is an asynchronous sensor with high temporal resolution and high dynamic range. Thus, it is particularly promising for the visual navigation of mobile robots and drones. Due to the novelty of this sensor, publicly available datasets are scarce. In this paper, a total of nine datasets aimed at event-based visual navigation are reviewed and their most important properties and features are pointed out. Major aspects for choosing an appropriate dataset for visual navigation tasks are also discussed.


Atslēgas vārdi
Datasets, Event-based Vision, Neuromorphic Vision, Visual Navigation, Concise Review.

Zujevs, A., Ņikitenko, A. Visual Navigation Datasets for Event-based Vision: 2014-2021. No: Proceedings of the 18th International Conference on Informatics in Control, Automation and Robotics, Amerikas savienotās valstis, Online, 6.-8. jūlijs, 2021. Online: Institute for Systems and Technologies of Information, Control and Communication (INSTICC), 2021, 507.-513.lpp.

Publikācijas valoda
English (en)
RTU Zinātniskā bibliotēka.
E-pasts: uzzinas@rtu.lv; Tālr: +371 28399196