Visual Navigation Datasets for Event-based Vision: 2014-2021
Proceedings of the 18th International Conference on Informatics in Control, Automation and Robotics 2021
Andrejs Zujevs, Agris Ņikitenko

Visual navigation is becoming the primary approach to the way unmanned vehicles such as mobile robots and drones navigate in their operational environment. A novel type of visual sensor named dynamic visual sensor or event-based camera has significant advantages over conventional digital colour or grey-scale cameras. It is an asynchronous sensor with high temporal resolution and high dynamic range. Thus, it is particularly promising for the visual navigation of mobile robots and drones. Due to the novelty of this sensor, publicly available datasets are scarce. In this paper, a total of nine datasets aimed at event-based visual navigation are reviewed and their most important properties and features are pointed out. Major aspects for choosing an appropriate dataset for visual navigation tasks are also discussed.


Keywords
Datasets, Event-based Vision, Neuromorphic Vision, Visual Navigation, Concise Review.

Zujevs, A., Ņikitenko, A. Visual Navigation Datasets for Event-based Vision: 2014-2021. In: Proceedings of the 18th International Conference on Informatics in Control, Automation and Robotics, United States of America, Online, 6-8 July, 2021. Online: Institute for Systems and Technologies of Information, Control and Communication (INSTICC), 2021, pp.507-513.

Publication language
English (en)
The Scientific Library of the Riga Technical University.
E-mail: uzzinas@rtu.lv; Phone: +371 28399196