The Dynamic Visual Sensor is considered to be a next-generation vision sensor. Since event-based vision is in its early stage of development, a small number of datasets has been created during the last decade. Dataset creation is motivated by the need for real data from one or many sensors. Temporal accuracy of data in such datasets is crucially important since the events have high temporal resolution measured in microseconds and, during an algorithm evaluation task, such type of visual data is usually fused with data from other types of sensors. The main aim of our research is to achieve the most accurate possible time synchronization between an event camera, LIDAR, and ambient environment sensors during a session of data acquisition. All the mentioned sensors as well as a stereo and a monocular camera were installed on a mobile robotic platform. In this work, a time synchronization architecture and algorithm are proposed for time synchronization with an implementation example on a PIC32 microcontroller. The overall time synchronization approach is scalable for other sensors where there is a need for accurate time synchronization between many nodes. The evaluation results of the proposed solution are reported and discussed in the paper. © 2020 IEEE.