A new generation of computer vision, namely event-based or neuromorphic vision, provides a new paradigm for capturing visual data and the way such data is processed. Due to a highly novel type of visual sensors used in event-based vision, only a few datasets aimed at visual navigation tasks are publicly available.In this paper, we present and describe the first event-based vision dataset intended to cover visual navigation tasks for mobile robots navigating in different types of agricultural environment. The dataset might open new opportunities for the evaluation of existing and creation of new event-based visual navigation methods for use in agricultural scenes that contain a lot of vegetation, animals, and patterned objects.The new dataset was created using our own custom-designed Sensor Bundle, which was installed on a mobile robot platform. During data acquisition sessions, the platform was manually controlled. The Sensor Bundle consists of a dynamic vision sensor, a LIDAR, an RGB-D camera, and environmental sensors.In total, 21 data sequences in 12 different scenarios for the autumn season are publicly available. Each data sequence is accompanied by a video demonstrating its content and a detailed description, including known issues. The new dataset is mostly designed for Visual Odometry tasks; however, it also includes loop-closures for applying event-based visual SLAM methods.