top of page

Event-Centric Multi-Sensor Dataset

Event cameras have recently gained in popularity as they hold strong potential to complement regular cameras in situations of high dynamics or challenging illumination. An important problem that may benefit from the addition of an event camera is given by Simultaneous Localization And Mapping (SLAM). However, in order to ensure progress on event-inclusive multi-sensor SLAM, novel benchmark sequences are needed. Our contribution is the first complete set of benchmark datasets captured with a multi-sensor setup con-training an event-based stereo camera, a regular stereo camera, multiple depth sensors, and an inertial measurement unit. The setup is fully hardware-synchronized and underwent accurate extrinsic calibration. All sequences come with ground truth data captured by highly accurate external reference devices such as a motion capture system. Individual sequences include both small and large-scale environments and cover the specific challenges targeted by dynamic vision sensors.

Screenshot from 2022-03-17 14-24-26.png

Simultaneous Localization And Mapping (SLAM) is 
regarded as an essential problem to be solved by intelligent mobile agents such as autonomous robots, XR devices, and smart vehicles. LiDARs or depth cameras provide direct depth readings and are therefore often considered very helpful in reducing the complexity and increasing the accuracy and density of a SLAM solution. However, a compact form factor, low energy consumption, and the ability to sense appearance information have since ever made regular
cameras an indispensable addition to any SLAM sensor
suite. The present paper targets the addition of yet another exteroceptive visual sensor: a Dynamic Vision Sensor (DVS), also called an event camera.

The dataset, along with the documentation and the toolbox, 

can be found at https://star-datasets.github.io/

L. Gao*, Y. Liang*, J. Yang*, S. Wu, C. Wang, J. Chen, L. Kneip, VECtor: A Versatile Event-Centric Benchmark for
Multi-Sensor SLAM, IEEE Robotics and Automation Letters (RA-L), 2022. [project webpage] [paper]

bottom of page