[robotics-worldwide] [software] Release of the Event-Camera Dataset

Davide Scaramuzza scaramuzza.davide at gmail.com
Thu Oct 27 10:22:39 PDT 2016


Dear colleagues,

We are very happy to announce the release of the first public collection 
of datasets recorded with an event camera (DAVIS) for pose estimation, 
visual odometry, and SLAM applications! The data also include intensity 
images, inertial measurements, ground truth from a motion-capture 
system, synthetic data, as well as an event camera simulator that allows 
you to create your own sequences! All the data are released both as 
standard text files and binary files (i.e., rosbag).

Dataset: https://urldefense.proofpoint.com/v2/url?u=http-3A__rpg.ifi.uzh.ch_davis-5Fdata.html&d=DQICaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=8ZWAny3qtu9Yzyn-I15T1AAH_ymDH-QqvKlcaAm0JZo&s=D5tdFzCwBgldWGguqyj-pmKQtlGY8DI2NyoKIGovqXs&e= 
Paper: https://urldefense.proofpoint.com/v2/url?u=https-3A__arxiv.org_pdf_1610.08336v1&d=DQICaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=8ZWAny3qtu9Yzyn-I15T1AAH_ymDH-QqvKlcaAm0JZo&s=mfST97HjmLDAgSydmRPvsESnCX8Hum1RAEN5-MNxqsM&e= 
Video of some of the sequences: https://urldefense.proofpoint.com/v2/url?u=https-3A__youtu.be_bVVBTQ7l36I&d=DQICaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=8ZWAny3qtu9Yzyn-I15T1AAH_ymDH-QqvKlcaAm0JZo&s=U1Kme7tL19AjF4enOMxM88zRwoAAvRA40DrQGiP-j1E&e= 
More on our research on event vision: 
https://urldefense.proofpoint.com/v2/url?u=http-3A__rpg.ifi.uzh.ch_research-5Fdvs.html&d=DQICaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=8ZWAny3qtu9Yzyn-I15T1AAH_ymDH-QqvKlcaAm0JZo&s=YQNCI-fG3SsQXhywRTfK9LMqsWLQAkehYhjEGeKoV70&e= 

We provide data:
* from a large variety of scenarios, ranging from indoors to outdoors, 
and high dynamic range
* featuring a variety of motions, from slow to fast, 1-DOF to 6-DOF
* with several sequences recorded using a motorized linear slider, 
leading to very smooth motions!
* synthetic data and, moreover, an event camera simulator that allows 
you to create your own sequences!
* including intensity images and inertial measurements at high frequencies.
* with precise ground truth from a motion-capture system.
* with accurate intrinsic and extrinsic calibration.

--------------------------------------------------------------
About event cameras and the DAVIS sensor
--------------------------------------------------------------

Event cameras are revolutionary vision sensors that overcome the 
limitations of standard cameras in scenes characterized by high-dynamic 
range and high-speed motion: https://urldefense.proofpoint.com/v2/url?u=https-3A__youtu.be_iZZ77F-2Dhwzs&d=DQICaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=8ZWAny3qtu9Yzyn-I15T1AAH_ymDH-QqvKlcaAm0JZo&s=ysWZzZFkexaJ1hnRlhLlNkNnel9ryzxj8O5J5VZAFT8&e=  However, as 
these cameras are still expensive and not widely spread, we hope that 
will accelerate research on event-based algorithms!
Our dataset was recorded with a DAVIS240C sensor, which incorporates a 
conventional global-shutter camera and an event-based sensor in the same 
pixel array. This sensor has great potential for high-speed and 
high-dynamic range robotics and computer vision applications because it 
combines the benefits of conventional cameras with those of event-based 
sensors: low latency, high temporal resolution (~1 micro-second), and 
very high dynamic range (120 dB). However, new algorithms are required 
to exploit the sensor characteristics and cope with its unconventional 
output, which consists of a stream of asynchronous brightness changes 
(called "events") and synchronous grayscale frames.

We greatly acknowledge our sponsors: the DARPA FLA Program, the Google 
Faculty Research Award, the Qualcomm Innovation Fellowship, the SNSF-ERC 
Starting Grant, NCCR Robotics, the Swiss National Science Foundation, 
and the UZH Forschungskredit.

All feedback is very welcome!

Elias Mueggler, Henri Rebecq, Guillermo Gallego, Tobi Delbruck, Davide 
Scaramuzza



More information about the robotics-worldwide mailing list