[robotics-worldwide] [software] Code release: ESIM: Simulator for Event-based Cameras

Davide Scaramuzza scaramuzza.davide at gmail.com
Mon Nov 5 03:53:03 PST 2018

Dear colleagues

We are excited to announce the open source release of our event-camera 
simulator, called ESIM: https://urldefense.proofpoint.com/v2/url?u=http-3A__rpg.ifi.uzh.ch_esim.html&d=DwICaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=1Y1aNnXUsCFqC0Bc4bIXK7m1bk9q_C5v71eCJhoqTj8&s=BagUhlPItP8E3yBiM3tN6c19-FpqTab1bRMf74Q4g2A&e=

Event cameras are revolutionary sensors that work radically differently 
from standard cameras. Instead of capturing intensity images at a fixed 
rate, event cameras measure changes of intensity asynchronously, in the 
form of a stream of events, which encode per-pixel brightness changes. 
In the last few years, their outstanding properties (asynchronous 
sensing, no motion blur, high dynamic range) have led to exciting vision 
applications, with very low-latency and high robustness. However, these 
sensors are still scarce and expensive to get, slowing down progress of 
the research community.

To address this issue, we present ESIM, an efficient and accurate event 
camera simulator implemented in C++ and available open source. ESIM can 
simulate arbitrary camera motion in 3D scenes, while providing events, 
standard images, inertial measurements, with full ground truth 
information including camera pose, velocity, as well as depth and 
optical flow maps. ESIM is the first and only event-camera simulator 
that accurately mimics the asynchronous output of an event camera thanks 
to an adaptive sampling strategy.

In addition to simulating event cameras, ESIM can also simulate standard 
cameras (including accurate motion blur) and an inertial measurement 
unit (IMU).
Our open-source code comes with multiple rendering engines, among which 
a fast, custom renderer based on OpenGL that can work in real time, and 
a photorealistic renderer based on the UnrealCV project.

ESIM was presented at the Conference on Robot Learning 2018 last week. 
Further details about ESIM can be found in the paper.

Project page, code, paper, video: https://urldefense.proofpoint.com/v2/url?u=http-3A__rpg.ifi.uzh.ch_esim.html&d=DwICaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=1Y1aNnXUsCFqC0Bc4bIXK7m1bk9q_C5v71eCJhoqTj8&s=BagUhlPItP8E3yBiM3tN6c19-FpqTab1bRMf74Q4g2A&e=

Henri Rebecq, Daniel Gehrig, Davide Scaramuzza


Prof. Dr. Davide Scaramuzza
Director of the Robotics and Perception Group: https://urldefense.proofpoint.com/v2/url?u=http-3A__rpg.ifi.uzh.ch_people-5Fscaramuzza.html&d=DwICaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=1Y1aNnXUsCFqC0Bc4bIXK7m1bk9q_C5v71eCJhoqTj8&s=F9xvd2mG_ue5n3WIc5v9ZxP0HMFT6fGDspJOn2qS_B0&e=
Inst. of Informatics, University of Zurich,
Inst. of Neuroinformatics, University of Zurich and ETH Zurich
Andreasstrasse 15, AND 2.10, Zurich, Switzerland
Office: +41 44 635 24 09
YouTube Channel: https://urldefense.proofpoint.com/v2/url?u=https-3A__www.youtube.com_ailabRPG_videos&d=DwICaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=1Y1aNnXUsCFqC0Bc4bIXK7m1bk9q_C5v71eCJhoqTj8&s=ETveLoh5oO1wNWoTq2q7d7El_oNvEhFEc09EA9AL5kM&e=

More information about the robotics-worldwide mailing list