ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1610.08336
103
598
v1v2v3v4 (latest)

The Event-Camera Dataset: Event-based Data for Pose Estimation, Visual Odometry, and SLAM

26 October 2016
Elias Mueggler
Henri Rebecq
Guillermo Gallego
T. Delbruck
Davide Scaramuzza
    VGen
ArXiv (abs)PDFHTML
Abstract

New vision sensors, such as the Dynamic and Active-pixel Vision sensor (DAVIS), incorporate a conventional global-shutter camera and an event-based sensor in the same pixel array. These sensors have great potential for high-speed robotics and computer vision because they allow us to combine the benefits of conventional cameras with those of event-based sensors: low latency, high temporal resolution, and very high dynamic range. However, new algorithms are required to exploit the sensor characteristics and cope with its unconventional output, which consists of a stream of asynchronous brightness changes (called "events") and synchronous grayscale frames. For this purpose, we present and release a collection of datasets captured with a DAVIS in a variety of synthetic and real environments, which we hope will motivate research on new algorithms for high-speed and high-dynamic-range robotics and computer-vision applications. In addition to global-shutter intensity images and asynchronous events, we also provide inertial measurements and ground truth from a motion-capture system. All the data are released both as standard text files and binary files (i.e., rosbag). This paper provides an overview of the available data.

View on arXiv
Comments on this paper