9
0

eStonefish-scenes: A synthetically generated dataset for underwater event-based optical flow prediction tasks

Abstract

The combined use of event-based vision and Spiking Neural Networks (SNNs) is expected to significantly impact robotics, particularly in tasks like visual odometry and obstacle avoidance. While existing real-world event-based datasets for optical flow prediction, typically captured with Unmanned Aerial Vehicles (UAVs), offer valuable insights, they are limited in diversity, scalability, and are challenging to collect. Moreover, there is a notable lack of labelled datasets for underwater applications, which hinders the integration of event-based vision with Autonomous Underwater Vehicles (AUVs). To address this, synthetic datasets could provide a scalable solution while bridging the gap between simulation and reality. In this work, we introduce eStonefish-scenes, a synthetic event-based optical flow dataset based on the Stonefish simulator. Along with the dataset, we present a data generation pipeline that enables the creation of customizable underwater environments. This pipeline allows for simulating dynamic scenarios, such as biologically inspired schools of fish exhibiting realistic motion patterns, including obstacle avoidance and reactive navigation around corals. Additionally, we introduce a scene generator that can build realistic reef seabeds by randomly distributing coral across the terrain. To streamline data accessibility, we present eWiz, a comprehensive library designed for processing event-based data, offering tools for data loading, augmentation, visualization, encoding, and training data generation, along with loss functions and performance metrics.

View on arXiv
@article{mansour2025_2505.13309,
  title={ eStonefish-scenes: A synthetically generated dataset for underwater event-based optical flow prediction tasks },
  author={ Jad Mansour and Sebastian Realpe and Hayat Rajani and Michele Grimaldi and Rafael Garcia and Nuno Gracias },
  journal={arXiv preprint arXiv:2505.13309},
  year={ 2025 }
}
Comments on this paper