Skip to the content.

imagen

June 12th, 2025, CVPR (Full day workshop), Nashville (TN), USA.
Held in conjunction with the IEEE/CVF Conference on Computer Vision and Pattern Recognition 2025.

Welcome to the 5th International Workshop on Event-Based Vision!

It starts at 8 am Eastern Time

Location: 4th level, Grand Ballroom C2

Room: Grand Ballroom C2

Invited Speakers

workshop speakers and organizers

Objectives

Event-based cameras are bio-inspired, asynchronous sensors that offer key advantages of microsecond temporal resolution, low latency, high dynamic range and low power consumption. Because of these advantages, event-based cameras open frontiers that are unthinkable with traditional (frame-based) cameras, which have been the main sensing technology for the past 60 years. These revolutionary sensors enable the design of a new class of efficient algorithms to track a baseball in the moonlight, build a flying robot with the agility of a bee, and perform structure from motion in challenging lighting conditions and at remarkable speeds. In the last decade, research about these sensors has attracted the attention of industry and academia, fostering exciting advances in the field. The proposed workshop covers the sensing hardware, as well as the processing, data, and learning methods needed to take advantage of the above-mentioned novel cameras. The workshop also considers novel vision sensors, such as pixel processor arrays, which perform massively parallel processing near the image plane. Because early vision computations are carried out on-sensor (mimicking the retina), the resulting systems have high speed and low-power consumption, enabling new embedded vision applications in areas such as robotics, AR/VR, automotive, gaming, surveillance, etc.

Topics Covered

A longer list of related topics is available in the table of content of the List of Event-based Vision Resources

Call for Contributions

Research papers

Research papers and demos are solicited in, but not limited to, the topics listed above.

List of accepted papers and live demos

  1. Dynamic EventNeRF: Reconstructing General Dynamic Scenes from Multi-view RGB and Event Streams
  2. Event Quality Score (EQS): Assessing the Realism of Simulated Event Camera Streams via Distances in Latent Space
  3. E-VLC: A Real-World Dataset for Event-based Visible Light Communication And Localization
  4. EV-Flying: an Event-based Dataset for In-The-Wild Recognition of Flying Objects
  5. BiasBench: A reproducible benchmark for tuning the biases of event cameras
  6. MTevent: A Multi-Task Event Camera Dataset for 6D Pose Estimation and Moving Object Detection
  7. Looking into the Shadow: Recording a Total Solar Eclipse with High-resolution Event Cameras
  8. Probabilistic Online Event Downsampling
  9. Every Event Counts: Balancing Data Efficiency and Accuracy in Event Camera Subsampling
  10. Learning from Noise: Enhancing DNNs for Event-Based Vision through Controlled Noise Injection
  11. Towards Low-Latency Event-based Obstacle Avoidance on a FPGA-Drone
  12. Real-Time Pedestrian Detection at the Edge on a Fully Asynchronous Neuromorphic System
  13. Reading Events Hierarchically to Reduce Latency
  14. DELTA: Dense Depth from Events and LiDAR using Transformer’s Attention
  15. Quadrocular, Neuromorphic Stereo Triangulation and Asynchronous Data Fusion for 3D Object Tracking
  16. Nanoparticle Diameter Measurements With Event Camera Tracking
  17. Egocentric Event-Based Vision for Ping Pong Ball Trajectory Prediction
  18. Event-Driven Dynamic Attention for Multi-Object Tracking on Neuromorphic Hardware
  19. Event-based Tracking and Imaging of Randomly Moving Objects in Dense Dynamical Scattering Media
  20. Perturbed State Space Feature Encoders for Optical Flow with Event Cameras
  21. Spatio-Temporal State Space Model For Efficient Event-Based Optical Flow
  22. Best Linear Unbiased Estimation for 2D and 3D Flow with Event-based Cameras
  23. Iterative Event-based Motion Segmentation by Variational Contrast Maximization
  24. EV-LayerSegNet: Self-supervised Motion Segmentation using Event Cameras
  25. Seeing like a Cephalopod: Colour Vision with a Monochrome Event Camera
  26. Event-based Continuous Color Video Decompression from Single Frames
  27. Reading in the Dark with Foveated Event Vision
  28. Human-Robot Navigation using Event-based Cameras and Reinforcement Learning
  29. E-BARF: Bundle Adjusting Neural Radiance Fields from a Moving Event Camera
  30. Event-based eye tracking. Even-based Vision Workshop 2025
  31. BRAT: Bidirectional Relative Positional Attention Transformer for Event-based Eye tracking
  32. Exploring Temporal Dynamics in Event-based Eye Tracker
  33. Dual-Path Enhancements in Event-Based Eye Tracking: Augmented Robustness and Adaptive Temporal Modeling
  34. Live Demonstration: AR Application Using Active Markers With An Event Camera
  35. Live Demonstration: Point-Feature Tracking for Pixel Processor Arrays
  36. Live Demonstration: NeuroTouch - A Neuromorphic Vision-based Tactile Sensor for Real-Time Gesture Recognition
  37. Live Demonstration: Real-time event-data processing with Graph Convolutional Neural Networks and SoC FPGA

Courtesy papers (in the poster session)

We also solicit contributions of papers relevant to the workshop that are accepted at the CVPR main conference or at other peer-reviewed conferences or journals. These contributions will be checked for suitability (soft review) and will not be published in the workshop proceedings. Papers should be submitted in single blind format (e.g., accepted version is fine), and should mention if and where the paper has been accepted / published. These contributions provide visibility to your work and help building a community around the topics of the workshop.

List of Courtesy papers

  1. NTIRE 2025 Challenge on Event-Based Image Deblurring: Methods and Results, CVPRW 2025.
  2. Graph Neural Network Combining Event Stream and Periodic Aggregation for Low-Latency Event-based Vision, CVPR 2025.
  3. Repurposing Pre-trained Video Diffusion Models for Event-based Video Interpolation, CVPR 2025.
  4. ETAP: Event-based Tracking of Any Point, CVPR 2025.
  5. EBS-EKF: Accurate and High Frequency Event-based Star Tracking, CVPR 2025.
  6. Event fields: Capturing light fields at high speed, resolution, and dynamic range, CVPR 2025.
  7. On-Device Self-Supervised Learning of Low-Latency Monocular Depth from Only Events, CVPR 2025.
  8. Bridge Frame and Event: Common Spatiotemporal Fusion for High-Dynamic Scene Optical Flow, CVPR 2025.
  9. Ev-3DOD: Pushing the Temporal Boundaries of 3D Object Detection with Event Cameras, CVPR 2025.
  10. TimeTracker: Event-based Continuous Point Tracking for Video Frame Interpolation with Non-linear Motion, CVPR 2025.
  11. EventFly: Event Camera Perception from Ground to the Sky, CVPR 2025.
  12. DiET-GS: Diffusion Prior and Event Stream-Assisted Motion Deblurring 3D Gaussian Splatting, CVPR 2025.
  13. Event Ellipsometer: Event-based Mueller-Matrix Video Imaging, CVPR 2025.
  14. IncEventGS: Pose-Free Gaussian Splatting from a Single Event Camera, CVPR 2025.
  15. PS-EIP: Robust Photometric Stereo Based on Event Interval Profile, CVPR 2025.
  16. EMBA: Event-based Mosaicing Bundle Adjustment, ECCV 2024.
  17. Neuromorphic Facial Analysis with Cross-Modal Supervision, ECCVW 2024.
  18. Distance Estimation in Outdoor Driving Environments Using Phase-only Correlation Method with Event Cameras, IEEE IV 2025.
  19. Evaluation of Mobile Environment for Vehicular Visible Light Communication Using Multiple LEDs and Event Cameras, IEEE IV 2025.
  20. Neural Inertial Odometry from Lie Events, RSS 2025.
  21. There is still time to submit!

Competitions / Challenges

1. Eye-tracking

Overview: Thhis challenge focuses on advancing event-based eye tracking, a key technology for driving innovations in interaction technology, extended reality (XR) and cognitive studies. While current state-of-the-art devices like Apple's Vision Pro or Meta’s Aria glasses utilize frame-based eye tracking with frame rates from 10 to 100 Hz and latency around 11 ms, there is a pressing need for smoother, faster, and more efficient methods to enhance user experience. By leveraging the event-based eye tracking dataset (3ET+ dataset), this challenge offers participants the opportunity to contribute to cutting-edge solutions that push beyond current limitations. Top-1-ranking team will get a Meta Quest 3 as the prize (Sponsored by DVsense).

Eye-tracking Challenge website

Timeline:

Contact:


2. Space-time Instance Segmentation (SIS) Challenge

MouseSIS Visualization

Overview:

Challenge Page (Codabench)

Timeline:

Contact: Friedhelm Hamann (f.hamann [at] tu-berlin [dot] de)


3. Event-Based Image Deblurring Challenge

Deblur with events

Overview:

This challenge focuses on leveraging the high-temporal-resolution events from event cameras to improve image deblurring. We hope that this challenge will serve as a starting point for promoting event-based image enhancement on a broader stage and contribute to the thriving development of the event-based vision community.

Challenge Page (CodaLab)

Timeline:

Contact: Lei Sun (leo_sun [at] zju [dot] edu [dot] cn)


4. Event-Based SLAM Challenge

Overview:

The goal of this challenge is to leverage the high temporal and spatial resolution of HD event cameras for SLAM and pose estimation applications.

Please refer to the challenge website for more information about participation.

Tracks:

Timeline:

Contact:


Location

Schedule

The tentative schedule is the following:

Time (local) Session
8:00 Welcome. Session 1: Event cameras: Algorithms and applications I (Invited speakers)
10:10 Coffee break. Set up posters.
10:30 Session 2: Poster session: contributed papers, competitions, demos and courtesy presentations (as posters).
12:30 Lunch break
13:30 Session 3: Event cameras: Algorithms and applications II (Invited speakers)
15:30 Coffee break
16:00 Session 4: Hardware architectures and sensors (Invited speakers)
17:45 Award Ceremony and Final Panel Discussion.
18:00 End

Organizers

Important Dates

FAQs

See also this link

Ack

The Microsoft CMT service was used for managing the peer-reviewing process for this conference. This service was provided for free by Microsoft and they bore all expenses, including costs for Azure cloud services as well as for software development and support.