Skip to the content.

imagen

June 12th, 2025, CVPR (Full day workshop), Nashville (TN), USA. Held in conjunction with the IEEE/CVF Conference on Computer Vision and Pattern Recognition 2025.

Welcome to the 5th International Workshop on Event-Based Vision!

CVPRW 2025 edition. Photo by G. Gallego Many thanks to all who contributed and made this workshop possible!

Photo Album of the Workshop

Invited Speakers

workshop speakers and organizers

Schedule

Time Speaker and Title
08:00 SESSION 1
08:00 Welcome and Organization
08:05 Davide Scaramuzza (University of Zurich, Switzerland). Event Cameras: a New Way of Sensing. Slides
08:35 Daniel Cremers (TU Munich, Germany)
09:05 Priyadarshini Panda (Yale University, USA). A Neuromorphic Approach To Event-based Vision: From Encoding To Adaptive Architectures. Slides
09:40 Introduction of 4 Challenges: Eye-tracking, Deblurring, Segmentation and SLAM.
Event-Based Eye-Tracking Challenge. Slides
Event-Based Instance Segmentation Challenge. Slides
Event-Based Image Deblurring Challenge. Slides
Event-Based SLAM Challenge. Slides
10:00 Coffee break. Set up posters
10:30 SESSION 2
10:30 Poster session: contributed papers, demos, challenges
and courtesy papers (as posters). Boards 280 -- 359.
12:30 Lunch break
13:30 SESSION 3
13:30 Niklas Funk (TU Darmstadt, Germany). Event-based Optical Tactile Sensing For Robotic Manipulation. Slides
13:55 Christopher Metzler (University of Maryland, USA). Computational Imaging with Event Cameras. Slides
14:20 Vladislav Golyanik (Max Planck Institute, Germany). Event-based Non-rigid 3D Reconstruction and Novel-View Synthesis. Slides
14:45 Ziwei Wang (ANU, Australia). Seeing through Space and Time: Asynchronous Event Blob Tracking, Detection and Optical Communication. Slides
15:10 Coffee break
16:00 SESSION 4
16:00 Hussain Sajwani (Khalifa University, UAE). Neuromorphic Vision in Aerospace Automation. Slides
16:20 Shintaro Shiba (Woven by Toyota, Japan). Real-world Event Camera Applications Towards Visible-Light Communication. Slides
16:40 Scott McCloskey (Kitware, USA). Centroiding Point-Objects with Event Cameras. Slides
17:00 Kynan Eng (SynSense Group, Switzerland).
17:30 Davide Migliore (Tempo Sense, USA). Harnessing the Power of AI for Smart Sensing. Slides
17:50 Award Ceremony

Objectives

Event-based cameras are bio-inspired, asynchronous sensors that offer key advantages of microsecond temporal resolution, low latency, high dynamic range and low power consumption. Because of these advantages, event-based cameras open frontiers that are unthinkable with traditional (frame-based) cameras, which have been the main sensing technology for the past 60 years. These revolutionary sensors enable the design of a new class of efficient algorithms to track a baseball in the moonlight, build a flying robot with the agility of a bee, and perform structure from motion in challenging lighting conditions and at remarkable speeds. In the last decade, research about these sensors has attracted the attention of industry and academia, fostering exciting advances in the field. The proposed workshop covers the sensing hardware, as well as the processing, data, and learning methods needed to take advantage of the above-mentioned novel cameras. The workshop also considers novel vision sensors, such as pixel processor arrays, which perform massively parallel processing near the image plane. Because early vision computations are carried out on-sensor (mimicking the retina), the resulting systems have high speed and low-power consumption, enabling new embedded vision applications in areas such as robotics, AR/VR, automotive, gaming, surveillance, etc.

Topics Covered

A longer list of related topics is available in the table of content of the List of Event-based Vision Resources

Proceedings at The Computer Vision Foundation (CVF)

List of accepted papers and live demos

  1. Dynamic EventNeRF: Reconstructing General Dynamic Scenes from Multi-view RGB and Event Streams
  2. Event Quality Score (EQS): Assessing the Realism of Simulated Event Camera Streams via Distances in Latent Space
  3. E-VLC: A Real-World Dataset for Event-based Visible Light Communication And Localization
  4. EV-Flying: an Event-based Dataset for In-The-Wild Recognition of Flying Objects
  5. BiasBench: A reproducible benchmark for tuning the biases of event cameras
  6. MTevent: A Multi-Task Event Camera Dataset for 6D Pose Estimation and Moving Object Detection
  7. Looking into the Shadow: Recording a Total Solar Eclipse with High-resolution Event Cameras
  8. Probabilistic Online Event Downsampling
  9. Making Every Event Count: Balancing Data Efficiency and Accuracy in Event Camera Subsampling
  10. Learning from Noise: Enhancing DNNs for Event-Based Vision through Controlled Noise Injection
  11. Towards Low-Latency Event-based Obstacle Avoidance on a FPGA-Drone
  12. Real-Time Pedestrian Detection at the Edge on a Fully Asynchronous Neuromorphic System
  13. DELTA: Dense Depth from Events and LiDAR using Transformer’s Attention
  14. Quadrocular, Neuromorphic Stereo Triangulation and Asynchronous Data Fusion for 3D Object Tracking
  15. Nanoparticle Diameter Measurements With Event Camera Tracking
  16. Egocentric Event-Based Vision for Ping Pong Ball Trajectory Prediction
  17. Event-Driven Dynamic Attention for Multi-Object Tracking on Neuromorphic Hardware
  18. Event-based Tracking and Imaging of Randomly Moving Objects in Dense Dynamical Scattering Media
  19. Perturbed State Space Feature Encoders for Optical Flow with Event Cameras
  20. Spatio-Temporal State Space Model For Efficient Event-Based Optical Flow
  21. Best Linear Unbiased Estimation for 2D and 3D Flow with Event-based Cameras
  22. Iterative Event-based Motion Segmentation by Variational Contrast Maximization
  23. EV-LayerSegNet: Self-supervised Motion Segmentation using Event Cameras
  24. Seeing like a Cephalopod: Colour Vision with a Monochrome Event Camera
  25. Event-based Continuous Color Video Decompression from Single Frames
  26. Reading in the Dark with Foveated Event Vision
  27. Human-Robot Navigation using Event-based Cameras and Reinforcement Learning
  28. E-BARF: Bundle Adjusting Neural Radiance Fields from a Moving Event Camera
  29. Event-based eye tracking. Even-based Vision Workshop 2025
  30. BRAT: Bidirectional Relative Positional Attention Transformer for Event-based Eye tracking
  31. Exploring Temporal Dynamics in Event-based Eye Tracker
  32. Dual-Path Enhancements in Event-Based Eye Tracking: Augmented Robustness and Adaptive Temporal Modeling
  33. Live Demonstration: Augmented Reality Applications Using Active Markers With An Event Camera
  34. Live Demonstration: Point-Feature Tracking for Pixel Processor Arrays
  35. Live Demonstration: NeuroTouch - A Neuromorphic Vision-based Tactile Sensor for Real-Time Gesture Recognition
  36. Live Demonstration: Real-time event-data processing with Graph Convolutional Neural Networks and SoC FPGA
  37. Reading Events Hierarchically to Reduce Latency

List of Courtesy papers

  1. NTIRE 2025 Challenge on Event-Based Image Deblurring: Methods and Results, CVPRW 2025.
  2. Graph Neural Network Combining Event Stream and Periodic Aggregation for Low-Latency Event-based Vision, CVPR 2025.
  3. Repurposing Pre-trained Video Diffusion Models for Event-based Video Interpolation, CVPR 2025.
  4. ETAP: Event-based Tracking of Any Point, CVPR 2025.
  5. EBS-EKF: Accurate and High Frequency Event-based Star Tracking, CVPR 2025.
  6. Event fields: Capturing light fields at high speed, resolution, and dynamic range, CVPR 2025.
  7. On-Device Self-Supervised Learning of Low-Latency Monocular Depth from Only Events, CVPR 2025.
  8. Bridge Frame and Event: Common Spatiotemporal Fusion for High-Dynamic Scene Optical Flow, CVPR 2025.
  9. Ev-3DOD: Pushing the Temporal Boundaries of 3D Object Detection with Event Cameras, CVPR 2025.
  10. TimeTracker: Event-based Continuous Point Tracking for Video Frame Interpolation with Non-linear Motion, CVPR 2025.
  11. EventFly: Event Camera Perception from Ground to the Sky, CVPR 2025.
  12. DiET-GS: Diffusion Prior and Event Stream-Assisted Motion Deblurring 3D Gaussian Splatting, CVPR 2025.
  13. Event Ellipsometer: Event-based Mueller-Matrix Video Imaging, CVPR 2025.
  14. IncEventGS: Pose-Free Gaussian Splatting from a Single Event Camera, CVPR 2025.
  15. PS-EIP: Robust Photometric Stereo Based on Event Interval Profile, CVPR 2025.
  16. Efficient Event-Based Object Detection: A Hybrid Neural Network with Spatial and Temporal Attention, CVPR 2025.
  17. EMBA: Event-based Mosaicing Bundle Adjustment, ECCV 2024.
  18. Neuromorphic Facial Analysis with Cross-Modal Supervision, ECCVW 2024.
  19. Distance Estimation in Outdoor Driving Environments Using Phase-only Correlation Method with Event Cameras, IEEE IV 2025.
  20. Evaluation of Mobile Environment for Vehicular Visible Light Communication Using Multiple LEDs and Event Cameras, IEEE IV 2025.
  21. EvMAPPER: High Altitude Orthomapping with Event Cameras, ICRA 2025.
  22. Neural Inertial Odometry from Lie Events, RSS 2025.

Competitions / Challenges

1. Eye-tracking

Overview: This challenge focuses on advancing event-based eye tracking, a key technology for driving innovations in interaction technology, extended reality (XR) and cognitive studies. While current state-of-the-art devices like Apple's Vision Pro or Meta’s Aria glasses utilize frame-based eye tracking with frame rates from 10 to 100 Hz and latency around 11 ms, there is a pressing need for smoother, faster, and more efficient methods to enhance user experience. By leveraging the event-based eye tracking dataset (3ET+ dataset), this challenge offers participants the opportunity to contribute to cutting-edge solutions that push beyond current limitations. Top-1-ranking team will get a Meta Quest 3 as the prize (Sponsored by DVsense).

Eye-tracking Challenge website

Timeline:

Contact:


2. Space-time Instance Segmentation (SIS) Challenge

MouseSIS Visualization

Overview:

Challenge Page (Codabench)

Timeline:

Contact: Friedhelm Hamann (f.hamann [at] tu-berlin [dot] de)


3. Event-Based Image Deblurring Challenge

Deblur with events

Overview:

This challenge focuses on leveraging the high-temporal-resolution events from event cameras to improve image deblurring. We hope that this challenge will serve as a starting point for promoting event-based image enhancement on a broader stage and contribute to the thriving development of the event-based vision community.

Challenge Page (CodaLab)

Timeline:

Contact: Lei Sun (leo_sun [at] zju [dot] edu [dot] cn)


4. Event-Based SLAM Challenge

Overview:

The goal of this challenge is to leverage the high temporal and spatial resolution of HD event cameras for SLAM and pose estimation applications.

Please refer to the challenge website for more information about participation.

Tracks:

Timeline:

Contact:

Organizers

Important Dates

FAQs

Upcoming Workshops

See also this link

Supported by

Ack

The Microsoft CMT service was used for managing the peer-reviewing process for this conference. This service was provided for free by Microsoft and they bore all expenses, including costs for Azure cloud services as well as for software development and support.