June 12th, 2025, CVPR (Full day workshop), Nashville (TN), USA. Held in conjunction with the IEEE/CVF Conference on Computer Vision and Pattern Recognition 2025.
Welcome to the 5th International Workshop on Event-Based Vision!
Many thanks to all who contributed and made this workshop possible!
Photo Album of the Workshop
Invited Speakers
Schedule
Time | Speaker and Title |
08:00 | SESSION 1 |
08:00 | Welcome and Organization |
08:05 | Davide Scaramuzza (University of Zurich, Switzerland). Event Cameras: a New Way of Sensing. Slides |
08:35 | Daniel Cremers (TU Munich, Germany) |
09:05 | Priyadarshini Panda (Yale University, USA). A Neuromorphic Approach To Event-based Vision: From Encoding To Adaptive Architectures. Slides |
09:40 | Introduction of 4 Challenges: Eye-tracking, Deblurring, Segmentation and SLAM.
Event-Based Eye-Tracking Challenge. Slides Event-Based Instance Segmentation Challenge. Slides Event-Based Image Deblurring Challenge. Slides Event-Based SLAM Challenge. Slides |
10:00 | Coffee break. Set up posters |
10:30 | SESSION 2 |
10:30 | Poster session: contributed papers, demos, challenges and courtesy papers (as posters). Boards 280 -- 359. |
12:30 | Lunch break |
13:30 | SESSION 3 |
13:30 | Niklas Funk (TU Darmstadt, Germany). Event-based Optical Tactile Sensing For Robotic Manipulation. Slides |
13:55 | Christopher Metzler (University of Maryland, USA). Computational Imaging with Event Cameras. Slides |
14:20 | Vladislav Golyanik (Max Planck Institute, Germany). Event-based Non-rigid 3D Reconstruction and Novel-View Synthesis. Slides |
14:45 | Ziwei Wang (ANU, Australia). Seeing through Space and Time: Asynchronous Event Blob Tracking, Detection and Optical Communication. Slides |
15:10 | Coffee break |
16:00 | SESSION 4 |
16:00 | Hussain Sajwani (Khalifa University, UAE). Neuromorphic Vision in Aerospace Automation. Slides |
16:20 | Shintaro Shiba (Woven by Toyota, Japan). Real-world Event Camera Applications Towards Visible-Light Communication. Slides |
16:40 | Scott McCloskey (Kitware, USA). Centroiding Point-Objects with Event Cameras. Slides |
17:00 | Kynan Eng (SynSense Group, Switzerland). |
17:30 | Davide Migliore (Tempo Sense, USA). Harnessing the Power of AI for Smart Sensing. Slides |
17:50 | Award Ceremony |
Objectives
Topics Covered
- Event-based / neuromorphic vision.
- Algorithms: motion estimation, visual(-inertial) odometry, SLAM, 3D reconstruction, image intensity reconstruction, optical flow estimation, recognition, segmentation, feature/object detection, visual tracking, calibration, action understanding, sensor fusion (video synthesis, events and RGB, events and LiDAR, etc.), model-based, embedded, or learning-based approaches.
- Event-based representation, signal processing, and control.
- Event-based active vision, event-based sensorimotor integration.
- Event camera datasets and/or simulators.
- Applications in: computational photography, robotics (navigation, manipulation, drones, obstacle avoidance, human-robot interaction,…), automotive, IoT, AR/VR (e.g., smart eyewear), space science, automated inspection, surveillance, crowd counting, physics, biology.
- Novel hardware (cameras, neuromorphic processors, etc.) and/or software platforms, such as fully event-based systems (end-to-end).
- New trends and challenges in event-based and/or biologically-inspired vision (SNNs, Reservoir Computing, etc.).
- Efficient computing architectures for Event-based processing (e.g., HD Computing, State Space Models).
- Near-focal plane processing, such as pixel processor arrays (PPAs).
A longer list of related topics is available in the table of content of the List of Event-based Vision Resources
Proceedings at The Computer Vision Foundation (CVF)
List of accepted papers and live demos
- Dynamic EventNeRF: Reconstructing General Dynamic Scenes from Multi-view RGB and Event Streams
- Event Quality Score (EQS): Assessing the Realism of Simulated Event Camera Streams via Distances in Latent Space
- E-VLC: A Real-World Dataset for Event-based Visible Light Communication And Localization
- EV-Flying: an Event-based Dataset for In-The-Wild Recognition of Flying Objects
- BiasBench: A reproducible benchmark for tuning the biases of event cameras
- MTevent: A Multi-Task Event Camera Dataset for 6D Pose Estimation and Moving Object Detection
- Looking into the Shadow: Recording a Total Solar Eclipse with High-resolution Event Cameras
- Probabilistic Online Event Downsampling
- Making Every Event Count: Balancing Data Efficiency and Accuracy in Event Camera Subsampling
- Learning from Noise: Enhancing DNNs for Event-Based Vision through Controlled Noise Injection
- Towards Low-Latency Event-based Obstacle Avoidance on a FPGA-Drone
- Real-Time Pedestrian Detection at the Edge on a Fully Asynchronous Neuromorphic System
- DELTA: Dense Depth from Events and LiDAR using Transformer’s Attention
- Quadrocular, Neuromorphic Stereo Triangulation and Asynchronous Data Fusion for 3D Object Tracking
- Nanoparticle Diameter Measurements With Event Camera Tracking
- Egocentric Event-Based Vision for Ping Pong Ball Trajectory Prediction
- Event-Driven Dynamic Attention for Multi-Object Tracking on Neuromorphic Hardware
- Event-based Tracking and Imaging of Randomly Moving Objects in Dense Dynamical Scattering Media
- Perturbed State Space Feature Encoders for Optical Flow with Event Cameras
- Spatio-Temporal State Space Model For Efficient Event-Based Optical Flow
- Best Linear Unbiased Estimation for 2D and 3D Flow with Event-based Cameras
- Iterative Event-based Motion Segmentation by Variational Contrast Maximization
- EV-LayerSegNet: Self-supervised Motion Segmentation using Event Cameras
- Seeing like a Cephalopod: Colour Vision with a Monochrome Event Camera
- Event-based Continuous Color Video Decompression from Single Frames
- Reading in the Dark with Foveated Event Vision
- Human-Robot Navigation using Event-based Cameras and Reinforcement Learning
- E-BARF: Bundle Adjusting Neural Radiance Fields from a Moving Event Camera
- Event-based eye tracking. Even-based Vision Workshop 2025
- BRAT: Bidirectional Relative Positional Attention Transformer for Event-based Eye tracking
- Exploring Temporal Dynamics in Event-based Eye Tracker
- Dual-Path Enhancements in Event-Based Eye Tracking: Augmented Robustness and Adaptive Temporal Modeling
- Live Demonstration: Augmented Reality Applications Using Active Markers With An Event Camera
- Live Demonstration: Point-Feature Tracking for Pixel Processor Arrays
- Live Demonstration: NeuroTouch - A Neuromorphic Vision-based Tactile Sensor for Real-Time Gesture Recognition
- Live Demonstration: Real-time event-data processing with Graph Convolutional Neural Networks and SoC FPGA
- Reading Events Hierarchically to Reduce Latency
List of Courtesy papers
- NTIRE 2025 Challenge on Event-Based Image Deblurring: Methods and Results, CVPRW 2025.
- Graph Neural Network Combining Event Stream and Periodic Aggregation for Low-Latency Event-based Vision, CVPR 2025.
- Repurposing Pre-trained Video Diffusion Models for Event-based Video Interpolation, CVPR 2025.
- ETAP: Event-based Tracking of Any Point, CVPR 2025.
- EBS-EKF: Accurate and High Frequency Event-based Star Tracking, CVPR 2025.
- Event fields: Capturing light fields at high speed, resolution, and dynamic range, CVPR 2025.
- On-Device Self-Supervised Learning of Low-Latency Monocular Depth from Only Events, CVPR 2025.
- Bridge Frame and Event: Common Spatiotemporal Fusion for High-Dynamic Scene Optical Flow, CVPR 2025.
- Ev-3DOD: Pushing the Temporal Boundaries of 3D Object Detection with Event Cameras, CVPR 2025.
- TimeTracker: Event-based Continuous Point Tracking for Video Frame Interpolation with Non-linear Motion, CVPR 2025.
- EventFly: Event Camera Perception from Ground to the Sky, CVPR 2025.
- DiET-GS: Diffusion Prior and Event Stream-Assisted Motion Deblurring 3D Gaussian Splatting, CVPR 2025.
- Event Ellipsometer: Event-based Mueller-Matrix Video Imaging, CVPR 2025.
- IncEventGS: Pose-Free Gaussian Splatting from a Single Event Camera, CVPR 2025.
- PS-EIP: Robust Photometric Stereo Based on Event Interval Profile, CVPR 2025.
- Efficient Event-Based Object Detection: A Hybrid Neural Network with Spatial and Temporal Attention, CVPR 2025.
- EMBA: Event-based Mosaicing Bundle Adjustment, ECCV 2024.
- Neuromorphic Facial Analysis with Cross-Modal Supervision, ECCVW 2024.
- Distance Estimation in Outdoor Driving Environments Using Phase-only Correlation Method with Event Cameras, IEEE IV 2025.
- Evaluation of Mobile Environment for Vehicular Visible Light Communication Using Multiple LEDs and Event Cameras, IEEE IV 2025.
- EvMAPPER: High Altitude Orthomapping with Event Cameras, ICRA 2025.
- Neural Inertial Odometry from Lie Events, RSS 2025.
Competitions / Challenges
1. Eye-tracking
Timeline:
- Challenge Start: February 15, 2025
- Challenge End: March 15, 2025
- Top-ranking teams will be invited to submit factsheet, code, and workshop paper after competition ends, the submission deadline: March 25, 2025
- Top-ranking teams will be invited to write challenge report together, the deadline: April 5, 2025
- Paper review deadline: April 5, 2025
Contact:
- Prof. Qinyu Chen (q.chen [at] liacs [dot] leidenuniv [dot] nl)
- Prof. Chang Gao (chang.gao [at] tudelft [dot] nl)
2. Space-time Instance Segmentation (SIS) Challenge
Overview:
- Task: Predict mask-accurate tracks of all mouse instances from input events (and optional frames).
- Data: This challenge is based on the MouseSIS dataset.
- Two Tracks: (1) Frame + Events Track, and (2) Events-only Track.
Timeline:
- February 7, 2025: Challenge opens for submissions
- May 23, 2025: Challenge closes, final submission deadline
- May 26, 2025: Winners announced.
Top teams are invited to:
- submit factsheets and code
- collaborate on challenge report
- present a poster at the CVPR workshop
- June 6, 2025: Deadline for top teams to submit: Factsheets, Code and Challenge report.
- June 12, 2025: Results presentation (Posters) at CVPR 2025 Workshop on Event-based Vision
Contact: Friedhelm Hamann (f.hamann [at] tu-berlin [dot] de)
3. Event-Based Image Deblurring Challenge
Overview:
This challenge focuses on leveraging the high-temporal-resolution events from event cameras to improve image deblurring. We hope that this challenge will serve as a starting point for promoting event-based image enhancement on a broader stage and contribute to the thriving development of the event-based vision community.
- Task: To obtain a network design / solution that fusing events and images produces high quality results with the best performance (i.e., PSNR).
- Data: This challenge is based on the HighREV dataset.
Timeline:
- February 10, 2025: Challenge opens for submissions
- March 15, 2025: Final test data release
- March 21, 2025: Challenge ends: submission deadline to upload results on the final test data
- March 22, 2025: Fact sheets and code/executable submission deadline
- March 24, 2025: Preliminary test results release to the participants
- March 29, 2025: Paper submission deadline for entries from the challenge, as per the NTIRE-W page
- June 11-12, 2025: Results presentation at CVPR 2025 Workshop NTIRE (June 11) and/or Workshop on Event-based Vision (June 12), as a poster.
Contact: Lei Sun (leo_sun [at] zju [dot] edu [dot] cn)
4. Event-Based SLAM Challenge
Overview:
The goal of this challenge is to leverage the high temporal and spatial resolution of HD event cameras for SLAM and pose estimation applications.
Please refer to the challenge website for more information about participation.
Tracks:
- Event (+ IMU): if you obtain your pose using a single or a pair of event cameras, with or without IMU.
- Event + Mono (+ IMU): if you obtain your pose using a single or a pair of event cameras fused with monocular global shutter cameras, with or without IMU.
Timeline:
- March 1, 2025: Challenge opens for submissions
June 2June 9, 2025: Challenge endsJune 8June 9, 2025: The top submissions should send their code for manual evaluation, report, and posters.June 4June 10, 2025: Winners announced- June 12, 2025: Posters presented at the CVPR Workshop on Event-based vision.
- After the workshop: The top submissions are invited to collaborate on a report for the challenge.
Contact:
- Fernando Cladera (fclad [at] seas.upenn.edu)
- Dr. Kenneth Chaney (chaneyk [at] seas.upenn.edu)
Organizers
- Guillermo Gallego, TU Berlin, Germany.
- Kostas Daniilidis, University of Pennsylvania, USA.
- Cornelia Fermüller, University of Maryland, USA.
- Daniele Perrone, Prophesee, France.
- Davide Migliore, Tempo Sense, USA.
Important Dates
Paper submission deadline: March 19 (was March 12), 2025 (23:59h PST). Submission website (CMT)Demo abstract submission: March 19 (was March 12), 2025 (23:59h PST)Reviewers fill in review reports: March 28.Organizers notify authors: March 31st.Authors of accepted papers submit meta-data via Google form: March 31st.Authors receive instructions for camera-ready preparation: by April 6th.Authors submit camera-ready paper: April 14, 2025 (as per CVPR website, firm deadline set by IEEE) using IEEE Computer Society’s Conference Publishing Services (CPS)Early-bird registration April 30th- Register for the Workshop / Conference.
- Workshop day: June 12th, 2025. Full day workshop, start at 8 am.
FAQs
- What is an event camera? Watch this video explanation.
- What are possible applications of event cameras? Check the TPAMI 2022 review paper.
- Where can I buy an event camera? From Inivation, Prophesee, Lucid, etc..
- Are there datasets and simulators that I can play with? Yes, Dataset. Simulator. More.
- Is there any online course about event-based vision? Yes, check this course at TU Berlin.
- What is the SCAMP sensor? Read this page explanation.
- What are possible applications of the scamp sensor? Some applications can be found here.
- Where can I buy a SCAMP sensor? Contact Prof. Piotr Dudek.
- Where can I find more information? Check out this List of Event-based Vision Resources.
Related Workshops
- CVPR 2023 Fourth International Workshop on Event-based Vision. Videos
- CVPR 2021 Third International Workshop on Event-based Vision. Videos
- CVPR 2019 Second International Workshop on Event-based Vision. Videos
- ICRA 2017 First International Workshop on Event-based Vision. Videos
Upcoming Workshops
- ICCV 2025: 2nd Workshop on Neuromorphic Vision (NeVi 2025)
- IROS 2025: NeuRobots 2025. Workshop on Neuromorphic Perception for Real World Robotics
See also this link
Supported by
Ack
The Microsoft CMT service was used for managing the peer-reviewing process for this conference. This service was provided for free by Microsoft and they bore all expenses, including costs for Azure cloud services as well as for software development and support.