Skip to the content.

imagen

June 11th or 12th (TBD), 2025, CVPR, Nashville (TN), USA.
Held in conjunction with the IEEE/CVF Conference on Computer Vision and Pattern Recognition 2025.

Welcome to the 5th International Workshop on Event-Based Vision!

Important Dates

CVPRW 2023 edition photo by S. Shiba

Objectives

Event-based cameras are bio-inspired, asynchronous sensors that offer key advantages of microsecond temporal resolution, low latency, high dynamic range and low power consumption. Because of these advantages, event-based cameras open frontiers that are unthinkable with traditional (frame-based) cameras, which have been the main sensing technology for the past 60 years. These revolutionary sensors enable the design of a new class of efficient algorithms to track a baseball in the moonlight, build a flying robot with the agility of a bee, and perform structure from motion in challenging lighting conditions and at remarkable speeds. In the last decade, research about these sensors has attracted the attention of industry and academia, fostering exciting advances in the field. The proposed workshop covers the sensing hardware, as well as the processing, data, and learning methods needed to take advantage of the above-mentioned novel cameras. The workshop also considers novel vision sensors, such as pixel processor arrays, which perform massively parallel processing near the image plane. Because early vision computations are carried out on-sensor (mimicking the retina), the resulting systems have high speed and low-power consumption, enabling new embedded vision applications in areas such as robotics, AR/VR, automotive, gaming, surveillance, etc.

Topics Covered

A longer list of related topics is available in the table of content of the List of Event-based Vision Resources

Call for Contributions

Research papers

Research papers and demos are solicited in, but not limited to, the topics listed above.

Courtesy papers (in the poster session)

We also solicit contributions of papers relevant to the workshop that are accepted at the CVPR main conference or at other peer-reviewed conferences or journals. These contributions will be checked for suitability (soft review) and will not be published in the workshop proceedings. Papers should be submitted in single blind format (e.g., accepted version is fine), and should mention if and where the paper has been accepted / published. These contributions provide visibility to your work and help building a community around the topics of the workshop.

Competitions / Challenges

1. Eye-tracking

Overview: Thhis challenge focuses on advancing event-based eye tracking, a key technology for driving innovations in interaction technology, extended reality (XR) and cognitive studies. While current state-of-the-art devices like Apple's Vision Pro or Meta’s Aria glasses utilize frame-based eye tracking with frame rates from 10 to 100 Hz and latency around 11 ms, there is a pressing need for smoother, faster, and more efficient methods to enhance user experience. By leveraging the event-based eye tracking dataset (3ET+ dataset), this challenge offers participants the opportunity to contribute to cutting-edge solutions that push beyond current limitations. Top-1-ranking team will get a Meta Quest 3 as the prize (Sponsored by DVsense).

Eye-tracking Challenge website

Timeline:

Contact:


2. Space-time Instance Segmentation (SIS) Challenge

MouseSIS Visualization

Overview:

Challenge Page (Codabench)

Timeline:

Contact: Friedhelm Hamann (f.hamann [at] tu-berlin [dot] de)


3. Event-Based Image Deblurring Challenge

Deblur with events

Overview:

This challenge focuses on leveraging the high-temporal-resolution events from event cameras to improve image deblurring. We hope that this challenge will serve as a starting point for promoting event-based image enhancement on a broader stage and contribute to the thriving development of the event-based vision community.

Challenge Page (CodaLab)

Timeline:

Contact: Lei Sun (leo_sun [at] zju [dot] edu [dot] cn)


4. Event-Based SLAM Challenge

Overview:

The goal of this challenge is to leverage the high temporal and spatial resolution of HD event cameras for SLAM and pose estimation applications.

Please refer to the challenge website for more information about participation.

Tracks:

Timeline:

Contact:


Speakers

Location

Schedule

The tentative schedule is the following:

Time (local) Session
8:00 Welcome. Session 1: Event cameras: Algorithms and applications I (Invited speakers)
10:10 Coffee break. Set up posters.
10:30 Session 2: Poster session: contributed papers, competitions, demos and courtesy presentations (as posters).
12:30 Lunch break
13:30 Session 3: Event cameras: Algorithms and applications II (Invited speakers)
15:30 Coffee break
16:00 Session 4: Hardware architectures and sensors (Invited speakers)
17:45 Award Ceremony and Final Panel Discussion.
18:00 End

Organizers

FAQs

See also this link

Ack

The Microsoft CMT service was used for managing the peer-reviewing process for this conference. This service was provided for free by Microsoft and they bore all expenses, including costs for Azure cloud services as well as for software development and support.