Skip to the content.


June 19, 2023, Monday. 2nd day of CVPR, Vancouver, Canada. Starts at 10 am Eastern Time; 4 pm Europe Time.
Held in conjunction with the IEEE Conference on Computer Vision and Pattern Recognition 2023, as part of the track: CV for non-traditional modalities.

Welcome to the 4th International Workshop on Event-Based Vision!

Important Dates


This workshop is dedicated to event-based cameras, smart cameras, and algorithms processing data from these sensors. Event-based cameras are bio-inspired sensors with the key advantages of microsecond temporal resolution, low latency, very high dynamic range, and low power consumption. Because of these advantages, event-based cameras open frontiers that are unthinkable with standard frame-based cameras (which have been the main sensing technology for the past 60 years). These revolutionary sensors enable the design of a new class of algorithms to track a baseball in the moonlight, build a flying robot with the agility of a bee, and perform structure from motion in challenging lighting conditions and at remarkable speeds. These sensors became commercially available in 2008 and are slowly being adopted in computer vision and robotics. In recent years they have received attention from large companies, e.g., the event-sensor company Prophesee collaborated with Intel and Bosch on a high spatial resolution sensor, Samsung announced mass production of a sensor to be used on hand-held devices, and they have been used in various applications on neuromorphic chips such as IBM’s TrueNorth and Intel’s Loihi. The workshop also considers novel vision sensors, such as pixel processor arrays (PPAs), which perform massively parallel processing near the image plane. Because early vision computations are carried out on-sensor, the resulting systems have high speed and low-power consumption, enabling new embedded vision applications in areas such as robotics, AR/VR, automotive, gaming, surveillance, etc. This workshop will cover the sensing hardware, as well as the processing and learning methods needed to take advantage of the above-mentioned novel cameras.

Topics Covered


The tentative schedule is the following:

Time (local) Session
8:00 Session 1: Event-based cameras and neuromorphic computing (Invited speakers)
10:10 Coffee break
10:30 Session 2: Poster session: contributed papers and courtesy presentations (as posters). Live Demonstrations.
12:30 Lunch break
13:30 Session 3: Applications, Algorithms and Architectures (Invited speakers)
15:30 Coffee break
16:00 Session 4: Industrial Session (Invited speakers).
17:45 Award Ceremony and Final Panel Discussion.
18:00 End

Accepted Papers

  1. M3ED: Multi-Robot, Multi-Sensor, Multi-Environment Event Dataset, Dataset, Code
  2. PDAVIS: Bio-inspired Polarization Event Camera, and Suppl mat
  3. Asynchronous Events-based Panoptic Segmentation using Graph Mixer Neural Network, and Suppl mat, Code and data
  4. Shining light on the DVS pixel: A tutorial and discussion about biasing and optimization, Tool
  5. Event-IMU fusion strategies for faster-than-IMU estimation throughput, and Suppl mat
  6. Fast Trajectory End-Point Prediction with Event Cameras for Reactive Robot Control, and Suppl mat, Code
  7. Exploring Joint Embedding Architectures and Data Augmentations for Self-Supervised Representation Learning in Event-Based Vision, and Suppl mat, Code
  8. Event-based Blur Kernel Estimation For Blind Motion Deblurring
  9. Neuromorphic Event-based Facial Expression Recognition, Dataset
  10. Low-latency monocular depth estimation using event timing on neuromorphic hardware
  11. Frugal event data: how small is too small? A human performance assessment with shrinking data
  12. Flow cytometry with event-based vision and spiking neuromorphic hardware, Code
  13. How Many Events Make an Object? Improving Single-frame Object Detection on the 1 Mpx Dataset, and Suppl mat, Code
  14. EVREAL: Towards a Comprehensive Benchmark and Analysis Suite for Event-based Video Reconstruction, and Suppl mat, Code
  15. X-maps: Direct Depth Lookup for Event-based Structured Light Systems, Code
  16. PEDRo: an Event-based Dataset for Person Detection in Robotics, Dataset
  17. Density Invariant Contrast Maximization for Neuromorphic Earth Observations, and Suppl mat, Code
  18. Entropy Coding-based Lossless Compression of Asynchronous Event Sequences, and Suppl mat
  19. MoveEnet: Online High-Frequency Human Pose Estimation with an Event Camera, and Suppl mat, Code
  20. Predictive Coding Light: learning compact visual codes by combining excitatory and inhibitory spike timing-dependent plasticity
  21. Neuromorphic Optical Flow and Real-time Implementation with Event Cameras, and Suppl mat
  22. HUGNet: Hemi-Spherical Update Graph Neural Network applied to low-latency event-based optical flow, and Suppl mat
  23. End-to-end Neuromorphic Lip-reading
  24. Sparse-E2VID: A Sparse Convolutional Model for Event-Based Video Reconstruction Trained with Real Event Noise, and Suppl mat, Video
  25. Within-Camera Multilayer Perceptron DVS Denoising, and Suppl mat
  26. Interpolation-Based Event Visual Data Filtering Algorithms, Code

Live Demonstrations

  1. PINK: Polarity-based Anti-flicker for Event Cameras, Video
  2. Event-based Visual Microphone, and Suppl mat
  3. E2P–Events to Polarization Reconstruction from PDAVIS Events, and Suppl mat, Code
  4. SCAMP-7
  5. ANN vs SNN vs Hybrid Architectures for Event-based Real-time Gesture Recognition and Optical Flow Estimation
  6. Tangentially Elongated Gaussian Belief Propagation for Event-based Incremental Optical Flow Estimation, Code
  7. Real-time Event-based Speed Detection using Spiking Neural Networks
  8. Integrating Event-based Hand Tracking Into TouchFree Interactions, and Suppl mat

Courtesy presentations (in the poster session)

We also invite courtesy presentations of papers relevant to the workshop that are accepted at CVPR main conference or at other peer-reviewed conferences or journals. These presentations provide visibility to your work and help building a community around the topics of the workshop. These contributions will be checked for relevance to the workshop, but will not undergo a complete review, and will not be published in the workshop proceedings. Please contact the organizers to make arrangements to showcase your work at the workshop.




Supported by