Event-Based Video Reconstruction

6 papers with code • 1 benchmarks • 1 datasets

Event-Based Video Reconstruction aims to generate a sequence of intensity frames from an asynchronous stream of events (per-pixel brightness change signals outputted by an event camera).

Most implemented papers

Reducing the Sim-to-Real Gap for Event Cameras

TimoStoff/event_cnn_minimal ECCV 2020

We present strategies for improving training data for event based CNNs that result in 20-40% boost in performance of existing state-of-the-art (SOTA) video reconstruction networks retrained with our method, and up to 15% for optic flow networks.

An Asynchronous Kalman Filter for Hybrid Event Cameras

ziweiWWANG/AKF ICCV 2021

Conversely, conventional image sensors measure absolute intensity of slowly changing scenes effectively but do poorly on high dynamic range or quickly changing scenes.

Event-Based Video Reconstruction Using Transformer

warranweng/et-net ICCV 2021

Event cameras, which output events by detecting spatio-temporal brightness changes, bring a novel paradigm to image sensors with high dynamic range and low latency.

Event-based Video Reconstruction via Potential-assisted Spiking Neural Network

LinZhu111/EVSNN CVPR 2022

We propose a novel Event-based Video reconstruction framework based on a fully Spiking Neural Network (EVSNN), which utilizes Leaky-Integrate-and-Fire (LIF) neuron and Membrane Potential (MP) neuron.

EVREAL: Towards a Comprehensive Benchmark and Analysis Suite for Event-based Video Reconstruction

ercanburak/EVREAL 30 Apr 2023

Event cameras are a new type of vision sensor that incorporates asynchronous and independent pixels, offering advantages over traditional frame-based cameras such as high dynamic range and minimal motion blur.

HyperE2VID: Improving Event-Based Video Reconstruction via Hypernetworks

ercanburak/HyperE2VID 10 May 2023

Event-based cameras are becoming increasingly popular for their ability to capture high-speed motion with low latency and high dynamic range.