Controllable Unsupervised Event-Based Video Generation

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon

The advent of event cameras, with their unique asynchronous sensing capabilities to capture the edge details of moving objects, has sparked new directions in video generation. So far, the challenge of integrating event-based data for controllable video generation remains largely unexplored. Addressing this gap, we introduce a framework that leverages the edge information from events and combines it with textual descriptions to synthesize videos without the requirement of extensive training. The framework marks a pioneering venture into event-based video generation using diffusion models. Comprehensive evaluations demonstrate the superior performance of our framework compared to existing methods. Code is available at: https://github.com/IndigoPurple/CUBE.

Save Icon
Up Arrow
Open/Close