18video性欧美19sex,国产精品成人av在线观看,精品国产av色一区二区深夜久久,久久久精品午夜免费不卡,欧美牲交a欧美在线

GRAPH NEURAL NETWORK COMBINING EVENT STREAM AND PERIODIC AGGREGATION FOR LOW-LATENCY EVENT-BASED VISION

UNIVERSITY GRENOBLE ALPES, CEA-LIST, PROPHESEE

 

Manon Dampfhoffer, Thomas Mesquida, Damien Joubert, Thomas Dalgaty, Pascal Vivet, Christoph Posch

ABSTRACT

Event-based cameras asynchronously detect changes in light intensity with high temporal resolution, making them a promising alternative to RGB camera for low-latency and low-power optical flow estimation. However, state-of-the-art convolutional neural network methods create frames from the event stream, therefore losing the opportunity to exploit events for both sparse computations and low-latency prediction. On the other hand, asynchronous event graph methods could leverage both, but at the cost of avoiding any form of time accumulation, which limits the prediction accuracy. In this paper, we propose to break this accuracy latency trade-off with a novel architecture combining an asynchronous accumulation-free event branch and a periodic aggregation branch. The periodic branch performs feature aggregations on the event graphs of past data to extract global context information, which improves accuracy without introducing any latency. The solution could predict optical flow per event with a latency of tens of microseconds on asynchronous hardware, which represents a gain of three orders of magnitude with respect to state-of-the-art frame-based methods, with 48x less operations per second. We show that the solution can detect rapid motion changes faster than a periodic output. This work proposes, for the first time, an effective solution for ultra low-latency and low-power optical flow prediction from event cameras.

Source: openaccess.thecvf.com

METAVISION SDK APPLICATION

Event Simulator

SEARCH PUBLICATION LIBRARY

X