TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Retina : Low-Power Eye Tracking with Event Camera and Spik...

Retina : Low-Power Eye Tracking with Event Camera and Spiking Hardware

Pietro Bonazzi, Sizhen Bian, Giovanni Lippolis, Yawei Li, Sadique Sheik, Michele Magno

2023-12-01Pupil TrackingPupil Detection
PaperPDFCode(official)

Abstract

This paper introduces a neuromorphic methodology for eye tracking, harnessing pure event data captured by a Dynamic Vision Sensor (DVS) camera. The framework integrates a directly trained Spiking Neuron Network (SNN) regression model and leverages a state-of-the-art low power edge neuromorphic processor - Speck, collectively aiming to advance the precision and efficiency of eye-tracking systems. First, we introduce a representative event-based eye-tracking dataset, "Ini-30", which was collected with two glass-mounted DVS cameras from thirty volunteers. Then,a SNN model, based on Integrate And Fire (IAF) neurons, named "Retina", is described , featuring only 64k parameters (6.63x fewer than the latest) and achieving pupil tracking error of only 3.24 pixels in a 64x64 DVS input. The continous regression output is obtained by means of convolution using a non-spiking temporal 1D filter slided across the output spiking layer. Finally, we evaluate Retina on the neuromorphic processor, showing an end-to-end power between 2.89-4.8 mW and a latency of 5.57-8.01 mS dependent on the time window. We also benchmark our model against the latest event-based eye-tracking method, "3ET", which was built upon event frames. Results show that Retina achieves superior precision with 1.24px less pupil centroid error and reduced computational complexity with 35 times fewer MAC operations. We hope this work will open avenues for further investigation of close-loop neuromorphic solutions and true event-based training pursuing edge performance.

Results

TaskDatasetMetricValueModel
Object TrackingINI-30Euclidean Distance3.24Retina
Object TrackingINI-30Euclidean Distance4.483ET
Object DetectionINI-30Euclidean Distance0.5CNN
Object DetectionINI-30Euclidean Distance1.75TinyissimoV8
3DINI-30Euclidean Distance0.5CNN
3DINI-30Euclidean Distance1.75TinyissimoV8
2D ClassificationINI-30Euclidean Distance0.5CNN
2D ClassificationINI-30Euclidean Distance1.75TinyissimoV8
2D Object DetectionINI-30Euclidean Distance0.5CNN
2D Object DetectionINI-30Euclidean Distance1.75TinyissimoV8
16kINI-30Euclidean Distance0.5CNN
16kINI-30Euclidean Distance1.75TinyissimoV8

Related Papers

Continuous Pupillography: A Case for Visual Health Ecosystem2024-10-16EyeTrAES: Fine-grained, Low-Latency Eye Tracking via Adaptive Event Slicing2024-09-27MODEL&CO: Exoplanet detection in angular differential imaging by learning across multiple observations2024-09-23A Framework for Pupil Tracking with Event Cameras2024-07-23Using Deep Learning to Increase Eye-Tracking Robustness, Accuracy, and Precision in Virtual Reality2024-03-283ET: Efficient Event-based Eye Tracking using a Change-Based ConvLSTM Network2023-08-22Eye-tracked Virtual Reality: A Comprehensive Survey on Methods and Privacy Challenges2023-05-23An Embedded and Real-Time Pupil Detection Pipeline2023-02-27