Jörg Conradt

Principal Investigator


EECS, CST

KTH Royal Institute of Technology, Sweden

Lindstedtsvägen 5
114 28 Stockholm, Sweden



Event-Based Near-Eye Gaze Tracking Beyond 10,000 Hz


Journal article


Anastasios Nikolas Angelopoulos, Julien N. P. Martel, Amit Kohli, J. Conradt, Gordon Wetzstein
IEEE Transactions on Visualization and Computer Graphics, 2020

Semantic Scholar ArXiv DBLP DOI PubMed
Cite

Cite

APA   Click to copy
Angelopoulos, A. N., Martel, J. N. P., Kohli, A., Conradt, J., & Wetzstein, G. (2020). Event-Based Near-Eye Gaze Tracking Beyond 10,000 Hz. IEEE Transactions on Visualization and Computer Graphics.


Chicago/Turabian   Click to copy
Angelopoulos, Anastasios Nikolas, Julien N. P. Martel, Amit Kohli, J. Conradt, and Gordon Wetzstein. “Event-Based Near-Eye Gaze Tracking Beyond 10,000 Hz.” IEEE Transactions on Visualization and Computer Graphics (2020).


MLA   Click to copy
Angelopoulos, Anastasios Nikolas, et al. “Event-Based Near-Eye Gaze Tracking Beyond 10,000 Hz.” IEEE Transactions on Visualization and Computer Graphics, 2020.


BibTeX   Click to copy

@article{anastasios2020a,
  title = {Event-Based Near-Eye Gaze Tracking Beyond 10,000 Hz},
  year = {2020},
  journal = {IEEE Transactions on Visualization and Computer Graphics},
  author = {Angelopoulos, Anastasios Nikolas and Martel, Julien N. P. and Kohli, Amit and Conradt, J. and Wetzstein, Gordon}
}

Abstract

The cameras in modern gaze-tracking systems suffer from fundamental bandwidth and power limitations, constraining data acquisition speed to 300 Hz realistically. This obstructs the use of mobile eye trackers to perform, e.g., low latency predictive rendering, or to study quick and subtle eye motions like microsaccades using head-mounted devices in the wild. Here, we propose a hybrid frame-event-based near-eye gaze tracking system offering update rates beyond 10,000 Hz with an accuracy that matches that of high-end desktop-mounted commercial trackers when evaluated in the same conditions. Our system, previewed in Figure 1, builds on emerging event cameras that simultaneously acquire regularly sampled frames and adaptively sampled events. We develop an online 2D pupil fitting method that updates a parametric model every one or few events. Moreover, we propose a polynomial regressor for estimating the point of gaze from the parametric pupil model in real time. Using the first event-based gaze dataset, we demonstrate that our system achieves accuracies of 0.45°-1.75° for fields of view from 45° to 98°. With this technology, we hope to enable a new generation of ultra-low-latency gaze-contingent rendering and display techniques for virtual and augmented reality.


Share

Tools
Translate to