Panagiotis Drakopoulos
Eye Tracking Interaction on Unmodified Mobile VR Headsets Using the Selfie Camera
Drakopoulos, Panagiotis; Koulieris, George-Alex; Mania, Katerina
Abstract
Input methods for interaction in smartphone-based virtual and mixed reality (VR/MR) are currently based on uncomfortable head tracking controlling a pointer on the screen. User fixations are a fast and natural input method for VR/MR interaction. Previously, eye tracking in mobile VR suffered from low accuracy, long processing time, and the need for hardware add-ons such as anti-reflective lens coating and infrared emitters. We present an innovative mobile VR eye tracking methodology utilizing only the eye images from the front-facing (selfie) camera through the headset’s lens, without any modifications. Our system first enhances the low-contrast, poorly lit eye images by applying a pipeline of customised low-level image enhancements suppressing obtrusive lens reflections. We then propose an iris region-of-interest detection algorithm that is run only once. This increases the iris tracking speed by reducing the iris search space in mobile devices. We iteratively fit a customised geometric model to the iris to refine its coordinates. We display a thin bezel of light at the top edge of the screen for constant illumination. A confidence metric calculates the probability of successful iris detection. Calibration and linear gaze mapping between the estimated iris centroid and physical pixels on the screen results in low latency, real-time iris tracking. A formal study confirmed that our system’s accuracy is similar to eye trackers in commercial VR headsets in the central part of the headset’s field-of-view. In a VR game, gaze-driven user completion time was as fast as with head-tracked interaction, without the need for consecutive head motions. In a VR panorama viewer, users could successfully switch between panoramas using gaze.
Citation
Drakopoulos, P., Koulieris, G.-A., & Mania, K. (2021). Eye Tracking Interaction on Unmodified Mobile VR Headsets Using the Selfie Camera. ACM Transactions on Applied Perception, 18(3), 1-20. https://doi.org/10.1145/3456875
Journal Article Type | Article |
---|---|
Online Publication Date | May 20, 2021 |
Publication Date | 2021-07 |
Deposit Date | Sep 20, 2021 |
Publicly Available Date | Nov 2, 2021 |
Journal | ACM Transactions on Applied Perception |
Print ISSN | 1544-3558 |
Electronic ISSN | 1544-3965 |
Publisher | Association for Computing Machinery (ACM) |
Peer Reviewed | Peer Reviewed |
Volume | 18 |
Issue | 3 |
Article Number | 11 |
Pages | 1-20 |
DOI | https://doi.org/10.1145/3456875 |
Public URL | https://durham-repository.worktribe.com/output/1234106 |
Files
Accepted Journal Article
(6.2 Mb)
PDF
Copyright Statement
© Owner/Author | ACM 2021. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in ACM Transactions on Applied Perception, https://doi.org/10.1145/3456875
You might also like
Downloadable Citations
About Durham Research Online (DRO)
Administrator e-mail: dro.admin@durham.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search