Photo credit: www.sciencedaily.com
The advancement of eye tracking technology remains a vital component in the evolution of virtual and augmented reality headsets, alongside its broader significance in the entertainment sector, scientific investigations, medical and behavioral sciences, automotive safety, and industrial engineering. Achieving high-accuracy tracking of human eye movements presents a complex challenge.
Researchers at the University of Arizona’s Wyant College of Optical Sciences have introduced a groundbreaking approach that holds the potential to transform eye-tracking methodologies. Their findings, detailed in Nature Communications, indicate that combining a sophisticated 3D imaging method known as deflectometry with cutting-edge computational techniques can greatly enhance the capabilities of current eye-tracking technologies.
Florian Willomitzer, an associate professor of optical sciences and principal investigator of the study, noted, “Traditional eye-tracking techniques are limited to capturing the directional movement of the eyeball from a handful of surface points, typically no more than a dozen. Our deflectometry-based approach enables us to gather data from over 40,000 surface points, potentially reaching into the millions, all extracted from a single instantaneous camera image.”
Jiazhang Wang, a postdoctoral researcher in Willomitzer’s lab and the lead author of the study, added, “The increased number of data points provides significantly more information, allowing for a substantial enhancement in the accuracy of gaze direction estimation. This improvement is essential for advancing applications in virtual reality, as our method can increase the number of data points collected by more than 3,000 times compared to traditional methods.”
Deflectometry is a 3D imaging approach that measures reflective surfaces with exceptional precision. It is commonly used for inspecting large telescope mirrors and high-performance optics to identify minor flaws or deviations from their intended configuration.
The research team aims to expand the use of deflectometry beyond its standard applications in industrial inspections. At the U of A Computational 3D Imaging and Measurement Lab, Willomitzer’s group combines deflectometry with advanced computational techniques typically found in computer vision. This innovative research area, referred to as “computational deflectometry,” encompasses applications ranging from the analysis of art to 3D imaging methods for assessing skin lesions and enhanced eye tracking.
Willomitzer remarked, “This unique combination of precise measurement and advanced computation enables machines to ‘see the unseen,’ providing them with ‘superhuman vision’ that surpasses human perceptual limits.”
The research involved experiments with both human participants and a realistic artificial eye model. The results showed that the method was able to track gaze direction with an accuracy ranging from 0.46 to 0.97 degrees in human subjects, and an impressive error margin of just 0.1 degrees in the artificial model.
Unlike conventional methods that rely on a few infrared point light sources, this new technique employs a screen displaying known structured light patterns as the source of illumination. Each pixel on the screen acts as a separate point light source, enhancing data collection.
By examining how these patterns deform when reflecting off the surface of the eye, researchers can acquire dense and precise 3D surface data from both the cornea (the transparent front part of the eye) and the surrounding sclera. Wang elaborated, “Our computational reconstruction utilizes this surface data along with established geometrical constraints regarding the eye’s optical axis to accurately determine gaze direction.”
Previous studies have indicated the technology’s potential to integrate smoothly with virtual and augmented reality systems, possibly by incorporating a fixed embedded pattern within the headset frame or utilizing the visual content displayed within the headset. This integration could significantly simplify system design. Future iterations may also employ infrared light instead of visible light, enabling operation without distracting users with visible patterns.
Wang emphasized, “We employ stereo-deflectometry alongside innovative surface optimization algorithms to extract the maximum directional information from the cornea and sclera without introducing ambiguities. This method does not rely heavily on assumptions about the shape or surface characteristics of the eye, which can differ among individuals.”
A promising outcome of this new technology is its capacity to produce detailed and accurate surface reconstructions of the eye, potentially paving the way for real-time diagnosis and treatment of specific ocular disorders in the future.
Aiming for the Next Technological Leap
This research marks the first application of deflectometry in eye tracking, to the authors’ knowledge. Wang noted, “It is promising that our initial implementation has already shown accuracy levels that are on par with or surpass those of commercial eye-tracking systems in tests involving actual human subjects.”
With a patent application filed and plans for commercialization through Tech Launch Arizona, this research signals a new era for robust and precise eye-tracking technology. The researchers are optimistic that through further enhancements and algorithmic improvements, they can extend the boundaries of what is achievable in eye tracking for practical use cases. Future objectives include integrating additional 3D reconstruction techniques and leveraging artificial intelligence to refine the method.
“Our aspiration is to achieve accuracy levels close to the 0.1-degree precision demonstrated with the model eye experiments,” Willomitzer stated. “We envision this new methodology unlocking a new realm of eye tracking possibilities, including applications in neuroscience and psychology research.”
Co-authors of the study include Oliver Cossairt, an adjunct associate professor of electrical and computer engineering at Northwestern University, where Willomitzer and Wang initially developed the project, along with Tianfu Wang and Bingjie Xu, both former students at Northwestern.
Source
www.sciencedaily.com