UF researchers making virtual reality accessible to all eyes

VR point and shoot game with the gaze pointer visualized as a circle and the pinch gesture to destroy the target.

A recent study led by a University of Florida engineering professor is challenging long-held assumptions in virtual reality design, particularly the belief that all users see the world the same way. 

The research by UF engineering professor Eakta Jain, Ph.D., highlights practical solutions for comprehensive virtual reality design by discussing how traditional eye-tracking methods, which assume coordinated eye movements, overlook individuals with visual impairments. 

“Every person with an eye impairment is unique. You can’t just say here is a typical normal eye or here is a typical abnormal eye. There is a range,” said Jain. 

In a project funded in part by the National Science Foundation, Jain’s research group collected qualitative and quantitative data from 11 participants with self-reported strabismus and amblyopia. Strabismus (commonly called crossed eyes), involves misalignment of the eyes while amblyopia (known as lazy eye), is a vision disorder where the two eyes don’t work together.  

Users participated in a virtual reality game divided into pre-gameplay, an immersive, interactive virtual reality game and post-gameplay. Users shared their experiences through interviews, cybersickness questionnaires, performance metrics from the game and post-session self-reports including challenges and coping mechanisms. 

Published in April in the ACM Transactions on Applied Perception, the research findings highlight how current assumptions in eye tracking hinder these users’ participation and enjoyment in virtual reality. VR platforms assume a ‘normal’ eye or that both eyes are aligned. The assumption does not hold true for about 2-4% of the population, impacting between 5 and 15 million people in the United States. 

“Eye tracking is used for authentication, interaction and contextual artificial intelligence. It is becoming the basic underlying technology in the next generation of computing interfaces,” said Jain. “Not just gaming, but also social interaction, education, vocational training such as construction work and therapeutic purposes. So, it’s important to think about what types of assumptions are being made that may unintentionally exclude users.” 

Because of these inherent assumptions about the eyes, the team uncovered three key limitations in existing virtual reality design:  

  1. Calibration and validation. In the calibration and validation stage, the user is shown a set of visual targets and asked to look at those targets. For participants with normal vision, gaze data points are tightly clustered around the targets whereas data from the non-dominant eye in users with strabismus/amblyopia is scattered widely. A user with this type of impairment may not even be able to proceed past the initial set up phase.  
  2.  Identifying Areas of Interest (AOI). This information is used for identifying areas of interest in a scene based on what object the participant is looking at and for how long. AOI analysis can be less effective for users who have eye variations such as amblyopia. 
  3. Gaze-based interaction. Some VR experiences rely heavily on eye gaze for interaction. For people with strabismus or amblyopia, this can cause frustration, fatigue, or inaccurate inputs thereby limiting their enjoyment and participation. 

“The biggest challenge is creating space to ask these fundamental science questions. We need interdisciplinary collaboration – including experts in visual impairment – and stronger industry partnerships to share our research and advocate for the needs of these populations,” said Jain. 

This call for collaboration is echoed by Michael Proulx, Ph.D., a professor at the University of Bath and a research scientist at Reality Labs Research.

“As a professor in the UK who has researched the psychology of visual impairments, I am passionate about inclusive design. One main attraction for me to start working with Reality Labs Research at Meta was the efforts to have eye tracking that works ‘all the time, for every person, in any environment,’” said Proulx. “In fact, this challenge was noted in efforts to support academic-industry partnerships through open-sourcing data, models, and papers. This is also why I’ve been proud to collaborate with researchers at the University of Florida who are working at the cutting edge of eye tracking research that works well for everyone, because this is a challenge that can only be solved with the whole field working together.” 

To support virtual reality experiences for more people, researchers outline practical design guidelines for developers: (1) Provide alternatives to relying on eye gaze as a cursor. For example, allow the individual to use their head as a cursor if they prefer. (2) Allow users to select which eye is used for gaze-based interaction or gaze-based contextual cues. (3) Incorporate tests to identify eye dominance as part of calibration. 

Ultimately, researchers hope to achieve a conceptual paradigm shift through the concept of “opto-diversity” which acknowledges all eye shapes, colors, behaviors and varying eye functions.  

“The hope is that these users can get more consideration from designers as they develop applications. Eye tracking and virtual reality is becoming more commonplace and accessible. Designing for their needs will hopefully become more of a priority,” said lead student author, Shaina Murphy, Ph.D. candidate in the Human Centered Computing program.  

Professor Jain added that to make long-term changes the research community and industry must work together. 

“The more we connect with the industry, the more we can start influencing how they approach technology development. By designing for opto-diversity rather than a default eye, we hope to create a better experience in immersive VR whether that’s in gaming, education, job training, therapy or beyond.”