Facial expressions could help widen VR and AR accessibility options
Published: 7 April 2025
A new study on how computers can be accurately controlled using only facial expressions could help make augmented reality (AR) and virtual reality (VR) technologies more accessible to people living with disabilities.
A new study on how computers can be accurately controlled using only facial expressions could help make augmented reality (AR) and virtual reality (VR) technologies more accessible to people living with disabilities.
Researchers from the University of Glasgow and the University of St. Gallen in Switzerland discovered that seven simple facial movements could be reliably recognised by Meta’s Quest Pro headset, enabling users to control a VR game and navigate web pages in an AR environment.
The results could allow widely-available VR headsets to offer people living with disabilities new ways to interact with computers, as well as broaden the options for hands-free control available to every user.
https://youtu.be/VaXVVNQYVC4
During the study, which will be presented as a paper at the CHI 2025 conference later this month, the researchers asked 20 non-disabled volunteers to try moving their faces to match the 53 expressions already recognised by the Quest Pro’s software.
These expressions, called Facial Action Units or FAUs, are usually picked up by the headset’s onboard cameras to translate real-world facial expressions into virtual ones in online environments.
The volunteers were asked to perform each expression three times over, holding the facial pose for three seconds at a time. They rated each expression for comfort, ease of use, and how well they felt they performed it.
The testing showed that seven FAUs which highlighted different areas of the face provided the optimum balance between being reliably recognised by the headset and being comfortable for users to regularly repeat. They were opening the mouth; squinting the left and right eyes; puffing the left and right cheek and pulling the edges of the mouth to either side.
Dr Graham Wilson, of the University of Glasgow’s School of Computing Science, is one of the authors of the paper. He said: “Some have suggested that VR is an inherently ableist technology because it often requires users to perform dextrous hand motions with controllers or make full-body movements which not everyone can perform comfortably. That means that, at the moment, there are barriers preventing widespread access to VR and AR experiences.
“With this study, we were keen to explore whether the functions of a commercially-available VR headset could be adapted to help users accurately control software using only their face. The results are encouraging, and could help point to new ways of using technology not only for people living with disabilities but more widely too.”
To test the effectiveness of the seven facial movements the team identified, the researchers built their own neural network model to read the expressions captured by the Quest Pro with 97% accuracy
Then, they asked 10 non-disabled volunteers to use the headset to perform two different tasks, representing typical future use cases for the system. First, the volunteers used facial expressions to turn, select options and shoot weapons in a VR game environment. Then, they used their faces to scroll, select and navigate web pages in an AR environment. In both cases, they also used traditional handheld VR controllers to compare the experiences.
The participants reported that, while the controllers offered more precise input for gaming, using facial expressions instead worked well, offering a helpful level of control without requiring excessive effort. They also found the web browsing experience to be simple and intuitive, and showed high willingness to use facial input in the future.
Dr Mark McGill, a co-author of the paper from the University of Glasgow’s School of Computing Science, said: “This is a relatively small study, based on data captured with the help of non-disabled people. However, it shows clearly that these seven specific facial movements are likely the most easily-recognised by off-the-shelf hardware, and translate well to two of the typical ways we might expect to use more traditional VR and AR input methods.
“That gives us a base to build on in future research. We plan to work with people with disabilities like motor impairments or muscular disorders in further studies to provide developers and XR platforms with new suggestions of how they can expand their palette of accessibility options, and help them break free of the constraints of hand and controller-based input.”
The potential applications of the research go further than expanding accessibility. In the paper, the researchers also highlight how enabling headsets to use facial expression control could help users more comfortably perform a range of everyday tasks. They suggest that users could control AR glasses while walking with hands full, while cooking, or when performing tasks requiring clean hands. It could also help users more discreet control of devices when in public spaces.
The team have also made their dataset freely available online to encourage other researchers to explore their own potential uses of the FAUs identified as the most useful for facial control of software.
The team’s paper, titled ‘InterFACE: Establishing a Facial Action Unit Input Vocabulary for Hands-Free Extended Reality Interactions, from VR Gaming to AR Web Browsing’, will be presented at the CHI Conference in Yokohama, Japan, on Monday 28th April. The research was supported by funding from UK Research & Innovation (UKRI) and the European Research Council (ERC).
First published: 7 April 2025