Official newspaper of The University of Texas at Austin

The Daily Texan

Official newspaper of The University of Texas at Austin

The Daily Texan

Official newspaper of The University of Texas at Austin

The Daily Texan

Advertise in our classifieds section
Your classified listing could be here!
October 4, 2022
LISTEN IN

Microsoft mobile app allows people with ALS to speak

app_Courtesy+of+Xiaoyi+Zhang
Courtesy of Xiaoyi Zhang

A new application called GazeSpeak will allow people to “speak” with their eyes. 

The application aims to allow those affected by motor neuron diseases such as amyotrophic lateral sclerosis, or ALS, to communicate. GazeSpeak was created by Enable, a branch of Microsoft that develops accessible technology for patients that suffer from a variety of diseases. The app will be introduced in May.

Researchers Xiaoyi Zhang, Harish Kulkarni and Meredith Ringel Morris, three Microsoft employees who worked directly on the development of GazeSpeak, wrote a report on the application. 


“GazeSpeak has no additional cost other than a smartphone, which most people in the U.S. already own,” the report stated. “(The application is an) eye gesture communication system that runs on a smartphone … designed to be low-cost, robust, portable and easy-to-learn.”

The application is a low-cost alternative to eye tracking machines that can cost up to $10,000. 

GazeSpeak works by interpreting different eye movements. For example, a wink would indicate the end of a word, looking up would signal a letter from A to F, looking to the right would signal a letter from G to L, and so on. The application then uses artificial intelligence and a program similar to predictive text messaging to “speak” what the user is trying to say.

J. K. Aggarwal, retired UT computer engineering professor, said that the mobile, easy-to-use application wouldn’t be possible without advanced artificial intelligence.

“The subject winks and chooses from a variety of options — there are a lot of different words that could be conveyed from the same gestures, so the AI must be skilled in cutting out the noise,” Aggarwal said. “It’s a simple case of adept artificial intelligence.”

Faced with impairments in muscle function and control, eye movement can sometimes be the only way that those with motor neuron diseases can communicate. Aggarwal said the idea behind GazeSpeak is similar to other language concepts such as sign language.

“Just as you convey different words through hand gestures in American sign language, you use non-oral gestures to signal what you want to say with this application,” Aggarwal said.

Engineering junior Nikhil Bhargava said GazeSpeak is an important application.

“This app has the potential to give a voice to those who, quite literally, cannot have a voice,” Bhargava said. “There’s a lot intellectuals out there that have a lot of important things to say that simply cannot due to neuronal disabilities.”

Rico Malvar, distinguished engineer at Microsoft and chief scientist for Microsoft Research, said Microsoft hasn’t officially announced GazeSpeak, but the application will debut in Denver at the Conference on Human Factors in Computing Systems in May.

Malvar said Microsoft’s mission statement to empower all people to achieve more helped inspire the app’s development.

“This (mission statement) means not only making sure that our products are accessible, but also investing in new interfaces and applications that can help people do more in spite of their disabilities,” Malvar said.

As for why he enjoys working at Enable, Malvar says the research performed there has the power to shape the quality of many lives and bring independence to those who have disabilities.

“Being able to achieve that kind of impact is the inspiration that motivates the work in our Microsoft Research Enable team,” he said.

More to Discover
Activate Search
Microsoft mobile app allows people with ALS to speak