Official newspaper of The University of Texas at Austin

The Daily Texan

Official newspaper of The University of Texas at Austin

The Daily Texan

Official newspaper of The University of Texas at Austin

The Daily Texan

Advertise in our classifieds section
Your classified listing could be here!
October 4, 2022
LISTEN IN

Female bias in voice technologies reflect societal preferences, experts say

Female_Sarah
Sarah Bloodworth

When ordering your technological assistant to complete a task, you’ll get a response from a female voice — a polite and reliable female assistant. Siri, Alexa and Cortana all respond with a female voice, set as the default, and this could reflect a bias toward females serving others, researchers find.

Research led by Karl MacDorman, a professor at Indiana University who specializes in human-computer interaction, found that both men and women prefer a female voice when talking to a computer. There are psychological reasons why we might feel more comfortable with a woman’s voice, said sociology professor Gloria González-López.

González-López, who specializes in sexuality, gender and social inequality in Mexico, said it’s not surprising to see female voices set as a default for our voice assistants, because women are taught to be servants in society, especially toward men.


She said this process is called  “gendered servitude,” where, for example, it’s not unusual to ask a girl to clean after her brother. These gender inequalities, which become normalized, can later impact the workplace where we often see women in charge of administrative roles rather than being the boss, she added.

“Technology and the internet no doubt have become the mirror reflecting the very same expressions of social inequality that people encounter in their actual lives,” González-López said. “It’s not a surprise to have female voices as a default in these devices, and so when someone needs help, a woman is the one expected to come to offer that service.”

Computer science senior Yair Nieto said having a default female voice assistant is degrading to women.

“When a person thinks of assistants, the person who is giving the commands to the voice assistant feels like they are higher than the assistant,” Nieto said. “The voices should be balanced out.”

Apple’s Siri and the Google Assistant now offer the option to switch the default female voice to a male voice. But Alexa and Cortana don’t have these options. Computer science senior Vincent Lee said he doesn’t think tech companies would create default female voices to intentionally slight women, but sees where gender bias could play a role.

He added that his Google Assistant software removed gender bias from its voice settings by labeling the voices with colors instead of assigning a name.

“We don’t formally designate voices as being male or female,” said Google spokeswoman Ashley Thompson. “But you can think of ‘voice one’ as traditionally female sounding, and ‘voice two’ as traditionally male sounding.”

She added that when people set up their Google Home device, you are randomly assigned a voice which gives a 50-50 chance of getting a female or male voice.

González-López said tech companies should be mindful when creating their products to reduce the implications of gender inequality.

“We have been socialized to see women helping others as ‘normal,’ and that creates some sort of emotional comfort,” González-López said. “(But) a woman’s voice in our devices remind us again and again — every time we push that button —of one more expression of gender inequality in our society.”

More to Discover
Activate Search
Female bias in voice technologies reflect societal preferences, experts say