Artificial intelligence-powered voice assistants are showing up more and more in our day to day lives. Customer service robots exist in hotels, restaurants, bars, security, child care, and even some grocery stores. Although AI can often make our lives easier, it introduces numerous problems and can perpetuate stereotypes and hold prejudices.
Studies have shown that facial recognition and artificial intelligence (AI) bias disproportionately affects women and people of color. But what most people aren’t aware of are the gender stereotypes that everyday voice assistants present.
When iPhone users give Siri some sexually explicit commands, she responds with, “I’d blush if I could.” A 2019 UNESCO publication by this name was one of the first to delve into the harmful gender stereotypes that Conversational Voice Assistants (CVAs) like Siri, Alexa, Cortana, and Google Assistant reinforce. Apple has since updated Siri’s reply to a flatter, “I don’t know how to respond to that.” But the biases and stereotypes don’t go away by changing one line of code.
Most CVAs, when sexually harassed, flirt, avoid, or even thank the user. In February of 2017, Leah Fesler tested assistants for reactions to sexual harassment from humans. They responded with “Well, thanks for the feedback,” “Well, I never!” or, “There’s no need for that…” Since then, some of these programs have changed to a less flirtatious response, but all are still far from the much needed, “That sounds like sexual harassment. That’s not okay.”
Siri’s response to commands like these is not the only place we see gender-based stereotypes enforced by artificial intelligence. The first and most obvious shortcoming of most CVAs is gender. Although Google’s voice bot, Google Assistant claims that it “eats gender roles for breakfast,” 92.4% of all voice assistants default to a female sounding voice. The tasks that these bots carry out are often thought of as traditional women’s work. Society teaches children to treat voice bots as “unquestioning helpers who exist only to serve owners unconditionally.” These expectations reinforce outdated stereotypes and perpetuate harmful gender norms when these voice bots are female (Statt, 2019).
On the occasion that chat boxes or voice assistants default to a male, name, picture, or voice, you can usually find them on a law firm, accounting, or business based website delivering analytical advice and information. Chat boxes and voice assistants that default to female genders are almost always executing secretarial tasks. These tasks might include customer service, retrieving information, or doing other helpful things (New York Times).
According to Josie Young, “Assigning gender to a voice bot is poor design. It’s boring and lazy. Companies claim they are incorporating ‘progressive ideas.’ The ideas that women are tools who serve others and tolerate abuse.” The more that we associate assistants with women, the more society will see women as servants who must always be sympathetic, helpful, and eager to please.
However, most often, gender isn’t assigned to bots with mal-intent. Instead, it’s the result of wildly non-diverse teams of almost always cis-gendered and white males (Chin, Robinson, 2020). Women make up only 12 percent of AI researchers and a staggering 6 percent of software developers, as they’re not welcomed into the workforce and are often pushed out of the industry.
“It’s not always malicious bias, it’s unconscious bias, and lack of awareness that this unconscious bias exists, so it’s perpetuated,” said Allison Gardner, a co-founder of Women Leading in A.I. “But these mistakes happen because you do not have the diverse teams and the diversity of thought and innovation to spot the obvious problems in place. The whole structure of the subject area of computer science has been designed to be male-centric, right down to the very semantics we use.”
Not only does assigning a human gender to conversational voice assistants enforce dangerous stereotypes, but it also limits a bold future for voice bots. A future where they can exceed human capabilities. It constrains what society can see as possible for CVAs and artificial intelligence.
The solution to these problems is simple. Incorporate women into software development teams and give them equitable opportunities to ensure that modern-day voice assistants can manage sexual harassment, gender bias, and toxic gender stereotypes.