Part of:

Women in AI: Reinforcing Sexism and Stereotypes with Tech

Why Trust Techopedia

There is evidence of a persistent gender bias in AI. That could be one of the reasons why we see gendered stereotypes manifested in the female identities associated with AI assistants.

Digital transformation has changed the way we work and play.

But technological progress does not always translate into social progress with respect to gender parity. Some aspects of tech have reinforced, rather than countered, sexual stereotypes. (Also read: What Do Women in Tech Want?)

Here are some key areas reinforcing sexism and gendered stereotypes within artificial intelligence (AI):

Bro Culture at Work

While tech and AI continue to advance, the same cannot be said for women’s positions in these very male-dominated fields. Women still hold just over a quarter (26%) of data and AI positions, according to the World Economic Forum.

That lack of representation means that the majority of women working in tech—72%, according to TrustRadius—still have to contend with bro culture. That can translate into a very toxic, and even dangerous, environment for women.

In the instance of the virtual reality (VR) game company Activision Blizzard, the workplace bro culture led to unequal pay, sexual harassment and even assaults without any real consequences for the perpetrators. The Wall Street Journal reported that the CEO, who knew about misconduct, intervened to be sure those found guilty of misconduct from internal investigations were not fired as per the recommendations.


Bro Culture at Play

The sexism pervading the companies that produce games also creates a hostile environment for female players.

In 2021, Reach3 Insights surveyed 900 women and found 59% of them opted for gender-neutral or even masculine names when playing to avert sexual harassment.

Over three-quarters of women surveyed (77%) reported having to deal with some kind of unpleasantness as a female. Judgment on their skills was reported by 70% and gatekeeping by 65%. Half reported patronizing comments and 44% said they “received unsolicited relationship asks while gaming.”

Some women have an even worse experience in virtual reality. Jordan Belamire wrote “My First Virtual Reality Groping.” An avatar named BigBro442 persisted in groping her avatar despite her requests and orders to stop. Belamire noted:

“As VR becomes increasingly real, how do we decide what crosses the line from an annoyance to an actual assault? Eventually we’re going to need rules to tame the wild, wild west of VR multi-player.”

New Platforms, Same Old Problem

Another question is: when is “eventually” going to arrive?

Belamire wrote “My First Virtual Reality Groping” in 2016, and over five years later a similar incident was reported in The Verge. A beta tester for Meta’s “Horizon Worlds” reported that her avatar was groped on the platform and reported how upsetting she found the incident.

“Sexual harassment is no joke on the regular internet, but being in VR adds another layer that makes the event more intense,” she wrote. “Not only was I groped last night, but there were other people there who supported this behavior which made me feel isolated in the Plaza.”

Meta’s platform does offer a blocking feature, which can give some more control to those entering a space where anyone can approach your avatar. But that kind of solution still doesn’t measure up to what Belamire suggested: a code of conduct that players would have to adhere to.

That sexual harassment remains a serious problem—one that carries over from the real world to the virtual one—reflects the fact that society is still mired in certain gendered assumptions. And these assumptions also express themselves in subtler forms.

What Siri Says About Us

Gender equality was supposed to have advanced since the middle of the last century, but the gendered assumptions that remain in place in everyday tech remind us we still have a long way to go.

That was the focus of a recent UNESCO study entitled “I’d Blush If I Could.”

The study’s title is a sentence Apple’s female-gendered voice-assistant, Siri, was originally programmed to say in response to users calling her a sexist name. Apple updated Siri’s programming in early 2019 to offer a more machine-appropriate “I don’t know how to respond to that” when someone makes such a statement to the AI agent.

But still, one has to wonder why it took the company that long. Siri was released in 2011, and it shouldn’t have taken nearly eight years to acknowledge and address a problem of sexist assumptions.

As the report points out, ”Siri’s ‘female’ obsequiousness—and the servility expressed by so many other digital assistants projected as young women—provides a powerful illustration of gender biases coded into technology products, pervasive in the technology sector and apparent in digital skills education.”

What’s in a Name?

Ironically enough, Amazon, whose name refers to a fierce warrior race of women, upheld sexist assumptions about females when it launched its AI agent. Alexa’s name is derived from Alexandria, a city whose claim to fame in the ancient world was its library, according to Daniel Rausch, the head of Amazon’s “Smart Home” division.

Rausch told Business Insider that the idea behind referencing Alexandria with Alexa’s name was to capture the idea of the ancient library’s original collection of volumes, which housed “all the collective knowledge of the world at that time.” As that ancient city was named for Alexander the Great, Amazon could just as well have called its agent “Alex,” a name used by men and women.

But the company decided on the distinctly feminine version of the name, just as Apple opted for the feminine “Siri” and Microsoft created Cortana. Likely, the companies all did the same kind of market research Amazon said it did. (Also read: How Will AI Change the Market Research Scenario?)

Why AI Uses Women’s Voices and Avatars

In the Business Insider interview, Rausch said Amazon “found that a woman’s voice is more ‘sympathetic’ and better received.” The article went on to say this preference for female voices predates AI assistants.

Indeed, even the computer on board the Enterprise spoke in a female voice. The voice was in fact that of Majel Barrett-Roddenberry, wife of the creator of the “Star Trek” series and most recognized by fans for her recurring role as the perfectly coiffed blond nurse, Christine Chapel, who had to dutifully take orders from Dr. McCoy.

True there are AI agents that are linked to male identities, as PC Mag’s Chandra Steele observed in a Medium blog in 2018. But they are typically linked to more serious tasks than those relegated to the virtual assistant on your desktop or phone. Accordingly, IBM’s Watson, which is associated with things like medical research, was given the “masculine-sounding voice” that people associate with confidence and leadership. (Also read: Top 20 AI Use Cases: Artificial Intelligence in Healthcare.)

In contrast, the female voices are associated with cordiality and complaisance. “Though they lack bodies,” Steele explained, “they embody what we think of when we picture a personal assistant: a competent, efficient, and reliable woman.”

Sometimes virtual assistants are even granted a feminine virtual body—at least one that appears on-screen. That is the case of IPsoft’s cognitive agent Amelia; she is depicted as blonde who could be in her twenties. She embodies the dependable female who supports the one in charge, in the background but also conventionally attractive.

Tackling the Root of the Problem

“There’s nothing artificial about AI,” declared Fei-Fei Li, an expert in the field. “It’s inspired by people, it’s created by people, and—most importantly—it impacts people.” Just as “garbage in, garbage out” applies to all data, the same holds for what she terms “bias in, bias out” for AI systems.

The upside of that, however, is that it is possible to reshape the path that has been set. However, we must make a conscious effort to balance the perspectives fed into AI. Failure to do so, Li said, would “reinforce biases we’ve spent generations trying to overcome.” (Also read: Minding the Gender Gap: 10 Facts about Women in Tech.)

What we need to do going forward is to consciously combat bro culture, whether it gives expression to overtly harmful effects, as in the case of sexual harassment and assault, or whether it manifests more subtly, like in the sexual stereotyping of AI-powered entities.

To achieve that, it is important that women’s voices are heard—not as compliant helpers, but as equals to their male counterparts at work and at play.


Related Reading

Related Terms

Ariella Brown
Ariella Brown

Ariella Brown has written about technology and marketing, covering everything from analytics to virtual reality since 2010. Before that she earned a PhD in English, taught college level writing and launched and published a magazine in both print and digital format.Now she is a full-time writer, editor, and marketing consultant.Links to her blogs, favorite quotes, and photos can be found here at Write Way Pro. Her portfolio is at