Ever wonder why Alexa is not Alex, which could actually be a nickname for either gender? It’s actually ironic that a company with a name derived from a fierce warrior race of women just fell into the standard practice of casting the helper who takes orders from the user as female.

What's in a Name?

In fact, the AI agent’s name is derived from Alexandria, a city whose claim to fame in the ancient world was its library, according to Daniel Rausch, the head of Amazon’s “Smart Home” division. He told Business Insider that it is to capture the idea of the original collection of volumes that housed “all the collective knowledge of the world at that time.”

But they could have just as easily considered the fact that the city was named for Alexander the Great, and gone with the name Alex, a nickname adopted by men and women. Instead, they went with the distinctly feminine Alexa.

Amazon is not alone in this. Apple chose Siri as the name for its voice-assistant who speaks in a feminine voice. A female identity was also selected for Microsoft’s Cortana. It does not appear to be just a coincidence that the three best known AI identities are female.

Likely the companies all did the same kind of market research that Amazon said it did. (For the not-so-serious side of Siri, check out Pardon Moi? Top Siri Fail Messages.)

If Computers Could Talk, How Should They Sound?

In the Business Insider interview, Rausch said that they “found that a woman’s voice is more ‘sympathetic’ and better received.” The article went on to say this preference for female voices predates AI assistants.

That could well be why the computer on board the Enterprise spoke in a female voice. The voice was in fact that of Majel Barrett-Roddenberry, wife of the creator of the “Star Trek” series, and most recognized by fans for her recurring role as the perfectly coiffed blond nurse, Christine Chapel, who had to dutifully take orders from Dr. McCoy.

In the video below, she is actually giving voice to a library in sharing the information on file for the person the captain is asking about.

True there are AI agents that are linked to male identities, as PC Mag’s Chandra Steele observed in a Medium blog this year. But they are typically linked to more serious tasks than those relegated to the virtual assistant on your desktop or phone. Accordingly, IBM’s Watson, which is associated with things like medical research, was given the “masculine-sounding voice” that people associate with confidence and leadership.

In contrast, the female voices are associated with cordiality and complaisance. “Though they lack bodies,” Steele explained, “they embody what we think of when we picture a personal assistant: a competent, efficient, and reliable woman.”

Sometimes they are even granted a feminine virtual body, at least one that appears on the screen. That is the case of IPsoft’s cognitive agent Amelia. As you can see in the video below, she is shown as a fairly young blonde (the hair color Barrett-Roddenberry dyed her hair for her visible role as a nurse).

Amelia embodies the dependable female who supports the one in charge, in the background but also just a bit decorative. She puts one in mind of the ideal assistant envisioned in the 1950s. Like Perry Mason’s Della Street, the quintessentially loyal secretary, she is always there to meet the needs of her male boss and refuses no request or task.

What Siri’s Voice and Expressions Say About Us

We’re supposed to have made significant progress with respect to gender equality since the middle of the last century, but the AI assistants remind us that we still have a long way to go. That was the focus of a recent UNESCO study entitled “I’d Blush If I Could.”

The study’s title is a sentence that Apple’s female-gendered voice-assistant, Siri, was originally programmed to say in response to users calling her a sexist name. The good news is that Apple updated Siri’s programming in early 2019 to now offer a more machine-appropriate “I don’t know how to respond to that” when someone makes such a statement to the AI agent.

However, as the report points out, Apple did let it stand for a very long time, considering that Siri was released back in 2011. That is the crux of the problem.

Tech May Have Advanced, but Women...Not so Much

As the report points out, ”Siri’s ‘female’ obsequiousness — and the servility expressed by so many other digital assistants projected as young women — provides a powerful illustration of gender biases coded into technology products, pervasive in the technology sector and apparent in digital skills education.”

The report takes the sexism embedded in the AI’s feminine identity and programmed responses as due to the systemic gender gap in technology in general and AI in particular: “Today, women and girls are 25 per cent less likely than men to know how to leverage digital technology for basic purposes, 4 times less likely to know how to programme computers and 13 times less likely to file for a technology patent.”

Very Few Women Are Working in AI

That fits with the findings of the AI Now Institute’s Discriminating Systems: Gender, Race, and Power. At AI conferences women make up only 18% of the represented authors and less than 20% of AI professors. The representation is even worse in the industry. At Facebook women only hold 15% of the research staff positions, and at Google the percentage drops to 10.

As bad as those numbers are, they are better than what the UNESCO report says about applicants to such positions: “Recruiters for technology companies in Silicon Valley estimate that the applicant pool for technical jobs in artificial intelligence (AI) and data science is often less than 1 per cent female.”

There are serious consequences for that imbalance, according to the UNESCO report: “As men continue to dominate this space, the disparity only serves to perpetuate and exacerbate gender inequalities, as unrecognized bias is replicated and built into algorithms and artificial intelligence.”

It explains that “male-dominated teams” are the ones that set up the criteria that set a norm that “can be hard to fix when gender biases are pointed out.” The issue of AI bias was addressed in AI’s Got Some Explaining to Do. The AI is not creating the bias, but reflecting the prejudices of its programming.

“There’s nothing artificial about AI,” declared Fei-Fei Li, an expert in the field. “It’s inspired by people, it’s created by people, and — most importantly — it impacts people.” Just as “garbage in, garbage out” applies to all data, the same holds for what she terms, “bias in, bias out” for AI systems.

The Way Forward

So how can we overcome such biases that have been built into our machine learning functions and further shape our expectations? The answer, according to the UNESCO report, is education: “Education is where expectations are forged and competencies cultivated.”

Li agrees that it is possible to reshape the path that has been set. “With proper guidance AI will make life better,” she asserts. “But without it, the technology stands to widen the wealth divide even further, make tech even more exclusive, and reinforce biases we’ve spent generations trying to overcome.”

It’s time to boldly go where man has not gone before.