AI in the Workplace: What it Means to the Gender Wage Gap in 2019

Why Trust Techopedia
KEY TAKEAWAYS

AI's importance to business operations is growing, and while it's meant to operate objectively, it has been shown to reinforce gender biases. But with awareness and education, it can be used to bridge gaps rather than reinforce them.

As we saw in Minding the Gender Gap, women still lag far behind men in the tech field, both in terms of representations (which hovers around 25% in the United States), and in terms of pay, where the gap between men and women is close to 12%.

While figures for pay disparity in tech don't focus on specialists in artificial intelligence (AI), female representation there is even lower.

According to the report, Discriminating Systems: Gender, Race, and Power, conferences women make up only 18% of the represented authors at AI conferences and less than 20% of AI professors. They fare even worse in corporations where they make up only 15% of research staff positions at Facebook and a mere 10% at Google.

As AI grows increasingly central to business operations, the question to explore is: what impact can AI have on gender gaps and the workforce in general? (Read Could genetics explain the gender gap between men and women in tech?)

I reached out to a few experts in the field to get their take on the role AI can play in reinforcing or transcending gender bias. Generally, they are optimistic about the future.

Women and AI

Anish Joshi, VP of Technology at Fusemachines (a leading provider of AI services, solutions and education) believes AI actually “remove bias from the hiring process that has historically favored men.” (Also read: 5 Ways to Support Women in Your Tech Company.)

Advertisements

That corresponds with what Amy Chen, COO of Cortex Labs observed about AI serving as a counter to emotional or subjective perspectives that still influence decisions: “We can in the future can be more based on objective facts with less stereotyping and bias,” she declared.

The way that will work, Joshi explained, is as follows:

  • AI algorithms are able to incorporate data that matters when hiring (qualifications, education, experience, etc.) and ignore data that doesn’t matter (sex, ethnicity, age, etc.).
  • This technology can also measure relevant trends and use predictive analysis to make better merit-based hires.
  • Employees may value different things in the workplace, and AI augmented predictive analysis can identify these differences.

This is not strictly hypothetical but already put in practice by utilizing software made by companies like Gapsquare, Pipeline, Plum, and Pymetric to drive decisions based on data, Joshi observed. However, he does concede that bias programmed into AI can exacerbate the disparity between the sexes in hiring.

He explained: "If algorithms are trained on biased data, they will produce biased outcomes. This can be especially harmful to women in HR and hiring. There have been many notable instances of this, including one with Amazon whose machine learning powered hiring technology penalized resumes that suggested its owner was female, e.g., women’s.”

"Reuters reports this is because Amazon’s computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period. Most came from men, a reflection of male dominance across the tech industry.”

Still, Joshi maintains the “hope that AI will help increase the number of women hired.” He referred to a Unilever report that employee diversity has improved by 16%, thanks to the application of AI.

Michal Neufeld, CPO of Ubimo put it this way: “In a nutshell, any algorithm is as good as the input it gets and the models that it uses.” The real danger in AI infused with bias from its programming is that it can carry the appearance and carry the authority of “‘objective science.’”

However, awareness of that is growing, and that is inspiring solutions.

“Partly because of unfortunate findings such as the COMPAS case, partly because there is a practical need to explain and stand behind results provided by these systems, and hopefully also because we are trying to do better,” Neufeld said.

Addressing Bias Baked Into Artificial Intelligence

Neufeld explained that much of the problem stems from the difficulty inherent in comprehending what exactly is going on in the AI models. That is what has become known as the “black box” problem (see AI's Got Some Explaining to Do).

She explained that one way to address that "is developing explainers such as LIME, aimed at "reverse engineering" the input, output, and model to indicate which features from the input were eventually used in order to compute the model’s results.”

That would make it possible to "identify predictors that are biased, or assuming causality vs. correlation.”

She also believes it’s important to get to the core cause of bias, not merely the way it is manifested in AI results. The way to do that is by “putting the emphasis on education and cultural bias legitimacy.” Neufeld is optimistic that it will be possible to remove bias from the people and bias-free programming will naturally follow.

That optimistic prediction is shared by the other two, as well.

AI’s Impact on the Workforce

All respondents concede that there will be some job displacement as certain tasks are automated through AI. Joshi further acknowledged how that can have a detrimental impact on women.

“Roles that have been traditionally held by women (administrative, customer service, etc.) are being automated, putting some women at risk of being replaced if they aren’t up-skilled and/or trained.”

However, they maintain that the loss in one area will be offset by opportunities in another area. Neufeld put it this way:

"Much like in the Industrial Revolution, when people’s work was replaced by machines, people were still needed to build those machines. In today’s case, cognitive work that is done by machines is still lacking capabilities that are either transcendent to the tasks (like designing the model that will complete the task) or such we humans can’t ‘easily’ teach, such as creativity."

"Despite the obvious loss of jobs due to AI replacements, I believe many new opportunities will be created in the landscape of “guarding” the machines — whether in training them, monitoring their ethical and social outcome, or explaining their output and bridging the gap between "them" and humans. People will be in charge of explaining and translating technical datasets, implementation, and results to the business side (Technical to Business proxies)."

Joshi added that the workforce of future will not be reduced to people or machine so much as people with machines.

“People will have to become accustomed to working alongside intelligent machines, not simply being replaced by them.” He cited the example of innovations in in diagnosing in cancer now. “A human assessment is still necessary on top of using AI.” (Read Cancer Vaccines and Artificial Intelligence: Winning the War Against Cancer?)

The AI Talent Pool Now and Down the Road

The problem of talent shortage in tech is something many companies complain about. Joshi, Chen, and Neufeld agree that is the case now. But they all consider this a temporary setback. Joshi believes that programs like those offered by programs like The Fuse AI Center will make AI education more accessible, and that “will widen the AI talent pool and essentially prepare engineers for the fast-growing global AI job market.”

In terms of the education needed for jobs, Neufeld, considers the situation of AI talent comparable to what we’ve seen before and so predicts that there will be a swing in the other direction:

"I believe there is a shortage in data science and AI developers today, same as the shortage of web developers at the end of the previous century. And similar to that, we are seeing the natural economic effect of supply and demand that raises the wages and lucrativeness of such positions."

"If I had to predict, we will see a pendulum phenomena of market saturation before balance is achieved."

Education to Develop the AI Talent Pool

They all anticipate that more people will be taught the technical skills they need to advance into AI in the future and curriculums shift to prepare people for the modern workforce. However, Chen believes that should even begin at the high school level.

Her argument is that of the curriculum being adapted to the needs of the times. In the earlier part of the past century in which technology centered around engines, the subjects taught at the high school level included chemistry, physics, and mathematics. Then technology advanced “to center around computers and cell phones and now AI and blockchain.”

Consequently, Chen argues, education should now include “coding, computer science, and computer architecture” among the “mandatory courses.”

Neufeld, on the other hand, thinks that disciplines like philosophy, psychology, and anthropology may become increasingly important in training the people who program AI. She explained it this way:

"The interesting part of the changes in the talent pool are in the jobs we don’t know of, or those requiring qualifications we don’t know how to teach. How do you train someone to create a machine that shows empathy or recognizes sarcasm?"

"In that sense study subjects like philosophy, psychology, and anthropology may become more apt to train the future generation of AI operators. This may result in a bigger gap in the near future, as it requires building these capabilities and adapting education programs from the ground up."

Advertisements

Related Reading

Related Terms

Advertisements
Ariella Brown
Contributor
Ariella Brown
Contributor

Ariella Brown has written about technology and marketing, covering everything from analytics to virtual reality since 2010. Before that she earned a PhD in English, taught college level writing and launched and published a magazine in both print and digital format.Now she is a full-time writer, editor, and marketing consultant.Links to her blogs, favorite quotes, and photos can be found here at Write Way Pro. Her portfolio is at https://ariellabrown.contently.com