We’d never thought that there would be a day when we’d be accusing technology of perpetrating gender bias, but here we are…


In March 2024, a UNESCO study revealed alarming tendencies in LLMs (large language models) to produce gender bias, among other equally distressing concerns including racial stereotyping and homophobia.

Artificial intelligence (AI) has been transforming our world at an unprecedented pace, but the concerning pattern of these systems demonstrating significant gender bias has been emerging as well. As automated decisions across industries like finance, criminal justice, employment, healthcare, etc. become the norm, this bias threatens everything the world has worked hard for towards gender equality.

What is AI Gender Bias and Why It’s Important

Imagine training a machine or model to make hiring decisions by feeding it examples from the past. If a majority of those examples carry unconscious or conscious bias – let’s say it shows most men as scientists or professionals and most women as nurses or in vocational roles – it might result in the AI interpreting that someone is better suited for specific roles, thus making biased decisions when filtering employment applications.

This is gender bias in AI when the AI model treats people differently based on their gender since that’s what it learned from the (biased) data it received training from. The gender gap due to AI is real, and it’s been hurting women.

Image Courtesy: Wikimedia

The Origins: Where It All Started

It’s clear as day that AI systems don’t develop biases spontaneously — they learn them from the data they’ve been trained on. At its core, AI is a set of technologies that’s all about data, enabling computers to do complex tasks more speedily than humans. Since the enormous datasets these systems are analysing often reflect societal prejudices and historical inequalities, the outcomes are inevitably biased.

What’s troubling is what happens with the increasing use of synthetic or artificially-generated data. It might help overcome data shortages, but such data simply magnifies existing prejudices. Moreover, such data usually lacks transparency, making it difficult to pinpoint and address these embedded biases, thus creating a dangerous feedback loop.

As biased AI systems making prejudiced decisions generate even more biased data, it ends up training all the more biased systems. With AI becoming increasingly embedded in the decision-making procedures across sectors, this dangerous cycle can systematically and severely exclude women from opportunities, reinforcing gender stereotypes.

Real-World Impact:

When a gender bias study analysed 100 AI-generated images of the text-to-image generator Midjourney, its results were surprising. For instance, when it returned images for specialised job titles and roles, it showed both older and younger people, but the older people were always men. This implicitly reinforced numerous biases, including the assumption that when it comes to specialised work, it’s only older men that are suited for it, and women’s domain is that of less specialised work.

This was a fun tool, but the repercussions of gender bias in AI are far-reaching. For instance, AI bias manifests in healthcare manifests in the form of discrepancies in diagnosis and treatment. AI diagnostic tools are often less accurate for women, especially women of colour, as they’ve been majorly trained on male patient data. This has serious consequences for women’s health outcomes, especially maternal health, where existing disparities are already pronounced.

With the financial sector increasingly relying on AI for credit decisions, it’s resulted in more barriers for women who are seeking financial independence. Since these AI lending algorithms either deny women credit or offer them less favourable terms, they severely limit their ability to secure capital for entrepreneurship, homeownership, and education, all of which are crucial for financial security.

However, the most glaring biases are perhaps being seen in employment, which is so important worldwide. AI recruitment tools are frequently known to exhibit what experts now call the “mom penalty.” These systems are interpreting employment gaps as negative indicators, rather than recognizing the valuable skills women develop during career breaks for childcare, practically penalizing them.

In fact, Amazon even had an AI recruitment tool that favoured male resumes, which it eventually discontinued. If that wasn’t enough, AI hiring algorithms often favour candidates with traits that are historically associated with male leadership, thus limiting advancement opportunities for women even more, especially in traditionally male-dominated fields such as executive management, finance, and technology.

That’s not all. LLMs like BERT and GPT often associate jobs like “scientist” with men and “nurse” with women. And have we ever given a second thought as to why voice assistants always default to female voices? It reinforces stereotypes that women are better suited for service roles.

Can AI Be Better?

Yes, AI-generated data carries gender bias shortcomings. However, it’s also shown significant potential in identifying and addressing gender inequalities across industries.

For instance, AI is helping overcome long-standing credit scoring gender biases, with Zest AI using ML (machine learning) to make fairer credit assessments. AI has also helped reveal disparities in gender enrolment rates on platforms such as edX and Coursera, uncovering biases in textbooks, tracking leadership role gender representations, and finding ways to address inequalities.

AI has even helped find workforce gender pay gaps, with tools like Glassdoor now showing gender-based differences in salaries. Why that gender pay gap exists at all is a whole other question, which is the fight we must fight to prevent gender inequity.

In case you missed:

Malavika Madgula is a writer and coffee lover from Mumbai, India, with a post-graduate degree in finance and an interest in the world. She can usually be found reading dystopian fiction cover to cover. Currently, she works as a travel content writer and hopes to write her own dystopian novel one day.

Leave A Reply

Share.
© Copyright Sify Technologies Ltd, 1998-2022. All rights reserved