Unconscious Bias: Can AI Help?

Emma Brine
09.07.2024

Share now

For years, ‘unconscious bias’ was understood to be a distinctly ‘human’ problem. Now, with the increasing uptake of AI, and on the heels of suggestions it could solve our bias problem, understanding where the human influence in AI ends is more important than ever. 

Can AI help?

Unconscious bias is a key term in conversations concerning Equality, Diversity and Inclusion (ED&I). It refers to unintentional forms of discrimination or stereotyping based on a person’s characteristics, such as race, sexuality, gender, religion, or age, and it can often lead to unintentional favouring or disfavouring of certain groups or outcomes.  

As unconscious bias stems from the negative stereotypes we don’t know we might have absorbed, many have suggested that AI could potentially play a part in removing bias. After all, it’s not human, doesn’t have opinions, and should be entirely data-driven. 

Corporations are beginning to utilise this more and more, particularly in recruitment, marketing and PR. That’s because AI can provide key insights into audiences and candidates, as well as analyse bias in our own work that we may not necessarily see. 

For example, AI has been used by multiple corporations in their recruitment process, specifically to write job descriptions. The technology can help identify potentially problematic terms and phrases and provide inclusive alternatives. This can include things such as removing gender bias in descriptions typically tailored towards a certain gender, to challenge those stereotypes and encourage people outside of a usual target group to provide new opportunities. 

Is it really that simple?

Unfortunately, when it comes to AI, nothing is. 

As with anything else human-made, AI can easily pick up our own unconscious biases. It’s therefore incredibly important to challenge whether the data that has been used to educate it, was entirely neutral. 

I recently had a conversation with AI, where I tested its capabilities from an ED&I perspective. I asked a popular AI tool to create me a variety of football kits based on different themes. Whilst the themes were all very respectfully dealt with, there was one issue that cropped in every single kit it designed for me – every single one was for a male body. 

Have a look:

Interestingly, the designs make it clear that the data AI has gathered about football has noticed  the unconscious bias many of us have around football being a ‘men’s sport’. After all, the cold-hard data likely shows that men are typically more interested in football than other genders – mens sport is more widely shown, has a longer history, and will return more search results online. 

With this information, the AI has created kits that are suited for the biggest target market, as it understands this as being the way of them being typically designed. In doing so, it perpetuates the biased idea that football kits should first and foremost be more men. 

What can we do?

With the knowledge that the data underlying the AI could potentially hinder the ED&I process, it complicates the role AI can play in helping to remove or detect unconscious bias. 

For example, if you ask AI to recruit candidates for a job in STEM, it would analyse information online around “good” candidates, track that men are typically more likely to be hired in this field, and so would search for male candidates, not understanding that woman have systematically been alienated from this career area. 

All that being said – it doesn’t mean that we should remove AI from the equation altogether. It can still help! In many ways, AI is simply an extension of ourselves, a human that doesn’t need a lunch break or 8 hours of sleep, but does require constant education and training in order to evolve to be a better version of themself. 

And, as shown in the conversation pictured above, AI is able to provide information and content that can benefit the ED&I agenda – we just need to instruct it accordingly.

What does ‘good’ ED&I prompting look like?

Once you understand AI’s particular pitfalls, you can navigate around them. In ED&I spaces, that often means remaining mindful of the need to challenge responses to widen the representation within them, or challenge a stereotype. 

It comes down to the difference between ‘equality’, and ‘equity’. On the top line, AI may help with equality – which assumes that everyone ought to be on a level playing field – but it hinders equity, as it isn’t built to lift marginalised communities up. 

When prompting for ED&I purposes, consider these three tips:

  • Be specific
    Create a unisex football kit based on traditional Metal music graphics, include kits for all ages, genders, and abilities. 
  • Indicate tone and feeling 

The results should be playful 

  • Suggest a source 

Take inspiration from the latest England football kits 

The more considered and aware we are when prompting, the more useful AI will be.