I’m regularly asked about the bias in artificial intelligence, with a tone that suggests AI should be dismissed because of it. The line of questioning generally stems from stories of racial, gender, and age bias.
For example, when AI is used in hiring, a company uses all its historical data to train a model on who should be hired and why. The algorithm might then pick a white, middle-aged man to fill a vacancy based on the fact that other white, middle-aged men were previously hired to the same position, and subsequently promoted.
However, what people often fail to understand is that the reason he was hired, and promoted, was not because he was a white, middle-aged man, but because he was good at the job. It’s easy to want to blame the AI for making a biased recommendation, when in fact, it was the right call.
That’s not to say AI bias isn’t real. It is a phenomenon that occurs when an algorithm produces results that are systematically prejudiced due to erroneous assumptions in the machine learning process. This can stem from the limitation of the data or something wrong with the model.
However, more times than not, what’s really wrong is that ingrained biases in society have led to unequal outcomes in the workplace, and that isn’t something you can fix with an algorithm.
Ridding inbuilt machine biases
There are two main ways that bias shows up in training data: either the data you collect is unrepresentative of reality, or it reflects existing prejudices. The first case might occur, for example, if a deep-learning algorithm is fed more photos of light-skinned faces than dark-skinned faces. The resulting face recognition system would inevitably be worse at recognising darker-skinned faces.
The second case is precisely what happened when Amazon discovered that its internal recruiting tool was dismissing female candidates. Because it was trained on historical hiring decisions, which favoured men over women, it learnt to do the same.
Algorithms can have built-in biases because they are created by individuals who have conscious or unconscious preferences that may go undiscovered until the algorithms are used, and potentially amplified, publicly. This is often due to human intervention or the researchers’ lack of cognitive assessment.
Cognitive bias is a limitation in objective thinking caused by the tendency for your brain to perceive information through a filter of personal experience and preferences. Types of such bias that can be inadvertently applied to algorithms are stereotyping, bandwagon effects, confirmation bias, priming and selective perception.
Of course, we want to make sure that the way AI processes data doesn’t end up harming groups of people, but I want to focus on a far more prevalent issue: management bias. Admit it, you are biased, and as hard as you try to keep it from affecting your decision making, it still does.
The fact is, all human beings are biased. Hiring and promoting based purely on meritocracy might be the ideal scenario, but we know that to be unrealistic. One-third of bosses decide if they like a candidate in the first 90 seconds, citing eye contact and how a person looks and dresses as justification for their suitability for a role. Are the way people look and the clothes they wear criteria for success, or is judging people in those terms a demonstration of a bias?
The question I have for you is, “Who’s more biased, the AI or the human leader?” And I hope you indict man over the machine.
Every day, you fall into cognitive traps such as availability bias (the tendency to substitute available data for representative data); familiarity bias (the tendency to overvalue things we already know); and confirmation bias (the tendency to think new information proves our existing beliefs).
If you asked me to pick between AI bias and man’s bias, I’d pick machines every time. Why? Because once AI bias is identified, it becomes predictable and more easily understood.
Human bias is complex, hard to understand and unpredictable.
Tommy Weir is the CEO of enaible: AI-powered Leadership and author of “Leadership Dubai Style”. Contact him at firstname.lastname@example.org