Bad moral code? When algorithms have bigoted world views


  • TECH
  • Sunday, 07 Jul 2019

A visitor to the Cebit expo in Hamburg pictured in front of a light display. — dpa

Technology is supposed to help humans be more productive, and algorithms are taking all kinds of tasks out of our hands. But when algorithms go wrong, it can be a real horror story.

Like when an algorithm to help Amazon's hiring process suggested only male applicants. Or the times when Google's image recognition software kept mixing up black people with Gorillas and telling Asian people to open their eyes.

So what's up with that? Can algorithms be prejudiced?

Lorena Jaume-Palasi, founder of the Ethical Tech Society in Berlin, says it's more complicated than that. "People are always the reason for discrimination," she says.

"Instead of trying to regulate the reasons discrimination exists, we are focusing on the technology, which just mirrors discriminatory practices," she says.

Algorithms are instructions on how to solve a particular problem. They tell the machine: This is how to do this thing. Artificial intelligence (AI) is based on algorithms.

AI copies intelligent actions, and the machine is instructed to make informed decisions. In order for it to do that successfully, it needs large amounts of data, which it can use to recognise patterns and make decisions based on those patterns.

This is one explanation for why algorithms can turn out so nasty: Often, they are making decisions based on old data.

"In the past, companies did have employment practices that favoured white men," says Susanne Dehmel from Bitkom. If you train an algorithm using this historic data, it will choose candidates that fit that bill.

When it comes to racist photo recognition software, it is also very likely that it was not the algorithm's fault – instead, the choice of images used to train the machine may have been problematic in the first place.

Now, there is a positive side to all this: The machines are holding a mirror up to human society, and showing us a pretty ugly picture. Clearly, discrimination is a big problem.

One solution is for tech companies to take more of an active role in what algorithms spit out, and correct behaviours when needed.

This has already been done. For example, when US professor Safiya Umoja Noble published her book Algorithms Of Oppression, in which she criticised the fact that Google's search results for the term "black girls" were extremely racist and sexist, the tech giant decided to make some changes.

We need to ask how we can ensure that AI technologies make better and fairer decisions in the future. Dehmel says there needn't be any government regulation.

"It is a competency problem. When you understand how the technology works, then you can counter discrimination carefully," she says.

Past examples have already shown that it isn't enough to just take out information about gender and race – the algorithms were still able to make discriminatory connections and produced the same results. Instead, Dehmel suggests developers create diverse data sets, and conduct careful trials before training the machines.

Jaume-Palasi believes continuous checks on algorithmically based systems are necessary, and AI should be created by more than just a developer and a data scientist.

"You need sociologists, anthropologists, ethnologists, political scientists. People who are better at contextualising the results that are being used across various sectors," she says.

"We need to move away from the notion that AI is a mathematical or technological issue. These are socio-technological systems, and the job profiles we need in this field need to be more diverse." – dpa

Limited time offer:
Just RM5 per month.

Monthly Plan

RM13.90/month
RM5/month

Billed as RM5/month for the 1st 6 months then RM13.90 thereafters.

Annual Plan

RM12.33/month

Billed as RM148.00/year

1 month

Free Trial

For new subscribers only


Cancel anytime. No ads. Auto-renewal. Unlimited access to the web and app. Personalised features. Members rewards.
Follow us on our official WhatsApp channel for breaking news alerts and key updates!
   

Next In Tech News

Baffled judge jails US man driving during Zoom hearing about suspended license, video shows
Amazon adds Grubhub food delivery to its website, app in the US
How AI made Mark Zuckerberg popular again in Silicon Valley
EU court backs Google, Amazon, Airbnb in Italian rule dispute
Tech war: Huawei races to fill void left by Nvidia in China, with home-grown chips becoming popular components in ‘AI boxes’
China plans leading role in global AI race on standards and computing power push
One dead after falling into jet engine at Schiphol
YouTube is coming for all of your devices
Four arrested in sprawling European sting on malware network
Ring has a�new mission.�‘Hey, there’s raccoons in my backyard’

Others Also Read