Opinion: Use of algorithms can perpetuate bias


In 2014, Amazon tried to develop a recruiting algorithm to rate the resumes of job candidates and predict who would do well. But, even though gender was not intended as a factor in the algorithm, it still favored men and penalised resumes that included the names of all-women’s colleges. — Sipa USA/TNS

For most of us, the word “algorithm” is fairly new to our vocabulary. But badly designed decision-making algorithms have a growing impact on our lives and can do a great deal of damage.

Simply put, an algorithm is a set of instructions used by computer systems to perform a task or make a decision. On social media platforms, for example, algorithms decide what ads appear based on what content a user looks at, likes or shares.

As we discovered in a new Greenlining Institute report on algorithmic bias, these algorithms may be used to decide everything from whether someone gets a job interview or mortgage, to how heavily one’s neighbourhood is policed.

“Poorly designed algorithms,” we wrote, “threaten to amplify systemic racism by reproducing patterns of discrimination and bias that are found in the data algorithms use to learn and make decisions.”

Algorithms can be put to good use, such as helping manage responses to the COVID-19 pandemic, but things can also go seriously wrong. Sometimes, algorithms replicate the conscious or unconscious biases of the humans who designed them, disadvantaging whole groups of people, often without them even knowing it’s happening.

Like humans, algorithms “learn” – in the latter case through what’s called training data, which teaches the algorithm to look for patterns in bits of information. That’s where things can start to go wrong.

Consider a bank whose historical lending data shows that it routinely gave higher interest rates to people in a ZIP code with a majority of Black residents. An algorithm trained on that biased data could learn to overcharge residents in that area.

In 2014, Amazon tried to develop a recruiting algorithm to rate the resumes of job candidates and predict who would do well. But, even though gender was not intended as a factor in the algorithm, it still favoured men and penalised resumes that included the names of all-women’s colleges. This likely happened because Amazon had a poor record of hiring and promoting women, causing the training data used for the algorithm to repeat the pattern.

Happily, Amazon’s researchers caught the problem and, when they found they couldn’t fix it, scrapped the algorithm. But how many such situations have gone unnoticed and uncorrected? No one knows.

Worse, our laws have not caught up with this new, insidious form of discrimination. While both US federal and state governments have anti-discrimination laws, they’re ineffective in this situation, since most were written before the Internet was even invented. And proving algorithmic bias is difficult since the people being discriminated against may not know why or how the decision that harmed them was made.

Our anti-discrimination laws must be updated to properly regulate algorithmic bias and discrimination, with provisions to promote transparency. California’s legislature is leading the way by considering legislation that would bring more transparency and accountability to algorithms used in government programmes.

Government at all levels should pay much more attention to this new, insidious form of discrimination. – Progressive Media Project/Tribune News Service

(Vinhcent Le is technology equity legal counsel at The Greenlining Institute. This column was produced for the Progressive Media Project, which is run by The Progressive magazine, and distributed by Tribune News Service.)

Article type: metered
User Type: anonymous web
User Status:
Campaign ID: 46
Cxense type: free
User access status: 3

Algorithms

   

Next In Tech News

South Korean battery makers agree last-minute deal in boost to Biden's EV policy
Messenger chats bring people closer than video chat, study finds
Wave of phishing emails feared after massive Facebook leak uncovered
Stifel CEO says life after Covid-19 means a return to the office
Need more bass?�Tweak the sound of your headphones with an EQ�app
NASA space copter ready for first Mars flight
Digital farewells: A guide to painless pandemic-era office goodbyes
Lean forward - and other ways to radiate competence on Zoom
Judge hands Amazon a setback in New York lawsuit over COVID-19 shortfalls
U.S. senators criticize Apple for not testifying on antitrust concerns

Stories You'll Enjoy


Vouchers