Google AI tool to no longer label people in photos as ‘man’ or ‘woman’


  • AI
  • Monday, 24 Feb 2020

The Google AI tool used to label content in images will no longer attach gender tags to people. — AFP Relaxnews

On Feb 20, Business Insider reported that Google's Cloud Vision API service, an AI-powered tool that developers use to identify components in an image like faces, objects, or landmarks, will no longer attach gender-related labels to pictured people.

Last week, Google sent out an email to its Cloud Vision API customers that the tool – which can identify and tag various components in an image like brand logos, faces, and landmarks – will no longer attach gender labels like "man" or "woman" to people pictured in an image.

According to the email as reported by Business Insider, Google said that this practice was to be discontinued because "you can't deduce someone's gender by their appearance alone" and doing so would enforce an unethical use of AI. Instead, individual's will simply be tagged as a "person".

Speaking with Business Insider, AI bias expert Frederike Kaltheuner describes this change as "very positive", stating that "Classifying people as male or female assumes that gender is binary. Anyone who doesn't fit it will automatically be misclassified and misgendered. So this is about more than just bias – a person's gender cannot be inferred by appearance. Any AI system that tried to do that will inevitably misgender people."

Google noted in the email that they intend to continue evolving their AI to ensure that people are not discriminated against based on gender, but also not discriminated against based on factors like, race, ethnicity, income, or religious belief. – AFP Relaxnews

Article type: metered
User Type: anonymous web
User Status:
Campaign ID: 1
Cxense type: free
User access status: 3

gender

   

Did you find this article insightful?

Yes
No

100% readers found this article insightful

Across the site