Google’s AI tool is no longer going to label people in photos as man or woman
Because you “can’t deduce someone’s gender by their appearance alone”

According to a Business Insider report, Google's Cloud Vision API service will no longer label photos of people with 'man' or 'woman'. Cloud Vision API is an AI-powered tool that helps developers identify components in an image.
Google sent out an email to Cloud Vision API customers stating that the tool will no longer attach gender labels to pictures. Google mentioned in the email that they had decided to discontinue gender labels because "you can't deduce someone's gender by their appearance alone" and use of these labels enforce an unethical use of AI. Google also mentioned that an individual in a photo will only be tagged as 'person'.
Also Read: Google cracks down on clickbait ads, removes 600 apps from Play Store
AI bias expert Frederike Kaltheuner, speaking to Business Insider, called this change "very positive" stating that "classifying people as male or female assumes that gender is binary. Anyone who doesn't fit it will automatically be misclassified and misgendered. So this is about more than just bias -- a person's gender cannot be inferred by appearance. Any AI system that tried to do that will inevitably misgender people."
Google also noted in the email that they plan to continue evolving their AI to ensure that people are not discriminated against based on gender, race, ethnicity, income and religious belief.
Catch all the Latest Tech News, Mobile News, Laptop News, Gaming news, Wearables News , How To News, also keep up with us on Whatsapp channel,Twitter, Facebook, Google News, and Instagram. For our latest videos, subscribe to our YouTube channel.
