Instagram Plans to Protect Users From Unsolicited Nude Photos
Instagram is developing a feature that will allow users to block unsolicited nude photos in their direct messages.
Instagram is developing a feature that will allow users to block unsolicited nude photos in their direct messages.
“Nudity protection” will be an optional privacy setting, similar to the Hidden Words feature launched last year, which filters out messages containing abusive language and emojis. Using machine learning, Instagram will prevent nude pictures from being delivered. The company also won't view or store any of the images.
The feature, first reported by the Verge, addresses persistent complaints of abuse on the social media application, owned by Meta Platforms Inc. Some 41% of Americans reported online harassment and 79% believe social media companies are doing a fair or poor job of addressing such problems, according to a report last year from the Pew Research Center, which surveyed more than 10,000 adults in September 2020. One-third of women under 35 experienced sexual harassment online, compared with 11% of men in the same age range, according to the report.
Nudity protection on Instagram is still in the early stages of development. “We're working closely with experts to ensure these new features preserve people's privacy, while giving them control over the messages they receive,” a Meta spokesperson said.
Catch all the Latest Tech News, Mobile News, Laptop News, Gaming news, Wearables News , How To News, also keep up with us on Whatsapp channel,Twitter, Facebook, Google News, and Instagram. For our latest videos, subscribe to our YouTube channel.