People who send unsolicited nudes in D.M.’s are the scum of the earth. This was just one of the many ways that made using social media platforms like Instagram rather disturbing, especially for women. Instagram, finally, is working on a solution that might put an end to this.
Until now, the only way to deal with unsolicited nudes was for a user to report the sender to Instagram. We all know how well that usually worked. Instagram is working on a new program to filter out unsolicited nude messages sent over direct messages.
The discovery was made by Alessandro Paluzzi, an app researcher, earlier this week. According to a tweet that he posted, Instagram was working on technology that would cover up photos that may contain nudity but noted that the company would not be able to access the images itself.
Instagram has also confirmed the feature to several publications. The Meta-owned company has said the part is in the early stages of development, and it’s not testing this yet.
“We’re developing a set of optional user controls to help people protect themselves from unwanted D.M.’s, like photos containing nudity,” Meta spokesperson Liz Fernandez told a publication. “This technology doesn’t allow Meta to see anyone’s private messages, nor are they shared with us or anyone else. We’re working closely with experts to ensure these new features preserve people’s privacy while giving them control over the messages they receive,” she added.
Screenshots of the feature posted by Paluzzi suggest that Instagram will process all images for this feature on the device, so nothing is sent to its servers. Plus, you can choose to see the photo if you think it’s from a trusted person. When Instagram rolls the feature out widely, it will be an optional setting for users who want to weed out messages with nude photos.
Last year, Instagram launched D.M. controls to enable keyword-based filters that work with abusive words, phrases, and emojis. Earlier this year, the company introduced a “Sensitive Content” filter that keeps certain kinds of content — including nudity and graphic violence — out of the users’ experience.
Social media has badly grappled with the problem of unsolicited nude photos. While some apps like Bumble have tried tools like AI-powered blurring for this problem, Twitter has struggled with catching child sexual abuse material (CSAM) and non-consensual nudity at scale.
Because of the lack of concrete steps from platforms, lawmakers have been forced to look at this issue with a stern eye. For instance, the U.K.’s upcoming Online Safety Bill aims to make cyber flashing a crime. Last month, the state of California in the United States passed a law that allows receivers of unsolicited graphical material to sue the senders. The state of Texas, too, passed a law on cyber flashing in 2019, counting it as a “misdemeanor” and resulting in a fine of up to $500.