According to reports, Instagram has started to test a new ‘Nudity Protection’ feature to protect users from unwanted nude content.
App researcher Alessandro Paluzzi has uncovered another Instagram feature test for us this week. This time, a new ‘Nudity protection’ feature that helps protect the app’s users from viewing unwanted nude content sent to them via DM.
The ‘Nudity protection’ feature is a filter that will detect and block potential nude content from users’ Instagram Direct messages by activating iOS’ nudity detection and then scanning messages on your device.
When it detects any objectionable content, it blurs the image it detects to shield users’ from an unwelcome image of someone’s privates (or worse). It can be turned off at any time in your app’s settings, and you can choose to view (and unblur) an image or not.
Apart from the apparent protection it will offer, the feature is also “privacy-centric,” at least in as much as Instagram is concerned. The platform doesn’t analyze any images – everything takes place on your device. Of course, that potentially means that Apple is analyzing your images.
Apple has said that it doesn’t upload images, and the analysis takes place locally (on your device) with machine learning. That’s comforting.
Either way, the new feature is good if it saves you from seeing something you’d rather not have. Right?
You might also like
More from Instagram
Instagram is introducing a new 'Save posts with friends' that lets you create a collaborative collection for you and your …
According to reports, Instagram is testing a new Top 3 sticker for Stories, allowing users to highlight favorite accounts, places, …