New Tools to Help Protect Against Sextortion and Intimate Image Abuse We're testing new features to help protect young people from sextortion and intimate image abuse, and to make it more difficult for potential scammers
and criminals to find and interact with teens. We're also testing new ways to help people spot potential sextortion scams, encourage them to report and empower them to say no to anything that makes them feel uncomfortable. We've started sharing more
signals about sextortion accounts to other tech companies through Lantern, helping disrupt this criminal activity across the internet. While people overwhelmingly use DMs to share what they love with their friends, family or
favorite creators, sextortion scammers may also use private messages to share or ask for intimate images. To help address this, we'll soon start testing our new nudity protection feature in Instagram DMs, which blurs images detected as containing nudity
and encourages people to think twice before sending nude images. This feature is designed not only to protect people from seeing unwanted nudity in their DMs, but also to protect them from scammers who may send nude images to trick people into sending
their own images in return. Nudity protection will be turned on by default for teens under 18 globally, and we'll show a notification to adults encouraging them to turn it on. When nudity protection is
turned on, people sending images containing nudity will see a message reminding them to be cautious when sending sensitive photos, and that they can unsend these photos if they've changed their mind. Anyone who tries to forward a
nude image they've received will see a message encouraging them to reconsider. When someone receives an image containing nudity, it will be automatically blurred under a warning screen, meaning the recipient isn't confronted with
a nude image and they can choose whether or not to view it. We'll also show them a message encouraging them not to feel pressure to respond, with an option to block the sender and report the chat. Nudity protection uses on-device
machine learning to analyze whether an image sent in a DM on Instagram contains nudity. Because the images are analyzed on the device itself, nudity protection will work in end-to-end encrypted chats, where Meta won't have access to these images --
unless someone chooses to report them to us.
|