25.1 C
New Delhi
Friday, November 22, 2024
HomeTechMeta's new tool allows teens to remove 'nude' photos, videos from internet

Meta’s new tool allows teens to remove ‘nude’ photos, videos from internet


Meta has introduced a new tool for teenagers that will allow them to remove ‘nude’, ‘partially nude’, or ‘sexually explicit’ images from Instagram and Facebook that were uploaded in the past.


Meta‘s new tool is called “Take It Down” and is operated by the National Center for Missing and Exploited Children.

Take It Down is a free service that lets teens or their parents anonymously submit photos or videos they fear might be uploaded to the internet or that have already been distributed online.

Such photos can be submitted to a web-based tool that will convert the images into digital fingerprints known as hashes, which will be sent to NCMEC and shared with platforms. The social media sites will use hash-matching technology to find and block any attempts to upload the original images.

Apart from Facebook and Instagram, the other participating platforms are Yubo, OnlyFans and Pornhub, owned by Mindgeek.

Meta’s new tool has been designed to combat the rising problem of ‘sextortion,’ where children are coerced or deceived into sharing intimate images with another person online, then threatened or blackmailed with the prospect of having those images published on the internet.

Some offenders are motivated to extract even more explicit images from the child while others are seeking money.

Take It Down hashes the images in the browser, so they don’t leave the device of the child or parent. If the extorter tries to upload the original images, the platform’s hash-matching technology will detect a match and send the newly uploaded image to a content moderator to review.

Meta said it will ingest new hashes multiple times a day, so it can be ready to block images very quickly.

Now, the caveats. If the image is on another site, or if it is sent in an encrypted platform such as WhatsApp, it will not be taken down.

In addition, if someone alters the original image, for instance, cropping it, adding an emoji, or turning it into a meme, it becomes a new image and thus needs a new hash. Images that are visually similar, such as the same photo with and without an Instagram filter, will have similar hashes, differing in just one character.

Meta, back when it was still Facebook, attempted to create a similar tool, although for adults, back in 2017. It didn’t go over well because the site asked people to send their (encrypted) nudes to Facebook — not the most trusted company even in 2017. The company tested out the service in Australia for a brief period but didn’t expand it to other countries.

In 2021, it helped launch a tool for adults called StopNCII — or nonconsensual intimate images, aka “revenge porn.” That site is run by a UK nonprofit, the UK Revenge Porn Helpline, but anyone around the globe can use it.

Catch all the Technology News and Updates on Live Mint.
Download The Mint News App to get Daily Market Updates & Live Business News.

More
Less



Source link

- Advertisment -

YOU MAY ALSO LIKE..

Our Archieves