Microsoft Launches Tool to Remove Deepfake Porn from Bing Search Results

Nettribe MediaNettribe Media
2 min read

Microsoft has introduced a significant new tool aimed at helping victims of deepfake pornography remove non-consensual intimate imagery (NCII) from its Bing search engine. This initiative, launched in partnership with StopNCII, a project operated by the Revenge Porn Helpline, marks an important step in combating the growing issue of AI-generated explicit content, which has been increasingly used for harassment, extortion, and other malicious purposes.

The tool allows users to create a digital "hash" or fingerprint of their explicit images, which StopNCII then adds to a database. This database is utilized by Microsoft to identify and remove matching images from Bing search results. The process is designed to protect user privacy, as the images themselves never leave the user’s device.

This move aligns Microsoft with other major tech platforms, such as Facebook, TikTok, and Instagram, that also use StopNCII’s hashing technology to prevent the spread of NCII. However, the challenge remains particularly complex with AI-generated images, which can evade traditional detection methods. In these cases, victims are encouraged to manually report the content to Microsoft for removal.

By the end of August 2024, Microsoft reported that it had removed nearly 269,000 images through this initiative. Despite these efforts, the lack of comprehensive federal legislation in the U.S. addressing AI-generated non-consensual imagery underscores the need for broader solutions to fully tackle this issue.

This tool is part of Microsoft's broader efforts to enhance online safety, particularly for women and girls, who are disproportionately affected by intimate image abuse​.

0
Subscribe to my newsletter

Read articles from Nettribe Media directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Nettribe Media
Nettribe Media