Google Search Updates Content Removal and Ranking System to Combat Explicit Deepfakes
Google Search updated its removal processes and ranking systems on Wednesday to combat non-consensual fake explicit imagery, also known as deepfakes. The tech giant’s new strategy involves quickly removing explicit deepfakes and demoting websites that host such content so that this type of content does not appear high up in Search results. The company claimed that it has also simplified the process of requesting the removal of explicit deepfakes. These changes are aimed at discouraging bad actors from using artificial intelligence (AI) to generate harmful content.
Google Search to Remove Deepfakes
In a blog post, Google announced that new changes were introduced to deal with the issue of the rise in explicit deepfakes. Many cybercriminals are leveraging AI image and video generation tools to generate fake explicit content about individuals and release them online. Celebrities, social media influencers, and other recognisable personalities are particularly targeted with deepfakes.
Noting the nature of the threat, Google has updated its content removal process. Now, when someone successfully requests the removal of deepfakes featuring them from Search, Google’s systems will take additional steps. The company said that Search will also filter all explicit results on similar searches on the individual as well as remove any duplicates of that image it can find. Notably, the removal will be from Google Search rankings, and it will not show up on the search result page.
“These efforts are designed to give people added peace of mind, especially if they’re concerned about similar content about them popping up in the future,” Google said.
The tech giant has also updated the ranking systems. Whenever explicit deepfakes are requested on Google Search using a specific query, the company will aim to show high-quality, non-explicit content instead. The post highlighted that the technique can reduce exposure to fake explicit content by as much as 70 percent. These users will now see how deepfakes are impacting society instead of seeing non-consensual fake images and videos.
Further, if a website has a high volume of removals for explicit deepfakes, Google will take it as a signal that it is not a high-quality site and will demote it. For the future, the tech giant is working on differentiating explicit content which consensual such as a scene from a movie from explicit deepfakes.
Source link