back to top
Thursday, May 23, 2024
HomeAIGoogle and Bing accused of putting non-consensual Deepfake PORN in top Search...

Google and Bing accused of putting non-consensual Deepfake PORN in top Search Results

The boom in artificial intelligence (AI) arrived with the launch of ChatGPT and, since then, this technology has evolved enough to become a tool that can help us in our daily lives. Unfortunately, some users use AI maliciously to harm other people, such as the case of the young women from Almendralejo (Badajoz, Spain), minors who were victims of the dissemination of unreal images. in which they came out naked.

As if that were not enough, months after this event, history was repeated in a school in New Jersey (United States), where a group of students also used artificial intelligence to create fake nude photos of students.

But what applications are capable of doing this crime? ClothOff and DeepNude are some of the apps that use AI to undress both girls and boys with a simple full-body photograph, and as technology evolves, deep learning algorithms are already capable of creating pornographic videos.

In line with this last mentioned multimedia content, according to the report State of deepfakes 2023 Home Security Heroesit takes 25 minutes, an image of someone’s face, and zero euros to create a pornographic video deepfake 60 seconds; adding that “Deepfake porn accounts for 98% of all videos deepfake online. Therefore, such content can be accessed with a few clicks, both on Google and Bing search engines.

Top Google and Bing searches show deepfake porn

NBC News found that pornographic images deepfake of several famous people were the first photographs that Google and Bing showed in the results when the names of “many women” were searched. Additionally, “a review of the results found images deepfake non-consensual and links to videos deepfake in the main Google results for 34 of those searches and the main Bing results for 35 of them,” says the aforementioned newspaper.

On the other hand, “when searching Google ‘fake nudes’ were obtained links to multiple applications and programs to create and watch porn deepfake in the first six results“. Meanwhile in Bing “searching for ‘fake nudes’ returned dozens of results from tools and websites of deepfake non-consensual“, along with “fake photographs of former teenage Disney Channel actresses” in which they appeared naked.

Google’s response

Faced with this problem, Google tells 20Bits that “we understand how distressing this content can be for the people affected by it, and we are actively working to provide more protections to the Search. Like any search engine, Google indexes the content that exists on the web, but we actively design our rating systems to avoid surprising people with unexpected harmful or explicit content that they are not looking for.

The website is receiving criticism for being quite sexist.

What can I do if I am a victim of a pornographic deepfake?

In Spain, those affected can go to the priority channel of the Spanish Data Protection Agency to expose these vulnerable situations, whether for sexual reasons or sharing intimate images. In less than 24 hours, through the Data Agency, an attempt will be made to prevent said information from being shared or from continuing to be disseminated to cut the chain of transmission.

After doing this step, users must make a complaint indicating who was the creator to prosecute the crime.

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Fresh