Google works to reduce non-consensual deepfake porn in search


Now, when Google decides a takedown is warranted, it will filter all explicit results on similar searches and remove duplicate images, the company said in a recent blog post. — Reuters

Google is making adjustments to its search engine to reduce the prevalence of sexually explicit fake content high in results, responding to the explosion in non-consensual unsavory content people have created using generative artificial intelligence tools.

When that AI-generated content features a real person’s face or body without their permission, that person can request its removal from search results. Now, when Google decides a takedown is warranted, it will filter all explicit results on similar searches and remove duplicate images, the company said July 31 in a blog post.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!
Deepfake porn

Next In Tech News

Amazon halts plans for drone delivery in Italy
Coupang founder Kim Bom apologises for data leak, pledges compensation
Sam Altman hints at the radical design choices behind OpenAI’s upcoming devices
Opinion: Enable Wi-Fi calling if your house has dead zones
'Brazen attempt': Can a start-up restore the original Twitter brand?
An ecological tale gives life to Metroid Prime 4
TSMC says some facilities evacuated after quake
Analysis-Waymo's San Francisco outage raises doubts over robotaxi readiness during crises
Apple, Google and others tell some foreign employees to avoid traveling out of the country
Opinion: Apple’s leadership exodus isn’t a crisis. It’s just smart transition planning

Others Also Read