Over 250 UK Celebs Fall Prey to Deepfake Porn Scandal

Posted by

In a recent expose, it’s been revealed that upwards of 250 UK stars have become the unwilling subjects of deepfake porn. This shocking news comes after an in-depth study was conducted.

Related posts

Cathy Newman, a broadcaster, expressed her disgust and shock upon seeing a video that had been doctored using AI tech to paste her face onto explicit content. On Channel 4, which broadcasted the investigative piece this Thursday night, they’ve uncovered through an analysis of the top five deepfake sites that out of almost 4,000 celebrities listed, 255 are from the UK and shockingly, nearly all are female.

Newman, having seen the doctored video of herself, shared her feelings: “It’s like someone’s invaded my privacy. It’s this creepy feeling ’cause there’s someone out there who’s done this, and I can’t see who, but they can see this, this made-up version of me. It’s permanently in my mind now. The thought of so many women being exploited like this, it’s a gross breach of their personal space. It’s frightening how easy it is for people to create and access these twisted versions of reality just by clicking around.”

Channel 4 News reached out to over 40 well-known figures during their investigation, but none chose to speak out publicly.

The report also highlighted that a staggering 70% of traffic to these deepfake sites came through search engines, including Google.

Experts in the industry have been sounding the alarm about the dangers of AI-generated deepfakes, especially with the risk of spreading false information in a year filled with key elections in countries like the UK and the US.

For instance, earlier this year, deepfake images of Taylor Swift appeared on X, the platform formerly known as Twitter. The site, which is under Elon Musk’s ownership, blocked related searches after Swift’s fans demanded action.

While the Online Safety Act criminalizes the distribution or threat of distribution of manipulated intimate images without consent, it stops short of criminalizing the creation of such content.

Channel 4’s investigation pointed out that often, the victims of this type of pornographic deepfaking are ordinary women, not just public figures.

Newman spoke with Sophie Parrish, who had initiated a petition leading to a change in law after she became a victim herself. The offender was apprehended but faced no further legal repercussions. Parrish described the feeling of contamination and shame upon discovering manipulated images of herself used in explicit acts.

Tory MP Caroline Nokes, head of the Women And Equalities Committee, told Channel 4 News about the severity of the issue, emphasizing that women are the primary targets and the necessity for protection from life-ruining deepfake images.

Google, in response to Channel 4, said that they recognize the distress caused by such content and are dedicated to enhancing their safeguards. They already allow individuals to request the removal of pages with such content from their Search results.

Ryan Daniels, a representative from Meta, stated to the news channel that the company has strict rules against child exploitation and the creation of non-consensual AI-generated nude images.

Share this:
Subscribe
Notify of

0 Comments
Inline Feedbacks
View all comments