DERANGED Putin could use AI-generated deep fake pornography to disrupt and tear down Western democracies, an intelligence analyst has warned.

Nina Jankowicz, former executive director of the US Homeland Security Department’s disinformation task force, told The Sun she is “really worried” about the risk of AI in Russia’s hands.

Professor Jankowicz, who has studied in Russia and worked with the foreign ministry of Ukraine, analyses Russian operations aimed at weakening democracies around the globe.

She told The Sun that Russia has already used deep fake porn to attack its enemies – and only a single real image is needed to generate the pervasive AI nudes.

And the AI expert thinks the “sick” fake porn is definitely part of Russia’s “sexualised playbook” when it comes to weakening more countries in the West.

She even said some of the Russian cybercriminals producing the pervasive pictures are acting on Putin’s direct orders.

Jankowicz explained that while AI can be a “transformative technology”, in the wrong hands it can be incredibly dangerous.

The rapid rise of such technology has created “a whole new level of threat”, and is a “great way to upset the balance of power”.

“The vast majority of deep fakes online are non-consensual deep fake pornography of women,” she told The Sun.

And Russia has used this terrifying method to “attack women who are part of the democracies” in Ukraine and Georgia already.

Most read in The Sun

“If I was Russia I would be looking at using deep fake pornography to undermine democracies… to upset the balance of power”.

“Targeting women is a great way to do it”, she warned.

Inside the world of romance scammers who trick up to one in three Brits using deepfake videos

Jankowicz told The Sun that 96% of deepfake images online are pornography.

“We’re in an age where a single photo of a woman can be used to create a realistic deepfake.

“Russia deploys these in very strategic ways. They tear at the fabric, the vulnerabilities of society, the misogynistic tendencies.”

It’s sick… I don’t want to let the bad guys win

Nina Jankowicz

“The sexualised playbook has been part of their tools and tactics for a very long time,” she explained.

“If you wanna upset a candidate for office or a high-ranking military official, this is a good way to damage her credibility.

“It’s really pervasive.”

The result is other women are more reluctant to take on public-facing or high-ranking positions in government, the armed forces, and more – disturbing the balance of a modern and democratic society.

Jankowicz, who has also written two books about disinformation and women online, has even been targeted herself.

In How to be a Woman Online: Surviving Abuse and Harassment, and How to Fight Back, she opens up about her own experience with deep fake porn.

Nina told The Sun: “The deepfakes of me were created in the weeks after I resigned from government in 2022.

“I actually didn’t discover them for a while afterwards, it was a google alert that clued me into them.

“Of course it was part of a broader interference into public life and that’s why I’m continuing to speak up about it.

“It’s sick,” she said, “I don’t want to let the bad guys win”.

Are Russian cyber groups acting on Kremlin orders?

Nina told The Sun that the Russian cyber groups producing these perverse images and clips could be doing so on direct orders from Putin.

“We’ve seen the Russian security services involved in this kind of thing before, on Russian opposition figures specifically.

“The Kremlin either directly employs those groups through the security services or has crime pacts with them.

“The criminals will use it as a favour for keeping them out of jail.”

The idea is to undermine a woman’s credibility… to knock her down a few pegs. They view it as punishment for the women they’re depicting

Nina Jankowicz

The Russian regime has historically seen security services spend vast amounts of time and money spying on its enemies, and it can now artificially generate the information they want to use, she warned.

And if they were to directly toy with Western governments this way, they’d target high-profile individuals.

“If they were to interfere in this way it would have to be a pretty high-value target.

“I do think it’s possible that they would be looking at government officials… a cabinet secretary or a high-ranking appointee.”

As we get further into a year of elections, in the US, UK and dozens of other countries, Nina said she is “absolutely concerned” about the prospect of AI-generated porn.

“If you look at the way this has been used against women in politics in democratic countries before, it’s been a huge problem already for about eight years.

I do think it’s possible that they would be looking at government officials… a cabinet secretary or a high ranking appointee

Nina Jankowicz

“It’s been allowed to persist without very much intervention from the tech companies or government.

“When I look at the vulnerabilities that we have in Western society, certainly misogyny is one and we know that [Russia] has used kind of misogynistic rhetoric and misogynistic disinformation before.”

Deepfake porn in Ukraine and Georgia

Nina gives an example of Ukrainian MP Svitlana Zalishchuk who was targeted with deepfake porn seven years ago.

In 2017 fake AI posts online of a tweet appearing to be written by Zalishchuk saw her promising to run down the street naked in Kyiv if Ukraine lost a key battle.

It was really disturbing that a journalist from a European country would take this so seriously as to bring it up as the United Nations

Nina Jankowicz

Alongside the post were false images of her completely naked.

Jankowicz told The Sun that the seemingly real post went so viral “a reporter then asked her about it at a UN meeting for women”.

“It was really disturbing that a journalist from a European country would take this so seriously as to bring it up as the United Nations,” Nina said.

“Similarly there have been sex tape scandals in Georgia,” a country with Nina said is home to criminal groups in “cahoots with the Kremlin”.

“Russia was able to either use sex tapes that were attributed to certain women but weren’t actually of them, or actually spy on them because either they were involved in extramarital affairs or in their own bedrooms.”

Nina told The Sun that Russia had a pattern of doing this to opposition figures, and the idea behind it is to “shame them out of public life”.

She added: “To sexualise them, particularly in a country like Georgia which is very traditional, it can be really damaging to them.”

“There have been several women who this has happened to who have left public life entirely.”

The Rise of Deepfakes

DEEPFAKE porn is nothing new – but in recent years it has attracted more attention as increasingly famous targets fall victim to it.

The term was first coined in 2017, when the faces of high-profile figures would be photoshopped or edited onto pornographic content.

In its early form, a combination of machine-learning algorithms would be used with AI software to make them.

But as AI gets increasingly sophisticated, highly realistic visuals can be easily made from scratch – using the most unassuming photographs.

Just a few weeks ago, megastar Taylor Swift was subjected to explicit deepfake images blasted online.

Arguably the most famous person to ever fall victim to the dangerous technology – one of the images was viewed around 45million times.

Other famous women who have been targeted over the years include Gal Gadot, Emma Watson, Natalie Portman and Scarlett Johansson.

Professor Jankowicz is aware of four major websites that specialise in the images – but plenty more sub-threads, copies and sites exist on the web and regulation has proved difficult.

Governments, scientists and intelligence analysts like Jankowicz are looking increasingly at ways to combat the twisted tech.

Examples in the UK

Cara Hunter, a Northern Irish politician, saw a fake porn video using her likeness published online while she was running for election in April 2022.

Shared tens of thousands of times online, it led to her being sexually harassed even while walking down the street.

One of Britain’s first “deep fake” political incidents – and the first with such a high profile figure – was aimed at Labour leader Keir Starmer.

In October 2023, an audio clip of Starmer swirled online, apparently showing him swearing at his staff.

Another fake soundbite popped up of London mayor Sadiq Khan, implying that he was opposed to Remembrance Day memorials and instead favoured a pro-Palestinian march.

While neither were pornographic, they were convincing to people online and spread rapidly.

The UK chairman of the Electoral Commission also recently warned that female MPs could be directly targeted by deepfake porn – particularly ahead of the General Election this year.

John Pullinger told the Financial Times just weeks ago that AI could “block out the real campaign”.

And he warned that any false AI-generated pornography would be “much more targeted towards female candidates”.

2024 is a pivotal year for AI

A study from the University of Oxford just weeks ago concluded that AI is set to “sweep through the information space this year at a time of intense political and economic volatility around the world”.

This, they say, is particularly dangerous during a year with elections in over 40 democracies around the world and wars on different continents.

Read More on The US Sun

The study warned of “bad actors” who could use technology like AI to influence the results of elections in 2024.

Big tech is working hard and fast to counter the threat of such realistic disinformation online – but it is hard to judge at this point whether they will be able to effectively and consistently.