HomeTech & AIA Pro-Russia Disinformation Campaign Is Using Free AI Tools to Fuel a...

A Pro-Russia Disinformation Campaign Is Using Free AI Tools to Fuel a ‘Content Explosion’


A pro-Russia disinformation campaign is leveraging consumer artificial intelligence tools to fuel a “content explosion” focused on exacerbating existing tensions around global elections, Ukraine, and immigration, among other controversial issues, according to new research published last week.

The campaign, known by many names including Operation Overload and Matryoshka (other researchers have also tied it to Storm-1679), has been operating since 2023 and has been aligned with the Russian government by multiple groups, including Microsoft and the Institute for Strategic Dialogue. The campaign disseminates false narratives by impersonating media outlets with the apparent aim of sowing division in democratic countries. While the campaign targets audiences around the world, including in the US, its main target has been Ukraine. Hundreds of AI-manipulated videos from the campaign have tried to fuel pro-Russian narratives.

The report outlines how, between September 2024 and May 2025, the amount of content being produced by those running the campaign has increased dramatically and is receiving millions of views around the world.

In their report, the researchers identified 230 unique pieces of content promoted by the campaign between July 2023 and June 2024, including pictures, videos, QR codes, and fake websites. Over the last eight months, however, Operation Overload churned out a total of 587 unique pieces of content, with the majority of them being created with the help of AI tools, researchers said.

The researchers said the spike in content was driven by consumer-grade AI tools that are available for free online. This easy access helped fuel the campaign’s tactic of “content amalgamation,” where those running the operation were able to produce multiple pieces of content pushing the same story thanks to AI tools.

“This marks a shift toward more scalable, multilingual, and increasingly sophisticated propaganda tactics,” researchers from Reset Tech, a London-based nonprofit that tracks disinformation campaigns, and Check First, a Finnish software company, wrote in the report. “The campaign has substantially amped up the production of new content in the past eight months, signalling a shift toward faster, more scalable content creation methods.”

Researchers were also stunned by the variety of tools and types of content the campaign was pursuing. “What came as a surprise to me was the diversity of the content, the different types of content that they started using,” Aleksandra Atanasova, lead open-source intelligence researcher at Reset Tech, tells WIRED. “It’s like they have diversified their palette to catch as many like different angles of those stories. They’re layering up different types of content, one after another.”

Atanasova added that the campaign did not appear to be using any custom AI tools to achieve their goals, but were using AI-powered voice and image generators that are accessible to everyone.

While it was difficult to identify all the tools the campaign operatives were using, the researchers were able to narrow down to one tool in particular: Flux AI.

Flux AI is a text-to-image generator developed by Black Forest Labs, a German-based company founded by former employees of Stability AI. Using the SightEngine image analysis tool, the researchers found a 99 percent likelihood that a number of the fake images shared by the Overload campaign—some of which claimed to show Muslim migrants rioting and setting fires in Berlin and Paris—were created using image generation from Flux AI.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read

spot_img