Saturday, March 14, 2026

The Pro-Russia disinformation campaign uses free AI tools to drive “content explosion”

Share

Pro-Russian disinformation The campaign uses the tools of artificial consumer intelligence to drive the “content explosion” focusing on exacerbating existing tensions related to global elections, Ukraine and immigration, according to other controversial issues, in accordance with other controversial issues New research published last week.

The campaign, known under many names, including Overloading the operation and matryoshka (other researchers also tied it up Storm-1679), has been operating since 2023 and is consistent with the Russian government by many groups, including Microsoft and Institute of Strategic Dialogue. The campaign spreads false narrative by impersonating the media with the apparent goal of sowing division in democratic countries. While the campaign attacks recipients around the world, including in the USAHis main goal was Ukraine. Hundreds of films manipulated by AI-munipulated by the campaign tried to drive pro-Russian narratives.

The report presents how, between September 2024 to May 2025, the amount of content produced by the campaign rolling dramatically and receives millions of views around the world.

In their report, scientists identified 230 unique elements of content promoted by the campaign between July 2023 and June 2024, including photos, videos, QR codes and phony websites. However, over the past eight months, overloading the operation has been thrown out of a total of 587 unique elements of content, with most of them formed using AI tools, scientists say.

Scientists have found that the boost in content was powered by consumer class AI tools, which are available for free online. This simple access helped to fuel the tactics of the “Content combination” campaign, in which people launching the operation were able to produce many fragments of content that push the same story thanks to AI tools.

“This means a change in the direction of more scalable, multilingual and more and more sophisticated propaganda tactics,” wrote scientists from Reset Tech, a London Non-Profit organization that follows disinformation campaigns, and check the first Finnish software company in the report. “The campaign has significantly increased the production of new content in the last eight months, signaling a change towards faster, more scalable methods for creating content.”

Scientists were also stunned with a variety of tools and types of content that the campaign led. “What surprised to me the variety of content, various types of content they began to use,” says Wired Aleksandra Atanasova. “It’s as if they diversified their palette to catch the same number of different sides of these stories. They employ different types of content, one after the other.”

Atanasova added that the campaign did not seem to operate any non -standard AI tools to achieve their goals, but used voice generators and images powered by artificial intelligence, which are available to everyone.

Although it was arduous to identify all the tools used by campaign agents, scientists were able to narrow in particular one tool: AI stream.

Flux AI is a text generator for the image developed by Black Forest Labs, a German company founded by former AI stability employees. Using the Vistingaine image analysis tool, scientists have found a 99 -percentage probability that many false images made available by the overload campaign – some of which claimed that they show Muslim migrants of the riots and setting fires in Berlin and Paris – it was created using image generation from Flux AI.

Latest Posts

More News