Thanks to AI, the Wagner mutiny will only briefly hamper Russian disinformation
For more than two decades, Vladimir Putin has effectively maintained a firm grip on power in Russia, projecting an image of invincibility. But on June 24, a mutiny within the Wagner mercenary force brought Russia perilously close to witnessing the collapse of Putin’s authority.
Remarkably, Russia’s own internal challenges posed a significant threat to Putin’s rule, and the U.S. did not need to actively intervene to achieve this outcome. With the recent downfall of Wagner leader Yevgeny Prigozhin, it is probable that Russia’s disinformation operations will decrease in the foreseeable future.
Prigozhin, a former chef turned confidant to Putin and the founder of the Wagner Group, has been at the heart of Russia’s covert disinformation efforts. His Internet Research Agency (IRA), a troll farm known for its interference in the 2016 U.S. election, and his wider Patriot Media Group, have significantly shaped the digital landscape, amplifying the Kremlin’s messages across the globe.
However, recent events have thrown a wrench into Russia’s disinformation machinery. Prigozhin’s media empire is collapsing, following a raid on its St. Petersburg headquarters and the blocking of several Patriot group websites by Russia’s media watchdog, Roskomnadzor. The result: Prigozhin has announced the closure of his media holding, marking an unexpected potential downturn in Russia’s disinformation operations.
The magnitude of this sudden change is hard to overstate. For years, Prigozhin’s IRA has operated out of St. Petersburg, serving as a fundamental pillar of Russia’s online influence operations. The agency has employed hundreds of skilled “trolls” to advance Kremlin interests both domestically and internationally. Through their targeted and deceitful strategies, they have effectively reached millions of individuals, skillfully manipulating public opinion, undermining democratic processes and exacerbating societal divisions.
By 2015, the IRA had expanded its workforce to an estimated 400 staff members, working grueling 12-hour shifts. Among them, 80 trolls were dedicated solely to disrupting the U.S. political system. Their activities spanned numerous social media platforms, including VKontakte, often regarded as Russia’s equivalent of Facebook. According to a U.S. Senate Intelligence Committee report, the IRA’s managers meticulously monitored the workplace using CCTV cameras and had a relentless obsession with pageviews, posts, clicks and overall web traffic.
In particular, the IRA’s tactics have targeted the U.S. Ahead of the 2016 elections, the IRA worked tirelessly to disrupt our political process. By the following year, it had over 80 employees focused exclusively on the U.S., weaponizing every major social media platform to spread propaganda. Such tactics are far from innocent internet trolling. They are concerted efforts to destabilize nations and their political structures. This is a modern form of warfare that has been increasingly adopted by state actors.
Yet with the upcoming 2024 U.S. elections, the closure of Prigozhin’s media empire could signify a critical shift. If Russia’s current internal chaos continues, there’s a possibility that the West will see a reduction in the influence of Russian troll farms, thereby curbing the scale of disinformation campaigns.
The impact of these troll farms has been remarkable, as evinced by their content reaching 140 million U.S. users per month. Seventy-five percent of these users had never even engaged with any of the pages. Rather, they encountered the content through Facebook’s content-recommendation system, which pushed it into their news feeds.
Although certain troll farms in Russia may be shutting down, the emergence of advanced language models such as ChatGPT presents a potential avenue for the remaining actors to amplify their operations.
Researchers from Stanford and Georgetown utilized a predecessor of ChatGPT to generate fictional narratives, demonstrating their ability to influence the perspectives of American readers nearly as effectively as real Russian and Iranian propaganda. The model-generated articles, with minor human editing, had a more pronounced impact on reader opinion than the foreign propaganda that initially trained the computer model, according to the findings.
The upcoming 2024 presidential race is anticipated to experience a significant level of foreign and domestic meddling, according to Chris Krebs, the former chief of the Cybersecurity and Infrastructure Security Agency. Krebs highlighted that, even apart from Russia, China and Iran might attempt to influence and disrupt the presidential race.
He emphasized that foreign actors have even more motivation to interfere compared to 2020, both in terms of shaping public opinion (influence) and attacking election infrastructure (interference). He also noted that increased tensions between Washington and Beijing could prompt China to re-engage in influence operations, whereas Iran might make another attempt, given its active involvement in the previous election.
Therefore, the West must understand that Russia is not the sole purveyor of disinformation. Other actors will likely attempt to fill the void left by the dissolution of Prigozhin’s empire. Our responses must be flexible and adaptable to tackle the ever-evolving threat of disinformation.
Although the turbulence within Russia could limit its disinformation activities temporarily, we should consider it a reprieve rather than a victory. This situation presents an opportunity to prepare and strengthen our systems against future disinformation attacks. To neglect this opportunity would be to our detriment. After all, the digital battlefield is as crucial as the physical one in this age of information.
David Kirichenko is a freelance journalist and an editor at Euromaidan Press.
Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..