Tag Archives: Threat Analysis Group

Prigozhin interests and Russian information operations

One of Threat Analysis Group’s (TAG) missions is to understand and disrupt coordinated information operations (IO) threat actors. Our research enables Google teams to make enforcement decisions backed by rigorous analysis. TAG’s investigations do not focus on making judgements about the content on Google platforms, but rather examining technical signals, heuristics, and behavioral patterns to make an assessment that activity is coordinated inauthentic behavior.

In this post, TAG is highlighting four case studies involving Russian IO tied to the Internet Research Agency (IRA) and its financier, Russian oligarch Yevgeny Prigozhin. In several cases, those campaigns served the dual purpose of promoting Russia’s agenda and Prigozhin’s business interests.

These examples underline broader trends we’re seeing: Russian IO groups are increasingly obscuring their role in influence operations, relying on stronger operational security and cutouts (intermediaries to mask their work) to dissociate themselves from user-facing activity. They launder their messages via local media brands, NGOs and PR firms that were in fact created by Russian shell companies. And in some cases, IRA-affiliated actors have responded to platforms’ enforcement efforts by moving to more permissive online spaces and platforms.

IO amplifying Prigozhin’s pro-Russian films

Prigozhin has financed several movies through a partial ownership stake in the film company, Aurum LLC. The company’s movies show Russia — especially the Russian military and mercenaries — in a positive light. The films have high production values and fictionalize Russia’s actions abroad in the style of Hollywood action movies. Storylines in the films include depictions of Russian soldiers in the Central African Republic, soldiers defending native Russians in Ukraine, and even a satire about the IRA and its role in the 2016 US elections. In 2021, they released “Солнцепёк” (“Sunlight” or “Blazing Sun” in English), which takes place in eastern Ukraine and claims to be a story based on true events from 2014 of Russian mercenaries, connected to the paramilitary Wagner Group, protecting Russians in Ukraine against Ukrainian forces.

Shortly after Russia’s invasion of Ukraine, TAG identified several IRA-affiliated news sites hosting ads to drive traffic to the videos including sites like newinform[.]com and slovodel[.]com. While the film was an older release from 2021, the timing of this campaign was notable because the subject matter mirrored newly topical real world events in Ukraine in a way that portrayed Russia positively. Google terminated nine new IRA-linked accounts using Ads to advertise the film and 44 new IRA-linked YouTube channels hosting clips, the full-length film and related comments. Some accounts claimed to be officially affiliated with the film, while others presented themselves as fan accounts.

A movie advertisement featuring the film's poster

Advertisement for the movie “Sunlight” on an IRA-affiliated news site

IRA-linked IO campaigns in Africa

In recent years, Russian IO actors tied to Prigozhin and the IRA, have peddled influence campaigns promoting the interests of Russia and Prigozhin’s Wagner Group in Africa. Researchers at Stanford, Graphika, and our colleagues at Meta have documented this trend going back to 2019. These campaigns involved creating NGOs, media brands and news agencies across Africa including a Ghanaian NGO, Sudan Daily, Peace Data and SADC News. These entities presented themselves as independent non-profit organizations and recruited local journalists and subject matter experts to publish content on topics like pro-Russia narratives, African pride and empowerment, and stories suggesting that Western imperialism is destroying Africa. Some authors likely did not realize they were working for a Russia-backed IO and genuinely believed in the content they wrote.

TAG’s investigations align with these earlier findings. Google terminated accounts and channels associated with the IRA’s fake media brands and NGOs throughout 2019 and 2020. This included IRA-linked accounts using Gmail to create profiles on non-Google social platforms, creating YouTube channels affiliated with the so-called news brands, and publishing content to Blogger.

In March 2021, Google shut down activity by several IRA-linked actors who published content promoting Wagner’s operations in Africa along with pro-Russia narratives. These articles appeared on Blogger and a number of non-Google blogging platforms such as Balalaika, Hashtap, Technowar and Voskhodinfo. The blogs amplified false narratives that the United Nations is funding terrorists in the Central African Republic and that Syrians need Wagner protection. The blogs were not backed by a social media presence.

a blog post showing soldiers in action

Example of a blog posted by an IRA-affiliated account

a blog heading showing a person holding a rocket launcher

Example of a blog posted by an IRA-affiliated account

In September 2022 Google terminated three IRA-linked YouTube channels that were sharing content in French and supportive of Russian policy objectives in Libya, including promoting a film in the Shugaley trilogy, another Aurum LLC film.

IRA influence operations concerning Ukraine

Russia’s agenda in Ukraine has also been a consistent, but not overwhelming, focal point for IRA-linked influence campaigns. In February 2022, Google terminated five YouTube channels and 21 Blogger blogs posting coordinated narratives on Blogger, YouTube and the Ukrainian blogging platform, Hashtap. In addition to domestically-focused content about Russia, several of the narratives focused on maligning Ukraine. These included allegations of Ukrainians deceiving Europe and stories of how Kyiv authorities failed to properly handle the Covid-19 pandemic. This activity spanned multiple blogging platforms and TAG observed the same IRA-linked accounts posted similar commentary across various news sites.

a muted and off-color flag is used at the top of a blog

IRA-created blog on Blogger criticizing EU support for Ukraine

IRA IO targeting domestic Russian audiences

Google regularly disrupts activity by IRA-linked accounts targeting Russian domestic audiences. These are often clusters of related accounts that create YouTube channels, upload videos, and comment and upvote each other’s videos. The activity occurs during Russian work hours, with narratives focused on Russian domestic issues and typically targeting political dissidents. In October 2022, Google terminated a cluster of nearly 700 IRA-linked accounts that were posting YouTube Shorts. The Shorts were crafted for a Russian domestic audience, praising Russian soldiers in Ukraine, and had negligible views or subscribers.

Other campaigns have focused on blogs. In July 2021, Google terminated 28 Blogger blogs created by IRA-linked accounts. Narratives in the blogs focused on Russian domestic affairs, including stories dismissing protests supporting anti-corruption activist, Alexei Navalny, denigrating local opposition politicians, criticizing the mayor of St. Petersburg and praising the heroics of Wagner Group. IRA actors also mirrored the same content on Ukrainian blogging platform, Hashtap. In some cases, multiple Blogger profiles published very similar or near-identical content.

The evolution of the Russian IO landscape

These case studies underscore several developments TAG observes in Russian IO activity. The accounts created lack well-developed, and backstopped personas, and increasingly are disrupted before they can gain traction. Russian IO actors also increasingly obscure their role, using stronger operational security and a range of intermediaries to conduct the actual user-facing activity. These proxies include third party PR firms, marketing agents, or unknowing local journalists and creators. Using well-selected proxies launders their legitimacy, and this provides an advantage compared to creating direct personas with little reach.

In our investigations of IRA-backed IO, we have also noted several cases where the narratives pushed by the IRA serve a dual purpose. Not only do they amplify messages supporting Russia, they also promote the business interests of oligarch, Yevgeny Prigozhin. Prigozhin has organized his empire around projects that directly and indirectly support the Russian state, and as the main financier of the IRA, he has cleverly leveraged his IO apparatus to amplify narratives that benefit not only Russia, but his own business interests as well.

Prigozhin interests and Russian information operations

One of Threat Analysis Group’s (TAG) missions is to understand and disrupt coordinated information operations (IO) threat actors. Our research enables Google teams to make enforcement decisions backed by rigorous analysis. TAG’s investigations do not focus on making judgements about the content on Google platforms, but rather examining technical signals, heuristics, and behavioral patterns to make an assessment that activity is coordinated inauthentic behavior.

In this post, TAG is highlighting four case studies involving Russian IO tied to the Internet Research Agency (IRA) and its financier, Russian oligarch Yevgeny Prigozhin. In several cases, those campaigns served the dual purpose of promoting Russia’s agenda and Prigozhin’s business interests.

These examples underline broader trends we’re seeing: Russian IO groups are increasingly obscuring their role in influence operations, relying on stronger operational security and cutouts (intermediaries to mask their work) to dissociate themselves from user-facing activity. They launder their messages via local media brands, NGOs and PR firms that were in fact created by Russian shell companies. And in some cases, IRA-affiliated actors have responded to platforms’ enforcement efforts by moving to more permissive online spaces and platforms.

IO amplifying Prigozhin’s pro-Russian films

Prigozhin has financed several movies through a partial ownership stake in the film company, Aurum LLC. The company’s movies show Russia — especially the Russian military and mercenaries — in a positive light. The films have high production values and fictionalize Russia’s actions abroad in the style of Hollywood action movies. Storylines in the films include depictions of Russian soldiers in the Central African Republic, soldiers defending native Russians in Ukraine, and even a satire about the IRA and its role in the 2016 US elections. In 2021, they released “Солнцепёк” (“Sunlight” or “Blazing Sun” in English), which takes place in eastern Ukraine and claims to be a story based on true events from 2014 of Russian mercenaries, connected to the paramilitary Wagner Group, protecting Russians in Ukraine against Ukrainian forces.

Shortly after Russia’s invasion of Ukraine, TAG identified several IRA-affiliated news sites hosting ads to drive traffic to the videos including sites like newinform[.]com and slovodel[.]com. While the film was an older release from 2021, the timing of this campaign was notable because the subject matter mirrored newly topical real world events in Ukraine in a way that portrayed Russia positively. Google terminated nine new IRA-linked accounts using Ads to advertise the film and 44 new IRA-linked YouTube channels hosting clips, the full-length film and related comments. Some accounts claimed to be officially affiliated with the film, while others presented themselves as fan accounts.

A movie advertisement featuring the film's poster

Advertisement for the movie “Sunlight” on an IRA-affiliated news site

IRA-linked IO campaigns in Africa

In recent years, Russian IO actors tied to Prigozhin and the IRA, have peddled influence campaigns promoting the interests of Russia and Prigozhin’s Wagner Group in Africa. Researchers at Stanford, Graphika, and our colleagues at Meta have documented this trend going back to 2019. These campaigns involved creating NGOs, media brands and news agencies across Africa including a Ghanaian NGO, Sudan Daily, Peace Data and SADC News. These entities presented themselves as independent non-profit organizations and recruited local journalists and subject matter experts to publish content on topics like pro-Russia narratives, African pride and empowerment, and stories suggesting that Western imperialism is destroying Africa. Some authors likely did not realize they were working for a Russia-backed IO and genuinely believed in the content they wrote.

TAG’s investigations align with these earlier findings. Google terminated accounts and channels associated with the IRA’s fake media brands and NGOs throughout 2019 and 2020. This included IRA-linked accounts using Gmail to create profiles on non-Google social platforms, creating YouTube channels affiliated with the so-called news brands, and publishing content to Blogger.

In March 2021, Google shut down activity by several IRA-linked actors who published content promoting Wagner’s operations in Africa along with pro-Russia narratives. These articles appeared on Blogger and a number of non-Google blogging platforms such as Balalaika, Hashtap, Technowar and Voskhodinfo. The blogs amplified false narratives that the United Nations is funding terrorists in the Central African Republic and that Syrians need Wagner protection. The blogs were not backed by a social media presence.

a blog post showing soldiers in action

Example of a blog posted by an IRA-affiliated account

a blog heading showing a person holding a rocket launcher

Example of a blog posted by an IRA-affiliated account

In September 2022 Google terminated three IRA-linked YouTube channels that were sharing content in French and supportive of Russian policy objectives in Libya, including promoting a film in the Shugaley trilogy, another Aurum LLC film.

IRA influence operations concerning Ukraine

Russia’s agenda in Ukraine has also been a consistent, but not overwhelming, focal point for IRA-linked influence campaigns. In February 2022, Google terminated five YouTube channels and 21 Blogger blogs posting coordinated narratives on Blogger, YouTube and the Ukrainian blogging platform, Hashtap. In addition to domestically-focused content about Russia, several of the narratives focused on maligning Ukraine. These included allegations of Ukrainians deceiving Europe and stories of how Kyiv authorities failed to properly handle the Covid-19 pandemic. This activity spanned multiple blogging platforms and TAG observed the same IRA-linked accounts posted similar commentary across various news sites.

a muted and off-color flag is used at the top of a blog

IRA-created blog on Blogger criticizing EU support for Ukraine

IRA IO targeting domestic Russian audiences

Google regularly disrupts activity by IRA-linked accounts targeting Russian domestic audiences. These are often clusters of related accounts that create YouTube channels, upload videos, and comment and upvote each other’s videos. The activity occurs during Russian work hours, with narratives focused on Russian domestic issues and typically targeting political dissidents. In October 2022, Google terminated a cluster of nearly 700 IRA-linked accounts that were posting YouTube Shorts. The Shorts were crafted for a Russian domestic audience, praising Russian soldiers in Ukraine, and had negligible views or subscribers.

Other campaigns have focused on blogs. In July 2021, Google terminated 28 Blogger blogs created by IRA-linked accounts. Narratives in the blogs focused on Russian domestic affairs, including stories dismissing protests supporting anti-corruption activist, Alexei Navalny, denigrating local opposition politicians, criticizing the mayor of St. Petersburg and praising the heroics of Wagner Group. IRA actors also mirrored the same content on Ukrainian blogging platform, Hashtap. In some cases, multiple Blogger profiles published very similar or near-identical content.

The evolution of the Russian IO landscape

These case studies underscore several developments TAG observes in Russian IO activity. The accounts created lack well-developed, and backstopped personas, and increasingly are disrupted before they can gain traction. Russian IO actors also increasingly obscure their role, using stronger operational security and a range of intermediaries to conduct the actual user-facing activity. These proxies include third party PR firms, marketing agents, or unknowing local journalists and creators. Using well-selected proxies launders their legitimacy, and this provides an advantage compared to creating direct personas with little reach.

In our investigations of IRA-backed IO, we have also noted several cases where the narratives pushed by the IRA serve a dual purpose. Not only do they amplify messages supporting Russia, they also promote the business interests of oligarch, Yevgeny Prigozhin. Prigozhin has organized his empire around projects that directly and indirectly support the Russian state, and as the main financier of the IRA, he has cleverly leveraged his IO apparatus to amplify narratives that benefit not only Russia, but his own business interests as well.

TAG Bulletin: Q3 2022

This bulletin includes coordinated influence operation campaigns terminated on our platforms in Q3 2022. It was last updated on October 26, 2022.

July

  • We terminated 7 YouTube channels as part of our investigation into coordinated influence operations linked to Russia. The campaign was linked to a Russian consulting firm and was sharing content in Russian that was supportive of Russia and critical of Ukraine and the U.S.
  • We terminated 7 YouTube channels and 3 AdSense accounts as part of our investigation into coordinated influence operations linked to China. The campaign was sharing content in English and Chinese that was supportive of the Chinese semiconductor and tech industries and critical of the U.S. semiconductor industry and U.S. sanctions on Chinese tech companies.
  • We terminated 2,150 YouTube channels as part of our ongoing investigation into coordinated influence operations linked to China. These channels mostly uploaded spammy content in Chinese about music, entertainment, and lifestyle. A very small subset uploaded content in Chinese and English about China and U.S. foreign affairs. These findings are consistent with our previous reports.

August

  • We terminated 10 YouTube channels and blocked 120 domains from eligibility to appear on Google News surfaces and Discover as part of our investigation into coordinated influence operations linked to China. The campaign was linked to a Chinese PR firm named Shanghai Haixun Technology Co., Ltd. and was sharing content in English, Chinese, Russian, Ukrainian, Thai, Hindi, French, Arabic, Italian, Vietnamese and Korean that was critical of international news coverage of Xinjiang, the United States and its relationship with Taiwan, and high profile critics of the Chinese government. We received leads from Mandiant that supported us in this investigation.
  • We terminated 12 YouTube channels, 4 Ads accounts, and 2 Blogger blogs and blocked 3 domains from eligibility to appear on Google News surfaces and Discover as part of our investigation into coordinated influence operations linked to the United States. The campaign was sharing content in English, Arabic, Persian, and Russian that was promoting U.S. foreign affairs. We received leads from Twitter that supported us in this investigation.
  • We terminated 15 YouTube channels as part of our investigation into coordinated influence operations linked to Sudan. The campaign was sharing content in Arabic that was supportive of the Sudanese Rapid Support Forces and their leader Hemetti. We received leads from Twitter that supported us in this investigation.
  • We terminated 3 YouTube channels as part of our investigation into coordinated influence operations linked to Russia. The campaign was linked to the media outlet News Front and was sharing content in English and German that was supportive of Russia and critical of the United States. We received leads from Twitter that supported us in this investigation.
  • We terminated 1 AdSense account and blocked 1 domain from eligibility to appear on Google News surfaces and Discover as part of our investigation into coordinated influence operations linked to Turkey. The campaign was sharing content in Turkish that was supportive of Turkey’s AK Party. We received leads from Twitter that supported us in this investigation.
  • We terminated 12 YouTube channels as part of our investigation into coordinated influence operations linked to Russia. The campaign was linked to a Russian consulting firm and was sharing content in Russian that was supportive of Russia and the Russian military and critical of NATO, Ukraine, and the West. We received leads from Twitter that supported us in this investigation.
  • We terminated 15 YouTube channels, 2 AdSense accounts, and 1 Blogger blog as part of our investigation into coordinated influence operations linked to Vietnam. The campaign was sharing content in Chinese, Japanese, Korean, and German that was supportive of Russia and critical of Ukraine and China. We believe this operation was financially motivated.
  • We terminated 1 YouTube channel and 1 Ads account and blocked 1 domain from eligibility to appear on Google News surfaces and Discover as part of our investigation into coordinated influence operations linked to Russia. The campaign was sharing content in Russian that was critical of the United States, the EU, Ukraine, and NATO.
  • We terminated 1104 YouTube channels as part of our ongoing investigation into coordinated influence operations linked to China. These channels mostly uploaded spammy content in Chinese about music, entertainment, and lifestyle. A very small subset uploaded content in Chinese and English about China and U.S. foreign affairs. These findings are consistent with our previous reports.

September

  • We terminated 1 AdSense account and blocked 4 domains from eligibility to appear on Google News surfaces and Discover as part of our investigation into coordinated influence operations linked to North Macedonia. The campaign was sharing sensational content in English that was about a variety of topics including U.S. and European current events. We believe this operation was financially motivated.
  • We terminated 5 YouTube channels as part of our investigation into coordinated influence operations linked to Myanmar. The campaign was sharing content in Burmese that was critical of the People’s Defense Force of Myanmar.
  • We terminated 3 YouTube channels as part of our investigation into coordinated influence operations linked to Russia. The campaign was linked to the Internet Research Agency (IRA) and was sharing content in French that was supportive of Russian policy objectives in Libya. We received leads from the FBI that supported us in this investigation.
  • We blocked 1 domain from eligibility to appear on Google News surfaces and Discover as part of our investigation into coordinated influence operations linked to Iran. The campaign was sharing content in Arabic that was critical of the UAE, Saudi Arabia, and Bahrain.
  • We terminated 6957 YouTube channels and 144 Blogger blogs as part of our ongoing investigation into coordinated influence operations linked to China. These channels and blogs mostly uploaded spammy content in Chinese about music, entertainment, and lifestyle. A very small subset uploaded content in Chinese and English about China and U.S. foreign affairs. These findings are consistent with our previous reports.