Skip to content
The Geopost

The Geopost

  • NEWS
  • FACT CHECKING
  • ANALYSIS
  • INTERVIEWS
  • BALKAN DISINFO
  • ENG
  • ALB
  • SRB
  • UKR
  • ABOUT US
  • Analysis

Coordinated network for the distribution of CSAM content uncovered on network X

The Geopost August 23, 2025 8 min read
Share the news

Researchers from Alliance4Europe (A4E), a non-profit organization whose mission is to protect and promote democracy and fundamental values in Europe, have uncovered a coordinated network (CIB – Coordinated Inauthentic Behaviour) on platform X in connection with illegal Russian influence operations, which was spreading content related to child sexual abuse (CSAM). The network hijacked hashtags, posted explicit videos and redirected users to other platforms, which is why it was named “Operation X-ploitation”.

Alliance4Europe’s lead researcher, Saman Nazari, reported the case to the Belgian police and ChildFocus on July 18.

What is a CIB network?

These are online accounts that exhibit a coordinated and inauthentic pattern of behavior, often intended to spread content or manipulate audiences. Due to the extensive use of bots to comment on and rate videos, it is difficult to estimate the actual number of views, but some videos have received more than 20-30 thousand views and hundreds of inauthentic comments.

The investigation lasted from July 18 to July 22, 2025. However, evidence suggests that the operation has been ongoing since at least May 17. By July 28, Platform X began to respond—it temporarily suspended accounts more quickly, introduced age restrictions, and cleaned up hashtags, but did not shut down the network. New accounts continue to appear with less visibility.

“Once the age of my research account was verified, the CSAM content became available again, indicating that the underlying issues—the ease of account creation and inadequate moderation—had not been addressed,” Nazari said.

CIB network analysis

More than 150 accounts sharing CSAM video content were discovered, and new accounts were created during the investigation. The network appears to be profit-driven, either by selling CSAM or by redirecting users to potential scams.

It was found that the accounts operate automatically and have almost identical behavior patterns, but direct users to different content. Some accounts were hacked, others were created for this operation, and some were repurposed as spam accounts. The account names indicate a diverse geographic area—some are named Vietnamese, while others change their names to English.

Tactics include flooding specific hashtags, automated comments spreading content, and hashtags serving as an aggregator for CSAM content.

Links to Telegram, Discord, and dedicated websites indicate that the accounts are part of a larger network serving multiple clients, or that multiple networks are using similar techniques.

This connection suggests the possibility of purchasing accounts or reusing them for this purpose.

Network tactics

The network’s content consists of the following elements: hashtags, text, links, video content, and comments.

The accounts use 19 specific hashtags that are often associated with terms such as “mom,” “teens,” incest, and pornography.

The hashtags used can easily direct both children and adults to inappropriate content. Platform X recommends related hashtags, which further expands the network.

Wide reach and systemic risk

The network uses spam tactics (“Like, RT, and follow!”) and links to CSAM stores, Telegram and Discord groups, fake websites, and other dangerous sources. One portal offers membership in exchange for payment in cryptocurrency. Deleting individual accounts does not stop the network from expanding—new posts continue to appear within minutes.

Systemic problem and regulatory framework

This operation exploits the same vulnerabilities that were previously exploited in Russian influence operations (such as “Doppelganger” and “Operation Overload”): ease of account creation, weak moderation, and poor risk management. This situation may constitute a violation of Articles 34 and 35 of the Digital Services Act (DSA).

According to the researchers’ recommendations, stricter account verification (by phone, blacklisting email addresses, restricting IP addresses), better monitoring of connections and behavioral patterns, and systematic vulnerability resolution, rather than simply removing individual accounts, are necessary. Even though accounts are being removed, new ones keep popping up under the same hashtags. During the investigation, more than 150 such accounts were identified, and even though most of them are temporarily suspended within a few hours, the flood of CSAM content doesn’t stop. For example, under one of the hashtags, CSAM content is posted every 1 to 10 minutes, continuously for three days. There are several different versions of the text used by these accounts. These versions contain different components that are combined in the posts. One of the most frequently repeated phrases is: “You know what you have to do – Like, RT, and follow!” and “BABY You know what you have to do – Like, RT, and follow! ,” combined with various hashtags. Another version uses a random string of text followed by the word “ideas” and a few emojis.

The third component includes the phrase “I saw this,” followed by an emoji.

The fourth most frequently used phrase is “Check this out,” followed by emojis.

Almost all posts contain random strings of numbers and/or letters added at the end.

All posts attempt to redirect users to external websites or communication platforms. The accounts use videos to attract the audience’s attention and then redirect them to external links.

These links lead to: websites selling child sexual abuse material (CSAM) folders, Telegram accounts and groups, as well as Discord groups that provide access to CSAM content, (possibly) fraudulent tools for “hacking” Snapchat accounts, dating websites, and pages for downloading tools that appear to be private/secure chat apps but actually serve to access communities that share CSAM.

Some of these links are reminiscent of the Russian influence operation “Doppelganger,” which is known for concealing the ultimate destination of links with redirects, while others are not. Although the accounts on platform X that shared these links have been removed, X does not block the links themselves, allowing attackers to continue spreading content using other, one-time accessible accounts.

One of the heavily promoted profiles offers access to CSAM content packages for sale. The website is hosted behind Cloudflare, making it impossible to trace the hosting provider or owner at this stage. Further investigation is needed to determine whether the external sites are somehow connected to each other and whether they are somehow “visible” to someone who knows what to look for. For example, membership in a private chat may be a prerequisite for transferring content from Telegram groups. Video content

Accounts often share the same videos within a certain time period. In other words, there appear to be waves of different videos appearing at the same time. The videos show children, ranging from toddlers to teenagers, being sexually abused, raped, or otherwise exposed. Some videos have a trademark that refers to the connection they are trying to reinforce. Some videos have comments below them that also show lists of videos.

Comments

Within seconds of one account posting the original post, numerous other accounts begin flooding it with comments. The comments either contain links to websites or Telegram channels selling child sexual abuse material, or text urging users to check out “teen content” on their profiles using a special font.

Although the comments are automatically flagged as spam, the posts still appear as “top” content under the hashtags they target, apparently due to this flood of comments.

Accounts

While there appear to be empty accounts used solely for this purpose, there are also spam accounts and hacked accounts that have clearly been exploited.

Spam

Some accounts appear to be generic spam accounts with content such as cryptocurrency advertisements and then CSAM content. This suggests that these spam accounts may be used as a for-profit service where people pay to post unmoderated content. If confirmed, this would represent a significant and uncontrolled systemic risk.

Some other spam accounts post random items with descriptions that are likely generated automatically.

The spam accounts observed were often created in the last year, between March and June 2025, and typically have a small number of followers (observed range: 0 to 32) and accounts they follow (observed range: 0 to 35).

The analysis examined spam accounts with Western-sounding names such as Linda Jones, Maria Green, and Elizabeth Brown. They have very few followers and follow very few accounts (between 0 and 5).

The “Western” accounts contain links that lead to a page inviting people to join the Discord CSAM server. Discord links from “Western” accounts lead to websites containing malicious code.

However, other accounts with names of unclear origin appear to lead to real Discord servers. One of them uses a very explicit name.

Another group of unwanted accounts has Vietnamese names. They have a slightly larger number of followers (from 1 to 30) and follow more accounts (from 13 to 35). The Vietnamese accounts offer links to various websites: CPteen.pages.dev, a real and active website with CSAM content; various Telegram channels related to CSAM content, such as “Dirtybox,” “Flamefolder,” “XStore,” and “Snapchat hack.”

Hacked accounts

It appears that several accounts had been inactive for a long time and showed genuine behavior before being used to distribute CSAM content, sometimes after years of inactivity.

Pornographic posts are usually shared one after another in quick succession. It is concerning that some hacked accounts retain profile pictures of real people.

Accounts that follow other accounts

Many accounts, especially those with profile pictures of Asian girls, follow other similar unwanted accounts. The account “Dang” below is a good example of this pattern, which is very widespread.

Both the account that posts CSAM and the accounts that follow it post comments that look like quotes followed by random names. It is worth noting that the accounts that follow the pornographic account do not post CSAM content.

The investigation was conducted between July 18 and 22, 2025. However, researchers found evidence that the operation had been running since at least May 17. During this period, although it is not possible to calculate the exact number, the number of posts is estimated to be in the millions, and the operation generally ran without major interruptions.

Since July 28, X has been removing individual pieces of content more quickly. Although this has reduced the intensity of the operation, the activity has not stopped. On July 29, it was also found that X’s age verification system had begun restricting user access to content, preventing commenting accounts (part of the CIB network) from promoting content. This made hashtag hijacking somewhat more difficult, but did not stop it entirely.

At first, it seemed that X was responding to the CSAM operation (the shortened content removal time and regular deletion of hashtags indicated this).

However, once the researcher’s account age was verified, the CSAM content became available again and the operation resumed in full force. Some of the measures taken by X are clearly related to the new age restriction policy.

In fact, it appears that X’s approach is based on implementing an age verification policy to protect sensitive content from accounts under the age of 18. This represents a departure from the platform’s initial response to the operation, when X temporarily suspended the accounts, allegedly for violating the platform’s rules.

Implementing an age verification policy means that X automatically and proactively verifies the age of its users.

This age verification measure is clearly a response to the new need for compliance with the Irish Internet Safety Code and the UK Internet Safety Act, which include age protection provisions to prevent minors from accessing harmful content, including pornographic and violent content. The Irish media regulator responded to X on July 24 regarding this issue. As such, these measures are not specific to CSAM, nor do they systematically address underlying technical issues such as URL blocking, hashtags, and the ease of creating fake accounts. An update after August 6 showed that researchers were able to confirm that age-verified accounts could still access the content. The operation continues to function in a similar manner to the original investigation and reaches tens of thousands of users./The Geopost/

Continue Reading

Previous: Rome proposal as precedent: If Russia attacks Ukraine after peace, NATO would respond within 24 hours
Next: How the regime is stomping on the future of this country: Scared Vučić started attacking students with a baton, but the resistance is getting stronger

State Honours in Belgrade for a War Criminal 7 min read
  • Analysis

State Honours in Belgrade for a War Criminal

The Geopost October 23, 2025
Ukraine and Sweden sign a long-term deal for up to 150 Gripen fighter jets for Kyiv 2 min read
  • Analysis

Ukraine and Sweden sign a long-term deal for up to 150 Gripen fighter jets for Kyiv

The Geopost October 23, 2025
European Parliament draft resolution on Serbia supports citizens’ right to protest 10 min read
  • Analysis

European Parliament draft resolution on Serbia supports citizens’ right to protest

The Geopost October 22, 2025
Poland detains eight over suspected Russia-linked sabotage, says PM Tusk 2 min read
  • Analysis

Poland detains eight over suspected Russia-linked sabotage, says PM Tusk

The Geopost October 22, 2025
No Trump-Putin meeting in foreseeable future — this is why 3 min read
  • Analysis

No Trump-Putin meeting in foreseeable future — this is why

The Geopost October 22, 2025
EU Readies New Trade Routes — And A Challenge To Beijing And Moscow — At Luxembourg Summit 5 min read
  • Analysis

EU Readies New Trade Routes — And A Challenge To Beijing And Moscow — At Luxembourg Summit

The Geopost October 21, 2025

  • [email protected]
  • +383-49-982-362
  • Str. Ardian Krasniqi, NN
  • 10000 Prishtina, KOSOVO
X-twitter Facebook

Corrections and denials

Copyright © The Geopost | Kreeti by AF themes.