
The intensity and danger of hybrid threats and disinformation have increased in recent years, with Russia and China as the main actors of these threats.
This was said at the conference “Balkan Disinfo 2025” organized by The Geopost, which is taking place in Pristina and is attended by experts in this field from various countries around the world.
Finnish diplomat Tapio Pysalo said that the purpose of these hybrid threats and disinformation from Russia and China is to stir up uncertainty and fear and division within the EU and NATO.
The Finnish diplomat said that these two countries are working together to spread disinformation, including using artificial intelligence.
“The intensity and dangerousness of hybrid threats has increased in recent years, especially if Russia and the People’s Republic of China are considered the main threat actors.” Their aim is above all to undermine our partnerships by sowing divisions within the EU and NATO, in particular by hindering NATO enlargement and EU enlargement in the Western Balkans, by undermining democratic institutions, including the credibility of elections, by undermining public confidence and by polarizing our societies and thereby affecting the stability of our societies. The aim is to sow uncertainty and fear, undermine public confidence and weaken our support for Ukraine. I believe that all of this can also be applied to general trends in the field of disinformation. And in the area of disinformation, we see that both China and Russia have stepped up their activities in Europe and the US. “Russia has invested massively in disinformation,” he said.
Matarina Klingova from GLOBSEC in Slovakia said that the largest amount of disinformation is spread by political leaders themselves.
In her opinion, misrepresentations are spread through sponsored content on social media.
“Our political leaders are spreading disinformation. It’s not something you can follow on a website or on Facebook and Twitter channels. It is now being spread and presented by our political leadership. Turn on the TV or watch a political debate before and after the election and you’ll see polarized accounts presented by different stakeholders. And then there are campaigns trying to defame various government organizations and then journalists. And then there is disinformation from our allies in the EU and NATO institutions. Then there are domestic actors spreading narratives that downplay the war in Ukraine or detract from any help from NATO and the EU. “Our surveys show that, depending on the narrative, 66% of Slovaks in 2023 thought that the US would try to withdraw from the war in Ukraine and that today Russia is seen as a threat in most Central European countries. In Slovakia, however, this perception dropped to 62% in 2022 and 49% in 2024,” he emphasized.
The Head of the Western Balkans Task Force in the Strategic Department of the European External Action Service in Brussels, Alen Musaefendic, explained that the Western Balkans Department has been working against disinformation since 2018.
He emphasized that the heads of state and government of the Western Balkan states had committed to working on combating FIMI, adding that the fight against disinformation had become part of the accession process to the EU.
“This is unfortunate, because all Balkan political leaders have repeatedly committed to taking tougher action against FIMI and disinformation. For example, the final chapter of the Western Balkans Summit in Tirana in 2002 states in point 25: ‘We commit ourselves to fight FIMI [disinformation, manipulation, interference in the flow of information by foreign actors], and this also applies to the leaders of the EU and the Western Balkans. From 2023, the introduction of FIMI is also part of the EU accession process. One innovation is that the candidate countries, i.e. the majority of Western Balkan countries, are obliged to do more to limit the space for the spread of FIMI as part of the EU accession process. “We are proactively positioning ourselves as EU supporters and pro-EU supporters so that threat actors cannot spread false narratives,” he emphasized.
Ben Graham, a consultant specializing in new challenges to election integrity, emphasized that in his home country of the UK, there has been a strengthening of pro-Russian narratives from networks in the People’s Republic of China.
However, he added that the numerous election processes in the past year had brought some positive aspects for a more effective fight against disinformation.
“We see a stronger strategic focus on our opponents, we see an increase or strengthening of pro-Russian narratives from the PRC networks, and it is important that we work together to counter them.” Secondly, we believe that we have not yet caught up. It seems like we’re stuck in a besieged castle and we tend to put up our shields, but maybe that helps us at this moment, but it doesn’t help us win on the ground. In terms of combating information manipulation, we also need to look at the psychological aspects of why people believe this disinformation and how we can convince them. “I think we need to work on being more eloquent in our opposition,” he emphasized.
Benjamin Schultz, a US researcher and expert on digital intelligence, in particular manipulation campaigns and foreign information influence, explained that there are attacks on researchers due to presidential decrees by the new American presidency.
According to him, the US has become a hyper-polarized society.
“In the next two weeks, a lot will change in the US and around the world due to various presidential decrees, and a problematic trend is making the rounds in the US, namely the attack on researchers and their websites. The USA is currently trying to adjust its course towards Europe. But this is proving increasingly problematic in terms of international cooperation and in the face of hybrid threats from Iran, Russia, China and other countries. I’m trying not to sound like a politician, but disinformation is a political dirty word because scientific grants have now been withdrawn that have nothing to do with the party but were fundamental work with societal impact, namely public health, AI, anything where the words disinformation, women or bias are even mentioned. And now grants are being withdrawn on that basis alone. We’re now in a situation where collaboration at the institutional level and in all aspects of scientific research is under attack and you’re being blackballed,” Schultz said./The Geopost/