Information operations and warfare

Cybersecurity Map 12:02

Current situation 

The idea of using information as a weapon is not new. As far back as antiquity, military leaders were using disinformation and propaganda to deceive and weaken their enemies. However, the development of communications technology and the spread of the Internet have drastically changed the nature of information operations, and other tools are now available. This factsheet focuses on state actors in the field, categorises their roles and tasks, and highlights priority action areas and recommendations for Switzerland’s government.  

The topic of information warfare has a lengthy history, both internationally and in Switzerland. As early as the 1990s, various organisations and researchers were drawing attention to potential developments in the use of digital technology that could be harnessed to generate and disseminate information that has been manipulated in various ways. However, the problem has only become relevant and real with the emergence of social media and the broader availability of machine and deep learning technologies at affordable prices. 

From the late 1990s until around 2010, the umbrella term “information operations” was applied to all relevant issues, from cyber warfare through to propaganda, disinformation and psychological warfare. In current parlance, the term refers increasingly to efforts to influence public opinion by semantic means, where the technological aspects are one of the driving forces behind the implementation of such operations. The term “influence operations” is also used. 

The use of artificial intelligence algorithms, automation and large volumes of data on the Internet and in social media is changing the scope, range and precision of computer-assisted propaganda campaigns to manipulate public opinion. Nowadays, it is possible to create content automatically or semi-automatically, convert it into text, speech and images and make it accessible to a broad mass of people and focused target audiences within a very short period of time and at negligible cost. 

Social media are by nature particularly susceptible to attack. Filter bubbles and echo chambers can be created and strengthened; memes, photos and videos can be used to spread information with no possibility of verifying the source; communities can be attacked by identifying at-risk personal profiles or influential network hubs.  

Governments and political parties manipulate social media. According to a study by the Oxford Internet Institute1, there was evidence of organised social media manipulation campaigns by cyber troops or political parties in 81 countries in 2020, compared with 70 countries in 2019 and 48 in 2018. Authoritarian states use social media manipulation as a tool to control their own populations. Democratic states are targeted by influence operations conducted by a handful of players, including (according to the available evidence) China, India, Iran, Pakistan, Russia, Saudi Arabia and Venezuela. Current estimates indicate that China may have between 300,000 and two million cyber troops. 

Moreover, the cyber troops often collaborate with private firms, civil society organisations, Internet subcultures, youth groups, hacker groups, fringe movements, social media influencers and volunteers who support their ideological cause. In addition to the external attacks mentioned above, “inside attacks” are sometimes the bigger problem in polarised societies where there is a major split in public opinions and convictions, often into two opposing camps.  

Influence operations can be carried out using a wide range of techniques, including disinformation, social hacking, fraudulent identities, bots and trolling. The one thing these techniques have in common is that they set out to influence political processes (e.g. elections) or even set revolts or revolutions in motion. The destabilising potential of this type of information warfare operation is based on a profound understanding of human decision-making processes and the dynamics of mass phenomena. Thanks to new technologies and algorithms, particularly artificial intelligence (AI), large language models (LLMs) and GPT, it is now even easier to simulate these facets. 

Challenges

Operations that attempt to exert influence at state level can change public opinion within a country in a way that benefits the attacker. This makes democracies, in which the political decision-making process is deeply rooted in public opinion, particularly susceptible to this type of attack. 

Influence operations appeal to attackers for a range of reasons: 

1. They cost relatively little to conduct. 

2. They are difficult to attribute and the risk of escalation is limited. 

3. They provide a way of instrumentalising users on a large scale. 

4. They can be used in isolation or in combination with other forms of warfare (traditional or economic). 

By contrast, it is difficult to determine the efficiency and impact of influence operations. There is also a risk of the attacker losing control of them. Nevertheless, it can be assumed that public opinion was changed in the course of several recent events, particularly the elections in the USA, UK and France. What has not been resolved is whether these changes were ultimately decisive. 

A further unresolved question is whether Switzerland is or has already been the target of sophisticated influence operations. There are indications that this has been the case with certain political issues (e.g. Billag fee referendum, 5G controversy, Ukraine war). 

Action areas for government, business and civil society: Current gaps

The Swiss government’s stance is that active use of manipulated information to achieve political goals is not an appropriate tool for a democratic state. This makes it all the more important to implement the following as a response to the threat posed by information warfare (see also the EU’s Action Plan):  

  1. Develop state capabilities to recognise and attribute influence operations. These capabilities should involve all levels of the state (federal, cantonal and communal) and be coordinated at federal level. 

  2. Review the political and legal frameworks so that it is possible to respond to influence operations; issue clear guidelines for potential defence options or counter-attacks. 

  3. Work with the national and international media, if possible in conjunction with partners, to debunk disinformation and raise public awareness of the issue. 

  4. Enter into agreements with the key social platforms that can be leveraged in Switzerland to combat social media influence operations. 

  5. Create an alarm system for ongoing attacks (as with, or as part of, Alertswiss). 

  6. Political coordination with the European Union to combat social media influence operations, for example in association with social media platforms. 

  7. Incorporate social media skills and awareness of digital risks into training programmes. 

  8. Support fact-checking initiatives aimed specifically at the Swiss context. 

  9. Define thresholds for activating crisis mode or at least drive forwards discussions on the subject. 

Appendix

Explanation of a few common forms of information warfare: 

  • Disinformation: Spreading false or incomplete information with the intent to deceive. 

  • Social hacking: Exploiting socio-cognitive attributes of the human psyche, particularly tribalism and the propensity to conform.  

  • Fraudulent identities: Use of legitimate identities by illegitimate operators. 

  • Bots: Automated software that manipulates online platforms. 

  • Trolls: Users or bots who deliberately attack, insult or deride users. 

References 

Countering Disinformation effectively: An Evidence-Based Policy Guide: https://carnegieendowment.org/research/2024/01/countering-disinformation-effectively-an-evidence-based-policy-guide?lang=en 

Cyber Influence Operations: An Overview and Comparative Analysis, Center for Security Studies (CSS), ETH Zürich, Zürich, October 2019: https://css.ethz.ch/en/services/digital-library/publications/publication.html/c4ec0cea-62d0-4d1d-aed2-5f6103d89f93 

Authors and subject responsibility

Karl Aberer, EPFL | Umberto Annino, Microsoft | Myriam Dunn Cavelty, ETHZ

Review Board

Endre Bangerter, BFH | Alain Beuchat, Banque Lombard Odier & Cie SA | Matthias Bossardt, KPMG | Daniel Caduff, AWS | Adolf Doerig, Doerig & Partner | Stefan Frei, ETH Zurich | Roger Halbheer, Microsoft | Katja Dörlemann, Switch | Pascal Lamia, BACS | Martin Leuthold, Switch | Hannes Lubich, Board of Directors and Consultant | Luka Malis, SIX Digital Exchange | Adrian Perrig, ETH Zurich | Raphael Reischuk, Zühlke Engineering AG | Ruedi Rytz, BACS | Riccardo Sibilia, DDPS | Bernhard Tellenbach, armasuisse | Daniel Walther, Swatch Group Services | Andreas Wespi, IBM Research

More articles from the Cybersecurity Map

 

(Adversarial) artificial intelligence

 

Dependencies and sovereignty

 

Data protection

 

Cloud computing

 

Digitalisation / e-government

 

Internet of Things (IoT) and Operational Technology (OT)

 

Quantum computing