Information operations and warfare

Information warfare is no longer a phenomenon of authoritarian regimes, but is increasingly threatening democracies such as Switzerland. State and non-state actors are making targeted use of disinformation and social media manipulation to influence public opinion and destabilise societies. The use of modern technologies such as artificial intelligence (AI) and machine learning makes it possible to disseminate content in an automated and targeted manner. In Switzerland, signs of such influence operations have already been observed in political debates (e.g. 5G discussion, Ukraine conflict).

The challenges 

  • Technological possibilities and scalability: Modern technologies such as AI, automated bots and trolling campaigns can spread information on a large scale. Manipulated content circulates quickly on social media, amplified by filter bubbles and echo chambers. 

  • Difficulty of detection and attribution: Influence operations are inexpensive, difficult to attribute and have a limited risk of escalation. A lack of data and analysis capacities makes timely identification difficult. 

  • Social polarisation: Polarised societies are particularly susceptible to disinformation and manipulation from within. A deep understanding of human decision-making processes makes these attacks effective. 

  • Attacks on democratic decision-making processes: Elections, votes or controversial political debates are favoured targets of influence operations. 

Recommendations for politics, business and society 

  • Strengthening state capacities for detection and response: building up capabilities for recognising, analysing and attributing influence operations at all levels of government (federal, cantonal, municipal). Establishment of an early warning system (e.g. comparable to Alertswiss) for ongoing disinformation campaigns. 

  • Create a legal and political basis: Review the legal framework to enable rapid and transparent responses. Establish guidelines for cooperation with social platforms and media in the fight against disinformation. 

  • Cooperation with the private sector and media: cooperation with social media operators to identify and remove manipulated content. Support for fact-checking initiatives and targeted sensitisation of the population. 

  • Increasing social resilience: promoting media literacy and awareness of digital risks in educational institutions. Integration of training programmes to identify and combat disinformation in all age groups. 

  • International cooperation: Political coordination with the European Union to jointly combat social media influence operations. 

  • Involvement of the population: Transparent and fact-based communication by government agencies in the event of attacks. Ensuring access to independent and reliable information. 

Authors and subject responsibility

Karl Aberer, EPFL | Umberto Annino, Microsoft | Myriam Dunn Cavelty, ETHZ

Review Board

Endre Bangerter, BFH | Alain Beuchat, Banque Lombard Odier & Cie SA | Matthias Bossardt, KPMG | Daniel Caduff, AWS | Adolf Doerig, Doerig & Partner | Stefan Frei, ETH Zurich | Roger Halbheer, Microsoft | Katja Dörlemann, Switch | Pascal Lamia, BACS | Martin Leuthold, Switch | Hannes Lubich, Board of Directors and Consultant | Luka Malis, SIX Digital Exchange | Adrian Perrig, ETH Zurich | Raphael Reischuk, Zühlke Engineering AG | Ruedi Rytz, BACS | Riccardo Sibilia, DDPS | Bernhard Tellenbach, armasuisse | Daniel Walther, Swatch Group Services | Andreas Wespi, IBM Research