Sound the Alarm! Updating Beliefs and Degradative Cyber Operations
Bracing for An Absent Threat
Cyber operations have grown in both frequency and complexity. Often observed alongside on-going disputes, advances in cyber capabilities grant aggressors the opportunity to threaten an adversary’s critical infrastructure. While this trend supports the notion of an “offensive advantage” in cyberspace and enables alarmist rhetoric, existing evidence proves otherwise.
For the most part, highly damaging cyber operations are limited to a handful of state actors with the necessary technological, organizational., and economic resources. Moreover, those that meet these pre-requisites appear restrained in their exercise of their cyber capabilities – if used at all – during periods of heightened conflict. Yet despite these contradictory observations, states continue to pour noteworthy amounts of treasure into the development of offensive cyber capabilities. If we are to treat states as the conglomeration of rational individuals, what accounts for this seemingly irrational behavior?
While it is easy to assign responsibility to sensationalized media reporting or on self-interested political elites, the tendency to ignore probabilities (i.e. how common these incidents actually are) is rooted in individual cognitive processes. Consequently, this study surfaces pre-existing beliefs and information order as influencing the extent to which individuals overestimate the probability of state-associated cyber operations.
Through a series of vignette experiments, the study finds that individuals are able to utilize available information to evaluate the probability that a cyber operation originated from a known state actor. The extent to which this evaluation integrates all available information, however, is modulated by the aforementioned factors.
Beliefs and Information Order
It is crucial to note that cyber operations do not occur within a strategic vacuum. These incidents are often regionally bound and involve states that are locked in dispute with one another. As such, the study finds that individuals are more likely to assign blame given the reputation (i.e. pre-existing belief in) of the suspected aggressor. That is to say, if Country A manages to trace the incident to Country B and if the latter has a history of belligerent behavior, the former is more likely to overestimate the latter’s culpability in spite of contradictory information.
The consequences of framing events through the lens of pre-existing beliefs is further aggravated by prioritizing initial information as a basis for judgements. Within this study, a “primacy” effect is observed in which the first available information is exploited at the expense of later revelations. For this study, participants are informed of both the global probability of state-associated cyber operations and the pronounced accuracy of individual attribution. In cases where the latter is presented first, the reported probability that the suspected country is responsible is noticeably higher than the global probability.
Beyond this experimental environment, the influence of both pre-existing beliefs and information order raises the prospect of conflict spirals. With pre-existing beliefs strengthening an adversarial mindset, political elites who overvalue initial, and possibly inaccurate, information may form sub-optimal judgements leading to inappropriate policy responses that result in increased tension.
The Need to Mitigate Bias
While the findings are by no means novel in the broader context of interstate relations, these provide crucial contributions to both theory and policy. Theoretically, the study improves our understanding of elite decision-making vis-à-vis cyberspace. Existing literature feature systemic or technologically driven explanations of state behavior. While these deliver useful insights, the importance of micro-level attributes remains understudied despite their demonstrated explanatory power with respect to state behavior.
With regards to policy, the observed sub-optimal judgements only strengthen the need to institute tools and processes to better gauge evidence in the aftermath of malicious cyber behavior. The need to train analysts and the availability of tools and processes that enable a wholistic assessment is crucial. Moreover, the findings also surface the need for better information sharing across responsible organizations in order to provide a diversified set of beliefs as a means controlling for bias.
Miguel Alberto Gomez is a senior researcher at the Center for Security Studies at the ETH in Zurich. His research explores the role of heuristics and cognitive bias in the formation of cybersecurity policy and interstate interactions in cyberspace.
Gomez’s EJIS article is available free of charge until the end of June 2019.