Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

When the future meets the past: Can safety and cyber security coexist in modern critical infrastructures?

When the future meets the past: Can safety and cyber security coexist in modern critical... Big data technologies are entering the world of ageing computer systems running critical infrastructures. These innova- tions promise to afford rapid Internet connectivity, remote operations or predictive maintenance. As legacy critical infra- structures were traditionally disconnected from the Internet, the prospect of their modernisation necessitates an inquiry into cyber security and how it intersects with traditional engineering requirements like safety, reliability or resilience. Looking at how the adoption of big data technologies in critical infrastructures shapes understandings of risk management, we focus on a specific case study from the cyber security governance: the EU Network and Information Systems Security Directive. We argue that the implementation of Network and Information Systems Security Directive is the first step in the integration of safety and security through novel risk management practices. Therefore, it is the move towards legit- imising the modernisation of critical infrastructures. But we also show that security risk management practices cannot be directly transplanted from the safety realm, as cyber security is grounded in anticipation of the future adversarial beha- viours rather than the history of equipment failure rates. Our analysis offers several postulates for the emerging research agenda on big data in complex engineering systems. Building on the conceptualisations of safety and security grounded in the materialist literature across Science and Technology Studies and Organisational Sociology, we call for a better under- standing of the ‘making of’ technologies, standardisation processes and engineering knowledge in a quest to build safe and secure critical infrastructures. Keywords Safety, cyber security, risk, critical infrastructure, materiality, expertise cyber security (Thomas et al., 2020). Moreover, the require- Introduction ment for cyber security cannot be divorced from safety as What happens when new tools enter the old world? For the consequences of cyber security attacks in critical infra- decades, inaccessible legacy computing systems have structure systems move into the material realm (Tanczer been running critical infrastructures, like power plants, et al., 2018). Cyber security attacks on OTs can lead to train stations or wastewater facilities. These so-called oper- explosions, collisions and blackouts. This necessitates ational technologies (OTs), traditionally consisted of iso- novel risk management practices which are simultaneously lated computers controlling sensors and actuators, often attuned to security and safety. use simple binary logic (e.g. a machine turning on/off Although the professional practice of cyber security risk depending on a sensed ambient temperature). The propo- management is novel in critical infrastructures, risk man- nents of infrastructure modernisation argue that legacy agement in other domains has a long-standing tradition of systems are due an upgrade – after all, they remained the evolving through controversies. What is construed as same for decades (cf. Schiølin, 2020). Connecting critical infrastructures to the Internet and the world of big data would equip practitioners with the possibility of remote The University of Bristol, Bristol, UK Bristol Cyber Security Research Group, Bristol, UK operations, predictive maintenance or real-time monitoring of industrial processes (Brass et al., 2018; Urquhart and Corresponding author: McAuley, 2018). Although this paradigm shift offers inter- Ola Michalec, The University of Bristol, Bristol, UK. esting prospects, it also brings a novel concern, namely Email: ola.michalec@bristol.ac.uk; aleks.michalec@gmail.com Creative Commons CC BY: This article is distributed under the terms of the Creative Commons Attribution 4.0 License (https:// creativecommons.org/licenses/by/4.0/) which permits any use, reproduction and distribution of the work without further permission provided the original work is attributed as specified on the SAGE and Open Access page (https://us.sagepub.com/en-us/nam/open-access-at-sage). 2 Big Data & Society ‘risky’, ‘secure’ or ‘safe’ is a matter of debates, testing accepting the modernisation agenda. Second, by outlining regimes and evolving standards – in other words, it is collective risk management practices that enabled diverse socially constructed (Marres and Stark, 2020; Stilgoe, practitioners to collaborate. Third, by highlighting how prac- 2021). The same could be said about how practitioners titioners borrowed elements from safety culture and incorpo- grapple with new computing technologies to arrive at jud- rated it to security. Fourth, by cautioning that epistemic and gements about a novel criterion of cyber security while pre- material differences between the old world of legacy tech- serving their traditional goal of safety. Although safety and nologies and novel big data tools pose limits to the future cyber security concerns have different origins, the current of critical infrastructures modernisation. direction in policy and practice is to integrate these two requirements through the harmonisation of regulatory fra- meworks, product standards and professional training Background (Kriaa et al., 2015). Safety versus Security The Science and Technology Studies (STS) and Organisational Sociology research on safety and risk in Safety and security might seem synonymous, however, there complex systems helps us understand to what extent the are many technical, political and cultural differences which security and safety risk management practices could be inte- distinguish these two requirements. Broadly, in infrastructure grated. Looking at how the adoption of big data technolo- research, safety is concerned with prevention, protection and gies in critical infrastructures shapes understandings of recovery from unintentional accidents, while (cyber)security risk management, we focus on a specific case study from is interested in dealing with malicious and deliberate incidents the UK cyber security governance. We argue that the imple- (Pietre-Cambacedes and Chaudet, 2010). However, even this mentation of the Network and Information Systems high-level distinction has been a subject to multiple theoretical Directive (NIS) (European Commission, 2016) commenced developments. Researchers identify four main paradigms in the process of the integration between safety and security cyber security: (1) fixing and breaking technical objects; (2) concerns across critical infrastructure providers. It has, erroneous use of computers; (3) malicious political actions therefore, legitimised the modernisation of OTs. By addres- by the means of digital tools; (4) social construction of expert- sing the traditional engineering requirement of safety, the ise around what is deemed worth protecting (Adams and proponents of big data technologies managed to position Sasse, 1999; Dunn-Cavelty, 2018; Klimburg-Witjes and the changes as acceptable to critical infrastructure practi- Wentland; 2021; Renaud et al., 2018). In parallel, a number tioners. However, we also show that the risk management of ‘turns’ have been recognised in safety research: from practices in big data-enabled critical infrastructures cannot safety being the priority goal, trumping efficiency of pro- be directly transplanted from the safety realm, as cyber cesses; through accident prevention via designing-in safety security is grounded in anticipation of the future adversarial into complex systems, to, finally, placing responsibility on behaviours rather than the history of equipment failures. the end-user or the operator (Elish, 2019; Norton, 2015). It Therefore, while integration of safety and security is is worth noting that these paradigms often co-exist over the important for the delivery of reliable critical infrastructure same timescales, although they tend to reside in different pro- services, it cannot be taken for granted. Precisely, this fessions and disciplines, without challenging each other’s clash in temporalities between legacy infrastructures and assumptions. Therefore, the premise of ‘new tools entering big data technologies creates a gap in conceptualisations the old world’– managing novel risks from big data in and methodologies for risk management. legacy critical infrastructures – provides a unique opportunity The remainder of this article will proceed as follows. to re-consider the established ways of thinking about both First, we provide an overview of security, safety, and safety and cyber security. OTs. Here, we synthesise the literature of theoretical Security and safety are distinguished by their unique ‘turns’ across security and safety, highlighting the material, temporalities. A key feature specific to the cyber security cultural and political differences between information tech- field is that it rests on novelty. New threat actors, vulner- nologies and OTs. Next, we introduce the conceptual lens abilities and theoretical attacks come to light regularly, of safety, risk and security as social constructions (Pinch often at a pace faster than the creation of regulations and Bijker, 1984; Barnes, 1993). In the section ‘Case (Matthew and Cheshire, 2016). Cyber security risks in crit- study: NIS implementation in the United Kingdom’,we ical infrastructures often originate from high-profile mali- contextualise the article by familiarising the reader with cious activities of organised criminals or state actors, the outline of the regulatory landscape. Following that, lending itself to the use of political rhetoric, high levels of the section ‘Research design’ reports on methods used secrecy and large budget spending as means to protect crit- and reflects on the opportunities and challenges of close ical infrastructures from any presumed existential threats collaborations between researchers and practitioners. (Dunn-Cavelty, 2013). Consequently, security by automa- We present our argument in four parts. First, by establish- tion, prediction, and testing becomes especially challenging ing that safety-security integration was key for engineers in such environments due to difficulties in access to data or Michalec et al. 3 trusted informants. Meanwhile, in legacy OT systems, required), or the juxtaposition of safety culture of OT engi- safety risk has been traditionally understood probabilistic- neers and innovation culture of IT workers (Guldenmund, ally as a ‘failure rate’, a frequency with which an engineer- 2000; Reece and Stahl, 2015; Thekkilakattil and ing component fails when tested, expressed in failures per Dodig-Crnkovic, 2015). Infrastructure providers running unit of time. Failure rate figure is deeply rooted in physical on OT systems are also organised very differently com- properties of the system and a wealth of historical data (Ani pared to IT companies – critical infrastructures are often et al., 2016). The dynamic characteristic of cyber security hierarchical and governed through public-private partner- contrasting with a static (or, at best, slowly moving) ships, while IT companies range from start-ups to monop- nature of safety implies there are limits to the integration olies and they are most often private sector entities of traditional OT safety paradigms to the context of (Dunn-Cavelty and Suter, 2009; Murray et al., 2017). modern, interconnected critical infrastructures (Slayton These distinctions are important as they inform who gets and Clark-Ginsberg, 2018). to conceptualise risk, how do they do it and why. Operational technologies versus information Theoretical framework technologies How do we know if machines are safe?. In order to frame this Throughout the article, we distinguish between information research paper, we used the conceptual lens of social con- technologies (computers commonly found in homes and struction of safety, risk and security. This paradigm explains offices) and OTs (computers operating engineering machin- how technological expertise emerges, stabilises, gets con- ery) to understand how these technologies were historically tested or widely accepted (Barnes 1993; Pinch and Bijker, constructed as separate and how they are now poised as 1984). Such research examines how different actors arrive integrating with each other. In this claim we follow calls at their assessments, rather than examining whether their from Kinsley (2014) and Aradau (2010) to pay attention assessments are true. Examining technological expertise to materiality in computers and infrastructures. Objects of involves an inquiry into situated practices, materials of cyber security – sensors, buildings, code – are not passive day-to-day work and debates surrounding technoscientific technologies waiting to be filled with discourses. They are developments (Collins, 2007; Pinch and Swedberg, 2008; not characterised by ‘essential’ features separating them Suchman, 2007). In that vein, we first build our argument from humans, either (Fouad, 2021). Instead, materiality is by reviewing the literature on social construction of safety critical for noticing how objects become ‘agents’ of social and risk before moving to the analysis of our data which change through practices which are both discursive and focuses on the construction of security. material (Aradau, 2010). The long history of safety research in complex systems Historically, the consequences of security incidents in IT like aviation (Downer, 2010) nuclear engineering were materially different from OT because these systems (Wynne, Waterton and Grove-White, 2007; Polleri, 2020; were traditionally built for different purposes. While IT pro- Perrow, 1984) or autonomous vehicles (Haugland, 2020; fessionals are typically concerned with the damage to data, Stilgoe, 2021) thoroughly documents and analyses the evo- hence lost revenue, customer trust or reputation, OT practi- lution of testing regimes and assurance schemes to minim- tioners are mainly concerned with human safety, equipment ise risks, recover from incidents and anticipate a range of damage and continuous supply of ‘essential services’. possible scenarios. How do experts establish whether a Traditionally, OT systems were designed with physical complex system is ‘safe enough’? resilience and safety in mind; cyber security was not a Focusing on controversies, STS scholars have been typical requirement due to the practice of ‘air-gapping’, tracing how debates evolve to establish complex, emer- that is, isolating OT computers from the unsecured net- ging or high-stakes technologies as ‘safe’ or ‘risky’.For works like the public Internet (Byres, 2013). IT systems, example, industrial manufacturers’ framed safety as a in contrast, are commonly interconnected, which necessi- matter of feeding more data to proprietary machine learn- tates the need for security and privacy by design and regu- ing algorithms in case of autonomous vehicles (Stilgoe, lation (Michalec et al., 2020). As both IT and OT systems 2018) or raising awareness of machine operators working are gaining Internet connectivity and real-time analytics with robots (Elish, 2019). Meanwhile, Norton (2015) functionalities, they are ‘blending’ into a single entity. showed how road safety was deprioritised over decades as And so are the previously separate concerns for cyber secur- the automotive industry grew in the United States. These ity and safety (Michalec et al., 2021). In short, contempor- examples illustrate that the research on safety and cannot ary ‘big data’ practices of OT and IT professionals are be limited to laboratory experiments and test beds as cat- reconfiguring what critical infrastructures are made of. egories like ‘safety’ and ‘risk’ are inherently riddled The differences between OT and IT were historically not with political and organisational contingencies. only material but also cultural, such as varying degrees of Other researchers contributed to the social construction of professionalisation (i.e. typical career routes and education safety through understanding accidents. Downer (2010), 4 Big Data & Society brings attention to what he calls ‘epistemic accidents’– proportionately to the number of Internet-connected man-made calamities resulting from fundamental limitations devices. In practice, assigning risk scores in a given organ- of engineering tests and models which, by design, are never isation depends on asset criticality (i.e. how important are perfect representations of the ‘real world’ conditions. Such devices and datasets), motivations of potential attackers, events happen if ‘scientific or technological assumptions the available budget, just to name a few (Cherdantseva prove to be erroneous, even though there were reasonable et al., 2016). Moreover, risk decision makers need to and logical reasons to hold the assumptions before the account for issues which are not specific to cyber security, event’ (Downer 2010: 752). Epistemic accidents offer valu- for example, public responsibility for delivery of reliable able lessons for organisations and practitioners by revealing essential services, business models, insurance, risk appetite, inherent shortcomings of the current engineering and design reputation (Henrie, 2015; Nurse et al., 2017; Pieters and paradigms. By researching how experts work towards safety Coles-Kemp, 2011). While recent research offers reviews in complex engineering systems, we can better see the poten- of risk assessment frameworks (Kriaa et al., 2015; tial for epistemic failures and build-in practices to learn from Cherdantseva et al., 2016), it leaves a gap for understanding them. to what extent these frameworks are applied in the real- Analysis of epistemic accidents matters as it shifts atten- world context. tion from an individual’s error to interactions between opera- Previous risk studies embedded in the critical infrastruc- tors, organisational cultures and politics and machines. In ture context highlighted that risk assessments are collabora- doing so, STS scholars go beyond seeing safety accidents tive processes, rather than a matter of following a formal conventionally, as failure of individuals and their erroneous methodology (Frey et al., 2019; Shreeve et al., 2020). In use of complex systems (Pinch and Bijker, 1984; Stilgoe, doing so, they challenge the trope of ‘security expertise’ 2018). Here, the notions of error and safety are intimately being solely a technical and individual matter. This shows connected to (the limits to) ‘knowability’ of complex security expertise as inherently emergent, contextual and systems (Downer, 2010; Spinardi, 2019). They feed into subjective. For example, security practitioners might use a safety testing and modelling, and therefore, everyday deci- variety of reasoning strategies, such as ‘risk first’ (following sions about risk (Marres and Stark, 2020). governmental risk assessment framework) or ‘opportunity Finally, while the expertise from the safety world cannot first’ (identification of investment opportunities before con- be directly transposed to the security context, there is an sidering risks). Moreover, in practice, people exercise both overlap between these fields. Politically, both safety and kinds of reasoning, with vulnerabilities (i.e. weaknesses in security of infrastructures are prioritised by governments computer systems) are thought about most commonly, as they fundamentally relate to the ‘normal’ functioning and assets (i.e. equipment, documents, employees) least of the society (Agrafiotis et al., 2018; Shove and often, leading to an over-reliance on vulnerabilities-centred Trentmann, 2018). However, as certain security incidents threat assessments (Shreeve et al. 2020). Moreover, risk and safety accidents cannot be prevented in complex thinking is ‘front-loaded,’ as practitioners tend to think systems (Perrow, 1984), critical infrastructure operators about risk in the beginning of the decision-making often emphasise resilience and risk management. In prac- process, rather than systematically throughout the lifecycle tice, this means that the same teams could be made respon- of OT systems (Shreeve et al., 2020). Meanwhile, as threats sible for both safety and security. Even though cyber in cybersecurity evolve over time, risk management ought security incidents and safety accidents require separate to be iterative and regularly updated (Ani et al., 2016; root cause analyses, they might manifest as the same conse- Frey et al., 2019). quences (Agrafiotis et al., 2018; Kriaa et al., 2015), thus Risk management in the context of security often draws share commonalities in terms of risk management practices. from a practice called threat modelling to anticipate likely attackers, incident pathways, possible consequences of Risk management: Between calculation and anticipation. attacks and best ways to respond to them. The techniques Understanding risk management in critical infrastructures under the umbrella of threat modelling vary; from qualita- is a multifaceted issue of both qualitative and quantitative tive expert workshops (Wuyts et al., 2020), through math- nature (Shreeve et al. 2020). Despite the rise of rule-based ematical models based on probabilities (Markov chains, and probabilistic risk methodologies, for example, attack game theory) to graphical representations (in the forms of trees, attribute-based algorithms (Tatam et al., 2021), secur- tables, data flow diagrams and attack trees), with some ity risk is ‘incalculable’ since there are limits of what could threat modelling techniques promising full automation be inferred from scientific data (Amoore, 2014: 424). Risk and quantification of risks (Tatam et al., 2021). methodologies are ‘already political’ as they involve com- In practice, when it comes to classifying potential binatorial possibilities whose arrangement has effects on impacts, and evaluating attackers’ motivations, threat mod- risk scores, and associated countermeasures (Amoore, elling relies on qualitative expert judgement, usually a small 2014: 423). In the case of OT cyber security, this means group of domain specialists. However, as cyber security that we cannot simply assume that the risk rises ‘spills out’ beyond simply protecting computers, there is a Michalec et al. 5 call for broadening the scope of threat modelling. Critical (known as the Cyber Assessment Framework; NCSC, social scientists argued for anticipating risks of emerging 2019). The Cyber Assessment Framework is the key oper- technologies by including non-experts (Slupska et al., ational document pertaining to the question of cyber secur- 2021), understanding security in tandem with privacy and ity risk management of critical infrastructures in the United surveillance (Kazansky, 2021; Wuyts et al., 2020), and Kingdom. Fourteen principles of the Cyber Assessment approaching non-human actors (code, hardware, algo- Framework are set out as so-called ‘Indicators of Good rithms) as active co-creators of geopolitics (Dwyer, 2021; Practice’ (NCSC, 2019), or recommended outcomes of Fouad, 2021). The strength of such a ‘critical threat model- security improvements rather than specification how to ling’ approach would then lie in the capacity to imagine and improve cyber security. For the purpose of self- anticipate a wide range of outcomes and curate a space for assessments, each of the 14 outcomes is self-assessed in explicitly normative discussions about living with digital diverse teams comprising of both OT and IT practitioners technologies. Including actors outside of cyber security pro- according to a three-grade scale as either ‘fully achieved’, fession allows the multiplicity of futures to become visible, ‘partially achieved’ or ‘not achieved’. Following the com- as there is no single objective and optimal choice between pletion of self-assessments, operators and regulators draw security, privacy, risk appetite, resources available, reputa- agreements on the improvement plans, and conduct external tion, innovativeness, and many other factors. audits (Shukla et al., 2019; Wallis and Johnson, 2020). Since the successful implementation of cyber security reg- ulations requires collaboration across the IT and OT Case study: NIS implementation in the teams, it makes the cross-cutting issues of safety and secur- United Kingdom ity visible (Michalec et al., 2021) To address how critical infrastructure practitioners concep- tualise and practice security risk management, we use the Research design case study of the NIS, as implemented in the United Kingdom (DCMS, 2018). The NIS implementation prac- We conducted a qualitative study of experts managing big tices reveal how practitioners from diverse sectors grapple data risks to critical infrastructures. Between November with the modernisation of legacy OT systems. Their NIS 2019 and January 2020, we interviewed 30 practitioners compliance practices are balancing acts to build intercon- and observed two industry events focused on the implemen- nected and secure infrastructures, without compromising tation of the NIS Regulations. Our interviewees ranged on the traditional engineering goals like safety, or reliability from the critical infrastructure operators, regulators, consul- of essential services like water, energy or transport. tants, lawyers, to OT equipment manufacturers. We aimed NIS originated as a high-level supranational directive rati- to cover a range of sectors (e.g. energy, water, transport) fied by the European Parliament in 2016. Since then, it has and roles (e.g. technical, managerial, consultancy, regula- been transposed to the EU Member States and the United tory). We conducted semi-structured interviews focusing Kingdom as NIS Regulations (DCMS, 2018). This move on historical perspectives on the development of OTs, par- meant that while high-level objectives and international ticipants’ outlooks on the future of modernisation, interpre- cooperation mechanisms were set by the EU, the scope of tations of the Regulations and the issues around what is regulated as well as implementation mechanisms communicating security risk across professional boundar- are decided by each state and sector individually. In the ies. Questions were tailored to each participant in order to United Kingdom, the implementation of NIS follows the account for differences in sectors and professions. principles of ‘appropriateness and proportionality’ (Michels Interviews took place either at the participant’s organisa- and Walden, 2018), which necessitates careful deliberation tion, our institution or via online calls, with the lead over designation of the operators falling under the purview author conducting all interviews. All conversations were of regulations, thresholds of incident reporting and recorded with the interviewees’ consent. No reimbursement maximum penalties. NIS is known as ‘principles-based regu- was given for participation. Our analysis is complemented lation,’ meaning that critical infrastructure operators work by an in-depth reading of the Cyber Assessment towards meeting the governmental objectives without speci- Framework (NCSC, 2019), a UK-specific document outlin- fication how to achieve such goals (Michels and Walden, ing what the outcomes of ‘good’ risk management is secur- 2018). The government’s reasoning behind this move is to ity look like. avoid ‘box ticking’ style of compliance and contextualize Our approach responds to the calls by de Goede (2020) risk management. In the eyes of the UK’sNational Cyber for increased engagements between empirical research on Security Centre, ‘this encourages innovation and expands expert practices and critique. By treating the implementa- the breadth of technologies we can assure’ (NCSC, 2021). tion of cyber security regulations as ‘situated practices’, Risk assessment is embedded in NIS implementation we bring our attention to the notion of expertise construc- from the beginning. The implementation procedures in tion and de-centre policy discourses or legal analysis. By the United Kingdom begin with a self-assessment stage following practitioners and practices, we were able to 6 Big Data & Society gain trust of our informants, appreciate the diversity of their critical infrastructures. This agrees with the overarching justi- expertise, and their disagreements and material artefacts fication behind the UK National Cyber Strategy which claims they work with. As a result, long after data collection that the ongoing and rapid expansion of digital connectivity is period finished, the first author of the paper is still collabor- a main driver behind cyber security regulations (Cabinet ating with practitioners; publishing government guidance or Office, 2022: 29). Effectively, NIS is the first step to legitimise giving regular industry talks. The downside of research the modernisation of critical infrastructures: approaches relying on in-depth engagement with practi- tioners is the possibility of losing ‘critical distance’ and ‘there are some instances where the best answer would be getting ‘co-opted’ by practitioners’ agenda (de Goede, to innovate legacy. NIS has not ever come up with a recom- 2020). This is especially challenging when working with mendation that there should be greater digitisation, but practitioners whose goals are both normative and open to what it did say is: “There are certain expectations, particu- interpretation, like security and safety. In our case, we navi- larly around configuration and software management gated that tension by highlighting the plurality and contin- where it was very hard to deal with a legacy.” So, some gency of expertise, rather than promoting a single vision. In people found themselves caught in a business case terms for further research avenues, this research agenda between a technology refresh which, frankly, was overdue would benefit from an in-depth investigation of single quan- anyway, or retaining legacy systems for reliability titative threat modelling methodology (e.g. Markov chains); reasons with negative cyber security implications.’ following the datasets, construction of algorithms, model- (Interview with energy sector working group lead) ler’s assumptions and how results of risk assessments are However, while bringing the diversity of expertise translated (or not) into organisational decisions. allowed to advance and integrate risk management prac- tices, there are fundamental differences between safety Towards modernisation of critical and security. Therefore, the future of big data in critical infrastructures is still uncertain. In what follows, we will infrastructures examine practices which enabled that integration as well Our research shows that the introduction of security regula- as highlight epistemic and materials differences between tions into the world of legacy safety-critical systems these two requirements. prompted harmonisation of these two requirements. In turn, this move legitimised the modernisation of legacy OT envir- onments. However, due to fundamental differences between Hiveminds and other collaborations managing safety and security risks, the modernisation of crit- Diverse expertise ical infrastructures cannot be taken for granted. At first, engineers exhibited resistance to the modernisa- What makes risk management across security and safety tion agenda: ‘oursectorisadoptingIndustrialInternetof successful? First, it is contingent on the access to diverse Things at a frightening rate, and we’ll have little idea as to expertise within an organisation, and how effectively what it looks like and how to secure it’ (interview with recommendations are communicated to those in charge of water sector operator 1). The dominating mood was cynicism decision making, who are usually senior managers about big data technologies being introduced to increase man- without the expertise in security: ‘So the security engineers ufacturers’ profits: ‘people see opportunities to deliver a new might be quite grumpy because the manager just does not shiny box, a new system, a new bit of software, a new service. understand their problems. But the engineers also do not So, that is really, really driving and almost pushing along understand there is a bigger picture going on here, e.g., innovation in the market’ (interview with OT security con- that a power station needs to provide an ongoing supply sultant 1). However, a pivotal moment occurred when of electricity’ (Interview with engineering consultant 2). safety and security professionals started working together Second, cyber security risk assessment requires diverse with the regulators to identify how their requirements map inputs – apart from traditional technical experts, human onto the Cyber Assessment Framework and create a factors practitioners are needed to anticipate how workers common benchmark for the whole sector: ‘it is like an could be employing workarounds against security mea- exam board where you get together and make sure all the sures, so that they could improve the usability of security markers are assessing against the same criteria. We often practices. In the words of our interviewee, a water regulator, went back to the regulator pointing out where NIS did not ‘You can create more risks by going overboard with too make sense in our context of OT technologies’ (Interview stringent and annoying security measures where people with water operator 2). try and find work arounds. Water plant operator working Precisely, that bringing together of diverse experts enabled with time-critical systems cannot afford 30 seconds delay safety-security integration. Regulators, by listening to the con- if they typed their password incorrectly’ (Interview with cerns from safety engineers, adjusted the Cyber Assessment water regulator no 3). As such, NIS does not only regulate Framework guidance, to facilitate digital connectivity in technologies, but also how people use them. Michalec et al. 7 In the eyes of participants, these hiveminds, usually Trust in collaborations expressed as semi-formal working groups, are better suited Security risk assessment is also contingent on trust. In particu- for sharing expertise than regulations. In other words, risk lar, it is trust between IT workers and OT workers filling the management practices preferred by the participants are rela- Cyber Assessment Framework: ‘when Idomyriskassessment tional and collaborative, rather than top-down and individua- of systems we rely on, I’ve got to assume that the guy doing the lised. As our interviewee put it, ‘working groups could be IT bit has got his IT correctly’ (interview with OT water oper- tasked with a creation of sector-specific process standards ator 1). Furthermore, successful risk management happens if which would be a collective endeavour rather than an indi- security experts manage to establish a trusting relationship vidual activity of ‘box ticking’ (interview with water operator with the board members and gain ‘buy-in from senior manage- 1). An example from an energy sector working group shows ment to invest in cyber’ (Interview with energy regulator 2). that collaborating on risk assessment was easier before regu- Ultimately, senior managers are the budget holders and lations came into force: ought to see how security improvements translate into organ- isational goals, be it by providing reliable energy supply, or ‘In 2013, we did a UK-wide risk assessment. We anon- ensuring workplace safety on a train station. While partici- ymised responses from individual companies, we aggre- pants acknowledged that connecting security practitioners to gated it and so we could come up with two things. First, board level executives has traditionally been a challenge, where collective gaps and difficulties, so that we could they are now gaining techniques for better engagement: ‘I request help from the government departments. Second, would stop talking about the threats, the executives know we found out there was a difference between the best prac- about the threats. Instead, say how we are looking after the tice in some and those who were struggling and there we business and its core critical functions’ (interview with a introduced knowledge sharing opportunities. We then vendor of security products). Ultimately, cyber security risk allowed the good ones to present their approaches and management is seen in the context of broader risk manage- the others could learn so we got a best practice learning ment, where practitioners across diverse teams are encouraged environment’ to reflect: ‘How much risk can we tolerate as an organisation? How much do we value our reputation? What is our attitude (Interview with energy sector working group lead). towards legislation and regulation? It is all interconnected’ The implementation of the Cyber Assessment (Interview with an IT security consultant). Indeed, in this Framework reveals three crucial aspects pertaining to the case, security is a matter of care (Kocksch et al., 2018) social construction of risk management: professional prac- where security budgets are considered as a matter of long-term tices as objects of regulations, cyber security mapped to maintenance of whole organisations rather than cutting-edge broader organisational goals, and practitioners collaborat- technological ‘solutions’. ing to create ‘risk thinking hiveminds’ that capture risk management practices across their sectors. Just like safety regulations in critical infrastructures (Downer, 2010), NIS Building a ‘risk thinking’ hivemind regulates trust in professional practices, rather than tech- One of the pressing questions for the critical infrastructure nologies. Cyber security has been placed in the broader practitioners is how NIS could avoid being a tick-box exer- organisational context of safety, usability or reliability. cise. The UK Government designed the Cyber Assessment Finally, faced with the novelty of cyber security regulations Framework as an outcomes-based document to ‘discourage in the legacy environments, practitioners collaborated to compliance thinking’ (NCSC, 2019). However, by provid- manage the overlapping risks of safety and security. ing a set of ‘good outcomes’ rather than policies on how to Lacking prescriptive guidance, they created a ‘risk-thinking achieve them, the Cyber Assessment Framework received hivemind’ to collectively work towards their goals. criticisms for ‘leaving everything up for negotiation’ (inter- view with energy regulator 2). On the one hand, outcomes- Towards harmonisation of safety and based regulations are suitable for dynamic contexts, like security cyber security, where new risks emerge regularly and there are multiple ways to ‘do the right thing’. On the Let us now turn to how cyber security integrated practices other hand, outcomes-based regulations rely on a baseline from safety engineering in their work to blend the level of expertise where practitioners can exercise expert ‘digital’ and ‘engineering worlds’. judgement on risk: ‘we want people use the Cyber Assessment Framework as a sanity check rather than a pro- Threats and incidents reporting cedure to follow to the letter to protect their own reputa- tion’ (interview with energy regulator 1). And so, In the event of a cyber security incident, operators will have practitioners called for raising the level of expertise across to report it to the regulator and evidence that they took the whole sector, what we call a ‘risk thinking hivemind’. ‘appropriate and proportionate’ measures to mitigate risks 8 Big Data & Society in order to avoid a penalty (NCSC, 2019). However, there is one person owns it, and you look at all the independences no obligation to report ongoing threats, that is, prospective and as you get further away from the core’ (interview malicious activities and actors that are yet to hit a computer with engineering consultant). In particular, it is the inter- network. The above caveats resulted in the ongoing debates national nature of internet services (e.g. cloud providers), on defining reporting thresholds for incidents and even dis- which highlights the difficulty with drawing a clear bound- tinguishing between a threat and an incident (DCMS, ary around cyber security risks (and, indeed, the scope of 2021). The dilemma lies in the fluid nature of the above NIS itself!): ‘a whole chunk of security is now outsourced terms. On the one hand, encouraging reporting of the to the Cloud provider overseas, so critical infrastructure ongoing threats improves the collective intelligence, the operators lose control over it’ (IT security vendor). aforementioned ‘risk thinking hivemind’. On the other, if Yet again, well-established practices from safety engin- a threat reported by one organisation turns into an incident eering could come to rescue, with maintenance contract in another, both organisations may be receiving fines. This between third-party suppliers and operators recommended lateral way malware propagates is a well-known phenom- as ways to uphold good standards of security over time: enon in interconnected complex environments (Dwyer, ‘long term improvement is a matter of maintenance con- 2018) but historically it was not a concern in disconnected tracts. So that is important also, is that if you are buying critical infrastructures. As a result, these contingencies of an expensive piece of equipment you want to have it sup- threat reporting pose a risk that operators will minimise ported for a long time, otherwise you do not have a business their reporting all together. The evidence from the critical case to use that supplier’ (interview with a rail engineer). infrastructure security regulations in the United States While borrowing professional practices from the safety shows that fear of fines created a counterproductive envir- culture might help engineers with understanding of cyber onment for information sharing (Clark-Ginsberg and security, the complexities around global supply chains Slayton, 2019). and the scope of NIS remain. In an example from one of In order to encourage operators to report on the develop- the critical infrastructure sectors (Wallis and Johnson, ing threats, water regulators broadened the reporting scope 2020), for data centres located outside of the United so that all security incidents and safety accidents, however Kingdom, the NIS regulators cannot oversee their security minor, had to be reported under the same umbrella . This measures. However, critical infrastructure operators are also led to discussions among practitioners to report ‘near still legally obliged to arrive at bilateral contracts with misses,’ threats which did not have a significant impact data centre providers to meet the requirements of NIS. on their network , showing that thresholds of harmful The requirement for security remains but less so the events are a subject to ongoing debates. This move clarity about who validates the process (Wallis and signals that both incidents and accidents are bound to Johnson, 2020). happen and reporting of the ongoing threats (even if not To conclude, by borrowing established practices and yet materialised as security incidents or safety accidents) terms from the safety culture context, NIS practitioners will not be stigmatised. were able to make cyber security more familiar to critical However, this practice is not uniform across all critical infrastructure engineers. Encouraging broader incident infrastructure sectors. Right now, energy regulators do not reporting and establishing maintenance contracts opened have the same level of insight. In order to allow further inte- new discussions highlighting the complexity of cyber gration of security and safety, regulators advocated for security in interconnected, big data environments like improved capabilities to observe the dynamic nature of cloud. threat actors and typical attacks: ‘it would be of a real inter- est to us, but currently this is a voluntary procedure’ Dissonant harmonies: The limits to (Energy regulator 1). Although 2020 saw numerous integration of safety and security attempts of security breaches attempts, none of them were Despite the opportunities of safety-security harmonisation reported to NIS regulators as they did not lead to the loss as expressed through professional practices, this section of supply or power outages; such lenient reporting criteria argues that there are fundamental epistemic and material also raise suspicions in the national news, which questions differences between legacy OT environments and big data whether NIS’ reporting criteria in the energy sector is fit for practices. purpose (Martin, 2021). Maintenance contracts Prescriptive thinking Deciding on the ownership of cyber security risks proved First, let us return to collective risk assessments we identi- very challenging: ‘when you start looking at the scope of fied earlier in the analysis. The creation of ‘risk thinking NIS, which is one of the first things you do, you ask yourself, hiveminds’ which consolidate security knowledge across what do you really depend on? Very complicated, and no the sector could be complicated by the tendency to work Michalec et al. 9 be asking: was it safety incident, security, or something in a prescriptive manger common in safety engineering. else? What did we learn from it? It all needs to be put One of the water regulators appeals: into the pot. It is like you are telling people, “Something bad has happened”. They need to know: “Well, actually, ‘Getting companies used to doing risk assessment rather what can I do about it?.” And I think there needs to be than compliance is key. Our safety framework was com- more done about turning those incidents into lessons for pletely prescriptive: a list of measures that you must have best practice’ (interview with engineering consultant). on water plants –, e.g., you must have this kind of lock fitted in this kind of way by one of these companies. A parallel gap resides in the practice of threat modelling Which companies love because they can cost it up and go in critical infrastructures. The lack of historical data on along to Ofwat and say, ‘We need exactly this much security attacks in OT environments poses challenges to money to do this much work over this number of years.’ the modelling of future threats: ‘when you’re looking for (Interview with water regulator 1) a record of past incidents to take to your senior manage- ment and all you can show them is a brief declassified docu- Risk thinking at the intersection of safety and security ment with barely any information, they can say, “Well is would necessitate encompassing novel big data practices, that all you have got? If there are hardly any incidents, that is, back-ups for real-time environments, asset inventor- maybe we should not be spending more money and ies for equipment operating automated processes, anomaly effort?” (Interview with engineering consultant 3). detection on segregated computer networks (NCSC, 2021). Why is it so difficult to obtain data? The access to infor- These practices are not familiar to safety engineers who typ- mation on threat actors and past incidents is highly limited ically work with legacy systems where computer networks due to the sensitivity of this topic. Complex procedures were not traditionally monitored, backed-up or segregated. around data classification, information exchanges and Moreover, no single risk management framework covers all even day-to-day interactions give rise to secrecy as a dom- recommended risk management practices with ‘various inating practice in social interactions. For example, some of countries having their own standards. So, it is horses for our participants were unable to have their cameras on courses and some of the best solutions I have seen are during the interviews due to their work incorporating both basically taking a blend of several of the standards’ (inter- offensive and defensive security (i.e. they were simultan- view with OT security consultant). But, to what extent are eously hackers and defenders). Such restrictive norms engineers willing to let go of prescriptive thinking and, around communication raise a possibility of ‘epistemic instead, start blending various frameworks or anticipating accidents’ (or, rather ‘incidents’, if we are concerned with futures? A shift to ‘risk thinking’ culture would necessitate intentional and malicious nature of security attacks), a major change in the ‘epistemic culture’ in safety engineer- events highlighting the limits to established practices ing, to borrow a term from Knorr-Cetina (1999). Epistemic across engineering and computing (Downer, 2010). A culture refers to an established way of accessing, validating telling example would be a cyber security incident hitting and advancing knowledge in a given expert community an underprepared organisation that incorrectly extrapolated (Knorr-Cetina, 1999). Nonetheless, changing culture estab- the rarity of cyber security attacks based on scant declassi- lished over many decades is a mammoth task beyond the fied data. In such case, a cyber security attack would be a scope of a single regulatory initiative. consequence of poor communication and mistrust across critical infrastructure organisations. Secrecy restricts learning. Our second point relates to the secrecy challenges with accessing data required to differen- tiate between security and safety. Is an anomaly in the Logics of risk assessment system due to an error or a hacker? Did the blackout result from a storm or a cyber security attack? The final point of contention relates to the very logics of Earlier in our analysis, we examined integrated reporting risk assessment across safety and security. While the prob- of security and safety events to harmonise these two ability of safety failures is well grounded in historical requirements. However, integrated reporting of security records and components testing (Michalec et al., 2021), incidents and safety accidents yields limited lessons for security incidents in the OT space are a function of antici- the operators if they cannot learn what caused a harmful pating malicious behaviours and relying on sparse historical event. Currently, the lack of separate root cause analysis data, which does not lend itself easily to the logics of prob- limits further integration of safety and security paradigms abilistic prediction. Considering active adversarial actions in engineering: from highly skilled actors like organised criminals or state- sponsored hackers brings a contentious dimension to the ‘There is a lot of reporting that does not tend to think it is practice of risk management. In practice, it means that engi- cyber security but actually that could feed into the cyber neers will have to conduct an explicitly political and norma- risk picture that we need to bring into the mix, we should tive analysis and they are not necessarily ready to 10 Big Data & Society acknowledge this: “if state sponsored hackers bring a We argue that the introduction of security regulations power station down, then we have to react. But that is dif- into the world of legacy safety-critical systems prompted ficult, because then you are definitely into politics. We are a harmonisation of these two requirements. Integration of non-political, non-government organisation, we only do safety and security was afforded thanks to collective risk what we can” (Interview with Incident Response Director). management practices: (1) conducting risk assessment in In order to escape being drawn into politics, industry diverse team; (2) mapping cyber security onto organisa- actors propose machine learning as a data-driven, objective tional goals with senior stakeholders and (3) practitioners means of risk assessment (Dragos, 2019). However, other collaborating to create ‘risk thinking hiveminds’ capturing than any risk analysis being far from objective due to afore- good practices across their sectors. Next, we also show mentioned ‘incalculability of risk’ (Amoore, 2014), the prac- that the implementation of NIS created opportunities to tice of automated anomaly detection using machine learning borrow established terms and practices from safety engin- in particular is seen as contentious due to the low diversity of eering and incorporate them into security procedures. In modelling data used to train machine learning algorithms: doing so, NIS serves as a vehicle that enables incorporating ‘There’s not enough randomness in the datasets themselves cyber security in the existing engineering professions, to say the type of algorithms they use were going to have organisational structures, and maintenance contracts with perfect detection rates’ (interview with energy regulator 3). third party suppliers. On the other hand, however, there Consequently, practitioners are ‘not afraid to lose their are major epistemic and material differences between jobs’ as ‘although the networks are evolving and there is safety and security domains, such as prescriptive attitudes more information, we will always have a human operator to risk in safety engineering standards, or secrecy restricting checking the anomalies’ (interview with rail engineer). cyber security information sharing. Ultimately, the NIS Overall, despite the attempts to integrate safety with Regulations exposed a tension between two vastly different security, the paradox is that big data computing and logics of risk assessment across security and safety: future- legacy engineering environment belong to different and grounded and explicitly normative anticipation versus past- incompatible worlds. The dissonance is expressed in the based probabilistic prediction. following three forms: (1) epistemic culture: risk versus The implementation of NIS is the first step in the integra- prescription; (2) secrecy restricting collective learning; (3) tion of safety and cyber security; therefore, it is the move different logics of risk assessment. The logics of anticipa- towards legitimising the modernisation of critical infra- tion and connectivity favoured in the big data environments structures with big data. But we also show that the cyber do not fit easily into the prescriptive and siloed world of OT security risk management practices cannot be directly trans- engineering, leading to the situation in which the modern- planted from the safety realm, as cyber security is grounded isation of critical infrastructures will continue to pose chal- in anticipation of the future adversarial behaviours rather lenge and cannot be taken for granted. than the history of equipment failure rates. While the har- monisation of safety and security standards and organisa- tional practices is important for the delivery of reliable Concluding thoughts critical infrastructure services, this process cannot be Can new tools be useful, or work at all, for the world that has taken for granted and, consequently, we call for a better not been designed and built to accommodate them? Can crit- understanding of the making of technologies, standardisa- ical infrastructures, with their paramount concern about tion processes and engineering knowledge in a quest to safety, adjust to the new reality brought about by instant con- build safe and secure modern critical infrastructures. nectivity and big data? Can safety and security coexist? Despite epistemic accidents and incubation over a long Modernisation of legacy systems with big data technologies period of un-reliability and controversy, we learn a lot brings about the need to reconsider traditional paradigms in from the histories of safety and engineering paradigms. both engineering and computing in order to successfully inte- grate them. Tracing the attempts to harmonise diverse com- Acknowledgements puting and engineering requirements, we draw from the We thank our participants for sharing their experiences and invit- case study of NIS Regulations. NIS Regulations bring atten- ing us to their professional events. We would also like to thank Dr tion to the management of risks to critical infrastructures. John Downer, Lars Gjesvik as well as two anonymous reviewers While previous research on critical infrastructure risk man- for their insightful comments on early versions on the paper. Finally, we would like to extend our gratitude to the European agement accounts for the variety and sophistication of risk Cyber Security Seminar community, where we had a chance to assessment methods (Kriaa et al., 2015; Cherdantseva present a seminar based on this research project. et al., 2016) as well as the topical coverage of various frame- works and standards (Topping et al., 2021), we brought Declaration of conflicting interests attention to the social construction of risk, safety and secur- ity. In other words, what happens when traditional safety The authors declared no potential conflicts of interest with respect practices meet novel big data practices. to the research, authorship, and/or publication of this article. Michalec et al. 11 Collins H (2007). The uses of sociology of science for scientists Funding and educators. Science & Education 16: 217–230. The authors disclosed receipt of the following financial support for de Goede M (2020) Engagement all the way down. Critical the research, authorship, and/or publication of this article: This Studies on Security 8(2): 101–115. work was supported by the National Cyber Security Centre. Department for Digital, Culture, Media and Sport – DCMS (2018) “Power – Understanding disruptive powers of IoT in the energy The NIS regulations. Available at: https://www.gov.uk/ sector” funded by the PETRAS National Centre of Excellence government/collections/nis-directive-and-nis-regulations-2018 (part of UKRI, number EP/S035362/1). Department for Digital, Culture, Media and Sport – DCMS (2021) Government response to the call for views on amend- ORCID iDs ing the security of network and information systems regu- lations. Policy paper. Available at: https://www.gov.uk/ Ola Michalec https://orcid.org/0000-0003-3807-0197 government/publications/government-response-on-amending- Sveta Milyaeva https://orcid.org/0000-0002-0156-5359 the-nis-regulations/government-response-to-the-call-for-views- Awais Rashid https://orcid.org/0000-0002-0109-1341 on-amending-the-security-of-network-and-information-systems- regulations Notes Downer J (2010) Trust and technology: the social foundations of aviation regulation. The British Journal of Sociology 61(1): 1. Workshop with water suppliers, November 2019, Leeds. 83–106. 2. Workshop for critical infrastructure operators, Oct 2019, Dragos (2019) Key Considerations for Selecting an Industrial London. Cybersecurity Solution for Asset Identification, Threat 3. Ofwat is the economic regulator for the water sector in England Detection, and Response. Report. 2019. Available at: https:// and Wales, setting maximum investment budgets and water www.dragos.com/wp-content/uploads/Key-Considerations- pricing. Industrial-Cybersecurity-Solution.pdf Dunn Cavelty M (2013) From cyber-bombs to political fallout: References threat representations with an impact in the cyber-security dis- Adams A and Sasse MA (1999) Users are not the enemy. course. International Studies Review 15(1): 105–122. Communications of the ACM 42(12):40–46. Dunn-Cavelty M and Suter M (2009) Public–private partnerships Agrafiotis I, Nurse JC, Goldsmith M, et al. (2018) A taxonomy of are no silver bullet: an expanded governance model for critical cyber-harms: defining the impacts of cyber-attacks and under- infrastructure protection. International Journal of Critical standing how they propagate. Journal of Cybersecurity 4(1): Infrastructure Protection 2(4): 179–187. 1–15. doi:10.1093/CYBSEC/TYY006. Dunn-Cavelty M (2018) Cybersecurity research meets science and Amoore L (2014) Security and the incalculable:. Security Dialogue technology studies. Politics and Governance 6(2): 22–30. 2014;45(5):423–439. doi:10.1177/0967010614539719 Dwyer AC (2018) The NHS cyber-attack: A look at the complex Ani UPD and He H. (Mary) and Tiwari A (2016) Review of cyber- environmental conditions of WannaCry. RAD Magazine, 44. security issues in industrial critical infrastructure: Manufacturing Dwyer AC (2021) Cybersecurity’s grammars: a more-than-human in perspective. Journal of Cyber Security Technology 1(1): 32– geopolitics of computation. Area 00: 1– 8. doi:10.1111/area. 74. doi:10.1080/23742917.2016.1252211. 12728 Aradau C (2010) Security that matters: critical infrastructure and Elish MC (2019) Moral crumple zones: cautionary tales in human- objects of protection. Security Dialogue 41(5): 491–514. robot interaction (pre-print). Engaging Science, Technology, Barnes TJ (1993). Whatever happened to the philosophy of and Society (pre-print)6:1–29. science? Environment and Planning A 25(3): 301–304. European Commission (2016) NIS Directive. Available at: https:// Brass I, Tanczer M, Carr M, et al. Blackstock (2018) Standardising digital-strategy.ec.europa.eu/en/policies/nis-directive a moving target: the development and evolution of IoT security Fouad NS (2021) The non-anthropocentric informational agents: standards. IET Conference Publications, 2018(CP740). doi:10. codes, software, and the logic of emergence in cybersecurity. 1049/CP.2018.0024. Review of International Studies 1–20. Byres E (2013) The air gap: SCADA’s enduring security myth: Frey S, Rashid P, Anthonysamy M, et al. (2019) The good, the bad attempting to use isolation as a security strategy for critical and the ugly: a study of security decisions in a cyber-physical systems is unrealistic in an increasingly connected world. systems game. IEEE Transactions on Software Engineering Communications of the ACM 56(8): 29–31. 45(5): 521–536. doi:10.1109/TSE.2017.2782813. Cabinet Office (2022) National cyber strategy. Available at: Guldenmund FW (2000) The nature of safety culture: a review of https://assets.publishing.service.gov.uk/government/uploads/ theory and research. Safety Science 34(1–3): 215–257. system/uploads/attachment_data/file/1053023/national-cyber- Haugland BT (2020) Changing oil: self-driving vehicles and strategy-amend.pdf (accessed 13 June 2022). the Norwegian state. Humanities and Social Sciences Cherdantseva Y, Burnap P, Blyth A, et al. (2016) A review of Communications 7(1): 1–10. doi:10.1057/s41599-020- cyber security risk assessment methods for SCADA systems. 00667-9. Computers & Security 56: 1–27. Henrie M (2015) Cyber security risk management in the SCADA Clark-Ginsberg A and Slayton R (2019) Regulating risks within critical infrastructure environment. 25(2): 38–45. doi:10.1080/ complex sociotechnical systems: evidence from critical infra- 10429247.2013.11431973. structure cybersecurity standards. Science and Public Policy Kinsley S (2014). The matter of ‘virtual’geographies. Progress in 46(3): 339–346. Human Geography 38(3): 364–384. 12 Big Data & Society Klimburg-Witjes N and Wentland A (2021) Hacking humans? Nurse JRC, Creese S and de Roure D (2017) Security risk assess- Social engineering and the construction of the “deficient ment in internet of things systems. IT Professional 19(5): 20– user” in cybersecurity discourses. Science, Technology, & 26. doi:10.1109/MITP.2017.3680959. Human Values 46(6): 1316–1339. Perrow C (1984) Normal Accidents: Living with High-Risk Kazansky B (2021) ‘It depends on your threat model’: the anticipatory Technologies. Princeton University Press, pp. 1–466. dimensions of resistance to data-driven surveillance. Big Data Pieters W and Coles-Kemp L (2011) Reducing normative conflicts and Society 8(1): 1–12. doi:10.1177/2053951720985557 in information security. Proceedings of the 2011 workshop on Kocksch L, Korn M, Poller A, et al. (2018) Caring for IT security: new security paradigms workshop - NSPW ‘11 [Preprint], (11). accountabilities, moralities, and oscillations in IT security prac- doi:10.1145/2073276. tices. Proceedings of the ACM on Human-Computer Interaction, Piètre-Cambacédès L and Chaudet C (2010) The SEMA referen- 2(CSCW): 1–20. doi:10.1145/3274361. tial framework: avoiding ambiguities in the terms “security” Knorr- Cetina K (1999) Epistemic Cultures: How the Sciences and “safety”. International Journal of Critical Infrastructure Make Knowledge. Cambridge, MA: Harvard University Press. Protection 3(2): 55–66. Kriaa S, Pietre-Cambacedes L, Bouissou M, et al. (2015) A survey Pinch TJ and Bijker WE (1984) The social construction of facts of approaches combining safety and security for industrial and artifacts: or how the sociology of science and the sociology control systems. Reliability Engineering & System Safety of technology might benefit each other. The Social 139: 156–178. Construction of Technological Systems: New Directions in Marres N and Stark D (2020) Put to the test: for a new sociology of the Sociology and History of Technology: Anniversary testing. The British Journal of Sociology 71(3): 423–443. Edition, 11–44. Available at: https://mitpress.mit.edu/books/ Martin A (2021) UK Cyber security law forcing energy companies social-construction-technological-systems-anniversary-edition to report hacks has led to no reports, despite numerous hacks. (Accessed: December 16, 2021). Sky News. Available at: https://news.sky.com/story/uk-cyber- Pinch T and Swedberg R (2008). Living in a Material World: security-law-forcing-energy-companies-to-report-hacks-has-led- Economic Sociology Meets Science and Technology Studies to-no-reports-despite-numerous-hacks-12254296 (Vol. 1). Cambridge, MA: The MIT Press. Matthew A and Cheshire C (2016) Trust and community in the Polleri M (2020) Post-political uncertainties: governing nuclear practice of network security. Preprint. Available at: https:// controversies in post-Fukushima Japan. Social Studies of papers.ssrn.com/sol3/papers.cfm?abstract_id=2756244 Science 50(4): 567–588. Michalec OA, Van Der Linden D, Milyaeva S, et al. (2020). Industry Reece RP and Stahl BC (2015) The professionalisation of informa- responses to the European directive on security of network and tion security: perspectives of UK practitioners. Computers & information systems (NIS): understanding policy implementation Security 48: 182–195. practices across critical infrastructures. In Sixteenth Symposium Renaud K, Flowerday S, Warkentin M., et al. (2018) Is the respon- on Usable Privacy and Security (SOUPS 2020) (pp. 301− sibilization of the cyber security risk reasonable and judicious? 3317). USENIX, Virtual. Computers & Security 78: 198–211. Michalec O, Milyaeva S and Rashid A (2021) Reconfiguring gov- Schiølin K (2020) Revolutionary dreams: future essentialism and ernance: how cyber security regulations are reconfiguring the sociotechnical imaginary of the fourth industrial revolution water governance. Regulation & Governance 1–18. in Denmark. Social Studies of Science 50(4): 542–566. Michels JD and Walden I (2018) How safe is safe enough? Shove E and Trentmann F (2018) Infrastructures in Practice: The Improving cybersecurity in Europe’s critical infrastructure Dynamics of Demand in Networked Societies. New York: under the NIS directive. Queen Mary School of Law Legal Routledge. Studies Research Paper No. 291/2018, Available at SSRN: Shreeve B, Hallett J, Edwards M, et al. (2020) ‘So if Mr Blue Head https://ssrn.com/abstract=3297470 here clicks the link…’ risk thinking in cyber security decision Murray G, Johnstone MN and Valli C (2017) The convergence of making. ACM Transactions on Privacy and Security (TOPS) IT and OT in critical infrastructure. The Proceedings of 15th 24(1): 1–29. doi:10.1145/3419101. Australian Information Security Management Conference. Shukla M, Johnson SD and Jones P (2019) Does the NIS imple- 5–6 December, 2017, Perth: Edith Cowan University, 149– mentation strategy effectively address cyber security risks in 155. doi:10.4225/75/5a84f7b595b4e. the UK? in 2019 International Conference on Cyber Security National Cyber Security Centre (2019) Cyber assessment frame- and Protection of Digital Services, Cyber Security 2019. work guidance. Available at: https://www.ncsc.gov.uk/ Institute of Electrical and Electronics Engineers Inc. doi:10. collection/caf 1109/CyberSecPODS.2019.8884963. National Cyber Security Centre (2021) Technology assurance. Slayton R and Clark-Ginsberg A (2018) Beyond regulatory Guidance. Available at: https://www.ncsc.gov.uk/collection/ capture: coproducing expertise for critical infrastructure pro- technology-assurance/future-technology-assurance/whitepaper- tection. Regulation & Governance, 12(1): 115–130. developing-a-new-approach-to-assurance Slupska J, Dawson Duckworth SD, Ma L, et al. (2021) Participatory Norton P (2015) Four paradigms: traffic safety in the twentieth- threat modelling: exploring paths to reconfigure cybersecurity. century United States. Technology and Culture.56(2), Conference on Human Factors in Computing Systems - SPECIAL ISSUE: (Auto)Mobility, Accidents, and Danger, 319 Proceedings [Preprint]. doi:10.1145/3411763.3451731. − 3334. Available at: https://www.jstor.org/stable/24468867? Spinardi G (2019) Performance-based design, expertise asym- seq=1#metadata_info_tab_contents (Accessed: December 16, metry, and professionalism: fire safety regulation in the neo- 2021). liberal era. Regulation & Governance 13(4): 520–539. Michalec et al. 13 Stilgoe J (2018) Machine learning, social learning and the gov- Topping C, Dwyer A, Michalec O, et al. (2021). Beware suppli- ernance of self-driving cars. Social Studies of Science ers bearing gifts!: analysing coverage of supply chain cyber 48(1): 25–56. security in critical national infrastructure sectorial and Stilgoe J (2021) How can we know a self-driving car is safe? cross-sectorial frameworks. Computers & Security,108, Ethics and Information Technology 2021: 1–13. 102324. Suchman L (2007) Human-machine Reconfigurations: Plans and Urquhart L and McAuley D (2018) Avoiding the internet of inse- Situated Actions. Cambridge, UK: Cambridge university press. cure industrial things. Computer Law and Security Review Tanczer LM, Steenmans I, Elsden M, et al. (2018 Emerging risks in 34(3): 450–466. the IoT ecosystem: who’s afraid of the big bad smart fridge?. In Wallis T and Johnson C (2020) Implementing the NIS directive, Living in the Internet of Things: Cybersecurity of the IoT-2018 driving cybersecurity improvements for essential services. in (pp. 1− 19). London: IET 2020 International Conference on Cyber Situational Awareness, Tatam M, Shanmugam B, Azam S, et al. (2021) A review of threat Data Analytics and Assessment (CyberSA), 1–10. doi:10.1109/ modelling approaches for APT-style attacks. Heliyon 7(1): e05969. CyberSA49311.2020.9139641. Thekkilakattil A and Dodig-Crnkovic G (2015) Ethics aspects of Wuyts K, Sion L and Joosen W (2020) LINDDUN GO: A lightweight embedded and cyber-physical systems. Proceedings - approach to privacy threat modeling. Proceedings - 5th IEEE International Computer Software and Applications Conference, European Symposium on Security and Privacy Workshops, 2: 39–44. doi:10.1109/COMPSAC.2015.41. Euro S and PW 2020, 302–309. doi:10.1109/EUROSPW51379. Thomas RJ, Gardiner J, Chothia T, et al. (2020) Catch me if you can: 2020.00047. an in-depth study of CVE discovery time and inconsistencies for Wynne B, Waterton C and Grove-White R (2007) Public percep- managing risks in critical infrastructures. CPSIOTSEC 2020 - tions and the nuclear industry in west Cumbria. Available at: Proceedings of the 2020 Joint Workshop on CPS and IoT http://inis.iaea.org/Search/search.aspx?orig_q=RN:34004547 Security and Privacy, 49–60. doi:10.1145/3411498.3419970. (Accessed: December 16, 2021). http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Big Data & Society SAGE

When the future meets the past: Can safety and cyber security coexist in modern critical infrastructures?

Big Data & Society , Volume OnlineFirst: 1 – Jun 28, 2022

Loading next page...
 
/lp/sage/when-the-future-meets-the-past-can-safety-and-cyber-security-coexist-94wtlFX0wO
Publisher
SAGE
Copyright
© The Author(s) 2022
ISSN
2053-9517
eISSN
2053-9517
DOI
10.1177/20539517221108369
Publisher site
See Article on Publisher Site

Abstract

Big data technologies are entering the world of ageing computer systems running critical infrastructures. These innova- tions promise to afford rapid Internet connectivity, remote operations or predictive maintenance. As legacy critical infra- structures were traditionally disconnected from the Internet, the prospect of their modernisation necessitates an inquiry into cyber security and how it intersects with traditional engineering requirements like safety, reliability or resilience. Looking at how the adoption of big data technologies in critical infrastructures shapes understandings of risk management, we focus on a specific case study from the cyber security governance: the EU Network and Information Systems Security Directive. We argue that the implementation of Network and Information Systems Security Directive is the first step in the integration of safety and security through novel risk management practices. Therefore, it is the move towards legit- imising the modernisation of critical infrastructures. But we also show that security risk management practices cannot be directly transplanted from the safety realm, as cyber security is grounded in anticipation of the future adversarial beha- viours rather than the history of equipment failure rates. Our analysis offers several postulates for the emerging research agenda on big data in complex engineering systems. Building on the conceptualisations of safety and security grounded in the materialist literature across Science and Technology Studies and Organisational Sociology, we call for a better under- standing of the ‘making of’ technologies, standardisation processes and engineering knowledge in a quest to build safe and secure critical infrastructures. Keywords Safety, cyber security, risk, critical infrastructure, materiality, expertise cyber security (Thomas et al., 2020). Moreover, the require- Introduction ment for cyber security cannot be divorced from safety as What happens when new tools enter the old world? For the consequences of cyber security attacks in critical infra- decades, inaccessible legacy computing systems have structure systems move into the material realm (Tanczer been running critical infrastructures, like power plants, et al., 2018). Cyber security attacks on OTs can lead to train stations or wastewater facilities. These so-called oper- explosions, collisions and blackouts. This necessitates ational technologies (OTs), traditionally consisted of iso- novel risk management practices which are simultaneously lated computers controlling sensors and actuators, often attuned to security and safety. use simple binary logic (e.g. a machine turning on/off Although the professional practice of cyber security risk depending on a sensed ambient temperature). The propo- management is novel in critical infrastructures, risk man- nents of infrastructure modernisation argue that legacy agement in other domains has a long-standing tradition of systems are due an upgrade – after all, they remained the evolving through controversies. What is construed as same for decades (cf. Schiølin, 2020). Connecting critical infrastructures to the Internet and the world of big data would equip practitioners with the possibility of remote The University of Bristol, Bristol, UK Bristol Cyber Security Research Group, Bristol, UK operations, predictive maintenance or real-time monitoring of industrial processes (Brass et al., 2018; Urquhart and Corresponding author: McAuley, 2018). Although this paradigm shift offers inter- Ola Michalec, The University of Bristol, Bristol, UK. esting prospects, it also brings a novel concern, namely Email: ola.michalec@bristol.ac.uk; aleks.michalec@gmail.com Creative Commons CC BY: This article is distributed under the terms of the Creative Commons Attribution 4.0 License (https:// creativecommons.org/licenses/by/4.0/) which permits any use, reproduction and distribution of the work without further permission provided the original work is attributed as specified on the SAGE and Open Access page (https://us.sagepub.com/en-us/nam/open-access-at-sage). 2 Big Data & Society ‘risky’, ‘secure’ or ‘safe’ is a matter of debates, testing accepting the modernisation agenda. Second, by outlining regimes and evolving standards – in other words, it is collective risk management practices that enabled diverse socially constructed (Marres and Stark, 2020; Stilgoe, practitioners to collaborate. Third, by highlighting how prac- 2021). The same could be said about how practitioners titioners borrowed elements from safety culture and incorpo- grapple with new computing technologies to arrive at jud- rated it to security. Fourth, by cautioning that epistemic and gements about a novel criterion of cyber security while pre- material differences between the old world of legacy tech- serving their traditional goal of safety. Although safety and nologies and novel big data tools pose limits to the future cyber security concerns have different origins, the current of critical infrastructures modernisation. direction in policy and practice is to integrate these two requirements through the harmonisation of regulatory fra- meworks, product standards and professional training Background (Kriaa et al., 2015). Safety versus Security The Science and Technology Studies (STS) and Organisational Sociology research on safety and risk in Safety and security might seem synonymous, however, there complex systems helps us understand to what extent the are many technical, political and cultural differences which security and safety risk management practices could be inte- distinguish these two requirements. Broadly, in infrastructure grated. Looking at how the adoption of big data technolo- research, safety is concerned with prevention, protection and gies in critical infrastructures shapes understandings of recovery from unintentional accidents, while (cyber)security risk management, we focus on a specific case study from is interested in dealing with malicious and deliberate incidents the UK cyber security governance. We argue that the imple- (Pietre-Cambacedes and Chaudet, 2010). However, even this mentation of the Network and Information Systems high-level distinction has been a subject to multiple theoretical Directive (NIS) (European Commission, 2016) commenced developments. Researchers identify four main paradigms in the process of the integration between safety and security cyber security: (1) fixing and breaking technical objects; (2) concerns across critical infrastructure providers. It has, erroneous use of computers; (3) malicious political actions therefore, legitimised the modernisation of OTs. By addres- by the means of digital tools; (4) social construction of expert- sing the traditional engineering requirement of safety, the ise around what is deemed worth protecting (Adams and proponents of big data technologies managed to position Sasse, 1999; Dunn-Cavelty, 2018; Klimburg-Witjes and the changes as acceptable to critical infrastructure practi- Wentland; 2021; Renaud et al., 2018). In parallel, a number tioners. However, we also show that the risk management of ‘turns’ have been recognised in safety research: from practices in big data-enabled critical infrastructures cannot safety being the priority goal, trumping efficiency of pro- be directly transplanted from the safety realm, as cyber cesses; through accident prevention via designing-in safety security is grounded in anticipation of the future adversarial into complex systems, to, finally, placing responsibility on behaviours rather than the history of equipment failures. the end-user or the operator (Elish, 2019; Norton, 2015). It Therefore, while integration of safety and security is is worth noting that these paradigms often co-exist over the important for the delivery of reliable critical infrastructure same timescales, although they tend to reside in different pro- services, it cannot be taken for granted. Precisely, this fessions and disciplines, without challenging each other’s clash in temporalities between legacy infrastructures and assumptions. Therefore, the premise of ‘new tools entering big data technologies creates a gap in conceptualisations the old world’– managing novel risks from big data in and methodologies for risk management. legacy critical infrastructures – provides a unique opportunity The remainder of this article will proceed as follows. to re-consider the established ways of thinking about both First, we provide an overview of security, safety, and safety and cyber security. OTs. Here, we synthesise the literature of theoretical Security and safety are distinguished by their unique ‘turns’ across security and safety, highlighting the material, temporalities. A key feature specific to the cyber security cultural and political differences between information tech- field is that it rests on novelty. New threat actors, vulner- nologies and OTs. Next, we introduce the conceptual lens abilities and theoretical attacks come to light regularly, of safety, risk and security as social constructions (Pinch often at a pace faster than the creation of regulations and Bijker, 1984; Barnes, 1993). In the section ‘Case (Matthew and Cheshire, 2016). Cyber security risks in crit- study: NIS implementation in the United Kingdom’,we ical infrastructures often originate from high-profile mali- contextualise the article by familiarising the reader with cious activities of organised criminals or state actors, the outline of the regulatory landscape. Following that, lending itself to the use of political rhetoric, high levels of the section ‘Research design’ reports on methods used secrecy and large budget spending as means to protect crit- and reflects on the opportunities and challenges of close ical infrastructures from any presumed existential threats collaborations between researchers and practitioners. (Dunn-Cavelty, 2013). Consequently, security by automa- We present our argument in four parts. First, by establish- tion, prediction, and testing becomes especially challenging ing that safety-security integration was key for engineers in such environments due to difficulties in access to data or Michalec et al. 3 trusted informants. Meanwhile, in legacy OT systems, required), or the juxtaposition of safety culture of OT engi- safety risk has been traditionally understood probabilistic- neers and innovation culture of IT workers (Guldenmund, ally as a ‘failure rate’, a frequency with which an engineer- 2000; Reece and Stahl, 2015; Thekkilakattil and ing component fails when tested, expressed in failures per Dodig-Crnkovic, 2015). Infrastructure providers running unit of time. Failure rate figure is deeply rooted in physical on OT systems are also organised very differently com- properties of the system and a wealth of historical data (Ani pared to IT companies – critical infrastructures are often et al., 2016). The dynamic characteristic of cyber security hierarchical and governed through public-private partner- contrasting with a static (or, at best, slowly moving) ships, while IT companies range from start-ups to monop- nature of safety implies there are limits to the integration olies and they are most often private sector entities of traditional OT safety paradigms to the context of (Dunn-Cavelty and Suter, 2009; Murray et al., 2017). modern, interconnected critical infrastructures (Slayton These distinctions are important as they inform who gets and Clark-Ginsberg, 2018). to conceptualise risk, how do they do it and why. Operational technologies versus information Theoretical framework technologies How do we know if machines are safe?. In order to frame this Throughout the article, we distinguish between information research paper, we used the conceptual lens of social con- technologies (computers commonly found in homes and struction of safety, risk and security. This paradigm explains offices) and OTs (computers operating engineering machin- how technological expertise emerges, stabilises, gets con- ery) to understand how these technologies were historically tested or widely accepted (Barnes 1993; Pinch and Bijker, constructed as separate and how they are now poised as 1984). Such research examines how different actors arrive integrating with each other. In this claim we follow calls at their assessments, rather than examining whether their from Kinsley (2014) and Aradau (2010) to pay attention assessments are true. Examining technological expertise to materiality in computers and infrastructures. Objects of involves an inquiry into situated practices, materials of cyber security – sensors, buildings, code – are not passive day-to-day work and debates surrounding technoscientific technologies waiting to be filled with discourses. They are developments (Collins, 2007; Pinch and Swedberg, 2008; not characterised by ‘essential’ features separating them Suchman, 2007). In that vein, we first build our argument from humans, either (Fouad, 2021). Instead, materiality is by reviewing the literature on social construction of safety critical for noticing how objects become ‘agents’ of social and risk before moving to the analysis of our data which change through practices which are both discursive and focuses on the construction of security. material (Aradau, 2010). The long history of safety research in complex systems Historically, the consequences of security incidents in IT like aviation (Downer, 2010) nuclear engineering were materially different from OT because these systems (Wynne, Waterton and Grove-White, 2007; Polleri, 2020; were traditionally built for different purposes. While IT pro- Perrow, 1984) or autonomous vehicles (Haugland, 2020; fessionals are typically concerned with the damage to data, Stilgoe, 2021) thoroughly documents and analyses the evo- hence lost revenue, customer trust or reputation, OT practi- lution of testing regimes and assurance schemes to minim- tioners are mainly concerned with human safety, equipment ise risks, recover from incidents and anticipate a range of damage and continuous supply of ‘essential services’. possible scenarios. How do experts establish whether a Traditionally, OT systems were designed with physical complex system is ‘safe enough’? resilience and safety in mind; cyber security was not a Focusing on controversies, STS scholars have been typical requirement due to the practice of ‘air-gapping’, tracing how debates evolve to establish complex, emer- that is, isolating OT computers from the unsecured net- ging or high-stakes technologies as ‘safe’ or ‘risky’.For works like the public Internet (Byres, 2013). IT systems, example, industrial manufacturers’ framed safety as a in contrast, are commonly interconnected, which necessi- matter of feeding more data to proprietary machine learn- tates the need for security and privacy by design and regu- ing algorithms in case of autonomous vehicles (Stilgoe, lation (Michalec et al., 2020). As both IT and OT systems 2018) or raising awareness of machine operators working are gaining Internet connectivity and real-time analytics with robots (Elish, 2019). Meanwhile, Norton (2015) functionalities, they are ‘blending’ into a single entity. showed how road safety was deprioritised over decades as And so are the previously separate concerns for cyber secur- the automotive industry grew in the United States. These ity and safety (Michalec et al., 2021). In short, contempor- examples illustrate that the research on safety and cannot ary ‘big data’ practices of OT and IT professionals are be limited to laboratory experiments and test beds as cat- reconfiguring what critical infrastructures are made of. egories like ‘safety’ and ‘risk’ are inherently riddled The differences between OT and IT were historically not with political and organisational contingencies. only material but also cultural, such as varying degrees of Other researchers contributed to the social construction of professionalisation (i.e. typical career routes and education safety through understanding accidents. Downer (2010), 4 Big Data & Society brings attention to what he calls ‘epistemic accidents’– proportionately to the number of Internet-connected man-made calamities resulting from fundamental limitations devices. In practice, assigning risk scores in a given organ- of engineering tests and models which, by design, are never isation depends on asset criticality (i.e. how important are perfect representations of the ‘real world’ conditions. Such devices and datasets), motivations of potential attackers, events happen if ‘scientific or technological assumptions the available budget, just to name a few (Cherdantseva prove to be erroneous, even though there were reasonable et al., 2016). Moreover, risk decision makers need to and logical reasons to hold the assumptions before the account for issues which are not specific to cyber security, event’ (Downer 2010: 752). Epistemic accidents offer valu- for example, public responsibility for delivery of reliable able lessons for organisations and practitioners by revealing essential services, business models, insurance, risk appetite, inherent shortcomings of the current engineering and design reputation (Henrie, 2015; Nurse et al., 2017; Pieters and paradigms. By researching how experts work towards safety Coles-Kemp, 2011). While recent research offers reviews in complex engineering systems, we can better see the poten- of risk assessment frameworks (Kriaa et al., 2015; tial for epistemic failures and build-in practices to learn from Cherdantseva et al., 2016), it leaves a gap for understanding them. to what extent these frameworks are applied in the real- Analysis of epistemic accidents matters as it shifts atten- world context. tion from an individual’s error to interactions between opera- Previous risk studies embedded in the critical infrastruc- tors, organisational cultures and politics and machines. In ture context highlighted that risk assessments are collabora- doing so, STS scholars go beyond seeing safety accidents tive processes, rather than a matter of following a formal conventionally, as failure of individuals and their erroneous methodology (Frey et al., 2019; Shreeve et al., 2020). In use of complex systems (Pinch and Bijker, 1984; Stilgoe, doing so, they challenge the trope of ‘security expertise’ 2018). Here, the notions of error and safety are intimately being solely a technical and individual matter. This shows connected to (the limits to) ‘knowability’ of complex security expertise as inherently emergent, contextual and systems (Downer, 2010; Spinardi, 2019). They feed into subjective. For example, security practitioners might use a safety testing and modelling, and therefore, everyday deci- variety of reasoning strategies, such as ‘risk first’ (following sions about risk (Marres and Stark, 2020). governmental risk assessment framework) or ‘opportunity Finally, while the expertise from the safety world cannot first’ (identification of investment opportunities before con- be directly transposed to the security context, there is an sidering risks). Moreover, in practice, people exercise both overlap between these fields. Politically, both safety and kinds of reasoning, with vulnerabilities (i.e. weaknesses in security of infrastructures are prioritised by governments computer systems) are thought about most commonly, as they fundamentally relate to the ‘normal’ functioning and assets (i.e. equipment, documents, employees) least of the society (Agrafiotis et al., 2018; Shove and often, leading to an over-reliance on vulnerabilities-centred Trentmann, 2018). However, as certain security incidents threat assessments (Shreeve et al. 2020). Moreover, risk and safety accidents cannot be prevented in complex thinking is ‘front-loaded,’ as practitioners tend to think systems (Perrow, 1984), critical infrastructure operators about risk in the beginning of the decision-making often emphasise resilience and risk management. In prac- process, rather than systematically throughout the lifecycle tice, this means that the same teams could be made respon- of OT systems (Shreeve et al., 2020). Meanwhile, as threats sible for both safety and security. Even though cyber in cybersecurity evolve over time, risk management ought security incidents and safety accidents require separate to be iterative and regularly updated (Ani et al., 2016; root cause analyses, they might manifest as the same conse- Frey et al., 2019). quences (Agrafiotis et al., 2018; Kriaa et al., 2015), thus Risk management in the context of security often draws share commonalities in terms of risk management practices. from a practice called threat modelling to anticipate likely attackers, incident pathways, possible consequences of Risk management: Between calculation and anticipation. attacks and best ways to respond to them. The techniques Understanding risk management in critical infrastructures under the umbrella of threat modelling vary; from qualita- is a multifaceted issue of both qualitative and quantitative tive expert workshops (Wuyts et al., 2020), through math- nature (Shreeve et al. 2020). Despite the rise of rule-based ematical models based on probabilities (Markov chains, and probabilistic risk methodologies, for example, attack game theory) to graphical representations (in the forms of trees, attribute-based algorithms (Tatam et al., 2021), secur- tables, data flow diagrams and attack trees), with some ity risk is ‘incalculable’ since there are limits of what could threat modelling techniques promising full automation be inferred from scientific data (Amoore, 2014: 424). Risk and quantification of risks (Tatam et al., 2021). methodologies are ‘already political’ as they involve com- In practice, when it comes to classifying potential binatorial possibilities whose arrangement has effects on impacts, and evaluating attackers’ motivations, threat mod- risk scores, and associated countermeasures (Amoore, elling relies on qualitative expert judgement, usually a small 2014: 423). In the case of OT cyber security, this means group of domain specialists. However, as cyber security that we cannot simply assume that the risk rises ‘spills out’ beyond simply protecting computers, there is a Michalec et al. 5 call for broadening the scope of threat modelling. Critical (known as the Cyber Assessment Framework; NCSC, social scientists argued for anticipating risks of emerging 2019). The Cyber Assessment Framework is the key oper- technologies by including non-experts (Slupska et al., ational document pertaining to the question of cyber secur- 2021), understanding security in tandem with privacy and ity risk management of critical infrastructures in the United surveillance (Kazansky, 2021; Wuyts et al., 2020), and Kingdom. Fourteen principles of the Cyber Assessment approaching non-human actors (code, hardware, algo- Framework are set out as so-called ‘Indicators of Good rithms) as active co-creators of geopolitics (Dwyer, 2021; Practice’ (NCSC, 2019), or recommended outcomes of Fouad, 2021). The strength of such a ‘critical threat model- security improvements rather than specification how to ling’ approach would then lie in the capacity to imagine and improve cyber security. For the purpose of self- anticipate a wide range of outcomes and curate a space for assessments, each of the 14 outcomes is self-assessed in explicitly normative discussions about living with digital diverse teams comprising of both OT and IT practitioners technologies. Including actors outside of cyber security pro- according to a three-grade scale as either ‘fully achieved’, fession allows the multiplicity of futures to become visible, ‘partially achieved’ or ‘not achieved’. Following the com- as there is no single objective and optimal choice between pletion of self-assessments, operators and regulators draw security, privacy, risk appetite, resources available, reputa- agreements on the improvement plans, and conduct external tion, innovativeness, and many other factors. audits (Shukla et al., 2019; Wallis and Johnson, 2020). Since the successful implementation of cyber security reg- ulations requires collaboration across the IT and OT Case study: NIS implementation in the teams, it makes the cross-cutting issues of safety and secur- United Kingdom ity visible (Michalec et al., 2021) To address how critical infrastructure practitioners concep- tualise and practice security risk management, we use the Research design case study of the NIS, as implemented in the United Kingdom (DCMS, 2018). The NIS implementation prac- We conducted a qualitative study of experts managing big tices reveal how practitioners from diverse sectors grapple data risks to critical infrastructures. Between November with the modernisation of legacy OT systems. Their NIS 2019 and January 2020, we interviewed 30 practitioners compliance practices are balancing acts to build intercon- and observed two industry events focused on the implemen- nected and secure infrastructures, without compromising tation of the NIS Regulations. Our interviewees ranged on the traditional engineering goals like safety, or reliability from the critical infrastructure operators, regulators, consul- of essential services like water, energy or transport. tants, lawyers, to OT equipment manufacturers. We aimed NIS originated as a high-level supranational directive rati- to cover a range of sectors (e.g. energy, water, transport) fied by the European Parliament in 2016. Since then, it has and roles (e.g. technical, managerial, consultancy, regula- been transposed to the EU Member States and the United tory). We conducted semi-structured interviews focusing Kingdom as NIS Regulations (DCMS, 2018). This move on historical perspectives on the development of OTs, par- meant that while high-level objectives and international ticipants’ outlooks on the future of modernisation, interpre- cooperation mechanisms were set by the EU, the scope of tations of the Regulations and the issues around what is regulated as well as implementation mechanisms communicating security risk across professional boundar- are decided by each state and sector individually. In the ies. Questions were tailored to each participant in order to United Kingdom, the implementation of NIS follows the account for differences in sectors and professions. principles of ‘appropriateness and proportionality’ (Michels Interviews took place either at the participant’s organisa- and Walden, 2018), which necessitates careful deliberation tion, our institution or via online calls, with the lead over designation of the operators falling under the purview author conducting all interviews. All conversations were of regulations, thresholds of incident reporting and recorded with the interviewees’ consent. No reimbursement maximum penalties. NIS is known as ‘principles-based regu- was given for participation. Our analysis is complemented lation,’ meaning that critical infrastructure operators work by an in-depth reading of the Cyber Assessment towards meeting the governmental objectives without speci- Framework (NCSC, 2019), a UK-specific document outlin- fication how to achieve such goals (Michels and Walden, ing what the outcomes of ‘good’ risk management is secur- 2018). The government’s reasoning behind this move is to ity look like. avoid ‘box ticking’ style of compliance and contextualize Our approach responds to the calls by de Goede (2020) risk management. In the eyes of the UK’sNational Cyber for increased engagements between empirical research on Security Centre, ‘this encourages innovation and expands expert practices and critique. By treating the implementa- the breadth of technologies we can assure’ (NCSC, 2021). tion of cyber security regulations as ‘situated practices’, Risk assessment is embedded in NIS implementation we bring our attention to the notion of expertise construc- from the beginning. The implementation procedures in tion and de-centre policy discourses or legal analysis. By the United Kingdom begin with a self-assessment stage following practitioners and practices, we were able to 6 Big Data & Society gain trust of our informants, appreciate the diversity of their critical infrastructures. This agrees with the overarching justi- expertise, and their disagreements and material artefacts fication behind the UK National Cyber Strategy which claims they work with. As a result, long after data collection that the ongoing and rapid expansion of digital connectivity is period finished, the first author of the paper is still collabor- a main driver behind cyber security regulations (Cabinet ating with practitioners; publishing government guidance or Office, 2022: 29). Effectively, NIS is the first step to legitimise giving regular industry talks. The downside of research the modernisation of critical infrastructures: approaches relying on in-depth engagement with practi- tioners is the possibility of losing ‘critical distance’ and ‘there are some instances where the best answer would be getting ‘co-opted’ by practitioners’ agenda (de Goede, to innovate legacy. NIS has not ever come up with a recom- 2020). This is especially challenging when working with mendation that there should be greater digitisation, but practitioners whose goals are both normative and open to what it did say is: “There are certain expectations, particu- interpretation, like security and safety. In our case, we navi- larly around configuration and software management gated that tension by highlighting the plurality and contin- where it was very hard to deal with a legacy.” So, some gency of expertise, rather than promoting a single vision. In people found themselves caught in a business case terms for further research avenues, this research agenda between a technology refresh which, frankly, was overdue would benefit from an in-depth investigation of single quan- anyway, or retaining legacy systems for reliability titative threat modelling methodology (e.g. Markov chains); reasons with negative cyber security implications.’ following the datasets, construction of algorithms, model- (Interview with energy sector working group lead) ler’s assumptions and how results of risk assessments are However, while bringing the diversity of expertise translated (or not) into organisational decisions. allowed to advance and integrate risk management prac- tices, there are fundamental differences between safety Towards modernisation of critical and security. Therefore, the future of big data in critical infrastructures is still uncertain. In what follows, we will infrastructures examine practices which enabled that integration as well Our research shows that the introduction of security regula- as highlight epistemic and materials differences between tions into the world of legacy safety-critical systems these two requirements. prompted harmonisation of these two requirements. In turn, this move legitimised the modernisation of legacy OT envir- onments. However, due to fundamental differences between Hiveminds and other collaborations managing safety and security risks, the modernisation of crit- Diverse expertise ical infrastructures cannot be taken for granted. At first, engineers exhibited resistance to the modernisa- What makes risk management across security and safety tion agenda: ‘oursectorisadoptingIndustrialInternetof successful? First, it is contingent on the access to diverse Things at a frightening rate, and we’ll have little idea as to expertise within an organisation, and how effectively what it looks like and how to secure it’ (interview with recommendations are communicated to those in charge of water sector operator 1). The dominating mood was cynicism decision making, who are usually senior managers about big data technologies being introduced to increase man- without the expertise in security: ‘So the security engineers ufacturers’ profits: ‘people see opportunities to deliver a new might be quite grumpy because the manager just does not shiny box, a new system, a new bit of software, a new service. understand their problems. But the engineers also do not So, that is really, really driving and almost pushing along understand there is a bigger picture going on here, e.g., innovation in the market’ (interview with OT security con- that a power station needs to provide an ongoing supply sultant 1). However, a pivotal moment occurred when of electricity’ (Interview with engineering consultant 2). safety and security professionals started working together Second, cyber security risk assessment requires diverse with the regulators to identify how their requirements map inputs – apart from traditional technical experts, human onto the Cyber Assessment Framework and create a factors practitioners are needed to anticipate how workers common benchmark for the whole sector: ‘it is like an could be employing workarounds against security mea- exam board where you get together and make sure all the sures, so that they could improve the usability of security markers are assessing against the same criteria. We often practices. In the words of our interviewee, a water regulator, went back to the regulator pointing out where NIS did not ‘You can create more risks by going overboard with too make sense in our context of OT technologies’ (Interview stringent and annoying security measures where people with water operator 2). try and find work arounds. Water plant operator working Precisely, that bringing together of diverse experts enabled with time-critical systems cannot afford 30 seconds delay safety-security integration. Regulators, by listening to the con- if they typed their password incorrectly’ (Interview with cerns from safety engineers, adjusted the Cyber Assessment water regulator no 3). As such, NIS does not only regulate Framework guidance, to facilitate digital connectivity in technologies, but also how people use them. Michalec et al. 7 In the eyes of participants, these hiveminds, usually Trust in collaborations expressed as semi-formal working groups, are better suited Security risk assessment is also contingent on trust. In particu- for sharing expertise than regulations. In other words, risk lar, it is trust between IT workers and OT workers filling the management practices preferred by the participants are rela- Cyber Assessment Framework: ‘when Idomyriskassessment tional and collaborative, rather than top-down and individua- of systems we rely on, I’ve got to assume that the guy doing the lised. As our interviewee put it, ‘working groups could be IT bit has got his IT correctly’ (interview with OT water oper- tasked with a creation of sector-specific process standards ator 1). Furthermore, successful risk management happens if which would be a collective endeavour rather than an indi- security experts manage to establish a trusting relationship vidual activity of ‘box ticking’ (interview with water operator with the board members and gain ‘buy-in from senior manage- 1). An example from an energy sector working group shows ment to invest in cyber’ (Interview with energy regulator 2). that collaborating on risk assessment was easier before regu- Ultimately, senior managers are the budget holders and lations came into force: ought to see how security improvements translate into organ- isational goals, be it by providing reliable energy supply, or ‘In 2013, we did a UK-wide risk assessment. We anon- ensuring workplace safety on a train station. While partici- ymised responses from individual companies, we aggre- pants acknowledged that connecting security practitioners to gated it and so we could come up with two things. First, board level executives has traditionally been a challenge, where collective gaps and difficulties, so that we could they are now gaining techniques for better engagement: ‘I request help from the government departments. Second, would stop talking about the threats, the executives know we found out there was a difference between the best prac- about the threats. Instead, say how we are looking after the tice in some and those who were struggling and there we business and its core critical functions’ (interview with a introduced knowledge sharing opportunities. We then vendor of security products). Ultimately, cyber security risk allowed the good ones to present their approaches and management is seen in the context of broader risk manage- the others could learn so we got a best practice learning ment, where practitioners across diverse teams are encouraged environment’ to reflect: ‘How much risk can we tolerate as an organisation? How much do we value our reputation? What is our attitude (Interview with energy sector working group lead). towards legislation and regulation? It is all interconnected’ The implementation of the Cyber Assessment (Interview with an IT security consultant). Indeed, in this Framework reveals three crucial aspects pertaining to the case, security is a matter of care (Kocksch et al., 2018) social construction of risk management: professional prac- where security budgets are considered as a matter of long-term tices as objects of regulations, cyber security mapped to maintenance of whole organisations rather than cutting-edge broader organisational goals, and practitioners collaborat- technological ‘solutions’. ing to create ‘risk thinking hiveminds’ that capture risk management practices across their sectors. Just like safety regulations in critical infrastructures (Downer, 2010), NIS Building a ‘risk thinking’ hivemind regulates trust in professional practices, rather than tech- One of the pressing questions for the critical infrastructure nologies. Cyber security has been placed in the broader practitioners is how NIS could avoid being a tick-box exer- organisational context of safety, usability or reliability. cise. The UK Government designed the Cyber Assessment Finally, faced with the novelty of cyber security regulations Framework as an outcomes-based document to ‘discourage in the legacy environments, practitioners collaborated to compliance thinking’ (NCSC, 2019). However, by provid- manage the overlapping risks of safety and security. ing a set of ‘good outcomes’ rather than policies on how to Lacking prescriptive guidance, they created a ‘risk-thinking achieve them, the Cyber Assessment Framework received hivemind’ to collectively work towards their goals. criticisms for ‘leaving everything up for negotiation’ (inter- view with energy regulator 2). On the one hand, outcomes- Towards harmonisation of safety and based regulations are suitable for dynamic contexts, like security cyber security, where new risks emerge regularly and there are multiple ways to ‘do the right thing’. On the Let us now turn to how cyber security integrated practices other hand, outcomes-based regulations rely on a baseline from safety engineering in their work to blend the level of expertise where practitioners can exercise expert ‘digital’ and ‘engineering worlds’. judgement on risk: ‘we want people use the Cyber Assessment Framework as a sanity check rather than a pro- Threats and incidents reporting cedure to follow to the letter to protect their own reputa- tion’ (interview with energy regulator 1). And so, In the event of a cyber security incident, operators will have practitioners called for raising the level of expertise across to report it to the regulator and evidence that they took the whole sector, what we call a ‘risk thinking hivemind’. ‘appropriate and proportionate’ measures to mitigate risks 8 Big Data & Society in order to avoid a penalty (NCSC, 2019). However, there is one person owns it, and you look at all the independences no obligation to report ongoing threats, that is, prospective and as you get further away from the core’ (interview malicious activities and actors that are yet to hit a computer with engineering consultant). In particular, it is the inter- network. The above caveats resulted in the ongoing debates national nature of internet services (e.g. cloud providers), on defining reporting thresholds for incidents and even dis- which highlights the difficulty with drawing a clear bound- tinguishing between a threat and an incident (DCMS, ary around cyber security risks (and, indeed, the scope of 2021). The dilemma lies in the fluid nature of the above NIS itself!): ‘a whole chunk of security is now outsourced terms. On the one hand, encouraging reporting of the to the Cloud provider overseas, so critical infrastructure ongoing threats improves the collective intelligence, the operators lose control over it’ (IT security vendor). aforementioned ‘risk thinking hivemind’. On the other, if Yet again, well-established practices from safety engin- a threat reported by one organisation turns into an incident eering could come to rescue, with maintenance contract in another, both organisations may be receiving fines. This between third-party suppliers and operators recommended lateral way malware propagates is a well-known phenom- as ways to uphold good standards of security over time: enon in interconnected complex environments (Dwyer, ‘long term improvement is a matter of maintenance con- 2018) but historically it was not a concern in disconnected tracts. So that is important also, is that if you are buying critical infrastructures. As a result, these contingencies of an expensive piece of equipment you want to have it sup- threat reporting pose a risk that operators will minimise ported for a long time, otherwise you do not have a business their reporting all together. The evidence from the critical case to use that supplier’ (interview with a rail engineer). infrastructure security regulations in the United States While borrowing professional practices from the safety shows that fear of fines created a counterproductive envir- culture might help engineers with understanding of cyber onment for information sharing (Clark-Ginsberg and security, the complexities around global supply chains Slayton, 2019). and the scope of NIS remain. In an example from one of In order to encourage operators to report on the develop- the critical infrastructure sectors (Wallis and Johnson, ing threats, water regulators broadened the reporting scope 2020), for data centres located outside of the United so that all security incidents and safety accidents, however Kingdom, the NIS regulators cannot oversee their security minor, had to be reported under the same umbrella . This measures. However, critical infrastructure operators are also led to discussions among practitioners to report ‘near still legally obliged to arrive at bilateral contracts with misses,’ threats which did not have a significant impact data centre providers to meet the requirements of NIS. on their network , showing that thresholds of harmful The requirement for security remains but less so the events are a subject to ongoing debates. This move clarity about who validates the process (Wallis and signals that both incidents and accidents are bound to Johnson, 2020). happen and reporting of the ongoing threats (even if not To conclude, by borrowing established practices and yet materialised as security incidents or safety accidents) terms from the safety culture context, NIS practitioners will not be stigmatised. were able to make cyber security more familiar to critical However, this practice is not uniform across all critical infrastructure engineers. Encouraging broader incident infrastructure sectors. Right now, energy regulators do not reporting and establishing maintenance contracts opened have the same level of insight. In order to allow further inte- new discussions highlighting the complexity of cyber gration of security and safety, regulators advocated for security in interconnected, big data environments like improved capabilities to observe the dynamic nature of cloud. threat actors and typical attacks: ‘it would be of a real inter- est to us, but currently this is a voluntary procedure’ Dissonant harmonies: The limits to (Energy regulator 1). Although 2020 saw numerous integration of safety and security attempts of security breaches attempts, none of them were Despite the opportunities of safety-security harmonisation reported to NIS regulators as they did not lead to the loss as expressed through professional practices, this section of supply or power outages; such lenient reporting criteria argues that there are fundamental epistemic and material also raise suspicions in the national news, which questions differences between legacy OT environments and big data whether NIS’ reporting criteria in the energy sector is fit for practices. purpose (Martin, 2021). Maintenance contracts Prescriptive thinking Deciding on the ownership of cyber security risks proved First, let us return to collective risk assessments we identi- very challenging: ‘when you start looking at the scope of fied earlier in the analysis. The creation of ‘risk thinking NIS, which is one of the first things you do, you ask yourself, hiveminds’ which consolidate security knowledge across what do you really depend on? Very complicated, and no the sector could be complicated by the tendency to work Michalec et al. 9 be asking: was it safety incident, security, or something in a prescriptive manger common in safety engineering. else? What did we learn from it? It all needs to be put One of the water regulators appeals: into the pot. It is like you are telling people, “Something bad has happened”. They need to know: “Well, actually, ‘Getting companies used to doing risk assessment rather what can I do about it?.” And I think there needs to be than compliance is key. Our safety framework was com- more done about turning those incidents into lessons for pletely prescriptive: a list of measures that you must have best practice’ (interview with engineering consultant). on water plants –, e.g., you must have this kind of lock fitted in this kind of way by one of these companies. A parallel gap resides in the practice of threat modelling Which companies love because they can cost it up and go in critical infrastructures. The lack of historical data on along to Ofwat and say, ‘We need exactly this much security attacks in OT environments poses challenges to money to do this much work over this number of years.’ the modelling of future threats: ‘when you’re looking for (Interview with water regulator 1) a record of past incidents to take to your senior manage- ment and all you can show them is a brief declassified docu- Risk thinking at the intersection of safety and security ment with barely any information, they can say, “Well is would necessitate encompassing novel big data practices, that all you have got? If there are hardly any incidents, that is, back-ups for real-time environments, asset inventor- maybe we should not be spending more money and ies for equipment operating automated processes, anomaly effort?” (Interview with engineering consultant 3). detection on segregated computer networks (NCSC, 2021). Why is it so difficult to obtain data? The access to infor- These practices are not familiar to safety engineers who typ- mation on threat actors and past incidents is highly limited ically work with legacy systems where computer networks due to the sensitivity of this topic. Complex procedures were not traditionally monitored, backed-up or segregated. around data classification, information exchanges and Moreover, no single risk management framework covers all even day-to-day interactions give rise to secrecy as a dom- recommended risk management practices with ‘various inating practice in social interactions. For example, some of countries having their own standards. So, it is horses for our participants were unable to have their cameras on courses and some of the best solutions I have seen are during the interviews due to their work incorporating both basically taking a blend of several of the standards’ (inter- offensive and defensive security (i.e. they were simultan- view with OT security consultant). But, to what extent are eously hackers and defenders). Such restrictive norms engineers willing to let go of prescriptive thinking and, around communication raise a possibility of ‘epistemic instead, start blending various frameworks or anticipating accidents’ (or, rather ‘incidents’, if we are concerned with futures? A shift to ‘risk thinking’ culture would necessitate intentional and malicious nature of security attacks), a major change in the ‘epistemic culture’ in safety engineer- events highlighting the limits to established practices ing, to borrow a term from Knorr-Cetina (1999). Epistemic across engineering and computing (Downer, 2010). A culture refers to an established way of accessing, validating telling example would be a cyber security incident hitting and advancing knowledge in a given expert community an underprepared organisation that incorrectly extrapolated (Knorr-Cetina, 1999). Nonetheless, changing culture estab- the rarity of cyber security attacks based on scant declassi- lished over many decades is a mammoth task beyond the fied data. In such case, a cyber security attack would be a scope of a single regulatory initiative. consequence of poor communication and mistrust across critical infrastructure organisations. Secrecy restricts learning. Our second point relates to the secrecy challenges with accessing data required to differen- tiate between security and safety. Is an anomaly in the Logics of risk assessment system due to an error or a hacker? Did the blackout result from a storm or a cyber security attack? The final point of contention relates to the very logics of Earlier in our analysis, we examined integrated reporting risk assessment across safety and security. While the prob- of security and safety events to harmonise these two ability of safety failures is well grounded in historical requirements. However, integrated reporting of security records and components testing (Michalec et al., 2021), incidents and safety accidents yields limited lessons for security incidents in the OT space are a function of antici- the operators if they cannot learn what caused a harmful pating malicious behaviours and relying on sparse historical event. Currently, the lack of separate root cause analysis data, which does not lend itself easily to the logics of prob- limits further integration of safety and security paradigms abilistic prediction. Considering active adversarial actions in engineering: from highly skilled actors like organised criminals or state- sponsored hackers brings a contentious dimension to the ‘There is a lot of reporting that does not tend to think it is practice of risk management. In practice, it means that engi- cyber security but actually that could feed into the cyber neers will have to conduct an explicitly political and norma- risk picture that we need to bring into the mix, we should tive analysis and they are not necessarily ready to 10 Big Data & Society acknowledge this: “if state sponsored hackers bring a We argue that the introduction of security regulations power station down, then we have to react. But that is dif- into the world of legacy safety-critical systems prompted ficult, because then you are definitely into politics. We are a harmonisation of these two requirements. Integration of non-political, non-government organisation, we only do safety and security was afforded thanks to collective risk what we can” (Interview with Incident Response Director). management practices: (1) conducting risk assessment in In order to escape being drawn into politics, industry diverse team; (2) mapping cyber security onto organisa- actors propose machine learning as a data-driven, objective tional goals with senior stakeholders and (3) practitioners means of risk assessment (Dragos, 2019). However, other collaborating to create ‘risk thinking hiveminds’ capturing than any risk analysis being far from objective due to afore- good practices across their sectors. Next, we also show mentioned ‘incalculability of risk’ (Amoore, 2014), the prac- that the implementation of NIS created opportunities to tice of automated anomaly detection using machine learning borrow established terms and practices from safety engin- in particular is seen as contentious due to the low diversity of eering and incorporate them into security procedures. In modelling data used to train machine learning algorithms: doing so, NIS serves as a vehicle that enables incorporating ‘There’s not enough randomness in the datasets themselves cyber security in the existing engineering professions, to say the type of algorithms they use were going to have organisational structures, and maintenance contracts with perfect detection rates’ (interview with energy regulator 3). third party suppliers. On the other hand, however, there Consequently, practitioners are ‘not afraid to lose their are major epistemic and material differences between jobs’ as ‘although the networks are evolving and there is safety and security domains, such as prescriptive attitudes more information, we will always have a human operator to risk in safety engineering standards, or secrecy restricting checking the anomalies’ (interview with rail engineer). cyber security information sharing. Ultimately, the NIS Overall, despite the attempts to integrate safety with Regulations exposed a tension between two vastly different security, the paradox is that big data computing and logics of risk assessment across security and safety: future- legacy engineering environment belong to different and grounded and explicitly normative anticipation versus past- incompatible worlds. The dissonance is expressed in the based probabilistic prediction. following three forms: (1) epistemic culture: risk versus The implementation of NIS is the first step in the integra- prescription; (2) secrecy restricting collective learning; (3) tion of safety and cyber security; therefore, it is the move different logics of risk assessment. The logics of anticipa- towards legitimising the modernisation of critical infra- tion and connectivity favoured in the big data environments structures with big data. But we also show that the cyber do not fit easily into the prescriptive and siloed world of OT security risk management practices cannot be directly trans- engineering, leading to the situation in which the modern- planted from the safety realm, as cyber security is grounded isation of critical infrastructures will continue to pose chal- in anticipation of the future adversarial behaviours rather lenge and cannot be taken for granted. than the history of equipment failure rates. While the har- monisation of safety and security standards and organisa- tional practices is important for the delivery of reliable Concluding thoughts critical infrastructure services, this process cannot be Can new tools be useful, or work at all, for the world that has taken for granted and, consequently, we call for a better not been designed and built to accommodate them? Can crit- understanding of the making of technologies, standardisa- ical infrastructures, with their paramount concern about tion processes and engineering knowledge in a quest to safety, adjust to the new reality brought about by instant con- build safe and secure modern critical infrastructures. nectivity and big data? Can safety and security coexist? Despite epistemic accidents and incubation over a long Modernisation of legacy systems with big data technologies period of un-reliability and controversy, we learn a lot brings about the need to reconsider traditional paradigms in from the histories of safety and engineering paradigms. both engineering and computing in order to successfully inte- grate them. Tracing the attempts to harmonise diverse com- Acknowledgements puting and engineering requirements, we draw from the We thank our participants for sharing their experiences and invit- case study of NIS Regulations. NIS Regulations bring atten- ing us to their professional events. We would also like to thank Dr tion to the management of risks to critical infrastructures. John Downer, Lars Gjesvik as well as two anonymous reviewers While previous research on critical infrastructure risk man- for their insightful comments on early versions on the paper. Finally, we would like to extend our gratitude to the European agement accounts for the variety and sophistication of risk Cyber Security Seminar community, where we had a chance to assessment methods (Kriaa et al., 2015; Cherdantseva present a seminar based on this research project. et al., 2016) as well as the topical coverage of various frame- works and standards (Topping et al., 2021), we brought Declaration of conflicting interests attention to the social construction of risk, safety and secur- ity. In other words, what happens when traditional safety The authors declared no potential conflicts of interest with respect practices meet novel big data practices. to the research, authorship, and/or publication of this article. Michalec et al. 11 Collins H (2007). The uses of sociology of science for scientists Funding and educators. Science & Education 16: 217–230. The authors disclosed receipt of the following financial support for de Goede M (2020) Engagement all the way down. Critical the research, authorship, and/or publication of this article: This Studies on Security 8(2): 101–115. work was supported by the National Cyber Security Centre. Department for Digital, Culture, Media and Sport – DCMS (2018) “Power – Understanding disruptive powers of IoT in the energy The NIS regulations. Available at: https://www.gov.uk/ sector” funded by the PETRAS National Centre of Excellence government/collections/nis-directive-and-nis-regulations-2018 (part of UKRI, number EP/S035362/1). Department for Digital, Culture, Media and Sport – DCMS (2021) Government response to the call for views on amend- ORCID iDs ing the security of network and information systems regu- lations. Policy paper. Available at: https://www.gov.uk/ Ola Michalec https://orcid.org/0000-0003-3807-0197 government/publications/government-response-on-amending- Sveta Milyaeva https://orcid.org/0000-0002-0156-5359 the-nis-regulations/government-response-to-the-call-for-views- Awais Rashid https://orcid.org/0000-0002-0109-1341 on-amending-the-security-of-network-and-information-systems- regulations Notes Downer J (2010) Trust and technology: the social foundations of aviation regulation. The British Journal of Sociology 61(1): 1. Workshop with water suppliers, November 2019, Leeds. 83–106. 2. Workshop for critical infrastructure operators, Oct 2019, Dragos (2019) Key Considerations for Selecting an Industrial London. Cybersecurity Solution for Asset Identification, Threat 3. Ofwat is the economic regulator for the water sector in England Detection, and Response. Report. 2019. Available at: https:// and Wales, setting maximum investment budgets and water www.dragos.com/wp-content/uploads/Key-Considerations- pricing. Industrial-Cybersecurity-Solution.pdf Dunn Cavelty M (2013) From cyber-bombs to political fallout: References threat representations with an impact in the cyber-security dis- Adams A and Sasse MA (1999) Users are not the enemy. course. International Studies Review 15(1): 105–122. Communications of the ACM 42(12):40–46. Dunn-Cavelty M and Suter M (2009) Public–private partnerships Agrafiotis I, Nurse JC, Goldsmith M, et al. (2018) A taxonomy of are no silver bullet: an expanded governance model for critical cyber-harms: defining the impacts of cyber-attacks and under- infrastructure protection. International Journal of Critical standing how they propagate. Journal of Cybersecurity 4(1): Infrastructure Protection 2(4): 179–187. 1–15. doi:10.1093/CYBSEC/TYY006. Dunn-Cavelty M (2018) Cybersecurity research meets science and Amoore L (2014) Security and the incalculable:. Security Dialogue technology studies. Politics and Governance 6(2): 22–30. 2014;45(5):423–439. doi:10.1177/0967010614539719 Dwyer AC (2018) The NHS cyber-attack: A look at the complex Ani UPD and He H. (Mary) and Tiwari A (2016) Review of cyber- environmental conditions of WannaCry. RAD Magazine, 44. security issues in industrial critical infrastructure: Manufacturing Dwyer AC (2021) Cybersecurity’s grammars: a more-than-human in perspective. Journal of Cyber Security Technology 1(1): 32– geopolitics of computation. Area 00: 1– 8. doi:10.1111/area. 74. doi:10.1080/23742917.2016.1252211. 12728 Aradau C (2010) Security that matters: critical infrastructure and Elish MC (2019) Moral crumple zones: cautionary tales in human- objects of protection. Security Dialogue 41(5): 491–514. robot interaction (pre-print). Engaging Science, Technology, Barnes TJ (1993). Whatever happened to the philosophy of and Society (pre-print)6:1–29. science? Environment and Planning A 25(3): 301–304. European Commission (2016) NIS Directive. Available at: https:// Brass I, Tanczer M, Carr M, et al. Blackstock (2018) Standardising digital-strategy.ec.europa.eu/en/policies/nis-directive a moving target: the development and evolution of IoT security Fouad NS (2021) The non-anthropocentric informational agents: standards. IET Conference Publications, 2018(CP740). doi:10. codes, software, and the logic of emergence in cybersecurity. 1049/CP.2018.0024. Review of International Studies 1–20. Byres E (2013) The air gap: SCADA’s enduring security myth: Frey S, Rashid P, Anthonysamy M, et al. (2019) The good, the bad attempting to use isolation as a security strategy for critical and the ugly: a study of security decisions in a cyber-physical systems is unrealistic in an increasingly connected world. systems game. IEEE Transactions on Software Engineering Communications of the ACM 56(8): 29–31. 45(5): 521–536. doi:10.1109/TSE.2017.2782813. Cabinet Office (2022) National cyber strategy. Available at: Guldenmund FW (2000) The nature of safety culture: a review of https://assets.publishing.service.gov.uk/government/uploads/ theory and research. Safety Science 34(1–3): 215–257. system/uploads/attachment_data/file/1053023/national-cyber- Haugland BT (2020) Changing oil: self-driving vehicles and strategy-amend.pdf (accessed 13 June 2022). the Norwegian state. Humanities and Social Sciences Cherdantseva Y, Burnap P, Blyth A, et al. (2016) A review of Communications 7(1): 1–10. doi:10.1057/s41599-020- cyber security risk assessment methods for SCADA systems. 00667-9. Computers & Security 56: 1–27. Henrie M (2015) Cyber security risk management in the SCADA Clark-Ginsberg A and Slayton R (2019) Regulating risks within critical infrastructure environment. 25(2): 38–45. doi:10.1080/ complex sociotechnical systems: evidence from critical infra- 10429247.2013.11431973. structure cybersecurity standards. Science and Public Policy Kinsley S (2014). The matter of ‘virtual’geographies. Progress in 46(3): 339–346. Human Geography 38(3): 364–384. 12 Big Data & Society Klimburg-Witjes N and Wentland A (2021) Hacking humans? Nurse JRC, Creese S and de Roure D (2017) Security risk assess- Social engineering and the construction of the “deficient ment in internet of things systems. IT Professional 19(5): 20– user” in cybersecurity discourses. Science, Technology, & 26. doi:10.1109/MITP.2017.3680959. Human Values 46(6): 1316–1339. Perrow C (1984) Normal Accidents: Living with High-Risk Kazansky B (2021) ‘It depends on your threat model’: the anticipatory Technologies. Princeton University Press, pp. 1–466. dimensions of resistance to data-driven surveillance. Big Data Pieters W and Coles-Kemp L (2011) Reducing normative conflicts and Society 8(1): 1–12. doi:10.1177/2053951720985557 in information security. Proceedings of the 2011 workshop on Kocksch L, Korn M, Poller A, et al. (2018) Caring for IT security: new security paradigms workshop - NSPW ‘11 [Preprint], (11). accountabilities, moralities, and oscillations in IT security prac- doi:10.1145/2073276. tices. Proceedings of the ACM on Human-Computer Interaction, Piètre-Cambacédès L and Chaudet C (2010) The SEMA referen- 2(CSCW): 1–20. doi:10.1145/3274361. tial framework: avoiding ambiguities in the terms “security” Knorr- Cetina K (1999) Epistemic Cultures: How the Sciences and “safety”. International Journal of Critical Infrastructure Make Knowledge. Cambridge, MA: Harvard University Press. Protection 3(2): 55–66. Kriaa S, Pietre-Cambacedes L, Bouissou M, et al. (2015) A survey Pinch TJ and Bijker WE (1984) The social construction of facts of approaches combining safety and security for industrial and artifacts: or how the sociology of science and the sociology control systems. Reliability Engineering & System Safety of technology might benefit each other. The Social 139: 156–178. Construction of Technological Systems: New Directions in Marres N and Stark D (2020) Put to the test: for a new sociology of the Sociology and History of Technology: Anniversary testing. The British Journal of Sociology 71(3): 423–443. Edition, 11–44. Available at: https://mitpress.mit.edu/books/ Martin A (2021) UK Cyber security law forcing energy companies social-construction-technological-systems-anniversary-edition to report hacks has led to no reports, despite numerous hacks. (Accessed: December 16, 2021). Sky News. Available at: https://news.sky.com/story/uk-cyber- Pinch T and Swedberg R (2008). Living in a Material World: security-law-forcing-energy-companies-to-report-hacks-has-led- Economic Sociology Meets Science and Technology Studies to-no-reports-despite-numerous-hacks-12254296 (Vol. 1). Cambridge, MA: The MIT Press. Matthew A and Cheshire C (2016) Trust and community in the Polleri M (2020) Post-political uncertainties: governing nuclear practice of network security. Preprint. Available at: https:// controversies in post-Fukushima Japan. Social Studies of papers.ssrn.com/sol3/papers.cfm?abstract_id=2756244 Science 50(4): 567–588. Michalec OA, Van Der Linden D, Milyaeva S, et al. (2020). Industry Reece RP and Stahl BC (2015) The professionalisation of informa- responses to the European directive on security of network and tion security: perspectives of UK practitioners. Computers & information systems (NIS): understanding policy implementation Security 48: 182–195. practices across critical infrastructures. In Sixteenth Symposium Renaud K, Flowerday S, Warkentin M., et al. (2018) Is the respon- on Usable Privacy and Security (SOUPS 2020) (pp. 301− sibilization of the cyber security risk reasonable and judicious? 3317). USENIX, Virtual. Computers & Security 78: 198–211. Michalec O, Milyaeva S and Rashid A (2021) Reconfiguring gov- Schiølin K (2020) Revolutionary dreams: future essentialism and ernance: how cyber security regulations are reconfiguring the sociotechnical imaginary of the fourth industrial revolution water governance. Regulation & Governance 1–18. in Denmark. Social Studies of Science 50(4): 542–566. Michels JD and Walden I (2018) How safe is safe enough? Shove E and Trentmann F (2018) Infrastructures in Practice: The Improving cybersecurity in Europe’s critical infrastructure Dynamics of Demand in Networked Societies. New York: under the NIS directive. Queen Mary School of Law Legal Routledge. Studies Research Paper No. 291/2018, Available at SSRN: Shreeve B, Hallett J, Edwards M, et al. (2020) ‘So if Mr Blue Head https://ssrn.com/abstract=3297470 here clicks the link…’ risk thinking in cyber security decision Murray G, Johnstone MN and Valli C (2017) The convergence of making. ACM Transactions on Privacy and Security (TOPS) IT and OT in critical infrastructure. The Proceedings of 15th 24(1): 1–29. doi:10.1145/3419101. Australian Information Security Management Conference. Shukla M, Johnson SD and Jones P (2019) Does the NIS imple- 5–6 December, 2017, Perth: Edith Cowan University, 149– mentation strategy effectively address cyber security risks in 155. doi:10.4225/75/5a84f7b595b4e. the UK? in 2019 International Conference on Cyber Security National Cyber Security Centre (2019) Cyber assessment frame- and Protection of Digital Services, Cyber Security 2019. work guidance. Available at: https://www.ncsc.gov.uk/ Institute of Electrical and Electronics Engineers Inc. doi:10. collection/caf 1109/CyberSecPODS.2019.8884963. National Cyber Security Centre (2021) Technology assurance. Slayton R and Clark-Ginsberg A (2018) Beyond regulatory Guidance. Available at: https://www.ncsc.gov.uk/collection/ capture: coproducing expertise for critical infrastructure pro- technology-assurance/future-technology-assurance/whitepaper- tection. Regulation & Governance, 12(1): 115–130. developing-a-new-approach-to-assurance Slupska J, Dawson Duckworth SD, Ma L, et al. (2021) Participatory Norton P (2015) Four paradigms: traffic safety in the twentieth- threat modelling: exploring paths to reconfigure cybersecurity. century United States. Technology and Culture.56(2), Conference on Human Factors in Computing Systems - SPECIAL ISSUE: (Auto)Mobility, Accidents, and Danger, 319 Proceedings [Preprint]. doi:10.1145/3411763.3451731. − 3334. Available at: https://www.jstor.org/stable/24468867? Spinardi G (2019) Performance-based design, expertise asym- seq=1#metadata_info_tab_contents (Accessed: December 16, metry, and professionalism: fire safety regulation in the neo- 2021). liberal era. Regulation & Governance 13(4): 520–539. Michalec et al. 13 Stilgoe J (2018) Machine learning, social learning and the gov- Topping C, Dwyer A, Michalec O, et al. (2021). Beware suppli- ernance of self-driving cars. Social Studies of Science ers bearing gifts!: analysing coverage of supply chain cyber 48(1): 25–56. security in critical national infrastructure sectorial and Stilgoe J (2021) How can we know a self-driving car is safe? cross-sectorial frameworks. Computers & Security,108, Ethics and Information Technology 2021: 1–13. 102324. Suchman L (2007) Human-machine Reconfigurations: Plans and Urquhart L and McAuley D (2018) Avoiding the internet of inse- Situated Actions. Cambridge, UK: Cambridge university press. cure industrial things. Computer Law and Security Review Tanczer LM, Steenmans I, Elsden M, et al. (2018 Emerging risks in 34(3): 450–466. the IoT ecosystem: who’s afraid of the big bad smart fridge?. In Wallis T and Johnson C (2020) Implementing the NIS directive, Living in the Internet of Things: Cybersecurity of the IoT-2018 driving cybersecurity improvements for essential services. in (pp. 1− 19). London: IET 2020 International Conference on Cyber Situational Awareness, Tatam M, Shanmugam B, Azam S, et al. (2021) A review of threat Data Analytics and Assessment (CyberSA), 1–10. doi:10.1109/ modelling approaches for APT-style attacks. Heliyon 7(1): e05969. CyberSA49311.2020.9139641. Thekkilakattil A and Dodig-Crnkovic G (2015) Ethics aspects of Wuyts K, Sion L and Joosen W (2020) LINDDUN GO: A lightweight embedded and cyber-physical systems. Proceedings - approach to privacy threat modeling. Proceedings - 5th IEEE International Computer Software and Applications Conference, European Symposium on Security and Privacy Workshops, 2: 39–44. doi:10.1109/COMPSAC.2015.41. Euro S and PW 2020, 302–309. doi:10.1109/EUROSPW51379. Thomas RJ, Gardiner J, Chothia T, et al. (2020) Catch me if you can: 2020.00047. an in-depth study of CVE discovery time and inconsistencies for Wynne B, Waterton C and Grove-White R (2007) Public percep- managing risks in critical infrastructures. CPSIOTSEC 2020 - tions and the nuclear industry in west Cumbria. Available at: Proceedings of the 2020 Joint Workshop on CPS and IoT http://inis.iaea.org/Search/search.aspx?orig_q=RN:34004547 Security and Privacy, 49–60. doi:10.1145/3411498.3419970. (Accessed: December 16, 2021).

Journal

Big Data & SocietySAGE

Published: Jun 28, 2022

Keywords: Safety; cyber security; risk; critical infrastructure; materiality; expertise

References