Get 20M+ Full-Text Papers For Less Than $1.50/day. Subscribe now for You or Your Team.

Learn More →

Google DeepMind and healthcare in an age of algorithms

Google DeepMind and healthcare in an age of algorithms Health Technol. (2017) 7:351–367 DOI 10.1007/s12553-017-0179-1 ORIGINAL PAPER 1 2 Julia Powles & Hal Hodson Received: 14 November 2016 /Accepted: 26 January 2017 /Published online: 16 March 2017 The Author(s) 2017. This article is published with open access at Springerlink.com Abstract Data-driven tools and techniques, particularly ma- 1 Introduction chine learning methods that underpin artificial intelligence, offer promise in improving healthcare systems and services. A key trend in contemporary healthcare is the emergence of an One of the companies aspiring to pioneer these advances is ambitious new cadre of corporate entrants: digital technology DeepMind Technologies Limited, a wholly-owned subsidiary companies. Google, Microsoft, IBM, Apple and others are all of the Google conglomerate, Alphabet Inc. In 2016, preparing, in their own ways, bids on the future of health and DeepMind announced its first major health project: a collab- on various aspects of the global healthcare industry. oration with the Royal Free London NHS Foundation Trust, to This article focuses on the Google conglomerate, Alphabet assist in the management of acute kidney injury. Initially re- Inc. (referred to as Google for convenience). We examine the ceived with great enthusiasm, the collaboration has suffered first healthcare deals of its British-based artificial intelligence from a lack of clarity and openness, with issues of privacy and subsidiary, DeepMind Technologies Limited, in the period power emerging as potent challenges as the project has un- between July 2015 and October 2016. In particular, the article folded. Taking the DeepMind-Royal Free case study as its assesses the first year of a deal between Google DeepMind pivot, this article draws a number of lessons on the transfer and the Royal Free London NHS Foundation Trust, which of population-derived datasets to large private prospectors, involved the transfer of identifiable patient records across identifying critical questions for policy-makers, industry and the entire Trust, without explicit consent, for the purpose of individuals as healthcare moves into an algorithmic age. developing a clinical alert app for kidney injury. We identify inadequacies in the architecture of this deal, in its public com- munication, and in the processes of public sector oversight. . . . Keywords Artificial intelligence Clinical care Consent We conclude that, from the perspective of patient autonomy, . . . . Data protection Machine learning Power Privacy public value, and long-term competitive innovation, existing Regulation institutional and regulatory responses are insufficiently robust and agile to properly respond to the challenges presented by data politics and the rise of algorithmic tools in healthcare. This article is part of the Topical Collection on Privacy and Security of The article proceeds in three main sections. The next Medical Information two sections document comprehensively how the This article was completed while Hal Hodson was a Technology DeepMind deals proceeded, drawing attention to the dis- Reporter, New Scientist. closures and omissions in how data handling was commu- nicated, justified and, ultimately, scrutinized in public. * Julia Powles Section 2 discusses the chronology, formal contractual jep50@cam.ac.uk Since Oct 2015, DeepMind has been owned by Google’s parent company, Alphabet Inc. Faculty of Law and Computer Laboratory, University of Cambridge, Given the time period considered, the EU’s General Data Protection Cambridge, UK Regulation 2016/679, which enters into force in May 2018, is beyond the The Economist Newspaper, London, UK scope of this article. 352 Health Technol. (2017) 7:351–367 basis, and stated clinical motivation underlying the Royal been adequately explained by either DeepMind or Free deal, highlighting the delayed revelation of the na- Royal Free. ture and scale of patient data involved. Section 3 explores DeepMind’s broader ambitions in working with the NHS 2.1 Contractual foundations vs public relations and the lack of ex ante discussions and authorizations with relevant regulators. It also elaborates on the problem- Throughout the whole first phase of the deal, through to atic basis on which data was shared by Royal Free, name- October 2016, DeepMind’s publicly announced purposes ly, the assertion that DeepMind maintains a direct care for holding sensitive data on Royal Free’s patients, i.e. relationship with every patient in the Trust. Section 4 then the management and direct care of AKI, were narrower lays out the lessons that can be drawn from the case study than the purposes that contractually constrained its use as a whole, assesses at a high level the data protection and of the data. These constraints were described in an eight medical information governance issues, and then turns to page information sharing agreement (ISA) between transparency, data value, and market power. Google UK Limited and Royal Free, signed on 29 September 2015 [4]. The Google-Royal Free ISA stated that, in addition to developing tools for ‘Patient Safety 2 A startup and a revelation Alerts for AKI’ (presumably via the application now badged as Streams), Google, through DeepMind, could In July 2015, clinicians from British public hospitals also build Breal time clinical analytics, detection, diag- within the Royal Free London NHS Foundation Trust nosis and decision support to support treatment and approached Google DeepMind Technologies Limited, avert clinical deterioration across a range of diagnoses an artificial intelligence company with no experience and organ systems^ [10]. Further, it stated that the data in providing healthcare services, about developing soft- provided by Royal Free was envisaged for use in the ware using patient data from the Trust [1]. Four months creation of a service termed ‘Patient Rescue’, Baproof later, on 18 November 2015, [2] sensitive medical data of concept technology platform that enables analytics as on millions [3] of Royal Free’s patients started flowing a service for NHS Hospital Trusts^. into third-party servers contracted by Google to process This was the entirety of the language in the ISA specifying data on behalf of DeepMind [4]. the purposes for data sharing between Royal Free and Google Royal Free is one of the largest healthcare providers in over a two-year period ending 29 September 2017. (The ISA Britain’s publicly funded National Health Service (NHS). was superseded, prematurely, by a new set of agreements TheNHS offers healthcarethatisfreeatthepointof signed on 10 November 2016. Those agreements are beyond service, paid for through taxes and national insurance the scope of the present article and will be considered in future contributions. Beloved in the UK, the NHS is a key work.) At least contractually, the original ISA seemed to per- part of the national identity. mit DeepMind to build systems to target any illness or part of DeepMind publicly announced its work with Royal the body. Further, the ISA contained no language constraining Free on 24 February 2016 [5]. No mention was made the use of artificial intelligence (AI) technologies on the data, of the volume or kind of data included in the transfer— meaning that DeepMind’s assurance that Bfor the moment millions of identifiable personal medical records. there’s no AI or machine learning^ was, and remains, rather DeepMind said it was building a smartphone app, called less convincing than Bbut we don’t rule it out for the future^ ‘Streams’, to help clinicians manage acute kidney injury [9]. In mid-2016, the app’s online FAQs reiterated the same (AKI). AKI has outcomes ranging from minor kidney sentiment, adding that if artificial intelligence techniques dysfunction through to dialysis, transplant, and even are applied to the data in the future, this would be death, and is linked to 40,000 deaths a year in the announced on the company’s website, and indicating UK [6, 7]. The app, DeepMind claimed, would not ap- that the company will seek regulatory approval under ply any of the machine learning or artificial intelligence research authorization processes [11]. techniques (effectively, statistical models built using Another subject unaddressed in the ISA was the Google powerful computing resources over large corpora of question: i.e. how data shared under the scheme would be granular, personalized data [8]) for which it is re- cabined from other identifiable data stored by Google, given nowned, and would act as a mere interface to patient that Google was the signing party to the contract and that the medical data controlled by Royal Free [9]. Why company’s business model depends on monetizing personal DeepMind, an artificial intelligence company wholly data. DeepMind has made regular public assurances that owned by data mining and advertising giant Google, Royal Free data Bwill never be linked or associated with was a good choice to build an app that functions pri- Google accounts, products or services^ [9, 12]. marily as a data-integrating user interface, has never Problematically, these assurances appear to have been given Health Technol. (2017) 7:351–367 353 little to no legal foundation in Google and DeepMind’sdeal- platform. On 4 May 2016, Royal Free issued a statement in ings with Royal Free, even if there is no reason to disbelieve line with Google’s position [17]. the sincerity of their intent [13]. The reality is that the exact The data package described in the ISA and destined for nature and extent of Google’s interests in NHS patient data DeepMind is patient identifiable, and includes the results of every remain ambiguous. blood test done at Royal Free in the five years prior to transfer [18]. It also includes demographic details and all electronic pa- tient records of admissions and discharges from critical care and 2.2 Data, direct care and consent accident and emergency. It includes diagnoses for conditions and procedures that have a contributory significance to AKI, such as It is important to note that, though the ISA provided Google diabetes, kidney stones, appendectomies or renal transplants, but with a broad set of purposes contractually, it did not displace also those that do not, such as setting broken bones. various other legal, regulatory and ethical restrictions. A perti- nent restriction is that medical information governance in the 2.3 A ‘national algorithm’ for AKI UK is geared around obtaining explicit consent from each pa- tient whose identifiable data is passed to a third-party, when that Both DeepMind and Royal Free claim that Streams relies third-party is not in a direct care relationship with the patient in solely on a ‘national algorithm’ for AKI published by the question. Direct care is defined as Ban activity concerned with NHS [19]; a process designed to assist in the rapid diagnosis the prevention, investigation and treatment of illness and the of AKI from the starting point of a renal blood test for creat- alleviation of suffering of an identified individual^ [14]. inine levels [20]. The implication is that all that Streams does The data that DeepMind processed under the Royal Free is host this algorithm, and pump the Royal Free data (as project was transferred to it without obtaining explicit consent stored, structured, formatted and delivered by DeepMind) from—or even giving any notice to—any of the patients in the through it to generate alerts [21, 11]. These alerts are trans- dataset. For patients who had the necessary precursor renal mitted to a clinician’s mobile device, along with historical data blood test and were then progressed to being monitored by on the patient in question to analyze trends (in seeming con- clinicians for AKI, the appropriate direct care relationship tradiction to the ISA, which stated that historical information would exist to justify this data processing, through the vehicle was shared only Bto aid service evaluation and audit on the of implied consent. However, the dataset transferred to AKI product^). Adding any new functions to the app, or ful- DeepMind extended much more broadly than this. In fact, it filling any of the broader contractual purposes described in the included every patient admission, discharge and transfer within ISA, would comprise research. DeepMind did not have the constituent hospitals of Royal Free over a more than five-year requisite approvals for research from the Health Research period (dating back to 2010). For all the people in the dataset Authority (HRA) and, in the case of identifiable data in par- who are never monitored for AKI, or who have visited the ticular, the Confidentiality Advisory Group (CAG) [22, 23]. hospital in the past, ended their episode of care and not Because DeepMind’s processes and servers—and those of the returned, consent (explicit or implied) and notice were lacking. third-party datacenter holding the data—have not been inde- This is an issue to which we will return, given the centrality of pendently scrutinized and explained, what the company has these requirements to patient privacy and meaningful agency. been, and is actually, doing with the data is not public. On 29 April 2016, the extent of data held by DeepMind was The national AKI algorithm was launched in a patient safety revealed following a New Scientist investigation [15]. Google, alert put out by NHS England on 9 June 2014, recommending which acted as the media filter for its subsidiary until at least Bthe wide scale introduction and uptake of an automated com- October 2016, issued a swift public relations response. In all of puter software algorithm to detect AKI^ [24]. The algorithm its communications, Google insisted that it would not be using was standardized by a working group of nephrologists and the full scope of the ISA it had signed [15], emphasizing that biochemists, with inputs from providers of specialized labora- DeepMind was only developing an app for monitoring kidney tory software systems, and leads to results being generated in disease [16]. This was despite the clear statements in the ISA Royal Free’s laboratory information management system [25]. quoted above, i.e. that information was also being shared for DeepMind’s asserted role has been to design a clinical app to the development of real time analysis and alert systems, poten- get the alerts generated by this algorithm delivered to clinicians tially as part of a broadly-defined ‘analytics as a service’ ‘on the fly’. The algorithm does not, however, extend to pa- tients who have never been tested for serum creatinine, nor does it mention historical contextual data [26, 27]. It is only There is no prohibition on linkage in the ISA. DeepMind’s own internal privacy impact assessment (see section 3.3 below) states that no new linkages an assistant to good clinical care [28, 29], and its sensitivity and of data will be made, but this document has no legal force, and given its other shortcomings––i.e. that it does not deal with the bulk of the data transfer, nor the bulk of the individuals affected––we do not consider it adequate. The Streams FAQ states that “all development is done using synthetic See discussion of consent in section 4.2-4.3 below. (mock) data. Clinical data is used for testing purposes”. 354 Health Technol. (2017) 7:351–367 effectiveness remains a vibrant, contested field of research. As around the patients who are in a direct care relationship is not DeepMind has acknowledged, Bthe national algorithm can likely to be as clean as saying that it extends only to those who miss cases of AKI, can misclassify their severity, and can label contract AKI, since the purpose of the app also includes mon- some as having AKI when they don’t^ [30]. The failure to itoring. However, since the large, Trust-wide group whose data explain and address these issues and, in particular, the discon- has been transferred includes individuals who have never had a nect between the Trust-wide dataset that has been transferred blood test, never been tested or treated for kidney injury, or under the broad terms of the ISA and the narrower set of indeed patients who have since left the constituent hospitals patients who will ever be monitored and treated for AKI, or even passed away, the position that Royal Free and throws considerable doubt on the DeepMind-Royal Free posi- DeepMind assert—that the company is preventing, investigat- tion that all of the data being transferred is necessary and pro- ing or treating kidney disease in every patient—seems difficult portionate to the safe and effective care of each individual to sustain on any reasonable interpretation of direct patient care. patient [31, 32]. 3.2 Grand ambitions and systemic change Despite the public narrative’s exclusive focus on AKI, it is 3 Grand designs and governance gaps clear that DeepMind and Royal Free have always had designs on much grander targets. A memorandum of understanding Between late April 2016, when the scale of the data transfer (MOU) between DeepMind and Royal Free was signed on 28 from Royal Free to DeepMind and the relative lack of con- January 2016, though was only discussed for the first time in straints on its use became publicly known, and until at least June 2016, after being uncovered by a freedom of information October 2016, DeepMind and Royal Free maintained the nar- request [36]. The document, which is not legally binding, talks rative that the entire purpose of transferring millions of patient about plans for DeepMind to develop new systems for Royal records was to assist with AKI diagnosis and alerts, under a Free as part of a Bbroad ranging, mutually beneficial partner- relationship of direct patient care [33]. This position, however, ship… to work on genuinely innovative and transformational fails to justify both the initial breadth of the data transfer and projects^ [37]. Quarterly meetings are envisaged for the set- the continued data retention. ting of priorities on product development, including real time prediction of risks of deterioration, death or readmission, bed, 3.1 Questioning direct care demand and task management, junior doctor deployment/ private messaging, and the reading of cardiotocography traces Royal Free states that AKI affects Bmore than one in six in- in labor [38]. Although the MOU states that all such projects patients^ [17]. If, as DeepMind claims, it only uses patient will be governed by their own individual agreements, the ini- data in the service of monitoring and treating AKI, then it tial Royal Free ISA already covers DeepMind for the devel- follows that as many as five sixths of patients (though this opment of a wide range of medical tools. quantity is very unclear on the current state of the evidence) These are vast ambitions, considerably out of step with are not in a direct care relationship with the company. The DeepMind and Royal Free’s narrow public relations orienta- distinction between being monitored or treated for AKI and tion towards their collaboration being entirely founded on not being monitored matters, because under British medical direct care for AKI. The MOU also makes apparent the esteem information governance guidelines [34], a direct care relation- in which DeepMind is held by its public service partners, ship between an identified patient and an identified clinical indicating in the principles under which the parties intend to professional or member of a clinical care team obviates the cooperate that one of the major reasons for Royal Free’sde- need for explicit consent. Without such a direct care relation- sired collaboration is BReputational gain from a strategic alli- ship, however, and without another basis such as consent, a ance with an unrivalled partner of the highest profile and ex- formal research authorization from the HRA CAG, or other- pertise, focused on a highly impactful mission^,plusa Bplace wise satisfying necessity requirements and introducing appro- at the vanguard of developments in … one of the most prom- priate safeguards [35], it is unlawful to continue to process ising technologies in healthcare^. DeepMind, by contrast, is in patient data under the UK Data Protection Act 1998 (DPA). it for rather more sombre reasons: a clinical and operational As already noted, DeepMind has held data on millions of test-bed, a strategic steer in product development and, most of Royal Free patients and former patients since November 2015, all, for data for machine learning research. with neither consent, nor research approval. The company, with Nascent indications of DeepMind’s plans for datasets that the support of Royal Free, has elected to position itself as hav- not only span a large healthcare trust such as Royal Free, but ing a direct care relationship, by virtue of its AKI alert app, with the entire NHS, have not yet received critical discussion [39], each and every one of those patients. Drawing boundaries but can be seen in presentations given throughout 2016 by This seems an upper estimate on the clinical reality: see [6]. DeepMind cofounder and health lead, Mustafa Suleyman. Health Technol. (2017) 7:351–367 355 These presentations have elaborated a vision for a Btruly dig- at how we can deploy our technologies to radically trans- ital NHS^,comprising Bmassively improved patient care, ac- form the NHS, digitize, and then help better organize and tionable analytics, advanced research both at the hospital-wide run the National Health Service [42].^ DeepMind’s level and the population-wide level, and an open innovation website pertaining to Streams was also updated, to state ecosystem^ [40]. Suleyman characterizes this fourth element, BWe’re building an infrastructure that will help drive in- underpinned technically by Bdigitizing most of the data that’s novation across the NHS, ultimately allowing clinicians, exchanged across the [NHS] system, open standards, and true patients and other developers to more easily create, inte- interoperability^,as Bthe key pivotal thing^ Bthat will enable grate and use a broad range of new services^ [43]. us to bring a wide variety of providers into the system and for clinicians up and down the country to be able to commission 3.3 Riding high above regulatory streets much smaller, more nimble, startup-like organizations to pro- vide some of the long tail of specialist and niche applications When Royal Free transferred millions of patient records to that nurses and doctors are asking for^ [40]. At the core of DeepMind in November 2015, it was done without consulting Suleyman’s described vision is the Bsecure and controlled re- relevant public bodies. The UK has an Information lease of data^ from what he terms Ba single, back-end canon- Commissioner’s Office (ICO), responsible for enforcing the ical record^ that indexes, but also gives a degree of control to, Data Protection Act. The Health Research Authority (HRA) all patients [41]—a telling sign of where a trust-wide dataset, provides a governance framework for health research, and pro- retrofitted in a way that allows it to be leveraged by Google/ vides a path for the release of confidential health information in DeepMind products and those of other technology companies, the absence of explicit consent, through the Confidentiality might ultimately be directed. Advisory Group (CAG). The Medicines and Healthcare prod- These statements are considerably broader than ucts Regulatory Agency (MHRA) regulates medical devices. DeepMind and Royal Free’s public relations focus on None of these bodies were approached about the November the Streams AKI app, with very extensive implications 2015 data transfer [44]: not for informal advice from the ICO; deserving of full and rigorous consideration. As not to go through an official and required device registration Suleyman describes it, the Bvery specific^ targeting of process with the MHRA before starting live tests of Streams at AKI under Streams precedes a Breal opportunity for us Royal Free in December 2015 [44]; and not to go through the to go much much further and extend this to a broader HRA’s CAG, which could have been a vehicle for legitimizing patient-centric collaboration platform^ [41]. Part of how many aspects of the project [45]. (DeepMind has subsequently this would be achieved technically, he indicated, was by been in discussion with all of these parties in reference to its making patient health data repurposable through an appli- Royal Free collaboration and, for several months from cation programming interface termed FHIR (Fast July 2016, stopped using Streams until the MHRA-required Healthcare Interoperability Resources; pronounced ‘fire’); self-registration process was completed. [46]) an open, extensible standard for exchanging electronic Instead, the parties went through just one third-party health records. The FHIR API, Suleyman indicated in check before transferring the data: the ‘information gover- July 2016, allows Baggregating the data in the back-end nance toolkit’ [47], a self-assessment form required by despite the fact that it is often spread across a hundred NHS Digital (formerly HSCIC) [48], designed to validate plus databases of different schemas and in different stan- the security of the technical infrastructure DeepMind dards and in many hospitals^. He continued, Bthis is ac- would be using [49]. The same tool has been used for tually very tractable… it’s not a research problem, and self-assessment by some 1500 external parties. The tool we’ve actually had some success in starting to think about assists organizations to check that their computer systems how we might do that, with the Royal Free^ [41]. are capable of handling NHS data, but it does not consider By September 2016, Suleyman was pitching DeepMind any of the properties of data transfers such as those at the heart of a new vision for the NHS––and casting the discussed in this paper. NHS Digital conducted a routine Google-Royal Free collaboration in the terms that Google desktop review of DeepMind’s toolkit submission in and DeepMind had vigorously denied and critics had December 2015 (after data had been transferred) and ap- feared (i.e. something much broader than an app for kid- proved that the third-party datacenter contracted by Google ney injury, giving Google and DeepMind undue and anti- had adequate security [50]. Beyond this surface check, competitive leverage over the NHS [15]), highlighting NHS Digital made no other enquiries. It subsequently con- sharply DeepMind’s unsatisfactory and quite possibly un- firmed the security of the external datacenter with an on- lawful processing and repurposing of Trust-wide Royal site check, but it was beyond the scope of NHS Digital’s Free data. Speaking at Nesta’s annual FutureFest, role to assess the flow of data between Royal Free and Suleyman stated: BEarlier this year, in February, we Google or to examine any other parts of Google or any launched our first business that’s facing outwards, looking aspect of the data sharing agreements [50]. 356 Health Technol. (2017) 7:351–367 While the DeepMind-Royal Free project does have a raison d’être is artificial intelligence; and its parent, self-assessed Privacy Impact Assessment (PIA) [51], as Google, the world’s largest advertising company, that has recommended by the ICO [52], the assessment commenced long coveted the health market [57]. Combined with the on 8 October 2015 [53], only after the ISA was signed, i.e. unavoidable fact that a sizeable number of patients never once the rules were already set. The PIA also failed to give need care for kidney injury, the absence of any public con- any consideration to the historical data trove that was sideration of patient privacy and agency, and the lack of transferred under the ISA, as well as omitting to discuss safeguards to prioritize public goods and interests over privacy impacts on patients who never have the requisite private ones, there are reasons to see the deal as more blood test or otherwise proceed through the AKI algorithm damaging than beneficial. that Streams uses, but whose data is in DeepMind’s Large digital technology companies certainly have the po- servers, and which is formatted, structured, and prepared tential to improve our healthcare systems. However, given the for repurposing anyway. That is to say, it neglected to deal sensitivity of building public trust in emerging technology with the primary privacy issues, as well as to justify the domains, in order for innovation to deliver over the long-term, failure to address basic data processing principles such as it must advance in a way that meets and exceeds existing data minimization. At the time of publication, the ICO was regulatory frameworks and societal expectations of fair treat- investigating the data transfer (primarily on whether data ment and value. Not doing so will only hinder the adoption protection law requirements have been satisfied) [54], as and growth of beneficial technology. was the National Data Guardian (primarily on the adequa- In this section, we identify a number of salutary lessons cy of the ‘direct care’ justification for processing) [55]. The from thecasestudy, assessing their implications for only remaining health regulator in the picture is the Care DeepMind, in particular, and for current and future directions Quality Commission (CQC), which gave a statement in in healthcare, more generally. These lessons draw on the October 2016 indicating the CQC would consider re- themes of consent, transparency, privatization and power. ported data breaches to the ICO as part of its own Our ambition is to span from the details presented in the pre- inspections, but otherwise declined to comment on the vious two sections towards the broader dynamics at play, both data transfer, indicating that it was broadly supportive in the present deal and in the longer-term ambitions of AI- of experimentation with big data-based care solutions Bif driven medical tools. The DeepMind-Royal Free deal is fast they will lead to people getting higher quality care being converted from an optimistic mistake into a long-term without undermining patient confidentiality^ [56]. partnership. What are the implications, both for this deal and One year after data started to flow from Royal Free to for othersthatloom? DeepMind, the basic architecture of the deal had not visibly The significance of this case study is not only that there are retrospective and grave concerns about the justifiability of changed. On the other hand, subsequent deals between DeepMind and other London medical institutions, this time DeepMind’s continued holding of data on millions of citizens. for research rather than direct patient care, were announced in The case study also offers a prism on the future. It offers one a way that avoided many of the same questions. In these angle into how public institutions and the public at large are arrangements, data was anonymized before being transferred presently equipped to grapple with the promised rise of data- to DeepMind, and research approval (which raises separate driven tools in domains such as medicine and public health. issues, as discussed further below) was sought and gained And it tests our assumptions and responses to Google/ before any research work commenced. Crucially, DeepMind Alphabet and other speculative private prospectors of this al- and its partners were clear about the purposes and amount of gorithmic age—‘New Oil’, ‘New Rail’, ‘New Pharma’,we data that would be transferred in those deals. could say––as they transition from web-based markets, into land- and body-based markets. 4 Assessing the damage 4.1 The conversation we need The most striking feature of the DeepMind-Royal Free ar- It was only after an independent journalistic investiga- rangement is the conviction with which the parties have tion revealed the necessary information—seven months pursued a narrative that it is not actually about artificial after DeepMind and Royal Free first entered into a data intelligence at all, and that it is all about direct care for sharing agreement, five months after the data had been kidney injury—but that they still need to process data on transferred into DeepMind’s control and during which all the Trust’s patients over a multi-year period. This is product development and testing had commenced, and hardly a recipe for great trust and confidence, particularly two months after the project had been publicly an- given that the arrangement involves largely unencumbered nounced––that any public conversation occurred about the nature, extent and limits of the DeepMind-Royal data flows, both with one company, DeepMind, whose Health Technol. (2017) 7:351–367 357 Free data transfer. Despite the shortcomings in the DeepMind is developing an app that will only conceiv- deal’s structure, if DeepMind and Royal Free had en- ably be used in the treatment of one sixth of those indi- deavored to inform past and present patients of plans viduals; and for their data, initially and as they evolved, either 7) More than 12 months into the deal being made, no regu- through email or by letter, much of the subsequent fall- lator had issued any comment or pushback. out would have been mitigated. A clear lesson of this whole arrangement is that attempts to deliver public If account is not taken of these lessons, it could healthcare services should not be launched without dis- result in harms beyond the breach of patients’ rights closing the details, documentation, and approvals—the to confidentiality and privacy––though these elements legal bedrock—of the partnerships that underlie them. in themselves should be enough to demand a regulatory This lesson applies no less to companies offering algo- response. Some of the potential risks posed by unregu- rithmic tools on big datasets than it does to pharmaceu- lated, black box algorithmic systems include misclassi- tical and biotech companies. fication, mistreatment, and the entrenchment and exac- The failure on both sides to engage in any conversation erbation of existing inequalities. It does not take an with patients and citizens is inexcusable, particularly in the active imagination to foresee the damage that computa- British context, in the wake of the 2013 Caldicott review into tional errors could wreak in software applied to information governance practices [34], the very public and healthcare systems. Clearly, the same skills and re- profoundly damaging 2013–15 failure of the government’s sources must be devoted to the examination and valida- care.data data sharing scheme [58], the 2014 recommenda- tion of data-driven tools as to their creation. tions of the National Data Guardian in the wake of the Without scrutiny (and perhaps even encouraged com- care.data debacle [59], and the 2015 Nuffield Council report petition) Google and DeepMind could quickly obtain a on bioethics [60]. The clear take-away from these reports and monopolistic position over health analytics in the UK recommendations––and indeed the entire regulatory apparatus and internationally. Indeed, the companies are already around healthcare––is that patients should be able to under- in key positions in policy discussions on standards and stand when and why their health data is used, with realistic digital reform. If a comprehensive, forward-thinking and options for effective choice [61]. Patients should not be hear- creative regulatory response is not envisaged now, ing about these things only when they become front-page health services could find themselves washed onwards scandals [62]. in a tide of efficiency and convenience, controlled more The DeepMind-Royal Free data deal may be just one trans- by Google than by publicly-mind health practitioners. action, but it holds many teachings. To sum up: Aggregating and centralizing control of health data and its analysis will generate levers that exist beyond dem- 1) We do not know––and have no power to find out––what ocratic control, with no guarantees except for corporate Google and DeepMind are really doing with NHS patient branding and trust as to where they might end up. data, nor the extent of Royal Free’s meaningful control It is important to reflect on these scenarios not as a predic- over what Google and DeepMind are doing; tion of what will come to pass, but as a vision of the potential 2) Any assurances about use of the dataset come from public danger if policymakers and regulators do not engage with relations statements, rather than independent oversight or digital entrants such as DeepMind and incumbents such as legally binding documents; Google. There may be other, worse outcomes. To demand that 3) The amount of data transferred is far in excess of the innovation be done in a principled manner is not to stand in its requirements of those publicly stated needs, but not in way––it is to save it. excess of the information sharing agreement and broader memorandum of understanding governing the deal, both 4.2 Data protection of which were kept private for many months; 4) The data transfer was done without consulting relevant DeepMind and Royal Free have coalesced on the justi- regulatory bodies, with only one superficial assessment fication of direct care and implied consent to seek to of server security, combined with a post-hoc and inade- justify the sharing of the Royal Free dataset [48, 17, quate privacy impact assessment; 32]. Although this is the only available basis for them 5) None of the millions of identified individuals in the to justify the data transferred in November 2015, it sets dataset were either informed of the impending transfer a dangerous precedent for the future. To understand to DeepMind, nor asked for their consent; why, we need to step through the UK data protection 6) The transfer relies on an argument that DeepMind is in a and medical information governance frameworks. Bdirect care^ relationship with each patient that has been Under the UK Data Protection Act, DeepMind needs admitted to Royal Free constituent hospitals, even though to comply with a set of data protection principles, 358 Health Technol. (2017) 7:351–367 including having a legitimate basis at all times for pro- under a contract, made or evidenced in writing, under cessing information that can identify an individual [63]. which the processor Bis to act only on instructions from Health information is classed as ‘sensitive personal data’ the data controller^. under this law [64], and is subject to additional safe- It seems clear that Royal Free have contracted with guards [65]. For DeepMind, legitimate processing of DeepMind to analyze complex data and come up with solu- health information comes down to one of two alterna- tions by applying DeepMind’s own expertise in analysis to an tives. Either it requires explicit consent from the indi- extent that Royal Free cannot begin to do. Apart from the vidual concerned, which DeepMind does not have, or parties’ consensus on the overall purpose of processing––to DeepMind must show that its processing is Bnecessary assist in monitoring AKI using the nationally-mandated AKI for medical purposes^ (defined as Bthe purposes of pre- algorithm––DeepMind seems to have considerable discretion, ventative medicine, medical diagnosis, medical research, in addition to Royal Free, to determine the purposes and man- the provision of care and treatment and the management ner in which any personal data is processed. The company is of healthcare services^)and Bundertaken by (a) a med- storing, structuring and formatting the Trust-wide dataset, test- ical professional; or (b) a person who in the circum- ing it, preparing to deliver data and visualizations to clinician’s stances owes a duty of confidentiality which is equiva- devices and, most recently, discussing technical infrastructure lent to that which would arise if that person were a that could enable it to be repurposed. These factors all point health professional^ [66]. very strongly to DeepMind assuming the role of a joint data Simply using health data for the purpose of specula- controller. Certainly, Royal Free, in its responses to investiga- tive and abstract Bmedical purposes^ does not satisfy tions and freedom of information requests, has never provided data protection law. This is where the medical informa- any specific awareness or understanding of the means of tion governance architecture—the so-called Caldicott DeepMind’s processing. principles and guidelines [34]—come into play. Before Further, even if DeepMind were to avoid the substantive turning to these rules, it is important to address an out- factual conclusion that it is determining the purposes and man- standing core issue in the data protection aspects of the ner of data processing, the document that is said to constrain Royal Free-DeepMind project. DeepMind’s processing—the ISA—has a number of short- Data protection law relies on a key distinction between comings that undermine its status as a ‘contract’ satisfying ‘data controllers’ and ‘data processors’ [67]. A data con- the mandatory requirements for data controller-processor re- troller is defined as Ba person who (either alone or jointly lationships in Schedule 1, Part II, paragraph 12 of the DPA. or in common with other persons) determines the purposes The contract plausibly extends to a wide range of health tools for which and the manner in which any personal data are, for any health condition, without overriding controls from Royal Free. There is an absence of evidence in writing that or are to be, processed^,whileadataprocessor is Bany person (other than an employee of the data controller) DeepMind will act only on instructions from Royal Free, that who processes the data on behalf of the data controller^ data will not be linked with other datasets held by Google or [68]. It is crucial to define controller and processor status DeepMind, or that the data will not be repurposed for other in any information sharing arrangement because legal ob- uses. It is irrelevant whether or not the parties would actually ligations and liabilities flow from it [69], with significant do any of these things. Assurances from the parties are not real-world consequences [70]. In essence, data controllers what matters here––what mattersiswhatisstated inthe bear primary responsibility for complying with the princi- document that purports to be the governing contract. ples of data protection, for accommodating data subject Finally, the status of the document as a contract is di- rights and other requirements, and are liable for damages minished by its absence of any discussion of consider- in case of non-compliance. ation passing between the entities. The ISA between Royal Free and DeepMind states at DeepMind cannot be converted to being a pure data a number of points that Royal Free is the data controller, processor by having both parties sign an agreement declar- while DeepMind is merely a data processor. While this is ing that this is its status, no matter how much the parties clearly the intention of the parties, the legal question is mightwishit[71]. The situation is analogous to the exam- one of substance, not form. The substantial issue turns ple given by the ICO of a sharing agreement between a car on applying the provisions of the DPA, particularly pa rental service and a tracking company that helps ensure ragraphs 11–12 of Schedule 1, Part II. These provisions that cars are returned [70]. The agreement allows the track- require, respectively, under paragraph 11 that a data pro- ing company to hold customer location data for a set peri- cessor provide sufficient guarantees and compliance in od. However, the ICO states that because the tracking com- respect of the technical and organizational security mea- pany applies its own secret knowledge in deciding the data sures governing the processing to be carried out and, to collect and how to analyze it, the fact that the rental company determines the overall purpose of tracking (i.e. under paragraph 12, that the processing be carried out Health Technol. (2017) 7:351–367 359 car recovery) is not sufficient to make the tracking compa- safeguards, such as that complaints can be made to the ny a processor. Addressing and resolving the status of General Medical Council. DeepMind is crucial, and is presumably a core dimension For individuals who are escalated to clinical interven- of the ICO’s ongoing investigations of the deal. In our tion based on the results of applying the AKI algorithm assessment, it is clearly arguable that DeepMind is a joint after a preliminary blood test, clearly this direct care sce- data controller along with Royal Free. It is unfortunate that nario applies. However, for the remainder of patients the ICO had not yet made a clear determination to resolve whose data has been transferred to DeepMind, no plausible the question of legal status in over 12 months after the deal necessity for DeepMind’s processing of their data arises. It commenced, leaving individual rights and organizational is, instead, a classic situation of health services manage- responsibility hanging in the balance. ment, preventative medicine, or medical research that ap- plies to the overall provision of services to a population as a whole, or a group of patients with a particular condition. 4.3 Caldicott guidelines This is the very definition of indirect care [34]. Lawful processing of identifiable data for indirect care, if there is The Caldicott rules help reduce the friction on data sharing no consent, can only proceed under what is termed the of identifiable health information for direct patient care, ‘statutory gateway’––i.e. under section 251 of the NHS while ensuring that other uses of such information—indi- Act 2006 (UK) and The Health Service (Control of rect care, such as research on identifiable individuals or Patient Information) Regulations 2002. In effect, s.251 al- risk prediction and stratification––are accorded sufficiently lows third-parties to bypass the impracticality of gaining strong regard to legal obligations of privacy and confiden- consent from large numbers of patients to process their tiality. In relation to Streams, the argument made by data, by asking the Secretary of State for Health on the Google and Royal Free—and their only arguable basis patients’ behalf through the HRA CAG approval process. for continuing to process the totality of data made available It is notable that the process that Royal Free and DeepMind under the ISA—is that DeepMind is in a direct patient care assert is necessary here—of storing, structuring and for- relationship with all Royal Free patients. The assertion matting trust- or hospital-wide datasets in order to then seems to be that, since any Royal Free patient may deteri- effectively deliver clinical care to a subset of patients— orate with AKI in the future, the hospitals are justified in does not naturally fall into any of the envisaged s.251 cir- sharing the superset of everyone’s medical information cumstances in which confidential patient information may with Google now, just in case a patient needs be processed [75]. DeepMind’s services in the future. To this the odd claim A final element in addition to the hard legal arguments is added that Bwith any clinical data processing platform it about consent and the ICO, and direct care and the Data is quite normal to have data lying in storage^ [72], without Guardian, is notice. Notice is not a mandatory require- acknowledging necessary legal and ethical limits to such a ment under the DPA if data is lawfully repurposed, but claim. This line of reasoning is unsustainable if ‘direct it is necessary if data is being processed for a new pur- care’ is to have any useful differentiating meaning from pose [76]. This would be the case, at the very least, for ‘indirect care’. By the same argument, DeepMind would patients who are in the transferred dataset, but who are be justified in holding all the data on every patient in never tested and treated for kidney injury. Though at the the NHS, on the basis that one of those patients, one broadest level Royal Free is engaged in repurposing data day, might require direct clinical treatment through acquired for medical purposes, in this case, to argue that Streams [73]. this data is legitimately being repurposed en masse in the DeepMind’s situation has no clear direct analogy in the DeepMind deal undermines wholly the protections Caldicott guidelines. Usually when speaking of the implied afforded to such sensitive data in both the DPA and the consent inherent in direct care relationships, the guidelines Caldicott rules. describe scenarios where registered clinical professionals As a partial acknowledgment of this, in May 2016 acting as part of a clinical team, all with a legitimate rela- Royal Free highlighted that an opt-out exists to data sha tionship with the patient, pass relevant patient data be- ring with Google/DeepMind [17]. However, the opt-out tween themselves, e.g., a surgeon passing a patient to a was only made clear after public attention had been called post-operation nurse [34, 74]. Implied consent in these to the deal. Such an after-the-fact concession strikes as scenarios is easily justified. It builds on the core relation- poor compensation, and is inconsistent with the practice ship between a patient and a clinical professional, within of other hospitals in endeavors of similar reach. Take for which tools—including software tools, record management example the 2015 Connecting Care project, comprising systems, alert and analytics systems, etc.—can be intro- Bristol, North Somerset and South Gloucestershire hospi- ducedintheserviceofpatient care. There are also tals [77]. This project involved a more sound basis for 360 Health Technol. (2017) 7:351–367 population-wide data sharing based on implied consent, cannot see it [78, 79]. The company benefits from relying because it concerned various third-party providers being on commercial secrets and the absence of public law obli- linked to provide an electronic patient record system. A gations and remedies against it. This leaves it with few mass mailing of information on the parties involved, and incentives for accountability. Only when it collides with reasons for data processing, to all individuals in the com- institutions that have obligations to account—i.e. when it munity was undertaken as a key exercise to inform and makes data sharing arrangements with Royal Free, or it allow individuals to opt out, and was followed up with applies for approval to NHS Digital––do rulessuchasthe ongoing efforts to inform patients. Though this project UK Freedom of Information Act 2000 permit some cracks was more involved than the Royal Free-DeepMind deal, in the glass. it also had a more legitimate reason for extending This particular case study, and the way that it has across the entire population of constituent hospitals. unfolded, demonstrates the clear absence of strong tools Royal Free has not justified why a similar process did to require companies to account in the same way as not take place with its arrangements with Google. public institutions—even if they aspire to deliver, and Given Streams is characterized as a clinical app, there in some cases even overtake, public services. There are are more elegant––and less legally and ethically dubi- many parallels to another contemporary policy issue in- ous––solutions available than simply running a mirror volving Google: its application of a 2014 European copy of the Royal Free’s repository of patient data on court ruling requiring the company to delist information third-party servers controlled by DeepMind, for every sin- that is retrieved on name searches from its search en- gle hospital patient, entirely independently of AKI sus- gine when that information is not of public interest and ceptibility and diagnosis. One solution is for DeepMind is shown to have lost relevance, accuracy or timeliness to pull in historical data only on patients who have had [80]. In that case too, the one-way mirror has conceded the gateway blood test that is prerequisite for AKI diag- only cracks of knowledge. The tools of discovery, to nosis. If Royal Free’s systems cannot currently handle real inform the public about privately-run services with deep time data requests in this manner, they ought to. It seems impacts on their lives, are vastly unequal to the power in the essence of an ethical and legal streaming service that Google wields. that just as a patient’s relevant blood tests from Royal Free ‘stream’ to DeepMind’s servers, so should historical 4.5 Corporate responsibility data on the identified at-risk patients. Below, we unpack the implications of these points Even without portholes through which to examine the opera- with a focus on transparency, data value, and market tions of powerful technology companies in detail, there is still power. There has been an inexcusable institutional delay a lot more that can be done, both from corporations them- in the NHS, ICO and Data Guardian’s response to the selves, and from the institutions that are mandated to oversee issues discussed so far. The remainder of this section them. The deal-making between DeepMind and public insti- exposes how ill-equipped our institutions are to deal tutions continues to be secretive. This is inappropriate for a with the challenges ahead. system that typically requires open tender and disclosure. The purpose and terms of these deals should be made transparent, 4.4 Transparency and the one-way mirror before committing populations of millions to them. They should clearly lay out the public benefit of their works, as well At the heart of this deal is a core transparency paradox. as the private benefits—what is in it for Google, for Google knows a lot about all of us. For millions of pa- DeepMind? What initiatives have been made towards ensur- tients in the Royal Free’s North London catchment, it now ing ongoing and equitable benefit-sharing? How are procure- has the potential to know even more. Yet, when the tables ment rules and restrictions satisfied? While total transparency are turned, we know very little about Google. Once our of processes is not possible, transparency of purpose and data makes its way onto Google-controlled servers, our means must be—legitimizing, in detail, the company’srea- ability to track that data––to understand how and why sons and limits in holding sensitive data. To its credit, decisions are made about us––is at an end. Committed DeepMind’s announcement of deals subsequent to Royal investigative reporting has led to documentation describ- Free have moved in this direction; although peer reviewers ing the DeepMind-Royal Free data transfer being made still question issues of consent [81], and the lack of details public, but we still have no real knowledge of what hap- around the algorithmic processes to be applied [82]. pens once the data reaches DeepMind, nor many tools to DeepMind has taken steps towards self-regulation. When find out. DeepMind announced Streams in February 2016, it also an- The public’s situation is analogous to being interrogated nounced the creation of a board of what it termed ‘indepen- through a one-way mirror: Google can see us, but we dent reviewers’––nine prominent public figures in the fields of Health Technol. (2017) 7:351–367 361 technology and medicine in the UK—to scrutinize the alternative is to abdicate ourselves to systems that, when company’swork withthe NHS[83, 84]. The board met for they break, will not explain themselves to us. the first time in June 2016. The board is ostensibly reviewing It is worth noting that in digesting our medical records and DeepMind’s activities in the public interest, but as at the end histories, machine learning systems have the potential to un- of January 2017, it had not made any minutes or account of its cover new hypotheses and trends about us, as a population, discussions public, nor had any reviewers expressed any con- that are difficult to adapt to and deal with. It may turn out, for cerns about DeepMind’s arrangements publicly. Annual state- instance, that certain kinds of people are particularly suscep- ments are envisaged. Oversight of artificial intelligence as it is tible to requiring an expensive medical intervention over the applied to healthcare is obviously desirable. But a self- course of their lives. Regulations should require that the bur- appointed oversight board, arguably paid in the currency of dens of new discoveries not fall solely on the shoulders reputational gain by association with successful technology of those individuals who happen to need the interven- companies, is far from adequate in itself. Being hand-chosen tion. There is a risk that, if we do not understand how by DeepMind, the members of the board are unlikely to have companies like DeepMind draw knowledge from our positions fundamentally at odds with the company. It would data, we will not be prepared for the implications of also be a considerable about-face to denounce the whole ar- the knowledge when it arrives. rangement with a partner such as Royal Free. At best, the It is essential that society is prepared for these new- board will supplement institutional external oversight mecha- found patterns, and able to protect those people who nisms and provide insights not readily gained by outsiders: for find themselves newly categorized and potentially disad- example, access to internal data; independent assessments of vantaged. This newfound understanding of our condition internal arrangements for data handling, privacy and security; will leave us all better off, but only if we share the empirical insights into the attitudes of employees and the pro- burdens that the discoveries will place on individuals. tection of the public interest. At worst, however, such a board risks creating a vacuum around truly independent and rightly skeptical critique and understanding. 4.6 Privatization and value for data The question of how to make technology giants such as Google more publicly accountable is one of the most pressing Even if DeepMind had been more open about its Royal Free political challenges we face today. The rapid diversifi- data deal, as it was in subsequent research deals, questions still cation of these businesses from web-based services into remain about the value that flows to the British public from all sorts of aspects of everyday life—energy, transport, these deals. DeepMind has made public two other partner- healthcare—has found us unprepared. But it only em- ships with the NHS, both—unlike with Royal Free—for re- search rather than patient care, with actual involvement of AI, phasizes the need to act decisively. Machine learning tools offer great promise in helping to and with appropriate research approvals. One, with navigate complex information spaces, environments and Moorfields Eye Hospital in London [85], involves the AI work flows that are beyond the reach of any one clinician company receiving one million anonymized eye scans which or team. However, it is essential that the supply chain of it will run through its machine learning algorithms in search of data and humans leading to any machine learning tools are new patterns of degeneration that might allow disease to be comprehensible and queryable. This is a check on the im- caught earlier [86]. Like the Royal Free collaboration, it com- pulse of technology startups that want to ‘move fast and menced in July 2015 [87], when a Moorfields ophthalmologist break things’. While there is little doubt that individuals at approached DeepMind with a research question: can deep DeepMind do care about improving the situation at Royal learning be used to diagnose age related macular degeneration Free and across the NHS generally, the young company is or diabetic retinopathy? Approval to work on anonymized clearly interested in moving fast—as are Royal Free’scli- data was granted by Moorfields in October 2015 and the first nicians. ‘The faster we move, the more lives we can save’, part of an approval to work on pseudonymized data came in goes the logic. This may be true, but it injects several June 2016, at the same time as a research protocol was also elements of dangerous risk, and potentially hazardous published in an open access journal [88]. Ethical approval was breakages, in developing these new tools: first, that the granted, but it is worth noting that it was confined to looking at tools will provide misleading and harmful advice in edge the risk of adverse patient events, not at broader questions cases; and second, that public trust and confidence in arti- such as the future for jobs, for competition, human deskilling, ficial intelligence erodes, making it harder to carry out etc [82]. The Moorfields project was announced publicly in projects in the future in sensitive areas, despite their prom- July 2016. While other hospitals and startups can pursue ised benefits. Aligning the development and operation of artificial intelligence products with human-scale account- As one reviewer remarked: “Overall a novel concept and worth exploring as ability and explanation will be a challenge. But the it will be able to replace human workforce if successful”. 362 Health Technol. (2017) 7:351–367 similar projects, Moorfields sees more patients a year than any data. Without data, there is no artificial intelligence. It is a other eye hospital in the US or Europe. great stroke of luck that business has found a way to mon- The second partnership, with UCL Hospitals NHS etize a commodity that we all produce just by living our Foundation Trust, sees DeepMind receiving 700 anonymized lives. Ensuring we get value from the commodity is not a radiography scans [89]. The AI company is attempting to im- case of throwing barriers in front of all manner of data prove how treatment is planned for head and neck cancer, by processing. Instead, it should focus on aligning public speedingupscansegmentation––the process of deciding where and private interests around the public’s data, ensuring that and how to direct radiation in order to maximize impact to can- both sides benefit from any deal [91]. cerous cells and minimize harm to healthy tissue. At the moment The value embodied in these NHS datasets does not an expert radiologist needs to label images pixel-by-pixel, with a belong exclusively to the clinicians and specialists who 28 day wait-time, for a four hour process [40]. DeepMind re- have made deals with DeepMind. It also belongs to the ceived approval to work on anonymized data in April 2016, with public who generated it in the course of treatment. There its research protocol published August 2016 [90]. is a pressing need for the NHS to consult broadly on the The assumption is that DeepMind’s technical capability value-for-data aspects of these transfers, to ensure that the will let it discover new things about analyzing medical imag- British public gets the best possible value out of any fu- ery, and that those new modes of analysis will be shared back ture deal. This value might take the form of an NHS stake to the community. However, documents binding DeepMind’s in any products that DeepMind, a for-profit company, de- agreement with Moorfields and UCL, and the terms of data velop and sell using NHS data. It could be as simple as a sharing, were not public as at October 2016. We do know that binding agreement to share any future products with the DeepMind will keep all algorithms that are developed during entire NHS at a discount, or for free. It is inappropriate to the studies. In other words, the knowledge DeepMind extracts leave these matters for future discussion, risking lock-in. from these public resources will belong exclusively to There may even be scenarios where third-party processors DeepMind. Even if it publishes the scientific results of its can use NHS data to build products that are not related to studies, it is unlikely it will freely publish the algorithms it health,but areusefulinother markets. Thepublichas a has trained. In effect, the chance to train and exploit algo- revulsion against ‘selling’ NHS data, but this impulse rithms on real-world health data is DeepMind’s consideration sells the public short on its own assets. The Royal Free- for these deals. The consideration for its partners is that those Google deal suggests that data will flow in any event, algorithms––and the promise that they advance the field of under the banner of innovation, without any value-for- diagnostics––exist in the world. Given this, the opacity of money discussions. We recommend that, in addition to consideration passing between the parties in this, as with the formalizing inputs on these aspects of value, the NHS might also consider the intrinsic impacts of automation contract with Royal Free, is problematic. There are no details on the clinical service and cost of any service to be provided [92]—how will clinicians interface with these new tools? by DeepMind in exchange for the data access, only vague How will the NHS deal with inevitable deskilling and statements that have been made in public fora about the pos- shifts in the workforce, in response to automation? How sibility of a future levy being imposed, in alignment with will they ensure that the daily art of medicine is as improvements in clinical outcomes. protected and valued as the science? A properly resourced and truly independent entity or 4.7 Open competition and public interest entities should be tackling these challenges. Perhaps the Council of Data Science Ethics and standing Commission Offering DeepMind a lead advantage in developing new algo- on Artificial Intelligence, recommended—and, in the first rithmic tools on otherwise privately-held, but publicly-generated case, accepted by the government [93]—under two reports datasets limits the adoption of any scientific advances the com- of the UK House of Commons Science and Technology pany may make to two channels: via DeepMind on DeepMind’s Committee [94, 95], will be able to undertake this task, terms; or to recreating, at expense and with unclear routes to but their independence and rigor must be proven. They access, DeepMind’s training on the same datasets. must also take into account the fact that DeepMind con- Concepts of the value of data have not yet permeated popular tinues to rapidly expand its staff, including with senior culture. Google and other technology companies know very well appointments from the ranks of government and the what value they can unlock from a particular dataset and from NHS itself [96, 97]. access to millions or billions of computers that stream data on how their human owners walk, talk, travel and think. 4.8 Market power But the public, and by extension the public sector, do not yet contemplate the value of this commodity that only The new phenomenon of using machine learning to extract they are capable of creating. Without people, there is no value from health data is likely the precursor of a general Health Technol. (2017) 7:351–367 363 movement to monetize public datasets. Centralized go- institutions and communities appear to be the best vehicles vernment services are obvious targets for machine lear- to advocate for individual rights, rather than placing the ning. They are directed towards fundamental human needs burden of ownership on individuals. The key then is to of care, housing, education and health, and often hold long return value at the communal level [99]. Indeed, data held baseline datasets about human behavior in areas where by NHS trusts ought to be perfectly positioned for this services could be improved. The complexity and scale of treatment. this information is what has led to the suggestion that these Hospitals are a community dedicated to the care of are areas where the sheer force of computation and algo- their patients. The first step for DeepMind and Royal rithmic learning on large volumes of data offers great uti- Free should have been to engage the community in lity and promise. explaining the solutions they will pursue, and achieving When private companies access these resources with the buy-in with communal control and reward. The second intention of building on top of them, first-mover advantage step would have been to expand this with other alterna- exists as it does whenever private companies exploit public tives in a flourishing innovative ecosystem. This did not resources—land, fossil fuel stores, connection points to peo- happen, and it does not look like it will happen. In this ple’s homes. In the new realm of machine learning, it is im- regard, it is important to note that offering functionality portant to ensure that DeepMind’s algorithms do not put it in for patients to see and audit their own data as it moves an entrenched market position. through systems [100, 101], as DeepMind has intimated Of course, DeepMind is not the only innovator making that it will do in the future, is a positive development, overtures to the NHS, and machine learning is not the but it is also one that resigns itself to perpetuating ulti- only innovation. In the case of kidney injury, outcomes mate control, and a power asymmetry, in the hands of would be as well influenced by employing more nurses to those who control the system—in this case, DeepMind. ensure that patients are hydrated, as deploying new algo- None of the approaches of DeepMind, of Google, or of rithms. Some healthy caution about the first-mover is ad- the industry-supported Partnership on Artificial vised. If our public services have not laid the groundwork Intelligence that they announced in 2016, do anything for an open, flourishing future innovation ecosystem, then to mitigate this control. They trumpet their own good the temptation for players like DeepMind to sit on their intents, in benefiting the many in open, responsible, so- entrenched networks will be too strong. cially engaged ways that avoid undesirable outcomes It is important to note that, while giving DeepMind [102]. But ultimately, these are tweaks within the frame access to NHS data does not in principle preclude the of a certain deterministic approach to technology. They same access being given to other companies in future, look for corporate initiative, not for robust solutions that stand outside our present paradigm and ask how best we the willingness to recreate work, and ability to catch up, will diminish over time. Already, anecdotally, startups are can truly assure that we advance technologically, and that reluctant to move in places where DeepMind has started we do so in a way that ensures deep and broad public deploying its immense resources. The danger of uncon- interests are met, not just superficially immediate, effi- strained, unreflective allocation of datasets to powerful cient, commercial solutions. parties is that the incentives for competition will distort. Like physical networks of electricity cables or gas pipes, it is perfectly possible for another company to redo what 5 Conclusion has been done by another. However, there are powerful inefficiencies and network effects that count against such The 2015–16 deal between a subsidiary of the world’s possibilities. If we are to see the true promise of artificial largest advertising company and a major hospital trust in intelligence, a much more positive solution would be to Britain’s centralized public health service should serve as heavily constrain the dataset and to introduce a competi- a cautionary tale and a call to attention. Through the ve- tive, open process for simultaneous technology develop- hicle of a promise both grand and diffuse––of a streaming ment by a range of private, public, and private-public app that will deliver critical alerts and actionable analytics providers. on kidney disease now, and the health of all citizens in the A way of conceptualizing our way out of a single pro- future––Google DeepMind has entered the healthcare vider solution by a powerful first-mover is to think about market. It has done so without any health-specific domain datasets as public resources, with attendant public owner- expertise, but with a potent combination of prestige, patro ship interests. Ownership in this context is often a loaded nage and the promise of progress. notion, but it does not need to reduce to something that is Networks of information now rule our professional and atomized and commoditized for control at the individual personal lives. These are principally owned and controlled level. Learning from commons movements [98], trusted by a handful of US companies: Google, Facebook, 364 Health Technol. (2017) 7:351–367 Microsoft, Amazon, Apple, IBM. New players cannot References compete with these successful networks, whose influence deepens and becomes more entrenched as they ingest 1. DeepMind. Acute kidney injury. In: Streams. 2016. https://deepmind.com/applied/deepmind-health/streams/. more data, more resources. If these born-digital compa- Accessed 6 Oct 2016. nies are afforded the opportunity to extend these networks 2. Royal Free response to Hodson freedom of information request into other domains of life, they will limit competition 1548, 30 Aug 2016. there too. This is what is at stake with Google 3. The exact number is unknown, but Royal Free admits an average DeepMind being given unfettered, unexamined access to 1.6 million patients per year: NHS. Overview. In: Royal Free London NHS Hospital Trust. 2016. http://www.nhs. population-wide health datasets. It will build, own and uk/Services/Trusts/Overview/DefaultView.aspx?id=815. control networks of knowledge about disease. Accessed 20 Sep 2016. Fortunately, health data comes with very strong protec- 4. DeepMind. Information sharing agreement. 2016. https://storage. tions that are designed to protect individuals and the pub- googleapis.com/deepmind-data/assets/health/Royal Free - DSA - redacted.pdf (granting DeepMind data on all patients over a five lic interest. These protections must be respected before year period). The agreement was signed by Subir Mondal, a acceding to any promises of innovation and efficiency deputy director and head of information governance at Royal emanating from data processing companies. Public health Free, and Mustafa Suleyman, one of DeepMind’sthree services such as the British NHS are deeply complex sys- cofounders (presumably with authority to contract on behalf of Google). tems. It is imperative for such institutions to constantly 5. DeepMind. We are very excited to announce the launch of explore ways to advance technologically in their public DeepMind Health. 2016. https://deepmind.com/blog/we-are- health mission. Artificial intelligence and machine learn- very-excited-announce-launch-deepmind-health/. Accessed 20 ing may well offer great promise. But the special relation- Sep 2016. ship that has surged ahead between Royal Free and 6. Kerr M, Bedford M, Matthews B, O'Donoghue D. The economic impact of acute kidney injury in England. Nephrol Dial Google DeepMind does not carry a positive message. Transplant. 2014;29(7):1362–8. doi:10.1093/ndt/gfu016. Digital pioneers who claim to be committed to the public 7. Bedford M, Stevens PE, Wheeler TWK, Farmer CKT. What is the interest must do better than to pursue secretive deals and real impact of acute kidney injury? BMC Nephrology. 2014;15: specious claims in something as important as the health of 95. doi:10.1186/1471-2369-15-95. populations. For public institutions and oversight mecha- 8. Jordan MI, Mitchell TM. Machine learning: trends, perspectives, and prospects. Science. 2015;349(6245):255–60. doi:10.1126 nisms to fail in their wake would be an irrevocable /science.aaa8415. mistake. 9. Boseley S, Lewis P. Smart care: how Google DeepMind is working with NHS hospitals. Guardian. 24 Feb 2016. https://gu.com/p/4h2 Acknowledgements Hodson acknowledges the support of New k2. Scientist, where many of the investigations and facts discussed 10. The terms ‘analytics’ and ‘decision support’ echo knowledge- in this article were first revealed. Both authors warmly thank based and expert systems, the areas where narrow artificial intel- the participants at the Salzburg Global Forum–Johann Wolfgang ligence methods achieved early success: Keen PGW. Decision von Goethe Foundation workshop, ‘Remaking the state: The im- support systems: the next decade. Decis Support Syst. pact of the digital revolution now and to come’, the University of 1987;3(3):253–265. doi: 10.1016/0167-9236(87)90180-1. Cambridge’s Technology and Democracy project and Computer 11. DeepMind. Streams FAQ. In: Streams. 2016. https://deepmind. Security Group, and numerous colleagues for the many conversa- com/applied/deepmind-health/streams/. Accessed 6 Oct 2016. tions that fueled this endeavor. Thanks are also due to three 12. Suleyman M. DeepMind Health: our commitment to the NHS. anonymous reviewers and the editor of this special issue for their Medium. 5 Jul 2016. https://medium.com/@mustafasul/deepmind- helpful comments and guidance. health-our-commitment-to-the-nhs-ac627c098818#.66w4mgi4j. 13. The technology industry is notorious for its pivots. Further, exter- Compliance with ethical standards nal factors could intervene. See, e.g., scenario models in UC Berkeley’s Center for Long-Term Cybersecurity. Cybersecurity Conflict of interest The authors declare they have no conflict of Futures 2020. 2016. https://cltc.berkeley.edu/files/2016/04 interest. /cltcReport_04-27-04a_pages.pdf. 14. Department of Health, The Caldicott Committee. Report on the Funding There is no funding source. review of patient-identifiable information. 1997. http://webarchive.nationalarchives.gov.uk/20130107105354 /http://www.dh.gov.uk/en/Publicationsandstatistics/Publications/ Ethical approval This article does not contain any studies with human PublicationsPolicyAndGuidance/DH_4068403. participants or animals performed by any of the authors. 15. Hodson H. Revealed: Google AI has access to huge haul of NHS patient data. New Scientist. 29 Apr 2016. https://www. newscientist.com/article/2086454-revealed-google-ai-has-access- Open Access This article is distributed under the terms of the Creative to-huge-haul-of-nhs-patient-data/. Commons Attribution 4.0 International License (http:// 16. Hawkes N. NHS data sharing deal with Google prompts concern. creativecommons.org/licenses/by/4.0/), which permits unrestricted use, BMJ. 2016;353. doi:10.1136/bmj.i2573. distribution, and reproduction in any medium, provided you give appro- This commitment was priate credit to the original author(s) and the source, provide a link to the reaffirmed by both parties immediately prior to publication: Creative Commons license, and indicate if changes were made. personal communication, O’Connell M, Royal Free press office Health Technol. (2017) 7:351–367 365 to Powles, 5 Nov 2016; personal communication, Rickman O, 34. Caldicott F. Information: to share or not to share? The information DeepMind to Powles, 4 Nov 2016 g overnance review. 2 0 1 3 . ht tp s:/ / w w w. g o v. uk/government/uploads/system/uploads/attachment_ 17. Royal Free. Google DeepMind: Q&A. 2016. https://www. data/file/192572/2900774_InfoGovernance_accv2.pdf. royalfree.nhs.uk/news-media/news/google-deepmind-qa/. 35. Data Protection Act 1998 (UK), Schedule 3, par 8. Accessed 20 Sep 2016. 36. Hodson H. Did Google’s NHS patient data deal need ethical ap- 18. Details on the specifics of the data package are the subject of proval? New Scientist. 13 May 2016, updated 8 Jun 2016. ongoing investigation, including via Hodson freedom of informa- https://www.newscientist.com/article/2088056-did-googles-nhs- tion request 1812 to Royal Free, 13 Dec 2016. patient-data-deal-need-ethical-approval/. 19. NHS. Algorithm for detecting acute kidney injury (AKI) based on 37. DeepMind. Memorandum of understanding. 2016. https://storage. serum creatinine changes with time. 2014. https://www.england. googleapis.com/deepmind-data/assets/health/Memorandum%20 nhs.uk/wp-content/uploads/2014/06/psa-aki-alg.pdf. of%20understanding%20REDACTED%20FINAL.pdf. 20. Sawhney S, Fluck N, Marks A, Prescott G, Simpson W, 38. Lomas N. NHS memo details Google/DeepMind’sfiveyear plan Tomlinson L, Black C. Acute kidney injury––how does automat- to bring AI to healthcare. TechCrunch. 8 Jun 2016. http://tcrn. ed detection perform? Nephrol Dial Transplant. 2015;30(11): ch/25MV8Py. 1853–61. doi:10.1093/ndt/gfv094. 39. Wakefield J. Google DeepMind: should patients trust the company 21. Montgomery H, quoted in Google’s NHS data deal ‘business as with their data? BBC. 23 Sep 2016. http://www.bbc.co. usual’ says Prof. BBC. 5 May 2016. http://www.bbc.co. uk/news/technology-37439221. uk/news/technology-36212085. 40. Suleyman M. Delivering the benefits of a digital NHS. NHS Expo 22. DeepMind took one step towards general ethics approval (a nec- 2016, Manchester. 7 Sep 2016. https://youtu.be/L2oWqbpXZiI. essary precursor to research approvals, which must be separately 41. Suleyman M. New ways for technology to enhance patient care. and specifically obtained for each site where research is undertak- King’s Fund Digital Health and Care Congress 2016, London. 5 en) on 10 Nov 2015: HRA. Using machine learning to improve Jul 2016. https://youtu.be/0E121gukglE. prediction of AKI & deterioration. In: Research summaries. 42. Suleyman M. Artificial intelligence and the most intractable prob- http://www.hra.nhs.uk/news/research-summaries/using-machine- lems. Nesta FutureFest 2016, London. 17 Sep 2016. https://youtu. learning-to-improve-prediction-of-aki-deterioration/. Accessed 6 be/KF1KhuoX2w4. Oct 2016. 43. DeepMind. Streaming the right data at the right time. In: Streams. 23. After 12 months, still no research approval was granted; though 2016. https://deepmind.com/applied/deepmind-health/streams/. applications for use of anonymized data in “potentially enhanced Accessed 1 Nov 2016. detection of AKI” remained on foot (details withheld on the basis 44. Lomas N. UK healthcare products regulator in talks with Google/ that they would disclose commercially sensitive information about DeepMind over its Streams app. TechCrunch. 18 May 2016. the research protocol). In: Royal Free response to Hodson freedom http://tcrn.ch/1XziiGT. of information request 1716, 9 Nov 2016. 45. No HRA approvals exist. In particular, there is no research approv- 24. NHS. Patient safety alert on standardising the early identification al directed at the category of patients who are never treated for of Acute Kidney Injury. 2014. https://www.england.nhs.uk/2014 kidney injury. No research approval presently exists for /06/psa-aki/. Accessed 20 Sep 2016. DeepMind to do anything with Royal Free data beyond mere 25. NHS. Patient safety alert: directive standardising the early identi- application of the AKI algorithm; though a research application fication of acute kidney injury. 2015. https://www.england.nhs. for Bpotentially enhanced detection of AKI^ is pending: Royal uk/wp-content/uploads/2014/06/psa-aki-alg-faqs.pdf. Accessed Free response to Hodson freedom of information request 1716, 9 20 Sep 2016. Nov 2016. HRA approval was sought for assessing the effective- 26. This is not to say that it would not be useful under a research ness of the post-alert enhanced care component of Streams on 21 project. See Connell A, Laing C. Acute kidney injury. Clin Med. Mar 2016, and on 29 Mar 2016 this was advised to be Bservice 2015;15(6):581–483. doi: 10.7861/clinmedicine.15–6-581 (co- evaluation^ rather than research: Royal Free response to Hodson authored by one of the architects of the DeepMind-Royal Free freedom of information request 1717, 9 Nov 2016. deal, promoting Bdevelopment of algorithm-based predictive, di- 46. Lomas N. DeepMind’s first NHS health app faces more regulatory agnostic, and risk-stratification instruments^). bumps. TechCrunch. 20 Jul 2016. http://tcrn.ch/2a85jum. 27. Kellum JA, Kane-Gill SL, Handler SM. Can decision support 47. NHS. Information governance toolkit. https://www.igt.hscic.gov. systems work for acute kidney injury? Nephrol Dial Transplant. uk/. Accessed 20 Sep 2016. 2015;30(11):1786–1789. doi: 10.1093/ndt/gfv285. 48. DeepMind. Information governance. 2016. https://deepmind. 28. Meijers B, De Moor B, Van Den Bosch B. The acute kidney injury com/applied/deepmind-health/information-governance/. e-alert and clinical care bundles: the road to success is always Accessed 6 Oct 2016. under construction. Nephrol Dial Transplant. 2016;0:1–3. 49. Hern A. DeepMind has best privacy infrastructure for handling doi:10.1093/ndt/gfw213. NHS data, says co-founder. Guardian. 6 May 2016. https://gu. 29. Roberts G, Phillips D, McCarthy R, et al. Acute kidney injury risk com/p/4jv7m. assessment at the hospital front door: what is the best measure of 50. Letter from HSCIC to Med Confidential. 6 Jul 2016. risk? Clin Kidney J. 2015;8(6):673–80. doi:10.1093/ckj/sfv080. 51. DeepMind. Waking project privacy impact assessment. 2016. 30. HRA. Using machine learning to improve prediction of AKI & (‘Waking’ was an early product name for Streams.) deterioration. In: Research summaries. http://www.hra.nhs. https://storage.googl eapis.com/deepmind- uk/news/research-summaries/using-machine-learning-to- data/assets/health/Privacy%20Impact%20Assessment%20 improve-prediction-of-aki-deterioration/. Accessed 6 Oct 2016. for%20Waking%20Project%2027%20Jan%202016%20V0%201 31. Personal communication, O’Brien D, Royal Free press office to %20redacted.pdf. Hodson, 14 Jul 2016. 52. ICO. Conducting privacy impact assessments: code of practice. 32. Letter from Sloman D, Royal Free Chief Executive to Ryan J MP, 2014. https://ico.org.uk/media/for-organisations/documents/1595 22 Jul 2016. /pia-code-of-practice.pdf. 33. BBC. Why Google DeepMind wants your medical records. 19 53. Personal communication, O’Brien D, Royal Free press office to Hodson, 15 Jun 2016. Jul 2016. http://www.bbc.co.uk/news/technology-36783521. 366 Health Technol. (2017) 7:351–367 54. Donnelly C. ICO probes Google DeepMind patient data-sharing 76. Data Protection Act 1998 (UK), Schedule 1, Part II, par 5–6. deal with NHS hospital trust. Computer Weekly. 12 May 2016. 77. NHS. Bristol, North Somerset and South Gloucestershire http://www.computerweekly.com/news/450296175/ICO-probes- Connecting care data sharing agreement. 2015. https://www. Google-DeepMind-patient-data-sharing-deal-with-NHS- bristolccg.nhs.uk/media/medialibrary/2016/01/FOI_1516_264_ Hospital-Trust. Confirmed as ongoing by ICO in Sep 2016. connecting-care-data-sharing-agreement-v3-sept-15.pdf. 55. Lomas N. DeepMind NHS health data-sharing deal faces further 78. Pasquale F. The black box society: the secret algorithms that control scrutiny. TechCrunch. 23 Aug 2016. http://tcrn.ch/2bKqz7p. money and information. 2015. Cambridge: Harvard University 56. Personal communication, CQC press office to Hodson, 14 Press. Oct 2016. 79. The Wellcome Trust. The one-way mirror: public attitudes to com- 57. Boiten E. Google’s Larry page wants to save 100,000 lives but big mercial access to health data. 2016. https://wellcome.ac. data isn’t a cure all. The Conversation. 27 Jun 2014. uk/sites/default/files/public-attitudes-to-commercial-access-to- http://theconversation.com/googles-larry-page-wants-to-save- health-data-wellcome-mar16.pdf. 100-000-lives-but-big-data-isnt-a-cure-all-28529. 80. Powles J. The case that won’t be forgotten. Loy. U. Chi. L.J. 58. Carter P, Laurie GT, Dixon-Woods M. The social licence for re- 2015;47:583 –615. http://luc.edu/media/lucedu/ search: Why care.data ran into trouble. J Med Ethics. 2015;41(5): law/students/publications/llj/pdfs/vol47/issue2/Powles.pdf. 404–9. doi:10.1136/medethics-2014-102374. 81. Zweifel S. Referee report for: Automated analysis of retinal imag- 59. National Data Guardian. The independent information governance ing using machine learning techniques for computer vision [ver- oversight panel’s report to the care.data programme board on the sion 1; referees: 2 approved]. F1000Research. 2016;5:1573. care.data pathfinder stage. 2014. https://www.gov. doi:10.5256/f1000research.9679.r14781. uk/government/uploads/system/uploads/attachment_ 82. Yang Y. Referee report for: Automated analysis of retinal imaging data/file/389219/IIGOP_care.data.pdf. using machine learning techniques for computer vision [version 1; 60. Nuffield Council on Bioethics. The collection, linking and use of referees: 2 approved]. F1000Research. 2016;5:1573. doi:10.5256 data in biomedical research and health care: ethical issues. 2015. /f1000research.9679.r15056 http://nuffieldbioethics.org/wp-content/uploads/Biological_and_ 83. Feng Y. Referee report for: Applying machine learning to auto- health_data_web.pdf. mated segmentation of head and neck tumour volumes and organs 61. This was reiterated again in a report post-dating the DeepMind- at risk on radiotherapy planning CT and MRI scans [version 1; Royal Free transfer. National Data Guardian for Health and Care. referees: 1 approved with reservations]. F1000Research. 2016;5: Review of data security, consent and opt-outs. 2016. https://www. 2104. doi: 10.5256/f1000research.10262.r17312. gov.uk/government/uploads/system/uploads/attachment_ 84. DeepMind. Our independent reviewers. 2016. https://deepmind. data/file/535024/data-security-review.PDF. com/applied/deepmind-health/independent-reviewers/. Accessed 62. Lawrence ND. Google’s NHS deal does not bode well for the 6 Oct 2016. future of data-sharing. Guardian. 5 May 2016. https://gu.com/p/4 85. Baraniuk C. Google’s DeepMind to peek at NHS eye scans for tpd5. disease analysis. BBC. 5 Jul 2016. http://www.bbc.co. 63. Data Protection Act 1998 (UK), s.1(1), 4. uk/news/technology-36713308. 64. Data Protection Act 1998 (UK), s.2(e). 86. Hodson H. Google’s new NHS deal is start of machine learning 65. Data Protection Act 1998 (UK), Schedule 3. marketplace. New Scientist. 6 Jul 2016. https://www.newscientist. 66. Data Protection Act 1998 (UK), Schedule 3, par 8. com/article/2096328-googles-new-nhs-deal-is-start-of-machine- 67. For a comprehensive and critical analysis of these concepts, see learning-marketplace/. Van Alsenoy B. Regulating data protection: the allocation of re- 87. Hillen M. On a quest to find the holy grail of imaging. The sponsibility and risk among actors involved in personal data pro- Opthalmologist. 2016. https://theophthalmologist. cessing. 2016. KU Leuven doctoral thesis. https://lirias.kuleuven. com/issues/0716/on-a-quest-to-find-the-holy-grail-of-imaging/. be/bitstream/123456789/545027/1/PhD_thesis_Van_Alsenoy_ 88. De Fauw J, Keane P, Tomasev N et al. Automated analysis of Brendan_archived.pdf. retinal imaging using machine learning techniques for computer 68. Data Protection Act 1998 (UK), s.1(1). vision [version 1; referees: 2 approved]. F1000Research 2016;5: 69. Article 29 Working Party. Opinion 1/2010 on the concepts of 1573. doi:10.12688/f1000research.8996.1. controller and processor. 2010. http://ec.europa. 89. Meyer D. Google’s DeepMind partners with British doctors on eu/justice/policies/privacy/docs/wpdocs/2010/wp169_en.pdf. oral cancer. 31 Aug 2016. Fortune. http://fortune.com/2016/08 70. ICO. Data controllers and data processors: what the difference is /31/google-deepmind-cancer/. and what the governance implications are. https://ico.org. 90. Chu C, De Fauw J, Tomasev N et al. Applying machine learning uk/media/1546/data-controllers-and-data-processors-dp-guidance. to automated segmentation of head and neck tumour volumes and pdf. organs at risk on radiotherapy planning CT and MRI scans [ver- 71. Wales IG. Royal Free NHS Trust and Google UK. 5 May 2016. sion 1; referees: 1 approved with reservations]. F1000Research. http://igwales.com/?p=107. 2016;5:2104. doi:10.12688/f1000research.9525.1. 72. Shead S. Google’s DeepMind tried to justify why it has access to 91. Taylor L. The ethics of big data as a public good: Which public? millions of NHS patient records. Business Insider. 27 May 2016. Whose good? SSRN. 2016. doi:10.2139/ssrn.2820580. http://uk.businessinsider.com/googles-deepmind-tried-to-justify- 92. Charette RN. Automated to death. IEEE Spectrum. 15 Dec 2009. why-it-has-access-to-millions-of-nhs-patient-records-2016-5. http://spectrum.ieee.org/computing/software/automated-to-death. 73. Boiten E. Google is now involved with healthcare data – is that a 93. House of Commons Science and Technology Committee. The big go od thing? The C onversation. 5 M ay 2016. data dilemma: Government response. 2016. HC 992. http://www. https://theconversation.com/google-is-now-involved-with- publications.parliament.uk/pa/cm201516/cmselect/cmsctech/992 healthcare-data-is-that-a-good-thing-58901: “This is seeing /992.pdf. clinical care through a mass surveillance lens – we need all the 94. House of Commons Science and Technology Committee. The big data on everyone, just in case they require treatment”. data dilemma. 2016. HC 468. ht tp://www.publications.parliament. 74. See also, Health and Social Care Act 2012 (UK), s.251B. uk/pa/cm201516/cmselect/cmsctech/468/468.pdf. 75. The Health Service (Control of Patient Information) Regulations 95. House of Commons Science and Technology Committee. 2002 (UK), Schedule 1. Robotics and artificial intelligence. 2016. HC 145. http://www. Health Technol. (2017) 7:351–367 367 publications.parliament.uk/pa/cm201617/cmselect/cmsctech/145 99. Lawrence ND. Data trusts could allay our privacy fears. Guardian. 3 Jun 2016. https://gu.com/p/4k5gk. /145.pdf. 96. Shead S. Google DeepMind has doubled the size of its healthcare 100. Honeyman M. What if people controlled their own health data? The team. Business Insider. 11 Oct 2016. http://uk.businessinsider. King’s Fund blog. 10 Aug 2016. https://www.kingsfund.org. com/google-deepmind-has-doubled-the-size-of-its-healthcare- uk/reports/thenhsif/what-if-people-controlled-their-own-health-data/. team-2016-10. 101. Persson J. Care.data, the King and I: An eternal illusion of control 97. Stevens L. Google DeepMind recruits government health tech and consent? 16 Aug 2016. http://jenpersson.com/king-i-privacy- managers. Digital health intelligence. 13 Oct 2016. http://www. caredata-consultation/. digitalhealth.net/news/48167/google-deepmind-recruits- 102. Schmidt E, Cohen J. Technology in 2016. Time. 21 Dec 2015. government-health-tech-managers. http://time.com/4154126/technology-essay-eric-schmidt-jared- 98. Frischmann BM, Madison MJ, Strandburg KJ. Governing cohen/. knowledge commons. Oxford: Oxford University Press; http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Health and Technology Pubmed Central

Google DeepMind and healthcare in an age of algorithms

Health and Technology , Volume 7 (4) – Mar 16, 2017

Loading next page...
 
/lp/pubmed-central/google-deepmind-and-healthcare-in-an-age-of-algorithms-j4h5vfyOMV

References (180)

Publisher
Pubmed Central
Copyright
© The Author(s) 2017
ISSN
2190-7188
eISSN
2190-7196
DOI
10.1007/s12553-017-0179-1
Publisher site
See Article on Publisher Site

Abstract

Health Technol. (2017) 7:351–367 DOI 10.1007/s12553-017-0179-1 ORIGINAL PAPER 1 2 Julia Powles & Hal Hodson Received: 14 November 2016 /Accepted: 26 January 2017 /Published online: 16 March 2017 The Author(s) 2017. This article is published with open access at Springerlink.com Abstract Data-driven tools and techniques, particularly ma- 1 Introduction chine learning methods that underpin artificial intelligence, offer promise in improving healthcare systems and services. A key trend in contemporary healthcare is the emergence of an One of the companies aspiring to pioneer these advances is ambitious new cadre of corporate entrants: digital technology DeepMind Technologies Limited, a wholly-owned subsidiary companies. Google, Microsoft, IBM, Apple and others are all of the Google conglomerate, Alphabet Inc. In 2016, preparing, in their own ways, bids on the future of health and DeepMind announced its first major health project: a collab- on various aspects of the global healthcare industry. oration with the Royal Free London NHS Foundation Trust, to This article focuses on the Google conglomerate, Alphabet assist in the management of acute kidney injury. Initially re- Inc. (referred to as Google for convenience). We examine the ceived with great enthusiasm, the collaboration has suffered first healthcare deals of its British-based artificial intelligence from a lack of clarity and openness, with issues of privacy and subsidiary, DeepMind Technologies Limited, in the period power emerging as potent challenges as the project has un- between July 2015 and October 2016. In particular, the article folded. Taking the DeepMind-Royal Free case study as its assesses the first year of a deal between Google DeepMind pivot, this article draws a number of lessons on the transfer and the Royal Free London NHS Foundation Trust, which of population-derived datasets to large private prospectors, involved the transfer of identifiable patient records across identifying critical questions for policy-makers, industry and the entire Trust, without explicit consent, for the purpose of individuals as healthcare moves into an algorithmic age. developing a clinical alert app for kidney injury. We identify inadequacies in the architecture of this deal, in its public com- munication, and in the processes of public sector oversight. . . . Keywords Artificial intelligence Clinical care Consent We conclude that, from the perspective of patient autonomy, . . . . Data protection Machine learning Power Privacy public value, and long-term competitive innovation, existing Regulation institutional and regulatory responses are insufficiently robust and agile to properly respond to the challenges presented by data politics and the rise of algorithmic tools in healthcare. This article is part of the Topical Collection on Privacy and Security of The article proceeds in three main sections. The next Medical Information two sections document comprehensively how the This article was completed while Hal Hodson was a Technology DeepMind deals proceeded, drawing attention to the dis- Reporter, New Scientist. closures and omissions in how data handling was commu- nicated, justified and, ultimately, scrutinized in public. * Julia Powles Section 2 discusses the chronology, formal contractual jep50@cam.ac.uk Since Oct 2015, DeepMind has been owned by Google’s parent company, Alphabet Inc. Faculty of Law and Computer Laboratory, University of Cambridge, Given the time period considered, the EU’s General Data Protection Cambridge, UK Regulation 2016/679, which enters into force in May 2018, is beyond the The Economist Newspaper, London, UK scope of this article. 352 Health Technol. (2017) 7:351–367 basis, and stated clinical motivation underlying the Royal been adequately explained by either DeepMind or Free deal, highlighting the delayed revelation of the na- Royal Free. ture and scale of patient data involved. Section 3 explores DeepMind’s broader ambitions in working with the NHS 2.1 Contractual foundations vs public relations and the lack of ex ante discussions and authorizations with relevant regulators. It also elaborates on the problem- Throughout the whole first phase of the deal, through to atic basis on which data was shared by Royal Free, name- October 2016, DeepMind’s publicly announced purposes ly, the assertion that DeepMind maintains a direct care for holding sensitive data on Royal Free’s patients, i.e. relationship with every patient in the Trust. Section 4 then the management and direct care of AKI, were narrower lays out the lessons that can be drawn from the case study than the purposes that contractually constrained its use as a whole, assesses at a high level the data protection and of the data. These constraints were described in an eight medical information governance issues, and then turns to page information sharing agreement (ISA) between transparency, data value, and market power. Google UK Limited and Royal Free, signed on 29 September 2015 [4]. The Google-Royal Free ISA stated that, in addition to developing tools for ‘Patient Safety 2 A startup and a revelation Alerts for AKI’ (presumably via the application now badged as Streams), Google, through DeepMind, could In July 2015, clinicians from British public hospitals also build Breal time clinical analytics, detection, diag- within the Royal Free London NHS Foundation Trust nosis and decision support to support treatment and approached Google DeepMind Technologies Limited, avert clinical deterioration across a range of diagnoses an artificial intelligence company with no experience and organ systems^ [10]. Further, it stated that the data in providing healthcare services, about developing soft- provided by Royal Free was envisaged for use in the ware using patient data from the Trust [1]. Four months creation of a service termed ‘Patient Rescue’, Baproof later, on 18 November 2015, [2] sensitive medical data of concept technology platform that enables analytics as on millions [3] of Royal Free’s patients started flowing a service for NHS Hospital Trusts^. into third-party servers contracted by Google to process This was the entirety of the language in the ISA specifying data on behalf of DeepMind [4]. the purposes for data sharing between Royal Free and Google Royal Free is one of the largest healthcare providers in over a two-year period ending 29 September 2017. (The ISA Britain’s publicly funded National Health Service (NHS). was superseded, prematurely, by a new set of agreements TheNHS offers healthcarethatisfreeatthepointof signed on 10 November 2016. Those agreements are beyond service, paid for through taxes and national insurance the scope of the present article and will be considered in future contributions. Beloved in the UK, the NHS is a key work.) At least contractually, the original ISA seemed to per- part of the national identity. mit DeepMind to build systems to target any illness or part of DeepMind publicly announced its work with Royal the body. Further, the ISA contained no language constraining Free on 24 February 2016 [5]. No mention was made the use of artificial intelligence (AI) technologies on the data, of the volume or kind of data included in the transfer— meaning that DeepMind’s assurance that Bfor the moment millions of identifiable personal medical records. there’s no AI or machine learning^ was, and remains, rather DeepMind said it was building a smartphone app, called less convincing than Bbut we don’t rule it out for the future^ ‘Streams’, to help clinicians manage acute kidney injury [9]. In mid-2016, the app’s online FAQs reiterated the same (AKI). AKI has outcomes ranging from minor kidney sentiment, adding that if artificial intelligence techniques dysfunction through to dialysis, transplant, and even are applied to the data in the future, this would be death, and is linked to 40,000 deaths a year in the announced on the company’s website, and indicating UK [6, 7]. The app, DeepMind claimed, would not ap- that the company will seek regulatory approval under ply any of the machine learning or artificial intelligence research authorization processes [11]. techniques (effectively, statistical models built using Another subject unaddressed in the ISA was the Google powerful computing resources over large corpora of question: i.e. how data shared under the scheme would be granular, personalized data [8]) for which it is re- cabined from other identifiable data stored by Google, given nowned, and would act as a mere interface to patient that Google was the signing party to the contract and that the medical data controlled by Royal Free [9]. Why company’s business model depends on monetizing personal DeepMind, an artificial intelligence company wholly data. DeepMind has made regular public assurances that owned by data mining and advertising giant Google, Royal Free data Bwill never be linked or associated with was a good choice to build an app that functions pri- Google accounts, products or services^ [9, 12]. marily as a data-integrating user interface, has never Problematically, these assurances appear to have been given Health Technol. (2017) 7:351–367 353 little to no legal foundation in Google and DeepMind’sdeal- platform. On 4 May 2016, Royal Free issued a statement in ings with Royal Free, even if there is no reason to disbelieve line with Google’s position [17]. the sincerity of their intent [13]. The reality is that the exact The data package described in the ISA and destined for nature and extent of Google’s interests in NHS patient data DeepMind is patient identifiable, and includes the results of every remain ambiguous. blood test done at Royal Free in the five years prior to transfer [18]. It also includes demographic details and all electronic pa- tient records of admissions and discharges from critical care and 2.2 Data, direct care and consent accident and emergency. It includes diagnoses for conditions and procedures that have a contributory significance to AKI, such as It is important to note that, though the ISA provided Google diabetes, kidney stones, appendectomies or renal transplants, but with a broad set of purposes contractually, it did not displace also those that do not, such as setting broken bones. various other legal, regulatory and ethical restrictions. A perti- nent restriction is that medical information governance in the 2.3 A ‘national algorithm’ for AKI UK is geared around obtaining explicit consent from each pa- tient whose identifiable data is passed to a third-party, when that Both DeepMind and Royal Free claim that Streams relies third-party is not in a direct care relationship with the patient in solely on a ‘national algorithm’ for AKI published by the question. Direct care is defined as Ban activity concerned with NHS [19]; a process designed to assist in the rapid diagnosis the prevention, investigation and treatment of illness and the of AKI from the starting point of a renal blood test for creat- alleviation of suffering of an identified individual^ [14]. inine levels [20]. The implication is that all that Streams does The data that DeepMind processed under the Royal Free is host this algorithm, and pump the Royal Free data (as project was transferred to it without obtaining explicit consent stored, structured, formatted and delivered by DeepMind) from—or even giving any notice to—any of the patients in the through it to generate alerts [21, 11]. These alerts are trans- dataset. For patients who had the necessary precursor renal mitted to a clinician’s mobile device, along with historical data blood test and were then progressed to being monitored by on the patient in question to analyze trends (in seeming con- clinicians for AKI, the appropriate direct care relationship tradiction to the ISA, which stated that historical information would exist to justify this data processing, through the vehicle was shared only Bto aid service evaluation and audit on the of implied consent. However, the dataset transferred to AKI product^). Adding any new functions to the app, or ful- DeepMind extended much more broadly than this. In fact, it filling any of the broader contractual purposes described in the included every patient admission, discharge and transfer within ISA, would comprise research. DeepMind did not have the constituent hospitals of Royal Free over a more than five-year requisite approvals for research from the Health Research period (dating back to 2010). For all the people in the dataset Authority (HRA) and, in the case of identifiable data in par- who are never monitored for AKI, or who have visited the ticular, the Confidentiality Advisory Group (CAG) [22, 23]. hospital in the past, ended their episode of care and not Because DeepMind’s processes and servers—and those of the returned, consent (explicit or implied) and notice were lacking. third-party datacenter holding the data—have not been inde- This is an issue to which we will return, given the centrality of pendently scrutinized and explained, what the company has these requirements to patient privacy and meaningful agency. been, and is actually, doing with the data is not public. On 29 April 2016, the extent of data held by DeepMind was The national AKI algorithm was launched in a patient safety revealed following a New Scientist investigation [15]. Google, alert put out by NHS England on 9 June 2014, recommending which acted as the media filter for its subsidiary until at least Bthe wide scale introduction and uptake of an automated com- October 2016, issued a swift public relations response. In all of puter software algorithm to detect AKI^ [24]. The algorithm its communications, Google insisted that it would not be using was standardized by a working group of nephrologists and the full scope of the ISA it had signed [15], emphasizing that biochemists, with inputs from providers of specialized labora- DeepMind was only developing an app for monitoring kidney tory software systems, and leads to results being generated in disease [16]. This was despite the clear statements in the ISA Royal Free’s laboratory information management system [25]. quoted above, i.e. that information was also being shared for DeepMind’s asserted role has been to design a clinical app to the development of real time analysis and alert systems, poten- get the alerts generated by this algorithm delivered to clinicians tially as part of a broadly-defined ‘analytics as a service’ ‘on the fly’. The algorithm does not, however, extend to pa- tients who have never been tested for serum creatinine, nor does it mention historical contextual data [26, 27]. It is only There is no prohibition on linkage in the ISA. DeepMind’s own internal privacy impact assessment (see section 3.3 below) states that no new linkages an assistant to good clinical care [28, 29], and its sensitivity and of data will be made, but this document has no legal force, and given its other shortcomings––i.e. that it does not deal with the bulk of the data transfer, nor the bulk of the individuals affected––we do not consider it adequate. The Streams FAQ states that “all development is done using synthetic See discussion of consent in section 4.2-4.3 below. (mock) data. Clinical data is used for testing purposes”. 354 Health Technol. (2017) 7:351–367 effectiveness remains a vibrant, contested field of research. As around the patients who are in a direct care relationship is not DeepMind has acknowledged, Bthe national algorithm can likely to be as clean as saying that it extends only to those who miss cases of AKI, can misclassify their severity, and can label contract AKI, since the purpose of the app also includes mon- some as having AKI when they don’t^ [30]. The failure to itoring. However, since the large, Trust-wide group whose data explain and address these issues and, in particular, the discon- has been transferred includes individuals who have never had a nect between the Trust-wide dataset that has been transferred blood test, never been tested or treated for kidney injury, or under the broad terms of the ISA and the narrower set of indeed patients who have since left the constituent hospitals patients who will ever be monitored and treated for AKI, or even passed away, the position that Royal Free and throws considerable doubt on the DeepMind-Royal Free posi- DeepMind assert—that the company is preventing, investigat- tion that all of the data being transferred is necessary and pro- ing or treating kidney disease in every patient—seems difficult portionate to the safe and effective care of each individual to sustain on any reasonable interpretation of direct patient care. patient [31, 32]. 3.2 Grand ambitions and systemic change Despite the public narrative’s exclusive focus on AKI, it is 3 Grand designs and governance gaps clear that DeepMind and Royal Free have always had designs on much grander targets. A memorandum of understanding Between late April 2016, when the scale of the data transfer (MOU) between DeepMind and Royal Free was signed on 28 from Royal Free to DeepMind and the relative lack of con- January 2016, though was only discussed for the first time in straints on its use became publicly known, and until at least June 2016, after being uncovered by a freedom of information October 2016, DeepMind and Royal Free maintained the nar- request [36]. The document, which is not legally binding, talks rative that the entire purpose of transferring millions of patient about plans for DeepMind to develop new systems for Royal records was to assist with AKI diagnosis and alerts, under a Free as part of a Bbroad ranging, mutually beneficial partner- relationship of direct patient care [33]. This position, however, ship… to work on genuinely innovative and transformational fails to justify both the initial breadth of the data transfer and projects^ [37]. Quarterly meetings are envisaged for the set- the continued data retention. ting of priorities on product development, including real time prediction of risks of deterioration, death or readmission, bed, 3.1 Questioning direct care demand and task management, junior doctor deployment/ private messaging, and the reading of cardiotocography traces Royal Free states that AKI affects Bmore than one in six in- in labor [38]. Although the MOU states that all such projects patients^ [17]. If, as DeepMind claims, it only uses patient will be governed by their own individual agreements, the ini- data in the service of monitoring and treating AKI, then it tial Royal Free ISA already covers DeepMind for the devel- follows that as many as five sixths of patients (though this opment of a wide range of medical tools. quantity is very unclear on the current state of the evidence) These are vast ambitions, considerably out of step with are not in a direct care relationship with the company. The DeepMind and Royal Free’s narrow public relations orienta- distinction between being monitored or treated for AKI and tion towards their collaboration being entirely founded on not being monitored matters, because under British medical direct care for AKI. The MOU also makes apparent the esteem information governance guidelines [34], a direct care relation- in which DeepMind is held by its public service partners, ship between an identified patient and an identified clinical indicating in the principles under which the parties intend to professional or member of a clinical care team obviates the cooperate that one of the major reasons for Royal Free’sde- need for explicit consent. Without such a direct care relation- sired collaboration is BReputational gain from a strategic alli- ship, however, and without another basis such as consent, a ance with an unrivalled partner of the highest profile and ex- formal research authorization from the HRA CAG, or other- pertise, focused on a highly impactful mission^,plusa Bplace wise satisfying necessity requirements and introducing appro- at the vanguard of developments in … one of the most prom- priate safeguards [35], it is unlawful to continue to process ising technologies in healthcare^. DeepMind, by contrast, is in patient data under the UK Data Protection Act 1998 (DPA). it for rather more sombre reasons: a clinical and operational As already noted, DeepMind has held data on millions of test-bed, a strategic steer in product development and, most of Royal Free patients and former patients since November 2015, all, for data for machine learning research. with neither consent, nor research approval. The company, with Nascent indications of DeepMind’s plans for datasets that the support of Royal Free, has elected to position itself as hav- not only span a large healthcare trust such as Royal Free, but ing a direct care relationship, by virtue of its AKI alert app, with the entire NHS, have not yet received critical discussion [39], each and every one of those patients. Drawing boundaries but can be seen in presentations given throughout 2016 by This seems an upper estimate on the clinical reality: see [6]. DeepMind cofounder and health lead, Mustafa Suleyman. Health Technol. (2017) 7:351–367 355 These presentations have elaborated a vision for a Btruly dig- at how we can deploy our technologies to radically trans- ital NHS^,comprising Bmassively improved patient care, ac- form the NHS, digitize, and then help better organize and tionable analytics, advanced research both at the hospital-wide run the National Health Service [42].^ DeepMind’s level and the population-wide level, and an open innovation website pertaining to Streams was also updated, to state ecosystem^ [40]. Suleyman characterizes this fourth element, BWe’re building an infrastructure that will help drive in- underpinned technically by Bdigitizing most of the data that’s novation across the NHS, ultimately allowing clinicians, exchanged across the [NHS] system, open standards, and true patients and other developers to more easily create, inte- interoperability^,as Bthe key pivotal thing^ Bthat will enable grate and use a broad range of new services^ [43]. us to bring a wide variety of providers into the system and for clinicians up and down the country to be able to commission 3.3 Riding high above regulatory streets much smaller, more nimble, startup-like organizations to pro- vide some of the long tail of specialist and niche applications When Royal Free transferred millions of patient records to that nurses and doctors are asking for^ [40]. At the core of DeepMind in November 2015, it was done without consulting Suleyman’s described vision is the Bsecure and controlled re- relevant public bodies. The UK has an Information lease of data^ from what he terms Ba single, back-end canon- Commissioner’s Office (ICO), responsible for enforcing the ical record^ that indexes, but also gives a degree of control to, Data Protection Act. The Health Research Authority (HRA) all patients [41]—a telling sign of where a trust-wide dataset, provides a governance framework for health research, and pro- retrofitted in a way that allows it to be leveraged by Google/ vides a path for the release of confidential health information in DeepMind products and those of other technology companies, the absence of explicit consent, through the Confidentiality might ultimately be directed. Advisory Group (CAG). The Medicines and Healthcare prod- These statements are considerably broader than ucts Regulatory Agency (MHRA) regulates medical devices. DeepMind and Royal Free’s public relations focus on None of these bodies were approached about the November the Streams AKI app, with very extensive implications 2015 data transfer [44]: not for informal advice from the ICO; deserving of full and rigorous consideration. As not to go through an official and required device registration Suleyman describes it, the Bvery specific^ targeting of process with the MHRA before starting live tests of Streams at AKI under Streams precedes a Breal opportunity for us Royal Free in December 2015 [44]; and not to go through the to go much much further and extend this to a broader HRA’s CAG, which could have been a vehicle for legitimizing patient-centric collaboration platform^ [41]. Part of how many aspects of the project [45]. (DeepMind has subsequently this would be achieved technically, he indicated, was by been in discussion with all of these parties in reference to its making patient health data repurposable through an appli- Royal Free collaboration and, for several months from cation programming interface termed FHIR (Fast July 2016, stopped using Streams until the MHRA-required Healthcare Interoperability Resources; pronounced ‘fire’); self-registration process was completed. [46]) an open, extensible standard for exchanging electronic Instead, the parties went through just one third-party health records. The FHIR API, Suleyman indicated in check before transferring the data: the ‘information gover- July 2016, allows Baggregating the data in the back-end nance toolkit’ [47], a self-assessment form required by despite the fact that it is often spread across a hundred NHS Digital (formerly HSCIC) [48], designed to validate plus databases of different schemas and in different stan- the security of the technical infrastructure DeepMind dards and in many hospitals^. He continued, Bthis is ac- would be using [49]. The same tool has been used for tually very tractable… it’s not a research problem, and self-assessment by some 1500 external parties. The tool we’ve actually had some success in starting to think about assists organizations to check that their computer systems how we might do that, with the Royal Free^ [41]. are capable of handling NHS data, but it does not consider By September 2016, Suleyman was pitching DeepMind any of the properties of data transfers such as those at the heart of a new vision for the NHS––and casting the discussed in this paper. NHS Digital conducted a routine Google-Royal Free collaboration in the terms that Google desktop review of DeepMind’s toolkit submission in and DeepMind had vigorously denied and critics had December 2015 (after data had been transferred) and ap- feared (i.e. something much broader than an app for kid- proved that the third-party datacenter contracted by Google ney injury, giving Google and DeepMind undue and anti- had adequate security [50]. Beyond this surface check, competitive leverage over the NHS [15]), highlighting NHS Digital made no other enquiries. It subsequently con- sharply DeepMind’s unsatisfactory and quite possibly un- firmed the security of the external datacenter with an on- lawful processing and repurposing of Trust-wide Royal site check, but it was beyond the scope of NHS Digital’s Free data. Speaking at Nesta’s annual FutureFest, role to assess the flow of data between Royal Free and Suleyman stated: BEarlier this year, in February, we Google or to examine any other parts of Google or any launched our first business that’s facing outwards, looking aspect of the data sharing agreements [50]. 356 Health Technol. (2017) 7:351–367 While the DeepMind-Royal Free project does have a raison d’être is artificial intelligence; and its parent, self-assessed Privacy Impact Assessment (PIA) [51], as Google, the world’s largest advertising company, that has recommended by the ICO [52], the assessment commenced long coveted the health market [57]. Combined with the on 8 October 2015 [53], only after the ISA was signed, i.e. unavoidable fact that a sizeable number of patients never once the rules were already set. The PIA also failed to give need care for kidney injury, the absence of any public con- any consideration to the historical data trove that was sideration of patient privacy and agency, and the lack of transferred under the ISA, as well as omitting to discuss safeguards to prioritize public goods and interests over privacy impacts on patients who never have the requisite private ones, there are reasons to see the deal as more blood test or otherwise proceed through the AKI algorithm damaging than beneficial. that Streams uses, but whose data is in DeepMind’s Large digital technology companies certainly have the po- servers, and which is formatted, structured, and prepared tential to improve our healthcare systems. However, given the for repurposing anyway. That is to say, it neglected to deal sensitivity of building public trust in emerging technology with the primary privacy issues, as well as to justify the domains, in order for innovation to deliver over the long-term, failure to address basic data processing principles such as it must advance in a way that meets and exceeds existing data minimization. At the time of publication, the ICO was regulatory frameworks and societal expectations of fair treat- investigating the data transfer (primarily on whether data ment and value. Not doing so will only hinder the adoption protection law requirements have been satisfied) [54], as and growth of beneficial technology. was the National Data Guardian (primarily on the adequa- In this section, we identify a number of salutary lessons cy of the ‘direct care’ justification for processing) [55]. The from thecasestudy, assessing their implications for only remaining health regulator in the picture is the Care DeepMind, in particular, and for current and future directions Quality Commission (CQC), which gave a statement in in healthcare, more generally. These lessons draw on the October 2016 indicating the CQC would consider re- themes of consent, transparency, privatization and power. ported data breaches to the ICO as part of its own Our ambition is to span from the details presented in the pre- inspections, but otherwise declined to comment on the vious two sections towards the broader dynamics at play, both data transfer, indicating that it was broadly supportive in the present deal and in the longer-term ambitions of AI- of experimentation with big data-based care solutions Bif driven medical tools. The DeepMind-Royal Free deal is fast they will lead to people getting higher quality care being converted from an optimistic mistake into a long-term without undermining patient confidentiality^ [56]. partnership. What are the implications, both for this deal and One year after data started to flow from Royal Free to for othersthatloom? DeepMind, the basic architecture of the deal had not visibly The significance of this case study is not only that there are retrospective and grave concerns about the justifiability of changed. On the other hand, subsequent deals between DeepMind and other London medical institutions, this time DeepMind’s continued holding of data on millions of citizens. for research rather than direct patient care, were announced in The case study also offers a prism on the future. It offers one a way that avoided many of the same questions. In these angle into how public institutions and the public at large are arrangements, data was anonymized before being transferred presently equipped to grapple with the promised rise of data- to DeepMind, and research approval (which raises separate driven tools in domains such as medicine and public health. issues, as discussed further below) was sought and gained And it tests our assumptions and responses to Google/ before any research work commenced. Crucially, DeepMind Alphabet and other speculative private prospectors of this al- and its partners were clear about the purposes and amount of gorithmic age—‘New Oil’, ‘New Rail’, ‘New Pharma’,we data that would be transferred in those deals. could say––as they transition from web-based markets, into land- and body-based markets. 4 Assessing the damage 4.1 The conversation we need The most striking feature of the DeepMind-Royal Free ar- It was only after an independent journalistic investiga- rangement is the conviction with which the parties have tion revealed the necessary information—seven months pursued a narrative that it is not actually about artificial after DeepMind and Royal Free first entered into a data intelligence at all, and that it is all about direct care for sharing agreement, five months after the data had been kidney injury—but that they still need to process data on transferred into DeepMind’s control and during which all the Trust’s patients over a multi-year period. This is product development and testing had commenced, and hardly a recipe for great trust and confidence, particularly two months after the project had been publicly an- given that the arrangement involves largely unencumbered nounced––that any public conversation occurred about the nature, extent and limits of the DeepMind-Royal data flows, both with one company, DeepMind, whose Health Technol. (2017) 7:351–367 357 Free data transfer. Despite the shortcomings in the DeepMind is developing an app that will only conceiv- deal’s structure, if DeepMind and Royal Free had en- ably be used in the treatment of one sixth of those indi- deavored to inform past and present patients of plans viduals; and for their data, initially and as they evolved, either 7) More than 12 months into the deal being made, no regu- through email or by letter, much of the subsequent fall- lator had issued any comment or pushback. out would have been mitigated. A clear lesson of this whole arrangement is that attempts to deliver public If account is not taken of these lessons, it could healthcare services should not be launched without dis- result in harms beyond the breach of patients’ rights closing the details, documentation, and approvals—the to confidentiality and privacy––though these elements legal bedrock—of the partnerships that underlie them. in themselves should be enough to demand a regulatory This lesson applies no less to companies offering algo- response. Some of the potential risks posed by unregu- rithmic tools on big datasets than it does to pharmaceu- lated, black box algorithmic systems include misclassi- tical and biotech companies. fication, mistreatment, and the entrenchment and exac- The failure on both sides to engage in any conversation erbation of existing inequalities. It does not take an with patients and citizens is inexcusable, particularly in the active imagination to foresee the damage that computa- British context, in the wake of the 2013 Caldicott review into tional errors could wreak in software applied to information governance practices [34], the very public and healthcare systems. Clearly, the same skills and re- profoundly damaging 2013–15 failure of the government’s sources must be devoted to the examination and valida- care.data data sharing scheme [58], the 2014 recommenda- tion of data-driven tools as to their creation. tions of the National Data Guardian in the wake of the Without scrutiny (and perhaps even encouraged com- care.data debacle [59], and the 2015 Nuffield Council report petition) Google and DeepMind could quickly obtain a on bioethics [60]. The clear take-away from these reports and monopolistic position over health analytics in the UK recommendations––and indeed the entire regulatory apparatus and internationally. Indeed, the companies are already around healthcare––is that patients should be able to under- in key positions in policy discussions on standards and stand when and why their health data is used, with realistic digital reform. If a comprehensive, forward-thinking and options for effective choice [61]. Patients should not be hear- creative regulatory response is not envisaged now, ing about these things only when they become front-page health services could find themselves washed onwards scandals [62]. in a tide of efficiency and convenience, controlled more The DeepMind-Royal Free data deal may be just one trans- by Google than by publicly-mind health practitioners. action, but it holds many teachings. To sum up: Aggregating and centralizing control of health data and its analysis will generate levers that exist beyond dem- 1) We do not know––and have no power to find out––what ocratic control, with no guarantees except for corporate Google and DeepMind are really doing with NHS patient branding and trust as to where they might end up. data, nor the extent of Royal Free’s meaningful control It is important to reflect on these scenarios not as a predic- over what Google and DeepMind are doing; tion of what will come to pass, but as a vision of the potential 2) Any assurances about use of the dataset come from public danger if policymakers and regulators do not engage with relations statements, rather than independent oversight or digital entrants such as DeepMind and incumbents such as legally binding documents; Google. There may be other, worse outcomes. To demand that 3) The amount of data transferred is far in excess of the innovation be done in a principled manner is not to stand in its requirements of those publicly stated needs, but not in way––it is to save it. excess of the information sharing agreement and broader memorandum of understanding governing the deal, both 4.2 Data protection of which were kept private for many months; 4) The data transfer was done without consulting relevant DeepMind and Royal Free have coalesced on the justi- regulatory bodies, with only one superficial assessment fication of direct care and implied consent to seek to of server security, combined with a post-hoc and inade- justify the sharing of the Royal Free dataset [48, 17, quate privacy impact assessment; 32]. Although this is the only available basis for them 5) None of the millions of identified individuals in the to justify the data transferred in November 2015, it sets dataset were either informed of the impending transfer a dangerous precedent for the future. To understand to DeepMind, nor asked for their consent; why, we need to step through the UK data protection 6) The transfer relies on an argument that DeepMind is in a and medical information governance frameworks. Bdirect care^ relationship with each patient that has been Under the UK Data Protection Act, DeepMind needs admitted to Royal Free constituent hospitals, even though to comply with a set of data protection principles, 358 Health Technol. (2017) 7:351–367 including having a legitimate basis at all times for pro- under a contract, made or evidenced in writing, under cessing information that can identify an individual [63]. which the processor Bis to act only on instructions from Health information is classed as ‘sensitive personal data’ the data controller^. under this law [64], and is subject to additional safe- It seems clear that Royal Free have contracted with guards [65]. For DeepMind, legitimate processing of DeepMind to analyze complex data and come up with solu- health information comes down to one of two alterna- tions by applying DeepMind’s own expertise in analysis to an tives. Either it requires explicit consent from the indi- extent that Royal Free cannot begin to do. Apart from the vidual concerned, which DeepMind does not have, or parties’ consensus on the overall purpose of processing––to DeepMind must show that its processing is Bnecessary assist in monitoring AKI using the nationally-mandated AKI for medical purposes^ (defined as Bthe purposes of pre- algorithm––DeepMind seems to have considerable discretion, ventative medicine, medical diagnosis, medical research, in addition to Royal Free, to determine the purposes and man- the provision of care and treatment and the management ner in which any personal data is processed. The company is of healthcare services^)and Bundertaken by (a) a med- storing, structuring and formatting the Trust-wide dataset, test- ical professional; or (b) a person who in the circum- ing it, preparing to deliver data and visualizations to clinician’s stances owes a duty of confidentiality which is equiva- devices and, most recently, discussing technical infrastructure lent to that which would arise if that person were a that could enable it to be repurposed. These factors all point health professional^ [66]. very strongly to DeepMind assuming the role of a joint data Simply using health data for the purpose of specula- controller. Certainly, Royal Free, in its responses to investiga- tive and abstract Bmedical purposes^ does not satisfy tions and freedom of information requests, has never provided data protection law. This is where the medical informa- any specific awareness or understanding of the means of tion governance architecture—the so-called Caldicott DeepMind’s processing. principles and guidelines [34]—come into play. Before Further, even if DeepMind were to avoid the substantive turning to these rules, it is important to address an out- factual conclusion that it is determining the purposes and man- standing core issue in the data protection aspects of the ner of data processing, the document that is said to constrain Royal Free-DeepMind project. DeepMind’s processing—the ISA—has a number of short- Data protection law relies on a key distinction between comings that undermine its status as a ‘contract’ satisfying ‘data controllers’ and ‘data processors’ [67]. A data con- the mandatory requirements for data controller-processor re- troller is defined as Ba person who (either alone or jointly lationships in Schedule 1, Part II, paragraph 12 of the DPA. or in common with other persons) determines the purposes The contract plausibly extends to a wide range of health tools for which and the manner in which any personal data are, for any health condition, without overriding controls from Royal Free. There is an absence of evidence in writing that or are to be, processed^,whileadataprocessor is Bany person (other than an employee of the data controller) DeepMind will act only on instructions from Royal Free, that who processes the data on behalf of the data controller^ data will not be linked with other datasets held by Google or [68]. It is crucial to define controller and processor status DeepMind, or that the data will not be repurposed for other in any information sharing arrangement because legal ob- uses. It is irrelevant whether or not the parties would actually ligations and liabilities flow from it [69], with significant do any of these things. Assurances from the parties are not real-world consequences [70]. In essence, data controllers what matters here––what mattersiswhatisstated inthe bear primary responsibility for complying with the princi- document that purports to be the governing contract. ples of data protection, for accommodating data subject Finally, the status of the document as a contract is di- rights and other requirements, and are liable for damages minished by its absence of any discussion of consider- in case of non-compliance. ation passing between the entities. The ISA between Royal Free and DeepMind states at DeepMind cannot be converted to being a pure data a number of points that Royal Free is the data controller, processor by having both parties sign an agreement declar- while DeepMind is merely a data processor. While this is ing that this is its status, no matter how much the parties clearly the intention of the parties, the legal question is mightwishit[71]. The situation is analogous to the exam- one of substance, not form. The substantial issue turns ple given by the ICO of a sharing agreement between a car on applying the provisions of the DPA, particularly pa rental service and a tracking company that helps ensure ragraphs 11–12 of Schedule 1, Part II. These provisions that cars are returned [70]. The agreement allows the track- require, respectively, under paragraph 11 that a data pro- ing company to hold customer location data for a set peri- cessor provide sufficient guarantees and compliance in od. However, the ICO states that because the tracking com- respect of the technical and organizational security mea- pany applies its own secret knowledge in deciding the data sures governing the processing to be carried out and, to collect and how to analyze it, the fact that the rental company determines the overall purpose of tracking (i.e. under paragraph 12, that the processing be carried out Health Technol. (2017) 7:351–367 359 car recovery) is not sufficient to make the tracking compa- safeguards, such as that complaints can be made to the ny a processor. Addressing and resolving the status of General Medical Council. DeepMind is crucial, and is presumably a core dimension For individuals who are escalated to clinical interven- of the ICO’s ongoing investigations of the deal. In our tion based on the results of applying the AKI algorithm assessment, it is clearly arguable that DeepMind is a joint after a preliminary blood test, clearly this direct care sce- data controller along with Royal Free. It is unfortunate that nario applies. However, for the remainder of patients the ICO had not yet made a clear determination to resolve whose data has been transferred to DeepMind, no plausible the question of legal status in over 12 months after the deal necessity for DeepMind’s processing of their data arises. It commenced, leaving individual rights and organizational is, instead, a classic situation of health services manage- responsibility hanging in the balance. ment, preventative medicine, or medical research that ap- plies to the overall provision of services to a population as a whole, or a group of patients with a particular condition. 4.3 Caldicott guidelines This is the very definition of indirect care [34]. Lawful processing of identifiable data for indirect care, if there is The Caldicott rules help reduce the friction on data sharing no consent, can only proceed under what is termed the of identifiable health information for direct patient care, ‘statutory gateway’––i.e. under section 251 of the NHS while ensuring that other uses of such information—indi- Act 2006 (UK) and The Health Service (Control of rect care, such as research on identifiable individuals or Patient Information) Regulations 2002. In effect, s.251 al- risk prediction and stratification––are accorded sufficiently lows third-parties to bypass the impracticality of gaining strong regard to legal obligations of privacy and confiden- consent from large numbers of patients to process their tiality. In relation to Streams, the argument made by data, by asking the Secretary of State for Health on the Google and Royal Free—and their only arguable basis patients’ behalf through the HRA CAG approval process. for continuing to process the totality of data made available It is notable that the process that Royal Free and DeepMind under the ISA—is that DeepMind is in a direct patient care assert is necessary here—of storing, structuring and for- relationship with all Royal Free patients. The assertion matting trust- or hospital-wide datasets in order to then seems to be that, since any Royal Free patient may deteri- effectively deliver clinical care to a subset of patients— orate with AKI in the future, the hospitals are justified in does not naturally fall into any of the envisaged s.251 cir- sharing the superset of everyone’s medical information cumstances in which confidential patient information may with Google now, just in case a patient needs be processed [75]. DeepMind’s services in the future. To this the odd claim A final element in addition to the hard legal arguments is added that Bwith any clinical data processing platform it about consent and the ICO, and direct care and the Data is quite normal to have data lying in storage^ [72], without Guardian, is notice. Notice is not a mandatory require- acknowledging necessary legal and ethical limits to such a ment under the DPA if data is lawfully repurposed, but claim. This line of reasoning is unsustainable if ‘direct it is necessary if data is being processed for a new pur- care’ is to have any useful differentiating meaning from pose [76]. This would be the case, at the very least, for ‘indirect care’. By the same argument, DeepMind would patients who are in the transferred dataset, but who are be justified in holding all the data on every patient in never tested and treated for kidney injury. Though at the the NHS, on the basis that one of those patients, one broadest level Royal Free is engaged in repurposing data day, might require direct clinical treatment through acquired for medical purposes, in this case, to argue that Streams [73]. this data is legitimately being repurposed en masse in the DeepMind’s situation has no clear direct analogy in the DeepMind deal undermines wholly the protections Caldicott guidelines. Usually when speaking of the implied afforded to such sensitive data in both the DPA and the consent inherent in direct care relationships, the guidelines Caldicott rules. describe scenarios where registered clinical professionals As a partial acknowledgment of this, in May 2016 acting as part of a clinical team, all with a legitimate rela- Royal Free highlighted that an opt-out exists to data sha tionship with the patient, pass relevant patient data be- ring with Google/DeepMind [17]. However, the opt-out tween themselves, e.g., a surgeon passing a patient to a was only made clear after public attention had been called post-operation nurse [34, 74]. Implied consent in these to the deal. Such an after-the-fact concession strikes as scenarios is easily justified. It builds on the core relation- poor compensation, and is inconsistent with the practice ship between a patient and a clinical professional, within of other hospitals in endeavors of similar reach. Take for which tools—including software tools, record management example the 2015 Connecting Care project, comprising systems, alert and analytics systems, etc.—can be intro- Bristol, North Somerset and South Gloucestershire hospi- ducedintheserviceofpatient care. There are also tals [77]. This project involved a more sound basis for 360 Health Technol. (2017) 7:351–367 population-wide data sharing based on implied consent, cannot see it [78, 79]. The company benefits from relying because it concerned various third-party providers being on commercial secrets and the absence of public law obli- linked to provide an electronic patient record system. A gations and remedies against it. This leaves it with few mass mailing of information on the parties involved, and incentives for accountability. Only when it collides with reasons for data processing, to all individuals in the com- institutions that have obligations to account—i.e. when it munity was undertaken as a key exercise to inform and makes data sharing arrangements with Royal Free, or it allow individuals to opt out, and was followed up with applies for approval to NHS Digital––do rulessuchasthe ongoing efforts to inform patients. Though this project UK Freedom of Information Act 2000 permit some cracks was more involved than the Royal Free-DeepMind deal, in the glass. it also had a more legitimate reason for extending This particular case study, and the way that it has across the entire population of constituent hospitals. unfolded, demonstrates the clear absence of strong tools Royal Free has not justified why a similar process did to require companies to account in the same way as not take place with its arrangements with Google. public institutions—even if they aspire to deliver, and Given Streams is characterized as a clinical app, there in some cases even overtake, public services. There are are more elegant––and less legally and ethically dubi- many parallels to another contemporary policy issue in- ous––solutions available than simply running a mirror volving Google: its application of a 2014 European copy of the Royal Free’s repository of patient data on court ruling requiring the company to delist information third-party servers controlled by DeepMind, for every sin- that is retrieved on name searches from its search en- gle hospital patient, entirely independently of AKI sus- gine when that information is not of public interest and ceptibility and diagnosis. One solution is for DeepMind is shown to have lost relevance, accuracy or timeliness to pull in historical data only on patients who have had [80]. In that case too, the one-way mirror has conceded the gateway blood test that is prerequisite for AKI diag- only cracks of knowledge. The tools of discovery, to nosis. If Royal Free’s systems cannot currently handle real inform the public about privately-run services with deep time data requests in this manner, they ought to. It seems impacts on their lives, are vastly unequal to the power in the essence of an ethical and legal streaming service that Google wields. that just as a patient’s relevant blood tests from Royal Free ‘stream’ to DeepMind’s servers, so should historical 4.5 Corporate responsibility data on the identified at-risk patients. Below, we unpack the implications of these points Even without portholes through which to examine the opera- with a focus on transparency, data value, and market tions of powerful technology companies in detail, there is still power. There has been an inexcusable institutional delay a lot more that can be done, both from corporations them- in the NHS, ICO and Data Guardian’s response to the selves, and from the institutions that are mandated to oversee issues discussed so far. The remainder of this section them. The deal-making between DeepMind and public insti- exposes how ill-equipped our institutions are to deal tutions continues to be secretive. This is inappropriate for a with the challenges ahead. system that typically requires open tender and disclosure. The purpose and terms of these deals should be made transparent, 4.4 Transparency and the one-way mirror before committing populations of millions to them. They should clearly lay out the public benefit of their works, as well At the heart of this deal is a core transparency paradox. as the private benefits—what is in it for Google, for Google knows a lot about all of us. For millions of pa- DeepMind? What initiatives have been made towards ensur- tients in the Royal Free’s North London catchment, it now ing ongoing and equitable benefit-sharing? How are procure- has the potential to know even more. Yet, when the tables ment rules and restrictions satisfied? While total transparency are turned, we know very little about Google. Once our of processes is not possible, transparency of purpose and data makes its way onto Google-controlled servers, our means must be—legitimizing, in detail, the company’srea- ability to track that data––to understand how and why sons and limits in holding sensitive data. To its credit, decisions are made about us––is at an end. Committed DeepMind’s announcement of deals subsequent to Royal investigative reporting has led to documentation describ- Free have moved in this direction; although peer reviewers ing the DeepMind-Royal Free data transfer being made still question issues of consent [81], and the lack of details public, but we still have no real knowledge of what hap- around the algorithmic processes to be applied [82]. pens once the data reaches DeepMind, nor many tools to DeepMind has taken steps towards self-regulation. When find out. DeepMind announced Streams in February 2016, it also an- The public’s situation is analogous to being interrogated nounced the creation of a board of what it termed ‘indepen- through a one-way mirror: Google can see us, but we dent reviewers’––nine prominent public figures in the fields of Health Technol. (2017) 7:351–367 361 technology and medicine in the UK—to scrutinize the alternative is to abdicate ourselves to systems that, when company’swork withthe NHS[83, 84]. The board met for they break, will not explain themselves to us. the first time in June 2016. The board is ostensibly reviewing It is worth noting that in digesting our medical records and DeepMind’s activities in the public interest, but as at the end histories, machine learning systems have the potential to un- of January 2017, it had not made any minutes or account of its cover new hypotheses and trends about us, as a population, discussions public, nor had any reviewers expressed any con- that are difficult to adapt to and deal with. It may turn out, for cerns about DeepMind’s arrangements publicly. Annual state- instance, that certain kinds of people are particularly suscep- ments are envisaged. Oversight of artificial intelligence as it is tible to requiring an expensive medical intervention over the applied to healthcare is obviously desirable. But a self- course of their lives. Regulations should require that the bur- appointed oversight board, arguably paid in the currency of dens of new discoveries not fall solely on the shoulders reputational gain by association with successful technology of those individuals who happen to need the interven- companies, is far from adequate in itself. Being hand-chosen tion. There is a risk that, if we do not understand how by DeepMind, the members of the board are unlikely to have companies like DeepMind draw knowledge from our positions fundamentally at odds with the company. It would data, we will not be prepared for the implications of also be a considerable about-face to denounce the whole ar- the knowledge when it arrives. rangement with a partner such as Royal Free. At best, the It is essential that society is prepared for these new- board will supplement institutional external oversight mecha- found patterns, and able to protect those people who nisms and provide insights not readily gained by outsiders: for find themselves newly categorized and potentially disad- example, access to internal data; independent assessments of vantaged. This newfound understanding of our condition internal arrangements for data handling, privacy and security; will leave us all better off, but only if we share the empirical insights into the attitudes of employees and the pro- burdens that the discoveries will place on individuals. tection of the public interest. At worst, however, such a board risks creating a vacuum around truly independent and rightly skeptical critique and understanding. 4.6 Privatization and value for data The question of how to make technology giants such as Google more publicly accountable is one of the most pressing Even if DeepMind had been more open about its Royal Free political challenges we face today. The rapid diversifi- data deal, as it was in subsequent research deals, questions still cation of these businesses from web-based services into remain about the value that flows to the British public from all sorts of aspects of everyday life—energy, transport, these deals. DeepMind has made public two other partner- healthcare—has found us unprepared. But it only em- ships with the NHS, both—unlike with Royal Free—for re- search rather than patient care, with actual involvement of AI, phasizes the need to act decisively. Machine learning tools offer great promise in helping to and with appropriate research approvals. One, with navigate complex information spaces, environments and Moorfields Eye Hospital in London [85], involves the AI work flows that are beyond the reach of any one clinician company receiving one million anonymized eye scans which or team. However, it is essential that the supply chain of it will run through its machine learning algorithms in search of data and humans leading to any machine learning tools are new patterns of degeneration that might allow disease to be comprehensible and queryable. This is a check on the im- caught earlier [86]. Like the Royal Free collaboration, it com- pulse of technology startups that want to ‘move fast and menced in July 2015 [87], when a Moorfields ophthalmologist break things’. While there is little doubt that individuals at approached DeepMind with a research question: can deep DeepMind do care about improving the situation at Royal learning be used to diagnose age related macular degeneration Free and across the NHS generally, the young company is or diabetic retinopathy? Approval to work on anonymized clearly interested in moving fast—as are Royal Free’scli- data was granted by Moorfields in October 2015 and the first nicians. ‘The faster we move, the more lives we can save’, part of an approval to work on pseudonymized data came in goes the logic. This may be true, but it injects several June 2016, at the same time as a research protocol was also elements of dangerous risk, and potentially hazardous published in an open access journal [88]. Ethical approval was breakages, in developing these new tools: first, that the granted, but it is worth noting that it was confined to looking at tools will provide misleading and harmful advice in edge the risk of adverse patient events, not at broader questions cases; and second, that public trust and confidence in arti- such as the future for jobs, for competition, human deskilling, ficial intelligence erodes, making it harder to carry out etc [82]. The Moorfields project was announced publicly in projects in the future in sensitive areas, despite their prom- July 2016. While other hospitals and startups can pursue ised benefits. Aligning the development and operation of artificial intelligence products with human-scale account- As one reviewer remarked: “Overall a novel concept and worth exploring as ability and explanation will be a challenge. But the it will be able to replace human workforce if successful”. 362 Health Technol. (2017) 7:351–367 similar projects, Moorfields sees more patients a year than any data. Without data, there is no artificial intelligence. It is a other eye hospital in the US or Europe. great stroke of luck that business has found a way to mon- The second partnership, with UCL Hospitals NHS etize a commodity that we all produce just by living our Foundation Trust, sees DeepMind receiving 700 anonymized lives. Ensuring we get value from the commodity is not a radiography scans [89]. The AI company is attempting to im- case of throwing barriers in front of all manner of data prove how treatment is planned for head and neck cancer, by processing. Instead, it should focus on aligning public speedingupscansegmentation––the process of deciding where and private interests around the public’s data, ensuring that and how to direct radiation in order to maximize impact to can- both sides benefit from any deal [91]. cerous cells and minimize harm to healthy tissue. At the moment The value embodied in these NHS datasets does not an expert radiologist needs to label images pixel-by-pixel, with a belong exclusively to the clinicians and specialists who 28 day wait-time, for a four hour process [40]. DeepMind re- have made deals with DeepMind. It also belongs to the ceived approval to work on anonymized data in April 2016, with public who generated it in the course of treatment. There its research protocol published August 2016 [90]. is a pressing need for the NHS to consult broadly on the The assumption is that DeepMind’s technical capability value-for-data aspects of these transfers, to ensure that the will let it discover new things about analyzing medical imag- British public gets the best possible value out of any fu- ery, and that those new modes of analysis will be shared back ture deal. This value might take the form of an NHS stake to the community. However, documents binding DeepMind’s in any products that DeepMind, a for-profit company, de- agreement with Moorfields and UCL, and the terms of data velop and sell using NHS data. It could be as simple as a sharing, were not public as at October 2016. We do know that binding agreement to share any future products with the DeepMind will keep all algorithms that are developed during entire NHS at a discount, or for free. It is inappropriate to the studies. In other words, the knowledge DeepMind extracts leave these matters for future discussion, risking lock-in. from these public resources will belong exclusively to There may even be scenarios where third-party processors DeepMind. Even if it publishes the scientific results of its can use NHS data to build products that are not related to studies, it is unlikely it will freely publish the algorithms it health,but areusefulinother markets. Thepublichas a has trained. In effect, the chance to train and exploit algo- revulsion against ‘selling’ NHS data, but this impulse rithms on real-world health data is DeepMind’s consideration sells the public short on its own assets. The Royal Free- for these deals. The consideration for its partners is that those Google deal suggests that data will flow in any event, algorithms––and the promise that they advance the field of under the banner of innovation, without any value-for- diagnostics––exist in the world. Given this, the opacity of money discussions. We recommend that, in addition to consideration passing between the parties in this, as with the formalizing inputs on these aspects of value, the NHS might also consider the intrinsic impacts of automation contract with Royal Free, is problematic. There are no details on the clinical service and cost of any service to be provided [92]—how will clinicians interface with these new tools? by DeepMind in exchange for the data access, only vague How will the NHS deal with inevitable deskilling and statements that have been made in public fora about the pos- shifts in the workforce, in response to automation? How sibility of a future levy being imposed, in alignment with will they ensure that the daily art of medicine is as improvements in clinical outcomes. protected and valued as the science? A properly resourced and truly independent entity or 4.7 Open competition and public interest entities should be tackling these challenges. Perhaps the Council of Data Science Ethics and standing Commission Offering DeepMind a lead advantage in developing new algo- on Artificial Intelligence, recommended—and, in the first rithmic tools on otherwise privately-held, but publicly-generated case, accepted by the government [93]—under two reports datasets limits the adoption of any scientific advances the com- of the UK House of Commons Science and Technology pany may make to two channels: via DeepMind on DeepMind’s Committee [94, 95], will be able to undertake this task, terms; or to recreating, at expense and with unclear routes to but their independence and rigor must be proven. They access, DeepMind’s training on the same datasets. must also take into account the fact that DeepMind con- Concepts of the value of data have not yet permeated popular tinues to rapidly expand its staff, including with senior culture. Google and other technology companies know very well appointments from the ranks of government and the what value they can unlock from a particular dataset and from NHS itself [96, 97]. access to millions or billions of computers that stream data on how their human owners walk, talk, travel and think. 4.8 Market power But the public, and by extension the public sector, do not yet contemplate the value of this commodity that only The new phenomenon of using machine learning to extract they are capable of creating. Without people, there is no value from health data is likely the precursor of a general Health Technol. (2017) 7:351–367 363 movement to monetize public datasets. Centralized go- institutions and communities appear to be the best vehicles vernment services are obvious targets for machine lear- to advocate for individual rights, rather than placing the ning. They are directed towards fundamental human needs burden of ownership on individuals. The key then is to of care, housing, education and health, and often hold long return value at the communal level [99]. Indeed, data held baseline datasets about human behavior in areas where by NHS trusts ought to be perfectly positioned for this services could be improved. The complexity and scale of treatment. this information is what has led to the suggestion that these Hospitals are a community dedicated to the care of are areas where the sheer force of computation and algo- their patients. The first step for DeepMind and Royal rithmic learning on large volumes of data offers great uti- Free should have been to engage the community in lity and promise. explaining the solutions they will pursue, and achieving When private companies access these resources with the buy-in with communal control and reward. The second intention of building on top of them, first-mover advantage step would have been to expand this with other alterna- exists as it does whenever private companies exploit public tives in a flourishing innovative ecosystem. This did not resources—land, fossil fuel stores, connection points to peo- happen, and it does not look like it will happen. In this ple’s homes. In the new realm of machine learning, it is im- regard, it is important to note that offering functionality portant to ensure that DeepMind’s algorithms do not put it in for patients to see and audit their own data as it moves an entrenched market position. through systems [100, 101], as DeepMind has intimated Of course, DeepMind is not the only innovator making that it will do in the future, is a positive development, overtures to the NHS, and machine learning is not the but it is also one that resigns itself to perpetuating ulti- only innovation. In the case of kidney injury, outcomes mate control, and a power asymmetry, in the hands of would be as well influenced by employing more nurses to those who control the system—in this case, DeepMind. ensure that patients are hydrated, as deploying new algo- None of the approaches of DeepMind, of Google, or of rithms. Some healthy caution about the first-mover is ad- the industry-supported Partnership on Artificial vised. If our public services have not laid the groundwork Intelligence that they announced in 2016, do anything for an open, flourishing future innovation ecosystem, then to mitigate this control. They trumpet their own good the temptation for players like DeepMind to sit on their intents, in benefiting the many in open, responsible, so- entrenched networks will be too strong. cially engaged ways that avoid undesirable outcomes It is important to note that, while giving DeepMind [102]. But ultimately, these are tweaks within the frame access to NHS data does not in principle preclude the of a certain deterministic approach to technology. They same access being given to other companies in future, look for corporate initiative, not for robust solutions that stand outside our present paradigm and ask how best we the willingness to recreate work, and ability to catch up, will diminish over time. Already, anecdotally, startups are can truly assure that we advance technologically, and that reluctant to move in places where DeepMind has started we do so in a way that ensures deep and broad public deploying its immense resources. The danger of uncon- interests are met, not just superficially immediate, effi- strained, unreflective allocation of datasets to powerful cient, commercial solutions. parties is that the incentives for competition will distort. Like physical networks of electricity cables or gas pipes, it is perfectly possible for another company to redo what 5 Conclusion has been done by another. However, there are powerful inefficiencies and network effects that count against such The 2015–16 deal between a subsidiary of the world’s possibilities. If we are to see the true promise of artificial largest advertising company and a major hospital trust in intelligence, a much more positive solution would be to Britain’s centralized public health service should serve as heavily constrain the dataset and to introduce a competi- a cautionary tale and a call to attention. Through the ve- tive, open process for simultaneous technology develop- hicle of a promise both grand and diffuse––of a streaming ment by a range of private, public, and private-public app that will deliver critical alerts and actionable analytics providers. on kidney disease now, and the health of all citizens in the A way of conceptualizing our way out of a single pro- future––Google DeepMind has entered the healthcare vider solution by a powerful first-mover is to think about market. It has done so without any health-specific domain datasets as public resources, with attendant public owner- expertise, but with a potent combination of prestige, patro ship interests. Ownership in this context is often a loaded nage and the promise of progress. notion, but it does not need to reduce to something that is Networks of information now rule our professional and atomized and commoditized for control at the individual personal lives. These are principally owned and controlled level. Learning from commons movements [98], trusted by a handful of US companies: Google, Facebook, 364 Health Technol. (2017) 7:351–367 Microsoft, Amazon, Apple, IBM. New players cannot References compete with these successful networks, whose influence deepens and becomes more entrenched as they ingest 1. DeepMind. Acute kidney injury. In: Streams. 2016. https://deepmind.com/applied/deepmind-health/streams/. more data, more resources. If these born-digital compa- Accessed 6 Oct 2016. nies are afforded the opportunity to extend these networks 2. Royal Free response to Hodson freedom of information request into other domains of life, they will limit competition 1548, 30 Aug 2016. there too. This is what is at stake with Google 3. The exact number is unknown, but Royal Free admits an average DeepMind being given unfettered, unexamined access to 1.6 million patients per year: NHS. Overview. In: Royal Free London NHS Hospital Trust. 2016. http://www.nhs. population-wide health datasets. It will build, own and uk/Services/Trusts/Overview/DefaultView.aspx?id=815. control networks of knowledge about disease. Accessed 20 Sep 2016. Fortunately, health data comes with very strong protec- 4. DeepMind. Information sharing agreement. 2016. https://storage. tions that are designed to protect individuals and the pub- googleapis.com/deepmind-data/assets/health/Royal Free - DSA - redacted.pdf (granting DeepMind data on all patients over a five lic interest. These protections must be respected before year period). The agreement was signed by Subir Mondal, a acceding to any promises of innovation and efficiency deputy director and head of information governance at Royal emanating from data processing companies. Public health Free, and Mustafa Suleyman, one of DeepMind’sthree services such as the British NHS are deeply complex sys- cofounders (presumably with authority to contract on behalf of Google). tems. It is imperative for such institutions to constantly 5. DeepMind. We are very excited to announce the launch of explore ways to advance technologically in their public DeepMind Health. 2016. https://deepmind.com/blog/we-are- health mission. Artificial intelligence and machine learn- very-excited-announce-launch-deepmind-health/. Accessed 20 ing may well offer great promise. But the special relation- Sep 2016. ship that has surged ahead between Royal Free and 6. Kerr M, Bedford M, Matthews B, O'Donoghue D. The economic impact of acute kidney injury in England. Nephrol Dial Google DeepMind does not carry a positive message. Transplant. 2014;29(7):1362–8. doi:10.1093/ndt/gfu016. Digital pioneers who claim to be committed to the public 7. Bedford M, Stevens PE, Wheeler TWK, Farmer CKT. What is the interest must do better than to pursue secretive deals and real impact of acute kidney injury? BMC Nephrology. 2014;15: specious claims in something as important as the health of 95. doi:10.1186/1471-2369-15-95. populations. For public institutions and oversight mecha- 8. Jordan MI, Mitchell TM. Machine learning: trends, perspectives, and prospects. Science. 2015;349(6245):255–60. doi:10.1126 nisms to fail in their wake would be an irrevocable /science.aaa8415. mistake. 9. Boseley S, Lewis P. Smart care: how Google DeepMind is working with NHS hospitals. Guardian. 24 Feb 2016. https://gu.com/p/4h2 Acknowledgements Hodson acknowledges the support of New k2. Scientist, where many of the investigations and facts discussed 10. The terms ‘analytics’ and ‘decision support’ echo knowledge- in this article were first revealed. Both authors warmly thank based and expert systems, the areas where narrow artificial intel- the participants at the Salzburg Global Forum–Johann Wolfgang ligence methods achieved early success: Keen PGW. Decision von Goethe Foundation workshop, ‘Remaking the state: The im- support systems: the next decade. Decis Support Syst. pact of the digital revolution now and to come’, the University of 1987;3(3):253–265. doi: 10.1016/0167-9236(87)90180-1. Cambridge’s Technology and Democracy project and Computer 11. DeepMind. Streams FAQ. In: Streams. 2016. https://deepmind. Security Group, and numerous colleagues for the many conversa- com/applied/deepmind-health/streams/. Accessed 6 Oct 2016. tions that fueled this endeavor. Thanks are also due to three 12. Suleyman M. DeepMind Health: our commitment to the NHS. anonymous reviewers and the editor of this special issue for their Medium. 5 Jul 2016. https://medium.com/@mustafasul/deepmind- helpful comments and guidance. health-our-commitment-to-the-nhs-ac627c098818#.66w4mgi4j. 13. The technology industry is notorious for its pivots. Further, exter- Compliance with ethical standards nal factors could intervene. See, e.g., scenario models in UC Berkeley’s Center for Long-Term Cybersecurity. Cybersecurity Conflict of interest The authors declare they have no conflict of Futures 2020. 2016. https://cltc.berkeley.edu/files/2016/04 interest. /cltcReport_04-27-04a_pages.pdf. 14. Department of Health, The Caldicott Committee. Report on the Funding There is no funding source. review of patient-identifiable information. 1997. http://webarchive.nationalarchives.gov.uk/20130107105354 /http://www.dh.gov.uk/en/Publicationsandstatistics/Publications/ Ethical approval This article does not contain any studies with human PublicationsPolicyAndGuidance/DH_4068403. participants or animals performed by any of the authors. 15. Hodson H. Revealed: Google AI has access to huge haul of NHS patient data. New Scientist. 29 Apr 2016. https://www. newscientist.com/article/2086454-revealed-google-ai-has-access- Open Access This article is distributed under the terms of the Creative to-huge-haul-of-nhs-patient-data/. Commons Attribution 4.0 International License (http:// 16. Hawkes N. NHS data sharing deal with Google prompts concern. creativecommons.org/licenses/by/4.0/), which permits unrestricted use, BMJ. 2016;353. doi:10.1136/bmj.i2573. distribution, and reproduction in any medium, provided you give appro- This commitment was priate credit to the original author(s) and the source, provide a link to the reaffirmed by both parties immediately prior to publication: Creative Commons license, and indicate if changes were made. personal communication, O’Connell M, Royal Free press office Health Technol. (2017) 7:351–367 365 to Powles, 5 Nov 2016; personal communication, Rickman O, 34. Caldicott F. Information: to share or not to share? The information DeepMind to Powles, 4 Nov 2016 g overnance review. 2 0 1 3 . ht tp s:/ / w w w. g o v. uk/government/uploads/system/uploads/attachment_ 17. Royal Free. Google DeepMind: Q&A. 2016. https://www. data/file/192572/2900774_InfoGovernance_accv2.pdf. royalfree.nhs.uk/news-media/news/google-deepmind-qa/. 35. Data Protection Act 1998 (UK), Schedule 3, par 8. Accessed 20 Sep 2016. 36. Hodson H. Did Google’s NHS patient data deal need ethical ap- 18. Details on the specifics of the data package are the subject of proval? New Scientist. 13 May 2016, updated 8 Jun 2016. ongoing investigation, including via Hodson freedom of informa- https://www.newscientist.com/article/2088056-did-googles-nhs- tion request 1812 to Royal Free, 13 Dec 2016. patient-data-deal-need-ethical-approval/. 19. NHS. Algorithm for detecting acute kidney injury (AKI) based on 37. DeepMind. Memorandum of understanding. 2016. https://storage. serum creatinine changes with time. 2014. https://www.england. googleapis.com/deepmind-data/assets/health/Memorandum%20 nhs.uk/wp-content/uploads/2014/06/psa-aki-alg.pdf. of%20understanding%20REDACTED%20FINAL.pdf. 20. Sawhney S, Fluck N, Marks A, Prescott G, Simpson W, 38. Lomas N. NHS memo details Google/DeepMind’sfiveyear plan Tomlinson L, Black C. Acute kidney injury––how does automat- to bring AI to healthcare. TechCrunch. 8 Jun 2016. http://tcrn. ed detection perform? Nephrol Dial Transplant. 2015;30(11): ch/25MV8Py. 1853–61. doi:10.1093/ndt/gfv094. 39. Wakefield J. Google DeepMind: should patients trust the company 21. Montgomery H, quoted in Google’s NHS data deal ‘business as with their data? BBC. 23 Sep 2016. http://www.bbc.co. usual’ says Prof. BBC. 5 May 2016. http://www.bbc.co. uk/news/technology-37439221. uk/news/technology-36212085. 40. Suleyman M. Delivering the benefits of a digital NHS. NHS Expo 22. DeepMind took one step towards general ethics approval (a nec- 2016, Manchester. 7 Sep 2016. https://youtu.be/L2oWqbpXZiI. essary precursor to research approvals, which must be separately 41. Suleyman M. New ways for technology to enhance patient care. and specifically obtained for each site where research is undertak- King’s Fund Digital Health and Care Congress 2016, London. 5 en) on 10 Nov 2015: HRA. Using machine learning to improve Jul 2016. https://youtu.be/0E121gukglE. prediction of AKI & deterioration. In: Research summaries. 42. Suleyman M. Artificial intelligence and the most intractable prob- http://www.hra.nhs.uk/news/research-summaries/using-machine- lems. Nesta FutureFest 2016, London. 17 Sep 2016. https://youtu. learning-to-improve-prediction-of-aki-deterioration/. Accessed 6 be/KF1KhuoX2w4. Oct 2016. 43. DeepMind. Streaming the right data at the right time. In: Streams. 23. After 12 months, still no research approval was granted; though 2016. https://deepmind.com/applied/deepmind-health/streams/. applications for use of anonymized data in “potentially enhanced Accessed 1 Nov 2016. detection of AKI” remained on foot (details withheld on the basis 44. Lomas N. UK healthcare products regulator in talks with Google/ that they would disclose commercially sensitive information about DeepMind over its Streams app. TechCrunch. 18 May 2016. the research protocol). In: Royal Free response to Hodson freedom http://tcrn.ch/1XziiGT. of information request 1716, 9 Nov 2016. 45. No HRA approvals exist. In particular, there is no research approv- 24. NHS. Patient safety alert on standardising the early identification al directed at the category of patients who are never treated for of Acute Kidney Injury. 2014. https://www.england.nhs.uk/2014 kidney injury. No research approval presently exists for /06/psa-aki/. Accessed 20 Sep 2016. DeepMind to do anything with Royal Free data beyond mere 25. NHS. Patient safety alert: directive standardising the early identi- application of the AKI algorithm; though a research application fication of acute kidney injury. 2015. https://www.england.nhs. for Bpotentially enhanced detection of AKI^ is pending: Royal uk/wp-content/uploads/2014/06/psa-aki-alg-faqs.pdf. Accessed Free response to Hodson freedom of information request 1716, 9 20 Sep 2016. Nov 2016. HRA approval was sought for assessing the effective- 26. This is not to say that it would not be useful under a research ness of the post-alert enhanced care component of Streams on 21 project. See Connell A, Laing C. Acute kidney injury. Clin Med. Mar 2016, and on 29 Mar 2016 this was advised to be Bservice 2015;15(6):581–483. doi: 10.7861/clinmedicine.15–6-581 (co- evaluation^ rather than research: Royal Free response to Hodson authored by one of the architects of the DeepMind-Royal Free freedom of information request 1717, 9 Nov 2016. deal, promoting Bdevelopment of algorithm-based predictive, di- 46. Lomas N. DeepMind’s first NHS health app faces more regulatory agnostic, and risk-stratification instruments^). bumps. TechCrunch. 20 Jul 2016. http://tcrn.ch/2a85jum. 27. Kellum JA, Kane-Gill SL, Handler SM. Can decision support 47. NHS. Information governance toolkit. https://www.igt.hscic.gov. systems work for acute kidney injury? Nephrol Dial Transplant. uk/. Accessed 20 Sep 2016. 2015;30(11):1786–1789. doi: 10.1093/ndt/gfv285. 48. DeepMind. Information governance. 2016. https://deepmind. 28. Meijers B, De Moor B, Van Den Bosch B. The acute kidney injury com/applied/deepmind-health/information-governance/. e-alert and clinical care bundles: the road to success is always Accessed 6 Oct 2016. under construction. Nephrol Dial Transplant. 2016;0:1–3. 49. Hern A. DeepMind has best privacy infrastructure for handling doi:10.1093/ndt/gfw213. NHS data, says co-founder. Guardian. 6 May 2016. https://gu. 29. Roberts G, Phillips D, McCarthy R, et al. Acute kidney injury risk com/p/4jv7m. assessment at the hospital front door: what is the best measure of 50. Letter from HSCIC to Med Confidential. 6 Jul 2016. risk? Clin Kidney J. 2015;8(6):673–80. doi:10.1093/ckj/sfv080. 51. DeepMind. Waking project privacy impact assessment. 2016. 30. HRA. Using machine learning to improve prediction of AKI & (‘Waking’ was an early product name for Streams.) deterioration. In: Research summaries. http://www.hra.nhs. https://storage.googl eapis.com/deepmind- uk/news/research-summaries/using-machine-learning-to- data/assets/health/Privacy%20Impact%20Assessment%20 improve-prediction-of-aki-deterioration/. Accessed 6 Oct 2016. for%20Waking%20Project%2027%20Jan%202016%20V0%201 31. Personal communication, O’Brien D, Royal Free press office to %20redacted.pdf. Hodson, 14 Jul 2016. 52. ICO. Conducting privacy impact assessments: code of practice. 32. Letter from Sloman D, Royal Free Chief Executive to Ryan J MP, 2014. https://ico.org.uk/media/for-organisations/documents/1595 22 Jul 2016. /pia-code-of-practice.pdf. 33. BBC. Why Google DeepMind wants your medical records. 19 53. Personal communication, O’Brien D, Royal Free press office to Hodson, 15 Jun 2016. Jul 2016. http://www.bbc.co.uk/news/technology-36783521. 366 Health Technol. (2017) 7:351–367 54. Donnelly C. ICO probes Google DeepMind patient data-sharing 76. Data Protection Act 1998 (UK), Schedule 1, Part II, par 5–6. deal with NHS hospital trust. Computer Weekly. 12 May 2016. 77. NHS. Bristol, North Somerset and South Gloucestershire http://www.computerweekly.com/news/450296175/ICO-probes- Connecting care data sharing agreement. 2015. https://www. Google-DeepMind-patient-data-sharing-deal-with-NHS- bristolccg.nhs.uk/media/medialibrary/2016/01/FOI_1516_264_ Hospital-Trust. Confirmed as ongoing by ICO in Sep 2016. connecting-care-data-sharing-agreement-v3-sept-15.pdf. 55. Lomas N. DeepMind NHS health data-sharing deal faces further 78. Pasquale F. The black box society: the secret algorithms that control scrutiny. TechCrunch. 23 Aug 2016. http://tcrn.ch/2bKqz7p. money and information. 2015. Cambridge: Harvard University 56. Personal communication, CQC press office to Hodson, 14 Press. Oct 2016. 79. The Wellcome Trust. The one-way mirror: public attitudes to com- 57. Boiten E. Google’s Larry page wants to save 100,000 lives but big mercial access to health data. 2016. https://wellcome.ac. data isn’t a cure all. The Conversation. 27 Jun 2014. uk/sites/default/files/public-attitudes-to-commercial-access-to- http://theconversation.com/googles-larry-page-wants-to-save- health-data-wellcome-mar16.pdf. 100-000-lives-but-big-data-isnt-a-cure-all-28529. 80. Powles J. The case that won’t be forgotten. Loy. U. Chi. L.J. 58. Carter P, Laurie GT, Dixon-Woods M. The social licence for re- 2015;47:583 –615. http://luc.edu/media/lucedu/ search: Why care.data ran into trouble. J Med Ethics. 2015;41(5): law/students/publications/llj/pdfs/vol47/issue2/Powles.pdf. 404–9. doi:10.1136/medethics-2014-102374. 81. Zweifel S. Referee report for: Automated analysis of retinal imag- 59. National Data Guardian. The independent information governance ing using machine learning techniques for computer vision [ver- oversight panel’s report to the care.data programme board on the sion 1; referees: 2 approved]. F1000Research. 2016;5:1573. care.data pathfinder stage. 2014. https://www.gov. doi:10.5256/f1000research.9679.r14781. uk/government/uploads/system/uploads/attachment_ 82. Yang Y. Referee report for: Automated analysis of retinal imaging data/file/389219/IIGOP_care.data.pdf. using machine learning techniques for computer vision [version 1; 60. Nuffield Council on Bioethics. The collection, linking and use of referees: 2 approved]. F1000Research. 2016;5:1573. doi:10.5256 data in biomedical research and health care: ethical issues. 2015. /f1000research.9679.r15056 http://nuffieldbioethics.org/wp-content/uploads/Biological_and_ 83. Feng Y. Referee report for: Applying machine learning to auto- health_data_web.pdf. mated segmentation of head and neck tumour volumes and organs 61. This was reiterated again in a report post-dating the DeepMind- at risk on radiotherapy planning CT and MRI scans [version 1; Royal Free transfer. National Data Guardian for Health and Care. referees: 1 approved with reservations]. F1000Research. 2016;5: Review of data security, consent and opt-outs. 2016. https://www. 2104. doi: 10.5256/f1000research.10262.r17312. gov.uk/government/uploads/system/uploads/attachment_ 84. DeepMind. Our independent reviewers. 2016. https://deepmind. data/file/535024/data-security-review.PDF. com/applied/deepmind-health/independent-reviewers/. Accessed 62. Lawrence ND. Google’s NHS deal does not bode well for the 6 Oct 2016. future of data-sharing. Guardian. 5 May 2016. https://gu.com/p/4 85. Baraniuk C. Google’s DeepMind to peek at NHS eye scans for tpd5. disease analysis. BBC. 5 Jul 2016. http://www.bbc.co. 63. Data Protection Act 1998 (UK), s.1(1), 4. uk/news/technology-36713308. 64. Data Protection Act 1998 (UK), s.2(e). 86. Hodson H. Google’s new NHS deal is start of machine learning 65. Data Protection Act 1998 (UK), Schedule 3. marketplace. New Scientist. 6 Jul 2016. https://www.newscientist. 66. Data Protection Act 1998 (UK), Schedule 3, par 8. com/article/2096328-googles-new-nhs-deal-is-start-of-machine- 67. For a comprehensive and critical analysis of these concepts, see learning-marketplace/. Van Alsenoy B. Regulating data protection: the allocation of re- 87. Hillen M. On a quest to find the holy grail of imaging. The sponsibility and risk among actors involved in personal data pro- Opthalmologist. 2016. https://theophthalmologist. cessing. 2016. KU Leuven doctoral thesis. https://lirias.kuleuven. com/issues/0716/on-a-quest-to-find-the-holy-grail-of-imaging/. be/bitstream/123456789/545027/1/PhD_thesis_Van_Alsenoy_ 88. De Fauw J, Keane P, Tomasev N et al. Automated analysis of Brendan_archived.pdf. retinal imaging using machine learning techniques for computer 68. Data Protection Act 1998 (UK), s.1(1). vision [version 1; referees: 2 approved]. F1000Research 2016;5: 69. Article 29 Working Party. Opinion 1/2010 on the concepts of 1573. doi:10.12688/f1000research.8996.1. controller and processor. 2010. http://ec.europa. 89. Meyer D. Google’s DeepMind partners with British doctors on eu/justice/policies/privacy/docs/wpdocs/2010/wp169_en.pdf. oral cancer. 31 Aug 2016. Fortune. http://fortune.com/2016/08 70. ICO. Data controllers and data processors: what the difference is /31/google-deepmind-cancer/. and what the governance implications are. https://ico.org. 90. Chu C, De Fauw J, Tomasev N et al. Applying machine learning uk/media/1546/data-controllers-and-data-processors-dp-guidance. to automated segmentation of head and neck tumour volumes and pdf. organs at risk on radiotherapy planning CT and MRI scans [ver- 71. Wales IG. Royal Free NHS Trust and Google UK. 5 May 2016. sion 1; referees: 1 approved with reservations]. F1000Research. http://igwales.com/?p=107. 2016;5:2104. doi:10.12688/f1000research.9525.1. 72. Shead S. Google’s DeepMind tried to justify why it has access to 91. Taylor L. The ethics of big data as a public good: Which public? millions of NHS patient records. Business Insider. 27 May 2016. Whose good? SSRN. 2016. doi:10.2139/ssrn.2820580. http://uk.businessinsider.com/googles-deepmind-tried-to-justify- 92. Charette RN. Automated to death. IEEE Spectrum. 15 Dec 2009. why-it-has-access-to-millions-of-nhs-patient-records-2016-5. http://spectrum.ieee.org/computing/software/automated-to-death. 73. Boiten E. Google is now involved with healthcare data – is that a 93. House of Commons Science and Technology Committee. The big go od thing? The C onversation. 5 M ay 2016. data dilemma: Government response. 2016. HC 992. http://www. https://theconversation.com/google-is-now-involved-with- publications.parliament.uk/pa/cm201516/cmselect/cmsctech/992 healthcare-data-is-that-a-good-thing-58901: “This is seeing /992.pdf. clinical care through a mass surveillance lens – we need all the 94. House of Commons Science and Technology Committee. The big data on everyone, just in case they require treatment”. data dilemma. 2016. HC 468. ht tp://www.publications.parliament. 74. See also, Health and Social Care Act 2012 (UK), s.251B. uk/pa/cm201516/cmselect/cmsctech/468/468.pdf. 75. The Health Service (Control of Patient Information) Regulations 95. House of Commons Science and Technology Committee. 2002 (UK), Schedule 1. Robotics and artificial intelligence. 2016. HC 145. http://www. Health Technol. (2017) 7:351–367 367 publications.parliament.uk/pa/cm201617/cmselect/cmsctech/145 99. Lawrence ND. Data trusts could allay our privacy fears. Guardian. 3 Jun 2016. https://gu.com/p/4k5gk. /145.pdf. 96. Shead S. Google DeepMind has doubled the size of its healthcare 100. Honeyman M. What if people controlled their own health data? The team. Business Insider. 11 Oct 2016. http://uk.businessinsider. King’s Fund blog. 10 Aug 2016. https://www.kingsfund.org. com/google-deepmind-has-doubled-the-size-of-its-healthcare- uk/reports/thenhsif/what-if-people-controlled-their-own-health-data/. team-2016-10. 101. Persson J. Care.data, the King and I: An eternal illusion of control 97. Stevens L. Google DeepMind recruits government health tech and consent? 16 Aug 2016. http://jenpersson.com/king-i-privacy- managers. Digital health intelligence. 13 Oct 2016. http://www. caredata-consultation/. digitalhealth.net/news/48167/google-deepmind-recruits- 102. Schmidt E, Cohen J. Technology in 2016. Time. 21 Dec 2015. government-health-tech-managers. http://time.com/4154126/technology-essay-eric-schmidt-jared- 98. Frischmann BM, Madison MJ, Strandburg KJ. Governing cohen/. knowledge commons. Oxford: Oxford University Press;

Journal

Health and TechnologyPubmed Central

Published: Mar 16, 2017

There are no references for this article.