Google's DeepMind Company Didn't Break The Law Over NHS Data, Report Finds
DeepMind Health, a Google subsidiary that applies AI techniques to patient data, complied with the law in a recent project with the Royal Free hospital – although the Royal Free did not, and DeepMind was criticised for not being sufficiently proactive in reminding the hospital of its obligations.
A day after the Information Commissioner's Office found that the Royal Free hospital was in breach of the Data Protection Act in how it shared data with Google's DeepMind company, an independent review has found that DeepMind itself did not break the law.
DeepMind Health (DMH), the part of the London-based company that is dedicated to using AI techniques to improve the use of patient data, used data from the Royal Free NHS Trust hospitals in an app called Streams. Streams is designed to help clinicians spot when a patient is in acute kidney failure, a condition which leads to many thousands of preventable deaths a year. It looks for patterns in test data and alerts doctors or nurses when it thinks that a patient is at risk.
On Monday an Information Commissioner's Office report found "a number of shortcomings" in how the Royal Free provided the personal data of 1.6 million patients to support the app. Most notably, the Royal Free was not sufficiently open with the patients whose data was shared, meaning that they "would not have reasonably expected their information to have been used in this way".
The original data-sharing agreement received a great deal of criticism after it was revealed in New Scientist last year, leading DMH to set up an independent review chaired by the former MP Dr Julian Huppert. Huppert told a media briefing that the review's purpose was "not to defend or attack DeepMind Health, but to review and scrutinise it", adding that this sort of review, with few limits on what the panel could look at and no nondisclosure agreement, was as far as he knew "unprecedented".
The review panel criticised DMH for being insufficiently transparent in its original data-sharing agreement with the Royal Free, and suggested it publish all future contracts with public sector bodies openly. It also said DMH could be more "proactive" in future about reminding the groups it worked with of their responsibilities, and that as a private company – especially one linked to Google – it would be held to a higher standard than others.
However, the review found that DMH did not breach the Data Protection Act, because it was only acting as a "processor" of data rather than a "controller", as the Royal Free was. The data controller – the party that collects the data – has more responsibilities than the party it passes that data on to.
The review also found that DMH had several minor vulnerabilities – ten "low" risk, one "medium" risk – to data loss. But, it said, the company was in general very secure.
Regina Lally, codirector of the data protection company DataBasix, told BuzzFeed News DMH could have done more to ensure that the data it was getting from the Royal Free was being properly sourced. "There is a responsibility for data processors," she said. "Given the sensitive, personal nature of the data, they could have asked more questions about what needed to happen." Changes to the law coming in next year will mean processors will have more responsibility. Lally said that under the new rules, DMH might have been liable.
Two experts praised DMH for setting up the panel. Hetan Shah, director of the Royal Statistical Society, told the Science Media Centre: "I commend DeepMind Health for setting up an independent review panel which is unbound by nondisclosure agreements, has unfettered access, and its own budget." Nicola Perrin, head of Understanding Patient Data, told the SMC the review was "an indication of DeepMind's commitment to be open to scrutiny".