London's Royal Free Hospital failed to comply with UK privacy laws when it handed over 1.6 million patient records to Google's London-based subsidiary DeepMind for a trial, Britain's data watchdog concluded after a year-long investigation into the matter.
The data shared with DeepMind was part of a deal involving the development of an early-warning healthcare app called Streams, aimed at boosting early diagnosis and detection for acute kidney injury.
However, according to the Information Commissioner's Office (ICO), patients were not informed about the data transfer or about their data being used in the test by the NHS hospital.
"The ICO has ruled the Royal Free NHS Foundation Trust failed to comply with the Data Protection Act when it provided patient details to Google DeepMind," the watchdog said in a statement.
"There's no doubt the huge potential that creative use of data could have on patient care and clinical improvements, but the price of innovation does not need to be the erosion of fundamental privacy rights," said Information Commissioner Elizabeth Denham in a statement.
"Our investigation found a number of shortcomings in the way patient records were shared for this trial. Patients would not have reasonably expected their information to have been used in this way, and the Trust could and should have been far more transparent with patients as to what was happening.
"We've asked the Trust to commit to making changes that will address those shortcomings, and their co-operation is welcome. The Data Protection Act is not a barrier to innovation, but it does need to be considered wherever people's data is being used."
Patient data still being used in Streams trial
Despite the ICO's finding, the watchdog refrained from limiting DeepMind from continuing using patient records in the Streams trial. The Royal Free Hospital, however, has been asked to conduct a third-party audit of the trial and complete a privacy assessment, the Guardian reported.
"We are pleased that the information commissioner ... has allowed us to continue using the app which is helping us to get the fastest treatment to our most vulnerable patients – potentially saving lives," Royal Free said in a statement. The hospital also said that it has taken steps to keep patients informed about how their data is being used.
"We would like to reassure patients that their information has been in our control at all times and has never been used for anything other than delivering patient care or ensuring their safety," the hospital added.
Although the ICO didn't directly criticise DeepMind since the watchdog considered Royal Free as the "data controller," the London-based AI firm acknowledged responsibility over its role in the matter.
"We welcome the ICO's thoughtful resolution of this case, which we hope will guarantee the ongoing safe and legal handling of patient data for Streams," DeepMind said in a blog.
"Although today's findings are about the Royal Free, we need to reflect on our own actions too. In our determination to achieve a quick impact when this work started in 2015, we underestimated the complexity of the NHS and of the rules around patient data, as well as the potential fears about a well-known tech company working in health.
"We were almost exclusively focused on building tools that nurses and doctors wanted, and thought of our work as technology for clinicians rather than something that needed to be accountable to and shaped by patients, the public and the NHS as a whole. We got that wrong, and we need to do better," the firm said.
DeepMind also said that it has since incorporated several new steps, including increasing transparency and developing an independent review board to ensure that it too complies with Britain's privacy laws.
Although the ICO ruled that the deal between DeepMind and Royal Free was illegal, the watchdog refrained from fining either organisations, the Telegraph reported.