Information Commissioner's Office
Royal Free NHS Foundation Trust update, July 2019
In July 2017, following reports concerning the use of Google DeepMind’s Streams application at the Royal Free NHS Foundation Trust, the ICO announced that the processing of personal data within Streams was not compliant with the Data Protection Act 1998 – the relevant data protection law at the time. Having identified several shortcomings with the data processing, the Trust signed an undertaking committing it to bringing the processing into line with data protection laws, including the new General Data Protection Regulation (GDPR) and the Data Protection Act 2018.
The actions the Trust was asked to complete included establishing a proper legal basis for future processing, making sure future developments also comply with the common law duty of confidence, and completing a privacy impact assessment. We also agreed that the Trust would commission an independent audit into the processing of patient data that had occurred during the implementation of Streams.
The ICO can now report that the Trust has completed the actions required, both in response to the requirements set out in the undertaking, and to meet the concerns to addressed in the audit.
On matters relating to the data protection framework which we regulate, the Trust was able to demonstrate that the GDPR principles of proportionality and necessity had been considered and that the processing of large volumes of data was required during phases of clinical testing to ensure patient safety.
It has also taken steps to complete a data protection impact assessment (DPIA) as required by the new legal regime, and improve privacy information to its patients. We are therefore satisfied that the Trust is complying with its data protection requirements and we have no further outstanding concerns regarding the current processing of personal data within Streams.
During the audit, separate concerns were raised around the legal view on how the common law duty of confidentiality - often referred to as a ‘duty of confidence’ - could be satisfied during the clinical testing of Streams. The ICO found the approach - proposed in the audit, which focused on the clinician’s conscience rather than on the patient’s expectations, was inconsistent with current accepted thinking. Whilst common law matters fall outside of our regulatory purview, we are very aware that clinicians and developers are seeking regulatory clarity on the interplay between the duty of confidence and the data protection framework.
Greater clarity is needed and we are committed to working with other bodies including the National Data Guardian and Health Research Authority, to improve guidance and support to the sector so that healthcare organisations like NHS Trusts can implement data-driven technology solutions safely and legally.
Finally, ahead of the transfer of Streams from DeepMind to the new Google Health Unit, the ICO has made it clear to controllers using the Streams service that they will need to have the appropriate legal documentation in place to ensure their processing is in line with the requirements of the GDPR. Organisations must assure themselves and document how they have taken appropriate steps to mitigate data protection risks beyond contractual obligations and the obligation on Google Health under data protection law, such as audits, reports and other appropriate measures.
Latest News from
Information Commissioner's Office
Statement: Live facial recognition technology in Kings Cross16/08/2019 10:10:00
Statement from Elizabeth Denham, Information Commissioner, on the use of live facial recognition technology in Kings Cross, London.
Blog: Three top issues for town and parish councils15/08/2019 10:15:00
The advent of the GDPR in May 2018 brought new data protection obligations for many organisations. Some of this presented a challenge, particularly for smaller organisations like parish and town councils, who we saw were keen to demonstrate their compliance but needed support to achieve this.
ICO launches consultation on the draft framework code of practice for the use of personal data in political campaigning09/08/2019 14:20:00
The Information Commissioner's Office (ICO) is consulting on a new framework code of practice for the use of personal data in political campaigning.
Blog: Protecting children online: update on progress of ICO code07/08/2019 15:10:00
Blog posted by: Elizabeth Denham, Information Commissioner, 07 August 2019.
Fully automated decision making AI systems: the right to human intervention and other safeguards06/08/2019 10:25:00
Reuben Binns, our Research Fellow in Artificial Intelligence (AI), and Valeria Gallo, Technology Policy Adviser, discuss some of the key safeguards organisations should implement when using solely automated AI systems to make decisions with significant impacts on data subjects.
ICO joins international signatories in raising Libra data protection concerns05/08/2019 16:25:00
The Information Commissioner’s Office (ICO) has joined data protection authorities from around the world in calling for more openness about the proposed Libra digital currency and infrastructure.
ICO fines boiler replacement company for thousands of nuisance calls made to TPS subscribers05/08/2019 09:10:00
Making it Easy Ltd has been fined £160,000 by the Information Commissioner’s Office (ICO) for making spam calls to people registered with the Telephone Preference Service (TPS).
Blog: People care more about how their personal data is used. But what aspects cause them most concern?01/08/2019 16:10:00
Blog posted by: Elizabeth Denham, Information Commissioner, 31 July 2019.