Information Commissioner's Office
Four lessons NHS Trusts can learn from the Royal Free case
Today the ICO announced that the Royal Free NHS Foundation Trust did not comply with the Data Protection Act when it turned over the sensitive medical data of around 1.6 million patients to Google DeepMind, a private sector firm, as part of a clinical safety initiative.
As a result of our investigation, the Trust has been asked to sign an undertaking committing it to changes to ensure it is acting in accordance with the law, and we’ll be working with them to make sure that happens.
But what about the rest of the sector? As organisations increasingly look to unlock the huge potential that creative uses of data can have for patient care, what are the lessons to be learned from this case?
It’s not a choice between privacy or innovation
It’s welcome that the trial looks to have been positive. The Trust has reported successful outcomes. Some may reflect that data protection rights are a small price to pay for this.
But what stood out to me on looking through the results of the investigation is that the shortcomings we found were avoidable. The price of innovation didn’t need to be the erosion of legally ensured fundamental privacy rights. I’ve every confidence the Trust can comply with the changes we’ve asked for and still continue its valuable work. This will also be true for the wider NHS as deployments of innovative technologies are considered.
Don’t dive in too quickly
Privacy impact assessments are a key data protection tool of our era, as evolving law and best practice around the world demonstrate. Privacy impact assessments play an increasingly prominent role in data protection, and they’re a crucial part of digital innovation. Our investigation found that the Trust did carry out a privacy impact assessment, but only after Google DeepMind had already been given patient data. This is not how things should work.
The vital message to take away is that you should carry out your privacy impact assessment as soon as practicable, as part of your planning for a new innovation or trial. This will allow you to factor in your findings at an early stage, helping you to meet legal obligations and public expectations.
New cloud processing technologies mean you can, not that you always should
Changes in technology mean that vast data sets can be made more readily available and can be processed faster and using greater data processing technologies. That’s a positive thing, but just because evolving technologies can allow you to do more doesn’t mean these tools should always be fully utilised, particularly during a trial initiative.
In this case, we haven’t been persuaded that it was necessary and proportionate to disclose 1.6 million patient records to test the application. NHS organisations, perhaps more than any other sector, need to remember that we are talking about the medical information of real patients. This means you should consider whether the risks to patient privacy are likely to be outweighed by the data protection implications for your patients. Apply the proportionality principle as a guiding factor in deciding whether you should move forward.
Know the law, and follow it
No-one suggests that red tape should get in the way of progress. But when you’re setting out to test the clinical safety of a new service, remember that the rules are there for a reason. Just as you wouldn’t ignore the provisions of the Health and Social Care Act, or any other law, don’t ignore the Data Protection Act: you need a legal basis for processing personal data. Whether you contact the ICO or obtain expert data protection advice as early as possible in the process, get this right from the start and you’ll be well-placed to make sure people’s information rights aren’t the price of improved health.
Latest News from
Information Commissioner's Office
Man prosecuted and police force given undertaking after sensitive data leak on Twitter19/01/2018 09:10:00
A Kent man who posted sensitive police information on Twitter has appeared in court after he admitted breaking the Data Protection Act.
Company which made 75 million nuisance automated calls in four months is fined by the ICO18/01/2018 09:10:00
A company which made 75 million nuisance calls in four months has been fined £350,000 by the Information Commissioner’s Office (ICO).
Statement in response to reports of Just Eat story17/01/2018 10:20:00
An ICO spokesperson yesterday gave a statement in response to reports of Just Eat story.
Firms behind 44 million spam emails, 15 million nuisance calls and one million spam texts fined by the Information Commissioner’s Office12/01/2018 11:10:00
Four companies that disrupted people with nuisance marketing have been fined a total of £600,000 by the Information Commissioner’s Office (ICO).
Carphone Warehouse fined £400,000 after serious failures placed customer and employee data at risk11/01/2018 09:10:00
Carphone Warehouse has been issued with one of the largest fines by the Information Commissioner’s Office (ICO), after one of their computer systems was compromised as a result of a cyber-attack in 2015.