Here we go again... NHS Data: Round 3


We have written a few articles on NHS digital’s move to collect all health data from GP practices – a move that has now been dubbed the GPDPR programme – General Practice Data for Planning and Research Programme – which sounds awfully close the GDPR – the EU general data protection regulation intended to protect citizens privacy rights but that’s a discussion for a different day.

As we concluded in our last article on NHS data: health data is important, it drives innovation, and it holds the potential to unlock new treatments and cures and solve 21st century challenges, such as those laid out in the UN sustainable development goals. We are not opposing the collection and use of data per se, what we are strong believers in, however, is that data security, trust and privacy are essential building blocks in a functioning 21st century democracy – and we would argue society doesn’t have a strong privacy foundation at the moment.

The risk of abuse, manipulation, and targeting/bias towards certain groups of people (intentionally or unintentionally) when you control a large pool of data is no joke, especially a data pool that includes all the personal and sensitive data for over 55 million people, and especially when that data is health data. One estimate of the value of patient data to the NHS puts it at £10 billion per year, or ten thousand million pounds, (which I think always highlights the absurdity of ‘billions’ better).

So before we talk about the disappointing news that a US data-hungry spy firm is likely going to get access to all NHS data very shortly, we wanted to cover a few basic but not very well understood ideas surrounding big data.

 

______________________ 

 

The idea that large corporations want to collect huge amounts of data in the name of profit, safety or innovation isn’t new, but the more information these companies collect – the more detailed the profiles, tracking and insight into our lives become. Most people have caught on to the fact that for many companies, the main reason for collecting massive amounts of data is to show us adverts so we buy stuff – a boring but ultimately harmless aim – even if it can be invasive and annoying. But in the case of national security, mass data collection is a corner stone of digital mass surveillance – keeping a track on what everyone is saying and doing to prevent and detect bad people doing bad stuff. This raises several ethical issues, how far does the right to individual privacy go when weighed up against the need to protect society from threats?

I have been unable to find a single source in relation to the mass collection of NHS data (or any data to be honest) addressing what would happen if the reason why governments / organisations wanted to collect that data changes over time. For instance, UK charity ‘Privacy international’ and civil liberty campaigner ‘Liberty’ have both been campaigning for privacy and justice for over 100 years between them. Their most recent story from summer 2022 demonstrated how MI5 breached surveillance laws for more than a decade. Liberty and Privacy international told an investigative powers tribunal :..they were in breach of key legal safeguards, unlawfully held and used individuals’ private data that was gathered by secret surveillance. This included breaches of safeguards around how long MI5 retained data, who had access to data, and how MI5 protected legally privileged material such as private correspondence between lawyers and clients.

MI5 later admitted they stored data without securing the legal right to do so and withheld the information from the Home Office and other oversight bodies. When made aware, the Home Office and various Home Secretaries overlooked and failed to investigate MI5. Surveillance warrants are supposed to be approved by the Home Secretary and can only be approved if the Home Secretary is satisfied legal safeguards around the handling of data are being met. Liberty and Privacy International argued that successive Home Secretaries repeatedly ignored the signs of MI5’s unlawful handling of data and continued to sign off on surveillance warrants unlawfully despite this. We said earlier than society doesn’t have a strong privacy foundation, but signs are pointing towards an increasing awareness about privacy in the population. A recent survey of 1000 UK citizens found that 70% wanted the illegally collected data deleted and 66% said their trust in the government and in social media / email was decreased. 30% said they were equally concerned about hackers and the government.

The MI5 case is an example of protocol deviation, or perhaps better defined as a protocol violation. It can happen both consciously and unconsciously. A deliberate protocol deviation takes place via legislative changes and is thus a deliberate act. What one might call "unconscious" protocol deviation means that over time those who use the system (for ex. MI5 or, hypothetically, NHS digital) establish a practice that is not consistent with the intention of the original legislature – such as noticing patterns or finding short cuts to make their work easier, or after breaking the rules once, they were able to apprehend a bad guy, the patterns repeats, and this slowly chips away at their core values over time. In this sense, one can say that a deliberate protocol deviation is, democratically speaking, less objectionable than the unconscious, as a conscious deviation means at least the governing legislature has considered and implemented the protocol change.

There is however a third category of deviation that includes deliberate cases of abuse of data and power, and we can equally argue that the MI5 case was a deliberate abuse of power. Whilst as a rule protocol deviation must be avoided, cases of deliberate abuse must be counteracted and mitigated through good control arrangements and external independent governing bodies that represent the owners of the data (i.e., us).

 

______________________ 

 

Consider a hypothetical situation where the UK votes in an extreme political leader who decides that such a powerful system for bulk collection of health data through NHS digital should, or indeed must also be used for purposes other than just healthcare? It can be used to fight crime, or find child abusers, or combat extremism and combat terrorism. What if they want to identify ‘benefit cheaters’, or tax avoiders, or drug addicts and alcoholics, or introduce a sugar tax and want to collect data on obese people, or gamblers, or smokers, or teenagers in gang fights? What if you or your partner had an abortion, and the police want to have a chat because abortion is illegal? That is certainly the case for Americans today – a country where private industry routinely lobby politicians to promote their own interests, a trend which is increasing in the UK. UK politics is arguably still relatively clean and is currently ranked 11th on the corruption perception index, but trends can change quickly.

The intention behind the mass gathering of NHS data from GP’s is routed in a desire to modernise and improve healthcare, create new ways of identifying trends in data, save lives etc. Having access to all the personal health data of over 55 million people in one giant pool is risky and requires an exceptional amount of technical security, but also trust in the government, NHS digital and all the companies that request access to the data. To put this into context, if you have a minute click the link and scroll through the list to see all the companies that currently have data sharing agreements with NHS digital for your data.

Hacking, cyber-attacks and social engineering are tactics cyber criminals will employ to get access to the data. The risk of abuse by employees internally is another risk - but these events will usually happen in isolation and adequate safety measures can be implemented. The most dangerous risk is the risk of protocol deviation. One day, someone or some organisation that is powerful, devious, corrupt, pious, biased, polarised, brainwashed enough can come along and demand that the aims and objectives of this large data pool are to change and we are instead going to use it to incriminate, bias or target a particular group based on the whim of whatever is deemed unethical, immoral or illegal by the powers that be at the time. You currently have the option to opt out – but most people haven’t done so (there were 3,032,917 national data opt-outs as at 1st July 2021, an increase of 1,275,153 compared to 1 June 2021) but do we think that means that 52 million people have given their informed consent?

Children, elderly, people with English as a second language, variations in educational levels and in the understanding of something as complex as what opting in or out actually means for your health data, or other unnecessary terminology by NHS digital such as ‘pseudonymisation’ (a measure that allows data to be temporarily de-identified) are all factors that should make us sceptical about the consent levels present among the 52 million data sets being collected.

We thought we would try a thought experiment – what would it take for us to feel comfortable in sharing our most intimate data with NHS digital and other hitherto unidentified third party private and public companies (for research and profit purposes). We had a quick brainstorm and came up with the following criteria:

  1. To access the data, the third party must submit an application, each case is evaluated individually by a panel (like an ethics board).
  2. At this meeting there is, among others, a solicitor / barrister who represents us, ‘the people’, who considers the privacy burden of the intended use and argues our case to make sure our best interest is protected.
  3. The system has a built in ‘kill switch’ that renders the data unreadable / unusable in the event of misuse or hacking. Companies cannot download or copy the data, and need to access it through a safe, virtual desktop– this way connection can be given and revoked centrally. This would contribute to the mitigation of protocol deviation by third parties.
  4. The system is evaluated every two years and there are regular unannounced audits by an independent organisation of privacy specialists. During these visits the company will confirm that the data is being used as intended. The outcome of these findings will be published on an open access site as green, orange, or red compliance.
  5. If the data is being misused, and evidence of protocol deviation is detected, it is noted online. The company is liable to be fined, according to law, and will be banned from accessing NHS data for a minimum of 1 years. They will need to re-apply for access to data.
  6. We have a greater say in what data we can share on a granular level. You may have personal reasons why you would be happy to share your data to any provider that is researching cancer treatments, but (again, personally) don’t want American pharmaceutical companies to have access to it. You should be able to make these views clear. It is your data. GP health data is rarely a positive list of experiences. Healthy and happy people don’t spend a lot of time in GP surgeries. It is your history, your pain, your grief, and suffering, it should be your decision.

We could write up another 100 of these, and you could probably write down what it would take for you to be comfortable sharing your data with all those other private providers too.

It’s not like we love diabetes and look forward to Alzheimer’s – we want to feel safe AND help people, we want to be able to trust the government and NHS digital AND avoid mass digital surveillance that throws us into an Orwellian dystopia. The point is that it is possible to mitigate some of the risks associated with mass harvesting and digital mass surveillance to retain a semblance of democracy in all this.



______________________ 

 

Right then, let’s get on to the news:

An American company called Palantir – a company quite literally named after a Lord Of The Rings communication device used by Sauron and Saruman to subjugate Middle Earth, (also labelled a spy tech firm by Open Democracy) has been involved in the handling of UK patient data since 2020 when it started providing software that processes data for a variety of purposes - including take-up of Covid-19 vaccines and managing the post-pandemic bounce back in elective care.

Palantir was originally founded (with support from the US Central Intelligence Agency) in 2003 and they are now a significant powerhouse for data processing. Co-funded by Peter Thiel, a German American billionaire entrepreneur high-profile Trump supporter, Palantir are now on track to deliver a new ‘Federated Data Platform’ – aggregating data from multiple sources and formats into a single NHS platform.

They gained media attention last year when Open Democracy took the NHS to court over a long-term contract with Palantir for the analysis of vast amounts of public health data. Their work ranges from counter terrorism, helping tax revenue services, modernising the US army and The UK Ministry of Defence, and (drum roll) potentially enabling human rights violations against migrants and asylum seekers by facilitating ‘surveillance, mass raids, detentions, as well as de facto family separations and deportations’ by the American Immigration and Customs Enforcement (ICE). This drew attention from Amnesty International who published a rapport on the urgent need for Palantir to respect human rights.

It’s useful to point out that we don’t believe that the people working for NHS digital are in any way evil or malicious, quite the opposite. They want this data to do what they believe is good and just. Take this example for instance, where GP data was used to identify patients who were clinically deemed to be vulnerable during covid – which in turn created the shielded patient list of over 3.8 million patients, likely saving thousands of lives. The data was collected on a weekly basis for over two years, and was shared with the cabinet office, local authorities, CCG’s, hospitals, 111 services, mental health providers.

Patients who were deemed clinically vulnerable were asked to stay home for months at a time, and the data was shared with several services to ensure they received the support they needed. The ethics of this whole process was never really discussed at the time, it was after all a global emergency and decisions had to be made quickly. One cynical way of looking at it however, is that the government accessed data it holds about our health, created a list of people that it determined shouldn’t go outside, told them all they had to stay home, then told a lot of other people that you shouldn’t go outside and that they should offer you different services, some you had to pay for, some you didn’t.

It isn’t difficult to picture a scenario where this could turn dark. Although we believe they are trying hard to improve their image and listen to public concerns – earlier this year, shortly after they tried and failed to take all GP data with minimal public consultation, they created the GP Data Patient and Public Engagement and Communications Advisory Panel – the PPECAP – (not a great acronym?) – a combination of NHS digital staff, health workers and lay people.  They determined that the following criteria must be in place before mass harvesting of patient data can take place:

  • the ability to delete data if patients choose to opt-out of sharing their GP data with NHS Digital, even if this is after their data has been uploaded
  • the backlog of opt-outs has been reduced
  • a Trusted Research Environment has been developed and implemented in NHS Digital
  • patients, carers, and the public have been made more aware of the scheme through a campaign of engagement and communication

They are working hard to convince the public to be more comfortable with data sharing, and are engaging with diverse groups, holding ongoing communication and engagement exercises and publishing regular blogs, media campaigns. They are taking big steps to create a more trustworthy programme, whether you think they have achieved that will be up to you to decide.

You have a right to consent to sharing your data, in which case you don’t need to do anything. If you are in England you want to opt-out, you can do so here.