European health data reuse plans need “privacy surgery,” as the saying goes

Two key EU data protection watchdogs have called for the May proposal by European Union lawmakers to establish a legal framework to facilitate the sharing of electronic health records and other data across borders and care institutions and with researchers and developers of innovative health products to be revised to ensure that citizens’ health data is stored locally, inside of the European Economic Area (EEA), to prevent unlawful access.

Given the persistent legal ambiguity surrounding the transfer of personal data to other countries after key privacy judgements by the EU’s highest court since 2015, that seems to be sound advice.

We call on the European Parliament and Council to include in the Proposal a requirement to store the electronic health data in the EEA because of the large quantity of electronic health data that would be processed, the highly sensitive nature of those data, the risk of unlawful access, and the need for effective supervision by independent data protection authorities,” write the two supervisors in a summary of their joint opinion on the Commission’s European Hea proposal.

EDPB and EDPS, two EU authorities that provide guidance on the interpretation and execution of legislation, issued their 32-page combined opinion on the EHDS on Tuesday, according to a press release.

It includes a number of additional recommendations for improving the draught rule and explaining its interaction with current data protection regulations, noting that the Commission’s first pass falls short in a number of areas.

At both the national and European levels, there is already substantial regulation of health data (where processing this type of sensitive data with user consent requires an explicit ask, per purpose). Therefore, the EHDS is driven by the need to simplify the sharing of this “special category” data, with legislators praising the potential for the continent if fragmentation can be eliminated and citizens’ health data more easily pooled, processed and reused for purposes like disease research and drug discovery or innovative health tech (like AI diagnosis).

Startups based in Europe that specialise in health care technology, such as Kry, have also expressed their support for the EU’s proposal.

Individual rights to privacy and data access might be jeopardised if new laws encouraging data sharing and reuse isn’t drafted with great care.

EHDS may create legal inconsistencies, generate confusion for data subjects, and even undermine existing regulations like the GDPR and the ePrivacy Directive, according to the EDPB and EDPS opinion. For example, it’s not clear how individual rights, like GDPR’s right to rectification of personal, would be impacted by the framework, the two bodies warned (since the EHDS envisages not one data controller but multiple sources and recipients of personal data, at a national, EU and potentially even international level).

Concerns have been expressed concerning an attempt to limit access to information about so-called “secondary” uses of health data, which the draught law envisions controlling via data permits.

“The EDPB and the EDPS stress that the right to information and the right to object are inexorably intertwined,” they write. Since GDPR restricts individuals’ ability to access personal data, the EDPB and EDPS believe that the Proposal will fall short of its stated goals in Article 1(2)(a). As a result, they warn in their opinion, “the proposed approach appears to undermine natural persons’ rights to privacy and data protection, especially when considering the very broad definition of secondary use and the minimum categories of electronic data for secondary use introduced by the Proposal.”

Since there is so much detail in the Proposal on access, use, and exchange of particular categories of personal data like health data, broad references to the GDPR and the EUDPR [data protection law] may not sufficient.,” he writes. EDPB and EDPS are concerned that essential data protection rules may be misunderstood, resulting in a reduction in the degree of protection now afforded to data subjects under the current EU data protection law framework (GDPR, EUDPR and ePrivacy Directive). Because of this, more standards are required by the EDPB and the EDPS, they conclude.

“The EU Health Data Space will require the processing of vast volumes of data that are of a very sensitive nature,” EDPB head Andrea Jelinek said in an accompanying statement. As a result, it’s critical that our proposal doesn’t jeopardise the rights of everyone living in the European Economic Area. In light of the fact that the Proposal’s statement of rights does not align with the GDPR, persons may be left in the dark about their legal rights as a result. Specifically, we need clarification of how the Proposal’s many rights interact with those of EU citizens under GDPR.”

The dangers of ‘Quantified self’ information

The two data supervisors also recommend that the proposed European Health Data Space be revised to reduce the scope of information — to only include bona fide health data — advocating the removal of a reference that would also draw data from wellness and other consumer health/fitness apps (such as behavioral/lifestyle data) where it has been uploaded to an individual’s electronic health record (EHR).

According to the duo, the inclusion of this kind of data would put people’s privacy at danger since it may be used to draw sensitive conclusions about them based on their health and lifestyle.

A second problem is that consumer-grade technology does not provide the same quality of data as professional health services and medical equipment. Consequently, it might lead to further problematic and perhaps discriminatory links being drawn by combining it with strong health data.

Medical gadgets, fitness apps, and wearables have all seen a spike in popularity since the COVID-19 epidemic, according to the EDPB and EDPS. Although this technology creates a large quantity of data, it may be quite intrusive if it collects certain types of personal data. A degree of body, mind, and emotion surveillance that may be impossible even for humans themselves is now conceivable, in addition to recording human activities and choices. People’s behaviours and behaviour may thus be predicted and manipulated, even at a collective level,” they write.

To ensure that electronic health data generated by medical devices, wellness applications or other digital healthcare programmes are available for secondary use, the authors recommend that it be evaluated in light of the rapid technological developments in wearable and mobile technology and the increasing popularity of ‘quantified self’ apps and devices, which allow people to register all kinds of aspects about their personality, mentality, body, behavioural patterns and whereabouts. When people are worried about their health, it is difficult for them to understand that their data is being processed in this manner. When such data is used for extra purposes and/or integrated with other information or transmitted to other parties, there are actual privacy hazards.

“These types of data processing may create specific risks, including the risk of unequal or unfair treatment based on data about a person’s assumed or actual health status derived, for example through profiling, of very intimate details concerning his/her private life, irrespective of whether these conclusions concerning his/her health status are accurate or not. In fact, those risks may also be linked to the reliability and accuracy of data generated by medical devices, wellness applications or other digital health applications. Against this background, the EDPB and the EDPS acknowledge that Article 33(3) attempts at delimiting which data generated by medical devices, wellness applications or other digital health applications shall be made available for secondary uses. However, the EDPB and the EDPS underline that it is still unclear either what kind of data fall under this category or who would assess its validity and quality once inserted by individuals in their own EHR pursuant to Articles 3(6).”

As long as EU legislators insist on keeping such data in scope, the pair recommend that the proposal be amended to ensure that individuals remain free to decide “if or which of their personal data generated by wellness application and other digital applications… shall be shared with other recipients and further processed for secondary uses”.

There must also be “appropriate measures” put in place to guarantee that data subjects’ preferences are honoured in any subsequent processing, they urge. As a cautionary tale against any attempts to imitate adtech type ‘consent’ systems, which have been shown to be in violation of the GDPR, yet get people’s data moving

Wellness apps and other digital health applications don’t provide the same quality of health data as medical equipment. Because these programmes collect so much data, they may be intrusive and may expose sensitive information such as religious inclination. According to Wojciech Wiewiórowski, EDPS supervisor, “wellness apps and other digital health applications should thus be banned from being made accessible for secondary use.”

“The success” of the EHDS depends on what the two data supervisors summarise as “a robust legal basis for processing in line with EU data protection law, the establishment of a strong data governance mechanism and effective safeguards for the rights and interests of natural persons that are fully compliant with the GDPR,” which they emphasise must be accompanied by “sufficient assurances of a lawful, responsible, ethical management that is anchored I in the GDPR,” they write in a joint statement.

EHDS should “act as an example of openness, effective accountability and right balance between the individual data subjects’ interests and society’s as a whole,” they conclude.

Requests for an answer from the Commission were made. It has yet to respond at the time of this writing.

Strategy for collecting and analysing data

So if there is an agreement that the framework needs to be strengthened to defend basic rights, there is a well-established route to take in the EU’s lawmaking process, which normally includes the European Parliament and the Council as co-legislators.

The EHDS is part of a broader EU policy to increase data reuse for economic and social benefit that was laid out by the Commission in 2020.

A number of important pieces of legislation have been put in place by the EU executive since then, including the Data Governance Act (due to go into effect at the end of 2021 but which was agreed by the co-legislators this year and took effect on June 23) and the Data Act (proposed February 2022).

Commissaire for the Internal Market has proposed the data plan would assist Europe shift away from US-dominated Big Tech, by putting in place policies and infrastructure that will make the area “the world’s most data-empowered continent,” as he has stated.

A “medicine” containing (yet) more restrictions that delivers a successful innovation formula would not win over opponents who blame the EU for the absence of domestic IT giants on its love for high-dimension regulation.

The truth will come out in the end, but the Commission is working hard to get things done while we wait.

There will be more “data spaces” like this in the future, with the goal of increasing industrial data exchange and reuse while also encouraging individuals to volunteer their personal information in the name of “altruistic” reasons (like heath research or fighting climate change).

The EHDS, for example, has a set of rules and requirements in place to ensure “secure and privacy-preserving access and interoperability” via “dedicated trusted infrastructure and processes” for specific sectoral or topic-based information-sharing hubs — like the EHDS — and thus “grease the pipes” of data sharing for research and innovation. To further Europe’s AI ecosystem, legislators are placing a high value on expanding access to data as a means of spurring its growth.

For example, the European Commission has suggested creating a data space for the EU’s manufacturing sector, another for transportation, and a third to help its efforts to reduce emissions and decarbonize the economy.

The speed with which the Data Governance Act has been adopted demonstrates that the data reuse approach has strong support among EU legislators. Although the intersection of economic imperatives and individual rights may cause greater debate/dispute about standards for some sectoral/thematic data areas, such as health.

The health sector is one of the most delicate when it comes to encouraging data sharing, therefore the Commission’s selection of health as the first common data space may seem unusual. However, the recent pandemic of COVID-19 has focused legislators’ attention on the need for more efficient methods of transmitting health data in the event of another disaster.

For those hoping to access these data spaces, the administrative effort will be greater since each will have its own unique set of regulations, but the advantages are expected to exceed this.

According to EDPB and EDPS, the EU is growing its spread of digital legislation, which increases the potential of inconsistencies being inserted into its legal framework that might affect individual rights in areas such as data protection.

It might also raise new legal questions for businesses if regulatory obligations don’t match up.

According to Lukasz Olejnik, a security and privacy expert, this is an increasing threat. In the past, and in the next five years, new rules on technology are continually emerging. According to TechCrunch, “The confluence and fusion may generate further risks: of compliance, even lower safeguards.”

Digitalization of health data has the potential to cut costs, improve healthcare, and possibly save lives. But there are drawbacks as well. Of misappropriation, leakage, and theft. “Cyberattacks might be attracted to these centralised’spaces,'” he says.