Our lawyer Ersi Michailidou drafted an opinion article on BALANCING PRIVACY AND TRANPARENCY in the use of the Clinical Trials Information System (CTIS).
The solid protection of personal data is not only a key element for the robustness and quality of clinical trial procedures, but it is also considered a core ethical principle for the conduct of clinical research.
Undoubtedly, the decision of a patient to enroll in a clinical study builds upon a feeling of trust, that any personal information collected and processed within the research context will be safeguarded and kept confidential.
On the other hand, a high demand for transparency of the information related to clinical trials appears in the post-pandemic era more critical than ever. Access to this information, including trial results, is important to allow prompt recruitment of patients,avoid duplication of efforts, enable independent scientific evaluation and, ultimately, foster innovation in clinical research.
The Clinical Trials Regulation (CTR), which is effective as of January 2022, introduces an unprecedented level of transparency,through the obligation to submit the data and documents relating to clinical trials via the EU Portal and EU Database, as well as the Annual Safety Reports repository consist the Clinical Trials Information System - ‘CTIS’). However, given the extended obligations for the protection of personal data under the GDPR regime and GCP rules, transparency and personal data confidentiality must be fairly balanced when submitting information in the CTIS.
On May 3rd 2022, the EMA issued an ‘Interim guidance document on how to approach the protection of personal data and commercially confidential information while using the Clinical Trials Information System (CTIS)’, which is available for public consultation. In this document, the EMA provides guidance on how to manage personal data in structured data fields and in documents submitted to CTIS, by virtue of the provisions of article 81(7) CTR, which sets out that no personal data of trial participants shall be publicly accessible, combined with article 81(4) CTR, that states that the CTIS shall be publicly accessible, except where justified to protect personal data.
In particular, according to the EMA guidance document, the following must be taken into account:
In any case, the principle of data minimization under the GDPR requires for pseudonymised personal data to be included in the ‘not for publication’ version of documents, only if this is absolutely necessary and proportionate for the scientific evaluation of clinical trial information and, ultimately, of the safety and effectiveness of the investigational medical product. This is a criticalbalancing test to be performed by Sponsors, so as to serve their primary ethical and legal obligation to protect trial participants’ rights and, at the same time, to best serve transparency and promote clinical research overall.
[1] Anonymisation techniques and their effectiveness should be checked against three criteria (Opinion of article 29 WP on Anonymisation Techniques): (i) is it still possible to single out an individual, (ii) is it still possible to link records related to an individual and (iii) can information be inferred concerning an individual?
To read more: Public consultation of the EMA on the transparency rules for the operation of the Clinical Trials Regulation (CTR) and its Clinical Trials Information System (CTIS):
https://ec.europa.eu/eusurvey/runner/TransparencyRulesPublicConsultationCTIS
On 30 May 2023 the ICO, UK's Data Protection Authority ('DPA'), said they now have come with a new version of the UK Data Protection Bill, in agreement with the Government.
Information Commissioner John Edwards indicated the bill "has moved to a position where I can fully support it."
The Data Protection and Digital Information (No 2) bill (the DPDI bill) was introduced to Parliament on 8 March. It is an important milestone in the evolution of the UK’s data protection regime.
The DPDI clarifies how organisations to use personal data for research, the importance of which was demonstrated powerfully during the pandemic. The bill also clarifies the definition of what constitutes scientific research.
The bill also gives more confidence to organisations to rely on the legitimate interests lawful basis and to further process data.
The bill also clarifies the rules around international transfers. While the approach and standards remain consistent, the clarifications are intended to help organisations feel more confident about taking a risk-based approach when using existing mechanisms.
The DPDI introduces a more flexible and proportionate approach to demonstrating accountability. Organisations will be able to demonstrate accountability in ways that work for them, rather than requiring a one-size-fits-all approach.
Comment from PharMarketing: this flexibility was already the approach taken by the ICO and all other DPAs in Europe.
Prescriptive requirements are focused on organisations carrying out high risk processing, and the ICO asks the Government to clarify which personal data processings qualify as 'high risk'.
Regarding the rights of data subjects:
The right to a human review of AI processing has been maintained in the DPDI.
The Government has not taken forward the initial proposal to introduce a cost ceiling for subject access requests. the right of access to personal data remains free of charge in most circumstances.
Reducing burdens on organisations, and promoting growth and innovation:
In addition to the simplification for the processing of personal data for research (and its reuse), and to the fact that the legitimate interests' lawful basis will be easier to use for a number of specified purposes. the automated decision-making provisions will be simpler to apply.
Increasing the power of the ICO
The DPDI will increase the fines for breaches of Privacy and Electronic Communications Regulations (PECR), which will help the ICO to tackle predatory marketing calls.
The ICO will have new explicit power to refuse complaints that have not exhausted an organisation’s complaints procedure or are vexatious or excessive.
Read the press release of the ICO here:
Read ICO's response to DPDI Bill here:
19 June 2023 - ICO urges organisations to harness the power of data safely by using privacy enhancing technologies (PETs).
Such PETs include:
· Differential privacy
· Synthetic data
· Homomorphic encryption (HE)
· Zero-knowledge proofs
· Trusted execution environments
· Secure multiparty computation (SMPC)
· Private set intersection (PSI)
· Federated learning
John Edwards, UK Information Commissioner, said: “If your organisation shares large volumes of data, particularly special category data, we recommend that over the next five years you start considering using PETs."
So clearly all organisations in the life science and/or healthcare industries fall under this recommendation from the ICO.
Read full article of the ICO here:
https://ico.org.uk/about-the-i...
Since December 2022, the old Standard Contractual Clauses (SCCs) from the EU commission cannot be used for making a transfer of personal data outside of the EU compliant with the GDPR:
Did you review all the contracts between your organisation and clients / vendors to make sure none of them contain the old SCCs?
For any question, contact us at contact ( at ) pharmarketing.net
Here are some insider news about what's happening in Europe regarding medical research and data privacy:
Regarding drugs:
Accelerating Clinical Trials in the EU (ACT EU): no deliverable has been produced so far; the Kick Off meeting took place on 22 and 23 June 2023. See the agenda here: ACT EU multi-stakeholder platform kick-off workshop | European Medicines Agency (europa.eu).
The improvements of the CTIS portal will be managed outside of the ACT EU initiative, even if, of course the attractiveness of Clinical Trials in the EU is dependent of the ease of use of the CTIS.
A CTIS info day will be organised by the EMA and the DIA in July: watch out for more details on our LinkedIn page!
One action to accelerate submissions is the EU CT Cure program in 20 days: Home | CT Cure | EU4Health Programme. The first dossier for France has been submitted end of May 2023.
A new release of ICH E6 R3 has been published for public comments, see here: ICH E6 (R3) Guideline on good clinical practice (GCP)_Step 2b (europa.eu). Deadline for comments: 26 September 2023
About the transition period to the CTR:
Transitioning all studies submitted under the EU Directive to the CTR and to the CTIS: the deadline is 31st January 2024. An official from the EMA told one of our colleagues that at this time a postponement of the transition period of CTR is not envisaged. If we look back at what happened for the postponement of the MDR, in our opinion it would need a lot of pressure from industry groups to make a postponement happen.
Regarding medical devices:
· For products registered in the EU to be sold in Switzerland, a legal representative in Switzerland is now mandatory.
· An EU guide on clinical investigation for medical devices has been published, with the notion of estimand added; the aspects related to ISO 14155:2020 will change in the future, and on the investigation plan.
· An additional postponement of the transition period of the EU MDR is not envisaged.
For any question contact us at contact ( at ) pharmarketing.net
On 8 June, U.K. Prime Minister Rishi Sunak and U.S. President Joe Biden announced the Atlantic Declaration, with a future Data Bridge to let personal data flow between the UK and the US.
The Atlantic Declaration states: "we have committed in principle to establish a U.S.-UK Data Bridge to facilitate data flows between our countries while ensuring strong and effective privacy protections. We are working to finalize our respective assessments swiftly to implement this framework. We also intend to coordinate to further promote trust in the digital economy, including through support for the Global Cross-Border Privacy Rules (CBPR) Forum and the OECD’s Declaration on Government Access to Personal Data Held by Private Sector Entities, and to build shared understandings on data security risks."
Read the Atlantic Declaration here: https://www.whitehouse.gov/briefing-room/statements-releases/2023/06/08/the-atlantic-declaration-a-framework-for-a-twenty-first-century-u-s-uk-economic-partnership/
12 June: The Council for International Organizations of Medical Sciences (CIOMS) has produced this consensus report on Real-world Data RWD and Real-world Evidence RWE in Regulatory Decision Making.
You can send comments before 14 July 2023 to hills@cioms.ch, cc info@cioms.ch
14 June 2023 - President Bola Tinubu signed the Nigeria Data Protection Bill, 2023 into law, ITEdge Nigeria reports.
The Bill creates the Nigeria Data Protection Commission ('NDPC') headed by a national commissioner tasked with regulating how entities process personal information.
One of the mandates the NDPC includes facilitating "the development of personal data protection technologies, in accordance with recognised international good practices."
Read the full story here.
This month the European Parliament voted to ban biometric surveillance, emotion recognition and predictive policing in the world’s first piece of comprehensive legislation on artificial intelligence.
MEPs in Strasbourg approved the AI Act position, 499 votes in favour, 28 against and 93 abstentions, meaning that the draft law can now move forward to negotiation with national representatives in the Council.
MEPs want to curb the risks of AI and promote its ethical use, but their final wording has not pleased all stakeholders. CCIA said that the Parliament’s amendments deviated from the European Commission’s original risk-based approach meaning that “the strict requirements meant for high-risk cases” have been extended to other “useful AI applications that pose very limited risks.”
MEPs also voted to strictly limit “categorisation based on sensitive characteristics, predictive policing and emotion recognition systems” specifically:
They also propose tailor-made rules for generative AI, like ChatGPT, including labelling AI-generated content and making publicly-available summaries of copyrighted data used for training. And given all the hype around ChatGPT in recent months expect all parties in trialogue to push as hard as possible to get the law done as soon as possible.
“We now need the Parliament to stick to its guns and hold its ground so its position gets taken up by member states,” said Ursula Pachl, Deputy Director General of the European Consumer Organisation (BEUC).
Remember there are European elections next May and MEPs in particular would like a done-deal before then.
“The AI Act will set the tone worldwide in the development and governance of artificial intelligence, ensuring that this technology, set to radically transform our societies through the massive benefits it can offer, evolves and is used in accordance with the European values of democracy, fundamental rights, and the rule of law,” said co-rapporteur Dragos Tudorache MEP.
The UK’s privacy authority, the ICO, has warned that there is “real danger” of discrimination with new neurotechnologies thatmonitor the brain.
It is expected that the technology to monitor neurodata – the information coming directly from the brain and nervous system – will become widespread over the next decade. While this could offer huge health benefits – [personal advertising klaxon] see my podcast on the Human Brain Project – the ICO is seriously concerned that the collection and use of such data poses major risks of being biased leading to discrimination, with neurodivergent people particularly at risk.
Stephen Almond, Executive Director of Regulatory Risk at the Information Commissioner’s Office said: “Neurotechnology collects intimate personal information that people are often not aware of, including emotions and complex behaviour. The consequences could be dire if these technologies are developed or deployed inappropriately.”
According to the ICO, “the use of neurotech in the workplace could also lead to unfair treatment. An example of this could be that if specific neuropatterns or information come to be seen as undesirable due to ingrained bias, those with those patterns may then be overlooked for promotions or employment opportunities.”
The ICO wrote on its website:
Neurotechnologies have continued to proliferate in the health and research sector over the past decade and may soon become part of our daily life. Our workplaces, home entertainment and wellbeing services may use neurotechnology to provide more personalised services in the years to come.
As the UK’s data protection regulator, the Information Commissioner’s Office (ICO) aims to increase public trust in how organisations process personal information through responsible practice. We want to empower people to safely share their information and use innovative products and services that will drive our economy and our society. In our ICO25 strategy, we committed to set out our views on emerging technologies to reduce burdens on businesses, support innovation and prevent harms.
This report specifically considers gathering, analysing and using information that is directly produced by the brain and nervous system, referred to as neurodata. This ranges from monitoring concentration levels at work, to more distant concepts such as smart prosthetics that can mimic brain patterns for greater responsivity. This report is a short introductory guide for those who wish to know more about neurotechnologies from a regulatory perspective. It does not consider the implications of neurodata inferred from broader biometric information, such as eye movements, gait or heartrate tracking. This formed part of our earlier work around biometric technologies.
We examine the impact of neurotechnologies and neurodata and analyse their impact on privacy. We explore plausible scenarios and use cases for emerging neurotechnologies, and through these, raise the following issues:
· a significant risk of discrimination emerging in non-medical sectors such as the workplace, as complex systems and potentially inaccurate information become embedded in neurotechnology products and services. There may also be an increasing risk that unfair decisions could be made even when accurate information is used, discriminating in ways that have not previously been defined;
· the need for people to clearly understand the technology and terminology. This enables organisations to meet their requirements for transparency, and enables people to understand their individual rights. Without this, people will be unable to provide clear consent for processing when appropriate and organisations may struggle to address the challenges of automated processing of neurodata; and
· a need for regulatory co-operation and clarity in an area that is scientifically, ethically and legally complex.
We will address these areas of concern through:
· ongoing engagement with key stakeholders across industry, regulation, academia and civil society. This will include inviting organisations to work with our Regulatory Sandbox to engineer data protection into these technologies;
· engagement with the public to better understand their knowledge and concerns about neurotechnologies and privacy; and
· producing neurotechnology specific guidance in the longer term. This will address the need for regulatory clarity and set clear expectations about the responsible and compliant use of neurodata.
ICO will address some other issues elsewhere, as they build on their Artificial Intelligence (AI) Framework and forthcoming guidance on workplace surveillance. This will include potential neurodiscrimination arising through inaccurate information or inappropriate processing and decision-making.
On 14 June 2023, the Autoridade Nacional de Proteção de Dados (ANPD), the Data Protection Authority of Brazil, published a new simplified model of Register Of Processing Activities ('ROPA') for small and mid-size businesses ('SMBs').
This "simplified model" requires information on eight fields that are "considered essential," including categories of data subjects, details on how the data is shared, security measures and data retention.
Read more details on the simplified ROPA model here.
Also, the ANPD issued Guidelines for the Processing of Personal Data in Academic Research
These guidelines provides clarification on the legal basis that can be used for the processing of personal data for academic research. The possible legal basis are:
The guidelines also clarify in which conditions researchers can be given access to personal data and how it can be sharedbetween different organisations.
The guidance also explicitly outlines "the need for the processing agent to follow ethical standards and the principle of good faith.
It gives examples such as the shared use of data between Health Departments and research bodies, the processing of personal data carried out by educational institutions , cases of use of personal data by research centers created by the Public Ministry in states of the federation, among others.
Read the press release from the ANPD on Academic Research here.
Dear Sir/Madam,
Thank you for contacting us.
We will get back to you as soon as possible.
Best regards,
PharMarketing