The Swiss-US
Data Privacy Framework ('DPF')
can now be used
The Swiss-US
Data Privacy Framework ('DPF')
can now be used
On 15 September 2024, the U.S. Department of Commerce announced that Switzerland now recognizes the adequacy of the Swiss-US Data Privacy Framework ('DPF').
What does it mean?
It means that any US organisation which registers with the Swiss-US DPF on the website of the US Department of Commerce can now receive personal data from people based in Switzerland with no problem.
In other words, you don't need to put in place a Data Transfer Agreement (with the EU Standard Contractual Clauses + CH addendum) with each organisation in Switzerland from which you are receiving personal data.
This is especially helpful for US organisations offering services to people in Switzerland: CROs, Cebtral Labs, boutique #datamanagement or #biostat firms, software companies, etc:
By registering with the Swiss-US DPF, you have a great marketing argument to demonstrate to your contacts in Switzerland that your organisation complies with European privacy laws and that you will protect personal data of patients and of healthcare professionals.
Updated guidance regarding the Swiss-U.S. DPF is now available on the DPF Program website here.
In addition, remember that you now need to appoint a Swiss Data Protection Representative ('DPR').
For more information, contact us at PharMarketing GDPR Life Sciences Data Protection, Data Privacy: Caroline at c.x.josse@pharmarketing.net or Karine at k.i.renault@pharmarketing.net
EU CTIS:
Clarification of the privacy rules
for personal data entered by sponsors
On 9 July 2024, the EMA released an updated Q&A on the protection of Commercially Confidential Information and Personal Data while using CTIS.
On 19 July 2024, the EMA released the Revised CTIS transparency rules and historical trials: quick guide for users; this new version includes the change of sponsor.
EMA indicated in its Newsletter CTIS dated 26 July 2024 that the revised CTIS transparency rules are applicable since 18 June 2024.
For support in the implementation of the revised rules, sponsors can consult the updated quick guide for users, guidance, annex I and Q&A document on the protection of personal data and CCI in CTIS.
Links:
Whistle Blower
outside Europe
and Data Privacy
At PharMarketing we are helping a mid-size pharma company to implement a Whistle Blower alert system, both on the business process side, on the IT side and on the data privacy compliance side.
This pharma company is headquartered in the EU and they have offices in several countries across the globe
Recently they received an alert from a manager outside Europe, based in a non adequate country (in the sense of the GDPR), where privacy laws are much stricter than in the EU.
This manager said that some people in its team wanted to make an alert. The compliance team at the HQ wanted to ask the manager to provide the names of its colleagues.
BUT this is non-compliant with the local privacy laws in this non adequate country and with the rules for alert systems.
So how can this be managed in a compliant way?
The compliant way is that the manager in this non adequate country asks its colleagues to enter an alert in the system themselves.
For more information on the interplay between whistleblower systems and Privacy, contact us at contact ( at ) pharmarketinting.net
Penalties for Non-Compliance with Health Data Privacy
European Data Protection Authorities published several decisions related to the processing of health data in the past months.
Such decisions shed light on the key measures to implement to stay compliant with privacy (and healthcare) laws and avoid a critical finding.
Many thanks to GDPR hub NOYB and to IAPP for all these valuable information!
Croatia:
Fact:
The Data Protection Authority ('DPA') of Croatia, the AZOP, fined a hospital €190,000 after a data breach led to the deletion of the X-ray images stored on its server. The controller failed to notify the DPA and did not have a backup of these images. Read more or edit on GDPRhub...
Takeaway:
It can always happen that somebody deletes data by mistake. It's not a big deal if the organisation backs up regularly the databases. In this case the hospital had decided not to do back up of the X-ray images because of the cost, but such explanation is not acceptable. If the hospital didn't have sufficient budget, they should have notified the Croatian DPA at the time, and then the DPA would have helped the hospital to identify other technical security measures. Now the consequences are important because the healthcare professionals don't have the X-Rays taken in the past and might not be able to provide appropriate care, for example because they cannot track the evolution of a disease over time.
France:
Fact:
On 5 September 2024, the French Data Protection Authority, the CNIL fined software company CEGEDIM SANTÉ 800,000 euros for creating a
health data warehouse
with pseudonymised health data; the CNIL claimed that
patients could be reidentified and that the patient data should have been fully anonymised instead of pseudonymised. In addition, the CNIL said that Cegedim Santé should have asked for a specific authorisation from the CNIL, as per article 66 of the French Data Privacy Law. Remark from PharMarketing: 1) the CNIL had not published their guideline for healthcare data warehouses at the time the CNIL started to audit Cegedim Santé 2) in its press release, the CNIL did not say that they managed to reidentify patients from the health data warehouse, they just said that someone might be able to reidentify a patient.
Takeaway:
France is the only country in Europe to our knowledge to make it mandatory for organisations to demonstrate compliance with the guidelines they publish (with a small exception in Ireland also, but for a very specific situation in clinical trials): there are guidelines for medical research (interventional and observational), for vigilance activities, for healthcare data warehouses and more.
We advise all organisations willing to collect, process or store health data from people based in France to first check if the CNIL has published a guideline for such activity, then to check if it is mandatory to demonstrate compliance (some don't require it), and if yes, to demonstrate compliance by self declaration on the CNIL website. PharMarketing can help you to do this in a few hours.
Read the article 66 of the French Data Privacy Law here: https://www.legifrance.gouv.fr...
Read the press release from the CNIL about Cegedim Santé : https://www.cnil.fr/en/health-...
Italy:
Fact:
The local Data Protection Authority ('DPA'), the Garante issued a reprimand to the Ministry of Infrastructure and Transport after it unlawfully transferred data about an alleged mental illness of an employee to another Ministry in order to inquire if he had a gun licence. Read more or edit on GDPRhub...
Takeaway:
The Ministry of Infrastructure and Transport didn't need to send health data to the other Ministry in order to know if the employee had a gun license. The employee name was sufficient: in other words, the former didn't comply with the principle of Data Minimisation. In addition it was a breach of sensitive personal data (health data) from a vulnerable person (employee). So the takeaway is that it is important to train all employees on data privacy at regular interval, especially regarding transfers of personal data. A good practice is to train employees upon recruitment and then to provide an annual refresher.
Norway:
Fact:
Following the request of a patient to update a medical record, Norway's DPA, the Datatilsynet, said that they don't have the power to carry out a real review of what information is correct or relevant to include in a patient record.
Takeaway:
This case looks a bit strange to us, because even if it is true that the Datatilsynet is not expected to have medical expertise, their role is to ask the data controller to update the medical dossier following the request of a data subject.
United Kingdom:
Fact:
A ransomware attack on the U.K. National Health Service exposed the sensitive health data of
more than 900,000 patients, The Record reports. The personally identifiable information breached allegedly included "pathology and histology forms that are used to share patient details between medical departments and institutions."
Full story
Australia:
Fact: Australia DPA, the OAIC revealed in its first half report that health data breaches represented 19% of the total of personal data breaches in H1 2024. The MediSecure data breach notified in the period affected approximately 12.9 million Australians – the largest number of Australians affected by a breach since the Notifiable Data Breaches scheme came into effect. read more: https://www.oaic.gov.au/news/m...
USA:
Fact:
Reuters reports
genetic testing company 23andMe agreed to pay USD30 million to settle claims of insufficient protection of customers' personal information. The tentative settlement includes three years of security monitoring for 6.9 million customers and must be approved by a U.S. judge.
Full story
Takeaway:
Make sure your organisation has Technical and Organisational Measures ('TOMS') in place that provide sufficient protection of personal date, especially health data. To demonstrate that the risk for the private life of people is negligible, you must draft a Data Privacy Impact Assessment ('DPIA'). If you are not sure of your risk evaluation, we advise you ask the opinion of a local Data Protection Authority: it's always better to ask before you experience a data breach, or before a patient complains with the authority.
Fact:
The U.S. District Court for the Northern District of California ruled
a class-action lawsuit can proceed against online therapy provider Headway over alleged personal data sharing, Courthouse News Service reports. The lawsuit claimed Headway allegedly shared sensitive patient information, including mental health information, and appointments with Google Analytics after embedding the technology into its website.
Full story
Takeaway: make sure your organisation doesn't share personal data for an objective which is not the primary objective of the personal data collection and/or processing; also, make sure your organisation informs people in a transparent manner BEFORE you collect or process their personal data of the objectives of the processing, where their personal data will be stored, and if their data will be shared with other organisations. Google Analytics is banned in several countries in Europe for non compliance with data privacy laws, so be very careful when using this tool or similar ones. paragraphe
New FDA Guidance on Decentralized Clinical Trials
On 17 September 2024, the FDA updated its guidance on Decentralized Clinical Trials ('DCTs')
It is interesting to note that what the FDA calls DCTs are trials where some visits don't take place at the site, but at patient's home, or any other place outside of the site. This is what is called Home Trial Visits ('HTVs') in the industry.
To say it differently, clinical trials where ALL visits take place at the SITE where the investigator is based but the patient is given a wearable or enters a few data in a smartphone app DON'T FALL in the definition of a DCT per FDA's guidance.
On the opposite, the EMA calls DCTs trials where all visits take place at the site, but where for example the patient enters information on a portal or in a smartphone app.
Read the new guidance here: https://www.fda.gov/regulatory...
Read the Federal Register notice accompanying the final guidance: https://public-inspection.fede...
For any questions on DCTs and how to stay compliant with Good Clinical Practices and Data Privacy rules, contact our expert on DCTs Bertrand: b.p.lebourgeois ( at ) pharmarketing.net
Are ChatBots
Compliant with
Privacy Laws?
ChatBots are using Artificial Intelligence to try and guide a website visitor. They look harmless and useful tools, but they collect more information about you than you think.
And this must comply with privacy laws.
Chatbots also capture information on what you are doing on the website. The more you use a particular chatbot, the more the chatbot will learn about you. That is the nature of AI.
With a chatbot, a business can make inferences about a consumer for profiling purposes, and it could process other publicly available information, aggregate it with what I'm saying to the chatbot and enrich the profile they have about you.
it could also theoretically make certain, well-informed assumptions about you. These assumptions may be based on your age, gender, profession, interests and billions of additional data points it has processed about other, potentially similar users. This information is valuable as it is exactly the type of deterministic data advertisers rely on for targeting purposes
In conclusion: if your organisation has already implemented chatbots or if your are planning to, ask the advice of a privacy expert to make sure that what you do is compliant with privacy laws.
Dear Sir/Madam,
Thank you for contacting us.
We will get back to you as soon as possible.
Best regards,
PharMarketing