The EU’s regulatory rules for medical devices are due to change on 26 May 2020, when the new Medical Device Regulation (“MDR”)[1] comes into effect.  The regime for in vitro diagnostic devices will change two years later from 26 May 2022 when the In Vitro Diagnostic Devices Regulation (“IVDR”)[2] will apply.

In advance of these changes, the EU Medical Device Coordination Group (“MDCG”) has recently published guidance on the Qualification and Classification of Software in the MDR and IVDR (the “Guidance”).

The aim of the Guidance is to assist manufacturers with interpreting the new Regulations to assess whether their software meets the definition of a medical device or an in vitro diagnostic device (i.e., “qualification”); and if so, what regulatory class the software would fall under (i.e., “classification”).

The MDCG is a coordination group established under Article 103 of the MDR, comprising up to two medical device experts from each EU Member State.  Its key functions include contributing to the development of guidance to ensure effective and harmonized implementation of the EU’s new medical device rules.  The Guidance is not legally binding nor does it necessarily reflect the official position of the European Commission.  However, given the MDCG’s important role in the regulatory landscape, the Guidance is likely to be highly persuasive.


Continue Reading EU Medical Device Coordination Group Publishes Guidance on the Qualification and Classification of Software under Upcoming Medical Device Regulations

On 19 September 2019, the European Parliamentary Research Service (“EPRS”)—the European Parliament’s in-house research service—released a briefing paper that summarizes the current status of the EU’s approach to developing a regulatory framework for ethical AI.  Although not a policymaking body, the EPRS can provide useful insights into the direction of EU policy on an issue.  The paper summarises recent calls in the EU for adopting legally binding instruments to regulate AI, in particular to set common rules on AI transparency, set common requirements for fundamental rights impact assessments, and provide an adequate legal framework for facial recognition technology.

The briefing paper follows publication of the European Commission’s high-level expert group’s Ethics Guidelines for Trustworthy Artificial Intelligence (the “Guidelines”), and the announcement by incoming Commission President Ursula von der Leyen that she will put forward legislative proposals for a “coordinated European approach to the human and ethical implications of AI” within her first 100 days in office.


Continue Reading European Parliamentary Research Service issues a briefing paper on implementing EU’s ethical guidelines on AI

On 13 August 2019, the European Commission opened a call for expression of interest to relaunch the eHealth Stakeholder Group with a view to supporting the “digital transformation of healthcare in the EU”. The eHealth Stakeholder Group was first launched in 2012 and in its first iteration (between 2012 and 2015), contributed to the development

France’s medicines regulator, the Agence Nationale de Sécurité du Médicament et des Produits de Santé (ANSM), has released draft guidelines, currently subject to a public consultation, setting out recommendations for manufacturers designed to help prevent cybersecurity attacks to medical devices. Notably, the draft guidelines are the first instance of recommendations released by

On 8 April 2019, the EU High-Level Expert Group on Artificial Intelligence (the “AI HLEG”) published its “Ethics Guidelines for Trustworthy AI” (the “guidance”).  This follows a stakeholder consultation on its draft guidelines published December 2018 (the “draft guidance”) (see our previous blog post for more information on the draft guidance).  The guidance retains many of the same core elements of the draft guidance, but provides a more streamlined conceptual framework and elaborates further on some of the more nuanced aspects, such as on interaction with existing legislation and reconciling the tension between competing ethical requirements.

According to the European Commission’s Communication accompanying the guidance, the Commission will launch a piloting phase starting in June 2019 to collect more detailed feedback from stakeholders on how the guidance can be implemented, with a focus in particular on the assessment list set out in Chapter III.  The Commission plans to evaluate the workability and feasibility of the guidance by the end of 2019, and the AI HLEG will review and update the guidance in early 2020 based on the evaluation of feedback received during the piloting phase.


Continue Reading EU High-Level Working Group Publishes Ethics Guidelines for Trustworthy AI

On March 28, 2019, the Council of Europe* issued a new Recommendation on the protection of health-related data.  The Recommendation calls on all Council of Europe member states to take steps to ensure that the principles for processing health-related data (in both the public and private sector) set out in the Appendix of the Recommendation

On 15 February 2019, the European Medicines Agency (EMA) and Heads of Medicines Agencies (HMA) published their Joint Big Data Taskforce’s summary report (available here) setting out recommendations for understanding the acceptability of evidence derived from ‘big data’ in support of the evaluation and supervision of medicines by regulators.

The Taskforce has sought to clarify the meaning of ‘big data’ within the medicines regulatory context, defining it within the report as: “extremely large datasets which may be complex, multi-dimensional, unstructured and heterogeneous, which are accumulating rapidly and which may be analysed computationally to reveal patterns, trends, and associations. In general big data sets require advanced or specialised methods to provide an answer within reliable constraints”.

The Taskforce was split into seven sub-groups, each focusing on different categories of datasets:

  1. Clinical trials and imaging;
  2. Observational (or ‘Real World’) data;
  3. Spontaneous adverse drug reports (ADR);
  4. Social media and mobile health;
  5. Genomics;
  6. Bioanalytical ‘omics (with a focus on proteomics); and
  7. Data analytics (this work is ongoing and cuts across the above six sub-groups; a further report is expected in Q1 2019).

The sub-groups were each asked, amongst other thing, to characterise their respective datasets; consider the specific areas where big data usability and applicability may add value; assess the existing competencies and expertise present across the European regulatory network regarding the analysis and interpretation of big data; and provide a list of recommendations and a ‘Big Data Roadmap’.


Continue Reading EMA-HMA joint taskforce publish report outlining recommendations for using ‘big data’ for medicines regulation

As with anything personalized, be it advertising, medicines or training schedules, also personalized nutrition — using information on individual characteristics to develop targeted nutritional advice, products, or services — risks being affected by the feared GDPR.  Kristof Van Quathem discusses the topic in Vitafoods’ Insights magazine of January 2019, available here.

Wearable watches that help consumers obtain a better understanding of their eating patterns; wearable clothes that send signals to treating physicians; smart watches: they are but a few examples of the increasingly available and increasingly sophisticated “wearables” on the EU market. These technologies are an integrated part of many people’s lives, and in some cases allow healthcare professionals to follow-up on the condition or habits of their patients, often in real-time. How do manufacturers determine what wearables qualify as medical devices? How do they assess whether their devices need a CE-mark? Must they differentiate between the actual “wearable” and the hardware or software that accompanies them? In this short contribution, we briefly analyze some of these questions. The article first examines what “wearables” are, and when they qualify as a medical device under current and future EU rules. It then addresses the relevance of the applicability of EU medical devices rules to these products. The application of these rules is often complex and highly fact-specific.

Continue Reading Are Wearables Medical Devices Requiring a CE-Mark in the EU?