FDA Outlines Proposed Framework for Regulating Artificial Intelligence Software

On April 2, 2019, FDA released a discussion paper entitled “Regulatory Framework for Modifications to Artificial Intelligence/Machine Learning (AI/ML)-Based Software as a Medical Device (SaMD)” (the “AI Framework”). The AI Framework is the Agency’s first policy document describing a potential regulatory approach for medical devices that use artificial intelligence (“AI”) and machine learning (“ML”). The AI Framework does not establish new requirements or an official policy, but rather was released by FDA to seek early input prior to the development of a draft guidance. FDA acknowledges that the approach “may require additional statutory authority to implement fully.”

In an accompanying press release, former FDA Commissioner Scott Gottlieb outlined the need for a “more tailored” regulatory paradigm for algorithms that learn and adapt in the real world. FDA’s medical device regulation scheme was not designed for dynamic machine learning algorithms, as the Agency traditionally encounters products that are static at the time of FDA review. The AI Framework is FDA’s attempt to develop “an appropriate framework that allows the software to evolve in ways to improve its performance while ensuring that changes meet [FDA’s] gold standard for safety and effectiveness throughout the product’s lifecycle.” Continue Reading

EU High-Level Working Group Publishes Ethics Guidelines for Trustworthy AI

On 8 April 2019, the EU High-Level Expert Group on Artificial Intelligence (the “AI HLEG”) published its “Ethics Guidelines for Trustworthy AI” (the “guidance”).  This follows a stakeholder consultation on its draft guidelines published December 2018 (the “draft guidance”) (see our previous blog post for more information on the draft guidance).  The guidance retains many of the same core elements of the draft guidance, but provides a more streamlined conceptual framework and elaborates further on some of the more nuanced aspects, such as on interaction with existing legislation and reconciling the tension between competing ethical requirements.

According to the European Commission’s Communication accompanying the guidance, the Commission will launch a piloting phase starting in June 2019 to collect more detailed feedback from stakeholders on how the guidance can be implemented, with a focus in particular on the assessment list set out in Chapter III.  The Commission plans to evaluate the workability and feasibility of the guidance by the end of 2019, and the AI HLEG will review and update the guidance in early 2020 based on the evaluation of feedback received during the piloting phase.

Continue Reading

Council of Europe issues recommendation on health-related data

On March 28, 2019, the Council of Europe* issued a new Recommendation on the protection of health-related data.  The Recommendation calls on all Council of Europe member states to take steps to ensure that the principles for processing health-related data (in both the public and private sector) set out in the Appendix of the Recommendation are reflected in their law and practice.

This Recommendation is likely to be of interest to both public sector and private sector organizations that are seeking to use health-related data in innovative ways, including developing digital health solutions that involve genetic data, scientific research, data sharing or mobile health applications.

The Recommendation builds on Convention 108, which is an international treaty first ratified in 1981 and the first legally binding international instrument on protecting individuals’ privacy.  The Convention 108 has recently been updated to be aligned to the GDPR (see the text of the consolidated text of the modernized Convention 108+), but contains less granular obligations than the GDPR.  The Recommendation complements the modernized Convention 108+ by introducing specific definitions (such as “health-related data” and “genetic data”) and specific principles for processing health data.

Most of the principles on processing health data set out in the Recommendation reiterate the position under the EU General Data Protection Regulation (“GDPR”) and relevant guidance issued by European data protection authorities and the European Data Protection Board (the “EDPB”, previously known as the “Article 29 Working Party”).  The Recommendation does, however, provide some specific guidance on processing health-related data that is more detailed than, and in some aspects, goes beyond, the requirements of the GDPR, as described below:

  • Genetic data. The Recommendation provides that genetic data should only be collected subject to appropriate safeguards where it is either prescribed by law, or on the basis of consent (except where such consent is excluded by law).  Genetic data used for preventative health care, diagnosis or treatment of patients or scientific research should only be used for those purposes, or to enable the individuals concerned by the results of the genetic tests to take an informed decision on these matters.  Genetic data used in the employment context, for insurance purposes and for judicial procedures or investigations are specifically called out as areas requiring further consideration by member states on laws to provide appropriate safeguards.
  • Sharing health-related data for secondary purposes.  In relation to sharing health-related data for purposes other than providing and administering health care, the Recommendation states that only recipients who are authorized by law should have access to health-related data, with no mention of patients’ consent as a way of legitimizing such access. This position is potentially more restrictive than the current approach under the GDPR, where third parties not involved in providing health care to patients (such as research or academic institutions or commercial companies) may receive health-related data as long as they do so in compliance with the GDPR.  Whether national laws implementing this Recommendation will provide that third parties lawfully receiving health-related data in compliance with the GDPR (such as with patients’ consent) will be considered to meet this “authorization” requirement remains to be seen.  The Recommendation also states that recipients of health-related data must be subject to the rules of confidentiality incumbent upon a healthcare professional (or equivalent) unless other safeguards are provided by law.
  • Scientific research.  The Recommendation takes a contextual approach to scientific research, providing that the need to process health-related data for scientific research should be weighed against the risks to the data subject (and to their biological family if genetic data is involved). Unlike the GDPR, the Recommendation does not automatically qualify scientific research as being compatible with the original purposes for which the data was collected.  As a general principle, health-related data should only be processed for research purposes where the data subject has consented, unless the law provides that health-related data can be processed without consent.  Individuals should also be provided transparent and comprehensible information about the research project.  The Recommendation adds that conditions in which health-related data are processed for scientific research must be assessed, where necessary, by the competent independent body, such as an ethics committee, and such research projects should be subject to safeguards set out in law.  Fundamentally, the three-part requirements of consent/law, notice and safeguards for using health-related data for research is the same as under the GDPR  However, in some respects Recommendations appear to call for a strengthened regime for scientific research using health-related data that goes further than the GDPR.
  • Digital health. Several principles in the Recommendation are clearly relevant for digital health applications, particularly those involving artificial intelligence, machine learning and mobile devices.  The Recommendation provides that systems storing health-related data should be “auditable”, meaning that it should be possible to trace any access to, modification of, and actions carried out on the information system, so that the author can be identified.  The Recommendation also encourages the adoption of “reference frameworks”, which are coordinated set of rules and state-of-the-art processes adapted to practice and applicable to health information systems, covering areas of interoperability and security, which should apply to information systems hosting or processing health-related data.  The Recommendations also specifically mentions professionals who are not directly involved in providing individual patient health care, but may have access to health-related data to provide “smooth operation of information systems” (such as cloud systems?).  Such professionals must have full regard for professional secrecy and comply with security requirements laid down by law to guarantee the confidentiality and security of the data.  In relation to mobile devices, the Recommendation makes it clear that information collected on mobile devices can constitute health-related data and therefore should have the same legal protections as other health-related data processing.
  • Individuals’ rights. The Recommendation provides that individuals should have the right to be informed and exercise control over their health-related data and genetic data, in line with the GDPR.  However, three areas of deviation are: (1) individuals should have the right not to be informed of medical diagnoses or the results of genetic tests, as they may have their own reasons for not wishing to know, subject to limited exceptions where they must be informed by law; (2) when individuals withdraw from a scientific research project, individuals should be informed that their health-related data processed in the context of that research will be destroyed or anonymized in a manner which does not compromise the scientific validity of the research – which appear to be more nuanced than recent guidance form the EDPB; and (3) individuals should have the right to be informed the reasoning that underlies data processing involving health-related data where the results of such processing are applied to them, particularly if profiling is involved.  This second right is similar to the one in the GDPR (Article 15(1)(h)) but applies more broadly to include processing other than those that fall within solely automated decision-making with significant effects (as described in Article 22 of the GDPR).

To the extent that the GDPR does not already impose the same obligations as in the principles of the Recommendation, the Recommendation is not binding on any private sector or public sector organizations.  The member states of the Council of Europe or the European Union, however, are expected to use the Recommendation as guidance when adopting national laws that deal with health data.  These principles also provide some insight into how European data protection authorities are likely to interpret the provisions in the GDPR that apply to health-related data and genetic data, and the direction of future guidance and legislation on the topic.

* The Council of Europe is an international organization, which is distinct from the European Union, founded in 1949 to promote democracy and protect human rights and the rule of law in Europe.  The Council of Europe consists of 47 member states, which includes all of the 28 EU Member States.  Recommendations issued by the Council of Europe are not binding until the EU or national governments of Member States implement legislation, but EU laws often build on Council of Europe standards when drawing up legislation.

ICO opens beta phase of privacy “regulatory sandbox”

On 29 March 2019, the ICO opened the beta phase of the “regulatory sandbox” scheme (the “Sandbox”), which is a new service designed to support organizations that are developing innovative and beneficial projects that use personal data.  The application process for participating in the Sandbox is now open, and applications must be submitted to the ICO by noon on Friday 24 May 2019. The ICO has published on its website a Guide to the Sandbox, which explains the scheme in detail.

The purpose of the Sandbox is to support organizations that are developing innovative products and services using personal data and develop a shared understanding of what compliance looks like in particular innovative areas.  Organizations participating in the Sandbox are likely to benefit from having the opportunity to liaise directly with the regulator on innovative projects with complex data protection issues. The Sandbox will also be an opportunity for market leaders in innovative technologies to influence the ICO’s approach to certain use cases with challenging aspects of data protection compliance or where there is uncertainty about what compliance looks like.

The beta phase of the Sandbox is planned to run from July 2019 to September 2020.  Around 10 organizations from private, public and third sectors will be selected to participate. In the beta phase, the ICO is focusing on data processing that falls within the remit of UK data protection law.

In particular, the ICO is seeking applications for products or services that address the following data protection challenges relevant to innovation:

  • use of personal data in emerging or developing technology such as biometrics, internet of things (IoT), facial recognition, wearable tech, cloud-based products;
  • complex data sharing at any and all levels;
  • building good user experience and public trust by ensuring transparency, clarity and explainability of data use;
  • perceived limitations, or lack of understanding of the General Data Protection Regulation and Data Protection Act 2018 provisions on automated decision making, big data, machine learning or AI;
  • utilising existing data (often at scale and in linking data) for new purposes or for longer retention periods;
  • building ‘data protection by design and default’ into product development, taking account of cost issues and difficulties of doing this until testing has been undertaken; or
  • ensuring the security of data and identifying data breaches in complex and innovative environments.

Participating organizations will be asked to sign terms and conditions with the ICO, and will also receive a statement of ‘comfort from enforcement’. This statement will state that the ICO will not take immediate enforcement action for any inadvertent breach of data protection law as a result of product or service development during the Sandbox.

The ICO will work with participating organizations to design a bespoke plan, and provide informal advice or ‘steers’ on the project.  Participating organizations can also request ‘statements of regulatory comfort’ from the ICO when they exit the Sandbox, in which the ICO will state that on the basis of the information provided whilst in the Sandbox, the ICO did not encounter any indication that the product or service would infringe data protection law.

The ICO conducted a consultation on the Sandbox in September 2018 (see our previous blog post here), and the analysis of the results of the consultation was published in November 2018.  Information about how to apply to the Sandbox can be found here.

UK’s NICE releases newly updated Digital Health Technologies (DHT) Evidence Standards Framework

Following on from the Evidence Standards Framework for DHTs published in December 2018 (the Original Standards, as reported in our previous blog post, here), the UK’s National Institute for Health and Care Excellence (NICE) recently published a newly updated version of the standards (the Updated Standards, available here).

The Updated Standards were produced following feedback received by NICE concerning the Original Standards.  While the spirit of the latest standards is largely the same, the Updated Standards have sought to build out the aims and context for the document, including a new accompanying “User Guide” (which supersedes the previous “FAQs”).  The User Guide covers the background of the standards, their development and use, and future priorities identified by stakeholders, along with a glossary of key terms, such as the Updated Standards’ interpretation of ‘Artificial Intelligence’, and ‘Real-world Data’. With respect to the economic impact section of the Updated Standards, a new ‘Basic’ level has been added for ‘low impact’ DHTs undergoing local service evaluation where a budget impact analysis rather than economic analysis would be appropriate.

NICE has also provided supporting case studies to help demonstrate how DHTs are functionally classified (here), and assessed for effectiveness and economic impact (here), respectively, under the Updated Standards.

Patient Access to Electronic Health Data at the Forefront of Two HHS Proposed Rules

Digital health record

On March 4, 2019, the Department of Health and Human Services (HHS) published two proposed rules to improve patient access to personal health data. The two rules, issued by the HHS Centers for Medicare & Medicaid Services (CMS) and the Office of the National Coordinator for Health Information Technology (ONC), are intended to increase interoperability of electronic health information. These long-anticipated proposals follow legislative action undertaken in the 21st Century Cures Act. HHS indicated that, by increasing interoperability, it intends to empower patients with ownership of their medical histories and increase efficiency and quality of care in the health care industry.

CMS’s proposed rule on interoperability and patient access to health data would require Medicaid, the Children’s Health Insurance Program (CHIP), Medicare Advantage (MA), and Affordable Care Act federally-facilitated exchange (FFE) health plans to ensure patient access to electronic health information (EHI) by 2020. Key provisions of the proposed rule include:

  • Requiring plans to implement application programming interfaces (APIs), which are platforms that allow the transfer of electronic information between different computer systems. Last year, CMS established an API for Medicare fee-for-service plans through the MyHealthEData initiative. The proposed rule extends this initiative to other federal government-funded health plans. CMS indicated that, through the use of an API, it intends for patients to maintain access to their EHI throughout their “healthcare journeys,” even if they switch health plans.
  • Requiring Medicaid, CHIP, and MA health plans to make their entire provider directory available through API technology to facilitate patient access to in-network providers and providers’ ability to coordinate care with other providers. (FFE plans are already required to make their provider directories available and are excepted from this provision.)
  • Requiring MA organizations, Medicaid managed care plans, CHIP managed care entities, and issuers in the FFEs to participate in trust networks that allow the free and secure exchange of information over the internet, despite the use of different health IT networks.
  • Making publicly available a list of clinicians and hospitals that engage in information blocking practices that may prevent the disclosure and use of EHI and therefore undermine the aims of interoperability. By making the information publicly available, CMS hopes to incentivize providers to refrain from information blocking.
  • Requiring that states increase the frequency with which they share data on dually eligible Medicaid and Medicare beneficiaries from monthly to daily.
  • Requiring Medicare-participating hospitals to provide other providers and facilities with “electronic notifications when patients are admitted, discharged or transferred,” in order to improve patient care during transitions between settings and providers.

ONC’s proposed rule focuses on the more technical aspects of increasing interoperability. Key provisions of the rule include:

  • Providing standardized criteria for APIs to help health IT developers build apps patients can use to easily access their data. To reduce financial barriers to API adoption for government health plans, the rule also limits the fees API suppliers can charge and establishes pro-competitive conditions.
  • Establishing the following seven “reasonable and necessary” exceptions to the 21st Century Cures Act’s prohibition of information blocking:
    • Preventing patient harm
    • Promoting the privacy of EHI
    • Promoting the security of EHI
    • Recovering costs reasonably incurred in making EHI accessible
    • Responding to infeasible requests that impose a substantial burden
    • Licensing of interoperability elements on reasonable and non-discriminatory terms
    • Maintaining and improving health IT performance.
  • Establishing Conditions of Certification and Maintenance of Certification for health IT developers. These conditions prohibit information blocking, require assurances that developers will not engage in information blocking, prohibit developers from restricting communications about health IT, require compliance with API technical requirements, require real world testing, and require attestation to compliance with the Conditions and Maintenance of Certification requirements.

These proposed rules are part of a long-term plan to ensure safe and efficient exchange of EHI. Comments on the proposed rules are due May 3, 2019.

EMA-HMA joint taskforce publish report outlining recommendations for using ‘big data’ for medicines regulation

On 15 February 2019, the European Medicines Agency (EMA) and Heads of Medicines Agencies (HMA) published their Joint Big Data Taskforce’s summary report (available here) setting out recommendations for understanding the acceptability of evidence derived from ‘big data’ in support of the evaluation and supervision of medicines by regulators.

The Taskforce has sought to clarify the meaning of ‘big data’ within the medicines regulatory context, defining it within the report as: “extremely large datasets which may be complex, multi-dimensional, unstructured and heterogeneous, which are accumulating rapidly and which may be analysed computationally to reveal patterns, trends, and associations. In general big data sets require advanced or specialised methods to provide an answer within reliable constraints”.

The Taskforce was split into seven sub-groups, each focusing on different categories of datasets:

  1. Clinical trials and imaging;
  2. Observational (or ‘Real World’) data;
  3. Spontaneous adverse drug reports (ADR);
  4. Social media and mobile health;
  5. Genomics;
  6. Bioanalytical ‘omics (with a focus on proteomics); and
  7. Data analytics (this work is ongoing and cuts across the above six sub-groups; a further report is expected in Q1 2019).

The sub-groups were each asked, amongst other thing, to characterise their respective datasets; consider the specific areas where big data usability and applicability may add value; assess the existing competencies and expertise present across the European regulatory network regarding the analysis and interpretation of big data; and provide a list of recommendations and a ‘Big Data Roadmap’.

Continue Reading

Reconciling Personalized Nutrition with the GDPR

As with anything personalized, be it advertising, medicines or training schedules, also personalized nutrition — using information on individual characteristics to develop targeted nutritional advice, products, or services — risks being affected by the feared GDPR.  Kristof Van Quathem discusses the topic in Vitafoods’ Insights magazine of January 2019, available here.

NMPA Releases Draft Good Manufacturing Practice Appendix on Standalone Software

On January 3, 2019, the National Medical Products Administration (“NMPA”) published a draft standalone software appendix of medical device good manufacturing practice (“Draft Standalone Software GMP” or “Draft Appendix”) for public comment (available here).  Comments are due on January 30, 2019.

China revised its medical device GMP in 2014, which apply to all classes of devices regardless of whether they are imported or made in China.  Subsequently, NMPA added various appendices (fulu) to articulate special requirements for certain types of devices, including sterile, implantable, and in vitro diagnostic devices.    The Draft Appendix sets out proposed special requirements for software that falls under the definition of medical device.

In China, the definition of a medical device covers software that either itself constitutes a device (i.e., standalone software) or is an accessory/component of a device (i.e., component software).  The Draft Standalone Software GMP expressly applies to standalone software and it states that it applies, “by reference,” (mutatis mutandis) to component software.  If finalized, the Draft Standalone Software GMP would be effective on an undetermined date in 2020.

The Draft Appendix is a relatively simple document with four main sections:

  • scope and general principles of the Draft Appendix ;
  • special requirements for various aspects of the manufacturing and post-market processes (see below);
  • definitions of key terms; and
  • miscellaneous provisions.

Key features of the Draft Standalone Software GMP include the following:

Continue Reading

Are Wearables Medical Devices Requiring a CE-Mark in the EU?

Wearable watches that help consumers obtain a better understanding of their eating patterns; wearable clothes that send signals to treating physicians; smart watches: they are but a few examples of the increasingly available and increasingly sophisticated “wearables” on the EU market. These technologies are an integrated part of many people’s lives, and in some cases allow healthcare professionals to follow-up on the condition or habits of their patients, often in real-time. How do manufacturers determine what wearables qualify as medical devices? How do they assess whether their devices need a CE-mark? Must they differentiate between the actual “wearable” and the hardware or software that accompanies them? In this short contribution, we briefly analyze some of these questions. The article first examines what “wearables” are, and when they qualify as a medical device under current and future EU rules. It then addresses the relevance of the applicability of EU medical devices rules to these products. The application of these rules is often complex and highly fact-specific.

Continue Reading

LexBlog