On 20 November 2018, the UK government published its response (the “Response”) to the June 2018 consultation (the “Consultation”) regarding the proposed new Centre for Data Ethics and Innovation (“DEI”). First announced in the UK Chancellor’s Autumn 2017 Budget, the DEI will identify measures needed to strengthen the way data and AI are used and regulated, advising on addressing potential gaps in regulation and outlining best practices in the area. The DEI is described as being the first of its kind globally, and represents an opportunity for the UK to take the lead the debate on how data is regulated.
Continue Reading IoT Update: The UK Government’s Response to Centre for Data Ethics and Innovation Consultation
Artificial Intelligence (AI)
Significant FDA Digital Health Policy Development for Prescription Drug Sponsors
As previewed by Commissioner Gottlieb several months ago (see our earlier post here), FDA published a notice in the Federal Register on November 20, 2018, to propose a new framework for “prescription drug-use-related software.” The Agency defines this digital health category widely as software disseminated by a prescription drug sponsor for use with the sponsor’s prescription drug(s). Last spring, the Commissioner stated that FDA would be seeking input “on how to support the development of digital health tools that are included as part of approved drugs.” The goal in establishing the framework, Gottlieb stated, would be “to develop an efficient pathway for the review and approval of digital health tools as part of drug review, so that these tools reach their full potential to help us treat illness and disease, and encourage synergies between software and therapeutics that meet FDA’s gold standard for safety and effectiveness.”
This policy development is significant, not only because it is one of CDER’s first policy statements on digital health associated with pharmaceuticals (see a few of our earlier posts about pharma-related digital health here and here), but also because it implicates a broad range of information that could be made available by prescription drug sponsors through software used with their products. We encourage prescription drug sponsors with any interest in providing digital health solutions, including through collaborations, to review the Federal Register notice and consider submitting comments to FDA.
Here are a few key takeaways from FDA’s notice:
- Under the proposed framework, software with the same drug-related functionalities will be subject to different regulatory approaches by FDA, depending on the developer of the software. FDA will apply the proposed framework to prescription drug-user-related software developed by or on behalf of pharmaceutical manufacturers, and a different approach to drug-related software developed “independently” by third-party software developers and other entities that are not prescription drug sponsors.
- It is unclear from the notice how the proposed framework, including the evidentiary standards described in the Federal Register notice, will align with other FDA initiatives such as the use of real-world evidence for drug development and the pre-certification program (see our earlier post here).
- An important question for prescription drug sponsors in particular is whether the proposed framework will encourage continued digital health innovation, including through collaborations, or whether FDA’s proposal will create challenges that may discourage advances in this area.
Continue Reading Significant FDA Digital Health Policy Development for Prescription Drug Sponsors
AI Update: Medical Software and Preemption
In light of the rapidly expanding field of medical software technology, and its recognition that traditional approval mechanisms for hardware-based medical devices may not be well suited to regulating such technology, FDA is piloting a new, streamlined regulatory approach for digital health technologies. The initiative, currently a “working model” and…
Continue Reading AI Update: Medical Software and Preemption
ICO consults on privacy “regulatory sandbox”
Designing data-driven products and services in compliance with privacy requirements can be a challenging process. Technological innovation enables novel uses of personal data, and companies designing new data-driven products must navigate new, untested, and sometimes unclear requirements of privacy laws, including the General Data Protection Regulation (GDPR). These challenges are often particularly acute for companies providing products and services leveraging artificial intelligence technologies, or operating with sensitive personal data, such as digital health products and services.
Recognising some of the above challenges, the Information Commissioner’s Office (ICO) has commenced a consultation on establishing a “regulatory sandbox”. The first stage is a survey to gather market views on how such a regulatory sandbox may work (Survey). Interested organisations have until 12 October to reply.
The key feature of the regulatory sandbox is to allow companies to test ideas, services and business models without risk of enforcement and in a manner that facilitates greater engagement between industry and the ICO as new products and services are being developed.
The regulatory sandbox model has been deployed in other areas, particularly in the financial services sector (see here), including by the Financial Conduct Authority in the UK (see here).
Potential benefits of the regulatory sandbox include reducing regulatory uncertainty, enabling more products to be brought to market, and reducing the time of doing so, while ensuring appropriate protections are in place (see the FCA’s report on its regulatory sandbox here for the impact it has had on the financial services sector, including lessons learned).
The ICO indicated earlier this year that it intends to launch the regulatory sandbox in 2019 and will focus on AI applications (see here).
Further details on the scope of the Survey are summarised below.Continue Reading ICO consults on privacy “regulatory sandbox”
UK Government publishes “Initial code of conduct for data-driven health and care technology” for consultation
On 5 September, in response to the opportunities presented by data-driven innovations, apps, clinician decision support tools, electronic health care records and advances in technology such as artificial intelligence, the UK Government published a draft “Initial code of conduct for data-driven health and care technology” (Code) for consultation. The Code is designed to be supplementary to the Data Ethics Framework, published by the Department for Digital, Culture, Media and Sport on 30 August, which guides appropriate data use in the public sector. The Code demonstrates a willingness of the UK Government to support data sharing to take advantage of new technologies to improve outcomes for patients and accelerate medical breakthroughs, while balancing key privacy principles enshrined in the GDPR and emerging issues such as the validation and monitoring of algorithm-based technologies. For parties considering data-driven digital health projects, the Code provides a framework to help conceptualise a commercial strategy before engaging with legal teams.
The Code contains:
- a set of ten principles for safe and effective digital innovations; and
- five commitments from Government to ensure the health and care system is ready and able to adopt new technologies at scale,
each of which are listed further below.
While the full text of the Code will be of interest to all those operating in the digital health space, the following points are of particular note:
- the UK Government recognises the “immense promise” that data sharing has for improving the NHS and social care system as well as for developing new treatments and medical breakthroughs;
- the UK Government is committed to the safe use of data to improve outcomes of patients;
- the Code intends to provide the basis for the health and care system and suppliers of digital technology to enter into commercial terms in which the benefits of the partnerships between technology companies and health and care providers are shared fairly (see further below); and
- given the need of artificial intelligence for large datasets to function, two key challenges arise: (i) these datasets must be defined and structured in accordance with interoperable standards, and (ii) from an ethical and legal standpoint, people must be able to trust that data is used appropriately, safely and securely as the benefits of data sharing rely upon public confidence in the appropriate and effective use of data.
The Code provides sets out a number of factors consider before engaging with legal teams to help define a commercial strategy for data-driven digital health project. These factors include: considering the scope of the project, term, value, compliance obligations and responsibilities, IP, liability and risk allocation, transparency, management of potential bias in algorithms, the ability of the NHS to add value, and defining the respective roles of the parties (which will require thinking beyond traditional research collaboration models).
Considering how value is created and realised is a key aspect of any data-driven digital health project, the Code identifies a number of potential options: simple royalties, reduced payments for commercial products, equity shares in business, improved datasets – but there is also no simple of single answer. Members of Covington’s digital health group have advised on numerous data-driven collaborations in the healthcare sector. Covington recently advised UK healthcare technology company Sensyne Health plc on pioneering strategic research and data processing agreements with three NHS Trust partners. Financial returns generated by Sensyne Health are shared with its NHS Trust partners via equity ownership in Sensyne Health and a share of royalties (further details are available here).
The UK Government also intends to conduct a formal review of the regulatory framework and assessing the commercial models used in technology partnerships in order to address issues such as bias, transparency, liability and accountability.
The UK Government is currently consulting on the Code (a questionnaire on the Code is available here) and intends to publish a final version of the Code in December.Continue Reading UK Government publishes “Initial code of conduct for data-driven health and care technology” for consultation
Inside FDA’s Latest Digital Health Developments: Gottlieb Sees “Vast Potential” Ahead
On April 26, Commissioner Gottlieb addressed the agency’s progress on FDA’s Digital Health Innovation Action Plan and announced several additional steps the agency is taking to advance the potential benefits of digital health. Here is a recap of the key updates:
(1) Launch of New FDA Program to Apply Digital …
Continue Reading Inside FDA’s Latest Digital Health Developments: Gottlieb Sees “Vast Potential” Ahead
Covington Artificial Intelligence Update: House of Lords Select Committee publishes report on the future of AI in the UK
Reflecting evidence from 280 witnesses from the government, academia and industry, and nine months of investigation, the UK House of Lords Select Committee on Artificial Intelligence published its report “AI in the UK: ready, willing and able?” on April 16, 2018 (the Report). The Report considers the future of AI in the UK, from perceived opportunities to risks and challenges. In addition to scoping the legal and regulatory landscape, the Report considers the role of AI in a social and economic context, and proposes a set of ethical guidelines. This blog post sets out those ethical guidelines and summarises some of the key features of the Report.
Continue Reading Covington Artificial Intelligence Update: House of Lords Select Committee publishes report on the future of AI in the UK
FDA Outlines Updated Approach to Regulating Digital Health Technologies
On December 8, FDA addressed the agency’s evolving approach to digital health by issuing two new draft guidance documents: “Clinical and Patient Decision Support Software” (the “CDS Draft Guidance”) and “Changes to Existing Medical Software Policies Resulting From Section 3060 of the 21st Century Cures Act” (the “Software Policies Draft Guidance”). These draft guidances announce the agency’s initial interpretation of the health software provisions enacted as part of last year’s 21st Century Cures Act (the “Cures Act”).
Given the rapid pace of digital health innovation across the life sciences, technology and health care sectors, FDA guidance on these topics is critical. Here are a few key takeaways from the draft guidances:
- FDA’s initial interpretation of the Cures Act provision related to clinical decision support (CDS) software may lead to a fairly narrow carve-out—in other words, many cutting-edge CDS software functions could remain subject to FDA regulation.
- FDA’s draft guidances do not directly address dynamic digital health solutions, such as those that incorporate machine learning, artificial intelligence (AI), or blockchain.
- FDA has proposed an enforcement discretion approach for decision support software aimed at patients that generally parallels the regulatory approach for CDS software aimed at clinicians, even though patient decision software was not addressed directly in the Cures Act.
- Consistent with the Cures Act, FDA’s draft guidances reflect that many of the software functions that were previously subject to FDA enforcement discretion (i.e., not actively regulated as devices) no longer meet the definition of “device.”
- Significant for pharmaceutical companies, CDER joined one of the draft guidances, and that draft guidance makes clear that other FDA requirements may apply to digital health products disseminated by or on behalf of a drug sponsor beyond those outlined in the draft guidance.
FDA’s regulatory approach has a significant impact on the investment in and development of digital health solutions across the digital health ecosystem. Stakeholders should consider submitting comments to the agency to help shape the direction of FDA’s final guidances on these topics.Continue Reading FDA Outlines Updated Approach to Regulating Digital Health Technologies
Digital Health Checkup (Part Three): Key Questions About AI, Data Privacy, and Cybersecurity

In the third installment of our series, Covington’s global cross-practice Digital Health team considers some additional key questions about Artificial Intelligence (AI), data privacy, and cybersecurity that companies across the life sciences and technology sectors should be asking to address the regulatory and commercial pieces of the complex digital health…
Continue Reading Digital Health Checkup (Part Three): Key Questions About AI, Data Privacy, and Cybersecurity