01.27.2021

|

Updates

On January 12, 2021, the U.S. Food and Drug Administration (FDA) released the Artificial Intelligence/Machine Learning (AI/ML)-Based Software as a Medical Device (SaMD) Action Plan (Action Plan). The Action Plan expresses that the “FDA’s vision is that, with appropriately tailored total product lifecycle-based regulatory oversight, AI/ML-based Software as a Medical Device (SaMD) will deliver safe and effective software functionality that improves the quality of care that patients receive.”

The FDA currently regulates and has approved “locked” SaMD utilizing AI/ML, but has struggled to determine the appropriate regulatory approach for “adapting” AI/ML SaMD that learns and evolves using real world inputs over time. The Action Plan is the FDA’s step forward to address this challenge.

Led by the FDA Center for Devices and Radiological Health’s (CDRH) Digital Health Center of Excellence, the Action Plan expands on concepts outlined in the 2019 Discussion Paper, “Proposed Regulatory Framework for Modifications to Artificial Intelligence/Machine Learning (AI/ML)-Based Software as a Medical Device (SaMD),” which describes a potential approach to premarket review for AI/ML-driven software modifications. The Action Plan highlights, in part, the FDA’s desire to involve stakeholders in these proposals and has indicated that it welcomes industry feedback on the plan.

The FDA’s roadmap consists of the following five major components:

1. Tailored Regulatory Framework for AI/ML-based SaMD. The FDA intends to issue draft guidance in 2021 for public comment on the Predetermined Change Control Plan, including a content proposal for SaMD Pre-Specifications (SPS) and Algorithm Change Protocol (APS), what types of modifications are appropriate, and the specifics of the Focused Review Process. The Predetermined Change Control Plan is an option to submit a plan for modifications to AI//ML SaMDs during the initial premarket review, which the FDA would then review to ensure the modifications are reasonably safe and effective.

2. Good Machine Learning Practice (GMLP) Development. The FDA will encourage further harmonization of Good Machine Learning Practices (GMLP) development by increasing its participation in various communities dedicated to its creation, including the International Medical Device Regulators Forum (IMDRF) Artificial Intelligence Medical Devices (AIMDs) Working Group and International Organization for Standardization/ Joint Technical Committee 1/ SubCommittee 42 (ISO/ IEC JTC 1/SC 42) - Artificial Intelligence. The FDA will pursue these GLMP efforts alongside their Medical Device Cybersecurity Program.

3. Patient-Centered Approach Incorporating Transparency. The FDA will host a public workshop to address manufacturing disclosures on how AI/ML SaMDs will interact with people focusing on transparency to users and to patients. During the workshop, the agency will share its findings from their Patient Engagement Advisory Committee held in October of 2020 to help address how device labeling supports user trust and transparency in AI/ML devices. Feedback from the public workshop will then be used to recommend types of information the manufacturer should include in its labelling.

4. Regulatory Science Methods Related to Algorithm Bias and Robustness. The FDA will expand its research to identify and eliminate algorithmic bias in AI/ML SaMDs. Through cooperation with various research organizations around the country, the FDA is expanding regulatory science research efforts to develop methods capable of improving ML algorithms, including ways to eliminate bias and increase generalizability to ensure the algorithm is well-suited for a racially and ethnically diverse patient population. Acknowledging that AI/ML technologies are often trained using historical datasets, they are particularly vulnerable to bias, and given their anticipated role in healthcare, it is vital to ensure AI/ML technology can function appropriately for a diverse population.

5. Real-World Performance (RWP). The FDA will support a Real-World Pilot (RWP) program to work with stakeholders, on a voluntary basis, and provide additional clarity on what a real-world evidence generation program should look like. The RWP program will help develop a framework that can be used for gathering and validation of RWP parameters, metrics, and evaluations that can be used for AI/ML-based SaMD in the real world, such as responding to safety or usability concerns or receiving user feedback.

Feedback regarding the Action Plan may be submitted through the public docket (FDA-2019-N-1815) at www.regulations.gov.

© 2021 Perkins Coie LLP


 

Sign up for the latest legal news and insights  >