FDA drafts AI-enabled medical device life cycle plan guidance

The agency says it is proposing a science-based approach to requirements for medical devices powered by artificial intelligence and machine learning that would help deploy new devices faster.
By Andrea Fox
10:56 AM

Photo: Martin Barraud/Getty Images

The Food and Drug Administration announced the availability of draft guidance that provides recommendations on life cycle controls in submissions to market machine learning-enabled device software functions.

WHY IT MATTERS

In the document, "Marketing Submission Recommendations for a Predetermined Change Control Plan for Artificial Intelligence/Machine Learning-Enabled Device Software Functions," the FDA proposes to ensure that AI/ML-enabled devices "can be safely, effectively and rapidly modified, updated, and improved in response to new data," Brendan O'Leary, deputy director of the Digital Health Center of Excellence in the FDA's Center for Devices and Radiological Health, said in a March 30 announcement.

Predetermined Change Control Plans reviewed and agreed to by FDA would include:

  • Detailed description of the specific, planned device modifications.
  • Explanation of the methodology that would be used to develop, validate and implement those modifications.
  • Assessment of the benefits and risks of the planned modifications. 

FDA says that companies must also describe how information about modifications will be clearly communicated to users in the PCCP. 

The agency explains that control plans are not just intended for the AI/ML-enabled software as a medical device – "but for all AI/ML-enabled device software functions."

According to O'Leary, the PCCP facilitates the rapid and regular improvement of AI/ML-enabled device performance across diverse populations.

"The approach FDA is proposing in this draft guidance would ensure that important performance considerations, including with respect to race, ethnicity, disease severity, gender, age and geographical considerations, are addressed in the ongoing development, validation, implementation and monitoring of AI/ML-enabled devices," he said.

Bradley Merrill Thompson, an attorney who is chairman of the board and chief data scientist of EBG Advisors, said he reviewed the guidance and found it very useful when he reached out to Healthcare IT News by email.

"It will encourage innovation and the timely delivery of new medical technologies," he said.

However, developers will need to prepare for the burden. 

Beyond the "intricate plans" to prepare, when the changes are actually made, "the document requires the companies to develop an enormous amount of documentation going forward," he said. 

"Essentially, they will have to periodically write what amount to 'submissions,' but they just don't have to file them with FDA. All that documentation needs to be in their files should FDA come to inspect."

The FDA is accepting comments on the draft guidance through July 3.

THE LARGER TREND

Since artificial intelligence and any of its potential biases can impact clinical decisions, many believe building more trust in machine learning models for healthcare will be highly pragmatic. 

The different types of machine learning – supervised, unsupervised and reinforcement learning – each has its own strengths and weaknesses, according to Ittai Dayan, CEO and cofounder of Rhino Health, an analytics and AI platform vendor.

ML can be used for healthcare processes and to develop predictive models that can help healthcare providers anticipate patient outcomes and tailor treatments.

Health IT leaders can develop robust quality management systems for monitoring and documenting an algorithm's purpose, data quality, development process and performance, said AI expert Henk van Houten, chief technology officer at global IT vendor Royal Philips.

"As regulators have also recognized, continuous monitoring aftermarket introduction will be necessary to ensure fair and bias-free performance," he told Healthcare IT News in a discussion about how bias can affect AI in healthcare.

"Ideally this would include finding a way to validate the representativeness of new learning data – in a way that respects ethical, legal and regulatory boundaries around the use of sensitive personal data." 

ON THE RECORD

"The approach the FDA is proposing in this draft guidance would put safe and effective advancements in the hands of healthcare providers and users faster, increasing the pace of medical device innovation in the United States and enabling more personalized medicine," said O'Leary.

"It opens up a whole range of possibilities for anticipating changes and avoiding multiple submissions for technology that does indeed evolve," Thompson said.

"Technically, FDA will permit automatic evolution in the anticipated direction, but I get the sense that the bar will be higher for anything that doesn't involve manual decision-making by the developer."

Andrea Fox is senior editor of Healthcare IT News.
Email: afox@himss.org

Healthcare IT News is a HIMSS Media publication.

 

Nate Lesser will offer more detail during the HIMS23 session "Code Dark: Finding Force Multipliers in Hospital Cybersecurity." It's scheduled for Wednesday, April 19 at 11:30 a.m. - 12:30 p.m. CT at the South Building, Level 4, room S406 B.

Want to get more stories like this one? Get daily news updates from Healthcare IT News.
Your subscription has been saved.
Something went wrong. Please try again.