Reliable Next-Gen Data May Avert FDA Review, Shuren Says

Life Sciences Law & Industry Report connects the dots among the many disciplines that make up the burgeoning life sciences industry, with biweekly updates on current regulatory, legislative,...

By Jeannie Baumann

Oct. 14 — Formal FDA review may not be needed for interpreting results of advanced genetic screening tests if the associated databases contain reliable information, the agency's device chief said.

Jeffrey E. Shuren, director of the Center for Devices and Radiological Health, made the remarks while explaining the Food and Drug Administration's plan to regulate next-generation sequencing tests. These diagnostic tests, which can screen an entire genome to detect variations linked to diseases, are considered critical to the White House's precision medicine initiative.

“The linchpin for precision medicine—getting the right treatment to the right patient at the right time—is the diagnostic,” Shuren said during an Oct. 13 congressional briefing on the topic.

If these tests aren't accurate enough, or if the measurements aren't meaningful for treatment decisions, “then the patient is either going to get the wrong treatment, or they’re not going to get the treatment when they should.”

Draft Guidances

In July, the agency released two draft guidances that together propose a framework for how the FDA wants to regulate next-generation sequencing tests. One of the documents addresses clinical validity, or determining how meaningful the test measurements are in clinical practice.

The FDA couldn't apply the same approach to next-generation sequencing tests as it does to traditional diagnostics, he said, because conventional diagnostics have one test for one disease. But a next-generation test that screens the entire human genome could be screening 3 billion base pairs, with 3 million genetic variants.

“Let’s say you’re 99 percent accurate in what you’re measuring. That’s great. That means you’re only wrong probably about 30,000 times, if you were looking at the whole genome,” Shuren said. “Now let’s say you want to go ahead and look at the accuracy of all the differences that you’re measuring. Well the idea that you’d have to demonstrate it for every little thing that you looked at would make it infeasible to do. We can’t do that.”

Facing Challenges

Another challenge with genetic variants, Shuren said, is that many of them are uncommon.

“So it’s very hard, sometimes impossible, for a single organization or institution to have sufficient data to really know what that variant means,” he said, adding that the agency would like to pool that information and build confidence in using those data.

The guidance on clinical validity also includes recommendations on how to assure the quality of those data when they're provided to a database. The FDA proposed a consensus-building approach both for data sharing and curating data so these data are reliable.

“What we’ve said is we’d like to work with the clinical community on developing the standards for interpretation so that we can rely on the interpretations by the owners of the database or by the clinical community,” Shuren said, “and eventually maybe even get to a point where we don’t need formal FDA review.”

“What you have is our engagement with the clinical community on standards, crowdsourcing evidence and having common rules of the road on interpretation, so that the regulatory paradigm syncs lock-step with the science as it develops,” Shuren said.

FDA Oversight of Tests

Many of these next-generation tests are developed in a single laboratory, and the FDA doesn't regulate laboratory-developed tests; the Centers for Medicare & Medicaid Services has jurisdiction under the Clinical Laboratory Improvement Amendments program.

The FDA proposed regulating LDTs in draft guidance issued more than two years ago. The move has been controversial as laboratory groups staunchly oppose the proposal and the diagnostic industry has embraced it. Lamar Alexander (R-Tenn.), chairman of the Senate health committee and a leader of the Senate's companion effort to 21st Century Cures (H.R. 6), also criticized the FDA's plan as too costly in a recent hearing (10 LSLR 19, 9/30/16).

John Iafrate, an associate in pathology at Massachusetts General Hospital, said during the Oct. 13 briefing his institution would have to hire two to three full-time regulatory compliance professionals if the FDA regulates LDTs.

“Why would I need to perform a very expensive, large clinical trial to show my test predicts” the correct mutation accurately? he asked.

“Imposing a regulatory framework over all LDTs is going to be extremely expensive,” he said. “In the end, there will be a definite increased cost to the hospitals, to the payers and patients.”

Misinformation

Shuren responded: “There’s a lot of misinformation about what FDA requires for assessing tests.”

He agreed that there's no need to run expensive clinical trials for tests where the use is known and if lots of people are developing the same test in labs. “In fact, for most tests they’re actually not even doing clinical studies. A lot of times it’s information that’s out there in the scientific literature we can use or there are tests that have been more on specimens that have been collected. It’s not this bar that a lot of people have painted out there.”

Shuren told Bloomberg BNA he doesn't know when the FDA may take final action on the 2014 proposal to oversee LDTs.

The pathology lab at Massachusetts General runs more than 1 million tests per year, Iafrate said. Most of these tests have received FDA approval, but he said thousands of LDTs haven't. A study conducted by Friends of Cancer Research and the Deerfield Institute, which hosted the briefing, found LDTs and FDA-regulated tests are often used in the same setting. The briefing was titled “The Future of Precision Medicine & Patient Care: Current Landscape in Genomic Testing.”

Consistent Standards Helpful

Jeff Allen, president and chief executive officer of Friends of Cancer Research, said that because these tests are subject to different requirements, there could be variability between the performance of these tests. And that variability may not be captured “because we’re not holding them all to the same playing field.”

“It seems that holding these tests to the same standards before they go out into practice would be an assurance that people would certainly feel better about,” he said. Patients, clinicians and pathologists would know “that no matter the tests or where it was conducted, they could rely upon the results.”

Iafrate said the non-FDA approved tests tend to be higher-volume but are important as more than half of them are related to genomics. Running an FDA-approved test is “relatively straightforward” because pathologists follow the package insert.

For LDTs, “I actually have to use my training to perform a full validation on that test to use high quality controls,” Iafrate said, adding that every time the hospital undergoes an inspection, inspectors look at all the LDTs as well. “Just because they’re LDTs it doesn’t mean there’s anything bad about that, or that they’re necessarily dangerous. It can be, but it’s not necessarily the case.”

To contact the reporter on this story: Jeannie Baumann in Washington at jbaumann@bna.com

To contact the editor responsible for this story: Randy Kubetin at rkubetin@bna.com

For More Information

More information on the briefing is at http://src.bna.com/joQ.

The 2014 guidance document on LDTs is at http://src.bna.com/joz.

The next-gen sequencing draft guidances are at http://src.bna.com/jox.

Copyright © 2016 The Bureau of National Affairs, Inc. All Rights Reserved.