Making In Vitro Chemical Data Useful for Decisions

By Pat Rizzuto

Feb. 26 — Scientists working for federal agencies, chemical manufacturers and other institutions are working to combine information from in vitro chemical bioactivity tests and computer models with exposure estimates to provide useful information for regulatory agencies and others making decisions about chemicals.

Environmental researchers and risk assessors met at a workshop the Environmental Protection Agency hosted Feb. 17–18 to discuss ways regulatory agencies could use emerging types of information to decide which chemicals need further review; opportunities the information provides, such as dramatically reducing reliance on test animals; and barriers to the broader acceptance of these types of emerging toxicity prediction methods.

Barriers include both establishing scientific validity and “cultural” challenges, such as overcoming regulators' fear that the tests may overlook disease or that their environmental health decisions may face legal challenges for relying on promising but new science.

New Methods Being Used

Barriers or not, workshop participants said the new methods are here to stay.

“The horse is out of the barn,” said John Wambaugh, a scientist with the Environmental Protection Agency about the advanced toxicity testing methods.

“In vitro” or non-animal based data as well as “in silico” computer models already are being used, said Wambaugh, who works at EPA's National Center for Computational Toxicology.

A central challenge is to key in on refinements that should be made to advanced toxicity tests and communicate caveats about ways the tests should not be used, he said.

Warren Casey, director of a National Toxicology Program division that co-hosted the workshop, said an outcome of the workshop will be recommendations on best practices to conduct in vitro tests and use the test data and other information to estimate internal concentrations of chemicals that could cause biological activity.

Because they need best practices, EPA and NTP's Interagency Center for the Evaluation of Alternative Toxicological Methods organized the February workshop, In Vitro to In Vivo Extrapolation for High Throughput Prioritization and Decision Making, held in Research Triangle Park, N.C.

Changing Cultures

The issues workshop participants discussed are part of a revolution occurring in the types of data being generated for toxicology and chemical assessments and toxicity prediction models dependence on information coming from health researchers about biological changes that can lead to disease.

Studies of test animals have served as the backbone of chemical safety evaluation and health risk policies for decades.

Yet, in its 2007 report, “Toxicity Testing in the 21st Century: A Vision and a Strategy,” the National Academies of Sciences, Engineering, and Medicine laid out a strategy for using human cells, tissues and organs to make toxicity tests better predictors of how chemicals might affect people .

The new advanced tests can be threatening both scientifically and to the culture of regulatory agencies. Many regulators are comfortable with chemical tests in living systems because they can show a spectrum of key health effects they fear might be missed by narrow non-animal tests.

Nevertheless, the speed, ease and low cost of cellular tests combined with what many scientists say is better, more relevant data have prompted the BASF Corp., the Dow Chemical Co., the Procter & Gamble Co. and other chemical manufacturers to invest millions in in vitro research .

Converting Doses

Participants at the workshop, including Dow and Procter & Gamble Co. scientists, described ways to convert doses of chemicals that cause a biological change in an advanced cell-based test to an estimated concentration of a chemical in a person that would cause a biological effect. These biological effects, such as a signal that indicates the cell is stressed, may be temporary adjustments to changed conditions or early indicators of a disease state.

Health officials, risk assessors and other interested parties could then compare the estimated internal, biologically active dose with the amount of a chemical to which a population is known or predicted to be exposed to in the environment.

Richard Becker, a senior toxicologist at the American Chemistry Council, told Bloomberg BNA after the workshop that in vitro to in vivo extrapolation, or IVIVE, puts cellular test data in context.

IVIVE calculations take the results from automated toxicity screens and combine that with “toxicokinetic” information on how the body absorbs, distributes, breaks down and eliminates a chemical.

Such data allow scientists to estimate internal concentrations of a chemical that cause or do not cause biological activity.

If the margin is large—meaning people are exposed to far less of a chemical than would be needed to trigger biological activity—that chemical may not be a priority for further review by an agency, he said.

However, with a narrow margin, closer scrutiny could be warranted.

The point, said Becker, is that the IVIVE concentration coupled with exposure measurements provides information decision makers can use. “It's very useful for priority-setting,” Becker said.

IVIVE information could help the EPA prioritize which of the tens of thousands of chemicals in commerce should be scrutinized first for potential adverse health or ecological risks.

Best Practices Needed

Using IVIVE information to prioritize chemicals, requires a thorough understanding of the protocols laboratories are using, the assumptions analysts are making and other details that can affect the interpretation of in vitro test data and computer models, Casey told Bloomberg BNA Feb. 16.

As they prepare in vitro tests, laboratory equipment or individual technicians put specific concentrations of chemicals into cells that are in plastic trays with tiny wells. One lab may assume 100 percent of the chemical is available to interact with the cell while another lab may assume a lesser amount is “biologically available” because some sticks to proteins or to the plastic well, Casey said.

The different assumptions would lead to different conclusions about the chemical concentrations that caused, or failed to cause, disease-related activity.

“It's not that those approaches are right or wrong, but we need to understand their implications,” he said.

It's imperative to understand the implications, because high throughput tests, computer modeling and IVIVE calculations increasingly are going to be used, he said. “This field is going to start exploding.”

NICEATM, therefore, wants to bring together key scientists involved with the development and use of in vitro tests and computer models to discuss best practices, he said.

The National Institute of Environmental Health Sciences and NICEATM plan to use knowledge gleaned from the workshop immediately, he said.

They'll also prepare a workshop report, which should be available in about six months.

“This field is going to start exploding.”Warren Casey, NTP
Issues, Questions Raised at Workshop

Workshop participants formed into breakout groups to discuss best practices for their work, information that researchers should share to help other scientists better understand how results were generated and strategies to validate in vitro assays and IVIVE calculation methods. This will allow researchers and decision-makers to understand both the applications and limitations of the assays and methods.

Other participants joined in a breakout group that discussed the use of IVIVE information. Questions they explored included:

  • Who are the stakeholders, and what are their needs?;
  • What are the training needs for industry and for regulators?;
  • What are potential regulatory or other programs that could use IVIVE information?; and
  • What barriers exist to the use of the information?
  •  

    Fear of Lawsuits

    During one breakout session, a participant said EPA regional officials were resistant to using in vitro chemical data at Superfund sites out of concern they couldn't scientifically defend them under a key Supreme Court precedent.

    The provisions for admittance of expert testimony set in the Supreme Court decision, Daubert v. Merrell Dow Pharmaceuticals, Inc., are broadly applied .

    The standard is used by trial judges to decide whether an expert’s scientific testimony is based on reliable and relevant methods.

    Factors the judge is to consider include whether the theory or technique in question has been tested, whether they have been peer reviewed, whether they are controlled by standards and whether they are widely accepted within the relevant scientific community.

    The issue of whether a method has “widespread acceptance within the scientific community” is the easiest point to attack and is relevant to in vitro methods, the participant said.

    EPA Program Builds Confidence

    Diverse parties need to have confidence in toxicity tests, or “assays,” and computer models, Becker said.

    The EPA's Endocrine Disruptor Screening Program developed a peer-reviewed, scientific information package that gave it and the regulated community confidence in three advanced high-throughput toxicity tests, Becker said.

    He referred to the agency's announcement in September 2015 that it would allow companies to submit data on a chemical's ability to mimic or interfere with estrogen, the female reproductive hormone (80 Fed. Reg. 35,350).

    Jim Jones, EPA's assistant administrator for chemical safety and pollution prevention, has said the agency will allow the use of additional high throughput tests for other endocrine effects by 2017 .

    “It's not rocket science, just good science,” Becker said.

    Sharing Information

    During the workshop, participants said building such confidence in other uses of in vitro data and IVIVE information will require agencies establishing “safe harbors,” or situations where companies can share their information without being penalized.

    That idea deserves further exploration, Becker said.

    The biological activity a chemical may cause does not equate to that chemical having caused an adverse effect, he said.

    Some type of safe harbor strategy would allow regulators and the regulated community to discuss such issues without sanctions, he said.

    The workshop group that focused on risk assessment and other uses of IVIVE information discussed programs at the EPA, the Food and Drug Administration and possibly other agencies that might benefit from information generated by in vitro tests, computer models and IVIVE calculations.

    Programs at the Food and Drug Administration and EPA's air toxics, Superfund and new chemicals division—which often lacks toxicity data about new compounds—will likely be affected by the new tests.

    Near Term Application

    Stakeholders include not only regulatory agencies and the regulated community, but communities that could be given the results from in vitro tests about a chemical of concern in their neighborhood.

    Becker said there must be scientific support for any program's use of information coming from in vitro tests, computer models and IVIVE calculations.

    At present, the most scientifically supportable way these technologies can be used is for “read across” purposes, which means applying toxicity activity or other information about one chemical to another similar chemical which lacks data. Such “read across” techniques will be the subject of a March 1 workshop hosted by the FDA in College Park, Md.

    To contact the reporter on this story: Pat Rizzuto in Washington at prizzuto@bna.com

    To contact the editor responsible for this story: Larry Pearl at lpearl@bna.com

    For More Information

    Information about the IVIVE workshop, including presentations from four webinars that led up to the meeting, are available at http://src.bna.com/cUW. Information on the “Read Across” workshop hosted by the FDA March 1 is available at http://src.bna.com/cU5.