Welcome back. I'm going to talk now about application of quantitative in vitro to in vivo extrapolation to in vitro testing. And so these are good examples of how it can be done. In this first study that I'm going to talk about, it was a joint effort between the Hamner Institutes and the EPA, the National Center for Computational Toxicology, their ToxCast program. And so they were collecting information on the potential toxicity of hundreds of chemicals using hundreds of assays. And were coming up with bioactive concentrations for these chemicals. In other words, what concentration of the chemical in the media caused a hit on a particular test? And we suggested to them what might more useful would be if they could predict what oral equivalent dose. In other words, what human exposure that was equivalent to that concentration would result in a hit on that assay. And they liked that idea. And so we put together this analysis on the chemicals, which required doing analytical chemistry on all those hundreds of chemicals. So it kept our analytical chemists quite busy for several years, but it was possible to do it. And so we actually obtained metabolism data on all these chemicals, which involves putting them into an assay with hepatocytes and following their decrease in concentration over time. And from that, estimating the rate of metabolism. We then also measured plasma protein binding together a fraction unbound that could be used to estimate renal clearance, as I mentioned, by using the filtration rate. And then we conducted the in vitro and in vivo extrapolation using the hepatocellular clearance to predict in vivo clearance by deliver. And then using something which I called reverse dosimetry years ago, when we first were doing it for estimating human exposures from blood levels from biomarkers. We were doing the same thing now, except we were using the concentrations in the in vitro assays instead of the blood levels from the biomonitoring studies. And so we can then convert the in vitro assay concentration to an equivalent exposure. In this case, we were looking for an equivalent oral exposure. In order to check whether we actually had any hope of predicting the in vivo clearance using this very simple method, we actually looked for chemicals which had been studied in humans. And out the various chemicals, the hundreds of chemicals that they had looked at, we were only able to find about a dozen that had been tested in humans. For each of those then, we used our simple method, shown in the second column. We used a simple method of estimating the steady state concentration that you would achieve if you were exposed at a constant rate to that chemical. A constant daily rate, milligrams per kilogram per day of what would be the steady state concentration. And then we ran two different approaches for doing the in vitro in vivo extrapolation, and they're shown on the two columns on the right. In one case, we assume that only the free fraction of the chemical in the blood could be metabolized. This is what pharma generally assumes when they're looking at drugs. In the other hand, in the right column, we then looked at the assumption that all of the chemical in the blood was available to be metabolized, whether it was bound or not. That's actually more generally used with environmental chemicals, because their properties are different and they don't bind as avidly as drugs often do. It turned out that the comparison was pretty good for the non-restrictive approach. In other words, assuming all of the chemical is available. The red numbers are those that don't match up with the in vivo estimate. And you can see the restrictive doesn't work as well. There's a seven chemicals that were misestimated badly using the restrictive assumption, but there are only four using the non-restrictive. And actually, of those, we know the reason why it wouldn't match up for a particular chemical. Perfluorooctanoic acid is very avidly resorbed in the kidney because the body mistakes it for a fatty acid that it wants to retain. And the oxytetracycline dihydrate is a penicillin-like drug which is very, very poorly absorbed. And so those are areas where we have to bring in more than just a clearance in order to estimate in vivo conditions. But overall, for a high throughput screening process, it seemed the EPA was satisfied that this was worth doing. And so this shows what was the study that was performed. On the left you have what we did at the Hamner. We had toxicokinetic parameters that we needed to obtain, the plasma protein binding, and the metabolic stability or metabolism rate. And we used that to perform the in vitro and in vivo extrapolation. Meanwhile, on the right, the EPA performed 398 in vitro ToxCast assays on hundreds of chemicals to get AC 50 values. That's effective concentrations, 50% effective concentrations. And then they estimated bioactivity concentration from those. And so then we predicted the oral equivalent dose for that concentration. And the EPA did some analysis to estimate what would be an upper level of expected human exposure so that we could try to determine a rough margin of safety. What was really the point at which the EPA felt that incorporating metabolism and doing IVIVE was important was when they saw results like these. This is an example for a variety of different chemicals using a variety of tests, but all of them came out with the same value for the bioactive concentration, 1.481 micromolar. However, when we did the inversion to the equivalent oral dose in the human, these same chemicals vary in potency by more than an order of magnitude. You can see it goes from about 0.266 to 0.758. And so that's a factor of about a 30 difference in the actual potency of these chemicals if they were in the human diet. So, at that point then, EPA discovered that they weren't going to be able to just do concentrations in vitro and use that to prioritize testing. And so now they are actually developing their own ability to do this kind of simple IVIVE, or interpretation, of which are the chemicals of greatest concern for further testing. This is a figure from the first paper we published together with the EPA on this study. And it makes it a little bit clearer about the difference that metabolism consideration makes. What's shown here are box and whisker plots that show the range of concentrations at which the tests were positive, the hundreds of tests were positive. And the whiskers show the range. There's only a few tests that are positive, down in the lower part of this diagram. And what is shown across the bottom are the chemicals that were actually tested. So this is a subset of all the chemicals that were tested, so that they could show them on the plot. And then the green dots are estimates of the maximum exposure that EPA estimated people might have. And if you at look at the top left diagram then, what you see is that's four orders of magnitude difference in the daily oral equivalent dose at which a chemical is found to be active. And so then the chemicals on the right are the ones that are the least potent, and the ones on the left are the most potent. So you have affects at the lowest daily dose. If you plot those same chemicals in the same order against their bioactivity concentration in vitro, you see that it's really not much differentiation, it's fairly flat. As it turns out, this is a matter of chemistry. Most chemicals tend to interact with cells at about the same concentrations, unless they have a very, very specific binding to a particular protein. Actually, one of the things that was learned in ToxCast is that it's important to separate chemicals that have non-specific toxicity from those that have a specific toxicity, such as binding to a receptor. Another study that we're involved in was one with the University of Utrecht, the Institute for Risk Assessment Science at the University of Utrecht. And also with RIVM, which is the Dutch environmental government organization. And in that case, we actually took chemicals that had been well studied and pretended that we didn't know that they were toxic, or how they were toxic. And then we did an in vitro based risk assessment to see how well we did. And we're still in the process of analyzing this data, but I can show you some of the things that we did find. So in part one, we used QSAR to predict target tissue, and then we compared with what was known for the chemicals in terms of their target tissue toxicity. We then also used QSAR to predict metabolites and looked at whether they would come up positive in a QSAR prediction of toxicity. And we also looked at whether the metabolites that were predicted were the ones that have been found for these chemicals in vivo. And so that part has been done. And then the second part, which is still in progress, is to actually do in vitro toxicity assays on these chemicals, or find assays already performed in a literature. And then conduct in vitro and in vivo extrapolation, as I've described so far. And then compare with the toxicities, chemicals based on regulatory decision making such as reference concentrations and reference doses. And so that's the part that's still in progress. Well, we found about the prediction of metabolites was that the OECD Toolbox, which is quite widely used, they have a metabolism prediction program in there. And it was able to correctly predict the primary metabolite responsible for the toxicity of 9 of the 12 chemicals investigated in this study, where toxicity is due to a metabolite. So that's actually reasonably good. There were a few misses, as you can see, three cases, but overall it shows that there's a real promise. However, in some cases, the toolbox predicted a large number of other metabolites that have not been detected in vivo. So the problem, really, if you were going to take the output from the toolbox and try to investigate all those metabolites, is that you would be wasting a lot of effort. And so, it's important that we improve the process of predicting metabolites in order to focus on those that really are produced in vivo. And so the prediction of nontoxic or low yield metabolites makes the process of investigating metabolite toxicity more difficult and time consuming. And that's not something we can afford. On the other side of that part one of our work, we were looking at metabolism predictions using in vitro data. And this was just using studies that had been published on in vitro metabolism of these chemicals. You can see the chemicals listed here. Trichlorethylene, acetaminophen, bisphenol A, nicotine, coumarin, parathion, warfarin, TCE, and nicotine. And so they were chosen because there was a lot of data. They were also chosen because they're toxic. And they were also chosen to represent different sets of properties. I'll explain the plot. So on the y-axis, we have the concentration at steady state predicted based on the in vivo data that was actually available on these chemicals. On the x-axis, along the bottom, we have the concentration that steady state for the same exposure in vitro. And using in vitro data to do the in vitro and vivo extrapolation to estimate the in vivo steady state. And so the dotted lines are a factor of three from one-to-one agreement. And so, basically, there's reasonably good ability to predict the in vivo clearance using in vitro data. And the outliers, again, we know why they are outliers. In one case, parathion is metabolized in the gut prior to getting to the liver. And so we're not modelling intestinal metabolism with the method that we were using. It is possible to do that, and if one did that then that would bring it to a better prediction. In the other case, warfarin binds very avidly to proteins in the plasma. And we were not looking at the restrictive clearance possibility in this study. And so, that again shows that that is an important thing to consider. That concludes the section on approaches to in vitro and in vivo extrapolation. The next section, we'll get into challenges for conducting quantitative in vitro in vivo extrapolation. And what is the research that's really needed in order to improve what we can do.