Welcome back. I'm now going to get into the question of what is Quantitative in Vitro in Vivo extrapolation and in particular, what needs to be done to make it truly quantitative and not just conveying information that's not completely accurate. So this is a description of the process, by which we try to do risk assessment using in-vitro data. And this was developed at a meeting of the Transatlantic Think Tank, which is a center for alternative to animal testing activity, that tries to move things forward by getting scientists together to discuss them. And so, this approach, as most approaches that they are starting to do, starts with exposure assessment. First thing to do if you're concerned about whether a chemical can have effect is to determine, just how much exposure could people have. Is this something that will only be used in the workplace as a chemical intermediate? Or is this something that could be potentially released? Is it something that everybody's going to have in their cereal? That determines then what levels people might be exposed to and how many people might be exposed. And that drives the overall formulation of a risk assessment that is focused on the actual concerns that there might be. The beginning then of the risk assessment process is it goes with the literature review to find out as much as possible about the chemical. Sometimes, it's possible to evaluate risk right at that point. If it's something with very low exposure potentials so that in order for it to be of concern, it would have to be the most potent chemical that's ever been discovered. Then, some decisions can be made about whether it's okay to have that chemical in the environment. In many cases though, it's necessary to derive new information out of chemical. Sometimes it's possible to do read across, where you look at a similar chemical and say well, this chemical looks a lot like that, it might likely have a similar toxicity. In order to support that, there are programs that look at structural physical chemical properties and reactivity of a chemical. On Tater's structure activity relationship modeling, Quantitative structure property relationship modeling, looking at how that structure informs what might be the nature of the toxicity and the kinds of studies that will be needed. At next point in this stepwise approach, we have to determine bio-kinetic behavior. What that means is, metabolism studies primarily. And, this is one of the more nontraditional recommendations of the group who put together this proposed risk assessment evaluation approach, is to actually do metabolism studies of chemical before doing a battery of tests to look for its toxicity. This is of course, the opposite of the way things happen in high through put testing, is being developed now in places like DPA and NIH adjust. They test the chemical first, and then they try to determine if it does pass or if it gets flagged during the testing, then they will look at its metabolism in order to determine what would be an equivalent dose in the human. But the problem is that, many chemicals are only toxic when metabolized. So, if you're doing tests without knowing that a particular chemicals is metabolized in a way that is potentially going to produce a more toxic metabolite, that's something you should know first before you actually do the battery of tests. And so, this is actually more of a medium throughput approach. The high throughput of produce or have separate issues just due to the time they have to form the tests, it makes metabolism very difficult. There is work going on at both the DPA and NIH looking at whether it's possible to incorporate metabolism into the high throughput systems. After designing the tests that are appropriate and the conditions of the testing, whether there should be some sort of metabolism considered, then we conduct that studies and we do them in concentration response. That is to do a series of concentrations of the chemical, so that you can determine what is the lowest effective concentration or the EC10 or something that would be used then as a point of departure in the risk assessment. In some cases, it's necessary to do more sophisticated modeling if the biomarker that's used in a particular in vitro assay doesn't really correspond one for one with an out from a pathway that is disrupted, then there is a need for doing some sort of computer modeling to determine the point of departure. But once that point of departure is determined then the next step is in vitro to in-vivo extrapolation to obtain either a daily dose ingested or a concentration, in the air that's breathed that corresponds to that concentration in vitro. At that point, one then can also look at questions about whether there's a portion of the population who might be more sensitive, because, I have a slow form of a particular metabolism that's important for that chemical or in the case of children, whether a particular metabolizing enzyme hasn't really come up to adult levels yet and so, that could affect their susceptibility to a particular chemical. This diagram shows the elements of the Quantitative in vitro to in-vivo extrapolation. In the middle is the Biokinetic model. The Biokinetic model actually helps us to understand what's going on in the in vitro to assaye, and also how to translate it to an in-vivo human toxicity estimate. At the top, the model needs inputs as I mentioned parameters, and some of those parameters can be estimated by connotative structure property relationships, partizan coefficients in particular, the tendency of chemical to collect in one tissue versus another, can often be estimated reasonably well using quantitative structure property relationships. On the other hand, metabolism has to be measured. We don't have the ability to estimate it from in silico type methods. And so, those are the two key elements that are fed into the model the in-vitro metabolism studies, in vitro kinetics and then estimates for the other parameters from quantitative of structure property relationships. In meantime, QSAR, the relationship of a chemical structure to its potential toxicity helps us to identify what are the potential target tissues, in order to design the tests that are most appropriate for that chemical. And then, the biokinetic model can be used, since we have a measure of metabolism, to estimate the lifetime of the chemical in the in vitro system if there is metabolic capability. For example, if you do and in vitro study with hepatocytes, then the chemical will be metabolized to some extent by those hepatocytes at the same time that you're looking to see whether it's toxic. And often as I said, the toxicity may be due to the metabolism. For example, acetaminophen is toxic because it's metabolized to a reactive chemical. But then, the model also can be used to do the quantitative in vitro, in vivo extrapolation to get the human equivalent dose. And so, I'll go over the next few slides through some of the elements that are part of this process. So for the quantitative structure activity relationship or the property relationship data, there are multiple methods available, correlations are available for things like physical chemical properties, the actonel water, partition coefficient, the solubility in water, the volatility, the acidity or basicity. All of these things can be estimated with QSAR to some extent. And, there are also a fragment or rule based systems, that can positively predict metabolism. There are also three dimensional docking software, that predicts whether a particular structure would bind to a receptor or to a metabolizing enzyme that's available for some proteins. The limitation is that QSAR is a good example of something that's only as good as the data that's put into it. And, there are many things that we'd like to be able to do QSAR for that there just have not been enough data collected in order to support it. In particular, a lot of work has been done with drugs. And so, if your chemical is drug-like, similar to drugs then that's fine, but you probably have a good basis set for predicting. But unfortunately, environmental chemicals are just not drug-like they tend to be volatile, lipophilic, metabolized rapidly, all things that would make a drug not useful. This diagram here shows what I'm talking about with the question of properties of chemicals. And if you look at the center of the three axes, they represent lipophilicity, things that are highly lipophilic, like dioxins and siloxanes, and then, there's solubility, things that are water soluble, like esters, ionic chemicals, and then there's chemicals that are volatile, the volatile organic chemicals like benzene and toluene, alcohols like methanol. But alcohols are also water soluble, so every chemical is a little different in terms of its properties. Actually, if a chemical is useful as a drug, it will be to the right on this diagram because it has to be water soluble. It will not be volatile unless it's something that's going to be used as an inhaler, and it will not be highly lipophilic. Drug companies don't like highly lipophilic compounds because they tend to stay in tissues for a long period of time, which can make it a problem for dosing. So, I mentioned in the last section that there are pharmacokinetic factors that affect in vivo toxicity, but are not appropriately reflected in in vitro toxicity tests. And one of those is bioavailability, whether a compound is well absorbed, and that can be true, whether it's dermal exposure, or oral exposure, or even inhalation exposure. Some chemicals are absorbed very strongly, very rapidly. And others, it takes longer for them to get in, they may not get in very much at all. And then there's the transport processes that move chemicals around to the liver, in protein binding, that can go on in the blood so that the chemical then is not available for filtration in the glomerulus. Actually, for highly lipophilic chemicals, there's a potential for them to be incorporated into chylomicrons in the gut tissue, and then they go in to the body through the lymph instead of through the blood, to the liver, and they can actually bypass metabolism in that way. It's much more complicated in vivo than it is in vitro, and all of that has to be taken into consideration when trying to do that extrapolation. There are a lot of methods available for doing metabolism in vitro, really, over the last 20 years, have had a tremendous improvement. Hepatocytes can now be, as primary cells, can be kept going for weeks and still maintain their function. It used to be hard to keep them going for a few hours. And there are cell lines, the hepaRG and hepG2 are both cells derived from human liver tumors, and they are used to estimate metabolism, although they don't strictly function the same way as a primary cell, they are derived from a tumor. So it is always advantageous to do the studies with the primary cells themselves, if possible, and there are many vendors now where you can obtain primary cells. There are some other instances where it's not possible to use primary cells because the cells aren't viable long enough to be derived and separated. So then, tissue slices can be used, and there is a good deal of work with gastrointestinal tract tissues in order to be able to look at intestinal metabolism using tissue slices. Sometimes, we can use fractions of cells, microsomes, which include many of the metabolizing enzymes, and in other cases, cytosol, which has some other enzymes can be used if that's all that's really important for a particular chemical if we know that ahead of time, but it's safer usually to use hepatocytes. Something that's actually become quite popular now is the use of recombinant enzymes where you have cells that have a single enzyme so that you can measure which enzymes are metabolizing a particular chemical and then use the abundance of that enzyme in the liver to determine the rate of metabolism, and that is something that came out of work, primarily postered at the pharmaceutical industry, but now it's widely used environmentally as well. There's a lot of work today, in these times, on three-dimensional cultures to try to improve the reproducibility of the in vitro assay for what goes on in vivo. Organotypic cultures are those that incorporate multiple cell types and arrangements that are more in vivo-like, so that they will behave more like in vivo cells. Then the secondary where metabolism is important is the potential for generating a reactive metabolite or a toxic metabolite. And there is work going on to try to project those kinds of things, there's a software called Meteor, and there's also the OECD, a European organization has a toolbox that can do that, and there is work also going on to develop high-resolution liquid chromatography linked to mass spectrometry, and then use computer assistance to try to identify chemicals quickly during metabolism studies. Metabolism is important. It's important for understanding toxicity because, in some cases, toxicity might not be observed in test cells because their metabolic competence is low. And if they were actually exposed to the metabolite, then you might see the toxicity, but you have to know that the metabolite is something you should worry about in order to make some of it and then expose the cells. A circulating metabolite could also be toxic in tissues other than the liver, and so just looking at hepatocytes where you do get metabolism isn't enough. There are cases where, for example, the chloral hydrate, it's called knockout drops, but actually it's metabolite trichloroethanol that has that effect in the brain. So, if you actually looked at brain cells with chloral hydrate, you wouldn't see the same effect you would in vivo because the brain doesn't have metabolism. Similarly, the kidney gets a lot of things that are produced in the liver, are transported to the kidney for excretion, but in the meantime, sometimes, the kidney inadvertently creates something reactive from the conjugates that are produced in the liver. So, a key limitation incurred in vitro testing is the failure to adequately consider metabolism, particularly in high throughput studies just because of the time restraints, the development of assays is focused almost exclusively on detecting parent chemical toxicity. Looking at the potential for metabolism is put off until there's more time to try to work it out. But the problem is that in-vivo toxicity, often results from production of metabolites, particularly when you're talking about longer term exposures, most of the chemicals that we hear about are known to be toxic through their metabolism to a metabolite, and so we don't want to miss these kinds of things during our in vitro-based toxicity testing. It may be that we may not be able to only do high-throughput testing, there may be a need to have a second level of testing which involves medium throughput where you actually take the time to try to incorporate the need to look for metabolites. And examples of chemicals that where it is a metabolite that is responsible for their toxicity are things like isopropanol, coumarin, phthalates, hair dyes, coal tar, all of these things require metabolic activation. There are other processes, in vivo processes that we have to be able to consider. Urinary clearance. If the chemical is water soluble, then it's pretty easy to estimate its glomerular filtration using the glomerular filtration rate and the fraction unbound in the plasma of the free chemical is what is filtered. It does ignore active transport. Some chemicals are actually actively excreted or reabsorbed in the kidney, and so one of the things that's having to be looked into now is how to predict whether a chemical is likely to be subject to active transport in the kidney. Ventilatory clearance is something that people often don't think about because most of the chemicals are nonvolatile that we worry about, but there are also many volatile chemicals that we need to test. And for those, a few are exposed to them in drinking water, they'll be exhaled and metabolized, and so we can use the alveolar ventilation rate in order to be able to model that. Of course, that's also important for ventilation exposure and the blood-air partition coefficient, the ratio between the air concentration and blood concentration at steady state, is something that we use in order to model that. And then, we have to be able to identify chemicals that are poorly absorbed or else we'll overestimate how much of the chemical is getting into the blood. And for that, there is an in vitro assay that's widely used using what are known as Caco-2 cells. So again, it's a tumor-derived cell line, in this case, intestinal cells. I'm going to turn now to another difficulty in doing in vitro testing, and I guess the villain in this is analytical chemistry. Analytical chemistry is expensive and time-consuming, and so there's a tendency for investigators to use something that's known as a nominal dose or nominal concentration where you take the amount of chemical in grams that you're adding to the media and divide by the volume of the media to get a concentration in grams per milliliter. The problem is that you don't actually know that all of that chemical is freely dispersed in the medium. So what is really the active thing in the effects of the chemical is its free concentration, that is, not bound, not adsorbed on other materials but actually free in the medium to get into the cell. So the preferred approach would be to measure a free concentration in the medium and then you can also measure the fraction unbound in the plasma, and the fraction unbound is just another word for the free concentration in the plasma. And so then you're equating apples and apples. This is important, but as I said, it's time consuming, it's expensive, you have to develop an analytical method or use a very expensive equipment, high-resolution mass spectrometry, in order to be able to measure it, so people often don't do it. The problem is that the only thing worse than having no data is having bad data. And by that, I mean data that wasn't collected correctly so that you think that you can trust it but you actually can't. I actually was at EPA once when a group was trying to get an assay, an in vitro assay, and one of the reviewers said, "You do realize that the concentrations that you say that you use in this assay are 10 times its solubility." And the people didn't know that. And the reason that they didn't know that is they hadn't tried to measure it. What had happened with that chemical is it was lipophilic and so it was binding to the plastic of the plate. So those are the kind of things that you have to watch out for or else you'll get bad data. This is an illustration of what I've been talking about. When you put a chemical into a medium, it depends on how much, what the composition of the media is. It may be that there's a good deal of albumin added, it may be that there is not. Every assay is different and it depends on, in most cases, what makes the cells happy. You need to put them in an environment where they'll maintain function. And so there's only so much one can do to vary the composition of a medium, but what can happen then and what does happen is that the chemical will bind to the plastic and to the tubing. A lot of tests are done with flow, and so then the tubing, the chemical also had to flow through tubing. The most popular tubing silicone and Pharma tubing, for example, absorb chemicals like sponges. And so, very often, you have to put a lot of a chemical in in order to get any of it to the cells. It can bind to the albumin or other components of the medium. It can bind to the surface of the cell and not actually get in. It can be degraded or metabolized. It can also evaporate. Even a slow evaporation rate can make a big difference if you're doing an assay over 6 or 24 hours. So these are all the things that need to be considered, partitioning due to lipophilicity, and that's partitioning, this is in vivo now. Partitioning to the lipophilicity is important that you will get more of it into the liver. You may also get incorporation into lipoproteins as I mentioned. All of these processes need to be modeled. This shows that it is doable. This is an example of someone who actually looked at the question of estradiol versus genistein, doing studies to determine how potent is genistein compared to estradiol and what are the things one needs to consider. What they discovered was that it's really important to monitor the competitive binding of the target chemical, estradiol, and the chemical being evaluated, in this case, genistein. Estradiol is a positive control. And this shows what is in the model, the media and the cells. It includes the competitive binding to two different proteins both in the media and in the cell. And this shows the effect of that binding, that if you add serum to the media as opposed to serum free, then the apparent potency which is the 50 percent point on these curves, that you see 50, then it can change in this case by about a factor of 40, from 0.23 nanomolar to 8.42 nanomolar because when it's serum free, then it is not competitive binding as when there is serum protein there, and so the apparent potency reduces if you have added serum. And this kind of thing needs to be considered in vivo as well. In this case, the diagram shows plasma and the tissue. And again, it's the same process as being modeled, but the concentrations are different in vivo than they are in the in vitro assay and so one needs to correct for that. And so this shows that the important thing to remember is that this is dose-dependent. So, the solid line shows the dose response for free estradiol as you increase the serum concentration of estradiol. The free fraction starts low and then goes up when you've used up all the binding sites. For genistein, it's much more shallow because it doesn't bind as avidly. And so, that means that the relative potency comparing genistein and estradiol depends on what concentrations you do the comparison. And the in vivo serum concentrations naturally occurring of estradiol are shown by that downward arrow, and at that point, then you can see that the free concentration of genistein percentwise is much higher than that of estradiol. That concludes the section on in vitro in vivo extrapolation. In the next session, I'm going to talk about how we actually perform quantitative in vitro in vivo extrapolation to in vitro testing, so I'll give you an example of where it's been done well.