DHHS Eagle graphic
ASL Header
Mission Nav Button Division Nav Button Grants Nav Button Testimony Nav Button Other Links Nav Button ASL Home Nav Button
US Capitol Building
Search
HHS Home
Contact Us
dot graphic Testimony bar

This is an archive page. The links are no longer being updated.

Testimony on Federal Workplace Drug-Testing by Edward J. Cone, PH.D.
National Institute on Drug Abuse
National Institutes of Health
U.S. Department of Health and Human Services

Before the House Committee on Commerce, Subcommittee on Oversight and Investigations
July 23, 1998


Drug testing provides objective information about an individual's recent drug use habits.

NIDA has taken a lead role in developing the scientific basis of drug testing.

The scientific "validity" of a test system is determined by numerous chemical (e.g., sensitivity, specificity) and pharmacological factors (e.g., dose, frequency of use). It is vital to research these factors for each existing and new test system to insure that test systems are fair, accurate and reproducible.

In addition to conventional urine testing, other biological materials (e.g., saliva, sweat, hair) may be useful for drug testing. Research on these new technologies is underway at NIDA and other institutions.

Research has shown that each biological fluid and tissue provides unique and sometimes different information regarding an individual's drug exposure history.

Saliva testing provides a reasonable alternative to blood testing and offers similar information regarding possible influence of drugs on the person at the time of testing (e.g., drugged driver testing). Collection is relatively non-invasive, but drug detection times are short.

Sweat testing with a "patch" device offers a non-invasive way to monitor drug use over a weekly interval. Caution must be practiced not to contaminate the patch when applying or removing it.

Hair testing offers a long term "historical" look at an individual's drug exposure over a period of months. Environmental contamination and hair color bias for certain drugs is of concern. Advantages include the long term detection of drug and ability to obtain a similar specimen days after collection of an original specimen.

Mr. Chairman and Members of the Subcommittee, I am Dr. Edward J. Cone, Acting Chief of the Clinical Pharmacology Branch, Intramural Research Program, National Institute on Drug Abuse (NIDA), one of the research Institutes of the National Institutes of Health. I am very pleased to be here today with my distinguished colleagues to testify before the Subcommittee.

The National Institute on Drug Abuse supports over 85 percent of the world's research on the health aspects of drug abuse and addiction. For more than two decades, NIDA has been exploring the biomedical and behavioral foundations of drug abuse. NIDA's scientific research program addresses the most fundamental and essential questions about drug abuse, ranging from its causes and consequences to its prevention and treatment. The scientific knowledge that is generated through NIDA research is a critical element to improving the overall health of the Nation. There has never been a greater need to increase our knowledge about drug abuse. I am very pleased to be able to testify today on current research findings concerning the comparative usefulness of various biological samples for drug detection (See Table 1).

General Issues About Drug Testing

Drug testing technology, such as urinalysis, provides an objective means of determining recent drug use. The scientific basis of drug testing has been advancing rapidly over the last two decades resulting in the commercial development of reliable and inexpensive urine-based tests. At the same time research has been progressing on the evaluation of other biological fluids and tissues for drug detection. NIDA has taken a lead role in researching the utility of various biological samples as substrates for drug testing. NIDA staff have published over 20 scientific papers on the possible use of hair, sweat, saliva, sebum, skin, meconium and other specimens for drug testing. As a result of advances in the science of drug testing, there is growing commercial interest in utilizing saliva, sweat, hair and meconium for the detection of drug use.

The usefulness of a drug test resides in its ability to accurately detect the presence of the parent drug and/or its metabolite(s) in a biological fluid or tissue following human drug use. Many factors can influence the "validity" of these test systems. The accuracy of drug testing reflects both chemical factors which influence test outcome, such as sensitivity (the least amount of detectable drug) and specificity (how selective the test is for the drug) as well as pharmacological considerations such as drug dose, time of drug use and route of drug administration. Individual differences in rate of absorption, metabolism and excretion are additional pharmacological variables that can influence the test outcome.

In the development of drug testing systems, it is paramount that the results are accurate, reliable, and free from the possibility of bias towards different populations. Therefore, the test system must demonstrate that it performs its intended purpose, that it accurately identifies drug users, and it does not falsely incriminate innocent people. Numerous scientific principles and guidelines have evolved that new as well as existing drug testing technologies should follow. In order for a new biological material, such as hair, to be useful for drug testing, the tests using that material should have the following attributes:

  1. the biological material to be tested should be easily obtainable and as non-invasive as possible;
  2. the drug and/or its metabolite should be present in the biological material and be identified accurately and precisely with existing, available technology;
  3. the amount of the drug and/or metabolite appearing the biological material following the use of a single psychoactive dose of a drug should be sufficient for routine detection (that is, there should be a low rate of false negative results);
  4. the relationship between concentration of drug and/or metabolite and the drug dose taken should be clearly established;
  5. the time course of appearance and disappearance of the drug in the material should be clearly established and matched to the purpose or intent of testing (e.g. detection of recent or long-term drug use);
  6. the risk of false positive results from environmental contamination should be extremely small;
  7. the drug test methodology should be completely unbiased toward all populations and ethnic groups--this is not to say that individual differences in rates of metabolism and excretion should not exist, but simply that test methodology as a whole should not produce a higher overall rate of positive (or negative) results for any particular ethnic or minority group compared to the general population.

At the present time, there are a plethora of commercial assays and published methodologies that may be employed for drug testing. For the most part, these methods can be grouped into two categories, screening assays (tests) and confirmation assays. These assays can be adapted for measurement of drugs in body fluids, but they must first be properly validated. Generally, screening assays, such as immunoassays, are commercial-based tests that are inexpensive and simple to perform. In contrast, confirmation assays, such as gas chromatography/mass spectrometry (GC/MS), are more expensive and more labor intensive, but the sensitivity and specificity are generally higher than the screening assays.

Immunoassay screening tests may cross-react with a variety of similar chemical substances, reducing their specificity. For example, most commercial immunoassays for opiates give positive test results for specimens containing either morphine, codeine or heroin metabolites. In this case, a more specific methodology, i.e. a confirmation assay, is needed to identify the particular drug or metabolite present. Often, the less expensive screening tests are employed initially to eliminate specimens containing no drug or drug below the cutoff concentration. The more expensive, labor intensive tests are subsequently employed for absolute drug identification and accurate quantification.

Urine Testing

When a drug is taken intravenously or smoked, the absorption of the drug into the body is nearly instantaneous and its excretion in urine begins almost immediately. Drug absorption into the body is slower when the drug is taken orally and its excretion in urine may be delayed for several hours. Normally, urine specimens voided within six hours after taking a drug contain the highest concentration of the drug and its metabolites. In general, drug excretion in urine occurs at an exponential rate with most of the drug being eliminated within 48 hours after administration. Detection times for drugs of abuse vary according to dose, frequency of taking the drug, cutoff concentration and numerous other factors. Despite this variability, average detection time for most drugs in urine range from one to five days when the user is taking low doses or is an occasional user. When the individual is a heavy, chronic drug user, the detection times for some drugs, e.g. marijuana, may be up to a month.

Saliva

Considerable research has been focussed on the utility of saliva as a biological material for drug testing. Saliva offers a number of advantages and some disadvantages in comparison to urine testing. The major advantages include its easy accessibility for collection, less objectionable nature (compared to urine), presence of the parent drug in higher abundance than metabolites, and a high correlation between the concentration of drug found in the blood and that which is detected in saliva. Further, studies have found that the amount of cocaine detected in saliva correlates strongly with the physiological and euphoric effects of cocaine including the heart rate and self-reported feelings of cocaine "rush."

Despite the numerous advantages of saliva, there are some disadvantages. For example, there is a risk of contamination of saliva from drug use by the oral, smoked, and intranasal routes. This contamination can skew the correlation between saliva and blood drug concentrations, thereby distorting useful pharmacokinetic relationships. Even with this obvious limitation, saliva measurements can be used as evidence of recent drug use, even in some situations in which oral contamination is likely to be involved, e.g. marijuana smoking.

Another disadvantage of the use of saliva for drug testing, is the short time course for detectability of drugs. This effectively prevents saliva from being used to detect historical drug use. At the same time, however, this feature makes saliva useful for detection of very recent drug use. Most drugs disappear from saliva and blood within 12-24 hours after administration. However, there is often a temporal relationship between the disappearance of drugs in saliva and the duration of its pharmacologic effects. Consequently, saliva testing could be useful in the detection of recent drug use in automobile drivers, accident victims and for testing employees prior to engaging in safety-sensitive activities.

Sweat

Research on sweat testing for drugs has been limited because of the difficulty in collecting sweat samples. Recently, a sweat collection device has been developed that appears to offer promise for drug monitoring. This device resembles a Band-Aid which is applied to the skin and can be worn for a period of several days to several weeks. The "sweat patch" consists of an adhesive layer on a thin transparent film of surgical dressing to which a rectangular absorbent cellulose pad is attached. Sweat is absorbed and concentrated on the cellulose pad. The transparent film allows oxygen, carbon dioxide and water vapor to escape, but prevents loss of drug constituents excreted in an individual's sweat. Over a period of several days, sweat saturates the pad and the excreted drug slowly concentrates. The patch can then be removed, the absorbent pad detached from the device and analyzed for drug content.

The advantages of the sweat patch for drug monitoring include the high subject acceptability of wearing the patch, low incidence of allergic reactions to the patch adhesive, and the ability to monitor drug intake for a full week with a single patch. In addition, the patch appears to be relatively tamper-proof in that the patch adhesive is specially formulated so that the patch can only be applied once and cannot be removed and successfully reapplied to the skin surface.

Disadvantages of the sweat patch include high variability among individuals, possibility of environmental contamination of the patch before application or after removal, and risk of accidental removal during a monitoring period. During patch application, extreme care must be taken to cleanse the skin surface prior to placement of the patch and also avoid contamination of the cellulose pad during handling. Similar care must be taken when removing the patch and handling for analysis. Since this is a relatively new technology, research is ongoing into the efficacy and utility of sweat as a biological material for drug testing.

Hair

The use of hair as a biological material for drug testing has received much attention over the last decade. While the technology of hair testing has progressed rapidly, there remain several highly controversial aspects of hair testing yet to be worked out. First and probably foremost, it is unclear how drugs enter hair. The most likely routes involve:

  • diffusion from blood into the hair follicle and hair cells with subsequent binding to hair cell components,
  • excretion in sweat which bathes hair follicles and hair strands,
  • excretion of oily secretions into the hair follicle and onto the skin surface,
  • entry from the environment.

The possibility of drug entry from sweat and/or the environment are particularly troubling since this allows the possibility of false positives if an individual's hair absorbs drugs from the environment or from another person's drug-laden sweat.

Another controversial issue in hair analysis is the interpretation of dose and time relationships. Although it has been generally assumed that segmental analysis of hair provides a record of drug usage, studies with labeled cocaine have not supported this interpretation. At best, only limited dose and time relationships were found. Other controversial issues that remain unresolved are the possibility of bias based on hair color and texture, appropriate means of differentiating drug users' hair from environmentally contaminated hair, appropriate applications of hair testing and the feasibility of hair testing for marijuana usage.

Despite the controversial nature of some aspects of hair testing, this technique is being used on an increasingly broad scale in a variety of circumstances. One of the most promising applications of hair testing appears to be its use in epidemiological studies of drug use. Hair testing could be a valuable confirmation of the self-reported drug use data that much of the current research in this area relies upon. In addition, the time record of drug use available from hair is considerably longer than any other biological materials currently being employed for drug testing. Self-reported drug use over a period of several months can be compared to test results from a hair strand representative of the same time period (about 4 cm in this case) as a means of validating the self-report data. It is expected that this type of comparison would be more effective than urine testing since urine provides a historical record of only two to four days under most circumstances.

Other advantages include ease of obtaining, storing and shipping specimens, ability to obtain a second sample for reanalysis, low potential for evasion or manipulation of test results, and low risk of disease transmission in the handling of samples.

A potential disadvantage of hair analysis would be its inability to detect recent drug usage because of slow growth rate. However, this has not been thoroughly investigated. Recent evidence suggests that drug excretion in sweat is an important route of drug entry into the hair. This allows for the possibility that drug could appear in the hair within hours of drug administration. Another consideration regarding the use of hair analysis is the limited number of laboratories offering commercial hair testing services. However, as demand for hair testing services grows, commercial development will expand to meet that demand. As more attention is then focused on this new area of drug testing, and further research is conducted, perhaps many of the early controversies will eventually be resolved.

Conclusion
Mr. Chairman, this concludes my formal testimony. I would be pleased to answer any questions you may have. Thank you.Table 1 Comparison of usefulness of urine, saliva, sweat and hair as a biological matrix for drug detection. Biological Matrix Drug Detection Time Major Advantages Major Disadvantages Primary Use Urine 2-4 Days Mature technology; On-site methods available; Established cutoffs Only detects recent use Detection of recent drug use Saliva 12-24 hours Easily obtainable; Samples "free" drug fraction; Parent drug presence Short detection time; Oral drug contamination; Collection methods influence pH and s/p ratios; Only detects recent use; New technology Linking positive drug test to behavior and performance impairment; onsite testing for drugged-drivers Sweat 1 week Cumulative measure of drug use High potential for environmental contamination; New technology Detection of recent (1-week) drug use Hair Months Long-term measure of drug use; Similar sample can be recollected High potential for environmental contamination; New technology; possible color/ethnic bias Detection of drug use in recent past (1-6 months)

Privacy Notice (www.hhs.gov/Privacy.html) | FOIA (www.hhs.gov/foia/) | What's New (www.hhs.gov/about/index.html#topiclist) | FAQs (answers.hhs.gov) | Reading Room (www.hhs.gov/read/) | Site Info (www.hhs.gov/SiteMap.html)