Department of Justice

FY 2000 Summary Performance Plan

Prepared by the Justice Management Division
March 1999

 

PART II: Measurement Issues

Developing an organizational capability to measure and report program performance information is necessary to comply fully with the Results Act. DOJ and its components have worked to identify a range of performance indicators that keep the primary focus on mission outcomes, while minimizing reliance on output measures. In addition, components are working to ensure full compliance with the Attorney General's concerns over the appearance of "bounty hunting." As discussed in the sections that follow, we expect that there will continue to be refinements and changes as our components deal with the complexities of data management and program measurement in a law enforcement environment.

 

New Steps to Strengthen Data Capacity and Integrity

Whether requested by agency managers, Congress or the general public, there is an increasing call for more detailed and subject-specific performance information. Such new requirements for specialized data require time and funding commitments to resolve measurement and definition issues, determine data collection and reporting mechanisms, and actually undertake the required research or process.

Identification of Data Sources.

DOJ's plan for FY 2000 either begins or reinforces steps that address important performance measurement and data integrity concerns of the Results Act. For example, we have begun to identify specific data sources for the indicators included in this plan. For the most part, these sources are major statistical reports or internal case processing and management systems. Examples of these are listed in Appendix A.

We have also put in place an across-the-board DOJ requirement that each of our component's FY 2000 budget requests specifically identify the data source(s) for each of the indicators included in their more detailed plans Although data are available for the vast majority of indicators, in a few areas new data collection systems must be developed or existing ones modified. We have also asked our components to discuss steps they will take to insure the accuracy of data reported under their systems. Because we rely so heavily on our component-level systems to address data integrity, we believe these new requirements are key improvements.

Integrating Program and Financial Data

We have taken steps to include a stronger programmatic focus in our financial management systems and related reports. Our goal, as stated in our Five-Year Financial Management Plan and in this Summary Performance Plan, is to provide complete and useful financial information that fully supports financial and performance reporting, so that program and financial managers can achieve their objectives. We are moving toward this goal by installing new accounting systems in the DEA, INS, OJP and USMS and by completing the migration of the Bureau of Prisons to the Department's system. We are also migrating JMD's debt collection accounting and disbursing system to the Department's accounting system. We are committed to the full deployment, by FY 2000, of JMD's automated debt collection and litigation support system, which will support the financial litigation efforts of the U.S. Attorneys and the Department's litigating divisions.

In addition, DOJ's Justice Management Division established a Managerial Cost Accounting Standards Working Group to develop a program framework in support of our audited financial statements. Specifically, the Working Group developed standards for classifying DOJ expenses according to the core function structure of the DOJ strategic plan. This action will further support DOJ's measurement of performance as required by the Results Act.

Development of a Performance Measurement Process and System

In FY 1999 we are embarking on a new initiative to develop a systemic performance measurement process and system. With contractor assistance, we expect to make substantial progress in refining and clarifying measures; identifying data sources; assessing data quality, consistency and reliability; and collecting, verifying, analyzing and displaying actual performance data. The undertaking is envisioned as a two-phased project. The first phase will focus on the data requirements for reporting results of DOJ drug control programs. The second phase will expand the project to other program areas and support preparation of the Results Act performance report. This phase also calls for the design and testing of a decision-support system that uses technology such as data warehousing to meet multiple and changing needs for performance information.

Focus on Evaluation

Each DOJ component organization uses an assortment of investigative, litigation, technical assistance, training or other strategies to conduct its mission activities, whether these activities involve investigating crime or helping prepare inmates to reintegrate into society. A performance measurement system, by itself, will not be able to establish definitive causal relationships between the "inputs" of specialized strategies and the outcomes for which they are intended. A strong evaluative capability will be needed to help make those types of assessments.

DOJ and the Office of Justice Programs (OJP) have long been committed to the value of research and evaluation. The 1994 Crime Control Act added a new impetus to these efforts at the same time that it put new focus on several specific programs, i.e., community-oriented policing services; grants to counter violence against women; sentencing and corrections programs, and Drug Courts. Consistent with the 1994 statute, each program office that administers these programs has allocated up to five percent of its funds to support evaluative studies of the new programs by OJP's National Institute of Justice (NIJ). In addition to its national-level evaluations, the NIJ encourages partnerships between researchers and the police, corrections officials, and other justice practitioners to study topics important to the local jurisdiction.

Other DOJ components have also begun to put more emphasis on evaluation. For example, the BOP recently undertook an evaluation of its residential drug abuse treatment program, designed to monitor inmates up to three years following release from BOP custody. This study is being conducted with funding and assistance from the National Institute on Drug Abuse. An interim report, based on inmates who have been released into the community for six months, suggests that BOP's institutional treatment programs are effective in reducing recidivism and substance abuse. Although it may take years more to validate, these preliminary findings underscore the importance of BOP's tracking the inmates' level of participation in such treatment programs.

Our FY 2000 performance plan anticipates that we will put far more emphasis on program evaluation in the future. It is anticipated that we will have a formal evaluation agenda in place during the current year.

Development of New Measures

DOJ and its components are beginning to reassess certain performance indicators to address emerging crime threats and the evolving enforcement responses. For example, a new FBI focus on preventing specifically-targeted crime now requires more attention to measures that are anticipatory, not reactive. Using the example of hate crimes, the FBI explains that this approach might involve tracking the "proportion of agencies and communities that adopt and use training and (other) models developed". . . to counter this problem. In other words, more emphasis is being given to the implementation of process or procedural changes that will result in the prevention of such crime. Similarly, the FBI states that the truest measure of anti-terrorism efforts will be the "ability to respond to terrorist acts before they occur."

Determining the success of certain specialized federal enforcement efforts will likely involve the design of new sets of measurement tools, covering not only enforcement strategies, but also long-term economic impact, as shown by industry and market indices. For example, in the white collar crime (WCC) area, DOJ's Antitrust Division is developing "proxy" measures that give an indication of the magnitude of its enforcement efforts, as well as the "economic reach" of the Division's efforts. The FBI's WCC program is also exploring use of new measures, moving beyond tracking of traditional data. For example, the FBI's traditional WCC measures include the simple number of informations and indictments obtained and an estimated gross dollar value of recoveries and restitutions. By contrast, its new measures tend to be far more specific and geared to the strategy being used, e.g., per cent of financial institution fraud cases investigated by local law enforcement in areas covered by FBI task forces and economic loss to financial institutions related to durable medical equipment. Actual data for some of these newer measures is already being tracked.

Other DOJ components are working to revise or develop additional measures that will support a stronger capacity to understand the nature of the crime threat and determine what type and level of resources with which to respond. For example, the USMS' FY 2000 performance plan includes a new performance indicator ("inappropriate communications received" by the judiciary) in order to better differentiate among the various types of threats regularly directed against judges and other court personnel. In addition, the USMS is working to develop a measure that will track the average number of days to close Class 1 warrants, for both domestic and international fugitives.

Inspector General Plans

To further assist with implementation of the Results Act, various Offices of the Inspector General in Executive agencies are undertaking plans to examine the data integrity concerns of performance information. DOJ's OIG has indicated that they will do significantly more in the area of verification and validation of supporting data sources and information systems used for the performance measures outlined in agency performance reports and strategic plans. The OIG's role will be to assess the performance goals to ensure that they are appropriate, objective, quantifiable, and measurable. The OIG plans to be able to report on the key outputs, service levels, and outcomes.

In a related effort, being undertaken as part of the implementation of the Government Management and Reform Act, DOJ's OIG has begun an internal control assessment of DOJ's case management systems. The objective of this assessment is to determine whether controls exist to ensure that information provided by DOJ to client agencies is accurate and complete.

 

Measurement Issues of Special Relevance to DOJ Activities

This section includes brief descriptions of certain data and measurement issues and related actions that emerged during DOJ's early experience with implementing the Results Act. Although not all-inclusive, we have selected several of the more significant of these issues and generally grouped them according to core function. We believe they illustrate some of the unique measurement issues and concerns that we confront in federal law enforcement efforts.

Law Enforcement and Criminal Prosecution Measurement Issues

There are many different and sensitive factors that play a role in attempts to measure performance in the investigation and prosecution of criminal offenses. Foremost among these is the previously-noted Attorney General directive that "bounty hunting" never be associated with federal enforcement activities. Consistent with this policy, all DOJ components with criminal investigative responsibilities currently report only prior-year "actual" data for certain key indicators, e.g., indictments, convictions and seizures. We believe this policy properly insulates our agent and attorney personnel from inappropriate pressure to project and achieve a targeted level of enforcement activity. It also protects the justice system from being perceived as encouraging a "quota-driven" approach to measurement. At the same time, this policy emphasizes that we maintain a record of past performance, thereby establishing overall accountability for specific results.

A second important factor is that many federal law enforcement activities will require more than one year to achieve results. Consequently, indicators are difficult to quantify on an annualized basis. Subjects of federal investigations usually have wide-ranging criminal influence and measuring success against them should involve analyses that are equally wide-ranging. An example of this is the Federal Bureau of Investigation's (FBI) effort against organized crime/La Cosa Nostra (LCN) syndicates. The outcome measure selected is "Percentage reduction in LCN membership." Additional outcome measures are still under development, and illustrate the long-term and difficult nature of designing true outcome measures. For less sophisticated organized crime groups, e.g., gangs, the FBI believes that impact may be more readily measurable, focusing on the actual disruption, dismantlement, and changes in amount of criminal activity by these groups. However, a multi-year effort is still anticipated.

In addition, in certain specialized enforcement areas, e.g., drug abuse, trying to evaluate the success of law enforcement through indicators such as "overall drug usage percentages" can be misleading. DOJ believes that measures such as drug use are generally dictated by the "demand side" whereas the majority of our enforcement efforts, especially those of the Drug Enforcement Administration (DEA), are primarily focused on the supply side. On the whole, demand reduction activities are not part of the core missions of DOJ's criminal enforcement organizations and comprise only a relatively small proportion of their budgets. Also, several other aspects of drug law enforcement are particularly difficult to quantify and performance data can be easily misinterpreted. For example, DEA states that the "impact of investigative intelligence is difficult to quantify". . . and that, much like arrest and conviction data, they "must be accompanied by a qualitative assessment." DOJ's Criminal Division notes, in support of the Interagency Crime and Drug Enforcement program, that "it is extremely difficult to evaluate the significance of a particular law enforcement action based merely upon statistical figures."

Other significant issues that influence how we measure progress in criminal investigations and other litigation include the following:

Despite these difficulties, DOJ enforcement components are working to develop reasonable measures and identify baseline data that are as clearly and directly linked to the outcome intended as possible. For example, over the past three years the FBI has supported an annual survey by the Computer Security Institute to establish better baseline data on the nature and scope of technology-based crime.

State and Local Assistance-related Measurement Issues

There are also unique factors that influence how we measure performance in providing law enforcement assistance to state and local governments, often in close partnership with other public or private organizations. We have learned in particular that the provision of technical assistance and a strong commitment to routine monitoring and program evaluation are elements vital to measuring program success.

DOJ and the Office of Justice Programs (OJP) share the belief that research and evaluation strengthens the potential for identifying and collecting data that is both reliable and valid. Evaluations may take two years to publish results and even then may not necessarily generate findings that are uniformly applicable to other projects. Nevertheless, results from long-term "impact evaluations" can have significant consequences because they may guide policy or program funding decisions. For example, preliminary results from Abt Associates' evaluation of the Weed & Seed program provide important implications for future expansion of the program.

DOJ's FY 2000 Plan reinforces the approach that a strong central office review and monitoring capability will further improve data integrity/verification efforts in grant-making programs. For example, in both the Community Oriented Policing Services (COPS) and the Weed and Seed programs, validation data from sites was improved by a central requirement that appropriate information be included in the 1998 grant applications. These data, in turn, will be verified by site visits over the year. OJP also notes that one focus of its "Statewide Community Initiative" will be on strengthening efforts to validate performance measures through a planned review of progress reports, telephone contacts, and on-site monitoring.

Other significant measurement issues include the following:

The COPS program provides another example of how we rely on state and local partnerships to gauge our successes in providing useful assistance. In addition to its own management system, semi-annual surveys, the triennial COPS Count and other internal databases, COPS will look to some of its partners such as the Federal Law Enforcement Training Center, the Community Policing Consortium, and the Regional Community Policing Institutes to provide data related to COPS' training activities.

Finally, tracking post-treatment success in some specialized areas of assistance can be problematic. For example, DOJ components have supported a variety of programs intended to reduce or eliminate future abuse among drug offenders. However, some states or programs may have experienced funding or other difficulties in establishing mechanisms to track drug use and recidivism among program participants, particularly during a follow-up period in the community.

Immigration-related Measurement Issues

The accuracy and reliability of immigration-related statistics has been the focus of much attention over the years. DOJ's Immigration and Naturalization Service (INS) has made substantial progress in making improvements in this area, including the implementation of previous General Accounting Office (GAO) recommendations. In fact, the INS has established "data integrity and integration" as an agency-wide priority. This action reflects INS' determination to review its data collection, processing and reporting activities in order to "increase efficiency, consistency, accuracy, and timeliness of data availability." INS has committed itself to address the areas of technology deployment and support, system utilization and effectiveness, financial systems and records modernization and integrity, and information effectiveness.

In addition, DOJ and INS have engaged independent contract assistance as part of a sustained effort to implement an entirely new Naturalization process. An important part of this project is to identify and begin tracking performance measures that will systematically test features of the new process and flag opportunities for improvement. Measures are being designed and put in place for each of the Naturalization program's key subprocesses, including the operation of INS call centers, the fingerprinting process, the role of its Service Centers, and the adjudication ceremony itself. Because this is a relatively new effort, significant measurement difficulties are still being met, e.g., need to redefine certain metrics, smooth out reporting and other coordination issues among agencies, etc. A related part of this larger effort is a contractor-supported project to reduce the Naturalization program's existing backlog. Monitoring performance will be a key part of this undertaking as well.

INS' FY 2000 performance plan calls for it to initiate evaluations of new and pilot programs, including those mandated by the Illegal Immigration Reform and Immigrant Responsibility Act, and to expand its evaluation of several employment verification pilot programs mandated by Congress. A proposed survey of recently-naturalized citizens is expected to provide in-depth information to evaluate the enhancements now being implemented and to guide further improvements. In addition, the INS has used existing resources to join with the National Institutes of Health and the National Science Foundation in cosponsoring the pilot of a survey of new immigrants.

Finally, INS has taken several steps to meet prior GAO criticisms. For example, GAO reported in July 1998 that INS' Statistics Branch had been "working to improve its capacity and coordination by hiring qualified professional staff and by coordinating its statistical activities with other agencies that produce data on the foreign-born." (2) The GAO report also states that experts on immigration statistics "credit the Branch for making improvements in its capacity to produce statistical information." INS has noted that it believes that it could make further improvements, including undertaking ongoing quality assurance programs and publishing statistical standards to act as overall guidance.

Noted below are some other key issues affecting performance measurement in DOJ's immigration-related programs.

Incarceration and Detention-related Measurement Issues

Accurately estimating immediate and long-term needs for confinement bedspace is an urgent concern of several DOJ components, i.e., the Bureau of Prisons (BOP), the U.S. Marshals Service (USMS), and the INS. Developing such projections is an inexact science because there are so many factors outside the control of any single DOJ component that influence future prison and detention populations. The pace and success of investigations and prosecutions to counter constantly changing patterns of drug trafficking and illegal immigration along the Southwest Border are just two of the factors involved.

BOP's goal of "adding capacity to keep pace with inmate population growth and reduce overcrowding" -- and achieving a specific level of reduced overcrowding within a specific timeframe -- is directly tied to population projections and an assumed level of funding. An example of the difficulty of accurately forecasting prison population occurred during FY 1998 when BOP revised its projections upward twice due to substantial increases in the number of drug convictions, the number of immigration cases from the Southwest Border, and a higher than expected influx of D.C. inmates. These increases directly affected BOP's goal to reduce its overall overcrowding rate to 15 per cent and contributed to an end of year overcrowding rate of 26 percent. This illustrates how unanticipated trends or events, combined with lower funding levels than initially planned, can impact an agency's ability to achieve its stated goals.

The BOP uses two essential types of data to predict its future inmate population. The first is based upon the actual sentences of inmates. Because the federal government uses determinate sentencing, this data is quite accurate and stable. The second type of data used in the BOP projection model are defendants in cases commenced. Recent trends in these data help BOP anticipate the number of its future admissions. However, abrupt changes in the actual number of defendants can produce relatively large errors in the projections. Since these data depend on the local resources and policy decisions of law enforcement and the U.S. Attorneys, trends in the number of defendants in cases commenced are difficult to predict.

Other significant measurement concerns or improvements underway in this functional area, not necessarily limited just to the overcrowding issue, include the following:

Data Sources and Systems

As noted previously, DOJ has instructed its components to identify within their individual performance plans the specific sources of data for each indicator. For the vast majority of these indicators, data are already collected and reported through existing statistical series or internal data systems. Appended to this plan is a summary of the principal sources of FY 2000 performance data.

Some indicators require modifications or enhancements to data collection systems. For example, the OCDETF program plans to work with the investigative agencies and the U.S. Attorneys to collect data on completed investigations regarding the extent to which the goal of dismantling or disrupting the criminal organization has been achieved. The U. S. Attorneys plan to begin to collect data on victim impact statements in Federal criminal proceedings. DEA will be expanding the use of post deployment reviews to measure the effectiveness of its MET program.

There will also be continuing refinements made as research continues in response to changing crime patterns and problems. As one example, the problem of police use of excessive force received increased public attention in the past few years as a result of a number of highly publicized cases. The Crime Control Act of 1994 (Title XXI) strengthened the federal role in controlling such conduct and instructed the Attorney General to "acquire data about the use of excessive force by law enforcement officers" and publish an annual summary of that data. One early finding, jointly determined by the FBI and OJP's National Institute of Justice, was that no single data collection mechanism could provide a complete picture of this problem, although several methods (use of court records and data on citizen complaints to the police) were possible. Subsequently, the Bureau of Justice Statistics (BJS) awarded the International Association of Chiefs of Police (IACP) a grant, co-funded by NIJ, for the National Police Use of Force Database Project to collect incidence data nationwide. In the first year, the IACP developed software that enables police agencies to record a wide range of information such as type and level of force used, characteristics of the officer and the suspect, and outcome of complaints if one is filed. The next step is collecting the data from police agencies in the seven states where the project is being piloted.

DOJ also recognizes that performance information must have some uniformity in definition and be accessible to line managers if it is to be meaningful. For many DOJ components, data integrity is heavily dependent on the effectiveness of its operational case tracking systems. For example, DOJ's Criminal Division currently relies on its sections to report workload data using an established format to ensure uniformity across the Division. To ensure more complete and accurate data, this process will be automated with the roll out of the Division's case tracking system. The Civil Division relies heavily on its own automated tracking system (CASES) to meet management and planning needs. Because it has made data accuracy an important goal of the system, the Division has in place numerous safeguards, including a contractor staff review of case listings, the generation of exception reports, and follow-up interviews with attorneys regarding case status. Despite these efforts some data limitations do exist, e.g., incomplete data on case terminations and attorney time. To its credit, Civil Division managers have made adherence to the reporting requirements of CASES a performance element in all attorney work plans.






 

Return to FY 2000 Summary Performance Plan: TOC

Return to the TOP