MEMORANDUM

Date: August 30, 2004

To: The Commissioner

From: Acting Inspector General

Subject: Disability Determination Services' Claims Processing Performance (A-07-03-13054)

The attached final report presents the results of our review. Our objective was to identify factors that may have resulted in differing levels of performance at selected Disability Determination Services.

Please provide within 60 days a corrective action plan that addresses each recommendation. If you wish to discuss the final report, please call me or have your staff contact Steven L. Schaeffer, Assistant Inspector General for Audit, at
(410) 965-9700.

Patrick P. O'Carroll, Jr.

OFFICE OF
THE INSPECTOR GENERAL

SOCIAL SECURITY ADMINISTRATION

DISABILITY DETERMINATION
SERVICES'
CLAIMS PROCESSING
PERFORMANCE

August 2004

A-07-03-13054

EVALUATION REPORT

Mission

We improve SSA programs and operations and protect them against fraud, waste, and abuse by conducting independent and objective audits, evaluations, and investigations. We provide timely, useful, and reliable information and advice to Administration officials, the Congress, and the public.

Authority

The Inspector General Act created independent audit and investigative units, called the Office of Inspector General (OIG). The mission of the OIG, as spelled out in the Act, is to:

Conduct and supervise independent and objective audits and investigations relating to agency programs and operations.
Promote economy, effectiveness, and efficiency within the agency.
Prevent and detect fraud, waste, and abuse in agency programs and operations.
Review and make recommendations regarding existing and proposed legislation and regulations relating to agency programs and operations.
Keep the agency head and the Congress fully and currently informed of problems in agency programs and operations.

To ensure objectivity, the IG Act empowers the IG with:

Independence to determine what reviews to perform.
Access to all information necessary for the reviews.
Authority to publish findings and recommendations based on the reviews.

Vision

By conducting independent and objective audits, investigations, and evaluations, we are agents of positive change striving for continuous improvement in the Social Security Administration's programs, operations, and management and in our own office.

Executive Summary

OBJECTIVE

Our objective was to identify factors that may have resulted in differing levels of performance at selected Disability Determination Services (DDS).

BACKGROUND

Disability determinations under the Social Security Administration's (SSA) Disability Insurance and Supplemental Security Income programs are made by DDSs in each State or other responsible jurisdictions according to Federal regulations. In carrying out this function, DDSs are responsible for determining claimants' disabilities and ensuring that adequate evidence is available to support their determinations.

To accomplish our objective, we stratified 41 DDSs into five strata based on initial case clearances for Fiscal Years (FY) 2000 through 2002. We ranked the DDSs within each stratum according to performance on four indicators: production, timeliness, accuracy, and cost. We selected the higher- and lower-performing DDS in each stratum, as shown in the following table. We then collected and analyzed information from the 10 selected DDSs to identify factors that may have resulted in differing levels of performance between the higher- and lower-performing DDSs.

Workload Strata Higher-Performing
DDS Lower-Performing DDS
Very Small Wyoming Vermont
Small Maine New Mexico
Medium Minnesota Oregon
Large Mississippi New Jersey
Very Large North Carolina Georgia

RESULTS OF REVIEW

The lower-performing DDSs in our review had higher rates of disability examiner attrition, fewer examiners in relationship to total staff, and purchased consultative examinations (CE) on more claims than their higher-performing DDS counterparts. We believe these factors may have contributed to increased processing times and decreased productivity at lower-performing DDSs. We also identified factors that may have negatively affected the claims processing performance of both higher- and lower-performing DDSs. These factors included State restrictions on hiring staff and high rates of claimants who missed scheduled CE appointments.

RECOMMENDATIONS

We recommend that SSA continue to work with State governments to resolve the factors that result in high examiner attrition and difficulties in hiring staff. We also make recommendations related to an optimal DDS staff mix, uncooperative medical evidence providers and missed CE appointments. See page 16 for our formal recommendations.

AGENCY COMMENTS

In comments to our draft report, SSA stated that it generally agreed with our findings and conclusions. SSA stated that it could not develop an optimal DDS staff mix at this time because staffing mix requirements for the electronic folder and the Commissioner's new disability process are unknown. However, SSA stated that it will evaluate the staffing requirements as it transitions into the new processes. SSA also provided technical and other comments which we addressed as appropriate. SSA's comments are included as Appendix C.

OIG RESPONSE

We recommended that SSA develop an optimal DDS staff mix model as the Commissioner's new disability process is being implemented and the related staffing requirements are determined. We are encouraged that the Agency agreed to evaluate staffing requirements as it transitions into the new disability process and we continue to recommend that the optimal staff mix model be developed as staffing requirements are determined.

Table of Contents
Page
INTRODUCTION 1
RESULTS OF REVIEW 2
Examiner Attrition 2
High Attrition Significantly Impacts DDS Performance 6
Staff Mix 9
CE Purchases 10
Factors Affecting Higher- and Lower-Performing DDSs 12
State Restrictions on DDS Staffing 12
Missed CE Appointments 14
CONCLUSIONS AND RECOMMENDATIONS 16

APPENDICES
Appendix A - Scope and Methodology
Appendix B - DDSs Ranked Nationally Based on Fiscal Years 2000, 2001, and 2002 Performance Data
Appendix C - Agency Comments
Appendix D - OIG Contacts and Staff Acknowledgments

Acronyms
AeDib Accelerated Electronic Disability
CE Consultative Examination
C.F.R. Code of Federal Regulations
DDS Disability Determination Services
DOT Dictionary of Occupational Titles
FD-14 State Agency Operations Report
FY Fiscal Year
HIPAA Health Insurance Portability and Accountability Act of 1996
ODD Office of Disability Determination
OIG Office of the Inspector General
POMS Program Operations Manual System
PPWY Production-Per-Workyear
SSA Social Security Administration

Introduction

OBJECTIVE

Our objective was to identify factors that may have resulted in differing levels of performance at selected Disability Determination Services (DDS).

BACKGROUND

Disability determinations under the Social Security Administration's (SSA) Disability Insurance and Supplemental Security Income programs are made by DDSs in each State or other responsible jurisdictions according to Federal regulations. In carrying out this function, DDSs are responsible for determining claimants' disabilities and ensuring that adequate evidence is available to support their determinations.

To accomplish our objective, we stratified 41 DDSs into five strata based on the number of initial claims closed by the DDS in Fiscal Years (FY) 2000 through 2002. We ranked the DDSs within each stratum according to performance on four indicators: production, timeliness, accuracy, and cost. We selected the higher- and lower-performing DDS in each stratum, as shown in Table 1. We then collected and analyzed information from the 10 selected DDSs to identify factors that may have resulted in differing levels of performance between the higher- and lower-performing DDSs.

Table 1
DDSs Selected for Review

Workload Strata
Higher-Performing DDS Lower-Performing DDS
Very Small Wyoming Vermont
Small Maine New Mexico
Medium Minnesota Oregon
Large Mississippi New Jersey
Very Large North Carolina Georgia

Results of Review

The lower-performing DDSs in our review had higher rates of disability examiner attrition, fewer examiners in relationship to total staff, and purchased consultative examinations (CE) on more claims than their higher-performing DDS counterparts. We believe these factors may have contributed to increased processing times and decreased productivity at lower-performing DDSs. We also identified factors that may have negatively affected the claims processing performance of both higher- and lower-performing DDSs. These factors included State restrictions on hiring staff and high rates of claimants who missed scheduled CE appointments.

EXAMINER ATTRITION

During FYs 2000 through 2002, at least 458 examiners left the employment of the 10 DDSs in our review (see Table 2). Over 50 percent of these examiners left DDS employment because of:

Other Employment (100 of the 458 examiners or 22 percent).
Retirement (82 of the 458 examiners or 18 percent).
Low Salaries (54 of the 458 examiners or 12 percent).

Approximately 50 percent (228 of the 458) of the examiners left the employment of the 10 DDSs in our review because of reasons related to job quality, such as other employment, low salary, job stress, and low morale. DDSs have some ability to influence job quality, whereas, they cannot control examiners leaving due to factors, such as retirement or family obligations.

Table 2: Examiners Who Left DDS Employment FYs 2000 through 2002
Higher/
Lower
Performing

Workload Strata Other Employment Retirement Low Salaries Job stress Relocated to another city Low morale Return to school Left involuntarily Family obligations Medical Other Reasons TOTAL
Very Small DDSs
Higher Wyoming 2 2
Lower Vermont 1 1 5 1 2 2 1 13
Small DDSs
Higher Maine 5 1 6
Lower New Mexico 6 1 2 1 1 1 1 13
Medium DDSs
Higher Minnesota 1 1 2 2 3 3 1 - death 13
Lower Oregon 5 4 8 7 2 10 4 7 47
Large DDSs
Higher Mississippi 3 19 16 3 6 1 48
Lower New Jersey 19 36 - not specified 55
Very Large DDSs
Higher North Carolina 87 6 21 7 15 21 23 2 7 189
Lower Georgia 11 36 4 7 4 4 3 3 72
TOTAL 100 82 54 42 34 32 26 24 17 10 37 458

Both the higher- and lower-performing DDSs in our review experienced examiner attrition during FYs 2000 through 2002. However, lower-performing DDSs were most impacted by examiner attrition (see Table 3). Specifically:

In FY 2000, four of the five lower-performing DDSs (New Mexico, Oregon, New Jersey, and Georgia) had examiner attrition rates greater than their comparative higher-performing DDSs. The four lower-performing DDSs lost 12 to 28 percent of their examiner staff.

In FY 2001, three of the five lower-performing DDSs (Vermont, New Mexico, and Oregon) had attrition rates greater than their comparative higher-performing DDSs. All three DDSs lost over 20 percent of their examiner staff, and the Vermont DDS lost most of its examiner staff (97 percent).

In FY 2002, four of the five lower-performing DDSs (Vermont, New Mexico, Oregon, and New Jersey) had attrition rates greater than their comparative higher-performing DDSs. Each DDS lost 18 to 24 percent of their examiner staff.

Table 3: Examiner Attrition Rates
Higher/Lower
Performing
Workload Strata

FY 2000
Attrition
Rate

FY 2001
Attrition
Rate
FY 2002
Attrition
Rate
Very Small DDSs
Higher Wyoming 0% 0% 0%
Lower Vermont 0% 97% 24%
Small DDSs
Higher Maine 0% 9% 14%
Lower New Mexico 15% 24% 18%
Medium DDSs
Higher Minnesota 6% 6% 7%
Lower Oregon 12% 22% 21%
Large DDSs
Higher Mississippi 14% 19% 15%
Lower New Jersey 18% 12% 22%
Very Large DDSs
Higher North Carolina 22% 29% 18%
Lower Georgia 28% 15% 17%
National Average 13% 13% 14%

Attrition for the right reasons may be positive for an organization because new examiners can provide innovative ideas and differing skills. However, high attrition is a concern when it involves new staff. Of the 458 examiners who left DDS employment during FYs 2000 through 2002, at least 195, or 43 percent were new examiners (see Table 4). At 8 of the 10 DDSs in our review, over 30 percent of the examiners who left DDS employment were new examiners. Three lower-performing DDSs (New Mexico, Oregon, and New Jersey) had a greater percentage of new examiners leave the employment of the DDS than their comparative higher-performing DDSs. DDS management stated that the primary reason new examiners left DDS employment was because of performance problems, low salaries, and/or other job opportunities.

Table 4: New Examiners
Who Left DDS Employment
FY 2000 through FY 2002
Higher/ Lower
Performing
Workload Strata
Number of
New Examiners
Who Left
New Examiners Who Left as a Percent of
All Examiners Who Left
Very Small DDSs
Higher Wyoming 2 100%
Lower Vermont 4 31%
Small DDSs
Higher Maine 0 0%
Lower New Mexico 1 8%
Medium DDSs
Higher Minnesota 4 31%
Lower Oregon* 11 34%
Large DDSs
Higher Mississippi 22 46%
Lower New Jersey 30 54%
Very Large DDSs
Higher North Carolina 98 52%
Lower Georgia** 23 32%
Total 195 43%
* Information based on two years only. FY 2000 data were not available by experience level.
** Information based on two years only. FY 2000 data were not available.

High Attrition Significantly Impacts DDS Performance

High examiner attrition results in a lower percentage of full-time experienced examiners on the DDS staff (see Chart 1). At 4 lower-performing DDSs (Vermont, Oregon, New Jersey, and Georgia) the experienced examiner staff was 22 to 30 percent less than their comparative higher-performing DDS counterparts. Generally, experienced examiners are more likely to manage heavier caseloads, make more disability determinations per week, process more complex or specialized cases, and rotate the time-consuming duties of mentoring new examiners among other experienced examiners. The lower proportion of experienced examiners at the four lower-performing DDSs could have contributed to these DDSs experiencing lower productivity and higher claims processing times than their higher-performing DDS counterparts.

When new examiners leave a DDS' employment, it loses the staff resources devoted to training new examiners. Specifically, a DDS devotes up to 2 years training, mentoring, and supervising new examiners. During the training period, all case actions performed by new examiners are closely monitored. New examiners are usually able to process cases independently by the end of their second year of employment. When new examiners leave DDS employment and are replaced by other new examiners, the training and mentoring process starts over. Therefore, staff resources are once again diverted to training new examiners rather than processing disability claims.

High attrition can also result in delaying the assignment of disability cases to examiners, referred to as staging cases. We found that 4 lower-performing DDSs (New Mexico, Oregon, New Jersey, and Georgia) staged initial cases from 10 to 40 days longer on average than their higher-performing DDS counterparts (see Chart 2). These DDSs reported that initial cases were staged because of an insufficient number of examiners to process all case receipts. For DDSs, staged cases represent increased processing times. For claimants, it results in waiting longer for their disability decision.

High rates of examiner attrition can significantly impact a DDS's productivity and processing time, as illustrated by the extreme experience of the Vermont DDS. The DDS lost most of its experienced examiner staff in FYs 2001 and 2002. According to DDS management, examiners who could not adjust to the increasing DDS workload became overwhelmed, and morale and production plummeted. Approximately 57 percent of the experienced examiners (8 of 14) left the DDS's employment in FY 2001. In addition, one experienced examiner transferred from an examiner position to a less stressful position. Of the remaining five experienced examiners, one left the following year and another was promoted to supervisor, leaving the DDS with only three experienced examiners. With few experienced examiners, the DDS was forced to divert a portion of its workload to the Massachusetts DDS in FYs 2001 and 2002. The DDS's productivity decreased from 239 case clearances per workyear in FY 2000 to 173 case clearances per workyear in FY 2002. During the same time period, processing time for Title II cases increased from 61 to 85 days and from 60 to 84 days for Title XVI cases.

Examiner attrition can also increase DDS administrative (personnel) costs because it can result in examiners working overtime to process the workload. For example, the lower-performing Georgia DDS experienced high rates of examiner attrition from FY 2000 through FY 2002. In FY 2002, the Georgia DDS worked about 54,000 hours of overtime because the DDS had an insufficient number of experienced examiners to process disability cases. Overtime pay is at a higher hourly rate than regular pay, which results in increased administrative costs.

When receipts increase and examiner attrition is high, workforce planning becomes essential to DDSs as they strive to increase productivity and to decrease processing times. In December 2003, the Commissioner of Social Security issued a workforce plan to explain how SSA will manage its human capital in order to achieve the Agency's mission and goals. One of SSA's goals in the workforce plan is to deliver high-quality, citizen-centered service, and an objective of the goal is to make the right decision in the disability process as early as possible. Although this objective applies directly to the work of DDSs, the plan did not specifically address the DDS workforce, or specify how SSA will assist the DDSs in workforce planning.

Of the 10 DDSs in our review, 5 DDSs (Vermont, New Mexico, Oregon, Georgia, and North Carolina) conducted long-term workforce planning that addressed examiner attrition. Four DDSs (Wyoming, Maine, Minnesota, and Mississippi) did not conduct long-term workforce planning. All nine of these DDSs reported that long-term workforce planning is difficult because:

SSA does not make long-term DDS workload and staffing projections.
Budget allocations are delayed each FY.
DDS resource levels are uncertain each FY.
SSA does not always grant DDS hiring authority when requested.
The Commissioner's new approach for the disability program may change the workforce needs of the DDS.
States sometimes restrict DDS staffing.

The aforementioned issues make long-term workforce planning a challenge for SSA and DDSs. Nevertheless, SSA and the DDSs need to have a process in place to ensure sufficient qualified staff to adequately process disability determinations.

STAFF MIX

Lower-performing DDSs had a different full-time staff mix than their higher-performing DDS counterparts. As shown in Chart 3 , the lower-performing DDSs had fewer examiners in relationship to total staff than their higher-performing DDS counterparts.

In FY 2002, the five higher-performing DDSs had a higher percentage of examiner staff than their lower-performing DDS counterparts. We believe this may have contributed to the higher-performing DDSs being able to process an average of 70 more clearances per workyear and process claims an average of 32 days faster than the 5 lower performing DDSs.

Federal regulations allow States to provide the organizational structure and qualified personnel needed to make disability determinations. Furthermore, States are required to adhere to applicable State approved personnel standards in hiring staff. Accordingly, SSA has limited its involvement in the DDS' ongoing management of the disability program. However, the Federal/State relationship does not preclude SSA from developing an optimal DDS staff mix model for DDSs to follow. In fact, amendments to the Social Security Act in 1980 allow SSA to issue regulations specifying performance standards and administrative requirements and procedures to be followed in making disability determinations.

SSA should develop an optimal DDS staff mix model and encourage DDSs to follow the model to achieve the Commissioner's goal of processing disability claims accurately and as early as possible in the process. According to staff in the Office of Disability, the Commissioner's new approach to improve the disability determination process, including the transition to accelerated electronic disability (AeDib), makes it difficult to develop an optimal staff mix because it does not know what an optimal staff mix will be under the new claims processing environment. However, SSA could initiate development of an optimal staff mix model as the Commissioner's new approach is being implemented and related staffing requirements are determined.

CE PURCHASES

In FY 2002, four of the five lower-performing DDSs (New Mexico, Oregon, New Jersey, and Georgia) purchased CEs on a higher percentage of disability cases than their higher-performing DDS counterparts (see Table 5). Three of the five lower-performing DDSs (New Mexico, New Jersey, and Georgia) also purchased CEs on a higher percentage of disability cases than their higher-performing DDS counterparts regardless of the type of case (Title II, Title XVI, or concurrent). CEs are purchased when the claimant does not have medical evidence that is available or sufficient to support a medically determinable impairment. CE purchases increase claims processing time because the DDS must wait for a CE to be scheduled and performed, and the results of the CE to be received from the medical provider.

Table 5: CE Purchase Rates and Average CE Waiting Time FY 2002
Higher/
Lower Performing Workload Strata CE Purchase Rate (%) Average Estimated CE Waiting Time (days)
Title II Cases Title XVI Cases Concurrent Cases Total Cases
Very Small DDSs
Higher Wyoming 38% 45% 41% 41% 33
Lower Vermont 37% 40% 33% 37% 24
Small DDSs
Higher Maine 28% 31% 38% 32% 31
Lower New Mexico 29% 43% 39% 38% 72
Medium DDSs
Higher Minnesota 28% 41% 38% 34% 36
Lower Oregon 31% 38% 36% 35% 22
Large DDSs
Higher Mississippi 32% 41% 41% 39% 36
Lower New Jersey 36% 43% 47% 41% 36
Very Large DDSs
Higher North Carolina 31% 34% 39% 34% 33
Lower Georgia 39% 45% 48% 44% 56

During FY 2002, an average of 42 days elapsed between the time the CE appointment was scheduled and the CE report was received (CE waiting time) by lower-performing DDSs. Overall, this accounted for about 40 percent of the average claims processing time for the 5 lower-performing DDSs. Of most concern was that the CE waiting time accounted for 58 percent of the overall claims processing time at the New Mexico and Georgia DDSs.

The lower-performing Oregon and Georgia DDSs reported that uncooperative medical evidence providers were a reason for increased CE purchases. , A medical evidence provider may be uncooperative in providing medical evidence if they are waiting for payment for previous services from the claimant; require a special medical release form before submitting evidence; or are unsatisfied with the DDS's payment for the medical evidence. The Georgia DDS (lower-performing DDS) estimated that 25 to 30 percent of CEs purchased in FY 2002 were due to uncooperative medical evidence providers.

FACTORS AFFECTING HIGHER- AND LOWER-PERFORMING DDSs

We also identified factors that may have adversely affected the performance of both higher- and lower-performing DDSs. These factors included State restrictions on hiring staff and high rates of claimants who missed scheduled CE appointments.

STATE RESTRICTIONS ON DDS STAFFING

The 10 DDSs in our review stated that they experienced State restrictions in hiring staff, including hiring freezes, lengthy hiring procedures, and noncompetitive salaries. Although DDSs are funded by SSA, DDSs must follow State personnel policies and procedures including State approval on hiring new staff. Two higher-performing DDSs (Maine and Minnesota) and two lower-performing DDSs (Vermont and Oregon) experienced State-imposed hiring freezes in FYs 2000 through 2002, which delayed the hiring of examiners needed to process disability claims. As previously reported, the attrition rate in Vermont and Oregon was among the highest of the 10 DDSs in our review (see Table 3), therefore, it was critical for these DDSs to hire examiners timely in order to process SSA's disability claims.

Some DDSs also reported that lengthy State procedures prolonged the hiring process because of scheduling and administering civil service tests, performing background checks, reviewing DDS selections, and authorizing DDS hires. From the time an opening is announced to actually bringing a new employee onboard, Wyoming, New Jersey, and Georgia DDSs reported a wait of 3 to 5 months.

One DDS reported that their Department of Personnel establishes the candidate pool by means of a civil service test. The test has resulted in no one in the pool because either no one took the test or no one was qualified. So, the test had to be rescheduled and the process restarted resulting in a delay in hiring.

Another DDS reported that its parent agency has taken 6 weeks to approve the DDS' selection. Until the selection is approved by the parent agency, the DDS cannot offer the job to the candidate. Due to the delay, candidates have chosen not to wait for the approval and the DDS has lost potential qualified employees.

Eight DDSs reported that examiner salaries need to be upgraded to attract and retain examiners. Two higher-performing DDSs, Mississippi and North Carolina, indicated that State salary restrictions significantly limit the applicant pool by not attracting applicants with the necessary skills to be examiners. To provide SSA with information on the salaries of a similar occupation in the State, we compared the average salaries of experienced DDS examiners to average salaries of claims examiners for the insurance industry.

As shown in Table 6, the salary for an insurance examiner was higher than the salary of an experienced DDS examiner at nine DDSs. Only the examiner salary of the New Jersey DDS exceeded that of claims examiners in their State. We could not reach any definitive conclusions regarding the salary at these DDSs in relation to their respective performance and attrition rates. For example, 4 of the 10 DDSs (Maine, New Mexico, Mississippi, and Georgia) in our review had examiner salaries that were at least 10 percent lower than the salaries of insurance examiners in their State (see Table 6). However, two of these DDSs (Maine and Mississippi) were higher-performing DDSs, which diminish any correlation between low salaries and low performance. Also, these DDSs did not have the highest attrition rates of the 10 DDSs in our review. In fact, 4 of the 10 DDSs (Vermont, Oregon, New Jersey, and North Carolina) in our review had attrition rates that were equal to or higher than the 4 DDSs with the greatest difference in DDS and insurance claims examiners' salaries. In addition, based on our national performance ranking of the 41 DDSs in our population (see Appendix B), the New Jersey DDS had the lowest performance of all DDSs, but it had the highest average examiner salary of the 10 DDSs in our review.

Table 6: Comparison of an Insurance Examiner's
Salary to the Average Salary of an Experienced
DDS Examiner in 2002
Higher/Lower
Performing Workload Strata Insurance
Examiner
Average Salary Experienced
DDS Examiner
Average Salary Difference (%)
Very Small DDSs
Higher Wyoming $39,710 $36,762 8%
Lower Vermont $45,460 $43,222 5%
Small DDSs
Higher Maine $45,770 $38,160 20%
Lower New Mexico $40,700 $35,984 13%
Medium DDSs
Higher Minnesota $47,190 $46,505 1%
Lower Oregon $45,740 $43,000 6%
Large DDSs
Higher Mississippi $34,580 $30,000 15%
Lower New Jersey $45,850 $56,403 (19%)
Very Large DDSs
Higher North Carolina $42,200 $39,787 6%
Lower Georgia $48,380 $41,872 15%

MISSED CE APPOINTMENTS

All 10 DDSs in our review indicated ongoing problems with claimants not attending their scheduled CE appointment. DDSs reported this significantly increased case processing times:

Minnesota DDS reported that about 16 percent of claimants, the majority of whom were mentally impaired or children, missed CE appointments, which added an estimated 7-weeks processing time per case.

Mississippi DDS reported that about 25 percent of claimants, the majority of whom were mentally impaired, were uncooperative in attending the CE appointment, which added an estimated 4 to 6 weeks processing time per case.

Georgia DDS estimated that about 30 percent of their CE appointments were rescheduled, most of these being mentally impaired and children, which added an estimated 2-weeks processing time per case.

Neither SSA nor DDSs can make a claimant attend the scheduled CE, so they must be innovative in creating processes that improve CE attendance. For example, the North Carolina DDS reported a small, successful pilot initiated at a Cherokee reservation to address missed CE appointments. Twice a year, the local SSA field office, the DDS, and the reservation hospital work together on the reservation to take claims and perform CEs. The DDS indicated that the pilot facilitates rapid processing of claims in that area by reducing missed CE appointments.

In FY 2002, the Georgia DDS created an innovative pilot to help address its high rates of missed CE appointments. The pilot avoided rescheduling CEs for mental impairment cases, increased medical evidence usage and decreased CE purchases. After medical evidence is received from a mental health provider, the Committee developed and sent a "Mental Impairment Questionnaire" to these providers to obtain additional information, thereby eliminating the need, in some cases, to order a CE. During the pilot, the DDS sent 257 questionnaires, 87 were returned, and the DDS avoided ordering 72 CEs. Due to the success of the pilot, the Georgia DDS now allows this process to be used by all examiners.

Conclusions and Recommendations

We identified factors that may have influenced the level of performance of the DDSs included in our review. Generally, the lower-performing DDSs in our review had higher rates of disability examiner attrition, fewer examiners in relationship to total staff, and purchased CEs on more claims than their higher-performing DDS counterparts. In addition, State restrictions on hiring staff and high rates of claimants who missed scheduled CE appointments may have negatively affected the claims processing performance of both higher- and lower-performing DDSs. We acknowledge that there are other factors that most likely also influenced the performance of the DDSs in our review. Accordingly, we do not suggest that the factors identified in our report are all inclusive.

In September 2003, the Commissioner of Social Security presented a new approach to improve the disability determination process with a goal to make the right decision as early as possible in the process. The approach includes:

An electronic disability claims folder that will link all components involved in processing disability claims and eliminate mailing, locating, and organizing paper folders.

A Quick-Decision step where a Regional Expert Review Unit will screen and approve claims to allow a claimant to receive a decision as soon as possible when they are obviously disabled.

Centralized medical expertise within the Regional Expert Review Units that are available to provide support to disability decision makers at all levels, including the DDSs.

Elimination of the reconsideration step.

Continued full documentation and explanation of disability determinations.

An in-line quality review process managed by DDSs to increase opportunities for identifying problem areas and implementing corrective actions and related training.

We commend the Commissioner for developing a new approach to improve SSA's disability determination process. The quick-decision step and the elimination of the reconsideration step may remove some of the workload from DDSs and free up DDS resources for more demanding tasks, such as adjudicating difficult claims, fully documenting determinations, and performing in-line quality reviews. However, these very challenging tasks will require the knowledge and skills of experienced disability examiners. High examiner attrition may impact some DDS' ability to perform these tasks efficiently and effectively. In addition, an appropriate staff mix of examiners to total DDS staff to handle the complexity and volume of the workload is needed to reach the Commissioner's goal of making the right decision as early as possible in the disability determination process.

The Commissioner's new approach will not resolve the difficulties DDSs face in obtaining medical evidence from uncooperative medical evidence providers or the high rates of claimants who do not attend CE appointments. These claims processing challenges may impact some DDS' ability to improve performance. Regardless of the success of the electronic disability folder or the Commissioner's other initiatives, DDSs cannot make timely decisions without timely medical evidence from treating sources and CE providers. Accordingly, resolution of these claims processing challenges will assist DDSs to achieve the Commissioner's goal of making the right decision as early as possible in the disability determination process.

The DDSs are responsible for providing an organizational structure and qualified personnel to process disability claims and to obtain evidence needed to make disability determinations. However, SSA is required to work with DDSs to provide and maintain an effective system for processing disability claims, including providing leadership and oversight. Therefore, to improve DDS claims processing performance, we recommend that SSA:

1. Continue to work with State governments to resolve the factors that result in high DDS examiner attrition and difficulties in hiring staff.

2. Initiate development of an optimal staff mix model as the Commissioner's new disability determination approach is being implemented and related staffing requirements are determined.

3. In concert with DDSs, establish outreach efforts with providers who are historically unwilling to submit medical evidence in a timely manner to educate them on the importance of medical evidence on disability decisions that affect the life quality of disabled citizens.

4. Assist DDSs to establish innovative processes that will lower the high rates of claimants who do not attend CE appointments.

AGENCY COMMENTS

In comments to our draft report, SSA stated that it generally agreed with our findings and conclusions. With regards to recommendation number two, SSA stated that it could not develop an optimal DDS staff mix model at this time because staffing mix requirements for the electronic folder and the Commissioner's new disability process are unknown. However, SSA stated that it will evaluate the staffing requirements as it transitions into the new processes.

SSA also expressed reservations about our review methodology and provided technical comments that we addressed as appropriate. SSA's comments are included at Appendix C.

OIG RESPONSE

We recommended that SSA develop an optimal DDS staff mix model as the Commissioner's new disability process is being implemented and the related staffing requirements are determined. We are encouraged that the Agency agreed to evaluate staffing requirements as it transitions into the new disability process and we continue to recommend that the optimal staff mix model be developed as staffing requirements are determined.

With regards to SSA's comments on our review methodology, we provided SSA opportunities to be involved in the development of our review methodology and to provide comments on the final methodology. Specifically, on January 8, 2003, we met with Office of Disability Determination (ODD) to discuss the methodology for our review. At that time, ODD stated that our proposed methodology was fair. On January 14, 2003, we provided ODD a detailed, written description of our review's methodology and asked for comments. ODD did not provide any comments on or objections to our methodology. Furthermore, SSA did not provide any comments on or objections to our methodology at the March 4, 2003 entrance conference or the June 23, 2004 exit conference.

We believe our review methodology fairly reflects the DDS' ability to manage workloads and produce timely and accurate disability decisions. Specifically, our methodology ranked DDS performance based upon four performance indicators that SSA utilizes to make DDS management decisions (Production-Per-Workyear [PPWY], processing time, accuracy and cost-per-case). Our methodology weighted the performance indicators equally because SSA stated that it considered all of the indicators critical to successful DDS claims processing performance and SSA was unable to provide us with a measurement system that provided a better measurement of DDS performance. Three of the indicators, PPWY, processing time, and accuracy were compared among DDSs with similar workload size. This methodology resulted in a fair comparison among DDSs because it addressed the potentially significant variations caused by workload size and provided a measure of each DDS' success in managing its own workload. For the fourth indicator, cost-per-case, our methodology focused on how well each DDS was able to control its own costs. We believe this methodology was fair since we did not compare the cost-per-case among DDSs due to variances in cost-of-living across the country.

Appendices

Appendix A
Scope and Methodology

Our population of Disability Determination Services (DDSs) for this review included 41 of the 52 DDSs. Based on the combined initial clearances for Fiscal Years (FY) 2000 through 2002, we divided the 41 DDSs into five strata, as shown in the following table (Also, see the table on page A-4).

Workload Strata Combined Clearances
FYs 2000, 2001, 2002 Number of DDSs in Strata
Very Small Less than 21,000 8
Small 21,000 to 50,000 7
Medium 50,001 to 100,000 8
Large 100,001 to 200,000 12
Very Large Greater than 200,000 6

To select the 10 DDSs for our review, we used the performance indicators of adjusted Production-Per-Workyear (PPWY), Title II and Title XVI processing times, performance accuracy, and cost-per-case. We then developed an overall performance indicator using the four individual performance indicators for FYs 2000 through 2002. The overall performance indicator enabled us to select DDSs based upon all around performance from each workload stratum. The 10 DDSs selected for our review did not consistently rank as the best performer or the worst performer on all performance indicators.

The individual performance indicators we used to select the 10 DDSs for our review are described below.

Production: Adjusted PPWY

As the performance measure for production, we used adjusted PPWY. PPWY is calculated by dividing the number of disability case clearances by the number of workyears. When contractual hours are factored into the denominator, this indicator becomes adjusted PPWY. The higher the adjusted PPWY, the more productive the DDS.

Adjusted PPWY data from FYs 2000 through 2002 were combined to arrive at a 3 year combined PPWY. The percentage difference between the DDS with the highest PPWY and each of the other DDSs within the stratum was calculated (see the table on page A-5).

Timeliness: Title II and Title XVI Processing Time

As the performance measure for timeliness, we used Title II and Title XVI processing times, weighted by number of clearances, for FYs 2000 through 2002. The lower the overall processing time, the more timely the DDS processed disability claims.

The average processing time for each year was combined to arrive at a 3-year combined processing time. The percentage difference between the DDS with the lowest processing time and each of the other DDSs within the stratum was calculated (see the table on page A-5).

Accuracy: Performance Accuracy Rate

Performance accuracy is the percentage of cases that do not have to be returned to DDSs for further development or correction of decisions based on evidence in the case file and as such, represents the accuracy of DDS disability decisions. This measure constitutes the performance accuracy standard set forth under 20 Code of Federal Regulations (C.F.R.) §§ 404.1643 and 416.1043. The higher the accuracy rate, the higher the number of correct decisions the DDS issued.

The performance accuracy rates for FYs 2000 through 2002 were combined to arrive at a 3-year combined accuracy. The percentage difference between the DDS with the highest accuracy and each of the other DDSs within the stratum was calculated (see the table on page A-5).

Cost-Per-Case

As a performance measure of cost, we used cost-per-case. The lower the percent change in cost-per-case, the more success the DDS had containing the cost of processing disability cases. We calculated the percent change in cost-per-case from FY 2000 to FY 2002, and used this as the measure of cost.

The cost-per-case in FY 2000 was subtracted from the cost-per-case in FY 2002. The difference was then divided by the cost-per-case in FY 2000 to arrive at the percentage change in cost-per-case (see the table on page A-5).

Combined Factor

Each DDS' percentage difference within each stratum for adjusted PPWY, Title II and Title XVI processing time, performance accuracy rate, and percentage change for cost-per-case were added together to arrive at a combined factor. Within each workload stratum, the DDSs were ranked on the basis of the combined factor-the lowest-scoring DDS on the combined factor was the highest-performing DDS and the highest-scoring DDS on the combined factor was the lowest-performing DDS. The results of our ranking methodology are on page A-5.

From each stratum, the highest-performing DDS and the lowest-performing DDS were selected for our review, as shown in the table below.

DDS Workload Strata
Highest-performing DDS
Lowest-performing DDS
Very Small Wyoming Vermont
Small Maine New Mexico
Medium Minnesota Oregon
Large Mississippi New Jersey
Very Large North Carolina Georgia

We also:

reviewed 20 C.F.R. §§ 404.1601 et seq. and 416.1001 et seq.; SSA's Program Operations Manual System DI 11005, DI 22505, DI 22510, DI 24501, DI 24505, DI 25205, DI 22510, DI 33510, DI 39501, DI 39503, DI 39506, DI 39518, DI 39518, DI 39545, DI 39557, DI 39563, and PM 00203; and Social Security Rulings 96-1p through 96-9p;

reviewed SSA published statistics on DDS disability claims processing and performance;

met with and obtained agreement from the Office of Disability Determinations on the methodology used to select the DDSs for our review; and

obtained performance-related information from the 10 DDSs selected for our review.

We did not verify the accuracy or reliability of the Social Security Administration or DDS data presented in this report. We performed fieldwork in Baltimore, Maryland, and Kansas City, Missouri, from March 2003 to March 2004. We conducted our evaluation in accordance with the Quality Standards for Inspections issued by the President's Council on Integrity and Efficiency.

DDS Workload Stratification

Total Initial Clearances

DDS FY 2000 FY 2001 FY 2002 TOTAL STRATA
1 Wyoming 2,674 2,761 3,576 9,011 Very Small
2 North Dakota 3,088 3,283 3,607 9,978 Very Small
3 Vermont 3,620 3,196 3,411 10,227 Very Small
4 South Dakota 4,281 4,714 4,781 13,776 Very Small
5 District of Columbia 5,446 4,799 5,099 15,344 Very Small
6 Delaware 4,734 5,082 6,076 15,892 Very Small
7 Montana 5,835 6,344 6,963 19,142 Very Small
8 Hawaii 6,816 7,107 6,931 20,854 Very Small
9 Rhode Island 8,129 8,101 8,482 24,712 Small
10 Idaho 8,373 8,373 9,483 26,229 Small
11 Utah 8,226 8,916 10,366 27,508 Small
12 Maine 9,575 10,155 11,050 30,780 Small
13 Nebraska 10,089 11,215 11,847 33,151 Small
14 Nevada 11,770 12,365 15,304 39,439 Small
15 New Mexico 13,429 12,734 15,394 41,557 Small
16 Iowa 15,468 17,393 19,203 52,064 Medium
17 Kansas 17,172 18,346 19,748 55,266 Medium
18 Connecticut 18,943 20,655 20,612 60,210 Medium
19 West Virginia 23,291 24,932 23,854 72,077 Medium
20 Oregon 22,530 23,481 26,687 72,698 Medium
21 Minnesota 21,810 24,153 26,915 72,878 Medium
22 Oklahoma 25,856 29,502 31,143 86,501 Medium
23 Arkansas 26,531 31,781 35,414 93,726 Medium
24 Maryland 34,105 33,178 33,663 100,946 Large
25 Arizona 32,857 33,680 35,894 102,431 Large
26 Wisconsin 30,433 32,310 40,076 102,819 Large
27 South Carolina 36,773 40,227 41,879 118,879 Large
28 Washington 38,677 38,856 43190 120,723 Large
29 New Jersey 37,916 43,839 44,419 126,174 Large
30 Mississippi 40,471 44,068 46,156 130,695 Large
31 Massachusetts 44,975 44,217 46,773 135,965 Large
32 Indiana 42,571 41,039 52,497 136,107 Large
33 Virginia 46,502 49,192 43,481 139,175 Large
34 Kentucky 48,054 51,245 53,814 153,113 Large
35 Tennessee 53,401 58,586 62,754 174,741 Large
36 Georgia 62,796 74,746 75,726 213,268 Very Large
37 North Carolina 64,100 76,955 84,978 226,033 Very Large
38 Illinois 82,477 84,849 93,607 260,933 Very Large
39 Ohio 80,781 93,139 100,824 274,744 Very Large
40 Florida 131,884 135,664 139,219 406,767 Very Large
41 Texas 127,921 131,730 167,609 427,260 Very Large

DDSs Ranked Within Strata Based on FYs 2000, 2001, and 2002 Performance Data

DDS Strata Combined Accuracy % Difference
within Stratum Combined
PPWY % Difference
within Stratum Combined
Processing
Time % Difference
within Stratum % Change in Cost- Per-Case Combined Factor (%)
High Wyoming Very Small 285.4 2.66% 829.6 12.97% 180.65 0.00% -1.10% 14.54%
Delaware Very Small 284.8 2.86 953.3 0.00 200.71 11.10 9.54 23.50
Montana Very Small 283.9 3.17 801.3 15.95 215.13 19.09 2.12 40.33
North Dakota Very Small 281.6 3.96 667.5 29.98 215.51 19.29 6.73 59.96
South Dakota Very Small 287.0 2.11 790.5 17.08 281.42 55.78 1.56 76.53
Hawaii Very Small 285.8 2.52 756.3 20.67 310.69 71.98 -3.05 92.13
Washington, D.C. Very Small 292.0 0.41 753.8 20.93 307.81 70.39 4.58 96.31
Low Vermont Very Small 293.2 0.00 641.6 32.69 219.89 21.72 47.43 101.84
High Maine Small 283.6 2.11 844.9 10.72 217.81 0.00 3.72 16.55
Nebraska Small 284.6 1.76 742.5 21.54 228.68 4.99 -6.94 21.35
Idaho Small 281.6 2.80 878.9 7.13 232.08 6.55 12.29 28.77
Nevada Small 286.9 0.97 932.5 1.46 292.25 34.17 -5.18 31.42
Rhode Island Small 281.7 2.76 946.3 0.00 289.32 32.83 8.69 44.28
Utah Small 283.3 2.21 756.4 20.07 277.01 27.18 6.17 55.63
Low New Mexico Small 289.7 0.00 765.4 19.12 332.21 52.52 12.77 84.41
High Minnesota Medium 288.1 0.79 872.4 3.95 209.53 0.00 -5.94 -1.20
Arkansas Medium 290.4 0.00 830.3 8.58 236.98 13.10 0.94 22.62
Kansas Medium 278.7 4.03 825.7 9.09 216.73 3.44 9.61 26.16
Connecticut Medium 282.1 2.86 908.2 0.00 258.96 23.59 5.04 31.50
Oklahoma Medium 287.7 0.93 902.2 0.67 274.73 31.12 6.51 39.22
West Virginia Medium 281.0 3.24 817.6 9.97 259.90 24.04 12.41 49.66
Iowa Medium 283.3 2.44 771.3 15.08 285.53 36.27 3.44 57.23
Low Oregon Medium 282.7 2.65 724.6 20.22 292.06 39.39 4.15 66.41
High Mississippi Large 278.5 2.89 1022.7 0.00 201.14 1.71 2.84 7.44
Massachusetts Large 282.8 1.39 966.6 5.48 220.81 11.65 12.14 30.67
Tennessee Large 280.2 2.30 876.4 14.30 207.20 4.77 12.12 33.49
Kentucky Large 280.1 2.34 817.2 20.09 219.17 10.82 4.30 37.56
Arizona Large 285.8 0.35 813.6 20.44 233.78 18.21 1.24 40.24
Indiana Large 283.4 1.19 854.1 16.48 271.06 37.06 -9.01 45.72
Virginia Large 282.1 1.64 891.7 12.81 197.76 0.00 33.09 47.53
Washington Large 276.5 3.59 789.6 22.79 240.50 21.61 11.35 59.34
South Carolina Large 282.4 1.53 857.8 16.12 287.37 45.31 3.04 66.00
Wisconsin Large 286.8 0.00 821.5 19.67 296.05 49.70 4.17 73.54
Maryland Large 279.9 2.41 816.8 20.13 241.23 21.98 31.22 75.74
Low New Jersey Large 272.6 4.95 754.8 26.19 363.24 83.67 -3.11 111.71
High North Carolina Very Large 280.5 1.37 833.8 4.12 252.60 6.57 -6.56 5.50
Illinois Very Large 279.4 1.76 803.0 7.67 237.03 0.00 2.47 11.90
Florida Very Large 282.9 0.53 821.3 5.57 281.97 18.96 6.65 31.70
Ohio Very Large 284.1 0.11 869.7 0.00 306.07 29.12 2.28 31.51
Texas Very Large 284.4 0.00 860.6 1.05 280.27 18.24 12.38 31.67
Low Georgia Very Large 281.9 0.88 754.5 13.24 292.01 23.20 1.42 38.73

Appendix B
Disability Determination Services (DDSs) Ranked Nationally Based on
Fiscal Years 2000, 2001, and 2002 Performance Data
DDS Performance High to Low DDS
(high or low performing) Combined Accuracy % Difference Nationally Combined PPWY % Difference Nationally Combined Processing Time % Difference Nationally % Change in Cost-Per-Case Combined Factor (%) Strata
1 Mississippi (high) 278.5 5.01 1022.7 0.00 201.14 11.34 2.84 19.20 Large
2 Wyoming (high) 285.4 2.66 829.6 18.88 180.65 0.00 -1.10 20.44 Very Small
3 Minnesota (high) 288.1 1.74 872.4 14.70 209.53 15.99 -5.94 26.48 Medium
4 Delaware 284.8 2.86 953.3 6.79 200.71 11.10 9.54 30.29 Very Small
5 Massachusetts 282.8 3.55 966.6 5.49 220.81 22.23 12.14 43.40 Large
6 Maine (high) 283.6 3.27 844.9 17.39 217.81 20.57 3.72 44.96 Small
7 Tennessee 280.2 4.43 876.4 14.30 207.20 14.70 12.12 45.55 Large
8 Montana 283.9 3.17 801.3 21.65 215.13 19.09 2.12 46.03 Very Small
9 Nebraska 284.6 2.93 742.5 27.40 228.68 26.59 -6.94 49.98 Small
10 Kentucky 280.1 4.47 817.2 20.10 219.17 21.32 4.30 50.19 Large
11 Arkansas 290.4 0.95 830.3 18.81 236.98 31.18 0.94 51.89 Medium
12 Arizona 285.8 2.52 813.6 20.44 233.78 29.41 1.24 53.62 Large
13 Kansas 278.7 4.95 825.7 19.26 216.73 19.97 9.61 53.79 Medium
14 North Carolina (high) 280.5 4.33 833.8 18.47 252.60 39.83 -6.56 56.07 Very Large
15 Idaho 281.6 3.96 878.9 14.06 232.08 28.47 12.29 58.78 Small
16 Virginia 282.1 3.79 891.7 12.81 197.76 9.47 33.09 59.16 Large
17 Illinois 279.4 4.71 803.0 21.48 237.03 31.21 2.47 59.87 Very Large
18 Indiana 283.4 3.34 854.1 16.49 271.06 50.05 -9.01 60.87 Large
19 Connecticut 282.1 3.79 908.2 11.19 258.96 43.35 5.04 63.37 Medium
20 North Dakota 281.6 3.96 667.5 34.73 215.51 19.30 6.73 64.71 Very Small
21 Nevada 286.9 2.15 932.5 8.82 292.25 61.78 -5.18 67.56 Small
22 Oklahoma 287.7 1.88 902.2 11.78 274.73 52.08 6.51 72.25 Medium
23 Washington 276.5 5.70 789.6 22.80 240.50 33.13 11.35 72.97 Large
24 Rhode Island 281.7 3.92 946.3 7.47 289.32 60.15 8.69 80.23 Small
25 West Virginia 281.0 4.16 817.6 20.05 259.90 43.87 12.41 80.49 Medium
26 South Carolina 282.4 3.68 857.8 16.12 287.37 59.08 3.04 81.92 Large
27 South Dakota 287.0 2.11 790.5 22.70 281.42 55.78 1.56 82.16 Very Small
28 Florida 282.9 3.51 821.3 19.70 281.97 56.09 6.65 85.95 Very Large
29 Texas 284.4 3.00 860.6 15.86 280.27 55.15 12.38 86.38 Very Large
30 Utah 283.3 3.38 756.4 26.04 277.01 53.34 6.17 88.93 Small
31 Maryland 279.9 4.54 816.8 20.13 241.23 33.54 31.22 89.42 Large
32 Iowa 283.3 3.38 771.3 24.58 285.53 58.06 3.44 89.45 Medium
33 Ohio 284.1 3.10 869.7 14.96 306.07 69.42 2.28 89.77 Very Large
34 Wisconsin 286.8 2.18 821.5 19.67 296.05 63.88 4.17 89.91 Large
35 Georgia (low) 281.9 3.85 754.5 26.22 292.01 61.65 1.42 93.14 Very Large
36 Hawaii 285.8 2.52 756.3 26.05 310.69 71.99 -3.05 97.51 Very Small
37 Oregon (low) 282.7 3.58 724.6 29.15 292.06 61.67 4.15 98.55 Medium
38 Washington, D.C. 292.0 0.41 753.8 26.29 307.81 70.39 4.58 101.67 Very Small
39 Vermont (low) 293.2 0.00 641.6 37.26 219.89 21.72 47.43 106.41 Very Small
40 New Mexico (low) 289.7 1.19 765.4 25.16 332.21 83.90 12.77 123.02 Small
41 New Jersey (low) 272.6 7.03 754.8 26.20 363.24 101.08 -3.11 131.19 Large

Appendix C
Agency Comments

MEMORANDUM

Date: August 6, 2004

To: Patrick P. O'Carroll, Jr.
Acting Inspector General

From: Larry W. Dye
Chief of Staff

Subject: Office of the Inspector General (OIG) Draft Report "Disability Determination Services' Claims Processing Performance" (A-07-03-13054)--INFORMATION

We appreciate OIG's efforts in conducting this review. Our comments on the draft report content and recommendations are attached.

Please let me know if we can be of further assistance. Staff inquiries may be directed to Candace Skurnik, Director, Audit Management and Liaison Staff, at extension 54636.

SSA Response

COMMENTS ON THE OFFICE OF THE INSPECTOR GENERAL (OIG) DRAFT EVALUATION REPORT "DISABILITY DETERMINATION SERVICES' CLAIMS PROCESSING PERFORMANCE" (A-07-03-13054)

Thank you for the opportunity to review and comment on the draft report. We are in general agreement with the findings and conclusions presented in the report. Numerous other studies have found that DDS examiner attrition, DDS staffing restrictions, and salary inequalities often affect DDS production and, indirectly, the accuracy of DDS determinations.

However, we have some reservations about the grouping of DDS performance data, described in Appendix A, (Scope and Methodology). We agree that the 3-year totals of yearly clearances (fiscal years 2000-2002) are a meaningful measure of productivity. However, we do not endorse a similar addition process to produce totals of weighted DDS accuracy rates. We prefer a weighted combined 3-year accuracy rate for each DDS, which is a more statistically reliable measure than the 3-year totals displayed in the report.

From the combined accuracy totals, percentage differences within each DDS stratum were calculated. Similar percentage differences were then calculated for the other two measures of performance (3-year total Productivity Per Work Year, and 3-year combined processing time), and these differences were added together to yield a single "combined factor" for each DDS. This method is very similar to the total-scorecard method of ranking DDSs that we recently analyzed.

In our analysis of the total-scorecard method, we found that the method is unfair and inequitable under any possible scheme for weighting each measure and "counting" it in the overall ranking. Even if equal weighting is given to each measure, the measures themselves are so disparate that, when they are combined mathematically into a single index, it becomes impossible to draw valid conclusions about their combined effect.

We also found that the drawbacks of the total-scorecard method cannot be overcome by simple mathematical manipulation of the performance measures. For example, as OIG points out, it is impossible to standardize cost-per-case across DDSs because of differing economic conditions in different states. However, there is no general agreement about methods for deriving a single cost index, and percentage changes in costs are not meaningful unless they are referred to a standardized and indexed baseline. A similar consideration applies, with perhaps less force, to efforts to standardize processing times across DDSs.

Therefore, although we generally agree with the conclusions of this report - and, as mentioned above, the conclusions are consistent with the results of numerous other studies - we have serious reservations about the method by which OIG reached these conclusions.

Our responses to the specific recommendations are provided below.

Recommendation 1

Continue to work with State governments to resolve the factors that result in high DDS examiner attrition and difficulties in hiring staff.

Response

We agree. The Federal/State relationship in administering SSA's disability programs is complex and presents unique challenges. Under the Social Security Act and our regulations, the States are responsible for providing qualified personnel to ensure that disability determinations are made accurately and promptly. In addition, our regulations indicate the DDSs will adhere to applicable State approved personnel standards in the selection, tenure, and compensation of any individual employed in the disability program. SSA works with States within the context of this Federal/State relationship to address staffing issues that affect the performance of the DDSs, but the DDSs are State agencies with State employees governed by State personnel rules. Rather than imposing Federal mandates regarding DDS personnel issues, SSA works collaboratively with the States to reach mutually agreeable solutions to the issues that affect program administration.

Over the years, the Office of Disability Determinations (ODD) has worked with States' leaders to overcome barriers to State mandated hiring freezes. We have been successful in many States. SSA will continue to work with State government personnel to address staffing issues that affect DDS performance.

Recommendation 2

Initiate development of an optimal staff mix model as the Commissioner's new disability determination approach is being implemented and related staffing requirements are determined.

Response

We disagree at this time. The only basis for such a model would be the past DDS disability process. The Agency is currently in transition--implementing the electronic folder (eDib) and anticipating the implementation of the Commissioner's new disability process. The staffing mix requirements for the differing processes are unknown at this time. We will need time to evaluate the staffing requirements as we transition into the new processes.

Recommendation 3

In concert with DDSs, establish outreach efforts with providers who are historically unwilling to submit medical evidence in a timely manner to educate them on the importance of medical evidence on disability decisions that affect the life quality of disabled citizens.

Response

We agree. SSA will continue to encourage outreach efforts at the State, regional and national levels. Currently, each DDS has a staff specialist (Medical Relations Officer (MRO) or Professional Relations Officer (PRO)) who conducts ongoing outreach efforts with the medical community. These MRO/PROs work closely with local medical providers to obtain needed evidence in a timely fashion. The Office of Disability Programs (ODP) also participates in ongoing outreach efforts through national and regional conferences of PROs.

Recommendation 4

Assist DDSs to establish innovative processes that will lower the high rates of claimants who do not attend consultative examination (CE) appointments.

Response

We agree. SSA will continue to encourage the DDSs to establish innovative processes for lowering the rate of claimants who do not attend CE appointments. However, it does not appear that the report considered the impact of the workload mix on CE costs and claimant "no shows." Cases involving mental impairment tend to have higher rates of claimant "no shows" to the CE appointment. For cases involving CE appointments, the DDSs schedule the appointments and send follow-up reminders to the claimants. Cases involving mental impairments also tend to have less medical evidence of record (MER) in file. In these cases, the DDSs usually have a harder time acquiring MER from claimants and need to schedule CE appointments in order to make a determination.

Appendix D
OIG Contacts and Staff Acknowledgments
OIG Contacts
Mark Bailey, Director, Central Audit Division (816) 936-5591
Shannon Agee, Audit Manager, Central Audit Division (816) 936-5590

Acknowledgments
In addition to those named above:
Carol Cockrell, Program Analyst
Ken Bennett, Lead IT Specialist
Cheryl Robinson, Writer-Editor

For additional copies of this report, please visit our web site at www.ssa.gov/oig or contact the Office of the Inspector General's Public Affairs Specialist at (410) 966-1375. Refer to Common Identification Number A-07-03-13054.

Overview of the Office of the Inspector General

The Office of the Inspector General (OIG) is comprised of our Office of Investigations (OI), Office of Audit (OA), Office of the Chief Counsel to the Inspector General (OCCIG), and Office of Executive Operations (OEO). To ensure compliance with policies and procedures, internal controls, and professional standards, we also have a comprehensive Professional Responsibility and Quality Assurance program.

Office of Audit

OA conducts and/or supervises financial and performance audits of the Social Security Administration's (SSA) programs and operations and makes recommendations to ensure program objectives are achieved effectively and efficiently. Financial audits assess whether SSA's financial statements fairly present SSA's financial position, results of operations, and cash flow. Performance audits review the economy, efficiency, and effectiveness of SSA's programs and operations. OA also conducts short-term management and program evaluations and projects on issues of concern to SSA, Congress, and the general public.

Office of Investigations

OI conducts and coordinates investigative activity related to fraud, waste, abuse, and mismanagement in SSA programs and operations. This includes wrongdoing by applicants, beneficiaries, contractors, third parties, or SSA employees performing their official duties. This office serves as OIG liaison to the Department of Justice on all matters relating to the investigations of SSA programs and personnel. OI also conducts joint investigations with other Federal, State, and local law enforcement agencies.

Office of the Chief Counsel to the Inspector General

OCCIG provides independent legal advice and counsel to the IG on various matters, including statutes, regulations, legislation, and policy directives. OCCIG also advises the IG on investigative procedures and techniques, as well as on legal implications and conclusions to be drawn from audit and investigative material. Finally, OCCIG administers the Civil Monetary Penalty program.

Office of Executive Operations

OEO supports OIG by providing information resource management and systems security. OEO also coordinates OIG's budget, procurement, telecommunications, facilities, and human resources. In addition, OEO is the focal point for OIG's strategic planning function and the development and implementation of performance measures required by the Government Performance and Results Act of 1993.