The Commissioner

Inspector General

Performance Measure Review: Reliability of the Data Used to Measure Employer Satisfaction (A-02-01-11012)

Following consultations with congressional committees, the Office of the Inspector General agreed to review the Social Security Administration’s (SSA) performance indicators over a continuous 3-year cycle. We recently completed our first 3-year cycle. In conducting this work, we used the services of an outside contractor, PricewaterhouseCoopers, LLP (PwC), to assist us in our efforts.

For this report, we used PwC to conduct the review of two of the Agency’s performance indicators related to the satisfaction employers have with the services SSA provides to them. The objective of the review was to assess the reliability of the data used to measure the employer satisfaction with SSA services.

Please comment within 60 days from the date of this memorandum on corrective action taken or planned on each recommendation. If you wish to discuss the final report, please call me or have your staff contact Steven L. Schaeffer, Assistant Inspector General for Audit, at (410) 965-9700.

James G. Huse, Jr.

OFFICE OF

THE INSPECTOR GENERAL

SOCIAL SECURITY ADMINISTRATION

PERFORMANCE MEASURE REVIEW:

RELIABILITY OF THE DATA USED TO

MEASURE EMPLOYER SATISFACTION

April 2002

A-02-01-11012

EVALUATION REPORT


Evaluation of Selected Performance Measures of the Social Security Administration:

Reliability of the Data Used to Measure Employer Satisfaction

Office of the Inspector General
Social Security Administration

INTRODUCTION

This report is one of five separate stand-alone reports, corresponding to the following Social Security Administration (SSA) process and performance measures (PM):

Fiscal Year (FY) 2000 Goal: 93 percent

FY 2000 Goal: 13 percent

This report reflects our understanding and evaluation of the process related to PMs #3 and #4. To achieve its strategic goal "To Deliver Customer-Responsive World-Class Service," SSA has developed several strategic objectives. One of these objectives is to have 9 out of 10 customers rating SSA’s service as "good," "very good," or "excellent," by 2002 with most rating it "excellent." SSA’s FY 2001 Annual Performance Plan (APP) contains two performance indicators developed to meet this objective as follows:

We performed our testing from September 21, 2000 through February 15, 2001. Our engagement was limited to testing at SSA’s headquarters in Woodlawn, Maryland. The procedures that we performed were in accordance with the American Institute of Certified Public Accountants’ Statement on Standards for Consulting Services, and are consistent with appropriate standards for performance audit engagements in Government Auditing Standards (Yellow Book, 1994 version). However, we were not engaged to and did not conduct an audit, the objective of which would be the expression of an opinion on the reliability or accuracy of the reported results of the performance measures evaluated. Accordingly, we do not express such an opinion.

BACKGROUND

These indicators have been created to measure the overall satisfaction rate of employers who performed wage reporting to SSA. The goal during FY 2000 is to have 93 percent of employers surveyed rating SSA’s overall service as "excellent," "very good," or good," with 13 percent rating SSA’s overall service as "excellent." Below is an overview of the Employer Satisfaction survey and data collection mechanism.

Survey Frequency

This study was first conducted in 1996 by the Office of Quality Assurance and Performance Assessment (OQA) to assess the satisfaction of employers with SSA-provided services and information. The results were used to develop employer-related customer service standards. Specifically, this study aimed to seek opinions on the following issues:

In 2000, this study was repeated for a second time using a questionnaire that was substantially modified. The 2000 administration of this survey collected information on employers’ use, awareness, and satisfaction with the SSA wage reporting services. It also sought opinions of employers on SSA’s plans for improving wage-reporting services via the Internet.

Sample Design

Both administrations of this survey have used the SSA Employer Identification Master File (EIMF) (1993 for 1996 study and 1998 for 2000 study) for the sampling frame. Each time a stratified sample of employers was selected, with three strata defined based on the number of employees for whom the wage reports were filed. All employers with no known number of employees have been grouped together to form a fourth sampling stratum. The following table provides a summary of the sampling design for the 2000 survey.

Table 1. Sampling Design and Universe Counts for the 2000 Administration

Strata

Number of Employees

Population

Sample

Strata 1

1 – 10

4,200,000

1,000

Strata 2

11 - 250

1,700,000

1,000

Strata 3

> 250

100,000

1,000

Strata 4

Unknown

500,000

500

Total

6,500,000

3,500

Questionnaire Design

The 1996 questionnaire was designed to include 23 questions, 10 of which were open-ended. That questionnaire consisted of the following five sections.

In order to improve the 1996 questionnaire, 16 focus groups were conducted in California, Florida, Illinois, and Maryland in May and June of 1999 by the Office of Communications (OComm). Prior to these focus groups, which included six major segments of employers and their representatives, OComm conducted a number of in-depth interviews with representatives of employer and business groups. We applaud these questionnaire improvement initiatives, as they have resulted in a more concise survey instrument while reducing respondent burden.

The resulting 2000 questionnaire, which was shorter than the one used for the 1996 survey, consisted of the following three sections:

One of the main highlights of this survey was that a notable number of respondents expressed unfamiliarity with the SSA services, as their wage reports were prepared by a payroll service or by someone outside of the company.

Administration/Data Collection

The first survey was administered beginning in May 1996, while the second survey was conducted between November 1999 and March 2000. Each administration followed a procedure similar to what is outlined below:

The following table provides a summary of the resulting response rates by strata for the 2000 survey. As seen, the overall response rate is only 40 percent, which is substantially lower than the 52 percent rate achieved during the 1996 survey.

Table 2. Sampling Design and Response Rate for the 2000 Survey

Strata (Employees)

Sample

Excluded

Responders

Response Rate

1 – 10

1,000

68

421

45%

11 - 250

1,000

50

422

44%

> 250

1,000

30

374

39%

Unknown

500

51

108

24%

Total

3,500

199

1,325

40%


Several potential reasons have been offered for the lower response rates. First, most employers do not seem to view themselves as customers or stakeholders, and instead consider wage reporting as a burden. That is, there is not as much empathy on the part of employers to participate in this survey as SSA might have expected. Second, a large percentage of employers use a third party to process their wage reporting, making it difficult for them to respond to questions with which they are not familiar. Finally, despite the fact that sampled employers were asked to forward the questionnaire to individuals or organizations they deemed most knowledgeable about the survey, only a very small number of such third parties responded to the survey. To compensate for this under-representation, OQA requested four large payroll services to complete the survey. Although the information gathered from the four payroll services was not combined with the results of the employer survey, it provided OQA with additional insight about SSA’s wage reporting services.

Analysis and Report Generation

Prior to data analysis, survey data were key entered and weighted to compensate for the employed stratified sampling design. Subsequently, various estimates were generated using the weighted data. An example of such estimates is provided in the following table.

Table 3. Overall Rating of Service Provided by SSA (2000 Survey)

Category

2000

Target

Excellent

3.4%

13%

Very Good

15.6%

 

Good

28.8%

 

Fair

9.3%

 

Poor

0.7%

 

Very Poor

0.4%

 

No Opinion

41.8%

 

Total

100%

 

Excellent, Very Good, or Good

48%

93%


Based on the results of these surveys, SSA is considering lowering the set targets to more attainable levels. It is argued that since some employers view wage reporting as a burden, they might be giving lower ratings to SSA, hence deflating the overall estimate of percent satisfied. Other reasons offered for the low ratings include the fact that a large number of employers are not aware of the many services provided by SSA to employers.

When reporting the results of the survey SSA calculated the percent of respondents providing a rating of Good, Very Good, or Excellent by eliminating all respondents who have provided a "No Opinion" response. This approach increases the reported results of 48 percent to 82 percent.

RESULTS OF EVALUATION

During the period of September 21, 2000 to February 15, 2001, we evaluated the current processes and controls, which support the FY 2000 SSA performance measurement process. In addition, we determined the accuracy of the underlying performance measure data. Our evaluation of the information provided by SSA management as of February 15, 2001 allowed us to determine that the reported FY 2000 results of the two performance measures tested (as itemized below) were reasonably stated based on the methodology used by SSA.

Performance Measure

Reported Result

Percent of employers rating SSA’s overall service as "excellent," "very good," or "good."

82 percent

Percent of employers rating SSA’s overall service as "excellent."

6 percent

However, we did note the following five opportunities for improvement in SSA’s methodology:

Survey methodology may have led to a large non-response rate;
Survey results cannot be extrapolated to the target population;
Survey results may be upwardly biased;
Survey design needs improvement; and
Errors were identified in FY 2000 Survey Draft Report.

These items were noted as a result of our testing of the SSA survey instrument, as well as the focus group documentation, sampling parameters, and discussions with SSA and its contractor staff.

Survey methodology may have led to a high non-response rate.

    Overall 40 percent of employers surveyed responded to SSA. Although it is not uncommon for mail surveys to experience response rates that are of this level or lower, it is generally accepted, for example by the Office of Management and Budget (OMB), that response rates of at least 70 percent are needed to provide credible results. This response rate was partially due to the fact that a large number of employers had their wage reports prepared by third parties and therefore experienced difficulties in evaluating SSA effectively on this aspect. In addition, a significant number of employers have minimal or no direct contact with SSA or knowledge of the type of services SSA provides. This has led to a high non-response rate of 60 percent of employers surveyed. For this research to serve as an employer satisfaction survey, it is critical for the sampling frame to include only "eligible" employers. Eligible employers would be those who have experience with the SSA-provided services.

    The OMB Resource Manual for Customer Survey states, "The ability to make general statements about your agency’s customers based on the survey results is linked to how complete and accurate your frame is."

    The data collection methodology should be improved to ensure a better than 40 percent response rate. SSA should consider the use of a Computer Assisted Telephone Interviewing (CATI) method. By using a CATI method it would be possible to screen out those employers who have had no direct contact with SSA. This will effectively eliminate the need for forwarding the survey to third parties, which in turn will eliminate the compounding effects due to responses from inappropriate respondents, and, based on experience with such surveys, should increase the response rate. The Paperwork Reduction Act of 1995, Implementing Guide, Chapter 6, F2 states, "Agencies shall use electronic collection techniques where such techniques reduce burden on the public, increase efficiency of government programs, reduce cost to the government and the public, and/or provide better service to the public." While mail surveys are sometimes less expensive to administer, by utilizing the CATI method SSA could increase the quality of the resulting data, and therefore increase the efficiency and effectiveness of the survey.

    If a CATI approach is not feasible, a second suggestion is to include a simple "screening" section at the top of the questionnaire that asks whether respondents ever used or attempted to use SSA services. If the answer is "NO," the text of the questionnaire would instruct the response to put the form in the return envelope and return it, without further effort. This should tend to increase the response rate (by adding those who are out of scope), to allow for a count of those using versus not using SSA services, and to avoid the confusion of a handoff between the employer and the third-party service providers. SSA could then send a separate survey to a sample of third-party service providers, which is more controlled.

    No matter what mode of data collection is employed, it is important to conduct non-response analysis after the survey has been administered. This way, it would be possible to design and implement a non-response adjustment step in the weighting process, which can reduce some of the potential bias due to differential non-response.

Survey results cannot be extrapolated to the target population.

    In order to minimize non-response, the current survey has instructed employers to forward the questionnaire to the third party they use for wage reporting. This snowballing feature has made it impossible to extrapolate the results to the target population. Also, this approach mixes together the awareness results of the employers and third party providers. Since these two groups are likely to have different levels of awareness, it is best to produce separate awareness estimates for them.

    The OMB Resource Manual for Customer Survey states, "The ability to make general statements about your agency’s customers based on the survey results is linked to how complete and accurate your frame is."

Survey results may be upwardly biased.

    The above issues have impacted the analysis and reporting of the survey data as well. For instance, when calculating the percent of employers satisfied with the SSA services, a large number of respondents who have expressed no opinion are excluded from calculations. Because of this exclusion, the estimate of the percent of employers satisfied with the SSA service changes from 48 percent (of the entire set of respondents) to 82 percent (excluding those who said they had "no opinion"). It can be argued that such an estimate is upwardly biased. Instead, the report should reflect the percents for each of the answer categories, including no opinion, as shown in Table 3. The "no opinion" response, as it is currently implemented, does not clearly distinguish between those in the population who do not have an opinion because they did not use the services from those who did use the service, but just do not have an opinion.

The survey design needs improvement.

    Section 1 of the 2000 survey (page A2) includes a battery of awareness questions with simple Yes/No type answer categories. This is not an efficient use of valuable space on the questionnaire. Questions regarding awareness with simple answer categories (yes or no) have limited informational value. Instead of allocating a significant part of the survey to such questions, it would be advisable to change the structure of these questions so that both awareness and degree to which respondents find the given service useful can be captured. An example would be:

    On a scale of 1 to 5, where 1 means very useless and 5 means very useful, please rate the usefulness of each of the following services provided by the SSA. If you are unaware of the given service, please mark the "Don’t know" box.

    Changing the wording of these questions and their corresponding answer categories would allow SSA to obtain the awareness measures, while determining how useful respondents find such services at the same time. In addition, the 2000 questionnaire did not use question numbers, making it difficult for the respondent to delineate individual questions, as some questions wrap to the next line. Numbering the questions will make it easier to identify where questions begin and end.

Errors were identified in FY 2000 Survey Draft Report.

During our evaluation of the 2000 draft report, we identified a few minor mistakes. While these errors will not have a significant impact on the reported results, it is important that such errors be corrected in the final report. Examples of the errors are as follows:

CONCLUSIONS AND RECOMMENDATIONS

Our evaluation found that the primary goal of this survey has been fulfilled, namely to report FY 2000 results of the two performance measures. However, our evaluation noted several methodological, as well as administrative, issues with the 2000 survey. We recommend that SSA consider the following corrective actions:

  1. SSA should enhance the sampling frame to include a larger percentage of respondents and only those which are "eligible" employers (i.e., those employers who use SSA’s services and have direct contact with SSA). SSA should consider the use of a more effective data collection methodology, such as a CATI method. It is likely that SSA can obtain as many respondents as they did for the most recent survey (approximately 1,300) at a cost comparable to the mailed out protocol most recently employed. If SSA continues the mail protocol, telephone follow-ups should be done for employers who, after the second follow-up, do not respond.

  2. In addition, SSA should conduct a non-response analysis after the survey has been administered and develop non-response adjustment factors for the survey data.

  3. If a CATI approach is not feasible, SSA should consider including a simple "screening" section at the top of the questionnaire that asks whether respondents ever used or attempted to use SSA services.
  4. SSA should change the structure of the yes/no questions so that both awareness and degree to which respondents find the given service useful can be captured.

  5. In addition, SSA should number the questions to make it easier to identify where questions begin and end.

  6. When calculating the percent of respondents providing a rating of Good, Very Good, or Excellent, the denominator should include all respondents, regardless of their ratings.
  7. This will provide an unbiased estimate of percent satisfied.

  8. In future surveys, SSA should ask respondents whether they use SSA services or not, and exclude them from the computation as non-customers.

  9. APPROPRIATENESS OF THE PERFORMANCE MEASURES

    As part of this engagement, we evaluated the appropriateness of each of the performance measures with respect to the Government Performance and Results Act of 1993 (GPRA) compliance and SSA’s APP. To do so, we determined whether the specific indicators and goals corresponded to the strategic goals identified in SSA’s APP, determined whether each of these indicators accurately measure performance, and determined their compliance with GPRA requirements.

    Performance Measures #3 and #4 align logically with the SSA Strategic Plan but still need improvement

    The relationships between PMs #3 and #4 and the applicable SSA Strategic Goal are depicted in the following figure:

    Performance Measures #3 and #4 align logically with the SSA Strategic Plan but still need improvement flowchart

    The SSA mission is supported by five strategic goals, including Goal #2, "To deliver customer-responsive world-class service." Goal #2, in turn, is supported by several strategic objectives, including the relevant objective dealing with customer perceptions, "By 2002, to have 9 out of 10 customers rate SSA’s service as ‘good,’ ‘very good,’ or ‘excellent,’ with most rating it ‘excellent’." Performance Measures #3 and #4 characterize employer satisfaction with SSA services. Both PM #3 and PM #4 logically align with SSA’s strategic planning process.

    Based on the taxonomy of performance measures included in Appendix F, both PM #3 and PM #4 are measures of accomplishment because they report on a result (employer satisfaction) achieved with SSA resources. They are further categorized as outcome measures because they indicate the accomplishments or results (employer satisfaction) that occur because of the SSA services provided. As shown in Appendix F, outcome measures include public perceptions of outcomes, such as employer satisfaction.

    Within the framework of GPRA, Performance Measures #3 and #4 fit the intent of an outcome measure because they provide "…a description of the intended result, effect, or consequence that will occur from carrying out a program or activity."3 The intent of these two performance measurements is to gauge employer satisfaction (i.e., the effect) for the activity of providing services. TATTTrrTTTThey can both be useful to management and external stakeholders, as encouraged by OMB’s Circular A-11 Preparation and Submission of Budget Estimates. Given the duplication of the measures, SSA should eliminate one of the measures, unless it finds utility in keeping both of them. Nevertheless, these measures could be improved, as discussed below.

    In its current form, the metric may not provide reliable results. The survey methodology might not be appropriate for the underlying objectives. The results cannot be extrapolated to the target population and may be upwardly bias. Furthermore, our evaluation of the draft report indicated that there were errors in the reported data. In addition, the survey design was confusing for employers and the corresponding results are skewed. For example, almost 42 percent of respondents stated they had no opinion on the services provided by SSA.

    Ideally, a performance metric should help the agency to take action and affect the performance of the indicator being measured. However, due to the presence of numerous third parties, the measurement becomes clouded and is, therefore, less credible and less useful.

    The metric readily facilitates comparisons between geographic regions and business segments. With some enhancements, the measurement system could also be used to benchmark against other agencies with employer relationships, such as the IRS, the Health Care Financing Administration, the Pension Benefit Guaranty Corporation, and the Office of Personnel Management.

    Recommendations

  10. SSA should enhance its measurement system to ensure greater accuracy.
  11. They should also take steps to make the metric actionable so that it will provide greater utility for the agency.
  12. In addition, SSA should consider using the measurement to benchmark against other agencies that interface with employers.
  13. Finally, SSA should eliminate one of the two measures, unless SSA finds a utility in keeping both measures.

Agency Comments

The Agency agreed with or has a plan of action to address 8 of the 11 recommendations contained in this report. In fact, it has already taken steps to implement some of the recommendations. For example, it has eliminated yes/no questions when it modified its survey instrument. The removal of this type of question was recommended since answers to it provide limited informational value.

SSA disagreed with the recommendation that all respondents should be included in the denominator, regardless of their rating, when calculating the percent of respondents providing a rating of Good, Very Good or Excellent. The Agency stated that basing ratings on substantive responses is an acceptable practice, as long as accompanying discussion indicates that the percentages reflect the opinions of those who provided a rating.

The Agency also disagreed with the recommendation to consider using the employer satisfaction measurement to benchmark against other agencies that interface with employers. The Agency stated that modifications of the performance measure, as well as changes made to the survey instrument, addressed the need for benchmarking at this time. It also stated that its day-to-day interactions with States and the IRS provided an awareness of "best practices" used by those entities. SSA provides due consideration to incorporating those practices into its own service delivery model.

Finally, we recommended that the Agency consider eliminating one of the two measures. SSA, however, intends to keep both measures at this time. SSA stated that the first measure, Percent of employers rating SSA’s overall service as "excellent," "very good" or "good", identifies the universe of satisfied employers/customers. The second measure, Percent of employers rating SSA’s overall service as "excellent", serves to capture how well the Agency is achieving best in business service. The full text of SSA’s comments can be found in Appendix C.

OIG Response

We appreciate SSA’s comments on this report. The implementation of the recommendations contained within this report will help to ensure better measurement of the level of service provided to employers who have used SSA’s services. We are glad to see that the Agency has already taken steps to implement some of our recommendations through modifications made to its survey instrument.

We continue to believe SSA should reexamine the methods used to calculate the employer’s level of satisfaction with its services. With the current methods used, the estimate of the percent of employers satisfied with the SSA service changes from 48 percent (of the entire set of respondents) to 82 percent (excluding those who did not have an opinion of SSA’s services). SSA‘s reporting of the results of the employer survey should reflect the percents for each of the answer categories, including those that do not capture an opinion of the service provided. SSA should also work to differentiate between the respondents who did receive service, but did not express an opinion, and those who did not receive service from SSA so they could not express an opinion.

In regards to benchmarking, we note the usefulness of SSA’s interaction with other State agencies and the IRS and suggest that SSA continue to look for opportunities to benchmark its operations against other agencies that interface with employers.

Finally, while we respect the Agency’s prerogative to create its performance measures, we believe that the use of both measures is repetitive. SSA can provide decisionmakers with a clear understanding of the satisfaction of employers who have had contact with SSA through the use of one performance measure.

Appendices

APPENDIX A – Scope and Methodology

APPENDIX B – Acronyms

APPENDIX C – Agency Comments

APPENDIX D – Performance Measure Summary Sheets

APPENDIX E – Performance Measure Process Maps

APPENDIX F – Performance Measure Taxonomy

Scope and Methodology

The Social Security Administration’s (SSA) Office of the Inspector General contracted PricewaterhouseCoopers LLP (PwC) to evaluate 11 SSA performance indicators identified in its Fiscal Year (FY) 2001 Annual Performance Plan (APP). We performed our testing from September 21, 2000 through February 15, 2001. Since FY 2001 performance results were not yet available as of the date of our evaluation, we performed tests of the performance data and related internal controls surrounding the maintenance and reporting of the results for FY 2000. Specifically, we performed the following:

  1. Obtained an understanding and documented the FY 2000 Employer Survey Methodology;
  2. Tested the reasonableness of the survey data;
  3. Determined whether performance measures were meaningful and in compliance with the Government Performance and Results Act of 1993 (GPRA); and
  4. Identified findings relative to the above procedures and provided recommendations for improvement.

Our engagement was limited to testing at SSA’s headquarters in Woodlawn, Maryland. The procedures that we performed were in accordance with the American Institute of Certified Public Accountants’ Statement on Standards for Consulting Services, and are consistent with appropriate standards for performance audit engagements in Government Auditing Standards (Yellow Book, 1994 version). However, we were not engaged to and did not conduct an audit, the objective of which would be the expression of an opinion on the reliability or accuracy of the reported results of the performance measures evaluated. Accordingly, we do not express such an opinion.

Obtained an understanding and documented the FY 2000 Employer Survey Methodology.

We obtained an understanding of the underlying process and operating procedures surrounding the generation of the performance measures through interviews and meetings with the appropriate SSA personnel. Our work involved a comprehensive evaluation of the survey methodology used for this performance measure. The specific steps included evaluating the sampling and questionnaire designs, data collection procedures, and data analysis and reporting. We evaluated the following documents:

Tested the reasonableness of the survey data.

To ensure the reasonableness of the number reported in the FY 2000 GPRA section of the SSA Performance and Accountability Report, we evaluated the Employer Satisfaction Survey data provided by SSA’s Office of Quality Assurance and Performance Assessment. We verified the accuracy of the data by calculating the frequencies and counts using SAS software. This included calculating the percent of employers rating SSA wage reporting service as "excellent," "very good," or "good."

Determined whether performance measures were meaningful and in compliance with GPRA

As part of this engagement, we evaluated the appropriateness of each of the performance measures with respect to GPRA compliance and SSA’s APP. To do so, we determined whether the specific indicators and goals corresponded to the strategic goals identified in SSA’s APP, determined whether each of these indicators accurately measure performance, and determined their compliance with GPRA requirements.

ACRONYMS

APP Annual Performance Plane
CATI Computer Assisted Telephone Interviewing
EIMF Employee Identification Master File
FY Fiscal Year
GPRA Government Performance and Results Act
IRS Internal Revenue Service
OAS Office of Automation Support
OComm Office of Communications
OMB Office of Management and Budget
OQA Office of Quality Assurance and Performance Assessment
PM Performance Measure
PwC PricewaterhouseCoopers LLP
SSA Social Security Administration

AGENCY COMMENTS

COMMENTS ON THE OFFICE OF THE INSPECTOR GENERAL (OIG) DRAFT REPORT, "PERFORMANCE MEASURE REVIEW: RELIABILITY OF THE DATA USED TO MEASURE EMPLOYER SATISFACTION" (A-02-01-11012)

Recommendation 1

Enhance the sampling frame to include a larger percentage of respondents and only those which are "eligible" employers, i.e., those employers who use the Social Security Administration’s (SSA) services and have direct contact with SSA.

Comment

We agree. We have discontinued the generalized sampling frame used in the survey OIG reviewed. We have already implemented a survey in FY 2002 targeted to employers who have received direct service from SSA; i.e., employers who called the Employer Reporting Services Center (ERSC) for assistance. This sampling frame includes specific contact information that will greatly enhance our ability to achieve an acceptable response rate.

Recommendation 2

Conduct a non-response analysis after the survey has been administered and develop non-response adjustment factors for the survey data.

Comment

We will conduct the suggested analysis if the response rate for the survey falls below acceptable levels. We anticipate that our revised methodology, which includes use of Computer Assisted Telephone Interviewing (CATI) on a more targeted sample, will result in a much higher response rate.

Recommendation 3

If a Computer Assisted Telephone Interview approach is not feasible, consider including a simple "screening" section at the top of the questionnaire that asks whether respondents ever used or attempted to use SSA services.

Comment

We have adopted a CATI for the current survey, which also utilizes a completely redesigned questionnaire developed to assess satisfaction with service provided by the ERSC.

Recommendation 4

Change the structure of the yes/no questions so that both awareness and degree to which respondents find the given service useful can be captured.

Comment

Our current survey does not include yes/no questions.

Recommendation 5

Number the questions to make it easier to identify where questions begin and end.

Comment

We agree. The CATI software being used for the current survey eliminates this problem.

Recommendation 6

When calculating the percent of respondents providing a rating of Good, Very Good, or Excellent, the denominator should include all respondents, regardless of their ratings.

Comment

We disagree. Basing ratings on substantive responses is acceptable practice, as long as accompanying discussion indicates that the percentages reflect the opinions of those who provided a rating, as was done in the report prepared by the Office of Quality Assurance and Performance Assessment. It should also be noted that the rating questions did not include a choice of "no opinion" as indicated in OIG's report; the choice was "not applicable/service not used," which served as a screening device that fit into the format of the questionnaire.

Recommendation 7

In future surveys, ask respondents whether they use SSA services or not, and exclude them from the computation as non-customers.

Comment

Our FY 2002 survey specifically targets employers who have received direct service from SSA (i.e., employers who call the ERSC for assistance).

Recommendation 8

Enhance the measurement system to ensure greater accuracy.

Comment

We agree. The revised methodology for the employer survey should produce this result.

Recommendation 9

Take steps to make the metric actionable so that it will provide greater utility for the Agency.

Comment

We agree. We have modified the employer performance indicator to "Percent of employers rating SSA's overall service during interactions with SSA as excellent, very good or good." This performance measure will be based solely on results of a survey of employers who have called the ERSC. Therefore, the performance measure will indicate satisfaction specifically with service from the ERSC. The survey results will identify specific areas where improvements can be made in the Center's service in order to increase satisfaction as captured by the performance measure. In the future, we intend to include additional types of employer interactions in the survey. At that time, we will determine if and how the employer performance measures should be modified.

Recommendation 10

Consider using the measurement to benchmark against other agencies that interface with employers.

Comment

We believe the modification to the indicator described above and changes to the survey instrument address the need for benchmarking at this time. In addition, our day-to-day interactions with the States and the Internal Revenue Service provide us with an awareness of "best practices" used by those entities. Due consideration is given to incorporating those practices into our own service delivery.

Recommendation 11

Eliminate one of the two measures, unless SSA finds a utility in keeping both measures.

Comment

At this time, we intend to keep both measures. We developed the two performance measures to capture how well the Agency was achieving both parts of the customer satisfaction strategic objective--"...9 out of 10 customers rate SSA's service as good, very good or excellent" and "with most rating it excellent." It is consistent with the two measures assessing the satisfaction of our core business customers. In both instances, the first measure identifies the universe of satisfied employers/customers. The second measure serves to capture how well the Agency is achieving best in business service.

Performance Measure Summary Sheets

Name of Measure

Measure Type

Strategic Goal/Objective

3) Percent of employers rating SSA’s overall service as "excellent," "very good," or "good."

Percentage

Goal: To deliver customer-responsive world-class service.

Objective: By 2002, to have 9 out of 10 customers rate SSA’s service as "good", "very good", or "excellent," with most rating it "excellent".

Purpose

Survey Frequency

To assess the percent of employers rating SSA’s overall service as excellent, very good, or good, SSA’s Office of Quality Assurance and Performance Assessment (OQA) surveyed a sample of employers obtained from SSA’s Employer Identification Master File (EIMF) for CY 1998. The EIMF is updated based on data received from the IRS Master Business File.

Yearly

Target Goal

How Computed

Data Source

93%

Number of employers surveyed by SSA’s OQA who rate overall service as "good," "very good," or "excellent" divided by the total number of respondents to that question.

Employer Satisfaction Survey

Designated Staff Members

Division

Jean Venable

Mark Ruley

Lois Smith

Norman Goldstein

Marvin Brody

Allan Kaufman

Peg Blatter

Sail Reiner

Bartona Harrison

OSM

OSFE

OSFE

OQA

Office of Automation Support (OAS)

Testing and Results

We performed an evaluation of the survey methodology used for this performance measure. The specific steps included evaluating the sampling and questionnaire designs, data collection procedures, and data analysis and reporting. In this process, we evaluated the following documents and data.

  • 1996 Final Report, issued in April 1998
  • 2000 Draft Report, issued in October 2000
  • 2000 survey data
  • 2000 Focus Group reports, undated

Refer to "Results of Evaluation" for a description of the findings.

 

Name of Measure

Measure Type

Strategic Goal/Objective

4) Percent of employers rating SSA’s overall service as "excellent."

Percentage

Goal: To deliver customer-responsive world-class service.

Objective: By 2002, to have 9 out of 10 customers rate SSA’s service as "good", "very good", or "excellent," with most rating it "excellent".

Purpose

Survey Frequency

To assess the percent of employers rating SSA’s overall service as excellent. SSA’s Office of Quality Assurance and Performance Assessment (OQA) surveyed a sample of employers obtained from SSA’s Employer Identification Master File (EIMF) for CY 1998. The EIMF is updated based on data received from the IRS Master Business File.

Yearly

Target Goal

How Computed

Data Source

13%

Number of employers surveyed by SSA’s OQA who rate overall service as "excellent" divided by the total number of respondents to that question.

Employer Satisfaction Survey

Designated Staff Members

Division

Jean Venable

Mark Ruley

Lois Smith

Norman Goldstein

Marvin Brody

Allan Kaufman

Peg Blatter

Sail Reiner

Bartona Harrison

OSM

OSFE

OQA

OAS

Testing and Results

We performed an evaluation of the survey methodology used for this performance measure. The specific steps included evaluating the sampling and questionnaire designs, data collection procedures, and data analysis and reporting. In this process, we evaluated the following documents and data.

  • 1996 Final Report, issued in April 1998
  • 2000 Draft Report, issued in October 2000
  • 2000 survey data
  • 2000 Focus Group reports, undated

Refer to "Results of Evaluation" for a description of the findings.

Performance Measure Process Maps

Performance Measure Process Maps

Performance Measure Taxonomy

Performance Measure Taxonomy