OFFICE OF
THE INSPECTOR GENERAL

SOCIAL SECURITY ADMINISTRATION

PERFORMANCE INDICATOR AUDIT:
HEARINGS AND APPEALS PROCESS

January 2006 A-15-05-15113

AUDIT REPORT

Mission

We improve SSA programs and operations and protect them against fraud, waste, and abuse by conducting independent and objective audits, evaluations, and investigations. We provide timely, useful, and reliable information and advice to Administration officials, the Congress, and the public.

Authority

The Inspector General Act created independent audit and investigative units, called the Office of Inspector General (OIG). The mission of the OIG, as spelled out in the Act, is to:

Conduct and supervise independent and objective audits and investigations relating to agency programs and operations.
Promote economy, effectiveness, and efficiency within the agency.
Prevent and detect fraud, waste, and abuse in agency programs and operations.
Review and make recommendations regarding existing and proposed legislation and regulations relating to agency programs and operations.
Keep the agency head and the Congress fully and currently informed of problems in agency programs and operations.

To ensure objectivity, the IG Act empowers the IG with:

Independence to determine what reviews to perform.
Access to all information necessary for the reviews.
Authority to publish findings and recommendations based on the reviews.

Vision

By conducting independent and objective audits, investigations, and evaluations, we are agents of positive change striving for continuous improvement in the Social Security Administration's programs, operations, and management and in our own office.

MEMORANDUM

Date: January 24, 2006 Refer To:

To: The Commissioner

From: Inspector General

Subject: Performance Indicator Audit: Hearings and Appeals Process (A-15-05-15113)

We contracted with PricewaterhouseCoopers, LLP (PwC) to evaluate 16 of the Social Security Administration’s performance indicators established to comply with the Government Performance and Results Act. The attached final report presents the results of four of the performance indicators PwC reviewed. For the performance indicators included in this audit, PwC’s objectives were to:
• Assess the effectiveness of internal controls and test critical controls over the data generation, calculation, and reporting processes for the specific performance indicator.
• Assess the overall reliability of the performance indicator’s computer processed data. Data are reliable when they are complete, accurate, consistent and are not subject to inappropriate alteration.
• Test the accuracy of results presented and disclosed in the Fiscal Year 2004 Performance and Accountability Report.
• Assess if the performance indicator provides a meaningful measurement of the program it measures and the achievement of its stated objective.

This report contains the results of the audit for the following indicators:
• Number of appellate actions processed.
• Number of SSA hearings cases processed per workyear.
• Number of SSA hearings pending.
• Hearings decision accuracy rate.

Please provide within 60 days a corrective action plan that addresses each recommendation. If you wish to discuss the final report, please call me or have your staff contact Steven L. Schaeffer, Assistant Inspector General for Audit, at (410) 965-9700.

S
Patrick P. O’Carroll, Jr.

Attachment

MEMORANDUM

Date: January 17, 2006

To: Inspector General

From: PricewaterhouseCoopers LLP

Subject: Performance Indicator Audit: Hearings and Appeals Process (A-15-05-15113)

OBJECTIVE

The Government Performance and Results Act (GPRA) of 1993 requires the Social Security Administration (SSA) to develop performance indicators that assess the relevant service levels and outcomes of each program activity. GPRA also calls for a description of the means employed to verify and validate the measured values used to report on program performance.

Our audit was conducted in accordance with generally accepted government auditing standards for performance audits. For the performance indicators included in this audit, our objectives were to:

1. Assess the effectiveness of internal controls and test critical controls over the data generation, calculation, and reporting processes for the specific performance indicator.

2. Assess the overall reliability of the performance indicator’s computer processed data. Data are reliable when they are complete, accurate, consistent and are not subject to inappropriate alteration.

3. Test the accuracy of results presented and disclosed in the Fiscal Year (FY) 2004 Performance and Accountability Report (PAR).

4. Assess if the performance indicator provides a meaningful measurement of the program it measures and the achievement of its stated objective.

BACKGROUND

We audited the following performance indicators as stated in the SSA FY 2004 PAR:

Performance Indicator FY 2004 Goal FY 2004 Reported Results
Number of Appellate Actions Processed 996,500 1,019,007
Number of SSA Hearings Cases Processed per Workyear (PPWY) 105 100.2
Number of SSA Hearings Pending 586,000 635,601
Hearings Decision Accuracy Rate 90% 90% *

*The performance data shown for FY 2004 are estimated. Actual data was not available until December 2005. Social Security Administration Performance and Accountability Report
Fiscal Year 2004, p. 92.

SSA administers the Old-Age and Survivors Insurance (OASI), Disability Insurance (DI) and Supplemental Security Income (SSI) programs. The OASI program, also referred to as Retirement and Survivors Insurance (RSI), is authorized by Title II of the Social Security Act and provides benefits for eligible workers and for eligible members of their families and survivors. The DI program, also authorized by Title II of the Social Security Act, provides income for eligible workers who have qualifying disabilities and for eligible members of their families before those workers reach retirement age. The SSI Program, authorized by Title XVI of the Social Security Act, was designed as a needs-based program to provide or supplement the income of aged, blind, and/or disabled individuals with limited income and resources.

To determine eligibility for both Title II and Title XVI programs, applicants must first file a claim with SSA. This is typically accomplished through an appointment or walk-in visit to one of SSA’s approximately 1,300 field offices (FO). Interviews with the applicants are conducted by FO personnel via the telephone or in person to determine the applicants’ nonmedical eligibility. If the applicants are filing for benefits based on disability, basic medical information concerning the disability, medical treatments, and identification of treating sources is obtained.

After the applicants submit a claim, they will receive an initial determination of benefits. If a claimant disagrees with the initial determination, he/she can appeal within 60 days. The SSA appeals program provides four levels of appeal for a claimant. The four levels of appeal are:

• Reconsideration;
• Hearing;
• Appeals Council (AC) Review; and,
• Lawsuit in Federal District Court.

Reconsideration
The first level of appeal is a reconsideration, in which a complete review of the claim is completed by an SSA employee who did not take part in the initial decision process. All of the evidence initially submitted by the claimant, and any new evidence, is re-evaluated during the reconsideration process. Upon receiving the reconsideration decision, the claimant may request a hearing if he/she disagrees with the decision.

Hearing
The second level of appeal is a hearing, which is conducted by an Administrative Law Judge (ALJ) who is independent of both the initial determination and the reconsideration decision. The ALJ reviews all information related to the claim and makes the hearing decision. If the claimant disagrees with an ALJ’s hearing decision, the claimant may request an AC review.

AC Review
The AC evaluates all requests for review, but can deny a request if it believes the hearing decision was correct. If the AC grants the request for review, it will either complete the review or return it to an ALJ for further review. If the claimant disagrees with the review decision or if the AC decides not to review the case, the claimant may file a lawsuit in a Federal District Court.

Lawsuit in Federal District Court
The Federal District Court may remand the court case to SSA’s Commissioner for further consideration or dismiss the case. If remanded to the Commissioner, the AC, acting on behalf of the Commissioner, can make a decision or remand the case to an ALJ to make a decision.

(For additional details of the appeals process, refer to the flowcharts in Appendix C.)

RESULTS OF REVIEW

For one or more of the indicators included in this report we identified:

• Insufficient documentation supporting the process and controls to create, monitor, and report the results of the performance indicators;
• SSA employees with excessive system access rights to the datasets used to calculate the results of the performance indicators;
• Control weaknesses within the applications that impact the results of the performance indicators;
• Issues with the accuracy and presentation of information reported in the PAR; and
• Inconsistent retention of detailed data used to calculate the performance indicator results.

Number of Appellate Actions Processed

Indicator Background

“Number of appellate actions processed” is the summary count of the following appellate actions:

• Reconsiderations;
• AC reviews; and,
• Court cases and court remands from Federal District Courts.

Although part of the appeals process, hearings are not included in the count of appellate actions, but are included in the performance indicator, “Number of SSA Hearings Processed.”

Reconsiderations
Reconsiderations accounted for approximately 87.5 percent of appellate actions processed. When a request for reconsideration is filed, the request is entered into the appropriate application, which varies depending on the type of claim that is being appealed (i.e. RSI, DI, or SSI).

RSI reconsideration requests are generally entered into the Processing Center Action Control System (PCACS), but can also be processed through the Recovery of Overpayments, Accounting and Reporting (ROAR) application for reconsideration requests related to overpayments. The PCACS reconsideration data are uploaded into the Process Center Management Information system (PCMI), which generates a report of processed reconsiderations. The ROAR application generates the OC210 report, which includes a count of reconsiderations that have been input into ROAR.

Most DI and SSI disability reconsideration requests are processed by Disability Determination Services (DDS) in each State. The data are entered through the Levy application and is automatically transferred to the Disability Operational Datastore (DIODS). The count of processed DI reconsiderations is reported on the State Agency Operations Report (SAOR), which is generated from DIODS. DI and SSI disability reconsiderations can also be processed at the Federal DDS (FDDS) through the FDDS Case Tracking System (FDDS CTS), and through PCACS and ROAR as described previously.

SSI nondisability reconsideration requests are entered through the Modernized SSI Claims System (MSSICS). The data are stored in the Title XVI Operational Datastore (TXVI ODS), which then provides the data to the Integrated Work Measurement System (IWMS). The count of processed SSI reconsiderations is reported on the Field Counts spreadsheet, which is an output from IWMS.

AC Reviews
AC reviews accounted for approximately 9.5 percent of appellate actions processed. AC reviews are tracked in the Appeals Council Automated Processing System (ACAPS). Each month, the count of processed AC reviews is reported in the Monthly Office of Appellate Operations Disposition report generated from ACAPS.

Court Cases and Court Remands
Court cases and court remands accounted for approximately 3 percent of appellate actions processed. Both are tracked in the Litigation Overview Tracking System (LOTS). Each month, the count of processed court cases and court remands is reported in the Workload Production report generated from LOTS.

All appellate actions processed are totaled manually on a worksheet from the supporting documentation provided by each component.

Performance Indicator Calculation

Number of Appellate Actions Processed =Reconsiderations
+ AC reviews
+ new court cases
+ court remands.

Findings

Internal Controls and Data Reliability

We found the policies and procedures related to the formal process to capture, store and calculate the results of the performance indicator were not adequate. The documentation did not accurately describe the process in place during FY 2004 and all components of the indicator calculation were not included. Office of Management and Budget (OMB) Circular A-123, Management Accountability and Control, requires, “…documentation for transactions, management controls, and other significant events must be clear and readily available for examination.” Formally documented change control policies and procedures for the ACAPS and LOTS applications had not been developed. This results in a greater risk of unauthorized changes being made to the ACAPS and LOTS applications. The National Institute of Standards and Technology (NIST) guidance states, "An effective agency configuration management and control policy and associated procedures are essential to ensure adequate consideration of the potential security impacts due to specific changes to an information system or its surrounding environment."

ACAPS and LOTS Application Controls Issues
We found the following application issues for the ACAPS and LOTS applications:
1. User ID and Password settings were inadequate. Passwords were only required to be three characters in length, were allowed to be the same as the user ID, and user IDs and passwords were stored in a nonencrypted file within the applications. Additionally, there was no user ID lockout after invalid attempts to sign-on to the applications. This could have allowed unauthorized users to repeatedly attempt to log into the applications.
2. Security incident reports and error logs were not generated by the applications or monitored by management. As a result, security violations and data errors/irregularities may have occurred without management detection or investigation.

Specifically for ACAPS we found the following application issues:
1. Control logs to show the complete and accurate transfer of data between the CPMS and ACAPS applications were not created. This increased the risk that case data was not transferred completely and appropriately.
2. We found that duplicate cases could be created in ACAPS if all identifying fields were not present when inputting the case. This could create duplicate counts of AC reviews.

Specifically for LOTS we found the following application issues:
1. Three individuals had access within LOTS that were not consistent with their job functions. The level of access would have allowed them to inappropriately access all of the cases in LOTS.
2. Social Security numbers (SSN) could be entered with less than 9 digits.
3. LOTS does not have an audit trail that tracks user activities, date, and time of transactions entered into the application.
4. Cases that are entered into the LOTS application with an incorrect “case type” (such as Court Cases and Court Remands) must be detected within the same month that the case was created in LOTS or the case will be counted as a disposition, even though a disposition for the case has not been finalized.

In addition, we identified 24 security and compliance issues in our review of the Windows 2000 Operating System on which the ACAPS application resides. Five of the conditions were contrary to the requirements of the SSA Windows 2000 Risk Model and the other 19 conditions were contrary to existing Government guidelines from NIST and the Defense Information Systems Agency (DISA) Windows 2000 Security Checklist, version 3.1.11.

Data Reliability
We were unable to confirm that data submitted as appellate actions processed were, in fact, processed for three data sources included in the calculation of this performance indicator (FDDS CTS, ACAPS, and the OC210 report). For FDDS CTS and ACAPS, PwC was unable to trace the counts in the application reports to the corresponding spreadsheets used in the reporting of the performance indicator. Additionally, the OC210 report contained the number of reconsiderations input to the ROAR application, not the processed count as stated in the indicator title.

We tested the datasets used to calculate the indicator and found that six PCACS, five IWMS, and seven ROAR programmers had the “All” access designation to these datasets. This level of access allows users to create, delete and modify any of the data contained within the datasets we reviewed. This level of access prevents SSA from ensuring the integrity of this production data. OMB Circular A-130 requires agencies to implement the practice of least privilege whereby user access is restricted to the minimum necessary to perform his or her job; and enforce a separation of duties so that steps in a critical function are divided among different individuals. It also emphasizes the importance of management controls – such as individual accountability requirements, separation of duties enforced by access controls, and limitations on the processing privileges of individuals – to prevent and detect inappropriate or unauthorized activities.
The detailed data used to calculate this indicator were not archived and maintained for all of the data sources. SSA management stated that recreating the data for this audit was not considered to be cost effective; therefore, we were unable to recalculate the results of this performance indicator as reported in the PAR.

As a result of these issues, PwC was unable to validate the accuracy of the reported indicator results and does not consider the data used to calculate this indicator to be reliable.

Accuracy of PAR Presentation and Disclosure

Documentation in the FY 2004 PAR related to this indicator was inaccurate, as indicated in the following observations:
1. Several data sources used in the calculation of the indicator were not listed in the PAR, including: LOTS, ROAR, PCACS, PCMI, DIODS, FDDS CTS and IWMS.
2. The Cost Analysis System (CAS) was listed as a data source in the PAR, but it was not used in the indicator calculation.
3. The title “Number of Appellate Actions Processed” was misleading as SSA was not including the count of processed hearings as part of the appellate actions processed.

Performance Indicator Meaningfulness

Although part of the appellate process, the count of processed hearings was not included in the “Number of Appellate Actions Processed.” Processed hearings would have represented a portion of the total appellate actions processed had it been included. The inclusion of hearings would reflect the total number of appellate actions processed.

Number of SSA Hearings Cases Processed per Workyear (PPWY)

Indicator Background

Employees in the Division of Cost Analysis (DCA) and the Division of Budget and Financial Management at the Office of Hearings and Appeals (OHA) are responsible for calculating the number of SSA hearings cases processed per workyear.

When a claimant requests a hearing, the case is processed through the Case Processing and Management System (CPMS) and the Hearing Office Tracking System (HOTS). During FY 2004, CPMS replaced HOTS as the tracking system for the SSA hearings workload (HOTS will continue to track Medicare cases, which were not included in this indicator). All Hearing Offices (HO) were converted to CPMS as of August 2004.

The numerator in the calculation of this performance indicator was the number of SSA hearings cases processed during FY 2004, as reported on the Monthly Activity Report (MAR) generated from CPMS. This number is the sum of all hearings cases with a disposition date recorded in CPMS, and is entered into the Electronic CAS spreadsheet.

The denominator of this indicator was the number of direct workyears expended by OHA employees. The denominator is calculated using the following formula:

Direct Workyears =(Regular + Overtime) – (Leave + Holidays + Training + Travel)
Work Hours in a Year

SSA defines work hours in a year as 2080 hours (40 hours in a week x 52 weeks in a year). However, for the overtime figure used in the denominator calculation, work hours in a year was a calculated figure using data from the Payroll Analysis and Recap Report (PARR) generated from the Payroll Operational Datastore (Payroll ODS). SSA uses a separate divisor for overtime because it contributes to total time differently than regular hours. Unlike regular hours, it is considered to be completely work time, with no associated leave hours. The calculation converts the overtime to the equivalent regular time to properly compute direct workyears.

The following inputs for the Direct Workyears calculation are entered into the PPWY Calculation spreadsheet:

• Regular and Overtime: Time worked by OHA employees is recorded in the Mainframe Time and Attendance System (MTAS) and automatically transferred to the Time and Attendance Management Information System (TAMIS) at the end of each pay period. Regular and overtime hours are reported on the Time and Attendance Report generated by TAMIS.
• Leave: Leave hours taken by OHA employees are recorded in MTAS and automatically transferred to the Payroll ODS. Leave hours used in the calculation are reported on the PARR.
• Holidays: The number of official SSA paid holidays is used for this input.
• Training: Time spent in training is tracked at the HO and Regional Office (RO) levels, by training forms, sign-in sheets and employee reporting. The HO sends the training information to the RO, who then sends this information to OHA.
• Travel: The amount of time spent traveling by ALJs is estimated using the following formula:

Travel Time = [1.1 X # workdays in a month X (# of ALJs - 1) + 1.1 X #
workdays in a month X (# of ALJs ) X .1]/2080

The formula is based on the assumption that 10 percent of a judge’s time is spent traveling. The Biweekly Staffing Report gives the number of ALJs, which is then reduced by one because the Chief ALJ, who manages hearing operations, does not travel.

After all components of the indicator are entered, formulas in the PPWY Calculation spreadsheet calculate the direct workyears. The direct workyears are entered into an Electronic CAS spreadsheet, and the final number of SSA hearings cases processed per workyear is calculated. The assigned DCA employee inputs the Electronic CAS spreadsheet into CAS, which produces the Pre-Input Cost Analysis (PICA) report. The indicator figures are taken from the PICA and the result for the number of SSA hearings cases processed per workyear is reported on the OHA PPWY spreadsheet.

Performance Indicator Calculation

SSA Hearings Cases Processed per Workyear (PPWY) = Number of SSA hearings cases processed
Direct Workyears

Findings

Internal Controls and Data Reliability

Source documentation did not reconcile to the PPWY Calculation spreadsheet used in the indicator calculation in the following instances:
1. There were seven instances where OHA did not receive the training reports from ROs. This resulted in the amount of time spent on training being inaccurate.
2. There were two instances where the total number of ALJs entered did not match the number noted on the Biweekly Staffing Report. This resulted in the estimated amount of time judges spent traveling (as used in the final calculation of the indicator result) being inaccurate.

We tested the datasets used to calculate the indicator and found that one CPMS, one TAMIS, and one CAS programmer had the “All” access designation to these datasets. This level of access allows users to create, delete and modify any of the data contained within the datasets we reviewed. This level of access prevents SSA from ensuring the integrity of this production data. OMB Circular A-130 requires agencies to implement the practice of least privilege whereby user access is restricted to the minimum necessary to perform his or her job; and enforce a separation of duties so that steps in a critical function are divided among different individuals. It also emphasizes the importance of management controls – such as individual accountability requirements, separation of duties enforced by access controls, and limitations on the processing privileges of individuals – to prevent and detect inappropriate or unauthorized activities.

The CPMS data used to calculate the numerator for this indicator were not archived and maintained. We were unable to recalculate the numerator used in the calculation of this indicator.

As a result of these issues, PwC was unable to validate the accuracy of the reported indicator results and does not consider the data used to calculate this indicator to be reliable.

Accuracy of PAR Presentation and Disclosure

The information reported in the PAR related to this indicator was inaccurate, as indicated in the following observations:
1. We were unable to determine if the travel formula accurately reflected the amount of time judges actually spent traveling. The formula was created in 1972 and updated in 2003 to remove support staff from the calculation, since they no longer travel. However, there was no review to determine whether this estimation accurately reflected the actual travel time spent by judges.
2. The regular and overtime hours used in the denominator included time spent working on both SSA and Medicare cases, although the numerator only included SSA hearings.
3. The data sources indicated in the PAR were not accurate. Specifically, one data source listed in the PAR, the Travel Report, did not exist. Additionally, two data sources were not disclosed in the PAR, including a formula used to estimate the total ALJ travel time and the Payroll ODS.

Performance Indicator Meaningfulness

The purpose of the indicator (to measure the number of SSA hearings processed per workyear) appears meaningful, as it is a measure of productivity of OHA employees. However, as a result of the findings related to the formula used in the calculation, we do not consider the performance indicator to be meaningful. Specifically, the following issues noted above limit the meaningfulness:
1. The indicator is not consistently measuring the data because the denominator includes time spent on Medicare and SSA hearings, while the numerator includes only SSA hearings.
2. We were unable to determine that the travel formula accurately reflected the amount of time ALJs spent traveling.

Number of SSA Hearings Pending

Indicator Background

The performance indicator measures the number of SSA hearings cases that have not been decided by an ALJ. When a claimant requests a hearing, the case is entered into CPMS or HOTS. When a hearing decision is made, the disposition date is entered into CPMS and the status is set to “closed.” The total number of pending SSA hearings is the sum of all cases in a status other than “closed” or “temporary transfer.” Cases in “temporary transfer” are excluded because the case would be counted twice (in the originating office and the transfer office). The number of pending SSA hearings is reported as of September 24, 2004 on the Caseload Analysis Report (CAR).

Performance Indicator Calculation
Number of SSA Hearings Pending =Sum of all cases in a status other than “closed” or “temporary transfer”

Findings

Internal Controls and Data Reliability

We tested the CPMS datasets used to calculate the indicator and found that one programmer had the “All” access designation to these datasets. This level of access allows users to create, delete and modify any of the data contained within the datasets we reviewed. This level of access prevents SSA from ensuring the integrity of this production data. OMB Circular A-130 requires agencies to implement the practice of least privilege whereby user access is restricted to the minimum necessary to perform his or her job; and enforce a separation of duties so that steps in a critical function are divided among different individuals. It also emphasizes the importance of management controls – such as individual accountability requirements, separation of duties enforced by access controls, and limitations on the processing privileges of individuals – to prevent and detect inappropriate or unauthorized activities.

The CPMS data used to calculate this indicator was not archived and maintained. We were unable to recalculate the “Number of SSA Hearings Pending” as reported in the PAR.

As a result of these issues, we could not conclude that the data used to calculate this indicator was reliable.
Meaningfulness and Accuracy of PAR Presentation and Disclosure

We did not identify any significant issues related to the accuracy of PAR presentation and disclosure, or meaningfulness of this indicator.

Hearings Decision Accuracy Rate

Indicator Background

To determine the “Hearings Decision Accuracy Rate,” the Office of Quality Assurance and Performance Assessment (OQA) made a selection from a sample file of 1400 closed cases generated by the OHA Case Control System (OHACCS) to be used in its review of the hearings process. The sample file, which contains all closed cases, was downloaded to the Disability Hearings Quality Review System (DHQRS) database. The samples were stratified by type of decision (allow/deny) and geographic location (by region). From the sample file of 1400 cases, OQA selected approximately 250 denial cases and 300 allowance cases. The denial cases were selected from those for which an appeal was not currently pending, as only those cases considered to be administratively final were reviewed. The allowance cases included in the review are those for which OQA was able to obtain both the claim folder and hearing tape. The claim folders and hearing tapes are maintained at different locations, and OQA was not always able to obtain both in the month in which the case was selected for review.

Upon receipt by OQA, claim folders and hearing tapes were assigned and sent to the ALJs completing the reviews, referred to as Reviewing Judges (RJ). When assigning cases, OQA did not assign RJs to review cases where the hearing was held in their own region. The RJs assessed the cases and completed Data Collection Forms (DCF) to be sent back to OQA. The indicator was based on the RJs’ analysis of whether the original decision was supported by substantial evidence. When the completed DCFs were received by OQA, they were entered into DHQRS.

The data in DHQRS were analyzed and the results published every two years in the report, “Findings of the Disability Hearings Quality Review Process: ALJ Peer Review Report.” To date, there have been five reports with the last one published in December 2003. Since actual data for FY 2004 were not available, SSA stated that the number reported in the 2004 PAR was an estimate based on actual results from the December 2003 ALJ Peer Review Report.

Performance Indicator Calculation

Hearings Decision Accuracy Rate = Number of disability hearing decisions (both favorable and unfavorable) supported by “substantial evidence”
Total disability hearings reviewed

Findings

Internal Controls and Data Reliability

The hearing data contained in CPMS was not transferred to OHACCS on a consistent basis. As a result, the OHACCS file used to make the monthly case selections may not have contained the most current case data. SSA management stated that this issue would be addressed in the June 2005 release of CPMS, after the end of our fieldwork.

We tested the datasets used to calculated the indicator and found that one DHQRS and four OHACCS programmers had the “All” access designation to these datasets. This level of access would allow users to create, delete and modify any of the data contained within the datasets we reviewed. This level of access prevents SSA from ensuring the integrity of this production data. OMB Circular A-130 requires agencies to implement the practice of least privilege whereby user access is restricted to the minimum necessary to perform his or her job; and enforce a separation of duties so that steps in a critical function are divided among different individuals. It also emphasizes the importance of management controls – such as individual accountability requirements, separation of duties enforced by access controls, and limitations on the processing privileges of individuals – to prevent and detect inappropriate or unauthorized activities.

The application used to store RJs’ responses to the hearings review, DHQRS, did not maintain an audit trail tracking user actions. As a result, management could not review user actions to identify suspicious activities or inaccurate data entry patterns. OMB Circular A-123, Management Accountability and Control, states that, “Managers should exercise appropriate oversight to ensure individuals do not exceed or abuse their assigned authorities.”

As a result of these issues, we could not consider the data that was used to calculate this indicator to be reliable.
Accuracy of PAR Presentation and Disclosure

The title of the performance indicator, “Hearings Decision Accuracy Rate,” did not accurately reflect what was being measured. The title suggested that the indicator reported on the accuracy of hearing decisions; however, hearing decisions supported by substantial evidence were not necessarily accurate. Accuracy implies precision or correctness in decision making. The substantial evidence criterion essentially measures the adequacy of hearing decisions (i.e. was the evidence and documentation sufficient?). There is another question in the DCF that asks the RJ to describe what decision they would have made, given the same evidence available for the initial hearing. This question seems to better address the accuracy of hearing decisions.

Data sources that were used in the calculation of the indicator were not listed in the PAR. This included DHQRS and OHACCS.

OQA calculates the result of this performance indicator every 2 years. During the years that an actual result is not available, an estimate is reported in the PAR. The process used to calculate the estimated result has not been formally documented. SSA has stated that the figure reported as the estimate was the target for FY 2004, which was the result from the most recent actual data available (FY 2001 and FY 2002 results). We believe that a more accurate depiction of the situation would have been to report this indicator result as “not available.”

Performance Indicator Meaningfulness

As a result of the matters discussed previously in the section “Accuracy of PAR Presentation and Disclosure,” we do not consider the reported performance indicator to be a meaningful measure of the achievement of its stated objective. However, we believe that the indicator can represent a meaningful outcome-based measure of the adequacy of support for decisions.

General Findings

For all of the performance indicators included in this report, we identified other issues related to the general controls at the OHA facility in Falls Church, Virginia and the CPMS and HOTS applications.

During general controls testing, we found that visitors to the OHA facility were not required to sign-in upon entry. It should be noted that the OHA is located in a
multi-tenant, privately owned building, and so does not have complete control over the physical security of the building. In addition, there were no guards at the entrance of the OHA facility. Management did state that security guards are in place throughout the facility, however during the course fieldwork PwC did not note the presence of any guards. We also noted that tape back-up procedures for the OHA local systems environment were not adequately documented. We noted that the current back-up procedures did not include the schedule and frequency of back-ups, what is included in the back-up, procedures if the back-up fails, the back-up log process, and off-site procedures. Finally, OHA’s contingency plan was in draft status at the time of our audit. OHA management was in the process of updating and finalizing the plan to accommodate the current conditions.

The CPMS and HOTS applications are used to track and calculate performance data for the following indicators: “Number of SSA Hearings Cases Processed per Workyear (PPWY),” “Number of SSA Hearings Pending,” and “Hearings Decision Accuracy Rate.” We identified the following issues with these applications:

1. CPMS:
• It was not necessary to enter a date of death in CPMS to close a case based on death. This created the potential for open cases to be improperly classified as processed due to death in CPMS. As a result, the number of processed hearings could have been overstated in the PAR.
• CPMS users had the ability to create duplicate cases. Duplicate cases could have resulted in the number of pending cases being overstated in the PAR.
• We identified five security and compliance issues in our review of the UNIX server on which the CPMS application resides. Two of the conditions were contrary to the requirements of the SSA UNIX Risk Model and the other three conditions were contrary to existing Government guidelines from NIST and the DISA Windows 2000 Security Checklist, version 3.1.11.

2. HOTS:
• Weaknesses were identified related to the HOTS password and security setting requirements. The password weaknesses could allow for unauthorized access to HOTS. A detailed list of the password security weaknesses were provided to SSA management. Additionally, there was no user ID lockout after invalid attempts to sign-on to the applications. This could have allowed unauthorized users to repeatedly attempt to log into the applications.
• SSA student interns had the same access rights as supervisors within HOTS. This level of access allows users to read, write, and modify all of the data maintained in the HOTS application.
• HOTS lacked certain basic edit controls that would allow users to create duplicate claims, allow certain key dates to be back-dated, and allow closed claims to be reopened.
• HOTS did not have an audit trail.

CONCLUSION AND RECOMMENDATIONS

We recommend SSA:

1. Ensure SSA personnel do not have the ability to directly modify, create or delete the data, outside the application used to calculate the results of these indicators.

2. Ensure all visitors are required to sign in upon entry to restrict visitor access to the OHA building.

3. Enhance existing tape back-up procedures to include the entire back-up and recovery process in detail.

4. Ensure that the OHA contingency plan is complete and approved by management.

Specific to the performance indicators, “Number of Appellate Actions Processed,” “Number of SSA Hearings Cases Processed per Workyear (PPWY),” and “SSA Hearings Pending” we recommend SSA:

5. Ensure that the UNIX environment used in the calculation of these indicators is configured in compliance with the SSA UNIX Risk Model.

6. Retain the detailed data used to calculate the performance indicators results that are reported in the PAR.

Specific to the performance indicators, “Number of SSA Hearings Cases Processed per Workyear (PPWY),” “SSA Hearings Pending,” and “Hearings Decision Accuracy Rate” we recommend SSA:

7. Require a date of death to be entered into CPMS for cases closed based on death restricting the ability to create duplicate cases in CPMS.

Specific to the performance indicator, “Number of Appellate Actions Processed” we recommend SSA:

8. Improve PAR disclosure and meaningfulness by:
• revising the performance indicator title and description to ensure the data sources are accurately reflected; and
• including all elements of the appeals process in the calculation of the indicator or disclose the basis for excluding hearings counts from the indicator. (BASED UPON AGENCY COMMENTS, THIS RECOMMENDATION IS BEING WITHDRAWN.)

9. Improve internal controls and data reliability by:
• maintaining documentation that describes how the performance indicator goals were established, creating policies and procedures used to prepare and report the results of the performance indicators, and maintaining a complete audit trail of the transactions and data used to calculate the indicator results;
• ensuring that indicator data can be reconciled to the corresponding spreadsheets used in the reporting of this indicator; and
• correcting the indicator calculation to capture processed counts used for the ROAR portion of the indicator rather than the number of reconsiderations inputs.

10. Address ACAPS and LOTS application deficiencies (See Appendix D).

Specific to the performance indicator, “Number of SSA Hearings Cases Processed per Workyear” we recommend SSA:

11. Improve disclosure in the PAR by disclosing that a travel formula and Brio reports are used.

12. Improve internal controls by:
• updating the travel formula used to calculate ALJ travel time to reflect actual time spent on travel;
• ensuring the hours used in the calculation reflect only time spent working on SSA hearings – not Medicare hearings; and
• requiring all regions to provide training reports on a monthly basis to ensure all time spent on training is included in the indicator number.

Specific to the performance indicator, “Hearings Decision Accuracy Rate” we recommend SSA:

13. Improve PAR disclosure by:
• updating the data sources noted in the PAR to reflect all sources used in the indicator calculation, including DHQRS and OHACCS;
• revising the performance indicator title to clarify that it measures whether or not there is substantial evidence for each case reviewed, not accuracy of hearing decisions; and
• reporting accuracy rates available when actual results are not available.

14. Improve internal controls and data reliability by:
• ensuring timely and consistent receipt of CPMS cases into OHACCS; and
• maintaining an audit trail within the DHQRS application that captures the user ID or terminal, date and time of the transactions being processed. Policies and procedures should be created to review the audit trail for inappropriate access to data or processing of transactions.

AGENCY COMMENTS

SSA agreed with 10 of our recommendations and disagreed with 4. For recommendation 6, SSA disagreed and stated that system capacity and limited resources would prevent them from full implementation of this recommendation. For recommendation 8, SSA stated that it could not implement the recommendation because the performance measure has been eliminated. For recommendation 9, SSA stated that policies and procedures have been developed and were provided to PwC. For recommendation 11, SSA stated that the travel formula and Brio software do not need to be reflected in the PAR. The full text of SSA’s comments can be found in Appendix E.

PWC RESPONSE

In response to comments regarding recommendation 6, one of the objectives of the GPRA audit is to ensure the accuracy of results reported in the PAR for each of the indicators under audit. We are willing to discuss any alternate methods the Agency is considering to ensure that the indicator results are auditable. However, SSA is responsible for meeting the requirements of OMB Circular A-123, Management Accountability and Control, which states, "…documentation for transactions, management controls, and other significant events must be clear and readily available for examination." In addition, although PwC was able to recalculate the results using summary data from DIODS, we could not consider the data to be reliable as the Government Accountability Office defines reliability in Assessing the Reliability of Computer-Processed Data (October 2002) as:

• Data are reliable when they are (1) complete (they contain all of the data elements and records needed for the engagement) and (2) accurate (they reflect the data entered at the source or, if available, in the source documents).

Regarding recommendation 8, we have agreed to withdraw the recommendation since SSA has decided to eliminate the indicator from the Final FY 2006 Annual Performance Plan. However, if SSA decides to use this or a similar indicator in the future, issues relating to PAR disclosure and meaningfulness should be addressed.

Regarding recommendation 9, we continue to believe that internal controls and data reliability in general should be improved, even though this specific indicator will not be reported in subsequent PARs. As stated in our report, although SSA management provided documented policies and procedures to the auditors, the documentation was not sufficient as it did not accurately describe the process in place during FY 2004 to record and report the results of the indicator. In addition, all components of the indicator calculation were not included in the policies and procedures.

We agree with SSA’s response to recommendation 11. In fact, the Agency’s response supports the intent of our recommendation, which is for the PAR to include a reference to all key data sources instead of referencing a travel report which does not exist. The Agency’s response indicates that the travel formula and Brio software are the data sources for this indicator. As such, we continue to believe that the actual data sources should be referenced, instead of the non-existent travel report.

Appendices

APPENDIX A – Acronyms

APPENDIX B – Scope and Methodology

APPENDIX C – Process Flowcharts

APPENDIX D – ACAPS and LOTS Application Deficiencies

APPENDIX E – Agency Comments

Appendix A
Acronyms

AC Appeals Council
ACAPS Appeals Council Automated Processing System
ALJ Administrative Law Judge

CAR Caseload Analysis Report
CAS Cost Analysis System
COOP Continuity of Operations Plan
CPMS Case Processing and Management System
CPMS MI Case Processing and Management System Management Information
DAS Disability State Agencies
DBFM Division of Budget and Financial Management
DCA Division of Cost Analysis
DCF Data Collection Form

DDHQ Disability Hearings Quality Review
DDS Disability Determination Services
DHQRS Disability Hearings Quality Review System
DI Disability Insurance

DIODS Disability Operational Datastore
DISA Defense Information Security Agency
FDDS Federal Disability Determination Services
FDDS CTS Federal Disability Determination Services Case Tracking System
FY Fiscal Year
GPRA Government Performance and Results Act
HO Hearing Office
HOTS Hearing Office Tracking System
IWMS Integrated Work Measurement System
LOTS Litigation Overview Tracking System
MAR Monthly Analysis Report
MCS Modernized Claims System
MSSICS Modernized Supplemental Security Income Claims Systems
MTAS Mainframe Time and Attendance System
NDDSS National Disability Determination Services System
NIST National Institutes of Standard and Technology
OAO Office of Appellate Operations
OASI Old-Age and Survivors Insurance
ODS Operational Datastore
ODSSIS Office of Disability and Supplemental Security Income Systems
OHA Office of Hearings and Appeals

OHACCS Office of Hearings and Appeals Case Control System
OMB Office of Management and Budget
OQA Office of Quality Assurance
OSM Office of Strategic Management
PAR Performance and Accountability Report
PARR Payroll Analysis and Recap Report
Payroll ODS Payroll Operational Datastore
PCACS Processing Center Action Control System
PCMI Process Center Management Information System
PICA Pre-Input Cost Analysis
PPWY Processed per Workyear
RJ
Reviewing Judge
RO Regional Office
ROAR

Recovery of Overpayments, Accounting and Reporting
RSI Retirement and Survivors Insurance
SA System Access
SAOR State Agency Operations Report
SSA Social Security Administration
SSAMIS SSA Management Information System
SSI Supplemental Security Income
SSN Social Security Number

fdfd
TAMIS Time and Attendance Management Information System
TXVI ODS Title 16 Operational Datastore
U.S.C. United States Code

Appendix B
Scope and Methodology
We updated our understanding of the Social Security Administration’s (SSA) Government Performance and Results Act (GPRA) processes. This was completed through research and inquiry of SSA management. We also requested SSA to provide various documents regarding the specific programs being measured as well as the specific measurement used to assess the effectiveness and efficiency of the related program.

Through inquiry, observation, and other substantive testing, including testing of source documentation, we performed the following:

• Reviewed prior SSA, Government Accountability Office, and other reports related to SSA GPRA performance and related information systems.
• Met with the appropriate SSA personnel to confirm our understanding of each individual performance indicator.
• Flowcharted the processes. (See Appendix C).
• Tested key controls related to manual or basic computerized processes (e.g., spreadsheets, databases, etc.).
• Conducted and evaluated tests of the automated and manual controls within and surrounding each of the critical applications to determine whether the tested controls were adequate to provide and maintain reliable data to be used when measuring the specific indicator.
• Identified attributes, rules, and assumptions for each defined data element or source document.
• Recalculated the metric or algorithm of key performance indicators to ensure mathematical accuracy.
• For those indicators with results that SSA determined using computerized data, we assessed the completeness and accuracy of that data to determine the data's reliability as it pertains to the objectives of the audit.
• Performed a follow-up general computer control review as it relates to the Office of Hearings and Appeals.

As part of this audit, we documented our understanding, as conveyed to us by Agency personnel, of the alignment of the Agency’s mission, goals, objectives, processes, and related performance indicators. We analyzed how these processes interacted with related processes within SSA and the existing measurement systems. Our understanding of the Agency’s mission, goals, objectives, and processes were used to determine if the performance indicators being used appear to be valid and appropriate given our understanding of SSA’s mission, goals, objectives and processes.

We followed all performance audit standards in accordance with generally accepted government auditing standards. In addition to the steps above, we specifically performed the following to test the indicators included in this report:

NUMBER OF APPELLATE ACTIONS PROCESSED

• Audited the design and effectiveness of SSA internal controls and the accuracy and completeness of the data related to the following areas:
 Observed the input of a request for review in the Appeals Council Automated Processing System (ACAPS).
 Observed the input of a new court case and a court remand in the Litigation Overview Tracking System (LOTS).
 Performed an application controls audit of ACAPS, LOTS, Recovery of Overpayments, Accounting and Reporting, and the Disability Operational Datastore (DIODS).
 Performed a limited application controls audit of the Integrated Work Measurement System (IWMS), and the Processing Center Action Control System.
• Used a programming specialist to determine the adequacy of the programming logic used by SSA to calculate the reconsiderations processed for DIODS, IWMS, ACAPS, and LOTS.
• Recalculated the summary DIODS data for Fiscal Year (FY) 2004 and compared it to the reconsiderations in the State Agency Operations Report reported during the year.
• Traced data from supporting reports to the indicator calculation total for all data sources.

NUMBER OF SSA HEARINGS CASES PROCESSED PER WORKYEAR (PPWY)

• Audited the design and effectiveness of SSA internal controls and the accuracy and completeness of the data related to the following areas:
 Observed the input of the Hearing Request Date, Request Received Date and the Input of Hearing Disposition in the Case Processing and Management System (CPMS).
 Performed a follow-up general computer control review as it relates to the the Office of Hearings and Appeals (OHA).
 Performed a follow-up application review of the Hearing Office Tracking System (HOTS).
 Performed an application controls audit of CPMS.
• Used a programming specialist to determine the adequacy of the programming logic used by SSA to calculate the hearings processed from CPMS and the Case Processing and Management System Management Information (CPMS MI) and the time from the Time and Attendance Management Information System.
• Reviewed each component of the workyear calculation for completeness and accuracy.
• Traced data from supporting reports to the spreadsheets used to calculate the performance indicator.

NUMBER OF SSA HEARINGS PENDING

• Audited the design and effectiveness of SSA internal controls and the accuracy and completeness of the data related to the following areas:
 Observed the input of the Hearing Request Date and the Request Received Date in CPMS.
 Performed a follow-up general computer control review as it relates to the OHA.
 Performed a follow-up application review of HOTS.
 Performed an application controls audit of CPMS.
• Used a programming specialist to determine the adequacy of the programming logic used by SSA to calculate the hearings pending from CPMS and CPMS MI.
• Selected forty-five cases from the Modernized Claims System and the Modernized Supplemental Security Income Claims System and validated that each was in pending status in CPMS.
• Combined all regional and national Monthly Analysis Reports and Caseload Analysis Reports to verify the total of hearings pending.

HEARINGS DECISION ACCURACY RATE

• Audited the design and effectiveness of SSA internal controls and the accuracy and completeness of the data related to the following areas:
 Observed the input of the Hearing Request Date, Request Received Date and the Input of Hearing Disposition in CPMS.
 Performed a follow-up general computer control review as it relates to the OHA.
 Performed a follow-up application review of HOTS.
 Performed an application controls audit of CPMS.
 Performed a limited application controls audit of the Disability Hearings Quality Review System (DHQRS).
• Reviewed the process to create the sample file of cases from Office of Hearings and Appeals Case Control System, append the file to DHQRS, and select the cases to be reviewed.
• Selected forty-five cases from DHQRS and reviewed the Data Collection Forms to test the accuracy of input.
• Used a programming specialist to determine the adequacy of the programming logic used by SSA to calculate the accuracy rate.

Appendix C
Flowchart of Number of Appellate Actions Processed

Flowchart of Number of Appellate Actions Processed Cont.

Flowchart of Number of Appellate Actions Processed Cont.

Number of Appellate Actions Processed
• Initial decision received by claimant.
• Request for reconsideration filed by claimant.
• Retirement and Survivors Insurance (RSI) reconsiderations.
o PCACS.
o ROAR.
• Disability Insurance reconsiderations.
o PCACS.
o ROAR.
o Will reconsideration go to State Disability Determination Services (DDS)?
 No – Federal Disability Determination Services (FDDS) Case Tracking System (Levy).
 Yes – Levy (State DDS).
o Levy (State DDS).
o National Disability Determination Services System (NDDSS).
o DIODS (State Agency Operations Report (SAOR)).
• Supplemental Security Income (SSI) reconsiderations.
o MSSICS.
o Title XVI Operational Datastore.
o IWMS.
• Hearing decision received by claimant.
• Request for Appeals Council (AC) review filed by claimant.
• AC reviews.
o ACAPS.
• AC decision received by claimant.
• Lawsuit filed in Federal District Court.
o New court cases.
• Federal District Court remands case back to AC.
o Court remands.
• LOTS.
• OHA Workload spreadsheet.
• CAS Input spreadsheet.
• Performance owner review of the performance indicator results.
• Number of Appellate Actions Processed reported in SSA Tracking Report/PAR.

• (A/B) RSI/DI reconsiderations.
• PCACS.
o Obtain data for RSI reconsiderations:
 Office of International Operations:
• RECON8 NONCLM
• RECON8 NONMED
• RECON8 OTHER
• TOT RECON
 Program Service Center:
• RECON NONMED
• RECON MISC
o Obtain data for RSI reconsiderations:
 Office of International Operations:
• RECON8 MEDICAL
 Office of Disability Operations:
• RECON MEDICAL
• RECON MISC
• RECON NONCLM
• RECON NONMED
o Input acquired data into CAS Input spreadsheet.
• ROAR OC210 report.
o RSI & DI reconsiderations entered into RSIDI reconsiderations spreadsheet.
o RSIDI reconsiderations spreadsheet.
o Open CAS Input spreadsheet to automatically update counts for RSI & DI reconsiderations.
• FDDS spreadsheet.
o Calculate cumulative processed counts for FDDS DI reconsiderations.
o DI reconsiderations plus DI/SSI reconsiderations.
o Input cumulative processed counts into CAS Input spreadsheet.
• DIODS.
o SAOR Report.
o Complete the following calculation to obtain the Disability State Agencies (DSA) number for the CAS Input spreadsheet: (Title II Workloads Recon Clearances + Concurrent Workloads Clearances) – Transitional Federal Medicare Recon – Regular Federal Medicare Recon.
o Input acquired data into CAS Input spreadsheet.
• (C) SSI reconsiderations.
o SSA Management Information System (SSAMIS).
o IWMS portion of SSAMIS pulls reconsideration total.
o Field Counts spreadsheet.
o Obtain data for SSI reconsiderations cumulative totals processed.
o Input acquired data into CAS Input spreadsheet.
• CAS Input spreadsheet.
• The cumulative processed count is compared to the prior month.
• Any variation of more than 10 percent is verified with the data source contact for accuracy and an explanation of the variance.
• Review and approval of CAS Input spreadsheet by another analyst.
• Number of processed counts of appellate actions to the Office of Strategic Management by the 25th of each month.
• (D) AC reviews.
o ACAPS.
o ACAPS Report.
o ACAPS report sent for Associate Commissioner review.
o Systems uses ACAPS report as basis for Kiwi report. Kiwi report prepared with Division of Budget and Financial Management (DBFM) and Office of Appellate Operations (OAO) input.
• (E) New court cases.
o LOTS.
o LOTS Report.
• (F) Court remands.
o LOTS.
o LOTS Report.
• Kiwi report.
• OAO takes disposition figures from Kiwi (request for review dispositions) and LOTS (new court cases and court remand dispositions) reports.
• Figures sent through OAO Executives for approval.
• Figures released to OHA Executive Secretariat for executive approval and release to pertinent components.
• OHA Workload spreadsheet.
• To Division of Cost Analysis (DCA).

Flowchart of Number of SSA Hearings Cases Processed per Workyear, Number of SSA Hearings Pending, Hearings Decision Accuracy Rate

Flowchart of Number of SSA Hearings Cases Processed per Workyear, Number of SSA Hearings Pending, Hearings Decision Accuracy Rate Cont.

Flowchart of Number of SSA Hearings Cases Processed per Workyear, Number of SSA Hearings Pending, Hearings Decision Accuracy Rate Cont.

Number of SSA Hearings Cases Processed per Workyear (PPWY), Number of SSA Hearings Pending, Hearings Decision Accuracy Rate
• Initial decision received by claimant.
• Request reconsideration?
o No – End.
o Yes – Reconsideration decision received by claimant.
• Request hearing?
o No – End.
o Yes – Input into HOTS/CPMS when request received at Hearing Office (HO).
• SSA hearings pending number includes all with no disposition date in HOTS/CPMS (excludes cases in temporary transfer status).
• Will Administrative Law Judge (ALJ) conduct hearing?
o No
 Dismissed.
 Pay on Record (Expedite without hearing).
o Written decision sent to claimant.
o Yes – Hearing is held and case is explained.
• ALJ makes determination.
• ALJ enters decision into CPMS for SSA cases and HOTS for Medicare cases.
• Clerk enters disposition date and mail date into HOTS/CPMS.
• Decision letter and a copy of the ALJ’s decision are sent to claimant.
• A.
• HO database files sent to Regional Office and combined in HOTS/CPMS.
• Regional database files sent to OHA and combined in HOTS/CPMS.
• Monthly Activity Report (MAR) produced by HOTS/CPMS.
• Combine MAR for all locations to generate CAR.
• MAR posted to the Intranet for ROs to review.
• B.
• Performance Indicator 3 Calculation: Number of SSA Hearings Pending, Total number of hearings cases with no disposition date in HOTS/CPMS (excludes cases in temporary transfer status).
• Performance owner review of the performance indicator results.
• Reporting of Performance Indicator 3.
• B.
• Control workyears calculated by DCA and emailed to OHA/DBFM analyst.
• Number of SSA Hearings Processed taken from the MAR.
• Travel calculated using travel formula.
• Training totaled from HO/RO Training Reports.
• Spreadsheet formulas calculate the time spent on SSA hearings:
o (Regular Time + Overtime) – (Leave + Travel + Training)
2080
*SSA defines a work year as 2,080 hours = (40 hours * 52 weeks).
• Monthly Electronic CAS file emailed to DCA.
• Create OHA Workload file and agree to Electronic CAS.
• Run macro to create PRN file for upload to CAS.
• Run macro to update CAS Input sheet with data from OHA Workload file.
• Upload PRN file to CAS.
• Pre-Input Cost Analysis (PICA) report updated with current month input.
• Check PICA against OHA Workload file to ensure proper upload of information.
• Manually update OHA PPWY Rec_Mo_Cum file with figures from PICA.
• Check against PICA for Cum. Receipts, Processed and Workyears.
• Spreadsheet formulas calculate Performance Indicator 2: Number of SSA Hearings Cases Processed per Workyear (PPWY).

o SSA Hearings Processed
SSA Hearings Direct Time

• Send to Director of DCA for review.
• Director of DCA sends to OSM by 15th of the month.
• Reporting of Performance Indicator 2 in PAR.
• A.
• Office of Disability and Supplemental Security Income Systems (ODSSIS) creates an extract file from Office of Hearings and Appeals Case Control System (OHACCS) containing the latest transaction for each case in OHACCS.
• Division of Disability Hearings Quality (DDHQ) inputs month, year and number of cases required for sample on SSA mainframe and notifies ODSSIS.
• ODSSIS creates the monthly sample file.
• The sample file is downloaded onto the Office of Quality Assurance and Performance Assessment server and appended to the Disability Hearings Quality Review System (DHQRS) database.
• Folders for the cases selected as the sample are requested through various contacts.
• When folders are received, they are screened for exclusions and completeness.
• Cases are assigned to Reviewing Judges (RJ) and data are entered into the DHQRS case control system.
• Folders are mailed to the RJs with Data Collection Forms (DCF) attached.
• RJs review the cases and complete the DCF. Two questions point to the supported-by-substantial-evidence performance indicator.
• Folders are returned to the DDHQ with the DCF. The form is reviewed for completeness and consistency.
• DCF information is entered into DHQRS database.
• After all information is entered into DHQRS database, DCFs are placed in individual residual file folders and filed Social Security number.
• Data are analyzed by DDHQ staff to compare to data reported in previous reporting periods.
• Results of analysis are reported in biennial reports.
• Biennial Disability Hearings Quality Review Process Peer Review Reports.
• Performance Indicator 4 Determined: Hearings Decision Accuracy Rate, Percent of hearings supported by substantial evidence.
• Performance owner review of the performance indicator results.
• Reporting of Performance Indicator

Appendix D

ACAPS and LOTS Application Deficiencies
Specific to Recommendation 10, SSA should address ACAPS and LOTS application deficiencies by taking the following actions:

• Document change control procedures for both ACAPS and LOTS;
• Strengthen password parameters in ACAPS and LOTS to require encryption of the passwords, lockout of users accounts after a set number of failed attempts, the use of alphanumeric passwords and passwords with a minimum of eight characters;
• Strengthen the ACAPS and LOTS applications to include security incident reports for tracking inappropriate access attempts to ACAPS and LOTS;
• Generate error logs for ACAPS and LOTS activities to ensure timely identification and follow-up of data entry errors;
• Maintain an audit trail that captures the user ID or terminal, as well as date and time of the transaction being processed through LOTS. Policies and procedures should be created to review the audit trail for inappropriate access to data or processing of transactions;
• Ensure that the Windows 2000 environment that supports the ACAPS application is configured to be in compliance with the SSA Windows 2000 Risk Model;
• Create a control log for the CPMS and ACAPS interface to ensure that all case data are completely and accurately transferred to ACAPS;
• Require ACAPS users to enter all identifying fields to prevent duplicate cases;
• Require SSNs entered to be nine digits to ensure case information in LOTS is complete and accurate; and,
• Monitor case type errors in LOTS on a consistent basis to ensure case information in LOTS is complete and accurate.

Appendix E

Agency Comments

SOCIAL SECURITY

MEMORANDUM

Date: January 13, 2006 Refer
Refer To: S1J-3

To: Patrick P. O'Carroll, Jr.
Inspector General

From: Larry W. Dye /s/
Chief of Staff

Subject: Office of the Inspector General (OIG) Draft Report "Performance Indicator Audit: Hearings and Appeals Process" (A-15-05-15113) -- INFORMATION

We appreciate OIG’s efforts in conducting this review. Our comments on the draft report content and recommendations are attached.

Let me know if we can be of further assistance. Staff inquiries may be directed to Candace Skurnik, Director, Audit Management and Liaison Staff on extension 54636.

Attachment:
SSA Response

COMMENTS ON THE OFFICE OF THE INSPECTOR GENERAL (OIG) DRAFT REPORT "PERFORMANCE INDICATOR AUDIT: HEARINGS AND APPEALS PROCESS" (A-15-05-15113)

Thank you for the opportunity to review the above subject audit report.

Our responses to the specific recommendations are as follows:

Recommendation 1

Ensure SSA personnel do not have the ability to directly modify, create or delete the data, outside the application used to calculate the results of these indicators.

Comment:

We agree. SSA has begun implementing the "access on demand project" that should be fully implemented in 2009. We suggest that PricewaterhouseCoopers (PwC) should acknowledge this in their report.

Recommendation 2

Ensure all visitors are required to sign in upon entry to restrict visitor access to the Office of Hearings and Appeals (OHA) building.

Comment:

We agree. The OHA Headquarters building security could be improved. OHA is working in conjunction with the Department of Justice to provide security enhancements at the OHA facility in Falls Church, Virginia to bring the building in compliance with Level IV federal standards.

Recommendation 3

Enhance existing tape back-up procedures to include the entire back-up and recovery process in detail.

Comment:

We agree. SSA provided documentation of its tape back-up procedures to PwC in May 2005.

Recommendation 4

Ensure that the OHA contingency plan is complete and approved by management.

Comment:

We agree. OHA will update their Occupant Emergency Plan and Security Action Plan yearly as required by the Administrative Instructions Manual System guide.

Specific to the performance indicators, “Number of Appellate Actions Processed,” “Number of SSA Hearings Cases Processed per Workyear (PPWY),” and “SSA Hearings Pending.”

Recommendation 5

Ensure that the UNIX environment used in the calculation of these indicators is configured in compliance with the SSA UNIX Risk Model.

Comment

We agree. The SSA UNIX Risk Model is updated every six months. The Multi-Platform Security Branch forwards the risk model to members (administrators) of the UNIX Functional Workgroup to implement. In addition, we also use a tool called Policy Compliance Manager to scan and monitor items in the risk model for non-compliance on a regular basis. Contact is made with the administrator and manager of the UNIX environment to bring the server into compliance.

Recommendation 6

Retain the detailed data used to calculate the performance indicators results that are reported in the Performance and Accountability Report (PAR).

Comment:

We disagree. System capacity and the diversion of already limited resources to support such an activity compel us to disagree with this recommendation. Satisfying this recommendation would require SSA to preserve and maintain, among other things, data transactions, source code, multiple versions of software and the operating system in use during the potential audit review period. Staff would then need to be available to reconstruct all this to support an audit. The magnitude of such an effort would seriously impede work to implement new information technology supported processes that support SSA programs and its clients.

PwC has suggested that for some indicators maintaining raw summary data would meet the needs of its audits, and that it would provide the server capacity to store such data. For some systems/datastores (Case Processing and Management System (CPMS)/Disability Operations Datastore (DIODS)), we are sizing data volume for PwC so that they can determine the appropriate server size. However, it is unclear that this alone will address PwC's needs, and for which indicators this is true. Even though this would likely be a limited diversion of SSA resources, it would be helpful if we had some assurance that this activity will in fact support the auditor’s needs.

SSA has explained that it is cost-prohibitive to maintain the detail-level data required to recalculate performance results for a full year and PwC should acknowledge this in their report. Further, the Office of Management and Budget's Circular A-11, section 230f states "Performance data need not be perfect to be reliable, particularly if the cost and effort to secure the best performance data will exceed the value of any data so obtained".

Therefore, since PwC was able to recalculate the results using summary data from DIODS, we suggest PwC revise their statement in Findings that they could not consider the data reliable. Also, PwC should acknowledge SSA's proposal to take snapshots of the detail-level pending at pre-determined times and use those snapshots to verify the accuracy of the summary data.

Specific to the performance indicators, “Number of SSA Hearings Cases Processed per Workyear (PPWY),” “SSA Hearings Pending,” and “Hearings Decision Accuracy Rate.”

Recommendation 7

Require a date of death to be entered into the CPMS for cases closed based on death restricting the ability to create duplicate cases in CPMS.

Comment:

We agree. We will ensure that this change is added to the current list of requested CPMS enhancements.

Specific to the performance indicator, “Number of Appellate Actions Processed.”

Recommendation 8

Improve PAR disclosure and meaningfulness by: a) revising the performance indicator title and description to ensure the data sources are accurately reflected; and b) including all elements of the appeals process in the calculation of the indicator or disclose the basis for excluding hearings counts from the indicator.

Comment:

We disagree. We cannot implement this recommendation since this performance measure has been eliminated. This performance measure was very problematic because it was made up of multiple workloads handled by the field, Disability Determination Services, and the Appeals Council. After executive review, it was decided that the measure did not provide meaningful management information.

We suggest that the report indicate that the performance measure, Number of Appellate Actions Processed, was dropped as a Government Performance and Results Act (GPRA) measure in the FY 2006 Annual Performance Plan.

Recommendation 9

Improve internal controls and data reliability by: a) maintaining documentation that describes how the performance indicator goals were established, creating policies and procedures used to prepare and report the results of the performance indicators, and maintaining a complete audit trail of the transactions and data used to calculate the indicator results; b) ensuring that indicator data can be reconciled to the corresponding spreadsheets used in the reporting of this indicator; and c) correcting the indicator calculation to capture processed counts used for the Recovery of Overpayments, Accounting and Reporting (ROAR) portion of the indicator rather than the number of reconsiderations inputs.

Comment:

We disagree. Policies and procedures have been developed and were provided to the auditors. This should be acknowledged in their draft report. In addition, we suggest that the report indicate that the performance measure, Number of Appellate Actions Processed, was dropped as a GPRA measure in the FY 2006 Annual Performance Plan.

Recommendation 10

SSA should address the Appeals Council Automated Processing System (ACAPS) and the Litigation Overview Tracking System (LOTS) application deficiencies by taking the following actions: a) document change control procedures for both ACAPS and LOTS;
b) strengthen password parameters in ACAPS and LOTS to require encryption of the passwords, lockout of users accounts after a set number of failed attempts, the use of alphanumeric passwords and passwords with a minimum of eight characters;
c) strengthen the ACAPS and LOTS applications to include security incident reports for tracking inappropriate access attempts to ACAPS and LOTS; d) generate error logs for ACAPS and LOTS activities to ensure timely identification and follow-up of data entry errors; e) maintain an audit trail that captures the user ID or terminal, as well as date and time of the transaction being processed through LOTS. Policies and procedures should be created to review the audit trail for inappropriate access to data or processing of transactions; f) ensure that the Windows 2000 environment that supports the ACAPS application is configured to be in compliance with the SSA Windows 2000 Risk Model; g) create a control log for the CPMS and ACAPS interface to ensure that all case data is completely and accurately transferred to ACAPS; h) require ACAPS users to enter all identifying fields to prevent duplicate cases; i) require SSNs entered to be nine digits to ensure case information in LOTS is complete and accurate; and, j) monitor case type errors in LOTS on a consistent basis to ensure case information in LOTS is complete and accurate.

Comment:

We agree. ACAPS and LOTS are old stand-alone systems that will eventually be replaced. In order to make any changes to those systems, SSA would have to redirect systems resources that are being used for critical CPMS enhancements. This would have a negative impact on the SSA hearings workload, so we are only considering operations essential changes to ACAPS and LOTS at this time. Nonetheless, any future changes will follow the principles of change control as dictated by the Agency's policies/procedures in change control management. The Intrusion Protection Team monitors 24/7 with the Harris-Stat tool for violations to the Windows 2000 risk model. Contact is made with the administrator and manager of the Windows environment to bring the server into compliance. At the next suitable opportunity, we will ensure that this system includes appropriate edits for SSN entries.

Specific to the performance indicator, “Number of SSA Hearings Cases Processed per Workyear.”

Recommendation 11

Improve disclosure in the PAR by disclosing that a travel formula and Brio reports are used.

Comment:

We disagree. The Travel Report mentioned on page 11 of the subject draft report in the section entitled "Accuracy of PAR Presentation and Disclosure" is not a report or data source. The travel formula that computes how much Administrative Law Judges spend in travel is appropriately reflected on page 9 of the Draft Report. This formula is a computation and not a report and should not be reflected in the PAR as a data source. The Operational Data Store is a tool that provides payroll users with on-line access to their Payroll Analysis Recap Report (PARR). BRIO is not a report, but software used to run the PARR. The PARR is listed as a data source.

Recommendation 12

Improve internal controls by: a) updating the travel formula used to calculate ALJ travel time to reflect actual time spent on travel; b) ensuring the hours used in the calculation reflect only time spent working on SSA hearings - not Medicare hearings; and
c) requiring all regions to provide training reports on a monthly basis to ensure all time spent on training is included in the indicator number.

Comment:

We agree. The travel formula will be revised as part of the Social Security Unified Measurement System/Social Security Administration Managerial Cost Accountability System pilot once it is expanded to OHA. All training hours from all of the regions are being captured. Effective with additional computer enhancements to existing systems (expected no later than January 2006), the reporting of training hours from all regions will be automated.

However, relative to item (b), Medicare time is not included in the PPWY calculation. OHA uses Standard Time Values (STV) to assign workyears to their various workloads. For each workload, the STV is multiplied by the processed workload count to determine earned hours. Earned hours are compared to direct hours worked and any difference is prorated over the workloads. When calculating the PPWY, the Agency only uses SSA hearings processed counts and SSA hearings time. Accordingly, that part of the recommendation is already in effect.

Specific to the performance indicator, “Hearings Decision Accuracy Rate.”

Recommendation 13

Improve PAR disclosure by: a) updating the data sources noted in the PAR to reflect all sources used in the indicator calculation, including the Disability Hearings Quality Review System (DHQRS) and the Office of Hearings and Appeals Case Control System (OHACCS); b) revising the performance indicator title to clarify that it measures whether or not there is substantial evidence for each case reviewed, not accuracy of hearing decisions; and c) reporting accuracy rates available when actual results are not available.

Comment:

We agree. SSA will update and reflect all data sources used in the indicator calculation. While we believe that the data definition provides a straightforward disclosure that decision accuracy is based on substantial evidence, the standard used by most Federal courts to evaluate the accuracy of decisions, nonetheless we will revise the title to clarify that we measure whether there is substantial evidence to support the hearing decision. Finally, we will provide a yearly report on the performance indicator.

Recommendation 14

Improve internal controls and data reliability by: a) ensuring timely and consistent receipt of CPMS cases into OHACCS; and b) maintaining an audit trail within the DHQRS application that captures the user identification or terminal, date and time of the transactions being processed. Policies and procedures should be created to review the audit trail for inappropriate access to data or processing of transactions.

Comment:

We agree. The problem with the interface between CPMS and OHACCS has been corrected. We are developing an Electronic Quality Assurance (eQA) system for the Office of Quality Assurance and Performance Assessment that will maintain an audit trail of user actions.

Overview of the Office of the Inspector General
The Office of the Inspector General (OIG) is comprised of our Office of Investigations (OI), Office of Audit (OA), Office of the Chief Counsel to the Inspector General (OCCIG), and Office of Resource Management (ORM). To ensure compliance with policies and procedures, internal controls, and professional standards, we also have a comprehensive Professional Responsibility and Quality Assurance program.

Office of Audit

OA conducts and/or supervises financial and performance audits of the Social Security Administration’s (SSA) programs and operations and makes recommendations to ensure program objectives are achieved effectively and efficiently. Financial audits assess whether SSA’s financial statements fairly present SSA’s financial position, results of operations, and cash flow. Performance audits review the economy, efficiency, and effectiveness of SSA’s programs and operations. OA also conducts short-term management and program evaluations and projects on issues of concern to SSA, Congress, and the general public.

Office of Investigations

OI conducts and coordinates investigative activity related to fraud, waste, abuse, and mismanagement in SSA programs and operations. This includes wrongdoing by applicants, beneficiaries, contractors, third parties, or SSA employees performing their official duties. This office serves as OIG liaison to the Department of Justice on all matters relating to the investigations of SSA programs and personnel. OI also conducts joint investigations with other Federal, State, and local law enforcement agencies.

Office of the Chief Counsel to the Inspector General

OCCIG provides independent legal advice and counsel to the IG on various matters, including statutes, regulations, legislation, and policy directives. OCCIG also advises the IG on investigative procedures and techniques, as well as on legal implications and conclusions to be drawn from audit and investigative material. Finally, OCCIG administers the Civil Monetary Penalty program.

Office of Resource Management

ORM supports OIG by providing information resource management and systems security. ORM also coordinates OIG’s budget, procurement, telecommunications, facilities, and human resources. In addition, ORM is the focal point for OIG’s strategic planning function and the development and implementation of performance measures required by the Government Performance and Results Act of 1993.