MEMORANDUM

Date: April 17, 2008

To: The Commissioner

From: Inspector General

Subject: On-site Security Control and Audit Reviews at Program Service Centers (A-03-07-17064)

The attached final report presents the results of our audit. Our objectives were to assess (1) the Social Security Administration's (SSA) procedures for selecting Program Service Center components for On-site Security Control and Audit Reviews (OSCAR), (2) SSA's system for ensuring appropriate correction of deficiencies identified through OSCARs, and (3) additional steps SSA can take to enhance the OSCAR guide.

Please provide within 60 days a corrective action plan that addresses each recommendation. If you wish to discuss the final report, please call me or have your staff contact Steven L. Schaeffer, Assistant Inspector General for Audit, at
(410) 965-9700.

Patrick P. O'Carroll, Jr.

OFFICE OF
THE INSPECTOR GENERAL

SOCIAL SECURITY ADMINISTRATION

ON-SITE SECURITY CONTROL
AND AUDIT REVIEWS
AT PROGRAM SERVICE CENTERS

April 2008

A-03-07-17064

AUDIT REPORT

Mission

By conducting independent and objective audits, evaluations and investigations, we inspire public confidence in the integrity and security of SSA's programs and operations and protect them against fraud, waste and abuse. We provide timely, useful and reliable information and advice to Administration officials, Congress and the public.

Authority

The Inspector General Act created independent audit and investigative units, called the Office of Inspector General (OIG). The mission of the OIG, as spelled out in the Act, is to:

Conduct and supervise independent and objective audits and investigations relating to agency programs and operations.
Promote economy, effectiveness, and efficiency within the agency.
Prevent and detect fraud, waste, and abuse in agency programs and operations.
Review and make recommendations regarding existing and proposed legislation and regulations relating to agency programs and operations.
Keep the agency head and the Congress fully and currently informed of problems in agency programs and operations.

To ensure objectivity, the IG Act empowers the IG with:

Independence to determine what reviews to perform.
Access to all information necessary for the reviews.
Authority to publish findings and recommendations based on the reviews.

Vision

We strive for continual improvement in SSA's programs, operations and management by proactively seeking new ways to prevent and deter fraud, waste and abuse. We commit to integrity and excellence by supporting an environment that provides a valuable public service while encouraging employee development and retention and fostering diversity and innovation.

Executive Summary
OBJECTIVE

Our objectives were to assess (1) the Social Security Administration's (SSA) procedures for selecting Program Service Center (PSC) components for On-site Security Control and Audit Reviews (OSCAR), (2) SSA's system for ensuring appropriate correction of deficiencies identified through OSCARs, and (3) additional steps SSA can take to enhance the OSCAR guide.

BACKGROUND

SSA must comply with the Federal requirements associated with management controls and provide assurances that its financial, programmatic, and administrative processes are functioning as intended. SSA designed the Management Control Review (MCR) program to satisfy the Federal requirements. The MCR program is implemented within the PSCs using the Program Service Center OSCAR Guide, which standardizes Agency-wide review techniques and reporting criteria for various management control areas.

SSA has eight PSCs, six of which are located in the regions, and two are located in the Agency's Headquarters in Baltimore, Maryland. The Centers for Security and Integrity (CSI) are responsible for conducting OSCARs at the PSCs, while the Division of Financial Integrity (DFI) is responsible for ensuring compliance of the MCR program. Under the PSC OSCAR guide, the components within the PSCs are required to be reviewed at least once within a 5-year period. The reviews cover a number of programmatic and administrative functions, including: (1) security of automated systems, (2) physical and protective security, (3) time and attendance, (4) enumeration, and (5) third-party draft account.

RESULTS OF REVIEW

We found that two of the eight PSCs were not on track with meeting the PSC OSCAR requirement that each PSC component be reviewed at least once every 5 years. The two PSCs located in SSA Headquarters had not been reviewed under the PSC OSCAR process at the time of our review, even though it conducted the same type of work as the non-Headquarters PSCs. Currently, SSA is developing an OSCAR guide for these two PSCs and plans to conduct reviews in FY 2008. Moreover, we found that SSA did not have a consistent policy to determine which PSC components should be included in the OSCAR process. As a result, some PSC OSCARs were more comprehensive than others.

Generally, we found the issuance of the OSCAR reports and implementation of the recommendations to be timely, but monitoring and following up on actions related to OSCAR reports could be improved. Finally, current PSC OSCAR guidance did not include sufficient steps to ensure that sensitive information contained in SSA's automated systems was properly protected.

CONCLUSION AND RECOMMENDATIONS

While SSA is making progress to ensure that all PSCs meet the 5-year OSCAR requirement, the Agency needs to ensure that the CSI components have a consistent method for identifying components subject to review and then maintain a management tracking system to assess their overall progress. Finally, SSA needs to ensure the PSC OSCAR guide addresses known areas of risk, such as the need to safeguard laptop computers and/or the personally identifiable information contained therein.

To improve the OSCAR process and increase its effectiveness, we recommend SSA:

Develop a consistent national policy on which PSC components are included in the OSCAR process and ensure any changes from this policy are approved by DFI management.

Review all PSC components at least once during a 5-year cycle.

Establish a minimum number or percent of PSC component reviews that must be conducted annually within each region, similar to the 10-percent rule used by other SSA offices conducting OSCARs.

Ensure the Office of Disability Operations and the Office of International Operations PSCs are reviewed timely under the PSC OSCAR process.

Require that the CSI offices obtain and maintain validation reports in a timely manner.

Develop the Automated OSCAR for the PSCs so that CSIs can (a) automatically track and monitor the OSCAR reports, corrective actions, and validation reports and (b) accurately report to DFI the number of reviews conducted.

Update the OSCAR guide, as needed, to include the protection of sensitive data, especially to safeguard laptop computers and/or the personally identifiable information contained within the laptop computers taken outside of the PSCs.

Table of Contents
Page

INTRODUCTION 1
RESULTS OF REVIEW 3
OSCAR Coverage and Selection 3
Required Coverage 3
Selection of Components for Review 5
Corrections of Deficiencies 7
Timeliness of Issuing and Responding to OSCAR Reports 7
Follow-up Reports 8
Monitoring System 8
Protection of Sensitive Data 10
CONCLUSIONS AND RECOMMENDATIONS 11
APPENDICES
APPENDIX A - Acronyms
APPENDIX B - Scope and Methodology
APPENDIX C - Functions and Description of the Program Service Centers
APPENDIX D - Comparison of Program Service Center Components Reviewed Under On-Site Security Control and Audit Reviews
APPENDIX E - Recommendations Not Implemented
APPENDIX F - Agency Comments
APPENDIX G - OIG Contacts and Staff Acknowledgments

Introduction
OBJECTIVE

Our objectives were to assess (1) the Social Security Administration's (SSA) procedures for selecting Program Service Center (PSC) components for On-site Security Control and Audit Reviews (OSCAR), (2) SSA's system for ensuring appropriate correction of deficiencies identified through OSCARs, and (3) additional steps SSA can take to enhance the OSCAR guide.

BACKGROUND

SSA must comply with the Federal requirements associated with management controls and provide assurances that its financial, programmatic, and administrative processes are functioning as intended. These requirements include the Federal Managers' Financial Integrity Act (FMFIA). SSA designed the Management Control Review (MCR) program to satisfy the Federal requirements. The Division of Financial Integrity (DFI) develops and executes the MCR program in the PSCs to comply with the FMFIA.

SSA has eight PSCs, six of which are located within the regions, and two are located at the Agency Headquarters in Baltimore, Maryland (see Table 1). The MCR program is implemented in the PSCs using the Program Service Center Onsite Security, Control and Audit Review (OSCAR) Guide, which standardizes Agency-wide review techniques and reporting criteria for various management control areas, including: (1) security of automated systems, (2) physical and protective security, (3) time and attendance, (4) enumeration, and (5) third-party draft account. While the Centers for Security and Integrity (CSI) are responsible for conducting the reviews at the PSCs, SSA may hire a contractor to perform the PSC reviews as well.

Table 1: PSCs at SSA
PSC Location
Northeastern Program Service Center (NEPSC) Jamaica, New York
Mid-Atlantic Program Service Center (MATPSC) Philadelphia, Pennsylvania
Southeastern Program Service Center (SEPSC) Birmingham, Alabama
Great Lakes Program Service Center (GLPSC) Chicago, Illinois
Western Program Service Center (WNPSC) Richmond , California
Mid-America Program Service Center (MAMPSC) Kansas City, Missouri
Office of Disability Operations (ODO) Baltimore, Maryland
Office of International Operations (OIO) Baltimore, Maryland

Under the OSCAR guide, all components within the PSCs must be reviewed at least once during a 5-year period. For example, a PSC may have 24 different components associated with areas such as disability operations, claims taking, and personnel and training, and each of these components must be reviewed within the 5-year cycle. To evaluate the management control areas as part of the OSCAR process, CSI staff conducts interviews, observes operations, and verifies information. Once the on-site activities are completed, the CSI staff meets with component management to discuss the findings and recommendations. Table 2 below provides the OSCAR reporting and corrective actions timeline for CSI staff and the components reviewed.

Table 2: Timing of OSCAR Reporting and Corrective Actions
Action Component Responsible for Action Component Monitoring Action Calendar Days for Action to be Completed
Final OSCAR report to component manager and DFI CSI Component
Manager 45 days
Response with corrective actions planned and/or taken Component Manager ARC-PCO 45 days
Validation that corrective actions were taken by the component ARC-PCOa CSI 90 days
Note a: Assistant Regional Commissioner-Processing Center Operations.

In addition to the OSCARs, the PSC components are subject to other reviews that assess management controls. For example, PSC managers conduct their own internal reviews using the PSC Component Manager's Self-Review Guide, which is patterned after the OSCAR guide. The self-review is designed to familiarize the manager with the security responsibilities that are part of his/her job and be used as a tool to assess the security posture of the manager's module. The CSI staffs also occasionally conduct targeted reviews within the PSC components in areas such as time and attendance to monitor compliance with policies and procedures.

Results of Review
We found that two of the eight PSCs were not on track with meeting the OSCAR requirement that each PSC component be reviewed at least once every 5 years. SSA is now developing an OSCAR guide for the two Headquarters-based PSCs that have never undergone a review. In addition, we found that the CSIs did not have a clear, consistent policy to determine which PSC components should be included in the OSCAR process. As a result, the CSIs define the PSC components subject to an OSCAR differently. Moreover, our review found that, generally, the CSIs issued timely PSC OSCAR reports, and the audited components had taken appropriate actions on the recommendations.

However, we found that monitoring and following up on actions related to the OSCAR reports could be improved. Lastly, current PSC OSCAR procedures did not include sufficient steps to ensure that personally identifiable information (PII) contained in SSA's automated systems was protected. Such procedures need to be updated to provide for adequate review of handling PII contained in SSA's automated systems.

OSCAR COVERAGE AND SELECTION

Our review found that two of the eight PSCs were not on track with meeting the OSCAR requirement that each PSC component be reviewed at least once every 5 years. In addition, we found the Agency did not have a consistent policy to determine which PSC components should be included in the OSCAR process.

REQUIRED COVERAGE

In our review of the eight PSCs subject to an OSCAR during Fiscal Years (FY) 2004 to 2008, we found that six of the eight PSCs were scheduling reviews in such a way that they would be able to review each component at least once every 5 years. The two process centers recently identified as PSCs in Baltimore, Maryland, were not in compliance with the requirement.

Non-Headquarters PSCs

SSA management noted that, before FY 2004, there was no minimum requirement for the frequency of reviews or the number of components to be reviewed at the PSCs. To determine whether the non-Headquarters PSCs were on track to meet the 5-year requirement since the FY 2004 policy was put into place, we obtained information on the PSC OSCARs issued and planned for FYs 2004 through 2008.

We found that the non-Headquarters PSCs had reviewed or planned to review all of their components by FY 2008, as shown in Table 3.

Table 3: Planned and Issued OSCARs Per PSC
FYs 2004 through 2008
(Related to 159 PSC Components in the 6 Non-Headquarters PSCs)
PSC PSC Components Total Reviews Total Componentsb Percent
Reviewedc
2004 2005 2006 2007a 2008 a
NEPSC 3
3
0
8
12
26
26
100%
MATPSC 4
5
4
5
6
24
18d
133%
SEPSC 9
6
5
5
5
30
29
103%
GLPSC 6
2
7
8
7
30
29
103%
WNPSC 0
8
4
6
7
25
25
100%
MAMPSC 6
6
8
6
6
32
32
100%
Total 28 30 28 38 43 167 159 105%
Note a: Already issued or planned.
Note b: Total components represent the components that were reviewed or scheduled to be reviewed under the OSCAR process. However, these do not represent all components that were subject to an OSCAR review (see page 6).
Note c: Some components were reviewed or scheduled to be reviewed more than once during the 5-year period.
Note d: The MATPSC currently reviews 16 components because 2 modules disbanded in April 2006.

The Northeastern PSC did not review components in FY 2006. CSI staff at the Northeastern PSC stated they postponed the FY 2006 OSCARs because of a regional mandate that required all component managers to perform a self-review OSCAR in FY 2006. The CSI staff acknowledged that these self-reviews do not replace the PSC OSCAR 5-year requirement. However, at the time, they considered it counterproductive for every PSC component to conduct both self-reviews and CSI OSCARs at the same time. The CSI staff stated they were closely involved in supporting the manager self-reviews.

The OSCAR guides for other SSA components require a minimum number of OSCARs to be conducted each year so that the reviews are performed consistently throughout the 5-year period. For instance, the Office of Disability Adjudication and Review OSCAR guide requires that offices annually review 20 percent of the field and Headquarters offices/components under their jurisdiction or use the 10 percent of the targeted review process each year and complete all offices/components within 5 years. SSA could develop similar guidance for the PSCs that establishes a minimum requirement for the number of PSC components to be reviewed annually in conjunction with the requirement to review all PSC components at least once during a 5-year cycle. In the case of the Northeastern PSC, such a policy would have led to five OSCARs per year. Such a policy would also ensure that PSCs undergo periodic reviews and deficiencies are detected earlier rather than later.

Headquarters PSCs

The two Headquarters PSCs, ODO and OIO, had not been reviewed under the PSC OSCAR process at the time of our review, even though the two PSCs conducted the same type of work as the non-Headquarters PSCs. CSI staff in the Office of Central Operations (OCO) stated that the PSCs were not reviewed using the PSC OSCAR guide because most of the questions and steps were not applicable to the PSCs. In our review of the OSCAR guide, we found that, while some of the chapters may not have been applicable to all the Headquarters PSCs, other chapters were relevant to their operations such as (1) time and attendance, (2) security of automated systems, and (3) physical and protective security.

CSI staff noted that self-reviews and other types of reviews had been performed at the ODO and OIO locations over the years. For example, CSI staff conducted reviews that covered items that are also contained in the OSCAR (that is, time and attendance, Single Payment System [SPS], and programmatic reviews). CSI staff has also conducted reviews of the enumeration process at OIO. Moreover, both PSCs were also subject to operations reviews that included security and mailroom reviews.

CSI staff informed us that they were developing an OSCAR guide for the ODO and OIO PSCs and planned to implement it in FY 2008. We believe this change in operations will formalize the review process at the two PSCs, increase monitoring of reviews and recommendations, and ensure the two PSCs are subject to the same requirements as non-Headquarters PSCs.

SELECTION OF COMPONENTS FOR REVIEW

The CSIs did not have a clear, consistent policy to determine which PSC components should be included in the OSCAR process. While the PSC OSCAR requires that all PSC components be reviewed at least once during a 5-year cycle, we found that the CSIs had defined the PSC components subject to an OSCAR review differently.

Although the six non-Headquarters PSCs had approximately 185 components nationally, the CSIs only reviewed or were scheduled to review 159 of these components under the OSCAR process (see Table 4). Hence, 26 (14 percent) components within 5 PSCs did not receive or were not scheduled to receive an OSCAR review. For each PSC, the CSI determined which components should be reviewed using the PSC OSCAR guide. Several CSI staff stated they did not conduct PSC OSCARs on certain PSC components such as the Operations Support Branch and the Computer Operations Section because many of the PSC OSCAR chapters were not applicable.

Table 4: PSC Components Not Reviewed Under OSCAR
FYs 2002 through 2006
(Related to 185 PSC Components in the 6 Non-Headquarters PSCs)
PSC Total Number of Components Components Reviewed Using the PSC OSCAR
Components Not Reviewed Using the OSCAR Percentage of Components Not Reviewed Using the OSCAR
MATPSC 31 18 13 42%
MAMPSC 41 32 9 22%
NEPSC 28
26
2 7%
WNPSC 26 25 1 4%
GLPSC 30 29 1 3%
SEPSC 29 29 0 0%
Total 185 159 26 14%
As an example, the CSI staff at the Mid-Atlantic PSC reviewed 18 components using the PSC OSCAR guide to include 16 modules, the Intermediate Claims Taking Unit, and the Inquiry and Expediting Unit. However, they excluded 13 (42 percent) of the components in the PSC from the OSCAR review process such as
the mailroom, the four Process Areas, the Disability Processing Branch, the Operations Support Branch, the Operations Analysis Section, the Computer Operations Section, and the Debt Management Section.

CSI staff at the Mid-Atlantic PSC stated they conducted compensating reviews for PSC components that were not reviewed under the OSCAR process, such as annual mailroom audits, third-party draft and acquisition audits, and remittance and accounting unit annual audits. In our discussion with CSI staff at the other regions, we were told that they also conducted various internal control reviews of components not covered by an OSCAR.

We found the SEPSC was the only location where all of the components had been reviewed under the OSCAR guide. The CSI at the SEPSC reviewed all 29 of the PSC components as well as 5 Management and Operations Support (MOS) components that are housed within the PSC. CSI staff stated that the purpose of including all of these components within the OSCAR was to maintain a consistent security posture within the physical location of the SEPSC. Furthermore, they explained that they used the PSC OSCAR chapters applicable to the component being reviewed. For example, the OSCAR of the Labor Management and Employee Relations component was conducted using (1) time and attendance, (2) security of automated systems, and (3) physical security chapters of the OSCAR guide. The final OSCAR report contained relevant findings and recommendations related to time and attendance and the security of automated systems. CSI staff also discussed various aspects of physical security with management as part of their review.

CORRECTION OF DEFICIENCIES

Generally, we found that the CSIs issued timely PCS OSCAR reports and the audited components had taken appropriate actions on the recommendations. However, monitoring and follow-up actions related to the OSCAR process needed to be improved.

TIMELINESS OF ISSUING AND RESPONDING TO OSCAR REPORTS

The OSCAR guide requires the issuance of an OSCAR report within 45 calendar days from the completion of the OSCAR. We found that the OSCAR reports were issued timely or close to on time, as shown in Table 5. The reports were issued on average
13 to 51 calendar days. Moreover, the PSCs are required to provide to the CSIs a report of corrective action planned and/or taken within 45 days of receipt of the OSCAR report. Table 5 shows the PSCs issued the corrective action reports within the 45-day period or close to this period. These reports were issued within an average of 38 to 57 days.

Table 5: Timeliness of OSCAR Reports
(Issued in FYs 2005 and 2006)

PSC PSC Oscar Reports Issued in FYs 2005 and 2006 Average Number of Days to Issue Report Average Component Response Time
NEPSC 3
32 57
MATPSC 9
46 41
SEPSC 11
13
38
GLPSC 9
27
40aWNPSC 12
51
39MAMPSC 14
26
39aNote: The 40-day average component response time for GLPSC was calculated based on six reports instead of the nine reports that were issued. The CSI at GLPSC did not have evidence that it had received a corrective action reponses for three reports.

FOLLOW-UP REPORTS

We were unable to determine the timeliness of the PSC OSCAR validation reports for FYs 2005 to 2006 for the six PSCs because of incomplete CSI documentation. Under the OSCAR process, the ARC-PCOs are responsible for validating that corrective actions have been implemented by sending a validation report to the CSIs 90 days after receipt of the component's response. We found that only the CSIs within Western and Mid-America maintained validation documentation for both FYs 2005 and 2006. In general, the CSIs relied on the ARC-PCO validation process to track and verify that the recommended corrections were implemented. While the validation reports were being tracked by the ARC-PCO, it is the responsibility of the CSIs to track the validation reports sent to them to ensure the receipt and timeliness of the validation reports.

We visited the six PSC components to determine whether the recommendations had been implemented timely and whether appropriate actions were taken. At each PSC, we reviewed the last OSCAR reports issued in FY 2006. We found that the 6 PSCs had implemented 90 (92 percent) of the 98 recommendations by the time we visited, which was at least 9 months after the reports were issued (see Table 6). There were eight recommendations that were not implemented and they related to a number of areas, including (1) enumeration, (2) SPS, (3) management controls, and (4) time and attendance.

Table 6: OSCAR Recommendations Not Implemented
(FY 2006 OSCAR Reports for Components at Six PSCs)
PSC CSI OSCAR
Report Date OIG Review Date Recommendations
Total Number Implemented Not Implemented Percent Implemented
SEPSC 09/01/2006 07/19/2007 24 24 0 100%
MAMPSC 09/28/2006 07/01/2007 25 25 0 100%
MATPSC 08/16/2006 06/26/2007 22 19 3 86%
WNPSC 09/29/2006 07/26/2007 7 6 1 86%
GLPSC 03/01/2006 07/19/2007 13 11 2 85%
NEPSC 10/20/2006 07/31/2007 7 5 2 71%
Total 98 90 8 92%

MONITORING SYSTEMS

We found that four of the six CSIs had no central management tracking system for follow-up on the OSCAR findings, corrections, or receipt of validation reports. While the PSC OSCAR does not require that CSI maintain a tracking system, we believe not doing so increases the risk that OSCAR recommendations remain unresolved (as noted earlier), and managers do not have the information necessary to track the results of the OSCAR process.

We found that CSIs in the MATPSC and SEPSC had tracking systems in place to monitor the entire OSCAR process. The CSI office at the SEPSC had the most comprehensive system for tracking the OSCAR process. Within the tracking system, reviews were tracked by FY from the date the notification memorandum was sent to the component manager through the verification of the corrective actions. The tracking system included

a component checklist that tracked the component OSCARs by FY;
the CSI's 5-year plan of PSC OSCARs; and
a checklist that tracked the manager self-reviews by FY.

As for the four remaining PSCs, the CSI management did not track their reports through the entire OSCAR process. Instead, they relied on the individual CSI staff to follow up on the response to the reports through the OSCAR process using email reminders and/or the ARC-PCO validation process.

Furthermore, we found that the lack of adequate management information was also evident at the national level. For instance, we found that the national DFI tracking reports incorrectly documented the status of PSC OSCAR reports at five of the six PSCs during FYs 2002 through 2006. As shown in Table 7, the DFI tracking reports did not document 14 OSCARs conducted in 4 PSCs for 4 of the 5 FYs reviewed. We also found that the DFI tracking reports for FYs 2005 documented four reviews in the Western and Great Lakes PSCs that were not conducted by the regions. Evidence could not be provided by DFI or the CSIs to determine why the OSCAR reports were incorrectly documented on the DFI tracking reports.

Table 7: OSCAR Reviews Documented
Incorrectly on the DFI Tracking Reports
PSC FY 2002 FY 2003 FY 2005 FY 2006 Total
Reviewed But Not Documented in the DFI Tracking Report
SEPSC 2 1 3
MATPSC 1 2 1 1 5
GLPSC 1 1
MAMPSC 5 5
Total 8 3 1 2 14
Documented in the DFI Tracking Report But Not Reviewed
GLPSC 1 1
WNPSC 3 3
Total 1 3 4
Grand Total 8 4 4 2 18

According to SSA staff, the Agency goal is to develop an Automated OSCAR for the PSCs in the near future that will automatically monitor and track the OSCAR process. Currently, SSA tracks and monitors the field office OSCARs using the Automated OSCAR, which allows CSI to enter the findings electronically and generate the required reports, corrective action plans, and validations, thereby eliminating the manual reporting requirements. Moreover, DFI has access to the Automated OSCAR for field offices, which eliminates the need for CSIs to manually report to DFI the number of field office OSCARs conducted during the FY. We encourage SSA to expedite the development of the PSC Automated OSCAR, as we believe a centralized tracking system will help improve monitoring and follow-up actions for the PSC OSCARs as well as produce accurate management information reports.

PROTECTION OF SENSITIVE DATA

Current PSC OSCAR procedures do not include sufficient steps to ensure that PII contained in SSA's automated systems is protected. Such procedures need to be updated to provide for adequate review of handling PII contained in SSA's automated systems. The PSC OSCAR guide's Chapter 5, Security of Automated Systems, includes procedures for reviewing SSA's automated systems and associated data at PSCs. The OSCAR guide should further consider current work environments that allow some PSC staff to work from home using an SSA-provided laptop. For example, the PSC OSCAR does not review procedures in place to ensure safeguarding laptop computers and/or the PII contained within the laptop computers taken outside of the PSCs.

Conclusions and Recommendations

While SSA is making progress to ensure that all PSCs meet the 5-year OSCAR review requirement, the Agency needs to ensure that the CSI components have a consistent method for identifying components subject to review and then maintain a management tracking system to assess their overall progress. Finally, SSA needs to ensure the PSC OSCAR guide addresses known areas of risk, such as the need to safeguard laptop computers and/or the PII contained therein.

RECOMMENDATIONS

To improve the OSCAR process and increase its effectiveness, we recommend SSA:

1. Develop a consistent national policy on which PSC components are included in the OSCAR process and ensure any changes from this policy are approved by DFI management.

2. Review all PSC components at least once during a 5-year cycle.

3. Establish a minimum number or percent of PSC component reviews that must be conducted annually within each region, similar to the 10-percent rule used by other SSA offices conducting OSCARs.

4. Ensure the ODO and OIO PSCs are reviewed timely under the PSC OSCAR process.

5. Require that the CSI offices obtain and maintain validation reports in a timely manner.

6. Develop the Automated OSCAR for the PSCs so that CSIs can (a) automatically track and monitor the OSCAR reports, corrective actions, and validation reports and (b) accurately report to DFI the number of reviews conducted.

7. Update the OSCAR guide, as needed, to include the protection of sensitive data, especially to safeguard laptop computers and/or the PII contained within the laptop computers taken outside of the PSCs.

AGENCY COMMENTS

SSA agreed with our recommendations. The Agency's comments are included in Appendix F.

Appendices

Appendix A
Acronyms

ARC-PCO
CPS
CSI
DFI
FPPS
FMFIA
FY
GLPSC Assistant Regional Commissioner for Processing Center Operations
Critical Payment System
Center for Security and Integrity
Division of Financial Integrity
Federal Personnel Payroll System
Federal Managers' Financial Integrity Act
Fiscal Year
Great Lakes Program Service Center
MCR
MAMPSC
MATPSC
MOS
NEPSC
OCO
ODO
OIO Management Control Review
Mid-America Program Service Center
Mid-Atlantic Program Service Center
Management Operations Support
Northeastern Program Service Center
Office of Central Operations
Office of Disability Operations
Office of International Operations
OSCAR On-site Security Control and Audit Review
PAR Performance and Accountability Report
PII Personally Identifiable Information
PSC Program Service Center
SPS
SSA
SEPSC
WNPSC
Single Payment System
Social Security Administration
Southeastern Program Service Center
Western Program Service Center

Appendix B
Scope and Methodology

To accomplish our objectives, we:

Reviewed the Social Security Administration's (SSA) policies and procedures pertaining to the Program Service Centers (PSC), including the criteria pertaining to On-site Security Control and Audit Reviews (OSCAR) at PSCs. The PSC OSCAR guide for Fiscal Year (FY) 2006 consists of 10 chapters, as shown below:

o Third Party Draft Account;
o Acquisitions;
o Debt Management System;
o Time and Attendance;
o Security of Automated Systems;
o Physical and Protective Security;
o Enumeration;
o Single Payment System and One Check Only Payments;
o Integrity Review Areas; and
o Management Controls.

Reviewed prior Office of the Inspector General audit reports.

Met with SSA staff to gain a better understanding of the OSCAR process as well as other compensating controls.

Gained an understanding of PSC components through interviews with PSC staff as well as a review of PSC organizational charts and telephone directories.

Obtained a listing of all PSC OSCARS performed at the PSCs during FYs 2002 to 2006 and reviews scheduled during FYs 2007 and 2008. For FYs 2005 to 2006 audit period, we:

o Collected and analyzed data related to the timeliness of issuing OSCAR reports related to the PSC OSCARs performed.

o Selected the last OSCAR reports issued in FY 2006 for the six non-Headquarters PSCs. If the PSC did not issue a report in FY 2006, we selected the first report issued in FY 2007. The reports were selected for our visit to the six PSCs to determine whether the OSCAR follow-up process was correctly followed and that recommendations were implemented as required.

We found data used for this audit to be sufficiently reliable to meet our objectives. The entity audited was the Office of the Deputy Commissioner of Operations. We conducted our fieldwork from December 2006 through September 2007 in Philadelphia, Pennsylvania; New York, New York; Richmond, California; Kansas City, Missouri; Chicago, Illinois; Birmingham, Alabama; and Baltimore, Maryland. We conducted this performance audit in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives.

Appendix C
Functions and Description of the Program Service Centers

The Program Service Centers (PSC) are six large and complex multi-mission stations, established as extensions of the national Headquarters. There are actually eight such centers, collectively referred to as "processing centers." The other two centers are located in the national Headquarters of the Social Security Administration (SSA) in Baltimore, Maryland. The Office of Disability Operations (ODO) in Baltimore performs generally the same type of work as a PSC and serves all persons less than 59 years of age claiming disability insurance or black lung benefits. The Office of International Operations (OIO) PSC serves all accounts in which one or more beneficiaries resides in a foreign country. In addition, this PSC provides technical supervision to Foreign Service posts for the taking and development of claims and the investigation of subsequent actions affecting benefit payments.

The primary missions of the PSCs are to:
1. Provide uniform, accurate, and prompt processing of Social Security claims and post-adjudicative changes after beneficiaries have been entitled.
2. Perform formal and informal reconsideration of determinations.
3. Make determinations of overpayments and collects amounts due.
4. Maintain document records, updates computer records and certifies payment and collection transactions to the Department of the Treasury.

All 8 PSCs in our review consist of a total of 257 components. The 6 non-Headquarters PSCs consist of 185 components located in 6 regions nationwide. The 2 PSCs located in the national Headquarters, ODO and the OIO, consist of 58 components and 14 components, respectively.

Appendix D
Comparison of Program Service Center Components Reviewed Under On-Site Security Control and Audit Reviews

In our review, we found that each Center for Security and Integrity (CSI) office had its own definition of the Program Service Center (PSC) components subject to review under the On-site Security Control and Audit Review (OSCAR) process. In Table D-1, we show a comparison of the components reviewed using the OSCAR guide in the Mid-Atlantic PSC and the Southeastern PSC. We found that the Mid-Atlantic PSC CSI staff reviewed 18 of its 31 components, whereas CSI staff at the Southeastern PSC reviewed all 29 components in the PSC.

Table D-1: PSC Components
Reviewed Using OSCAR
Types of Program Service Center Component MATPSC SEPSC
Modules Yes Yes
Immediate Claims Taking Unit Yes Yes
Inquiry and Expediting Yes Yes
Process Division Office No Yes
Processing Center Operations No Yes
Disability Process Branch No Yes
Operations Analysis Section No Yes
Mail & Direct Input No Yes
Computer Operations Section & Unit No Yes
Debt Management Section, Contact Unit, Remittance & Accounting & Debt Specialist Unit No Yes
Operations Support Branch No Yes

Appendix E
Recommendations Not Implemented

We conducted reviews in the six non-headquarters Program Service Centers (PSC) to determine whether the recommendations from the On-site Security Control and Audit Reviews (OSCAR) had been implemented timely and whether appropriate actions were taken. We selected the last OSCAR reports issued in Fiscal Year (FY) 2006 for all of the six PSCs in our review except the Northeastern PSC, which did not issue a report in FY 2006. In this case, we selected the first report issued in FY 2007. We found that, of 98 recommendations made by the Centers for Security and Integrity, 8 recommendations were not implemented. Table E-1 gives a summary of the recommendations.

Table E-1: Recommendations not Implemented
PSC Chapter Recommendation

Northeastern PSC Enumeration Authorizers must ensure Numident changes meet requirements.
Single Payment System (SPS) Remind authorizers to follow Agency procedures for determining eligibility for death underpayments and dividing these payments according to relationship to the deceased beneficiary. Remind authorizers to code the Social Security numbers of death underpayment payees per instructions.
Mid-Atlantic PSC Time and Attendance Action should be taken to ensure that the timekeeper completes all items on the pre-approval register.
Integrity Integrity reviews should have the correct remarks documented on the certification screen per instructions.
Enumeration Refresher training is provided to the PSC Spikers on handling calls involving the enumeration process.
Great Lakes PSC Management Controls Management should ensure that the Critical Payment System (CPS) records on the monthly reports are properly adjusted.
Management should ensure that SPS cases are processed timely to avoid possible tampering with payment addresses or duplicate payments.
Western PSC Time and Attendance The timekeeper should reconcile the Mainframe Time and Attendance System to the information posted to the Federal Personnel Payroll System (FPPS) record.

Appendix F
Agency Comments

MEMORANDUM

Date: April 1, 2008

To: Patrick P. O'Carroll, Jr.
Inspector General

From: David Foster
Chief of Staff

Subject: Office of the Inspector General (OIG) Draft Report, "On-site Security Control and audit Reviews at Program Service Centers" (A-03-07-17064)--INFORMATION

We appreciate OIG's efforts in conducting this review. Our comments regarding the draft report and response to the recommendations are attached.

Please let me know if we can be of further assistance. Staff inquiries may be directed to Ms. Candace Skurnik, Director, Audit Management and Liaison Staff, at (410) 965-4636.

COMMENTS ON THE OFFICE OF THE INSPECTOR GENERAL'S DRAFT REPORT, "ON-SITE SECURITY CONTROL AND AUDIT REVIEWS AT PROGRAM SERVICE CENTERS" (A-03-07-17064)

Thank you for the opportunity to review and provide comments on this draft report.

Recommendation 1

Develop a consistent national policy on which Program Service Center (PSC) components are included in the On-Site Security Control and Audit Review (OSCAR) process and ensure any changes from this policy are approved by Division of Financial Integrity (DFI) management.

Comment

We agree. The Division of Systems Security and Program Integrity is working closely with the DFI to develop a consistent national policy, ensuring that any deviations from the policy are approved by DFI management. We plan to have the policy decisions completed by
June 30, 2008.

Recommendation 2

Review all PSC components at least once during a 5-year cycle.

Comment

We agree. We are considering the 5-year cycle as an option as we develop our National policy (see our response to recommendation 1). We plan to have the policy decisions completed by June 30, 2008.

Recommendation 3

Establish a minimum number or percent of PSC component reviews that must be conducted annually within each region, similar to the 10 percent rule used by other SSA offices conducting OSCARs.

Comment

We agree. We are considering the 10 percent rule as an option as we develop our National policy (see our response to recommendation 1). We plan to have the policy decisions completed by June 30, 2008.

Recommendation 4

Ensure the Office of Disability Operations (ODO) and the Office of International Operations (OIO) PSC's are reviewed timely under the PSC OSCAR process.

Comment

We agree. In December 2007, we established the OSCAR project plan for ODO and OIO PSCs. They are both now subject to timely reviews under the PSC OSCAR process. The 5-year plan started in January 2008, and will be completed in August 2012. The plan includes a review of each Office of Central Operations component within that timeframe.

Recommendation 5

Require that the Center for Security and Integrity (CSI) offices obtain and maintain validation reports in a timely manner.

Comment

We agree. Our ability to monitor for compliance will be enhanced by the automation of the PSC OSCAR (see our response to recommendation 6). A reminder will be issued in April 2008 to all of our CSIs informing them that they are to ensure that they obtain and maintain validation reports in a timely manner.

Recommendation 6

Develop the Automated OSCAR for the PSCs so that CSIs can: a) automatically track and monitor the OSCAR reports, corrective actions, and validation reports; and b) accurately report to DFI the number of reviews conducted.

Comment

We agree. We are in the process of enhancing our current field office website of Automated OSCARs to include the PSC OSCAR process. Our target date for completion of the enhancements is early fiscal year 2009. Once the enhancements are complete, we will be able to automatically track and monitor the OSCAR reports, corrective actions, and validation reports. We will also be able to provide accurate and timely data to DFI regarding the number of reviews conducted.

Recommendation 7

Update the OSCAR guide, as needed, to include the protection of sensitive data, especially to safeguard laptop computers and/or the Personally Identifiable Information (PII) contained within the laptop computers taken outside of the PSCs.

Comment

We agree. Currently, we update the guide on a monthly basis to ensure it is in alignment with the current security policies and procedures. While the current version of the PSC OSCAR guide does contain questions related to the protection of sensitive data, including properly securing laptops in the office when not in use, it does not contain a question regarding the protection of PII contained within laptop computers taken outside of the PSCs. We will add this type of question to the PSC OSCAR guide by May 2008.

Appendix G
OIG Contacts and Staff Acknowledgments
OIG Contacts
Cylinda McCloud-Keal, Director, Philadelphia Audit Division, (215) 597-0572
Acknowledgments
In addition to those named above:
Mary Dougherty, Auditor-in-Charge
Richard Devers, Information Technology Specialist
Elizabeth Juarez, Senior Auditor
Timothy Meinholz, Senior Auditor
Denise Molloy, Senior Analyst
Karis Crane, Auditor
Hollie Reeves, Auditor
Nichole Purnell, Program Analyst

For additional copies of this report, please visit our web site at www.socialsecurity.gov/oig or contact the Office of the Inspector General's Public Affairs Specialist at (410) 965-3218. Refer to Common Identification Number A-03-07-17064.

Overview of the Office of the Inspector General
The Office of the Inspector General (OIG) is comprised of our Office of Investigations (OI), Office of Audit (OA), Office of the Chief Counsel to the Inspector General (OCCIG), and Office of Resource Management (ORM). To ensure compliance with policies and procedures, internal controls, and professional standards, we also have a comprehensive Professional Responsibility and Quality Assurance program.

Office of Audit
OA conducts and/or supervises financial and performance audits of the Social Security Administration's (SSA) programs and operations and makes recommendations to ensure program objectives are achieved effectively and efficiently. Financial audits assess whether SSA's financial statements fairly present SSA's financial position, results of operations, and cash flow. Performance audits review the economy, efficiency, and effectiveness of SSA's programs and operations. OA also conducts short-term management and program evaluations and projects on issues of concern to SSA, Congress, and the general public.

Office of Investigations
OI conducts and coordinates investigative activity related to fraud, waste, abuse, and mismanagement in SSA programs and operations. This includes wrongdoing by applicants, beneficiaries, contractors, third parties, or SSA employees performing their official duties. This office serves as OIG liaison to the Department of Justice on all matters relating to the investigations of SSA programs and personnel. OI also conducts joint investigations with other Federal, State, and local law enforcement agencies.

Office of the Chief Counsel to the Inspector General
OCCIG provides independent legal advice and counsel to the IG on various matters, including statutes, regulations, legislation, and policy directives. OCCIG also advises the IG on investigative procedures and techniques, as well as on legal implications and conclusions to be drawn from audit and investigative material. Finally, OCCIG administers the Civil Monetary Penalty program.

Office of Resource Management
ORM supports OIG by providing information resource management and systems security. ORM also coordinates OIG's budget, procurement, telecommunications, facilities, and human resources. In addition, ORM is the focal point for OIG's strategic planning function and the development and implementation of performance measures required by the Government Performance and Results Act of 1993.