FDIC, Federal Deposit Insurance Corporation, Office of Inspector General, core values: communication, objectivity, responsibility, excellence
FDIC.GOV Office of Inspector General core values: communication, objectivity, responsibility, excellence
Search | Accessibility | Privacy | Information Quality | Contact Us | Site Map | Home
PRINT


FDIC’s Controls Related to the Offsite Review List

February 2009
Report No. AUD-09-004

FDIC OIG, Office of Audits
Federal Deposit
Insurance Corporation

Why We Did The Audit

The FDIC insures about 8,500 financial institutions with assets totaling over $13 trillion and domestic deposits totaling over $6.9 trillion. The federal banking agencies (the agencies), which include the FDIC, have developed a number of tools for monitoring the health of individual institutions and the industry as a whole. The FDIC has developed eight offsite systems to monitor insured institutions between examinations. Three of these systems are used to produce the Offsite Review List (ORL), which identifies insured institutions with potential problems (as described later). Within the FDIC, the Division of Supervision and Consumer Protection (DSC) is responsible for performing offsite monitoring of FDIC-insured depository institutions.

The audit objective was to assess DSC’s internal controls for performing offsite monitoring of insured financial institutions. The audit focused on the controls related to offsite reviews of institutions on the ORL. As part of our audit, we also reviewed DSC’s implementation of a recommendation by the Government Accountability Office (GAO) for strengthening the FDIC’s risk assessment activities through periodic evaluations and monitoring of the Corporation’s offsite monitoring systems.

Background

One of the FDIC’s primary offsite monitoring tools is the Statistical CAMELS Offsite Rating (SCOR) model, a statistical model that uses financial ratios and historical examination results to assign an offsite CAMELS rating. (CAMELS is an acronym for the components Capital, Asset quality, Management, Earnings, Liquidity, and Sensitivity to market risk. Institutions receive a composite and component rating of 1 to 5, with 1 having the least regulatory concern and 5 having the greatest concern.) SCOR is designed to identify 1- and 2-composite rated institutions that have experienced substantial financial deterioration since the last onsite examination.

SCOR and two other risk measurement systems, SCOR-Lag and the Growth Monitoring System, are used to produce the ORL each quarter. The ORL consists of 1- and 2-rated institutions that have been identified with potential problems or pose the risk of being downgraded to a 3 rating or worse at the next examination.

DSC’s Case Manager Procedures Manual requires an analysis of all institutions on the ORL so an appropriate supervisory strategy can be developed, as warranted.

Audit Results

DSC has established internal controls for performing offsite monitoring of insured financial institutions. Specifically, each institution on the ORL must have an offsite review completed and approved within 3½ months after the end of each quarter. We sampled 60 of the 577 institutions on the December 31, 2007 ORL and found that DSC had completed offsite reviews for each sampled institution, developed supervisory strategies, and documented the reviews in accordance with DSC policies and procedures.

Additionally, DSC has initiated a process for periodically evaluating its offsite monitoring systems in response to a February 2007 GAO recommendation to evaluate and monitor these systems. DSC plans to evaluate, on a rotational basis, its offsite monitoring systems. However, at the time of our audit, no details regarding a schedule or procedures for conducting evaluations were available, and no system evaluations had been performed.

Although the FDIC has developed an extensive offsite monitoring program, opportunities exist for improvement. Specifically, we found that the ORL was not capturing a significant percentage of institutions that DSC, through its risk management examinations, downgraded to a 3 rating or worse, including many of the institutions that ultimately failed.

DSC pointed out that although the ORL may not have captured a significant percentage of institutions that were downgraded, the same institutions may have been receiving additional DSC supervisory attention through other monitoring tools. In addition to the ORL, the FDIC uses several other offsite monitoring tools to monitor risks within the industry and to identify potential emerging risks that may require additional supervisory follow-up—including a model that projects an institution’s CAMELS rating subject to a real estate crisis similar to one in New England in the early 1990s; a model that identifies institutions that have experienced consistent rapid growth (over several quarters) and/or a funding structure that is highly dependent on non-core sources; a report that monitors institutions exhibiting high-risk lending activity; a report that monitors the condition of institutions that have been in operation less than 8 years; a quarterly monitoring program for institutions with total assets over $10 billion; and a model that identifies institutions with characteristics that have been associated with fraud, formal regional risk committees, and various regional monitoring programs and watch lists.

Thorough and timely evaluations of the three DSC models-based offsite monitoring systems that create the ORL are needed to determine if the assumptions and methodologies used reasonably support determinations for including institutions on the ORL. Further, the offsite monitoring systems used to create the ORL are largely based on historical indicators, pertaining to institution asset quality, earnings, and capital, that may not fully consider current and emerging risks. As a result, the ORL may not be capturing a complete picture of the risks facing 1- and 2-composite rated institutions or identifying those institutions at risk of significant ratings downgrades.

Using actual failure and downgrade information to test all offsite monitoring systems and incorporating the results into evaluations of those systems could lead to a more focused ORL and a more effective and efficient offsite monitoring program. Scheduling all offsite monitoring systems for regular evaluations and establishing procedures to conduct the evaluations would help to assure that management’s objectives regarding offsite monitoring are being achieved and financial risks to the FDIC’s Deposit Insurance Fund are being mitigated.

Recommendations and Management Response

We recommended that DSC: (1) validate the assumptions and methodology used in SCOR; (2) ensure that the regular evaluations of all offsite monitoring systems used to create the ORL are performed as scheduled; and (3) establish procedures to evaluate all models-based offsite monitoring systems and, as part of these procedures, consider recent failure and downgrade information to test the efficacy of the logic and assumptions used in the offsite monitoring systems. In its response to the audit, DSC stated that it concurred with the recommendations and completed the recommended actions. Additionally, DSC provided comments regarding the accuracy of the ORL as a predictive tool and stated that DSC had completed the first of the GAO-recommended evaluations.





Contents Page

BACKGROUND2
RESULTS OF AUDIT 5
CONTROLS FOR PERFORMING OFFSITE REVIEWS 7
EFFECTIVENESS OF THE ORL 8

Guidance Related to Ensuring the Effectiveness of the ORL

8

Accuracy of the ORL in Identifying Problem Institutions

8

Failed Institutions That Were Not Identified on ORLs

9

Downgraded Institutions That Were Not Identified on ORLs

9

Division of Insurance and Research Analysis of SCOR’s Performance

9

Implementation of a GAO Recommendation Related to Evaluating Offsite Monitoring Systems

11

Conclusion

12

Recommendations for Enhancing the Effectiveness of the ORL

12
CORPORATION COMMENTS AND OIG EVALUATION12
APPENDICES

1. OBJECTIVE, SCOPE, AND METHODOLOGY

14

2. THE FDIC’S OFFSITE MONITORING TOOLS

17

3. OIG ANALYSIS OF WHETHER FAILED INSTITUTIONS WERE ON THE ORL PRIOR TO FAILURE

19

4. CORPORATION COMMENTS

20

5. MANAGEMENT RESPONSE TO THE RECOMMENDATION

25

6. ACRONYMS USED IN THE REPORT

26
TABLES

1. Offsite Review Dates

5

2. Sample of Institutions from the December 31, 2007 ORL

7

3. Number of Insured and Supervised Institutions, by Regulator

14
FIGURE

Number of Institutions on the ORLs Since 2006

4









FDIC, Federal Deposit Insurance Corporation, Office of Inspector General,Office of Audits, 3501 Fairfax Drive, Arlington, VA 22226-3500
DATE: February 19, 2009
 
MEMORANDUM TO:Sandra L. Thompson, Director
Division of Supervision and Consumer Protection
 
FROM:Russell A. Rau [Electronically produced version; original signed by Russell A. Rau]
Assistant Inspector General for Audits
 
SUBJECT:FDIC’s Controls Related to the Offsite Review List
(Report No. AUD-09-004)
 

This report presents the results of our audit of the FDIC’s controls related to one of its offsite monitoring tools—the Offsite Review List (ORL).1 The audit objective was to assess the FDIC’s internal controls for performing offsite monitoring of insured financial institutions. We focused the audit on the Division of Supervision and Consumer Protection’s (DSC) controls pertaining to offsite reviews of institutions on the FDIC’s ORL, which identifies insured institutions with 1 and 2 composite ratings and potential problems that pose the risk the institution will be downgraded at the next examination.2 As part of our audit, we also reviewed DSC’s implementation of a recommendation by the Government Accountability Office (GAO),3 pertaining to strengthening the FDIC’s risk assessment activities through periodic evaluations and monitoring, including offsite monitoring.

We conducted this performance audit in accordance with generally accepted government auditing standards. Appendix 1 of this report discusses our audit objective, scope, and methodology in detail.














1 The ORL is described in more detail in the Background section of this report. It is important to note that DSC uses several other offsite monitoring tools to monitor risks within the industry and to identify potential emerging issues that may require additional supervisory follow-up.
2 Under the Uniform Financial Institutions Rating System (UFIRS), each financial institution is assigned a composite rating by a federal or state banking agency based on an evaluation and rating of six essential components of an institution’s financial condition and operations. These component factors address the adequacy of Capital, the quality of Assets, the capability of Management, the quality and level of Earnings, the adequacy of Liquidity, and the Sensitivity to market risk (otherwise known as CAMELS).
3 Report Number GAO-07-255, Federal Deposit Insurance Corporation: Human Capital and Risk Assessment Programs Appear Sound, but Evaluations of Their Effectiveness Should Be Improved, dated February 15, 2007.








BACKGROUND

Section 10(d) of the Federal Deposit Insurance Act requires annual onsite examinations of each insured financial institution at least once during each 12-month period.4 Annual examination intervals may be extended to 18 months if the insured institution has assets totaling less than $500 million and is well managed and well capitalized. As stated in the FDIC Banking Review, 2003, Volume 15, No. 3, onsite examinations provide the most complete and reliable information about an institution’s financial health, and the federal banking agencies regard CAMELS ratings as the single best indicator of an institution’s condition. However, subsequent to a completed onsite examination, an institution’s financial condition may change, so the CAMELS ratings may no longer accurately reflect the institution’s current condition. Therefore, the FDIC has developed various offsite tools, including the ORL, to monitor insured institutions between examinations.

As identified in the Offsite Review Program section of the Case Manager Procedures Manual (Procedures Manual), DSC developed eight risk measures for monitoring the condition of individual institutions. These eight risk measures use offsite data to assist in monitoring the risk of about 8,500 insured institutions (including non-FDIC supervised institutions). The eight measures use information reported in financial institutions’ quarterly Consolidated Reports of Condition and Income (Call Report) and Thrift Financial Reports (TFR). Information from the measures may also aid examiners in planning for an onsite examination. The eight measures are described below.

  • The Statistical CAMELS Offsite Rating (SCOR) model uses statistical techniques to measure the likelihood that an institution will receive a ratings downgrade at the next examination. The output of the model is derived from historical examination results as well as from Call Report and TFR data.
  • SCOR-Lag, a derivation of SCOR, attempts to more accurately assess the financial condition of rapidly growing banks.
  • The Growth Monitoring System (GMS) identifies institutions experiencing rapid growth and/or with a funding structure highly dependent on non-core funding sources.
  • The Real Estate Stress Test (REST) projects an institution’s CAMELS rating subject to a real estate crisis similar to that in New England in the early 1990s.
  • Consistent Grower is a cumulative growth score measure using up to 20 quarters of GMS scores.
  • The Quarterly Lending Alert (QLA) monitors institutions exhibiting high-risk lending activity such as subprime lending.



4 12 United States Code 1820(d).





2







  • Young Institutions identifies institutions that are less than 8 years old.
  • Multiflag combines the multiple risk measures discussed above.

Three of these eight measures are used to produce the quarterly ORL: the SCOR, SCOR-Lag, and GMS. The ORL consists of 1- and 2-composite rated institutions that are (1) identified by SCOR or SCOR-Lag with a 35 percent or higher probability of being downgraded to a 3 rating or worse at the next examination or (2) flagged by the GMS as being in the 98th or higher growth percentile.

DSC stated that it uses several other offsite monitoring tools to monitor risks within the industry and to identify potential emerging issues that may require additional supervisory follow-up. These tools (details are in Appendix 2) include, but are not limited to, the following.

  • Large Insured Depository Institution (LIDI) Program.
  • Internal Control Assessment Rating System (ICARuS) and Risk Analysis Center (RAC) Dashboard.
  • Regional Watch Lists.
  • Regional Offsite Monitoring and Supervisory Strategies.
  • Regional Risk Committees.
  • Quarterly Supervisory Risk Profile.

According to DSC, it created and staffed two new sections in 2008 to strengthen the examination program and enhance the risk assessment process, including offsite review. These sections are: (1) the Risk Analysis Section, which analyzes offsite information available through various monitoring systems, together with specific information gathered during examinations, to proactively identify risks and trends; and (2) the Emerging Issues Section, which was created to enhance the Corporation’s ability to develop proactive forward-looking bank supervision policy and conduct offsite monitoring of various institutional risks.

Information in DSC’s Virtual Supervisory Information on the Net (ViSION) System shows that the number of institutions on the ORL has been increasing significantly since 2006, as shown in the figure, which follows.







3







Number of Institutions on the ORLs Since 2006
Source: Office of Inspector General (OIG) analysis of ViSION system information. [ D ]

Examiner Guidance

Section 13 of the Procedures Manual discusses the Offsite Review Program, including: (1) definitions of the eight risk measures, (2) generation of the ORL based on updated quarterly Call Report data, (3) deadlines for conducting an offsite review, (4) the reviewer’s documented findings and supervisory strategy in the Offsite Module of ViSION, and (5) ViSION comments on reviews that found medium or high levels of risk. The Procedures Manual does not provide specific step-by-step instructions for completing the offsite review; rather, it provides a general overview of the Offsite Review Program.

According to the Procedures Manual, “[E]ach institution on the ORL must have an Offsite Review and will appear in the Active Tasks of the appropriate Case Manager or Field Supervisor in ViSION.” Case Managers or Field Supervisors perform the offsite reviews to determine whether supervisory attention is warranted before the next regularly scheduled examination or a rating change should be initiated, if the review indicates that the institution poses a greater risk to the insurance fund than indicated by the composite rating. The manual also states that offsite reviews must be completed and approved within 3½ months after each Call Report date. Table 1, which follows, shows the schedule for completing and approving offsite reviews.





4






Table 1: Offsite Review Dates
Call Report Date Call Report Finalized Offsite Reviews Approved
March 31 May 31 July 15
June 30 August 31 October 15
September 30 November 30 January 15
December 31 February 28 April 15
Source: The Case Manager Procedures Manual.

Prior Related Audit Attention

In February 2007, the GAO issued a report entitled, Federal Deposit Insurance Corporation: Human Capital and Risk Assessment Programs Appear Sound, but Evaluations of Their Effectiveness Should Be Improved (Report No. GAO-07-255). In its report, the GAO noted that the FDIC has an extensive risk assessment system and contingency plans for bank failures but had not comprehensively or routinely evaluated the system or plans. Although the GAO noted that the FDIC had conducted a one-time analysis of the performance of SCOR, the GAO also noted that the FDIC was not regularly evaluating its offsite monitoring systems for reliability and underscored the need for the Corporation to perform more regular reviews. The GAO recommended that

. . . to strengthen the oversight of its risk assessment activities, the FDIC should develop policies and procedures clearly defining how it will systematically evaluate and monitor its risk assessment activities and ensure that required evaluations are conducted in a comprehensive and routine fashion.

In response to the GAO report, the Corporation stated:

We agree that it would be beneficial to review our risk assessment activities to ensure they are comprehensive, appropriate to our mission, and fully evaluated. As noted in the GAO draft report, a review of FDIC offsite monitoring systems has been completed, and work continues to implement needed changes.

Beginning in January 2007, an interdivisional committee will perform an in-depth review of current risk assessment activities and evaluation procedures. By September 30, 2007, the committee will make recommendations to FDIC executive management as to how we might strengthen the risk assessment framework. At that time, management will establish a reasonable timeline to implement any required changes.

RESULTS OF AUDIT

DSC has established an internal control process for performing offsite monitoring of insured financial institutions identified on the ORL. The internal control process includes: (1) scheduling and performing offsite reviews for each institution on the ORL; (2) documenting the analyses performed as part of each review, including a supervisory strategy; and (3) requiring a supervisory approval of the reviews performed. We sampled





5






60 of the 577 institutions on the December 31, 2007 ORL and found that DSC had completed offsite reviews for each sampled institution and documented the reviews in accordance with DSC policies and procedures, including specifying a supervisory strategy. Further, there was evidence of supervisory review for each of the offsite reviews in our sample (Controls for Performing Offsite Reviews).

Although DSC has developed an extensive offsite review program using a variety of sources – including the LIDI program, the QLA, and Regional Watch Lists – to monitor financial institution condition, the ORL was not capturing a significant percentage of institutions that DSC, through its risk management examinations, downgraded to a 3 rating or worse, as illustrated below.

  • For 20 institutions that failed from January 2001 through July 2008, which were rated 1 or 2 in an examination during that period, 13 institutions (65 percent) did not appear on an ORL in the 4 quarters prior to the quarter each institution was downgraded to a 3, 4, or 5 rating (see Appendix 3).
  • Of the 223 institutions that were downgraded by 2 or more ratings from 2002 through 2007, 151 institutions (68 percent) did not appear on an ORL in the 4 quarters prior to the downgrades.
  • From 1998 through 2007, SCOR did not flag 2,011 (88 percent) of 2,281 institutions that were eventually downgraded to a 3 rating or worse (referred to by DSC as a Type I Error – the percentage of downgraded institutions that SCOR did not identify as problem institutions).
  • From 1998 through 2007, SCOR flagged 832 institutions of which 562 (68 percent) were not downgraded (referred to by DSC as a Type II Error – the percentage of institutions that were identified by SCOR but were not downgraded in a subsequent examination).

The assumptions and methodologies in SCOR have not been updated since 2003. Further, the offsite monitoring systems used to create the ORL are largely based on historical indicators pertaining to institution asset quality, earnings, and capital that may not fully consider current and emerging risks. As a result, the ORL may not be capturing a complete picture of the risks facing 1- and 2-rated institutions or identifying those institutions at risk of significant ratings downgrades.

Additionally, DSC has initiated a process for periodically evaluating the three models-based systems that determine the ORL in response to a February 2007 GAO recommendation to evaluate and monitor these systems. DSC plans to evaluate, on a rotational basis, all of its offsite monitoring systems. However, at the time we completed our audit fieldwork, no details regarding a schedule or procedures for conducting evaluations were available, and no system evaluations had been performed.





6






Validation of the assumptions and methodology used in SCOR is needed on a priority basis to determine if the performance of the system could be enhanced. In addition, thorough evaluations of the three DSC offsite monitoring systems that create the ORL are needed on a regular basis to determine if the assumptions and methodologies used reasonably support determinations for including institutions on the ORL. Using actual failure and downgrade information to test offsite monitoring systems and incorporating the results into evaluations of those systems could lead to a more focused ORL and a more effective and efficient offsite monitoring program. Scheduling all offsite monitoring systems for regular evaluations and establishing procedures to conduct the evaluations would help to assure that management’s objectives regarding offsite monitoring are being achieved and financial risks to the FDIC’s Deposit Insurance Fund are being mitigated (Effectiveness of the ORL).

CONTROLS FOR PERFORMING OFFSITE REVIEWS

DSC has established an internal control process for performing offsite reviews of insured institutions appearing on the ORL. Such controls include making an assessment of risk, identifying a supervisory strategy, documenting analyses performed, and requiring supervisory approval of the reviews. We sampled 60 of the 577 institutions on the December 31, 2007 ORL. For all 60 institutions, we found that DSC Case Managers or Field Supervisors had complied with guidance in the Procedures Manual. Specifically, a reviewer completed an offsite review for each institution, identified a risk level and trend, identified a supervisory strategy, and documented the review in ViSION. We also found that each review had been approved by an Assistant Regional Director (ARD), as required by the policy. Further, for those institutions not regulated by the FDIC, we found that Case Managers or Field Supervisors contacted the appropriate regulators to discuss supervisory strategies for each of the sampled institutions whose overall risk level was expected to sufficiently change over the next 12-month period. The results of our sample are shown in Table 2.

Table 2: Sample of Institutions from the December 31, 2007 ORL
Federal Regulator Sampled Institutions Risks Identified Supervisory Strategy Noted Contacts with the Regulator
FDIC 41 Yes Yes 3
Office of the Comptroller of the Currency (OCC) 8 Yes Yes 5
Office of Thrift Supervision (OTS) 8 Yes Yes 5
Board of Governors of the Federal Reserve System (FRB) 3 Yes Yes 2
Source: OIG analysis and information in the ViSION system.




7






Because we noted no matters warranting additional management action, we made no recommendation in this area related to the audit objective.

EFFECTIVENESS OF THE ORL

The ORL has not been as effective in capturing problem institutions as it should be. Although DSC has developed an extensive offsite monitoring program that utilizes a multitude of other systems and ad hoc reports to address emerging risks, DSC has not regularly validated the underlying assumptions and methodologies for the models-based component of the system or conducted regular evaluations of the models used to create the ORL. This led the GAO to recommend that the FDIC strengthen the oversight of its risk assessment activities and systematically evaluate these activities. The assumptions and methodologies in SCOR have not been updated since 2003. Further, the offsite monitoring systems used to create the ORL are largely based on historical financial information, provided by the financial institution, that may not be accurate and may not fully consider current and emerging risks. As a result, the FDIC’s offsite monitoring systems may not be capturing a complete picture of the current and emerging risks facing 1- and 2-rated institutions or identifying those institutions at risk of significant ratings downgrades.

Guidance Related to Ensuring the Effectiveness of the ORL

FDIC Circular 4010.3, FDIC Enterprise Risk Management Program, adopted internal control standards prescribed in GAO publication, Standards for Internal Control in the Federal Government. The GAO standards apply to all operations (programmatic, financial, and compliance) and are intended to ensure the effectiveness and efficiency of operation, reliability of financial reporting, and compliance with applicable laws and regulations. Circular 4010.3 requires management to develop and implement controls to ensure that management directives are carried out and to provide reasonable assurance that controls are sufficient to minimize exposure to waste, fraud, and mismanagement. The circular also requires management to perform monitoring activities to assess the quality of performance over time and the effectiveness of controls. Key control activities described in the circular, as they relate to offsite monitoring, include routine management and supervisory actions; transaction comparisons and reconciliations; other actions taken in the course of normal operations; as well as separate and discrete control evaluations, including internal self-assessments and external reviews.

Accuracy of the ORL in Identifying Problem Institutions

To assess the effectiveness of the ORL in identifying institutions at risk of being downgraded, we analyzed ORL and CAMELS ratings information available on failed institutions from January 2001 through July 31, 2008 and on






8






institutions that had been downgraded by two or more ratings from 2002 through 2007. The results of our analysis are discussed below.

Failed Institutions That Were Not Identified on ORLs. Of the 20 institutions that failed from January 2001 through July 2008 and were rated 1 or 2 in an examination during that period, 13 institutions (65 percent) did not appear on an ORL in the 4 quarters prior to the quarter they were downgraded to a 3, 4, or 5 rating (see details in Appendix 3).5 In its response to these statistics, DSC stated:

. . . 7 of the 13 institutions that failed and did not appear on the ORL had fraud as a contributing factor for failure. The statistical models used to generate the ORL rely on Call and Thrift Financial Report data and cannot detect deteriorating financial conditions when banks or thrifts misstate their financial information. In addition, another institution was extensively monitored through the LIDI program and also did appear on the ORL for the period from 9/30/00 to 12/31/07. After excluding these eight institutions, the ORL identified 7 of the 12 failures (58 percent) in the 4 quarters prior to the quarter the institutions were downgraded.

Downgraded Institutions That Were Not Identified on ORLs. We obtained a list from FDIC officials of 223 institutions rated 1 or 2 that had been downgraded by 2 or more ratings from 2002 through 2007. When we compared these institutions to the ORLs created during that time period, we found that 151 institutions (68 percent) did not appear on an ORL in any of the 4 quarters prior to the multiple downgrade despite, in many cases, significant financial deterioration. Our results indicate that actual failure and downgrade information would be useful in testing the efficacy of the assumptions and methodologies used in the offsite monitoring systems.

Division of Insurance and Research Analysis of SCOR’s Performance

DSC provided us with a memorandum, dated February 4, 2008, summarizing a Division of Insurance and Research (DIR) analysis of SCOR’s performance from 1985 to the first quarter in 2007. The memorandum states that SCOR’s performance was evaluated in terms of whether the model was able to correctly identify banks that were subsequently downgraded. DIR’s analysis showed that from 1998 through 2007, SCOR did not flag 2,011 (88 percent) of 2,281 institutions that were eventually downgraded to a 3 rating or worse and that SCOR flagged 832 institutions of which 562 (68 percent) were not downgraded in a subsequent examination. According to the memorandum, during periods of economic expansion and growth:




5 Since 2001, the FDIC has resolved 32 institution failures with estimated costs to the Deposit Insurance Fund (DIF) that could exceed $10 billion. We reviewed ViSION information on the failed institutions for the 4 quarters before they were downgraded to a 3 rating or worse to determine whether the institutions had appeared on an ORL. We were not able to assess 12 of the 32 institutions because those institutions were 3 rated or worse prior to 2000, and ORL information was not available in ViSION prior to 2000.





9






. . . institutions are more likely to be downgraded for non-financial reasons (such as incompetent management, fraud, etc.). In such periods, a financial conditions model such as SCOR, which identifies institutions based purely on the financial ratios, is more likely to miss the downgrades.

DIR’s analysis points to potential issues related to the accuracy of the SCOR model. DSC stated that it recognized the limitations of offsite monitoring models and does not rely solely on these models to evaluate the potential risks posed by financial institutions.

In 2005, an FDIC interdivisional team completed a project to review and evaluate the FDIC’s offsite monitoring systems and tools for their effectiveness and efficiency and to identify opportunities to improve the offsite monitoring process. The Offsite Monitoring Project’s Summary Report and Recommendations stated that, as of June 2005, 283 (60 percent) of 472 institutions with a composite rating of 3 or worse were not identified on a single ORL between June 2001 and March 2005. This interdivisional effort recommended that the selection criteria for the ORL incorporate trend analysis, other statistical measures, and other factors (such as output from other models combined with contributing or mitigating risk factors) to increase the overall predictive capability of the list. The project also recommended that a core set of profiles be established for institutions that migrate to a composite 3 rating and that these profiles be statistically compared to the output from other models. Additionally, the project report included recommendations to improve the performance of SCOR, SCOR-Lag, GMS, and other FDIC offsite monitoring tools. However, we found no evidence that the recommendations had been implemented.

In 2003, the FDIC published an article about SCOR, noting that its performance has declined particularly in good economic periods. The article noted:

The low level of accuracy might be expected inasmuch as SCOR relies completely on financial ratios. Any such model will probably be more accurate when the reasons for downgrades are financial, and less accurate when the reasons have to do with some aspect of bank operations that does not affect the bank’s financial ratios. For example, examiners may downgrade a bank because they discover that it has significantly weakened its underwriting standards or has weak internal controls—but as long as the more risky loans have not become past due, problems might not have made their way to the financial statements. Consequently, one might reasonably expect that SCOR would be less accurate over the last decade. The reliance on financial data has several other effects on SCOR’s performance. For one thing, it means that SCOR is completely dependent on the accurate reporting of financial information.

DSC officials we interviewed reiterated that SCOR relies on the financial information reported by institutions in Call Reports and TFRs. To the extent this information is inaccurate, the ORL may be impacted. According to DSC officials, DSC has begun looking at this issue by assessing whether institutions that were downgraded and not detected by SCOR eventually filed amended Call Reports. That review was still ongoing at the time of issuance of this report.





10






DSC also indicated that in 2008, it implemented ICARuS as an offsite risk monitoring tool to identify institutions, between examinations, that may warrant increased supervisory attention because of an increased susceptibility to fraud. ICARuS was developed as a solution for identifying certain management-related characteristics, financial indices, and trends that were common among the failures and near failures not identified by SCOR.

Implementation of a GAO Recommendation Related to Evaluating Offsite Monitoring Systems

GAO’s February 2007 report noted that SCOR is informative but does not always produce accurate results. GAO further noted that such a finding and the FDIC’s limited evaluation of its other offsite monitoring systems underscore the need for more regular reviews. GAO recommended that, “to strengthen the oversight of its risk assessment activities, the FDIC should develop policies and procedures clearly defining how it will systematically evaluate and monitor its risk assessment activities and ensure that required evaluations are conducted in a comprehensive and routine fashion.”

In response to the GAO report, the Corporation stated that an interdivisional committee would perform an in-depth review of its risk assessment activities and evaluation procedures. As part of that review, the FDIC would seek to improve its offsite monitoring systems. The plan for this effort included the development of an Offsite Models and Systems Validation Program. As part of the validation program, the FDIC planned to conduct two types of reviews—a logical and conceptual soundness review of the offsite systems, performed over a 4-year period, and a backtesting (outcome analysis) review to be performed annually as follows:

  • 2008—SCOR and SCOR-Lag
  • 2009—GMS and Consistent Grower
  • 2010—Real Estate Stress Test
  • 2011—MultiFlag, Young Institutions, and Quarterly Lending Alert

The backtesting reviews would include analyses of (1) trends in the number of institutions flagged, (2) analyses of trends in the underlying model or system ratios, and (3) a comparison of the model or system predictions to actual results.

In addition, DSC completed a study of the impact of fraud on financial institution failures and near failures over an 8-year period. The study identified certain management-related characteristics, financial indices, and trends that were common among the failures and near failures. One of the study’s primary recommendations was that DSC should develop an offsite monitoring system that identifies the presence of these characteristics and measures an institution’s susceptibility to fraud. As a result, in May 2008, DSC implemented ICARuS, which identifies institutions that may warrant increased supervisory attention between examinations because of an increased





11






susceptibility to fraud. According to DSC officials, ICARuS results, used in conjunction with SCOR data, improve the overall offsite monitoring results.

Conclusion

Validation of the assumptions and methodology used in SCOR is needed on a priority basis to determine if the performance of the system could be enhanced. In addition, thorough evaluations of all DSC offsite monitoring systems that create the ORL are needed on a regular basis to determine whether the assumptions and methodologies used reasonably support determinations for including institutions on the ORL. Using actual failure and downgrade information to test offsite monitoring systems and incorporating the results into evaluations of those systems could lead to a more focused ORL and a more effective and efficient offsite monitoring program. Scheduling all models-based offsite monitoring systems for regular evaluations and establishing procedures to conduct the evaluations would help to assure that management’s objectives regarding offsite monitoring are being achieved and financial risks to the DIF are being mitigated.

Recommendations for Enhancing the Effectiveness of the ORL

We recommend that the Director, DSC:
  1. Validate the assumptions and methodology used in SCOR.
  2. Ensure that the regular evaluations of all offsite monitoring systems used to create the ORL are performed as scheduled.
  3. Establish procedures to evaluate all models-based offsite monitoring systems. As part of these procedures, assess recent failure and downgrade information to test the efficacy of the logic and assumptions used in the offsite monitoring systems.

CORPORATION COMMENTS AND OIG EVALUATION

On February 10, 2009, the Director, DSC, provided a written response to the draft of this report. Management’s response is presented in its entirety in Appendix 4. Management concurred with our recommendations and stated it had completed the recommended actions.

Regarding validation, DSC stated that DSC and DIR will validate the assumptions and methodology used in SCOR by reviewing the offsite models that comprise SCOR on an annual rotational basis. With respect to evaluations of all offsite monitoring systems, DSC stated that DSC and DIR approved a regular validation schedule and has completed the first of the scheduled annual reviews. Further, DSC stated that it has established procedures to evaluate all models-based offsite monitoring systems. The procedures include: analyzing trends, performing statistical analyses of model logic and assumptions, and providing a summary of





12






related research findings pertaining to the financial performance and CAMELS rating trends. The first of these evaluations was completed in December 2008, according to DSC.

A summary of management’s response to each recommendation is in Appendix 5. DSC’s planned actions are responsive to our recommendations and the recommendations are resolved. Management indicated that it had completed the recommended actions; we will close the recommendations after reviewing those actions.





























13






APPENDIX 1

OBJECTIVE, SCOPE, AND METHODOLOGY

Objective

The audit objective was to assess the FDIC’s internal controls for performing offsite monitoring of insured financial institutions. We focused the audit on the DSC controls pertaining to offsite reviews of institutions on the FDIC’s ORL, which identifies insured 1- and 2-rated institutions with potential problems. As part of our audit, we also reviewed DSC’s implementation of a recommendation by GAO, pertaining to strengthening the FDIC’s risk assessment activities through periodic evaluations and monitoring, including offsite monitoring. We conducted this performance audit from April 2008 through July 2008 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objective.

Scope and Methodology

The FDIC insures deposits at approximately 8,500 institutions and supervises about 5,200 institutions as shown in Table 3 below.

Table 3: Number of Insured and Supervised Institutions, by Regulator
  Supervisor Number of Institutions Total Assets* Domestic Deposits* Estimated Insured Deposits*
FDIC 5,197 $ 2,180,697 $1,582,951 $1,091,571
OCC 1,632 7,782,387 3,590,744 1,995,866
FRB 878 1,519,012 845,494 502,812
OTS 826 1,556,670 892,592 696,835
Totals 8,533 $13,038,765 $6,911,780 $4,287,084
Source: The FDIC Quarterly Banking Profile, dated December 31, 2007.
* Dollars in millions.

During this audit, we selected a judgmental sample6 of 60 of 577 institutions from the December 31, 2007 ORL to determine whether DSC had completed offsite reviews for each of the sampled institutions and developed supervisory strategies in accordance with DSC policies and procedures. We also evaluated the FDIC’s implementation of GAO’s recommendation to strengthen its risk assessment activities, as it relates to offsite monitoring.

For the sampled institutions, we:

    • Reviewed the ORL data.



6 The results of a non-statistical sample cannot be projected to the intended population by standard statistical methods.





14






APPENDIX 1


    • Ensured that offsite reviews were completed in accordance with the Procedures Manual instructions for offsite reviews.
Additionally, we:
    • Reviewed memoranda and other documents related to the planning and performance of the offsite monitoring program.
    • Discussed other agencies’ offsite monitoring programs with DSC.
    • Compared institutions that either failed or were downgraded by two or more ratings to those institutions listed on the ORLs during the same periods.
Internal Control

The audit focused on the controls related to offsite monitoring and reviews of institutions on the ORL. These controls included policies and procedures contained in the Procedures Manual, which describes steps the FDIC should take in performing offsite reviews.

Reliance on Computer-processed Information

For purposes of the audit, we used computer-processed information provided in the ViSION system to support our significant findings, conclusions, and recommendations. To assess the reliability of this information, we tested the process for a sample of 60 institutions. The testing of computer-processed information was limited to our comparison of specific data elements, such as SCOR ratings, risk flags, asset size, level of risk, risk trend, and relevant follow-up codes7 for the 60 sampled financial institutions on the ORL. Additionally, we used the ViSION system to review examiner and supervisory comments from the other regulators and determine the timeliness of reviews by the Case Managers and Field Supervisors and of ARD approvals of the supervisory strategies.

Performance Measurement

We reviewed FDIC performance plans and strategic plans to determine whether the Corporation has established quantifiable performance measures related to its efforts to identify risk in institutions through the ORL. The Corporation has not established performance measures related to the ORL.




7 The follow-up codes are None, Continued Monitoring, or Onsite Activity.



15




APPENDIX 1


Compliance With Laws and Regulations

We determined that there were no applicable laws and regulations directly related to offsite monitoring. In addition, we assessed the risk of fraud and abuse related to the audit objective in the course of evaluating audit evidence.

Prior Coverage

In September 2002, we issued Audit Report No. 02-033 entitled, Statistical CAMELS Offsite Rating Review Program for FDIC-Supervised Banks. The audit objectives were to determine the effectiveness of SCOR as an early warning system and assess actions taken by DSC. The audit concluded that the effectiveness of the SCOR review program in detecting potential deterioration in the financial condition of insured depository institutions was limited because (1) a time lag of up to 4½ months existed between the date of the Call Report and the subsequent offsite review; (2) SCOR depends on the accuracy and integrity of Call Report information to serve as an early warning between examinations; (3) SCOR does not assess institution management quality and internal control or capture risks from non-financial factors such as market conditions, fraud, or insider abuse; and (4) DSC Case Managers rarely initiate follow-up action to address probable downgrades identified by SCOR other than deferring to a past, present, or future examination.

In February 2007, GAO issued a report entitled, Federal Deposit Insurance Corporation: Human Capital and Risk Assessment Programs Appear Sound, but Evaluations of Their Effectiveness Should Be Improved, Report No. GAO-07-255. GAO noted that SCOR is informative but does not always produce accurate results. GAO further noted that such a finding and the FDIC’s limited evaluation of its other offsite monitoring systems underscore the need for more regular reviews. GAO recommended that, to strengthen the oversight of its risk assessment activities, the FDIC should develop policies and procedures clearly defining how it will systematically evaluate and monitor its risk assessment activities and ensure that required evaluations are conducted in a comprehensive and routine fashion.























16








APPENDIX 2

THE FDIC’S OFFSITE MONITORING TOOLS



According to DSC, in addition to the ORL, the FDIC has established several other offsite monitoring tools that are used to monitor risks within the industry and to identify potential emerging issues that may require additional supervisory follow-up. The FDIC provided brief descriptions of these tools, as follows:
  • Large Insured Depository Institution Program – Quarterly offsite review program for institutions rated 3, 4, or 5 and with more than $3 billion in assets and all institutions with more than $10 billion in assets, including non-FDIC regulated institutions.
  • Internal Control Assessment Rating System and RAC Dashboard – In early 2006, an interdivisional executive sponsor team using resources from DSC and DIR was created to oversee efforts to enhance the FDIC’s offsite monitoring programs. The team developed ICARuS, an offsite monitoring system designed to identify institutions that may be more susceptible to fraud, and the RAC Dashboard. The RAC Dashboard is an information management tool to organize and synthesize key data about external risks to the banking industry. The components of the Dashboard reflect six major sources of risk: credit, economic, financial, large bank, market, and supervisory. The Dashboard consists of a set of risk indices that reflect underlying conditions in each of the risk areas. The index calculations rely on the quantitative as well as the subjective analysis of subject matter experts. Combining all relevant information into this simple risk index measure allows for consistent and meaningful interpretation both within and across business units. The Dashboard is an interdivisional project of the RAC; it is not intended to supplant the expertise of individual FDIC business units, but rather is meant to engage different divisions in a joint interpretation of risk data for the FDIC’s National Risk Committee (NRC)8 and other organizations within the Corporation.
  • Regional Watch Lists – Include institutions that are tracked and reported on by the regions in the Automated Regional Information System. The regional offices forward the watch lists to the Washington Office in the form of monthly status reports and include institutions listed on the Regional Bank Secrecy Act watch lists and Regional QLAs, which include institutions with significant exposures to subprime and nontraditional mortgage products.



8 The NRC identifies and evaluates the most significant external business risks facing the FDIC and the banking industry. Members of the committee include the Chief Operating Office; Chief Financial Officer; Deputy to the Chairman; Special Advisor to the Chairman; Directors for DSC, DIR, and the Division of Resolutions and Receiverships; and the General Counsel.



17






APPENDIX 2


  • Regional Offsite Monitoring and Supervisory Strategies – Various strategies are continually developed by the regional offices for areas of concern germane to each area of the country, including areas experiencing economic downturns such as Puerto Rico; states affected by deterioration in the auto industry; areas with heightened commercial real estate concentrations; etc.
  • Regional Risk Committees – Meet regularly to discuss current and emerging risks and to rank them in order of priority. These risks are monitored on an interdivisional basis, and material concerns are conveyed to the FDIC’s NRC.
  • Quarterly Supervisory Risk Profile – Highlights DSC’s monitoring of the current conditions of banks, including emerging risks (and how they are prioritized), and the division’s supervisory response.




























18






APPENDIX 3

OIG ANALYSIS OF WHETHER FAILED INSTITUTIONS WERE ON
THE ORL PRIOR TO FAILURE



Institution Date of Failure Latest Date Institution Was Rated 1 or 2 Institution on the ORL 4 Quarters Prior to Being Rated 1 or 2 No. of Quarters on the ORL Estimated Loss to the DIF as of July 31, 2008
First Heritage Bank, N.A.a 7/25/2008 6/26/2008 No None $819,843,000
First National Bank of Nevada 7/25/2008 10/15/2007 No None $41,773,000
IndyMac Bank, F.S.B.b 7/11/2008 1/25/2008 No None $8.9 Billion
First Integrity Bank, N.A. 5/30/2008 7/1/2003 Yes 2 $2,346,000
ANB Financial, N.A. 5/9/2008 2/23/2007 Yes 4 $214,000,000
Hume Bank 3/7/2008 7/16/2007 No None $2,618,000
Douglass National Bank 1/25/2008 11/22/2005 No None $6,000,000
Miami Valley Bank 10/4/2007 4/2/2007 No None $18,700,000
NetBank 9/28/2007 1/9/2006 Yes 2 $150,000,000
Metropolitan Savings Bank 2/2/2007 1/22/2007 No None $8,905,989
Bank of Ephraim 6/25/2004 4/7/2003 Yes 2 $2,998,017
Guaranty National Bank of Tallahassee 3/12/2004 7/23/2001 No None $0
Dollar Savings Bank 2/14/2004 2/14/2004 No None $0
Pulaski Savings Bank 11/14/2003 11/14/2003 No None $679,452
The First National Bank of Blanchardville 5/9/2003 5/9/2003 Yes 1 $12,776,628
The Farmers Bank & Trust of Cheneyville 12/17/2002 10/15/2002 No None $12,204,810
AmTrade International Bank of Georgia 9/30/2002 5/13/2002 Yes 1 $1,325,766
Universal F.S.B. 6/27/2002 6/27/2002 No None $274,452
NextBank, N.A. 2/7/2002 8/20/2001 Yes 4 $114,700,478
Oakwood Deposit Bank 2/1/2002 2/1/2002 No None $63,802,661
Source: The ViSION system and FDIC press releases.
a N.A. – National Association.
b F.S.B. – Federal Savings Bank.




19






APPENDIX 4

CORPORATION COMMENTS


February 10, 2009
TO:Russell A. Rau
Assistant Inspector General for Audits
 
FROM:Sandra L. Thompson, Director [Electronically produced version; original signed by Sandra L. Thompson]
Division of Supervision and Consumer Protection
 
SUBJECT:Response to the audit report entitled:
FDIC's Controls Related to the Offsite Review List (2008-021)
 

The Division of Supervision and Consumer Protection (DSC) has received and considered the Draft Report entitled FDIC’s Controls Related to the Offsite Review List prepared by the FDIC’s Office of Inspector General (OIG). The audit’s objective was to assess internal controls for performing offsite monitoring of insured financial institutions with a focus on controls related to reviews of institutions on the Offsite Review List (ORL). We appreciate your finding that DSC had completed offsite reviews for each institution included in the audit sample of institutions on the December 31, 2007 Offsite Review List (ORL) and had developed supervisory strategies and documented the reviews in accordance with the Division’s policies and procedures.

The Draft Report also contains three recommendations to help ensure that management’s objectives regarding offsite monitoring are being achieved and financial risks to the FDIC’s deposit insurance fund are being mitigated. We concur with these recommendations and have completed the recommended actions, as detailed in Appendix I. Although we agree with the recommendations, DSC believes it is important to emphasize three points regarding OIG’s comments throughout the report regarding the accuracy of the ORL as a predictive model.

First, the ORL is only one of a number of offsite monitoring tools or systems DSC uses to identify potential deterioration in insured institutions between onsite examinations. The extent to which these other tools and systems were used and accurately identified potential downgrades would not be captured by reviewing the performance of the ORL in isolation. For example, the largest institution included in Appendix 3 of the OIG’s Report (institutions that failed from January 2001 through July 2008 and not on the ORL in the four quarters before the downgrade) was subject to offsite supervisory review by the Quarterly Lending Alert and Large Insured Depository Institution Risk Monitoring Program for the entire OIG review period.




1 The ORL identifies institutions rated “1” or “2” that have been identified by (1) the Statistical CAMELS Offsite Rating (SCOR) System or SCOR-Lag as having at least a 35 percent probability of being downgraded to a “3” rating or worse at the next examination or (2) flagged by the Growth Monitoring System as being in at least the 98th growth percentile. Every institution on the ORL must have an offsite review completed and approved within three and one-half months following the end of each quarter



 






APPENDIX 4





Second, given the ORL draws from historical financial information, its accuracy in identifying institutions that are ultimately downgraded is likely to be lower when the reasons for downgrades are not fully reflected in reported financial information. During periods of economic expansion, such as the period reviewed in the OIG audit from 1998 to 2007, institutions were more likely to have been downgraded for non-financial reasons, such as identified risk management weaknesses and fraud. The accuracy of these same models improves during periods of economic weakness, when deterioration is more likely to be reflected in reported financial statements, as depicted in Appendix II.

Third, the accuracy of the ORL, or any statistically based model, tends to decline when the variable being evaluated (in this case, ratings downgrades) is estimated using a relatively small number of downgrade observations. During the period reviewed by the OIG, approximately 29,000 examinations occurred. Of these examinations, less than 7 percent were downgraded to a CAMELS “3” or worse. With so few observations (and even fewer observations when downgrades attributed to non-financial factors are excluded), the statistical relationships between the financial ratio inputs to the model and instances of ratings downgrades become weaker (i.e., the explanatory power of the model declines).

In closing, thank you for the opportunity to review and comment on the Draft Report.























21






APPENDIX 4



APPENDIX I
  1. The OIG recommended that DSC validate the assumptions and methodology used in Statistical CAMELS Offsite Rating (SCOR).

    DSC and DIR will validate the assumptions and methodology used in SCOR by reviewing the offsite models that comprise SCOR on an annual rotational basis. DSC and DIR have developed the validation process for the SCOR model assumptions and methodology by doing the following

    1. Evaluating the logic and conceptual underpinnings of the models;
    2. Ongoing monitoring to include benchmarking the offsite models against the performance of other where possible; and
    3. Outcome analysis or backtesting, which involves comparing predicted model outcomes with actual results. For this latter validation element, the accuracy of statistical models or expert systems is and will be analyzed in terms of Type I and Type II errors.1

    In addition to these elements, consideration is and will be given to non-financial and financial factors not considered by the models that could help explain any systematic underperformance as indicated by Type I and II errors. DSC and DIR completed SCOR validation in December 2008 and consider this action closed.

  2. The OIG recommended DSC ensure that the regular evaluations of all offsite monitoring systems used to create the ORL are performed as scheduled.

    A regular validation schedule was approved by DSC and DIR management in response to a recommendation by the U.S. Government Accountability Office (GAO), (Audit 07-255 Human Capital and Risk Assessment Programs Appear Sound) pertaining to strengthening the FDIC’s risk assessment activities, including offsite monitoring. DSC and DIR have completed the first of the scheduled annual reviews and consider this action closed.

  3. The OIG recommended DSC establish procedures to evaluate all models-based monitoring systems. As part of these procedures, the OIG recommended DSC assess recent failure and downgrade information to test the efficacy of the logic and assumptions used in the offsite monitoring systems.

    DSC has established procedures in response to GAO audit 07-255 to evaluate all models-based offsite monitoring systems. Evaluation work pertaining to the models is similar to work performed at model inception and includes these procedures:


1 A Type I error occurs when an institution receives a downgrade from a CAMELS rating of “1” or “2” to a “3” or worse rating without being identified (flagged) by the model or system. A Type II error occurs when an institution is identified (flagged) by the model or system, but it is not downgraded subsequently to a CAMELS rating of “3” or worse.






22






APPENDIX 4





    • DSC analyzing ratings trends (including failures and downgrades) and relating those trends to industry conditions;
    • DIR performing statistical analyses of model logic and assumptions and providing evidence of how these accuracy rates vary over the business cycle; and
    • DIR providing a summary of related research findings pertaining to the financial performance and CAMELS rating trends of certain “2” rated institutions with identified weaknesses.

    As a result of the implementation of the procedures discussed above, DSC considers this action closed.

































23






APPENDIX 4



APPENDIX II

The ORL identifies institutions rated “1” or “2” that have been identified by (1) the Statistical CAMELS Offsite Rating (SCOR) System or SCOR-Lag as having at least a 35 percent probability of being downgraded to a “3” rating or worse at the next examination or (2) flagged by the Growth Monitoring System as being in at least the 98th growth percentile.

Figure 1 shows the yearly average Type I and Type II accuracy achieved under the SCOR model for all insured banks. These accuracy measures evaluate the predictive precision of forecasting models. The Type I accuracy measure indicates the percentage of all downgraded banks that the SCOR model flagged prior to downgrade, and the Type II accuracy measure indicates the percentage of banks that are flagged by the model that are subsequently downgraded. The accuracy of SCOR is substantially higher in the late 1980’s and early 1990’s, and is lower from the mid 1990’s through 2006. The performance of SCOR is dependent on the business cycle and the predictive accuracy of SCOR, as evidenced in recent periods, is significantly higher during periods of financial deterioration. As stated previously, the accuracy of SCOR in terms of Type II accuracy as of 2008 has risen to its highest levels since its inception in 1986. The improvement in SCOR accuracy occurs around the second quarter of 2007, which is one quarter after the beginning of the sharp decline in U.S. housing prices and the same time period as the collapse of two large Bear Stearns hedge funds.

Figure 1

SCOR Forecasting Accuracy by Call Report Date
(6 month horizon)
Figure 1, SCOR Forecasting Accuracy by Call Report Date, (6 month horizon)





24





APPENDIX 5

MANAGEMENT RESPONSE TO RECOMMENDATIONS

This table presents the management response on the recommendation in our report and the status of the recommendation as of the date of report issuance.

Rec. No. Corrective Action: Taken or Planneda Monetary Benefits Resolved:b Yes or No Open or Closedc
1 DSC and DIR will validate the assumptions and methodology used in SCOR by reviewing the offsite models that comprise SCOR on an annual rotational basis.

DSC stated that the SCOR validation was completed in December 2008.
$0 Yes Open
2 DSC and DIT have (1) approved a regular validation schedule and (2) completed the first of the scheduled annual reviews. $0 Yes Open
3 DSC has established the following procedures to evaluate all models-based offsite monitoring systems: analyzing trends, performing statistical analyses of model logic and assumptions, and providing a summary of related research findings pertaining to the financial performance and CAMELS rating trends. $0 Yes Open
a In its written response to a draft of this report, DSC management stated that it concurred with the recommendations and had completed the recommended actions. We will close the recommendations after reviewing those actions.
b Resolved – (1) Management concurs with the recommendation, and the planned, ongoing, and completed corrective action is consistent with the recommendation.
(2) Management does not concur with the recommendation, but planned alternative action is acceptable to the OIG.
(3) Management agrees to the OIG monetary benefits, or a different amount, or no ($0) amount. Monetary benefits are considered resolved as long as management provides an amount.
c Once the OIG determines that the agreed-upon corrective actions have been completed and are responsive to the recommendations, the recommendations can be closed.








25




APPENDIX 6

ACRONYMS USED IN THE REPORT


ARD Assistant Regional Director
CAMELS Capital, Asset Quality, Management, Earnings, Liquidity, and Sensitivity to Market Risk
DIF Deposit Insurance Fund
DIR Division of Insurance and Research
DSC Division of Supervision and Consumer Protection
FDIC Federal Deposit Insurance Corporation
FRB Board of Governors of the Federal Reserve System
GAO Government Accountability Office
GMS Growth Monitoring System
ICARuS Internal Control Assessment Rating System
LIDI Large Insured Depository Institution
NRC National Risk Committee
OCC Office of the Comptroller of the Currency
OIG Office of Inspector General
ORL Offsite Review List
OTS Office of Thrift Supervision
QLA Quarterly Lending Alert
RAC Risk Analysis Center
REST Real Estate Stress Test
SCOR Statistical CAMELS Offsite Rating
TFR Thrift Financial Report
UFIRS Uniform Financial Institutions Rating System
ViSION Virtual Supervisory Information on the Net




26




Search | Accessibility | Privacy | Information Quality | Contact Us | Site Map | Home
Last updated 3/4/2009