FDIC’s Controls Related to the Offsite Review List
This report presents the results of our audit of the FDIC’s controls related to one of its offsite monitoring tools—the Offsite Review List (ORL).1 The audit objective was to assess the FDIC’s internal controls for performing offsite monitoring of insured financial institutions. We focused the audit on the Division of Supervision and Consumer Protection’s (DSC) controls pertaining to offsite reviews of institutions on the FDIC’s ORL, which identifies insured institutions with 1 and 2 composite ratings and potential problems that pose the risk the institution will be downgraded at the next examination.2 As part of our audit, we also reviewed DSC’s implementation of a recommendation by the Government Accountability Office (GAO),3 pertaining to strengthening the FDIC’s risk assessment activities through periodic evaluations and monitoring, including offsite monitoring.
We conducted this performance audit in accordance with generally accepted government auditing standards. Appendix 1 of this report discusses our audit objective, scope, and methodology in detail.
Section 10(d) of the Federal Deposit Insurance Act requires annual onsite examinations of each insured financial institution at least once during each 12-month period.4 Annual examination intervals may be extended to 18 months if the insured institution has assets totaling less than $500 million and is well managed and well capitalized. As stated in the FDIC Banking Review, 2003, Volume 15, No. 3, onsite examinations provide the most complete and reliable information about an institution’s financial health, and the federal banking agencies regard CAMELS ratings as the single best indicator of an institution’s condition. However, subsequent to a completed onsite examination, an institution’s financial condition may change, so the CAMELS ratings may no longer accurately reflect the institution’s current condition. Therefore, the FDIC has developed various offsite tools, including the ORL, to monitor insured institutions between examinations.
As identified in the Offsite Review Program section of the Case Manager Procedures Manual (Procedures Manual), DSC developed eight risk measures for monitoring the condition of individual institutions. These eight risk measures use offsite data to assist in monitoring the risk of about 8,500 insured institutions (including non-FDIC supervised institutions). The eight measures use information reported in financial institutions’ quarterly Consolidated Reports of Condition and Income (Call Report) and Thrift Financial Reports (TFR). Information from the measures may also aid examiners in planning for an onsite examination. The eight measures are described below.
Three of these eight measures are used to produce the quarterly ORL: the SCOR, SCOR-Lag, and GMS. The ORL consists of 1- and 2-composite rated institutions that are (1) identified by SCOR or SCOR-Lag with a 35 percent or higher probability of being downgraded to a 3 rating or worse at the next examination or (2) flagged by the GMS as being in the 98th or higher growth percentile.
DSC stated that it uses several other offsite monitoring tools to monitor risks within the industry and to identify potential emerging issues that may require additional supervisory follow-up. These tools (details are in Appendix 2) include, but are not limited to, the following.
According to DSC, it created and staffed two new sections in 2008 to strengthen the examination program and enhance the risk assessment process, including offsite review. These sections are: (1) the Risk Analysis Section, which analyzes offsite information available through various monitoring systems, together with specific information gathered during examinations, to proactively identify risks and trends; and (2) the Emerging Issues Section, which was created to enhance the Corporation’s ability to develop proactive forward-looking bank supervision policy and conduct offsite monitoring of various institutional risks.
Information in DSC’s Virtual Supervisory Information on the Net (ViSION) System shows that the number of institutions on the ORL has been increasing significantly since 2006, as shown in the figure, which follows.
Number of Institutions on the ORLs Since 2006
Source: Office of Inspector General (OIG) analysis of ViSION system information. [ D ]
Section 13 of the Procedures Manual discusses the Offsite Review Program, including: (1) definitions of the eight risk measures, (2) generation of the ORL based on updated quarterly Call Report data, (3) deadlines for conducting an offsite review, (4) the reviewer’s documented findings and supervisory strategy in the Offsite Module of ViSION, and (5) ViSION comments on reviews that found medium or high levels of risk. The Procedures Manual does not provide specific step-by-step instructions for completing the offsite review; rather, it provides a general overview of the Offsite Review Program.
According to the Procedures Manual, “[E]ach institution on the ORL must have an Offsite Review and will appear in the Active Tasks of the appropriate Case Manager or Field Supervisor in ViSION.” Case Managers or Field Supervisors perform the offsite reviews to determine whether supervisory attention is warranted before the next regularly scheduled examination or a rating change should be initiated, if the review indicates that the institution poses a greater risk to the insurance fund than indicated by the composite rating. The manual also states that offsite reviews must be completed and approved within 3½ months after each Call Report date. Table 1, which follows, shows the schedule for completing and approving offsite reviews.
Table 1: Offsite Review Dates
Prior Related Audit Attention
In February 2007, the GAO issued a report entitled, Federal Deposit Insurance Corporation: Human Capital and Risk Assessment Programs Appear Sound, but Evaluations of Their Effectiveness Should Be Improved (Report No. GAO-07-255). In its report, the GAO noted that the FDIC has an extensive risk assessment system and contingency plans for bank failures but had not comprehensively or routinely evaluated the system or plans. Although the GAO noted that the FDIC had conducted a one-time analysis of the performance of SCOR, the GAO also noted that the FDIC was not regularly evaluating its offsite monitoring systems for reliability and underscored the need for the Corporation to perform more regular reviews. The GAO recommended that
. . . to strengthen the oversight of its risk assessment activities, the FDIC should develop policies and procedures clearly defining how it will systematically evaluate and monitor its risk assessment activities and ensure that required evaluations are conducted in a comprehensive and routine fashion.
In response to the GAO report, the Corporation stated:
We agree that it would be beneficial to review our risk assessment activities to ensure they are comprehensive, appropriate to our mission, and fully evaluated. As noted in the GAO draft report, a review of FDIC offsite monitoring systems has been completed, and work continues to implement needed changes.
Beginning in January 2007, an interdivisional committee will perform an in-depth review of current risk assessment activities and evaluation procedures. By September 30, 2007, the committee will make recommendations to FDIC executive management as to how we might strengthen the risk assessment framework. At that time, management will establish a reasonable timeline to implement any required changes.
RESULTS OF AUDIT
DSC has established an internal control process for performing offsite monitoring of insured financial institutions identified on the ORL. The internal control process includes: (1) scheduling and performing offsite reviews for each institution on the ORL; (2) documenting the analyses performed as part of each review, including a supervisory strategy; and (3) requiring a supervisory approval of the reviews performed. We sampled
60 of the 577 institutions on the December 31, 2007 ORL and found that DSC had completed offsite reviews for each sampled institution and documented the reviews in accordance with DSC policies and procedures, including specifying a supervisory strategy. Further, there was evidence of supervisory review for each of the offsite reviews in our sample (Controls for Performing Offsite Reviews).
Although DSC has developed an extensive offsite review program using a variety of sources – including the LIDI program, the QLA, and Regional Watch Lists – to monitor financial institution condition, the ORL was not capturing a significant percentage of institutions that DSC, through its risk management examinations, downgraded to a 3 rating or worse, as illustrated below.
The assumptions and methodologies in SCOR have not been updated since 2003. Further, the offsite monitoring systems used to create the ORL are largely based on historical indicators pertaining to institution asset quality, earnings, and capital that may not fully consider current and emerging risks. As a result, the ORL may not be capturing a complete picture of the risks facing 1- and 2-rated institutions or identifying those institutions at risk of significant ratings downgrades.
Additionally, DSC has initiated a process for periodically evaluating the three models-based systems that determine the ORL in response to a February 2007 GAO recommendation to evaluate and monitor these systems. DSC plans to evaluate, on a rotational basis, all of its offsite monitoring systems. However, at the time we completed our audit fieldwork, no details regarding a schedule or procedures for conducting evaluations were available, and no system evaluations had been performed.
Validation of the assumptions and methodology used in SCOR is needed on a priority basis to determine if the performance of the system could be enhanced. In addition, thorough evaluations of the three DSC offsite monitoring systems that create the ORL are needed on a regular basis to determine if the assumptions and methodologies used reasonably support determinations for including institutions on the ORL. Using actual failure and downgrade information to test offsite monitoring systems and incorporating the results into evaluations of those systems could lead to a more focused ORL and a more effective and efficient offsite monitoring program. Scheduling all offsite monitoring systems for regular evaluations and establishing procedures to conduct the evaluations would help to assure that management’s objectives regarding offsite monitoring are being achieved and financial risks to the FDIC’s Deposit Insurance Fund are being mitigated (Effectiveness of the ORL).
CONTROLS FOR PERFORMING OFFSITE REVIEWS
DSC has established an internal control process for performing offsite reviews of insured institutions appearing on the ORL. Such controls include making an assessment of risk, identifying a supervisory strategy, documenting analyses performed, and requiring supervisory approval of the reviews. We sampled 60 of the 577 institutions on the December 31, 2007 ORL. For all 60 institutions, we found that DSC Case Managers or Field Supervisors had complied with guidance in the Procedures Manual. Specifically, a reviewer completed an offsite review for each institution, identified a risk level and trend, identified a supervisory strategy, and documented the review in ViSION. We also found that each review had been approved by an Assistant Regional Director (ARD), as required by the policy. Further, for those institutions not regulated by the FDIC, we found that Case Managers or Field Supervisors contacted the appropriate regulators to discuss supervisory strategies for each of the sampled institutions whose overall risk level was expected to sufficiently change over the next 12-month period. The results of our sample are shown in Table 2.Table 2: Sample of Institutions from the December 31, 2007 ORL
Because we noted no matters warranting additional management action, we made no recommendation in this area related to the audit objective.
EFFECTIVENESS OF THE ORL
The ORL has not been as effective in capturing problem institutions as it should be. Although DSC has developed an extensive offsite monitoring program that utilizes a multitude of other systems and ad hoc reports to address emerging risks, DSC has not regularly validated the underlying assumptions and methodologies for the models-based component of the system or conducted regular evaluations of the models used to create the ORL. This led the GAO to recommend that the FDIC strengthen the oversight of its risk assessment activities and systematically evaluate these activities. The assumptions and methodologies in SCOR have not been updated since 2003. Further, the offsite monitoring systems used to create the ORL are largely based on historical financial information, provided by the financial institution, that may not be accurate and may not fully consider current and emerging risks. As a result, the FDIC’s offsite monitoring systems may not be capturing a complete picture of the current and emerging risks facing 1- and 2-rated institutions or identifying those institutions at risk of significant ratings downgrades.
Guidance Related to Ensuring the Effectiveness of the ORL
FDIC Circular 4010.3, FDIC Enterprise Risk Management Program, adopted internal control standards prescribed in GAO publication, Standards for Internal Control in the Federal Government. The GAO standards apply to all operations (programmatic, financial, and compliance) and are intended to ensure the effectiveness and efficiency of operation, reliability of financial reporting, and compliance with applicable laws and regulations. Circular 4010.3 requires management to develop and implement controls to ensure that management directives are carried out and to provide reasonable assurance that controls are sufficient to minimize exposure to waste, fraud, and mismanagement. The circular also requires management to perform monitoring activities to assess the quality of performance over time and the effectiveness of controls. Key control activities described in the circular, as they relate to offsite monitoring, include routine management and supervisory actions; transaction comparisons and reconciliations; other actions taken in the course of normal operations; as well as separate and discrete control evaluations, including internal self-assessments and external reviews.
Accuracy of the ORL in Identifying Problem Institutions
To assess the effectiveness of the ORL in identifying institutions at risk of being downgraded, we analyzed ORL and CAMELS ratings information available on failed institutions from January 2001 through July 31, 2008 and on
institutions that had been downgraded by two or more ratings from 2002 through 2007. The results of our analysis are discussed below.
Failed Institutions That Were Not Identified on ORLs. Of the 20 institutions that failed from January 2001 through July 2008 and were rated 1 or 2 in an examination during that period, 13 institutions (65 percent) did not appear on an ORL in the 4 quarters prior to the quarter they were downgraded to a 3, 4, or 5 rating (see details in Appendix 3).5 In its response to these statistics, DSC stated:
. . . 7 of the 13 institutions that failed and did not appear on the ORL had fraud as a contributing factor for failure. The statistical models used to generate the ORL rely on Call and Thrift Financial Report data and cannot detect deteriorating financial conditions when banks or thrifts misstate their financial information. In addition, another institution was extensively monitored through the LIDI program and also did appear on the ORL for the period from 9/30/00 to 12/31/07. After excluding these eight institutions, the ORL identified 7 of the 12 failures (58 percent) in the 4 quarters prior to the quarter the institutions were downgraded.
Downgraded Institutions That Were Not Identified on ORLs. We obtained a list from FDIC officials of 223 institutions rated 1 or 2 that had been downgraded by 2 or more ratings from 2002 through 2007. When we compared these institutions to the ORLs created during that time period, we found that 151 institutions (68 percent) did not appear on an ORL in any of the 4 quarters prior to the multiple downgrade despite, in many cases, significant financial deterioration. Our results indicate that actual failure and downgrade information would be useful in testing the efficacy of the assumptions and methodologies used in the offsite monitoring systems.
Division of Insurance and Research Analysis of SCOR’s Performance
DSC provided us with a memorandum, dated February 4, 2008, summarizing a Division of Insurance and Research (DIR) analysis of SCOR’s performance from 1985 to the first quarter in 2007. The memorandum states that SCOR’s performance was evaluated in terms of whether the model was able to correctly identify banks that were subsequently downgraded. DIR’s analysis showed that from 1998 through 2007, SCOR did not flag 2,011 (88 percent) of 2,281 institutions that were eventually downgraded to a 3 rating or worse and that SCOR flagged 832 institutions of which 562 (68 percent) were not downgraded in a subsequent examination. According to the memorandum, during periods of economic expansion and growth:
. . . institutions are more likely to be downgraded for non-financial reasons (such as incompetent management, fraud, etc.). In such periods, a financial conditions model such as SCOR, which identifies institutions based purely on the financial ratios, is more likely to miss the downgrades.
DIR’s analysis points to potential issues related to the accuracy of the SCOR model. DSC stated that it recognized the limitations of offsite monitoring models and does not rely solely on these models to evaluate the potential risks posed by financial institutions.
In 2005, an FDIC interdivisional team completed a project to review and evaluate the FDIC’s offsite monitoring systems and tools for their effectiveness and efficiency and to identify opportunities to improve the offsite monitoring process. The Offsite Monitoring Project’s Summary Report and Recommendations stated that, as of June 2005, 283 (60 percent) of 472 institutions with a composite rating of 3 or worse were not identified on a single ORL between June 2001 and March 2005. This interdivisional effort recommended that the selection criteria for the ORL incorporate trend analysis, other statistical measures, and other factors (such as output from other models combined with contributing or mitigating risk factors) to increase the overall predictive capability of the list. The project also recommended that a core set of profiles be established for institutions that migrate to a composite 3 rating and that these profiles be statistically compared to the output from other models. Additionally, the project report included recommendations to improve the performance of SCOR, SCOR-Lag, GMS, and other FDIC offsite monitoring tools. However, we found no evidence that the recommendations had been implemented.
In 2003, the FDIC published an article about SCOR, noting that its performance has declined particularly in good economic periods. The article noted:
The low level of accuracy might be expected inasmuch as SCOR relies completely on financial ratios. Any such model will probably be more accurate when the reasons for downgrades are financial, and less accurate when the reasons have to do with some aspect of bank operations that does not affect the bank’s financial ratios. For example, examiners may downgrade a bank because they discover that it has significantly weakened its underwriting standards or has weak internal controls—but as long as the more risky loans have not become past due, problems might not have made their way to the financial statements. Consequently, one might reasonably expect that SCOR would be less accurate over the last decade. The reliance on financial data has several other effects on SCOR’s performance. For one thing, it means that SCOR is completely dependent on the accurate reporting of financial information.
DSC officials we interviewed reiterated that SCOR relies on the financial information reported by institutions in Call Reports and TFRs. To the extent this information is inaccurate, the ORL may be impacted. According to DSC officials, DSC has begun looking at this issue by assessing whether institutions that were downgraded and not detected by SCOR eventually filed amended Call Reports. That review was still ongoing at the time of issuance of this report.
DSC also indicated that in 2008, it implemented ICARuS as an offsite risk monitoring tool to identify institutions, between examinations, that may warrant increased supervisory attention because of an increased susceptibility to fraud. ICARuS was developed as a solution for identifying certain management-related characteristics, financial indices, and trends that were common among the failures and near failures not identified by SCOR.
Implementation of a GAO Recommendation Related to Evaluating Offsite Monitoring Systems
GAO’s February 2007 report noted that SCOR is informative but does not always produce accurate results. GAO further noted that such a finding and the FDIC’s limited evaluation of its other offsite monitoring systems underscore the need for more regular reviews. GAO recommended that, “to strengthen the oversight of its risk assessment activities, the FDIC should develop policies and procedures clearly defining how it will systematically evaluate and monitor its risk assessment activities and ensure that required evaluations are conducted in a comprehensive and routine fashion.”
In response to the GAO report, the Corporation stated that an interdivisional committee would perform an in-depth review of its risk assessment activities and evaluation procedures. As part of that review, the FDIC would seek to improve its offsite monitoring systems. The plan for this effort included the development of an Offsite Models and Systems Validation Program. As part of the validation program, the FDIC planned to conduct two types of reviews—a logical and conceptual soundness review of the offsite systems, performed over a 4-year period, and a backtesting (outcome analysis) review to be performed annually as follows:
The backtesting reviews would include analyses of (1) trends in the number of institutions flagged, (2) analyses of trends in the underlying model or system ratios, and (3) a comparison of the model or system predictions to actual results.
In addition, DSC completed a study of the impact of fraud on financial institution failures and near failures over an 8-year period. The study identified certain management-related characteristics, financial indices, and trends that were common among the failures and near failures. One of the study’s primary recommendations was that DSC should develop an offsite monitoring system that identifies the presence of these characteristics and measures an institution’s susceptibility to fraud. As a result, in May 2008, DSC implemented ICARuS, which identifies institutions that may warrant increased supervisory attention between examinations because of an increased
susceptibility to fraud. According to DSC officials, ICARuS results, used in conjunction with SCOR data, improve the overall offsite monitoring results.
Validation of the assumptions and methodology used in SCOR is needed on a priority basis to determine if the performance of the system could be enhanced. In addition, thorough evaluations of all DSC offsite monitoring systems that create the ORL are needed on a regular basis to determine whether the assumptions and methodologies used reasonably support determinations for including institutions on the ORL. Using actual failure and downgrade information to test offsite monitoring systems and incorporating the results into evaluations of those systems could lead to a more focused ORL and a more effective and efficient offsite monitoring program. Scheduling all models-based offsite monitoring systems for regular evaluations and establishing procedures to conduct the evaluations would help to assure that management’s objectives regarding offsite monitoring are being achieved and financial risks to the DIF are being mitigated.
Recommendations for Enhancing the Effectiveness of the ORLWe recommend that the Director, DSC:
CORPORATION COMMENTS AND OIG EVALUATION
On February 10, 2009, the Director, DSC, provided a written response to the draft of this report. Management’s response is presented in its entirety in Appendix 4. Management concurred with our recommendations and stated it had completed the recommended actions.
Regarding validation, DSC stated that DSC and DIR will validate the assumptions and methodology used in SCOR by reviewing the offsite models that comprise SCOR on an annual rotational basis. With respect to evaluations of all offsite monitoring systems, DSC stated that DSC and DIR approved a regular validation schedule and has completed the first of the scheduled annual reviews. Further, DSC stated that it has established procedures to evaluate all models-based offsite monitoring systems. The procedures include: analyzing trends, performing statistical analyses of model logic and assumptions, and providing a summary of
related research findings pertaining to the financial performance and CAMELS rating trends. The first of these evaluations was completed in December 2008, according to DSC.
A summary of management’s response to each recommendation is in Appendix 5. DSC’s planned actions are responsive to our recommendations and the recommendations are resolved. Management indicated that it had completed the recommended actions; we will close the recommendations after reviewing those actions.
OBJECTIVE, SCOPE, AND METHODOLOGY
|Supervisor||Number of Institutions||Total Assets*||Domestic Deposits*||Estimated Insured Deposits*|
During this audit, we selected a judgmental sample6 of 60 of 577 institutions from the December 31, 2007 ORL to determine whether DSC had completed offsite reviews for each of the sampled institutions and developed supervisory strategies in accordance with DSC policies and procedures. We also evaluated the FDIC’s implementation of GAO’s recommendation to strengthen its risk assessment activities, as it relates to offsite monitoring.
For the sampled institutions, we:
The audit focused on the controls related to offsite monitoring and reviews of institutions on the ORL. These controls included policies and procedures contained in the Procedures Manual, which describes steps the FDIC should take in performing offsite reviews.
Reliance on Computer-processed Information
For purposes of the audit, we used computer-processed information provided in the ViSION system to support our significant findings, conclusions, and recommendations. To assess the reliability of this information, we tested the process for a sample of 60 institutions. The testing of computer-processed information was limited to our comparison of specific data elements, such as SCOR ratings, risk flags, asset size, level of risk, risk trend, and relevant follow-up codes7 for the 60 sampled financial institutions on the ORL. Additionally, we used the ViSION system to review examiner and supervisory comments from the other regulators and determine the timeliness of reviews by the Case Managers and Field Supervisors and of ARD approvals of the supervisory strategies.
We reviewed FDIC performance plans and strategic plans to determine whether the Corporation has established quantifiable performance measures related to its efforts to identify risk in institutions through the ORL. The Corporation has not established performance measures related to the ORL.
Compliance With Laws and Regulations
We determined that there were no applicable laws and regulations directly related to offsite monitoring. In addition, we assessed the risk of fraud and abuse related to the audit objective in the course of evaluating audit evidence.
In September 2002, we issued Audit Report No. 02-033 entitled, Statistical CAMELS Offsite Rating Review Program for FDIC-Supervised Banks. The audit objectives were to determine the effectiveness of SCOR as an early warning system and assess actions taken by DSC. The audit concluded that the effectiveness of the SCOR review program in detecting potential deterioration in the financial condition of insured depository institutions was limited because (1) a time lag of up to 4½ months existed between the date of the Call Report and the subsequent offsite review; (2) SCOR depends on the accuracy and integrity of Call Report information to serve as an early warning between examinations; (3) SCOR does not assess institution management quality and internal control or capture risks from non-financial factors such as market conditions, fraud, or insider abuse; and (4) DSC Case Managers rarely initiate follow-up action to address probable downgrades identified by SCOR other than deferring to a past, present, or future examination.
In February 2007, GAO issued a report entitled, Federal Deposit Insurance Corporation: Human Capital and Risk Assessment Programs Appear Sound, but Evaluations of Their Effectiveness Should Be Improved, Report No. GAO-07-255. GAO noted that SCOR is informative but does not always produce accurate results. GAO further noted that such a finding and the FDIC’s limited evaluation of its other offsite monitoring systems underscore the need for more regular reviews. GAO recommended that, to strengthen the oversight of its risk assessment activities, the FDIC should develop policies and procedures clearly defining how it will systematically evaluate and monitor its risk assessment activities and ensure that required evaluations are conducted in a comprehensive and routine fashion.
THE FDIC’S OFFSITE MONITORING TOOLS
OIG ANALYSIS OF WHETHER FAILED INSTITUTIONS WERE ON
|Institution||Date of Failure||Latest Date Institution Was Rated 1 or 2||Institution on the ORL 4 Quarters Prior to Being Rated 1 or 2||No. of Quarters on the ORL||Estimated Loss to the DIF as of July 31, 2008|
|First Heritage Bank, N.A.a||7/25/2008||6/26/2008||No||None||$819,843,000|
|First National Bank of Nevada||7/25/2008||10/15/2007||No||None||$41,773,000|
|IndyMac Bank, F.S.B.b||7/11/2008||1/25/2008||No||None||$8.9 Billion|
|First Integrity Bank, N.A.||5/30/2008||7/1/2003||Yes||2||$2,346,000|
|ANB Financial, N.A.||5/9/2008||2/23/2007||Yes||4||$214,000,000|
|Douglass National Bank||1/25/2008||11/22/2005||No||None||$6,000,000|
|Miami Valley Bank||10/4/2007||4/2/2007||No||None||$18,700,000|
|Metropolitan Savings Bank||2/2/2007||1/22/2007||No||None||$8,905,989|
|Bank of Ephraim||6/25/2004||4/7/2003||Yes||2||$2,998,017|
|Guaranty National Bank of Tallahassee||3/12/2004||7/23/2001||No||None||$0|
|Dollar Savings Bank||2/14/2004||2/14/2004||No||None||$0|
|Pulaski Savings Bank||11/14/2003||11/14/2003||No||None||$679,452|
|The First National Bank of Blanchardville||5/9/2003||5/9/2003||Yes||1||$12,776,628|
|The Farmers Bank & Trust of Cheneyville||12/17/2002||10/15/2002||No||None||$12,204,810|
|AmTrade International Bank of Georgia||9/30/2002||5/13/2002||Yes||1||$1,325,766|
|Oakwood Deposit Bank||2/1/2002||2/1/2002||No||None||$63,802,661|
February 10, 2009
The Division of Supervision and Consumer Protection (DSC) has received and considered the Draft Report entitled FDIC’s Controls Related to the Offsite Review List prepared by the FDIC’s Office of Inspector General (OIG). The audit’s objective was to assess internal controls for performing offsite monitoring of insured financial institutions with a focus on controls related to reviews of institutions on the Offsite Review List (ORL). We appreciate your finding that DSC had completed offsite reviews for each institution included in the audit sample of institutions on the December 31, 2007 Offsite Review List (ORL) and had developed supervisory strategies and documented the reviews in accordance with the Division’s policies and procedures.
The Draft Report also contains three recommendations to help ensure that management’s objectives regarding offsite monitoring are being achieved and financial risks to the FDIC’s deposit insurance fund are being mitigated. We concur with these recommendations and have completed the recommended actions, as detailed in Appendix I. Although we agree with the recommendations, DSC believes it is important to emphasize three points regarding OIG’s comments throughout the report regarding the accuracy of the ORL as a predictive model.
First, the ORL is only one of a number of offsite monitoring tools or systems DSC uses to identify potential deterioration in insured institutions between onsite examinations. The extent to which these other tools and systems were used and accurately identified potential downgrades would not be captured by reviewing the performance of the ORL in isolation. For example, the largest institution included in Appendix 3 of the OIG’s Report (institutions that failed from January 2001 through July 2008 and not on the ORL in the four quarters before the downgrade) was subject to offsite supervisory review by the Quarterly Lending Alert and Large Insured Depository Institution Risk Monitoring Program for the entire OIG review period.
MANAGEMENT RESPONSE TO RECOMMENDATIONS
|Rec. No.||Corrective Action: Taken or Planneda||Monetary Benefits||Resolved:b Yes or No||Open or Closedc|
DSC and DIR will validate the assumptions and methodology used in SCOR by reviewing the offsite models that comprise SCOR on an annual rotational
DSC stated that the SCOR validation was completed in December 2008.
|2||DSC and DIT have (1) approved a regular validation schedule and (2) completed the first of the scheduled annual reviews.||$0||Yes||Open|
|3||DSC has established the following procedures to evaluate all models-based offsite monitoring systems: analyzing trends, performing statistical analyses of model logic and assumptions, and providing a summary of related research findings pertaining to the financial performance and CAMELS rating trends.||$0||Yes||Open|
|b Resolved –||(1) Management concurs with the recommendation, and the planned, ongoing, and completed corrective action is consistent with the recommendation.|
|(2) Management does not concur with the recommendation, but planned alternative action is acceptable to the OIG.|
|(3) Management agrees to the OIG monetary benefits, or a different amount, or no ($0) amount. Monetary benefits are considered resolved as long as management provides an amount.|
ACRONYMS USED IN THE REPORT
|ARD||Assistant Regional Director|
|CAMELS||Capital, Asset Quality, Management, Earnings, Liquidity, and Sensitivity to Market Risk|
|DIF||Deposit Insurance Fund|
|DIR||Division of Insurance and Research|
|DSC||Division of Supervision and Consumer Protection|
|FDIC||Federal Deposit Insurance Corporation|
|FRB||Board of Governors of the Federal Reserve System|
|GAO||Government Accountability Office|
|GMS||Growth Monitoring System|
|ICARuS||Internal Control Assessment Rating System|
|LIDI||Large Insured Depository Institution|
|NRC||National Risk Committee|
|OCC||Office of the Comptroller of the Currency|
|OIG||Office of Inspector General|
|ORL||Offsite Review List|
|OTS||Office of Thrift Supervision|
|QLA||Quarterly Lending Alert|
|RAC||Risk Analysis Center|
|REST||Real Estate Stress Test|
|SCOR||Statistical CAMELS Offsite Rating|
|TFR||Thrift Financial Report|
|UFIRS||Uniform Financial Institutions Rating System|
|ViSION||Virtual Supervisory Information on the Net|