DEFENSE SECURITY COOPERATION AGENCY |
12/20/2000 | |
MEMORANDUM FOR :
Deputy Under Secretary of the Army (International Affairs) Department of the Army SUBJECT : FMS Review Policy Guidance (DSCA 00-19) Over the past few years, DSCA received substantial comments from the USG FMS community and the FMS customer countries regarding the FMS review process. (Approximately 400 reviews are held on at least an annual basis.) In order to provide excellent support to our FMS customers, we use reviews to convey accurate, timely and thorough status on the FMS programs. These reviews represent a significant investment of FMS resources, in terms of both time and funding. While some aspects of the current process received favorable endorsement, the majority of feedback focused on an FMS review process in need of improvement. Specifically, it was felt policy was needed to establish whether a given review adds value, define the proper scope of the different FMS review types, apply consistency in determining which USG components should attend reviews, identify how the FMS reviews should be funded, and assign standard preparation and follow-on requirements. To respond to these issues, an Interagency Process Team (IPT) was formed in February 2000. The IPT's primary objective was to improve the FMS review process. Representatives from DSCA (Comptroller, MEAN, ERASA, DSADC), USASAC (Alexandria and New Cumberland), Navy (IPO and NAVICP), USAF (AFSAC and SAF-IA), and DFAS met on several occasions to explore this issue in considerable detail. In addition, DSCA briefed the Foreign Procurement Group, International Customers Users Group and numerous FMS customer countries during the past several months to solicit their input and to ensure that their desires were given utmost consideration. While there is a valid need to consistently apply FMS review policy as much as possible, this guidance gives due weight to accommodating uniqueness and flexibility necessary for the optimal execution of individual FMS country programs as discussed in those feedback sessions. This memo and Attachments 1 through 7 provide the comprehensive policy guidance derived from the IPT. A brief synopsis of this policy will be incorporated into the forthcoming SAMM (DoD 5105.38-M) rewrite. Corresponding updates to MILDEP-level policy publications may be necessary. Additionally, this memo will be posted on the DSCA Web Site (www.dsca.osd.mil). Your assistance is requested in ensuring widest possible dissemination of this policy. The FMS review policy guidance is found at Attachment 1. That guidance provides the general parameters within which FMS reviews are to be conducted. Main policy tenets follow:
As a means for monitoring this policy, DSCA seeks the establishment of FMS review advisors for the DSCA, MILDEPs/Implementing Agencies and DFAS Denver. These advisors should have either already served on the FMS review IPT or be otherwise familiar with the review process and policies. I ask that you notify my primary contacts, Mr. David Rude and Ms. Vanessa Glascoe, by 15 January 2001 as to whom will help promote this policy guidance. In closing, I want to thank the following individuals outside DSCA for their outstanding contributions to this important endeavor:
Please convey my personal appreciation for their dedication and professionalism, without which the IPT's objectives would not have been accomplished. This group, which is an essential component of the Business Processes IPT, will resume on an ad-hoc basis to ensure DSAMS requirements accurately capture FMS review policy; standardize FMS reporting formats and/or identify minimum data requirements to the extent possible; address policies regarding facilities hosting FMS reviews; clarify proper usage of representational funds, conference fees, gifts, and socials; refine FMS review delivery reporting transactions; and (if needed) fine tune this policy as a result of implementation feedback. The Business Process IPT's charter will reflect these efforts. Should your staff have any questions, the DSCA point of contact is Mr. David Rude, Financial Policy Team Chief/IPT Chair, (703) 604-6569, e-mail: david.rude@osd.pentagon.mil. Tome H. Walters, Jr. ATTACHMENT :
|
Attachment 1 While this policy guidance addresses the universe of FMS reviews, certain types of FMS meetings/visits are excluded from this policy. Training PMRs, IMET reviews, technical reviews, site surveys, releasability meetings, and INL-funded meetings are not covered by this policy. In addition, DSCA recognizes that the nature, scheduling and conduct of Policy-level reviews chaired by Assistant Secretary or higher level are not subject to this policy. However, Policy-level reviews represent one review category and, as such, are referred to in this document. Review Types Five broad types of reviews apply to FMS: Policy-level; Country-level; Service-level; Program-level; and Internal. The first four types (Policy -- through Program-level) constitute External reviews, i.e., those involving the FMS customer. Within the Internal review category are three subdivisions: External Review Planning Meetings; Internal Reconciliation Reviews; and Internal Process Reviews. Attachment 2 describes the characteristics and scope applicable to each review type. Please note that the "Associated Reviews" section within Attachment 2 attempts to correlate the review types with the various names/acronyms currently in use to represent that category. Every effort should be made to begin transitioning from those names/acronyms to simply identifying the review type. While some degree of flexibility should be retained to accommodate longstanding country/program-unique review acronyms, it is expected that all prospective reviews that commence for the first time after 1 January 2001 will adhere to the labeling format provided below. In doing so, and with increased familiarity over time with the corresponding characteristics and scope, any misunderstanding as to the purpose/intent/objective of any given review should be significantly reduced. Example 1: All Program-level reviews should be labeled (Country Name) (Weapon System/Program) (Program Review) -- to illustrate: Bandaria F-16 Program Review. Example 2: All Service-level reviews should be labeled (Country Name) (Service) (Review) -- to illustrate: Bandaria Army Review. Note: "Service" can denote either IA or In-Country Service (ICS), depending on the scope of that particular review. The foregoing illustration applies to ICS-driven reviews. If IA-driven reviews apply; the review name format would be: U.S. Navy Review for Bandaria The following sections of this policy correspond to the sequence of IPT Charter Elements found at Attachment 3. Review Value It is important that, when considering whether to conduct any given FMS review, a determination is made that the individual review adds value. In doing so, the value assessment should be made not only in consideration of USG resources and other constraints, but also the desires of the FMS customer. At times, the political visibility/sensitivity that an FMS review will receive is reason enough to conduct it; this is particularly true for the Policy-level reviews. In addition, drastic changes evident in a region, country or program may necessitate the conduct of previously unscheduled reviews and deviate from usual reporting formats (one such example is reviews stemming from the 1997-1998 Asia Financial Crisis). For all other circumstances, however, additional determinants must be taken into account in the context of value added. Those criteria include: Identifying Objectives and Deliverables. When considering whether to have an FMS review, it is imperative that the objectives (why are we conducting this FMS review?) and deliverables (what outcomes do we want to achieve?) are clearly identified. If either objectives or deliverables are absent in that analysis, the review should not be held at that time. Moreover, the objectives and deliverables should be articulated to all FMS review components (USG and customer) during the planning phase; this will help minimize confusion and reinforce the proper scope of issues to be discussed. Customer Requirements. A customer's internal policy or even legislation may require periodic information on the status of country accounts, issues, cases and programs. Care must be taken to ensure that customer expectations or precedence complement the review value process; on the other hand, having a review every quarter for the past three years is not in and of itself sufficient. (An exception would be Program-level reviews that are following an established milestone plan.) In addition, while technologies such as VTC should be explored whenever feasible, recognize that personal, face-to-face dialogue is vital in some cultures to actually getting the work accomplished. USG Requirements. We may have many of the same needs shown in the "Customer Requirements" section above. In addition, FMS reviews are a wonderful opportunity for apprising the customer on updated policies, laws and current events/issues. Reviews can also promote our proactiveness and advocacy, as well as timely resolution of issues and closures of actions. They show our commitment and desire to be effective/efficient stewards of the customer's FMS resources. Actions such as those announced in DEPSECDEF's 13 Dec 99 memo (Attachment 4) can be satisfied through the FMS reviews. Activity/Dollar Value/Size. This refers to the degree in which the country, service or program being reviewed is active, the dollar amounts associated thereto, and/or the number of cases being reviewed. It is important to note that none of these factors are sufficient standalone indicators for determining the value of a given review. For example, while Country XXX may have only 15 cases, those cases may total several billion dollars in value and could be a lynchpin in our bilateral relations. Under that scenario, using the number of open cases alone would be misleading. Instead, each of these factors must be viewed in conjunction with others. Long-Term Investment. The FMS review forum may be viewed as a valuable opportunity to promote USG interests and strengthen our sovereign relations with other countries. This is an intangible yet potentially important value determinant. Customer Sophistication/Reliance on USG. This can be an important factor, especially when an FMS review involves a customer unfamiliar with the FMS "language", policies and procedures. Usually, these customers require closer USG involvement and more intensive management. These reviews would also be prime venues for educating customers on the FMS process. Conversely, highly sophisticated customers can benefit from reviews as they help maintain open communications, but they may also be comfortable using technologies as a substitute for reviews per se. Customer Preference. The preferences and desires of the customer regarding the conduct of reviews should be accommodated to the extent possible. However, when those preferences are not practical and/or logical, the USG review component lead is responsible for offering sound and reasonable alternatives. The key is to find mutually agreeable solutions that make sense. Uniqueness. A number of reviews have evolved over time to accommodate unique requirements on the part of the customer, applicable weapon system, etc. These unique arrangements already in existence should continue to be honored provided they continue to add value. However, review components are invited to introduce common data element usage, standardized definitions and reporting formats to the extent agreeable by the FMS customer. Number of Reviews As noted earlier, approximately 400 FMS reviews are held at least once per year. DSCA received considerable feedback reflecting that the review components' organizational structures generally require the same cadre of country/case/program managers to attend numerous reviews within a given year. Understandably, this strains resources and adversely affects the time allotted for managers to resolve FMS review actions and perform their day-to-day routine functions. In addition, many FMS customers who have an active FMS review roster have expressed a desire to reduce the quantity of reviews for these same reasons. Also, it became quite clear during the IPT's research that areas of duplication and overlap exist between different reviews for the same country/service/program. Therefore, efforts are to begin immediately to identify reasonable ways to consolidate (or, in some instances, eliminate altogether) reviews. Examples of consolidation already instituted thus far follow: Example 1: Merge the Financial Management Review (FMR) and Case Reconciliation Review (CRR) for the same country into an FMR. Example 2: Consolidate separate Program-level reviews that are mature in nature into a single joint Program-level review. These consolidation efforts, however, cannot be taken unilaterally: the review consolidation/reduction proposals must be offered to and accepted by the FMS customer. USG flexibility in entertaining customer counter-proposals is expected. While the precedence of having a given review should be given merit, remember that precedence does not mandate permanence. For consolidation approach recommendations, or if problems with the proposals arise, please consult the respective FMS review advisor (see section below). The keys in being successful in endeavors to reduce/consolidate are that the value of such a reduction exceeds the status quo, and that the customer perceives fewer reviews improve the process. This latter point may involve educating on our part. In addition, a primary objective of merging reviews should be to minimize (if not eliminate altogether) areas of redundancy and duplication. Resource constraint issues arise in the context of having to present the exact same type of information (albeit in slightly different formats) during several different FMS reviews. Similarly, identical issues can be raised at more than one review and/or review type. In those instances, the party raising that issue should be apprised as to the most suitable review for discussing that topic. One corrective measure is to ensure correlation between the level of the issue being proposed for discussion and the review type itself (refer to Attachment 2). We must also remain reasonably flexible to address all customer concerns at a review. If issues are known in advance which are clearly outside the scope/purview of that review, the customer should be notified as to alternative venues for those discussions. Optimal Frequency of and Timing for Conducting Reviews The usual frequency of and timing for reviews depend in large part on the review type being considered. For all external reviews deemed necessary by both the USG and the customer, the frequency and timing must be agreed by mutual consent with the FMS customer. The following reflects normal guidelines:
LOA note for program-level review frequency follows: "Program Review Schedule. The initial review schedule has been projected as follows: (specify known review events here). Future changes and/or additions to this projected schedule will be based on further program definition and will be provided through official correspondence to the FMS customer for concurrence." In scheduling reviews, consideration should be given to customer and USG holidays, customer weekends (which are oftentimes different from ours), and changes within SAO personnel and customer leadership. Appropriate Levels of Representation For protocol purposes, whenever possible the rank of the lead USG review official should be equivalent to that of the customer co-chair (counterpart). All USG representatives attending FMS reviews must be knowledgeable and empowered to make on-the-spot decisions, while recognizing that some issues may require the final approval of senior management who may not be present at the review itself (which may require an action item). Those who attend the FMS reviews must be able to adequately represent their components and, consequently, speak effectively and decisively. This topic must also consider the type and scope of the review being held. While more senior officials may co-chair reviews of a highly visible and macro-level nature, detailed reviews such as PMRs may require the attendance of managers who are responsible for the day-to-day operation of that program/weapon system. FMS Review Attendees This factor addresses two aspects: (1) which components should attend each type of review, and (2) responsibilities of the attendees. Component Attendance. Although exceptions are allowed if agenda topics dictate (and if those issues are not under the purview of the usual attendees), components are normally required for the review types as shown below:
Attendee Responsibilities. All USG DOD officials attending FMS reviews must meet the following criteria:
FMS Review Funding During the IPT's research, it was found that there are inconsistent applications in terms of how each FMS review type is to be funded. It was also discovered that the funding source depends not only on what type of review is considered, but also what components attend and even what levels of component managers attend. Attachment 5 provides the FMS funding matrix. If the USG requests reviews exceeding the normal timeframe shown in the preceding table, the source of funding normally would not change. However, if the FMS customer requests reviews exceeding the norm, those additional reviews could be FMS case-funded -- in that situation, the USG and FMS customer should assign a mutually agreeable FMS case against which the review costs should be charged. DSCA will coordinate with OUSD (Comptroller) to ensure any rewrite to Table 718-1 of the DOD FMR, Volume 15, Chapter 7 reflects Attachment 5. We realize that extraordinary exceptions may be required to accommodate a given individual's circumstance for a specific FMS review; in those instances, the applicable FMS review advisor must be consulted for a policy exception determination. FMS Review Reporting Format Standardization The establishment of "boilerplate" reporting formats for each FMS review type is an important tool for eliminating inconsistencies and/or redundancies. In addition, using standard formats helps familiarize the FMS customer with our usage of data element terms, and avoids confusion that oftentimes results from presenting different formats in the same review. While standardized formats are preferred, flexibility should be retained to allow for supplemental changes and other deviations from the normal reporting structure. The standard format for use in DSCA Country-Level FMRs is provided at Attachment 6 to illustrate this point. As essential as the format itself is the consistency associated with defining each reporting data element. It is a source of confusion and frustration to those receiving reports in an FMS review when various reporting components use the same term (e.g., "obligations") in different ways. The development of a lexicon would assist all components responsible for preparing similar reports, and as such DSCA highly encourages that lexicons are distributed at all reviews. General Preparation and Follow-On Requirements The FMS review is both a culmination of extensive preparations and planning preceding it, and sets the stage for important follow-on requirements. The following guidelines apply to all reviews, regardless of level or hosting organization: Preparation. The first step in planning for a review is to identify the objectives and deliverables -- refer to the foregoing discussion under "Review Value". Subsequent preparation requirements are to involve the following:
Follow-on. It is expected that action items will be tasked, and other information will be required, as a result of an FMS review. The following applies:
Communication Channels The degree to which the planning for, conduct of and follow-up to reviews succeeds is highly dependent on open and efficient lines of communication. For external reviews, the SAOs in particular are key players as they are the official liaison between the FMS customer and the USG review components. The lead USG review component (i.e., review co-chair) is responsible for ensuring these clear communication channels exist. With ever expanding technology, communication occurs in the form of "formal" and "informal". For the purpose of communicating on FMS reviews, formal encompasses frontchannel cables, letters/memoranda, and meetings with the customer. Informal includes e-mail. Formal communication must be made on the following aspects of FMS reviews:
Informal communication can address the following:
Surveys The survey instrument is an excellent means for assessing customer satisfaction with the review just held, as well as a forum for "lessons learned" to improve future review endeavors. Surveys are required for all Country-through Program-level FMS reviews that commence after 31 March 2001. They are to be distributed prior to the review's closing session. Preferably, they will be returned before surveyed attendees depart; if that is not possible, a target date should be assigned by which respondents furnish the completed survey. The boilerplate survey to be used is found at Attachment 7. Modifications to the boilerplate survey may at times be warranted, to include adding survey elements addressing satisfaction with the FMS customer in a given review. DSCA encourages a central repository for survey results, possibly with the applicable FMS review advisor. FMS Review Advisors To address policy guidance implementation queries and help ensure consistent interpretation thereof, FMS review advisors should be established for the MILDEPs/Implementing Agencies and DFAS Denver. Mr. David Rude and Ms. Vanessa Glascoe will serve as the DSCA FMS review contacts. The selected advisors should be familiar with, and serve as overall focal points for the following:
|
Attachment 2
Note: Refer to handout next under for glossary of associated review acronyms.
Associated Reviews Glossary
*** Denotes reviews unique to a specific country |
Attachment 3
|
MEMORANDUM FOR :
Secretaries of the Military Departments SUBJECT : Foreign Military Sales (FMS) Financial Management Recent audit reports have identified a number of FMS management problems that manifest themselves in inaccurate or delayed financial management transactions. At my direction, a review of FMS processes impacting financial management was conducted. This effort was led by the Office of the Under Secretary of Defense (Comptroller) [OUSD(C)] and the Defense Security Cooperation Agency (DSCA), with participation by the Office of the Under Secretary of Defense (Acquisition, Technology and Logistics) [OUSD(AT&L)], the Military Departments, the Defense Logistics Agency, and the Defense Finance and Accounting Service (DFAS). The review produced a number of recommendations with the potential to improve financial management in the near-term. I have approved those recommendations and am directing their implementation through the actions contained in the attachment. The attached actions are intended to reduce work load, eliminate erroneous payments, lower operating costs, permit FMS cases to be closed sooner, accelerate reimbursements to the Department and the U.S. Treasury, and ensure better customer satisfaction. Within 90 days from the date of this memorandum, the USD(AT&L.), Heads of the DoD Components, and Directors of DFAS and DSCA are directed to report their progress on the attached actions to the USD(C). Your cooperation in implementing these rules is appreciated. John J. Hamre ATTACHMENT :
Attachment The Under Secretary of Defense (Acquisition, Technology and Logistics) (USD(AT&L)) is directed to:
The Under Secretary of Defense (Comptroller) (USD(C)) is directed to:
The Director, Defense Finance and Accounting Service (DFAS), is directed to:
The Director, Defense Security Cooperation Agency (DSCA), is directed to:
The heads of the DoD Components are directed to:
|
Attachment 5
|
Attachment 6
U.S. -- (Country) 2000 Financial Management Review Case Financial Status Reporting Format
|
Attachment 7 FMS Review Title: Please take a few moments and fill out this important survey so that we can assess and, where needed, improve this FMS review process. Circle the rating that you feel best applies. For numeric ratings a "1" is a low or very poor assessment, while a "5" is a high or extremely pleased opinion. In addition, you are encouraged to provide any written remarks or elaborate on your opinions at the bottom of this form. The provision of your name and component being represented is strictly voluntary. All responses will be considered as non-attribution. Thank you!
Other Comments:
|