Click Here for SharePoint 2013 Migration Information and News
Click here   image of a classical greek architecture representing DAU's strength as a business university instructing in DoD Acquisition
HomeContactAbout ACCPrivacyTutorialDoD CertificateReport an Issue  
.

Appendix A – Product Support BCA Checklist and Phases

Topic

Long Description

Previous and Next Page arrows

Appendix A - Product Support BCA Checklist and Phases

A.1 Product Support BCA Checklist

This attachment provides a guide for those responsible for preparing or reviewing the Product Support BCA. This checklist and process steps is provided as an initial guide for those responsible for preparing or reviewing the Product Support BCA. It is designed to enhance consistency in Product Support BCA products, and is not all-inclusive. Tailoring to the specific program and alternatives being assessed should be done.

A.1 Product Support BCA Checklist

1. Executive Summary:

a) Does the executive summary adequately state the problem, study objective, and significant criteria, assumptions and constraints?

b) Are the feasible alternatives clearly identified and differences explained?

c) Is the recommended alternative adequately supported by referencing details of the analysis?

2. Introduction, Outcomes, and Requirements:

a) Is the outcome clear and specific?

b) Is the outcome realistic?

c) Are any feasible alternative solutions excluded due to a bias in the objective statement?

d) Is the objective, as stated, unbiased as to the means of meeting the objective?

e) Are the expected outputs/accomplishments defined in quantifiable, measurable terms?

f) Are criteria specified for selection of a preferred course of action?

g) Is the objective statement phrased so that the type and variety of potential alternatives are not unnecessarily limited?

h) Is the statement of the objective/problem well documented?

i) Have performance measures and outcomes been identified which are appropriate for monitoring the business performance under the proposed new business plan?

3. Assumptions and Methods :

a) Are all assumptions recognized and identified?

b) Are the assumptions realistic, justified, and realistically supported?

c) Are assumptions used only when actual facts are unavailable?

d) Are assumptions unnecessarily restrictive, thereby preventing consideration of feasible alternatives?

e) Do assumptions include economic life and future changes in operations requirements?

f) Are key facts, ground rules, laws, DoD or Service policies, and other constraints stated?

g) Are all assumptions pertinent to the analysis identified and rationale provided?

h) Is a project time frame established?

i) Are space, construction, furniture, and lab equipment needs included?

j) Are necessary geographical constraints included?

k) Are assumptions too restrictive or too broad?

l) Are facts presented as assumptions? Can the facts be verified? Are uncertainties treated as facts?

m) Are all assumptions/constraints well documented?

n) Are methods, factors, evaluation criteria, and their approval process by the governance board clearly documented?

4. Alternatives:

a) Are all feasible alternatives considered?

b) Were alternatives rejected before a full analysis was adequately documented?

c) Are the alternatives significantly different as opposed to superficial restructuring of a single course of action?

d) Was the status quo used as the baseline for alternative evaluation?

e) Were other government agencies' capabilities to provide a product or service considered, where applicable?

f) Were contracting alternatives considered (including public private competition under OMB Circular A-76 or termination and consolidation of existing contracts)?

g) If appropriate, is lease versus buy evaluated as an alternative?

h) Are options applicable to each alternative presented?

i) If the project increases productive capacity, has a contracting alternative been examined?

j) Are the alternatives well defined?

k) Do alternatives overlap one another? Why?

5. Benefits and Non-Financial Analysis:

a) Have all project results, outputs, benefits, or yields been included?

b) Do the benefits relate to the project objective?

c) Are the benefits identified in measurable terms where possible?

d) Are benefits measuring techniques properly defined and supported?

e) Is benefit priority or ranking criteria clearly stated and used in the evaluation? Is any weighting scale consistently and reasonably applied?

f) Are negative results or outputs identified and adequately evaluated?

g) Is the list of benefits free of double counting?

h) Are secondary benefits (not related to the objective) identified?

i) Are all cost savings represented as a negative cost rather than as a benefit?

j) Are the benefits suitably tabulated, graphed, etc.?

k) Are the assumptions identified and rationale explained? Are they too restrictive or too broad?

l) Are estimating techniques defined? Are they appropriate?

m) Are information/estimation sources clearly identified?

n) Are data collection methods valid and adequate?

  • o) Are benefits estimating techniques valid?

p) If savings have been claimed, will a budget actually be reduced? Have the identified savings been fully coordinated with the impacted activity?

q) Have all advantages and disadvantages of the alternatives been identified?

r) Is expert opinion used? Were these experts properly qualified?

6. Cost and Financial Analysis:

a) Are cost and savings schedules realistic?

b) Have all incremental costs to the taxpayer, including common costs, been provided for each alternative?

c) Have cost estimates been provided for the status quo? Are they reasonable? Can they be verified?

d) Are all government direct and indirect costs included for each alternative?

e) Do investment costs include CAPE guidance, IPS Elements, etc.?

f) Are personnel costs all inclusive; that is, specific skill levels, fringe benefits, overtime and shift differentials, etc.? Are personnel costs broken out by rank/grade, number of employees in each category, etc.?

g) Are future equipment replacement costs included as investments as opposed to operations costs?

h) Are available asset values considered and are such values adequately documented?

i) Are cost collection and aggregation methods correct?

j) Are estimating relationships and procedures identified and properly supported?

k) Are program or project costs expressed in constant dollars?

l) Where inflation or cost escalation is used, have the factors been identified and validated?

m) Are cash flows discounted at the proper discount rate using OMB Circular A-94 guidance?

n) Are the sources of estimates identified? Are these sources accurate and appropriate?

  • o) Are cost factors current and supportable?

p) Is appropriate backup documentation, e.g., cost data sheets and variable explanation sheets, provided to support cost estimates?

q) Are cost estimates consistent with assumptions and constraints?

r) Has the life cycle cost estimate been provided for all feasible alternatives?

7. Risk:

a) Assuming that a risk analysis has been performed, how were the probability estimates derived?

b) Has an uncertainty analysis been performed? What technique was used (for example, a fortiori or contingency analysis)?

c) Were ranges of values used for unknown quantities?

d) Were point values varied to illustrate impact?

e) Have all relevant "what if" questions been answered?

8. Sensitivity Analysis:

a) Were the effects of possible changes to the objective requirements evaluated?

b) Has a sensitivity analysis been performed to show the impact of changes in dominant cost elements? Examples are length of economic life; volume, mix or pattern of workload; requirements; organizational structure; equipment, hardware, or software configuration; or, impact on the length of time for project completion. If no sensitivity analysis has been performed, why not?

c) What do the sensitivity analysis results imply about the relative ranking of alternatives?

d) Would the recommendation stay the same if a given characteristic varied within a feasible range?

9. Conclusion and Recommendation:

a) Do the comparison and selection criteria agree with those in the project or mission objective statement?

b) Does analysis data clearly support the recommendation?

c) Were alternative selection criteria applied consistently?

d) Were cost and benefit data suitably displayed to accurately depict relationships?

e) Were the alternatives compared to a common baseline (minimum requirements level)?

f) Were alternative comparison techniques suitable for the program project being evaluated; that is, present value, payback period, uniform annual cost, etc.?

g) Was a specific course of action recommended?

h) Does the analysis seem free of bias in favor of a particular alternative (for example, no benefits indicated for one or more of the alternatives, biased assumptions, etc.)?

i) Are the recommendations logically derived from the material?

j) Are the recommendations feasible in the real world of political or policy considerations?

k) Are the recommendations based on significant differences between the alternatives?

l) Do benefits exceed relevant costs for the preferred alternative?

m) Have all significant differences between the recommended alternative and others been emphasized?

n) Does the communication plan show a reasonable plan for spreading the word about the proposed business process to all affected parties?

  • o) Is there a project plan that spells out in sufficient detail the actions different offices or organizations must take to implement the new way of doing business?

p) Does the plan include reasonable steps that are sequenced in proper order to get from the “as-is” to the “to-be” state of business?

q) Do steps in the action plan acknowledge any barriers to implementation and allow time and a reasonable plan of action to overcome implementation barriers?

10. Documentation:

a) Are the costs thoroughly documented in appendixes so an independent reviewer may replicate it?

b) Is it possible to trace costs to their basic inputs, units of measure, sources derived from, and as of date for any special rates or factors?

c) If costs, assumptions, or other input to the estimate is based upon expert opinion, does the supporting documentation include the individual's office symbol, email address, and phone number?

d) Will the Product Support BCA "stand on its own?"

e) Will an independent reviewer be able to reach the same conclusion?

11. Coordination:

a) Has coordination of all participating offices and organizations been obtained?

12. Sustainability:

a) Is the project economically viable?

b) Is the project energy and resource efficient?

c) What is the program’s potential environmental impact?

d) What is the program’s plan and mitigation strategies for potential environmental impacts?

e) Is the project safe for workers and end users?

f) What is the impact to the local community?

g) Does the project consider the 6Rs of closed loop material flow (Recover, Recycle, Redesign, Reduce, Remanufacture, and Reuse)?

h) Does the project consider the 7 Elements of Sustainable Manufacturing (Cost, Resource Consumption, Environment, Health, Safety, Waste Management, and Local Community)?

Previous and Next Page arrows

Previous Page Next Page

List of All Contributions at This Location

No items found.

Popular Tags

Page Information

At this page:
29295 Page Views 0 Pages Emailed
0 Meta-card Views 1 Documents and Videos
0 Questions 0 Attachments Downloaded
0 Answers 0 Videos downloaded
0 Relationships and Highlights
ID454816
Date CreatedTuesday, July 5, 2011 11:05 AM
Date ModifiedFriday, August 26, 2011 12:54 PM
Version Comment:

REQUEST AN ACCOUNT Benefits of Membership I Forgot My Login Information
ACC Practice Center Version 3.2
  • Application Build 3.2.9
  • Database Version 3.2.9