print font size font_plus font_minus font_reset

Outreach

SSE Town Hall Q&A 2005
Tree Legend Expand All    Collapse All
  • Systems Engineering Policy, Guidance, Education, and Training
    • Q: Why does OSD feel it is important to revitalize systems engineering? Why now?
      • A: Effective technical discipline on programs is essential to accurately assess cost, schedule, and technology risks from the proper perspective. There have been numerous reviews and studies by government, industry (both Defense and commercial), and academia that clearly indicate sound application of systems engineering (SE) principals is fundamental to program success. We have mined best practices from these reviews and studies and promulgated these in the form of SE policies, the guidance provided in Chapter 4 of the Defense Acquisition Guidebook, the new continuous learning courses, and the soon-to-be-revised Systems Planning, Research, Development, and Engineering (SPRDE) career field courses.
    • Q: Can we get consensus among the Services and OSD to accelerate publishing of key standards (499, 1521, etc)?
      • A: Beyond the industry and international standards that already exist, the need for additional standards remains under discussion and consideration. DoD policy and guidance promulgated since 2003 is relatively new and is being assimilated by the acquisition community. The effectiveness of this policy and guidance needs to be assessed before additional steps are proposed and taken. Currently, the Defense Acquisition Guidebook, Chapter 4, serves as the guide for systems engineering application to systems acquisition.
    • Q: Should military standards (MIL-STDs) be implemented?
      • A: The issue of military standards (MIL-STDs) and the role they may play in systems engineering revitalization is under discussion. There already are commercial and international standards for systems engineering. Given that any new standard would need to be tailorable, it is not clear what value a military standard would have over and above the guidance in the Defense Acquisition Guidebook.
    • Q: Why haven't you emphasized the impact of systems engineering (SE) process and metrics in the Request for Proposal (RFP)? (Get SE priced and scheduled appropriately.)
      • A: It is expected that as the new systems engineering (SE) policies and guidance take hold, there will be attendant improvement in the pricing and scheduling of programs. For example, the policy for event-based reviews is expected to help drive more schedule realism into early program planning and help yield better costing of not only systems engineering, but of the overall program. To better focus on how the systems engineering policies and guidance can be applied to activities in source selection, OSD is developing a contracting guide for SE. We expect to complete this guide in the Fall of 2005.
    • Q: Besides paying directly for systems engineering (SE) in the scope of the contract, what will incentivize industry to practice sound SE throughout a program?
      • A: In the instant contract execution, it is important for industry to practice sound systems engineering (SE) to ensure success of the program. Past performance is an important element in the source selection process and a contractor should be concerned with making the program a success to avoid a bad record in past performance. To aid in formulating future contract strategies, OSD is working on a contracting guide for SE. We plan to address incentivization in this guide and expect to complete the guide in the Fall of 2005.
    • Q: What can the government do to incentivize flow down of systems engineering (SE) rigor to suppliers and subcontractors?
      • A: Incentives should first be established between the government and prime for SE. The prime will then flow down appropriate incentives to their subs and suppliers.

        Suggestion: Need to allow program management-track people to take Systems Planning, Research, Development, and Engineering (SPRDE) courses!

        A: Nothing prevents this except for limited funding issues. The Systems Planning, Research, Development, and Engineering (SPRDE)/Systems Engineering (SE) Career Path Functional Advisor (FA) will coordinate with the Program Management Career Field FA, the Defense Acquisition University (DAU), and the Service/DoD Agency Defense Acquisition Career Managers (DACMs) on finding ways of encouraging all appropriate workforce members to take SE courses. Currently the SPRDE/SE career path curriculum is being restructured and revised to reflect the new SE policy and guidance. A new SYS 101 course is in development and will be required for SPRDE/SE certification.
    • Q: What education is available to train architects? How do you know when you have a well-done architecture?
      • A: The general topic of architectures is covered in the Systems Planning, Research, Development, and Engineering (SPRDE)/Systems Engineering SYS courses. With respect to knowing when you have a well-done architecture, a well-planned and executed technical approach would clearly design in and demonstrate the required features the customer expects.
  • Systems Engineering Plan Guidance
    • Q: What is the strategic objective of the Systems Engineering Plan (SEP)?
      • A. Our acquisition programs are becoming increasingly complex, as is the underlying systems engineering. There can be wide variation in how systems engineering is applied to a program-with attendant variations in program success (ability to deliver required capability on schedule and within budget). It has been noted that problem programs typically have weak application of systems engineering. The purpose of the Systems Engineering Plan (SEP) is to enable a collective understanding (government and industry, working level and leadership) of how technical management will be applied and by whom, to the program. The SEP is a mechanism by which systems engineering best practice application can be planned for, viewed, reviewed, and executed.
    • Q: Systems Engineering Plan (SEP) document versus actual use practices! What is more important?
      • A: The primary purpose of a program's Systems Engineering Plan (SEP) is to capture and share the systems engineering (SE) "actual use practices" to be applied on a program; there should be no difference.
    • Q: What's more important? A well polished plan or employing the practices of systems engineering (SE) in a robust manner?
      • A: The more important activity is planning for systems engineering (SE) that includes the best practices espoused in the Defense Acquisition Guidebook and the Systems Engineering Plan (SEP) Preparation Guide, executing to that planned approach, and updating the planning necessary to accomplish future activities.
    • Q: What does my Systems Engineering Plan (SEP) need to contain to get approved? Are there specific info/processes that it needs to address?
      • A: The better question is what SE planning has been done, and is it properly documented in the SEP? Alternatively, what does the Systems Engineering Plan (SEP) need to contain for the program to be successful? The guidance in the Defense Acquisition Guidebook, Chapter 4, the SEP Preparation Guide, and the 25 focus areas for technical planning can be accessed at the OSD Systems Engineering website: http://www.acq.osd.mil/se/.
    • Q: Where can we download a copy of the 25 focus areas to address technical planning in a Systems Engineering Plan (SEP)?
    • Q: Capability Maturity Model Integrated (CMMI) addresses 20+ processes. Which of these should be addressed Systems Engineering Plans (SEPs)? The issue being.bounding process definition.
      • A: The Systems Engineering Plan (SEP) should address the systems engineering (SE) approach to implementing the Capability Maturity Model Integrated (CMMI) processes. The SEP should detail the technical, including the "who," planning for conduct of SE in support of any and all processes on a program. However, DoD does not require or support obtaining a CMMI level rating of any kind.
    • Q: Systems Engineering Plan (SEP) Process-Why not use only the 25 questions as the SEP guidance and scorecard and let the Service/Agency/program office set the SEP 'format?' OSD, if not concerned about the document, should only set the grading standards and let the Services/Agencies bring you the quality product.
      • A: Given that the purpose of the Systems Engineering Plan (SEP) is to capture and share (across all program participants and stakeholders) the plan for how systems engineering (SE) will be accomplished on a program, there are many good sources for guidance. Systems engineering process standards, Capability Maturity Model Integrated (CMMI), company best practice standards, the Defense Acquisition Guidebook, etc., are all pertinent guidance. The emphasis should be on good SE application in the SEP rather than conforming to any given set of questions or "scorecard." The 25 questions used by OSD to organize and formalize their review process are only one set of perspectives and should not be the sole focus of program technical planning. Program teams should consider the totality of guidance available and apply and tailor to the specific technical and programmatic challenges of their individual programs.
    • Q: There appears to be interest in system-of-systems (SoS) engineering, but the Systems Engineering Plan (SEP) guidance doesn't seem to address SoS or Enterprise-level processes. Discuss what should be done.
      • A: The Systems Engineering Plan (SEP) guidance was developed to apply to any system at any level of integration complexity. The notion of a "system" directly applies to a system-of-systems (SoS) from a systems engineering standpoint as discussed in the Defense Acquisition Guidebook (Section 4.2.6).
    • Q: Discuss the level at which a Systems Engineering Plan (SEP) is required: Program Executive Officer (PEO), Program Manager (PM), or Program?
      • A: The Systems Engineering Plan (SEP) should be prepared, reviewed, and approved across the stakeholders responsible for the program oversight and execution. As the technical plan for the program, part of its purpose is to illustrate to program leadership (at multiple levels) how the program team will execute the technical aspects of the program and how technical risks will be identified, assessed, and mitigated from a technical standpoint.
    • Q: For business systems, change management, process re-engineering, and data migration are key risks. Should these risks be addressed in the Systems Engineering Plan (SEP)?
      • A: Change management, process re-engineering, and data migration are key processes for business systems. As such, they need to be examined as potential risk areas and documented in the Systems Engineering Plan (SEP), as appropriate. These may not, however, be the only risks to the program that exist. The SEP should detail the technical planning for the program and how these and all program risks will be identified, assessed, and mitigated from a technical standpoint. The event-driven technical reviews are key to determining program health.
    • Q: What are best practices for integrating contractor and government Systems Engineering Plans (SEPs)?
      • A: Systems engineering, when properly done, involves the entire program team-government program office, prime contractor, and sub-tier suppliers. An integrated program Systems Engineering Plan (SEP) should capture the technical plan for how these organizations will coordinate the program's overall technical effort across organizational and contractual boundaries.
    • Q: Are you suggesting that the contractor Systems Engineering Plan (SEP) be a formal contract item and used as a baseline under change control?
      • A: The level of formality is up to the program team to decide. If making the Systems Engineering Plan (SEP) a "formal contract item" is necessary to have the SEP completed (over and above the necessity to have a technical plan for systems engineering on a program), then that is what the program should do. As with any useful planning document that is being used, it is expected that a program's SEP will change over time. Change controls will ensure all program stakeholders are working from the same plan.
    • Q: Is there value in a Systems Engineering Plan (SEP) for small ACAT III products that follow the systems engineering (SE) process and use integrated product teams (IPTs)?
      • A: Absolutely. Program size is not a determinant for requiring systems engineering; systems engineering should be applied to every program regardless of size. Given that we have a larger number of smaller programs (ACAT III and IV), taken collectively, these programs represent a significant part of the Department's overall portfolio and risk. The same technical underpinnings should be in our smaller programs, as well as our larger programs. For programs having less complexity and risk, it would be expected that their Systems Engineering Plans (SEPs) would be similarly smaller in scope.
    • Q: In a Systems Engineering Plan (SEP) for a program with many different delivery orders of different sizes, levels of complexity, etc, is it possible to clearly and concisely explain entry/exit criteria for reviews?
      • A: It should be. How else are we to track technical maturity within the delivery orders? Assuming each delivery order delivers a product, the technical maturity of that product needs to be assessed during its development. Technical reviews, their content/criteria/products, would form the basis for the technical planning in the Systems Engineering Plan (SEP). For many smaller orders, an overarching SEP may serve the purpose for structuring how systems engineering would be applied. For large delivery orders, a separate SEP for that delivery order may be appropriate. This should be determined by the Milestone Decision Authority.
    • Q: How to best document plan vice execution? How to keep plan baselined and also living?
      • A: The value of a plan is most often in the planning itself. But, there is also value in having a plan against which to execute-being able to share this plan with the overall team (which can change over time). During execution, it may become evident that the plan is no longer being followed, for positive or negative reasons. It would be prudent to readdress the plan and update it to take fuller advantage of the positive changes in execution (more broadly share), as well as reinforce the plan against the negative changes. More importantly, the systems engineering approach needs to evolve as the program evolves through its phases. Over time, future uncertainties become clearer and need to find their way into the detailed planning accordingly.
    • Q: Most of the systems engineering (SE) discussion, including Systems Engineering Plan (SEP) requirements, are focused on Milestones A, B, and C. What are we doing to encourage continual use of SE during the sustainment phase of a program?
      • A: Guidance for systems engineering (SE) during sustainment can be found in the Defense Acquisition Guidebook, Chapters 4 and 5. Additionally, Systems Engineering Plans (SEPs) submitted in support of the Milestone C should address the approach for sustaining engineering during the production and operations and support phases of the program.
    • Q: Given that the Systems Engineering Plan (SEP) is intended to cover the entire life cycle of the weapon system, how can we ensure the SEP is followed in the later life cycle phases, i.e., post-deployment?
      • A: It is expected that the Services and Agencies will, as part of their ongoing program oversight, ensure Systems Engineering Plans (SEPs) are updated as necessary during the Operations and Support Phase.
    • Q: If the program has passed Milestone B, how much history should be included in the Systems Engineering Plan (SEP)?
      • A: The Systems Engineering Plan (SEP) should only contain that background information necessary to frame the context for the planning discussion. For programs with SEPs since program initiation, this aspect should be easily accomplished.
  • SE Definition, Process, and Application
    • Q: What is the definition of systems engineering (SE)? I heard a lot of different definitions from leadership.
      • A: There are a number of recognized definitions for systems engineering (SE) including U.S. and international standards. Rather than endorse one of these proprietary definitions in lieu of others, DoD has adapted the following formal definition from EIA/IS 632, Processes for Engineering a System (see Defense Acquisition Guidebook Section 4.1.1): "Systems engineering is an interdisciplinary approach encompassing the entire technical effort to evolve and verify an integrated and total Lifecycle balanced set of system, people, and process solutions that satisfy customer needs. Systems engineering is the integrating mechanism across the technical efforts related to the development, manufacturing, verification, deployment, operations, support, disposal of, and user training for systems and their life cycle processes. System engineering develops technical information to support the program management decision-making process. For example, systems engineers manage and control the definition and management of the system configuration and the translation of the system definition into work breakdown structures."
    • Q: Is classical systems engineering (SE) the same as DoD SE?
      • A: "Classical" systems engineering (SE) is in the eye of the beholder. There are a variety of U.S. and international standards (see Defense Acquisition Guidebook Section 4.6.1), which describe the best methodology for application of SE. All are similar, but none are identical. The application of SE within DoD is different than outside DoD. It's a matter of the relationship between the acquirer and supplier. Generally, the DoD is in the role of acquirer, soliciting technical performance outcomes from its suppliers and Defense industries, and judging that performance during contract oversight and independent technical reviews. Suppliers, on the other hand, generally are responsible for meeting the SE expectations of the acquirers (as well as implementing internal procedures for SE). This is grass roots, hands-on SE and should extend throughout the supplier chain. The Systems Engineering Plan (SEP) should be the common product of both acquirer and supplier and as such is the mutually approved and prosecuted plan that guides SE for a program.
    • Q: There is a tension between systems engineering (SE) as an art, learned from experience and recognized by its successful program performance, and SE as a disciplined process, defined by models and measured by metrics. What is the right balance?
      • A: This "tension" is more perceived than real. Systems engineering is a disciplined process, whether learned from experience or defined by models and measured by metrics. There is no "right" balance.
    • Q: Would you elaborate on the distinction between systems engineering (SE) and systems integration-to the extent that there is a distinction?
      • A: Systems integration is one of a variety of technical processes encompassed by systems engineering (SE) (see also Defense Acquisition Guidebook Section 4.2.4.5). There is a measure of integration with different emphasis during each life cycle phase of acquisition. This distinction is addressed during the detailed by-phase discussion of it in Defense Acquisition Guidebook Section 4.3.
    • Q: Different organizations use different terms for systems engineering (SE). Some organizations use the term 'mission assurance.' Can we develop a list of tasks to be performed as part of SE?
      • A: The Defense Acquisition Guidebook, Chapter 4, provides systems engineering (SE) tasks by life cycle phase in Section 4.3. Additionally, Section 4.2.2.2 provides a review of standardized terminology for SE and explains each in more detail.
    • Q: Is there any value in defining an engineering concept of operations (CONOPS) that can be applied across Services/Agencies?
      • A: The structure supplied by the systems engineering (SE) process to technical program planning and execution could be considered an acquisition program's concept of operations (CONOPS) as SE provides the framework and processes by life cycle phase recommended for all Services/Agencies and described in detail in the Defense Acquisition Guidebook, Chapter 4.
    • Q: Despite the fact that DoDI 5000.2 states, 'incremental' approaches are the first choice in acquisition, there is very little discussion about how to use this strategy to address requirements/resources to get the user 'bite size' servings of real capability. Engineering for system increments seems like an important topic, but it gets little attention. Why?
      • A: The Defense Acquisition Guidebook discussion of engineering for system increments is diffuse, however, Chapter 4 makes clear that the same phase-specific systems engineering (SE) processes apply for each increment. Other chapters also provide guidance on incremental acquisition. Suggest use of the search function on the Defense Acquisition Guidebook at the DAU web site to find all the specific references.
    • Q: Please describe the systems engineering (SE) processes that we should be employing.
      • A. Chapter 4 of the Defense Acquisition Guidebook (Section 4.2.2.2) describes two categories of systems engineering (SE) processes: Technical Management Processes and Technical Processes. The former group includes such activities as Risk Management and Configuration Management; the latter group includes, for example, Integration, Verification, and Validation. Each of these SE processes listed in the Defense Acquisition Guidebook are subsequently described and cross-referenced to documentation that is more detailed.

        Comment: T&E is an integrated part of systems engineering (SE). OSD, of all organizations, should not address T&E apart from SE.

        A. Test and evaluation (T&E) is certainly an integral part of systems engineering. That is why OSD has used the systems engineering "v" model in Chapter 4 of the Defense Acquisition Guidebook. This model and framework fully integrates T&E activities across the entire life cycle of the program (Section 4.3). Additionally, the systems engineering "v" links T&E activities during all activities of the systems engineering process, both during definition, decomposition, allocation, and design on the left side of the "v" where verification and validation plans should evolve as an integral part of the technical baselines and during integration, verification, and validation where T&E functions as the feedback loop on the right side of the "v." More detailed information can be found in Chapters 4 and 9 of the Defense Acquisition Guidebook.
    • Q: What is the role of technology transition in the systems engineering (SE) process?
      • A: Defense Acquisition Guidebook Section 4.2.4.8 defines the systems engineering (SE) process of transition as " . . .the process applied to move the system element to the next level in the physical architecture or, for the end-item system, to the user. This process may include installation at the operator or user site." Technology maturation, validation and verification, and integration into a system at whatever level (e.g., configuration item, subsystem, etc.) is planned and documented in the Systems Engineering Plan (SEP), Risk Management Plan, and Test and Evaluation Master Plan (TEMP). .
    • Q: Can you define the role/scope of Configuration Management (CM)/Data Management (DM) in the systems engineering (SE) process?
      • A: The role and scope of Configuration Management (CM) and Data Management (DM) are discussed in Defense Acquisition Guidebook Section 4.2.3.6 and Section 4.2.3.7, respectively. Briefly, a CM process guides the system products, processes, and related documentation, and facilitates the development of open systems. Configuration Management efforts result in a complete audit trail of decisions and design modifications. CM establishes and maintains consistency of a product's performance and its functional and physical attributes with its requirements, design, and operational information throughout its life. Data Management also plays an important role in the systems engineering (SE) process. In the program office, DM consists of the disciplined processes and systems used to plan for, acquire, access, manage, protect, and use data of a technical nature to support the total life cycle of the system.
    • Q: How should hardware and software architecture be documented?
      • A: Configuration item architecture requirements and design should be an integral part of the overall technical baseline description along with the other aspects of functional, allocated, and product requirements and design information.
    • Q: What are the issues associated with executing the systems engineering (SE) process in programs having a high degree of concurrency? What are the methods to resolve?
      • A: Tracking design and interface changes and validating them and their integration while maintaining configuration control across the total program is a challenge when faced with a high degree of concurrency. Systems engineering (SE) processes including configuration management, continual tracking of critical path, event-driven technical reviews, and a detailed, event-driven schedule in the integrated master plan/schedule are the SE process tools most germane to managing concurrency. There are, however, no methods to resolve concurrency if it is the result of program constraints that preclude a more serially designed schedule.
    • Q: How do you apply systems engineering (SE) to a system-of-systems (SoS) program? Many systems were developed as stovepipes by different program management organizations.
      • A: The same systems engineering (SE) processes apply to a system-of-systems (SoS) program as to a single system. However, there may be additional complexities in the SoS SE processes because an SoS is far more complex since its components are themselves independently useful systems. The Defense Acquisition Guidebook Section 4.2.6 offers guidance on SoS SE and includes a set of diagnostic questions to guide program manager (PM) thinking on SoS.
    • Q: What is the benefit of doing a Milestone A review?
      • A: Technical maturation and trade-off decisions are made within a disciplined systems engineering (SE) process in Concept Refinement where enabling technologies are aligned with needed capabilities. Risk mitigation plans (technology maturation plans using Technology Readiness Levels as measures of technology maturity) are a key output of the Concept Refinement phase. These plans together with other products, such as a preliminary system specification, systems engineering planning, support and maintenance concepts, Test and Evaluation Strategy (TES), an Analysis of Alternatives, and Initial Capabilities Document (ICD) are then reviewed as an integrated package at Milestone A. This milestone has the benefit of highlighting the linkage of needed capabilities to required technologies that will be further matured in the follow-on Technology Development Phase.
    • Q: Would you expect to use systems engineering (SE) on an international program?
      • A: Absolutely. Use of systems engineering (SE) processes to manage technical program planning and execution is recognized by international standard, ISO 15288. This standard has been adopted (and is in the process of being tailored) by NATO for member country use.
  • Technical Reviews
    • Q: Where can I get the entrance criteria for the seven reviews during System Development and Demonstration (SDD) Phase of a program?
      • A: The Defense Acquisition Guidebook, Chapter 4, provides sample entrance criteria for all of the technical reviews. Additional information can be obtained on the Defense Acquisition University website, which has Technical Review Risk Assessment Checklists for all technical reviews that can also be used as a guide.
    • Q: Is there an effort to drive to one technical review process for all DoD activities? If not, why? We all deal with the same contractors.
      • A: DoD components vary in their organizational implementation of program and technical support and oversight. How an individual component would implement technical reviews and technical authority will vary according to organization, staffing, and other factors. Chapter 4 of the Defense Acquisition Guidebook, together with its references, offers guidance on technical reviews. Another good source of information is the Technical Review Continuous Learning Course offered by the Defense Acquisition University.
    • Q: Please elaborate with respect to expectations for technical reviews and 'independent' subject matter expert (SME) participation and well-established entry/exit criteria for reviews, e.g., for component-based developments acquired under multiple, simultaneous delivery orders, is there expectation of 'independent' participation for every System Requirements Review (SRR), Preliminary Design Review (PDR), etc. for every component (or 'smart' selection)?
      • A: The role of independent subject matter experts (SMEs) in technical reviews should be to critically assess the program as part of the program team during a peer review. Their value can be most evident if they have subject matter expertise in key areas of risk and are intimately familiar with the technical review process. The program manager and systems engineer gain the two-fold benefit of an independent view as well as specific advice at key points (preliminary design reviews (PDRs), critical design reviews (CDRs), test readiness reviews (TRRs), production readiness reviews (PRRs), etc.) in the program. Critical to technical reviews is the active participation of all appropriate stakeholders. These include the operational users, maintainers, testers, cost analysts, certification representatives, etc. More guidance on this topic is available in Chapter 4 of the Defense Acquisition Guidebook and in the Technical Review Continuous Learning Course available from the Defense Acquisition University.
    • Q: Give some specific examples of 'peers' and 'SMEs' who are available to assist program managers (PMs) in determining the adequacy of technical reviews.
      • A: Peers and subject matter experts can come from multiple sources. Ideally, programs should coordinate with the respective technical authority in identifying appropriate peers and subject matter experts that would best support a given review. This kind of support and component best practices on conduct of technical reviews are two examples of the value added by technical authority to programs.
    • Q: Independent Technical Reviews-How should the reviewers gain adequate insight into the program to provide meaningful feedback?
      • A: Technical reviews benefit from an integrated assessment-integrated in the sense that there are both program team personnel and independent (peer) reviewers jointly reviewing all aspects of the program (cost, schedule, and performance) from a technical perspective. Peer reviewers bring a unique perspective to the program and can best contribute to the review if they are given access to technical baseline materials (the elements of the technical baseline under review) prior to the actual review. They should be "teamed" with their counterpart(s) on the program team and be briefed on the program prior to the review.
  • Systems Engineering versus Architecting
    • Q: What is the relationship between system architecting and systems engineering (SE)? Is the system architect the system engineer?
      • A: Architecting is a subset of the systems engineering (SE) process. The early phases of the SE process involve translation of capability requirements and the system concept into a system architecture inclusive of the system functional and physical partitioning and inclusive of the internal and external interfaces. This process is iterative at any level and recursive across system abstraction levels. The chief engineer of a system is usually also considered the chief architect of the system being developed.
    • Q: What are the similarities and differences between system-of-systems (SoS) architecting and SoS engineering? Are these significant? Should we care?
      • A: The fundamental notion of systems engineering and that of architecting, as a component activity within the overall SE process, remains the same for systems and for system-of-systems.
    • Q: What is the best way to tie architecture development back into the systems engineering (SE) process?
      • A: Internal and external interface definition and tracking, and the notion of developing a logical and physical structure, is probably the best natural way to link system architecture into SE processes, and vice versa.
    • Q: System Architectures: What is the future of system architecture in DoD? Are integrated architectures across DoD achievable?
      • A: Battle Force interoperability is mandatory for combat synergy amongst the services. Integrated architectures and system commonality are achievable. The Department is developing the DoD Architectural Framework, Version 2.0. This work updates the method to define the operational, system, and technical views of systems and system-of-systems. This methodology is being adopted by industry. Over time SE tools will continue to be developed to support automation of this effort.
    • Q: Is systems engineering (SE) also the system-of-systems (SoS) architect?
      • A: Clearly, systems engineering should guide the program level is generally not the system-of-systems (SoS) systems engineer as the system-of-systems (SoS) architect is a matter of choice.
  • SE Organization, Roles, Responsibilities including Technical Authority, and Capabilities
    • Q: Is there a recommended organizational structure that makes systems engineering (SE) work better and function more efficiently?
      • A: There is no single best way to organize for systems engineering efficiency and effectiveness. Every program is different with widely varying scope and risk. However, the following approaches may serve to help in the application of systems engineering across a complex program. A program should have:
        a) A lead or chief systems engineer responsible to the program manager for technical issues;
        b) Assigned systems engineers for each major integrated product team (IPT) responsible to the lead SE for implementation of sound SE practices and to the IPT lead for the IPT's technical product; and
        c) A Systems Engineering and Integration Team (SEIT) led by the lead or chief systems engineer and staffed by subordinate systems engineers. The SEIT is responsible for integration of SE both horizontally and vertically in the program organization. It also assists the program manager (PM) in oversight of the program's technical baseline products across the work breakdown structure (WBS).
    • Q: The size, scope, and complexity of DoD acquisition programs vary. What are your views on determining the size of the government systems engineering (SE) organizations within a program office? Are there some tried and true measures for right-sizing a program's SE staff?
      • A: There are no "tried and true" ways of determining the optimum size of either the government "SE organization" or the government program office. These factors are heavily driven by the scope and risk of the program, acquisition strategy, and executing command's technical and business practices for systems acquisition. In general, as uncertainty increases, the role of government management and oversight increases. With this expanded role, the government's capability to perform the necessary oversight functions is directly dependent on a technical staff that is able to address areas of significant technical risk across the entire program. One metric for determining the size of a program team is to assess the ability of the team to effectively address all of the technical work in a way consistent with the practices described in Chapter 4 of the Defense Acquisition Guidebook. If the program team is not staffed to execute these practices, the program will likely encounter cost, schedule, and performance shortfalls beyond that planned for and anticipated.
    • Q: What are the government systems engineering (SE) responsibilities during system development? Is it just management?
      • A: Government systems engineering (SE) should involve more than just management. Government systems engineers must ensure there is sufficient technical insight on programs to continually assess cost, schedule, and performance requirements and that risks are within mutually acceptable bounds. This is achieved through a rigorous technical review process where the technical baselines are developed, matured, and managed across the system life cycle.
    • Q: What is the proper relationship between government and contractor engineers?
      • A: There is no set answer to this question as the relationship can vary depending on the contractual arrangements on the program. In general, good systems engineering requires the participation of both government and contractor personnel (engineers, logisticians, program managers, integrated product team (IPT) leads, testers, user representatives, etc.). The key is to lay out in the Systems Engineering Plan (SEP) what the planned roles and responsibilities are for how the government will technically oversee, how the contractor will technically manage, and how the program will be technically executed as an integral part of an applied systems engineering process.
    • Q: Systems engineering (SE) policy requires each PEO have an assigned lead or chief systems engineer, who is required to assess subordinate lead engineers. Does this mean the PEO chief systems engineer is now the performance rater or in the chain of command of the PM lead analysts?
      • A: USD (AT&L) policy states, "The PEO lead or chief systems engineer shall also assess the performance of lead or chief systems engineers assigned to individual programs in conjunction with the PEO and program manager." The intent of this policy is to ensure the PEO lead or chief systems engineer has direct input to the performance assessment of lead or chief systems engineers on subordinate programs and be part of the performance rating process. Chain of command and organizational relationships will vary from command to command and across the DoD components.
    • Q: Program managers (PMs) disagree that the Program Executive Officer (PEO) chief engineer assess the PM technical chief or PM lead system engineer. Why is this required?
      • A: The lead or chief systems engineer assigned to a program should be primarily responsible for sound application of engineering and systems engineering on that program. Their performance should be assessed accordingly. The PEO lead or chief systems engineer is in the best position to know what engineering and systems engineering is required on their assigned programs.
    • Q: Is OSD driving toward matrix management of engineers with the functional lead being the Program Executive Officer (PEO) chief system engineer?
      • A: We are trying to drive technical discipline back into programs. To do this, there needs to be a systems engineering process focus within the executing organizations, and hence the policy mandate for a Program Executive Officer (PEO) lead or chief systems engineer. How this is specifically implemented is up to the Services and Agencies.
    • Q: Recognizing limited resources is a universal issue, where is the Working Integrated Product Team (WIPT) going to get appropriate subject matter experts (SMEs) to 'help?' How can you work to ensure programs get the needed people to execute successfully?
      • A: One purpose behind systems engineering (SE) revitalization, the SE guidance in the Defense Acquisition Guidebook, and the review of Systems Engineering Plans (SEPs), is to get better collective insight as to the relationship between program staffing and program risks. Discussion of the specifics of applied SE on a program can lead to a better understanding of the implications of program staffing and resourcing shortfalls and how these shortfalls impact the program. Properly conducted technical reviews should identify staffing and resource shortfalls as part of the risk assessment process. The earlier in the acquisition cycle these shortages are identified and corrected, the greater the likelihood of program success.
    • Q: What organization should define what systems make up a family or system of systems and with what process?
      • A: The DoD is implementing a top down capabilities based approach into its strategy, planning, requirements, acquisition, and budgeting processes. Definition of family- and system-of-systems (FoS/SoS) is done in partnership between the Joint Staff, who defines military capability needs, and OUSD(AT&L) ,who provides the materiel solutions to implement needed capabilities. In the complexity of the current battle space, capabilities often require cooperative engagement by multiple systems implemented by different Services working together in a joint context. Both the new DoD requirements process under JCIDS and the acquisition process, with Capability Area Reviews and Roadmaps, recognize this need to support broader capabilities with FoS/SoS.
    • Q: Are there lessons learned available to define the role and relationship of technical authority (government) to the program system engineering?
      • A: We are not aware of any lessons learned addressing the specific aspect of technical authority as applied to major systems acquisitions. Technical authority is receiving significant attention by the Navy, but is not a well-known concept across the acquisition community. We are working via the Systems Engineering Forum to define the prudent next steps on this topic.
    • Q: What are your views with regard to the relationship of technical and program authorities?
      • A: The respective roles of technical and program management authority vary from organization to organization. Moreover, the role of technical authority versus program management authority will vary greatly according to the technical issue at hand (e.g., recommendations to suspend operations due to safety concerns versus program budget decisions). From a systems engineering best practice perspective, programs need to provide for a venue for unimpeded (by overarching program management priorities) technical assessment of a program's technical maturity and cost/schedule/performance risks against objective measures of technical conscience. This, in turn, is a valuable input to the program management process where the realities of budget, need date, and capabilities can be further assessed against those objective measures. Close collaboration across the technical and program management authorities is critical to achieve a balanced solution based on known, honest data.
    • Q: The systems engineering (SE) policy states that the chief systems engineer (CSE) will chair program reviews. This implies approval/disapproval authority. What is the relationship between the CSE and the program manager (PM) with Title X authority for the program?
      • A: Chairing does not imply approval/disapproval authority. Chairing implies a responsibility to correctly apply, through technical leadership, the technical review process to the program at hand. As described in Chapter 4 of the Defense Acquisition Guidebook, technical reviews are essentially an integrated assessment of a program's technical maturity against objective measures and are intended to be a product to the program manager. The program manager, as a part of the review and as a member of the technical review board, approves any action items stemming from the review. Reviews themselves are not "approved" or "disapproved," but rather completed when the approved action items are closed out.
    • Q: Does the SEP describe how technical authority will be implemented on the program to address the full spectrum of program requirements? What does this mean? What information is being requested?
      • A: For certain requirements (e.g., airworthiness certification or SUBSAFE certification) there is a cognizant technical authority within the component having certain delegated responsibilities. The nature of these responsibilities and the system requirement to which they pertain vary significantly across programs and components. The relationship and engagement mechanism of technical authority to programs is foundational to the accomplishment of sound systems engineering.
    • Q: Need a clear, single definition of systems engineering (SE) versus program management and work breakdown structure (WBS) elements. Which are SE versus other areas?
      • A: Systems engineering focuses on the application of sound engineering principles and activities to optimize system performance and minimize ownership costs. The program manager (PM) has the broad, fundamental responsibility of balancing the many programmatic factors that influence cost, schedule, and performance, which include not only the technical aspects of the program, but also areas such as budgeting, contracting, and support. The PM should rely on systems engineering to deal with performance and its impact on cost. The lead or chief systems engineer serves as the technical conscience of the PM. The work breakdown structure (WBS) summarizes data for management, including the PM and systems engineer, and provides information on the projected, actual, and status of the elements for which they are responsible. The WBS provides the planned basis to keep the program's status constantly visible so that the PM and systems engineer, in cooperation with the contractor, can identify and implement changes necessary to assure desired performance.
    • Q: I understand that Capability Maturity Model Integrated (CMMI)-level X in and of itself means nothing. But an organization that is truly CMMI Level 3 or 4 demonstrates mature institutional systems engineering (SE) processes. We ask contractors to achieve this. Why isn't there a push for Program Executive Officers (PEOs) to adopt CMMI and become 'certified?'
      • A: The most frequent issue is that different organizations within a company may have different Capability Maturity Model Integrated (CMMI) levels. Further, under the current system, organizations may have different certifications for different activities. As a minimum, when a government organization asks about a company's CMMI level, they should ask for the lead assessor's disclosure statement that documents the particulars of that assessment, including the specific part of the company to which the rating applies. There is a new tool under consideration called Capability Maturity Model Integrated-Acquisition Module (CMMI-AM) that is more appropriate for a PM or PEO than CMMI. It is self-administered to allow a PM or PEO to examine their capability and take actions they deem appropriate. We have recently conducted a pilot project with CMMI-AM and may conduct a few more pilots.
  • Industry Roles, Responsibilities, and Capabilities
    • Q: In industry, what is the role of analysts (cost, schedule, performance) during stages of the life cycle as compared to role of managers and engineers?
      • A: The analysts are engaged throughout the program and they provide direct support to the program managers and program team on a continuous basis. "Performance analysis" isn't necessarily accomplished by a separate group of people; our systems engineers and test engineers are continuously analyzing performance. The role of the cost analyst is to collect the costs and facilitate the earned value management process, but this is always performed in conjunction with the lead functional manager and often the systems engineer(s). The earned-value-management process includes schedule aspects.
    • Q: You stated that when the government program manager (PM) requires Lockheed Martin to change a tool, this causes problems within your company's systems engineering (SE) process. How would I know that your SE process is the 'best of breed' and will be more than adequate than Northrop Grumman's, Boeing's, etc.? Every company has their own tools, processes, and talent, etc. They have spent much funds to put the infrastructure in place.
      • A: I might have used "Lockheed Martin" as an example, but the statement is universal. The determination of specific process and the tools used to support it (them), is subjective to a great degree. The major defense companies have been doing engineering process for decades. They gain their knowledge by use, generating experience, and by interfacing with other defense contractors, including other "primes" as well as a myriad of contractor subcontractors. Generally, there is a high degree of consistency among these defense contractors, meaning that there is reasonable consensus on what constitutes a "good" or "best" practice. Boeing, for example, has a "Best Practices" Vice President, and they ensure their aggregate best practices, learned from decades of experience, are followed by their major development subcontractors. Typically, however, such "Best Practices" are a description of WHAT is to be done, and not necessarily HOW to do it. The WHAT is a requirement, the HOW is the specific implementing process. It is changing the HOW (the specific process) that causes trouble, since the HOW is institutionalized within each contractor. Of critical importance is ensuring the "best practices" are being applied to the program at hand. This requires technical oversight by both the acquirer and supplier. [Bob Rassa reviewing]
    • Q: How stretched is industry to provide senior systems engineers to lead a program? Does the lead systems engineer need to have detailed knowledge/understanding of components' desired functionality?
      • A: There is definitely a shortage of experienced lead systems engineers within the defense community. All defense contractors are working hard to fill this shortfall. Approximately 3-4 years ago, before the dot-com bust, we had the same problem with obtaining skilled software engineers. Industry can generally meet the systems engineering (SE) requirements on programs by judiciously sharing their talent across multiple programs. And yes, the best SE practitioners most assuredly have comprehensive knowledge and understanding of the components' desired functionality. The best SE talent has multiple-domain, multi-discipline experience.
  • SE Tools
    • Q: SE Tools: What is available? What is preferred? What is needed?
      • A: Without any specifics, we suggest you refer to the Defense Acquisition Guidebook, Section 4.5, and the Systems Engineering Community of Practice (SE CoP) for possible insights.
    • Q: How do we achieve better utilization of distributive collaborative environments in government and industry for systems engineering (SE) implementation and execution?
      • A: How a program implements the systems engineering approach should be based on the nature and complexity of the program in question. If a distributive collaborative environment facilitates the management of the program, then by all means use it.
  • Design Considerations
    • Q: How do we focus logistics into the systems engineering (SE) process?
      • A: The Defense Acquisition Guidebook, Chapters 4 and 5, Systems Engineering and Life Cycle Logistics, respectively, were prepared in parallel following the same general structure to highlight the close relationship of logistics to SE as a design characteristic. This relationship is portrayed graphically on the recently published (see the DAU web site) "Integrated Defense Acquisition, Technology, and Logistics Life Cycle Management Framework" also known as the "wall chart."
    • Q: Why is OSD emphasizing Reliability and Maintainability at program milestone reviews and in the technical reviews?
      • A: System reliability and maintainability are very strong drivers of cost over the total life cycle. Too often, we are finding that reliability and maintainability are traded for other performance factors or are used as a way of limiting upfront development costs, without full understanding of the longer-term effects. The purpose of highlighting these areas in reviews, both programmatic and technical, is to ensure all stakeholders have fully analyzed design choice implications from a total life cycle cost perspective. Reliability also is a key metric for technical maturity.
    • Q: How well is Human Systems Integration addressed in DoD/defense systems as part of the systems engineering (SE) process?
      • A: Human Systems Integration (HSI) is one of many design considerations (see Defense Acquisition Guidebook, Section 4.4) that must be addressed during system design. It figures into the balanced design solution developed during trade studies, and to the extent that HSI is specified by statute, it may mandate certain system designs. This is discussed in more detail in Defense Acquisition Guidebook, Section 4.4.10, and Chapter 6, Human Systems Integration.
    • Q: Is TOC a key metric in Human Systems Integration (HSI)? Why is HSI still not given the emphasis needed? How well is HSI understood?
      • A: DoD Directive 5000.1 states in E1.27, "Systems Engineering. Acquisition programs shall be managed through the application of a systems engineering approach that optimizes total system performance and minimizes total ownership costs." Human Systems Integration (HSI) is one of many competing design considerations that must be addressed in translating user capability needs into quantifiable, measurable system requirements. TOC is a key metric for ALL competing system considerations. In some instances, however, there are statutes that preclude trading operator safety for cost, and the program manager (PM) should be mindful of these during system trade studies. We have no empirical evidence that HSI is not emphasized in system acquisition. HSI is applied by an amalgamation of specialized disciplines. The key to appropriate HSI considerations in system design is the active participation of these experts in program Integrated Product Teams (IPTs) to help other IPT participants understand and balance the application of HSI in system design and integration.
  • Oversight and Program Support Reviews
    • Q: How can OSD streamline the oversight bodies (Program Support Review (PSR), Integrating IPT (IIPT), Over-arching IPT (OIPT), NII OIPT) combined reviews versus individual assessments?
      • A: Program Support Reviews (PSRs), Integrating IPTs (IIPTs), Over-arching IPT (OIPT), and DABs all have the same goal-to foster program success and reduce program risk. They are conducted with varying levels of participation with each striving to reduce risk and issues before going forward to the next level. They are neither separate nor duplicative. They are a continuum of activity focused on a common goal of delivering capability to the warfighter. The PSRs are conducted to support the IIPT/OIPT/ DAB/Information Technology Acquisition Board (ITAB) meetings. The findings, risks, and recommendations are adjudicated with the program manager (PM) and lead systems engineer prior to briefing the OUSD(AT&L)/Defense Systems leadership. A summary of the adjudicated strengths, risks, and recommendations are then briefed to the IIPT and OIPT. The focus remains to assist the PM in achieving success.
    • Q: Can you clarify the role of the Networks and Information Integration (NII) Over-arching Integrated Product Team (OIPT) versus the role of NII representatives who may not speak with one voice at program (AT&L) OIPTs?
      • A: The NII Over-arching Integrated Product Team (OIPT) performs the same functions for the Information Technology Acquisition Board (ITAB) as the Defense Systems OIPT performs for the Defense Acquisition Board (DAB). Each OIPT facilitates communication and vets issues before the ITAB or DAB meets. In contrast, NII representatives are members of the IIPT process supporting both OIPTs. Their working level views get consolidated and presented as a position at the Defense Systems and NII OIPTs. Moreover, NII members have different functional responsibilities in the IIPT/OIPT process, e.g., one representative might address Clinger-Cohen compliance, while another addresses spectrum management, and yet a third representative might look for Global Information Grid (GIG) compliance or interoperability issues.
    • Q: The experience of one ACAT I program is that agreements and guidance in Working Integrated Product Teams (WIPTs) is later overruled/abrogated at the Integrating IPT (IIPT) level, followed by later upsets and reversal at the Over-arching IPT (OIPT). The Integrated Product Team (IPT) process is more an ordeal to be endured rather than effective support and oversight. Can this be fixed?
      • A: We regret that this may have happened. The intent of the IPT process is to avoid such occurrences.
    • Q: How can we make the OSD-led program support assessments less intrusive and more useful to PMs as they plan and carry out technical management? Why are the program support reviews conducted at Milestone Reviews instead of earlier when the processes are being implemented?
      • A: OUSD(AT&L) Defense Systems, Systems Engineering, Assessments and Support (OUSD(AT&L) DS/SE/AS) conducts program support reviews (PSRs) for two purposes. One purpose is to assist program mangers (PMs) by offering an independent cross-functional review by an interdisciplinary team that has a wide range of experience on both similar and diverse programs. These reviews are most useful when conducted well in advance of any major decision points. Ideally, a review is conducted when the program management office (PMO) is formulating plans or early in execution so the PM can consider the recommendations developed by the review team and implement at his/her discretion. The least intrusive reviews are those that are scheduled to coincide with similar activities that the PMO will execute regardless of the requirement for a PSR. For example, if we schedule a PSR to immediately precede or follow a System Requirements Review (SRR), almost all of the people and documents required for the PSR will have already been assembled for the SRR. The key is early and open coordination and communication by all parties. The second purpose is to provide OSD decision makers additional insight into program execution and risks to assist in the decision making process as part of the Defense Acquisition Board (DAB) oversight process. However, even PSRs conducted shortly before a DAB are focused more on providing actionable recommendations to the PMO than on oversight.
    • Q: Why do you feel the product support review (PSR) isn't another level of oversight on top of Working Integrated Product Team (WIPT)/Integrating IPT (IIPT)/Over-arching IPT (OIPT) process? I thought that OSD promulgated policy and the Services/Agencies executed.
      • A: The program support review (PSR) is not another layer of oversight; it is a tool we use as part of the oversight process. The same OUSD(AT&L) DS/SE/AS representative that participates in the Working Integrated Product Team (WIPT)/Integrating IPT (IIPT) and assists in preparation of the principals for the Over-arching IPT (OIPT) and Defense Acquisition Board (DAB), leads the PSR. OSD, by law, makes decisions (through the DAB process) on major programs. OSD decision makers require insight into program execution, risks, and the likely outcome of various alternatives to make critical acquisition decisions. This independent review process has been shown to be of benefit to programs as well.
    • Q: What is the role of OSD versus the Services/Agencies regarding the conduct of independent program support reviews or assessments? What additional systems engineering (SE) oversight do the Services/Agencies need to do to reduce OSD oversight?
      • A: As part of the SE reorganization, OSD initiated the program support review (PSR) process to improve program executability. In some cases, Services can participate if they make a request. The intent is for the Services and Agencies to adopt the PSR process and assume this role once the cognizant acquisition organization provides adequate oversight in these areas.
    • Q: Discuss the ATL-SE program support review process in more detail: Who does the briefing? Who invites/attends, etc.? What is the agenda?
      • A: Program support reviews (PSRs) are guided by two key planning documents, the Defense Acquisition Program Support (DAPS) methodology and the PSR process. Combined, these products provide the Program Support Team Lead (PSTL) and the PSR team members with a repeatable, tailorable approach to conducting the review. The structure and depth of the review is based on the nature of the program, where it is in the acquisition process, and issues the program may be facing. The process begins with coordination between the PSTL and the program management office (PMO) to shape the issues and the scope of the review. The PSTL prepares a PSR plan and coordinates it with the program manager (PM). This plan will specify the briefings and agenda and recommend types of participants from the PMO and its supporting organizations. OUSD(AT&L), DS/SE will post a sanitized program review plan on the systems engineering (SE) web site.
    • Q: It was stated that DoD risk management is not very mature; however, your OSD program support review (PSR) findings indicate some programs do risk management very well. What are these programs doing right? How do the rest of us learn from them?
      • A: DoD systems acquisition programs that have a mature risk management program will:
        a) Have an Acquisition Strategy that is based on a technical understanding of the problem at hand, including identification of all major program risks.
        b) Use the technical baseline to continually assess technical risk.
        c) Have a defined process for identifying, analyzing, developing mitigation plans, implementing those plans, and monitoring risk mitigation effectiveness.
        d) Monitor risk using system development/program metrics, Technical Performance Measures (TPMs), Critical Technical Parameters, Measures-of-Effectiveness (MOEs), Measures-of-Suitability (MOSs), and Measures-of-Performance (MOPs). Update frequency, tracking depth, response time to generate recovery plans and planned profile revisions as a standard part of measurement activities.
        e) Integrate their systems acquisition program's systems engineering approach with the program's risk management effort. Technical reviews should provide technical risk assessment input to the risk management process, and it should be evident how the lead or chief systems engineer uses each technical review to assess risk.
        f) Have a linkage between the systems engineering (SE) technical risk assessment and mitigation efforts and the overall risk management process.
        A joint Government and Contractor collaborative relationship and a feedback mechanism between the SE and risk management processes should exist. During our program support reviews (PSRs), we've seen that some programs have noteworthy risk management programs that identify and track the risk migration efforts. On the other hand, we've seen some programs present textbook risk management programs during PSRs or in their Systems Engineering Plans (SEPs), only to find that they do not have a formal risk management program in place at all to identify, resource, and mitigate risks. We suggest you refer to the Risk Management Guide and Risk Management Community of Practice for further insight.
    • Q: Where are you posting lessons learned from Systems Engineering Plan (SEP) assessments? Where do you post the program support review (PSR) and SEP review schedule?
      • A: The processes-both development of SEPs and the review of SEPs-do not have enough data to develop statistically significant patterns of lessons learned. However, general lessons learned from Systems Engineering Plan (SEP) reviews will be posted at the following link: http://www.acq.osd.mil/se/as/publications.htm by July 30, 2005. You should review and analyze Chapter 4 of the Defense Acquisition Guidebook to ensure the recommended guidance put forth is fully addressed in the program's SEP. At this point we can only share anecdotal examples of our observations at various presentations, ranging from individual meetings with program management offices (PMOs) to the DAU-hosted PEO SYSCOM Conference. We also provide customized training to program managers (PMs) and their systems engineers to assist in developing SEPs and provide those lessons learned that may be useful to that PMO for that SEP. We also use the observations of the review teams to help guide enhancements to the Defense Acquisition Guidebook, SEP Preparation Guide, and DAU-developed education and training programs. Program support review (PSR) schedules are coordinated with PSTLs and PMOs. We review SEPs as soon as feasible after receiving them. We strive to complete the review of SEPs for pre-DAB approval within 30 days.
    • Q: How are different teams from OSD who perform assessments of Systems Engineering Plans (SEPs), Capability Maturity Model Integrated-Acquisition Management (CMMI-AM), and Defense Acquisition Program Support (DAPS) coordinated to provide common perspectives?
      • A: Our organization is structured around the concept of "presenting one face to the customer." The Program Support Team Lead (PSTL), whose portfolio includes your program, is responsible for all DS/SE activities related to that program including Systems Engineering Plan (SEP) approval, Test and Evaluation Strategy (TES) and Test and Evaluation Master Plan (TEMP) approval, program support reviews (PSRs), DAES reporting for DT&E, IIPT/OIPT/DAB support and coordination of all OSD staffed/approved program documents, e.g., Acquisition Strategy Report (ASR), Acquisition Decision Memoranda (ADMs), and Over-arching IPT (OIPT) reports. As part of program support, we use a separate team of experts to help program managers (PMs) and Program Executive Officers (PEOs) conduct Capability Maturity Model Integrated-Acquisition Management (CMMI-AM) self-assessments as necessary.
  • Systems Engineering Plan Review Process
    • Q: Can you give us clear criteria for OSD approval of Systems Engineering Plans (SEPs)? Individual reviews appear to have opinions that we interpret as pass/fail criteria for SEP approval and may not be so. If there are no clear criteria, this should be made clear to OSD reviewers.
      • A: The only hard criteria for a Systems Engineering Plan (SEP) are contained in the USD(AT&L) Memoranda "Policy for Systems Engineering in DoD," dated February 20, 2004, and "Policy Addendum for System Engineering," dated October 22, 2004. The first memo requires the SEP, "describe the program's overall technical approach, including processes, resources, metrics and applicable performance criteria. It shall also detail the timing, conduct, and success criteria of technical reviews." The addendum adds a requirement to have a lead systems engineer at the Program Executive Officer (PEO) level or equivalent and amplifies the policy on technical reviews (,i.e., event driven based on entrance criteria and participation by independent subject matter experts). OUSD(AT&L), DS/SE published the SEP Preparation Guide, which is guidance intended to help program managers (PMs) prepare a SEP that is consistent with the above policy. Subsequently, we prepared the SEP Focus Areas for Technical Planning framework for two purposes: 1) to further assist PMs in preparing their SEPs and 2) to guide OSD reviewers in determining if the SEP answers the most critical questions necessary to comply with the policy. The SEP Preparation Guide and the framework may be accessed at: http://www.acq.osd.mil/se/publications.htm.
    • Q: What are the key checklist review items that OSD staff use to assess a Systems Engineering Plan (SEP)? Is this online?
      • A: OSD uses the Defense Acquisition Guide; the Systems Engineering Plan (SEP) Preparation Guide, version 0.95; and the SEP Focus Areas for Technical Planning framework supporting Milestone B. We will soon publish two additional frameworks for Milestones A and C. These products can all be found at http://www.acq.osd.mil/se/publications.htm
    • Q: We heard much support for flexibility in organization of Systems Engineering Plans (SEPs), but, in experiencing a review, reviewers have indicated that the 'proposed' outline was not followed.
      • A: Our interest is in systems engineering planning. The Systems Engineering Plan (SEP) is to document that planning. We do not care what format or outline is used as long as we can find the information. Our original plan was to not publish any guidance or outline for preparing a plan so systems engineers would focus on their planning and not try to force fit to a one-size-fits-all document. After many requests from the field, we published a preparation guide, which includes a sample outline. The outline and more importantly, the SEP Preparation Guide, was written to give guidance on what should be considered when writing a SEP, not in what format.
    • Q: What is the 'planned' process for getting a Systems Engineering Plan (SEP) reviewed and approved?
      • A: The planned process for getting a Systems Engineering Plan (SEP) reviewed and approved involves a 30-day process, which includes: process control, SEP review by staff technical experts, a SEP adjudication board to adjudicate SEP review comments, and routing of the SEP package to the appropriate signatories.
    • Q: What time is required to get Systems Engineering Plans (SEPs) reviewed by OSD and returned with comments? We need commitments to make our milestone schedules.
      • A: We strive to complete staff approvals for formal Systems Engineering Plan (SEP) submissions within 30 days from receipt of the document. Early involvement and participation in Systems Engineering (SE) Working Integrated Product Teams (WIPTs), as well as reviews of draft SEPs, will reduce turn-around time and rework.
    • Q: The amount of Systems Engineering Plans (SEPs) to be reviewed by OSD looks ominous. How long should programs allocate in their schedule for OSD to review the SEP and return comments? We should plan for how many OSD reviews of the SEP?
      • A: We strive to complete staff approvals for formal Systems Engineering Plan (SEP) submissions within 30 days from receipt of the document. Early involvement and participation in Systems Engineering (SE) Working Integrated Product Teams (WIPTs), as well as reviews of draft SEPs, will reduce turn-around time and rework.
    • Q: How can OSD reviews of Systems Engineering Plans (SEPs) be expedited (e.g., obtain at least initial feedback within 2 weeks)?
      • A: Our Program Support Team Leads (PSTLs) are assigned between 12-32 programs each and typically conduct several Program Support Reviews (PSRs), Systems Engineering Plan (SEP), DAES, and Test and Evaluation Master Plan (TEMP) reviews concurrently. Hence, it's very difficult to promise comments on a document within 2 weeks. Also, to ensure consistency in SEP comments, a panel of several groups reviews, discusses, and prepares the comments that will be provided on draft SEPs. This process typically takes 30 days.
    • Q: How can OSD reviews of Systems Engineering Plans (SEPs) be expedited (e.g., obtain at least initial feedback within 2 weeks)?
      • A: Our Program Support Team Leads (PSTLs) are assigned between 12-32 programs each and typically conduct several Program Support Reviews (PSRs), Systems Engineering Plan (SEP), DAES, and Test and Evaluation Master Plan (TEMP) reviews concurrently. Hence, it's very difficult to promise comments on a document within 2 weeks. Also, to ensure consistency in SEP comments, a panel of several groups reviews, discusses, and prepares the comments that will be provided on draft SEPs. This process typically takes 30 days.
    • Q: Is there web access to the Systems Engineering Plan (SEP) review schedules and lessons learned?
      • A: We do not publish the review schedule. We have Program Support Team Leads assigned to each major program that serves as the single face to the program. A review of a program SEP occurs only when a program chooses to send it to OSD for review (generally when getting close to a decision point in the life cycle). For lessons learned, SEP Frequently Asked Questions (FAQs) will be posted to the DS/SE website to answer common questions.
    • Q: What is the process for getting feedback and lessons learned on Systems Engineering Plan (SEP) reviews?
      • A: As part of its support role, OUSD(AT&L) Defense Systems, Systems Engineering, Assessments and Support (OUSD(AT&L) DS/SE/AS) provides feedback for out-of-cycle draft Systems Engineering Plan (SEP) directly to the requesting organization. If a program manager (PM) sends us a draft SEP for review, we respond directly to that PM. If a Component Acquisition Executive (CAE) provides a SEP to us for approval, we respond to that CAE. For lessons learned, SEP Frequently Asked Questions (FAQs) will be posted to the DS/SE website to answer common questions.
    • Q: What are the key shortcomings of submitted Systems Engineering Plans (SEPs)?
      • A: The most frequent shortcomings in submitted Systems Engineering Plans (SEPs) at this time are:
        1. Incomplete discussion of program requirements-missing categories such as statutory, regulatory, or certifications;
        2. Minimal discussion of program Integrated Product Teams:
        - Need to identify technical authority, lead systems engineer, and key stakeholders
        - Addresses part of systems engineering (SE) organization, such as prime; no mention of government, subcontractors, or suppliers;
        3. Incomplete technical baseline:
        - How does the program go from Capabilities Development Document to product-traceability?
        - Linkage to Earned Value Management-not able to measure technical maturity via baselines;
        4. Incomplete discussion of technical reviews:
        - How many, for what (should tie to baselines and systems/subsystems/configuration items), and by whom (should tie to staffing)?
        - Lacking specific entry criteria
        - Peer reviews;
        5. Integration with other program management planning:
        - Linkage with IMP, IMS, logistics, testing, and risk management
        - Schedule adequacy-success-oriented vice event-driven; schedule realism
        - Contracting for SE.
  • Systems Engineering Metrics
    • Q: How do you measure effectiveness of the systems engineering (SE) process?
      • A: Generally systems engineering (SE) effectiveness is revealed by the technical maturity of the program's technical baselines at the respective event-driven technical reviews (e.g., the allocated baseline for a given configuration item (CI) at its Preliminary Design Review (PDR)). However, metrics for SE is a shared concern between DoD and the Defense industry as a whole. The National Defense Industrial Association SE Division has chartered a joint government/industry committee, Systems Engineering Effectiveness, to examine this issue further and report back to the Division.
    • Q: Why is the major systems engineering (SE) metric not based on number of changes to a technical baseline?
      • A: Changes to the technical baseline reflect technical discovery during the course of a program. They may reflect both positive and negative results and are not indicative, per se, that program systems engineering (SE) is either good or bad, but rather that the technical baseline required adjustment for good and sufficient reasons. Few changes to the technical baseline is not necessarily a positive metric, as is the case when a baseline is not being properly updated to reflect design trades during development.
    • Q: Under the heading of best practices, is there a set of metrics that can be made available to attendees? You cited a lack of good SE metrics as a problem. What are your ideas about sound measures of merit?
      • A: There is no finite set of systems engineering (SE) metrics. Generally, SE effectiveness is revealed by the technical maturity of the program's technical baselines at the respective event-driven technical reviews (e.g. the allocated baseline for a given configuration item (CI) at its Preliminary Design Review (PDR). However, metrics for SE is a shared concern between DoD and the Defense industry as a whole. The National Defense Industrial Association SE Division has chartered a joint government/industry committee, Systems Engineering Effectiveness, to examine the issue further and report to the Division..
    • Q: What work is being done to better define/standardize metrics?
      • A: Metrics for SE is a shared concern between DoD and the Defense industry as a whole. The National Defense Industrial Association SE Division has chartered a joint government/industry committee, Systems Engineering Effectiveness, to examine the issue further and report back to the Division.
  • Systems Engineering Outreach
    • Q: Even if a program manager (PM) believes in the need for systems engineering (SE) and wants to do the right technical things, the customer may not be sympathetic. What efforts are being pursued to sensitize the operational community to SE?
      • A: Defense Systems is continuing its SE outreach efforts to include senior DS staff members assigned as liaison to the Joint Staff and engagement with multiple forums. Additionally, a representative of the J8 participates in the monthly SE Forum, a flag-level group that meets to consider a wide variety of SE issues and share best practices and initiatives.
    • Q: You mentioned that the warfighter 'wants it now' and is not interested in testing and supportability, etc. What is our response to the warfighter when we do what they ask and the system fails or is not reliable/supportable in the field?
      • A: It is the role of systems engineering to articulate to the warfighter the importance of testing and supportability to the effectiveness and suitability of our fielded products. Generally, when systems engineering is successful and the tradeoffs are completely communicated to all stakeholders, the right decisions are made during development to field supportable products in a timely manner. It is when systems engineering fails, and trades are not fully understood by the warfighter as well as the developer, that we end up with unreliable and unsupportable systems. This communication of trades is a key component of the technical reviews; warfighter participation in these reviews is critical.
    • Q: How do we foster a spirit of collaboration between the program manager (PM) and the systems engineer? How do we not build an 'us vs. them' mentality?
      • A: OSD continues to reach out through fora like the Systems Engineering Town Hall to clearly communicate the value and necessity for a sound technical foundation in our acquisition programs.
    • Q: How do you intend to modify the perception at the OSD, Component Acquisition Executive (CAE), and congressional levels that event-driven schedule changes aren't necessarily a sign that the program is floundering?
      • A: Systems engineering revitalization and outreach are an ongoing effort. We are continuing to work with OSD, DoD Components, and the broader acquisition community to heighten awareness of systems engineering, its fundamental elements, and how these elements apply to major systems acquisition and sustainment. Current systems engineering policy calls for event-based reviews. It is expected that this will help drive programs to more realistic planning with well-understood critical paths and diminish the need for schedule changes during execution.
    • Q: How do you, as a systems engineer, show value to the PM when beginning a program?
      • A: You show value to the PM by pointing out that a sound systems engineering approach provides an execution plan, and is an integral part of any successful acquisition. It is important that everyone recognize that executing a disciplined, effective systems engineering program will help, for example, avoid repeating engineering efforts. There have been cases where we rushed into a key program event, such as a test or technical decision, prior to adequately completing the technical work that would minimize and control program risk. This results in additional costs to the program and adversely impacts the schedule because we typically have to go back and complete the technical work and repeat the event. Doing it right the first time, when appropriate from a technical standpoint, is the most efficient approach from a cost and schedule standpoint.
    • Q: What is your specific advice for how to incorporate new direction/governance into programs to minimize cost/schedule/performance after contract award? The users don't always see the value added in these new requirements.
      • A: Executing a sound systems engineering effort should not adversely impact existing schedules and budgets. From a contractual standpoint, it is expected that most, if not all, industry best practices include a sound engineering and technical effort. If the contractor you are dealing with does not follow sound engineering and technical practices, then there are likely serious issues on the program that go beyond implementing the systems engineering policy. Like other facets of a program, systems engineering is an integral and critical element of any acquisition program. It is important that we clearly articulate to the sponsor the benefits of a sound systems engineering approach and point out the resultant program risks, not only technical but also cost and schedule, of not performing key SE efforts.
    • Q: What can OSD do to better align the view of policy held by its varied organizations? It is difficult to meet the expectations of the different subgroups when they hold separate views of the policy and guidance.
      • A: All of the systems engineering policies and guides go through extensive review by multiple offices in OSD, as well as the Services and Agencies, via the Systems Engineering Forum. OSD Systems Engineering is continuing its outreach efforts to ensure the systems engineering policies put in place are in harmony with policies from other organizations that participate in system acquisitions.
  • Barriers to SE Implementation
    • Q: What do you think is the most significant barrier to implementing systems engineering (SE) effectively?
      • A: There is no single most significant barrier to implementing systems engineering (SE) effectively. Unfortunately, there are many issues that can adversely affect effective SE on programs. Some of these include: lack of strong technical authority at the Service or Agency level, lack of SE staffing in the program office, insufficient or absent SE planning, lack of systems experience among those tasked with systems engineering execution and leadership, weak suppliers (in any of the areas above), lack of effective leveraging of available training in systems engineering, and related areas.
  • Resources
    • Q: How do you intend to reconcile budgets established via a 5-year Program Objective Memorandum (POM) cycle with the dynamic (potentially) nature of event-driven schedules? This is particularly true of high visibility, ACAT I programs.
      • A: Event-driven schedules do not equate to increases in program costs or schedule delays. A problem we've seen with schedule-driven programs is a tendency to repeat engineering efforts because it was premature to go forward with the effort the first time, for example, a technical review. This results in additional costs to the program and typically adversely impacts the schedule. Doing it right the first time, when appropriate from a technical standpoint, is the most efficient approach from a cost and schedule standpoint.
    • Q: Do you have any opinion of the resource levels of systems engineering (SE)? Do we need 5%, 15%, 20%, or more? Does it change as a result of life cycle?
      • A. The resource level of systems engineering (SE) should be based on the engineering effort associated with the particular program. The level will be different from program to program depending on the technical challenges facing the program and its acquisition strategy.
    • How do you bridge the expertise gap? (Most 20-25 year olds have no clear military standards experience; 40-45 year olds do; but 50-55 year olds grew up pre-acquisition streamlining and are moving toward retirement.)
      • A: The expertise gap is something that both the government and industry are working to overcome. Efforts such as university systems engineering (SE) programs and strengthening the Defense Acquisition University (DAU) courses to include appropriate coverage of SE should help. Mentor programs are another technique that can be used to help overcome the expertise gap between the various age groups.
    • Q: With the impending high retirement rate, are there DoD-level initiatives to attract and retain the next generation of organic/civilian engineers (e.g., scholarship programs, engineer bonuses)? Or is contractor support engineering a better value?
      • A: DoD has historically, and continues to have, strong tuition assistance programs for its employees. Engineers can take advantage of the various tuition assistance programs to gain additional knowledge in systems engineering. Like other areas, the balance between organic and contractor support is something that needs to be assessed by the individual program offices depending on their particular needs and hiring and financial constraints.
    • Q: The shortage of trained, experienced systems engineers (and program managers) in the civilian and military workforce is a common theme. Are there any Program Executive Officers (PEOs) with effective initiatives for getting support contractors the training they need?
      • A: We're not aware of any particular Program Executive Officer (PEO) program that we can highlight regarding training of support contractors. Support contractor personnel have the option of attending courses offered by academia including DAU in the area of systems engineering and should be encouraged to do so.
    • Q: Resource instability significantly, adversely impacts systems engineering (SE) efforts. It is very difficult to justify enough SILS (e.g., get a prime, but not enough for tiered suppliers), cuts to risk mitigation efforts (e.g., test-fix-test time) moves to force 'spiral development' without permitting sound, early development of later spiral technology. just to name a few. We're also impacted by regular 'adjustments' to FY problems, forcing cuts to well laid out plans. How can we justify the needed efforts and gain oversight support to stabilize associated resources with priority?
      • A: It is important that we clearly articulate the benefits of a sound systems engineering (SE) approach and point out the resultant program risks, not only technical but also cost and schedule, of not performing key SE efforts. Identifying the benefits of SE efforts was highlighted at the recent Systems Engineering Town Hall meeting. System engineering is certainly one of the key focus areas during program reviews at the OSD level. It is expected that this upper management focus will help place priority on the systems engineering efforts at the program office level.
    • Q: Please address the apparent dichotomies between event-driven schedules and the Planning, Programming, Budgeting and Execution (PPBE) Process and between meeting warfighter requirements and Cost As an Independent Variable (CAIV).
      • A: Event-driven schedules do not equate to increases in program costs or schedule delays. A problem we've seen with schedule-driven programs is a tendency to repeat engineering efforts because they were premature to go forward with the effort the first time, for example, a technical review. This results in additional costs to the program and typically adversely impacts the schedule. Doing it right the first time, at the appropriate time when ready from a technical standpoint, is the most effective and efficient approach to meeting warfighter needs.
    • Q: How do we effectively implement the systems engineering (SE) policy within the budgetary constraints imposed on program managers (PMs)? Example: Most PMs are restricted to staffing levels (and other associated overhead) of 5% or less.
      • A: Systems engineering (SE) efforts are not "overhead." A sound SE approach is an integral part of a successful acquisition. With regard to budget limitations, executing an effective SE program will help avoid repeating engineering efforts because they were premature to go forward with the effort the first time, for example, a technical review. This results in additional costs to the program and typically adversely impacts the schedule. Doing it right the first time, when appropriate from a technical standpoint, is the most effective and efficient approach from a cost and schedule standpoint.
    • Q: What mechanism is applied to have sponsor provide funding/resources to revitalize systems engineering (SE) at the program manager (PM) level?
      • A: Like other facets of a program, systems engineering (SE) is an integral and critical element of any acquisition program and therefore should be funded to the appropriate level from the program's budget. It is important that we clearly articulate to the sponsor the benefits of a sound SE approach and point out the resultant program risks, not only technical but also cost and schedule, of not performing key SE efforts.
    • Q: If acquisition reform was at least partly responsible for the lack of effective systems engineering (SE), what has changed to allow us to re-gain and re-allocate resources for SE revitalization? (It ain't free.)
      • A: What has changed is the realization that without a sound systems engineering (SE) program our acquisition efforts tend to be over cost and behind schedule. In the end, we spend more time and money achieving our acquisition goals than originally planned. Implementing a sound SE effort will help achieve our acquisitions on time and within budget.
    • Q: After years of acquisition streamlining, is it possible to achieve 'attention to detail' in systems engineering (SE) without significant impacts to existing development schedules and budgets?
      • A: Executing a sound systems engineering (SE) effort should not adversely impact existing schedules and budgets. In the end, implementing SE will benefit the program from a cost and schedule standpoint. For example, a problem we've seen with schedule-driven programs is a tendency to repeat engineering efforts because they were premature to go forward with the effort the first time, for example, a technical review. This results in additional costs to the program and typically adversely impacts the schedule. Doing it right the first time, in accordance with a plan, when appropriate from a technical standpoint, is the most effective and efficient approach from a cost and schedule standpoint.
    • Q: How do you stay event-based when there are system dependences that are counting on the 'product?'
      • A: Event-based does not equate to schedule delays or increases in program costs. A problem we've seen with schedule driven programs is a tendency to repeat engineering efforts because they were premature to go forward with the effort the first time, for example, a technical review. This results in additional costs to the program and typically adversely impacts the schedule. Doing it right the first time, when appropriate from a technical standpoint, is the most effective and efficient approach from a cost and schedule standpoint.
    • Q: What is the relationship of the Program Objective Memorandum (POM)/Budgeting process to the systems engineering (SE) policy?
      • A: When building a new program budget, all required efforts, including systems engineering (SE), need to be taken into account. For existing program budgets, we do not expect that implementing a sound SE approach will result in a budget increase. Additionally, the understanding of the technical challenge at hand derived from a sound technical plan will provide the technical basis for a more reasonable program cost estimate.
    • Q: Many programs are under development and systems engineering (SE) was not considered. Are there any sources of funds that can assist program managers (PMs) to begin the process of SE?
      • A: There is no separate "pot of money" at the DoD level to assist program managers (PMs) in executing the systems engineering (SE) process. Like other facets of a program, SE is an integral and critical element of any acquisition program and therefore should be funded to the appropriate level out of the program's budget.
  • Best Practices
    • Q: Everyone feels overwhelmed.inefficient workload planning, etc. How do we not burn folks out to where there is no time/energy left to research best practice?
      • A: We are working with the Defense Acquisition University (DAU) and other academic and industry partners on an automated, web-accessible Best Practices Clearing House. A pilot was demonstrated recently at the Systems and Software Technology Conference to gather user feedback. Development work on the tool is continuing. In addition, one of the merits of the Program Support Reviews, is the sharing of best practices from other programs in the DoD.