Back to Top

Educating the Aerospace Industry

Blog Archives

Requirements Development

M&S Requirements Development


Modeling & Simulation (M&S) requirements specify the set of capabilities that a simulation needs in order to adequately serve all of its intended uses. Users rely on information from the three domains (Problem, User and Simulation) to develop a concise and consistent set of requirements for the simulation at hand. Users use this set of requirements in deciding whether to apply an existing simulation, modify an existing simulation, construct a federation, or build a completely new simulation to serve their purposes. The set of requirements consists of two complementary types of requirements: [1]

  • Representation requirements: describe the properties and behaviors of the things that a model or simulation must represent to adequately serve the user’s purposes. These include the represented entities, their properties, and the dependencies that, when executed, produce their behavior. Representation requirements define the needed simulation capabilities in the terms of simulation fidelity. These requirements should define the resolution, accuracy, and confidence in that accuracy needed for every object property of importance for directly meeting the user’s needs.
  • Implementation requirements: encompass all of those needs that are not representational in nature, such as the specifics of the interfaces and execution environments (e.g., model execution speed).

Requirements in New Simulation Development [1]
For each new model, simulation, or federation, a set of requirements is developed that governs how it is built (i.e., what the model, simulation, or simulation federation needs to be able to do; its capabilities). These requirements are continuously reexamined and refined as more information is acquired and trade-off analyses are performed during development and the simulation’s subsequent reuse.

Requirements in Legacy Simulation [1]
Reuse For reuse of a legacy simulation, the User assesses the viability of the simulation through an analysis of the existing capabilities of the simulation and its documentation. Capabilities of the existing simulation are then compared with the requirements needed to address the current problem. The results of this comparison should detail:

  • Which capabilities of the existing simulation will be retained as-is
  • Which capabilities need to be modified
  • What capabilities need to be added to the simulation to make it fit the new intended purpose

For both new and legacy simulations, the User maintains the responsibility for articulating a set of M&S requirements that addresses the intended purpose.  The User may enlist the aid of the Accreditation Agent, the V&V Agent, the M&S PM or M&S Proponent, SMEs from various applicable disciplines, other past and prospective users, and even the Developer to produce a comprehensive, correct and consistent set of M&S requirements; but, in the end, it is the User who must be satisfied that the set is accurate and complete enough to specify the simulation capabilities needed for the intended use.

Steps in M&S Requirements Development process.(see the Requirements Development section for more details)

  1. Requirements Identification
  2. Requirements Articulation
  3. Requirements Analysis
  4. Requirements Configuration Control

Important to refining requirements is managing “requirements creep” (i.e., expansion of requirements beyond those originally specified to capture the simulation’s intended uses) and to eliminate unrealistic requirements (i.e., requirements that cannot be satisfied by the available technology or resources).

See the Verification, Validation & Accreditation Recommended Practice Guide for more detailed information.

AcqLinks and References:

Proposal Development

Software Source Selection Considerations


Software Source Selection Consideration for Computer Systems & Software (CS&S) tasks includes:

It’s should be the objective to select developers with sound approaches; applicable domain experience in the development of software-intensive systems; proven program management, systems engineering, and software engineering processes; and successful and relevant past performance. Therefore, the source selection strategy for CS&S is to solicit and evaluate proposal information with a focus in the following areas: [1]

  • Soundness of the proposed CS&S approach, including CS&S architecture, and software development and integration approach.
  • Offeror capability, as defined by internal process standards that form the foundation for and can be tailored to provide program-specific processes, evidence that the proposed processes are part of the company culture, and capable, adequate staffing and other resources.
  • Offeror commitment to the application of program-specific processes, as defined in the Software Development Plan (SDP), Integrated Master Plan (IMP), Integrated Master Schedule (IMS), and other contractual documents.
  • Realistic program effort and schedule baselines that are compatible with the estimated software development/integration effort and schedule, and that accommodate the consistent application of the proposed processes and tools.
  • Successful past performance in software development, including application of the proposed processes and satisfaction of program cost, schedule, and performance baselines.

AcqLinks and References:

Modeling & Simulation

M&S Contracting

Close coordination is necessary between the Program Management Office’s (PMO) Modeling and Simulation (M&S) lead and the Program Contracting Officer (PCO). Contracting strategies, solicitation, and contract provisions must be consistent with the decided division of responsibilities with particular attention paid to the use of Government Furnished Equipment (GFE) / Government Furnished Information (GFI). [1] Request for Proposal (RFP) language and contract provisions should address: [1]

  • M&S requirements
  • Data rights
  • Contractor’s own M&S planning and documentation
  • Examination of reuse opportunities
  • Expectations regarding the sources of M&S tools and data
  • Ownership and maintenance of government-funded M&S resources
  • VV&A requirements
  • Government user support
  • Access control
  • Metrics and documentation requirements

Indicators of contractor M&S expertise should be considered in defining source selection criteria. Contractor attributes that have a direct relationship to successful M&S use may include: [1]

  • A documented systems-engineering process showing its organizations, activities, the specific M&S tools used by each, and the information flows among them;
  • An existing information sharing infrastructure (i.e., integrated data environment) providing enterprise team members, on a nearly continuous, from-the-desktop basis, the capability to discover, access, understand and download a comprehensive set of authoritative, accurate and coherent product development information. The data items provided by this system should be accompanied with metadata providing the pedigree and sufficient applicability and context information to guide their valid use;
  • Successful experience using a wide variety of M&S, both for design (prescriptive modeling environments such as systems engineering tools, CAD, and software design tools) and assessment (descriptive M&S), from the engineering to mission levels;
  • Successful participation in federations or other types of distributed simulations using an open standard architecture e.g., the High Level Architecture (HLA);
  • A record of reuse of M&S tools and information produced by other organizations (government, industry and COTS)
  • A documented VV&A process, with records indicating a history of compliance; and
  • A staff with documented M&S expertise.

AcqLinks and References:

Earned Value Management

Cost Assessment and Program Evaluation (CAPE)


Note: Cost Analysis Improvement Group (CAIG) is now known as Cost Assessment and Program Evaluation (CAPE).

The Cost Assessment and Program Evaluation (CAPE) office provides independent analytic advice to the Secretary of Defense on all aspects of the Defense program, including alternative weapon systems and force structures, the development and evaluation of defense program alternatives, and the cost-effectiveness of defense systems. The office also conducts analyses and offers advice in a number of related areas, such as military medical care, school systems for military dependents, information technology, and defense economics. Consistent with its advisory role, the office has no decision authority or line responsibility and has no vested interest in any sector of the defense budget. [1]

Website: Cost Assessment and Program Evaluation (CAPE)

CAPE is also responsible for the management of the programming system, including development of planning guidance (in conjunction with other organizations within the Office of the Secretary of Defense), production of the Joint Programming Guidance (JPG), and direction of the annual program review. The ultimate product of the program review is the Future Years Defense Program (FYDP) – the authoritative statement of what the Department plans, year by year, by way of force structure (how many ships, brigades and divisions, aircraft squadrons and wings, etc., we will operate), procurement (how many ships, tanks, aircraft, missiles, etc., we will buy), manpower (how many people, military and civilian, we plan to employ in each of the services and defense agencies), other supporting programs (such as R&D and military construction), and what it will all cost. [1]

Director of Cost Assessment and Program Evaluation (DCAPE)
The Director of CAPE is a principal staff assistant and advisor to the Secretary and Deputy Secretary of Defense in the Office of the Secretary of Defense (OSD). The postholder, as chartered under United States Department of Defense Directive 5141.01, provides independent analytic advice to the Secretary of Defense on all aspects of the an DoD program, including alternative weapon systems and force structures, the development and evaluation of defense program alternatives, and the cost-effectiveness of defense systems.

The DCAPE responsibilities include: [1]

  • Analyze and evaluate plans, programs, and budgets in relation to U.S. defense objectives, projected threats, allied contributions, estimated costs, and resource constraints.
  • Review, analyze, and evaluate programs, including classified programs, for executing approved policies.
  • Provide leadership in developing and promoting improved analytical tools and methods for analyzing national security planning and the allocation of resources.
  • Ensure that the costs of DoD programs, including classified programs, are presented accurately and completely.

AcqLinks and References: