Back to Top

Educating the Aerospace Industry

Blog Archives

Production, Quality & Manufacturing

Continuous Improvement Process

Shewart Cycle

 

The Continuous Improvement Process (CIP) is an ongoing effort to improve products, services, or processes. It’s is a six-step systematic approach to plan, sequence and implement improvement efforts using data and elaborates on the Shewhart Cycle (Act, Plan, Do, Study). The CIP provides a common language and methodology which enables understanding the improvement process. The CIP always links back to each organization’s own goals and priorities. 

 

Continuous improvement is the act of continually looking to improve upon a process, product, or service through small incremental steps.

 

Why Use a Continuous Improvement Process:

Implementing a Continuous Improvement Process in an organization should be standard practice now. Studies have shown the main benefits of a CIP are:

  • Increase productivity,
  • Better teamwork and morale,
  • Greater agility,
  • Less waste,
  • More efficiency
  • Increase in profit.

The six (6) steps of the Continuous Improvement Process are:

  1. Identify Improvement Opportunity: Select the appropriate process for improvement.
    • Evaluate Process:
    • Select a challenge/problem
  2. Analyze: Identify and verify the root cause(s).
  3. Take Action: Plan and implement actions that correct the root cause(s).
  4. Study Results: Confirm the actions taken to achieve the target.
  5. Standardize Solution: Ensure the improved level of performance is maintained.
  6. Plan for the Future:
    • Plan what is to be done with any remaining problems
    • Evaluate the team’s effectiveness Set a target for improvement

CPI Development Cycle [1]

 

The Best Time to Start a Continuous Improvement Process

There is no bad time to start using plan a continuous improvement process but the sooner the better. Below is a list of the times I believe a CIP should be implemented:

  • Beginning of a new project
  • Development of processes and procedures
  • Developing a new or improved product, or service
  • Planning data collection and analysis 
  • Implementing any change to a process
  • Whenever a failure occurs

Tools that can be used to help with the Continual Improvement Process are:

AcqLinks and References:

Updated: 4/12/2021

Production, Quality & Manufacturing

Quality Management Plan (QMP)

 

A Quality Management Plan (QMP) helps guides the Program Manager (PM) and project personnel to execute quality management and quality assurance activities for a project or program. The purpose of the QMP is to describe how quality will be managed throughout the lifecycle of the project. Quality management planning determines quality policies and procedures relevant to the project for both project deliverables and project processes, defines who is responsible for what, and documents compliance. A QMP is developed by a contractor. Quality is the degree to which the project fulfills requirements.

 

Template: Sample Quality Management Plan (QMP)

 

The QMP identifies these key components:

  • Project Deliverables & Project Processes:  The key project deliverables and processes subject to quality review.
  • Deliverable Quality Standards:  The quality standards are the “measures” used to determine a successful outcome for a deliverable. These standards may vary dependent on the type of information technology project.
  • Customer Satisfaction:  The customer satisfaction criteria describe when each deliverable is complete and acceptable as defined by the customer. Deliverables are evaluated against these criteria.
  • Quality Control Activities: The quality control activities that monitor and verify that the project deliverables meet defined quality standards.
  • Process Quality Standards:  The quality standards are the “measures” used to determine if project work processes are being followed.
  • Stakeholder Expectations: Stakeholder expectations describe when a project process is effective as defined by the project stakeholders. An example is the review and approval of all high-impact changes to the project.
  • Quality Assurance Activities: The quality assurance activities that monitor and verify that the processes used to manage and create the deliverables are followed and are effective.

 

Quality Management Plan Methodology:

Step 1: Plan the Development of the QMP

  • Identify the customers Quality Objectives. Help customers express quality expectations in an objective, quantitative terms.
  • Identify professional standards including legal, environmental, economic, code, life safety, and health.
  • Balance needs and expectations of customers and stakeholders with cost, schedule, and professional standards. Evaluate the costs and benefits of selected quality objectives and the processes to be used to achieve objectives.
  • Develop an effective plan and processes, including quality assurance and quality control procedures, to achieve objectives. Consider risk/hazard factors and complexity of the project and adapt processes to provide the requisite level of quality. Document in the risk management plan any project variations from the local QMP requirements.
  • Develop performance measure thresholds to ensure agreement on the definition of success relative to Quality Objectives.
  • Ensure customer endorsement of all quality objectives included in the Quality Management Plan.

Step 2: Execute the QMP

  • Do the work according to the approved Program Management Plan (PMP) and standard operating procedures.
  • Project execution is a dynamic process. The PDT must communicate, meet on a regular basis, and adapt to changing conditions. The Quality Management Plan and PMP may require modification to ensure that project objectives are met.
  • Document in Lessons Learned.

Step 3: Perform Quality Checks

  • Perform independent technical review, management oversight, and verification to ensure that quality objectives are met consistent with District Quality Management Plans.
  • Check performance against the PMP and Customer Quality Objectives performance measures thresholds to verify that performance will accomplish Quality Objectives and to verify sufficiency of the plan.
  • Share findings with all project stakeholders to facilitate continuous improvement.

Step 4: Take Corrective Action if Necessary.

  • If performance measures thresholds are exceeded, take specific corrective actions to fix the systemic cause of any non-conformance, deficiency, or other unwanted effect.
  • Document quality improvements that could include appropriate revisions to the quality management plan, alteration of quality assurance and control procedures, and adjustments to resource allocations.

 

AcqLinks and References:

Updated: 4/10/2021

Requirements Development

Requirement Types

 

There are a number of different type of requirement that system engineers will have to develop on an acquisition program through it life-cycle. These requirements range from very high-level concept-focused to very specific for a part. The main types of requirements are:

  • Functional Requirements
  • Performance Requirements
  • System Technical Requirements
  • Specifications

Functional Requirements
A functional requirement is simply a task (sometimes called action or activity) that must be accomplished to provide an operational capability (or satisfy an operational requirement). Some functional requirements that are associated with operations and support can be discerned from the needed operational capability (see Operational Requirements). Others often result only from diligent systems engineering. Experience in systems engineering has identified eight generic functions that most systems must complete over their life cycle: development, manufacturing, verification, deployment, training, operations, support, and disposal. These are known as the eight primary system functions. Each must usually be considered to identify all the functional requirements for a system.

 

Performance Requirements
A performance requirement is a statement of the extent to which a function must be executed, generally measured in terms such as quantity, accuracy, coverage, timeliness, or readiness. The performance requirements for the operational function and sometimes a few others often correlate well with the statement of the needed operational capability as developed by the Joint Capabilities Integration and Development System (JCIDS) Process. The statement of other performance requirements usually requires thorough systems engineering.

 

System Technical Requirements
Result in both allocated and derived requirements.

  • Allocated Requirements: flow directly from the system requirements down to the elements of the system.
  • Derived Requirements: dependent on the design solution (and so are sometimes called design requirements). They include internal interface constraints between the elements of the system.

Specifications
A specification is a detailed, exact statement of particulars, especially a statement prescribing materials, dimensions, and quality of work for something to be built, installed, or manufactured. The overall purpose of a specification is to provide a basis for obtaining a product or service that will satisfy a particular need at an economical cost and to invite maximum reasonable competition. By definition, a specification sets limits and thereby eliminates, or potentially eliminates, items that are outside the boundaries drawn. A good specification should do four (4) things:

  1. Identify minimum requirements
  2. List reproducible test methods to be used in testing for compliance with specifications
  3. Allow for a competitive bid
  4. Provide for an equitable award at the lowest possible cost.

– The document that defines the proper organization for all armed-forces specifications is MIL-STD-961E “Defense and Program-Unique Specification Format and Content”.

 

AcqLinks and References:

Updated: 4/10/2021

Program Management

Responsibility Assignment Matrix (RAM)

 

A Responsibility Assignment Matrix (RAM) describes the participation of various organizations, people, and roles in completing tasks or deliverables for a project. It’s used by the Program Manager (PM) in clarifying roles and responsibilities in a cross-functional team, projects, and processes. A Request for Proposal (RFP) might request a RAM from a contractor.

 

Template: Responsibility Assignment Matrix (RAM) Template (Excel)

 

A RAM is also called a Responsible, Accountable, Consulted, and Informed (RACI) matrix. The PMBOK Guide 4th Edition defines RACI as a RAM that is used to illustrate the connections between work packages or activities and project team members. On larger projects, RAMs can be developed at various levels.

  • Responsible (R): Those who do the work to achieve the task. There is typically one role with a participation type of Responsible, although others can be delegated to assist in the work required.
  • Accountable (A): The one ultimately accountable for the correct and thorough completion of the deliverable or task, and the one to whom Responsible is accountable. In other words, an Accountable must sign off (Approve) on work that Responsible provides. There must be only one Accountable specified for each task or deliverable.
  • Consulted (C): Those whose opinions are sought; and with whom there is two-way communication.
  • Informed (I): Those who are kept up-to-date on progress, often only on completion of the task or deliverable; and with whom there is just one-way communication.

 

A RAM can define what a project team is responsible for within each component of the Work Breakdown Structure (WBS). It could also be used within a working group to designate roles, responsibilities, and levels of authority for specific activities. The matrix format shows all activities associated with one person and all people associated with one activity. This ensures that there is only one person accountable for any one task to avoid confusion.

 

A RAM is displayed as a chart that illustrates the interaction between work packages that need to be done and project team members. Typically, the list of objectives are on the left-hand column with the project team member names across the top. Each work package will be assigned to the appropriate project team member. The chart aids in communication amongst the project team members. Below is a common RAM format. 

 

 

RAM Picture

 

 

AcqLinks and References:

Updated: 4/10/2021

Program Management

Memorandum of Agreement (MOA)

 

A Memorandum of Agreement (MOA) is a written document describing a cooperative relationship between two parties wishing to work together on a project or to meet an agreed-upon objective. An MOA serves as a legal document and describes the terms and details of the partnership agreement. An MOA is more formal than a verbal agreement but less formal than a contract. Organizations can use an MOA to establish and outline collaborative agreements, including service partnerships or agreements to provide technical assistance and training. An MOA may be used regardless of whether or not money is to be exchanged as part of the agreement.

 

The typical format of an MOA include:

  • Authority
  • Purpose of the Agreement
    • Name of parties involved
    • Brief description of the scope of work
    • Financial obligations of each party, if applicable
    • Dates agreement is in effect
    • Key contacts for each party involved
  • Detailed Description of Roles and Responsibilities
  • Payment Schedule if Applicable
  • Duration of the Agreement
  • Modification of Termination
  • Signatures of Parties’ Principals

 

Memorandum of Understanding (MOU)
Defines a “general area of understanding” within both parties’ authorities and no transfer of funds for services is anticipated. MOUs often state common goals and nothing more. Thus, MOUs do not contemplate funds transfers and should usually include language that states something similar to: “This is not a funds obligating document; by signing this agreement the parties are not bound to take any action or fund any initiative.” An MOU may be used to outline the operation of a program so that it functions a certain way. For example, two agencies that have similar goals may agree to work together to solve a problem or support each other’s activities by using an MOU. The MOU is nothing more than a formalized handshake.

 

Template: US Army Memorandum of Agreement

 

Steps to writing a Memorandum of Agreement (MOA) or MOU

  • Step 1: Determine the Appropriate Agreement Type
    • Memorandum of Agreement: A legal document that describes a partnership and agreed-upon objectives
    • Memorandum of Understanding: Agreement of common goals between two or more parties 
  • Step 2: Determine the parties involved in developing the agreement
    • Who needs to sign the MOA or MOU and who needs to be a part of its development? 
  • Step 3: Create a Draft Agreement
    • Have one person be the focal point for drafting the agreement
  • Step 4: Submit Draft Agreement for Coordination Review
    • Send the Draft MOA or MOU out for coordination with a  sign-off sheet. This will ensure all parties have seen and reviewed the draft MOA or MOU
  • Step 5: Finalize Agreement
    • Write the finalized agreement
  • Step 6: Sign Agreement

 

Tips for Writing a Memorandum of Agreement (MOA) or Memorandum of Understanding (MOU)

  • Only use one Memorandum of Agreement form when writing the terms of an agreement.
  • Keep it simple. Make sure that the wording is clear and concise. Whenever possible, use the wording of the parties when drafting the mediation agreement.
  • Agreements should strive for balance – a “sandwich” model can be useful. Start with “both parties agree” then state what each individually agrees to then close with “both parties agree.” Balance is not that each party has the same number of bullet points but that what is expected of each in the future has a sense of balance for them.
  • Agreements should be written in positive language. For example, state what someone will do, not what they will not do.
  • Agreements should be specific. As much as possible address: who, what, when and how questions.
  • Careful reality checks should be done with the parties to ensure that the terms of the agreement are realistic and within their scope of authority.
  • Carefully review each item in the terms of agreement with both parties to ensure that each item is correct and appropriately captures each party’s intent. You should read each item out loud and ask each party if the wording is accurate. Each party should be able to understand their responsibilities in the terms of the agreement.
  • Keep in mind that the Memorandum of Agreement is a Settlement Agreement; therefore, appropriate personnel will need to clearly understand the terms of the agreement in order to effectuate the contents of the agreement.
  • Be absolutely sure that all parties sign the agreement.
  • All parties should receive a written copy of their agreement before they leave the session.

 

AcqTips:

  • MOUs tend to be used for simple common-cause agreements which are not legally binding. MOAs, on the other hand, establish common legal terms that establish a “conditional agreement” where the transfer of funds for services are anticipated.

AcqLinks and References:

Updated: 4/10/2021

Technology Development

Technology Readiness Assessment (TRA)

 

A Technology Readiness Assessment (TRA) (Title 10 U.S.C. § 2366b) is a formal, metrics-based process and accompanying report that assesses the maturity of critical hardware and software technologies called Critical Technology Elements (CTE) to be used in systems. It is conducted by an Independent Review Team (IRT) of subject matter experts (SMEs). All DoD acquisition programs must have a formal TRA at Milestone B and at Milestone C. A preliminary assessment is due for the Development RFP Release Decision Point. [2]

 

Guide: Technology Readiness Assessment (TRA) Guidance – May 2011

Guide: GAO Technology Readiness Assessment Guide – Aug 2016

 

The TRR is a statutory requirement for Major Defense Acquisition Programs (MDAP) per DoD Instruction 5000.02, Enclosure 4, and regulatory information required for all other acquisition programs. Title 10 United States Code (U.S.C.) Section 2366b requires, in part, that the Milestone Decision Authority (MDA) certify that the technology in an MDAP has been demonstrated in a relevant environment and has a Technology Readiness Levels (TRL) of (TRL 6) before Milestone B approval. [2]

 

The TRA may be conducted concurrently with other technical reviews such as the Alternative Systems Review (ASR), System Requirements Review (SRR), or the Production Readiness Review (PDR). The Defense Director for Research and Engineering (DDR&E) is required to conduct an independent TRA of MDAPs prior to Milestone B. [1]

 

The TRA should be used a tool for assessing program risk and the adequacy of technology maturation planning. The TRA scores the current readiness level of selected system elements, using defined Technology Readiness Levels (TRL). Completion of the TRA should provide the following: [1]

  1. A comprehensive review, using an established program Work Breakdown Structure (WBS) as an outline, of the entire platform or system. This review, using a conceptual or established design, identifies program CTEs,
  2. Objective scoring of the level of technological maturity for each CTE by subject matter experts,
  3. Maturation plans for achieving an acceptable maturity roadmap for CTEs before critical milestone decision dates,
  4. A final report documenting the findings of the assessment panel.

 

TRAs Inform Technology Development and Identify Potential Concerns

While a TRA uses TRL as key metrics for the evaluation of each technology, an assessment is more than just a single number at only single point in time. It is a compilation of lower-level assessments that could span several years, based on the program schedule and complexity of the development. Evaluations can help gauge the progress of technology development, inform program plans, and identify potential concerns for decision-makers throughout acquisitions. Conducting TRAs periodically and during the earlier phases of development can identify potential concerns before risks are carried into the later and more expensive stages of system development.

 

TRAs can also facilitate communication between technology developers, program managers, and acquisition officials throughout development and at key decision points by providing a common language for discussing technology readiness and related technical risks. Finally, TRA results can inform other assessments and planning activities, such as cost and schedule estimates, risk assessments, and technology maturation plans. [4]

 

Program Managers (PM) have found that the TRA assessment process is useful in managing technology maturity. The TRA process highlights critical technologies and other potential technology risk areas that require the PM’s attention. The TRA can help identify immature and important components and track the maturity development of those components. Some programs use TRAs as an important part of their risk assessment.

 

STATUTORY:  A preliminary assessment is due for the Development RFP Release Decision Point. The Assistant Secretary of Defense for Research and Engineering (ASD(R&E)) will conduct an independent review and assessment of the TRA conducted by the Program Manager and other factors to determine whether the technology in the program has been demonstrated in a relevant environment. The assessment will inform the 2366b CERTIFICATION MEMORANDUM at Milestone B (in accordance with 10 U.S.C. 2366b (Reference (g)). The TRA at Milestone C is a Regulatory requirement when Milestone C is Program Initiation. [3]

 

Detailed information about the TRA and its process and procedures can be found in the Technology Readiness Assessment Deskbook. The content of the Deskbook is listed below:

 

TRA Deskbook Content
1.     Introduction
1.1     Technology Readiness Assessment Definition
1.2     TRA Authority
1.3     TRA Importance
1.3.1     Milestone B TRA
1.3.2     Milestone C TRA
1.4     Purpose and Organization of This Document
2.     Initiating and Conducting TRAs
2.1     Key Players and the TRA Timeline
2.2     Roles and Responsibilities
3.     Evolution of Knowledge on Technology Maturity
3.1     Early Evaluations of Technology Maturity
3.2     Summary List of Acronyms

Appendixes
A.     Submitting a Technology Readiness Assessment
B.     Guidance and Best Practices for Identifying Critical Technology Elements (CTEs)
C.     Guidance and Best Practices for Assessing Technology Maturity
D.     Amplifying Technology Readiness Assessment Guidance for Ships
E.     Biomedical Technology Readiness Levels
F.     Technology Maturity Policy
G.     The Technology Readiness Assessment Process
H.     Easy-Reference Displays of the Hardware/Software TRLs and Additional TRL Definitions

 

AcqLinks and References:

Updated: 12/4/2018

Acquisition Process

Technology Readiness Assessment (TRA)

 

A Technology Readiness Assessment (TRA) is a formal, metrics-based process and accompanying report that assesses the maturity of critical hardware and software technologies called Critical Technology Elements (CTE) to be used in systems. It is conducted by an Independent Review Team (IRT) of subject matter experts (SMEs). All DoD acquisition programs must have a formal TRA at Milestone B and at Milestone C. A preliminary assessment is due for the Development RFP Release Decision Point. [2]

 

Guide: Technology Readiness Assessment Deskbook -May 2011

Guide: GAO Technology Readiness Assessment Guide – Aug 2016

 

The TRR is statutory requirement for Major Defense Acquisition Programs (MDAP) per DoD Instruction 5000.02, Enclosure 4, and a regulatory information requirement for all other acquisition programs. Title 10 United States Code (U.S.C.) Section 2366b requires, in part, that the Milestone Decision Authority (MDA) certify that the technology in an MDAP has been demonstrated in a relevant environment and has a Technology Readiness Levels (TRL) of (TRL 6) before Milestone B approval. [2]

 

The TRA may be conducted concurrently with other technical reviews such as the Alternative Systems Review (ASR), System Requirements Review (SRR), or the Production Readiness Review (PDR). The Defense Director for Research and Engineering (DDR&E) is required to conduct an independent TRA of MDAPs prior to Milestone B. [1]

 

The TRA should be used a tool for assessing program risk and the adequacy of technology maturation planning. The TRA scores the current readiness level of selected system elements, using defined Technology Readiness Levels (TRL). Completion of the TRA should provide the following: [1]

  1. A comprehensive review, using an established program Work Breakdown Structure (WBS) as an outline, of the entire platform or system. This review, using a conceptual or established design, identifies program CTEs,
  2. Objective scoring of the level of technological maturity for each CTE by subject matter experts,
  3. Maturation plans for achieving an acceptable maturity roadmap for CTEs before critical milestone decision dates,
  4. A final report documenting the findings of the assessment panel.

 

TRAs Inform Technology Development and Identify Potential Concerns

While a TRA uses TRL as key metrics for the evaluation of each technology, an assessment is more than just a single number at only single point in time. It is a compilation of lower-level assessments that could span several years, based on the program schedule and complexity of the development. Evaluations can help gauge the progress of technology development, inform program plans, and identify potential concerns for decision-makers throughout acquisitions. Conducting TRAs periodically and during the earlier phases of development can identify potential concerns before risks are carried into the later and more expensive stages of system development. [4]

 

TRAs can also facilitate communication between technology developers, program managers, and acquisition officials throughout development and at key decision points by providing a common language for discussing technology readiness and related technical risks. Finally, TRA results can inform other assessments and planning activities, such as cost and schedule estimates, risk assessments, and technology maturation plans. [4]

 

Program Managers (PM) have found that the TRA assessment process is useful in managing technology maturity. The TRA process highlights critical technologies and other potential technology risk areas that require the PM’s attention. The TRA can help identify immature and important components and track the maturity development of those components. Some programs use TRAs as an important part of their risk assessment.

 

STATUTORY:  A preliminary assessment is due for the Development RFP Release Decision Point. The Assistant Secretary of Defense for Research and Engineering (ASD(R&E)) will conduct an independent review and assessment of the TRA conducted by the Program Manager and other factors to determine whether the technology in the program has been demonstrated in a relevant environment. The assessment will inform the 2366b CERTIFICATION MEMORANDUM at Milestone B (in accordance with 10 U.S.C. 2366b (Reference (g)). The TRA at Milestone C is a Regulatory requirement when Milestone C is Program Initiation. [3]

 

Detailed information about the TRA and its process and procedures can be found in the Technology Readiness Assessment Deskbook. The content of the Deskbook is listed below:

 

TRA Deskbook Content
1.     Introduction
1.1     Technology Readiness Assessment Definition
1.2     TRA Authority
1.3     TRA Importance
1.3.1     Milestone B TRA
1.3.2     Milestone C TRA
1.4     Purpose and Organization of This Document
2.     Initiating and Conducting TRAs
2.1     Key Players and the TRA Timeline
2.2     Roles and Responsibilities
3.     Evolution of Knowledge on Technology Maturity
3.1     Early Evaluations of Technology Maturity
3.2     Summary List of Acronyms

Appendixes
A.     Submitting a Technology Readiness Assessment
B.     Guidance and Best Practices for Identifying Critical Technology Elements (CTEs)
C.     Guidance and Best Practices for Assessing Technology Maturity
D.     Amplifying Technology Readiness Assessment Guidance for Ships
E.     Biomedical Technology Readiness Levels
F.     Technology Maturity Policy
G.     The Technology Readiness Assessment Process
H.     Easy-Reference Displays of the Hardware/Software TRLs and Additional TRL Definitions

 

AcqLinks and References:

Updated: 4/10/2021

Technology Development

Technology Readiness Level (TRL)

 

Technology Readiness Levels (TRL) are a method of estimating the technology maturity of Critical Technology Elements (CTE) of a program during the acquisition process. They are determined during a Technology Readiness Assessment (TRA) that examines program concepts, technology requirements, and demonstrated technology capabilities.  TRL is based on a scale from 1 to 9 with 9 being the most mature technology. The use of TRLs enables consistent, uniform, discussions of technical maturity across different types of technologies. Decision authorities will consider the recommended TRLs when assessing program risk. [1,2]

 

The DoD TRL’s are defined in the table below:

Level Definition DoD DAG Description
1 Basic principles observed and reported Lowest level of technology readiness. Scientific research begins to be translated into applied research and development. Examples might include paper studies of a technology’s basic properties.
2 Technology concept and/or application formulated. Invention begins. Once basic principles are observed, practical applications can be invented. Applications are speculative and there may be no proof or detailed analysis to support the assumptions. Examples are limited to analytic studies.
3 Analytical and experimental critical function and/or characteristic proof of concept. Active research and development is initiated. This includes analytical studies and laboratory studies to physically validate analytical predictions of separate elements of the technology. Examples include components that are not yet integrated or representative.
4 Component and/or breadboard validation in laboratory environment. Basic technological components are integrated to establish that they will work together. This is relatively “low fidelity” compared to the eventual system. Examples include the integration of “ad hoc” hardware in the laboratory.
5 Component and/or breadboard validation in relevant environment. The Fidelity of breadboard technology increases significantly. The basic technological components are integrated with reasonably realistic supporting elements so it can be tested in a simulated environment.
6 System/subsystem model or prototype demonstration in a relevant environment. A representative model or prototype system, which is well beyond that of TRL 5, is tested in a relevant environment. Represents a major step up in a technology’s demonstrated readiness.
7 System prototype demonstration in an operational environment. Prototype near, or at, planned operational system. Represents a major step up from TRL 6, requiring the demonstration of an actual system prototype in an operational environment such as an aircraft, vehicle, or space.
8 Actual system completed and qualified through test and demonstration. Technology has been proven to work in its final form and under expected conditions. In almost all cases, this TRL represents the end of true system development. Examples include developmental test and evaluations of the system in its intended weapon system to determine if it meets design specifications.
9 Actual system has proven through successful mission operations. The actual application of the technology in its final form and under mission conditions, such as those encountered in operational test and evaluation. Examples include using the system under operational mission conditions.

 

The primary systems engineering objective is to gain sufficient technical knowledge to develop the program’s System Requirements Document (SRD) and to verify that the system solution(s) required technology is sufficiently mature, has a TRL 6 or above, before proceeding into an end-item design or Milestone B. [1]

 

The Technology Development Strategy (TDS) will describe how a program plans to mature its CTE before proceeding into Milestone B. After Milestone B, a technology maturation plan/strategy should be part of the Engineering and Manufacturing Development (EMD) Phase Acquisition Strategy for those CTE that requires additional concurrency and technological development to achieve a higher TRL. [1]

 

Technology Readiness Assessment

While a TRA uses TRL as key metrics for the evaluation of each technology, an assessment is more than just a single number at only single point in time. It is a compilation of lower-level assessments that could span several years, based on the program schedule and complexity of the development. Evaluations can help gauge the progress of technology development, inform program plans, and identify potential concerns for decision-makers throughout acquisitions. Conducting TRAs periodically and during the earlier phases of development can identify potential concerns before risks are carried into the later and more expensive stages of system development.

TRAs can also facilitate communication between technology developers, program managers, and acquisition officials throughout development and at key decision points by providing a common language for discussing technology readiness and related technical risks. Finally, TRA results can inform other assessments and planning activities, such as cost and schedule estimates, risk assessments, and technology maturation plans. [3]

 

The Technology Readiness Assessment Deskbook – Appendix C, is the best source of TRL information. It covers:
1.  Overview of TRL
2.  Assessing Hardware CTEs
3.  Assessing Software CTEs

 

AcqTips:  

  • There’re different definitions of Technology Readiness Levels, so make sure you follow the one that’s specific to your program or R&D project.
  • The TRL definitions were adopted from NASA.

AcqLinks and References:

Updated: 6/22/2018

Schedule Development

Critical Path Critical Path Method

 

The Critical Path is the longest path of scheduled activities that must be met to execute a project.  This is important for Program Managers (PM) to know since any problems on the critical path can prevent a project from moving forward and be delayed.  Earned Value Management (EVM) analysis focuses on the critical path and near-critical paths to identify cost and schedule risks. Other schedule paths might have slack time to avoid delaying the entire project, unlike the critical path. There might be multiple critical paths on a project.

 

Critical Path

 

The Critical Path is determined when analyzing a project’s schedule or network logic diagram and uses the Critical Path Method (CPM).  The CPM provides a graphical view of the project, predicts the time required for the project, and shows which activities are critical to maintaining the schedule.

 

The seven (7) steps in the CPM are: [1]

  1. List of all activities required to complete the project (see Work Breakdown Structure (WBS)),
  2. Determine the sequence of activities
  3. Draw a network diagram
  4. Determine the time that each activity will take to completion
  5. Determine the dependencies between the activities
  6. Determine the critical path
  7. Update the network diagram as the project progresses

 

The CPM calculates the longest path of planned activities to the end of the project and the earliest and latest that each activity can start and finish without making the project longer. This process determines which activities are “critical” (i.e., on the longest path) and which have “total float” (i.e., can be delayed without making the project longer). [1]

 

History

The CPM scheduling technique was introduced at approximately the same time as PERT Analysis. It was developed by J. E. Kelly of Remington-Rand and M. R. Walker of DuPont to aid in scheduling maintenance shutdowns in chemical processing plants. Over the years, CPM has enjoyed more use than any other network scheduling technique. It is based on the concept of critical path and was designed to focus on the time and resources, particularly cost, necessary to complete a project’s activities.

 

Although CPM and PERT are conceptually similar, some significant differences exist mostly due to the type of projects best suited for each technique. As discussed earlier, PERT is better to use when there is much uncertainty and when control over time outweighs control over costs. PERT handles uncertainty of the time required to complete an activity by developing three estimates and then computing an expected time using the beta distribution. CPM is better suited for well-defined projects and activities with little uncertainty, where accurate time and resource estimates can be made. The percentage of completion of an activity then can be determined. [1]

 

AcqLinks and References:

Updated: 4/10/2021

Schedule Development

PERT Analysis

 

Program Evaluation and Review Technique (PERT) is a method used to examine the tasks in a schedule and determine a Critical Path Method variation (CPM). It analyzes the time required to complete each task and its associated dependencies to determine the minimum time to complete a project. It estimates the shortest possible time each activity will take, the most likely length of time, and the longest time that might be taken if the activity takes longer than expected.  The US Navy developed the method in 1957 on the Polaris nuclear submarine project.

 

To conduct PERT Analysis, three-time estimates are obtained (optimistic, pessimistic, and most likely) for every activity along the Critical Path. Then use those estimates in the formula below to calculate how much time for each project stage:

 

Formula: (P+4M+O)/6

  • Optimistic Time (O): the minimum possible time required to accomplish a task, assuming everything proceeds better than is normally expected.
  • Pessimistic Time (P): the maximum possible time required to accomplish a task, assuming everything goes wrong (excluding major catastrophes).
  • Most likely Time (M): the best estimate of the time required to accomplish a task, assuming everything proceeds as normal.

PERT1

Example of the three-time estimates

 

Critical Path

Example of a Critical Path Nodal Diagram

 

History

In 1958, the U.S. Navy introduced network scheduling techniques by developing PERT as a management control system for the development of the Polaris missile program. PERT’s focus was to give managers the means to plan and control processes and activities so the project could be completed within the specified time period. The Polaris program involved 250 prime contractors, more than 9,000 subcontractors, and hundreds of thousands of tasks. [1]

PERT was introduced as an event-oriented, probabilistic technique to increase Program Manager’s control in projects where time was the critical factor and time estimates were difficult to make with confidence. The events used in this technique represent the start and finish of the activities. PERT uses three-time estimates for each activity: optimistic, pessimistic, and most likely. An expected time is calculated based on a beta probability distribution for each activity from these estimates. [1]

 

AcqLinks and References:

Updated: 4/10/2021