Back to Top

Educating the Aerospace Industry

Blog Archives

Test & Evaluation

Test Data Management

 

The Deputy for Test and Evaluation (T&E) should have approval authority for all contractor-created test plans, procedures, and reports. They must have access to all contractor’s testing and test results and is responsible for data disseminating. They also create report formats and timelines for contractor submittal and government approval. [1]

 

Guide: DAU Test and Management Guide – Jan 2005 – Chapter 4.3.2

 

The data requirements for the entire test program are outlined in the Contract Data Requirements List (CDRL). The Deputy for T&E provides input to this section of the Request for Proposal (RFP) early in the program. They ensure that the office and all associated test organizations requiring the information receive the test documentation on time. Usually, the contractor sends the data packages directly to the Deputy for T&E, who, in turn, has a distribution list trimmed to the minimum number of copies for agencies needing that information to perform their mission and oversight responsibilities. It is important for the Deputy for T&E to use an integrated test program and request contractor test plans and procedures well in advance of the actual test performance to ensure that the office of the Deputy for T&E has time to approve the procedures or implement modifications.

 

The Deputy for T&E must receive the test results and reports on time to enable the Program Manager (PM) and others to make program decisions. The data received should be tailored to provide the minimum information needed. All must be aware that data requirements in excess of the minimum needed may lead to an unnecessary increase in the overall program cost. For data that are needed quickly and informally, the Deputy for T&E can request Quick-Look Reports that give test results immediately after test performance.

 

The contract must specify the data the contractor will supply the Operational Test Agency (OTA). Unlike Development Test and Evaluation (DT&E), the contractor will not be making the Operational Test and Evaluation (OT&E) plans, procedures, or reports. These documents are the responsibility of the OTA. The Deputy for T&E should include the OTA on the distribution list for all test documents that are of concern during the DT&E phase of testing so they will be informed of test item progress and previous testing. An OTA representative should attend the CDRL Review Board and provide the PM with a list of the types of documents the OTA will need. The Deputy for T&E should coordinate the test sections of this data list with the OTA and indicate concerns at that meeting. All contractor test reports should be made available to the OTA. In return, the Deputy for T&E must stay informed of all OTA activities, understand the test procedures, and plan and receive the test reports.

 

AcqLinks and References:

Updated: 7/18/2017

Test & Evaluation

Test and Evaluation Reports

 

The Program Manager (PM) is required to provide Test and Evaluation (T&E) reports to OSD T&E for 1) Developmental Test and Evaluation (DT&E), 2) Operational Test and Evaluation (OT&E), and 3) Live-Fire Test and Evaluation (LFT&E).  These reports go to the Director, Operational Test and Evaluation and Under Secretary of Defense for Acquisition, Technology and Logistics (USD(AT&L)) (or as delegated/designated). The reports: [1,2]

  • Should be submitted 45 days before the decision point
  • PM’s will report the results of completed developmental testing to the Milestone Decision Authority (MDA) at Milestones B and C
  • The report will identify the strengths and weaknesses in meeting the warfighters documented needs based on developmental evaluations

1) Developmental Test and Evaluation (DT&E) Report
For each program on the OSD T&E Oversight list, the DT&E Responsible Test Organization (RTO) should prepare a written report at the completion of the DT&E described in the Test and Evaluation Master Plan (TEMP). The DT&E report will: [1]

  • Provide a historical record of the final test and evaluation results for the system
  • Include the RTO‘s assessment of a system‘s military utility, capabilities and limitations
  • Document the test techniques, procedures, and data analysis concepts
  • Provide data for operating/employment and maintenance manuals for the system
  • Be submitted to the Defense Technical Information Center (DTIC) for inclusion in their repository

2) Live-Fire Test and Evaluation (LFT&E) Report
The Director, Operational Test and Evaluation (DOT&E) monitors and reviews the LFT&E of each covered system. At the conclusion of LFT&E, the Director prepares an independent assessment report that: [1]

  • Describes the results of the survivability or lethality LFT&E, and
  • Assesses whether the LFT&E was adequate to provide information to decision-makers on potential user casualties and system vulnerability or lethality when the system is employed in combat, and to ensure that knowledge of user casualties and system vulnerabilities or lethality is based on realistic testing, consideration of the validated statement of desired operational capabilities, the expected threat, and susceptibility to attack.

3) Beyond Low-Rate Initial Production (LRIP) Report
To meet the statutory requirements of 10 USC 2399, Director, Operational Test and Evaluation (DOT&E) analyzes the results of IOT&E conducted for each MDAP program. At the conclusion of Initial Operational Test and Evaluation (IOT&E), the Director prepares a report stating the opinion of the Director as to: [1]

  • Whether the T&E performed were adequate;
  • Whether the results of such T&E confirm that the items or components actually tested are effective and suitable for combat, and
  • Additional information on the operational capabilities of the items or components that the Director considers appropriate based on the testing conducted.

AcqLinks and References:

Updated: 6/5/2018

Technology Development

Technology Transition Initiative

10 U.S. Code § 2359a – Repealed.


 

The Technology Transition Initiative (TTI) is a DoD program that helps move technology from a science and Technology (S&T) program into a DoD acquisition programs.  Congress established it in 2002 to bridge the gap between demonstration and production of S&T funded technology in (10 U.S.C. 2359a). It often takes 2-3 years to obtain procurement funding to buy a product and during that time, many technology projects either become obsolete or are canceled due to a lack of funding. The TTI help prevents this. [1]

Key provisions of the code include:

  • TTI is intended to accelerate the introduction of new technologies into operational capabilities for the armed forces.
  • TTI can successfully demonstrate new technologies in relevant environments.
  • The science and technology and acquisition executives of each military department and each appropriate Defense Agency and the commanders of the unified and specified combatant commands nominate projects to be funded.
  • The TTI Program Manager identifies promising projects that meet DoD technology goals and requirements in consultation with the Technology Transition Council.
  • The TTI Program Manager and the appropriate acquisition executive can share the transition cost.  Service/Agency contribution can be up to 50% of the total project cost.  A project cannot be funded for more than four years.

To be considered for TTI funding, a project must meet the following criteria:

  • Technology developed with S&T funding,
  • Product has buyer with funds available to purchase it in later years,
  • Preferably Joint or Multi-Service project (2 or more Services/Agencies),
  • Cost sharing between TTI and Service/Agency is encouraged to leverage funding, and
  • TTI Project duration of less than four years.

For more information or to submit a technology for review, visit the DoD Technology Transition Initiative website.

AcqLinks and References:

Updated: 6/22/2018

Technology Development

Technology Broad Agency Announcement

 

A Broad Agency Announcement (BAA) is a notice from the government that requests scientific or research proposals from private firms concerning certain areas of interest to the government. The proposals submitted by the private firms may lead to contracts.

 

FAR 35.016 (a) Broad Agency Announcement procedure is for the acquisition of basic and applied research and that part of development not related to the development of a specific system or hardware procurement. BAA’s may be used by agencies to fulfill their requirements for scientific study and experimentation directed toward advancing the state-of-the-art or increasing knowledge or understanding rather than focusing on a specific system or hardware solution. The BAA technique shall only be used when meaningful proposals with varying technical/scientific approaches can be reasonably anticipated.

 

AcqLinks and References:

Updated: 6/25/2018

Intelligence & Security

Technology Assessment & Control Plan

 

The Technology Assessment/Control Plan (TA/CP) is prepared by the Program Manager (PM) when there will be foreign involvement in a program. It’s prepared after completing the identification of Critical Program Information (CPI) and the Security Classification Guide (SCG). The TA/CP does the following: [1]

  • Assess the feasibility of U.S. participation in joint programs from a foreign disclosure and technical security perspective.
  • Prepare guidance for negotiating the transfer of classified information and critical technologies involved in international agreements.
  • Identify security arrangements for international programs.
  • Provide a basis for the Delegation of Disclosure Authority Letter (DDL) that contains specific guidance on proposed disclosures.
  • Support the acquisition decision review process.
  • Support decisions on foreign sales, co-production, or licensed production, commercial sales of the system, or international cooperative agreements involving U.S. technology or processes.
  • Support decisions on the extent and timing of foreign involvement in the program, foreign sales, and access to program information by foreign interests.

 

The Technology Assessment/Control Plan (TA/CP) is composed of four sections: [1]

  1. Program Concept: This section requires a concise description of the purpose of the acquisition program.
  2. Nature and Scope of Effort and the Objectives: This section briefly explains the operational and technical objectives of the program (e.g., co-production, cooperative research and development) and discusses any foreign participation or involvement.
  3. Technology Assessment: This section analyzes the technology involved in the program, its value, and the consequences of its compromise. It should provide conclusions regarding the need for protective security measures and the advantages and disadvantages of any foreign participation in the program, in whole or in part, and should describe foreign sales.
  4. Control Plan: The fourth section, describes actions that are to be taken to protect U.S. interests when foreign involvement or sales are anticipated. Those actions should be specific and address specific risks, if any, as discussed in the technology assessment. Actions might include withholding certain information, stringent phasing of releases, or development of special security requirements.

 

Delegation of Disclosure Authority Letter (DDL) is a recommendation for foreign involvement, disclosure of the program to foreign interests, request for authority to conclude an international agreement, or a decision to authorize foreign sales. The TA/CP provides the basis for the DDL.

 

AcqLinks and References:

Updated: 11/29/2018

Requirements Development

Technical Requirements Document (TRD)

 

Technical Requirements Document (TRD) is no longer in use and was replaced by the System Requirements Document (SRD)

 

A Technical Requirements Document (TRD) was a requirements document that was put together by the government that addressed technical level requirements for a system. It accompanied a Request for Proposal (RFP) and provided a better technical breakdown than the Operational Requirements Document (ORDs) which usually lacked the technical details required to identify system performance outcomes in a system specification.  The TRD has now been replaced by the System Requirements Document (SRD).

 

In 2003, the TRD and ORD were phased out with the introduction of the Joint Capabilities Integration and Development System (JCIDS) process.  The ORD was replaced by the Initial Capabilities Document (ICD), the Capability Development Document (CDD), and the Capability Production Document (CPD).

 

Updated: 7/21/2018

Systems Engineering

Technical Performance Measurement (TPM)

 

Technical Performance Measurement (TPM) involves a technique of predicting the future value of a key technical performance parameter of the higher-level end product under development based on current assessments of products lower in the system structure. [3]

 

At the start of a program, TPMs define the planned progress of selected technical parameters. The plan is defined in terms of expected performance at specific points in the program as defined in the Work Breakdown Structure (WBS) and Integrated Master Schedule (IMS), the methods of measurement at those points, and the variation limits for corrective action. [2]


Figure: Example Technical Performance Measurement

 

TPMs provide an assessment of key capability values in comparison with those expected over time. TPM is an evolutionary Program Management and systems engineering tool that builds on the three parameters of (1) Earned Value Management (EVM) and (2) cost and schedule performance indicators and (3) the status of technical achievement. By combining cost, schedule, and technical progress into one comprehensive management tool, program managers are able to assess the progress of their entire program. TPMs are typically established on those programs complex enough where the status of technical performance is not readily apparent. TPMs can also be valuable for Risk Tracking – levels below that forecast can indicate the need for an alternate approach.

 

With a TPM program, it is possible to continuously verify the degree of anticipated and actual achievement of technical parameters and compare with the anticipated value. TPM is also used to identify and flag deficiencies that might jeopardize meeting a critical system level requirement. Measured values that fall outside an established tolerance band will alert management to take corrective action. By tracking the system’s TPMs, the Program Manager and systems engineer gain visibility into whether the delivered system will actually meet its performance specifications (requirements). Beyond that, tracking TPMs ties together a number of basic systems engineering activities of Systems Analysis and Control, Functional Analysis Allocation, and Verification and Validation activities.

 

TPMs are typically derived directly from Measures of Performance (MOP) to characterize physical or functional attributes relating to the execution of the mission or function. TPMs may also be derived from Measures of Effectiveness (MOE) to become system cost and effectiveness metrics. Some guidance for selecting TPMs:

  • Performance parameters that are significantly qualifies the entire system
  • Parameters are directly derived from analyses, demonstrations, or test
  • A direct measure of value can be derived from results of analyses or tests
  • Predicted values have a basis (analyses, historical data)
  • Each parameter can periodically be measured and profiled to compare with predicted values and tolerances over the project life cycle.

 

The most important process in TPM planning is the development of Technical Parameter Hierarchy, which requires the establishment of the Technical Performance Baseline. The technical performance baseline identifies all measurable key technical elements and establishes their relative relationships and importance. The hierarchy can be representative of the program, contract, sub-contract or other subset of technical requirements. The hierarchy must comprehensively represent technical risk factors associated with the project. Typically, the highest level of the hierarchy represents system level or operational requirements with subsystem level requirements underneath these as lower level parameters. This form of TPM methodology not only serves internal tracking by the systems engineer but also adds visibility of program status reporting.

 

AcqLinks and References:

Updated: 6/01/2018

Program Management

Team Development Stages

 

In 1965, Bruce Tuckman discovered that teams go through a team-building process comprised of five stages of growth: Forming, Storming, Norming, Performing, and finally Adjourning.  Teams do not typically go through the team-building process on their own accord.  Teams must be led through the process by their Program Manager (PM). Below is a detailed explanation of each phase.

 

Forming Stage
In this stage team members are introduced.  The program manager needs to state why they were chosen and what they will accomplish for the team.  This is a good time to lay out the mission, vision, and goals of the team. This is a stage of transition from individual to team member status, and of testing the leader’s guidance both formally and informally.  During the forming stage of team building, a program manager should:

  • Promote excitement, anticipation, and optimism
  • Develop an appropriate leadership style
  • Set the Mission, Vision, and Goals
  • Develop a team decision process

 

Forming activities include abstract discussions of the concepts and issues; some members will be impatient with these discussions.  There is often difficulty in identifying some of the relevant problems because there is so much going on that members get distracted.  The team often accomplishes little concerning its goals in this stage.  A program manager needs to know this is perfectly normal.

 

Storming Phase
This is the phase when a group becomes a team. During this phase most members have their own ideas as to how the process should look and personal agendas are often rampant.  Storming is probably the most difficult stage for the team and the program manager to control.  They begin to realize the tasks ahead of them are different and more difficult than they previously imagined.  They also get impatient about the lack of progress and members argue about what actions the team should take.  They try to rely solely on their personal and professional experience and resist collaborating with other team members.  Typical characteristics of a team in the storming phase that a program manager should know include:

  • Resisting the tasks
  • Resisting quality improvement approaches suggested by other members
  • Sharp fluctuations in attitude about the team’s chance of success
  • Arguing among members, even when they agree on the real issues
  • Defensiveness, competition, and choosing sides
  • Questioning the wisdom of those who selected the project and appointed the members of the team

 

The Storming phase is where a program manager’s leadership qualities will be tested.  It’s their job to make sure the team doesn’t break apart and keeps on track.  A program manager needs to determine the best leadership style to prevent this from happening. The style will depend on the team member’s make-up and response.  These leadership styles can include:

  • Autocratic Leadership – Best applied to situations where there is little time for group decision-making or where the leader is the most knowledgeable member of the group.
  • Democratic Leadership – Best applied for teams that are very skilled, motivated and working effectively.
  • Free-Rein Leadership – Effective in situations where group members are highly skilled, motivated and capable of working on their own.

 

Norming Phase
This phase is when the team reaches a consensus on the “To-Be” process.  Enthusiasm is high, and the team is often tempted to go beyond the original scope of the process which a program manager must prevent.  During this stage, members reconcile competing loyalties and responsibilities.  They accept the team, ground rules, controls, roles, and the individuality of fellow members.  Emotional conflict is reduced as previously competitive relationships become more cooperative.  During the Norming stage of team building, a program manager should:

  • Prevent teams from going beyond the process
  • Reinforce the standards
  • Eliminate barriers to effective decision making

 

Performing Phase
In this phase, the team has primarily settled its relationships and expectations.  They can begin performing by diagnosing, problem solving, and implementing changes.  At last, team members have discovered and accepted other’s strengths and weaknesses.  In addition, they have learned what their roles are. During the Performing stage a program manager should:

  • Reinforce the control process thru audits
  • Look for areas of optimization
  • Managing the decision-making process

 

Updated: 7/16/2017

Aerospace Industry

University Affiliated Research Center (UARC)

 

University-Affiliated Research Center Laboratories- UARCs
A University Affiliated Research Center (UARC) is a strategic United States Department of Defense (DoD) research center associated with a university. UARCs was formally established in May 1996 to ensure that essential engineering and technology capabilities of particular importance to the DoD are maintained. These not-for-profit organizations maintain essential research, development, and engineering “core” capabilities; maintain long-term strategic relationships with their DoD sponsors; and operate in the public interest, free from real or perceived conflicts of interest. Collaboration with the educational and research resources available at their universities enhances each UARC’s ability to meet the needs of their sponsors.
School Laboratory Description
Navy
Johns Hopkins University Applied Physics Laboratory (APL) The U.S. Navy is APL’s primary long-term sponsor. The Laboratory performs work for the Missile Defense Agency, the Department of Homeland Security, intelligence agencies, the Defense Advanced Research Projects Agency (DARPA), and others. The Laboratory supports NASA through space science, spacecraft design and fabrication, and mission operations. APL has made significant contributions in the areas of air defense, strike and power projection, submarine security, antisubmarine warfare, strategic systems evaluation, command and control, distributed information and display systems, sensors, information processing, and space systems. (Wikipedia)
Pennsylvania State University Applied Research Laboratory (ARL) ARL serves as a university center of excellence in Defense science and technologies, with a focus on naval missions and related areas. It maintains a long-term strategic relationship with the Navy and provides support for the other services. ARL provides science and technology for national security, economic competitiveness, and quality of life through Education Scientific Discovery Technology Demonstration Transition to Application.
University of Hawaii at Manoa Applied Research Laboratory (ARL) ARL serves as a research center of excellence for critical Navy and national defense science, technology and engineering with a focus in naval missions and related areas. ARL conducts research for the Navy, the Department of Defense and other Government agencies.ARL research areas are; Oceanography and environmental research, astronomical research, advanced electro-optical systems, laser, lidar and remote sensing detection systems, and research in various engineering programs to support sensors, communications, and information technology.
University of Texas at Austin Applied Research Laboratories (ARL) ARL’s research programs consist entirely of sponsored projects, with the bulk of the sponsorship by the Navy and Department of Defense. ARL has research programs in applications of acoustics, electromagnetics, and information technology.
University of Washington Applied Physics Laboratory (APL) APL conducted acoustic and oceanographic studies in how deep ocean variability affects Navy systems. Their scientists and engineers pursue leadership roles in acoustic and remote sensing, ocean physics and engineering, medical and industrial ultrasound, polar science and logistics, environmental and information systems, and electronic and photonic systems.
Army
University of California at
Santa Cruz
Ames Research Center Ames was founded to engage in wind-tunnel research on the aerodynamics of propeller-driven aircraft; however, its role has developed beyond research and technology in aeronautics, to encompass spaceflight, and information technology. Ames plays a role in many of NASA missions in support of America’s space and aeronautics programs. It provides leadership in astrobiology; small satellites; robotic lunar exploration; technologies for the Constellation Program; the search for habitable planets; supercomputing; intelligent/adaptive systems; advanced thermal protection; and airborne astronomy. (Wikipedia)
University of California at Santa Barbara Institute for Collaborative Biotechnologies (ICB) ICB research teams develop technological innovations in bio-inspired materials and energy, biomolecular sensors, bio-inspired network science, and biotechnological tools.
University of Southern California Institute for Creative Technologies (ICT) ICT collaborates with the entertainment industry and is a leader in producing virtual humans, computer training simulations and immersive experiences for decision-making, cultural awareness, leadership and health.
Georgia Institute of Technology Georgia Tech Research Institute (GTRI) GTRI’s research spans a variety of disciplines, including national defense, homeland security, public health, education, mobile and wireless technologies, and economic development.
Massachusetts Institute of Technology Institute for Soldier Nanotechnologies(ISN) Fundamental science and engineering research is the centerpiece of the ISN mission. Designing the soldier system of the future is a formidable task that requires a range of experts, from chemists to mechanical engineers. ISN research is divided into three broad capability areas that cross disciplinary boundaries: protection; injury intervention and remediation; and human performance improvement.
MDA
Utah State University Space Dynamics Laboratory (SDL) SDL solves the technical challenges faced by the military, science community, and industry through Serving MDA and the DoD for electro-optical sensor systems research and development, pioneering efficient and effective calibration and characterization techniques, innovating CubeSat busses and small-scale components and developing real-time reconnaissance data visualization hardware and software for operational military applications.
OSD
Stevens Institute of Technology Systems Engineering Research Center (SERC) SERC provides systems engineering knowledge and research to the Department of Defense, NASA and other government agencies. It provides researchers a community of broad experience, deep knowledge, and diverse interests. It comprises a significant part of systems engineering research and educational programs in the United States.
University of Maryland,
College Park
Applied Research Laboratory for Intelligence and Security
STRATCOM
University of Nebraska National Strategic Research Institute

Website: Most Updated List: Defense Innovation Marketplace

Updated: 10/06/2020

Business & Marketing

Worker Adjustment and Retraining Notification Act (WARN)

 

Worker Adjustment and Retraining Notification (WARN) Act (29 USC Chapter 23) offers protection to workers, their families, and communities by requiring employers to provide notice 60 days in advance of covered plant closings and covered mass layoffs. This notice must be provided to either affected workers or their representatives (e.g., a labor union); to the State dislocated worker unit; and to the appropriate unit of local government. [1]

 

Employer Coverage
In general, employers are covered by WARN if they have 100 or more employees, not counting employees who have worked less than 6 months in the last 12 months and not counting employees who work an average of less than 20 hours a week. Private, for-profit employers and private, nonprofit employers are covered, as are public and quasi-public entities which operate in a commercial context and are separately organized from the regular government. Regular Federal, State, and local government entities which provide public services are not covered.

 

Employee Coverage
Employees entitled to notice under WARN include hourly and salaried workers, as well as managerial and supervisory employees. Business partners are not entitled to notice.

 

What Triggers Notice

  • Plant Closing: A covered employer must give notice if an employment site (or one or more facilities or operating units within an employment site) will be shut down, and the shutdown will result in an employment loss (as defined later) for 50 or more employees during any 30-day period. This does not count employees who have worked less than 6 months in the last 12 months or employees who work an average of less than 20 hours a week for that employer. These latter groups, however, are entitled to notice (discussed later).
  • Mass Layoff: A covered employer must give notice if there is to be a mass layoff which does not result from a plant closing, but which will result in an employment loss at the employment site during any 30-day period for 500 or more employees, or for 50-499 employees if they make up at least 33% of the employer’s active workforce. Again, this does not count employees who have worked less than 6 months in the last 12 months or employees who work an average of less than 20 hours a week for that employer. These latter groups, however, are entitled to notice (discussed later).

 

An employer also must give notice if the number of employment losses which occur during a 30-day period fails to meet the threshold requirements of a plant closing or mass layoff, but the number of employment losses for 2 or more groups of workers, each of which is less than the minimum number needed to trigger notice, reaches the threshold level, during any 90-day period, of either a plant closing or mass layoff. Job losses within any 90-day period will count together toward WARN threshold levels, unless the employer demonstrates that the employment losses during the 90-day period are the result of separate and distinct actions and causes.

 

AcqLinks and References:

Updated: 6/18/2018