Distribution Restriction:
Operational Test and
Evaluation Manual
Commander, Operational Test And Evaluation
Force
OPTEVFORINST 3980.2J
20 Oct 21
Version 1.0
20 October 2021
Contents
RECORD OF REVISIONS
Number of
Change
Summary of Changes
Updated
This is the initial Operational Test and Evaluation Manual
26 Aug 20
1
This is the updated Operational Test and Evaluation Manual
20 Oct 21
THIS PAGE INTENTIONALLY LEFT BLANK.
iii
Contents
Operational Test & Evaluation Manual
TABLE OF CONTENTS
SECTION 1 - INTRODUCTION . 1-1
1.1 THE MISSION .................................. 1-1
1.2 THE OT&E PROCESS...................... 1-3
1.3 KEY DIRECTIVES ........................... 1-7
SECTION 2 - ORGANIZATIONAL
RELATIONSHIPS ... 2-1
2.1 INTRODUCTION ............................. 2-1
2.2 EXTERNAL ALIGNMENT .............. 2-1
2.3 INTERNAL ALIGNMENT ............... 2-3
2.4 PROCESS OWNERS ........................ 2-9
2.5 WARFARE DIVISION ROLES AND
RESPONSIBILITIES
PRIMARY DUTIES ...... 2-11
2.6 THE TEAM CENTRIC
CONSTRUCT ................ 2-14
SECTION 3 - GENERAL
ADMINISTRATIVE
PROCESSES ............ 3-1
3.1 INTRODUCTION ............................. 3-1
3.2 GENERAL ......................................... 3-1
3.3 COLLABORATION.......................... 3-1
3.4 CROSS-DOMAIN POST-TEST
BRIEFS ............................ 3-2
3.5 TRAINING FOR NEW TESTERS ... 3-3
3.6 POLICY AND REFERENCES ......... 3-4
3.7 SUT/SoS REQUIREMENTS............. 3-5
3.8 GENERAL WRITING STYLE ......... 3-5
3.9 EDITORS .......................................... 3-6
3.10 BRIEFINGS TO THE COMMANDER
OR DEPUTY ................... 3-6
3.11 COMOPTEVFOR'S POSITION ...... 3-7
SECTION 4 - TEST DESIGN
BASICS ..................... 4-1
4.1 MBTD .......................................... 4-1
SECTION 5 - TEST AND
EVALUATION
MASTER PLAN
(TEMP) ..................... 5-1
5.1 INTRODUCTION ............................. 5-1
5.2 ADMINISTRATIVE POLICIES ....... 5-1
5.3 ORGANIZATION/CONTENT ......... 5-2
5.4 DEVELOPMENT .............................. 5-3
5.5 REVIEW AND APPROVAL ............ 5-3
5.6 UPDATE .......................................... 5-5
5.7 TEST AND EVALUATION
COORDINATING GROUP
(TECG) (DoN ONLY) ..... 5-6
SECTION 6 - TEST PLANNING 6-1
6.1 GENERAL ......................................... 6-1
6.2 INTRODUCTION ............................. 6-1
6.3 BRIEFING TEST PLANS ................. 6-2
6.4 LIMITATIONS TO TEST ................. 6-3
SECTION 7 - TEST DATA
CONTROL ................ 7-1
7.1 GENERAL ......................................... 7-1
7.2 SHARING AND RELEASE OF OT
DATA AND RESULTS .. 7-1
iv
Contents
SECTION 8 - EVALUATION
REPORTS ................. 8-1
8.1 INTRODUCTION ............................. 8-1
8.2 TYPES OF OPERATIONAL
EVALUATION AND
OTHER REPORTS .......... 8-1
SECTION 9 - RESOURCES ........ 9-1
9.1 INTRODUCTION ............................. 9-1
9.2 ELECTRONIC RESOURCES .......... 9-1
9.3 T&E PROGRAM SYSTEM (TEPS) . 9-2
9.4 SHARED DRIVES ............................ 9-2
9.5 PHYSICAL RESOURCES ................ 9-3
9.6 TEMPORARY ASSIGNED DUTY
(TAD) TRAVEL .............. 9-3
9.7 FLEET SERVICES ............................ 9-4
9.8 MULTISERVICE REQUESTS ......... 9-7
9.9 RELATED COMMUNICATIONS ... 9-8
9.10 TEST TARGETS ............................. 9-8
SECTION 10 - MODELING AND
SIMULATION ........ 10-1
SECTION 11 - CYBER
SURVIVABILITY
TEST AND
EVALUATION ....... 11-1
11.1 INTRODUCTION ......................... 11-1
11.2 TEST PLANNING......................... 11-2
11.3 MODELING AND SIMULATION
(M&S) ............................ 11-2
11.4 TEST EXECUTION AND POST
TEST PROCESS ............ 11-2
SECTION 12 - CONTRACT
SUPPORT ............... 12-1
12.1 INTRODUCTION ......................... 12-1
12.2 ROLES AND RESPONSIBILITIES .... 12-2
12.3 GENERAL CONTRACT TASK
ORDER INITIATION
PROCEDURES.............. 12-3
12.4 SERVICE REQUIREMENTS
REVIEW BOARD
(SRRB) ........................... 12-4
12.5 TECHNICAL EVALUATION
BOARD (TEB) .............. 12-4
12.6 TASK ORDER AWARD .............. 12-5
12.7 TASK ORDER MODIFICATIONS ..... 12-5
12.8 INVOICE CONCURRENCE ........ 12-6
12.9 ASSESSING CONTRACTOR
PERFORMANCE .......... 12-8
12.10 TASK ORDER CHECKLIST...... 12-9
12.11 TEMPLATE E-MAIL (WHEN
DISTRIBUTING TASK
ORDERS AND
MODIFICATIONS) TO PM
BUDGET OFFICE: ..... 12-10
SECTION 13 - FINANCE ......... 13-1
13.1 INTRODUCTION ......................... 13-1
13.2 FISCAL GUIDANCE AND
PROCUREMENT
INTEGRITY .................. 13-1
13.3 FUNDING SOURCES AND
REGULATIONS............ 13-1
13.4 AMPLIFYING GUIDANCE ON USE
OF PROGRAM FUNDS 13-4
13.5 SPECIFIC GUIDANCE REGARDING
PROGRAM FUNDS ...... 13-5
13.6 INAPPROPRIATE USES OF
PROGRAM FUNDING . 13-6
13.7 PROGRAM FUNDING
DOCUMENTS ............... 13-7
13.8 ADDITIONAL FISCAL
GUIDANCE/SUPPORT
AVAILABLE ................. 13-7
v
Contents
APPENDIX A - ACRONYMS AND
ABBREVIATIONS .. A-1
APPENDIX B - THE CONTINUUM
OF TESTING .......... B-1
APPENDIX C - ELECTRONIC
MANAGEMENT
SYSTEMS ................ C-1
APPENDIX D - SQUADRON
COORDINATION ... D-1
APPENDIX E - TEST AND
EVALUATION
STAKEHOLDERS .. E-1
APPENDIX F - GLOSSARY ....... F-1
TABLES
Table 3-1. Warfare Division Brief Schedule
.................................................................. 3-2
Table 3-2. Nomination and Review Timeline
.................................................................. 3-3
Table 3-3. Document Routing after
Signature .................................................. 3-7
Table 3-4. Signature Authority ............... 3-8
Table 5-3. TEMP Comment Letter Timelines
.................................................................. 5-4
Table 8-1. Report Format Guidance ....... 8-1
Table 12-1. Contract TO Package
Generation Responsibilities ................... 12-3
Table B-1. Comparison of DT&E and OT&E
................................................................. B-3
Table B-2. Description and Decision
Authority for ACAT I-IV and AAPs ...... B-4
Table B-3. DT Assist-Combined DT/OT-IT
Comparison ........................................... B-14
Table C-1. Critical TEPS Fields ............ C-2
FIGURES
Figure 2-1. COMOPTEVFOR External
Relationships ............................................ 2-2
Figure 2-2. COMOPTEVFOR Internal
Relationships ............................................ 2-7
Figure 2-3 Team-Centric Construct ....... 2-14
vi
Contents
THIS PAGE INTENTIONALLY LEFT BLANK
1-1
SECTION 1 - INTRODUCTION
1.1 THE MISSION
OPTEVFOR’s mission is to ensure naval forces can fight and win, by evaluating warfare
capabilities in realistic combat environments with Fleet warfighters. Our test reports rapidly
inform Navy, Marine Corps, and Coast Guard Warfighters and support acquisition decisions.
1.1.1 Vision
Our vision is to be recognized as the voice of operational truth with the Fleet. We will lead the
Operational Test (OT) community with highly skilled testers and staff that adapt to change, and
provide credible, prompt, warfighting-focused test results to Navy, Marine Corps and Coast Guard
forces, and acquisition leadership.
1.1.2 Strategic Objectives
We place a high priority on:
Informing the Fleet, by making warfighting information readily available, and by reporting
test results to the Fleet.
Supporting Acquisition, by providing timely, accurate, and impactful information, and
establishing a team culture among stakeholders.
Improving OPTEVFOR, by improving organizational effectiveness, and aligning our purpose
and priorities.
1.1.3 The Admiral’s Priorities For 2020
Improve communications and coordination across warfare domains
Implement the Operational Test Agencies’ (OTA) six core test principles (see below) and
Capabilities-Based Test and Evaluation (CBTE)
Develop enterprise-wide efficient and effective cyber tests
Modernize how we educate and train our workforce
1.1.4 The Six OTA Core Principles of Adaptive, Relevant Testing (ART):
In support of the National Defense Strategy (NDS), the OTA Commanders determined that the
OTAs must adjust our focus of purpose from solely supporting the acquisition and warfighter
communities, to that of support for “delivering combat capability at the speed of relevance”. To
that end, the OTA named six core test principles designed to ensure the future relevance, credibility
and timeliness of OT. Each Service OTA along with the Joint Interoperability Test Command
(JITC) have agreed to these principles as a framework to achieve the NDS vision:
Early OT Involvement – OTA engaged from the very inception of the program, truly part of
the team, more than just a seat at the table
1-2
Tailor to the SituationEach OT focused on the warfighter, tailored processes to ensure
rapid fielding of capability
Continuous and Cumulative Feedback – sharing test results with the program almost as soon
as we know them, OT is independent, but also a partner
Streamline Processes and Products – remove bureaucratic constraints that slow the process
Integrated and Combined Collection/Test – “One Team” approach to all testing, utilize all
test events to meet Contractor Testing (CT), Developmental Testing (DT), and OT objectives
Adaptive – willing to adjust test designs and processes to apply lessons learned and address
the real needs of the warfighter, testing at the “speed of relevance.”
The details of these six principles are provided by an OTA Joint Memorandum of May 2019,
available at Y:\OT&E Reference Library\Memorandums of Agreement (MOA). These core
principles were subsequently endorsed by the Director, Operational Test and Evaluation (DOT&E)
and emphasized in the 2019 DOT&E Annual Report.
1.1.5 Purpose
The purpose of this manual is to familiarize the reader with the role of Operational Test and
Evaluation (OT&E) in the Navy acquisition process and to prescribe policies for the planning,
conduct, and reporting of OT&E on various new and improved systems. This manual provides
policy and high-level guidance on the methods and processes used by Operational Test and
Evaluation Force (OPTEVFOR) in the conduct of OT&E. Specifics associated with the actual
conduct of OT events and the generation of test plans and writing of test reports are provided in
handbooks associated with the specific topic; these handbooks provide the reader the details of
“how to” actually conduct the task or event. Throughout all processes and in the application of all
guidance, OPTEVFOR testers are required to use critical thinking and maintain a questioning
frame of mind.
1.1.6 Background
By direction of the CNO, Commander, Operational Test and Evaluation Force (COMOPTEVFOR)
is chartered to conduct OT&E of systems in Acquisition Category (ACAT) I, II, III, and IVT (Test)
procurement programs. OT&E is conducted in as near a realistic operational environment as
possible with Fleet personnel operating and maintaining the System Under Test (SUT). Wherever
possible, simulated hostile threat action is employed to stress the system. Operational experience
and judgment of the naval personnel conducting OT&E is of utmost importance to the validity of
OT&E results, conclusions, and recommendations. To that end, meticulous planning, preparation,
conduct, and reporting of OT&E are mandatory. It is also important to note that although
COMOPTEVFOR works very closely with the acquisition process, the command is the Navy’s
independent OTA, works for the CNO, and must represent the equities of the warfighter to the
acquisition community.
1.1.7 The Role of COMOPTEVFOR
It is important to put the role of OT&E in context to best understand the responsibilities of
COMOPTEVFOR. In addition to the statutory missions assigned by law, COMOPTEVFOR has
additional responsibilities assigned by the CNO to assist the Service Acquisition Executive by
1-3
providing early assessments of the operational effectiveness and operational suitability of major
acquisition programs being developed by the Department of the Navy (DoN). These early
assessments are intended to help senior leaders identify risks and benefits of systems under
development so that the best acquisition decisions can be made.
During program development, OPTEVFOR will typically provide a series of one or more
operational assessments (OA) to help inform the Service Acquisition Executive and the Resource
Sponsor on the progress being made with particular focus on the risks that are likely to be observed
at Initial Operational Test and Evaluation (IOT&E).
During IOT&E, OPTEVFOR exercises its statutory responsibility to evaluate the operational
effectiveness, operational suitability, and cyber survivability of the SUT. In addition, the
Commander evaluates the operational effectiveness and the operational suitability of the SUT’s
performance as part of the overall System of Systems (SoS). As will be discussed later, it is not
uncommon to find a SUT that performs exactly as desired within a larger SoS, but that the SoS
does not accomplish the intended mission.
Depending on the structure of the program, there will likely be additional phases of test designed
to support the Verification of the Correction of Deficiencies (VCD) found in IOT&E or to assess
delivery of additional capability. Depending on the success of the IOT&E and/or the scope of
future changes, these additional test periods will vary significantly in size and scope.
In addition to acquisition program OT&E; COMOPTEVFOR supports the CNO and Fleet
Commanders by participating in Warfare Capability Baseline (WCB) assessments. These
assessments examine specific kill or effects chains identified by Fleet Commanders and report on
the Navy’s capability across all kill/effects chain platforms, networks, weapons, and
sensors. Often led by a Warfighting Development Center (WDC), the WCB assessment is
intended to draw on OT and Fleet data to display an objective view of the level of integration and
interoperability associated with the SoS capability for each chain. WCB assessments are
inextricably linked to the kill/effects chain used to inform SUT evaluations during OT&E because
each system must work within a SoS to provide warfighting capability. As each OPTEVFOR
warfare division collaborates with the WDCs to conduct fleet relevant OT&E, they should ensure
OT&E knowledge, insight, and data are used as part of WCB assessments; while simultaneously
drawing upon these assessments to inform the operational context for mission based test designs,
plans, and analyses. To enable this relevant fleet and OPTEVFOR insight, each OT&E risk or
deficiency should articulate the mission task impacted by that deficiency.
1.2 THE OT&E PROCESS
1.2.1 Mission Based Test Design (MBTD)
Once a program is assigned to a warfare division, the first step is to employ a process known as
MBTD to develop an evaluation strategy. Chapter 4 and the Integrated Evaluation Framework
(IEF) Checklist provide a detailed discussion of the MBTD process. In basic terms, MBTD begins
with the Navy Required Operational Capability/ Projected Operational Environment (ROC/POE)
mission areas and then examines the specific mission contributions ascribed to the system. To
accomplish this, the standard mission threads (first-level subtasks) are decomposed (as needed)
into second-, and third-level subtasks. Conditions, measures, and Data Requirements (DR) are
identified and traced to subtasks. MBTD also incorporates Design of Experiments (DOE) to create
1-4
defendable, minimum-adequate test designs for key SUT concerns. The product of this effort is a
document known as the IEF. The IEF provides the foundation for the input of the Operational
Test Agency (OTA) to the TEMP. It also enables the OT community to become a full-fledged
partner in Integrated Testing (IT) with members of the CT and DT communities. Beyond its
evident support of the acquisition process, the mission-task breakdown developed in the MBTD
process serves as the foundation for the creation of effects chains used in other analyses.
1.2.2 Test and Evaluation Master Plan (TEMP)
The TEMP is the controlling directive for managing the test and evaluation of an acquisition
program. It is directive in nature, and defines and integrates test objectives, Critical Operational
Issues (COI), test responsibilities, resource requirements, and test schedules. While the Program
Manager (PM) is responsible for the development and submission of the TEMP,
COMOPTEVFOR is responsible for the development of those portions dealing with OT.
COMOPTEVFOR is a signatory on all TEMPs developed in the DoN, as well as those for
joint/multiservice programs that have Navy equities.
OPTEVFOR’s input to the TEMP process is based on the IEF. In short, the TEMP is a formal
commitment between stakeholders on the IT strategy for a program to include resources, planning,
and methodology.
The OT process should be seen as a continuum that supports all phases of program development.
Using the IT construct, operational testers may participate in CT and government DT, in addition
to stand-alone OT. The intent is to use every opportunity to gather relevant data in the most
efficient and economical manner. All test communities (CT, DT, and OT) have unique roles and
responsibilities; however, there is generally a significant intersection of the data sets necessary to
inform their respective evaluations. OPTEVFOR’s commitment is to use all qualified data,
regardless of source, to make the best, informed evaluation.
1.2.2.1 Master Test Strategy (MTS)
Programs that employ adaptive acquisition authorities in lieu of, or prior to, traditional acquisition
program development may be required to develop a MTS as a streamlined document to capture
the test approach, resources and schedule when a TEMP is not used by that program. Department
of Defense (DoD) guidance is found in the DoDI 5000.80, Operation of the Middle Tier of
Acquisition. DoN guidance is in draft, and may be signed in FY21. Refer further questions via
the division chain of command to the COMOPTEVFOR Technical Director (TD).
1.2.3 OT PLANS
Formal, stand-alone OT phases are generally called out in support of a program’s acquisition
milestones. These test periods are conducted per an approved OT plan. For programs that fall
under the oversight of the Director, Operational Test and Evaluation (DOT&E), the law (10 USC
2399) requires that the adequacy of the test plan (including the projected level of funding) be
approved in writing by the Director prior to commencing OT. For all other programs, the
Commander is the approval authority.
The OT plan is built from the IEF. Depending on the stage of program development, the test plan
may only involve a subset of the capability described in the IEF. The OT plan expands upon the
1-5
IEF with an additional level of detail on the execution of the specific events and the details
associated with specific test configurations, range instrumentation, and Fleet participants.
1.2.4 OT PERIODS
There are five general types of dedicated OT periods, which may be executed as required within a
typical major acquisition program. Each test period that is outlined within the program test strategy
as documented by the TEMP shall result in a test report to officially document the operational
evaluations from the test period. The nature of those reports, their content, and the decisions they
inform are synopsized below.
1.2.4.1 Early Operational Assessment (EOA) and Operational Assessment (OA)
The first formal assessment is usually an EOA. This assessment occurs before the start of the
Engineering and Manufacturing Development phase of the acquisition program. Most programs
will have only a single EOA. Generally, this is limited to a review of the design documentation,
preliminary manning and training plans, and, potentially, a demonstration of technology. The goal
of the EOA is to identify system enhancements, as well as risks towards the successful completion
of IOT&E. Each risk identified is categorized and documented with a “Blue” or “Gold” sheet.
Blue sheets refer to the SUT risks, while Gold sheets address risks outside the SUT that impact
mission accomplishment. These risk sheets are tracked through the life of the system until they
are verified as corrected.
The second formal assessment period is generally an OA. This assessment occurs post-milestone
B, during the Engineering and Manufacturing Development phase. The scope of the OA is most
often determined by the maturity of the development program. As with EOAs, OAs identify
system enhancements, as well as risks towards the successful completion of the IOT&E. Each
identified risk is categorized and documented with a Blue or Gold sheet. Large complex programs
will often have multiple OAs during the Engineering and Manufacturing Development phase.
Major Defense Acquisition Programs typically require the results of an OA to support milestone
decisions and other program reviews.
1.2.4.2 Initial Operational Test and Evaluation (IOT&E)
The third type of OT period is the IOT&E. This is the statutorily required, independent evaluation
of the operational effectiveness and operational suitability of the SUT. This test is conducted on
production-representative test articles during the Production and Deployment phase of an
acquisition program. Specific deficiencies identified during test are documented as individual
Blue or Gold sheets. Based on the results of IOT&E, COMOPTEVFOR makes a determination
of the operational effectiveness, operational suitability, and cyber survivability of the SUT, as well
as the operational effectiveness, operational suitability, and cyber survivability of the SUT within
the overall context of the SoS in which it functions. The Commander makes a recommendation to
the CNO on the Fleet introduction (or full introduction in the case of joint/multiservice programs).
The results of IOT&E are a prerequisite for the Full-Rate Production (FRP) Decision Review
(FRPDR).
1.2.4.3 Verification of Correction of Deficiencies (VCD)
The fourth type of OT period is the VCD. Typically, this is not a preplanned phase of testing, but
is inserted into the test program after a formal phase of OT to verify that certain deficiencies have
1-6
been corrected. This provides the Milestone Decision Authority (MDA) with the independent
assurance the deficiencies cited as corrected by the PM from a previous phase of OT have actually
been corrected. When deficiencies are verified as corrected, the corresponding Blue or Gold sheet
is closed. If the deficiency is not fully corrected, the results are reviewed to determine if the
correction or mitigation to date has changed the risk to successful IOT&E, which may warrant a
change in the deficiency categorization.
1.2.4.4 Follow-On Operational Test and Evaluation (FOT&E)
The final category of OT period is FOT&E. Because it nominally encompasses all OT conducted
after IOT&E, it can take many different forms. In its original construct, FOT&E included
completion of deferred or incomplete testing from IOT&E, as well as validation of the operational
effectiveness and suitability of the actual production systems. In practice, FOT&E is often used
to support the development of incremental improvements to systems that are in production. These
improvements can range from minor hardware changes to periodic software system updates to
major engineering changes that require extensive development. Given the variations in scope,
FOT&E may be structured to resemble a subset of IOT&E, confirming production performance,
or it may take the form of an OA, identifying risks to successful implementation of a major
engineering change. Based on the focus of the test, Blue and Gold sheets may be closed as fixes
are incorporated into the production articles or new Blue and Gold sheets may be created to
document risks associated with the new development.
1.2.5 OPTEVFOR Tactics Guide (OTG)
There are four Navy and Marine Corps Squadrons that conduct OT&E under the direction of the
Commander. OTGs are created, primarily by test squadrons, to communicate tactical guidance to
the Fleet in conjunction with a given test period. They are developed on an as-needed basis, and
they will utilize a format locally established within the test squadron. OTGs are generally not
produced by the OPTEVFOR warfare divisions. Instead, tactical lessons learned are provided to
the respective Warfighting Development Centers (WDC) for inclusion in their tactical guidance.
1.2.6 Quick Reaction Assessment (QRA)
QRAs are abbreviated OT&E events that provide assessments for specific warfighting solutions
that address an urgent operational need or an accelerated acquisition program. A QRA provides
an objective characterization of system operational capabilities, limitations, and considerations for
deploying the system, using the criteria supplied by the end user in the rapid acquisition
documentation. There is no assessment of operational effectiveness, operational suitability, or
cyber survivability. Chapters 4-8 below delineate further details of the functional aspects of QRAs
and other forms of tailored testing.
1.2.7 Middle Tier Acquisition (MTA)
DoDI 5000.80 Operation of the Middle Tier of Acquisition provides policy guidance for programs
using MTA authorities. Specific SECNAV guidance is still in draft, and may be signed in FY21.
SECNAVINST 5000.2F conveys applicability of Quick Reaction Assessment (QRA) within the
test approaches for these programs. For oversight programs, DOT&E Memo dated 24 October
2019, subject “Operational and Live-Fire Test and Evaluation Planning Guidelines Middle Tier of
Acquisition Programs” provides relevant amplifying guidance. This memo is available in the
1-7
Y:\OT&E Reference Library\DOT&E Guidance . The OPTEVFOR Tailored IEF (TIEF) and
Level of Test Determination (LTD) processes have been adapted to account for the current
guidance regarding MTA programs.
1.3 KEY DIRECTIVES
The Department of Defense (DoD) and Navy acquisition and test and evaluation processes are
governed by statutes and directives as follows:
10 U.S. Code 139 - Director of Operational Test and Evaluation (DOT&E)
10 U. S. Code 2366 - Major systems and munitions programs: survivability testing and
lethality testing required before full-scale production
10 U.S. Code 2399 - Operational test and evaluation of defense acquisition programs
DoD Instruction 5000.02 – Operation of the Adaptive Acquisition Framework - Establishes
policy and prescribes procedures for managing acquisition programs, pursuant to the relevant
sections of Title 10, United States Code
Defense Acquisition Guidebook (DAG) - provides guidance on the process and procedures
for managing risks through planning and executing an effective and affordable test and
evaluation (T&E) program
DoD Cybersecurity T&E Guidebook v2.0 - provides data-driven, mission-impact based,
analysis and assessment methods for cybersecurity T&E and supports assessment of
cybersecurity, survivability, and resilience within a mission context
SECNAV Instruction 5000.2F - Prescribes DoN-specific acquisition policies and
procedures that supplement DoDI 5000.02
SECNAV Instruction 3960.1 (Draft) - Will establish DoN-specific Test & Evaluation
(T&E) policy and execution guidance
1-8
THIS PAGE INTENTIONALLY LEFT BLANK
2-1
SECTION 2 - ORGANIZATIONAL RELATIONSHIPS
2.1 INTRODUCTION
COMOPTEVFOR is an Echelon 2 Commander under the CNO reporting directly to the Vice Chief
of Naval Operations. The missions, functions, and tasks of OPTEVFOR are delineated in
OPNAVINST 5450.332A. OPTEVFOR serves as the service OTA for the Navy, as well as Marine
Corps Aviation. In addition to the headquarters element, OPTEVFOR includes a Fleet-scheduling
detachment in San Diego, and a detachment supporting the Joint Strike Fighter (JSF), Joint
Operational Test Team (JOTT) at Edwards, Air Force Base (AFB), CA. There are four Navy and
Marine Corps Squadrons that conduct OT&E under the direction of the Commander. Air Test and
Evaluation Squadron ONE (VX-1), located at Patuxent River, MD, is under the administrative
control of Commander, Naval Air Forces, Atlantic. Air Test and Evaluation Squadron NINE (VX-
9), located at China Lake, CA, is under the administrative control of Commander, Naval Air
Forces, Pacific. Marine Operational Test and Evaluation Squadron ONE (VMX-1), located at
Yuma, AZ, is administratively aligned under the Deputy Commandant for Aviation. Marine
Helicopter Squadron ONE (HMX-1), located at Quantico, VA, was historically assigned
responsibility for United States Marine Corps (USMC) rotary wing OT. Due to the growth of its
principal responsibilities for Presidential transport, most OT&E responsibilities have been
realigned to other organizations; however, HMX-1 retains responsibility for OT of aircraft
assigned for Presidential transport.
2.2 EXTERNAL ALIGNMENT
2-2
Figure 2-1. COMOPTEVFOR External Relationships
UNCLASSIFIED
It is important to note that while OPTEVFOR provides reports to the Navy’s Acquisition Executive
the Assistant Secretary of the Navy (Research, Development, and Acquisition) (ASN(RDA)), the
Commander is aligned under the CNO. The dotted line from the Office of the Chief of Naval
Operations (OPNAV) N94 reflects that OPTEVFOR’s mission funding is provided through the
Director of Innovation, Technology Requirements, and Test and Evaluation. The Test and
Evaluation (T&E) Executive also provides policy guidance on T&E within the DoN.
The DOT&E has statutory responsibility for the oversight of all OT&E carried out in the
Department of Defense (DoD). The DOT&E statutory responsibilities include the approval of the
adequacy of all OT plans that support programs designated for DOT&E oversight. By regulation,
the DOT&E is the approval authority for TEMPs for programs designated for DOT&E oversight.
While the DOT&E has no responsibility for the execution of T&E, the Director is required to
provide a variety of reports on the results of testing to the Congress. Based upon this, he or she
may designate observers for Service testing and has access to all data collected during OT.
There are three basic reports produced by the DOT&E. For Major Defense Acquisition Programs,
the Director must submit a report to the Congress on the results of OT prior to the approval to
proceed beyond Low-Rate Initial Production (LRIP). These are typically referred to as “BLRIP”
reports. In cases where the Secretary of Defense determines that it is necessary to field a system
before the completion of an IOT&E, the Director is required to submit a report to the Congress
based on the available test results with an assessment of the risk being incurred by the early
fielding. These are often referred to as “Section 231” reports. Finally, the DOT&E produces an
annual report to the Congress with an overview of the testing accomplished on each of the
programs under DOT&E oversight (including live-fire testing activities). This report also includes
2-3
recommendations for the Services and Defense Agencies. While there are other reports called out
in various National Defense Authorization Acts, these three are the ones that impact most
OPTEVFOR personnel. See appendix D for additional information on the role and staffing of the
DOT&E.
2.3 INTERNAL ALIGNMENT
To promote command-wide teamwork that produces consistent, repeatable credible results,
OPTEVFOR operates under a Competency-and Warfare-Aligned organizational structure. The
subsequent paragraphs describe the key roles within the organization and discuss the
interrelationships among them.
2.3.1 TOP LEADERSHIP
Top leadership below the Commander includes the Deputy (00D), the Chief of Staff (CoS) (01),
the Commanders Action Group (CAG), and the Technical Director (00TD). Their broad areas of
responsibility are as follows:
2.3.2 Deputy (00D)
The Deputy reports directly to the Commander. He or she, with the CoS, ensures the mission of
the command is carried out in conformance with the policies, plans, and intentions of the
Commander. The Deputy acts for and in the name of the Commander when the Commander is
temporarily absent. He or she actively participates in final reviews and presentations of test results
documents arriving for the Commander’s approval, and represents the Commander in the
executive oversight and command approval of Navy OT&E policy. The Deputy recommends
potential improvements in test and evaluation methodology, endorses OT&E policy, and
represents COMOPTEVFOR at high-level meetings involving the DoD and the DoN. He or she
oversees command resource planning in conjunction with the resources sponsor; and develops and
revises the command’s business and strategic plans.
2.3.3 CoS (01)
The CoS is the executor for and principal assistant and advisor to the Commander and the Deputy.
He or she ensures the administration, training, and operations of the command are carried out per
the Commander’s intentions. The CoS is responsible for daily command operations and the use
of command resources. He or she coordinates with the Deputy Commander for the final approval
for government civilian hires, and serves as the command point of contact with the CNO and other
offices pertaining to the command missions, functions and tasks. He/She provides routine
supervision for the CAG to ensure their attentiveness to the strategic priorities of the Commander
and Deputy Commander. The CoS exercises the full scope of leadership responsibilities in the
development and maintenance of military and civilian manpower; while further presiding over
command internal integrated project teams to facilitate diversity of participation and results
relevant to the Commanders intentions.
2.3.3.1 Commanders Action Group (CAG)
The CAG is a small staff element that is accountable to the Commander for strategy management.
The CAG’s purpose is to facilitate the OPTEVFOR strategy development, deployment,
2-4
communications and monitoring based on the Commanders priorities. The CAG is supervised by
the CoS and meets routinely with the Commander, Deputy Commander and TD to ensure
prioritization of effort and associated progress reporting. The CAG is led by a GS-15 or O-6 and
supported by additional action officers as appropriate. They plan and execute strategy related off-
site meetings and the OTA Roundtable when hosted by OPTEVFOR. The CAG is also responsible
to track and assist in the coordination of external stakeholder engagements by the Commander and
Deputy Commander, while also coordinating in-house execution of guest lecture presentations of
strategic benefit to the command.
2.3.4 Technical Director (00TD)
The Technical Director (TD) serves as the principal advisor to the Commander and staff on
technical aspects of T&E products. He works closely with the Commander, the Deputy
Commander, and the Chief of Staff for strategic engagement and leadership facilitation across the
breadth of OPTEVFOR internal and external stakeholders. By design, the role of the TD is broad
and flexible to ensure adaptability in meeting the Commanders needs. The Commander has
assigned the TD with a core responsibility to review all T&E products for technical acuity,
credibility, and relevance. Product review responsibilities are applicable but not limited to: T&E
Strategy (TES), TEMP, Master Test Strategy (MTS), IEF, Test Plan, Test Report, DT Assist
letters, Assessment of Operational Capability (AOC) letters, Concept of Test and Operational Test
Readiness Review (OTRR) briefs, CBTE documents, and Modeling and Simulation (M&S)
accreditation documents. The TD signs the Data Analysis Summary (DAS) that underpins test
reports and provides technical support to all divisions in the development of test products. He/she
is further responsible to serve as the principal liaison with the DOT&E Science Advisor, Service
OTA counterparts, and the DoN T&E governance structure. He/she represents the Commander in
the coordination of DoD and DoN policy development, accelerated and adaptive testing concepts,
and Service-and DoD-level technical initiatives related to the OT&E mission. Supporting all
warfare and competency divisions, the TD chairs the Technical Leadership Team (TLT) as
described in para 2.4-5 and serves as the Commanders champion for continuous process
improvement.
2.3.5 COMPETENCY- AND WARFARE-ALIGNED ORGANIZATION
OPTEVFOR is a competency- and warfare-aligned organization. This is significantly different
from the traditional Fleet organizations with which most military personnel are familiar.
OPTEVFOR has Warfare Division Directors who are responsible for executing operational testing
of all assigned systems, and for delivering test documents ready for the Commander’s signature.
OT&E competency division directors provide the disciplined policy and process support, and
technical expertise to complement warfare divisions; ensuring products meet the technical
requirements and the Commander’s standards. This collaborative structure aligns OT&E and
warfighter expertise to achieve sufficient critical thinking and analytical rigor in all aspects of
OT&E leading to a clear and accurate test report that is relevant to fleet needs.
There are six warfare divisions and a JSF Detachment at Edwards AFB that are supported by
competency divisions. The warfare divisions include Undersea Warfare (40), Air Warfare (50),
Command, Control, Communications, Computers, Intelligence, Surveillance, and Reconnaissance
(C4ISR) (60), Surface Warfare (70), Expeditionary Warfare and Littoral Combat Ship (LCS) (80),
and Advanced Programs (90). Each warfare division typically has a Navy Captain as the division
2-5
Director with a senior civil servant as the Deputy or a senior civil servant as the division Director
and a Navy Commander as the Deputy. The JSF Detachment manages Navy requirements in T&E
of the F-35 and is a member of the JOTT.
The warfare divisions represent the traditional core of the OPTEVFOR organization. This is where
the active duty fleet operational experts who are assigned to test programs reside. It is their
perspective that allows OPTEVFOR to bridge the technical to the tactical views commensurate
with the OT&E mission.
There are four competency divisions: Policy and Operations (01A), Test Design (01B), Test
Planning and Evaluation (01C), and Cybersecurity Testing (01D). Other support divisions include
the Staff Commanding Officer and Administration (10), Chief Information Officer (CIO) (20),
Contracts (01K), and the Comptroller (30). The matrix organizational concept that relates the
warfare, competency, and support divisions is depicted in figure 2-2.
The members of the competency divisions work within the test team to ensure that the
Commander’s policies are adhered to and that best practices are applied. Should there be a
disagreement that cannot be resolved between warfare and competency division directors, the issue
should be raised to the Deputy and, if necessary, the Commander, for resolution. For process and
technical matters, division directors shall obtain the TDs input and/or recommendations prior to
requesting audience with the Deputy Commander. The warfare division directors and the
competency division directors have the right and the duty to raise an issue for Flag-level
adjudication if they believe the proposed outcome is not in the best interests of the Service.
2-6
THIS PAGE INTENTIONALLY LEFT BLANK.
2-9
2.4 PROCESS OWNERS
The broad areas of responsibility for the process owners are as follows:
2.4.1 01A Policy and Operations
01A Policy and Operations is responsible for representing the Commander to external
organizations in the development of T&E policy. It is responsible for ensuring compliance with
governing directives, specifically Secretary of the Navy (SECNAV) Instructions and DoD
Directives. As the Operations Director, 01A tracks the status of ongoing testing and reporting, as
well as managing the response to external requests for document coordination and review. The
editorial staff and training staff fall under the Policy and Operations Director.
2.4.2 01B Test Design
01B Test Design is responsible for the implementation of MBTD across all products associated
with OT at OPTEVFOR. It oversees the development of all IEFs and subsequent revisions and
updates. It is responsible for managing the authoritative database of mission threads, subtasks,
conditions, measures, and DRs. The management of the Core Team Facilitators (CTF), who co-
chair the test design teams, and the statistical staff falls under the Test Design Director. As the
senior expert in MBTD, the Test Design Director is responsible for the development of the
associated training curriculum. The Test Design Director is also responsible for the policies on
the use of M&S in OT, the Verification, Validation, and Accreditation (VV&A) process for OT,
response variable (RV) analysis, test targets, and the Test Resources Requirements document, and
all related training curricula.
2.4.3 01C Test Planning And Evaluation
01C Test Planning and Evaluation is responsible for the analytical rigor applied to all test planning
documents and reports generated at COMOPTEVFOR. It oversees the development process for
all test plans and reports. 01C Division is comprised of the Director, a Deputy Director, and
Assistant Directors assigned as process owners. The management and professional development
of all Lead Test Engineers (LTEs), Center for Naval Analysis (CNA) representatives, and division
analysts, whether assigned directly to 01C staff or the warfare divisions/squadrons (01C forward),
falls under Test Planning and Evaluation Director. As the Subject Matter Expert (SME) in test
planning, execution, and report writing, the Test Planning and Evaluation Director is responsible
for the development of the related training curricula.
2.4.4 01D CYBERSECURITY TESTING
01D Cybersecurity OT&E Division supports all aspects of the cybersecurity operational test and
evaluation across all OPTEVFOR warfare divisions including VXs, VMX-1, and HMX-1. 01D
Division is composed of two mission areas: Cyber Survivability OT&E and Cybersecurity
Assessment Program (CAP). Cyber Survivability OT&E supports acquisition decisions and
informs the Fleet of the mission risks and associated impact. CAP integrates with fleet and
Combatant Command exercises to plan, conduct, evaluate and report assigned elements of the
DOT&E CAP. 01D leadership is comprised of the Director, Deputy Director, Test Operations
Director, and a CAP Director.
2-10
2.4.4.1 Cyber Survivability OT&E
Cyber survivability testing determines a system’s capability to survive and operate after exposure
to cyber threats intent on disrupting a system’s operational mission. There are two 01D supporting
roles within the cyber survivability OT&E mission: Test Strategy and Policy (TSP) and
Operations. 01D TSP establishes OPTEVFOR cyber test planning and reporting processes and
templates. TSP also oversees cyber survivability test strategy development and planning through
the test plan checkpoint processes. TSP is warfare divisions’ liaison to DOT&E for vetting
overarching cyber test concepts and strategy as well as supporting adjudication of DOT&E
comments on cyber survivability test plans. Further, TSP provides training for Cyber Test
Engineers and Operational Test Directors on cyber survivability test planning. The Operations
team manages and executes the cyber survivability testing with the OPTEVFOR Red Team (CRT)
who has the authorization from the Navy Authorizing Official and Navy Intelligence Designated
Authorizing Official to conduct test operations on Navy operational environments up to TS/SCI.
01D will augment OPTEVFOR’s organic cyber capabilities with external resources tailored to the
system’s requirements as needed. The warfare division is responsible for providing a Cyber Test
Engineer (CTE) to work with the TSP representative and the CRT to develop a test plan using the
established OPTEVFOR cyber test planning process. Once the test plan is signed and approved
by the Commander and DOT&E (if a program is on DOT&E oversight), the CRT executes the test
and coordinates the post-test process with the warfare divisions to develop the final report
products.
2.4.4.2 Cybersecurity Assessment Program (CAP)
CAP is a DOT&E managed, congressionally funded program mandated in the National Defense
Authorization Act of 2002. Each service OTA has a CAP team. CAP monitors and reports on
DoD efforts to improve cybersecurity, cyber functionality, and interoperability. While the CAP
does not conduct OT, it employs MBTD and Cyber Survivability principles to develop, design,
and execute assessments. The CAP mission has four primary objectives:
Conduct operationally relevant cybersecurity assessments of fielded systems, networks and
processes during Combatant Command and Service Tier I exercises featuring representative
cyber threats, to evaluate how realistic cyber conditions affect the subject commands’ ability
to execute their assigned missions.
Provide timely feedback to Combatant Command, Service, and Department of Defense
leadership on identified problems, associated mission effects, and successful defensive
strategies.
Share relevant information with, and support, those organizations authorized and able to
provide remediation and mitigation assistance and verify that remediation and mitigation
activities are effective.
Report overarching cybersecurity observations and trends for inclusion in the DOT&E
Annual Report to Congress.
The CAP Director holds ACOS administrative authorities.
2-11
2.4.5 TECHNICAL LEADERSHIP TEAM (TLT)/ CHANGE CONTROL BOARD (CCB)
The TLT is chaired by the TD and composed of the directors of 01A, 01B, 01C, and 01D, with
other command leadership participation included as necessary. Their purpose is to review the
technical implications of OT&E processes, lessons learned, best practices, and internal and
external trends in order to maintain alignment of command policies and processes to the needs of
the Navy and the OPTEVFOR mission. This continuous review and coordination forum helps to
ensure that command guidance and policies are well considered and do not conflict. Serving as
the OT&E guidance and policy Change Control Board (CCB), the TLT will periodically meet and
approve all proposed changes to OPTEVFOR OT&E-related policy and subordinate guidance
handbooks. Handbooks approved by the CCB chair will be signed by the cognizant competency
or support division director. Changes that require an update to the OT&E Manual will be endorsed
by the CCB chair before seeking the Commander’s final approval. The 01A competency division
will promulgate and maintain the manual and handbooks upon signature, and work with the
corresponding content owner to incorporate the associated changes into command training
programs.
2.5 WARFARE DIVISION ROLES AND RESPONSIBILITIES PRIMARY
DUTIES
The warfare divisions are composed of predominantly active-duty military personnel (officer and
enlisted), government civilian, and contract support personnel working together with a product
focus to execute the OT&E mission for assigned programs. Whereas the competency divisions
are generally comprised of civilian and contractor support personnel to provide process and
technical expertise in support of the warfare division mandate. The aggregate of personnel across
warfare and competency divisions work collaboratively to create a team-centric organizational
construct to ensure each assigned System Under Test has collective expertise to support OT&E
mission accomplishment. The various positions and associated responsibilities of the warfare
divisions are defined with the intent to accomplish this team-based execution of the OT&E
mission.
2.5.1 Division Director or Assistant Chief of Staff (ACOS)
The Division ACOS is responsible for all operational testing performed within the division and is
the primary interface with O-6 PMs and DOT&E Deputy Directors and Action Officers (AO). The
ACOS is to ensure that all Division products are ready for Flag-level review. The ACOS
represents COMOPTEVFOR at high-visibility test events and at Operational Test Readiness
Reviews (OTRR)/mission control panels, Working Integrated Product Team (WIPT) executive
level meetings, and DOT&E Concept of Test (COT) briefs. He or She provides leadership for the
engagement with fleet commands to ensure relevancy of OPTEVFOR testing and results to fleet
interests.
2.5.2 Division Deputy Director or Deputy Assistant Chief of Staff (DACOS)
The Division DACOS is responsible to the ACOS to ensure that all products are ready for Flag-
level review. The DACOS provides the long-term continuity for the Division and is the key
interface with 01A, 01B, 01C, and 01D competency division leadership. The DACOS is
responsible for the timely scheduling and execution of internal test product reviews; and
monitoring timely scheduling and execution of external test functions, such as OTRR and briefs
2-12
to DOT&E. The DACOS also manages the divisions assigned program portfolio, allocation of
personnel to each program, contract support vehicles, and financial resources. In cases when Navy
Working Capital Funded government agencies provide OT&E support, the DACOS is responsible
to staff and maintain documented agreements applicable to the nature of that support.
2.5.3 Section Head (SH)
The section head (SH), typically a mid-grade officer or civilian, is primarily responsible for
portfolio management of assigned programs of record. The SH provides leadership to the assigned
team to ensure external engagement and communications are aligned with internal policy and
processes. The SH also manages assigned manpower to efficiently provide program-focused
OT&E support to associated Program Managers, while guiding the team through contracting,
financial, and fleet resource allocation requirements to ensure mission success within the section.
The SH is responsible to the ACOS for all assigned military personnel meeting military
requirements, to include administrative support and identifying OT&E training required. The SH
is a facilitator bridging tactically realistic OT&E with OPTEVFOR processes and, therefore, acts
as the liaison with 01A, 01B, 01C, and 01D action officers. The SH is also accountable to the
DACOS for the timeliness, accuracy, and format of all test products assigned to them. The SH
ensures the timely scheduling and execution of internal test product reviews and the timely
scheduling and execution of external test functions, such as OTRR and briefs to DOT&E.
2.5.4 Operational Test Director (OTD)
Qualification as an OTD ensures the individual is capable of providing military leadership, fleet
experience, and tactical acumen to OT&E, specifically regarding the direction of operational test
execution. The OTD is assigned to one or more programs. The OTD is responsible for overseeing
tactically realistic, detailed test planning, thorough test execution, to include detailed data
collection, and that the observed results are accurately documented in the Test Report. In addition
to ensuring that the requisite phase of test execution is conducted properly, the OTD leads the test
team in ensuring associated documentation is “Flag-signature ready” and in compliance with
current policies and procedures. The OTD is accountable for following Section Head guidance to
provide clear and consistent communications with internal action officers (01A, 01B, 01C, 01D),
program office(s), and other external organizations (DOT&E, OPNAV, etc.), and attending OT&E
meetings as required. The OTD will also act as a mentor to Test Program Managers (TPM). To
document decisions made, issues identified, and serve as a running record of project history, OTDs
shall maintain a separate OTD Journal for each assigned project as a written, chronological record
of the project. Each OTD Journal will be a pass down item for subsequent OTDs assigned to each
project.
2.5.5 Test Program Manager (TPM)
The TPM is assigned to one or more programs as a tactical expert and manager of the OT&E
program administration. The TPM is critical member of each warfare division test section who
performs a role akin to an OTD while working to meet the training and certification requirements
associated to the OTD position. The TPM is responsible for ensuring that the requisite phase of
test is planned properly and that associated documentation is “Flag-signature ready” and in
compliance with current policies and procedures. The TPM is responsible to follow guidance from
the Section Head regarding the proper management of all program funds in support of the assigned
programs. He or she is also accountable for following Section Head guidance regarding clear and
2-13
consistent communications with internal action officers (01A, 01B, 01C, 01D), program office(s),
and other external organizations (DOT&E, OPNAV, etc.), and attending OT&E meetings as
required. TPMs may be assigned a variety of support staff, including military or government
civilian OTD, contracted support, or an additional assistant TPM, as needed. To facilitate and
support the wide variety of TPM responsibilities, TPMs shall maintain a separate TPM Journal for
each assigned project as a written, chronological record of the project. Each TPM Journal will be
a pass down item for subsequent TPMs or OTDs assigned to each project.
2.5.6 Lead Test Engineer (LTE)
LTEs are generally assigned to sections within the warfare divisions as an extension of the
competency divisions. LTEs may also be assigned duties as Deputy Section Heads to provide
long-term continuity within the warfare division. LTEs are administratively supervised, mentored,
and trained by the Test Planning and Evaluation Division (01C). Once assigned to a warfare
division, LTEs are operationally managed by and responsible to the warfare division Deputy
Director for the execution of their responsibilities. LTEs support test teams throughout OT,
providing process expertise, technical writing acumen, and ensuring development of quality test
products, including MBTD and the preparation and development of TEMPs, M&S products, test
plans, COT briefs, pre-test briefs, post-test iterative process Plan of Action and Milestones
(POA&M), data analysis summaries, Blue/Gold sheets, and test reports. LTEs also assist the SH
to maintain oversight of all testing to ensure the test is executed and data are collected per the test
plan. Additionally, LTEs may be assigned other administrative or collateral responsibilities to
support execution of the COMOPTEVFOR mission, as required.
2.5.7 STAT Analyst
The Scientific Test and Techniques (STAT) Analyst in Code 01B provides detailed analytical
support to the TPMs/OTDs/Operational Test Coordinators (OTC) in their preparation of TEMPs,
test plans, and final reports. The Analyst provides detailed analytical support to the
TPMs/OTDs/OTCs in review of management-level program documentation, especially Initial
Capabilities Documents (ICD), CDDs, and Capabilities Production Documents (CPD). The
Analyst generally applies statistical analysis techniques in support of OT&E. The Analyst assists
TPMs/OTDs/OTCs in establishing COIs and measures of effectiveness/performance. The Analyst
ensures the appropriateness of test scenarios and adequacy of requested resources to resolve COIs.
2.5.8 Cyber Test Engineer (CTE)
Cyber Test Engineers (CTE) are cyber subject matter experts within the warfare divisions
primarily assigned to lead the production of a Cyber Survivability test plan in accordance with
COMOPTEVFOR policies and guidance. CTEs are responsible for coordinating and leading all
aspects of the test plan Checkpoints to ensure all stakeholders are informed and test plan is
generated in a timely manner. CTEs are responsible for conducting test plan site visits and working
with the OPTEVFOR Red Team to establish test objectives and data requirements. CTEs are also
expected to be part of the OPTEVFOR post-test process to provide continuity of the program’s
cyber OT effort within the warfare division. CTEs may also be assigned to support IEF and TEMP
inputs for cyber OT as well as other supporting documents and briefs. CTEs are responsible for
early engagement with 01D and ensure 01D is part of the review process for all cyber deliverables
including IEF and TEMP inputs. Generally, CTEs are the central hub for cyber OT within the
2-14
warfare division. However, warfare division’s management of the CTEs may vary depending on
factors such as funding and contract limitations.
2.5.9 Operational Test Coordinator (OTC)
OTC positions are used in Air Warfare Division and, to a lesser extent, in other warfare divisions.
The OTC coordinates the efforts between the OTD, who often is located in a VX/VMX/HMX
squadron, and the division Section Head, DACOS, and ACOS. The OTC is responsible for all
communications with DOT&E, coordination of Fleet Resources, arranging funding from the
program office, staffing squadron documents for headquarters review, and scheduling briefings by
squadron OTDs to OPTEVFOR leadership.
2.6 THE TEAM CENTRIC CONSTRUCT
Figure 2-3 Team-Centric Construct
UNCLASSIFIED
Collaboration and teamwork is the key to adaptive relevant testing (ART). The various roles
described above are synergistically brought together in the team-centric construct. Figure 2-3
depicts a notional concept used within the warfare divisions to optimize effective use of workforce
capacity, talent and expertise. Because each warfare division is structured to accommodate its
specific acquisition, fleet, and test resource environment; the specific assignments within the team
framework may be tailored to the needs of the division.
Each portfolio team is assigned multiple test programs under the cognizance of a SH. The goal is
to ensure OT&E for all assigned test programs is successfully planned, executed, analyzed, and
reported to inform decision-makers and stakeholders. Individual program activity levels tend to
ebb and flow over time within the acquisition cycle (annotated using green for low activity level,
yellow for medium activity level, and red for high activity level). Managing the collection of test
2-15
programs as a portfolio provides each SH with greater flexibility to prioritize and manage
workload to ensure that programs OT&E needs are met at the pace required to keep all programs
“on-track” or ahead of schedule. In some cases, the SH may personally manage inactive or low
activity level programs to promote concentration of manpower on higher demand or higher priority
programs. Since the SH provides the financial and contracting management and oversight for each
program assigned within the portfolio, the members of the team maintain a focus on production.
This further enables the SH to coordinate with subject matter experts within the operations support
divisions such as finance and contracts to facilitate continuity. Each SH is allotted a cadre of Test
Program Managers (TPM) for program assignment and tasking. When necessary, and with
program resource availability, the TPM may work with the SH to acquire contract support. Not
all programs will require contract support, as illustrated by the low activity level program with no
contractor assigned.
When the program approaches test execution, an OTD will be assigned to oversee the detailed test
planning, test execution, analysis and reporting. The SH, TPM, assigned OTD, and contract
support are all directly supported by the LTE. The LTE has significant T&E expertise and in-
depth knowledge of OPTEVFOR test design, test planning, and post-test iterative processes. The
LTE supports the SH in establishing and maintaining a planned schedule for development of all
required OT&E products. Additionally, the LTE supports the TPM in ensuring all Integrated
Evaluation Frameworks, Test Plans, and Test Reports meet command accuracy and quality
standards while ensuring the applicable process steps are satisfied. In some cases, a Cyber Test
Engineer (CTE) will be assigned by the warfare division to assist with test planning for cyber
survivability. The SH is encouraged to consult with 01B, 01C, and 01D competency experts when
encountering specific production or process challenges that are beyond the capabilities of the core
team to address. Additionally, if further support is required to accelerate production, support
execution of test, or respond expeditiously to emergent requests from the Program Manager, the
TPM should notify the LTE and SH and work with the division leadership to ensure resources are
adjusted as necessary within the warfare and competency divisions.
2-16
THIS PAGE INTENTIONALLY LEFT BLANK
3-1
SECTION 3 - GENERAL ADMINISTRATIVE PROCESSES
3.1 INTRODUCTION
This chapter provides general guidance that pertains to the development of all briefings and
correspondence associated with OT&E. The principal output of OPTEVFOR is information for
decision makers within the Navy, the Marine Corps, and the DoD, as well as, ultimately,
Congressional decision makers. Given this audience, it is essential that all communications on
behalf of the command reflect the highest standards of professionalism. The impact of the
command’s work is directly tied to the credibility of its products.
3.2 GENERAL
As members of the headquarters staff and supporting squadrons, individuals must understand that
their actions and demeanor will reflect directly on the entire Force. All communications, whether
formal or informal, should be conducted in a professional manner. No conversation or e-mail can
be assumed private or “off-the-record. OPTEVFOR personnel will deal with a broad variety of
stakeholders with differing views on many issues. Whether or not there is agreement, individuals
should be treated with appropriate respect. Each stakeholder is trying to do what is perceived as
best from their respective vantage point. There is no room for denigrating or personal attacks on
the character or intelligence of any stakeholder regardless of the circumstances. TPMs, OTDs and
OTCs are likely to find themselves briefing Flag and General Officers and members of the Senior
Executive Service (SES), as well as, from time to time, Presidential Appointees. These briefings
should be conducted with decorum and respect. Briefers must avoid hyperbole, sarcasm, and
flippant remarks. By the same token, the briefer must ensure that the salient points of the brief are
clearly presented on behalf of the Commander. The briefer should not try to “game” an audience
by over- or understating an issue. The briefer should clearly state the facts, present a well-reasoned
analysis that ties the results clearly to the mission, and draw conclusions.
3.3 COLLABORATION
OPTEVFOR personnel must collaborate early and often with internal and external stakeholders.
The best results are generally attained when all perspectives are considered. If a TPM or OTD is
having difficulty bringing key stakeholders together, it is essential that the matter be brought to
the attention of the warfare division leadership. Failure to engage early often leads to unnecessary
rework or a less-than-optimal product. Key stakeholders in the test design phase include the
program manager’s staff, the resource sponsor’s representative, the developmental test
community, and WDC. For programs under the Office of the Secretary of Defense (OSD)
oversight, representatives from DOT&E and the Deputy Assistant Secretary of Defense for
Developmental Test and Evaluation, as well as the supporting analysts from various Federally
Funded Research and Development Centers (such as the Institute for Defense Analyses) should be
included as well. Internal collaboration will involve the various process owners and support
divisions as outlined later in this manual. Regardless of whether the collaboration is internal or
external, healthy collaboration involves individuals as critical-thinkers participating in
professional constructive conflict, not “groupthink”. Using a team-focused approach, we
3-2
challenge our assumptions and thoroughly consider the stakeholder implications of the actions
being taken. This promotes a means to resolve issues at the lowest level empowered to do so
while remaining attentive to the facts in an objective manner.
3.4 CROSS-DOMAIN POST-TEST BRIEFS
It is critical for warfare divisions to glean lessons obtained through all phases of the test design,
planning, execution, and reporting cycles and apply them to improve the efficiency and
effectiveness of operational testing. There is added benefit when this is done at the cross-divisional
level, especially when opportunities for collaboration and synergy are identified and implemented.
One method to enable this is for warfare divisions to identify suitable candidates for briefing and
provide these lessons learned in a cross-divisional setting.
3.4.1 Planning and Coordination
In order to enable consistent and predictable planning, table 3-1 provides an enduring schedule,
identifying months when various warfare divisions are responsible for providing briefings. The
Command Action Group (CAG) is responsible for scheduling and managing the execution of this
briefing effort.
Table 3-1. Warfare Division Brief Schedule
UNCLASSIFIED
Warfare Division
Briefing Month
40
July / January
50 August / February
60 September / March
70 October / April
80 November / May
90 December / June
3.4.2 Identifying Suitable Candidates
Not all test programs are good candidates for briefing. If a test executes uneventfully (i.e., test
design/planning/executions/reporting processes were nominal, no resource issues were
experienced, there were no opportunities to collaborate between warfare divisions, and post-test
reporting required little to no additional stakeholder engagement), there might be little value in
providing a briefing to the entire command. If, however, a test team dealt with tough problems,
experienced something new/unique, or learned things of great potential benefit to other warfare
divisions, it might be a good candidate. In general, good brief candidate programs possess one or
more of the following characteristics:
Cross-divisional tests (i.e., ones in which multiple SUTs were tested together)
Tests in which Fleet resources were shared between warfare divisions (i.e., collaborations)
Uncommon types of tests (e.g., EOAs, Middle Tier Acquisition, Agile/SWAP, etc.)
Unique test concepts
3-3
Highly tailored test designs or plans
Post-test briefings were required
Critical thinking was required to solve difficult problems in test design, planning, execution, or
post-test reporting
Challenges during test required adaptability of test design, resources, etc.
Unique stakeholder engagement
These briefs should not be limited to tests that have completed the reporting process. It is more
important to share lessons in a timely manner. If critical lessons are gained during the strategy
development, test design or planning process, consideration should be given to sharing them before
waiting until the post-test process has been completed, which can end up being more than six
months or a year later. These may include lessons learned during TEMP or Master Test Strategy
(MTS) development, CBTE coordination, test design, test planning, etc.
3.4.3 Monthly Battle Rhythm
A repeatable process must be used to ensure suitable candidate programs are identified and
nominated. The following timeline will be used:
Table 3-2. Nomination and Review Timeline
UNCLASSIFIED
Days Prior to Scheduled
Brief
Required Action
30
Warfare Division A-Code identify brief and inform senior staff at a Council of Captains
Meeting
14 OTD review scope of brief with Warfare Division A/B-Codes
7 OTD submit draft slides to Warfare Division A/B-Codes and TD for final review
0 Conduct briefing
OPTEVFOR must operate as a learning organization, actively sharing information across warfare
divisional boundaries and continuously improving our ability to conduct operational testing and
inform decision makers. This requires deliberate effort. Routine, recurring cross-domain post-test
briefs are a necessary part of this process.
3.5 TRAINING FOR NEW TESTERS
New TPMs typically arrive at OPTEVFOR with a wealth of Fleet and leadership experience, which
is crucial to successful performance. However, they rarely have a background in T&E. Therefore,
training is required. COMOPTEVFOR instruction 3500.1 specifies the training required for any
new tester at OPTEVFOR.
COMOPTEVFOR does not expect the TPM to know everything; that is not possible, especially
during a 2- to 3-year assignment. The Competency Divisions, 01A, 01B, 01C and 01D, were
established to provide technical support and assist the TPMs and OTDs in developing test products.
They are a key part of the “standing army” that assists the TPM/OTD in accomplishing the job.
3-4
OPTEVFOR recognizes the need for a comprehensive training period for new testers. There is
simply too much to learn. A 3-4 day course is not enough. A Training Integrated Product Team
is working on a new model for training of new OPTEVFOR testers. The solution will include
weeks, vice days of dedicated OT training before the tester reports to a warfare division.
Implementation is expected in 2022. Meanwhile, training will continue to start with the 4-day
OTD Course. Other advanced courses are optional unless or until actually needed to perform the
associated tasks, at which time they become mandatory. These include the IEF Course, Test
Planning Course, Suitability Course, Post-Test Iterative Process (PTIP) Course, and the Blue and
Gold Sheet Writing Course. Do not attempt to create an IEF without having attended the IEF
Course. Do not attempt to write a Test Plan without having attended the Test Planning Course,
and do not attempt to prepare a test report without having attended the PTIP Course.
All testers are also required to take some number of Defense Acquisition University (DAU)
courses. At a minimum, all will take the online level-1 T&E courses. Officers and civilians in
certain billets must proceed to level-2 courses, and some billets require level-3. For those in
acquisition positions, DAU certification is more than an opportunity; it is a requirement and a
condition of employment.
Training dates for COMOPTEVFOR courses are found at the OPTEVFOR public web site.
Seats may be requested at the public site by clicking the link provided.
3.6 POLICY AND REFERENCES
Policy at OPTEVFOR is found in COMOPTEVFOR instructions, including this manual,
handbooks, checklists, and various document templates. Temporary or unproven guidance may
be captured in a best practice document by a competency division. Warfare divisions and
OPTEVFOR personnel are encouraged to capture lessons learned and other recommendations
relative to all policy and guidance in order to validate or refine those policies. Once the content
of a best practice has been validated for inclusion within a policy directive, the corresponding
handbook owner will coordinate policy updates via the CCB as described below.
Testers and others need access to a wide variety of references in the course of their work. Within
COMOPTEVFOR Headquarters, the information resources division (Code 20) maintains local
area networks (LAN) for both UNCLASS and SIPR domains. These shared drive resources
warehouse official records and working files for all OPTEVFOR divisions and employees. OT&E
guidance and policy references can be found on the Y-drive of the LAN within the OT&E
Reference Library. Templates and checklists are found in the OT&E Production Library within
the folder for the specific type of document or product involved. When a competency division
document (template or checklist) is updated, the respective competency division (01A/B/C/D) will
hold a TPM/OTD call to review the changes and provide training on new policy/procedures
included in those documents.
Policy or Process Changes. Policy and process changes will be allowed time for review and
comment before being approved and implemented. In general, a draft policy/process change
will be reviewed by Competency Divisions 01A/B/C, and D. Their comments will be
incorporated, then warfare divisions and VXs will be provided the draft change, and allowed
adequate time to review and comment. The final draft will be provided and/or briefed to the
CCB and then to the Deputy and COMOPTEVFOR if required. Once approved, the change
may be promulgated by e-mail, pending incorporation into this manual or a handbook.
3-5
Handbook Changes. Policy and processes are primarily documented in this instruction and in
handbooks. This instruction intentionally does not specify process and details; those are the
subject of the handbooks. Handbooks can be changed quickly, as the need is determined by
the respective process/handbook owners. Handbook owners will make necessary changes,
provide warfare divisions an opportunity to review and comment, then submit them to the
CCB for approval. Once approved, the revised handbook is signed by the owner and
promulgated and maintained by 01A. Revisions to this manual should become less frequent,
as process changes can be more rapidly implemented via the handbooks.
3.7 SUT/SOS REQUIREMENTS
The unique responsibilities and substantial influence of COMOPTEVFOR will sometimes lead
PMs, developers, and even contractors to solicit the opinions of individual TPMs/OTDs as to
system enhancements that are desired or required. Requirements may be found in formal
requirements documents, such as the CDD or the CPD, or they may be derived from DoD,
SECNAV, or OPNAV Instructions, or published Tactics, Techniques, and Procedures (TTP). The
subject of requirements is problematic. Everyone wants full capability in all areas. Unfortunately,
that is neither practical nor affordable. The CNO must make a difficult set of choices, reflected in
the approved requirements documents such as the CDD and CPD. These documents reflect the
CNO’s unique perspective across all programs and his/her statutory responsibility to provide the
best possible manned, trained, and equipped forces to the Combatant Commanders. It is not the
role of COMOPTEVFOR or any associated personnel to make recommendations as to how to
correct deficiencies or enhance system performance. The Commander may recommend a
timeframe for correction of deficiencies, or whether to continue program development, or to
introduce a system to the Fleet, but OPTEVFOR personnel will not make recommendations for
how to correct deficiencies. There are two major concerns with any requirements
recommendations: first, if given in front of a contractor, they could be misinterpreted as tacit
contractual direction; second, even if shared exclusively with the government program office, any
recommendation may be considered to taint the objectivity of future evaluations.
3.8 GENERAL WRITING STYLE
As noted above, the principal audience for OPTEVFOR is senior civilian and military leaders with
broad responsibilities. In addition to being factual and unemotional, ensure that the product is
readable. That is to say, grammatically correct and without spelling errors. Some specifics:
In general, avoid the use of acronyms except where they are in broad general use (e.g.,
NATO for the North Atlantic Treaty Organization) or where they are commonly accepted on
a particular platform, such as AESA (Advanced Electronically Scanned Array) for the
AN/APG-79 series radars on the F/A-18 E/F and EA-18G. Surprisingly, many acronyms are
used for different terms at different classification levels across the Services and defense
agencies. For example, the acquisition community uses “DA” to refer to the Developing
Agency. Joint Publication 1-02 defines it as “data adapter aerospace drift; data administrator;
Department of the Army; Development Assistance; direct action; Directorate for Mission
Services (DIA); double agent.”
Use the active voice and simple declarative sentences where possible. Strive for brevity.
The goal is to maximize communication in the minimum amount of time. Use data tables
and figures to provide large volumes of data in a cogent manner.
3-6
Remember, words have specific meanings. Precise is not the same as accurate. As any
weapons tester will affirm, a weapon may be very precise but woefully inaccurate. Likewise
“electrical” is not a synonym for “electronic. As a writer, one must choose one’s words
carefully.
All OPTEVFOR reports are built around the Blue and Gold sheet construct. As discussed in
the Test Reporting Handbook, the Blue and Gold sheets employ a formalized structure that
presents complex information in a logical, usable format. Blue sheets describe issues or
deficiencies with the SUT, while the Gold sheets describe issues or deficiencies that are
outside the purview of the program of record undergoing test (the SUT), but are nevertheless
essential to the accomplishment of the required warfighting effect.
3.9 EDITORS
OPTEVFOR employs an editing staff (01AE) to review and correct documents as they move
through the document router. They improve documents by checking for template compliance,
format, grammar, spelling, tables, acronym definition and other editorial issues. While the editing
staff is part of the default review chain, they are there to help testers at any stage of document
preparation. Editors will be included in all T&E document routes, regardless of who will sign.
3.10 BRIEFINGS TO THE COMMANDER OR DEPUTY
Read aheads and visitor biographies are due to the Front Office 48 hours prior to the meeting.
Attendees are invited by the Front Office; do not forward invitations to additional invitees; notify
the Front Office if additional attendees are necessary.
3.10.1 Preparing External Briefs (Navy Gate Reviews, OSD Overarching Integrated
Product Team (OIPT) Briefs, ETC.)
The cognizant division director must provide the following information to the Commander upon
learning of a decision meeting involving a CNO project for which OPTEVFOR conducted OT&E.
NOTE
Specific guidance for COT briefings to the DOT&E is provided in the Test
Planning Handbook.
Type of decision forum
Date, time, and place
Purpose of the decision forum (Milestone (MS) and production level)
Schedule of preliminary briefs
Whether a formal presentation is required
Recommended OPTEVFOR briefer and other attendees
Whether attendance by the Commander or Deputy is recommended.
3-7
3.11 COMOPTEVFOR'S POSITION
The warfare division must ensure that the Commander's position is accurately conveyed at the
proper time (i.e., during the brief and any discussions that may follow). If the TPM/OTD is unsure
about the Commander’s position, raise the question through the chain-of-command. The division
is expected to propose a OPTEVFOR position with supporting rationale, provided it can be
supported.
3.11.1 Document Routing, Distribution, and Archiving
T&E documents are routed at COMOPTEVFOR Headquarters within the Electronic Document
Router in Knowledge Management System (KMS). Standard routes have been created for the
typical documents. All T&E documents are expected to be routed using the Electronic Document
Router, and not by other means. The signer of a document is generally determined by table 3-2
below, though there may be circumstances that require a different signatory. Regardless of
signatory, T&E documents shall be reviewed by the TD prior to signature unless otherwise
coordinated between the Division Director and the TD. Once the document has been signed, there
are still actions required to ensure proper distribution and archiving.
After signature, the signing office will ensure the document is serialized. Then, the signed
document (paper) and the latest version of the Microsoft Word document will be transformed into
the final Portable Document Format (PDF) document. This is done by Flag Admin for classified
documents and by the editors for all UNCLAS documents. The result is a searchable pdf document
with a table of contents, with all embedded files attached. Only Flag Admin and the editors have
the Adobe Pro software and the ability to embed files. Standard Adobe users can only produce a
“picture” of the document, which is not sufficient. Flag Admin or the editors will provide the
finished pdf to the appropriate division for distribution to the addressees, by e-mail. They will
also provide the pdf to Code 01A for archiving at Y:\00 Signed Test Documents. Responsibilities
for post-signature document handling are shown in table 3-1 below.
For test reports, the Commander sends an e-mail to Navy and acquisition leadership before the
report is distributed. The e-mail is drafted by the warfare division and coordinated by Flag Admin.
It is important for the warfare division to refrain from distributing the report until after the
Commander’s e-mail has been sent.
Table 3-3. Document Routing after Signature
UNCLASSIFIED
Warfare Division
Flag Admin
Editors
01A
Ensure the final pdf document is being
created by Flag Admin or editors. For
classified documents Flag Admin will
serialize. For unclassified documents,
division provides serial number
obtained from Admin.
Create final pdf with
embedded files for all
SIPR documents,
regardless of who signed.
Serialize if signed in Front
Office.
Create final pdf with
embedded files for all
NIPR documents,
regardless of who
signed
Check final pdf for
correctness, rename per
naming convention, and
post to Y:\00\Signed Test
Documents on SIPR or
NIPR
If necessary, provide signed document
and latest Word version to Flag Admin
(SIPR) or editors (NIPR) for pdf
creation.
Send final pdf to 01A for
archiving on y-drive, and
to the division for
distribution
Send final pdf to 01A
for archiving on y-
drive, and to the
division for
distribution
Receive final pdf, verify correctness
and searchability of document, and
For NIPR docs signed by
00, 00D or TD, provide
3-8
Table 3-3. Document Routing after Signature
UNCLASSIFIED
Warfare Division
Flag Admin
Editors
01A
distribute to recipients as attachment to
e-mail.
signed document (doc)
and related files to editors
for pdf creation
3.11.2 T&E DOCUMENT SIGNATURE AUTHORITY
Table 3-2 identifies OPTEVFOR signature authority for the various OT&E documents. The
smooth documents for the VXs, and rough and smooth for VMX-1, are to be provided to
OPTEVFOR Codes 50 or 01A, as appropriate, via e-mail.
Table 3-4. Signature Authority
UNCLASSIFIED
T&E Document
Response Time
Brief Required
Signature Authority
N0
DIV Director
TEMP and T&E Strategy
15 working days
(Note 1)
No
(Note 1)
X
Oversight test plans (Note 2) (Includes IOT&E,
FOT&E, OA, EOA, and Multiservice
Operational Test and Evaluation (MOT&E)
oversight test plans)
60 days prior to test
COT Brief only
X
All evaluation report letters (Includes MOT&E
Test Reports). Note that TD signs Data
Analysis Summary. Div. Director signs
Deficiency cover letter.
60
-90 days after
test (Note
3)
No.
Covered by ESERB
X
Interim Reports
As required
Yes
X
VCD messages/reports in which COI resolutions
are being changed from the previous phase of test
35 days after test
No.
Covered by ESERB
X
Quick Reaction Assessment
(QRA) messages/reports
60
days after test
No.
Covered by ESERB
X
All OT&E support letters (Warfare
Division responsible for drafting)
30 days prior to test
No
X
Deficiency report messages
As directed
Yes
X
M&S Accreditation Plan
ASAP after need
identified in E
-IPR,
NLT 1 year
prior to
test
Yes
X
All M&S Accreditation Letters
NLT 90 days prior to
test
Yes
(for programs on
oversight list)
X
IEF/IEF Revision
(Note 4)
No.
Covered by E-IPR
X
Tailored IEF where 1) Mid-Tier Rapid Fielding
will be decided, or 2) RCRM requires flag-level
resolution, or 3) Warfare Division Director
believes flag signature is most appropriate
No.
Covered by E-IPR
X
Tailored IEF that does not meet any conditions
requiring 00 signature (see above)
No
X
IA/Interoperability Assessment Reports
NLT 90 days post- test
Yes
X
Integrated Assessment Plan (IAP)
60 days after program
initiation
Yes
X
3-9
Table 3-4. Signature Authority
UNCLASSIFIED
T&E Document
Response Time
Brief Required
Signature Authority
N0
DIV Director
Operational Utility Assessment (OUA), Military
Utility Assessment (MUA), and Limited Military
Utility Assessment (LMUA) reports
60 days after
demonstration unless
specified otherwise
Yes
X
VCD messages/reports in which COI resolutions
are NOT being changed from the previous phase
of test
35 days after test
No
X
Level of Test Determination (LTD) Report
Yes
(for programs on
oversight list)
X
(Note 13)
Administrative Updates to Previously Approved
TEMPs
As required
No
X
Capabilities Documents, Initial Capabilities
Document (ICD)/CDD/CPD Clarification Letter
As required
(Note 5)
X
TEMP comment letters (for O-6 level reviews)
30 days from receipt
Yes (Note 6)
X
O-
6 level reviews of MOT&E Test Plans and Test
Reports
14 days from receipt
Yes
X
Non-oversight test plans (Note 2) (includes
IOT&E, FOT&E, OA, EOA, and MOT&E
non-oversight test plans)
30 days prior
to test
(Note 7)
X
Oversight and non-oversight QRA and VCD test
plans, and IT data collection plans (Note 8)
30 days prior to test
Yes
X
Risk/Deficiency forwarding letter
Prior to the SERB
No
X
Joint Capabilities Technology Demonstration
(JCTD) Demonstration Execution Document
(DED)
30 days prior to
demonstration
(Note 7)
X
Anomaly report messages
(Note 9)
X
TEMP input letters
90 days after program
initiation
No (for programs on
oversight list)
X
(Note 10)
Standard/Combined DT/OT Memorandums of
Agreement (MOA)
30 days prior to test (at
test plan signing)
No
X
IEF Change Letter
(Note 4)
X
Support documentation (Integrated Logistic
Support Plan (ILSP), Navy Training Plan (NTP),
etc.)
15 days from receipt
No (Note 8)
X
M&S Operational Requirement Input Letter
During IEF
development, as soon
as need is identified
No
X
Letters of Instruction (LOI)
30 days prior to test
No
X
(Note 11)
Adjunct tester forms
30 days prior to test
No
X
DT assist MOA-(if used)
30 days prior to
test
No
Division Director/
VX CO
IT MOAs and Charters
As required
No
Division Director/
VX CO
AOC letters and DT Assist Letter of Observation
(LOO)
30 days after test/
demonstration
As required
X
OT commencement messages or e-mails
No
X
OT completion messages or e-mails
End of test as
No
X
3-10
Table 3-4. Signature Authority
UNCLASSIFIED
T&E Document
Response Time
Brief Required
Signature Authority
N0
DIV Director
determined by division
director
ACAT IVM & Abbreviated Acquisition Program
(AAP) concurrence letters
X
OPTEVFOR Tactics Guides (OTG)
120 days after
evaluation report
As required
VX CO
(Note 12)
Notes:
1. Assumes a formal O-6 TEMP review has been completed and that all critical OPTEVFOR comments were
satisfactorily resolved. If not, a brief to the Commander is required.
2. Commander signs all ACAT I, DOT&E oversight, and controversial test plans. Additionally, the Commander may
sign all standard test plans, when desired, 30 days prior to testing.
3. Ninety days for ACAT I/IA and MOT&E; 60 days for all others.
4. For new programs, coordinate IEF completion to support initial TEMP development (MS-B). For existing
programs, IEF must be approved in time to support next phase of test or MS. IEFs for programs on oversight list
are forwarded to the DOT&E to support TEMP approval.
5. Briefs are on a case-by-case basis. The Commander may elect to sign comment letters with contentious issues.
6. Division Director shall brief Commander or Deputy on all TEMPS with critical OPTEVFOR comments.
7. Division director signs (provides a copy to Commander/Deputy for review; briefs on a case-by-case basis) standard
ACAT II, III, and IVT test plans. Staff through 01A/C prior to division director signature.
8. QRA test plans for programs on oversight list are forwarded to the DOT&E. For the case of DOT&E oversight, the
Commander will sign the QRA test plan.
9. Brief the Commander (or Deputy in his absence) prior to release.
10. Sign “By Direction.”
11. LOIs prepared at VX/VMX/HMX may be released by the squadron Commanding Officer (CO).
12. VX COs authorized to sign “By direction.
” The Commander will sign controversial and special interest OTGs and all Naval
Warfare Publications (NWP). Briefing requirements will be determined on a case-by-case basis.
13. If the LTD RCRM required Flag/SES-level intervention, the LTD will be signed by the Commander.
4-1
SECTION 4 - TEST DESIGN BASICS
4.1 MBTD
The OPTEVFOR MBTD process affects the full trade-space (cost, performance, and schedule) of
OT. Early and continuous involvement by program stakeholders is key to efficient and accurate
MBTD, resulting in strategies to integrate DT and OT, minimizing the number and duration of
required test periods while maximizing the information provided to the Fleet and the acquisition
community.
Figure 4-1. MBTD Process Flow Chart
UNCLASSIFIED
MBTD is an iterative, systems engineering approach to designing OT that is mission-focused and
executable by the operational tester. It draws from, and informs many acquisition processes, to
include those shown in the gray blocks of Figure 1. MBTD is divided into three phases, which are
further subdivided into a total of twelve steps. The steps are sequential, but can be worked in
parallel. The IEF is the written product developed using the MBTD process. This product provides
4-2
the OT measures, test vignettes, resources and other material required for TEMP inputs and OT
Plan development. Though the IEF is an OPTEVFOR-approved document, all participating
stakeholder inputs are formally resolved prior to COMOPTEVFOR approval. Disagreements
between stakeholders that continue beyond the most important review meetings are adjudicated
through the Running Comment Resolution Matrix (RCRM) process described at the end of this
document.
Six reviews are completed as the IEF is being produced to ensure stakeholder alignment, and to
provide course corrections to test teams when required. These reviews are decisional-working
meetings, not final brief-outs of an approved test design. O6-level representation from the
OPTEVFOR Warfare Division, Program office and DOT&E (if on oversight) attend with
decisional authority. Reviews are stakeholder-collaborative events where all inputs and concerns
are freely discussed and addressed. For all MBTD reviews, draft read-aheads are provided.
Because of the iterative nature of the process, new information learned in later steps may require
an adjustment of content from earlier steps; any changes to previously approved products will be
discussed. The duration of the meeting is highly dependent on program size and whether the
products are controversial.
4.1.1 Responsibilities:
Members of the “core team” who work together to execute MBTD, and their associated
responsibilities are:
TPM or OTD (50 Div OTC): Develop the IEF. The warfare division owns the product,
providing final decision on IEF content. TPMs or OTDs (and support contractors) provide
SME knowledge of the SUT/SoS and the mission, ensuring MBTD covers all necessary
tasks, capabilities, and conditions so that test will be scoped to gather adequate/correct data.
TPMs and OTDs must also verify that resultant STAT test designs, including DOE run
matrices, are executable. Additionally, the TPM/OTD will track MBTD progress in the T&E
Program System (TEPS).
Squadron TPM or OTD (if applicable): Support headquarters MBTD efforts.
CTF: Guide the TPM/OTD through the MBTD process. 01B owns the process, providing
feedback on IEF content to the product owners. The CTF ensures MBTD products follow
proper standards and result in logical design.
LTE: Assist the TPM/OTD throughout the process.
STAT/DOE Analysts: Join the core team when required to support STAT (including DOE)
construction. Assist in balancing mathematical rigor and a scientific approach to testing,
with a focus on providing timely and relevant information to the warfighter.
A/DACOS: Chair (decisional authority) all MBTD reviews and ensure IEF content is ready
for signature. The warfare division ACOS will sign TIEFs, as appropriate, per table 3-2.
The warfare division ACOS also invites his/her O-6 counterparts (Resource Sponsor, PM,
Fleet SME, WDC) to participate in the MBTD process as members of the core team.
4-3
01B: The Test Design Director or his/her deputy co-chairs all MBTD reviews and ensures
IEFs are developed per the MBTD process.
01C: The Test Planning/Reporting Division participates as appropriate to ensures the MBTD
is developed consistent with test planning, execution, and reporting policy.
01D: The Cyber Division participates as appropriate to ensures the MBTD contains
necessary cyber content, and the materials necessary for cyber test planning, execution, and
reporting are being obtained in parallel.
Stakeholders: External stakeholders, are important members of the core team and are critical
to successful development of the IEF. This chapter provides program stakeholders insight on
when and how to collaborate with OPTEVFOR on MBTD so that their concerns are properly
addressed/incorporated.
4.1.2 Stakeholders:
The TPM/OTD must ensure the following stakeholders are aware of the MBTD effort, and are
invited to participate:
OTA: OPTEVFOR leads the MBTD effort, executes much of the resulting test, analyzes all
the data, and reports to the stakeholders. Joint OTAs will be invited to participate. When
OPTEVFOR is not the lead-OTA, MBTD is still used to develop Navy test design inputs.
Program Office / Developmental Tester / M&S Developers: The program office engineers
provide details on the systems, the envisioned program/test progression, the intended T&E
resourcing, and more. The DT agency may adjust their events to provide IT, and will gather
data needed by the OTA. The M&S developers will aid in determining feasibility of M&S to
support OT data requirements.
Resource Sponsor: The Resource Officer (RO) ensures requirements are
understood/properly tested, T&E funds are justified/properly allocated, and the Fleet will be
properly informed by test.
DoN T&E: OPNAV N942 ensures the latest T&E policies are applied, facilitates target and
range resources, and coordinates with resource sponsors to resolve program T&E funding
shortfalls as necessary.
Testing Oversight: Oversight organizations influence test designs they will need to approve.
DOT&E reviews and concurs with the approved IEF via memo.
WDCs / Tactics Developers / Office of Naval Intelligence (ONI): WDCs provide the latest
tactics to be used while ONI provides the latest insight on threat capabilities, tactics, and
operating areas. Tactics development may also leverage planned events, including M&S.
Fleet Users, Trainers, and Planners: The Fleet depends on accurate understanding of the
system capabilities in order to use the system, train for use, and plan for use.
Others: Anyone who can contribute to the MBTD process is invited to participate.
4-4
4.1.3 IEF development resources:
The following resources are available to support the MBTD process:
01B CTFs and STAT Analysts: Each SUT will be assigned a primary CTF, but all CTFs are
available for TPM/OTD support. 01B statisticians aid TPMs, OTDs and CTFs in correct use
of STAT.
01D Cyber Support: Cyber Survivability (CS) and cybersecurity are dealt with in multiple
steps of the MBTD process. 01D assists per their processes. See Chapter 11 and the Cyber
Survivability Handbook.
IEF Checklist: The IEF Checklist is one of the handbooks. It provides detailed, step-by-step
descriptions of what the TPM/OTD needs to accomplish to build an IEF.
OT Analysis Handbook: An overview of STAT used at COMOPTEVFOR and a quick
reference guide for warfare division personnel responsible for STAT-based test design,
planning, and reporting. This handbook is not intended to be a statistical theory textbook, but
rather a guide for warfare divisions on which STAT are most commonly used and what tools
are available to assist TPMs, OTDs, LTEs, and contract support in calculating the results.
M&S Handbook (under development): Most programs will have some elements of Modeling
and Simulation (M&S) as part of their test design. Multiple MBTD steps include
consideration of M&S support to the test and its inclusion in the IEF. This handbook offers
guidance on the process and methods to identify M&S requirements and to ensure that M&S
supporting OT&E is credible.
Suitability Handbook: Direction on suitability testing within this handbook is vital to
MBTD. This handbook also contains standard suitability measures and DRs to be considered
for use.
Templates: The document and brief templates include guidance and samples to aid in
product development.
Mission Based Test and Evaluation System (MBTES): MBTES, is the database tool which
generates many of the tables used in an IEF (and test plans). It is maintained by 01B.
4.1.4 Timeline:
MBTD has the following timeline considerations and requirements:
Initiation: MBTD starts early. Even before the WIPT begins TEMP construction, MBTD
influences OPTEVFOR’s review of the Capabilities Document (CD), and more. Stakeholder
involvement supports the scope and sizing of not only OT, but DT, IT, M&S requirements,
and if early enough, contractor testing as well. Problem identification, test resource
reductions via shared DT/OT test data, system characterization in DT, and reductions in
dedicated OT are all possible.
4-5
Duration: IEF creation timeline depends on the program size/type, concurrent tasking of the
test team, as well as the availability of input information. There will be many requests by the
test team for information. Active stakeholder participation can greatly reduce the timeline.
Completion: The IEF is signed for approval by the signature authority listed in table 3-2.
TEMP OT input can, if required, be prepared after the E-IPR is complete (IPR-1 for TIEFs).
Read-aheads: For the IPR-1, DWG, and IPR-2, read-aheads are required to be sent at least
two weeks prior to the meeting. This enables stakeholders to review, provide feedback, and
come to the meeting ready to represent a final position on the content being discussed and
agreed to, thus supporting RCRM use. Under the accelerator team paradigm, as little as 48hr
is allowed.
Scheduling: Meetings should be scheduled with enough lead-time to allow desired attendees
to sufficiently de-conflict their schedule. This is usually done when sending read-aheads.
Meeting minutes: Within two days of the meeting, action items, meeting minutes, and a
RCRM (IPR-1, DWG, and IPR-2) are distributed by the TPM/OTD for stakeholder
comment.
4.1.5 Phases:
The MBTD process phases are:
Mission Analysis: This phase identifies the SUT and SoS boundaries, and focuses on
identifying mission areas supported by the SUT, creating a hierarchical decomposition of
operator tasks to accomplish those missions, and association of conditions (physical, military,
civil, etc.) that affect task performance.
Requirements Analysis: This phase references required system capabilities and their
established criteria to define measures of successful mission performance. SUT and SoS
measures are linked to the subtasks to show how OT will evaluate the SUT as operators
perform those tasks. Data Requirements (DR) for each measure and condition are also
identified.
Test Design: The test design phase leverages work to date to scope, schedule, and resource
data collection for OT. The methods used to collect these data may range from a rigorous
statistical DOE to simple demonstrations for problem identification. Vignettes logically
organize data collection, setting the stage for test event construction. Test methods,
including use of M&S are considered. Resources are detailed and limitations are recognized.
4.1.6 Steps:
The MBTD process has the following steps:
Step 1, Define SUT/SoS: The hardware and software configurations of the fielded SUT are
defined by the program office. The SoS, mission Concept of Operations (CONOPS)/Concept
of Employment (CONEMP), employment TTPs, material support/sustainment concept, and
cybersecurity implications are defined through stakeholder collaboration. These definitions
4-6
form the basis for the rest of MBTD, and are entirely dependent on accurate stakeholder
input. When beyond IOT&E, the “in-scope SUT” concept often applies. The focus is new
capabilities, enhancements and regression confirmation. Testing is not expanded to cover the
out-of-scope SUT, though this remains open for evaluation if relevant issues are discovered.
Step 2, Identify COIs: COIs are written by OPTEVFOR. For effectiveness COIs, use the
standard mission threads based off the current ROC/POE (varying by exception only). For
suitability, the standard four COIs (reliability, maintainability, logistic supportability, and
availability) are considered, and can be expanded upon identification of need (e.g., training,
personnel support). Cyber Survivability (CS), the third evaluation area, is generally assessed
for all programs requiring OT.
Step 3, Identify Subtasks: OPTEVFOR has defined first-level subtasks for each ROC/POE
mission thread. Not all mission thread first-level subtasks may be applicable to mission
accomplishment using the SUT. Missions are, as needed, further decomposed into lower-
level subtasks that the user will perform per the CONOPS and TTPs. Suitability tasks may
apply, depending on sustainment concept. Those tasks that are critical to mission
accomplishment will later be classified as critical. A good subtask hierarchy is vital to
correct identification of conditions/measures/DRs, construction of vignettes, reporting of
results, and more. CS tasks are not currently written into the IEF.
Step 4, Establish Conditions / Link Subtasks: It is not possible to test the SUT in every
operating or environmental condition. However, the full breadth of the operating
environment must be understood in order to select the testing conditions that are most critical
to system success. Physical, military, and civil conditions are pulled from the Universal Joint
Task List (UJTL)/Universal Naval Task List (UNTL) instructions, but in addition, custom
conditions specific to the SUT mission tasks are almost always defined. Each condition is
assigned levels (descriptors). Conditions are then linked to the lowest level of the subtask
hierarchy base on whether task performance is impacted. CS conditions are not currently
written into the IEF.
Step 5, Identify Specified Measures / Link to Subtasks: A measure is a specific metric used
to assess success of one or more subtasks. There are three measure types: specified, derived,
and other. As the name indicates, specified measures are clearly defined in a resource
sponsor-approved CD. The criterion assigned to a measure defines an acceptable level of
system/task performance. Specified measure criteria are based on CD thresholds. When a
threshold does not exist, the test team may establish a criterion to be used to evaluate the
measure (specified, or otherwise), and will request resource sponsor concurrence. If the RO
does not agree, the measure will remain, but with criterion of “No Threshold”. Measures of
Effectiveness (MOE) trace to effectiveness subtasks; Measures of Suitability (MOS) trace to
suitability subtasks/COIs; SoS Measures can trace to effectiveness or suitability. At least one
measure assigned to each critical task will be classified as a critical measure for those tasks.
Measures may be assigned to more than one task if appropriate. Quantitative measures are
preferred; good SUT documentation/requirements enable such measures. Thus, program
4-7
office and resource sponsor input is key to development of a comprehensive, testable set of
measures. CS measures are not currently written into the IEF.
Step 6, Develop Derived Measures / Link to Subtasks: Derived measures are sourced from
some other agreed-to and authoritative document (e.g., system specification, CONOPS,
tactical employment document). These link to subtasks.
Step 7, Create Other Measures / Link to Subtasks: Other, or “OTA Created” measures are
developed by test team when specified and derived measures are insufficient to evaluate
effectiveness and/or suitability. These link to subtasks.
Step 8, Derive Data Requirements / Link to Measures & Conditions: DRs must include the
data element (what you are collecting), the unit of measure (e.g., yards, seconds, likert scale,
qualitative), required data accuracy/frequency, and source of the data (where it comes from).
Each measure or condition may have multiple DRs. DRs are categorized by similar sources.
The DRs expected to come from DT require program office concurrence that the data will be
available and collected during DT. OT will resource and collect the data if the PM does not
agree to collect the data in DT. CS DRs are not currently written into the IEF.
Step 9, Statistical Design / DOE: OPTEVFOR analysts, in consultation with the test team,
will develop a quantitative test design using STAT. Per the Defense Acquisition Guidebook,
STAT refers to the scientific and statistical methods (and associated processes) that are used
to develop efficient, rigorous test strategies in order to yield defensible test results. DOE is
recommended for use by the test community in DoDI 5000.02, and is the preferred STAT
method when the test objective is SUT performance characterization across varying
condition/factor effects. DOE refers to the process of planning an experiment so that
appropriate data will be collected and analyzed by inferential statistical methods, resulting in
valid and objective conclusions, which can be used by the Fleet for expected operational
performance. Designs are only created for critical measures. The designed test must be
executable, and defendable as the minimum adequate to support OT conclusions. Test is
only sized for effectiveness data, but the validity of the scope of resulting suitability data is
also examined. Resources/funding, agreed to in the TEMP, for IT/OT are driven by these
designs. Stakeholder participation in this step is critical. This step, along with the existing
DRs, prompts the team to identify needed M&S (e.g., targets, threat surrogates, computerized
simulations). M&S is summarized at the E-IPR sufficient to understand objectives and
resourcing.
Step 10, Build Vignettes: Vignettes are groupings of subtasks that logically organize data
collection for efficient test execution and resourcing. Often, they equate directly to test
events. Vignettes are built for effectiveness, suitability, and CS. Each design created in step
9 must be associated with a vignette, which then has a matrix of runs to be completed under
specific conditions. Ideally the run order is randomized. M&S details are added, to include
enough to summarize intended uses and understand the path to accreditation. Cost savings
are realized through vignettes taking advantage of IT, fleet exercises, or other test
mechanisms/efficiencies. CS vignettes are added to the IEF.
4-8
Step 11, Devise Test Methods: Specific test method write-ups for vignette execution can be
added, but are not required for the IEF because Detailed Methods of Test (DMOT) are
written in subsequent OT test plans (chapter 6). This level of detail is not necessary to
provide a TEMP input. See Section 4.3 for specific guidance regarding Capabilities Based
T&E (CBTE) programs.
Step 12, Determine Resource Requirements: Resources are identified per vignette, and then
combined in test events, periods, or phases (as appropriate for the TEMP input). With
resources agreed to, the limitations to test are identified. Limitations represent data called for
by the outputs of steps 1-9 that cannot be collected via the testing identified in steps 10-12.
4.1.7 Reviews:
The following reviews are completed between the MBTD steps:
TP-1: This is a short (0.5-2hr), informal review, with phone-in (or on-site) participation by
external agencies as they desire. Review section 1 of the IEF. TP-1 is often combined with
TP-2.
TP-2: This also is a short (0.5-2hr), informal review, with phone-in (or on-site) participation
by external agencies as they desire. Review the subtask hierarchy, the conditions directory,
and the tracing of conditions to subtasks (traceability matrix).
IPR-1: This is the first formal review, more in-depth than the previous reviews and longer
(2-6hr, or more). Stakeholder participation, to include the program office and resource
sponsor, is critical at this phase. Review the measures matrix, the tracing of measures to
subtasks (traceability matrix), the DRs for measures and conditions, and any prior products
(as required).
DWG: This is an exhaustive technical review of the test design, including assumptions and
limitations, to ensure the test is properly scoped for live and/or M&S resourcing. It is
scheduled for a minimum of 4hr. Stakeholder participation is perhaps more important than at
IPR-1. Review IEF section 2, the design run matrices (if developed), and the Platform
Mission Tasks (PMT) View Shell (PV-0).
E-IPR: This flag-level review is often short (1-3hr). Stakeholder participation is not
required, but program office attendance is encouraged. Review the PowerPoint brief
summarizing all products to date and obtain Commander’s concurrence with the work so far,
and permission to continue.
IPR-2: This final review is formal, but shorter (2-4hr) than IPR-1 or DWG. Stakeholder
participation is not required, but program office attendance is encouraged. Review sections 3
and 4 of the IEF, to include vignette descriptions, run matrices, and resources, along with
initial schedule, anticipated M&S, and limitation descriptions.
Based on the iterative nature of MBTD, any material covered in a prior review can be
discussed at a subsequent meeting.
4-9
4.1.8 The IEF:
The IEF is organized as follows:
Section 1: The introduction section details document purpose, SUT/SoS descriptions,
employment concept (mission, sustainment, and cyber), effectiveness COIs, suitability COIs,
and CS applicability.
Section 2: The test design section is often a direct input to the TEMP as the DOE appendix.
It details the critical tasks and measures, experimental design and post-test analysis strategy
by COI.
Section 3: The test execution section is essentially the OT input to TEMP section 3.
Vignettes are described with enough detail to understand resourcing. Leveraging the
notional test schedule, vignette runs are matched up to testing phases/periods. M&S and
limitations are here too.
Section 4: The consolidated resources section provides OT inputs to TEMP resource tables,
as well as resources by vignette.
Appendices A/B: The MBTD products reviewed at TP-2, IPR-1, and IPR-2 are contained
here. The PV-0 is also included.
4.1.9 Updating IEFs:
As a program evolves, new capabilities may be added, measures may be developed or changed for
existing capabilities, lessons from testing may change the DOE for future test, and more. The
changes must be documented/approved. The options for updating a signed IEF are:
IEF Revision: A revised IEF leverages much of an existing IEF, but incorporates significant
MBTD changes (addition or removal of capabilities, addition or removal of resources, and/or
changes to test execution). The full MBTD process is executed to create revised IEFs. A full
IEF document is routed for the Commander’s approval signature. TIEFs are never revised.
IEF Change Letter: An IEF change letter reflects small changes to the existing IEF content
(tasks, measures, DRs, etc.) and has no impact on required resources. Complete only those
MBTD steps and reviews applicable to the change. The updated IEF sections are attached to
the change letter, using change format instructions in the Navy Correspondence Manual. The
letter and the changed IEF sections are routed for Warfare Division ACOS approval
signature. Signature authority for the IEF Change Letter can be elevated if the update
includes high-visibility or controversial material. Copy 01A, 01B, and 01C on the signed
letter. This ensures the support divisions are aware of the change approval and enables 01A
to post the update to Y:\00\Signed Test Documents.
Other: An IEF update is required if MBTD products will support test before any other
OPTEVFOR document is signed. For example, an IEF revision should be signed by the
Commander to update resources preceding a TEMP input. But often, MBTD changes can be
approved via another signed document, such as a test plan. For these cases, no IEF revision
or change letter is required.
4-10
4.1.10 Tailored IEF (TIEF):
The principal difference between a TIEF and a full IEF is that the TIEF may not execute all 12
steps of the normal MBTD process. This TIEF uses a template similar to a full IEF, but may not
contain the same level of detail and may be abbreviated, as required. 01B and the Warfare Division
Director will agree on how far in the MBTD process the TIEF should go. At a minimum, the first
8 MBTD steps will be fully completed, including conduct of IPR-1. TIEFs requiring the
Commander’s signature will complete E-IPR. A TIEF can be used to support:
Programs in early development, to produce test design detail sufficient to support the
Milestone A TEMP.
Preparation for a QRA.
Planning for, and input to programs where OPTEVFOR is not the lead OTA.
Planning for, and input to nontraditional assessments of Navy programs (e.g., JCTDs) or
MTA efforts.
4.1.11 Running Comment Resolution Matrix (RCRM):
The RCRM paradigm has been endorsed by DOT&E for use by all OTAs. A copy of the Joint
OTAs RCRM MOA of July 2018 can be found in the MOA folder of the Reference Library. The
RCRM serves to track/elevate (as needed) major disagreements between stakeholders. Comments
that cannot be resolved at IPR-1, DWG, or IPR-2 are entered into the RCRM within 2 weeks
following the meeting. The final RCRM is sent out to all stakeholders (even if blank), and a 90-
day clock starts to resolve any documented comments. OPTEVFOR, the program office, and
DOT&E each have a column in which to provide the specific language they want included in the
IEF to resolve the comment. If stakeholder agreement cannot be achieved at the O6 level,
unresolved comments are elevated to the Flag/SES level NLT 90 days from the RCRM issue date.
NOTE
An RCRM differs from a normal CRM in that it only lists unresolved comments with
outside stakeholders as opposed to all internal and external comments, both resolved
and unresolved. All programs use RCRMs, not just those on oversight.
4.2 PLATFORM MISSION TASKS (PMT) VIEW
The PMT View takes multiple forms and has multiple uses throughout test. It is required in all
OPTEVFOR test efforts. Generation begins during MBTD as the system’s missions are
decomposed into subtasks, and as performance measures are selected and then linked to that
subtask hierarchy. When the full PMT View concept is employed, it becomes a graphic depiction
of the system’s current, assessed/evaluated, mission-based capability. The primary use is clear
and efficient communication of the overall development status of the SUT capabilities being
introduced.
Test Design: Within the IEF, the PMT view does not incorporate test results or capability
evaluation. However, it still depicts the mission-based capability intended for delivery, along
4-11
with a graphical structure of how that capability will be evaluated. PMT View approval
authority at this stage is consistent with the IEF/TIEF signature. The same is true if the PMT
View is developed as part of a test plan. The IEF checklist details production of the PMT
View Shell (PV-0).
Test Results: The PMT View is populated using results of testing throughout the test
continuum as those results are received. It provides a common, data-driven, and shareable
perspective that is a useful reference real-time, on multiple levels. Population of results is
completed as a part of the PTIP. The PMT View standardized format is relatively easy to
update and share with stakeholders. The initial PMT View format is a Microsoft Excel-based
tool. The Test Reporting Handbook details how the PMT View is populated.
4.2.1 PMT View Variants:
The PMT View is not a DODAF-defined view, such as an Operational View (OV), Capability
View (CV), or System View (SV). However, for ease of understanding, DODAF-like
nomenclature is used for the PMT View variants. The initial set of standard PMT View variants
are listed below; these allow for consistent baseline use across all programs. The PV-0 is a direct
product of the MBTD process. Each subsequent PMT View variant starts with the applicable
portion of the PV-0 structure, and incorporate test results.
PV-0: The PMT View Shell displays the uncolored formats for the various PV-1s. It is a set
of Excel spreadsheets organized within a workbook and embedded in the IEF appendix A.
When introducing PMT Views into established test programs, the PV-0 will be created,
updated, or reviewed as an initial action.
PV-1: The Performance PMT Views take several forms, displaying the colored status/results
for tasks and/or measures on each COI, providing nuanced reporting over multiple tabs. As
the PV-1 is built, it will be reviewed by the Program Manager and the OTA’s senior
representative prior to any Gate Reviews. The stakeholders should ensure approval at their
appropriate levels prior to gate review.
4.2.2 PV-0 Revision:
The original PV-0 spreadsheets approved with the IEF should only be revised (e.g., capture
changes in task structure or measures/conditions associated with those tasks) when the MBTD is
revised or changed, which may occur during test plan development. The Warfare Division
Director will approve these revisions.
4.2.3 PMT View Handling:
The PV-0 may be handled on NIPR and/or SIPR. All other PMT Views contain test results, and
will normally be handled only on SIPR.
4.3 LEVEL-OF-TEST DETERMINATION (LTD)
Not every program requires formal OT. Not every acquisition decision within a program requires
OT input. LTD is the process by which the type of OT involvement is determined to be: no OT,
4-12
observation of DT by OT personnel, or formal OT. It parallels MBTD steps 1-4, replacing the
Risk Assessment Level of Test (RALOT) process. LTD only determines the level of OT required.
If the LTD is formal OT required, then the MBTD process must be used to scope the amount of
data collection necessary. A LTD briefing template (Y:\OT&E Production\IEF) is used to scope
the information to be discussed at the LTD decision meeting.
4.3.1 Applicability:
LTD can, as required, be used on any SUT. The following are the most common use cases:
Programs beyond IOT&E, when the need for FOT&E is in question.
Concurrence with OT not being required, in support of ACAT IVM or AAP designations.
Capabilities-Based Test and Evaluation (CBTE) test strategy development.
4.3.2 Considerations:
Considerations for LTD are similar to those of MBTD. The following amplifying information is
provided:
SUT: As with the in-scope SUT paradigm for FOT&E frameworks, the starting point for
LTD is understanding changes to the system, and how those changes are expected to affect
mission performance. SUT capabilities (new, enhanced, regression) are by far the most
important LTD consideration. For example, OT is required for regression testing of a
capability that previously required OT.
o Consider the mission impact of, and operator involvement in, each capability. Is usability
a concern for effectiveness?
o Address HW/SW configuration changes. Is reliability impacted? Has accessibility
changed for maintenance? Does the onboard-repair parts list add or eliminate items?
Suitability alone can justify need for OT.
o Cybersecurity must be taken into consideration.
o Training on the SUT can affect performance. Training observation can be an OT event.
Training changes may justify OT mission events. How much operator involvement is
needed for task accomplishment using the SUT?
SoS: Changes to the SoS can impact SUT task accomplishment. Identify such changes for
consideration in LTD. However, SoS impacts alone cannot change the level of test, just as
MBTD does not scope OT based on the SoS.
Employment Concept: Has the mission CONOPS, sustainment concept, and/or cybersecurity
concept changed? OT must verify the Fleet can employ the system per tactics and as trained.
Changes in this area are examined. Additionally, mission CONOPS breakdown via tasks in
MBTD considers the tactical environment (e.g. updated threat). OT may be needed for these
concerns.
4-13
Requirements: Testing for an upgrade is influenced by the goals of the upgrade. This is
similar to capabilities, but brings threshold levels of performance into consideration. How
much does the acquisition community understand about the intended results of the
improvements? Will OT be needed to define mission success if the requirements are
unclear?
POR Acquisition Stage and/or Fleet Introduction Status: How much does the program know
about their SUT, and their SUT’s contribution to Navy missions? How much does the Fleet
already know about the SUT? OT may be needed to fill large knowledge gaps.
Prior Testing: What data is available already, and thus, what data is still needed? DT/OT of
previous SUT iterations may provide data. CT/DT already completed for the current SUT
version makes data available. If existing system performance is well-understood, less test
may be needed to understand upgrades. The data must be applicable. How good was the
training at previous test? What environments/threats were tested? How well is performance
variation (vs conditions) understood?
Planned Testing: Formal OT may not be needed. COMOPTEVFOR can issue an AOC
(chapter 8) to inform fleet introduction based off observation of DT, but only if that DT is
operationally representative. What is the SUT configuration and operator training status at
DT, and will that status yield a fleet-representative test? To justify a higher test level,
consider clarifying data pedigree needed above/beyond what is planned.
4.3.3 Procedure:
In place of a formal procedure, a LTD is accomplished by applying critical thinking and use of
select steps within the IEF checklist as a guide while building the LTD brief. Leveraging MBTD,
a process familiar to OPTEVFOR, makes the process repeatable, consistent, and efficient.
Use the LTD template items applicable to your system, to identify the level and type of
information, including cybersecurity, required to support a LTD decisional meeting.
Though usually completed as part of the MBTD requirements analysis phase, identification
of critical tasks and measures within the subtask hierarchy may also be completed, if the
warfare division believes the additional information is required to conduct the LTD.
Ideally, read-aheads are sent out 2 weeks prior to the LTD approval meeting, based on use of
the RCRM paradigm. However, 48hr is acceptable.
4.3.4 Results:
One of three test levels can be recommended, or there may be insufficient information to
recommend a level:
No OT. This recommendation applies if:
o No risk to critical mission capabilities,
o No required regression,
4-14
o Little to no new/enhanced capabilities (e.g. obsolescence-driven changes only).
Observation of DT. This recommendation applies if:
o Minor risk to critical mission capabilities,
o Regression testing requires DT data collection,
o Upgrades are not operator-intense; minimal new/changed training is needed,
o OT providing insight to a decision (via LOO or AOC).
Formal OT (IOT&E, FOT&E, Ops Demo). Recommended when:
o Moderate (or more) risk to critical mission capabilities,
o Regression testing requires OT data collection,
o Significant new capabilities/enhancements,
o OT providing report/evaluation consistent with test.
Or, as mentioned above, there may not be enough information available to determine a level of
test.
4.3.5 Approval:
Conduct the LTD approval meeting.
The meeting is chaired by the Warfare Division ACOS and supported by the 01B Director.
Stakeholder attendance is highly encouraged. Similar to the DWG, leadership is presented
with sufficient background information as well as a recommendation from the participants.
The ACOS has decisional authority and will either agree with the team’s recommendation or
direct a different type of test.
The only required product prior to the meeting is the LTD Brief. MBTD products such as the
IEF section 1, conditions directory, and/or subtask hierarchy are not required, but can be
provided if required to aid in the final decision.
The RCRM paradigm is used to resolve disagreements that cannot be resolved at the
meeting.
4.3.6 Documentation:
A COMOPTEVFOR letter to the PM confirms the recommended OT level. The LTD Brief,
RCRM (if comments exist) and approved meeting minutes are attached as enclosures. The LTD
letter with enclosures is routed via the electronic document router and signed by the Warfare
Division ACOS, unless resolution of the RCRM required Flag/SES-level intervention. In these
cases, the Commander will sign. The letter is distributed per the IEF distribution list. For oversight
programs, DOT&E is a “Via” addressee. A DOT&E approval memo is expected for oversight
programs.
4-15
4.4 CAPABILITIES BASED TEST AND EVALUATION (CBTE)
CBTE is a test strategy selected by the program, guided by the Systems Command (SYSCOM),
and fully supported by COMOPTEVFOR, that shifts the focus of Developmental Testing (DT)
from requirement and specification verification (although those activities are still conducted) to
assessing the capability of the system in the larger context of the SoS. COMOPTEVFOR policy
is to fully support any program that adopts a CBTE approach. CBTE is more than “talking to
COMOPTEVFOR early” or “conducting some IT”, it requires continuous collaboration and
communication in an environment of trust among all T&E stakeholders at all levels. Programs
using CBTE understand that OPTEVFOR will be more active in their testing continuum, moving
towards a pattern of observation of, or participation in, most, if not all, testing. Because
OPTEVFOR already approaches OT from a mission-based capability evaluation perspective, the
TPM/OTD and supporting team members must be proactive and cooperative in helping the
program evaluate system capability during DT. CBTE does not replace the statutory requirement
for OT, but it should shift system characterization from OT into DT and enable greater mission-
based, operationally representative “free play” IOT&E and FOT&E, which could be aligned with
Strike Group workup periods or large force exercises. The TPM/OTD is not alone in having to
navigate a CBTE test program; OPTEVFOR has strategically identified “CBTE Champions” in
the warfare divisions, and trained personnel in the Test Design and Test Planning competency
divisions, to support CBTE programs. When a program initially indicates that they are considering
CBTE, the TPM/OTD should notify their division leadership and their 01B CTF to obtain
immediate assistance. Following that, training on CBTE is required.
4.4.1 Training:
When a program is considering or has decided to pursue CBTE, two important training courses
are available to the program office and OPTEVFOR test teams. Both teams, led by the program
T&E manager and the TPM/OTD respectively, should attend each course at the same time as
foundation-building steps for future collaboration and communication and both courses should be
completed at the earliest opportunity. The first course is the Introduction to CBTE (CBTE-100)
course sponsored by the SYSCOM. CBTE-100 provides the “what” and the “why” for CBTE.
The second course, sponsored by COMOPTEVFOR, is the IEF Course. In this course, the test
teams will learn how to complete MBTD and develop the IEF for their programs. Additionally,
as the test teams attend training, they should be creating the written agreement necessary to enable
successful CBTE.
4.4.2 CBTE Memorandum of Agreement (MOA):
When the program has committed to a CBTE approach with OPTEVFOR, OPTEVFOR’s support
for that approach should be formally documented in a MOA between the Program Manager and
the warfare division ACOS. This MOA, developed by the test teams with the help of SYSCOM
leadership and COMOPTEVFOR CBTE Champions, should address the commitment to
developing an environment of cooperation, communication, and trust; conducting MBTD and
developing an IEF that is useful across the test continuum; leveraging the IEF to create TEMP
inputs; OPTEVFOR participating in early DT and Contractor Testing (if applicable); cooperatively
planning and executing IT events; participating in data analysis activities and sharing test data;
and cooperating in understanding the system’s capabilities in the context of its operational
environment and its SoS. Additionally, the unique aspects of CBTE should be addressed in the
4-16
program TEMP, the Master Test Strategy (MTS), and/or the Integrated Test Team (ITT) Charter
as applicable.
4.4.3 MBTD and IEF:
MBTD enables CBTE. MBTD use is being expanded within the CBTE paradigm to form the basis
for evaluation of capabilities tied to the mission throughout all testing (CT/DT/IT/OT). It provides
a common framework around which tests may be designed, planned and executed to maximize
efficient use of resources in capturing required test data. CBTE/MBTD should start as soon as
possible within program test activities to ensure development of a test design supporting RFP
release and the Milestone (MS) B TEMP. In the case of a program that is being developed using
a Model-Based Systems Engineering (MBSE) approach (instead of a traditional paper document-
based approach), early execution of the MBTD process will support inclusion within the model
and influence requirements development. However, CBTE can be initiated at any point in the test
program where testing can be influenced. CBTE programs are not just observers of the
“COMOPTEVFOR MTBD process.” CBTE requires an IEF that must be cooperatively created
during MBTD, thoroughly understood and agreed to with the program’s stakeholders, and then
actively adopted as a DT, IT, and OT planning tool. The IEF provides the collaborative “script”
to guide the program’s CBTE test continuum. CBTE does not fundamentally change the execution
of MBTD. However, the following additional considerations apply when executing MBTD for a
CBTE program:
Close coordination among all stakeholders to collaboratively accomplish the process.
Scheduling MBTD reviews may be more challenging when ensuring full participation of
CT/DT stakeholders.
The OT team will lead MBTD for test programs where formal OT is required. For test
programs where formal OT is not required (determined via the LTD process), the program
DT team will be responsible to lead MBTD. In this case, the OT team will participate in
MBTD and provide assistance, upon request, particularly in the development of a statistical
test design (DOE).
CBTE uses MBTD to develop common data requirements during IT. IT test plans require an
expanded DMOT during MBTD to ensure successful execution and data collection to support
both DT and OT. As the tactical expert and OT Subject Matter Expert (SME), active
TPM/OTD involvement in DMOT development is critical to successful test design.
4.4.4 TEMP:
The test teams will use the approved IEF to inform creation of the TEMP. The TEMP will
document efficient data collection over the test continuum to satisfy the identified measures in
support of COI resolution. The Director, OT&E will provide approval of the test strategy, for
programs on the oversight list, to include IT conduct, with signature of the TEMP.
4.4.5 Test Planning:
CBTE does not replace the requirement for OT, where testing is supported by the Test Planning,
Test Execution, and Test Reporting Handbooks. In a CBTE approach however, there will be many
4-17
earlier IT events that will require cooperatively-planned, SYSCOM- or program-approved IT plans
capturing test objectives, procedures, resources, and data collection and analysis requirements.
Unlike OT phases with a single test plan, CBTE IT may be accomplished using multiple test plans
across the program’s test continuum. IT objectives may be incorporated in the main body of the
test plans or be included as an appendix attached to the test plan. IT plan development will follow
the SYSCOM process, guidelines and format, and should clearly identify those portions of the
MBTD applicable to the identified test phase through alignment to the IEF Vignettes. The
TPM/OTD/OTC will actively participate in IT plan development, bring their operational expertise
to the process and advocating for the operational environment, threat surrogacy, and OT data
collection and analysis requirements. Each IT plan may not include the entire MBTD, but should
clearly articulate what portions of the design, by vignette run, are planned for accomplishment. It
must also provide a SUT/SoS description (or reference to the description in a corresponding
document), the relevant COIs, the measures, any known test limitations, a DMOT, and any data
scoring requirements and procedures unique to the test phase. This plan will include the tasks to
be performed and the associated conditions to be considered for the test, as identified in the IEF.
The program manager, or representative, will approve the IT plan for use by the program office
test team. At the conclusion of IT plan development, the TPM/OTD will conduct a CBTE Test
Plan Review Board (TPRB) with the division ACOS, Technical Director, 01B, 01C, 01D, and the
DOT&E Action Officer (AO). The TPM/OTD will brief the test objectives, the system’s current
hardware and software configuration, the IEF vignettes to be conducted, the DMOT, all aspects of
operational and threat realism, the data to be collected and its relevance to OT, and the data scoring
and analysis plan. The ACOS will sign a memorandum approving the IT plan for OT data
collection. For DOT&E oversight programs where the TEMP does incorporate the CBTE strategy,
DOT&E signature on the TEMP confers approval of the CBTE strategy and no additional approval
document will be required. For programs where the TEMP does not incorporate the CBTE
strategy, a document for DOT&E approval, such an IEF approval memorandum, will be required.
If deemed appropriate by DOT&E, a Data Collection Plan (DCP) may still be required.
4.4.6 Test Plan Changes:
For changes to test plans after initial review and concurrence, the program TPM/OTD will review
the changes in consultation with the assigned 01B CTF and 01C AO, to determine if the changes
affect the MBTD or the qualification of data for use in OT. If MBTD is not affected, the test plan
changes may be approved by the division ACOS without further review. If changes to the MBTD
are required, the TPM/OTD will submit an IEF update (revision or change letter) for
COMOPTEVFOR leadership approval and DOT&E concurrence (if required).
4.4.7 Data Collection, Scoring, and Analysis:
The TPM, OTD, or designated OT representative, should be onsite during the entire IT in order to
ensure that the required data are being collected in accordance with the approved IT plan. The
criteria and methodology for scoring data to support OT has not changed. In the CBTE construct,
incremental data scoring may be required to ensure events are completed prior to moving on to
subsequent test events or test phases. The frequency, conduct, and composition of scoring boards
should be discussed in the IT plan, MOA, or ITT Charter, and the TPM/OTD must ensure that all
stakeholders understand how the scored data aligns with OT objectives. Critical thought is
required when scoring IT data for OT use. Were the data affected by the system configuration,
4-18
the operational environment, the operator, or the threat surrogate? Not all data are affected by
these criteria equally. The TPM/OTD must attend any post-test data analysis working groups,
where the test results are presented, in order to gauge the program’s progress in developing its
required capabilities. Following completion of those working groups, program and OPTEVFOR
test team leaders should convene to assess the test program’s status and to develop a “way ahead”
for future test events.
4.4.8 Platform Mission Tasks (PMT) Views:
The PMT View is a representation of the required tasks for a SUT to perform missions, organized
by COI. PMT Views are designed to be used as an execution tool for test teams and as a way to
communicate mission capability at decisional meetings. For more information and guidance on
the development, use and business rules for PMT Views as part of CBTE, consult the CBTE
Implementation Guide.
5-1
SECTION 5 - TEST AND EVALUATION MASTER PLAN
(TEMP)
5.1 INTRODUCTION
The TEMP is the most important T&E document associated with an acquisition program; the
controlling T&E management document. By regulation, it must be approved prior to
commencement of OT&E. The TEMP is directive in nature, having been signed as submitted by,
concurred with, or approved by all major T&E stakeholders. It documents the agreed-to solutions
for cost, performance, and schedule within the T&E trade space. The TEMP defines and integrates
test objectives, critical issues, system characteristics, test responsibilities, resource requirements,
and test schedules.
5.2 ADMINISTRATIVE POLICIES
Policies and procedures for the development, staffing, and approval of the TEMP are found in
SECNAVINST 5000.2F and DoDI 5000.02. Additional detailed guidance is contained in the
Defense Acquisition Guidebook (DAG), and DOT&E TEMP Guidebook 3.1. Per DoDI 5000.75,
business systems on the DOT&E Oversight List will document T&E management content in a
TEMP.
5.2.1 Primary TEMP Purposes
Combines the Developing Agency’s (DA) DT&E strategy and COMOPTEVFOR’s OT&E
strategy into one integrated master strategy. Because the PEO/DA and COMOPTEVFOR
have independent authority, within their respective areas, to determine program test periods
and test resources, it is imperative that these independent efforts be integrated.
Formal commitment among all stakeholders for the test approach for the life of the program.
Any differences between the DA and COMOPTEVFOR on the objectives, timeline, or
resources for testing have been satisfactorily resolved.
Direction to conduct the specified T&E program, including the sponsor’s committed support,
and approval of the COIs.
Provides DoN T&E Executive (OPNAV N94) concurrence (ACAT I through III TEMPs,
BCAT TEMPs with OT, and developmental Joint Program TEMPs) on the following:
o The thresholds and objectives as stated in the TEMP Part I are consistent with CNO
approved requirements.
o The scope of testing makes appropriate use of the Research, Development, Test, and
Evaluation (RDT&E) funding, which CNO must provide.
o The planned commitment of Fleet units for testing is consistent with CNO directed
schedules and priorities.
5-2
5.2.2 Other TEMP Purposes
Provide the MDA and program sponsor with a clear understanding of what information will
be available to support various decision forums through the program’s course.
Enables the DA to project T&E costs that must be funded.
Enable Fleet, range, simulator, and target schedulers to plan well in advance for the required
services. Resourcing specifics, particularly requirements for new or modified facilities, and
M&S support should be identified in the TEMP.
Establish stakeholder agreement on SUT, SoS, and the current threat per ONI threat
assessment.
Establish OT entrance procedures/criteria.
5.2.3 Multiservice or Joint TEMPs
For multiservice or Joint programs, a single, integrated TEMP is required. Component-unique
content requirements, particularly evaluation criteria associated with COIs, can be addressed in a
component-prepared annex to the basic TEMP. TEMPs for multiservice programs will be
prepared in close coordination with other participating Services’ OTAs and will be approved
jointly by OPNAV N94 and the representatives of the other participating Service chiefs. When
the Navy is designated as executive lead for development and T&E, TEMP preparation will be per
SECNAVINST 5000.2F. The lead service will provide the baseline threat documentation. If the
Navy is not the lead service, Navy-unique threat issues will be addressed.
5.2.4 Programs Covering a Collection of Systems
For a program consisting of a collection of individual systems, a Capstone TEMP (CTEMP)
integrating the T&E program for the entire system may be prepared. A CTEMP addresses the
T&E of a defense system comprised of a collection of stand-alone component systems that
function collectively to achieve the objectives of the defense system. Individual, system-unique
content requirements are to be addressed in an annex to the basic CTEMP. The requirement for a
CTEMP is dependent on the degree of integration and interoperability necessary to satisfy the total
system’s minimum acceptable operational performance requirements.
5.3 ORGANIZATION/CONTENT
The relationship of key TEMP portions to the successful completion of the overall OT&E program
cannot be overstated. The DOT&E TEMP Guidelines 3.1 dated 19 January 2017 describe the
expectations for TEMP content in detail, and should be used as a guide when constructing TEMP
inputs. The DAG provides the recommended four-part TEMP format that is the standard for
OPTEVFOR. Use of the legacy five-part TEMP format should only be by exception. If the
Program Manager insists on using the five-part TEMP format, the responsible TPM/OTD will
inform the Commander/Deputy via the division director or squadron commanding officer as soon
as possible for resolution at the appropriate level.
5-3
5.4 DEVELOPMENT
5.4.1 T&E WIPT
A TEMP is prepared jointly by the DA, the DT agency (if one is associated with the program), and
COMOPTEVFOR, with the involvement of the OPNAV program sponsor and the OPNAV N94
T&E coordinator as needed. All stakeholders participate in TEMP drafting and approval through
the T&E WIPT process. OPTEVFOR contributes to all parts of the TEMP (in working sessions,
through comment letters, etc.) and provides the OT&E portions throughout the document. The
TPM or OTD serves as the OPTEVFOR AO for the development or revision of a TEMP, keeping
the OTC (if assigned), section head, division director, and deputy director informed as required.
5.4.2 MBTD/CBTE Contributions
The MBTD process prompts questions and drives coordination that aid in TEMP development.
Often, the MBTD timeline is dictated by the TEMP timeline. MBTD review can serve as, or be
scheduled to coincide with WIPT meetings. CBTE (section 4-3) requires even greater
coordination, and there are specific TEMP contents required for programs using CBTE.
5.4.3 OT&E Inputs
The TPM/OTD should work with the program office to provide the required inputs to meet the
program office’s TEMP production timeline. The IEF provides the basis for the OPTEVFOR
submission to the TEMP. The TPM or OTD works with the core team to develop the required
schedule inputs for Part II, testing inputs for Part III, resource requirements for Part IV, and DOE
inputs for appendix D (if applicable). IEF approval is not required before providing TEMP inputs,
but E-IPR should be complete.
5.5 REVIEW AND APPROVAL
Formal TEMP review is initiated by transmission of the DA’s proposed draft to OPTEVFOR.
TEMPs are typically reviewed in their entirety twice: once when the DA submits a draft for O-6
level review, and again when the final version is received for the Commander’s signature.
OPTEVFOR staff reviews the entire TEMP since the Commander signs for concurrence on the
integrated master plan for T&E. The TPM/OTD is responsible for ensuring that 01B and 01D
have the opportunity to review and comment during each routing to ensure test design and planning
assumptions remain valid. Reviewers should be especially sensitive to resource and schedule
issues in the final draft TEMP.
5.5.1 AO-Level Review
This review is informal, and may not always occur. After the program office consolidates all
inputs, they may disseminate the TEMP for comment by AOs before formal review by leadership.
The goal is to reduce the time for, and number of formal comments. This process may or may not
include a Comment Resolution Matrix (CRM).
5.5.2 O-6-Level Review
Remaining contentious issues are clarified/endorsed at the O-6 level. When these issues are fully
adjudicated by the stakeholders, a smooth document can be produced for on-time, final signature.
The OPTEVFOR review and response typically includes a CRM, with comments categorized as
5-4
administrative, substantive, or critical. A substantive comment identifies potentially unnecessary,
incorrect, misleading, confusing, or inconsistent information. A critical comment is one which
would cause COMOPTEVFOR to not sign the final TEMP; these must be briefed and approved
by the Deputy or Commander prior to release.
5.5.2.1 TEMP Comment Letters (Navy)
To transmit the O-6-level review, the TPM/OTD prepares a letter commenting on TEMP contents
for signature by the division director within the timeline of table 5-3. Multiservice TEMP
comment letters must be routed within 14 days, per the MOT&E MOA. The TD shall review the
O-6 TEMP and corresponding comment letter prior to signature.
Table 5-3. TEMP Comment Letter Timelines
UNCLASSIFIED
Days
HQ Action
VX/VMX/HMX Action
Next working day after
receipt of TEMP
Draft TEMP is routed to the TPM/OTD/OTC, 01B
CTF, 01D, LTE and CTE
VX/VMX/HMX TPM/OTD is provided a copy
for review
NLT 5 working days
after receipt of TEMP
Draft TEMP with initial CRM and proposed cover
letter entered in Electronic Document Router
VX/VMX/HMX TPM/OTD provides copy of draft
TEMP and response to COTD/CO for review
NLT 10 working days
after receipt of TEMP
Brief for 00/00D scheduled if required
CO’s comments provided to 50 Division Director
NLT 28 working days
after receipt of TEMP
Conduct brief to 00/00D if required
CO/COTD participate in 00/00D brief as appropriate
NLT 30 working days
after receipt of TEMP
Division Director releases O-6 Comment Letter
(with 00/00D concurrence if required)
Not Applicable
5.5.2.2 Comment Letter Brief
Briefings to the Commander or Deputy are required prior to signature of all TEMP comment letters
with OPTEVFOR critical comments.
5.5.3 Final Signature Review
Once all issues have been resolved, the smooth TEMP will be signed and dated by the DA and
forwarded to COMOPTEVFOR for formal concurrence. Generally, there should not be any new
issues raised when the smooth TEMP is routed for the Commander’s signature. The only exception
would be if other changes are made in the document subsequent to the O-6 review.
5.5.3.1 TEMP Forwarding Letters
TEMPs and forwarding letters should be staffed and returned to the DA/PEO/PM as soon as
possible (15 working days maximum) after receipt of the TEMP for signature. The Commander
signs all TEMPs and TEMP forwarding letters. Warfare Division directors are expected to address
the following in the “Discussion” block of the Electronic Document Routing record:
Status of comments previously submitted by COMOPTEVFOR: If any critical or substantive
comments were rejected by the program office, address each one, and the impact of the
rejection. Provide rationale for continuing with TEMP signature, or a recommendation for
other action.
5-5
Resources: Make a positive statement that resources have been reviewed and found to be
adequate.
Schedule: Make a positive statement that schedule has been reviewed and found to be
executable.
DOT&E Position: For oversight programs, identify areas of disagreement. Explain them,
and provide rationale for proceeding to TEMP signature.
5.5.3.2 Approval Brief
Briefings are required when critical comments were not resolved, leading to the recommendation
for the Commander to non-concur.
5.5.4 Signed TEMPs
Once signed by the Commander, the TEMP will be forwarded to OPNAV N94 for final staffing
and approval at the appropriate level. For programs on the Office of the Secretary of Defense
Oversight List, the TEMP must be approved by the USD(R&E) and the DOT&E. For
non-oversight programs, final approval of DoN TEMPs rests with ASN(RDA) as the
Service/Component Acquisition Executive. For Navy TEMPs, the OPNAV N94 approves on
behalf of the CNO. ACAT IVT programs are the exception, as the TEMP will be effective once
signed by the System Command’s (SYSCOM) Commander or PEO, and COMOPTEVFOR.
5.6 UPDATE
Per DoDI 5000.89, the TEMP must be updated “as needed to support acquisition milestones or
decision points”. Update for the FRPDR “or thereafter… may” be required “to address changes
to planned or additional testing”. Changes separate from these specific events may be necessary
when significant program changes occur, or when the program baseline has been breached. The
DA is responsible for ensuring the TEMP is updated. Within the DoN, TEMP updates fall into
two categories: revision and administrative change.
5.6.1 Revision
A revision addresses changes to evaluation criteria, scope of testing, major resources, and/or
performance requirements. A revision may also be required if unanimous agreement is not reached
to submit an update as an administrative change. A revision is signed by all TEMP signatories
following O-6 and Final Reviews.
5.6.2 Administrative Change
An administrative change reflects fact-of-life changes such as personnel, schedule, test status,
history, etc. These changes are assessed as low risk for adversely impacting the scope of planned
testing, milestones, or the Acquisition Program Baseline (APB). Administrative changes may be
promulgated by the PM based on the concurrence of the T&E WIPT members who represent the
signatories. For OPTEVFOR, this is the Warfare Division Director. Use the TEMP Change Letter
template.
5-6
5.7 TEST AND EVALUATION COORDINATING GROUP (TECG) (DON
ONLY)
5.7.1 Critical Difference Resolution
In those rare cases where there are critical differences among the DoN TEMP stakeholders that
cannot be resolved by informal Flag-to-Flag or Flag-to- SES discussions, it may be necessary to
convene a TECG. This Flag/SES forum has been required very infrequently. TECGs will be
convened by the Director, Test and Evaluation Division (OPNAV N942), via formal
correspondence that outlines the purpose for convening the TECG, identifies the attendees, and
provides an advanced agenda for review prior to the meeting. Additional information on TECGs
is in SECNAVINST 5000.2F.
5.7.2 Other Purposes
In addition to resolving critical TEMP differences, a TECG may also be used to implement urgent
required changes to TEMPs. In this case, either a page change will be issued or the formal report
of the TECG will be attached to the TEMP as an annex until the next required update or review.
Finally, all Navy disputes concerning ACAT IV designations and disputes concerning the need for
OT&E (AAP) that cannot be resolved among the stakeholders may be arbitrated by the TECG
process.
6-1
SECTION 6 - TEST PLANNING
6.1 GENERAL
For more detailed information about the Test Planning process, refer to the Test Planning
Handbook.
6.2 INTRODUCTION
The OT communities’ value to the acquisition process stands in the observations and evaluations
provided to the stakeholders in the form of robust, repeatable, and defendable test reports. The
most valuable elements of these reports are clear and concise mission-focused COI results
paragraphs and Blue/Gold sheets. The well-constructed test plan is inextricably linked to the well-
written test report. A properly executed test plan provides the test team with all the data required
to adequately evaluate the SUT within the SoS for any given COI. Additionally, since the MBTD
process and resulting IEF forms the foundation for the test design, the IEF document is critical to
and the source of the bulk of the content contained within OPTEVFOR test plans. The operational
test plan adds the specifics not contained within the IEF or TEMP. Specifics, such as dates and
location of the test, test assets and ranges, squadron number, aircraft type(s), ship name/hull
number, support asset type and unit name/number, detailed scenarios, etc., all get spelled out in
the test plan. Many times, the resources defined as the minimum adequate test in the IEF are not
physically available or affordable for the test phase and force additional limitations to be included
within a test plan. In other cases, development of the SUT will not have progressed as planned
and elements may not have reached the anticipated level of maturity. With this as a backdrop, the
operational test plan is the document explaining the “who, what, when, where, why, and how” for
the OT. The TPM/OTD and supporting test team should expand upon the detailed work specified
in the IEF and clearly point out any differences.
The OT plan must be coordinated with all stakeholders. Key stakeholders include DOT&E (for
programs on the oversight list), the PM, the Resource Sponsor, Fleet representatives, Warfare
Development Centers and analytical support activities. For multiservice tests where OPTEVFOR
is the lead agency, close coordination with participating OTAs or responsible test organizations is
essential. Proper coordination and early identification of issues requiring resolution to the
OPTEVFOR chain of command is critical to successful preparation and approval of the test plan.
For an adequate OT, the OT plan must exercise the SUT within the SoS under conditions that are
as close as possible to the expected operational and combat environment, using operational
scenarios derived from MBTD vignettes in which Forces employ realistic tactics against realistic
simulations of potential adversaries and targets. Additionally, the SUT must be:
Representative (considering the stage of development and phase of test) of the intended
production equipment (note: what is required to be representative for one vignette may not
be adequate for another depending on what tasks are executed and what measures define their
success).
Operated and usually maintained by Fleet personnel. Operation by Fleet personnel is always
required for OT once a mature (production-representative) system is available. System
6-2
operation by contractors or SMEs is not appropriate for OT in any but the earliest phases,
usually EOA or OA when there is only a prototype or brassboard, or while depending on
computer or paper drawings or simulation. The same is not true of maintenance. During
early phases of OT, maintenance by Fleet personnel is usually not possible, making
maintainability data unusable for COI evaluation. On occasion, the Navy’s maintenance plan
states a continuing role for contractor personnel in organizational-level maintenance. When
testing a system with an approved plan of this kind, contractor personnel participation is
permitted exactly as specified in the approved plan, and their performance is subject to
review and analysis just as if they were Fleet Service personnel.
Operated or exercised in an operationally representative environment. OT seeks to provide
data on SUT performance (where performance includes all the elements of operational
effectiveness and operational suitability) in the operational environment and the SUT’s
capability to contribute to the SoS in which it is employed.
Installed (considering the stage of development) as it is expected to be installed in the Fleet.
6.2.1 Quick Reaction Assessments (QRA)
QRAs are used when necessity dictates a rapid deployment of a system in development to provide
critical capability to the Fleet, or when the program sponsor desires a quick assessment by
OPTEVFOR of capabilities, limitations, and considerations for operational employment of the
new system. QRAs are completed in response to a QRA tasking letter promulgated by CNO (N94).
The tasking letter, the program sponsor’s request letter to CNO (N94), and any available MBTD
product should be used to provide the basis for the QRA test plan (coordinate with 01C for the
QRA test plan format). The test plan will be produced using the test planning process described
in the Test Planning Handbook. The QRA will NOT resolve COIs, make effectiveness or
suitability determinations, nor Fleet introduction recommendations. QRAs will only assess those
capabilities or attributes identified in the tasking letter, and should make a risk assessment for early
deployment relative to selected COIs.
By virtue of the rapid deployment need, QRAs are limited in scope. Although an IEF is not
required, a QRA test plan should take advantage of an available IEF, if one already exists. If an
IEF does not exist, and time permits, develop a TIEF to improve test adequacy (see chapter 4).
The QRA test plan should be structured to provide clear insight into the risks associated with a
rapid deployment with limited OT.
6.3 BRIEFING TEST PLANS
6.3.1 General Test Plan Briefing Instructions
The Commander approves all test plans forwarded for DOT&E review.
6.3.2 The Test Plan Brief
The Commander is briefed on all ACAT I and DOT&E oversight test plans (including OAs) as
part of the test plan approval process using the COT brief. Briefings should be scheduled so that
time is available to incorporate the Commander’s guidance prior to briefing DOT&E no later than
180 days prior to expected test operations.
6-3
6.4 LIMITATIONS TO TEST
Limitations to any OT precludes the testers’, customers’, or stakeholders’ understanding of the full
range of capabilities of the SUT within the SoS. As such, any limitation to test implies the CNO
or Fleet Commander is accepting some risk by not knowing the system performance or capability
in the areas, conditions, or threats associated with the limitation. For test plans for mature systems
where a test article exists (IOT&E, FOT&E, late stage OA and QRA), it is very important for the
TPM or OTD to describe limitations not only in terms of what the limitation is, but also in terms
of the impact of the limitation; what is it that will not be known in terms of the COI and what is
the impact to COI assessment or resolution? Additionally, any mitigation for the limitation should
be discussed. For EOA and OA test plans where the scope of testing is restricted due to the early
position of the program within the acquisition life cycle; i.e., there is no representative test article
and the EOA is being performed as a paper study, all limitations should be based from the frame
of reference of the scope of testing. In other words, for an EOA of a ship that has not started
construction, not having a ship to observe, walk on, and test is not a limitation to test, but would
be described in the description of the purpose and scope of testing. Therefore, for EOA and OA
test plans, severe limitations do not apply. Limitations fall into three categories, severe, major,
and minor. The definitions for the three categories of limitations are as follows:
Severe Limitations. Limitation(s) that preclude COI resolution and adversely impact the
ability to form conclusions regarding operational effectiveness and suitability, or cyber
survivability.
Major Limitations. Limitation(s) that may affect COI assessment or resolution but should
not impact the ability to form conclusions regarding operational effectiveness and suitability,
or cyber survivability.
Minor Limitations. Limitation(s) that have minimal impact on COI assessment or resolution
and do not impact the ability to form conclusions regarding operational effectiveness and
suitability, or cyber survivability.
6-4
THIS PAGE INTENTIONALLY LEFT BLANK
7-1
SECTION 7 - TEST DATA CONTROL
7.1 GENERAL
Test execution is one of the most critical elements of the OPTEVFOR mission. Refer to the Test
Execution Handbook for detailed information.
7.2 SHARING AND RELEASE OF OT DATA AND RESULTS
7.2.1
As acquisition accelerates to meet urgent warfighting demands, OPTEVFOR will accelerate
delivery of informative, relevant, and technically accurate reports to operational and acquisition
leaders.
7.2.2 Test results are controlled at three levels.
Raw Data. Objective raw data are the recorded results of operational testing, either manually
in a data sheet or logbook, or using an automated data stream. These data (e.g. number of
gun rounds fired) are immediately releasable by the TPM/OTD to the developing program(s),
and also to the staff of the DOT&E as required by statute. We share objective data to enable
the program’s engineers to rapidly identify performance trends and any performance
deficiencies, in order to quickly develop required corrections. Subjective data such as
opinions, survey results, and guided interview notes will only be shared prior to the
publication of the test report with the staff of the DOT&E. Evaluative conclusions will not
be shared until publication of the test report, unless specifically directed by the Commander.
Analyzed data. Analyzed data, including measure results and deficiency (Blue and Gold)
sheets, are releasable to the applicable Program Manager (PM) by the warfare division
Assistant Chief of Staff (ACOS) or test squadron Commanding Officer.
Draft Blue/Gold sheets should be shared early and clearly marked with both a “Draft”
watermark and the draft disclaimer from the deficiency sheet template. Sharing does not
give the PM permission to edit or change deficiency sheets.
Warfare Division Directors should brief the Technical Director, Deputy Commander, and
Commander on any risk or deficiency sheet characterized as Major 1 or Severe prior to
signing the Risk or Deficiency Cover Letter.
Sharing finalized deficiency sheets via a letter should be tied to conclusion of formal review
with the Competency Divisions, and should not be delayed. Keep the Commander, the
Deputy Commander, and the Technical Director informed of stakeholder responses to shared
deficiencies.
Evaluated information. The Commander will normally release evaluated information via the
test report following the conclusion of the E-SERB. Evaluated information includes COI
resolutions (including “trending” evaluations in Interim Reports); Operational
7-2
Considerations; Effectiveness / Suitability / Cyber Survivability conclusions; and Fleet
release / production / deployment recommendations.
Information demand may overtake report production timeline. When this happens, with the
appropriate Competency Division Director, Warfare Division Directors provide a
recommended approach to accelerate information release, including what information to
release and to whom.
If an unsatisfactory COI resolution or a negative conclusion or recommendation is
recommended, do not wait for the E-SERB to notify the Technical Director, the Deputy
Commander, and the Commander. Warfare Division Directors and Squadron Commanding
Officers should be prepared to discuss the data that led to the recommendations, near-term
system acquisition and deployment milestones, and with whom the analyzed data has been
shared, and their responses.
7.2.3
If the system has already deployed, Warfare Division Directors and Squadron Commander
Officers should communicate directly with the unit Commanding Officer to obtain their updated
perspective on the system’s performance prior to the E-SERB.
7.2.4
In some cases delivered capability will not match Warfighter expectations. Prior to releasing the
report to the Vice Chief of Naval Operations, the Commander may direct Warfare Divisions or
Squadron Commanding Officers to brief evaluated information at the O-6/GS-15 level to the
Warfare Development Center, Resource Sponsor, Type Commander, and Fleet Commander in
order to give stakeholders an early opportunity to assess operational impact and deficiency
mitigations.
8-1
SECTION 8 - EVALUATION REPORTS
8.1 INTRODUCTION
The evaluation report provides the CNO with COMOPTEVFOR's conclusions regarding a
system's operational effectiveness, operational suitability and cyber survivability, and
recommendations regarding Fleet introduction, further development, additional OT&E, etc.
System evaluations of operational effectiveness, operational suitability, and cyber survivability are
made on the contribution of the SUT to the SoS warfighting effectiveness. The evaluation report
provides the information (test results, evaluation criteria, etc.) to substantiate COMOPTEVFOR's
conclusions and recommendations.
When conducted, EOA and OA phases of test require assessment reports to support MS-B
and MS-C per DoDI 5000.02.
Evaluation reports are prepared at the end of each OT&E phase and are required by DoDI
5000.02 for the FRP decisions. During times of compressed decision timelines the PM or
PEO may request an Interim Report. This report will use whatever data has been evaluated
to make an assessment of where the SUT performance stands in relation to effectiveness and
suitability. As the name implies, this report has not considered all available test data and is
not final, and therefore subject to change when the remaining data is fully evaluated.
Publication of an interim report does not alter the requirement for a Final Test Report.
Publication deadlines are specified in table 3-2.
Assessment reports and evaluation reports are OPTEVFOR’s most important contribution to
the acquisition process. Test reports help form the basis for acquisition decisions by
articulating the effectiveness, suitability, and cyber survivability of new systems and
capabilities. Test reports also provide a historical record of testing. The goal of all
OPTEVFOR reports is to clearly communicate the results of testing to all stakeholders.
These results are communicated by describing what was observed, then using operational
experience and judgment to evaluate the impact of those observations on mission
accomplishment.
8.2 TYPES OF OPERATIONAL EVALUATION AND OTHER REPORTS
There are several types of reports provided as a result of OPTEVFOR involvement in programs.
See table 8-1 for report format guidance.
Table 8-1. Report Format Guidance
UNCLASSIFED
Report
Type
Test Type
Purpose
Format
OER
IOT&E
To report a full, complete phase of testing. Consists of a
report letter signed by the Commander, a deficiency letter
signed by the Warfare Division Director, and a data analysis
summary memorandum signed by the Technical Director.
Full Report
(IOT&E-FOT&E
Report Templates)
OFER
FOT&E
To report a full, complete phase of testing. Consists of
Full Report
8-2
Table 8-1. Report Format Guidance
UNCLASSIFED
Report
Type
Test Type
Purpose
Format
a report letter signed by the Commander, a deficiency
letter signed by the Warfare Division Director, and a
data analysis summary memorandum signed by the
Technical Director.
(IOT&E-FOT&E
Report Templates)
OAR
EOA/OA
Early involvement OT reports used in identifying system
enhancements and significant areas of risk to the program's
successful completion of IOT&E in the form of Blue and
Gold sheets. OARs are assessment reports that support all
stakeholders, but do not support specific MS decisions.
Full Report
(EOA/OA Report templates)
OMAR
EOA/OA
Early involvement OT reports used to identify system
enhancements and significant areas of risk to the program's
successful completion of IOT&E in the form of Blue and
Gold sheets. OMARs are assessment reports used to support
MS decision meetings.
Full Report
(EOA/OA Report templates)
DT Assist
LOO
DT Assist
Per the PM’s DT assist request letter.
Letter with enclosed
Blue/Gold risk sheets
(LOO template)
Letter
AOC
Per the MOA coordinated between the PM and the Warfare
Division Director.
Letter with enclosed
Blue/Gold deficiency
sheets
QRA
QRA
To report findings for operational considerations/system
capabilities when it is necessary to achieve a rapid capability in
the Fleet. QRAs do not replace formal OT&E.
They are used to
support a rapid deployment of a capability to the Fleet.
Report
(QRA Report template)
VCD
VCD
To report results for validating correction of specific
deficiencies (specific COIs only) from previous testing (end-
to-end testing may not be required).
Report
(VCD Report template)
Interim
Report
EOA/OA/
IOT&E/
FOT&E
Report provided when, due to unforeseen events, evaluation
results are required prior to publication of the full OT report
.
The report provides the status of testing, an assessment of
available data, and a recommendation (if appropriate).
Use of
this report is at the Commander’s discretion.
The full formal
report is still required.
Report
MUA, LMUA,
or OUA
JCTD
Products for the JCTDs that provide an assessment of military
utility demonstrated. Not to be used for acquisition programs.
Full JCTD Report
8.2.1 Operational Test Agency Evaluation Report (OER), Operational Test Agency Follow-
On Evaluation Report (OFER), and Software Qualification Test (SQT)
For IOT&E and FOT&E, system evaluations of operational effectiveness, operational suitability,
and cyber survivability, are made on the contribution of the SUT to the SoS warfighting
effectiveness. A separate operational effectiveness and suitability evaluation may be provided for
the SoS capability to perform its mission in the operational environment only when there is
sufficient data to conclude the SoS performance differs from the SUT conclusion. A fielding
recommendation is provided in the OER or OFER. SQTs will use the same report format as
IOT&E/FOT&E. See appendix C for additional discussion of SQTs.
8-3
8.2.2 Operational Test Agency Assessment Report (OAR) and Operational Test Agency
Milestone Assessment Report (OMAR)
EOAs and OAs, whether conducted as stand-alone OT, combined DT/OT, or fully integrated
testing, often support program decision points. These reports will be termed OAR or OMAR.
OAR/OMAR requirements should be listed in the TEMP and, commonly support Defense
Acquisition Boards or MS decision meetings.
8.2.3 Observing DT
DT assist LOOs are used to communicate with the program manager when accomplishing a DT
assist. This feedback is in the form of observations of system performance using the DT assist
Letter of Observation (LOO) format. The format for DT assist LOOs is a brief letter to the PM
with attached Blue/Gold risk sheets for each performance issue identified.
AOCs are used when observing a DT phase or DT event(s) of an acquisition program to assess the
operational capabilities of a System Under Test (SUT) prior to introducing/releasing it for
Fleet/operational use and the Program has no future phase of Operational Test (OT) planned. The
decision to conduct an AOC should be at the A-Code and Program Manager (PM) level in
consultation with 01B and 01C. The AOC letter is addressed to the program stakeholders, includes
a description of system capabilities and limitations observed during the DT, and includes attached
Blue/Gold deficiency sheets for each performance issue identified.
8.2.4 Quick Reaction Assessment (QRA)
The QRA report will not resolve COIs, make effectiveness, suitability or cyber survivability calls,
or provide a limited Fleet introduction, Fleet introduction, or Fleet release recommendation. The
QRA report will answer the questions and address the purpose as outlined in the QRA request
letter. As such, the QRA request letter is routed with the test report as the report is staffed for
signature. Information from a QRA may be used by DOT&E in support of a “Section 231” report
to Congress when a system being developed is fielded prior to the completion of IOT&E.
8.2.5 Verification of the Correction of Deficiencies (VCD)
For a stand-alone VCD phase of test, the VCD report is a letter summarizing the resolution of
each evaluated deficiency, with all the deficiencies included as an enclosure.
For programs not on the DOT&E oversight list, when COI resolution is discussed in the test plan
and if the VCD results enable a change to the resolution of COIs (beyond IOT&E), then those
updated COI resolutions will be listed in the VCD report, thereby reducing the scope or eliminating
the need for later phases of OT for the specific purpose of verifying the deficiency that has been
corrected. For programs on DOT&E oversight, the only permitted change in COI resolution during
a VCD phase of test is from SAT to UNSAT. See appendix B for a detailed discussion of VCD
testing. For more detailed discussion of the Evaluation Report, see the Test Reporting Handbook.
8-4
THIS PAGE INTENTIONALLY LEFT BLANK
9-1
SECTION 9 - RESOURCES
9.1 INTRODUCTION
This chapter focuses on resources available to the operational tester. The chapter includes such
topics as points of contact, services, instructions, responsibilities, and specific resources available
to the tester. This chapter also provides an overview of the resource tools necessary to accomplish
the job of a TPM or OTD.
9.2 ELECTRONIC RESOURCES
9.2.1 OT&E REFERENCE LIBRARY
General T&E references are found on the OPTEVFOR UNCLAS and SIPR share drives at
Y:\OT&E Reference Library. In each domain, this folder contains a wide variety of valuable
resources that are particularly useful for test teams, including:
OT&E Manual (also on KMS)
Handbooks (also on KMS)
The COMOPTEVFOR briefing template
COMOPTEVFOR Acronym and Abbreviation List (CAAL)
COMOPTEVFOR OT&E Document Writing Guide Sheets
Security classification marking instructions
DoD, CJCS, SECNAV, and OPNAV T&E Instructions
DOT&E Guidance
M&S Instructions
Various MOAs.
9.2.2 OT&E PRODUCTION LIBRARY
The Y:\OT&E Production Library on the UNCLAS and SIPR share drives hold references,
templates, and guidance particular to OPTEVFOR products. For example, all templates and other
related references for a test plan are found in the “Test Plan and DCP” folder. There are also
folders for the IEF, M&S Accreditation, TEMP Input, Cyber Survivability, Test Execution, and
Test Reports.
9.2.3 SECURITY CLASSIFICATION GUIDES AND CLASSIFICATION MARKINGS
It is extremely important that OPTEVFOR documents have appropriate security classification
markings. To mark documents properly, the TPM/OTD must have the current Security
Classification Guide (SCG) for the particular program, as found on the Defense Technical
Information Center (DTIC) website. If the SCG for a particular program is not posted, the
9-2
TPM/OTD should contact the program office. The TPM/OTD must also review the latest guidance
on how to mark unclassified, Controlled Unclassified Information (CUI), and classified
documents, which is found at Y:\OT&E Reference Library\Security Classification Guides. Further
assistance is available from the Security Manager or from the editors.
9.2.4 THREAT SYSTEMS DATABASE & TETRA
Under a MOA between DOT&E and the Defense Intelligence Agency (DIA), the Test and
Evaluation Threat Resource Activity (TETRA), within the Missile and Space Intelligence Center
provides ongoing intelligence analysis and support for DOT&E threat resources while managing
and overseeing any DOT&E investments for the development of threat resources.
TETRA maintains a Threat Systems Database (TSD) and naval and other warfare ‘handbooks’ at
the SIPR address below. Operational testers and test planners may use this resource as a means of
accessing multiple accreditation documents for the various land and sea ranges throughout the
Major Range Test and Facility Base (MRTFB) infrastructure.
https://tsdb.msic.dia.smil.mil/home/main
DOT&E and TETRA also develop an annual prioritized list of foreign materiel requirements that
are submitted to the Joint Foreign Materiel Program Office (JFMPO) which informs the whole of
government materiel collection priorities. Actual foreign materiel and the information gained
through the exploitation of foreign materiel is critical to developing and fielding weapons that
work. COMOPTEVFOR participates in this process by providing annual inputs to DOT&E and
TETRA, which contains what threat surrogates and/or systems we might prioritize for Navy
testing.
COMOPTEVFOR’s Test Design Division (01B) coordinates the Foreign Materiel Acquisition/
Exploitation (FMA/FME) input process. For more information on the FMA/FME process or for
details on the TSD, the Threat Systems Lead is Mr. Heath Richardson, Code 01B3, (757) 457-
6351.
9.3 T&E PROGRAM SYSTEM (TEPS)
TEPS is a module within the COMOPTEVFOR Knowledge Management System (KMS) on the
unclassified LAN. (https://kms.cotf.navy.mil/home_auth/home.home_mis.home_main). TEPS is
a Web-based management tool designed to assist the TPM/OTD/SH/OTC/LTEs in the tracking
and administration of projects, Fleet services scheduling, and activity reports. Access to the TEPS
database is limited to members of OPTEVFOR. Procedures for the use of TEPS may be found in
appendix E, and in the TEPS User’s Guide in KMS.
9.4 SHARED DRIVES
The K: drives on the unclassified and classified LANs are shared drives that support access to and
storage of T&E documents. The drives are organized by division, and each division is organized
by section, with each section organized by office code. While each division may set its own
requirements, at a minimum, the K: drive folders for individual programs should be structured
with the following guidelines.
9-3
9.4.1 Program Folder
Program folders should be named with the TEIN and short name (e.g., K:\40\41\0371- 03
CBASS). Each program folder should have subfolders for the following, as required:
Each phase of test
Requirements documents
IEF
Funding
TEMP.
9.4.1.1 Phase of Test
Within program folders, each phase of test should have its own folder using the name of the phase
(e.g., K:\50\54\541\0201-08 EA-18G\OT-B1. Each phase of test should have folders for the
following documents:
Briefs
Messages
Final report
Test plan.
9.4.1.2 Documents
Once a final, signed, official document is available, save the document in PDF or Document (DOC)
format, as applicable, in the appropriate division folder. Remove all draft documents from the
main document folder by either deleting the draft document or moving it to a history folder. This
action may prevent confusion as to which document is the most current. Chapter 3 states that the
editors or Flag Admin will create a final PDF document after signature. These are provided to
01A for posting at Y:\00\Signed Test Documents.
9.5 PHYSICAL RESOURCES
Depending on the program, a TPM/OTD may need to arrange for support (i.e., data
collection/analysis/reduction, ranges, targets, etc.) from a variety of activities. In addition to the
resources available within the divisions and from the program offices, OPTEVFOR’s Fleet
Resources Office (01C7, LCDR Caity Atwood, Code 01C9A, 757-457-6245, for east coast, or Mr.
Scott Higbee, 619-553-4568 for west coast) and Test Resource Requirements (Mr. Heath
Richardson, Code 01B3, (757) 457-6351) can provide assistance in obtaining necessary support.
9.6 TEMPORARY ASSIGNED DUTY (TAD) TRAVEL
All TAD travel, either command or program funded, must be submitted and approved via the Web-
based Defense Travel System (DTS). Establishment of DTS accounts and training are provided
by the OPTEVFOR Admin office during the personnel check-in process. COMOPTEVFOR’s
policy is that all personnel exercise discretion in the stewardship of taxpayer funds and be frugal
in the use of appropriated funds in support of travel by:
9-4
Limiting travel to the absolute minimum level necessary to accomplish the mission in terms
of the number of travelers, mode of travel, duration of travel, alternatives to travel, etc.
Using teleconferencing, video-teleconferencing, and DoD sanctioned web-based
collaboration capabilities (such as Defense Collaboration Services (DCS), Webex, and
Microsoft Teams) in lieu of travel whenever possible.
Using government quarters, where available; where appropriate, travel arrangements to
locations in which government quarters exist should be done in a timely manner to allow
OPTEVFOR travelers to use government lodging while on travel.
Minimizing resource expenditure for vehicle rentals by ride-sharing arrangements whenever
two or more personnel are traveling to the same place. Note that restrictions apply to ride-
sharing when contractor personnel are involved. See the contracting handbook or consult
with your division contracting Technical Assistant (TA) or the command Contracting
Officers Representative (COR) if you have any questions or concerns.
Navy Defense Acquisition Career Manager (DACM) pays for travel associated with training
of Defense Acquisition Workforce Improvement Act designated personnel. Funding must be
identified and received by the traveler prior to processing orders.
Travel by staff personnel to support programs that have passed Full-Rate Production (FRP)
Decision Review (DR) will sometimes require the use of command Operations and
Maintenance, Navy (O&MN) funds. Due to the more restrictive financial rules that apply to
O&MN funding, travel requirements that utilize these types of funds should be planned as
early as possible in consultation with the division B Code and finance division.
Use Gov’t tax exemption forms wherever possible (States where they are accepted)
9.7 FLEET SERVICES
COMOPTEVFOR is the OPNAV N94 RDT&E Fleet support scheduling agent. This includes all
DT and OT associated with acquisition programs and those projects and initiatives endorsed by
OPNAV N94) requiring Fleet support under this process. The primary method to identify Fleet
support for acquisition projects is in Part IV of the TEMP. There are two types of Fleet Service
Requests (FSR): standard (quarterly) and emergent.
9.7.1 STANDARD FSR
Approximately 9 months prior to the actual execution quarter, OPNAV N94) sends the
"QUARTERLY CALL FOR FLEET RDT&E SUPPORT REQUIREMENTS" message to all
RDT&E agencies soliciting Fleet support requirements (the OPNAV N94 support request will
include a cut-off date, after which service requests will be submitted via an Emergent FSR (EFSR)
message). TPMs or OTDs submit FSRs per the Unclassified Test and Evaluation Support (UTES)
database, which can be accessed from the KMS main page or https://utes.cotf.navy.mil/. The
UTES Operator’s Guide can be found on the COMOPTEVFOR main web page, Y:\OT&E
Reference Library, or from the OPTEVFOR Fleet Resources managers. When preparing an FSR,
the following questions should be considered:
Hours per day? Day or night operations?
Type of aircraft, surface ship, or submarine required?
9-5
Sorties per day?
Are services requested: dedicated, concurrent, or Not-to-Interfere Basis (NIB)?
Consecutive? If not, minimum and maximum time between periods?
In connection with other units?
Can this be in connection with transit, Fleet exercise, or other project operations?
Why these specific date(s)?
How rigid are these dates?
Which day(s) (when in connection with other assets)?
Can these tests be done simultaneously?
DT or OT?
Phase?
If a specific unit is requested, then why this particular unit?
Is same unit(s) required each day (period)?
Ship Alteration (SHIPALT)/ Temporary Alteration (TEMPALT) required or preferred?
Test location/instrumented range?
Which units have this equipment?
Any riders? Justify number of riders.
Any previous Separate Correspondence (SEPCOR)? If so, make note of it.
Is this a continuation of previous quarter services?
What type augmentation?
Can more testing be done each day (period)?
If this asset is not available, is remainder of services required?
What is the minimum time required?
Does your test support: MS C, LRIP, OTRR, Critical Design Review, IOT&E, FRP, and/or
Fleet release?
If your program is delayed, what is the delay impact? What is the cancelation impact?
OPTEVFOR resource managers (east and west coast) will forward all OT requests to OPNAV
N942 for validation and prioritization. Once the validation and prioritization is complete, OPNAV
N942 will forward the endorsed “Fleet RDT&E Support Requirements for that FY Quarter” to the
OPTEVFOR resource managers, who, in turn, will enter them into Web-Enabled Scheduling
System (WEBSKED) prior to the quarterly Commander, Task Force 20/ Commander Third Fleet
scheduling conference.
9-6
When the scheduling conference is completed, OPTEVFOR resource managers will contact the
respective TPMs or OTDs by e-mail (SIPRNET preferred) or telephone with the results of the
conference. The following is a list of possible conference results.
Unit Assigned – When a specific unit is assigned, the OPTEVFOR resource managers will
provide the TPM/OTD with the scheduled unit POC. The TPM/OTD should contact, at the
earliest opportunity, either the unit POC or the command/activity that has been assigned, to
ensure that the requirements are known and integrated into the unit's planning at an early
stage, and to have COMOPTEVFOR added to the distribution of unit CASREP and
CASCOR messages.
Direct Liaison Authorized (DIRLAUTH) – OPTEVFOR resource managers will execute
DIRLAUTH to locate platform-level support and provide a unit scheduling agent POC to the
TPM/OTD. The TPM/OTD will coordinate with the unit scheduling agent to determine
supportability, while keeping the OPTEVFOR resource manager informed.
No Fill – Fleet support request is not supportable.
Open – Fleet support requested was not available during the scheduling conference; however,
it may become available sometime after the conference. All OPEN requests will be reviewed
regularly by OPTEVFOR Fleet resource managers for a potential support opportunity.
In all cases, it is advisable that the TPM/OTD contact the PM regarding assigned services for any
PM-required action. TPMs or OTDs should follow up face-to-face or telephone contacts with the
service provider with an e-mail detailing the substance of the discussions and save all e-mail traffic
with the service provider in order to avoid misunderstandings.
OPTEVOR Fleet resource managers will provide Fleet scheduler contact information for
applicable platforms assigned to the TPM/OTD. TPMs or OTDs should establish contact with the
Fleet scheduler (or platform operations officer) as applicable and as soon as feasible. TPMs/OTDs
should be prepared to provide details about what is expected of the platform/crew during testing.
OTDs should notify OPTEVFOR resource managers if, during the course of coordination with the
platform scheduling agent, the testing is deemed not supportable.
TPMs/OTDs requesting submarine support for RDT&E must comply with the following
procedures:
Submit a copy of the COMOPTEVFOR signed test plan to the ISIC and SUBOPAUTH NLT
30 days prior to the event.
For complicated tests (e.g., operating above 200 feet, in a high-density, contact-management
environment, or shallow water environment), official briefings should be provided by the
TPM/OTD well in advance of the event for the ISIC, SUBOPAUTH,
COMSUBPAC/COMSUBFOR N3, and N32.
A presail brief must be held with the ISIC and platform crew prior to the underway event.
9.7.2 EMERGENT REQUIREMENTS
Emergent requirements occur when a need arises for Fleet support after the deadline
(approximately 9 months prior to the actual execution quarter) for a UTES submission has passed,
or services are required in addition to those which were in the original request. When the need
9-7
occurs, the TPM/OTD will coordinate with the OPTEVFOR resource manager to determine the
feasibility of the emergent services requested. If the feasibility check yields a negative response,
a decision will be made as to whether or not the TPM/OTD will draft and transmit the EFSR
message (OPTEVFOR warfare divisions in coordination with VX squadrons will determine
message originator). When the OPNAV N94 endorses the EFSR message, the OPTEVFOR
resource manager will enter the request into WEBSKED for resourcing. Once in WEBSKED, the
responsible OPTEVFOR resource manager will coordinate obtaining support services.
NOTE
Emergent requests or schedule change requests have potentially negative impact on
Fleet operations, maintenance, and training commitments.
TPMs/OTDs should make every effort to acquire Fleet support prior to the established
submission deadline.
The following conditions must be met prior to requesting emergent services:
The emergent service request must state why services were not requested during the
scheduling conference.
A draft or final test plan must be available so that services required can be clearly identified.
9.7.3 ASSET REQUESTS NOT SCHEDULED AT SCHEDULING CONFERENCES
Range and Operating Area (OPAREA) requests are normally coordinated directly with the
facility's scheduling authority and the TPM/OTD. Due to the demand for these facilities, the TPM
or OTD should coordinate with the range-scheduling agents well in advance.
9.7.4 FOURTH, FIFTH, SIXTH, OR SEVENTH FLEET SERVICES
Requests for Fifth, Sixth, or Seventh Fleet Area of Responsibility services should be submitted to
OPNAV N942 via message with information copies to the program sponsor, Fleet commander,
and commands involved. Once endorsed by OPNAV N942, OPTEVFOR Fleet resource managers
will coordinate with applicable Fleet commanders for RDT&E assignments.
9.8 MULTISERVICE REQUESTS
9.8.1 MOT&E SERVICES SUPPORT COORDINATION
Each other-than-Navy Service OTA will establish an internal POC for requests and coordination
when a single Service requires resources from other Services. The single-Service OTA conducting
a test will initiate the request and coordinate the use of required Joint assets, and will be responsible
for the scheduling and managing of those assets. The OTA POCs for test resources are listed
below:
ATEC
DCSOPS (703) 681-2936/6518
DSN: 761-2936/6518
9-8
AFOTEC
A-8P- Programming (505) 846-1785
DSN: 246-1785
OPTEVFOR
Test Fleet Resource Scheduling East Coast: (757) 457-6245
DSN: 757-456-6245
West Coast: (619) 553-4568
MCOTEA
S-4 (703) 784-3286
9.9 RELATED COMMUNICATIONS
9.9.1 NOTICE OF INTENT (NOI)
The primary purpose of a NOI is to reserve a submerged OPAREA and establish procedures that
will prevent mutual interference between submerged submarines, and between submarines and
other operations, such as surface ships using variable depth sonar or dropping of explosive
ordnance. COMSUBFOR/Commander, Task Force 20.3 is Commander, U.S. Fleet Forces
SUBOPAUTH and is assigned the responsibility of coordinating and approving NOI requests.
CTF-20 Operations Order (OPORD) 2000, annex C provides the procedures for requesting an
NOI. If the test area, participating units, and timeframe are well defined, the NOI requests should
be sent to Commander, Task Force 20.3. If test operations are ill-defined or inherently flexible,
the responsibility for requesting the NOI rests with the primary participating unit.
9.9.2 COMMUNICATION PLANS
Communication plans are an integral component of any OPORD, Letter of Instruction (LOI), or
Pre-Exercise (PRE-EX) Message. An important step in formulation of these exercise directives is
the assignment of frequencies for short-term tactical and training evolutions. Guidance for
submitting frequency requests is contained in annex K of COMUSFLTFORCOM OPORD 2000
series.
9.10 TEST TARGETS
COMOPTEVFOR’s Test Design Division (01B) coordinates test targets for all Navy OT. The
Test Target Coordinator is Mr. Heath Richardson, Code 01B3, (757) 457-6351.
A test target is a surrogate asset, used to replicate a particular threat or family of threats. The Test
Target Available targets include seaborne targets such as the High Speed Maneuverable Target
(HSMST), Fast Attack Craft Target (FACT), and the Mobile Ship Target (MST). Aerial targets,
both subsonic and supersonic, are also available, along with a variety of UAVs. The inventory
includes an assortment of mine shapes, maneuvering undersea targets, and ground targets as well.
Test targets are no longer requested directly by COMOPTEVFOR. Rather, target allocation for
testing occurs through OTD/OTC coordination with the program offices, who in turn forward those
requests directly to OPNAV N94 Targets office.
9-9
The Target Allocation and Requirements Tool (i.e. TART Tool) is a new database being
maintained by the OPNAV N94 Targets office, and now used to track all target allocations. Within
this tool on the NAVAIR portal, COMOPTEVFOR personnel can see upcoming test and training
events, including those phases of test that are submitted by the respective Program Offices within
NAVAIR and NAVSEA.
Warfare division personnel should request accounts and check where their events are in the
schedule (and whether targets have been allocated). This should be done at least once per quarter.
If there are any discrepancies, OPTEVFOR OTDs and other planners should consult with the
respective program office. If there remain questions about target availability or allocation, please
contact the COMOPTEVFOR Test Target Coordinator.
The TART Tool is located at the link below. Issues with obtaining TART Tool access or any
related training needs may be forwarded to Mr. Richardson.
https://myteam.navair.navy.mil/org/targets/SitePages/Home.aspx
9-10
THIS PAGE INTENTIONALLY LEFT BLANK
10-1
SECTION 10 - MODELING AND SIMULATION
10.1 INTRODUCTION
Modeling and Simulation (M&S plays an important and growing role in OT&E. M&S may be
used to provide data in support of OPTEVFOR’s assessments of operational effectiveness and
suitability, and cyber survivability. As a matter of law, M&S cannot be the only basis for IOT&E
or FOT&E evaluations. Thus, every OT informing fleet introduction decisions must include some
form of live testing. Before OPTEVFOR can use M&S data, the model or simulation must be
accredited, signifying its acceptability for the Specific Intended Use(s) (SIU) in the associated test.
10.1.1 Guidance
.1C provides general guidance, including the requirement to use statistical techniques to compare
live data to M&S data as part of the accreditation process.
Key Roles
The Commander is the Accrediting Authority (AA) for COMOPTEVFOR. The accreditation
decision is based on the Verification and Validation (V&V) Report provided by the M&S
proponent, which is generally the Program Manager.
10.2 THE VV&A PROCESS
The official DoD definitions for the three constituent processes are:
Verification: The process of determining that a model implementation and its associated
data accurately represent the developer’s conceptual description and specifications.
Validation: The process of determining the degree to which a model and its associated data
provide an accurate representation of the real world from the perspective of the intended uses
of the model.
Accreditation: The official certification that a model, simulation, or federation of models
and simulations and its associated data is acceptable for use for a specific purpose.
Figure 10-1 illustrates the VV&A Process, which starts with the IEF. The IEF formalizes the DRs,
test design, and M&S resources necessary to resolve the COIs. It also serves as the first formal
acknowledgement that M&S will be used during OT to supplement live test events, documenting
the SIUs and summarizing the expected path to accreditation. More information about the IEF can
be found in Chapter 4.
10-2
Figure 10-1. Verification, Validation, and Accreditation Process
UNCLASSIFIED
Document Requirements
COMOPTEVFOR develops the M&S Requirements Letter after completion of the IEF IPR-2,
summarizing the capabilities that M&S must possess to satisfy the OT SIU(s). This letter, signed
by the warfare division director, describes the SUT attributes and associated performance criteria
which have been identified in the IEF to be addressed with M&S.
10.2.1 Plan Accreditation
OPTEVFOR’s M&S Accreditation Plan documents the scope of the accreditation associated with
V&V efforts, the criteria and methodology to be used, and a configuration management plan. The
accreditation plan must be provided to the M&S proponent to support development of the V&V
plan. The Program Office is responsible for the V&V Plan, whichThe V&V Plan incorporates the
overall methodologies dictated by the Accreditation Plan into an executable process. It also
defines the resources needed to perform the V&V, the V&V schedule, and identifies any issues
associated with performing the V&V. The accreditation plan is collaboratively built with inputs
Integrated Evaluation Framework (OPTEVFOR)
Requirements Letters (OPTEVFOR)
Accreditation Plan (OPTEVFOR)
Verification and Validation (V&V) Plan (M&S Proponent)
Runs-for-the-Record
(RFR)
Optional Accreditation Memo
(OPTEVFOR)
Accreditation Letter (OPTEVFOR)
V&V Report
(M&S Proponent)
Validation Data from Live Test
M&S
Development
Modeling &
Simulation
(M&S)
Development
Validation Data from M&S
V&V Addendum (OPTEVFOR
10-3
from the T&E WIPT members. Once all outstanding issues as documented in a CRM are resolved,
the AP is approved by the deputy commander OPTEVFOR.
10.2.2 Support Validation
The V&V Report, also developed by the Program Office, focuses on the results of the V&V
process and summarizes the analysis, assumptions, capabilities, and limitations of M&S. It also
identifies any unresolved issues associated with V&V implementation and documents any lessons
learned during V&V.
An Accreditation Memo is an optional document written by OPTEVFOR that states concurrence
to conduct RFR based on a review of the V&V Report. The Accreditation Memo content is usually
the same as the final Accreditation Letter minus the RFR analysis. The document is typically
requested by the M&S proponent when significant resources are required for RFR.
10.2.3 Complete RFR
The formal OT M&S runs, referred to as RFR, are defined in the IEF and considered to be the
minimum, adequate runs required to resolve the associated COIs. With all M&S data available,
OPTEVFOR may produce a V&V Addendum containing independent analysis, if not already
included in the V&V report.
10.2.4 Report Accreditation
The final step in the M&S VV&A process is for OPTEVFOR to issue the Accreditation Letter.
This document summarizes the findings and includes the final decision to either fully accredit,
accredit with limitations, or not accredit the M&S to support the OT SIU(s). The Accreditation
Letter must be approved prior to including M&S results in any OT report.
PROCESS EXCEPTIONS
The full VV&A process assumes a computerized model that simulates SUT performance. Not all
M&S for OT is this complex. Models that are not used to predict SUT performance but rather
represent a threat or threat environment to stimulate a model will still require an accreditation
letter, but they typically will not require an accreditation plan.
10-4
THIS PAGE INTENTIONALLY LEFT BLANK.
11-1
SECTION 11 - CYBER SURVIVABILITY TEST AND
EVALUATION
11.1 INTRODUCTION
This chapter introduces the basics of cyber survivability and its relationship to operational test and
evaluation. For more detailed information on cyber processes and products, see the Cyber
Survivability Test and Evaluation Handbook.
The purpose of the OPTEVFOR OT Cyber Survivability evaluation is to evaluate the system’s
capability to survive and operate after exposure to cyber threats, which attempt to prevent
completing operational mission(s) by destruction, corruption, denial, or exposure of data
transmitted, processed, and stored.
All systems assigned to COMOPTEVFOR for OT evaluation, for each phase of test, shall be
referred to 01D for a determination of whether cyber survivability is required. 01D and the warfare
division will work together to determine what level of cyber survivability OT&E must be
conducted in order to meet policy and stakeholder requirements.
There are five documents that are most relevant from a TPM/OTD perspective for clarifying the
requirements and guidance for the conduct of cyber survivability test and evaluation.
11.1.1 COMOPTEVFOR Red Team Memo.
01D was directed by the Commander to establish and maintain a National Security Agency (NSA)
certified red team, as well as develop, maintain, and oversee cyber test planning processes and
templates, and coordinate with external organizations with regard to cyber testing. This memo
captures the overall responsibly of 01D as the cyber survivability test execution stakeholder and
will be referred to as the OPTEVFOR Red Team (CRT) memo.
11.1.2 DoD Cybersecurity T&E Guidebook v2.0, April 2018.
This guidebook promotes data-driven, mission-impact based, analysis and assessment methods for
cybersecurity T&E and supports assessment of cybersecurity, survivability, and resilience within
a mission context by encouraging planning for tighter integration with traditional system T&E.
11.1.3 DOT&E
DOT&E Memo, April 2018. This memo directs Operational Test Agencies (OTA) to conduct a
Cybersecurity Cooperative Vulnerability and Penetration Assessment (CVPA) and an Adversarial
Assessment (AA) for all oversight acquisition programs. The OPTEVFOR cyber survivability
planning process aligns to the requirements of this memo to collect data and analyze a system’s
capability to prevent cyber-attacks, mitigate the effects of a cyber-attack to maintain a mission
capability, and recover lost mission capabilities to support follow-on mission requirements in a
tactically relevant timeframe. This construct is referred to as Prevent, Mitigate, and Recover
(PMR).
11-2
11.1.4 Defense Acquisition Guidebook.
The Defense Acquisition Guidebook (DAG) provides guidance on the process and procedures for
managing risks through planning and executing an effective and affordable test and evaluation
(T&E) program that enables the DoD to acquire systems that meet mission requirements.
OPTEVFOR employees can access the DAG at https://www.dau.edu/tools/dag.
11.2 TEST PLANNING
The roles and responsibilities with respect to cyber test plan development are defined in detail in
the Cyber Survivability Test and Evaluation Handbook. In general, 01D is responsible for
providing processes, templates, and guidance for overall test strategy and planning. 01D will liaise
with DOT&E to ensure test adequacy and address DOT&E concerns. Warfare divisions are
responsible for developing the cyber survivability test plan in accordance with the OPTEVFOR
cyber test planning process.
The pre-test planning steps occur as soon as the program is initiated in the division, notionally
12-18 months prior to test. The focus is to gather and evaluate the system documentation in order
to establish the program’s T&E strategy. Sometimes, pre-test planning may start when the
system’s IEF and/or TEMP is being finalized. In this case, 01D highly encourages early
engagement with the TPM/OTD, Cyber Test Engineer (CTE), and PMO personnel to ensure all
stakeholders understand the overall cyber survivability T&E strategy.
11.3 MODELING AND SIMULATION (M&S)
Due to scheduling, safety, Fleet asset availability, and other limiting factors, it is sometimes not
possible for a full scope of test to be conducted in the operational environment. Therefore, other
means to assess a system for cyber survivability may be explored through the use of non-
operational cyber test assets, such as M&S. Cyber M&S accreditation follows the same process
detailed in Chapter 10 above and COTFINST 5000.1C to ensure the program’s M&S requirements
letter and accreditation plan incorporate the necessary level of effort to verify and validate the
simulation.
Additional information can be found in the Cyber Survivability OT&E Handbook.
11.4 TEST EXECUTION AND POST TEST PROCESS
OPTEVFOR Red Team is the primary test team supporting the COMOPTEVFOR Cyber OT&E
with requisite authorizations and qualifications to operate on operational environment. Test team
augmentation shall be coordinated through 01D. 01D Test Strategy and Policy (TSP) and CRT
support the warfare divisions throughout the post-test process to develop final report products.
The cyber test execution and post-test process is described in detail in the Cyber Survivability Test
and Evaluation Handbook.
12-1
SECTION 12 - CONTRACT SUPPORT
12.1 INTRODUCTION
The workload of conducting OT&E may require augmentation by the contractor workforce.
COMOPTEVFOR has several contract vehicles at its disposal to assist in obtaining the necessary
contractor skill sets. The Contracting Officer Representative (COR) will assist you in choosing
the right vehicle for your contract requirement. Refer to the Contract Support Handbook for
guidance in preparing a contract package.
12.1.1 Key Terms/ Definitions
Contract - a mutually binding, legal relationship which obligates the seller to furnish the
supplies or services and the buyer to pay for them. It includes all types of commitments that
obligate the government to the expenditure of appropriated funds except as otherwise
authorized in writing. The OMNIBUS and GSA are examples of a “contract. ”
Deliverable - a product of a contractor or other agency’s effort, partially or wholly fulfilling
the objectives of a contract, per the requirement documents or other tasking.
Dispute - a disagreement between the contractor and government regarding the rights of the
parties under a contract.
Firm-Fixed Price (FFP) Contracta contract to pay a specified price when the supplies or
services called for by the contract have been delivered and accepted.
Incremental Funding - the obligation of funds to a contract in periodic installments as the
work progresses, rather than in a lump sum.
Modification - any formal revision of the terms of a contract.
Obligation - a monetary liability of the government limited in amount to the legal liability of
the government at the time of recording.
Option - a unilateral right in a contract by which, for a specified time, the government may
elect to purchase additional quantities of the supplies or services performed by the contractor,
thereby extending the period of performance of the contract.
Performance Work Statement (PWS) - a description of the work required, which results in
clear, specific, and objective terms, with measureable outcomes.
Quality Assurance Surveillance Plan (QASP) - a guide, which describes the contract
monitoring methods in detail. The QASP is usually written by the same team who develops
the work statement and is used in monitoring a contract.
Statement of Work (SOW) - a requirements document for services. It describes work or
services to be performed and may enumerate the methods to be used. It can apply to the
acquisition of services or development of hardware. The SOW is the contractual vehicle for
expressing exactly to what each party (the contractor and the government) is agreeing. Its
clarity has a direct effect on efficient contract administration since it defines the scope of
work.
12-2
Task Order (TO) Contract - a contract for services placed against an established contract
(i.e., OMNIBUS/GSA) or with government services.
Constructive Change - a situation in which the contractor performs work beyond that
required by the contract without a formal change order. It is perceived that the work
originated from a Government informal order or is due to Government fault. A Government
informal order can be defined as words or deeds excluding advice, comments, suggestions, or
opinions.
Contract Support Review Board - A meeting chaired by the supported division to decide what
type of contract support is required and ensure the level of expertise requested and scope of work
are consistent with Command objectives.
Contract Package - the set of documents, prepared by the division Section Head to initiate a
contract or task order. Included are the SOW, Independent Government Cost Estimate
(IGCE), TPM/OTD Form (if <$500k, TPM/OTD Form is an internal COMOPTEVFOR
document), approved TPM/OTD Form, DD254, and signed/accepted funding document.
12.2 ROLES AND RESPONSIBILITIES
12.2.1 Contracting Officer
Fleet Logistics Center (FLC) Norfolk has unlimited authority to approve all TOs exceeding $500k.
Only the Contracting Officer has the authority to change the terms and conditions of a contract or
to enter into a new contract agreement.
12.2.2 Ordering Officer
COMOPTEVFOR has limited written authority (Warrant) to make business decisions limited to
TOs and actions under $500k for OMNIBUS and $150k for GSA orders. The Ordering Officer
conducts all task order administration functions, monitors task order compliance, collects
information and provides recommendations to the Contracting Officer.
12.2.3 Command COR
The Command COR is an authorized representative of the Contracting Officer, designated by the
command and approved by the Contracting Officer. The COR is the liaison between the end user
(customer) and the Contracting Officer and Ordering Officer. The COR does not have the
authority to change terms and conditions of the contract or enter into a new contract agreement.
12.2.4 Ordering Officer’s Contract Specialist
Conducts all contract administration functions, is the liaison between the COR and Ordering
Officer, has no written authority to make business decisions or change the terms or conditions of
the contract or enter into a new contract agreement.
12.2.5 Technical Assistant (TA)
The requiring activity representative who may be assigned to provide technical/administrative
assistance to the Command COR. TA’s may be assigned to assist and support the COR but do not
12-3
have the authority to provide technical direction or clarification directly to the contractor. Each
warfare division has a designated TA assigned.
12.2.6 Warfare Division Section Head
Identifies the need for contract support, executes the procedures for obtaining contract support as
described in the Contract Support handbook, and obtains the required funding to support TO
award. The Section Head will also draft the contract package with the support and assistance of
the divisional TA and Command COR.
12.3 GENERAL CONTRACT TASK ORDER INITIATION PROCEDURES
12.3.1 The process to initiate contract delivery order/task orders should begin a minimum of 16
weeks prior to the desired start date of the period of performance. Refer to the Contract Support
Handbook. Execute the Contract Touch Points as described.
See table 12-1 for the contract package generation process. Also see the Contract Support
Handbook.
Table 12-1. Contract TO Package Generation Responsibilities
UNCLASSIFIED
Action
Responsible for
Action
Using the Contract Support Handbook:
Formulate the SOW, IGCE, and additional required documentation (e.g., DD254, approved
TPM/OTD Form (provided by Finance upon receipt of funding), using the templates located on
the Y: drive for the contract vehicle selected by the Contract Support Review Board. If using the
OMNIBUS: Y: T&E/OTD Contracts/Contract Package Templates 2017.
From the KMS home page, find an OTD Contract button located on the right-hand side column
near the bottom, this opens the Contract Package Templates 2017 folder.
Section Head (SH)
Reviews contract package for accuracy and completeness
Division TA
Review and recommend approval.
SH
Review and approve.
Division Deputy
Director
If under $500k, submit to COMOPTEVFOR Contracting Office for processing.
Contracting Officer
Assigns a Contracting
Specialist
If over $500k, submit to Fleet Logistics Center, Norfolk for processing.
COR
12.3.2 Ensure Funds Availability
Section Head will work with the TA and Division Director to ensure funding is available. Per
regulatory requirements, if incremental funding is used, a minimum of 25 percent of the
Independent Government Cost Estimate (IGCE) (or at least 90 days of coverage for performance
periods lasting less than 1 year) must be provided along with a schedule of when remaining
increments will be provided. Funds must be available at OPTEVFOR no later than 1 week prior
to the submission of the contract package to the COR.
12.3.2.1 Funding
After funds are accepted by OPTEVFOR, the SH, OTC, or other designated representative
coordinates with the Comptroller Division and submits a COMOPTEVFOR Funding Request
12-4
Document to the Comptroller. See Y:\T&E\Financial Guidance. Once signed electronically by
the Comptroller, the TPM/OTD provides a hard copy to Supply. Supply then creates a requisition
in PR Builder. Once approved, Supply provides the requisition to Contracts/01K. If required,
schedule a Service Requirements Review Board (SRRB).
12.4 SERVICE REQUIREMENTS REVIEW BOARD (SRRB)
COMOPTEVFORINST 4208.1 implemented the SRRB process in October 2016. See the
instruction at Y:\T&E\OTD Contracts\SRRB Instruction.
A SRRB is required for any service contract using Headquarters funds or a service that
requires cyber test support. See the Contract Support Handbook for an SOP and checklist.
12.5 TECHNICAL EVALUATION BOARD (TEB)
12.5.1 General
Following receipt of contractor proposals by the Ordering Officer, a TEB will be conducted for
every TO before an award can be made by the Ordering Officer. In unique cases where the
Ordering Officer executes a sole source procurement, a TEB will not be required.
12.5.1.1 The TEB’s purpose is to evaluate each of the competing proposals and each offerors
ability to perform the prospective task.
12.5.1.2
A technical evaluation is conducted to determine the degree to which each proposal meets or fails
to meet the solicitation’s minimum performance requirements through assessment of the strengths,
weaknesses, and risks of a proposal. Technical evaluations will be conducted using rating methods
including color or adjectival ratings, numerical weights or technically acceptable/unacceptable as
dictated by the Ordering Officer via the COR.
Other elements such as past performance evaluation, cost/price evaluation and small
business/subcontracting evaluation are performed by the Ordering Officer.
12.5.1.3
Evaluation of Task Order proposals (e.g., OMNIBUS, GSA, Seaport, etc.) may use one of the
“Best Value” processes described below. SH should contact the COR for guidance. Successful
evaluation is dependent upon a well-planned task order solicitation that includes a clear and
detailed SOW. Award of a TO will be based on the factors contained in the solicitation. Source
selection method can be LPTA or Trade-Off.
Lowest Price/Technically Acceptable (LPTA). A LPTA is a source selection method in
which each technical proposal is evaluated on the offerors demonstrated understanding of the
SOW and how its approach will successfully accomplish the requirements of the SOW. Each
technical proposal will be rated either “Acceptable” or “Unacceptable.” Award will be made
by the PCO or Ordering Officer to the offeror with the lowest technically acceptable bid.
Trade-Off. A source selection method in which the Government will award the TO to the
responsible offeror whose offer conforms to the SOW and is the most advantageous to the
12-5
Government, price and other factors considered. The trade-off method establishes weighting
among the technical, past performance, and price factors. The weighting of factors is defined
in the solicitation.
Trade-Off Source Selection is more demanding because it is used to justify quantitative
ranking and, as such, typically requires more time to conduct than LPTA. In addition, trade-
off requires an in-depth, rational, and thorough technical evaluation of the offered proposals
thereby critically distinguishing the technical differences between proposals.
See the Contracting Handbook for detailed TEB processes and guidelines.
12.6 TASK ORDER AWARD
12.6.1
Services are furnished as ordered by a D 1155 signed by the Contracting or Ordering Officer. The
SH shall retain electronic copies of all contract related documents received from 01K Contracts.
For program support contracts, the SH is urged to send a copy of all contract awards (to include
modifications) to the program office to assist with tracking funds expiration dates and to ensure
timely receipt of additional funds. A template e-mail is provided at section 12-13 which should
be populated with key information to identify the specific amount of additional funds needed by
the Contract Line Item Number (CLIN) # and date required.
TOs issued shall include, but not be limited to the following information:
Date of order
Contract and order number
Appropriation and accounting data
Item number and description of the services to be performed, period of performance,
quantity, and unit price
DD Form 254 (contract Security Classification Spec), if applicable
DD Form 1423 (Contract Data Requirements List), if applicable
Exact place of performance
The inspecting and accepting codes (as applicable)
The firm fixed price (award value)
List of Government Furnished Property and the estimated value thereof, if applicable
Any other pertinent information.
12.7 TASK ORDER MODIFICATIONS
A Modification or Bilateral Modification (supplemental agreement) is a contract modification that
is signed by the contractor and the Ordering Officer. Bilateral modifications are used to make
negotiated, equitable adjustments resulting from a necessary change in the scope of work. If you
feel a modification is necessary, contact your Division TA.
12-6
12.8 INVOICE CONCURRENCE
The Command COR is copied on all invoices; timely verification by the SH of travel expenditures
is critical. The SH shall review contractor monthly reports to confirm expenditures and be
proactive in not exceeding the authorized travel budget. For every TO that has been awarded, an
invoice will be submitted to COMOPTEVFOR via Wide Area Work Flow and received by our
Supply Department. The invoice will be sent to the respective SH, Division TA and B Code by
the Supply Department for review and concurrence/non-concurrence for payment.
Example:
12-7
DEPARTMENT OF THE NAVY
COMMANDER OPERATIONAL TEST AND EVALUATION FORCE
7970 DIVEN STREET
NORFOLK, VIRGINIA 23505-1498
Good Morning LCDR Jones.
Please review the attached invoices for accuracy of Labor and Travel/ODC, and respond with
your concurrence so it may be certified for payment. These are in reference to (Program Name)
Invoice received date: 14 JAN 2020.
A reply is required from the respective TPM/OTD/SH/OTC or division representative within 3
(three) working days. Timely replies are required to meet Prompt Payment Certification
requirements and to ensure Contractors are notified of any invoice problems within three (3) days
of receipt.
Approval recommendations imply that the nature, quantity and type of effort being expended by
the Contractor are per the contract.
Very Respectfully,
LS2 Jane Doe
Acceptor/Purchasing Agent
COMOPTEVFOR Supply Department
12.8.1
The following is a list of responsibilities every SH must keep in mind when conducting a review
of the invoice:
12-8
The SH makes a timely response back to the Supply Department. This will help ensure that
no additional costs (interest) are incurred by the government due to late responses.
Ensure you have (at a minimum) the previous monthly report on hand to augment your
invoice review.
The goods have been received or the services have been performed and are per the contract,
purchase order, or agreement.
The prices, subtotals and totals are accurate.
The invoice includes the contract, purchase order, or agreement number and is per the terms
of the contract, purchase order, or agreement.
The invoice is not a duplicate or has not been paid previously.
If you have any questions or concerns with the invoice, immediately contact the COR for
corrective action before any other action is taken.
12.9 ASSESSING CONTRACTOR PERFORMANCE
During task order execution, the SH should ensure the contractor is providing the goods or services
per the stated requirement as identified in the SOW. If, during performance of the task, inadequate
progress is being made, communicate immediately with the Command COR via divisional
Assistant Chief of Staff (ACOS). Be prepared to discuss objective evaluation of the contractor’s
performance and any e-mails documenting communication pertinent to the issue. If necessary, a
DD 2772 Contract Discrepancy Report may be prepared and submitted to the contractor,
documenting the process of constructive performance improvement. The Contracting Officer will
require this documentation should a need to issue a “Notice of Concern” or “Show Cause” be
required. The form may be found at Y:\T&E\OTDContracts\CPAR.
If a positive Contract Performance Assessment Report (CPAR) has been submitted recently, the
Contracting Officer will need significant documentation to support a decision to “Terminate for
Default. ”
The Command COR is required to execute a CPAR annually on each Contract Company
supporting each task order at COMOPTEVFOR. The SH has the responsibility to provide input
to the CPARS process. The SH’s input should be submitted to the Command COR using the
format provided on the Y-drive in the following location: Y:\T&E\OTD Contracts\CPAR. This
should be submitted in an e-mail along with the attached WORD Form/Document.
12.9.1 Sub-Par Contractor Performance
NOTE
At no time will anyone other than the Command COR contact the Contractor to make
a report of contractor sub-par performance.
If you are experiencing sub-par performance from a contractor who is supporting your program,
follow these guidelines:
12-9
What To Do:
a. Ensure your TA is aware of your situation as he will be able to assist you in compiling all of
the facts surrounding the sub-par performance, to include names of contractor(s) and
government/military personnel involved, and details pertaining to the contractor’s performance.
Be sure to address whether performance complies with the SOW.
b. With the TA, see the Command COR. Be ready to discuss the facts.
12.9.1.1 Do Not:
a. Reprimand, belittle, or conduct a performance evaluation of the contractor.
b. Use contractor(s) for performance of inherently governmental functions.
c. Create or support a work environment that is hostile or unprofessional.
d. Authorize time off, sign time cards, or dictate work hours for contractors.
A complete list of governmental functions is found in the Federal Acquisition Regulation (FAR)
Manual Part 7, Subpart 7.5, Inherently Governmental Functions. The Command COR has a copy
of this document.
12.10 TASK ORDER CHECKLIST
Review the Task Order Checklist in the Contract Support Handbook (CTP 1 through CTP 7) to
verify you have completed all the necessary steps.
12-10
12.11 TEMPLATE E-MAIL (WHEN DISTRIBUTING TASK ORDERS AND
MODIFICATIONS) TO PM BUDGET OFFICE:
[SHs: Send an e-mail containing the following information when additional funds are needed to
fully fund the current Period of Performance (PoP) under a TO:]
From: Section Head (Insert name here)
To: PM Budget Office (Insert names of PM Budget Office Personnel here)
Enclosed is a copy of TO #____ (OTD insert the 4 digit task order number found in Block #2 of
DD Form 1155 or Block #4 of SF 1449) which was recently awarded under contract #
_____________ (SH insert the 13 Alpha Numeric Contract # found in Block #1 of DD Form 1155
or Block #2 of SF 1449) providing contractor OT&E support services to the
___________________ [SH insert the full Program Name as well as the short title Here.]
program TEIN #_______ [SH insert TEIN here.]. This TO is currently incrementally funded
in the amount of $____________ [SH insert the total “FUNDED” amount shown in Section B
(usually the 2
nd
page) of the task order.]. An additional $_________ [SH calculate & insert the
difference between the VALUE of the CLINs to be performed during the current PoP and the total
“FUNDED” amount of those CLINs.] still needs to be provided in order to fully fund the
current period of performance which runs from ________ [SH insert current PoP STARTING
date DD MONTH YY] until _______. [SH insert current PoP ENDING date DD MONTH YY].
Please keep in mind that Defense Federal Acquisitions Regulations DFARS 232. 703-1(2)
requires “an incrementally funded fixed price contract shall be fully funded as soon as funds
are available.” In order to prevent interruption of your critical program mission support
(and to comply with acquisition regulations), please transmit the required additional funds
as soon as they are available. Please ensure your funding document identifies the funds are
for the following CLINs under the TO and Contract identified above:
CLIN: __*__# CLIN Description _____*____ $___*______ (amount needed to fully Fund)
CLIN: __*__# CLIN Description ____*_____$___*______ (amount needed to fully Fund)
CLIN: __*__# CLIN Description _____*___ $___*______ (amount needed to fully Fund)
[* OTD insert 4 digit CLIN #, Description and the remaining amount needed based on TO].
When transmitting additional funds, please e-mail a courtesy copy of the funding document
to the Contracting Officer (insert name & e-mail), Contracting Officer’s Representative (Tim
Burrows timothy.burrows@cotf.navy.mil), SH, TA and COMOPTEVFOR Funds e-mail
Thank You (SH insert Name and Contact info)
13-1
SECTION 13 - FINANCE
13.1 INTRODUCTION
This chapter focuses on financial resources available to the SH or OTC, and includes high level
fiscal guidance. For more detailed guidance, see the Financial Handbook.
13.2 FISCAL GUIDANCE AND PROCUREMENT INTEGRITY
13.2.1
OPTEVFOR personnel involved with managing appropriated funds shall, at all times, act as good
stewards of fiscal resources provided for executing the command’s mission. The policy will be to
establish and maintain a solid and unquestionable reputation for fiscal responsibility, such that
COMOPTEVFOR becomes synonymous with the ideals of fiscal integrity, frugality, and value.
13.2.2
OPTEVFOR leadership and management personnel, particularly those directly involved with
funds management and/or execution, will, in their appropriated funds dealings, always act
conservatively, consistently, and unquestionably in the best interests of the command and the
Navy, and, just as importantly, in the best interest of the American taxpayers. To be effective,
funds administrators and managers should have a fundamental understanding and appreciation for
basic financial principles and an understanding of the regulations and policies that must be
followed. This is an area where it is essential that the SH/OTC ask the experts before acting.
13.2.3
OPTEVFOR staff frequently interface with contractor personnel, internally and externally. All
staff must be familiar with the basic tenets of procurement integrity:
As representatives of the U.S. Government, OPTEVFOR staff must ensure not only full legal
compliance but also that there is not even a perception of impropriety in dealing with individuals
and organizations doing business with the Government. Actions that call into question an
individual’s integrity or propriety in financial or contractual matters can have far-reaching
consequences for the DoN.
13.3 FUNDING SOURCES AND REGULATIONS
13.3.1
The purpose of this section is not to make the SH/OTC a financial expert; but rather, to provide a
basic understanding of the key laws and regulations that must be observed and to help facilitate a
clear dialogue between the SH/OTC and the Comptroller staff.
13.3.1.1 Sources of Funds
OPTEVFOR is financially supported by a variety of different funding sources:
13.3.1.1.1 Direct Operating Funds
13-2
OPTEVFOR is a "mission funded" activity (i.e., resourced to perform its mission directly through
the annual Congressional appropriations process), where funds are appropriated by Congress
directly to support the core COMOPTEVFOR mission. OPTEVFOR's annual operating budget,
often referred to as HQ Funds, are provided solely from within the Research, Development, Test,
and Evaluation - Navy (RDT&E-N) appropriation. RDT&E-N funds are legally available for
obligation for 2 years - the appropriation and the funds therein are said to have a 2-year life.
13.3.1.1.2 PM Funds (Program/reimbursable funds)
In addition to direct annual operating funds, OPTEVFOR receives and is responsible for the proper
execution of funds from various projects and PMs. These funds are not to provide for core
OPTEVFOR annual operating requirements, but rather for specific T&E requirements unique to
programs, systems, and projects for which the funds are provided. Funding is provided for T&E
support to include range support, analytic support, test weapons, targets, program-specific travel,
Cyber Security testing, etc. These funds support program-specific T&E requirements for which
the command is not supported and/or funded directly. It is inappropriate to use program funds
(reimbursable or direct citation) for acquiring goods and/or services that are considered a core part
of the command’s mission (e.g., general headquarters administration). Use of reimbursable funds
for such purposes is considered an illegal augmentation of an appropriation and a violation of 31
USC, Section 1517. (It is sometimes referred to as the “Anti-Deficiency Act,” discussed later;
basically, it directs activities to not exceed their annual funds operating authority.)
13.3.1.1.3 Uses of Funds and “Color of Money”
The “color of money” is an expression referring to the appropriation from which the money
originates. The color is important in that there are laws and regulations that dictate what different
appropriations can and cannot be used for. There are a number of appropriations supporting the
Navy’s various missions and functions, including, but not limited to:
O&MN (1804)
Aircraft Procurement, Navy (APN) (1506)
Ship Construction, Navy (SCN) (1611)
Weapons Procurement, Navy (WPN) (1507)
Other Procurement, Navy (OPN) (1810)
Procurement of Ammunition, Navy and Marine Corps (PANMC) (1508)
RDT&E-N (1319)
Military Construction, Navy (MCN, often referred to as MILCON) (1205)
Each appropriation is defined by statute and regulations as for what it may be used. Inappropriate
use of an appropriation (even though the actual expenditure may be appropriate or legal)
constitutes a violation of Title 31 USC, Section 1301 (sometimes referred to as the “color of
money” statute).
13.3.1.1.4 Commonly Used Appropriations
The following descriptions are provided with reference to the appropriations most commonly used
by OPTEVFOR in the area of reimbursable program funds:
13-3
13.3.1.1.4.1 O&MN-1804
Finances the basic day-to-day operations of the Fleet and most principal shore activities. OM&N
supplies funds for annual operating expenses for other activities and Fleet commands, such as
supplies, utilities, civilian manpower, travel, administrative support, fuel, repair parts,
Operating/Operational Target (OPTAR), transportation leasing arrangements, maintenance of
property, etc.
13.3.1.1.4.2 RDT&E-N-1319
Finances the expenses necessary for basic and applied scientific RDT&E, including maintenance,
rehabilitation, lease, and operation of facilities as authorized by law. In the case of OPTEVFOR,
RDT&E-N funds our annual operating expenses.
13.3.1.1.4.3 APN-1506
Finances the procurement of Navy and Marine Corps aircraft and provides for related supporting
programs. Supporting programs include equipment for modification of in-service aircraft, aircraft
spare parts, ground support and training equipment, and industrial facilities and tools.
13.3.1.1.4.4 SCN-1611
Primarily funds ship construction, but also the conversion of existing ships (e.g., the SSN to SSGN
conversion program), including all hull, mechanical, and electrical equipment; electronics; guns;
torpedo and missile launching systems; and communications systems.
13.3.1.1.4.5 WPN-1507
Finances the procurement of missiles, torpedoes, guns, and ancillary weapons-related supporting
equipment for Navy forces and Marine air forces. Supporting equipment includes equipment for
modification of in-service missiles, torpedoes, guns, and gun mounts; targets used in weapons
training exercises and weapons evaluation; hardware for navigation and communications satellite,
and other space programs; spare parts; ground support and training equipment; and industrial
facilities and tools required for the production and maintenance of missiles.
13.3.1.1.4.6 OPN-1810
Finances the procurement, production, and modernization of equipment not otherwise provided
for. Such equipment ranges from the latest electronic sensors required to update the naval forces
to trucks, training equipment, and spare parts. This equipment is an integral part of programs to
improve the Fleet and shore establishment by expanding or maintaining existing capabilities or
replacing ineffective units.
13.3.2 Statutory Implications
There are several fundamental laws that serve as the underpinning for much of the “how and why”
funds are administered the way they are. The laws are frequently referred to in the aggregate as
the “Anti Deficiency Act.”
13.3.2.1 Applicable Statutes
The following statutes apply to financial management matters at OPTEVFOR.
13-4
Title 31 USC, Section 1301. Commonly referred to as the “color of money” or “purpose”
statute, it states that funds may only be obligated and expended for the purposes authorized
by the Congress in specific appropriations acts or other laws. It is a primary control that the
Congress exercises over the executive branch.
Title 31 USC, Section 1341. States that an officer or employee of the United States may not
authorize an obligation exceeding the amount available in an appropriation or make any
obligation before the appropriation becomes effective in law.
Title 31 USC, Section 1517. States that an officer or employee of the United States may not
authorize an obligation in excess of an apportionment. An apportionment is a subdivision of
a congressional appropriation that carries with it legal responsibilities.
13.3.2.2 Penalties
Penalties for violation of these statues include suspension from duty without pay and/or
removal from office and/or restitution of funds to the treasury by the responsible or
accountable individual. If the violation is deemed “knowing and willful,” the penalty can
include fines of up to $5,000 and/or up to 2 years in jail. Violations are reported up the
DoN/DoD/OMB administrative chain to the executive branch. The law mandates that
violations be reported to the President, then to Congress.
13.3.2.3 Misappropriation of Funds
Funding received from any source may not be used for a purpose not specifically provided for in
the law. Reimbursable funding also requires authorization from the issuing authority as to how
the funds are intended to be used. Where doubt exists, a SH/OTC should check with the
OPTEVFOR Comptroller for a determination as to whether a planned use of funds is appropriate.
13.4 AMPLIFYING GUIDANCE ON USE OF PROGRAM FUNDS
13.4.1
OPTEVFOR personnel will not rely exclusively on PM approval for use of program funds – once
a funding document is accepted by OPTEVFOR, sole fiduciary responsibility for the proper use
of the funds resides with COMOPTEVFOR and the Comptroller. This command, not the program
office, becomes thereafter solely responsible and accountable for any misdeeds (perceived or real),
regardless of whatever authorization or enabling support may have been involved by program
offices or other outside activities.
13.4.2 General Financial Guidance
Funding for all CNO project support is the responsibility of the PM (often referred to as the
program office). Each OTC/SH responsible for a CNO project requiring the technical expertise of
Contracted Service (CS) is responsible for working with the Comptroller staff to coordinate the
transfer of funds from the PM to the OPTEVFOR Comptroller. During TEMP revisions or
updates, a review of the Part IV Resource Summary is essential for updating funding requirements
to support any analytical contracts, range time, or Temporary Additional Duty (TAD) travel
needed in the course of the project’s active life. The movement of resources by a PM can often
13-5
take weeks or months, so early identification of funding issues within a program by the SH/OTC
is essential.
13.4.2.1
In interpreting federal appropriations law, the Supreme Court has stated that an established
fundamental rule is that “The expenditure of public funds is proper only when specifically
authorized not that public funds may be expended unless prohibited” This axiom is important
where federal monies are concerned, since it refutes the popular and common misconception that
“if the rules don’t say I can’t, then I can.”
13.4.2.2
In addition to various Supreme Court rulings, the United States Comptroller General decisions
have repeatedly demonstrated that where taxpayer funds are involved, traditional concepts like
“show me where it says I can’t” and “it’s easier to get forgiveness than permission” are not
applicable. Expenditures of federal funds are appropriate only when the laws/regulations/policies
are supportive. A corollary to this precept is that where federal law or departmental
regulations/policy is silent on an issue, expenditures related to that issue are not authorized.
13.4.2.3
If there is the slightest doubt, consult the OPTEVFOR financial staff for guidance before
expending funds or returning funds to the PM.
13.5 SPECIFIC GUIDANCE REGARDING PROGRAM FUNDS
While exceptions may arise that will be adjudicated by the Comptroller’s office, the following
“rules of the road” apply with respect to use of program funding. In general, the following uses
of funding received from PMs are acceptable (assuming the “color of money” stipulations
discussed further below are met):
13.5.1 Analytic Support Services
Includes contractor support services unique to the program from which the funds are provided.
Such services or support will use program funding when the services or support is not otherwise
available from the staff. Contractual support is funded via direct citation funding by the program
office. An SH/OTC must exercise care in establishing an appropriate professional and personal
relationship with support contractor personnel. The contractual support provided by a contractor
must never result in or give the outward appearance of a “personal services” contract. As stated
in FAR 37.104 (series), a personal services contract is one that, by its terms or as administered,
makes the contractor employees appear to be, in effect, government employees.
13.5.2 Flight Hour Support
Reimbursable funds from the program supported will pay for required flight hours in support of
program T&E.
13-6
13.5.3 Range Services
Range services in support of T&E will be funded using reimbursable funds from the program being
tested.
13.5.4 IOT&E Travel
IOT&E travel for programs of record will be funded using reimbursable program funds.
13.5.5 Program-Unique Equipment, Supplies, or Consumables
Equipment purchases involving program reimbursable funding must involve unique equipment,
the focus of which is exclusively in support of the specific program providing the funds. (The
same direction applies to program-unique supplies and consumables.) The policy at
COMOPTEVFOR will be that procurement of equipment or consumables using reimbursable
funds will be the exception to the rule; and such purchases will receive greater scrutiny during the
requisition approval process and require Comptroller office approval prior to ordering.
13.6 INAPPROPRIATE USES OF PROGRAM FUNDING
In general, the following are inappropriate uses of reimbursable funds. Appropriate alternative
sources of funds are as indicated.
13.6.1 Information Technology Equipment
Unless unique to a specific project, information technology equipment (computers, monitors,
laptops, personal digital assistants, etc.) will not be purchased using program reimbursable funds.
13.6.2 Mobile Phones/Other Personal Communications Equipment
Cell/mobile phones, Blackberries, and other PDAs will not be purchased using program
reimbursable funds.
13.6.3 Office Supplies
Unless unique to a specific program or project, office supplies will not be procured using program
reimbursable funds.
13.6.4 Personal Items
Personal items, other than those addressed herein, normally will not be purchased using
reimbursable program funds. In most instances, the general rule is that purchase of personal items
using federal funds is forbidden. Disallowed personal items include apparel, uniform items,
sunglasses, sunscreen, food items of any description, food preparation items of any description,
entertainment items (other than such items received as part of the command awards system), etc.
13.6.5 Full-Time Civilian Hires
COMOPTEVFOR will not hire permanent civilian positions using reimbursable funding.
Reimbursable program funds may be used to support manpower requirements using contractor or
working capital fund manpower (these personnel may work full time at the headquarters during
their term of service; however, they are not permanent OPTEVFOR employees).
13-7
13.6.6 “Color of Money” Concerns
COMOPTEVFOR’s policy will be to ensure that PM funds in support of T&E efforts are used in
a fiscally responsible manner. While there may be exceptions to the rules above relative to use of
reimbursable (PM) funds, it is expected that exceptions/waivers to the guidance herein will be rare.
In questionable circumstances where disagreement exists regarding interpretation and
implementation of this policy regarding appropriate use of reimbursable funds, the OPTEVFOR
Comptroller is charged with making a final determination as to the appropriate course of action,
guided by the precepts herein if guidance is not otherwise specified in higher-level
guidance/documentation. To the extent that a Comptroller decision is questioned, an appeal can
be made to the Commander via the Deputy, but the Comptroller decision will stand, pending
follow-on arbitration.
13.7 PROGRAM FUNDING DOCUMENTS
All program funding documents are to be forwarded to the OPTEVFOR Comptroller division for
processing by the assigned warfare division analyst.
13.8 ADDITIONAL FISCAL GUIDANCE/SUPPORT AVAILABLE
Should questions or issues relative to the use of funds arise for which the SH/OTC is unable to
ascertain the correct approach and that are beyond the scope of the OT&E Manual, the SH/OTC
should contact the Comptroller and/or Deputy Comptroller directly for specific assistance. The
Comptroller/Deputy have access to Fiscal Policy and Fiscal Law offices on the staff of the
SECNAV that can be queried to ensure the command safeguards funds, and uses funding in a legal
manner, within the bounds of the law/policy. When in doubt, an SH/OTC should contact the
Comptroller’s office for issue resolution; early notification works best since legal/policy issues
may require outside adjudication. See the Financial Handbook for more information.
13-8
THIS PAGE INTENTIONALLY LEFT BLANK.
A-1
APPENDIX A - ACRONYMS AND ABBREVIATIONS
6PP
Six-Part Paragraph
AA
Adversarial Assessment or Accelerated Acquisition
AAP
Abbreviated Acquisition Program
ACAT
Acquisition Category
ACOS
Assistant Chief of Staff
ACOTD
Assistant Chief Operational Test Director
ACTD
Advanced Concept Technology Demonstration
AEC
Army Evaluation Command
AFB
Air Force Base
AFOTEC
Air Force Operational Test and Evaluation Command
AFTTP
Air Force Tactic, Technique, and Procedure
ALSP
Acquisition Logistic Support Plan
AMW
Amphibious Warfare
Ao
Operational Availability
AO
Action Officer or Authorizing Officer
AoA
Analysis of Alternatives
AOC
AOR
Assessment of Operational Capability
Area of Responsibility
AOTD
Assistant Operational Test Director
APB
Acquisition Program Baseline
APN
Aircraft Procurement, Navy
ASD(NII)
Assistant Secretary of Defense for Networks and Information
Integration
ASN(RDA)
Assistant Secretary of the Navy (Research, Development, and
Acquisition)
ASW
Antisubmarine Warfare
ATO
Authority to Operate
AUTEC
Atlantic Undersea Test and Evaluation Center
AW
Air Warfare
AWG
Analysis Working Group
B&G
Blue and Gold Sheet
BIT
Built-in Test
BMD
Ballistic Missile Defense
A-2
C3
Command, Control, and Communications
C&A
Certification and Accreditation
CAAL
COMOPTEVFOR Acronyms and Abbreviations List
CAAS
Contractor Assistance and Advisory Service
CAE
Component Acquisition Executive
CASCOR
Casualty Correction Report
CAP
Cybersecurity Assessment Program
CASREP
Casualty Report
CBTE
CBR
Capabilities Based Test and Evaluation
Chemical, Biological, and Radiological
CD
Capabilities Document
CDD
Capability Development Document
CDR
Critical Design Review
CIO
Chief Information Officer
CEWG
COI Evaluation Working Group
CL
Confidentiality Level
CLIN
Contract Line Item Number
CNO
Chief of Naval Operations
CO
Commanding Officer
COI
Critical Operational Issue
COMOPTEVFOR
Commander, Operational Test and Evaluation Force
COMSUBLANT
Commander, Submarine Force Atlantic
COMSUBFOR
Commander, Submarine Force
COMSUBPAC
Commander, Submarine Force Pacific
CON
Construction
CONEMP
Concept of Employment
CONOPS
Concept of Operations
COR
Contracting Officer Representative
COS
Chief of Staff
COT
Concept of Test
COTD
Chief Operational Test Director
COMOPTEVFOR
Commander, Operational Test and Evaluation Force
CPAR
Contract Performance Assessment Report
CPD
Capabilities Production Document
CRT
CS
COMOPTEVFOR Red Team
Contracted Service / Cybersecurity / Cyber Survivability
A-3
CSA
CT
Cyber Survivability Attribute
Contractor Test(ing)
CTE
CTEMP
Cyber Test Engineer
Capstone Test and Evaluation Master Plan
CTF
Core Team Facilitator
CTF
Commander Task Force
CTP
Comparative Test Program/ Critical Technical Parameter
CVPA
Cooperative Vulnerability Penetration Assessment
DA
Developing Agency
DACOS
Deputy Assistant Chief of Staff
DACM
Defense Acquisition Career Manager
DAG
DCP
Defense Acquisition Guidebook
Data Collection Plan
DED
Demonstration Execution Document
DIACAP
DoD Information Assurance Certification and Accreditation Program
(Replaced in 2015 by the Risk Management Framework)
DIRLAUTH
Direct Liaison Authorization
DMOT
Detailed Method of Test
DoN
Department of the Navy
DoD
Department of Defense
DOE
Design of Experiment
DOT&E
Director, Operational Test and Evaluation
DOTMLPF-P
Doctrine, Organization, Training, Material, Leadership and Education,
Personnel, Facilities, and Policy
DR
Decision Review
Data Requirement
DRPM
Direct Reporting Program Manager
DRTM
Vignette-to-Data Requirements-to Test Method
DT
Developmental Test(ing)
DT&E
Developmental Test and Evaluation
DTS
Defense Travel System
DWG
Design Working Group
EA
Evolutionary Acquisition
ECP
Engineering Change Proposal
EFSR
Emergent FSR
A-4
E-IPR
eKM
Executive IPR
Enterprise Knowledge Management
EMCON
Emission Control
EMD
Engineering and Manufacturing Development
EOA
Early Operational Assessment
E-SERB
Executive System Evaluation Review Board
EU
Extended Use
EW
Electronic Warfare
EXW
Expeditionary Warfare
EXWDC
Expeditionary Warfare Development Center
FAR
Federal Acquisition Regulation
FFP
Firm-Fixed Price
FHN
Family Housing, Navy
FHP
Force Health Protection
FLC
Fleet Logistics Center
FMC
Full Mission Capable
FMR
Financial Management Regulations
FoS
Family of Services
FOT&E
Follow-on Operational Test and Evaluation
FPIN
Financial Policy and Information Notice
FRP
Full Rate Production
FRPDR
Full Rate Production Decision Review
FSA
Field Support Activity
FSO
Fleet Support Operations
FSR
Fleet Service Request
FXP
Fleet Exercise Publication
FWE
Foreign Weapons Evaluation
FY
Fiscal Year
GAO
Government Accounting Office
GPS
Global Positioning System
GSA
Government Services Agency
HITL
Hardware-in-the-Loop
HMX
Marine Helicopter Squadron
HQ
Headquarters
A-5
I&I
Integration and Interoperability
IAP
Integrated Assessment Plan
ICD
Initial Capabilities Document
ICTB
Initial Capability Technical Baseline
ID
Identification
IEF
Integrated Evaluation Framework
IGCE
Independent Government Cost Estimate
ILSP
Integrated Logistic Support Plan
INSURV
Board of Inspection and Survey
INT
Intelligence Operations
IO
Information Operations
IOC
Initial Operational Capability
IOT&E
Initial Operational Test and Evaluation
IPT
Integrated Product Team
IPR
In-Process Review
ISIC
Immediate Superior in Command
ISTF
Installed System Test Facility
IT
Integrated Test(ing)
ITT
Integrated Test Team
IW
Irregular Warfare
JCD
Joint Capabilities Document
JCIDS
Joint Capabilities Integrations Development System
JCTD
Joint Capabilities Technology Demonstration
JEON
Joint Emergent Operational Need
JITC
Joint Interoperability Test Command
JT
Joint Test
JT&E
Joint Test and Evaluation
JROC
Joint Required Operating Capability
JUONS
Joint Urgent Operational Need Statement
KMS
Knowledge Management System
KPP
Key Performance Parameter
KSA
Key System Attribute
LAN
Local Area Network
A-6
LBTS
Land-Based Test Site
LCSP
Life Cycle Support Plan
LFT
Live-Fire Testing
LFT&E
Live-Fire Test and Evaluation
LMUA
Limited Military Utility Assessment
LOG
Logistics
LOI
Letter of Instruction
LOO
Letter of Observation
LPTA
Lowest Price Technically Available
LRIP
LTD
Low Rate Initial Production
Level of Test Determination
LTE
Lead Test Engineer
MAA
Maritime Accelerated Acquisition
MACO
M&S
Maritime Accelerated Capability Office
Modeling and Simulation
M-DEMO
Maintenance Demonstration
MAC
Mission Assurance Category
MAIS
Major Automated Information System
MBTD
Mission-Based Test Design
MCMA
Mission Capability by Primary Mission Area
MCMTOMF
Mean Corrective Maintenance Time for Operational Mission Failures
MCN
Military Construction, Navy
MCOTEA
Marine Corps Operational Test and Evaluation Activity
MDA
Milestone Decision Authority
MDAP
Major Defense Acquisition Program
MESM
Mission Essential Subsystem Matrix
META
Mission Effect Test and Analysis
METL
Mission Essential Task List
MF
Measurement Facility
MFHBOMF
Mean Flight Hours Between Operational Mission Failures
MIW
Mine Warfare
MNS
Mission Need Statement
MOA
Memorandum of Agreement
MOB
Mobility
MOE
Measure of Effectiveness
MOP
Measure of Performance
A-7
MOS
Measure of Suitability
MOS
Missions of State
MOT&E
Multiservice Operational Test and Evaluation
MPN
Military Personnel, Navy
MR
Maintenance Ratio
M&S
Modeling and Simulation
MS
Milestone
MTA
Middle Tier Acquisition
MTB
Mission Technical Baseline
MTBOMF
Mean Time Between Operational Mission Failures
MTP
Management and Transition Plan
MTS
MUA
Master Test Strategy
Military Utility Assessment
NATO
North Atlantic Treaty Organization
NAWDC
Naval Air Warfare Development Center
NCO
Non-Combat Operations
NDA
Nondisclosure Agreement
NIB
Not-to-Interfere Basis
NIPRNET
Non-secure Internet Protocol Router Network
NLT
No Later Than
NMETL
Navy Mission-Essential Task List
NOI
Notice of Intent
NSMWDC
Naval Surface and Mine Warfare Development Center
NSW
Naval Special Warfare
NTP
Navy Training Plan
NTSP
Navy Training Systems Plan
NTTP
Navy Tactics, Techniques, and Procedures
NWCF
Navy Working Capital Funds
NWP
Naval Warfare Publication
NWS
New Weapons System
O&MN
Operations and Maintenance, Navy
O&MNR
Operations and Maintenance, Navy Reserve
OA
Operational Assessment
OAR
Operational Test Agency Assessment Report
OCE
Officer Conducting the Exercise
A-8
OE
Operational Effectiveness
OER
Operational Test Agency Evaluation Report
OFER
Operational Test Agency Follow-on Evaluation Report
OIPT
Overarching Integrated Product Team
OM
Operational Manager
OMAR
Operational Test Agency Milestone Assessment Report
OMB
Office of Management and Budget
OMF
Operational Mission Failure
ONI
Office of Naval Intelligence
ONR
Office of Naval Research
OPAREA
Operating Area
OPCON
Operational Consideration
OPCON
Operational Control
OPEVAL
Operational Evaluation
OPN
Other Procurement, Navy
OPNAV
Office of the Chief of Naval Operations
OPORD
Operations Order
OPSEC
Operations Security
OPTEVFOR
Operational Test and Evaluation Force
ORD
Operational Requirements Document
OS
Operational Suitability
OSD
Office of the Secretary of Defense
OT
Operational Test(ing)
OT&E
Operational Test and Evaluation
OTA
Operational Test Agency
OTC
Operational Test Coordinator
OTD
Operational Test Director
OTRR
Operational Test Readiness Review
OUA
Operational Utility Assessment
OTG
OPTEVFOR Tactics Guide
OV
Operational View
PANMC
Procurement of Ammunition, Navy and Marine Corps
PCO
Procurement Contracting Officer
PDF
Portable Document Format
PEO
Program Executive Office/Officer
PIN
Policy and Information Notice
A-9
PM
PMR
Program Manager
Prevent, Mitigate, Recover
POA&M
Plan of Action and Milestones
POC
Point of Contact
POE
Projected Operational Environment
PoP
Period of Performance
POR
Program of Record
PRE-Ex
Pre-Exercise
PWS
Performance Work Statement
QASP
Quality Assurance Surveillance Plan
QRA
Quick Reaction Assessment
QRT
Quick Reaction Test
RALOT
Risk Assessment Level of Test (replaced by LTD)
RDT&E
Research Development Test and Evaluation
RDA
Research, Development, and Acquisition
RDD
Rapid Development and Deployment
RFP
Request for Proposal
RFPPR
RFP Program Review
RMF
Risk Management Framework
RML&A
Reliability, Maintainability, Logistic Supportability, and Availability
ROC
Required Operating Capability
RPED
Rapid Prototyping Experimentation and Demonstration
RPN
Reserve Personnel, Navy
RV
Response Variable
S&T
Scientific and Technological
SAT
Satisfactory
SCN
Ship Construction, Navy
SDTS
Self-Defense Test Ship
SECNAV
Secretary of the Navy
SELEX
Selected Exercise
SEP
Systems Engineering Plan
SEPCOR
Separate Correspondence
SERB
System Evaluation Review Board
A-10
SES
Senior Executive Service
SH
Section Head
SHIPALT
Ship Alteration
SIL
System Integration Laboratory
SIPRNET
Secret Internet Protocol Router Network
SME
Subject Matter Expert
SOF
Statement of Functionality
SOP
Standard Operating Procedure
SoS
System of Systems
SOW
Statement of Work
SPECWAR
Special Warfare
SRRB
Service Requirements Review Board
SQT
Software Qualification Test(ing)
ST
Synchronized Test for CBTE
STAR
System Threat Assessment Report
STS
Strategic Sealift
STW
Strike Warfare
SUBOPAUTH
Submarine Operating Authority
SUT
System Under Test
SUW
Surface Warfare
SV
System View
SYSCOM
Systems Command
T&E
Test and Evaluation
TA
Threat Assessment / Technical Assistant
TACAIR
Tactical Aircraft
TACMAN
Tactical Manual
TACSIT
Tactical Situation
TAD
Temporary Assigned Duty
TEB
Technical Evaluation Board
TECG
Test and Evaluation Coordinating Group
TECHEVAL
Technical Evaluation
TEIN
Test and Evaluation Identification Number
TEMP
Test and Evaluation Master Plan
TEMPALT
Temporary Alteration
TEPS
Test and Evaluation Program System
TES
Test and Evaluation Strategy
A-11
TIEF
Tailored Integrated Evaluation Framework
TPRB
Test Plan Review Board
TO
Task Order
TRR
Test Resource Requirements
TSP
Test Strategy & Policy
TTP
Tactics, Techniques, and Procedures
TTVR
Target Threat Validation Report
TYCOM
Type Commander
UJTL
Universal Joint Task List
UNSAT
Unsatisfactory
UNTL
Universal Navy Task List
UONS
Urgent Operational Need Statement
USAF
United States Air Force
USC
Unites States Code
USCG
United States Coast Guard
USD(A&S)
Under Secretary of Defense (Acquisition and Sustainment)
USMC
United States Marine Corps
USN
United States Navy
USSOCOM
United States Special Operations Command
UTES
Unclassified Test and Evaluation Support
UUNS
Urgent Universal Need Statement
UUV
Unmanned Underwater Vehicle
UWDC
Undersea Warfare Development Center
VCD
Verification of Correction of Deficiencies
VMX-1
Marine Operational Test and Evaluation Squadron ONE
VV&A
Verification, Validation, and Accreditation
VX-1
Air Test and Evaluation Squadron ONE
VX-9
Air Test and Evaluation Squadron NINE
WAWF
Wide Area Work Flow
WCB
Warfare Capability Baseline
WEBSKED
Web-Based Scheduling System
WIPT
Working Integrated Product Team
WPN
Weapons Procurement, Navy
WSERB
Weapon Systems Explosive Review Board
A-12
THIS PAGE INTENTIONALLY LEFT BLANK.
B-1
APPENDIX B - THE CONTINUUM OF TESTING
B-1. INTRODUCTION
Per SECNAVINST 5000.2F, T&E programs will be structured to:
Provide essential information for assessment of acquisition risk and decision-making.
Verify attainment of technical performance specifications and objectives.
Verify that systems are operationally effective and suitable for intended use.
For programs of record, three principal types of T&E are conducted to accomplish these objectives:
Developmental Test and Evaluation (DT&E), OT&E, and IT. SECNAVINST 5000.2F and DoDD
5000.01 discuss each of these in detail. This appendix addresses the role of OPTEVFOR in the
test continuum. The challenge for the OTD is to understand the entire testing continuum and, with
that knowledge, make the best use of available resources to design and execute the minimum,
adequate test program.
DoD and DoN directives are currently being updated to address testing for accelerated acquisition
programs and those that utilize non-traditional authorities of the adaptive acquisition framework
(DoDI 5000.02). The scope of these policy efforts includes Middle Tier Acquisition (MTA)
programs. SECNAVINST 5000.2F conveys applicability of Quick Reaction Assessment (QRA)
within the test approaches for these programs. For oversight programs, DOT&E Memo dated 24
October 2019, subject “Operational and Live-Fire Test and Evaluation Planning Guidelines
Middle Tier of Acquisition Programs” provides relevant amplifying guidance. This memo is
available in the Y:\OT&E Reference Library. The OPTEVFOR tailored IEF and LTD processes
have been adapted to account for the current guidance regarding MTA programs.
B-2. T&E DEFINITIONS
B.1 DT&E
DT&E is planned and conducted by the DA, usually a SYSCOM or a PEO. In practice, DT is
typically managed by the PM through an assistant PM for T&E. In some cases, the principal
responsibility for the actual performance of T&E is assigned to a warfare center. SECNAVINST
5000.2F mandates the DA conduct adequate DT&E throughout the development cycle to support
risk management, provide data on the progress of system development, and to determine readiness
for OT. DT&E is conducted at contractor or government test and engineering activities.
OPTEVFOR should participate in DT&E when feasible to evaluate OT-relevant DT results and to
provide both an early operational perspective to developers and identification of OT issues to the
PM.
B.2 IT
Integrated testing takes a holistic view of both the developmental and operational test objectives
and seeks opportunities where test events can be leveraged to serve both. OPTEVFOR uses the
IEF to provide a comprehensive view of the information that will ultimately be needed to
B-2
determine the effectiveness and suitability of the SUT. By providing the IEF as an input to the
Milestone B TEMP, OPTEVFOR ensures that all stakeholders have a clear view of the critical
missions, tasks, attributes and measures that will need to be observed. Early and frequent
involvement by test agencies is required to ensure successful execution of IT. The DA, test
agencies, and user representative (resource sponsor) must share a common interpretation of the
system capability needs so that DT and OT are tailored to optimize resources, test scope, and
schedule. Test data qualified for OT use (OT-qualified data) should have the following
distinguishing characteristics:
Representative forces (friendly and opposing) will be used whenever possible, and employ
realistic tactics and targets.
Typical users (Fleet personnel) are required to operate and maintain the SUT for OT under
conditions simulating combat stress and peacetime conditions.
B.3 OT&E
Operational test and evaluation is defined in statute 10 USC. As the Navy’s Operational Test
Agency, OPTEVFOR is responsible for determining the operational effectiveness and operational
suitability of the SUT during realistic testing with actual Fleet operators and maintainers. In
addition, the CNO has tasked COMOPTEVFOR to evaluate how the SUT operates within the SoS
to deliver the required warfighting effects. To support the Service Acquisition Executive and
resource sponsor, OPTEVFOR also conducts a series of operational assessments prior to MS-C.
These assessments are focused on identifying the enhancing characteristics of the system under
development as well as the risks to the successful completion of IOT&E. For IOT&E, the test
article will be representative of the intended production equipment. Also, it will be installed as
closely as possible, as is expected in the Fleet.
Production or production-representative articles will be used for the dedicated phase of
IOT&E that supports the post-Milestone (MS)-C FRPDR.
Sufficient and accurate data must be recorded during the test to document all operationally
significant system or equipment characteristics.
Additionally, OT&E includes the evaluation and analysis of data from an operational
viewpoint to assess or determine the operational effectiveness and operational suitability of a
system.
The two products of OT&E are:
o The Evaluation Report.
o The OPTEVFOR Tactics Guide (OTG). Most tests do not require an OTG. OTGs are
often produced in support of air warfare systems during IOT&E. Submarine and Surface
Force tactics are developed by the respective WDC. Generally, OTGs are not produced
in support of FOT&E unless a major increase in new capability is introduced.
B-3. A COMPARISON OF DT&E AND OT&E
B-3
DT&E and OT&E necessarily examine the same performance features of a system; however,
their objectives are different. DT&E and OT&E normally differ in the way tests are conducted,
what is being tested, and the evaluation criteria and test measurements. Table C-1 illustrates this
comparison.
Table B-1. Comparison of DT&E and OT&E
UNCLASSIFIED
How Tests are Conducted
DT&E testing is generally conducted:
In a controlled environment that minimizes the
chance that unknown or unmeasured variables will
affect system performance
By technical personnel skilled at “tweaking” to
maximize performance
Against simulated threats tailored to demonstrate
various aspects of specified system technical
performance.
OT&E testing is generally conducted:
In an operationally realistic environment (e.g., high seas,
temperature extremes, high density electromagnetic
environments) under conditions simulating combat stress and
peacetime conditions
With Fleet operators and maintenance personnel
Against threats which replicate, as closely as possible, the
spectrum of operational characters
Using Fleet tactics.
Testing Subject/Topic
DT&E is focused on evaluating the technical parameters
of the weapon or system.
OT&E tests the performance of the SUT in the execution of a set of
critical mission tasks. This generally puts the SUT into a larger
SoS needed to deliver a required warfighting capability.
Evaluation Criteria
DT&E Technical criteria are measured to verify that
the SUT performance meets its specification
requirements.
OT&E is focused on validating the contribution of the SUT to the
CNO-specified warfighting requirements using a relevant fleet
mission context and threat environment.
Measurement and Frequency
DT&E
The tester generally knows what he/she wants to
measure (some particular parameter: launch velocity;
the number of g’s pulled as the missile acquires;
time to climb; etc.).
DT&E tests are structured to hold many things
constant, isolate others, and allow measurement of
one or two parameters of interest.
Special instrumentation is often installed to capture
required data.
OT&E
An objective is to create conditions that replicate combat as
closely as possible.
Using actual Fleet platforms in complex, time-compressed test
events with high costs generally precludes an incremental
experiment and test approach.
While every effort is made to identify the root cause of
deficiencies, OT&E may not have the time or resources
necessary to collect the data needed to isolate the cause of a
failure. It is generally more important for OT&E to ensure that
as many possible failure modes are identified prior to Fleet
release.
General Note: Data collection instrumentation used for DT should be examined to determine applicability and use during
OT&E. Additionally, data acquired during DT should be reviewed for use during OT&E.
B-4
B-4. PROGRAM OF RECORD OT&E
B-4. 1 GENERAL
In the Navy, COMOPTEVFOR plans and reports OT&E directly to CNO. All ACAT I, II, III,
and IVT programs require OT&E. Table B-2 provides a description of the criteria for ACAT
and AAP.
Table B-2. Description and Decision Authority for ACAT I-IV and AAPs
UNCLASSIFIED
ACAT
Criteria for ACAT or AAP Designation
Decision Authority
ACAT I
Statutory MDAP program which:
will not be carried out using the middle-tier
acquisition pathways for Rapid Prototyping or
Rapid Fielding;
RDT&E costs exceed $480 million in Fiscal
Year (FY) 2014 constant dollars or
procurement exceeds $2.79 billion in FY 2014
constant dollars.
Or, a DoD acquisition program that the USD (A&S) or the ASN
(RD&A) designates a statutory MDAP as a discretionary act.
ACAT ID: USD(A&S)
ACAT IC: SECNAV, or if
delegated, ASN (RD&A) as
the CAE (not further
delegable)
ACAT IA
MAIS: An Automated Information System (AIS) that is:
-Estimated to exceed:
$40 million in FY 2014 constant dollars for all
expenditures in any single fiscal year, for all
increments, sprints, etc., regardless of the
appropriation or fund source, directly related to
the AIS definition, design, development,
deployment, operation, and sustainment; or
$165 million in FY 2014 constant dollars for all
expenditures, for all increments, sprints, etc.,
regardless of the appropriation or fund source,
directly related to the AIS definition, design,
development, and deployment, and incurred
from the beginning of the Materiel Solution
Analysis Phase through deployment at all sites;
or
$520 million in FY 2014 constant dollars for
all expenditures, for all increments, sprints,
etc., regardless of the appropriation or fund
source, directly related to the AIS definition,
design, development, deployment, operations
and maintenance, and incurred from the
beginning of the Materiel Solution Analysis
ACAT IAM: USD (A&S)
or as delegated
ACAT IAC: ASN
(RD&A)
or as delegated
B-5
Table B-2. Description and Decision Authority for ACAT I-IV and AAPs
UNCLASSIFIED
ACAT
Criteria for ACAT or AAP Designation
Decision Authority
Phase through sustainment for the estimated
useful life of the system.
-Or designated as a MAIS by the USD (A&S) or the ASN (RD&A)
ACAT II
Does not meet criteria for ACAT I
Meets the definition of Major System:
Dollar value for all increments of the program
estimated to require:
-RDT&E total expenditures > $185 Million in FY 2014 constant
dollars; or
-Procurement total expenditures > $835 Million in FY 2014
constant dollars
Or, ASN (RD&A) designation as a Major
System
Does not apply to AIS programs. AIS
programs that do not meet the criteria for
ACAT IA shall be designated ACAT III
or lower, as
Designated by ASN (RD&A)
ACAT III
Does not meet criteria for ACAT I or II.
The program will acquire new or improved capability in response to
a validated capabilities document.
Dollar value for all increments of the program
estimated to require:
RDT&E total expenditures > $26 Million but < $185 Million in FY
2014 constant dollars; or
-Procurement total expenditures > $64 Million but < $835 Million
Individual designated by
the cognizant PEO,
DRPM, or SYSCOM
Commander.
B-6
Table B-2. Description and Decision Authority for ACAT I-IV and AAPs
UNCLASSIFIED
ACAT
Criteria for ACAT or AAP Designation
Decision Authority
ACAT IVT
Does not meet criteria for ACAT I, II, or III.
The program will acquire continuing capability for a deployed
system in response to a validated capabilities document.
Does require operational test and evaluation.
Dollar value for all increments of the program estimated to require:
RDT&E total expenditures > $26 Million but < $185 Million in FY
2014 constant dollars; or
-Procurement total expenditures > $64 Million but < $835 Million in
FY 2014 constant dollars
Individual designated by
the cognizant PEO,
DRPM, or SYSCOM
Commander.
ACAT IVM
Does not meet criteria for ACAT I, II, or III.
The program will acquire continuing capability for a deployed
system in response to a validated capabilities document.
Does not require operational test and evaluation.6
Dollar value for all increments of the program estimated to require:
RDT&E total expenditures > $26 Million but < $185 Million
in FY 2014 constant dollars; or
-Procurement total expenditures > $64 Million but < $835 Million in
FY 2014 constant dollars
Individual designated by
the cognizant PEO,
DRPM, or SYSCOM
AAP
Does not breach ACAT IV dollar thresholds
Does not require operational test and evaluation. 6
Dollar value for all increments of the program
estimated to require:
RDT&E total expenditures < $26 Million in FY 2014 constant
dollars; and
-Procurement total expenditures < $64 Million in FY 2014 constant
dollars
Individual designated by
the cognizant PEO,
DRPM, or SYSCOM
Commander.
(This designation
authority may be
delegated)
ASN (RD&A) Assistant Secretary of the Navy (Research, Development, and Acquisition)
CAE Component Acquisition Executive
DoD CIO Department of Defense Chief Information Officer
DRPM Direct Reporting Program Manager
FY Fiscal Year
MDA Milestone Decision Authority
RDT&E Research, Development, Test, and Evaluation
SECNAV Secretary of the Navy
SES Senior Executive Service
USC United States Code
USD(A&S) Under Secretary of Defense (Acquisition and Sustainment)
NOTE
OT&E is not required for ACAT IVM or AAPs per SECNAVINST 5000.2F. Written
concurrence from COMOPTEVFOR is required for designation of a program as an
ACAT IVM. For an AAP, written concurrence from COMOPTEVFOR must be
obtained stating that OT&E is not required.
B-5. TYPES OF OT
B-7
B-5.1 OA
An OA is a test event conducted before initial production units are available and which
incorporates substantial operational realism. An OA is conducted per a test plan and must be
approved by DOT&E for programs on OSD OT&E oversight. The focus of an OA is to assess
overall risk to a system successfully completing IOT&E and will usually address the following:
Significant trends noted in development efforts.
Limitations to test.
Areas of risk.
Capability of the SUT to meet performance goals in operational effectiveness and suitability
at IOT&E.
Capability of the SUT to deliver required warfighting effects in a SoS context.
OAs should be conducted when there is enough system maturity to conduct an operational test
incorporating substantial operational realism and may use technology demonstrators, prototypes,
mockups, or simulations if those articles can be placed in an operational context and risk to IOT&E
can be adequately assessed. An OA does not have to use production-representative articles and
does not substitute for the IOT&E necessary to support FRP decisions. As a general criterion for
proceeding through Milestone C, at least one OA will be conducted and the results documented in
a formal report. An OA may also be used to support other program reviews. OAs are not intended
to support FRPDRs, Fleet release, or introduction recommendations. All OAs are included in the
TEMP. There are two types of OAs:
B-5.1.1
OT-A (Early Operational Assessment (EOA)) is conducted during the Material Solution Analysis
and Technology Development phase. Results support decision makers at MS-B in determining
whether to continue development and approve entry into the Engineering and Manufacturing
Development phase of the acquisition process. EOAs may also focus on testability issues (e.g.,
M&S, ranges, environments).
B-5.1.2
OT-B (OA) is OT&E conducted during the Engineering and Manufacturing Development phase.
OT-B may be subdivided into discrete phases (e.g., OT-B1, OT-B2). OAs are conducted per a test
plan employing significant operational realism to identify enhancing characteristics of the system
as well as to discover and categorize risks to a successful IOT&E. Results of OT-B assessments
identify program enhancements and risks, and the final OT-B phase will support the MS-C LRIP
decision by the MDA.
B-5.2 IOT&E
OT-C is OT&E conducted on a production-representative test article(s) during the Production and
Deployment phase of the acquisition cycle, and is a prerequisite for the FRPDR.
COMOPTEVFOR makes a determination on operational effectiveness and operational suitability,
and a recommendation regarding Fleet introduction.
B-8
B-5.3 VCD
VCD is not a major phase, but is included as a phase of OT when necessary. A VCD is generally
not a preplanned phase in the TEMP, but can be incorporated into the test program after a formal
phase of OT to verify that certain deficiencies have been corrected. No TEMP update is required,
but a test plan is required. While VCDs normally do not resolve COIs, with proper pre-test
coordination and test planning, COIs may be evaluated during a VCD. For reporting purposes, a
VCD is tied to the previous phase of testing to which it applies (i.e., a VCD for OT-B1 would be
"OT-B1 VCD"). VCDs are done to assist the MDA in ensuring the deficiencies cited as corrected
by the DA from a previous phase of OT have actually been corrected. This type of test will
examine only those deficiencies (and associated COIs) the DA states have been corrected (or
substantially mitigated). The purpose is to show the deficiencies as demonstrated corrected;
demonstrated to be substantially mitigated, i.e., to a degree that recategorization is warranted;
demonstrated not corrected; or as not demonstrated. For non-DOT&E oversight programs, when
COI resolution is discussed in the test plan and if the VCD results enable a change to the resolution
of COIs (beyond IOT&E), then the new resolution is reported. For programs on DOT&E
oversight, the only permitted change in COI resolution during a VCD phase of test is from SAT to
UNSAT. See chapter 8, Evaluation Reports, for report requirements.
B-5.4 FOT&E
FOT&E is all OT&E conducted after the IOT&E. FOT&E is divided into two major phases:
B-5.4.1
OT-D is FOT&E conducted after IOT&E (post-MS-C/FRPDR), using equipment of the same
design as in IOT&E or preferably production systems. It includes completion of any deferred or
incomplete OT&E. OT-D is described in detail in chapter 5, TEMP.
B-5.4.2
OT-E is FOT&E conducted on production systems, unless previously accomplished in OT-D. The
major objective of OT-E is the validation of the operational effectiveness and operational
suitability of production systems. OT-E should be scheduled and conducted whenever production
articles are not available for testing in prior OT&E.
B-5.5 Software Testing
Software will be operationally tested in the system in which the application is installed or
implemented when fielded. The software used for IOT&E of the core block will provide a
performance baseline for testing subsequent increments. For each increment of software for a
software-intensive system, the OTD shall use the DoD guidelines for conducting OT&E for
software-intensive system increments and the ASN (RD&A) Guidebook for Acquisition of Naval
Software Intensive Systems, Version 1. 0 for determining elements of risk and the appropriate
level of OT.
B-5.5.1 SQT
When a software revision or increment is to be released as part of an acquisition MS decision, the
OT is considered an IOT&E. When a software revision or increment is to be released not in
conjunction with an MS decision, the decision may be made to use the SQT process. SQT applies
to software modifications of limited scope, as determined by CNO (N94), such as aircraft and
B-9
weapon systems’ operational flight programs and other systems in which software provides a
similar function. When a program is approved for SQT, CNO (N94) will assign a Test and
Evaluation Identification Number (TEIN); and an SQT TEMP will be written using the format
from SECNAVINST 5000.2F. For SQT, a Statement of Functionality (SOF) prepared by the PM
and approved by the program sponsor will be used in lieu of a CDD/CPD to develop the SQT
TEMP. SQT reports use the standard OPTEVFOR evaluation report format template. (see chapter
8)
B-5.5.1.1 SOF
The PM will forward a SOF to COMOPTEVFOR, via the program sponsor, copy to CNO (N942).
The program sponsor's endorsement will serve as validation of software requirements for that
intended release. The SOF will:
Define new capabilities of the improved software.
Address software corrections to previous deficiencies.
Address any capabilities that were deleted or modified.
Describe the breadth and depth of regression testing conducted.
Address specific operational requirement(s) of the new software.
Describe safety and/or security issues or functions added, modified, or deleted.
B-5.6 Significant Alterations
It is not possible to provide an explicit definition of a significant alteration, which is handled much
like a new system for system acquisition purposes. The decision to classify a modification, ECP,
ordnance alteration, block upgrade, product improvement, etc., as a significant alteration is based
on the scope of the change, the funding level, the importance of the system, the numbers to be
produced, etc. CNO (N94) will consider factors such as these in making the decision. In general,
where an alteration is intended to improve a warfighting capability vice suitability, the alteration
would require some measure of OT&E prior to Fleet introduction. The judgment of
COMOPTEVFOR, the DA, the CNO Resource and Program Sponsor, and (where applicable) the
Naval Board of Inspection and Survey (INSURV) will be major factors considered by N94 in
determining the applicability and scope of testing significant alterations.
B-5.7 QRA
Emerging operational requirements and accelerated acquisition programs necessitates modifying
the established OT process to achieve a rapid capability in the Fleet. In these cases, the program
sponsor may require a risk assessment by COMOPTEVFOR to better understand the capabilities
of the proposed system as well as the risks associated with its fielding. If the sponsor decides a
QRA is needed, the sponsor sends a request to CNO (N94), with a copy to COMOPTEVFOR. If
tasked by CNO (N94), OPTEVFOR will conduct the QRA as rapidly as is feasible. A QRA will
not take the place of a formal OA or IOT&E as described in the TEMP, nor will it be used to
resolve COIs, make effective, suitable, or cyber survivable determinations, or provide fully
developed Fleet introduction/Fleet release recommendations typically produced via an IOT&E
test. (If critical deficiencies are uncovered that clearly outweigh any potential operational benefit,
the Commander reserves the right to make a recommendation against Fleet release.) A QRA is an
B-10
operational risk assessment to address the purpose and answer the questions as outlined in the
QRA tasking letter. QRAs require a test plan, normally signed by the Division Director. All QRA
reports are signed by COMOPTEVFOR. See chapter 6 for QRA test planning and chapter 8 for
QRA report format. The following information must be included in the QRA request:
Purpose of the assessment and, specifically, what system attributes the program sponsor
wants assessed,
Length of time available for the assessment,
Funding available for the assessment.
B-6. TYPES OF IT
SECNAVINST 5000.2F requires that planning for DT and OT (IT) be coordinated at the test
design stages so that each test phase uses resources efficiently to yield the data necessary to satisfy
common needs of the PM and the OT&E agency. Where full IT is not possible or feasible, there
are two legacy methods for integrating T&E that should also be considered, to include combined
DT/OT and DT assist. The goal should be to maximize IT and use the OT-qualified data to support
the required independent OT period. The following paragraphs describe IT and the two legacy
methods.
B-6.1 IT
IT is the collaborative planning and collaborative execution of test phases and events to provide
data in support of independent analysis, evaluation, and reporting by all stakeholders, particularly
the DT (contractor and government) and OT communities. IT blends or combines CT, DT, and
OT to form a cohesive testing continuum. This integration cannot occur, unless the participants
(CT director, DT director, and OTD) have determined their entering requirements for adequate
testing of the SUT. IT does not remove or combine any of COMOPTEVFOR’s current or future
requirements for reporting based on a separate (OPTEVFOR) analysis of the shared test
information produced by the IT effort. IT does not eliminate the requirement for an independent
IOT&E phase of OT&E. However, the expectation is that the IOT&E period may be reduced in
scope and time due to the early, integrated involvement of operational testers throughout the entire
continuum of system development. Any reduction in the scope of IOT&E is highly contingent on
the stability of the configuration of the SUT and the amount of qualified data that can be brought
forward. Regardless of any reductions in scope, IT should substantially increase the probability
of successful completion of IOT&E, by bringing OT concerns earlier in development. IT includes
several key planning paradigms, including:
B-6.1.1
A requirement for the OT team to provide detailed OT input (IEF) to the IT planning process and
provide it early in the program schedule. To this end, OPTEVFOR will develop a Tailored IEF to
support the MS-A TEMP and a complete IEF to support the development of the MS-B TEMP.
B-6.1.2
The sharing of data throughout development and the associated IT periods. This sharing will
support the monitoring of the progress of system capabilities, attributes, KPP, MOEs, and MOSs
towards the successful resolution of COIs.
B-11
B-6.1.3
Blue and Gold sheets may be created, modified, and closed based on results obtained during IT.
Robust testing minimizes surprises when the warfighter receives the product and ensures the
specified capabilities are evaluated in the operational environment. Risk is reduced by bringing
all testing agents together early in the process to ensure capabilities are tied to missions and tasks,
mission-based testing is conducted, system anomalies/deficiencies are identified early in the
process, and all data are shared. Cost is reduced by the sharing of resources, elimination of
duplicative testing, and the early identification and correction of deficiencies. Schedule
compression is achieved by combined versus sequential testing and the sharing of high demand
testing assets. None of these objectives can be achieved without the cooperation of all parties and
a commitment to a team approach between program office, OT, DT, and contractor personnel.
B-6.2 Combined DT/OT
Combined DT/OT, in its strictest sense, is a test phase in which DT and OT testers share test assets
and data, and in which the events meet DT and OT requirements. An example of this would be a
test in which DT and OT testers collect data from the same event or flight. Combined DT/OT is
frequently employed for live fire events that tend to be constrained due to safety considerations
(e.g., air-to-air missile firings and torpedo set-to-hit firings). The following comments apply to
combined DT/OT in a broad sense:
B-6.2.1
While combined testing may be possible in some cases, the differing objectives of DT&E and
OT&E may make it more difficult to combine the two than it may first appear. The explanation is
as follows:
DT&E is properly conducted to test some individual specification or parameter (e.g., the
number of g's pulled by a projectile) with other parameters held constant. The test is
designed to measure technical performance of a system.
The mission of OPTEVFOR is to assess whether, given the achieved technical performance,
the weapon system can be operationally effective and operationally suitable (for both the
SUT and in the SoS) when employed under typical combat and environmental conditions by
Fleet personnel against an enemy who fights back. Thus, OT&E is conducted on a mission-
by-mission basis, varying such factors as sea state; visibility; own-ship speed and maneuvers;
and the method of illumination, range, firing doctrine, target maneuvers, enemy
countermeasures, etc.
B-6.2.1.1
Early planning for combined DT and OT is essential to ensure efficient use of resources.
Participation by OPTEVFOR in the planning and execution of combined tests must ensure that the
tests conducted and data collected are sufficient and credible to meet OT&E requirements. A
separate, independent OT plan will be provided, and separate and independent evaluation of OT
results will be conducted and reported. Depending on the phase of testing, OPTEVFOR will
identify new and changed risks or deficiencies in standard Blue and Gold sheet format. Prior to
combined DT/OT, the OTD should review the DT&E test plan for the technical characteristics,
B-12
test objectives, and to understand how the PM intends to test the system. The OT team needs to
understand what will be tested and how it may impact OT. Combined DT/OT typically requires
an MOA between the PM and COMOPTEVFOR that outlines the DT and OT objectives,
capabilities/functions to be demonstrated, test conditions, test operations, etc. The MOA format
is available in the OT&E Reference Library.
B-6.3 DT Assist
B-6.3.1
DT assist is simply DT with active involvement of OT personnel. As DT assists are not a formal
phase of OT, they will not be assigned an OT number, but will be assigned a DT assist number in
TEPS for test and/or document tracking purposes. OT testers help execute the DT plan. There is
no OT plan, and no OT report is required. A DT assist may be conducted for several different
reasons. It may be done to allow OTDs to become more familiar with a system, to supplement DT
personnel, or to allow the performance of non-hazardous developmental testing on aircraft
assigned to OT squadrons. In all cases, a DT assist provides the system's developers with an early
operational perspective. Though OPTEVFOR does not provide a formal report, Pending Risks
will be captured in Blue and Gold sheets. If the program manager requests, the results will be
compiled in a LOO with the Blue and Gold sheets appended. Conducting a Divisional SERB is at
the discretion of the A-Code, but Blue and Gold sheets will be reviewed by 01C prior to LOO
signature. If provided, the LOO will only be addressed to the PM or PEO as requested.
B-6.3.2
Table B-3 is provided to highlight the differences between DT assist and formal combined DT/OT
phases of testing.
B-6.3.2.1
In DT assists, we generally do not identify minor, major, or severe limitations, since the recipient
is generally completely familiar with the limited scope of the observations.
B-6.3.2.2
DT assist is more than passive observation of DT. OTDs have routinely monitored DT, and that
should continue. A DT assist signifies that the OT test team is actively engaged in the data
collection and is performing its own independent analysis. Ideally, a DT assist should be
characterized on the program-integrated schedule just as combined DT/OT is shown, with
simultaneous DT and OT activity. However, if it is not included on the schedule, a DT assist may
still be pursued and accomplished.
B-6.3.2.3
As is the case for all DT data, if the data meet OT requirements they can be used to supplement
OT data and help resolve COIs in future phases.
B-6.3.2.4
For DT assists, use of an MOA is recommended. This ensures all parties have like expectations
about the scope of the test, when testing commences and ends etc. This is also a good place to
B-13
specify that OPTEVFOR needs access to DT data and reports. The DT assist MOA template can
be tailored for the specific case.
B-6.3.2.5 DT Assist after MS-C
A DT assist can be employed during any phase of the acquisition process, including post-MS-C.
However, it is most appropriate for “fly and fix” applications where COI resolution and
conclusions regarding effectiveness and suitability are neither needed nor desired. Because most
programs are seeking “effective and suitable” conclusions after MS-C, the DT assist approach is
often not the vehicle of choice. It can be used effectively, though, as a lead-in to formal IOT&E,
FOT&E, or prior to a request for a VCD. If a DT assist, with a LOO, is being used to support a
fielding decision, as in the case of a software intensive system, any Pending Risks identified during
the DT assist should be written based on the risk to deployment or fielding.
B-16
B-7. MOT&E
B-7.1 MOT&E
MOT&E is OT conducted jointly by two or more Services for formal DoD acquisition programs.
A lead organization will be designated to coordinate all testing involving more than one military
department or defense agency. This lead organization will prepare a single TEMP, test plan, and
a single T&E report on the operational effectiveness and suitability of the system for each
participating organization. The basic framework for the conduct of MOT&E for the four Services
is contained in the MOT&E MOA.
B-7.2 Navy Lead Service
When the Navy is lead Service, OT&E will be conducted per the provisions of SECNAVINST
5000.2F, the MOT&E MOA, and this guide. OPTEVFOR performs essentially the same functions
as in normal OT&E, with the following modifications:
All planning including the MBTD process will be coordinated with other Service OT&E
agencies.
OPTEVFOR will begin the planning process (MBTD) by issuing a call to other
Service OT&E agencies for COIs and their test objectives. These issues and
objectives will then be consolidated into the IEF and coordinated with other Service
OT&E agencies.
Formal coordination action on the TEMP will accommodate other Service OT&E
requirements and inputs.
Participating OT&E agency test directors and/or project officers will meet to assign
responsibilities for accomplishment of the critical issues/test objectives (from the IEF).
Each participating agency will then prepare the portion of the overall test plan for their
assigned critical issues/objectives, in OPTEVFOR test plan format, and will identify its data
needs. OPTEVFOR will then prepare the MOT&E test plan.
The appropriate ONI Capstone TA will be the TA used for overall program and Navy-unique
threat issues. Other Services may supplement the threat requirements of the ONI Capstone
TA through use of their Service-unique TAs.
B-7.3 Other Lead Service
When another Service has the lead, either a fully integrated TEMP or a Navy appendix to the lead
Service TEMP, will be prepared that clearly reflects the unique Navy testing aspects of the
program, in addition to addressing multiservice testing. The threat for overall program issues,
based on the ONI Capstone TA, will also be addressed in the integrated TEMP or Navy appendix.
This integrated TEMP or Navy appendix will provide the basis for planning and executing Navy-
unique testing. Navy input to test documentation generated by other OTAs should be based on a
tailored IEF, as discussed in paragraph 4-1.10.
B-7.4 Discrepancy Reporting
The lead OT&E agency is responsible for ensuring a system is established to track discrepancies
and to provide periodic status reports to participating OT&E agencies. Control of promulgation
of such reports should be included in an MOA between the participating OT&E agencies.
B-17
B-7.5 Deviations from Lead Service OT&E Procedures
Deviations from lead Service OT&E procedures may be authorized by written agreement between
participating OT&E agencies. Close coordination will be required to ensure the requirements of
Navy OT&E are met.
B-7.6 Test Reporting
For major programs, the lead Service will prepare and coordinate the single (interim or final)
report, reflecting the system's operational effectiveness and operational suitability for each
Service. If a participating Service deems it necessary to produce an independent evaluation report,
it will be appended to the single MOT&E final report.
B-7.7 MOT&E Funding
Each Service OTA is funded differently for the execution of OT. The USAF and USA are directly
funded via Program Objective Memorandum (POM) for OT while the Navy and Marine Corps
rely on the PM/Joint Program Office (JPO) to fund testing resources. Consequently, the lead OTA
will ensure that the TEMP Part IV clearly identifies each Service’s specific test resources (assets
and funding) and where that funding is coming from (specific PM/JPO, POM, etc.).
B-8. JOINT TEST PROGRAMS
B-8.1 Discussion
COMOPTEVFOR’s mandate is to test and evaluate new and improved warfighting capability in
as near a realistic operational environment as possible, which should include some testing in a joint
environment for most programs. However, simply conducting OT of a Navy-only acquisition
program in a joint environment does not make it a joint test program. For the purposes of this
document, joint OT is defined as any test of a system, subsystem, component, or technology
program that involves funding or formal management (including test management) by more than
one DoD component, with the goal of providing a new or improved capability for a validated joint
need. This includes programs where one DoD component may be acting as acquisition agent for
another DoD component.
B-8.2 Types of Joint Testing
There are three basic types of tests for joint programs: MOT&E (described above), Joint Test and
Evaluation (JT&E), and JCTDs. MOT&E is OT&E conducted jointly by two or more Services.
The MOT&E MOA governs the conduct of MOT&E among the four Service OTAs. Some
Services delegate authority to conduct OT&E to supporting agencies or commands. For these
cases, an MOA may be required to codify test activities (e.g., resources, test execution, reporting,
etc.). JT&E and JCTD are joint test concepts that are outside the DoD Directive 5000.01 and are
funded outside the normal service budget process. While JT&Es and JCTDs attempt to address
shortfalls in joint warfighting, JT&E has a TTP focus, and JCTD has a technology/hardware focus.
(See paragraph C-8.2.11 for JT&E overview.) To support input to other OTA-led joint testing, or
JCTD test planning, a tailored IEF documenting OPTEVFOR’s mission and requirements analysis
is required (see paragraph 4-7).
B-18
B-8.2.1 JT&E
B-8.2.1.1 Overview
JT&E evaluates concepts, TTPs, architectures, processes, and addresses specific warfighter needs
and issues that occur in joint environments. The JT&E program is funded and directed by DOT&E
per DoD Instruction 5010.41. There are two types of JT&E:
Quick Reaction Test (QRT), normally lasting less than 1 year
Joint Test (JT), up to 3 years in duration.
B-8.2.1.2 Documentation and Test Reporting
Detailed guidance is available in the JT&E Program Handbook. QRT and JTs may use the
MOT&E MOA to guide the relationship among participating OTAs. Where Navy expertise and
liaison is required, CNO (N94) will task COMOPTEVFOR to provide an OTD to act in a Navy
operational oversight function. When tasked, QRTs and JTs will be assigned a 5000-series local
TEIN for tracking and administration within the TEPS.
B-8.2.2 JCTDs
B-8.2.2.1 Background
B-8.2.2.1.1
A JCTD is an integrating effort to assemble and demonstrate a desired capability based on mature
advanced technologies in a realistic environment to clearly establish military and/or operational
utility. In response to a combatant commander’s request, the USD (A&S) approves all new-start
JCTDs by issuing an approval memorandum. Each JCTD is assigned a sponsor, typically a
combatant command who represents the end user of the system or capability. Once the JCTD
makes it through the approval process, a Working Integrated Product Team (WIPT) is developed
to plan, coordinate, and execute the assessments of the JCTD. The WIPT is comprised of members
who fall under three distinct managers of the program. See
http://acqnotes.com/acqnote/acquisitions/joint-capability-technology-demonstration for the JCTD
guidance and process.
B-8.2.2.1.1.1 Operational Manager (OM)
The OM plans, schedules, and executes the OUA or Limited OUA (LOUA). The OM starts the
process with the assistance of an OTA to develop COIs, which make up the foundation of the IAP.
The IAP is similar to a TEMP and is the overarching test schedule for the program. The OM is
also responsible for drafting the DED for each OUA or LOUA. The DED is similar to a test plan.
B-8.2.2.1.1.2 Technical Manager (TM)
The TM is responsible for all contracts and acquisition instruments for the program, and manages
the budget for the JCTD. The TM is also responsible for delivering the Joint Capability Solution
to the OM for the assessments. The TM is responsible for ensuring that any technologies are
adequately mature and have met all technical and safety certifications before they are used in any
operational demonstrations.
B-19
B-8.2.2.1.1.3 Transition Manager (XM)
The XM is responsible for planning and supporting any Extended Use (EU) of the interim
capability. The XM identifies and facilitates funding for the transition of the capabilities and for
any EU period that has been planned. The XM is responsible for all transition documentation for
the capability to enter the JCIDS. All three managers are co-developers of the implementation
directive, management plan, and the transition plan. Through the WIPT, they all work closely
together in each phase of the program to ensure that the program is properly planned, executed,
and remains on schedule.
B-8.2.2.2 COMOPTEVFOR Participation
B-8.2.2.2.1
Many JCTDs will have little or no Navy interest, while a few may be developing an important new
capability for the Fleet. COMOPTEVFOR (based on input from the requesting sponsor combatant
commander) will determine which JCTDs merit our participation. Since JCTDs are not formal
acquisition, COMOPTEVFOR has no official mandate for participation in the process.
Nonetheless, given JCTDs may eventually transition to formal acquisition and the rigors of OT,
early involvement in selected JCTDs can be critical to rapid development and deployment to the
Fleet. With COMOPTEVFOR approval, a JCTD will be assigned an OTD and receive an
appropriate level of attention, which could well exceed that normally expended on a formal
acquisition program.
B-2.2.2.2
An OUA (replacing the MUA or LMUA) must be conducted by an independent activity (like
OPTEVFOR). Following the demonstration(s) and depending on the success, a JCTD may
transition to a formal acquisition program at the appropriate MS, may be produced in small
quantities and introduced to the Fleet; or may be shelved. JCTDs are not acquisition programs;
they transition solutions to the combatant commander.
B-8.2.2.3 Documentation
Because a JCTD is not a formal acquisition program, it will not have the traditional DoD and
SECNAV documentation. The following are the JCTD documents:
B-8.2.2.3.1 Implementation Directive
An implementation directive provides guidance and direction for implementing a JCTD. The
primary goal of an implementation directive is to define the JCTD program, its objectives, and key
participating agencies with their associated areas of responsibility and resources.
B-8.2.2.3.2 Management and Transition Plan (MTP)
Each JCTD is required to have an MTP, which is, basically, an agreement between the developer
and sponsor. Included should be an overview of the JCTD, a schedule of planned events and
demonstrations, programmatic and organizational details, funding information, COIs, and a
description of the residual operational capability expected on completion of the demonstration(s).
Requirements may be incorporated in the MTP or may not be documented at all. JCTD sponsors
may include a CONOPS, which addresses theater-level interoperability, compatibility, and
integration issues.
B-20
B-8.2.2.3.3 IAP
A TEMP-like document, the IAP, includes an OUA approach and an OUA framework. The OUA
approach section includes the schedule, demonstration venues and participants, data requirements,
resources, and constraints. The OUA framework includes COIs, objectives, top-level capabilities
and metrics, MOPs, and MOEs.
B-8.2.2.3.4 CONOPS and TTP Outline
The CONOPS and TTP outline should include required capabilities with metrics, CONOPS, COIs,
the expected threat and operational environments, operational scenarios, and tactical vignettes.
B-8.2.2.3.5 Tailored IEF
The Tailored IEF is an OPTEVFOR document that captures the mission and requirements analysis
performed by an OTD (as described in the Implementation Directive or other requirement
documents) for those JCTDs that are of particular interest to the Navy.
B-8.2.2.3.6 DED
The DED is akin to a standard test plan for a non-oversight program.
B-8.2.2.3.7 Final Reports
Final reports for JCTDs are similar to the EOA/OA formats and are described in chapter 8,
Evaluation Reports.
B-8.2.2.4 Requirements
Since JCTDs are technology demonstrations by nature, most will not have a formal set of
performance requirements. Often, the demonstration is used to quantify system capabilities and
define requirements. If there are no thresholds or objectives, the test team should ascertain what
the JCTD is meant to do, and determine COIs and MOEs/MOPs needed to reflect those
capabilities. Also, ask how the JCTD could be used. Bring ideas before the WIPT and get
agreement, then do the test planning. OPTEVFOR participation in JCTDs should focus on:
Providing a sound OT methodology, complete with COIs, MOEs, and MOPs.
Developing COIs and MOEs/MOPs, including suitability issues.
Assessing and documenting the demonstration results so that transition to formal acquisition
will be as easy as possible.
Making recommendations for system improvement.
Identifying strengths and weaknesses observed.
B-8.2.2.5 TEPS
When tasked, JCTDs will be assigned a 5000-series local TEIN for tracking and administration
within TEPS.
B-21
B-9. U. S. SPECIAL OPERATIONS COMMAND (USSOCOM) NAVAL
SPECIAL WARFARE (SPECWAR) RESEARCH, DEVELOPMENT, AND
ACQUISITION (RDA) POLICY
Procedures for USSOCOM (and its component SPECWAR) systems and equipment must be
streamlined to ensure the most rapid possible progress from the concept stage through final
development. In many instances, USSOCOM/SPECWAR systems are needed to meet
preparedness requirements for contingency operations around the world. See the SOCOM
Acquisition website for more information at (https://www.socom.mil/acquisition-authority).
B-10. FOREIGN COMPARATIVE TESTING (FCT) AND DEFENSE
ACQUISITION CHALLENGE (DAC) PROGRAMS
Title 10, U.S. Code Section 2350a(g) and 2359b establish two programs: the FCT Program and
the DAC Program, respectively. The FCT Program tests allied or friendly nations’ defense
equipment and technologies to see if they can satisfy DoD needs. DAC allows non-DoD entities
to propose technologies, products, or processes to existing DoD acquisition programs. At the OSD
level, FCT and DAC Programs are managed by the Comparative Testing Office (CTO).
B-10.1
The purpose of the FCT Program is to establish the ability of North Atlantic Treaty Organization
(NATO) and friendly foreign countries to satisfy U.S. requirements or operational deficiencies.
Authorizes side-by-side testing of foreign non-developmental or Commercial Off-the-Shelf
(COTS) equipment.
Focuses on mature or late-stage technologies.
The DAC Program provides increased opportunities for the introduction of innovative and cost-
saving technologies into DoD acquisition programs. DAC provides an “on-ramp” to DoD
acquisition systems for small and medium vendors.
CNO, under the policy guidance of the ASN (RD&A), has responsibility within the Navy for
management and program execution of Foreign Weapons Evaluation (FWE) and NATO
Comparative Test Program (CTP).
When procurement of a foreign weapon system is planned, CNO will direct the DA and
COMOPTEVFOR to assess the adequacy of any previously conducted DT&E and OT&E, and
provide recommendations on the need for additional T&E prior to procurement. If additional T&E
is required, CNO (N94) will assign an ACAT and TEIN. T&E will then be conducted using normal
system procurement procedures.
B-22
Close liaison between the CTO project personnel and OPTEVFOR is required during test planning
and evaluation periods to ensure data can be used effectively in follow-on OT.
B-10.2.3.1
Additional information on FCT and DAC Programs is available at the CTO Web site listed above
and in SECNAVINST 5000.2F.
B-11. LFT&E
Live Fire Testing (LFT) is conducted to provide a timely and thorough assessment of the
vulnerability and lethality of a system as it progresses through its development and subsequent
production phases. The primary emphasis of LFT is on realistic testing as a source of personnel
casualty, vulnerability, and lethality information, taking into account the susceptibility to attack
and combat performance of the system. LFT will include, when feasible, the firing of threat
munitions (or surrogates) at operational, combat-loaded U.S. weapon systems to test their
vulnerability; and/or the firing of U.S. munitions or missiles against operational, combat-loaded
threat targets (or surrogates) to test the lethality of those munitions or missiles. Guidelines for the
conduct of LFT&E are provided in SECNAVINST 5000.2F.
The basic resourcing document for LFT&E is the TEMP; which is complimented by the LFT&E
Management Plan. The TEMP Part III will contain a separate section that charts the LFT&E
course of action during the acquisition process. The LFT&E section of Part III of the TEMP will
be developed by the DA under the cognizance of DOT&E and will include:
Description of the overall LFT&E strategy for the item
Critical LFT&E issues
Required levels of system vulnerability/lethality
Management of the LFT&E program
LFT&E schedule, funding plans, and requirements
Related prior and future LFT&E efforts
Evaluation plan and shot selection process
Major test limitations for the conduct of LFT&E.
LFT&E resource requirements (including test articles and instrumentation) will be
appropriately identified in the TEMP Part IV T&E Resource Summary. See chapter 5 for
TEMP details.
Within the Navy, LFT&E is primarily a developmental test responsibility since it is directly tied
to the fundamental platform design. COMOPTEVFOR's major interest is system vulnerability
and lethality and the associated impacts on the successful execution of mission tasks. The role of
the OTD in LFT&E will be:
Review the LFT&E section of the TEMP.
B-23
Request a copy of the detailed LFT&E plan for review.
Monitor the LFT to obtain a firsthand impression of the vulnerability or lethality of the SUT.
Obtain a copy of the detailed LFT&E report for review.
C-1
APPENDIX C - ELECTRONIC MANAGEMENT SYSTEMS
C-1. INTRODUCTION
This appendix provides an overview of the Test and Evaluation Program System (TEPS) and also
discusses the shared drives and archiving of test documents.
C-2. TEPS
C-2.1
TEPS is a module within the COMOPTEVFOR Knowledge Management System (KMS) on the
unclassified LAN. (https://kms.cotf.navy.mil/home_auth/home.home_mis.home_main). TEPS is
a Web-based management tool designed to assist the OTD/OTC in the tracking and administration
of projects, Fleet services scheduling, and activity reports. Access to the TEPS database is limited
to members of OPTEVFOR. When a TEIN assignment letter is received from OPNAV (N942),
the new TEIN is entered in the TEPS database (see 01A) and the appropriate OPTEVFOR OTD
desk code is assigned. TEPS TEIN assignments are coordinated via the 01A deputy. When
required, a temporary local TEIN (3000-XXX) series may be assigned to programs that have not
yet been assigned a formal TEIN by N942. TEIN (4000-XXX) and (5000-XXX) series are
assigned for training and JCTDs, respectively. The TEPS User Guide is available for review on
the Y: drive as needed.
It is crucially important that TEPS be maintained by the OTD and periodically reviewed by SHs,
and division deputy directors. TEPS is central to the management of programs and documents at
OPTEVFOR, and is the primary source of information for the COMOPTEVFOR Annual Report.
Failure to maintain TEPS will result in inefficiencies and delays.
C-2.1.1 TEPS Requirements
C-2.1.1.1 Key Data Fields
Data fields that must be filled in prior to saving a project or phase page are marked with a red
asterisk. Table E-1 lists additional key data fields that are critical to program management and
require OTD/OTC focus.
C-2
Table C-1. Critical TEPS Fields
UNCLASSIFIED
Data Field Field Location Comment
Short Title Project Main Programs may have multiple short titles. Include common abbreviations to assist the
search for programs when the TEIN is unknown.
Status Project Main 1. “OPEN” - OPTEVFOR is expending resources (funding, OTD time attending
meetings, conducting tests, writing reports, etc.).
2. “OPEN NO OT” - No involvement from OPTEVFOR is anticipated (program is
fielded with no planned improvements).
3. “REC CNX” - Program has been or is being removed from the Fleet.
Test Status Phase Main
The KMS Test Plan and Final Report Trackers check this field.
Select from the following:
COMP = Test is complete. End of Test message sentall data received.
DEFICIENT = Test not started or stopped during test, due to programmatic issues.
FUTURE = Phase of test beyond the next phase.
INCOMP = Test event was attempted, but results were incomplete and another
attempt for this phase is planned.
INTEST = Test is in progress. Start of test message has been released.
NEXT = Next phase of testing planned.
CNX = Phase was cancelled. Selecting “CNX” removes the phase from all trackers.
Est. Start Date Phase Main Date OTD thinks test will start. The KMS Test Plan Trackers are based on this date.
Start Date Phase Main The date actual testing began. Should be the same as the start test message.
Estimated Last Test
Event Date
Phase Main The (planned or estimated) date for the last test event for this phase to end.
Last Test Event Date Phase Main The actual end date for the last planned test event for this phase testing, regardless of
data collection or data analysis. Last event used to gather data for this phase of test.
Est. End Date Phase Main Date the OTD thinks the test will end. The KMS Final Report Trackers use this date
when End Date has not been filled in. Usually 30 days after Last Test Event date.
End Date Phase Main The date testing ended (to include data collection). The KMS Final Report Trackers
are based on this date. Should be the same as the end of test message.
Test Result Code Phase Main After the final report is signed, select the appropriate option from the pull-down
menu. Contact 01A in cases where the option is not clear.
Recommend Code Phase Main After the final report is signed, select the appropriate option from the pull-down
menu.
Project COIs Project COI Ensure all COIs from the TEMP have been entered.
Phase COIs Phase COI Ensure all COIs for the phase have been entered. After the final report is signed,
edit each COI to update the assessment or resolution, as appropriate. For RED or
UNSAT COIs, a remark may be added to clarify the deficiency.
Major Deficiencies Phase COI Edit for
IOT&E and FOT&E
Phases
After the final report is signed, select the appropriate number of major deficiencies
associated with each COI.
C-3
Table C-1. Critical TEPS Fields
UNCLASSIFIED
Data Field Field Location Comment
FINAL REPORT Phase Documentation
Final Report Edit
The KMS Final Report Trackers look for a completion date. Enter “Complete
Date” with the date the Report was signed. Enter “Doc Provided to Editors/Vault”
with the date delivered to Editors (Unclass) or Vault (Classified). This alerts 01A to
post report to Y-drive. Upon completion of upload to Y-drive, 01A will enter “yes”
in the “01A Uploaded Document” box. This removes the document from the
tracker.
For phases that do not have final report, use the “NA” status to remove the phase
from the Final Report Trackers.
TP SIG
COMOPTEVFOR
and TP SIG DOT&E
Phase Documentation
TP SIG
COMOPTEVFOR and
DOT&E Edit
The KMS Test Plan Trackers look for a completion date. TP SIG DOT&E is only
required for DOT&E oversight programs. Enter “Doc Provided to Editors/Vault”
with the date delivered to Editors (Unclass) or Vault (Classified). This alerts 01A to
upload the Test Plan to Y-drive. Upon completion of upload, 01A will enter “yes”
in the “01A Upload Document” box. For non-oversight test plans, this removes the
document from the tracker. For oversight test plans, the “Complete Date” must be
entered in the “TP Sig DOT&E” box before the document comes off of the tracker.
For phases that do not have test plans, use the “NA” status to remove the phase from
the Test Plan Trackers.
FRAMEWORK SIG
COMOPTEVFOR
Phase Documentation
FRAMEWORK SIG
COMOPTEVFOR edit
The KMS Framework Trackers look for a completion date. Enter “Complete Date”
with the date the IEF was signed. Enter “Doc Provided to Editors/Vault” with the
date delivered to Editors (Unclass) or Vault (Classified). This alerts 01A to upload
report to Y-drive. Upon completion of upload, 01A will enter “yes” in the “01A
Uploaded Document” box. This removes the document from the tracker.
For phases that do not have an IEF (i.e., DT Assist), use the “NA” status to remove
the phase from the Final Report Trackers.
C-2.1.1.2 Shared Drives
C-2.1.1.2.1 K-Drive
The K: drives on the unclassified and classified LANs are shared drives that support access to and
storage of T&E documents. The drives are organized by division, each division is organized by
section, and each section is organized by office code. While each division may set its own
requirements, at a minimum, the K: drive folders for individual programs should be structured
with the following guidelines.
C-2.1.1.2.2 Program Folder
Program folders should be named with the TEIN and short name (e.g., K:\40\41\0371-03 CBASS).
Each program folder should have subfolders for the following, as required:
Each phase of test
Requirements documents
Framework
Funding
TEMP
C-4
Phase of Test / Test Period
Within program folders, each phase of test should have its own folder using the name of the phase
(e.g., K:\50\54\541\0201-08 EA-18G\OT-B1). Each phase of test should have folders for the
following documents:
Briefs
Messages
Final report
Test plan
Documents
Test documents are built and stored in the K: drive document folders until they are finally signed.
Once signed, most of these documents are archived elsewhere. However, as discussed below,
there are certain documents that are not archived elsewhere, and therefore should be retained in
the K: drive document folders.
C-2.1.1.3 Archiving of Documents
Test documents are archived in the Y:\00\Signed Test Documents folders on NIPR and SIPR. In
addition to Test Plans, Reports, and IEFs the Y:\00\Signed Test Documents folders are to be used
for archiving other official COMOPTEVFOR documents (i.e., TEMPs, M&S VV&A Plans, M&S
Requirements Letters, M&S Accreditations, other official correspondence) signed at the Division
Director or higher levels.
Additionally, the division’s K: drive folders may be used to archive test documents.
C-2.1.1.4 Document Distribution
The eKM distribution portal was stood down in May 2017. This “pull” distribution has been
replaced with a “push” approach. Divisions will send final signed documents to all stakeholders
as attachments to email.
C-2.1.1.4.1 TEPS Archiving
TEPS can upload any unclassified document, and may be used to archive documents not stored on
the Y-drive, to include:
TEMP comment letter
Deficiency report message
Anomaly report message
MOAs
OT commencement message (start test message)
OT completion message (end of test message)
Requirements documents (ICD, CDD, CPD, ORD, etc.)
C-5
C-2.1.1.4.2 K: Drive Archiving
The K: drive is the only place for archiving of certain classified documents which are not posted
to the Y: drive, and cannot be stored on TEPS due to classification. This includes the following
classified documents:
TEMP comment letter
Deficiency report message
Anomaly report message
MOAs
OT commencement message (start test message)
OT completion message (end of test message)
Requirements documents (ICD, CDD, CPD, ORD, etc.)
C-6
THIS PAGE INTENTIONALLY LEFT BLANK.
D-1
APPENDIX D - SQUADRON COORDINATION
D-1. Purpose
The goal of this procedure is to ensure effective coordination between COMOPTEVFOR HQ and
OT squadrons, and efficient routing of VX/VMX/HMX squadron documents during test planning,
test execution and final report development.
D-2. Overview
Communication and coordination between COMOPTEVFOR HQ elements (50 Division, 01B,
and 01C) and the VX/VMX/HMX squadrons for all OT-related products is required from cradle
to grave. Regardless of supported and supporting relationships, coordination is critical to ensure
all aspects of OT (to include targets, Fleet schedules, ranges, etc.) are planned and available for
test execution. Coordination must be early and continuous to ensure transparency across the
command and with external stakeholders. While each program is unique, the following is the
general coordination guidance for each product:
o IEF: 50A/B/OTC coordinates with DOT&E/Sponsor/PM.
o TEMP: 50A/B/OTC coordinates with DOT&E/Sponsor/PM.
o Test Plan:
o Squadron coordinates with Sponsor/PM.
o 50A/B/OTC coordinates with DOT&E.
o Report:
o Squadron coordinates with Sponsor/PM.
o 50A/B/OTC coordinates with DOT&E.
The supported/supporting relationship between the OT squadrons and COMOPTEVFOR HQ is
based on the document in development. For IEF, M&S Accreditation and TEMP development,
OPTEVFOR (50 division) is supported and the OT squadrons are supporting. For Test Plan/DCP
and Final Report development, the OT squadrons are supported and OPTEVFOR (50 and 01C) is
supporting.
D-3. Roles and Responsibilities
D-3.1 OTD
The OTD’s primary responsibility is to ensure all necessary operational and T&E expertise are
engaged, and sufficient statistical and analytical rigor is employed to conduct a thorough test and
to produce a clear and accurate test report. The OTD is the squadron’s Subject Matter Expert
(SME) for his program(s). The OTD is responsible to the squadron Commanding Officer (CO)
for the substance of all test plans and test reports. The OTD is responsible for the proper
management of all program funds, and for all phases of test planning, approval, execution,
analysis, and reporting. The OTD is accountable for communicating with the program offices and
D-2
other external agencies, as appropriate. OTDs may be assigned a variety of support staff, including
military or government civilian assistant OTDs or contracted support, as needed.
OTC
The OTC is a position assigned in the Air Warfare Division (50). The OTC coordinates the efforts
between the OTD, who is located in the squadron, and the division Section Head, DACOS, and
ACOS in 50 Division, as well as any other COMOPTEVFOR HQ entities that support the OT
squadrons (01A/B/C/D, Comptroller, etc.). The OTC is the COMOPTEVFOR HQ SME for his
programs and is responsible to the ACOS for the substance of all test documentation and situational
awareness of SUT performance/issues during test planning through reporting. Unique to the
OTD/OTC relationship, the OTC has the following responsibilities:
Manage all interaction with DOT&E in coordination with the OTD.
Coordinate all program funding with the program office to ensure that the OTD has adequate
funding to execute test.
Submit official Fleet resource requirements in support of test execution.
Coordinate and schedule all COMOPTEVFOR HQ briefs and DOT&E briefs that require
squadron support.
Provide consolidated COMOPTEVFOR HQ Comment Resolution Matrix (CRM) to OTD
following HQ review of squadron created documents.
Produce Start Test and End Test messages based on OTD input.
Support the squadron in staffing squadron documents through COMOPTEVFOR HQ
Maintain awareness of all aspects of SUT performance/program status during test execution.
01C Action Officer (AO)
01C Test Planning and Evaluation is responsible for the analytical rigor applied to all test planning
documents and reports across the Force. It supports the development of all test plans, reports, and
supporting modeling and simulation documents. The 01C AO supporting the squadrons is the
process SME for test planning, execution, and report writing.
50 Division ACOS
The 50 Division ACOS is responsible for being the primary interface with O-6 PMs during IEF
and TEMP development and, during all phases of program development, with DOT&E Deputy
Directors and AOs. The ACOS/DACOS ensure that all Division products are ready for Flag-level
review and/or signature. The ACOS ensures COMOPTEVFOR representation at high-visibility
test events and at all Operational Test Readiness Reviews (OTRR)/mission control panels,
Working Integrated Product Team (WIPT) executive level meetings, and DOT&E Concept of Test
Briefs.
Squadron CO
The squadron CO is responsible for primary interface with O-6 PMs during Test Execution. The
squadron CO/COTD/ACOTD ensures that all squadron products (Test Plans and Final Reports)
are ready for Flag-level review. The squadron CO may represent COMOPTEVFOR at high-
D-3
visibility test events and OTRR/mission control panels, WIPT executive level meetings, and
DOT&E concept of test briefs.
D-4. Document Development and Staffing
D-4.1 IEF
50 Division owns; OTC initiates; OTD SME supports; Squadron CO/COTD/ACOTD
involvement; review and concurrence achieved through the E-IPR process; 50 Division
writes/staffs, gets 00 approval.
TEMP Inputs
50 Division owns; OTC initiates; OTD/ACOTD support for resources; Squadron CO visibility
during O-6 review to ensure adequacy of resources; 50 Division writes/staffs, gets 00 approval.
Test Plan
o For oversight EOA/OA/QRA/IOT&E/VCD/FOT&E test plans and IT DCPs: Squadron
owns; OTD initiates; OTC/01C/D supports; Squadron creates/edits; CO-approved draft
Word version sent to OTC for routing (for HMX-1/VMX-1 Word version editors
included in staffing process).
o For non-oversight EOA/OA/QRA/IOT&E/VCD/FOT&E test plans and IT DCPs, which
are released at the O-6 level: letterhead is required, therefore, 50A will review post
squadron CO and release. 50A will inform 00/00D of impending approval of
non-oversight test plans and will provide them for 00/00D review if directed.
o Details of OTC/01C supporting role: During test plan development, when squadron is
ready for OPTEVFOR review (prior to squadron CO approval), OTD will send copy of
document to OTC who will forward to 01B/C/D. OTC and 01B will review document
for adherence to IEF test design. OTC and 01C will review for format, adequacy of data
collection and analysis plan, and adherence to approved test planning and execution
processes. OTC and 01B/CD will review comments together and OTC will provide a
consolidated CRM to OTD for further TP development and staffing through the squadron
CO.
o When CO-approved document has been reviewed by COMOPTEVFOR HQ front office,
OTC will incorporate comments into CRM for correction to document. OTD will
adjudicate CRM and return updated document and adjudicated CRM to OTC for staffing
to front office.
Reports
o Includes DT assist LOO, EOA/OA/IOT&E/VCD/FOT&E reports, MUA reports, and
QRA reports. Squadron owns; OTD initiates; OTC/01C/D supports; Squadron
creates/edits; 50A/B Code involvement during AWG, SERB and ESERB; CO-approved
draft Word version sent to OTC for routing (for HMX-1/VMX-1 word version editors
included in staffing process). For DT assist LOO requiring COMOPTEVFOR letterhead:
Squadron will e-mail final product to 50 Division as a Word document to print on
letterhead and acquire approval signature. 50A will inform 00/00D of impending
approval of LOOs and will provide them for 00/00D review if directed.
D-4
Details of OTC/01C Supporting Role
o In preparation for AWG, OTD will provide draft data appendix to OTC/01C. 01C
reviews and provides CRM to OTC.
o In preparation for SERB, OTD will provide draft B/G sheets and COI results paragraphs
to OTC/01C. OTC will review documents for content. 01C will review for format and
adherence to approved test reporting policies. OTC and 01C will review comments
together and OTC will provide a consolidated CRM to OTD for
incorporation/adjudication prior to SERB.
When CO-approved Final Report has been reviewed by COMOPTEVFOR HQ front office, OTC
will incorporate comments into CRM for correction to document and send to squadron. OTD will
adjudicate CRM and return updated document and adjudicated CRM to OTC for staffing to front
office.
Modeling and Simulation Documents
Includes M&S Requirements Letter, M&S Accreditation Plan, and M&S Accreditation report. 50
Division owns; OTC initiates; OTD/COTD/ACOTD supports. 50 division writes/staffs, gets 00
approval.
Other Communication
All communication between COMOPTEVFOR HQ and squadron during document development
must include all three stakeholders to maintain situational awareness (OTD, OTC and 01C).
CRM
All COMOPTEVFOR HQ CRMs, with squadron adjudication included, will be routed with final
document.
E-1
APPENDIX E - TEST AND EVALUATION
STAKEHOLDERS
E-1. INTRODUCTION
In addition to the relationships discussed in chapter 2, there are a number of other organizations
the operational tester will likely encounter in the design, planning, execution, and reporting of
operational tests. Since most are aligned by warfare area, the stakeholder list is provided by
warfare division.
E-2. UNDERSEA WARFARE
E-2.1 PEO Submarines
Focuses on the design, construction, delivery, and conversion of submarines and advanced
undersea and anti-submarine systems, including Special Operations Forces delivery systems;
submarine rescue systems; torpedoes; towed acoustics sensors; and unique submarine sonar,
control, imaging and electronic warfare systems.
E-2.2 PEO Columbia
Focuses on the design, construction, and delivery of the Columbia-class fleet ballistic missile
submarine.
E-2.3 PEO Ships
Manages acquisition and complete life-cycle support for all U.S. Navy non-nuclear surface ships.
These ships range from frontline combatants to amphibious ships that transport Marines and their
equipment to supply and replenishment cargo ships.
E-2.4 PEO Unmanned and Small Combatants (PEO USC)
Designs, develops, builds, maintains and modernizes the Navy's expanding family of unmanned
maritime systems, mine warfare systems, and Small Surface Combatants by employing the full
arsenal of acquisition authority to develop and deliver innovative solutions and technologies.
E-2.5 PEO for Command, Control, Communications, Computers, and Intelligence (PEO
C4I))
PEO C4I provides integrated communications and information technology systems that enable
information dominance and the command and control of maritime forces. PEO C4I is the ISIC
responsible for C4I-related Program Management Warfare Offices (PMW).
E-2.6 PEO, Integrated Warfare Systems (PEO IWS)
Develops, delivers and sustains operationally dominant combat systems for Ships and Submarines.
E-2
Integrated Warfare Systems, Undersea Systems (IWS 5)
Program Manager, oversees the design and development of the Submarine Combat Systems and
CRUDES ASW and Mission Planning segments. (ARCI, BYG-1, SQQ-89, USW DSS)
Program Office (PMS 325)
The Support Ships, Boats and Craft Program Office (PMS 325) within PEO Ships delivers
integrated ship, boat and craft products and services to U.S. and international maritime forces
around the world. (TAGOS(X))
Program Office (PMS 394)
Acquisition and life cycle support for advanced undersea systems, SEAWOLF Class submarines
including USS JIMMY CARTER (SSN 23), Deep Submergence Vehicle (DSV) ALVIN,
Universal Launch and Recovery Module (ULRM), and related Research and Development (R&D)
systems. (Tetra)
E-2.10 Columbia Class Program Office (PMS 397)
Is conducting the design, construction, and delivery of the next-generation Sea-Based Strategic
Deterrence submarine (SSBN). (CLB Class)
E-2.11 Submarine Acoustic Systems Program Office (PMS 401)
Develops and implements the Warfare System Modernization Plan that defines future upgrades,
systems standards, and interface definitions for development of submarine acoustic systems to
include towed arrays and the AN/BQQ-10(V) Sonar System. (BQQ-10, LVA)
E-2.13 Undersea Weapons Program Office (PMS 404)
Oversees the research, development, construction, and modernization of all undersea weapons,
including the M-54 lightweight torpedo employed aboard surface ships and aircraft and the Mk 48
ADCAP/CBASS heavyweight torpedoes employed aboard submarines. (Mk 48, Mk 54)
E-2.14 Unmanned Maritime Systems Program Office (PMS 406)
Chartered to develop, acquire, deliver, and support operationally effective, integrated Unmanned
Maritime Systems (UMS) for the war fighter and to direct UMS experimentation and technology
maturation efforts. (ORCA, Snakehead, Razorback)
Undersea Defensive Warfare Systems Program Office (PMS 415)
Conducts research, development, and construction of submarine defensive systems, including
noisemakers and anti-torpedo torpedoes. (CAT, TWS, NIXIE, SubTDS)
Submarine Combat System Program Office (PMS 425)
Develops and acquires the combat and weapons control systems to include the AN/BYG-1(V)
Combat System for both in-service and new construction ships. (BYG-1, SLUAS)
E- 2.17 Submarine Sensor Systems Program Office (PMS 435)
Designs, develops, and oversees the construction of EW Systems, periscopes, and the Photonics
Mast. (CSIS, BLQ-10).
E-3
Virginia Class Program Office (PMS 450)
Oversees the design, construction, and delivery of the United States’ newest attack
submarine. (VA Class)
Maritime Surveillance Systems Program Office (PMS 485)
Oversees the design and development of the SURTASS ASW Program. (SURTASS, SURTASS-
E, DSS DWP)
Mine Warfare Program Office (PMS 495)
Chartered to develop, deliver, field, and sustain enduring MIW systems including unmanned
systems. (HAMMERHEAD)
Tactical Networks Program Office (PMW 160)
The Tactical Networks Program Office provides affordable, interoperable, and secure net-centric
enterprise capabilities to the Navy, joint, and coalition warfighters. (SubLAN)
Undersea Integration Program Office (PMW 770)
The Undersea Integration Program Office delivers integrated and interoperable C4I capabilities
and support to the Navy by connecting the undersea architecture of manned and unmanned systems
and undersea vehicles. (OE538, LBUCS, CSRR)
E-2.21 Commander Submarine Forces, Commander Submarine Force Atlantic
Commander, Submarine Forces, is the undersea domain lead, and is responsible for the submarine
force's strategic vision. As commander, Submarine Force Atlantic commands all Atlantic-based
U.S. submarines, their crews and supporting shore activities. These responsibilities also include
duties as commander, Task Force (CTF) 114, CTF 88, and CTF 46. As commander, Allied
Submarine Command, provides advice to the North Atlantic Treaty Organization Strategic
Commanders on submarine related issues.
Commander Submarine Force, U.S. Pacific Fleet
Commander Submarine Force, U.S. Pacific Fleet is the principal advisor to the Commander, U.S.
Pacific Fleet for submarine matters. The Pacific Submarine Force (SUBPAC) includes attack,
ballistic missile and auxiliary submarines, submarine tenders, floating submarine docks, deep
submergence vehicles and submarine rescue vehicles throughout the Pacific. The Force’s mission
is to provide the training, logistical plans, manpower and operational plans and support and tactical
development necessary to maintain the ability of the Force to respond to both peacetime and
wartime demands.
)
UWDC integrates undersea CONOPS, TTP, theater level Command and Control of ASW forces,
and prepares submarine crews to conduct assigned advanced missions and all combat
missions. UWDC in concert with NWDC develops, validates, publishes, and revises TTP for
submarine and undersea warfare to include the Integrated Undersea Surveillance Systems.
E-4
Development Center (SMWDC)
SMWDC’s mission is to increase the lethality and tactical proficiency of the Surface Force across
all domains. SMWDC provides individuals, ships, and staffs tactical sets and reps to increase
lethality and tactical proficiency and provides operational and subject matter expert support to
ships, squadrons, strike groups, independent deployers, Numbered Fleet Commanders, Naval
Component Commanders, and Combatant Commanders through direct, reach-back, BMD mission
package, or fly-away team support, as needed.
Naval Undersea Warfare Center (NUWC) Keyport Washington and NUWC
Detachment Pacific
Supports system design and integration, test development, execution, and analysis for submarine
and surface ship combat systems, sonar systems, and torpedoes. (BQQ-10, BYG-1, SQQ-89, Mk
48, Mk 54, HAMMERHEAD, CRAW, UUVs)
Naval Undersea Warfare Center (NUWC) Newport Rhode Island
Supports system design and integration, test development, execution, analysis, and modeling and
simulation efforts for submarine and surface ship combat systems, sonar systems, and
torpedoes. (CLB Class, VA Class, BQQ-10, BYG-1, SQQ-89, Mk 48, Mk 54)
E-2.27 Johns Hopkins University Applied Physics Laboratory (JHU/APL)
A Federally Funded Research and Development Center (FFRDC). Supports advanced test and test
data analysis for submarine and surface ship combat systems and sonar systems to include in-lab
playback of test event recordings on tactical systems. (CLB Class, VA Class, SURTASS, BQQ-
10, BYG-1, SQQ-89)
Research Laboratory, Penn State University (ARL/PSU)
ARL/PSU is an integral part of the University. Originally focused on undersea weapons
technology development, ARL now includes a broad research portfolio addressing the needs of
various sponsors. As a Department of Defense (DoD) designated University Affiliated Research
Center (UARC), ARL conducts essential research, development, and systems engineering in
support of our nation's priorities, free from conflict of interest or competition with industry. (CLB
Class, VA Class, BQQ-10, BYG-1, SQQ-89, Mk 48, Mk 54)
Applied Research Laboratory, University of Texas (ARL/UT)
Applied Research Laboratories, the University of Texas at Austin (ARL/UT), is a Department of
Defense University-Affiliated Research Center (UARC). Since 1945, ARL/UT has been engaged
in sponsored research dedicated to improving our national security through applications of
acoustics, electromagnetics, and information sciences. (CLB Class, VA Class, BQQ-10, SQQ-89)
E-3. AIR WARFARE
E-3.1 Air Test and Evaluation Squadron ONE (VX-1)
The primary mission of VX-1 is to conduct tests, evaluations, and investigations of antisubmarine
and anti-surface aircraft weapons systems, airborne early warning aircraft systems, airborne
strategic weapons system, support systems, equipment, and materials in an operational
environment. The squadron also develops, reviews, and disseminates new ASW/SUW tactics and
E-5
procedures for Fleet use, serving as the model manager for all Air ASW/SUW tactical publications.
The squadron is administratively assigned to Commander, Naval Air Force, Atlantic.
Air Test and Evaluation Squadron NINE (VX-9)
VX-9 is charged with the testing and evaluation of weapons and their related systems for the F/A-
18 and AV-8B families of aircraft. The squadron is administratively aligned under Commander,
Naval Air Force, U.S. Pacific Fleet.
Marine Aviation Weapons and Tactics Squadron One (MAWTS-1)
Conducts training for aviation units, most notably the Weapons and Tactics Instructor (WTI)
course at Marine Corps Air Station Yuma.
Marine Helicopter Squadron ONE (HMX-1)
Is a United States Marine Corps helicopter squadron responsible for the transportation of the
President of the United States, Vice President, Cabinet members, and other VIPs. In addition to
its VIP transport role, it is also tasked with operational test and evaluation of Presidential transport
helicopters. The squadron is under the administrative control of the Deputy Commandant for
Aviation. Routine operational control is under the White House Military Office. Operational
testing is executed under the direction of COMOPTEVFOR, when required.
Marine Operational Test and Evaluation Squadron One (VMX-1)
VMX-1 is an independent test organization conducting operational test and evaluation of assigned
USMC helicopter, attack helicopter, and tilt rotor aircraft under the direction of COMOPTEVFOR.
The squadron is under the administrative control of the Deputy Commandant for Aviation with the
charter to:
Address future requirements.
Build an operational tactics guide.
Develop tactics, techniques, and procedures.
Sponsor tiltrotor issues and concepts of employment.
Air Test and Evaluation Squadron THREE ONE (VX-31)
Developmental Test and Evaluation squadron based at Naval Air Weapons Station China Lake,
CA, falls under Naval Test Wing Pacific (NTWP). Responsible for testing manned and unmanned
aircraft, air weapons, and air weapon systems.
Commander, Naval Air Forces (CNAF)
Is dual-hatted as Commander, Naval Air Force, Pacific (COMNAVAIRPAC) and is the aviation
TYCOM for all United States Navy naval aviation units. CNAF is responsible for the material
readiness, administration, training, and inspection of units/squadrons under its command, and for
providing operationally ready air squadrons and aircraft carriers to the Fleet. COMNAVAIRPAC
exercises administrative control of VX-9.
E-6
Commander, Naval Air Force, Atlantic (COMNAVAIRLANT)
Is the aviation TYCOM for the United States Atlantic Fleet naval aviation units. AIRLANT is
responsible for the material readiness, administration, training, and inspection of units/squadrons
under its command, and for providing operationally ready air squadrons and aircraft carriers to the
Fleet. COMNAVAIRLANT exercises administrative control of VX-1.
Commander, Naval Air Systems Command (NAVAIR)
Provides material support for aircraft and airborne weapon systems for the United States Navy and
Marine Corps. Serves as the ultimate technical authority for all U. S. Naval aircraft.
Naval Aviation Warfighting Development Center (NAWDC)
NAWDC trains Navy Air Forces in advanced TTP across all combat mission areas at the
individual, unit, and integrated levels ensuring alignment of the training continuum; develops,
validates, standardizes, publishes, and revises TTPs; provides operational and subject matter
expertise support to Strike Group Commanders, Numbered Fleet Commanders, and Combatant
Commanders.
Naval Air Warfare Center, Aircraft Division (NAWCAD)
Is an organization within the Naval Air Systems Command (NAVAIR), (aligned under AIR 4.0 –
the NAWCAD Commander serves as Assistant Commander for Research and Engineering)
focused primarily on aircraft development and testing for the DoN. NAWCAD supports major
aspects of aircraft developmental testing including aircraft performance, flying qualities,
electromagnetic compatibility, and carrier suitability. NAWCAD serves as ISIC for Naval Test
Wing Atlantic and the Training Systems Division (Orlando, FL).
VX-20
Developmental Test and Evaluation squadron based at NAS Patuxent River, MD, falls under Naval
Test Wing Atlantic (NWTL). Responsible for testing fixed-wing aircraft and aircraft systems, to
include systems for the P-3, P-8, E-2, C-2, C-130, E-6, T-6, and T-34 aircraft.
HX-21
Developmental Test and Evaluation squadron based at NAS Patuxent River, MD, falls under Naval
Test Wing Atlantic (NWTL). Responsible for testing rotary-wing aircraft and aircraft systems, to
include systems for the H-1, H-3, H-46, H-53, H-57, H-60, MQ-8B, and V-22 series aircraft and
UAVs.
VX-23
Developmental Test and Evaluation squadron based at NAS Patuxent River, MD, falls under
Naval Test Wing Atlantic (NWTL). Responsible for testing fixed-wing tactical aircraft and
aircraft systems, to include systems for the F-18, EA-6B, and T-45 series aircraft.
Naval Air Warfare Center, Weapons Division (NAWCWD)
Is an organization within the Naval Air Systems Command (NAVAIR), (aligned under AIR 5.0 –
the NAWCWD Commander serves as the Assistant Commander for Test and Evaluation) focused
primarily on EW and weapons development and testing for the DoN. NAWCWD also hosts
significant science and technology activity for aviation systems. NAWCWD has two locations in
E-7
Southern California: China Lake hosting the land test range and Point Mugu, hosting the sea test
range. NAWCWD serves as ISIC for Naval Test Wing Pacific.
Air Force Operational Test and Evaluation Center (AFOTEC)
Is a direct reporting unit of Headquarters, United States Air Force. It is the Air Force operational
test agency responsible for testing new systems being developed for Air Force and multiservice
use. AFOTEC employs a detachment construct for the execution of operational testing.
Detachment 1 (Edwards AFB, CA)
Lead agency for accomplishing Block 2 and 3 Initial Operational Test and Evaluation of the F-35
Lightning II for the U.S. Air Force, U.S. Navy, U.S. Marine Corps, United Kingdom Ministry of
Defense, and the Royal Netherlands Air Force. Leads the Joint Operational Test Team.
Detachment 2 (Eglin AFB, FL)
Evaluates operational system(s) mission capability, effectiveness, and suitability for Air Force and
multiservice users. Primarily focused on weapons and weapon system testing.
Detachment 4 (Peterson AFB, CO)
Operationally tests space, missile, and missile defense capabilities.
Detachment 5 (Edwards AFB, CA)
Operationally tests aircraft systems. Detachment 6 (Nellis AFB, NV)
Plans and conducts operational test and evaluation of fighter aircraft.
E-4. INFORMATION WARFARE
U.S. Fleet Cyber Command
Directs Navy cyberspace operations globally to achieve military objectives in and through
cyberspace. Organizes and directs Navy cryptologic operations worldwide and supports
information operations and space planning and operations as directed. Executes cyber missions as
directed. Operates, maintains, secures, and defends the Navy’s portion of the Global Information
Grid: Delivers integrated cyber, information operations, cryptologic, and space capabilities.
Assesses Navy cyber readiness; manages the Man, Train and Equip (MT&E) functions associated
with Navy Component Commander (NCC) for U.S. Cyber Command and Service Cryptologic
Commander (SCC) responsibilities.
Commander, TENTH Fleet (C10F)
Numbered Fleet Commander for Fleet Cyber Command and exercises operational control of
assigned naval forces to coordinate with other naval, coalition, and Joint Task Forces to execute
full spectrum of cyber, EW, information operations and signal intelligence capabilities, and
missions across the cyber, electromagnetic, and space domains.
Naval Information Forces (NAVIFOR)
NAVIFOR is the C5I Type Commander. It is responsible for Fleet Readiness, C5I Modernization
and Sustainment, Cyber Security, Information Technology Efficiencies, Improvement Program,
E-8
and training for the C5I workforce. The Naval OPSEC Support Team (NOST) located at the
NAVIFOR, has been designated the Naval (Navy and USMC) OPSEC Support Element, providing
OPSEC support throughout the Fleet.
Naval Information Warfare Systems Command (NAVWAR)
NAVWAR identifies, develops, delivers and sustains information warfare capabilities in support
of naval, joint, coalition and other national missions.
Naval Information Warfighting Development Center (NIWDC)
NIWDC Develops, validate, standardize, publish, and revise advanced Information Warfare
training; tactics, techniques and procedures (TTP).
Information Warfare Training Group (IWTG)
IWTG provide support for Navy unit IW assessments and certification. Trains the fleet to IW
mission TTPs at unit level.
Navy Cyber Defense Operation Command (NCDOC)
NCDOC executes Navy’s Defensive Cyber Operations. NCDOC is the only Navy organization
tasked with providing penetration assessment services for C4I systems.
Naval Network Warfare Command (NETWARCOM)
NETWARCOM Execute tactical-level command and control to direct, operate, maintain and
secure Navy communications and network systems for Department of Defense Information
Networks; leverage Joint Space capabilities for Navy and Joint Operations.
Program Executive Office for Command, Control, Communications, Computers,
and Intelligence (PEO C4I)
PEO C4I provides integrated communications and information technology systems that enable
information dominance and the command and control of maritime forces. PEO C4I is the ISIC
responsible for C4I-related Program Management Warfare Offices (PMW).
PMW 120
The Battlespace Awareness and Information Operations Program Office provides net-ready
intelligence, meteorological, oceanographic, and information operations products and services that
allow Sailors to correlate data from organic sensors and national sources, to gauge enemy
intentions, provide I&W, and determine operationally relevant information about the physical
environment.
PMW 130
The Information Assurance and Cybersecurity Program Office provides cybersecurity products
and services to ensure protection of Navy and joint information and telecommunications systems
from hostile exploitation and attack through cryptographic, network, and host-based security
products that provide for strong authentication, data integrity, confidentiality, nonrepudiation, and
availability of network resources and information.
E-9
PMW 146
The Navy Communications Satellite Program Office manages the acquisition for all DoD
Narrowband satellite systems. PMW 146 is the lead for Mobile Objective User System (MUOS)
and also manages the maintenance for the Ultra High Frequency Follow-On (UFO) SATCOM
system.
PMW 150
The Navy Command and Control Program Office provides operational and tactical command and
control capabilities, by integrating real-time and near real-time representations of tactical
situations, while including targeting support, chemical-biological warnings, and logistics support
for the Navy, Marine Corps, and joint and coalition warfighters.
PMW 160
The Tactical Networks Program Office provides affordable, interoperable, and secure net-centric
enterprise capabilities to the Navy, joint, and coalition warfighters.
PMW 170
The Communications and GPS Navigation Program Office provides satellite, line-of-sight, and
extended-line-of-site communication systems for voice and data communications and GPS
capabilities for ship navigation, command and control systems, and weapons systems.
PMW 180
The Navy’s Program Manager for developing, acquiring, fielding, and sustaining integrated,
network-ready products and services, including intelligence, meteorology, oceanography, and
information operations.
PMW 740
The International C4I Integration Program Office delivers and integrates tailored, C4I releasable
systems to foreign partners through Foreign Military Sales, Foreign Military Financing, and other
DoD-funded international programs to enhance interoperability between the United States and its
strategic partners.
PMW 750
The Carrier and Air Integration Program Office delivers integrated and interoperable C4I
capabilities and support to our Navy's aircraft carriers, amphibious ships, command ships, and
aircraft by leading advanced planning for Fleet modernization and new CON ship C4I efforts.
PMW 760
The Ship Integration Program Office delivers integrated C4I capabilities to the Navy's unit and
group-level ships in new CON and as part of the Navy Modernization Plan.
PMW 770
The Undersea Integration Program Office delivers integrated and interoperable C4I capabilities
and support to the Navy by connecting the undersea architecture of manned and unmanned systems
and undersea vehicles.
E-10
PMW 790
The Shore and Expeditionary Integration Program Office delivers relevant, integrated, and
interoperable C4I capabilities and support to our Navy's shore and expeditionary forces through
modernization, acquisition, and system integration.
E-4.10 Program Executive Office for Digital and Enterprise Services (PEO Digital)
PEO Digital oversees a portfolio of enterprise-wide information technology programs designed to
provide standard IT capabilities to Sailors at sea, Marines in the field and their support systems.
PEO Digital ensures that these programs maximize value to warfighters by balancing costs with
the capability delivered to the end-user.
PMW 205 – Naval Enterprise Network (NEN)
Manages Navy Marine Corps Intranet (NMCI); Next Generation Enterprise Network (NGEN);
BLII/ONE-NET - Manages the acquisition lifecycle of enterprise-wide networks while providing
secure, seamless and global computer connectivity for the Department of the Navy.
PMW 260 – Special Networks and Intelligence Mission Applications (SNIMA)
Manages the acquisition lifecycle of the Navy's shore-based Joint Worldwide Intelligence
Communications Systems (JWICS) IT domain.
PMW 270 – Navy Commercial Cloud Services (NCCS)
Develops and executes the Navy's overarching cloud brokerage structure.
PMW 280 – Special Access Programs (SAP)
Plans and manages integration of capabilities for DoN SAP enterprise IT networks.
PMW 290 – Enterprise IT Strategic Sourcing (EITSS)
Implements and manages IT agreements for the DoN, DoD and the Federal Government through
consolidating, centralizing and streaming IT acquisition.
E-4.11 Program Executive Office for Manpower, Logistics and Business Solutions (PEO
MLB)
PEO MLB oversees a portfolio of enterprise-wide information technology programs designed to
enable common business processes to Sailors at sea, Marines in the field and their support systems.
PEO MLB ensures these programs maximize value to the warfighter by balancing cost with the
capability delivered to the end user.
E-4.11.1 PMW 220
Navy Enterprise Business Solutions (EBS), PMW 220, is a portfolio program of IT solutions for
aligning the Navy's money, manpower, and materials. The Navy EBS portfolio includes both
Navy Enterprise Resource Planning (ERP) and the E- Business and Electronic Procurement
System (EPS), which modernize and standardize Navy business operations by providing
management visibility across the enterprise and increasing effectiveness and efficiency.
E-11
E-4.11.2 PMW 230
Delivers integrated IT logistics systems that provide cohesive and seamless high-performance
readiness capabilities for Marine Corps logistics operations.
E-4.11.3 PMW 240
Single information technology (IT) acquisition agent for N1 business operations, providing full
life cycle management to support the Navy's manpower, personnel, training and education
(MPT&E) initiatives. The SWP portfolio includes MyNavy Portal (MNP), mobile apps, Navy
eLearning (NeL), distribution systems, Navy Pay and Personnel System (NP2) and Authoritative
Data Environment (ADE).
E-4.11.4 PMW 250
Develops and implements reliable, efficient and secure business information technology (IT)
solutions. E2S's portfolio includes the Department of the Navy Tasking, Records and Consolidated
Knowledge Enterprise Repository (DoN TRACKER), iNAVY, DoN IT Portfolio Repository/DoN
Application and Database Management System (DITR/DADMS), Naval Justice Information
System (NJIS), Joint Air Logistics Information System (JALIS), NAVY 311, Navy Information
Application Product Suite (NIAPS) and Risk Management Information (RMI).
E-4.11.5 PMS 444
Navy Maritime Maintenance Enterprise Solution (NMMES) Technical Refresh (TR) Program
Management Office (NMMES-TR) - was established in 2016 to acquire, deliver, and deploy
modern Information Technology (IT) systems to replace the current maritime shore maintenance
solution which has reached end-of-life.
E-5. SURFACE WARFARE
E-5.1 Board of Inspection and Survey (INSURV)
The Board of Inspection and Survey conducts acceptance trials of ships and service craft for the
purpose of determining the quality of construction, compliance with specifications and Navy
requirements, to determine if builder responsible equipment is operating satisfactorily during the
guarantee period following acceptance and to make recommendations upon their acceptance by
the Navy. They conduct material inspections of all naval ships at least once every 3 years if
practical, for the purpose of determining and reporting upon a ship’s fitness for further service and
material conditions which limits its ability to carry out assignment missions. Since testing of
platforms and combat systems tends to be quite expensive, the cost efficiencies realized by
conducting combined INSURV/COMOPTEVFOR test events whenever possible can be quite
substantial.
Commander, Carrier Strike Group FOUR (CSG-4)/FIFTEEN (CSG-15)
CSG-4/CSG-15, along with subordinate commands Tactical Training Group Atlantic (TTGL) and
Pacific (TTGP), and Expeditionary Warfare Training Group Atlantic (EWTGL) and Pacific
(EWTGP), prepare every Carrier Strike Group (CSG), Amphibious Ready Group (ARG) and Task
Force Deployer for sustained, forward-deployed, high tempo operations. On behalf of Commander
USFF and Commander Pacific Fleet, CSG-4 and CSG-15, respectively, mentor, train and assess
deploying forces through planning and conducting immersive exercises at the operational and
E-12
tactical levels of war. CSG-4 assesses Atlantic Fleet and Forward Deployed Naval Forces - Europe
(FDNF-E) deployers.
CSG-15 assesses Pacific Fleet and FDNF Japan deployers. Additionally, CSG-4 is USFF's lead
agent for Integration and Interoperability (I&I) reviews conducted to establish WCB kill chains,
which subsequently inform COI selection consistent with Navy's required operational capabilities
and projected operational environments (ROC & POE) Naval Surface and Mine Warfighting
Development Center (NSWDC) NSMWDC trains Navy Surface Forces in advanced. TTPs across
all combat mission areas at the individual, unit, and integrated levels; develops, validates,
standardizes, publishes, and revises TTPs; and provides operational and subject matter expertise
support to Strike Group Commanders, Numbered Fleet Commanders, and Combatant
Commanders.
Military Sealift Command (MSC)
Mans and operates Fleet auxiliary vessels, such as the Joint High Speed Vessel (JHSV) and Dry
Ammunition and Cargo Ship (T-AKE). Responsible for maintenance and operations of all vessels
assigned to the MSC, including Military Preposition Force (MPF) ships.
Missile Defense Agency (MDA)
The MDA is a research, development, and acquisition agency within the Department of Defense.
The Navy's program element of MDA is MDA/AB (PD-452), which coordinates the
developmental efforts of the Navy's afloat and shore Ballistic Missile Defense (BMD) systems.
The Navy's BMD systems are part of the larger Ballistic Missile Defense System (BMDS).
Naval Surface Warfare Center, Corona Division (NSWCCO)
One of two suppliers of Navy Working Capital Funded (NWCF) government civilian OTDs,
AOTDs, and analysts. Additionally, one of Surface Warfare division's main data reduction and
analysis agencies. As a third-party data collector, NSWCCD serves warfighters and program
managers as an independent performance assessment agent throughout systems' life cycles by
gauging the Navy's warfighting capability of weapons and integrated combat systems, from unit
to force level, through assessment of those systems' performance, readiness, quality,
supportability, and the adequacy of training.
Naval Surface Warfare Center, Dahlgren Division (NSWCDD)
Home to several laboratories conducting R&D, as well as DT&E activities for programs covered
by Surface Warfare Division TEINs.
Naval Surface Warfare Center, Port Hueneme Division (NSWCPHD)
One of two suppliers of NWCF government civilian OTDs, AOTDs, and analysts.
PEO Integrated Warfare Systems (PEO (IWS))
Manages surface ship and submarine combat technologies and systems, and coordinates Navy
Open Architecture across ship platforms.
E-6. EXPEDITIONARY WARFARE DIVISION/LCS
E-13
E-6.1 PEO Unmanned and Small Combatants (PEO USC)
PEO Unmanned and Small Combatants designs, develops, builds, maintains and modernizes the
Navy's expanding family of unmanned maritime systems, mine warfare systems and small surface
combatants.
E-6.2 PMS 406 - Unmanned Maritime Systems
Chartered to develop, acquire, deliver, and support operationally effective, integrated Unmanned
Maritime Systems (UMS) for the war fighter and to direct UMS experimentation and technology
maturation efforts to meet the Fleet’s capability needs. UMS comprises Unmanned Maritimes
Vehicles (UMV), which includes both Unmanned Undersea Vehicles (UUVs) and Unmanned
Surface Vehicles (USVs), and fully integrated sensors and payloads as necessary to accomplish
the required missions
E-6.3 PMS 420 – LCS Mission Modules
Develops and acquires the Surface Warfare (SUW), Anti-submarine Warfare (ASW), and Mine
Countermeasures (MCM) Mission Packages for installation on both Littoral Combat Ship (LCS)
variants, enabling LCS to perform a diverse portfolio of mission sets from a common base
platform.
E-6.4 PMS 495 - Mine Warfare
Deliver mine warfare capabilities to the warfighter. PMS 495 systems provide mining and mine
countermeasure capability from the beach zone out to deep water.
E-6.5 PMS 501 - Littoral Combat Ships
Responsible for pre-OWLD (Obligation Work Limiting Date) LCS Seaframes (12-18 months after
delivery).
E-6.6 PMS 505 - LCS Fleet Introduction & Sustainment
Responsible for post- OWLD LCS Seaframes (in-service).
E-6.7 IWS 8 – LCS Combat Systems Integration
Responsible for LCS systems integration
E-6.8 PMS 340 (SEA-06 NSW)
Provides Program Management for Navy common service programs typically funded through title
10 appropriation. Mile Stone Decision Authority is SEA-06 or PEO Maritime at SOCOM.
E-6.9 PMS 408 (SEA-06 EXM)
Provides Program Management for Navy Expeditionary Warfare, Anti-Terrorism, Explosive
Ordnance Disposal, and CREW systems. Mile Stone Decision Authority is SEA-06.
E-6.10 SOCOM PEO Maritime
Mile Stone Decision Authority for all SOF AT&L managed programs in the maritime competency
to include small craft, submersibles, and diving system. Their subordinate Program Managers
E-14
provide development and program management support for Special Operations Force peculiar
capability funded though MFP 11 appropriation.
E-6.11 SOCOM J-8O
Provides oversight for PEO Maritime programs funded through MFP 11 where the requirements
are managed through SOCOM J-8. Works closely with the OTAs in observing testing as they
receive the OTA test report and grant fielding and deployment release of all SOCOM developed
technology.
E-6.12 Joint Service EOD Military Technical Acceptance Board
Comprised of services detachment from all four services, conducts testing and validation of EOD
tools, techniques and publications prior to authorization for EOD use and fielding
E-6.13 Naval Mine and ASW Warfare Center of Excellence (NMAWC)
The warfighting center of excellence for Mine Warfare (MIW) and Antisubmarine Warfare
(ASW), focuses efforts across numerous resource sponsors, systems commands, research
laboratories, training organizations, and operational commands to ensure Navy-wide competency
in the MIW and ASW mission areas. NMAWC is the primary command through which issues
related to MIW and ASW are coordinated with tactical development agencies and commands.
E-6.14 Naval Surface Warfare Center Panama City Division
Conduct research, development, test and evaluation, in-service support of mine warfare systems,
mines, naval special warfare systems, diving and life support systems, amphibious/expeditionary
maneuver warfare systems, other missions that occur primarily in coastal (littoral) regions and to
execute other responsibilities as assigned by Commander, Naval Surface Warfare Center.
E-6.15 Joint Program Executive Office Chemical-Biological-Radiological Defense (JPEO-
CBRD)
Provides single program executive responsibility for the acquisition, testing and fielding of new or
advanced chemical, biological or radiological (CBR) capabilities to the Joint services. Serves as
the Milestone Decision Authority for programmatic decisions and provides principle funding for
all test and evaluation requirements inclusive of OPTEVFOR support functions. Programs may
be delegated to Joint Program Managers (Protection, Sensors, Radiological/Nuclear, Medical
Countermeasures, Guardian and Information Management/Information Technology) as required
by system function.
E-6.16 Deputy Undersecretary of the Army for Test and Evaluation (DUSA-TE)
Serves as the executive oversight responsibility of all test and evaluation activities in the
Department of Defense when DOT&E is not involved. Provides policy, guidance and certification
for all chemical, biological or radiological test infrastructure or laboratory capability. Leads the
Test and Evaluation Capabilities and Methodologies Integrated Process Team (TECMIPT) which
provides T&E Test Operating Procedures (TTOPs) as guidelines for developmental test
administration, procedure and disposition of data. DUSA-TE involvement extends to all CBRT&E
activities regardless of Service affiliation or leadership.
E-15
E-6.17 Naval Surface Warfare Division Indian Head Explosive Ordnance Disposal
Technology Division (NSWC IHEODTD)
Conduct research, development, test and evaluation, in-service support of Navy EOD explosive
tools, robotics, and CREW systems, and CBR capabilities. Serves as the Developmental Test and
Evaluation organization for Navy CBR capabilities. Executes other responsibilities as assigned
by Commander, Naval Surface Warfare Center.
E-6.18 Commander, Naval Special Warfare Command (SPECWARCOM)
Type Commander and Operational Commander for Navy Special Warfare (NSW). Responsible
for requirements generation and concurrence and clarification of requirements and OTA derived
data requirements. Responsible for all NSW tactics, techniques and procedures as well as concept
of operations and mission essential task lists.
E-7. OTHER ORGANIZATIONS
E-7.1 Assistant Secretary of the Navy (Research, Development and Acquisition)
Responsible for the research, development, and acquisition of Navy and Marine Corps platforms
and warfare systems. Relevant to the COMOPTEVFOR mission, supporting the ASN (RDA) are:
Principal Military Deputy - Principal military advisor to ASN (RDA) on all Navy and Marine
Corps acquisition matters.
Principal Civilian Deputy - Principal civilian advisor to ASN (RDA) on all Navy and Marine
Corps acquisition matters.
Deputy Assistant Secretary of the Navy (DASN) for Research Development Test and
Evaluation (RDT&E) – principal advisor and policy coordinator for ASN (RDA) on all
matters pertaining to Navy science, technology, advanced research and development
programs; system prototype programs; and management of science and engineering. Via the
DoN Chief Systems Engineer (CHSENG) provides engineering leadership and focus within
the acquisition community to ensure the DoN delivers integrated and interoperable enterprise
capabilities.
DASN Ships - Principal advisor to ASN (RDA) on all matters pertaining to aircraft carriers,
surface ships, and submarines, as well as associated weapon systems.
DASN Air - Principal advisor to ASN(RDA) on all matters pertaining to aircraft, cruise
missiles, air-launched weapons, airborne sensors, avionics, and support equipment.
DASN Information Warfare and Enterprise Services (DASN IWAR) – principal advisor to
ASN (RDA) for all matters related to C4I and space programs, enterprise information
technologies, business systems, enterprise information technology services and related
policies. Leading process, culture and architectural changes in support of a modern digital
operating model, DASN IWAR provides acquisition program guidance, oversight and
advocacy for PEO C4I & Space, PEO Digital and PEO MLB.
DASN Acquisition Policy and Budget (DASN AP&B) – coordinates Acquisition Policy, to
include Earned Value Management and Acquisition Reporting; Programming, Planning,
Budgeting, and Execution (PPBE) and Congressional Affairs.
E-16
E-7.2 Center for Naval Analyses (CNA)
A FFRDC that provides analytical support to the Chief of Naval Operations, Fleet Commanders,
as well as subordinate operational commanders. There is a CNA representative assigned as an
advisor on the staff of COMOPTEVFOR. In addition, a second CNA representative supports the
DOT&E-funded Interoperability and Cybersecurity Program at OPTEVFOR and other CNA
representatives provide direct support to selected warfare divisions and squadrons.
E-7.3 Commander, U. S. Fleet Forces Command (USFF)
U. S. Fleet Forces Command supports both the Chief of Naval Operations and Combatant
Commanders worldwide by providing naval forces ready-for-tasking. The command provides
operational and planning support to Combatant Commanders and integrated warfighter capability
requirements to the CNO.
Additionally, U.S. Fleet Forces Command serves as the CNO's designated Executive Agent
for Antiterrorism/Force Protection (ATFP), Individual Augmentees (IA), and Sea Basing.
In collaboration with U.S. Pacific Fleet, USFF organizes, mans, trains, maintains, and equips
Navy forces, develops and submits budgets, and executes readiness and personnel accounts
to develop both required and sustainable levels of Fleet readiness. Additionally, the
command serves as the unified voice for Fleet training requirements and policies.
OPTEVFOR’s engagement with the USFF staff is generally through the N8 staff.
COMOPTEVFOR is the only outside commander to participate in the USFF Fleet
Introduction Program assessment process.
Together with the Commander, U.S. Pacific Fleet, the Commander, USFF nominates effects
chains for evaluation during the Warfare Capability Baseline Assessments.
E-7.4 Under Secretary of Defense for Research and Engineering (USD(R&E))
Serves as the Principal Staff Assistant and advisor to the Secretary of Defense and Deputy
Secretary of Defense for all research, engineering, and technology development activities and
programs in the DoD. Per Section 133a, Title 10, U.S.C. and DoDD 3134.AB, the USD(R&E):
Establishes policies and strategic guidance and leads defense research; engineering;
developmental prototyping and experimentation; technology development, exploitation,
transition, and transfer; DT&E; and manufacturing technology activities.
Prepares Milestone B (MS B) and Milestone C (MS C) DT&E sufficiency assessments on
those MDAPs where the Defense Acquisition Executive (DAE) is the milestone decision
authority (MDA).
Develops DT&E policy and ensures appropriate test facilities, test ranges, tools, and related
modeling and simulation capabilities are maintained within the DoD.
Serves as an advisor to the Joint Requirements Oversight Council on matters within
USD(R&E) authority and expertise to inform and influence requirements, concepts,
capabilities-based assessments, and concepts of operations.
Approves the DT&E plans within TEMPs. Delegates approval authority, as appropriate.
Develops governing policy and advances practices and workforce competencies for DT&E.
E-17
E-7.5 Director, Operational Test and Evaluation, Office of the Secretary of Defense
(OSD/DOT&E)
The Director is a Senate-confirmed Presidential Appointee who serves as the principal staff
assistant and senior advisor to the Secretary of Defense on OT&E in the DoD.
The DOT&E is responsible for issuing DoD OT&E policy and procedures; reviewing and
analyzing the results of OT&E conducted for each major DoD acquisition program; providing
independent assessments to Secretary of Defense, the Under Secretary of Defense for Acquisition
and Sustainment (USD(A&S)), and Congress; making budgetary and financial recommendations
to the Secretary regarding OT&E; and oversight to ensure OT&E for major DoD acquisition
programs is adequate to confirm operational effectiveness and suitability of the defense system in
combat use.
The staff is led by a Principal Deputy (career SES) and is supported by four Deputy Directors
and a Deputy for Live Fire Test and Evaluation as well as a Science Advisor.
The four Deputy Directors (Deputy Assistant Secretary of Defense equivalents) oversee the
following areas:
Air Warfare
Land and Expeditionary Warfare (including land-based rotary-wing aviation),
Naval Warfare (including Navy sea-based helicopters)
Net-centric and Space Systems/Ballistic Missile Defense (includes Defense Business
Systems)
The DOT&E also manages several other efforts not directly related to his primary responsibilities.
These include the Joint Test and Evaluation Program managed by the Deputy Director for Air
Warfare and the Interoperability and CAP managed by the Deputy Director for Net-centric and
Space Systems (see appendix C for additional information on these programs.)
E-7.6 Institute for Defense Analyses (IDA)
A FFRDC that provides analytical support to the Office of the Secretary of Defense. The
Operational Evaluation Division provides analytical support to the Director, Operational Test and
Evaluation.
E-7.7 Office of the Chief of Naval Operations
N94/ONR - Director, Innovation, Technology Requirements, and T&E. Dual-hatted as the
Director of the Office of Naval Research and the Navy’s T&E Executive. Determines the
requirements of Science and Technology (S&T), T&E. Establishes and promulgates Navy
S&T and T&E requirements, issues policy, regulations, and procedures governing S&T and
E-18
T&E. Acts for CNO in resolving T&E requirements issues. Approves Test and Evaluation
Strategies, Test and Evaluation Master Plans, and LFT&E Management Plans on behalf of
the CNO.
N942 - T&E Division of N94. Coordinates warfare T&E programs, C4I/AIS T&E programs,
and T&E Modeling and Simulation.
N9SP - Special Access Programs Coordinator. Responsible for management of the DoN
Special Access Program (SAP) Central Office.
N98 - Deputy CNO for Information Dominance. Responsible for functional integration of
intelligence, information warfare, information/network management, oceanography, and
geospatial information. Coordinates resource investments to deliver information-centric
capabilities and competitive advantages.
N2/6F - Director, Concepts, Strategies, and Integration. Serves as the Warfare Integration
Directorate (resource sponsor) validating requirements and provisioning program of record
systems across Navy equities in Communications and Networks (F1), ISR Capabilities (F2),
Electronic and Cyber Warfare (F3), and Decision Superiority (F4).
N9 - Deputy CNO for Warfare Systems. Responsible for optimizing Navy investments
through centralized coordination of Navy warfighting and warfighting support analysis and
assessments, Navy capability development and integration, joint and Navy requirements
development, and resources programming.
N9I – Responsible for warfare integration of the systems provided by N9 and N2/N6
resource sponsors.
N95 - Resource sponsor for naval expeditionary warfare missions and programs. Mission
areas include AMW, mine warfare, naval special warfare, expeditionary warfare (Explosive
Ordnance Disposal (EOD), and maritime expeditionary security force/naval coastal warfare).
N96 - Resource sponsor for surface combatants and command ships. Readiness, safety,
survivability, training, and preparation for war for above surface forces.
N97 - Resource sponsor for submarines, deep submergence systems, and undersea
surveillance systems and preparation for war for below surface forces.
N98 - Resource sponsor for aircraft carriers, specific aviation type ships, and naval aircraft,
and preparation for war for naval air forces.
Expeditionary Warfare Development Center (EXWDC)
EXWDC is a warfighting development center under the administrative control of Commander,
Naval Expeditionary Command. EXWDC provides training and subject matter expertise for
antiterrorism/force protection, CON, expeditionary warfare, and irregular warfare.
(COMPACFLT)
The mission of COMPACFLT is to protect and defend the maritime interests of the United States
in the Indo-Asia Pacific region. By providing combat-ready naval forces and operating forward
in global areas of consequence, COMPACFLT enhances stability, promotes maritime security and
freedom of the seas, defends the nation’s homeland, deters aggression, and when necessary,
E-19
conducts decisive combat action against the enemy. COMPACFLT collaborates with Commander
USFF to ensure optimum warfighting capacity and capability.
Joint Interoperability Test Command (JITC)
The JITC Operational Test and Evaluation (OT&E) Division (JT1) conducts operational testing of
Information Technology and National Security Systems acquired by the Defense Information
Systems Agency, other DoD organizations, and non-DoD entities to ensure operational
effectiveness, suitability, and security. JITC conducts the test and collects the data. JITC then
prepares an Operational Test and Evaluation Report (OTER), consistent with the test concept and
plan, and provides a copy to the appropriate offices of the Component and to DOT&E.
E-20
THIS PAGE INTENTIONALLY LEFT BLANK
F-1
APPENDIX F - GLOSSARY
Acquisition Category (ACAT). Categories established to facilitate decentralized decision
making and execution and compliance with statutorily imposed requirements. The categories
determine the level of review, decision authority, and applicable procedures. ACAT I, ACAT II,
ACAT III, and IV (ACAT IV is USN and USMC only)
Acquisition Program Baseline (APB). The PM initially develops the APB as a concept baseline
for the Milestone A (MS-A) decision point. A development baseline and a production baseline
are prepared for MS-B and -C. These baselines capture the threshold and objective values for the
minimum number of cost, schedule, and performance attributes (called "key performance parameters")
that describe the program over its life cycle.
(CJCSM 3170. 01C)
Adjunct Tester. A person, not normally assigned to COMOPTEVFOR, who is appointed by
COMOPTEVFOR to assist in test execution and/or data collection for a particular phase of test.
Each adjunct tester will be required to execute the COMOPTEVFOR Adjunct Tester Form. The
template is found with Test Plan templates.
Advanced Concept Technology Demonstration (ACTD). An ACTD (formerly a Joint Concept
of Technology Demonstration (JCTD)) is a demonstration of the military utility of a significant new
technology and an assessment to clearly establish operational utility and system integrity. (CJCSI
3170. 01G)
Advisory and Assistance Services. Technical support provided under contract by
nongovernmental sources, with outputs that take the form of information, advice, opinions,
alternatives, analyses, evaluations, recommendations, and training. (FAR 37.104)
Analytical Support. Support provided via military or civilian analysts, Navy laboratory or
defense contractors to assist force personnel in data collection, reduction, and analysis in support
of OT&E.
Analysis. A verification method involving the use of recognized analytic techniques (including
computer models) to interpret or explain the behavior/performance of the system element.
Analysis of test data or review and analysis of design data should be used as appropriate to verify
requirements (Defense Acquisition Guidebook). See Verification.
Analysis of Alternatives (AOA). The evaluation of the performance, operational effectiveness,
operational suitability, and estimated costs of alternative systems to meet a mission capability.
The AoA assesses the advantages and disadvantages of alternatives being considered to satisfy
capabilities, including the sensitivity of each alternative to possible changes in key assumptions or
variables. The AoA is one of the key inputs to defining the system capabilities in the capability
development document. (CJCSM 3170.01C)
Assessment of Operational Capability (AOC) A brief report prepared after observing a DT phase
or DT event(s) of an acquisition program to assess the operational capabilities of a System Under
F-2
Test (SUT) prior to introducing/releasing it for Fleet/operational use, and the Program has no
future phase of Operational Test (OT) planned.
Application Software. Consists of the computer program, firmware, and associated data that
implement the operational capabilities required for tactical weapon system employment; e.g.,
target tracking, navigation, avionics programs, and Built-In Test (BIT). A software change
required because of changed system performance requirements or new or redesigned hardware
shall be termed application vice support software.
Attribute. A quantitative or qualitative characteristic of an element or its actions. (CJCSM
3170.01C) For purposes of OT, “element,” refers to the system under test.
Availability. A measure of the degree to which an item is in an operable and committable state at
an unknown (random) point in time. (DAU Glossary) In OT&E, Operational Availability (A
o
) is
the usual measure. (See Operational Availability.)
Board of Inspection and Survey (INSURV) Responsibilities. INSURV is tasked with certain
responsibilities relating to RDT&E and the acquisition process. When tasked by CNO,
PRESINSURV will submit an individual technical assessment of readiness for OT&E to CNO and
COMOPTEVFOR for all ships, craft, or ship installations at the ACAT I and II levels.
Capabilities-Based Test and Evaluation (CBTE). A SYSCOM process that incorporates
COMOPTEVFOR’s MBTD into DT and ST test plans, such that operationally relevant data is
collected throughout the test program.
Capability Development Document (CDD). A document that captures the information necessary
to develop a proposed program(s), normally using an evolutionary acquisition strategy. The CDD
outlines an affordable increment of militarily useful, logistically supportable, and technically
mature capability. The CDD supports a Milestone B decision review. The CDD format is
contained in CJCSM 3170.01C. (DoD 5000.2 and CJCSI 3170.01G)
Capability Production Document (CPD). A document that addresses the production elements
specific to a single increment of an acquisition program. The CPD defines an increment of
militarily useful, logistically supportable, and technically mature capability that is ready for a
production decision. The CPD must be validated and approved prior to a Milestone C decision
review. The CPD format is in the JCIDS Manual, CJCSM 3170.01C. (DoD 5000.02 and CJCSI
3170.01G) OT&E shall determine the operational effectiveness and suitability of a system under
realistic operational conditions, including combat; determine if thresholds in the approved CPD
and COIs have been satisfied; and assess impacts to combat operations.
Capstone Test and Evaluation Master Plan (CTEMP). A TEMP which addresses the testing
and evaluation of a defense system consisting of a collection of individual systems which function
collectively to achieve the objectives of the defense system. Individual system-unique content
requirements are addressed in an annex to the basic CTEMP. (DAU Glossary)
Combined Developmental Testing (DT) and OT. Used to save time and reduce costs; must be
configured to meet operational capabilities/functions and developmental test objectives; must be
F-3
covered by an MOA; and must be followed by an appropriate final period of testing which will
emphasize appropriate separate OT before a MS-C decision.
Commercial Off-the-Shelf (COTS) Items. Use of COTS items offers significant opportunities
for reduced development time, faster insertion of new technology, and lower life-cycle costs,
owing to a more robust industrial base.
COMOPTEVFOR. Commander, Operational and Test Evaluation Force. This acronym should
be used to represent the Commander. (Note: The acronym OPTEVFOR should be used in
reference to COMOPTEVFOR’s staff.)
Compatibility. The capability of two or more items or components of equipment or material to
exist or function in the same system or environment without mutual interference. (DAU Glossary)
Compatibility includes physical, functional, electrical and electronic, and environmental issues.
Computer Resources. The totality of computer hardware, firmware, software, personnel,
documentation, supplies, services, and support services applied to a given effort.
Computer Software (or Software). A combination of associated computer instructions and
computer data definitions required to enable the computer hardware to perform computational or
control functions.
Computer Software Documentation. Technical data or information, including computer listings
and printouts, which documents the requirements, design, or details of computer software; explains
the capabilities and limitations of the software; or provides operation instructions for using or
supporting computer software during the software's operational life.
Concurrent Testing. A form of combined DT/OT in which test events are generally broken into
separate DT and OT events. Concurrent testing consists of DT and OT testers on a ship,
conducting separate and distinct test scenarios, some for DT, some for OT.
Condition. Variables of the environment that affect the performance of subtasks in the context of
the assigned mission. They are categorized by conditions of the physical environment (e. g., sea
state, terrain, or weather), military environment (e.g., forces assigned, threat, command
relationships), and civil environment (e.g., political, cultural, and economic factors).
(OPNAVINST 3500.38B)
Contracting Officer Technical Representative (COTR). Personnel nominated by
COMOPTEVFOR and appointed in writing by the contracting officer and designated in the
contract, who provide technical direction/clarification and guidance with respect to the contract
specifications or SOW. The term COR is now used interchangeably with COTR.
Criteria. The element of a standard that defines acceptable levels of performance. (OPNAVINST
3500.38B)
Critical Intelligence Parameters (CIP). CIPs are those key performance thresholds of foreign
threat systems, which, if exceeded could compromise the mission effectiveness of the U.S. system
in development. CIPs, and their accompanying production requirements, will be included in the
F-4
System Threat Assessment Report (STAR) unless DIA’s Acquisition Support Division in the
Defense Warning Office (DWO-3), the Threat Steering Group, and the program office agree that
CIPs are not required. If a CIP is breached, the responsible intelligence production center will
notify the program office and DIA/DWO-3 per DIA Instruction 5000.002. DIA/DWO-3 will
notify the appropriate organizations in the Office of the Secretary of Defense. (Defense
Acquisition Guidebook) CIPs are expressed in terms of a potential adversary's quantity, type,
force mix, and system capabilities for actual and projected specific threats.
Critical Operational Issues (COI). A key Operational Effectiveness (OE) and/or Operational
Suitability (OS) issue (not a parameter, objective, or threshold) that must be examined in OT&E
to determine the system's capability to perform its mission. A COI is normally phrased as a
question that must be answered in order to properly evaluate OE or OS. (DAU Glossary)
Critical Safety Item. A part, assembly, installation or production system with one or more critical
safety characteristics that, if missing or not conforming to the design data, quality requirements,
or overhaul and maintenance documentation, would result in an unsafe condition.
Current Threat. The threat which has been fielded or is assessed to be currently available.
Cybersecurity. Prevention of damage to, protection of, and restoration of computers, electronic
communications systems, electronic communications services, wire communication, and
electronic communication, including information contained therein, to ensure its availability,
integrity, authentication, confidentiality, and nonrepudiation.
Cyber Survivability. A system’s capability to survive and operate after exposure to cyber threats
which attempt to prevent the completion of operational mission(s) by destruction, corruption,
denial, or exposure of data transmitted, processed, or stored.
Defense Acquisition Board (DAB). The senior DoD acquisition review board for ACAT 1D and
selected ACAT IAM programs, chaired by the Under Secretary of Defense for Acquisition. The
Vice Chairman of the Joint Chiefs of Staff is the Vice-Chair. Other members of the board are: the
Deputy Under Secretary of Defense for Acquisition; service acquisition executives of the Army,
Navy, and Air Force; the Director of Defense Research and Engineering; the Assistant Secretary
of Defense for Program Analysis and Evaluation; the Comptroller of the Department of Defense;
the Director of Operational Test and Evaluation; the appropriate DAB Chair; and the Defense
Acquisition Board Executive Secretary. Other persons may attend at the invitation of the chair.
(See DoD Directive 5000.49, Defense Acquisition Board.)
Deferrals. The term "Deferrals" applies to a delay in testing requirements directed by the resource
sponsor. A deferral moves a testing requirement from one test period to a later period. Deferred
items cannot be used in the analysis to resolve COIs; however, the OTA may comment on
operational considerations in the appropriate sections of the test report. A deferral does not change
the requirement to test a system capability, function, or mission, only the timeframe in which it is
evaluated. Also see Waivers. (SECNAVINST 5000.2F)
Deficiency. Operational need minus existing and planned capability. The degree of inability to
successfully accomplish one or more mission tasks or functions required to achieve mission or
F-5
mission area objectives. Deficiencies might arise from changing mission objectives, opposing
threat systems, changes in the environment, obsolescence, or depreciation in current military
assets. (DAU Glossary)
Demonstration. A verification method involving the performance of operations at the system or
system element level where visual observations are the primary means of verification.
Demonstration is used when quantitative assurance is not required for verification of the
requirements (Defense Acquisition Guidebook). See Verification.
Derived Measure. Any requirement not clearly stated in the system’s capabilities document that
is necessary for the effective delivery of the system under test capability as defined in the
capabilities document, or are derived from:
1. Concept of Operation
2. Office of the Secretary of Defense/Joint Chiefs of Staff/Secretary of the Navy/Office of the
Chief of Naval Operations instructions
3. Threat documents
4. System under test specifications
5. System Stakeholders agreed upon capability/function to be delivered (Navy Sponsor’s intent
for funded capability). (COMOPTEVFOR derived definition)
Developing Agency (DA). The agency or command responsible for system design and
development, and accomplishment of DT&E to verify attainment of technical performance
specifications and objectives. The DA is usually a SYSCOM/PEO. (DAU Glossary)
Developmental Test and Evaluation (DT&E). Any engineering-type test used to verify status
of technical progress, verify that design risks are minimized, substantiate achievement of contract
technical performance, and certify readiness for initial Operational Testing (OT). Development
tests generally require instrumentation and measurements and are accomplished by engineers,
technicians, or soldier operator-maintainer test personnel in a controlled environment to facilitate
failure analysis. (DAU Glossary)
Direct Liaison Authorized (DIRLAUTH). That authority granted by a commander (any level)
to a subordinate to directly consult or coordinate an action with a command or agency within or
outside of the granting command. DIRLAUTH is more applicable to planning than operations and
always carries with it the requirement of keeping the commander granting DIRLAUTH informed.
DIRLAUTH is a coordination relationship, not an authority through which command may be
exercised.
Director, Operational Test and Evaluation (DOT&E). According to DoD Directive 5000.1,
DOT&E is the principle advisor to the Secretary of Defense on DoD OT&E matters.
Discrepancy Reporting. The lead OT&E agency is responsible for ensuring a system is
established to track discrepancies and to provide periodic status reports to participating OT&E
agencies. Control of promulgation of such reports should be included in an MOA between the
participating OT&E agencies. An example of another agency's reporting is the service reports that
can be issued by any Air Force organization.
F-6
Documentation. Documents used to determine suitability, e.g., operator and maintenance
instructions, repair parts lists, support manuals, and manuals related to computer programs and
system software. (DAU Glossary)
DT Assist. Similar to an early phase of combined DT/OT, but with a predominantly DT flavor.
OTDs take an active role in the DT effort. DT Assists are not assigned an OT number and are not
a formal phase of OT. See paragraph C-6.3 for detailed information.
Early Operational Assessment (EOA). An Operational Assessment (OA) conducted early in an
acquisition program (prior to, or in support of, MS-B), often on subsystems and early prototype
equipment, to forecast and assess the risk to successful completion of the IOT&E. EOAs also
assist in determining any system-unique test assets for future developmental and operational tests.
(DAU Glossary) (See Operational Assessment.)
Evaluation Report. One of the two products of OT&E (the other possible product is the
OPTEVFOR Tactics Guide).
Evolutionary Acquisition (EA). The preferred DoD strategy for rapid acquisition of mature
technology for the user. An evolutionary approach delivers capability in increments, recognizing
up front the need for future capability improvements. Each increment is a militarily useful and
supportable operational capability that can be developed, produced, deployed, and sustained.
Block upgrades, pre-planned product improvements, and similar efforts that provide a significant
increase in operational capability and meet an acquisition category threshold as specified by DoDI
5000.02 are managed as separate increments. (DoDI 5000.02)
Examination. A verification method involving visual inspection of equipment and evaluation of
drawings and other pertinent design data and processes should be used to verify conformance with
characteristics such as physical, material, part, and product marking and workmanship (Defense
Acquisition Guidebook). See Verification.
Exit Criteria. Program-specific accomplishments that must be satisfactorily demonstrated before
a program can progress further in the current acquisition phase or transition to the next acquisition
phase. (DAU Glossary) Exit criteria may include such factors as critical test issues, the attainment
of projected growth curves and baseline parameters, and the results of risk reduction efforts
deemed critical to the decision to proceed further. Exit criteria supplement minimum required
accomplishments and are specific to each acquisition phase.
Failure (Reliability). The malfunction or inoperable state of a previously operable system or part
of a system; reliability failures exclude damage caused by careless or improper operation or
operation outside the environment for which it was designed.
Fleet Operators. In the context of this manual, Fleet operators refers to Sailors, Marines, Soldiers,
and/or Airmen, to include the U.S. Coast Guard.
Fleet-Releasable Software. Software for which OT&E results confirm that all significant design
problems have been identified, that solutions to these problems are available, and that the software
F-7
actually tested is effective and suitable for its intended use and meets operational requirements.
This term is reserved for use by CNO following successful OT&E.
Fleet Services. These are used to plan and program not only Fleet support, but also financial
support, ranges, targets, simulators, and other required support.
Follow-on Operational Test and Evaluation (FOT&E). The Test and Evaluation (T&E) that
may be necessary after the Full Rate Production Decision Review (FRPDR) to refine the estimates
made during Operational Test and Evaluation (OT&E), to evaluate changes, and to reevaluate the
system to ensure that it continues to meet operational needs and retains its effectiveness in a new
environment or against a new threat. (DAU Glossary)
Foreign Weapons Evaluation (FWE). FWE evaluates foreign weapons systems, equipment, and
technologies that have the potential to satisfy a specific U.S. requirement. FWE applies to any
system, subsystem, or component purchased from a friendly or neutral country which is available
for procurement by the U.S.
Full Mission Capable (FMC). Material condition of any piece of military equipment, aircraft, or
training device indicating that it can perform all of its missions. (JP 1-02)
Full Rate Production and Deployment (FRP&D). Continuation into full-rate production results
from a successful Full-Rate Production (or Full Deployment) Decision Review by the MDA. The
decision to proceed into Full-Rate Production will be documented in an acquisition decision
memorandum (ADM). This effort delivers the fully funded quantity of systems and supporting
materiel and services for the program or increment to the users. During this effort, units will
typically attain Initial Operational Capability (IOC). As technology, software, and threats change,
FOT&E shall be considered to assess current mission performance and inform operational users
during the development of new capability requirements. (DoDI 5000.02)
Full Rate Production Decision (FRPD). The decision to enter into full rate production for the
system.
Full Rate Production Decision Review (FRPDR). A review normally conducted at the
conclusion of Low Rate Initial Production (LRIP) effort that authorizes entry into the Full Rate
Production (FRP) and Deployment effort of the Production and Deployment phase of the Defense
Acquisition Management Framework. (DAU Glossary)
Human Factors. A body of scientific facts about human characteristics. The term covers all
biomedical and psychosocial considerations. It includes, but is not limited to, principles and
applications in the areas of human engineering, personnel selection, training, life support, job
performance aids, and human performance evaluations (DoD 5000.2). OT includes examination
of those elements of system operation and maintenance which influence the efficiency with which
people can use systems to accomplish the operational mission of the system (e.g., arrangement of
controls and displays), the work environment (e.g., room layout, noise level, temperature, lighting,
etc.), the task (e.g., length and complexity of operating procedures), and personnel (e.g.,
capabilities of operators and maintainers).
F-8
Human Factors Engineering. The systematic application of relevant information about human
abilities, characteristics, behavior, motivation, and performance to provide for effective human-
machine interfaces and to meet Human System Integration (HSI) requirements. Where practicable
and cost effective, system designs should minimize or eliminate system characteristics that require
excessive cognitive, physical, or sensory skills; entail extensive training or workload-intensive
tasks; result in mission-critical errors; or produce safety or health hazards. (DoDI 5000.02)
Incremental Development. In this process, a desired capability is identified, an end-state
requirement is known, and that requirement is met over time by developing several increments,
each dependent on available mature technology. Incremental development relies heavily on
prototyping, both physical and functional, to get stakeholder feedback and reduce risk. See
Evolutionary Acquisition. (DAU Glossary and Defense Acquisition Guidebook)
Initial Capability Technical Baseline. This is a multi-tier product providing the following:
ICTB 1 Describes a CONEMP designed to provide an effect described in an MTB.
ICTB 2 Defines the SYSCOM contributions to the scenario detailed in ICTB 1.
ICTB 3 Links system/platform specific requirements to the SOS mission level capabilities
in ICTB 2. The ICTB 3 integrated architecture describes the technical approaches and
agreements made between individual programs.
Initial Capabilities Document (ICD). Representatives from multiple DoD communities shall
assist in formulating broad, time-phased, operational goals, and describing requisite capabilities in
the ICD. Programs that enter the acquisition process at MS-B shall have an ICD that provides the
context in which the capability was determined and approved, and a CDD that describes specific
program requirements. Projects that undergo a MS-A decision shall have a T&E strategy that
primarily addresses M&S, including identifying and managing the associated risk, and that
evaluates system concepts against mission requirements. Pre-MS-A projects shall rely on the ICD
as the basis for the evaluation strategy.
Initial Operational Capability (IOC). The first attainment of the capability to employ,
effectively, a weapon, item of equipment, or system of approved specific characteristics, that is
manned or operated by an adequately trained, equipped, and supported military unit or force. (JP
1-02)
Initial Operational Test and Evaluation (IOT&E). Dedicated Operational Test and Evaluation
(OT&E) conducted on production, or production representative articles, to determine whether
systems are operationally effective and suitable to support a Full Rate Production (FRP) decision.
(DAU Glossary)
Integrated Evaluation Framework. The IEF is the primary document for defining adequate OT,
and for integrating the OT requirements with DT and CT requirements to form an IT matrix. It
defines the OT objectives and the requirements for resolution of each COI, as well as the OTD’s
minimum IOT&E requirements.
F-9
Integrated Program Summary (IPS). A DoD component document prepared and submitted to
the MDA in support of MS-A, -B, -C, and -D reviews. It concisely highlights the status of a
program and its readiness to proceed into the next phase of the acquisition cycle.
Integrated Testing (IT). IT is the collaborative planning and collaborative execution of test
phases and events to provide shared data in support of independent analysis, evaluation, and
reporting by all stakeholders, particularly the developmental (both contractor and government) and
operational test and evaluation communities. (OSD memo, dated 25 April 2008) IT is not an
event or separate test phase, nor is it a new type of test. IT is a process intended to result in resource
efficiencies (time, money, people, and assets) and an enhanced data set for separate evaluations.
For example, the data from an IT could be used by the contractor for design improvements, by the
developmental evaluators for risk assessments, and the operational evaluators for operational
assessments. However, IT does not replace or eliminate the need for dedicated Initial Operational
Test and Evaluation required by 10 USC 2399, “Operational Test and Evaluation of Defense
Acquisition Programs” and DoD Instruction 5000.02.
Intelligence Production Requirement (IPR). An IPR may be initiated by a user whenever there
is a perceived data gap. It may cover current, midterm, or long range intelligence requirements
which cannot be wholly satisfied by the resources of the requester.
Interoperability. The ability of systems, units, or forces to provide data, information, materiel,
and services to and accept the same from other systems, units, or forces and to use the data,
information, materiel, and services so exchanged to enable them to operate effectively together.
National Security System (NSS) and Information Technology System (ITS) interoperability
includes both the technical exchange of information and the operational effectiveness of that
exchanged information as required for mission accomplishment. (CJCSI 6212.01F).
IT Integration. IT blends or combines contractor, developmental, and operational testing to form
a cohesive testing continuum. This integration cannot occur unless the participants (CT, DT, and
OT) have determined their entering requirements for adequate testing of the system under
evaluation. IT does not remove or combine any of OPTEVFOR’s current or future requirements
for reporting based on a separate (OPTEVFOR) analysis of the shared test information produced
by the IT effort.
Joint Interoperability. Joint Interoperability is an effectiveness measure that examines the use
of systems which must exchange information or services with non-Navy systems and platforms;
that is, Army or Air Force and in some cases, Marines or Coast Guard. For instance, in designing
a test for a submarine antenna, the capability of the antenna to assist the platform in communicating
with Army helicopters, USAF aircraft and satellites, and a Marine unit might need to be examined.
Joint Test and Evaluation (JT&E) Program. An OSD program that is structured to evaluate or
provide information on system performance, technical concepts, system requirements or
improvements, and system interoperability; to improve or develop test methodologies; or for force
structure planning, doctrine or procedures.
Key Performance Parameters (KPP). Those system requirements designated by the resource
sponsor as critical or essential to the development of an effective military capability and that make
F-10
a significant contribution to the characteristics of the future joint force as defined in the Capstone
Concept for Joint Operations. KPPs must be testable to enable feedback from test and evaluation
efforts to the requirements process. KPPs are validated by the Joint Requirement Oversight
Council (JROC) for JROC Interest documents, by the JCB for JCB Interest documents, and by the
DoD component for Joint Integration, Joint Information, or Independent documents. CDD and
CPD KPPs are included verbatim in the APB. (CJCSI 3170.01 series)
Key System Attributes (KSA). A system requirement considered crucial in support of achieving
a balanced solution/approach to a KPP or some other key performance attribute deemed necessary
by the sponsor. KSAs provide decision makers with an additional level of capability performance
characteristics below the KPP level and require a sponsor 4-star, Defense agency commander or
Principal Staff Assistant to change. (CJCSI 3170.01 series)
Land-Based Test Sites (LBTS). An LBTS is a facility that duplicates, simulates, or stimulates
the employment of a system's planned operational installation and use for the purpose of
conducting DT. (Navy) (DAU Glossary)
Lead Component/Service. The DoD Component responsible for management of a system
acquisition involving two or more DoD Components in a joint program. (DAU Glossary)
Lethality. The probability that a weapon will destroy or neutralize a target. (DAU Glossary)
Level of Effort (LOE). Effort of a general or supportive nature which does not produce definite
end products or results, i.e., contract for man-hours.
Level of Repair Analysis (LORA). A trade study conducted by a contractor as part of the
system/equipment engineering analysis process. A basis on which to evolve an optimum approach
to repair recommendations concurrent with the design and development process. Also referred to
as Optimum Repair Level Analysis (ORLA) or Level of Repair Analysis (LOR/A). (DAU
Glossary)
Level of Test Determination (LTD). Replaced RALOT in March, 2020. The process by which
COMOPTEVFOR determines the level of OT involvement in a program going forward. This may
apply to an ACAT-IV program that may or may not require OT, or a program that is past IOT&E
that may or may not require FOT&E. Other applications of LTD are also likely to arise.
Life Cycle Costs (LCC). The total cost to the government of acquisition and ownership of that
system over its useful life. It includes the cost of development, acquisition, operations, and support
(to include manpower), and where applicable, disposal. For defense systems, LCC is also called
Total Ownership Cost (TOC). (DAU Glossary)
Likert Scale. The most widely used scale in survey research. When responding to a Likert
questionnaire item, respondents specify their level of agreement to a statement. Further detail is
provided in the OT Analysis Handbook.
Live-Fire Test and Evaluation (LFT&E). LFT is conducted to provide a timely and thorough
assessment of the vulnerability and lethality of a conventional weapon or conventional weapon
system as it progresses through its development and subsequent production phases. The primary
F-11
emphasis of LFT is on realistic testing as a source of personnel casualty, vulnerability, and lethality
information, taking into account the susceptibility to attack and combat performance of the system.
LFT will include, when feasible, the firing of threat munitions (or surrogates) at operational,
combat-loaded U.S. weapon systems to test their vulnerability; and/or the firing of U.S. munitions
or missiles against operational, combat-loaded threat targets (or surrogates) to test the lethality of
those munitions or missiles. (Derived from DAU Glossary)
Live-Fire Test and Evaluation Report
1. Report prepared by the Director, Operational Test and Evaluation (DOT&E) on survivability
and lethality testing. Submitted to the Congress for covered systems prior to the decision to
proceed beyond Low Rate Initial Production (LRIP). Prepared within 45 days of receiving the
Component LFT&E Report.
2. Report prepared by the Component on the results of survivability and lethality testing.
(Defense Acquisition Guidebook)
Logistic Supportability. The degree of ease to which system design characteristics and planned
logistics resources (including the Logistics Support (LS) elements) allow for the meeting of system
availability and wartime usage requirements. (DAU Glossary)
Logistic Support (LS) Elements. A traditional group of items, that taken together constitutes LS.
These include: maintenance planning; Manpower and Personnel (M&P); supply support; support
equipment; Technical Data (TD); training and training support; computer resources support;
facilities; Packaging, Handling, Storage, and Transportation (PHST); and, design interface. (DAU
Glossary)
Low Rate Initial Production (LRIP). The first effort of the Production and Deployment (P&D)
phase. This effort is intended to result in completion of manufacturing development in order to
ensure adequate and efficient manufacturing capability and to produce the minimum quantity
necessary to provide production or production-representative articles for IOT&E; establish an
initial production base for the system; and permit an orderly increase in the production rate for the
system, sufficient to lead to full-rate production upon successful completion of operational (and
live-fire, where applicable) testing. (DoDI 5000.02 and DAG)
Maintainability. The ability of an item to be retained in, or restored to, a specified condition
when maintenance is performed by personnel having specified skill levels, using prescribed
procedures and resources, at each prescribed level of maintenance and repair. (DAU Glossary)
MTFL, MCMTOMF, and Maintenance Ratio (MR) are frequently calculated in maintainability
evaluations.
Major Deficiency. An operational mission failure or software fault (precludes successful
completion of a mission and no acceptable work-around is known). If occurring in sufficient
numbers during testing, can lead to an unresolved/split resolution or UNSAT resolution of a COI.
Conversely, only one major deficiency occurring may not lower the result to below a stated
threshold, meaning that the COI is still resolved as SAT.
F-12
Material Support Date (MSD). The date when all necessary supply support of the system or
equipment is furnished. Supply support includes allowance quantities stocked in the supply system
or furnished directly to the end-user.
Matrix. The arrangement of specific elements into rows and columns to indicate interdependence
or correlation.
Mean Corrective Maintenance Time for Operational Mission Failures (MCMTOMF).
Normally computed as part of Test S-2, MCMTOMF is the average time required to perform active
corrective maintenance. Corrective maintenance is the time during which one or more personnel
are repairing an operational mission failure and includes: preparation, fault location, part
procurement from local (onboard) sources, fault correction, adjustment and calibration, and
follow-up checkout times. It excludes off-board logistic delay time.
Mean Time to Fault Locate (MTFL). The total fault location time divided by the number of
critical failures. Frequently computed as part of Test S-2, Maintainability.
Measure. The element of a standard that provides the basis for describing varying levels of task
performance.
Measure of Effectiveness (MOE). The data used to measure the military effect (mission
accomplishment) that comes from the use of the system in its expected environment. That
environment includes the SUT and all interrelated systems, that is, the planned or expected
environment in terms of weapons, sensors, Command and Control (C2), and platforms, as
appropriate, needed to accomplish an end-to-end mission in combat. (DAU Glossary) In MBTD,
MOEs are measures traced to effectiveness COIs or subtasks of effectiveness COIs.
Measure of Suitability (MOS). Measure of an item’s capability to be supported in its intended
operational environment. MOSs typically relate to readiness or operational availability, and hence
reliability, maintainability, and the item’s support structure. (DAU Glossary) In MBTD, MOSs
are measures traced to suitability COIs or subtasks of suitability COIs.
Milestone A Decision. The decision to establish a new acquisition program and establish a
concept baseline containing initial program cost, schedule, and program objectives. Approves
entry into the Technology Development (TD) phase of acquisition.
Milestone B Decision. The decision to begin the Engineering and Manufacturing Development
(EMD) phase of acquisition.
Milestone C Decision. The decision to begin the Production and Deployment (P&D) phase of
acquisition.
Militarily Useful Capability. A capability that achieves military objectives through operational
effectiveness, suitability, and availability, which is interoperable with related systems and
processes, transportable and sustainable when and where needed, and at costs known to be
affordable over the long term. (CJCSM 3170.01C)
F-13
Minor Deficiency. A deficiency that affects system performance, but does not impact the ability
to perform the mission. Usually requires only a minor workaround to continue testing.
Mission. The task, together with the purpose, that clearly indicates the action to be taken and the
reason therefore. (JP 1-02)
Mission Analysis. The mission analysis is a combined effort between OPTEVFOR and the
program representatives (T&E IPT), and should include other participants such as the Fleet Forces
Command (N8) representative, and operational user representatives. Other SMEs may be included
to ensure this evolution is completed correctly. These SMEs might include center of excellence
representatives.
Mission-Based Test Design (MBTD). MBTD is COMOPTEVFOR’s primary test planning
methodology.
Mission Capability by Primary Mission Area (MC
MA
). The percentage of time the test aircraft
is capable of performing a specified mission.
Mission Critical System. A system whose Operational Effectiveness (OE) and Operational
Suitability (OS) are essential to successful completion or to aggregate residual combat capability.
If this system fails, the mission likely will not be completed. Such a system can be an auxiliary or
supporting system, as well as a primary mission system. (DAU Glossary)
Mission Need Statement (MNS). A statement of operational capability required to perform an
assigned mission or to correct a deficiency in existing capability to perform the mission. (Replaced
by the Initial Capabilities Document (ICD))
Mission Reliability. See Reliability.
Mission Technical Baseline (MTB). SYSCOMs develop and maintain these documents in
coordination with Fleet Forces, OPNAV, and COMOPTEVFOR. MTBs consist of a scenario
summary, commander’s intent, tactical situation with associated targets, desired effects,
controlling threat baseline, integrated architecture, and requirements document.
Model. A model is a representation of an actual or conceptual system that involves mathematics,
logical expressions, or computer simulations that can be used to predict how the system might
perform or survive under various conditions or in a range of hostile environments.
Modeling and Simulation (M&S). DoD directives encourage the use of M&S to assist in
projecting operational effectiveness and operational suitability prior to MS-B, but limit its use in
subsequent OT&E to that of supplementing OT&E test data. Because of the increased emphasis
on the use of simulation in early OT&E, the OTD must give careful consideration to requirements
for the use of threat simulation.
Multiservice T&E. T&E conducted by two or more DoD Components for systems to be acquired
by more than one DoD Component, or for a DoD Component's systems that have interfaces with
equipment of another DoD Component. (DAU Glossary)
F-14
NATO Comparative Test Program (CTP). NATO CTPs evaluate foreign weapons systems,
equipment, and technologies that have the potential to satisfy a specific U.S. requirement. NATO
CTP applies only to items of NATO origin. (See Foreign Comparative Testing (FCT) (DAU
Glossary)
Net-Ready Key Performance Parameter (NR-KPP). The NR-KPP assesses information needs,
information timeliness, cybersecurity, and net-ready attributes required for both the technical
exchange of information and the end-to-end operational effectiveness of that exchange. The NR-
KPP consists of measurable and testable characteristics and/or performance metrics required for
the timely, accurate, and complete exchange and use of information to satisfy information needs
for a given capability. The NR-KPP is comprised of the following attributes:
1. IT must be able to support military operations.
2. IT must be able to be entered and managed on the network.
3. IT must effectively exchange information.
(See CJCSM 3170.01C and CJCSI 6212.01F for amplifying information)
Non-developmental Item (NDI).
1. Any previously developed item of supply used exclusively for government purposes by a
Federal Agency, a State or local government, or a foreign government with which the United
States has a mutual defense cooperation agreement.
2. Any item described in paragraph 1 that requires only minor modifications or modifications of
the type customarily available in the commercial marketplace in order to meet the requirements
of the procuring department or agency.
3. Any item of supply being produced that does not meet the requirements of paragraphs 1 or 2
solely because the item is not yet in use. (FAR 2.101) See Commercial Off-the-Shelf (COTS).
Notice of Intent (NOI). An NOI reserves a submerged operating area and establishes procedures
that will minimize mutual interference between submerged submarines, and between submarines
and other operations, such as surface ships, using variable depth sonar or dropping of explosive
ordnance. (COMSECONDFLT OPORD 2000)
Operational Assessment (OA). A risk assessment for successful completion of IOT&E made by
an independent operational test activity, with user support as required, on other than production
systems. An OA is a test event that is conducted before initial production units are available and
which incorporates substantial operational realism. The focus of an OA is on significant trends
noted in development efforts, programmatic voids, areas of risk, adequacy of requirements, and
the capability of the program to support adequate OT. An OA is conducted when there is enough
system maturity to conduct an operational test and may use technology demonstrators, prototypes,
or Engineering Development Models, if those articles can be placed in an operational context and
risk to IOT&E can be adequately assessed. An OA will not substitute for the IOT&E necessary to
support FRPDs. Normally conducted prior to, or in support of, Milestone C.
Operational Availability (A
o
). (See Availability for basic definition.) A
o
is computed and
reported as follows:
F-15
For continuous-use system, operational availability shall be designated A
o
and shall be
determined as the ratio of system "uptime" to system "uptime plus downtime."
For "on-demand" systems, operational availability shall be designated A
od
and shall be
determined as the ratio of the "number of times the system was available to perform as
required" to the "total number of times its performance was required." (Note: "Total number
of times its performance was required" shall be the number of times attempted and the
number of times it was operationally demanded, but not attempted because the system was
known to be inoperable.)
Operational Consideration (OPCON). A type of OT deficiency or issue used in OT reports to
document tactical considerations which inform operational commanders of significant aspects (pro
and con) of system employment, or make clear what special measures would be required to make
the system more efficient in battle.
Operational Effectiveness. The overall degree of mission accomplishment of a system when
used by representative personnel in the environment planned, or expected (e.g., natural, electronic,
threat etc.), for operational employment of the system, considering organization, doctrine, tactics,
supportability, survivability, vulnerability, and threat (including countermeasures, initial nuclear
weapons effects, and NBCC threats). (DAU Glossary and CJCSM 3170.01C)
Operational Evaluation (OPEVAL). Term formerly used for IOT&E. OPEVAL can be used as
a generic term to refer to the conglomerate OT&E processes across an acquisition cycle.
Operational Mission Failure (Reliability). A hardware failure or software fault that precludes
successful completion of a mission, and must be specifically defined for each system.
Operational Mission Software Fault (Reliability). A software fault that precludes successful
completion of a mission, and must be specifically defined for each system.
Operational Requirements. User- or user representative-generated validated needs developed to
address mission area deficiencies, evolving threats, emerging technologies, or weapon system cost
improvements. Operational performance requirements from the Capability Development
Document (CDD) and Capability Production Document (CPD) form the foundation for weapon
system technical specifications and contract requirements. (DAU Glossary)
Operational Requirements Document (ORD). With the implementation of the JCIDS process
(2003), the ORD was replaced by the CDD and CPD. Many acquisition programs are
grandfathered and will continue to use an ORD for system requirements for OT&E.
Operational Suitability. The degree to which a system can be placed and sustained satisfactorily
in field use with consideration being given to availability, compatibility, transportability,
interoperability, reliability, wartime usage rates, maintainability, safety, human factors,
habitability, manpower, logistics supportability, natural environmental effects and impacts,
documentation, and training requirements. (CJCSM 3170.01C)
Operational Test and Evaluation (OT&E). The field test, under realistic conditions, of any item
(or key component) of weapons, equipment, or munitions for the purpose of determining the
F-16
effectiveness and suitability of the weapons, equipment, or munitions for use in combat by typical
military users; and the evaluation of the results of such tests. (DAU Glossary)
Operational Utility Assessment (OUA) Report. The OUA report describes how a Joint
Capability Technology Demonstration's (JCTD's) products affect the resolution of an Operational
Problem (OP) and fulfill operational Desired Capabilities (DC). It declares the level of operational
utility according to the Concept of Operations (CONOPs) and TTPs and provides post-JCTD
transition, CONOPs and TTP and DOTMLPF-P recommendations. The OUA report and
applicable Initial Capabilities Document (ICD) [if required in lieu of OUA Report] and /or
Capability Development Document (CDD) are needed to meet the requirements of the Joint Staff
JCIDS process. Referred to as a "Military Utility Assessment (MUA)" by the JCIDS Manual. See
Military Utility Assessment (MUA). (DAU Glossary)
Operations Security (OPSEC). OPSEC, as it relates to COMOPTEVFOR testing, may be
defined as the identification and protection of a broad spectrum of classified and open-source
information that collectively reveals current and future U.S. military capabilities, plans, and
operational procedures. In this respect, it encompasses and relates to other security programs such
as signal security, physical security, automated data processing, and operational deception.
OTD Journal. The OTD journal records, for possible later use, data that the OTD hadn’t
considered when developing the data or survey sheets, and may be of significance in the program.
While each OTD must use his own judgment when deciding what is significant, it is better to
record too much data rather than too little. And, it is better to record it as soon as an event occurs,
rather than to wait until later and risk forgetting.
Operational Test Readiness Review (OTRR). A multi-disciplined product and process
assessment to ensure that the production configuration system can proceed into Initial Operational
Test and Evaluation (IOT&E) with a high probability of success. More than one OTRR may be
conducted prior to IOT&E. (Defense Acquisition Guidebook)
OPTEVFOR. The acronym used in reference to COMOPTEVFOR’s staff.
Program Executive Officer (PEO). A military or civilian official who has responsibility for
directing several Major Defense Acquisition Programs (MDAPs) and for assigned major system
and non-major system acquisition programs. A PEO normally has no other command or staff
responsibilities within the Component, and only reports to and receives guidance and direction
from the DoD CAE. (DAU Glossary)
Program Manager (PM). Designated individual (military or civilian) with responsibility for and
authority to accomplish program objectives for development, production, and sustainment to meet
the user's operational needs. The PM shall be accountable for credible cost, schedule, and
performance reporting to the Milestone Decision Authority (MDA). (DoDD 5000.1)
Projected Threat. A best estimate based on historical trends data, evidence of continuing research
and development, postulated military requirements, technological capabilities, and the best
intelligence available. This threat consists of the weapon systems and characteristics that an
F-17
adversary can be expected to develop and deploy during the specified period. See Validated Online
Lifecycle Threat (VOLT) Report
Quick Reaction Assessment (QRA) (USN and USMC only). A QRA is a quick assessment that
examines specific operational considerations and capabilities of a system. Used when operational
necessity dictates deploying a rapid capability in the Fleet. A QRA will not be used to resolve
COIs.
Reliability. The probability that a system will perform its required functions without failure (see
failure) understated conditions for a stated period of time. In OT&E, reliability is usually reported
in one of two ways:
Mission Reliability (R). For equipment operated only during a relatively short duration
mission (as opposed to equipment operated more or less continuously), the probability of
completing the mission without an operational mission failure.
Mean Time Between Operational Mission Failures (MTBOMF). For more or less
continuously operated equipment or systems. MTBOMF measures reliability as it relates to
the overall mission of the equipment or system being tested and is the total operating time
divided by the number of operational mission failures. MTBOMF is the figure used in the
calculation of overall mission Reliability (R). MTBOMF is sometimes modified to Mean
Flight Hours Between Operational Mission Failures (MFHBOMF).
Resource Sponsor. See Sponsor.
Research, Development, Test, and Evaluation (RDT&E). See NAVSO P-2457 (RDT&E
Management Guide).
Research Laboratories. Laboratories available to provide analytical support to
COMOPTEVFOR in the OT&E of assigned CNO projects.
Requirement (Military Requirement or Operational Requirement). An established need
justifying the timely allocation of resources to achieve a capability to accomplish approved
military objectives, missions, or tasks. (JP 1-02) The need or demand for personnel, equipment,
facilities, other resources, or services, by specified quantities for specific periods of time or at a
specified time. (DAU Glossary)
Risk. A measure of future uncertainties in achieving program performance goals and objectives
within defined cost, schedule, and performance constraints. Risk can be associated with all aspects
of a program (e.g., threat, technology, maturity, supplier capability, design maturation,
performance against plan) as these aspects relate across the Work Breakdown Structure (WBS)
and Integrated Master Schedule (IMS). Risks have three components: 1) A future root cause (yet
to happen), which, if eliminated or corrected, would prevent a potential consequence from
occurring, 2) A probability (or likelihood) assessed at the present time of that future root cause
occurring, and 3) A consequence (or effect) of that future occurrence. (Risk Management Guide
for DoD Acquisition, Sixth Edition)
F-18
Risk Mitigation Plan. A document that records the results of Risk Mitigation Planning. It
typically addresses topics such as descriptive title of the risks, date of the plan, points of contact
for controlling identified root causes, options for mitigation, risk status, fallback approach,
recommendations, approval levels, and resource requirements. (Risk Management Guide for DoD
Acquisition, Sixth Edition)
Safety. Freedom from conditions that can cause death, injury, occupational illness, damage/loss
of equipment or property, or damage to the environment. (DAU Glossary) The program’s risk
management activities, and organizational and cultural values dedicated to preventing injuries and
accidental loss of human and materiel resources and to protecting the environment from the
damaging effects of DoD mishaps. (CJCSM 3170. 01C)
SECNAVINST 5000.2F. The fundamental Navy instruction on T&E.
Self-Defense Test Ship (SDTS). Realistic OT for softkill and short range hardkill self-defense
weapon systems is often restricted by safety considerations that prohibit threat-representative
target presentations for manned ships. For this reason, the former USS PAUL F FOSTER (DD
964) has been configured as an unmanned ship outfitted with current softkill and hardkill self-
defense weapon systems for use by the DT and OT communities.
Severe Deficiency. A deficiency that prevents the accomplishment of a requirement designated
as critical to achievement of a KPP and results in the inability to accomplish the mission. If a
deficiency is determined to be severe, the affected COI should be resolved UNSAT for IOT&E
and FOT&E.
Simulation. A method for implementing a model. It is the process of conducting experiments
with a model for the purpose of understanding the behavior of the system modeled under selected
conditions or of evaluating various strategies for the operation of the system within the limits
imposed by developmental or operational criteria. Simulation may include the use of analog or
digital devices, laboratory models, or test-bed sites. Simulations are usually programmed for
solution on a computer; however, in the broadest sense, military exercises and war games are also
simulations. (DAU Glossary)
Simulator. A generic term used to describe equipment used to represent weapon systems in DT,
OT, and training, e.g., a threat simulator has one or more characteristics which, when detected by
human senses or manmade sensors, provide the appearance of an actual threat weapon system with
a prescribed degree of fidelity. (DAU Glossary)
Software Qualification Test (SQT). Post-MS-C software testing will be conducted by
COMOPTEVFOR as SQT and is solely intended for a Fleet release recommendation. SQT applies
to software modifications of limited scope, such as aircraft and weapons systems Operational
Flight Programs (OFP) and other systems in which software provides a similar function.
Software Test. Software will be operationally tested in the system in which the application is
installed or implemented when fielded. The software to be used for IOT&E and FOT&E will be
the software intended for Fleet use.
F-19
Software Upgrade (U.S. Navy). Navy software upgrades (releases) fall into three categories:
Major -- adds new functions or warfare capabilities, interfaces with a different weapon system,
redesigns the software architecture, or rewrites the software in a different language (requires OT
by OPTEVFOR); Minor -- changes that do not add any significant functions or interfaces as
determined by CNO (OT by OPTEVFOR upon CNO approval); Maintenance -- releases that are
fixes to minor problems (no testing by OPTEVFOR).
Specified Requirement. A system requirement that is clearly documented in the system’s
capabilities document (Operational Requirements Document, Capabilities Development
Document, Capabilities Production Document) and must be either:
1. A KPP, KSA, MOE, MOS, or other performance threshold (not objective), or
2. Any capability stated as a “shall” or “will” statement.
Sponsor. The DoD Component, Principal Staff Assistant or domain owner responsible for all
common documentation, periodic reporting, and funding actions required to support the capabilities
development and acquisition process for a specific capability proposal. (CJCSI 3170.01G) (Also
commonly called resource sponsor. )
Standard. The minimum acceptable proficiency required in the performance of a particular task
under a specified set of conditions. (OPNAVINST 3500.38B) Defined by the ORD/CD or
assigned by OPTEVFOR, standards consist of measures and criteria.
Statement of Work (SOW). That portion of a contract which establishes and defines all non-
specification requirements for contractor's efforts either directly or with the use of specific cited
documents. (DAU Glossary)
Subtask. The further breakdown of a task into the discrete events or actions required to complete
the task. (See OPNAVINST 3500.38B)
Survivability. The capability of a system and its crew to avoid or withstand man-made, hostile
environment without suffering an abortive impairment of its ability to accomplish its designated
mission. (DAU Glossary)
Susceptibility. The degree to which a device, equipment, or weapons system is open to effective
attack due to one or more inherent weaknesses. (Susceptibility is a function of operational tactics,
countermeasures, probability of the enemy fielding a threat, etc.) Susceptibility is considered a
subset of survivability. (DAU Glossary)
Sustainability. The ability to maintain the necessary level and duration of operational activity to
achieve military objectives. Sustainability is a function of providing for and maintaining those
levels of ready forces, materiel, and consumables necessary to support military effort. (CJCSM
3170.01C)
Synergy. Interaction of discrete agents or conditions such that the total effect is greater than the
sum of the individual effects.
F-20
System-of-Systems (SoS). A set or arrangement of interdependent systems that are related or
connected to provide a given capability. The loss of any part of the system will significantly
degrade the performance or capabilities of the whole. The development of a SoS solution will
involve trade space between the systems as well as within an individual system performance.
(CJCSM 3170.01C)
Systems Engineering (SE). The overarching process that a program team applies to transition
from a stated capability to an operationally effective and suitable system. SE encompasses the
application of SE processes across the acquisition life cycle (adapted to each and every phase) and
is intended to be the integrating mechanism for balanced solutions addressing capability needs,
design considerations and constraints, as well as limitations imposed by technology, budget, and
schedule. The SE processes are applied early in concept definition, and then continuously
throughout the total life cycle. (Defense Acquisition Guidebook)
System Service Reports. Service reports are issued when a system in RDT&E has a major or
minor failure. They may be issued during any phase of T&E or between scheduled phases of T&E.
System Threat Assessment. Describes the threat to be countered and the projected threat
environment. The threat information should reference DIA or Service Technical Intelligence
Center-approved documents. (DoDI 5000.02)
System Threat Assessment Report (STAR). The STAR was recently the basic authoritative
threat assessment tailored for and focused on a particular U.S. defense acquisition program. The
STAR has been replaced by the Validated Online Lifecycle Threat (VOLT).
System Under Test (SUT). The SUT is the hardware and/or software being delivered/developed
to meet the requirements set by the resource sponsor and provide the capabilities needed by the
Fleet. Through MBTD, the SUT evaluation will be made against specified, derived, and other
measures. Issues that are identified as specific to the SUT shall be used for COI risk [Early
Operational Assessment (EOA) and Operational Assessments (OA)] or deficiency [Initial
Operational Test and Evaluation (IOT&E) or Follow-on Operational Test and Evaluation
(FOT&E)] determinations, COI resolution (SAT/UNSAT), system effectiveness/suitability
determinations, and fielding recommendations.
Tactical Development and Evaluation (TAC D&E). A program designed to improve tactical
readiness through development of tactical doctrine for the effective employment of current combat
systems or systems approaching IOC.
Tactical Situation (TACSIT). TACSITs provide Red Order of Battle (OOB), Red doctrine and
TTPs, Blue OOB, Blue doctrine and TTPs, environmental details, C2, ROE, and more based on
current OPLANs. They are Fleet documents.
Task. A discrete event or action, not specific to a single unit, weapon system, or individual, that
enables a mission or function to be accomplished by individuals and/or organizations.
(OPNAVINST 3500.38B)
F-21
Test. Any program or procedure which is designed to obtain, verify, or provide data for the
evaluation of any of the following: 1) progress in accomplishing developmental objectives; 2) the
performance, operational capability and suitability of systems, subsystems, components, and
equipment items; and 3) the vulnerability and lethality of systems, subsystems, components, and
equipment items. (DAU Glossary) The test verification method is an activity designed to provide
data on functional features and equipment operation under fully controlled and traceable
conditions. These data are subsequently used to evaluate quantitative characteristics (Defense
Acquisition Guidebook). See Verification.
Test and Evaluation Identification Number (TEIN). When a program becomes a program of
record, the CNO will assign a TEIN. If the program is internal to COMOPTEVFOR the TEIN
will start with 3000.
Test and Evaluation Master Plan (TEMP). Documents the overall structure and objectives of
the Test and Evaluation (T&E) program. It provides a framework within which to generate
detailed T&E plans and it documents schedule and resource implications associated with the T&E
program. The TEMP identifies the necessary Developmental Test and Evaluation (DT&E),
Operational Test and Evaluation (OT&E), and Live Fire Test and Evaluation (LFT&E) activities.
It relates program schedule, test management strategy and structure, and required resources to:
Critical Operational Issues (COI), Critical Technical Parameters (CTP), objectives and thresholds
documented in the Capability Development Document (CDD), evaluation criteria, and milestone
decision points. For multiservice or joint programs, a single integrated TEMP is required.
Component-unique content requirements, particularly evaluation criteria associated with COIs,
can be addressed in a component-prepared annex to the basic TEMP. (See Capstone TEMP).
(DAU Glossary) See SECNAVINST 5000.2E, DoD Instruction 5000.02, and the Defense
Acquisition Guidebook.
Test and Evaluation Coordinating Group (TECG). A TECG will convene when T&E issues
arise that cannot be resolved between the applicable commands or when extensive T&E
coordination is required. A TECG may also be used to implement urgent required changes to
TEMPs. In this case, either a page change will be issued or the formal report of the TECG will be
attached to the TEMP as an annex until the next required update or revision.
Test Report. Formally documents the results, conclusions, and recommendations as a result of
each phase of DT/OT. (DAU Glossary)
Test Reporting. For major programs, the lead service will prepare and coordinate the single
(interim or final) report reflecting the system's operational effectiveness and operational suitability
for each service. The participating services' independent evaluation reports will be appended to
final reports.
Threat. The sum of the potential strengths, capabilities, and strategic objectives of any adversary
that can limit or negate U.S. mission accomplishment or reduce force, system, or equipment
effectiveness. (DAU Glossary)
Threat Assessment. The provisions of intelligence assessment of the threat in the appropriate
context and detail necessary to support plans, programs, or actions. Threat support is normally
F-22
provided in the form of threat or capabilities publications, generic threat assessments, and specific
threat statements, all of which emphasize system projections and threat forecasts. Threat support
also includes operational intelligence on foreign naval targets and force employment. (See System
Threat Assessment and Capstone Threat Assessment in the DAU Glossary)
Threat Support. The provisions of intelligence assessments of the threat in the appropriate
context and detail necessary to support plans, programs, or actions. Threat support is normally
provided in the form of threat or capabilities publications, generic threat assessments, and specific
threat statements, all of which emphasize system projections and threat forecasts. Threat support
also includes operational intelligence on foreign naval targets and force employment. (See DoDI
5000.02 and DIA Directive 5000.200)
Threat Validation. The evaluation of, and concurrence with, threat documentation. DIA
evaluation of service-produced threats stresses the appropriateness and completeness of the
intelligence positions and the logic of extrapolations from existing intelligence. (See DoDI
5000.02)
Threshold. A minimum acceptable operational value below which the utility of the system
becomes questionable. (CJSCM 3170.01C)
Training. The level of learning required to adequately perform the responsibilities designated to
the function and accomplish the mission assigned to the system. (DAU Glossary)
Under Secretary of Defense (Acquisition and Sustainment) (USD (A&S). The USD (A&S)
has policy and procedural authority for the defense acquisition system, is the principal acquisition
official of the Department, and is the acquisition advisor to the Secretary of Defense (SECDEF).
In this capacity the USD (A&S) serves as the Defense Acquisition Executive (DAE), the Defense
Senior Procurement Executive, and the National Armaments Director, the last regarding matters
of the North Atlantic Treaty Organization (NATO). For acquisition matters, the USD (A&S) takes
precedence over the Secretaries of the Military Departments after the SECDEF and Deputy
SECDEF. The USD (A&S) authority ranges from directing the Military Departments and Defense
agencies on acquisition matters, to establishing the Defense Federal Acquisition Regulation
Supplement (DFARS), and chairing the Defense Acquisition Board (DAB) for Major Defense
Acquisition Program (MDAP) reviews. (DAU Glossary)
Universal Navy Task List (UNTL). A list of Navy tasks considered essential to the
accomplishment of an assigned or anticipated mission. OPNAV Instruction 3500.38 series applies.
User. An operational command or agency that receives or will receive benefit from the acquired
system. Combatant Commanders (COCOMs) and their Service Component commands are the
users. There may be more than one user for a system. Because the Service Component commands
are required to organize, equip, and train forces for the COCOMs, they are seen as users for
systems. The Chiefs of Services and heads of other DoD Components are validation and approval
authorities and are not viewed as users. (JCIDS Manual) See Validation Authority. (DAU
Glossary). In MBTD, users are Fleet operators that employ the SUT.
F-23
Validation. Provides objective evidence that the capability provided by the system complies with
stakeholder performance requirements, achieving its use in its intended operational environment.
Validation answers the question: “Is it the right solution to the problem?” Validation consists of
evaluating the operational effectiveness, operational suitability, sustainability, and survivability of
the system or system elements under operationally realistic conditions (Defense Acquisition
Guidebook).
1. The review of documentation by an operational authority other than the user to confirm the
operational capability. Validation is the precursor to approval. (JCIDS Manual)
2. The process by which the contractor (or as otherwise directed by the DoD Component procuring
activity) tests a publication/Technical Manual (TM) for technical accuracy and adequacy.
(DAU Glossary)
3. The process of evaluating a system or software component during, or at the end of, the
development process to determine whether it satisfies specified requirements. (DAU Glossary)
Verification. Provides evidence that the system or system element performs its intended functions
and meets all performance requirements listed in the system performance specification and
functional and allocated baselines. Verification answers the question: “Did you build the system
correctly?” (Defense Acquisition Guidebook). See Analysis, Demonstration, Examination, and
Test.
Verification of Correction of Deficiencies (VCD) (U.S. Navy). VCDs are used to support
acquisition decisions for limited or full rate production. Evaluation of corrections to specific
deficiencies cited in a previous OT&E report will apply to only those COIs that have been
corrected, and the evaluation will not require end-to-end testing of the complete system.
Vignette. A convenient or logical grouping of a subtasks to allow testing and data collection.
Vignettes are conducted under the varying conditions determined to have impact on the associated
subtask performance.
Validated Online Lifecycle Threat (VOLT) Report. A regulatory document for Acquisition
Category (ACAT) I-III programs. The VOLT supersedes the STAR and is a system-specific report
supporting capability development and PM assessments of mission needs and capability gaps
against likely threat capabilities at Initial Operational Capability (IOC).
Vulnerability. The characteristics of a system that cause it to suffer a degradation (loss or
reduction of capability to perform the designated mission) as a result of having been subjected to
a certain (defined) level of effects in an unnatural (man-made) hostile environment. Vulnerability
is considered a subset of survivability. (DAU Glossary)
Waivers. The term "Waivers" applies to a deviation from the criteria identified for certification
for operational testing in SECNAVINST 5000.2F. Waivers do not change or delay any testing or
evaluation of a system. Also see Deviations. (SECNAVINST 5000.2F)
Warfighting Development Centers (WDC). In DEC 2014, COMUSFLTFORCOM and
COMPACFLT stood up WDC to replace Warfare Centers of Excellence. WDCs are established
F-24
for air, undersea, surface, and expeditionary forces. Navy Warfare Development Command
(NWDC) leads cross domain warfare integration at all levels of Naval warfare.
Workaround. A procedure developed for taking into account shortcomings or other problems in
a program and devising workable solutions to get around the problems. (DAU Glossary)