Documents
ICE TECS Modernization Master Plan
Mar. 2, 2017
Exhibit G: ICE TECS Modernization TEMP
Exhibit G: ICE TECS Modernization TEMP
For Official Use Only
ICE TECS Modernization Program
Test and Evaluation Master Plan
(TEMP)
April 2, 2014
Version 2.0
Office of the Chief Information Officer (OCIO)
For Official Use Only
ICE TECS Modernization Program
Test and Evaluation Master Plan
(TEMP)
April 2, 2014
Version 2.0
Office of the Chief Information Officer (OCIO)
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Revision History
No.
Date
Reviewer
Title
Change Description
Version 1.0
2. 3/7/14
Version 2.0 - Updated document to reflect
Program Replan
3. 3/27/14
Version 2.0 - Program Adjudicated per CAE
Comments
4. 4/14/14
Version 2.0 - Program Adjudicated per CAE
Comments
1.
8/10/11
March 27, 2014
Version 2.0
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
2
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Revision History
No.
Date
Reviewer
Title
Change Description
Version 1.0
2. 3/7/14
Version 2.0 - Updated document to reflect
Program Replan
3. 3/27/14
Version 2.0 - Program Adjudicated per CAE
Comments
4. 4/14/14
Version 2.0 - Program Adjudicated per CAE
Comments
1.
8/10/11
March 27, 2014
Version 2.0
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
2
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Table of Contents
Executive Summary .........................................................................................................................5
1 Introduction ................................................................................................................................7
1.1 Background ......................................................................................................................7
1.2 Key Performance Parameters ...........................................................................................7
1.3 Critical Technical Parameters ..........................................................................................8
2 Program Summary ...................................................................................................................13
2.1 Initial Operational Capability Date ................................................................................14
2.2 Full Operational Capability Date ...................................................................................17
2.3 Management ...................................................................................................................17
3 Developmental Test and Evaluation Outline ...........................................................................21
3.1 Developmental Test and Evaluation Overview .............................................................21
3.2 Developmental Test and Evaluation to Date .................................................................26
3.3 Future Developmental Test and Evaluation ...................................................................26
3.4 Developmental Test and Evaluation Plans and Reports ................................................26
4 Operational Test and Evaluation Outline .................................................................................27
4.1 Operational Test and Evaluation Overview ...................................................................27
4.2 Evaluation Strategy ........................................................................................................29
4.3 Integrated Evaluation Framework..................................................................................29
4.4 Modeling and Simulation ...............................................................................................30
4.5 Reliability growth ..........................................................................................................30
4.6 Critical Operational Issues .............................................................................................30
4.7 TECS Modernization Failure Classification ..................................................................30
4.8 Operational Test and Evaluation to Date .......................................................................34
4.9 Planned Operational Test and Evaluation ......................................................................34
4.10 Constraints and Limitations ...........................................................................................34
4.11 Operational Test and Evaluation Plans and Reports ......................................................35
5 Test and Evaluation (T&E) Resource Summary .....................................................................36
Appendix A: Bibliography .............................................................................................................40
Appendix B: Acronyms .................................................................................................................41
Appendix C: Points of Contact ......................................................................................................43
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
3
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Table of Contents
Executive Summary .........................................................................................................................5
1 Introduction ................................................................................................................................7
1.1 Background ......................................................................................................................7
1.2 Key Performance Parameters ...........................................................................................7
1.3 Critical Technical Parameters ..........................................................................................8
2 Program Summary ...................................................................................................................13
2.1 Initial Operational Capability Date ................................................................................14
2.2 Full Operational Capability Date ...................................................................................17
2.3 Management ...................................................................................................................17
3 Developmental Test and Evaluation Outline ...........................................................................21
3.1 Developmental Test and Evaluation Overview .............................................................21
3.2 Developmental Test and Evaluation to Date .................................................................26
3.3 Future Developmental Test and Evaluation ...................................................................26
3.4 Developmental Test and Evaluation Plans and Reports ................................................26
4 Operational Test and Evaluation Outline .................................................................................27
4.1 Operational Test and Evaluation Overview ...................................................................27
4.2 Evaluation Strategy ........................................................................................................29
4.3 Integrated Evaluation Framework..................................................................................29
4.4 Modeling and Simulation ...............................................................................................30
4.5 Reliability growth ..........................................................................................................30
4.6 Critical Operational Issues .............................................................................................30
4.7 TECS Modernization Failure Classification ..................................................................30
4.8 Operational Test and Evaluation to Date .......................................................................34
4.9 Planned Operational Test and Evaluation ......................................................................34
4.10 Constraints and Limitations ...........................................................................................34
4.11 Operational Test and Evaluation Plans and Reports ......................................................35
5 Test and Evaluation (T&E) Resource Summary .....................................................................36
Appendix A: Bibliography .............................................................................................................40
Appendix B: Acronyms .................................................................................................................41
Appendix C: Points of Contact ......................................................................................................43
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
3
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Figures
Figure 1: TECS Modernization Conceptual Architecture ..................................................... 14
Figure 2: High-Level IOC Schedule...................................................................................... 16
Figure 3: ICM System Testing View..................................................................................... 25
Figure 4: Core HSI Investigative Case Management Processes ........................................... 32
Tables
Table 1: Key Performance Parameters .................................................................................... 8
Table 2: Critical Technical Parameters ...................................................................................9
Table 3: Standards of Compliance ........................................................................................ 11
Table 4: Management Roles & Responsibilities ................................................................... 17
Table 5: Operational Test and Evaluation Periods ................................................................ 27
Table 6: Critical Operational Issues ...................................................................................... 30
Table 7: Failure Classification............................................................................................... 33
Table 8: Operational Test and Evaluation Limitations.......................................................... 34
Table 9: Summary of T&E Funding Requirements .............................................................. 36
Table 10: Summary of Testing Resources............................................................................. 37
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
4
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Figures
Figure 1: TECS Modernization Conceptual Architecture ..................................................... 14
Figure 2: High-Level IOC Schedule...................................................................................... 16
Figure 3: ICM System Testing View..................................................................................... 25
Figure 4: Core HSI Investigative Case Management Processes ........................................... 32
Tables
Table 1: Key Performance Parameters .................................................................................... 8
Table 2: Critical Technical Parameters ...................................................................................9
Table 3: Standards of Compliance ........................................................................................ 11
Table 4: Management Roles & Responsibilities ................................................................... 17
Table 5: Operational Test and Evaluation Periods ................................................................ 27
Table 6: Critical Operational Issues ...................................................................................... 30
Table 7: Failure Classification............................................................................................... 33
Table 8: Operational Test and Evaluation Limitations.......................................................... 34
Table 9: Summary of T&E Funding Requirements .............................................................. 36
Table 10: Summary of Testing Resources............................................................................. 37
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
4
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Executive Summary
TECS is a mainframe based system and is the primary investigative tool used by Immigration and
Customs Enforcement (ICE). It provides investigation, reporting, entry processing, and Watchlist mission
support for ICE and Customs and Border Protection (CBP) and is the system of record for all activities
related to both criminal and administrative investigations. ICE has identified deficiencies in the current
legacy mainframe system and Operations and Maintenance (O&M) cost impacts that will occur
beginning in Fiscal Year (FY) 2016 in the event the system is not replaced. The ICE TECS
Modernization Program will address both of these issues.
The ICE TECS Modernization Program will provide a law enforcement case management solution that
meets ICE Homeland Security Investigations (HSI) mission needs in the areas of:
Case Management
Subject Records
System Interfaces
Reporting and Analytics
The ICE TECS Modernization Program has defined milestones for Initial Operating Capability (IOC) and
Final Operating Capability (FOC). The program will deliver IOC by September 30, 2015, which will
achieve mainframe independence. The program will deliver FOC by Q4 FY17, which will include
additional functionality and integrations.
The deployment strategy at IOC is to conduct an enterprise cutover preceded by extensive testing of the
new case management system. This system will be fully tested and integrated with all interfaces to
external systems, including performance testing which will validate the system will handle the specified
number of end users without degradation to system performance. Many representative end users (HSI
special agents) will be an integral part of this integration testing.
The Test and Evaluation Master Plan (TEMP) is the primary planning document for Test and Evaluation
(T&E) related activities for acquisition programs including all discrete segments, blocks, increments or
spirals. The purpose of the TEMP is to document the overarching T&E approach for the acquisition
program throughout the program lifecycle. The TEMP describes the necessary Developmental Test and
Evaluation (DT&E) and Operational Test and Evaluation (OT&E) that needs to be conducted to
determine system technical performance, operational effectiveness / suitability, and limitations. This
document addresses the following types of testing:
Functional Testing - Verifies the developer delivers products that meet the functional
requirements as described in the System Requirements Document (SRD) and the content of the
Design Document (DD), Interface Control Agreement (ICA), Data Management Plan (DMP), and
other artifacts to determine if the product performs the business functions as documented.
Interoperability Testing - End-to-end testing that verifies all TECS Modernization Program
system components maintain data integrity and can operate in coordination with other systems in
the same environment.
Performance Testing - Detects any performance and capacity limitations by generating system
load that emulates the behavior of users conducting business transactions, determines when
performance begins to degrade, and identifies bottlenecks across the system application and
infrastructure.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
5
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Executive Summary
TECS is a mainframe based system and is the primary investigative tool used by Immigration and
Customs Enforcement (ICE). It provides investigation, reporting, entry processing, and Watchlist mission
support for ICE and Customs and Border Protection (CBP) and is the system of record for all activities
related to both criminal and administrative investigations. ICE has identified deficiencies in the current
legacy mainframe system and Operations and Maintenance (O&M) cost impacts that will occur
beginning in Fiscal Year (FY) 2016 in the event the system is not replaced. The ICE TECS
Modernization Program will address both of these issues.
The ICE TECS Modernization Program will provide a law enforcement case management solution that
meets ICE Homeland Security Investigations (HSI) mission needs in the areas of:
Case Management
Subject Records
System Interfaces
Reporting and Analytics
The ICE TECS Modernization Program has defined milestones for Initial Operating Capability (IOC) and
Final Operating Capability (FOC). The program will deliver IOC by September 30, 2015, which will
achieve mainframe independence. The program will deliver FOC by Q4 FY17, which will include
additional functionality and integrations.
The deployment strategy at IOC is to conduct an enterprise cutover preceded by extensive testing of the
new case management system. This system will be fully tested and integrated with all interfaces to
external systems, including performance testing which will validate the system will handle the specified
number of end users without degradation to system performance. Many representative end users (HSI
special agents) will be an integral part of this integration testing.
The Test and Evaluation Master Plan (TEMP) is the primary planning document for Test and Evaluation
(T&E) related activities for acquisition programs including all discrete segments, blocks, increments or
spirals. The purpose of the TEMP is to document the overarching T&E approach for the acquisition
program throughout the program lifecycle. The TEMP describes the necessary Developmental Test and
Evaluation (DT&E) and Operational Test and Evaluation (OT&E) that needs to be conducted to
determine system technical performance, operational effectiveness / suitability, and limitations. This
document addresses the following types of testing:
Functional Testing - Verifies the developer delivers products that meet the functional
requirements as described in the System Requirements Document (SRD) and the content of the
Design Document (DD), Interface Control Agreement (ICA), Data Management Plan (DMP), and
other artifacts to determine if the product performs the business functions as documented.
Interoperability Testing - End-to-end testing that verifies all TECS Modernization Program
system components maintain data integrity and can operate in coordination with other systems in
the same environment.
Performance Testing - Detects any performance and capacity limitations by generating system
load that emulates the behavior of users conducting business transactions, determines when
performance begins to degrade, and identifies bottlenecks across the system application and
infrastructure.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
5
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Section 508 Compliance Testing - Tests all ICE systems with user interfaces for Section 508
compliance using the OAST (Office of Accessible System and Technology) approved testing
package.
User Acceptance Testing (UAT) - Allows production users to test systems before deployment to
ensure that the developed system meets their needs.
Security Authorization (Certification and Accreditation [C&A]) - Validates implementation of
security requirements and controls in the system and identifies potential intrusion or sensitive
data exposure vulnerabilities.
Operational Test and Evaluation - Determines if the system being delivered fulfills operational
effectiveness and suitability.
The TEMP also identifies all Critical Technical Parameters (CTP) and Critical Operational Issues (COIs)
and describes the objectives, responsibilities, resources, and schedules for all completed and planned
T&E, including Modeling and Simulation (M&S) tools used in the T&E process, over the acquisition
lifecycle. The TEMP will be updated upon selection of the ICM System Commercial Off-the-Shelf
(COTS) solution.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
6
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Section 508 Compliance Testing - Tests all ICE systems with user interfaces for Section 508
compliance using the OAST (Office of Accessible System and Technology) approved testing
package.
User Acceptance Testing (UAT) - Allows production users to test systems before deployment to
ensure that the developed system meets their needs.
Security Authorization (Certification and Accreditation [C&A]) - Validates implementation of
security requirements and controls in the system and identifies potential intrusion or sensitive
data exposure vulnerabilities.
Operational Test and Evaluation - Determines if the system being delivered fulfills operational
effectiveness and suitability.
The TEMP also identifies all Critical Technical Parameters (CTP) and Critical Operational Issues (COIs)
and describes the objectives, responsibilities, resources, and schedules for all completed and planned
T&E, including Modeling and Simulation (M&S) tools used in the T&E process, over the acquisition
lifecycle. The TEMP will be updated upon selection of the ICM System Commercial Off-the-Shelf
(COTS) solution.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
6
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
1 Introduction
The TEMP uses the developmental and operational performance requirements outlined in the Operational
Requirements Document (ORD) to develop a comprehensive testing strategy.
1.1
Background
TECS was developed in 1987 by the U.S. Customs Service as an umbrella system to support the business
activities of investigative case management. TECS is a mainframe based system utilizing 1980’s
processing, application and information management technology, which administered by the CBP Office
of Information Technology (OIT). CBP and ICE components are collaborating to modernize the system.
Under this structure, ICE and CBP will build separate systems that provide the capabilities needed to
support each component’s unique mission.
Due to the requirement to share critical information between ICE and CBP, the ICE TECS Modernization
Program and CBP TECS Modernization Program have been and will continue to closely coordinate
development of their respective systems. This will occur through a governance process that includes
executive oversight and participation at the program and working levels.
The ICE TECS Modernization Program is being driven by the need to address deficiencies in the legacy
mainframe system. Current functionality does not allow interfaces between systems, lacks modern
interfaces for system users, and does not support required security measures.
The program is also driven by the need to migrate the legacy system from the mainframe. If ICE is unable
to achieve mainframe independence by FY15, it will be required to assume responsibility for maintaining
the legacy TECS system and incur significant O&M costs starting in FY16.
The program revised its acquisition and solution strategy in Q1 FY14 to clearly define the capabilities
needed for ICE to discontinue the use of legacy TECS mainframe by September 2015. The program
assessed its requirements, technology, and processes to reduce the overall program costs, schedule, and
technical risks. The effort resulted in an issuance of an ADM requiring the ICE TECS Modernization
Program to achieve an ADE-2B decision, which is currently planned for Q3 FY14.
1.2
Key Performance Parameters1
The table below identifies the KPPs for the ICE TECS Modernization Program.
1
KPPs were defined in the ICE TECS Modernization Operational Requirements Document
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
7
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
1 Introduction
The TEMP uses the developmental and operational performance requirements outlined in the Operational
Requirements Document (ORD) to develop a comprehensive testing strategy.
1.1
Background
TECS was developed in 1987 by the U.S. Customs Service as an umbrella system to support the business
activities of investigative case management. TECS is a mainframe based system utilizing 1980’s
processing, application and information management technology, which administered by the CBP Office
of Information Technology (OIT). CBP and ICE components are collaborating to modernize the system.
Under this structure, ICE and CBP will build separate systems that provide the capabilities needed to
support each component’s unique mission.
Due to the requirement to share critical information between ICE and CBP, the ICE TECS Modernization
Program and CBP TECS Modernization Program have been and will continue to closely coordinate
development of their respective systems. This will occur through a governance process that includes
executive oversight and participation at the program and working levels.
The ICE TECS Modernization Program is being driven by the need to address deficiencies in the legacy
mainframe system. Current functionality does not allow interfaces between systems, lacks modern
interfaces for system users, and does not support required security measures.
The program is also driven by the need to migrate the legacy system from the mainframe. If ICE is unable
to achieve mainframe independence by FY15, it will be required to assume responsibility for maintaining
the legacy TECS system and incur significant O&M costs starting in FY16.
The program revised its acquisition and solution strategy in Q1 FY14 to clearly define the capabilities
needed for ICE to discontinue the use of legacy TECS mainframe by September 2015. The program
assessed its requirements, technology, and processes to reduce the overall program costs, schedule, and
technical risks. The effort resulted in an issuance of an ADM requiring the ICE TECS Modernization
Program to achieve an ADE-2B decision, which is currently planned for Q3 FY14.
1.2
Key Performance Parameters1
The table below identifies the KPPs for the ICE TECS Modernization Program.
1
KPPs were defined in the ICE TECS Modernization Operational Requirements Document
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
7
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Table 1: Key Performance Parameters
No.
1
KPP
Threshold
Response Time:
Transaction response time
refers to the time required for
completion of an individual
transaction. Specifically, the
time it takes from a
workstation request to a
workstation response, which is
tested at the end user device
level. Test time begins when
the user hits enter after filling
out the appropriate transaction
criteria and ends when the
intent of the transaction is
accomplished, for example
when search results appear on
the results page.
The system shall
provide operationally
acceptable transaction
response time* for
individual transactions
across the system, not
to exceed 5 seconds
95% of the time.
Objective
The system shall
provide a transaction
response time* for
individual transactions
across the system, not
to exceed 3 seconds
99% of the time.
Concurrent Users:
The system shall be able to
handle a high level of users,
measured by the number of
concurrent users accessing the
system at the same time.
3
Availability:
No less than 6,000
users accessing the
system at the same time
with system capability
allowing all users to
conduct business
transactions
concurrently within the
application.
(For example,
processing within the
ICM application must
not add more than 5
seconds to the time
required for an
external database to
process a request with
regard to Threshold
or 3 seconds with
regard to Objective.)
2) Ao > 99.07%
No less than 10,000 1)
users accessing the
system at the same time
with system capability
allowing all users to
conduct business
transactions
concurrently within the
application.
Ao > 99.97%
The ICM system shall achieve
the required level of
Operational Availability (Ao).
1.3
*Response time
excludes transaction
processing time on
systems external to
the ICM application.
Response time is
calculated only for
devices directly
connected to an ICE
network and does not
include remote
devices (i.e.,
connected through
Virtual Private
Network [VPN],
mobile device
running over wireless
network, etc.).
Response time for search
includes responses from all
data sources queried.
2
Comments
Required level of
monthly Operational
Availability for the
ICM system
components.
Critical Technical Parameters
The following section defines the critical technical parameters of the case management system that will
be evaluated during DT&E phases.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
8
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Table 1: Key Performance Parameters
No.
1
KPP
Threshold
Response Time:
Transaction response time
refers to the time required for
completion of an individual
transaction. Specifically, the
time it takes from a
workstation request to a
workstation response, which is
tested at the end user device
level. Test time begins when
the user hits enter after filling
out the appropriate transaction
criteria and ends when the
intent of the transaction is
accomplished, for example
when search results appear on
the results page.
The system shall
provide operationally
acceptable transaction
response time* for
individual transactions
across the system, not
to exceed 5 seconds
95% of the time.
Objective
The system shall
provide a transaction
response time* for
individual transactions
across the system, not
to exceed 3 seconds
99% of the time.
Concurrent Users:
The system shall be able to
handle a high level of users,
measured by the number of
concurrent users accessing the
system at the same time.
3
Availability:
No less than 6,000
users accessing the
system at the same time
with system capability
allowing all users to
conduct business
transactions
concurrently within the
application.
(For example,
processing within the
ICM application must
not add more than 5
seconds to the time
required for an
external database to
process a request with
regard to Threshold
or 3 seconds with
regard to Objective.)
2) Ao > 99.07%
No less than 10,000 1)
users accessing the
system at the same time
with system capability
allowing all users to
conduct business
transactions
concurrently within the
application.
Ao > 99.97%
The ICM system shall achieve
the required level of
Operational Availability (Ao).
1.3
*Response time
excludes transaction
processing time on
systems external to
the ICM application.
Response time is
calculated only for
devices directly
connected to an ICE
network and does not
include remote
devices (i.e.,
connected through
Virtual Private
Network [VPN],
mobile device
running over wireless
network, etc.).
Response time for search
includes responses from all
data sources queried.
2
Comments
Required level of
monthly Operational
Availability for the
ICM system
components.
Critical Technical Parameters
The following section defines the critical technical parameters of the case management system that will
be evaluated during DT&E phases.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
8
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Table 2: Critical Technical Parameters
CTP
Test Event
Technical
Threshold
Test Location
Test Schedule
Decision
Supported
Enterprise
Architecture
(EA)
Compliance
Preliminary Design
Review (PDR) /
Critical Design
Review (CDR)
Fully compliant
or obtain waiver
NA
Functional
Testing
Interoperabili
ty Testing
Throughput
Capability
Performance
Testing
250,000
transactions per
day
Government
PERF TEST
Environment
Performance
Testing
Availability
Performance
Testing
Ai > 99.15%
Government
PERF TEST
Environment
Performance
Testing
Proceed to
System
Security
Testing
Proceed to
System
Security
Testing
Government
TEST
Environment
System Security
Testing
Government
PERF TEST
Environment
Performance
Testing
Proceed to
System
Security
Testing
NA
Functional
Testing
Proceed to
Development
Security
Controls
Response
Rates
ICE
Infrastructure
Federal
Information
Security
Management Act
(FISMA)
Compliance
Performance
Testing
PDR / CDR
Ai is based on a
MTBF of 117
hours and an
MTTR of 1 hour
Fully Compliant
System shall
provide
operationally
acceptable
transaction
response time for
individual
transactions
across the
system, not to
exceed 5 seconds
95% of the time
for up to 6,000
concurrent users
Pass/Fail
Proceed to
System
Security
Testing
System will not
require any
infrastructure
that is not
supported within
the Department
of Homeland
Security (DHS)
DC1/DC2
environment
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
9
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Table 2: Critical Technical Parameters
CTP
Test Event
Technical
Threshold
Test Location
Test Schedule
Decision
Supported
Enterprise
Architecture
(EA)
Compliance
Preliminary Design
Review (PDR) /
Critical Design
Review (CDR)
Fully compliant
or obtain waiver
NA
Functional
Testing
Interoperabili
ty Testing
Throughput
Capability
Performance
Testing
250,000
transactions per
day
Government
PERF TEST
Environment
Performance
Testing
Availability
Performance
Testing
Ai > 99.15%
Government
PERF TEST
Environment
Performance
Testing
Proceed to
System
Security
Testing
Proceed to
System
Security
Testing
Government
TEST
Environment
System Security
Testing
Government
PERF TEST
Environment
Performance
Testing
Proceed to
System
Security
Testing
NA
Functional
Testing
Proceed to
Development
Security
Controls
Response
Rates
ICE
Infrastructure
Federal
Information
Security
Management Act
(FISMA)
Compliance
Performance
Testing
PDR / CDR
Ai is based on a
MTBF of 117
hours and an
MTTR of 1 hour
Fully Compliant
System shall
provide
operationally
acceptable
transaction
response time for
individual
transactions
across the
system, not to
exceed 5 seconds
95% of the time
for up to 6,000
concurrent users
Pass/Fail
Proceed to
System
Security
Testing
System will not
require any
infrastructure
that is not
supported within
the Department
of Homeland
Security (DHS)
DC1/DC2
environment
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
9
ICE OCIO
CTP
Data Integrity
TECS Modernization Test and Evaluation Master Plan
Test Event
Development
(Unit) Test
Technical
Threshold
Pass/Fail
Verify that a
particular set of
data (including
legacy data) is
saved to the
database, each
value gets saved
fully, and the
truncation of
strings and
rounding of
numeric values
do not occur
Section 508
Section 508
Testing
Pass/Fail
Interfaces to
internal and
external
systems (data
import and
export)
Interface
Integration Testing
Pass/Fail
Test Location
Test Schedule
Government
DEV and
DEV/INT
Environment
Functional
Testing
Government
TEST
Environment
Government
TEST
Environment
Section 508
Compliance
Testing
Interoperability
Test
Decision
Supported
Proceed to
Interoperabili
ty Testing
Proceed to
Integration
Testing
Proceed to
Integration
Testing
The system is
effectively able
to interface with
systems listed in
the ICM System
Interfaces list.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
10
ICE OCIO
CTP
Data Integrity
TECS Modernization Test and Evaluation Master Plan
Test Event
Development
(Unit) Test
Technical
Threshold
Pass/Fail
Verify that a
particular set of
data (including
legacy data) is
saved to the
database, each
value gets saved
fully, and the
truncation of
strings and
rounding of
numeric values
do not occur
Section 508
Section 508
Testing
Pass/Fail
Interfaces to
internal and
external
systems (data
import and
export)
Interface
Integration Testing
Pass/Fail
Test Location
Test Schedule
Government
DEV and
DEV/INT
Environment
Functional
Testing
Government
TEST
Environment
Government
TEST
Environment
Section 508
Compliance
Testing
Interoperability
Test
Decision
Supported
Proceed to
Interoperabili
ty Testing
Proceed to
Integration
Testing
Proceed to
Integration
Testing
The system is
effectively able
to interface with
systems listed in
the ICM System
Interfaces list.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
10
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
The following table identifies the standards of compliance that the new case management system will
adhere to.
Table 3: Standards of Compliance
System Characteristic
Description
Test Event
Test Location
Technical
Threshold
Enterprise Architecture
(EA) Compliance
The system must align with
ICE Architecture standards
as provided by the
approved SELC-TP.
Enterprise
Architecture
Decision (EAD)
reviews
NA
Fully compliant
unless waiver
granted
Security Controls
The system must meet all
FISMA requirements.
Using the National Institute
of Standards and
Technology (NIST)
Security Categorization
process (as specified in
Federal Information
Processing Standard (FIPS)
199), the categorization
impact levels for the ICE
TECS Modernization
program have been
determined to be high
across all three security
objectives –
Confidentiality, Integrity
and Availability. As a
result, all NIST Special
Publication 800-53
mandatory baseline security
controls – management,
operational and technical –
will be required for the ICE
TECS Modernization
program. This and other
NIST and DHS risk
management processes will
help ensure that all threats
and hazards are
appropriately considered
for this sensitive
environment.
FISMA
Compliance
Government
TEST
Environment
Fully compliant
unless waiver
granted
The ICM system must also
comply with the DHS
Information Security
Program policies for
sensitive systems (DHS
Management Directive
4300, Information
Technology Security
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
11
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
The following table identifies the standards of compliance that the new case management system will
adhere to.
Table 3: Standards of Compliance
System Characteristic
Description
Test Event
Test Location
Technical
Threshold
Enterprise Architecture
(EA) Compliance
The system must align with
ICE Architecture standards
as provided by the
approved SELC-TP.
Enterprise
Architecture
Decision (EAD)
reviews
NA
Fully compliant
unless waiver
granted
Security Controls
The system must meet all
FISMA requirements.
Using the National Institute
of Standards and
Technology (NIST)
Security Categorization
process (as specified in
Federal Information
Processing Standard (FIPS)
199), the categorization
impact levels for the ICE
TECS Modernization
program have been
determined to be high
across all three security
objectives –
Confidentiality, Integrity
and Availability. As a
result, all NIST Special
Publication 800-53
mandatory baseline security
controls – management,
operational and technical –
will be required for the ICE
TECS Modernization
program. This and other
NIST and DHS risk
management processes will
help ensure that all threats
and hazards are
appropriately considered
for this sensitive
environment.
FISMA
Compliance
Government
TEST
Environment
Fully compliant
unless waiver
granted
The ICM system must also
comply with the DHS
Information Security
Program policies for
sensitive systems (DHS
Management Directive
4300, Information
Technology Security
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
11
ICE OCIO
System Characteristic
TECS Modernization Test and Evaluation Master Plan
Description
Test Event
Test Location
Technical
Threshold
Program Publication).
Authorizing Official (AO)
issues Authority to Operate
(ATO) prior to Production
Readiness Review (PRR).
This constitutes a Security
approval to begin OT&E.
Human Factors
The system must meet all
Section 508 requirements.
Section 508
Testing
Section 508
Compliance
Testing
Fully compliant
(unless deferred)
Complete a Section 508
Product, Product
Compliance Determination
Form (CDF) which is
submitted by PM & ICE
508 Coordinator.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
12
ICE OCIO
System Characteristic
TECS Modernization Test and Evaluation Master Plan
Description
Test Event
Test Location
Technical
Threshold
Program Publication).
Authorizing Official (AO)
issues Authority to Operate
(ATO) prior to Production
Readiness Review (PRR).
This constitutes a Security
approval to begin OT&E.
Human Factors
The system must meet all
Section 508 requirements.
Section 508
Testing
Section 508
Compliance
Testing
Fully compliant
(unless deferred)
Complete a Section 508
Product, Product
Compliance Determination
Form (CDF) which is
submitted by PM & ICE
508 Coordinator.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
12
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
2 Program Summary
The ICE TECS Modernization program will be delivered in two phases. Phase 1 will focus on delivering
IOC, which consists of implementing the functional requirement to allow ICE to discontinue use of the
legacy TECS mainframe by the end of FY 2015. Phase 2 will consist of subsequent releases after IOC,
and focus on implementing enhancements and additional functionality. The program will achieve FOC
upon the conclusion of Phase 2.
The program’s Requirements Traceability Matrix (RTM) identifies its business and technical needs for
IOC and FOC. These requirements have been developed and vetted by its business and technical
stakeholders and will be leveraged for the Development solicitation and testing purposes.
In order to deliver the system capabilities necessary to meet the functional requirements identified for the
ICE TECS Modernization program, the program is structured around the delivery of four inter-related
capabilities: Investigative Case Management capability, Interface capability, Data Warehouse capability
and Data Migration capability.
Investigative Case Management – Capability to manage cases, subject records, and create/manage
investigative reports, including workflow to support review and approval
Interface – Capability to allow a centralized interfacing hub to control the sending and receiving
of information between the ICM capability and external information repositories
Data Warehouse – Capability that stores historical case information once no longer needed for
active investigations within the ICM system and provides access to that information to external
reporting systems
Data Migration – Capability that facilitates the transfer of the ICE data currently stored in the
legacy TECS system to either the new ICM capability (if required for active investigations at the
time of migration) or to the Data Warehouse (for historical reporting purposes).
Figures 1 illustrates the relationship between these four capabilities.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
13
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
2 Program Summary
The ICE TECS Modernization program will be delivered in two phases. Phase 1 will focus on delivering
IOC, which consists of implementing the functional requirement to allow ICE to discontinue use of the
legacy TECS mainframe by the end of FY 2015. Phase 2 will consist of subsequent releases after IOC,
and focus on implementing enhancements and additional functionality. The program will achieve FOC
upon the conclusion of Phase 2.
The program’s Requirements Traceability Matrix (RTM) identifies its business and technical needs for
IOC and FOC. These requirements have been developed and vetted by its business and technical
stakeholders and will be leveraged for the Development solicitation and testing purposes.
In order to deliver the system capabilities necessary to meet the functional requirements identified for the
ICE TECS Modernization program, the program is structured around the delivery of four inter-related
capabilities: Investigative Case Management capability, Interface capability, Data Warehouse capability
and Data Migration capability.
Investigative Case Management – Capability to manage cases, subject records, and create/manage
investigative reports, including workflow to support review and approval
Interface – Capability to allow a centralized interfacing hub to control the sending and receiving
of information between the ICM capability and external information repositories
Data Warehouse – Capability that stores historical case information once no longer needed for
active investigations within the ICM system and provides access to that information to external
reporting systems
Data Migration – Capability that facilitates the transfer of the ICE data currently stored in the
legacy TECS system to either the new ICM capability (if required for active investigations at the
time of migration) or to the Data Warehouse (for historical reporting purposes).
Figures 1 illustrates the relationship between these four capabilities.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
13
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Figure 1: TECS Modernization Conceptual Architecture
Agents
External
Systems
One-Time
Subjects
ICE TECS Modernization
Program
Investigative Case
Management
Interfaces
Data
Warehouse
Reporting
Tools
Analysts
TECS Legacy
2.1
All ICE Data from
Legacy TECS
Data Migration
Initial Operational Capability Date
The Initial Operational Capability (IOC) for the ICE TECS Modernization system is defined in the ORD.
It occurs once the investigative case management capabilities are deployed and the Program no longer
relies on the legacy TECS system. Mainframe independence will be achieved at IOC which is planned on
or before September 30, 2015 to align with CBP’s schedule to transition off legacy TECS.
IOC represents the minimum required functionality to achieve mainframe independence in the following
five categories:
Case: includes the primary work product for all users to include management of investigative
cases, investigative reports, administrative reports, and statistics
Subject Records: capture and share information on subjects of interest (for example, people,
vehicles, businesses), link subject records to cases and investigative reports, and establish
connections between subject records and cases globally
Interfaces: includes implementation of the required interfaces with which the investigative case
management system will share data as identified in the ICM System Interfaces List (reference
ORD Appendix B)
Universal: includes functionality that appears throughout the system in multiple screens and
modules (for example, the ability to print or sort)
Technical: includes implementation of all other technical requirements needed to support enduser functionality (for example, implementing existing DHS single sign-on capabilities)
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
14
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Figure 1: TECS Modernization Conceptual Architecture
Agents
External
Systems
One-Time
Subjects
ICE TECS Modernization
Program
Investigative Case
Management
Interfaces
Data
Warehouse
Reporting
Tools
Analysts
TECS Legacy
2.1
All ICE Data from
Legacy TECS
Data Migration
Initial Operational Capability Date
The Initial Operational Capability (IOC) for the ICE TECS Modernization system is defined in the ORD.
It occurs once the investigative case management capabilities are deployed and the Program no longer
relies on the legacy TECS system. Mainframe independence will be achieved at IOC which is planned on
or before September 30, 2015 to align with CBP’s schedule to transition off legacy TECS.
IOC represents the minimum required functionality to achieve mainframe independence in the following
five categories:
Case: includes the primary work product for all users to include management of investigative
cases, investigative reports, administrative reports, and statistics
Subject Records: capture and share information on subjects of interest (for example, people,
vehicles, businesses), link subject records to cases and investigative reports, and establish
connections between subject records and cases globally
Interfaces: includes implementation of the required interfaces with which the investigative case
management system will share data as identified in the ICM System Interfaces List (reference
ORD Appendix B)
Universal: includes functionality that appears throughout the system in multiple screens and
modules (for example, the ability to print or sort)
Technical: includes implementation of all other technical requirements needed to support enduser functionality (for example, implementing existing DHS single sign-on capabilities)
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
14
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
These collectively are considered core case management functionality. Those classified for FOC are both
enhancements to the core case management functionality and new functionality.
The Figure below provides a notional timeline for IOC.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
15
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
These collectively are considered core case management functionality. Those classified for FOC are both
enhancements to the core case management functionality and new functionality.
The Figure below provides a notional timeline for IOC.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
15
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Figure 2: High-Level IOC Schedule
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
16
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Figure 2: High-Level IOC Schedule
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
16
ICE OCIO
2.2
TECS Modernization Test and Evaluation Master Plan
Full Operational Capability Date
FOC of the TECS Modernization system is achieved when there are no additional development releases
and all planned capabilities have been developed and deployed. FOC includes major enhancements and
other service integration. The scheduled baseline objective for FOC is Q4 FY17.
FOC represents implementation of enhancements and additional functionality to provide more effective
core case management. In addition, there will be integration of functionality required by the Office of
Professional Responsibility (OPR). With regard to core case management, enhancements will include the
following capabilities:
2.3
Case: electronic distribution of investigative reports to administrative email inboxes to replace
distribution via office printers, implementation of digital signatures for case documents, creation
of repository for case document templates with version control, system mobility, voice
recognition capabilities and comprehensive analytics
Subject Records: creation of additional remarks to enhance communication with other law
enforcement agencies, address validation for better data quality and consistency across subject
records, and immediate access to record owner information by querying agent
Interfaces: includes implementation of remaining interfaces as identified in the ICM System
Interfaces List
Universal: includes implementation of media tagging; capability to save search criteria for
streamlined searching; ability to manage (save/delete) notifications; full utilization of audio,
video and other multi-media information; and geospatial mapping
Technical: includes implementation of all other technical requirements needed to support FOC
functionality
Management
The ICE TECS Modernization Program consists of an integrated team comprising of ICE OCIO, HSI, and
contractor support personnel. The government will manage the system integration and provide overall
direction and supervision to the program. The testing activities will be performed by team members from
the OCIO, Development (application, data migration, and interfaces), Test Support, Information Security,
HSI, and Operational Test and Evaluation teams.
Table 4: Management Roles & Responsibilities
Role
Program Manager (PM)
Business Sponsor
Responsibilities
Oversees all project roles to ensure schedule, cost, and activity
requirements (scope) are met for the program
Ensures the test teams are appropriately staffed to execute the
TEMP and testing activities
Represents the operational needs of the business unit and the system
users
Participates throughout the ICE System Lifecycle Management
(SLM) process to ensure that the system meets operational and user
requirements
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
17
ICE OCIO
2.2
TECS Modernization Test and Evaluation Master Plan
Full Operational Capability Date
FOC of the TECS Modernization system is achieved when there are no additional development releases
and all planned capabilities have been developed and deployed. FOC includes major enhancements and
other service integration. The scheduled baseline objective for FOC is Q4 FY17.
FOC represents implementation of enhancements and additional functionality to provide more effective
core case management. In addition, there will be integration of functionality required by the Office of
Professional Responsibility (OPR). With regard to core case management, enhancements will include the
following capabilities:
2.3
Case: electronic distribution of investigative reports to administrative email inboxes to replace
distribution via office printers, implementation of digital signatures for case documents, creation
of repository for case document templates with version control, system mobility, voice
recognition capabilities and comprehensive analytics
Subject Records: creation of additional remarks to enhance communication with other law
enforcement agencies, address validation for better data quality and consistency across subject
records, and immediate access to record owner information by querying agent
Interfaces: includes implementation of remaining interfaces as identified in the ICM System
Interfaces List
Universal: includes implementation of media tagging; capability to save search criteria for
streamlined searching; ability to manage (save/delete) notifications; full utilization of audio,
video and other multi-media information; and geospatial mapping
Technical: includes implementation of all other technical requirements needed to support FOC
functionality
Management
The ICE TECS Modernization Program consists of an integrated team comprising of ICE OCIO, HSI, and
contractor support personnel. The government will manage the system integration and provide overall
direction and supervision to the program. The testing activities will be performed by team members from
the OCIO, Development (application, data migration, and interfaces), Test Support, Information Security,
HSI, and Operational Test and Evaluation teams.
Table 4: Management Roles & Responsibilities
Role
Program Manager (PM)
Business Sponsor
Responsibilities
Oversees all project roles to ensure schedule, cost, and activity
requirements (scope) are met for the program
Ensures the test teams are appropriately staffed to execute the
TEMP and testing activities
Represents the operational needs of the business unit and the system
users
Participates throughout the ICE System Lifecycle Management
(SLM) process to ensure that the system meets operational and user
requirements
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
17
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Role
End Users
Test Manager
Operational Test Agent (OTA)
Director of Operational Test and
Evaluation (DOT&E)
Responsibilities
Provide input in determining requirements of the system
Provide functional subject matter expertise input in all phases of the
lifecycle
Perform testing activities (user acceptance)
Oversees all program testing activities, which includes
implementation of the TEMP
Assesses whether the final software product meets the approved
requirements described in the existing SRD
Ensures that defects and deficiencies are appropriately identified,
reported, and adjudicated
Ensures all testing teams have the required test data
Provide staffing requirements to the PM to ensure execution of the
TEMP and subsequent testing
Ensure access to external systems and data
Conducts all operational testing from Initial Operational Test and
Evaluation (IOT&E) through Follow-On Operational Testing &
Evaluation (FOT&E). A member of the ICE agent user community
is assigned to the OTA lead position after approval from the DHS
Director of Operational Test and Evaluation (DOT&E). The OTA is
led by an agent, supported by an experienced and professionally
qualified contractor. The execution of the testing will include the
OTA and practitioner. The OTA writes the OT&E test plan(s) and
report(s). The OT&E test plan(s) is sent to DHS OT&E for review
and approval.
Performs operational testing activities to assess the operational
effectiveness and operational suitability of a system or service in a
realistic operational environment employing fully trained operators
Analyzes the data collected for the Measure of Effectiveness
(MOE) and Measure of Suitability (MOS) in order to resolve the
Key Performance Parameters (KPPs) and COIs to determine
operational effectiveness and operational suitability
Advises the Component Program Managers in developing
Operational T&E documentation, planning for Operational T&E,
and resolution of T&E issues
Reviews the Mission Need Statements, Concept of Operations,
Integrated Logistic Support Plan, Operational Requirement
Document and associated COI
Approves the Test and Evaluation Master Plan
Approves the OTA
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
18
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Role
End Users
Test Manager
Operational Test Agent (OTA)
Director of Operational Test and
Evaluation (DOT&E)
Responsibilities
Provide input in determining requirements of the system
Provide functional subject matter expertise input in all phases of the
lifecycle
Perform testing activities (user acceptance)
Oversees all program testing activities, which includes
implementation of the TEMP
Assesses whether the final software product meets the approved
requirements described in the existing SRD
Ensures that defects and deficiencies are appropriately identified,
reported, and adjudicated
Ensures all testing teams have the required test data
Provide staffing requirements to the PM to ensure execution of the
TEMP and subsequent testing
Ensure access to external systems and data
Conducts all operational testing from Initial Operational Test and
Evaluation (IOT&E) through Follow-On Operational Testing &
Evaluation (FOT&E). A member of the ICE agent user community
is assigned to the OTA lead position after approval from the DHS
Director of Operational Test and Evaluation (DOT&E). The OTA is
led by an agent, supported by an experienced and professionally
qualified contractor. The execution of the testing will include the
OTA and practitioner. The OTA writes the OT&E test plan(s) and
report(s). The OT&E test plan(s) is sent to DHS OT&E for review
and approval.
Performs operational testing activities to assess the operational
effectiveness and operational suitability of a system or service in a
realistic operational environment employing fully trained operators
Analyzes the data collected for the Measure of Effectiveness
(MOE) and Measure of Suitability (MOS) in order to resolve the
Key Performance Parameters (KPPs) and COIs to determine
operational effectiveness and operational suitability
Advises the Component Program Managers in developing
Operational T&E documentation, planning for Operational T&E,
and resolution of T&E issues
Reviews the Mission Need Statements, Concept of Operations,
Integrated Logistic Support Plan, Operational Requirement
Document and associated COI
Approves the Test and Evaluation Master Plan
Approves the OTA
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
18
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Role
Technical Architect
Lead Systems Engineer
Development Teams (ICM System,
Data Migration, Interfaces, and Data
Warehouse)
Test Support Team
Section 508 Coordinator
Information Systems Security Officer
Responsibilities
Participates in T&E working groups
Participates in Operational Test Readiness Reviews (OTRR) and
observes Operational Tests
Approves the Operational Test Plans and reviews the Operational
Test and Evaluation Reports and writes Letter of Assessment of the
Operational Test and Evaluation (OT&E) Report as appropriate
Provides the T&E member to the Acquisition Review Boards
Manages the adoption, development, and specification of standards
supporting the ICE EA
Monitors project adherence to ICE Technical Architecture standards
Coordinates infrastructure requirements between development
teams and DHS hosting environments
Serves as the point of contact between ICE TECS Modernization
program and DHS hosting providers
Ensures all teams have the approved access to the development and
test environments
Responsible for designing, building, and testing (Unit and
Integration) activities
Complete SLM documentation throughout system lifecycle
Identify data set required to conduct testing and communicate to
Test Manager
Supports the TECS Modernization Test Manager in providing
oversight to all the different technical teams delivering the program
Serve as the coordination point for SLM documentation
Provide centralized control over the database, infrastructure, and
continuous integration tools
Perform Interoperability, Section 508, UAT, and Performance
testing
Assist in communicating technical information between teams
Identify data set required to conduct testing and communicate to
Test Manager
Provides guidance on Section 508 requirements
Evaluates Section 508 compliance
Assists the project through the Security Authorization process
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
19
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Role
Technical Architect
Lead Systems Engineer
Development Teams (ICM System,
Data Migration, Interfaces, and Data
Warehouse)
Test Support Team
Section 508 Coordinator
Information Systems Security Officer
Responsibilities
Participates in T&E working groups
Participates in Operational Test Readiness Reviews (OTRR) and
observes Operational Tests
Approves the Operational Test Plans and reviews the Operational
Test and Evaluation Reports and writes Letter of Assessment of the
Operational Test and Evaluation (OT&E) Report as appropriate
Provides the T&E member to the Acquisition Review Boards
Manages the adoption, development, and specification of standards
supporting the ICE EA
Monitors project adherence to ICE Technical Architecture standards
Coordinates infrastructure requirements between development
teams and DHS hosting environments
Serves as the point of contact between ICE TECS Modernization
program and DHS hosting providers
Ensures all teams have the approved access to the development and
test environments
Responsible for designing, building, and testing (Unit and
Integration) activities
Complete SLM documentation throughout system lifecycle
Identify data set required to conduct testing and communicate to
Test Manager
Supports the TECS Modernization Test Manager in providing
oversight to all the different technical teams delivering the program
Serve as the coordination point for SLM documentation
Provide centralized control over the database, infrastructure, and
continuous integration tools
Perform Interoperability, Section 508, UAT, and Performance
testing
Assist in communicating technical information between teams
Identify data set required to conduct testing and communicate to
Test Manager
Provides guidance on Section 508 requirements
Evaluates Section 508 compliance
Assists the project through the Security Authorization process
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
19
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Role
Responsibilities
Perform security risk analysis
Perform testing activities (Security Authorization)
Creates all security documentation
CISO/AO
Provide ATO approval
Contracting Officer and Contracts
Specialist
Monitor the acquisition schedule as influenced by test outcomes
Test and Evaluation Working
Integrated Project Team (T&E WIPT)
Members (includes the Test Manager,
Lead System Engineer, Technical
Architect, ISSO, Section 508
Coordinator, OTA, and representatives
from End Users Community,
Development Teams, Test Support
Team, and DHS S&T)
Provide input into the planning and execution of the program’s
T&E activities that will support a determination of whether or not a
system is operationally effective, suitable, and survivable
De-conflict any testing related issues
(ISSO)
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
20
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Role
Responsibilities
Perform security risk analysis
Perform testing activities (Security Authorization)
Creates all security documentation
CISO/AO
Provide ATO approval
Contracting Officer and Contracts
Specialist
Monitor the acquisition schedule as influenced by test outcomes
Test and Evaluation Working
Integrated Project Team (T&E WIPT)
Members (includes the Test Manager,
Lead System Engineer, Technical
Architect, ISSO, Section 508
Coordinator, OTA, and representatives
from End Users Community,
Development Teams, Test Support
Team, and DHS S&T)
Provide input into the planning and execution of the program’s
T&E activities that will support a determination of whether or not a
system is operationally effective, suitable, and survivable
De-conflict any testing related issues
(ISSO)
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
20
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
3 Developmental Test and Evaluation Outline
Developmental Test and Evaluation Overview
3.1
The objective of DT&E is to verify that the developer provides the government with a system that
satisfies the documented requirements. It will also verify if the system meets its KPPs and CTPs and
complies with the Standards of Compliance referenced above. During each testing phase, defects will be
identified, tested, new builds created, and regression testing performed. Also, as referenced in Section 4
(Operational Test and Outline), the OTA will be conducting preliminary Operational Assessments
throughout the development of the new case management solution.
The TECS Modernization program will utilize the ICE System Lifecycle Management (SLM) process
to oversee the various technical, security, and quality aspects of its technology projects and to
manage the integration of technology into the ICE organization. To facilitate better communication
with DHS Headquarters and other DHS components, the ICE SLM aligns with the DHS Systems
Engineering Lifecycle (SELC). This process supports the following:
Multiple lifecycle methodologies (approaches for managing a project from planning to
deployment) to enable project teams to select and apply a methodology that fits the unique
needs of a project
Tailoring (review and selection of the necessary activities, artifacts, and SLM Reviews),
which produces a customized work pattern for a project team, giving the team the flexibility
and agility to meet immediate business needs
Deploying technology solutions to meet immediate and critical business requirements
without circumventing the process
Practicing disciplined project management to ensure that the system is developed on schedule
and within budget and that it produces the expected results
Establishing a comprehensive project management plan to track, measure, and control the
progress of each project
Aligning each project with the target ICE Enterprise Architecture (the future or “to-be”
business, data, and IT environment)
Maintaining project information integrity, availability, and confidentiality
Conforming to ICE architecture standards
Verifying and certifying project activities through formal reviews, approval, and acceptance
Incorporating maintainability to enable adjustment to evolving business needs
In accordance with the ICE SLM process, the following will be conducted for each release to ensure
proper test procedures and processes were followed and that test reports are formally delivered. The test
reports are used in the overall evaluation of the system in an operational environment, which informs the
decision to move into the next lifecycle stage. To the extent possible, automated testing will be required.
1. Functional Testing – Functional testing consists of three components, Unit Testing, Integration
Testing, and Section 508 testing (where applicable). Unit testing verifies individual software units
as part of coding during system development to ensure that unit functions conform to unit design,
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
21
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
3 Developmental Test and Evaluation Outline
Developmental Test and Evaluation Overview
3.1
The objective of DT&E is to verify that the developer provides the government with a system that
satisfies the documented requirements. It will also verify if the system meets its KPPs and CTPs and
complies with the Standards of Compliance referenced above. During each testing phase, defects will be
identified, tested, new builds created, and regression testing performed. Also, as referenced in Section 4
(Operational Test and Outline), the OTA will be conducting preliminary Operational Assessments
throughout the development of the new case management solution.
The TECS Modernization program will utilize the ICE System Lifecycle Management (SLM) process
to oversee the various technical, security, and quality aspects of its technology projects and to
manage the integration of technology into the ICE organization. To facilitate better communication
with DHS Headquarters and other DHS components, the ICE SLM aligns with the DHS Systems
Engineering Lifecycle (SELC). This process supports the following:
Multiple lifecycle methodologies (approaches for managing a project from planning to
deployment) to enable project teams to select and apply a methodology that fits the unique
needs of a project
Tailoring (review and selection of the necessary activities, artifacts, and SLM Reviews),
which produces a customized work pattern for a project team, giving the team the flexibility
and agility to meet immediate business needs
Deploying technology solutions to meet immediate and critical business requirements
without circumventing the process
Practicing disciplined project management to ensure that the system is developed on schedule
and within budget and that it produces the expected results
Establishing a comprehensive project management plan to track, measure, and control the
progress of each project
Aligning each project with the target ICE Enterprise Architecture (the future or “to-be”
business, data, and IT environment)
Maintaining project information integrity, availability, and confidentiality
Conforming to ICE architecture standards
Verifying and certifying project activities through formal reviews, approval, and acceptance
Incorporating maintainability to enable adjustment to evolving business needs
In accordance with the ICE SLM process, the following will be conducted for each release to ensure
proper test procedures and processes were followed and that test reports are formally delivered. The test
reports are used in the overall evaluation of the system in an operational environment, which informs the
decision to move into the next lifecycle stage. To the extent possible, automated testing will be required.
1. Functional Testing – Functional testing consists of three components, Unit Testing, Integration
Testing, and Section 508 testing (where applicable). Unit testing verifies individual software units
as part of coding during system development to ensure that unit functions conform to unit design,
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
21
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
to ensure that every possible decision outcome in the code is made at least once, and to ensure
that all requirements have been met. The ICM system will be a COTS solution, which will not
require unit testing on out of the box functionality. Any configurations or custom development to
the COTS solution will require unit testing. Integration testing verifies software modules created
from the integration of previously tested software units within an individual development
workstream. Integration testing validates proper unit integration, satisfaction of module design
requirements for unit communication and interaction, and handling of and recovery from errors. It
also retests individual software units to validate that they function properly after their integration.
Functional testing will be conducted by each developer (ICM System, Data Migration, Interfaces,
and Data Warehouse) for all application functionality that is developed. Each developer will be
responsible for creating a Development Test Plan (DTP) (including Test Cases and a Test
Coverage Matrix) and producing a Development Test Analysis Report (DTAR) that includes test
results and deficiencies (if any). Section 508 results from the DTAR will be provided to the
Section 508 coordinator before entering Section 508 Compliance Testing.
Functional testing is complete when the program is approved to proceed to System Integration
Testing at the Integration Readiness Review (IRR).
2. Interoperability Testing – End-to-end testing that verifies all new case management system
components maintain data integrity and can operate in coordination with other systems in the
same environment. System Interoperability testing will be coordinated by the Test Manager with
support from the Test Support team, Development teams, and representatives from external
system owners that exchange data with the new case management solution. System
Interoperability testing will not be conducted until all Functional Testing (per above) has been
completed.
The Test Manager, with the support of the Test Support team, will develop the Interoperability
Test Plan and provide the System Interoperability Test Analysis Report to ensure the following
areas have been tested:
Data Migration – Verify migrated data is correctly ingested by the ICM system and can
be used to populate the Data Warehouse.
Data Warehouse – Verify legacy data is correctly ingested from the Data Migration
database and new data is correctly ingested from the ICM System. It will also verify that
data can be retrieved from the Data Warehouse into the ICM system.
Interfaces – Verify data is correctly transferred between the ICM System and the
interface hub. It will also verify that data is correctly transferred between the interface
hub and the systems listed in the ICM System Interfaces List (referenced in the ORD).
ICM Solution – Verify data is correctly ingested from the Data Migration database, Data
Warehouse, and interface hub. It also verifies the data is correctly transferred to the Data
Warehouse and interface hub.
Testing interfaces between the interface hub and the systems listed in the ICM System Interfaces
List will require additional planning and coordination with external stakeholders, including CBP,
DHS, DOS, DEA, ATF, and private industry. Ideally, all interoperability testing between the
interface hub and systems will be conducted against a testing environment that mimics the
behavior of the system in production using data sets that mimic those seen in production to the
greatest extent practical. The program will utilize existing ICAs and testing environments where
possible or establish new ICAs. There is also a possibility that external system owners may
require additional testing to be performed based on their internal procedures.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
22
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
to ensure that every possible decision outcome in the code is made at least once, and to ensure
that all requirements have been met. The ICM system will be a COTS solution, which will not
require unit testing on out of the box functionality. Any configurations or custom development to
the COTS solution will require unit testing. Integration testing verifies software modules created
from the integration of previously tested software units within an individual development
workstream. Integration testing validates proper unit integration, satisfaction of module design
requirements for unit communication and interaction, and handling of and recovery from errors. It
also retests individual software units to validate that they function properly after their integration.
Functional testing will be conducted by each developer (ICM System, Data Migration, Interfaces,
and Data Warehouse) for all application functionality that is developed. Each developer will be
responsible for creating a Development Test Plan (DTP) (including Test Cases and a Test
Coverage Matrix) and producing a Development Test Analysis Report (DTAR) that includes test
results and deficiencies (if any). Section 508 results from the DTAR will be provided to the
Section 508 coordinator before entering Section 508 Compliance Testing.
Functional testing is complete when the program is approved to proceed to System Integration
Testing at the Integration Readiness Review (IRR).
2. Interoperability Testing – End-to-end testing that verifies all new case management system
components maintain data integrity and can operate in coordination with other systems in the
same environment. System Interoperability testing will be coordinated by the Test Manager with
support from the Test Support team, Development teams, and representatives from external
system owners that exchange data with the new case management solution. System
Interoperability testing will not be conducted until all Functional Testing (per above) has been
completed.
The Test Manager, with the support of the Test Support team, will develop the Interoperability
Test Plan and provide the System Interoperability Test Analysis Report to ensure the following
areas have been tested:
Data Migration – Verify migrated data is correctly ingested by the ICM system and can
be used to populate the Data Warehouse.
Data Warehouse – Verify legacy data is correctly ingested from the Data Migration
database and new data is correctly ingested from the ICM System. It will also verify that
data can be retrieved from the Data Warehouse into the ICM system.
Interfaces – Verify data is correctly transferred between the ICM System and the
interface hub. It will also verify that data is correctly transferred between the interface
hub and the systems listed in the ICM System Interfaces List (referenced in the ORD).
ICM Solution – Verify data is correctly ingested from the Data Migration database, Data
Warehouse, and interface hub. It also verifies the data is correctly transferred to the Data
Warehouse and interface hub.
Testing interfaces between the interface hub and the systems listed in the ICM System Interfaces
List will require additional planning and coordination with external stakeholders, including CBP,
DHS, DOS, DEA, ATF, and private industry. Ideally, all interoperability testing between the
interface hub and systems will be conducted against a testing environment that mimics the
behavior of the system in production using data sets that mimic those seen in production to the
greatest extent practical. The program will utilize existing ICAs and testing environments where
possible or establish new ICAs. There is also a possibility that external system owners may
require additional testing to be performed based on their internal procedures.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
22
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
It is the responsibility of the Test Manager to secure permissions and facilitate the technical steps
necessary to ensure that testers are able to access all test environments in a manner that simulates
production to the greatest practical intent, to ensure that test data sets that mimic production are
available for use, and to ensure that all agreed upon test requirements imposed by external system
owners are met.
3. Performance Testing – Detects any performance and capacity limitations by generating system
load that emulates the behavior of users conducting business transactions, determines when
performance begins to degrade, and identifies bottlenecks across the system application and
infrastructure
All facets of the system will be tested to ensure data is moved accurately and efficiently and
meets any KPPs and other metrics as outlined in the ORD and TEMP for the program. System
Performance testing will be coordinated by the Test Manager with support from the Test Support
and Development Teams. System Performance Testing can begin as soon as the Data Migration
Team completes their portion of the System Integration Testing.
The Test Manager, with the support of the Test Support team, will develop the System
Performance Test Plan and provide the System Performance Test Analysis Report to ensure the
following areas have been tested:
ICM Solution – Simulated testing using Load Runner will be conducted to validate that
the KPPs will be met for time and volume.
Data Warehouse – On-Going Load: Simulated testing using Load Runner will be
conducted to obtain timings from the export of data from ICM to the Data Warehouse
and code/script adjustments will be made as needed based on results.
Interfaces – Simulated testing using Load Runner will be conducted to verify that the
KPPs for all functions as defined in the Interface Control Document are met.
4. UAT - Independent testing conducted by end users that will execute business processes to verify
and validate the system meets user requirements. UAT will be coordinated by the Test Manager
and conducted by end users with subject matter expertise and supported by the Test Support and
Development teams. UAT can begin upon completion of System Integration testing and will
occur in parallel with Section 508 testing.
The end users will develop a UAT Test Plan and provide the results in a User Acceptance Test
Analysis Report. UAT is complete once the UAT is accepted by the Business Sponsor.
5. Section 508 Testing - Determines compliance with Section 508 Assistive Technology
Interoperability requirements (Section 508 of the Rehabilitation Act of 1973). Section 508 testing
will be coordinated by the Test Manager with support from the Section 508 Coordinator, Test
Support team, and Development Teams. Section 508 testing can begin upon completion of
System Integration testing. It will occur in parallel with UAT. Any deficiencies will be logged in
the Section 508 Remediation Plan and assigned to development teams for resolution. Section 508
Testing is complete when the new case management system is approved by the Section 508
Coordinator.
6. System Security Testing (C&A) – Validates implementation of security requirements and
controls in the system and identifies potential intrusion or sensitive data exposure vulnerabilities.
The ISSO prepares the Security Authorization package required to receive ATO with support
from the Test Manager, Test Support and Development Teams. The Test Manager provides
access to the necessary resources (e.g., program artifacts, team members) to complete the
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
23
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
It is the responsibility of the Test Manager to secure permissions and facilitate the technical steps
necessary to ensure that testers are able to access all test environments in a manner that simulates
production to the greatest practical intent, to ensure that test data sets that mimic production are
available for use, and to ensure that all agreed upon test requirements imposed by external system
owners are met.
3. Performance Testing – Detects any performance and capacity limitations by generating system
load that emulates the behavior of users conducting business transactions, determines when
performance begins to degrade, and identifies bottlenecks across the system application and
infrastructure
All facets of the system will be tested to ensure data is moved accurately and efficiently and
meets any KPPs and other metrics as outlined in the ORD and TEMP for the program. System
Performance testing will be coordinated by the Test Manager with support from the Test Support
and Development Teams. System Performance Testing can begin as soon as the Data Migration
Team completes their portion of the System Integration Testing.
The Test Manager, with the support of the Test Support team, will develop the System
Performance Test Plan and provide the System Performance Test Analysis Report to ensure the
following areas have been tested:
ICM Solution – Simulated testing using Load Runner will be conducted to validate that
the KPPs will be met for time and volume.
Data Warehouse – On-Going Load: Simulated testing using Load Runner will be
conducted to obtain timings from the export of data from ICM to the Data Warehouse
and code/script adjustments will be made as needed based on results.
Interfaces – Simulated testing using Load Runner will be conducted to verify that the
KPPs for all functions as defined in the Interface Control Document are met.
4. UAT - Independent testing conducted by end users that will execute business processes to verify
and validate the system meets user requirements. UAT will be coordinated by the Test Manager
and conducted by end users with subject matter expertise and supported by the Test Support and
Development teams. UAT can begin upon completion of System Integration testing and will
occur in parallel with Section 508 testing.
The end users will develop a UAT Test Plan and provide the results in a User Acceptance Test
Analysis Report. UAT is complete once the UAT is accepted by the Business Sponsor.
5. Section 508 Testing - Determines compliance with Section 508 Assistive Technology
Interoperability requirements (Section 508 of the Rehabilitation Act of 1973). Section 508 testing
will be coordinated by the Test Manager with support from the Section 508 Coordinator, Test
Support team, and Development Teams. Section 508 testing can begin upon completion of
System Integration testing. It will occur in parallel with UAT. Any deficiencies will be logged in
the Section 508 Remediation Plan and assigned to development teams for resolution. Section 508
Testing is complete when the new case management system is approved by the Section 508
Coordinator.
6. System Security Testing (C&A) – Validates implementation of security requirements and
controls in the system and identifies potential intrusion or sensitive data exposure vulnerabilities.
The ISSO prepares the Security Authorization package required to receive ATO with support
from the Test Manager, Test Support and Development Teams. The Test Manager provides
access to the necessary resources (e.g., program artifacts, team members) to complete the
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
23
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Security Authorization package. The CISO, as the Certifying Official (CO), verifies the results of
the security assessment and makes an authorization recommendation to the AO.
Security Authorization testing begins upon completion of UAT and Section 508 Testing. The
Figure below documents the notional testing timeline:
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
24
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Security Authorization package. The CISO, as the Certifying Official (CO), verifies the results of
the security assessment and makes an authorization recommendation to the AO.
Security Authorization testing begins upon completion of UAT and Section 508 Testing. The
Figure below documents the notional testing timeline:
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
24
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Figure 3: ICM System Testing View
Jul
2014
Aug
2014
Sep
2014
Oct
2014
Nov
2014
Dec
2014
Jan
2015
Mar
2015
Apt
2015
May
2015
Jun
2015
Code Freeze
Date
Initial Config
of COTS Solution
ICM
Solution Development
Feb
2015
Gap
Analysis
Jul
2015
Aug
2015
Sep
2015
Cutover to ICM
System
SLM Gate: IRR
COTS Solution Gap Software
Development
Key:
NOTIONAL
Development
Development Unit Testing
Testing
Conversion / Cutover
Development Integration Testing
Production System
Interoperability, Performance,
UAT, 508, and Security Testing
Operations &
Maintenance
Interoperability Testing
System
Performance Testing
User Acceptance Testing
Section 508
Compliance Testing
SLM Gate: PRR, OTRR
System
Security
Testing
OT&E / Production
Production
Load
ICM System in Production
ICM System Operations
and Maintenance (O&M)
SLM Gate: ORR
Operational Assessment
OT&E Testing
Last Updated March 5, 2014
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
25
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Figure 3: ICM System Testing View
Jul
2014
Aug
2014
Sep
2014
Oct
2014
Nov
2014
Dec
2014
Jan
2015
Mar
2015
Apt
2015
May
2015
Jun
2015
Code Freeze
Date
Initial Config
of COTS Solution
ICM
Solution Development
Feb
2015
Gap
Analysis
Jul
2015
Aug
2015
Sep
2015
Cutover to ICM
System
SLM Gate: IRR
COTS Solution Gap Software
Development
Key:
NOTIONAL
Development
Development Unit Testing
Testing
Conversion / Cutover
Development Integration Testing
Production System
Interoperability, Performance,
UAT, 508, and Security Testing
Operations &
Maintenance
Interoperability Testing
System
Performance Testing
User Acceptance Testing
Section 508
Compliance Testing
SLM Gate: PRR, OTRR
System
Security
Testing
OT&E / Production
Production
Load
ICM System in Production
ICM System Operations
and Maintenance (O&M)
SLM Gate: ORR
Operational Assessment
OT&E Testing
Last Updated March 5, 2014
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
25
ICE OCIO
3.2
TECS Modernization Test and Evaluation Master Plan
Developmental Test and Evaluation to Date
There has been no DT&E to date.
3.3
Future Developmental Test and Evaluation
Testing of future development will consist of similar testing activities for each release using a baseline set
of previously developed automated test scripts, in addition to new scripts to test new functionality. The
needs of each release, and the projects contained within it, will determine which tests (refer to Section
3.1, Developmental Test and Evaluation Overview) will be removed or added based on the functionality
added to the system.
3.4
Developmental Test and Evaluation Plans and Reports
DT&E Plans and Reports are described in Section 3.1, Developmental Test and Evaluation Overview. A
summary of this information can be found in the Summary of Testing Resources table located in Section
5.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
26
ICE OCIO
3.2
TECS Modernization Test and Evaluation Master Plan
Developmental Test and Evaluation to Date
There has been no DT&E to date.
3.3
Future Developmental Test and Evaluation
Testing of future development will consist of similar testing activities for each release using a baseline set
of previously developed automated test scripts, in addition to new scripts to test new functionality. The
needs of each release, and the projects contained within it, will determine which tests (refer to Section
3.1, Developmental Test and Evaluation Overview) will be removed or added based on the functionality
added to the system.
3.4
Developmental Test and Evaluation Plans and Reports
DT&E Plans and Reports are described in Section 3.1, Developmental Test and Evaluation Overview. A
summary of this information can be found in the Summary of Testing Resources table located in Section
5.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
26
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
4 Operational Test and Evaluation Outline
4.1
Operational Test and Evaluation Overview
TECS Modernization will require an OT&E with an ADE decision to deploy. TECS Modernization will
undergo OT&E using a Mission Based Test Design (MBTD) approach to determine operational
effectiveness and suitability. Operational vignettes will be created to test TECS Modernization
capabilities to support law enforcement activities and operations, and case adjudication. The vignette lays
out the sequence of actions for conducting a particular mission.
OT&E determines if the TECS Modernization system being delivered fulfills operational effectiveness
and suitability.
o
Operational effectiveness is the overall degree of mission accomplishment of a system when
used by representative personnel in the planned environment after deployment of the project, or
the degree of mission accomplishment exhibited by the system upgrade after its technical
improvements are deployed.
o
Operational suitability is the degree to which a system can be satisfactorily placed in field use,
with consideration given to reliability, availability, maintainability, human factors, manpower
supportability at the data centers and the field, logistics supportability, documentation, and
training requirements.
OT&E for TECS Modernization will consist of at least one Operational Assessment (OA) during the
TECS Modernization development phase and, following the IOT&E Operational Test Readiness Review
(OTRR), formal IOT&E and FOT&E. OT&E test results are used to ascertain the operational
effectiveness and suitability of the ICM system. Following the successful FOT&E, Full Operational
Capability (FOC) can be reached for TECS Modernization.
The table below outlines the TECS Modernization OT&E:
Table 5: Operational Test and Evaluation Periods
Component of OT&E
Milestone
Purpose of Test Results
Operational Assessment(s)
As requested.
To assess progress toward or potential to achieve
satisfying COIs.
Initial Operational Test and
Evaluation
The end of DT&E; PRR, and
achieving a successful IOT&E
OTRR.
IOT&E results are used to support the initial
operational production decision at the Operational
Readiness Review (ORR). Evaluate operational
effectiveness and suitability of IOC capabilities.
Follow-On Operational Test
and Evaluation
Once system is at full
operational deployment and
achieving a successful
FOT&E OTRR.
FOT&E results are used to support the FOC
production decision at the ORR. Evaluate
operational effectiveness and suitability of FOC
capabilities.
Validate correction of Deficiencies.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
27
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
4 Operational Test and Evaluation Outline
4.1
Operational Test and Evaluation Overview
TECS Modernization will require an OT&E with an ADE decision to deploy. TECS Modernization will
undergo OT&E using a Mission Based Test Design (MBTD) approach to determine operational
effectiveness and suitability. Operational vignettes will be created to test TECS Modernization
capabilities to support law enforcement activities and operations, and case adjudication. The vignette lays
out the sequence of actions for conducting a particular mission.
OT&E determines if the TECS Modernization system being delivered fulfills operational effectiveness
and suitability.
o
Operational effectiveness is the overall degree of mission accomplishment of a system when
used by representative personnel in the planned environment after deployment of the project, or
the degree of mission accomplishment exhibited by the system upgrade after its technical
improvements are deployed.
o
Operational suitability is the degree to which a system can be satisfactorily placed in field use,
with consideration given to reliability, availability, maintainability, human factors, manpower
supportability at the data centers and the field, logistics supportability, documentation, and
training requirements.
OT&E for TECS Modernization will consist of at least one Operational Assessment (OA) during the
TECS Modernization development phase and, following the IOT&E Operational Test Readiness Review
(OTRR), formal IOT&E and FOT&E. OT&E test results are used to ascertain the operational
effectiveness and suitability of the ICM system. Following the successful FOT&E, Full Operational
Capability (FOC) can be reached for TECS Modernization.
The table below outlines the TECS Modernization OT&E:
Table 5: Operational Test and Evaluation Periods
Component of OT&E
Milestone
Purpose of Test Results
Operational Assessment(s)
As requested.
To assess progress toward or potential to achieve
satisfying COIs.
Initial Operational Test and
Evaluation
The end of DT&E; PRR, and
achieving a successful IOT&E
OTRR.
IOT&E results are used to support the initial
operational production decision at the Operational
Readiness Review (ORR). Evaluate operational
effectiveness and suitability of IOC capabilities.
Follow-On Operational Test
and Evaluation
Once system is at full
operational deployment and
achieving a successful
FOT&E OTRR.
FOT&E results are used to support the FOC
production decision at the ORR. Evaluate
operational effectiveness and suitability of FOC
capabilities.
Validate correction of Deficiencies.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
27
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
The OTA is responsible for all OT&E activities because it must be completely independent from any
prior testing (i.e. component testing by developers or any testing done by another Government assigned
team). The Operational Test and Evaluation Plan (OTEP) and OT&E reports will be written by the OTA.
For TECS Modernization, OT&E will be fully integrated with the development team and DT&E. As
stated previously, there will be an enterprise cutover for ICE TECS Modernization. The TECS
Modernization development framework provides key events for the OT&E to observe the system for early
operational trends and refine OT&E data collection and analysis methodologies. Using actual
investigators2 as users, the UATs will provide integrated testing opportunities for early look at ICE TECS
Modernization progress to satisfying the OT&E COIs. To ensure objectivity, the OT team will provide
independent agents to supplement the UAT users.
Operational Assessments. Due to the TECS Modernization single-release development strategy, as part
of an OT&E early look at the emerging capabilities, the OTA may decide to conduct an OA for an early
production release prior to a full OT&E, immediately following or as part of the UAT. To assist in
identifying deficiencies in functionality or operational capabilities prior to going into the formal system
release evaluations, the OTA, in coordination with the Program Office, will observe DT events. At key
DT events, sufficient functionality may be available to allow an early look by the OTA, using actual
agents / investigators as users.
It is key to note that an OA is not conducted in lieu of the OT&E; rather it is conducted to identify
significant trends or areas of risk to assist TECS Modernization leadership in determining the progress
toward meeting requirements. This OA will not only provide program management with early indications
of system potential effectiveness and suitability but also allows the OT team to familiarize with the
system and refine data collection methodologies prior to formal OT&E testing.
The OA is limited to the functionality and capabilities deployed in the release IOC, a subset of the
functionality to be deployed with TECS Modernization FOC. For the IOC release, the OTA would not
expect the COIs to be fully resolved, therefore IOT&E will only resolve COIs to the extent instantiated in
the IOC release. An OA may also be performed to augment FOT&E utilizing the FOC release.
IOT&E. For TECS Modernization IOT&E, the OT&E assessment team will match user requirements,
KPPs, and CTPs to business practice and case workflow based scenarios as well as using actual case data.
During the development phase, IOT&E will monitor and observe DT activities and may also include
conducting an OA during the IOC release. During formal IOT&E, OT&E representatives will be
responsible for scenario and performance testing procedures, data collection, and conduct to ensure the
collection of data pertinent to the overall IOT&E data requirements.
IOT&E starts following the completion of IOC DT&E and will begin with the approval to move the
operational production deployment to the production pilot site(s). An OTRR ensures the program is ready
to enter IOT&E. The OTRR should be led by the CAE or his designee, and conducted in a manner that
ensures Operational Test is ready to begin. The IOT&E testing is conducted in an operational or
operationally relevant environment.
FOT&E. FOT&E is focused on significant fixes, resolution of deficiencies, and remaining FOC
capabilities unavailable for IOT&E. FOT&E starts following the completion of FOC DT&E and will
begin with the approval to move the project to operational production deployment at the OT sites. An
OTRR ensures the program is ready to enter FOT&E. The OTRR should be led by the CAE or his
designee, and conducted in a manner that ensures Operational Test is ready to begin.
2
The term “Investigator” represents a TECS system user which could be an investigative assistant, special agent, or
investigative research specialist.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
28
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
The OTA is responsible for all OT&E activities because it must be completely independent from any
prior testing (i.e. component testing by developers or any testing done by another Government assigned
team). The Operational Test and Evaluation Plan (OTEP) and OT&E reports will be written by the OTA.
For TECS Modernization, OT&E will be fully integrated with the development team and DT&E. As
stated previously, there will be an enterprise cutover for ICE TECS Modernization. The TECS
Modernization development framework provides key events for the OT&E to observe the system for early
operational trends and refine OT&E data collection and analysis methodologies. Using actual
investigators2 as users, the UATs will provide integrated testing opportunities for early look at ICE TECS
Modernization progress to satisfying the OT&E COIs. To ensure objectivity, the OT team will provide
independent agents to supplement the UAT users.
Operational Assessments. Due to the TECS Modernization single-release development strategy, as part
of an OT&E early look at the emerging capabilities, the OTA may decide to conduct an OA for an early
production release prior to a full OT&E, immediately following or as part of the UAT. To assist in
identifying deficiencies in functionality or operational capabilities prior to going into the formal system
release evaluations, the OTA, in coordination with the Program Office, will observe DT events. At key
DT events, sufficient functionality may be available to allow an early look by the OTA, using actual
agents / investigators as users.
It is key to note that an OA is not conducted in lieu of the OT&E; rather it is conducted to identify
significant trends or areas of risk to assist TECS Modernization leadership in determining the progress
toward meeting requirements. This OA will not only provide program management with early indications
of system potential effectiveness and suitability but also allows the OT team to familiarize with the
system and refine data collection methodologies prior to formal OT&E testing.
The OA is limited to the functionality and capabilities deployed in the release IOC, a subset of the
functionality to be deployed with TECS Modernization FOC. For the IOC release, the OTA would not
expect the COIs to be fully resolved, therefore IOT&E will only resolve COIs to the extent instantiated in
the IOC release. An OA may also be performed to augment FOT&E utilizing the FOC release.
IOT&E. For TECS Modernization IOT&E, the OT&E assessment team will match user requirements,
KPPs, and CTPs to business practice and case workflow based scenarios as well as using actual case data.
During the development phase, IOT&E will monitor and observe DT activities and may also include
conducting an OA during the IOC release. During formal IOT&E, OT&E representatives will be
responsible for scenario and performance testing procedures, data collection, and conduct to ensure the
collection of data pertinent to the overall IOT&E data requirements.
IOT&E starts following the completion of IOC DT&E and will begin with the approval to move the
operational production deployment to the production pilot site(s). An OTRR ensures the program is ready
to enter IOT&E. The OTRR should be led by the CAE or his designee, and conducted in a manner that
ensures Operational Test is ready to begin. The IOT&E testing is conducted in an operational or
operationally relevant environment.
FOT&E. FOT&E is focused on significant fixes, resolution of deficiencies, and remaining FOC
capabilities unavailable for IOT&E. FOT&E starts following the completion of FOC DT&E and will
begin with the approval to move the project to operational production deployment at the OT sites. An
OTRR ensures the program is ready to enter FOT&E. The OTRR should be led by the CAE or his
designee, and conducted in a manner that ensures Operational Test is ready to begin.
2
The term “Investigator” represents a TECS system user which could be an investigative assistant, special agent, or
investigative research specialist.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
28
ICE OCIO
4.2
TECS Modernization Test and Evaluation Master Plan
Evaluation Strategy
The objective of this evaluation is to determine whether the TECS Modernization system when employed
in the ICE law enforcement enterprise is an operationally effective and suitable tool to enable ICE Agents
to conduct daily operations, and accurately and rapidly eliminate vulnerabilities.
The OT&E approach will be to use event-driven scenarios matched to the case management processes
and business practices to evaluate ICE TECS against operational requirements as well as validating
system performance as verified during developmental testing. Testing will employ an end-to-end
mission-oriented approach within an operationally realistic environment, using real-world data and
scenarios. During scenario driven testing, performance will be tested against the Measure of Performance
(MOPs)/MOEs and performance data will be recorded. If the system does not meet the threshold
requirements the deficiency will be noted in reporting. All performance data will be used for comparison
between the IOC and FOC releases. Metrics and subjective assessments by the users will provide data for
the MOSs as well as mission impact, ensuring legacy functionality and recommendations for enhancing
system capabilities.
4.3
Integrated Evaluation Framework
The ICE TECS Modernization Integrated Evaluation Framework (IEF) and supporting assessment
dendritic for ICE TECS will link the system’s key technical and performance metrics to the vignettes.
Those vignettes will be linked to specific mission-based effectiveness COIs, leading to the overall
evaluation of the effectiveness and suitability of the system.
In order to maximize the benefits and data collection opportunities during system development, ICE
TECS Modernization OT&E will use an “integrated testing” approach, as defined in the Defense
Acquisition Guidebook (DAG). Integrated testing is designed to produce credible qualitative and
quantitative data useful to all evaluators, and to address developmental, sustainment, and operational
issues. Integrated testing allows for the collaborative planning of test events, where a single test event,
such as UAT, separate OTA OA or User Observations can provide data to satisfy multiple test objectives,
without compromising the individual test objectives of each participating test organization. Integrated
testing will not just be concurrent or combined DT and OT, where both DT and OT test objectives are
interweaved on the same mission or schedule. Integrated testing focuses the entire test program on
designing, developing, and producing a comprehensive plan that coordinates all test activities to support
evaluation results for decision makers at required decision check points and during program reviews.
As part of the integrated testing, the OTA will observe key system development testing events or conduct
independent OA/User Observations with actual investigators and available capability set-driven scenarios.
Data and findings from these events may be used to supplement the final release evaluation data, provided
certain criteria are met. The early integrated testing efforts are not a formal phase of OT, but rather a
period of DT in which OT testers are actively involved, providing operational perspective and gaining
valuable hands-on familiarity with the system. The objective of the OTA observations is to provide
program management with early insight into operational issues and concerns. If appropriate for the test
event, subjective questionnaires might be administered by the OTA to support any OA/User
Observations.
As for DT testing, any software defects detected during OT&E will be documented in a Test Problem
Report (TPR) and assigned a severity level (low, medium, high, critical) by the tester. Software Defects
identified in TPRs will be reviewed by project management and prioritized in order to continue
development of the system.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
29
ICE OCIO
4.2
TECS Modernization Test and Evaluation Master Plan
Evaluation Strategy
The objective of this evaluation is to determine whether the TECS Modernization system when employed
in the ICE law enforcement enterprise is an operationally effective and suitable tool to enable ICE Agents
to conduct daily operations, and accurately and rapidly eliminate vulnerabilities.
The OT&E approach will be to use event-driven scenarios matched to the case management processes
and business practices to evaluate ICE TECS against operational requirements as well as validating
system performance as verified during developmental testing. Testing will employ an end-to-end
mission-oriented approach within an operationally realistic environment, using real-world data and
scenarios. During scenario driven testing, performance will be tested against the Measure of Performance
(MOPs)/MOEs and performance data will be recorded. If the system does not meet the threshold
requirements the deficiency will be noted in reporting. All performance data will be used for comparison
between the IOC and FOC releases. Metrics and subjective assessments by the users will provide data for
the MOSs as well as mission impact, ensuring legacy functionality and recommendations for enhancing
system capabilities.
4.3
Integrated Evaluation Framework
The ICE TECS Modernization Integrated Evaluation Framework (IEF) and supporting assessment
dendritic for ICE TECS will link the system’s key technical and performance metrics to the vignettes.
Those vignettes will be linked to specific mission-based effectiveness COIs, leading to the overall
evaluation of the effectiveness and suitability of the system.
In order to maximize the benefits and data collection opportunities during system development, ICE
TECS Modernization OT&E will use an “integrated testing” approach, as defined in the Defense
Acquisition Guidebook (DAG). Integrated testing is designed to produce credible qualitative and
quantitative data useful to all evaluators, and to address developmental, sustainment, and operational
issues. Integrated testing allows for the collaborative planning of test events, where a single test event,
such as UAT, separate OTA OA or User Observations can provide data to satisfy multiple test objectives,
without compromising the individual test objectives of each participating test organization. Integrated
testing will not just be concurrent or combined DT and OT, where both DT and OT test objectives are
interweaved on the same mission or schedule. Integrated testing focuses the entire test program on
designing, developing, and producing a comprehensive plan that coordinates all test activities to support
evaluation results for decision makers at required decision check points and during program reviews.
As part of the integrated testing, the OTA will observe key system development testing events or conduct
independent OA/User Observations with actual investigators and available capability set-driven scenarios.
Data and findings from these events may be used to supplement the final release evaluation data, provided
certain criteria are met. The early integrated testing efforts are not a formal phase of OT, but rather a
period of DT in which OT testers are actively involved, providing operational perspective and gaining
valuable hands-on familiarity with the system. The objective of the OTA observations is to provide
program management with early insight into operational issues and concerns. If appropriate for the test
event, subjective questionnaires might be administered by the OTA to support any OA/User
Observations.
As for DT testing, any software defects detected during OT&E will be documented in a Test Problem
Report (TPR) and assigned a severity level (low, medium, high, critical) by the tester. Software Defects
identified in TPRs will be reviewed by project management and prioritized in order to continue
development of the system.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
29
ICE OCIO
4.4
TECS Modernization Test and Evaluation Master Plan
Modeling and Simulation
There is no expectation of using modeling to conduct operational testing. However, it is possible that
some level of simulation may be expected. Operational Testing may include the use of a simulation tool
for three purposes
a) Faithfully replicate scenario driven tasks so they can be automated and repeated for multiple
tests
b) Simulate multiple system users without the need to establish a large number of workstations
c) Measuring system response time of actions
Any M&S tool used during OT must receive approval from the OTA.
4.5
Reliability growth
Reliability growth (KPP 3 – Availability) will be determined by analyzing the data produced by
performing the framework and performance procedures found in the DT&E Test Plans, by collecting
RAM data in the production environment, and by evaluating TECS Modernization conformity to the
Availability KPP. It will be analyzed in accordance with the ILSP. A reliability baseline will be
established during the early DT/OA/OT events, and comparability measurements will be taken throughout
the lifecycle of TECS Modernization to identify and quantify any system reliability changes.
Critical Operational Issues3
4.6
The table below identifies the COIs for the ICE TECS Modernization Program.
Table 6: Critical Operational Issues
No.
COI Category
COI
1
Case Management
Does the modernized system provide users the capability to perform case
management functions (whether criminal or administrative) required
from case inception to final disposition as well as reporting and audit
capabilities?
2
Interface with Critical
Systems
Does the modernized system allow HSI to exchange data with critical
systems both internal and external to the agency?
3
Access Control
Does the modernized system provide access control that will ensure
proper handling of sensitive data within the ICM application in addition
to data shared with systems both internal and external to the agency?
4
RAM
Is the modernized system technically viable (supportable) in an
operational environment providing a sufficient degree of reliability,
availability and maintainability?
4.7
TECS Modernization Failure Classification
A failure is an event that results in the loss of, or operationally unacceptable degradation of, an essential
mission capability. In order to classify the failures as to severity of operational impact, an analysis of the
3
COIs were defined in the TECS Modernization ORD
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
30
ICE OCIO
4.4
TECS Modernization Test and Evaluation Master Plan
Modeling and Simulation
There is no expectation of using modeling to conduct operational testing. However, it is possible that
some level of simulation may be expected. Operational Testing may include the use of a simulation tool
for three purposes
a) Faithfully replicate scenario driven tasks so they can be automated and repeated for multiple
tests
b) Simulate multiple system users without the need to establish a large number of workstations
c) Measuring system response time of actions
Any M&S tool used during OT must receive approval from the OTA.
4.5
Reliability growth
Reliability growth (KPP 3 – Availability) will be determined by analyzing the data produced by
performing the framework and performance procedures found in the DT&E Test Plans, by collecting
RAM data in the production environment, and by evaluating TECS Modernization conformity to the
Availability KPP. It will be analyzed in accordance with the ILSP. A reliability baseline will be
established during the early DT/OA/OT events, and comparability measurements will be taken throughout
the lifecycle of TECS Modernization to identify and quantify any system reliability changes.
Critical Operational Issues3
4.6
The table below identifies the COIs for the ICE TECS Modernization Program.
Table 6: Critical Operational Issues
No.
COI Category
COI
1
Case Management
Does the modernized system provide users the capability to perform case
management functions (whether criminal or administrative) required
from case inception to final disposition as well as reporting and audit
capabilities?
2
Interface with Critical
Systems
Does the modernized system allow HSI to exchange data with critical
systems both internal and external to the agency?
3
Access Control
Does the modernized system provide access control that will ensure
proper handling of sensitive data within the ICM application in addition
to data shared with systems both internal and external to the agency?
4
RAM
Is the modernized system technically viable (supportable) in an
operational environment providing a sufficient degree of reliability,
availability and maintainability?
4.7
TECS Modernization Failure Classification
A failure is an event that results in the loss of, or operationally unacceptable degradation of, an essential
mission capability. In order to classify the failures as to severity of operational impact, an analysis of the
3
COIs were defined in the TECS Modernization ORD
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
30
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
business practices was performed to match functional capabilities with operational mission components.
Figure below displays the Core HSI Investigative Case Management Processes.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
31
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
business practices was performed to match functional capabilities with operational mission components.
Figure below displays the Core HSI Investigative Case Management Processes.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
31
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Figure 4: Core HSI Investigative Case Management Processes
ICM Process
Mapping
Diagram 1
Core HSI Investigative Case Management (ICM) Processes
Open
Case
Conduct Investigation
Close
Case
Support
Prosecution
Enforcement Actions
DHS Federated
Authentication
U
Single
Sign-on
s
e
r
I
2
Search, View,
Create, Link
Subjects
3
n
t
e
r
f
a
c
e
Open Case
ICE Data
Warehouse
3
Create Other
Case
Documents
4
Investigate:
Individuals
Businesses
5
Investigate Investigate
Phone
Financial
Records
Info.
5
5
Interview
Suspects &
Witnesses
Electronic &
Physical
Surveillance
Undercover
Operations
Other
Covert
Operations
Add
Case
Docs
4
Enforcement Actions
Obtain
Search
Warrants
Obtain
Arrest
Warrants
Obtain
Seizure
Warrants
Serve
Warrants
Search
Locations
Seize
Evidence
Arrest
Persons
Book
Persons
Indict
Persons
Seize
Assets
Obtain
Forensics
Support
6
4
Support Prosecution
Maintain
Evidence
EID/Eagle = ICE Booking System
6
Prepare
Discovery
Package
Conduct
Proffer
Sessions
Trial
Preparation
Provide
Witness
Support
Present
Evidence &
Testify
Add
Case
Docs
Legend
4
Core ICM
FALCON = ICE Analytical System
Close Case
AFI= CBP Analytical System
Disposition
Arrests
LeadTrac = Refer to HSI
Disposition
Evidence
6
Disposition
Assets
6
6
Update
Case
Statistics
6
Other
System
Closing
ROI
NonSystem
Function
7
Virtual University = Certificates
(NCIC, PA, etc.)
SEACATS
1.
2.
3.
The example “snapshot” processes depicted above will occur in varying orders (sans the open and close).
Though an activity may be a “non-system” function, that activity is supported and documented in the system.
The symbol
with a number, indicates that there is a corresponding process flow with more detail.
Constant
in System
**Law Enforcement Sensitive - For Official Use Only**
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
Add
Case
Docs
32
Universal Functionality System-Wide
Search CBP
Systems for
Border Info
5
Access Control, Security & User Provisioning
Workflow, Document Management & Reporting
Perform
Federated
Searches
5
ACRIMe = NCIC/NLETS/III, CIS,
CLAIMS 3 & 4, & PCQS
SEACATS = Seized property
management
Research Laws,
Regulations,
Policy
Create CBP
Lookout
Subjects
Conduct Investigation
Interface Hub
CBP TECS Portal = Border info
(crossings & incident logs), bidirectional subject records,
primary lookouts
Create
ROI
Creating, Searching & Query Notifications
Migrated Data
and New ICM
Data
Create
Investigative
Case
1
Search for
Deconfliction
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Figure 4: Core HSI Investigative Case Management Processes
ICM Process
Mapping
Diagram 1
Core HSI Investigative Case Management (ICM) Processes
Open
Case
Conduct Investigation
Close
Case
Support
Prosecution
Enforcement Actions
DHS Federated
Authentication
U
Single
Sign-on
s
e
r
I
2
Search, View,
Create, Link
Subjects
3
n
t
e
r
f
a
c
e
Open Case
ICE Data
Warehouse
3
Create Other
Case
Documents
4
Investigate:
Individuals
Businesses
5
Investigate Investigate
Phone
Financial
Records
Info.
5
5
Interview
Suspects &
Witnesses
Electronic &
Physical
Surveillance
Undercover
Operations
Other
Covert
Operations
Add
Case
Docs
4
Enforcement Actions
Obtain
Search
Warrants
Obtain
Arrest
Warrants
Obtain
Seizure
Warrants
Serve
Warrants
Search
Locations
Seize
Evidence
Arrest
Persons
Book
Persons
Indict
Persons
Seize
Assets
Obtain
Forensics
Support
6
4
Support Prosecution
Maintain
Evidence
EID/Eagle = ICE Booking System
6
Prepare
Discovery
Package
Conduct
Proffer
Sessions
Trial
Preparation
Provide
Witness
Support
Present
Evidence &
Testify
Add
Case
Docs
Legend
4
Core ICM
FALCON = ICE Analytical System
Close Case
AFI= CBP Analytical System
Disposition
Arrests
LeadTrac = Refer to HSI
Disposition
Evidence
6
Disposition
Assets
6
6
Update
Case
Statistics
6
Other
System
Closing
ROI
NonSystem
Function
7
Virtual University = Certificates
(NCIC, PA, etc.)
SEACATS
1.
2.
3.
The example “snapshot” processes depicted above will occur in varying orders (sans the open and close).
Though an activity may be a “non-system” function, that activity is supported and documented in the system.
The symbol
with a number, indicates that there is a corresponding process flow with more detail.
Constant
in System
**Law Enforcement Sensitive - For Official Use Only**
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
Add
Case
Docs
32
Universal Functionality System-Wide
Search CBP
Systems for
Border Info
5
Access Control, Security & User Provisioning
Workflow, Document Management & Reporting
Perform
Federated
Searches
5
ACRIMe = NCIC/NLETS/III, CIS,
CLAIMS 3 & 4, & PCQS
SEACATS = Seized property
management
Research Laws,
Regulations,
Policy
Create CBP
Lookout
Subjects
Conduct Investigation
Interface Hub
CBP TECS Portal = Border info
(crossings & incident logs), bidirectional subject records,
primary lookouts
Create
ROI
Creating, Searching & Query Notifications
Migrated Data
and New ICM
Data
Create
Investigative
Case
1
Search for
Deconfliction
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
From this and analysis of the TECS ICE Modernization processes, the OT team developed a prioritization
of the operational failure categories. These classifications reflect the operational mission impact and
priority of resolution. Table below presents these failure classifications. As indicated, the higher priority
capability areas could also experience lower priority failures.
Table 7: Failure Classification
Priority
Description
Priority 1
A failure that has one or more of the following operational impacts to
critical processes:
Core Case
Management
Processes
Integrated Search
and Analytics
1. Jeopardizes personnel and public safety.
2. Prevents the accomplishment of an operational or mission-essential
capability specified by the TECS Modernization requirements.
3. Prevents the operator’s ability to accomplish an operational or missionessential capability specified by the TECS Modernization requirements.
There shall be no known Priority 1 problems in the product delivered
to the end user.
Priority 2
All Priority 1 Failure
Classifications and:
Leads Management
Operational
Management
Shared
Infrastructure
A failure that has one or more of the following operational impacts:
1. Adversely affects the accomplishment of an operational or missionessential capability specified by the TECS Modernization requirements
for which no alternative solution, such as a work around, is known.
2. Adversely affects the operator’s ability to accomplish an operational or
mission-essential capability specified by the TECS Modernization
requirements for which no alternative solution, such as a work around,
is known.
No known Priority 2 problem(s) will exist in the product delivered to
the end user without the associated alternative solution documentation.
Priority 3
All Priority 1 and 2
Failure Classifications
and:
Visualization
A failure that has one or more of the following operational impacts:
1. Adversely affects the accomplishment of an operational or missionessential capability specified by the TECS Modernization requirements
so as to degrade operational performance, but for which an alternative
solution is known and documented.
2. Adversely affects the operator’s accomplishment of an operational or
mission-essential capability specified by the TECS Modernization
requirements so as to degrade operational performance, but for which
an alternative solution is known and documented.
Priority 3 problem(s) should be fixed prior to the next product update.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
33
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
From this and analysis of the TECS ICE Modernization processes, the OT team developed a prioritization
of the operational failure categories. These classifications reflect the operational mission impact and
priority of resolution. Table below presents these failure classifications. As indicated, the higher priority
capability areas could also experience lower priority failures.
Table 7: Failure Classification
Priority
Description
Priority 1
A failure that has one or more of the following operational impacts to
critical processes:
Core Case
Management
Processes
Integrated Search
and Analytics
1. Jeopardizes personnel and public safety.
2. Prevents the accomplishment of an operational or mission-essential
capability specified by the TECS Modernization requirements.
3. Prevents the operator’s ability to accomplish an operational or missionessential capability specified by the TECS Modernization requirements.
There shall be no known Priority 1 problems in the product delivered
to the end user.
Priority 2
All Priority 1 Failure
Classifications and:
Leads Management
Operational
Management
Shared
Infrastructure
A failure that has one or more of the following operational impacts:
1. Adversely affects the accomplishment of an operational or missionessential capability specified by the TECS Modernization requirements
for which no alternative solution, such as a work around, is known.
2. Adversely affects the operator’s ability to accomplish an operational or
mission-essential capability specified by the TECS Modernization
requirements for which no alternative solution, such as a work around,
is known.
No known Priority 2 problem(s) will exist in the product delivered to
the end user without the associated alternative solution documentation.
Priority 3
All Priority 1 and 2
Failure Classifications
and:
Visualization
A failure that has one or more of the following operational impacts:
1. Adversely affects the accomplishment of an operational or missionessential capability specified by the TECS Modernization requirements
so as to degrade operational performance, but for which an alternative
solution is known and documented.
2. Adversely affects the operator’s accomplishment of an operational or
mission-essential capability specified by the TECS Modernization
requirements so as to degrade operational performance, but for which
an alternative solution is known and documented.
Priority 3 problem(s) should be fixed prior to the next product update.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
33
ICE OCIO
4.8
TECS Modernization Test and Evaluation Master Plan
Operational Test and Evaluation to Date
The ICE TECS Modernization program is currently in the Design and Development phase and has not
started any testing. Therefore, there has been no operational T&E performed to date.
4.9
Planned Operational Test and Evaluation
OT&E will validate that the system is operationally effective and suitable as measured against the KPPs,
COIs, and associated MOPs, MOEs and MOSs when deployed in an operational environment. The OTA
will be the responsible organization for IOT&E Testing, Operational Testing and FOT&E Testing. The
OTA:
Writes the OT&E Test Plan(s), Operational Assessment Plans, and report(s). The OT&E Test
Plan(s) is sent to DHS OT&E for review and approval.
Performs operational assessments of releases deployed in the production environment to analyze
and assess the release for its potential to meet a KPP or COI.
Performs operational test and evaluation to determine the operational effectiveness and
operational suitability of a system or service in a realistic operational environment employing
final user trained operators.
Analyzes the data collected for the MOPs, MOEs and MOSs in order to resolve the KPPs and
COIs to determine Operational Effectiveness and Operational Suitability.
The operational test system will be the production system for TECS Modernization. Operational testing
will occur in a live, full production configuration at selected operational field agent sites.
4.10 Constraints and Limitations
The TECS Modernization OT&E is constrained by the factors listed in the table below.
Table 8: Operational Test and Evaluation Limitations
Limitation
Impact
Mitigation Strategy
The testing will be constrained to a small
subset of the operational environment
that is representative of the entire TECS
production environment
Test data may not be
reflective of all sites
The OTA and ICE TECS
Modernization Program Office will
coordinate sites to provide the
variations of facility size, number of
agents, available bandwidth, types
of tasks performed, etc., in selection
of release assessment sites.
The program’s test strategy is to plan on
using a load simulator to create the
stresses required to test system
performance, response time and
scalability
During DT efforts, not all
data sources or operational
network loading factors may
be present and accounted for
during system performance
evaluation.
OT will leverage DT system
performance instrumentation as the
system is evaluated at the different
field sites. The OTA will perform
comparisons of the load simulator
data and OT&E observed data.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
34
ICE OCIO
4.8
TECS Modernization Test and Evaluation Master Plan
Operational Test and Evaluation to Date
The ICE TECS Modernization program is currently in the Design and Development phase and has not
started any testing. Therefore, there has been no operational T&E performed to date.
4.9
Planned Operational Test and Evaluation
OT&E will validate that the system is operationally effective and suitable as measured against the KPPs,
COIs, and associated MOPs, MOEs and MOSs when deployed in an operational environment. The OTA
will be the responsible organization for IOT&E Testing, Operational Testing and FOT&E Testing. The
OTA:
Writes the OT&E Test Plan(s), Operational Assessment Plans, and report(s). The OT&E Test
Plan(s) is sent to DHS OT&E for review and approval.
Performs operational assessments of releases deployed in the production environment to analyze
and assess the release for its potential to meet a KPP or COI.
Performs operational test and evaluation to determine the operational effectiveness and
operational suitability of a system or service in a realistic operational environment employing
final user trained operators.
Analyzes the data collected for the MOPs, MOEs and MOSs in order to resolve the KPPs and
COIs to determine Operational Effectiveness and Operational Suitability.
The operational test system will be the production system for TECS Modernization. Operational testing
will occur in a live, full production configuration at selected operational field agent sites.
4.10 Constraints and Limitations
The TECS Modernization OT&E is constrained by the factors listed in the table below.
Table 8: Operational Test and Evaluation Limitations
Limitation
Impact
Mitigation Strategy
The testing will be constrained to a small
subset of the operational environment
that is representative of the entire TECS
production environment
Test data may not be
reflective of all sites
The OTA and ICE TECS
Modernization Program Office will
coordinate sites to provide the
variations of facility size, number of
agents, available bandwidth, types
of tasks performed, etc., in selection
of release assessment sites.
The program’s test strategy is to plan on
using a load simulator to create the
stresses required to test system
performance, response time and
scalability
During DT efforts, not all
data sources or operational
network loading factors may
be present and accounted for
during system performance
evaluation.
OT will leverage DT system
performance instrumentation as the
system is evaluated at the different
field sites. The OTA will perform
comparisons of the load simulator
data and OT&E observed data.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
34
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
4.11 Operational Test and Evaluation Plans and Reports
The OTA will conduct OA and OT&E within T&E protocols and procedures as outlined in Section 4.1.
The details of OA and OT&E will be included in the Operational Assessment Plan and OTEP and
subsequent reports.
For each OA, an OA Plan will be submitted to the DHS DOT&E for approval. The formal OA period
starts after the Authorization to Operate (ATO) has been granted, the assessments of developmental
performance testing completed, TECS Modernization training conducted, and ends at the conclusion of
user observation and assessment at the operational pilot site.
The OTA’s role is to conduct user observations and document a Letter of Observation (LOO) by the OTA
to DHS OT&E and the ICE TECS Modernization Program Office. The LOO will document observed
functionality, capabilities and identify potential problem areas that need attention to perform at an
acceptable level prior to formal OT, as well as identify any operational user identified enhancements. The
LOO does not resolve COIs, does not reach conclusions regarding effectiveness or suitability, and does
not make a recommendation regarding fielding introduction/release. The LOO will be made available to
the PM within ten (10) working days from the completion of a test event.
The OTA will develop an OA Report that documents the results of each OA; it will include a summary of
the risk of deploying TECS Modernization IOC, determination of potential of meeting IOC and / or FOC
and recommendations for OT&E activities for subsequent final TECS Modernization production release.
For OT&E, the OTEP will be reviewed and approved by DOT&E. The OT&E Report will be delivered to
the DOT&E. DOT&E will provide a letter of assessment to the Acquisition Review Board.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
35
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
4.11 Operational Test and Evaluation Plans and Reports
The OTA will conduct OA and OT&E within T&E protocols and procedures as outlined in Section 4.1.
The details of OA and OT&E will be included in the Operational Assessment Plan and OTEP and
subsequent reports.
For each OA, an OA Plan will be submitted to the DHS DOT&E for approval. The formal OA period
starts after the Authorization to Operate (ATO) has been granted, the assessments of developmental
performance testing completed, TECS Modernization training conducted, and ends at the conclusion of
user observation and assessment at the operational pilot site.
The OTA’s role is to conduct user observations and document a Letter of Observation (LOO) by the OTA
to DHS OT&E and the ICE TECS Modernization Program Office. The LOO will document observed
functionality, capabilities and identify potential problem areas that need attention to perform at an
acceptable level prior to formal OT, as well as identify any operational user identified enhancements. The
LOO does not resolve COIs, does not reach conclusions regarding effectiveness or suitability, and does
not make a recommendation regarding fielding introduction/release. The LOO will be made available to
the PM within ten (10) working days from the completion of a test event.
The OTA will develop an OA Report that documents the results of each OA; it will include a summary of
the risk of deploying TECS Modernization IOC, determination of potential of meeting IOC and / or FOC
and recommendations for OT&E activities for subsequent final TECS Modernization production release.
For OT&E, the OTEP will be reviewed and approved by DOT&E. The OT&E Report will be delivered to
the DOT&E. DOT&E will provide a letter of assessment to the Acquisition Review Board.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
35
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
5 Test and Evaluation (T&E) Resource Summary
This section summarizes the cost and resources that will be used for conducting DT&E and OT&E.
Table 9: Summary of T&E Funding Requirements
This table will be completed upon the finalization of the LCCE.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
36
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
5 Test and Evaluation (T&E) Resource Summary
This section summarizes the cost and resources that will be used for conducting DT&E and OT&E.
Table 9: Summary of T&E Funding Requirements
This table will be completed upon the finalization of the LCCE.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
36
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Table 10: Summary of Testing Resources
No
1
2
Test
Functional
Testing
Interoperability
Testing
Description
Verifies the developer
delivers products that
meet the functional
requirements as
described in the SRD
and the content of the
DD, ICA, DMP, and
other artifacts to
determine if the product
performs the business
functions as
documented
End-to-end testing that
verifies all TECS
Modernization
Program system
components maintain
Deliverables
1.
2.
3.
1.
2.
data integrity and can
operate in coordination
with other systems in
the same environment.
3
Performance
Testing
Detects any
performance and
capacity limitations by
generating system load
that emulates the
behavior of users
conducting business
transactions, determines
1.
2.
Environment
Development
Test Plan
Test Coverage
Matrix
Development
Test Analysis
Report
Government
DEV
Environment
System
Interoperability
Test Plan
System
Interoperability
Test Analysis
Report
Government
DEV/INT
Environment
System
Performance
Test Plan
System
Performance
Test Analysis
Report
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
SLM
Gate
Rvw
IRR
Data Set
50K Scrubbed
Records*
Note: Other
data sets may
be identified in
the DTP
PRR
50K Scrubbed
Records
Note: Other
data sets may
be identified in
the System
Interoperability
Test Plan
Government
PERF TEST
Environment **
PRR
TBD
Test Tools
Responsibility
HP
Application
Lifecycle
Managemen
t (ALM)
JIRA
Fortify
Subversion
Jenkins
Web Inspect
DB Protect
Development
Team (ICM
System, Data
Migration,
Interfaces, and
Data Warehouse)
HP ALM
JIRA
Fortify
Subversion
Jenkins
Web Inspect
DB Protect
Coordinated by
Test Manager
HP ALM
JIRA
Load
Runner
Conducted by
Test Support
Team
Support provided
by Development
Teams (ICM
System, Data
Migration,
Interfaces, and
Data Warehouse)
Coordinated by
Test Manager
Conducted by
Test Support
Team
Support provided
37
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Table 10: Summary of Testing Resources
No
1
2
Test
Functional
Testing
Interoperability
Testing
Description
Verifies the developer
delivers products that
meet the functional
requirements as
described in the SRD
and the content of the
DD, ICA, DMP, and
other artifacts to
determine if the product
performs the business
functions as
documented
End-to-end testing that
verifies all TECS
Modernization
Program system
components maintain
Deliverables
1.
2.
3.
1.
2.
data integrity and can
operate in coordination
with other systems in
the same environment.
3
Performance
Testing
Detects any
performance and
capacity limitations by
generating system load
that emulates the
behavior of users
conducting business
transactions, determines
1.
2.
Environment
Development
Test Plan
Test Coverage
Matrix
Development
Test Analysis
Report
Government
DEV
Environment
System
Interoperability
Test Plan
System
Interoperability
Test Analysis
Report
Government
DEV/INT
Environment
System
Performance
Test Plan
System
Performance
Test Analysis
Report
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
SLM
Gate
Rvw
IRR
Data Set
50K Scrubbed
Records*
Note: Other
data sets may
be identified in
the DTP
PRR
50K Scrubbed
Records
Note: Other
data sets may
be identified in
the System
Interoperability
Test Plan
Government
PERF TEST
Environment **
PRR
TBD
Test Tools
Responsibility
HP
Application
Lifecycle
Managemen
t (ALM)
JIRA
Fortify
Subversion
Jenkins
Web Inspect
DB Protect
Development
Team (ICM
System, Data
Migration,
Interfaces, and
Data Warehouse)
HP ALM
JIRA
Fortify
Subversion
Jenkins
Web Inspect
DB Protect
Coordinated by
Test Manager
HP ALM
JIRA
Load
Runner
Conducted by
Test Support
Team
Support provided
by Development
Teams (ICM
System, Data
Migration,
Interfaces, and
Data Warehouse)
Coordinated by
Test Manager
Conducted by
Test Support
Team
Support provided
37
ICE OCIO
No
4
5
TECS Modernization Test and Evaluation Master Plan
Test
Section 508
Compliance
Testing
UAT
Description
when performance
begins to degrade, and
identifies bottlenecks
across the system
application and
infrastructure.
Tests all ICE systems
with user interfaces for
Section 508 compliance
using the OASTapproved testing
package.
Allows production users
to test systems before
deployment to ensure
that the developed
system meets their
needs.
Deliverables
1.
1.
2.
Section 508
Remediation
Plan (if
required)
User
Acceptance
Test Plan
User
Acceptance
Test Report
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
Environment
Government
TEST
Environment
Government
UAT
Environment
SLM
Gate
Rvw
PRR
PRR
Data Set
NA
Test Tools
Tools
provided by
the Section
508
Coordinator
50K Scrubbed
Records
Note: Other
data sets may
be identified in
the User
Acceptance
Test Plan
38
HP ALM
JIRA
Responsibility
by Development
Teams (ICM
System, Data
Migration,
Interfaces, and
Data Warehouse)
Coordinated by
Test Manager
Conducted by
Test Support
Team
Support provided
by the Section
508 Coordinator
and
Development
Teams (ICM
System, Data
Migration,
Interfaces, and
Data Warehouse)
Coordinated by
Test Manager
Conducted by
End Users
Support provided
by the Test
Support Team
and
Development
Teams (ICM
ICE OCIO
No
4
5
TECS Modernization Test and Evaluation Master Plan
Test
Section 508
Compliance
Testing
UAT
Description
when performance
begins to degrade, and
identifies bottlenecks
across the system
application and
infrastructure.
Tests all ICE systems
with user interfaces for
Section 508 compliance
using the OASTapproved testing
package.
Allows production users
to test systems before
deployment to ensure
that the developed
system meets their
needs.
Deliverables
1.
1.
2.
Section 508
Remediation
Plan (if
required)
User
Acceptance
Test Plan
User
Acceptance
Test Report
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
Environment
Government
TEST
Environment
Government
UAT
Environment
SLM
Gate
Rvw
PRR
PRR
Data Set
NA
Test Tools
Tools
provided by
the Section
508
Coordinator
50K Scrubbed
Records
Note: Other
data sets may
be identified in
the User
Acceptance
Test Plan
38
HP ALM
JIRA
Responsibility
by Development
Teams (ICM
System, Data
Migration,
Interfaces, and
Data Warehouse)
Coordinated by
Test Manager
Conducted by
Test Support
Team
Support provided
by the Section
508 Coordinator
and
Development
Teams (ICM
System, Data
Migration,
Interfaces, and
Data Warehouse)
Coordinated by
Test Manager
Conducted by
End Users
Support provided
by the Test
Support Team
and
Development
Teams (ICM
ICE OCIO
No
6
7
TECS Modernization Test and Evaluation Master Plan
Test
System Security
Authorization
Testing (C&A)
Operational
Test and
Evaluation
Description
Deliverables
Validates
implementation of
security requirements
and controls in the
system and identifies
potential intrusion or
sensitive data exposure
vulnerabilities.
1.
Determines if the
system being delivered
fulfills operational
effectiveness and
suitability.
1.
2.
2.
3.
Security
Authorization
Package
Authority to
Operate
Operational
Assessment
Plan
Operational
Test Evaluation
Plan
OT&E Report
Environment
Government
TEST
Environment
Government
UAT and
PRODUCTION
Environments
SLM
Gate
Rvw
PRR
PRR
OTRR
ORR
Data Set
TBD
1.
2.
Test Tools
HP ALM
Nessus
Web Inspect
DB Protect
50K
Scrubbed
Records
Production
Note: Other
data sets may
be identified in
the Operational
Assessment
Test Plan.
HP ALM
JIRA
Responsibility
System, Data
Migration,
Interfaces, and
Data Warehouse)
Coordinated by
Test Manager
Conducted by
ISSO
Support provided
by the Test
Support and
Development
Teams (ICM
System, Data
Migration,
Interfaces, and
Data Warehouse)
Conducted by
OTA
Support from
End Users and
OTA vendor
*50K Scrubbed Records – These are production records that were scrubbed per ICE Privacy’s instructions.
**PERF – This is a pre-production environment within Infrastructure as a Service (IaaS) where a production equivalent environment will be
established for System Performance.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
39
ICE OCIO
No
6
7
TECS Modernization Test and Evaluation Master Plan
Test
System Security
Authorization
Testing (C&A)
Operational
Test and
Evaluation
Description
Deliverables
Validates
implementation of
security requirements
and controls in the
system and identifies
potential intrusion or
sensitive data exposure
vulnerabilities.
1.
Determines if the
system being delivered
fulfills operational
effectiveness and
suitability.
1.
2.
2.
3.
Security
Authorization
Package
Authority to
Operate
Operational
Assessment
Plan
Operational
Test Evaluation
Plan
OT&E Report
Environment
Government
TEST
Environment
Government
UAT and
PRODUCTION
Environments
SLM
Gate
Rvw
PRR
PRR
OTRR
ORR
Data Set
TBD
1.
2.
Test Tools
HP ALM
Nessus
Web Inspect
DB Protect
50K
Scrubbed
Records
Production
Note: Other
data sets may
be identified in
the Operational
Assessment
Test Plan.
HP ALM
JIRA
Responsibility
System, Data
Migration,
Interfaces, and
Data Warehouse)
Coordinated by
Test Manager
Conducted by
ISSO
Support provided
by the Test
Support and
Development
Teams (ICM
System, Data
Migration,
Interfaces, and
Data Warehouse)
Conducted by
OTA
Support from
End Users and
OTA vendor
*50K Scrubbed Records – These are production records that were scrubbed per ICE Privacy’s instructions.
**PERF – This is a pre-production environment within Infrastructure as a Service (IaaS) where a production equivalent environment will be
established for System Performance.
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
39
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Appendix A: Bibliography
1.
2.
3.
4.
Immigration and Customs Enforcement System Lifecycle Management Handbook
TECS Modernization SELC Tailoring Plan
TECS Modernization Operational Requirements Document
TECS Modernization Life Cycle Cost Estimate
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
40
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Appendix A: Bibliography
1.
2.
3.
4.
Immigration and Customs Enforcement System Lifecycle Management Handbook
TECS Modernization SELC Tailoring Plan
TECS Modernization Operational Requirements Document
TECS Modernization Life Cycle Cost Estimate
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
40
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Appendix B: Acronyms
Acronym
ALM
Ai
AO
Ao
ATO
C&A
CAE
CBP
CDF
CDR
CISO
CO
COI
COTS
CTP
DAG
DC
DD
DHS
DMP
DOT&E
DT&E
DTAR
DTP
EA
EAD
FIPS
FISMA
FOC
FOT&E
FY
HSI
IAD
IaaS
ICA
ICE
ICM
IEF
IOC
IOT&E
Full Term
Application Lifecycle Management
Inherent Availability
Authorizing Official
Operational Availability
Authority to Operate
Certification and Accreditation
Component Acquisition Executive
Customs and Border Protection
Compliance Determination Form
Critical Design Review
Chief Information Security Officer
Certifying Official
Critical Operational Issue
Commercial Off-the-Shelf
Critical Technical Parameters
Defense Acquisition Guidebook
Data Center
Design Document
Department of Homeland Security
Data Management Plan
Director of Operational Test and Evaluation
Developmental Test and Evaluation
Development Test Analysis Report
Development Test Plan
Enterprise Architecture
Enterprise Architecture Decision
Federal Information Processing Standard
Federal Information Security Management Act
Full Operational Capability
Follow-On Operational Testing & Evaluation
Fiscal Year
Homeland Security Investigations
Information Assurance Division
Infrastructure as a Service
Interface Control Agreement
Immigration and Customs Enforcement
Investigative Case Management
Integrated Evaluation Framework
Initial Operational Capability
Initial Operational Testing & Evaluation
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
41
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Appendix B: Acronyms
Acronym
ALM
Ai
AO
Ao
ATO
C&A
CAE
CBP
CDF
CDR
CISO
CO
COI
COTS
CTP
DAG
DC
DD
DHS
DMP
DOT&E
DT&E
DTAR
DTP
EA
EAD
FIPS
FISMA
FOC
FOT&E
FY
HSI
IAD
IaaS
ICA
ICE
ICM
IEF
IOC
IOT&E
Full Term
Application Lifecycle Management
Inherent Availability
Authorizing Official
Operational Availability
Authority to Operate
Certification and Accreditation
Component Acquisition Executive
Customs and Border Protection
Compliance Determination Form
Critical Design Review
Chief Information Security Officer
Certifying Official
Critical Operational Issue
Commercial Off-the-Shelf
Critical Technical Parameters
Defense Acquisition Guidebook
Data Center
Design Document
Department of Homeland Security
Data Management Plan
Director of Operational Test and Evaluation
Developmental Test and Evaluation
Development Test Analysis Report
Development Test Plan
Enterprise Architecture
Enterprise Architecture Decision
Federal Information Processing Standard
Federal Information Security Management Act
Full Operational Capability
Follow-On Operational Testing & Evaluation
Fiscal Year
Homeland Security Investigations
Information Assurance Division
Infrastructure as a Service
Interface Control Agreement
Immigration and Customs Enforcement
Investigative Case Management
Integrated Evaluation Framework
Initial Operational Capability
Initial Operational Testing & Evaluation
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
41
ICE OCIO
Acronym
IRR
KPP
LOO
MBTD
MOE
MOP
M&S
MOS
NIST
OA
OAST
O&M
OCIO
OIT
OPR
ORD
ORR
OT&E
OT
OTA
OTEP
OTRR
PDR
PM
PRR
RTM
SELC
SLM
SRD
T&E
TAR
TEMP
TPR
UAT
VPN
WIPT
TECS Modernization Test and Evaluation Master Plan
Full Term
Integration Readiness Review
Key Performance Parameter
Letter of Observation
Mission Based Test Design
Measure of Effectiveness
Measure of Performance
Modeling and Simulation
Measure of Suitability
National Institute of Standards and Technology
Operational Assessment
Office of Accessible System and Technology
Operations and Maintenance
Office of the Chief Information Officer
Office of Information Technology
Office of Professional Responsibility
Operational Requirements Document
Operational Readiness Review
Operational Test and Evaluation
Operational Test
Operational Test Agent
Operational Test and Evaluation Plan
Operational Test Readiness Review
Preliminary Design Review
Program Manager
Production Readiness Review
Requirements Traceability Matrix
Systems Engineering Lifecycle
System Lifecycle Management
System Requirements Document
Test and Evaluation
Test Analysis Report
Test and Evaluation Master Plan
Test Problem Report
User Acceptance Testing
Virtual Private Network
Working Integrated Project Team
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
42
ICE OCIO
Acronym
IRR
KPP
LOO
MBTD
MOE
MOP
M&S
MOS
NIST
OA
OAST
O&M
OCIO
OIT
OPR
ORD
ORR
OT&E
OT
OTA
OTEP
OTRR
PDR
PM
PRR
RTM
SELC
SLM
SRD
T&E
TAR
TEMP
TPR
UAT
VPN
WIPT
TECS Modernization Test and Evaluation Master Plan
Full Term
Integration Readiness Review
Key Performance Parameter
Letter of Observation
Mission Based Test Design
Measure of Effectiveness
Measure of Performance
Modeling and Simulation
Measure of Suitability
National Institute of Standards and Technology
Operational Assessment
Office of Accessible System and Technology
Operations and Maintenance
Office of the Chief Information Officer
Office of Information Technology
Office of Professional Responsibility
Operational Requirements Document
Operational Readiness Review
Operational Test and Evaluation
Operational Test
Operational Test Agent
Operational Test and Evaluation Plan
Operational Test Readiness Review
Preliminary Design Review
Program Manager
Production Readiness Review
Requirements Traceability Matrix
Systems Engineering Lifecycle
System Lifecycle Management
System Requirements Document
Test and Evaluation
Test Analysis Report
Test and Evaluation Master Plan
Test Problem Report
User Acceptance Testing
Virtual Private Network
Working Integrated Project Team
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
42
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Appendix C: Points of Contact
Name
Organization
Telephone Number
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
Project Role
43
ICE OCIO
TECS Modernization Test and Evaluation Master Plan
Appendix C: Points of Contact
Name
Organization
Telephone Number
ICE TECS Modernization_TEMP_Replan 04152014_ICM Acquisition
For Official Use Only
Project Role
43