7 min read

Verification and Validation Guide for Data-Driven Systems Engineering

Verification and Validation Guide for Data-Driven Systems Engineering

This guides provides approaches for planning and executing Verification and Validation for Data-Driven Systems Engineering.

 

1. VERIFICATION AND VALIDATION

Verification and Validation (V&V) are independent evaluation processes for determining a system’s conformance to requirements and suitability for use. V&V occurs on the right side of the Systems Engineering (SE) V as shown in Figure 1-1.

Picture1-Jun-11-2024-06-32-41-4775-PMFigure 1-1. V&V on the Systems Engineering V

 

1.1 Verification

Verification is system focused, proving that the solution was built according to agreed upon specification- level requirements. It shows consistency between design decisions and the assumptions underlying requirements. Verification seeks to answer “Are we building the product right?”1

Verification occurs prior to validation. It is frequently associated with the terms “Unit Test” and “Developmental Test and Evaluation”.

Verification methods are the means by which requirements can be verified. They include:

  • Analysis – mathematical modeling and analytical techniques used to predict design suitability or performance based on calculated data or data derived from lower level testing; generally used when other methods are not cost effective
  • Demonstration – system or lower level operation used to show that a requirement can be achieved; verifies high-level functionality, lacks the detailed data associated with testing
  • Inspection – visual examination of design features or identifiable markings
  • Modeling and Simulation – certified models and/or simulations used to predict design suitability or performance; can be considered a subcategory of analysis, generally used when other methods are not cost effective
  • Testing – system or lower level operation used to obtain detailed data to verify performance or to provide sufficient information to verify performance through further analysis; verifies detailed functionality

The verification methods selected must be executable within program constraints (e.g., time, budget, resources).

 

1.2 Validation

Validation is operationally focused, proving that solution-independent requirements are satisfied. It addresses stakeholder satisfaction and helps to ensure that the system will ultimately be part of the accepted solution within the target environment. Validation seeks to answer “Are we building the right product?”2

Validation occurs after verification. It is frequently associated with the term “Operational Test and Evaluation”.

Validation methods are the means by which stakeholder satisfaction with the system can be validated. They include:

  • Formal and Informal Reviews – reviews of the system and supporting operational procedures to predict suitability for performing operational concepts
  • Modeling and Simulation – certified models and/or simulations used to predict effectiveness and suitability for performing operational concepts
  • Formal and Informal Demonstrations – system level operation in a relevant environment designed to show that the system satisfies stakeholder expectations; lacks the detailed data associated with operational tests
  • Functional Analysis – analytical techniques used to predict effectiveness and suitability for performing operational concepts
  • Operational Tests – system level operation in a relevant environment designed to obtain detailed data necessary to show that the system satisfies stakeholder expectations

The validation methods selected must be executable within program constraints (e.g., time, budget, resources).

 

2. TEST PLANNING

Test planning involves preparation at multiple levels. This section will discuss different capabilities that Innoslate provides to document, manage, and report on test plans in support of V&V.

 

2.1 Test Preparation

 
2.1.1 Test Plans

Test plans describe the orchestration and intended execution of tests from a program perspective. Test plans should minimally include the following items:

  • Test objectives
  • Test cost, schedule, and risks
  • Resources and test support requirements
  • Items to be tested
  • Testing approach
  • Data collection requirements

Innoslate implements test plans as Innoslate Documents as shown in Figure 2-1. In Documents View, create a new document with the ‘Test Plan Document’ type. Select a pre-loaded template or enter your own format. Populate the test plan with modeled content as described in Sections 2.1.2 and 2.3.3.

 

Picture2-Jun-11-2024-06-38-19-3310-PMFigure 2-2. Documents View of Test Plan

 

 
2.1.2 Test Plan Models

Innoslate provides a range of modeling capabilities that can be used to generate diagrams, charts, and tables for incorporation into a test plan. A summary of potential test plan models is shown in Table 2-1.

Screenshot 2024-06-11 124050

 Table 2-1. Potential Test Plan Models

 

 

2.2 Requirements and Verification

 
2.2.1 Requirement Verification

A requirement is verifiable when you can “express the expected performance and functional utility so that verification is objective and preferably quantitative”3. In Innoslate, requirements are verified through the execution of Test Cases.

 

There are two implementation approaches for making a requirement verifiable, as shown in Figure 2-2:

  • Approach 1: Ensure Original Requirement Verifiable – Implemented in Innoslate with Requirement ‘verified by’ Test Case(s)
  • Approach 2: Create Separate Verification Requirement – Implemented in Innoslate with Requirement ‘verified by’ Verification Requirement and Verification Requirement ‘verified by’ Test Case(s)

Picture3-Jun-11-2024-06-44-17-8691-PMFigure 2-2. Innoslate Verification Approaches

 

Verification methods, as described in Section 1.1, should be determined as part of the requirements development process. To specify a requirement’s verification method, add the corresponding Verification Method Label(s) through the Requirement Metadata sidebar as shown in Figure 2-3. For Approach 1, Verification Method Labels are added to Requirement entities. For Approach 2, Verification Method Labels are added to Verification Requirement entities.

 

Picture4-Jun-11-2024-06-46-12-1347-PM

Figure 2-3. Verification Method Labels in Sidebar

 


2.2.2 Requirement Verification Reports

Requirement verification reports should be run from the verification approach appropriate execution level (i.e., Requirements document for Approach 1, Verification Requirements document for Approach 2), as shown in Figure 2-2.

The Verification Cross Reference Matrix (VCRM) shows the verification method that is associated with each requirement. From within Documents View, filter the document by ‘Only Requirements’. Select ‘Reports’ and report type ‘VCRM Output (XLSX)’. Choose the desired VCRM column options, enter the file name, and select ‘Create’. The VCRM report is shown in Figure 2-4.

 

Picture5-Jun-11-2024-06-48-44-6610-PM

Figure 2-4. VCRM Report

 

The Requirements Verification Traceability Matrix (RVTM) shows the verification method and associated test case(s) for each requirement. From within Documents View, filter the document by ‘Only Requirements’. Select ‘Reports’ and report type ‘RVTM Output (XLSX)’. Choose the desired RVTM column options, enter the file name, and select ‘Create’. The RVTM report is shown in Figure 2-5.

 

Picture6-Jun-11-2024-06-50-17-7070-PM

Figure 2-5. RVTM Report

 

2.3 Developing Tests

 

2.3.1 Test Management

Test management within Innoslate occurs in Test Center. The Test Center Dashboard is used to create and manage Test Suites. A Test Suite is an Artifact entity representing a test event. Separate Test Suites should be created for different V&V events (e.g., Unit Tests, Component Tests, System Tests, Operational Test and Evaluation).

 

The Test Suite View renders the hierarchical collection of related Test Cases comprising the Test Suite, as shown in Figure 2-6. It supports test management, the visualization of test results, and progress tracking. Test Suite View allows the creation of Test Cycles which persist a snapshot of the cycle’s test results.

Picture7-Jun-11-2024-06-54-24-5787-PM

Figure 2-6. Test Suite View

 

2.3.2 Test Cases

Test Cases are tests that occur within a test event. Test Case is a subclass of Action with the additional attributes of Expected Results, Actual Results, Status, and Setup. The Test Case ‘Status’ attribute is visualized and aggregated within the Test Suite View, as shown in Figure 2-7.

Picture8-Jun-11-2024-06-56-05-0400-PMFigure 2-7. Test Case 'Status'

Requirements are verified through the execution of Test Cases. Priority should be given to ensuring that a Test Case’s description, setup information, and expected results are created during V&V planning in coordination with requirement developers.

The Test Cases Output (DOCX) report provides a comprehensive overview of the Test Cases contained within a Test Suite. From within Test Suite View, select ‘Reports’ and report type ‘Test Cases Output (DOCX)’. Choose the desired Test Case attribute options, enter the file name, and select ‘Create’. The Test Cases Output (DOCX) report is shown in Figure 2-8.

Picture9-2Figure 2-8. Test Cases Output (DOCX) Report

 

2.3.3 Modeling With Test Cases

Since Test Cases are a subclass of Action, they can be viewed and manipulated in any diagram available to Actions. Test procedures can be created from a Test Suite’s Test Cases. Test procedure modeling can be used to:

  • Establish detailed test procedure schedules
  • Estimate test procedure resource utilization and cost

To establish detailed test procedure schedules, first open any decomposed Test Case as an Action Diagram. Then arrange the decomposing child Test Cases to orchestrate the test process. Finally, open the decomposed Test Case as a Timeline Diagram to view and adjust the corresponding test schedule. Figure 2-9 shows a test procedure schedule model.

Picture10-1

Figure 2-9. Test Procedure Schedule Model

 

To estimate test procedure resource utilization and cost, add resource and cost information to the modeled test procedures. Then run a Monte Carlo Simulation from within the test procedure’s Action Diagram.

 

Expected resource utilization and cost can be determined from the values displayed on simulator panels (e.g., Resource (Radar), Cost Bar Chart, Status) and within simulation reports (e.g., Monte Carlo Resource Report, Monte Carlo Cost Report).

 

3 TEST EXECUTION

Test execution involves coordinating and running tests. This section will discuss different capabilities that Innoslate provides to facilitate and execute test plans in support of V&V.

 

3.1 Facilitating Tests

The Innoslate project dashboard is the first page displayed when a user accesses a project. Project dashboard widgets can be customized to facilitate V&V by providing testers with situational awareness and one-click access to test resources. Suggested dashboard uses in support of V&V and their corresponding widgets are shown in Table 3-1.

Screenshot 2024-06-11 130122Table 3-1. Dashboard Uses Supporting V&V

 

For all users to see the customized widgets, a Project Owner should create the dashboard layout. Each widget that is added should have the ‘Save to Project’ switch turned on in its settings. After the customized widgets have been added, the Project Owner should select the ‘Save Dashboard Layout’ button.

 

3.2 Running Tests

Test Cases are executed in accordance with the Test Procedure Schedule. A Test Case’s ‘Actual Results’ and ‘Status’ attributes should be filled in during V&V testing. ‘Status’ values will be rolled-up within the Test Suite hierarchy as shown in Figure 2-7.

Test Cases that are ‘Blocked’ may generate Issue entities. Relate the Test Case with the associated Issue entity using the ‘causes’ relationship.

A Test Suite can be run multiple times. After executing and recording the results for each Test Case within the Test Suite, save the Test Cycle. In the Test Suite View, select ‘More’ and then ‘New Test Cycle.’ Enter a name for the Test Cycle and select ‘Create’. Saving a Test Cycle persists a snapshot of the test results and resets the Test Case statuses to ‘Not Run’. Saved Test Cycles can be selected for viewing on the sidebar’s ‘Test Cycles’ tab.

 

3.3 Reporting Test Results

The TVM Output (XLSX) report provides an overview of a Test Suite’s Test Cases including their corresponding status and verified requirements. In the Test Suite View, select ‘Reports’ and report type, ‘TVM Output (XLSX).’ Choose the desired Test Case attribute options to include ‘Status’, enter the file name, and select ‘Create’. The TVM Output (XLSX) report is shown in Figure 3-1.

Picture11-2Figure 3-1. TVM Output (XLSC) Report

 

Upon test completion, a detailed Test Suite status report can be generated by running the Test Cases Output (DOCX) Report, as described in Section 2.3.2.

 

4 CONCLUSION

Innoslate supports Verification and Validation activities. It provides a versatile toolset for planning, executing, and reporting upon test events.

 

References

1   "Verification and Validation: Overview." AcqNotes. March 15, 2024.

           https://acqnotes.com/acqnote/careerfields/verification-validation.

2   Larson, Wiley J. Applied Space Systems Engineering. Boston, MA: McGraw-Hill, 2009.

 

V&V Using Innoslate's Test Center Webinar

V&V Using Innoslate's Test Center Webinar

Want to sit back, relax, and listen? Watch the webinar recording!

Read More
Plan Verification & Validation Early in the Lifecycle

Plan Verification & Validation Early in the Lifecycle

After watching the movie “Deepwater Horizon” [Deepwater], I observed the catastrophic consequences when critical testing is skipped. The film...

Read More
Meeting the Challenges for Digital Engineering

Meeting the Challenges for Digital Engineering

Introduction Recently, a senior US Air Force leader gave a presentation with the slide in Figure 1 showing the challenge areas for continued...

Read More