Office of Operations
21st Century Operations Using 21st Century Technologies

Real-Time System Management Information Program Data Exchange Format Specification — Implementation Guidance

6. Testing

6.1 Introduction

A system engineering process is recommended for implementation and testing of a DXFS interface to ensure that: 1) the system satisfies all user needs; 2) requirements are verified resulting in a system free of defects; and 3) the system interface is built on time and on budget. Significant cost savings can be realized from building a system without having to continually re-work the system to satisfy new user needs and requirements. The up-front effort in defining user needs and requirements can also lead to significant savings in time in re-working of the system during latter stages of system development.

The DXFS has been developed using a systems engineering process, described below:

  • The first step in the process, described in Section 2 of this report, was to develop a concept of operations that provides the reader with a detailed description of the scope of the RTSMIP, the user needs which the RTSMIP will address, and the operational scenarios that consider the center to center interfaces that will be a part of RTSMIP.
  • The second step, described in Section 3, demonstrated the process of elicitation of requirements that satisfy the user needs in the concept of operations.
  • The third step, described in Section 4, demonstrated the selection of design elements from existing system interface standards that fulfill the requirements.
  • Section 5 of this report dealt with implementation issues to identify additional material (outside of the scope of the DXFS) necessary to develop a complete system interface specification.
  • Lastly, this section describes testing.

The focus of this section is on development of test documentation to test system interface specification compliance. During the test phase, the system interface implementation is tested against the requirements specified for the project. A complete treatment of the topic of software testing is beyond the scope of this guide, and hence, no attempt is made to show a complete test example.

Field experience from system interface testing (such as testing a TMDD implementation) yields 2 major issues that are not currently being addressed during system interface testing:

  1. Lack of boundary testing. Boundary testing is intended to test that the content of messages is correct and complete. This includes testing that data values are within stated value ranges, that enumeration values are properly selected from the standardized list of choices, and data values conform with field length.
  2. Proprietary design. Proprietary designs are being implemented that supplants the national standard thus leading to non-interoperability of systems and non-conformance with standards.

6.1.1 DXFS Conformance

One of the goals of Section 1201 was to realize national interoperability of systems providing real time information. To be conformant with the DXFS, a system implementation must be conformant with the underlying standards (TMDD, TCIP, SIRI, and OASIS CAP) upon which the DXFS is based. A system that does not conform with one or more of the standards is not conformant with the DXFS.

6.1.2 Compliance with a DXFS Project Specification

Tailoring the DXFS for a specific project was described in Sections 2 through 4 of this report. The end-result is a specification of requirements and design tailored to satisfy specific project needs for a RTSMIP implementation. The NRTM was designed to allow a project to develop a needs-based specification.

To test a DXFS system interface, the interface must be isolated from the hardware and central system software. Testing focuses on the compliance with the requirements and the system interface specification and not the operations(s) the implementation is attempting to support via the interface software implementation. Testing will verify compliance with the specification requirements and ensure that the dialogs and data content of message exchanges are implemented correctly.

Testing compliance with a DXFS project specification can be summarized in three steps:

  1. Write test documentation. (Test documentation is described with examples in this section of the report.)
  2. Conduct tests in accordance with the test documentation and document the test results.
  3. When all pass/fail items are passed, the implementation is compliant with a project-specific DXFS.

6.1.3 Test Phases

A system can be thought of as being composed of many sub units. The system testing described in this section follows the path of system development, and is described in 4 phases as follows:

  • Unit Test. Conducted to verify that a particular sub unit of the system is complete and fulfills all the requirements allocated to the sub unit.
  • Integration Test. Conducted to verify that the sub units of the system, when integrated, will work together and will fulfill all system level requirements.
  • System Acceptance Test. Conducted after system installation and commissioning, and verifies the system is ready for operation.
  • Periodic Maintenance Test. This test phase is designed to allow the system to be periodically tested to ensure all system functions are operating properly.

6.2 Test Documentation

Test documentation is a key element of a testing program. Test documentation includes test plans, test designs, test cases, test procedures, and test reports. Test documentation may be developed by the vendor, the agency, a test laboratory, a consultant, or perhaps it is based on test documentation used developed by another agency as part of its compliance their qualified products program. Testing is conducted by a combination of vendor, agency, and possibly an independent laboratory to verify that an ITS system complies with the agency’s specification.

Developing agency test documentation can take a significant amount of time and require coordination of many parties. It is recommended that test plan development begin after system interface requirements have been completed and approved. Test design and development and or test case developments can begin after agency specification requirements have been approved and signed-off. Test Plan execution occurs throughout implementation. Test reports document test plan execution. Test documentation, as outlined, ensures that testing is thoroughly documented. In addition, test designs, test cases, and test procedures should be regularly reviewed based on past experience and results.

6.2.1 Standards that Support Test Documentation

As in previous sections, this section relies on existing standards to define the content and processes for test documentation. The IEEE Std 829-1998-1998, IEEE Standard for Software and System Test Documentation, hereafter referred to as IEEE Std 829-1998, provides a comprehensive overview of the processes and documentation for testing.

IEEE Std 829-1998-1998 specifies the form and content of an individual test document, but it does not specify a required set. The documents outlined in IEEE Std 829-1998 cover test planning, test specification, and test reporting. IEEE Std 829-1998 provides the following overview.

6.2.1.1 Test Plan

The test plan prescribes the scope, approach, resources, and schedule of the testing activities. It identifies the items to be tested, the features to be tested, the testing tasks to be performed, the personnel responsible for each task, and the risks associated with the plan (IEEE Std 829-1998-1998, IEEE Standard for Software and System Test Documentation, IEEE, 16 September 1998, p. iii).

6.2.1.2 Test Specifications

Test specifications are covered by three document types: (IEEE Std 829-1998-1998, IEEE Standard for Software and System Test Documentation, IEEE, 16 September 1998, p. iii).

  1. A test design specification refines the test approach and identifies the features to be covered by the design and its associated tests. It also identifies requirements, test cases, and test procedures necessary to accomplish the testing and specifies the feature pass-fail criteria.
  2. A test case specification documents the actual values used for input along with the anticipated outputs. A test case also identifies constraints on the test procedures resulting from use of that specific test case. Test cases are separated from test designs to allow for use in more than one design and to allow for reuse in other situations.
  3. A test procedure specification identifies all steps required to operate the system and exercise the specified test cases in order to implement the associated test design.

6.2.1.3 Test Reports

Test reporting is covered by four document types (IEEE Std 829-1998-1998, IEEE Standard for Software and System Test Documentation, IEEE, 16 September 1998, p. iii):

  1. A test item transmittal report identifies the test items being transmitted for testing in the event that separate development and test groups are involved or in the event that a formal beginning of test execution is desired.
  2. A test log is used by the test team to record what occurred during test execution.
  3. A test incident report describes any event that occurs during the test execution which requires further investigation.
  4. A test summary report summarizes the testing activities associated with the execution of test plan specifications. The test summary report can summarize key results captured in the test logs and test incident reports.

6.2.2 Example Test Documentation Framework for a DXFS Implementation

The IEEE standards that cover system engineering must be tailored to address the specific needs for a particular system engineering process, project plan, project life cycle development process, and specific part of the system being tested. Figure 4 provides a diagram showing an example of tailoring IEEE Std 829-1998 to support system interface testing.

Figure 4 is a graphic showing test documentation framework showing layers of test plan, test design specifications, test case specification, test procedure specification, test execution and test ports. These layers all flow to the Test Plan Execution Summary Report.

Figure 4. Diagram. IEEE 829-1998-based Test Documentation Framework for the RTSMIP DXFS.
(Source: IEEE.)

At the top of Figure 4. are the test plans for a DXFS system interface. The diagram shows a master test plan that includes IEEE Std 829-1998 test plan information for each test phase (unit test, integration, system acceptance, and maintenance). For example, a separate section of the master test plan will be developed for unit testing.

A test design specification will identify the user needs being validated, plus the set of requirements that satisfy those user need(s), and the list of associated test cases that will verify implementation of the requirements in the system interface.

Test cases may be reused across test designs as long as the test case input and output specifications are the same. For example, a test case may be created to test that an error report message is properly transmitted across the system interface.

A test case will identify the test case input specification(s) that identify the valid values to be contained in a message that will result in a positive or negative test. For example, a positive test case will be designed to generate a message where all the contained data concepts will pass the valid value rule/ criteria (e.g., whether the data concept contains a value from an enumerated list; or whether the data concept contain a value that is within the value range that is specified in the valid value rule; or whether a string is a specified length). A separate test case, with a different test input specification may test that the system properly reports an error message (negative test case).

The test case developer may also want to identify a test output specification that describes the valid value rules for each data concept in a message (positive test case). In this case, each data concept will be individually verified as to whether the criteria for valid values is fulfilled, and whether the entire message (a collection of data concepts) satisfies all valid value criteria for all data concepts in the message. (See Table 20 and Figure 6 for a positive test case output specification.)

A test case will also identify which test procedure(s) are necessary to verify that the requirements associated with the test case are verified. A test procedure may be used across multiple test cases. While the test case describes the inputs and outputs of the test to be executed, the test procedure identifies the steps to be taken to verify the requirements identified in a test case. The test procedure also contains entries that should be noted at test time: tester(s), date and time of test, and notes/comments.

After the test plan and test specifications have been developed, the tester executes the test plan, keeping a test log and recording anomalies in a test incident report. The test logs and test incident reports are then used to generate a test plan summary report.

One key purpose of the testing and the results documented in the test summary report is: 1) to verify that a contractor has fulfilled all the requirements in the system interface specification, and 2) to validate that all user needs are satisfied.

6.3 Test Plan

A test plan is a document describing the scope, approach, resources, and schedule of intended testing activities (IEEE Std 829-1998-1998, IEEE Standard for Software and System Test Documentation, IEEE, 16 September 1998, p. 2).

Either a test plan should be developed that covers just the system interface, or if a test plan exists for the project, a test item that represents the system interface should be created. The system interface test plan identifies test items (in this case, the system interface), the features to be tested (in this case, the requirements to be tested), the testing tasks, who will perform do each task, and any risks requiring contingency planning. According to IEEE Std 829-1998 provides a template for the contents, the following elements should be included in a test plan and it includes:

  1. Test Plan Identifier. This is a unique name and number for each test plan.
  2. Introduction. Should include references to all relevant documents, for example, the system interface needs, requirements, specification, and design.
  3. Test Items. This is a description of the software item to be tested. In our case, this will be the system interface.
  4. Features to be Tested. The features to be tested are the requirements. This section could reference the RTM, which in turn would provide a list of requirements to be tested.
  5. Features Not to be Tested. This section might not apply, but it would include a list of features, for example, requirements that will not be tested and why not.
  6. Approach. This section describes the overall approach to testing: who does it, what the main activities, techniques and tools used for each major group of features are. It will explain how to decide that all the features have been tested. IEEE 829-1998 also says that this section (and not the Schedule section) is the place to identify constraints, including deadlines, and the availability of people and test items.(Cem Kaner, Jack Falk, Hung Nguyen, Testing Computer Software, John Wiley & Sons, 2nd Edition, April 12, 1999, p. 247-248.)

It is worth considering a phased approach to testing to reduce the cost and risk of testing. Phases might include:

  • Unit Test Phase.
  • Integration Test Phase.
  • System Acceptance Phase.
  • Periodic Maintenance Test Phase.

For example, start with a smaller number of test units (e.g., dialogs) initially, and then add more units. This phased testing approach helps in isolating problems (what piece of the system is at fault). The phased approach also allows for multiple iterations for correcting errors encountered, thus reducing risk and helping find defects (one of the reasons why testing is conducted). One way to handle the phased approach is through the use of one test design specification for each phase. The approach section will also include an overview of logistics, test equipment (projectors, protocol analyzers, vendor equipment, test software, tables, and chairs), and equipment to be tested. The approach of doing incremental testing will help to isolate defects in the system and in turn allow proper verification and validation.

  1. Item Pass-Fail Criteria. Criteria for determining whether the test item, in this case, the system interface software, has passed or failed a test.
  2. Suspension Criteria and Resumption Requirements. Identifies anything that could cause the test to be stopped. It describes the rules for stopping and restarting a test. One benefit of having these rules is the result of shortened testing cycles, because a test does not have to start all over again if it is stopped in the middle. This clause should include how and when regression testing (retesting of previously tested elements) will be performed.
  3. Test Deliverables. This is a list of all the test documentation that will be written for the test.
  4. Testing Tasks. Identifies the planned activities required for testing. This section of the test plan may include the following items:
    1. Task Number. A unique identifier for a testing task.
    2. Task Name. A unique name or title for the test task.
    3. Predecessor Tasks. Identifies test task interdependencies.
    4. Responsibility. Identifies who needs to be present to conduct the test task.
  5. Special Skills. Identifies any special items and/or resources required to conduct the test task.
  6. Environmental Needs. Describes the configuration, necessary hardware, software, testing tools, supplies, lab facilities, centers, and reference documentation. Include a diagram showing the set up for the testing: locations and positioning of equipment, people, tables, chairs, and projection systems (so that everyone participating in the testing can see what is happening). Also include a list of equipment, equipment description, and equipment purpose to accompany the diagram.
  7. Responsibilities. Names the groups and/or individual persons responsible for managing, designing, preparing, executing, witnessing, checking, controlling the environment in the laboratory, obtaining the equipment and supplies, setting up equipment and software, writing reports, and approving (who will sign the approval sheet). (Cem Kaner, Jack Falk, Hung Nguyen, Testing Computer Software, John Wiley & Sons, 2nd Edition, April 12, 1999, p. 248.)
  8. Staffing and Training Needs. For each phase of testing (unit test, integration test, system acceptance test, and periodic maintenance test) this section of the test plan should describe who needs to be trained and for what purpose. It should begin with a description of staff, identifying who shall be available versus who shall participate full-time. It could also identify a meeting with all parties involved to let them know what the testing process will be.
  9. Schedule. This is a list of milestones and a timeline of when all resources and people will be needed. It could include a reference to the approach section for test tasks and add start and stop dates and times.
  10. Risks and Contingencies. Identifies significant risks to the testing plus contingency plans. This section includes risks to schedule, potential impact on cost, technical risks, and what to do if the situation occurs.
  11. Approvals. Lists the personnel who shall approve the plan and spaces for their signatures.

Developing a test plan can take a significant amount of time and require coordination of many parties. It is recommended that test plan development begin after the system interface requirements have been completed and approved.

6.4 Test Design Specifications

A test design specification is a document that specifies the details of the test approach for a feature or combination of features and identifies associated tests. According to IEEE Std 829-1998, a test design specification contains the following features:

  1. Test Design Specification Identifier. A unique identifier for the test design specification.
  2. Features to be Tested. List of requirements to be tested.
  3. Approach Refinements. Expands upon the approach described in the test plan. It is recommended that a purpose in the approach be added.
  4. Test Identification. A list and brief description of the test cases associated with this test design.
  5. Feature Pass-Fail Criteria. Explains how the tester will decide whether a feature under test has passed the test.

The purpose of the test design is to identify which test cases verify which requirements in the system interface. Though the diagram in Figure 4 shows a separate test design document for each test phase, a typical implementation will bundle the test design information with the test plan information, organized by test phase.

An example portion of a test design specification is shown in Table 16.

Table 16. Example Portion of a Test Design Specification.
Requirement ID Requirement Title Test Case ID Test Case Title
3.5.3.3.2.1 Send Link Status Information Upon Request TC001 [List additional test cases here.] Link Status Request-Response Dialog Verification
3.5.3.3.2.4 Contents of the Link Status Request TC001 [List additional test cases here.] Link Status Request-Response Dialog Verification
3.5.3.1.1 Contents of the Traffic Network Information Request TC001 [List additional test cases here.] Link Status Request-Response Dialog Verification
3.5.3.1.1.1 Required Traffic Network Information Request Content TC001 [List additional test cases here.] Link Status Request-Response Dialog Verification
3.5.3.3.2.5 Contents of the Link Status Information TC001 [List additional test cases here.] Link Status Request-Response Dialog Verification
3.5.3.3.2.5.1 Required Link Status Information Content TC001 [List additional test cases here.] Link Status Request-Response Dialog Verification
3.5.3.3.2.5.2.4 Link Travel Time TC001 [List additional test cases here.] Link Status Request-Response Dialog Verification

Note: ID: TD001: Link Status Request Dialog and Request and Response Message Data Content Verification
User Need: 2.5.2.2 Travel Time Data for Roads
Feature Pass-Fail Criteria:   This test design is passed when: 1) the dialogs
represented in TC001 completes round trip communication, and 2) the data content of dialog requests and responses are verified correct against the referenced input and output specifications.

6.5 Test Case Specifications

A test case specification is a document that specifies the inputs, predicted results, and set of execution conditions on a test. Test case specification development can begin after the system interface requirements are approved.

A test case specification includes the following elements:

  1. Test Case Specification Identifier. A unique identifier for the test case specification. A title is also strongly recommended so the tester can quickly grasp the nature of the test case.
  2. Test Items. Identifies the requirements being verified by this test case. The mapping of requirements to test cases will be documented in the requirements to test case traceability matrix.
  3. Input Specification. Description of input data values, range of values, names of files, or names of memory-resident areas containing test values.
  4. Output Specification. Description of expected output values and tolerances for each data concept, error messages, and expected response times.
  5. Environmental Needs. List of special requirements, equipment, skills, hardware, software, facilities, and staff. This clause also describes any environmental needs that are different (or require additional resources) from what is described in the test plan or test design specification.
  6. Special Procedural Requirements. List of special procedures for startup, setup, and analysis.
  7. Intercase Dependencies. A list of test cases to perform before this one, and contingency if the dependent case fails. This clause allows an ordering of test case execution to be developed.

Several test cases might be needed to determine that a requirement is fully satisfied, but at least one test case for each requirement shall be defined. The traceability matrix, RTCTM, is used in matching test cases to requirements. Some methodologies recommend creating at least two test cases for each requirement. One of them should perform positive testing of a requirement and the other should perform negative testing (e.g., testing for invalid values or conditions). Written test cases should include a description of the functions to be tested and the preparation required to ensure that the test can be conducted.

The test case should consider for positive test cases, boundary conditions, and error handling. For example, when testing boundary conditions, if the specifications state that valid values for vehicle speed is 0 to 65 miles per hour, use test values of 0, 65, and 66 miles per hours to verify that each test value is properly transmitted (or not) across the system interface.

What characterizes a formal, written test case is that there is a known input and an expected output, which is worked out before the test is executed. The known input should test a precondition, and the expected output should test a post-condition.

6.5.1. Example Requirements to Test Case Traceability Matrix (RTCTM)

The purpose of the RTCTM to verify that the test cases capture testing of all of the system interface requirements at least once.

The RTCTM (test case matrix) contains a Requirement Identifier, Requirement Title, Test Case ID, and Test Title. An example RTCTM is shown in Table 17.

Table 17. Example Requirements to Test Case Traceability Matrix.
Requirement ID Requirement Title Test Case ID Test Case Title
3.5.3.3.2.1 Send Link Status Information Upon Request TC001 Link Status Request-Response Dialog Verification
3.5.3.3.2.4 Contents of the Link Status Request TC001 Link Status Request-Response Dialog Verification
3.5.3.1.1 Contents of the Traffic Network Information Request TC001 Link Status Request-Response Dialog Verification
3.5.3.1.1.1 Required Traffic Network Information Request Content TC001 Link Status Request-Response Dialog Verification
3.5.3.3.2.5 Contents of the Link Status Information TC001 Link Status Request-Response Dialog Verification
3.5.3.3.2.5.1 Required Link Status Information Content TC001 Link Status Request-Response Dialog Verification
3.5.3.3.2.5.2.4 Link Travel Time TC001 Link Status Request-Response Dialog Verification

6.5.2 Example Test Case Specification

The test case descriptions presented in this section test the requirements for a specific dialog, dialog messages, and message content in a single test case. This combination ensures that all the requirements that together comprise a dialog, dialog messages, and message content are tested at least once. The process for testing of system interface compliance then is accomplished through a careful systematic testing of all of the dialogs that comprise a system interface. An example test case specification is shown in Table 18.

Table 18. Example Test Case Specification.
Purpose: To verify system interface implements (positive test case) requirements for:
  • Link Status Request-Response Dialog message exchange
  • Contents of the Link Status Request Message
  • Contents of the Link Status Information Message
Description: The test case verifies that the dialog, request message content, and response message content are correct by sending a request message (verified to be correct) across the system interface, and verification that the response message is correct. Input and output specifications are provided to verify the request and response message are correct per the requirements for the request and response message.
Test Items: 3.5.3.3.2.1 – Send Link Status Information Upon Request
3.5.3.3.2.4 – Contents of the Link Status Request
3.5.3.1.1 – Contents of the Traffic Network Information Request
3.5.3.1.1.1 – Required Traffic Network Information Request Content
3.5.3.3.2.5 – Contents of the Link Status Information
3.5.3.3.2.5.1 – Required Link Status Information Content
3.5.3.3.2.5.2.4 – Link Travel Time
Input Specification: TCIS001 – LinkStatusRequest (Positive Test Case)
Output Specification: TCOS001 – LinkStatusInformation (Positive Test Case)
Environmental Needs: No additional needs outside of those specified in the test plan.
Test Procedure(s): TP001: Link Status Request-Response Dialog Verification (Positive Test Case)
Pass/Fail Pass/Fail is determined upon verification of the following:
  • A LinkStatusInformation message is returned upon sending of a LinkStatusRequest Message. (Pass/Fail)
  • The structure and content of the LinkStatusRequest is verified to be correct. A test input specification is provided. See TCIS001 – LinkStatusRequest (Positive Test Case). (Pass/Fail)
  • The structure and content of the LinkStatusInformation is verified to be correct. A test output specification is provided. See TCOS001 – LinkStatusInformation (Positive Test Case). (Pass/Fail)
Tester/Reviewer B.C.
Special Procedure Requirements: None
Intercase Dependencies: None

Note: ID: TC001
Title: Link Status Request-Response Dialog Verification (Positive Test Case)

6.5.2.1 Example Input and Output Specifications

The test case above references Input and Output Specifications. These specifications provide a basis for verification that the content and structure of messages used or generated during testing are correct (per the requirements).

Test case data can be developed by creating a table showing the data variables defined in the project-specific XML schema. The constraints on the data values contained in messages are composed of the valid value rules for the data elements contained in a message. Data elements are typically constrained by:

  • Enumerated lists.
  • Value range.
  • Size.

Test case data has been prepared for positive test case data – i.e., test cases expected to pass. A test case data input specification table is shown in Table 19.

Table 19. Example Test Case Input Specification.
Data Concept Name (Variable) Data Concept Type Value Domain Pass-Fail
trafficNetworkInformationRequestMsg Message Empty cell. Pass/Fail
- organization-requesting Data Frame Empty cell. Pass/Fail
- organization-id Data Element IA5String (SIZE(1..32)) Pass/Fail
- organization-name Data Element IA5String (SIZE(1..128)) Pass/Fail
- network-information-type Data Element 1 = “node inventory”
2 = “node status”
3 = “link inventory”
4 = “link status”
5 = “route inventory”
6 = “route status”
7 = “network inventory”
Pass/Fail

Note: ID: TCIS001
Title: LinkStatusRequest (Positive Test Case)

Figure 5 shows a trafficNetworkInformationRequestMsg request message. The XML is an example of test case data that would pass all criteria (all pass-fails) identified in the input specification.

Figure 5 is an example of software script.

Figure 5. Message. Example trafficNetworkInformationRequestMsg
Request Message Test Case Data File.

A test case data output specification table is shown in Table 20.

Table 20. Example Test Case Output Specification.
Data Concept Name (Variable) Data Concept Type Value Domain Pass-Fail
linkStatusMsg Message Empty cell. Pass/Fail
- link-status-item Data Frame Empty cell. Pass/Fail
- organization-information Data Frame Empty cell. Pass/Fail
- organization-id Data Element IA5String (SIZE(1..32)) Pass/Fail
- organization-name Data Element IA5String (SIZE(1..128)) Pass/Fail
- link-status-list Data Frame Empty cell. Pass/Fail
- link Data Frame Empty cell. Pass/Fail
- network-id Data Element IA5String (SIZE(1..32)) Pass/Fail
- link-id Data Element IA5String (SIZE(1..32)) Pass/Fail
- link-name Data Element IA5String (SIZE(1..128)) Pass/Fail
- link-status Data Element 1 = “no determination”
2 = “open”
3 = “restricted”
4 = “closed”
Pass/Fail
- travel-time Data Element INTEGER (0..65535), units=seconds Pass/Fail

Note: ID: TCOS001
Title: LinkStatusInformation (Positive Test Case)

Figure 6 shows a linkStatusInformationMsg response message. The XML is an example of test case data that would pass all criteria (all pass-fails) identified in the output specification.

Figure 6 is an example of software script.

Figure 6. Message. Example linkStatusInformationMsg
Response Message Test Case Data File

6.6 Test Procedure Specifications

A test procedure specification is a document that specifies a sequence of actions for the execution of a test. The test procedures test the implementation of the requirement. Test procedure specification development can begin after the test cases and design are completed and approved.

A test procedure specification includes the following elements:

  • Test Procedure Specification Identifier. This is a unique identifier for a test procedure.
  • Purpose. Describes what the procedure is for.
  • Special Requirements. List of prerequisite procedures, special test skills, and environmental needs.
  • Procedure Steps. Includes a list of the steps. IEEE Std 829-1998 describes the following key words, as applicable, that should be used in describing procedure steps:
    • Log. Special methods or formats for logging results and observations.
    • Setup. Preparation for execution of the procedure.
    • Start. How to begin execution of the procedure.
    • Proceed. Actions necessary during program execution.
    • Measure. How test measurements (e.g., response times) are made.
    • Shut down. How to suspend testing in the face of an unscheduled event.
    • Restart. Where and how to restart any test step after a shut down of the test.
    • Stop. How to bring test execution to an orderly halt.
    • Wrap up. How to restore the test environment to its original state.
    • Contingencies. What to do in the case of an anomalous event (Cem Kaner, Jack Falk, Hung Nguyen, Testing Computer Software, John Wiley & Sons, 2nd Edition, April 12, 1999, p. 250).

6.6.1 Example Test Procedure Specification

Table 21 shows an example of the test procedure for the Link Status Request-Response dialog and includes the purpose, preconditions, and procedure steps.

Table 21. Example Test Procedure Specification.
Purpose: This test procedure verifies that the Link Status Request-Response dialog of an Owner Center system interface is implemented properly. It tests that when a correctly formatted trafficNetworkInformationRequestMsg request message is sent to an owner center, that the owner center responds with an linkStatusMsg response message.
Start Date/Time: 10:15 am, June 12, 2013
End Date/Time: 10:30 am, June 12, 2013
Special Requirements: None
Preconditions: 1. Verify that the XML Request Message is valid against Project XML Schema.
2. Verify that the WSDL for the Dialog to be tested is correct.
Procedure Steps: 1. Start HTTP Client
2. Load XML Request Information Message File
3. Send XML Request Message to Owner Center
4. Receive XML Response Message from Owner Center
5. Log XML Incident Description Response Message to a File
6. Verify that the Saved Incident Description Response File is SOAP XML (encoding is SOAP)
7. Verify that the Saved Incident Description Response File validates against the Project XML Schema. Using an XML schema tool, verify the XML message content against the value domains described in the test case input and output specifications.
Test Identification TC001 – Link Status Request-Response Dialog Verification (Positive Test Case)
TC002 – [continue with additional test case specifications here]
Feature
Pass-Fail
Passed
Tester(s) Initials B.C.
Notes: 1.

Note: ID: TP001
Title: Link Status Request-Response Dialog Verification (Positive Test Case)

6.7 Test Reports

This section provides a short summary of the test reports outlined in IEEE Std. 829-1998:

  1. Test Log. The purpose of the test log is to document the events and outcomes (pass/fails) encountered during the test. The test log is a chronological record of the execution of the test including persons present and roles, procedure results, and any anomalies encountered.
  2. Test Incident Report. The purpose of the test incident report is to record anomalies encountered during testing (sequence of events that led to the anomaly) to provide information to analyst that may need to identify the causes of system errors. This report documents any event that occurs during testing that required further investigation. It records expected versus actual results, procedure and procedure step, and attempts to repeat the test. It should also record any potential impacts on further testing activities.
  3. Test Summary Report. The purpose of the test summary report is to provide documentation on status for a test phase. The ability to move onto a new phase in the project may be predicated on satisfactory completion of a test phase. This report would be provided after the completion of a test phase, for example, all the tests defined in a test design specification.

The reader is asked to review Section 9 Test log, Section 10 Test incident report, and Section 11 Test summary report of IEEE Std 829-1998 for additional information.

6.8 Summary

This section provided guidance on system interface compliance testing. Key points included:

  1. Take a phased approach to testing. Also, use an incremental approach to test base functionality first and more advanced features next. This will facilitate isolation and correction of defects.
  2. Testing performs verification and validation and helps find defects in software.
  3. Testing isolates system interface defects (for example, incorrect sequences of message exchanges or incorrectly formed messages) versus defects in the system elements (applications, databases, etc.) that use or generate the information that is exchanged across the system interface.
  4. The cost of testing needs to be considered vis-à-vis the potential reduced cost during the operations and maintenance phase. The cost of testing needs to be considered vis-à-vis the potential reduced cost during the operations and maintenance phase. The cost to fix a bug during testing is much less than the cost to fix the same bug of a system that has already been deployed.