Introducing the Automated Testing Vehicle (ATV) Manager (ATV)


An ATV, or vehicle, is a project environment that contains a number of individual test cases that collectively address a defined regression or performance testing requirement. By collecting the elements needed to accomplish a testing requirement into a vehicle it simplifies the management of that test project.

The ATV Manager employs a test case centric approach to address the requirements of a test project. Individual test cases are defined to accomplish a single, well defined testing goal. The test case contains the information needed to achieve the stated testing goal, including data instantiation, environment setup, exercising the application being tested, and success or failure determination at the conclusion of test case execution. Utilizing this approach, a set of test cases are used to address a wider scope requirement than an individual test case. For example, three individual test cases for the following three requirements, 1) Test ISPF Browse of a PDS member, 2) Test ISPF Browse of a PDSE member, and 3) Test ISPF Browse of a sequential data set, taken together address the wider scope requirement to test the ISPF Browse function.

A project requirement for a complete regression or performance test for an application can be addressed using this paradigm. Running all of the test cases within a vehicle constitutes the full regression or performance test.

The ATV Manager has the ability to distribute test run reports via email to a defined set of recipients. Individual test case failure notices can be sent to the person or persons tasked with maintaining the failing test case. This proactive information delivery can be used to initiate the problem resolution process as soon as possible.

Care should be taken when defining the requirement scope to be addressed at the vehicle level. As the ATV Manager nomenclature suggests, a vehicle should be tasked with testing an application, though it is still up to the user to select the best scope. For example, you could consider a General Ledger an application, and Accounts Receivable as a function within the General Ledger application, or you could view Accounts Receivable as one application in your General Ledger system. Some considerations in defining the testing scope for your vehicles include:

  1. Reusability of assets. All scripts and test assets, the building blocks for test cases, are shareable at the vehicle level.
  2. Personnel using ATVs. Performance Test's facility for securing access to ATV data is done at the vehicle level.
  3. Report distribution. Test activity reporting and report distribution is performed via test run requests that range in scope from an individual test case to a single vehicle. If separate test or development teams maintain different areas within an application, it could be desirable to use separate vehicles along the same delineation.
  4. Your application development process. Entire vehicles can be cloned using the copy function. This is useful for setting up a new vehicle to test a new development version of an application. Changes to support the application under development can be applied to the new vehicle while leaving the old vehicle intact to continue testing the older, production version of the application.

The ATV Manager provides a system in which you can compile the elements of your automated tests to organize and manage their use. The ATV Manager then reports on the execution of tests and can e-mail the reports to defined recipients.

The ATV Manager will build, execute, and manage regression and performance automated test vehicles. The ATV Manager performs the following tasks:

  • Incorporate test assets and scripts within the ATV and control access to these elements.
  • Associate test assets including Performance Test scripts to create test cases.
  • Execute tests by using the assets and scripts from the test cases.
  • Allow a single Pass/Fail condition to be determined for a test case.
  • Create test case groups to execute regression or performance testing of an element of the target application through a single request.
  • Execute all test cases within an ATV to perform complete regression or performance testing of the target application through a single request.
  • Collect the Pass/Fail condition for all test cases into a single test execution report.
  • Distribute the test execution report to e-mail addresses as specified for the test execution.
  • Allow viewing of individual test case reports.
  • Assist in script repair using the test execution results.
  • Assist in the dubbing of new scripts, when necessary, within the context of the test case.
  • Allow the duplication of an entire ATV to support multiple versions of a target application.
  • Allow for maintenance of test assets to keep them current.

 

Tip: For faster searching, add an asterisk to the end of your partial query. Example: cert*