Test Scenarios
Creating a new Test Scenario
In order to manually create a new test scenario, right-click on the Scenario folder in the project where you want to create the scenario, and select New > Unit Test Scenario (or from the File menu select New > Other > Total Test >
Unit Test Scenario).
The Scenario Wizard opens. On the first page, enter the following data (obligatory fields listed in bold):
Field | Description |
---|---|
Name | File name for the test scenario |
Alias Name | More descriptive name |
Description | Detailed description |
Program Type | Check the appropriate box:
|
If you have not selected a Total Test project before invoking the wizard, a list of Total Test projects displays in the Project list at the bottom of the wizard pages. You must select a project to continue to the next wizard page.
Click Next > to continue to the next page where you select an interface. You can only select a program interface; non-program interfaces aren't included in the selection list. If no interfaces are listed, click < Back. The selected interface is also used as input for the Test case name field. If so desired, change the Test case name. Add the name of the program to be tested in the Program field. Optionally, you can enter values in the Csect, Alias, and Description fields.
Click Finish to create the scenario.
Or optionally, click Next > instead to proceed to the final page where you can select another destination folder for the test scenario.
After completing the wizard, the newly created test scenario opens in the Test Scenario Editor where you can specify input data, check conditions, stubs, and also add more test cases.
For details on how to use the Test Scenario Editor, see Editing Test Cases.
Editing Test Cases
Test case specifications are entered in the Test Scenario Editor. On the left side of the editor view, all test cases contained in the test scenario are listed. By selecting a test case in this list, its data are displayed in the editor and can be entered or updated.
To add a test case, click on the Add test case... icon above the Test cases listing. This opens the Add Test Case Wizard which allows you to add test cases.
The Test Scenario Editor has four tabs that can be selected in the lower area of the editor view:
- Input Data to enter values or select field references for input attributes.
- Stubs Test-Scenario to select one or multiple stubs to be used in this test case.
- Assertions to specify check conditions in a simple and efficient way.
- Information to change alias name and description and obtain information about the interface of a test case.
Input Data
The Test Cases tab of the Test Scenario Editor lists all test cases defined in the Scenario. Select a test case to view or edit the necessary input values for a test case.
- The Structure column displays a list of the structures and attributes used for input to the test case.
- The Location column describes the position values for the specific field.
- The Value column allows you to enter or change the values of attributes.
- The Assignation from column allows you to select a reference from preceding test cases to assign its value as input to an attribute or structure.
Structure
The Structure column displays a list of the structures used as input in the test case, as specified in the interface definition, including their substructures and attributes. Please refer to General Editor Behavior for a detailed description of all options provided to navigate in this list.
If the attribute is contained in an array, by default, the first array element will be used for the input value. If you need an input value for another array element, right-click on the array, then select Add array element from the context menu. A new array element will be added to the array which you can now enter input values for. Alternatively, you can right-click on an array element and select Duplicate array element from the context menu to create a copy of the array element and add it to the array. In order to remove an array element, right-click on the array element, then select Remove array element from the context menu.
Other right-click options for the Structure elements include:
- Copy value
- Redefine element
- Replace element
- Insert new field after element
- Delete element
- Revert to original interface
Location
The Location column describes the position values for the specific field.
Value
The Value column allows you to enter values for input attributes. Default values derived from the structures and interface are displayed in blue. Values entered in the test case are displayed in black. If you enter values that are not suitable for the attribute, for example, strings that are too long, they are displayed in red.
Values for text attributes can alternatively be entered in hex format, for example as x'61a5'. The hex value must be given in the target platform encoding because for hex values, no character conversion is performed.
If you want to reset a default value entered in a test case to the default value from the structure or interface, simply clear the input field. If you want to override a text default value taken from the structure to spaces, simply enter space.
Besides entering values for attributes, you may also enter values for structures. However, as structures cannot really hold values, the value entered will be propagated to the attributes contained in the structure. Values entered for structures are always treated as character data. Note that the character data propagated to numeric data types might lead to unexpected values.
Assignation From
The Assignation from column allows you to specify that a value should be assigned from the output of a preceding test case at runtime. In order to do so, click on the Assignation from column in the line of the attribute for which you want to select a reference. A button with three dots appears. Click this button, then the Select Reference pop-up will appear where you can select the reference test case from the drop-down list, then select the desired attribute from this test case. Because (naturally) you cannot have forward references, only test cases preceding the current one are displayed in the drop-down list.
Note that the source and destination attributes must be data format compatible. Otherwise, problems may occur during execution. Any numeric data types are compatible and a proper conversion will be performed during the assignation at runtime. Alphanumeric and numeric data formats can be compatible, too, if the alphanumeric attribute contains an all-numeric character value. If the source attribute is larger than the destination attribute, truncation will occur and a warning will be issued by the Runner.
Besides using assignations for attributes, you may specify an assignation for structures, too. An assignation on the structure level will always be treated as character data. The assigned value will propagate to the attributes contained in the destination structure. If source and destination structures have different lengths, the data will be truncated or padded with spaces accordingly. Keep in mind that character data propagated to numeric data types may lead to unexpected or invalid values.
If a reference has been selected for an attribute, the test case and attribute name will be displayed in the Assignation from column. If you specified both, a value and an assignation at the same time, the assignation will not be used during test execution, and the reference displayed in the Assignation from column will be grayed out as a hint. However, if you remove the value, the assignation will be re-activated immediately. This allows you to use assignations in a test, but to occasionally overwrite them easily.
In order to change a reference, invoke the Select Reference pop-up again in the same way as when you added the new reference, then select another reference attribute to be used.
In order to remove a reference, invoke the Select Reference pop-up again in the same way as when you added the new reference, then check Remove reference and click OK.
Stubs
The Stubs tab of the Test Scenario Editor is used to assign stubs to a test case.
Refer to the Stubs section for information about how to create stubs.
In the Stubs column, you can select one or more stubs to be used in the test case. On the left side, all Available stubs are listed. Select a stub in this list and click on the arrow in the middle to copy it to the Selected list on the right. You can copy the same stub multiple times.
During execution, the Selected stubs are used to simulate calls to the targets of their interface. If the same target is contained multiple times in the Selected stubs, the stubs will be used in exactly the order provided. This means that the first stub is used for the first call, the second stub is used for the second call, and so on. If there are more calls to the stub target than you have stubs in your test case, the last stub will be re-used indefinitely. This allows you to easily use one stub for multiple calls, if applicable.
Stubs apply only to the test case they are selected for. Their target will not be stubbed in other test cases unless you have specified stubs there, too.
Assertions
The Assertions tab allows you to define the conditions under which a test case is considered successful in a most simple way.
Attribute
The Attribute column displays a list of output attributes that can be used for check conditions. In order to add additional attributes, right-click in the list and select Create new assertion from the context menu, or click on the button in the upper right area of the editor. Then click on the Attribute column of the newly created assertion to select the attribute to be used. Alternatively, you can use the Create new assertion icon to add a check condition for an attribute by double-clicking it.
If the attribute is contained in an array, by default, the first array element will be used for the assertion. If you need an assertion for another array element, right-click on the array, either in the pop-up list or in the Create new assertion view, then select Add array element from the context menu. A new array element will be added to the array which you can now select for an assertion.
In order to remove an assertion, right-click on the assertion, then select Delete from the context menu. If you want to just temporarily disable an assertion, you can change its comparison operator to noassert.
Comparison
The Comparison column allows you to specify the comparison operator used for the assertion. The following operators are available:
Comparison Operators
Operator | Description |
---|---|
noassert | Check condition not active – will not be used |
= | Attribute must be equal to the given value |
<> | Attribute must be not equal to the given value |
> | Attribute must be greater than the given value |
>= | Attribute must be greater than or equal to the given value |
< | Attribute must be less than the given value |
<= | Attribute must be less than or equal to the given value |
same | Attribute must be unchanged (output value equal to input value) |
Value
The Value column allows you to enter the value to be used in the assertion.
Values for text attributes can alternatively be entered in hex format, for example as x'61a5'. The hex value must be given in the target platform encoding because for hex values, no character conversion is performed.
Label
The Label column allows you to enter a descriptive name for the assertion which is displayed in the Result Report. When creating a check condition, it defaults to "Check for attribute name", but you can change it to a more descriptive text.
Failure Message
The Failure Message column allows you to enter a descriptive name for a check condition failure which is displayed when the check condition fails (does not evaluate as expected). When creating a check condition, it defaults to "Check for attribute name failed", but you can change it to a more descriptive text.
Information
The Information tab allows you to change the Alias Name and Description of the test scenario.
Additionally, details are displayed about the test case selected in the test case list on the left. You can see the name, program, CSECT, alias, description, interface name, and structure for the interface on which the test case is based. The interface and structure(s) can be quickly opened by clicking on the listed interface or structure name.
Adding A New Test Case
If more test cases are required to define the test scenario, additional test cases can be added by right-clicking in the test case list area on the left and selecting Add test case... from the context menu, or by clicking on the Add test case... icon above the Test cases listing. The Add Test Case wizard opens, allowing you to select the interface for the new test case. The selected interface is also used as input for the Test case name field. If so desired, change the Test case name. Add the name of the program to be tested in the Target field.
Alternatively, you can use an existing test case as a base copy for the new test case. To do so, in the test case list, right-click the test case name you want to copy, then select Duplicate test case... from the context menu. Enter the new name and click OK.
Modifying Test Cases
You can change the name of a test case by right-clicking on the test case in the test case list, then selecting Rename from the context menu. Enter the new name, then click OK.
You can remove a test case from the test scenario by right-clicking on the test case in the test case list, then selecting Delete from the context menu. Confirm the deletion by clicking OK. Note that all data entered for the test case will be deleted with the test case.
You can re-arrange the order of the test cases by simply moving them up or down using your mouse pointer. Note that re-arranging test cases can lead to orphaned assignation references, if a referenced test case is moved after the test case that references it.
Running Test Scenarios
In order to run a test scenario, right-click the test scenario in the Project Explorer view, then select Run Unit Test Scenario... from the context menu. If you want to run the test scenario that you are just editing, simply click on the Run Test Scenario icon in the upper right area of the editor.
The Execution Context (or Test Run) dialog opens, allowing you to set the Host Connection (Target Environment), as well as to override the Used JCL Template, Execution Options, Test Aids, and Code Coverage options for this run. For details on these please refer to Project-Properties. Click OK to start the test run.
The test run will be started and monitored. After the execution has finished, the results will be analyzed and displayed in the Result Report.
While the test is being executed, you can click Run In Background to close the dialog. The test will continue to be executed and monitored, and you can view it in the Progress view. In the meantime, you can continue to use your development environment. Once the test run has completed and the results are ready to be displayed, this will be reported in the Progress view, and you can open the Result Report with a click from there.
Running a Unit Test Scenario from Non-virtualized Test
It is also possible to drag&drop a Unit test scenario into a Total Test Scenario (Non-virtualized Test) and execute it from there. Follow the instructions in Add Unit Test Test Scenario.
Extracting Check Conditions
This feature offers a quick and easy way to automatically create check conditions for a test case or a whole test scenario. It allows you to use the output data from a previous run to create simple check conditions from it.
To do so, right-click on the archive file (usually the one in the output folder of your project), then select Test > Extract Check Conditions... from the context menu. Alternatively, if the archive is opened in the Result Editor, simply click on the Extract check conditions from this output button located in the upper right area of the Result Editor.
The wizard opens, asking you to select the destination test scenario which you want the check conditions created for. The archive of a test suite may contain several test scenarios in which case you can select multiple test scenarios. By default, the original test scenarios used in the archive will be selected. If you decide to select another test scenario, it must contain the same test cases as the original. Click Clear for test scenarios that you do not want to extract check conditions from the archive.
Click Next > to continue to the data selection. Here you can select the attributes for which you want check conditions to be created using the values from the output. For each test case in the test scenario, you will see a tab by which you can go through all test cases of the test scenario. Above the test case tabs, you will see a tab for each test scenario found in the archive. Once you have completed your selection of attributes, click Finish to create the check conditions. The affected test scenarios will automatically be opened for you to review them.
Note that the attributes you select will overwrite any check conditions for the same attributes in the destination test cases. Check conditions for attributes that you did not select will be preserved in the destination test cases.