Introduction to Mainframe DevOps

In Information Technology, DevOps has become a culture of close collaboration between the different departments in any IT organization. DevOps, was initially referring to the collaboration between Development and Operations teams, soon included other departments, such as security, legal, and if possible, even the customer. The adoption of DevOps practices has proven to make IT teams more successful when compared to teams using other development paradigms. The challenge many large enterprises must overcome, in addition to getting rid of silos between different departments, is the separation of teams between different platforms. Mainframe and mainframe development have been suffering from being neglected when adopting Agile and DevOps strategies in the past. The consequences of such neglect are discrepancies in, mainly culture, frequency of delivery, and tools in use between the different platforms.

Culture is arguably the biggest hurdle to overcome, and requires much more effort, planning, and inclusive discussion than just adopting some tools. However, another significant hurdle is that, until recently it was almost impossible to integrate mainframe processes into the well-established processes and tool landscapes of the distributed side of the house, which has been in place and successfully supporting teams for many years now. Enterprises have realized that it is time to do something to solve these discrepancies, and that the mainframe needs to be a part of the game as much as any other contributor to the application landscape.

The purpose of these pages is to give BMC customers and mainframe shops some ideas and guidance on how this hurdle can be overcome, and how mainframe processes can be integrated into the so-called continuous integration and continuous deployment (CI/CD) pipelines.

This section covers the following topics: 

Getting started

To get started with building your own mainframe DevOps pipeline, we have provided pipeline examples and setup instructions by using Jenkins. To implement a pipeline by using other tools, see  Mainframe CI using alternatives Jenkins.

The following example code and documentation are available:

Important

  • The code published serves as example code by using BMC example applications and environments. You must adjust this code for your specific needs and requirements.
  • The descriptions and tutorials in this space assume a certain level of familiarity with using Jenkins, Total Test, and other BMC tools. The required level of knowledge expected, is not too high. For example, it helps to know how to define a new job in Jenkins.

The following image displays an example of the DevOps pipeline. 

Code examples

Code examples are stored in this GitHub repository  GitHub Examples Open link .

Pipeline examples

We have published several examples of complete pipelines which show different process steps and techniques in Jenkins.

  • Mainframe-CI-Example-pipeline - ( jenkinsfile Open link ) - a scripted pipeline using parameters. This is a simple approach to a DevOps pipeline that allows you to get up and going quickly, but may not be the best when scaling pipelines across your enterprise. The job is intended to be triggered after promoting code within Code Pipeline.

    Important

    This pipeline serves as the model for most of the other pipelines examples. Meaning, other pipelines follow the same structure, implementing basically the process and just differing in nuances, e.g. the types of test being executed at different stages of the development process, and demonstrating certain techniques that prove to be helpful.

  • Mainframe_CI_Pipeline_from_Shared_Lib - ( groovy Open link ) - a pipeline loaded from a Jenkins shared library. Shared Libraries are a useful approach to scale pipelines across an enterprise since it moves the bulk of the pipeline logic to shared components that individual pipelines can reference as steps. This allows organizations to develop pipelines in a more standard way. The job is also intended to be triggered after promoting code within Code Pipeline.

    Important

    Older versions of this pipeline made use of helper classes to encapsulate some of the complexity and make the flow of the job easier to follow. Thanks to the changes in the CLI and plugins the complexity of the code required to implement such a pipeline has been reduced so much that the use of these classes does not seem to be necessary anymore.

  • Combined Pipeline: The following two pipeline definitions are part of a more elaborate process. They are called by a Jenkins job "distributing" the work to either of the scripts based on the Code Pipeline operation (generate or promote) triggering the webhook and Jenkins job.
    • Mainframe_Generate_Pipeline.groovy Open link  - a pipeline that is triggered by an Code Pipeline Generate that executes virtualized tests against those components that have been generated.
    • Mainframe_Integration_Pipeline.groovy Open link - a pipeline that is triggered by an Code Pipeline Promote that executes non virtualized tests against the project/assignment.

      Important

      Also using the Shared Library technique, these two example show how Shared Library scripts can be used for modularization and re-use of existing code to implement such more elaborate processes.

  • Pipeline for Git/Code Pipeline : The integration between Git and Code Pipeline makes use of a  pipeline Open link to "synchronize" changes in branches residing on the Git server to levels. This pipeline is triggered when changes are pushed from the local workspace to a feature branch in the Git server, or when changes are merged via a pull request from one branch into another. The code determines the Git branch it is being executed for and the corresponding target Code Pipeline level. In addition, it will run tasks similar to the ones defined in the Combined pipeline, i.e. execute tests, pass the results to Sonar, etc.

Other code examples

Next to the pipeline examples, we also publish code snippets and short Jenkins jobs that demonstrate individual features and use cases of the BMC plugins or CLIs.

The GitHub repository is organized as follows:

  • vars folder: example pipelines using Shared Library technology.
  • src folder: class definitions for code related to these pipelines.
  • Jenkinsfile sub-folder: example code not directly related, but that define Jenkins jobs. Currently these are: 
    • JCL_Pipeline_Example Open link  contains a simple example of mainframe jobs being submitted from Jenkins — both with the JCL residing on the mainframe and the JCL stored/generated in the pipeline code itself.
    • Three examples of downloading sources (COBOL programs and copybooks) from the mainframe using different download stores for the code and methods to download, pushing the sources to SonarQube using the Sonar scanner and querying the resulting Sonar Quality Gate. 
      • Scan_Sources_from_ISPW_Container_with_Sonar Open link - using the container downloader for sources stored in Code Pipeline.
      • Scan_Sources_from_ISPW_Repository_with_Sonar Open link - using the repository downloader for sources stored in Code Pipeline.
      • Scan_Sources_from_PDS_with_Sonar Open link  - uses the PDS downloader for sources stored in PDS's (inside or outside a mainframe SCM tool).
    • Push_TTT_results_to_Git - ( jenkinsfile Open link ) - shows how to push results of unit test execution back to GitHub for a developer to consume locally.
  • misc-examples/AzureDevOps sub-folder: resides within the src directory of the repository. It contains PowerShell scripts and examples for using Azure DevOps Pipelines as an alternative to using Jenkins or the Jenkins plugins. The PowerShell scripts make use of the Code Pipeline REST API and the Workbench CLI, and we describe them in detail in Mainframe CI using alternatives Jenkins.

Simplifying pipelines

Over the recent few releases of Total Test, the Workbench CLI, Code Pipeline, the Code Pipeline CLI and the corresponding Jenkins plugins, integration between different tools has been greatly improved, moving much of the complexity of implementing certain requirements from the code of the scripts into functionality provided by the plugins/CLI. Therefore, many of the example pipelines will have become far simpler, compared to the older versions, while providing the same flexibility and functionality as before. For customers, this means that the coding effort and complexity of the resulting pipeline scripts have been reduced considerably. For details about what has changed, click here

Total Test project structure

Total Test allows storing Virtualized and Non-Virtualized test scenario in any folder of a given Total Test project. For Virtualized Test scenarios, the corresponding sub folders still need to meet the required folder structure, but with the new architecture storing Total Test assets has become highly flexible.


Important

It is now even possible to store mainframe code alongside Total Test scenarios within the same Eclipse project and Git repository, when using the Git to Code Pipeline integration.

New Version folder structureNew Version folder structure with sources
Each sub folder is just a sub folder within a single project. Adding new tests for a component is as simple as specifying a new folder when recording the corresponding tests. Virtualized and Non-Virtualized Tests may reside in the same project and can be grouped into dedicated folders by type, instead of by component.

Configuring a Total Test project to be "connected" to Code Pipeline (or vice versa) allows storing mainframe code alongside tests for this code.

One Total Test CLI

All Total Test Scenarios, both virtual and non-virtual are now executed by the TotalTestFTCLI.bat file.

Intelligent test execution

If you wanted to execute test scenarios only for those components that were affected by a "change", in the past you had to

  • determine the list of affected programs
  • determine the list of test scenarios
  • match test scenarios against the list of programs
  • execute matching scenarios in a loop

In contrast, with the Intelligent Test Execution feature any Code Pipeline operation will automatically create a .json file containing information about the tasks affected by the corresponding operation. This file can be processed by the Total Test CLI, telling it to execute any test scenario that matches any of the entries in the .json file. The resulting code consists of exactly one call to the Total Test CLI.

Runner JCL

One of the main concerns when using the traditional Total Test CLI to execute Virtualized Test scenarios, was setting up the correct Runner.jcl to use. Depending on the complexity of the environment (for example, several parallel development paths), the JCL had to modified at runtime, adding yet more complexity to the resulting pipeline scripts.

Using the One Total Test CLI in future, the "runner.jcl" can be configured and stored centrally for all scenarios of a project or even on the CES. 

Local TotalTestConfiguration"Runner JCL" stored in CES

With version 20.04.01 of Total Test, it allows using a central TotalTestConfiguration project, allowing to use one single JCL skeleton for each Total Test scenario on any project/folder within the same workspace. Use of variables allows for change of load library concatenations at runtime.

Instead of storing the JCL skeleton locally in the users' workspaces, the skeleton can be stored in CES as part of a connection definition. This allows sharing of the same skeleton with every user in the environment, without them having to maintain local copies.

Context variables

Total Test allows defining your own variables to use at runtime of a scenario. This allows for high flexibility in defining scenarios for one environment and executing them in different target environments without manual intervention or preparation by a user.

Code Pipeline source code downloader - include copybooks

Current versions of the plugin allow for downloading related copybooks automatically, no matter at which level in the life cycle, they reside within the path relative to the container you are downloading sources for. This resolves a problem many customers were facing, and had to write their own code for, unnecessarily complicating the code of their pipelines:

Early versions of the Code Pipeline source download plugin - especially of the container downloader - would only download those sources that were part of the specified container. This would cause problems if copybooks that were used by components, were not a part of that same container, e.g. component CWXTCOB within an assignment uses copybook EMPFILE, but EMPFILE is not a part of the container. In such a case, the container downloader would only get CWTXCOB. Passing the source to SonarQube would then result in "false positive" errors being flagged by Sonar due to missing copybooks.

Tools used

The example pipelines use a development scenario based on:

  • Code Pipeline Open link as the SCM to store and govern mainframe sources
  • Git (GitHub) Open link as the SCM to store unit test assets.
  • Total Test Open link  as the mainframe unit, functional, and integration testing tool to create and maintain test assets.
  • Code Debug Code Coverage Open link as the tool to gather code coverage data during execution of the unit tests.
  • SonarQube Open link as the server for code analysis and setting up quality gates.
  • XLRelease Open link as the CD server for release steps following the initial CI process Jenkins.

Instructions for configuring the various tools can be found in the Configuration section of this site.

The code repository folder structure

Based on the description above and due to the requirements for the use of Pipeline Shared Libraries Open link in Jenkins, the folder structure of the DevOps-Examples repository is as follows:

(root)
    +- resources                                            # Configuration files used by the pipelines
    +- src                                                  # (Groovy) source files
    |   +- Jenkinsfile                                      # "simple" example job scripts
    |   +- misc-examples                                    # non Jenkins related (non Groovy) code examples
    |       +- AzureDevOps
    |           +- PipelineYAML                             # YAML file(s) describing Azure DevOps pipelines (builds or releases)
    |           +- Powershell                               # Powershell scripts being used by an Azure DevOps example pipeline
    +- vars                                                 # Shared Library Pipeline Examples

Mainframe CI using alternatives to Jenkins

Even though Jenkins is the most prevalent CI server on the market, there are shops that use alternative solutions to using Jenkins. And even though implementing a mainframe CI process using Jenkins is simplified by the availability of BMC plugins, these plugins - under the hood - use the Workbench Command Line Interface and the Code Pipeline Rest API.

Solution using alternatives to Jenkins must allow:

  • Installation of the Workbench command line interface
  • Execution of command line commands
  • Execution of http REST calls
  • Triggering of remote jobs (e.g. via REST calls) then, making use of the CLI and Code Pipeline REST API, it should be possible to implement CI processes similar to the ones we present using Jenkins.

As a proof of concept we describe and publish the code to implement a CI/CD process using Azure DevOps pipelines. The pipeline will implement the general process steps. The CI build pipeline will trigger an Azure DevOps release pipeline which we will also describe. Most of the implementation will be done by using PowerShell scripts.

Was this page helpful? Yes No Submitting... Thank you

Comments