Default language.

Space announcement This documentation space provides the same content as before, but the organization of the content has changed. The content is now organized based on logical branches instead of legacy book titles. We hope that the new structure will help you quickly find the content that you need.
Space announcement To view the latest version of the product documentation, select the version from the Product version menu above the navigation pane.

Milestone 2: Prepare for Installation


Before you install Code Pipeline, there are certain decisions that must be made and pieces of information that must be gathered. The tables in this milestone can be printed and used to record the values to be specified later in the installation process.

Important

  • Roles involved: Code Pipeline Installer z/OS Security Administrator.
  • You will need two instances of Code Pipeline: a test instance and a production instance. Use the test instance to develop your skels and any other customizations. The test instance can also be used in preparation for installing a new Code Pipeline upgrade as well as for training.
  • You should also have a test and production instance of 

    CES

    (

    BMC AMI Common Enterprise Services

    ).
  • The UserID must not exceed 7 characters.

Complete the following tasks to get ready to install Code Pipeline:

Step 1 Establish Code Pipeline data set naming conventions

Various types of data sets are required for the installation of Code Pipeline and should be planned to ensure a smooth installation. The following table describes the various types of data sets required and provides naming examples. The naming convention established in this task will be used in Determine High-Level Qualifiers and Warehouse ID.


Code Pipeline

 Data Set Types

Data set Type

Description

Example

Corresponding Field in Installation Configuration Dialog (Specify Environment Values)

Your Naming Convention

SMP/E Datasets
(PDS/PDSE)

Twelve data sets created by the BMC Installer Mainframe Products SMP/E.

IW.R220000.*

HLQ for Target and Distribution libraries


Site Customizations
(PDS/PDSE)

The ISPF component of Code Pipeline is customizable. These Site customizations are managed in separate data sets from the Base software. Standard Code Pipeline users need READ access to these data sets. Your Code Pipeline Tech Support need UPDATE access.

ISPW.SITE.*

Site High Level Qualifier


Training Application
Datasets
(PDSE)

A Training Application called PLAY is delivered in the Code Pipeline SAMPLIB and can be installed as part of the IVP Process. Standard Code Pipeline users need UPDATE access to these data sets. Your Code Pipeline Tech Support need ALTER access.

ISPW.PLAY.*

Play High Level Qualifier


Set Log Datasets
(Sequential)

Some Code Pipeline background work is performed in a “Set”. A log of the Set’s progress and any errors encountered is stored in a sequential data set unique to that Set. Many of these will accumulate, but they can be automatically migrated and deleted after a period of time. Code Pipeline users need READ access to these data sets. The Code Pipeline SX Started Task needs ALTER access.

ISPW.SX.*

Set (SX) High Level Qualifier


Warehouse Datasets
(PDSE)

Automatically created by Code Pipeline (CT Task) as required. Used to store compressed versions of the Application Components Code Pipeline is managing. Only the Code Pipeline CT Started Task needs ALTER access to these data sets.

ISPW.WH.*

Warehouse
High Level Qualifier


Step 2 Determine High-Level Qualifiers and Warehouse ID

The following table can be used to record the values you will enter during Specify Environment Values.

Important

  • The Installation Requirements listed are also the field names on the Validate Environment Values screen used in Specify Environment Values.
  • The 17.02 Variable column in the following table lists the corresponding 17.02 variables for those performing an upgrade from release 17.02.

 High-Level qualifiers

Installation Requirement

17.02 Variable

Your Value

SMP/E (data sets from the BMC Installer):
HLQ for Target and Distribution libraries

TAPEHLQ


Site, Play, and Set data sets:
Site High Level Qualifier

SITEHLQ


Site, Play, and Set data sets:
Play High Level Qualifier

PLAYHLQ


Site, Play, and Set data sets:
Set (SX) High Level Qualifier

SETLGHLQ


Warehouse:
ID

WHID


Warehouse:
High Level Qualifier

WHDSHLQ


Step 3 Set Up an SAF Class

Code Pipeline makes security checks to SAF which require a Class Name (Generic). It is recommended that a separate SAF Class be defined specifically for Code Pipeline, because the Code Pipeline CM started task issues (RACROUTE REQUEST=LIST) against the class at startup to pre-load all the profiles. Keeping the Code Pipeline profiles separate provides the maximum level of optimization under SAF.

Dynamic RACF CDT Class definition

An example definition for RACF follows:

Example

RDEFINE CDT $ISPW OWNER(ISPWADM) CDTINFO(CASE(UPPER) DEFAULTRC(4) +
DEFAULTUACC(ACEE) FIRST(ALPHA NATIONAL NUMERIC) GENLIST(DISALLOWED) +
KEYQUALIFIERS(0) MACPROCESSING(NORMAL) MAXLENX(80) MAXLENGTH(80) +
OPERATIONS(NO) OTHER(ALPHA NATIONAL NUMERIC SPECIAL) POSIT(25) +
PROFILESALLOWED(YES) RACLIST(DISALLOWED) SIGNAL(NO) +
SECLABELSREQUIRED(NO))

This can be refreshed using the RACF command:

SETROPTS REFR RACLIST(CDT)

Specifying the Class during the installation

There is an installation substitution variable in the dialog called SECCLASS. A non-existent class can be specified here if it is to be defined later. Code Pipeline’s CM will still work, except that no internal security will be available.

Important

This can be useful to bring up a CM before the SAF Class is available, or to run a test CM with no security. It is not, however, recommended as a long-term approach.

Running without internal security

Early in the installation process, it will become necessary to turn on some internal security. All of the Reference Data should be protected from users so that it is not accidentally changed. The Reference Data governs the way Code Pipeline works, and it is recommended that only your Code Pipeline Technical Support person updates it.

Step 4 Create UserIDs for Started Tasks

Installation of the Base Code Pipeline product will result in three permanent Started Tasks (CM, CI, and CT) plus one other Started Task (SX) that is started, when necessary, for Code Pipeline  to perform work in a Set. Information as to their required permissions is listed in the subsequent paragraphs, however some of the values (such as the names of the Warehouse Datasets) will not be known until later in the installation process.

If your site intends to use the Code Pipeline Deploy feature, a fifth started task (RX) will need to be defined. See the Code-Pipeline-Deploy-Reference for more information about the RX task.

Additionally, if your site intends to use the Custom Exit Processor a sixth started task (FX) needs to be defined. Its permissions are similar to the SX task mentioned above.

If your site intends to use the Code Pipeline External Function Processor Enhancement, a seventh started task (EF) will need to be defined.

CM Authority

The CM Task needs to be associated with a UserID with the following authority:

  • READ access to data sets specified in the PROC for the CM.
  • EXECUTE privilege on the Db2 Plan as specified in the input parameters. This should be done as part of the Db2 Repository Install.
  • The UserID should be set up with no password (using the NOPASS attribute).
  • For TCP/IP communications, the CM Task will require an OMVS segment.
  • Authority to issue the z/OS START command (to start the SX task).

CI Authority

The CI Task needs to be associated with a UserID with the following authority:

  • READ access to data sets specified in the PROC for the CI.
  • The UserID should be set up with no password (using the NOPASS attribute).
  • STEPLIB must be z/OS APF-authorized.
  • For TCP/IP communications, the CI Task will require an OMVS segment.

CT Authority

The CT Task needs to be associated with a UserID with the following authority:

SX Authority

The SX Task needs to be associated with a UserID with the following authority:

  • READ access to data sets specified in the PROC for the SX.
  • UPDATE access to all Application data sets managed by Code Pipeline.
  • ALTER access to a specified HLQ for a “Set Log” (as specified for Set Log Datasets in Establish Code Pipeline Dataset Naming Conventions).
  • The UserID should be a maximum of 7 characters because SX runs batch TSO and ISPF.
  • Because SX submits controlled compile jobs under its authority, it requires TSOSUBMIT authority. For the same reason, SX requires JCL authority under the TSOAUTH resource class.
  • Authority to perform operations the SX task will do directly (for example, BINDs and CICS Newcopy).

RX Authority

Important

The RX task is configured differently at every site, so the guidelines here depend heavily on your particular implementation of Code Pipeline’s deploy functionality.

If Deploy will be used, the RX Task needs to be associated with an ID with the following characteristics:

  • TSO UserID (7 characters or less).
  • The UserID should be set up with no password (using the NOPASS attribute).
  • An OMVS segment is needed if Code Pipeline will be used to deploy UNIX files.
  • RACF authority to data sets the RX task will access (such as Lifecycle and Target Deploy Datasets).
  • Authority to perform operations the RX task will do directly (for example, BINDs and CICS Newcopy).

FX Authority

The FX Task needs to be associated with a UserID with the following authority:

  • READ access to data sets specified in the PROC for the FX.
  • UPDATE access to all Application data sets managed by Code Pipeline.
  • ALTER access to a specified HLQ for an “FX Log” (as specified in M.ER variable FXLOGPFX).
  • The UserID should be set up with no password (using the NOPASS attribute).
  • The FX Task requires JCL authority under the TSOAUTH resource class. It will require the same authority as the SX Processor.

EF Authority

The EF Task needs to be associated with a UserID with the following authority:

  • READ access to data sets specified in the PROC for the EF.

Step 5 Plan System Libraries

During Code Pipeline configuration, you will enter the library data set names to be used in tailoring the Site skeletons. ISPF Dataset Names and Compiler and Related Dataset Names can be used to record the values you will enter in Define/Provide System Libraries.

Task 5.1 Determine ISPF Data set names

The following table can be used to record the values you will enter during Provide ISPF Data set Names.

Important

The 17.02 Variable column in the following table lists the corresponding 17.02 variable for those performing an upgrade from release 17.02. These variables also correspond to the field names on the Code Pipeline Site Compile Libraries screen in Provide ISPF Data set Names.

ISPF Data set names

Dataset Type

17.02 Variable

Your Site-Specific Dataset Name

IBM-supplied EXECs

ISPFEXEC


IBM-supplied CLISTs

ISPFCLST


IBM-supplied Panels

ISPFPANL


IBM-supplied Skeletons

ISPFSKEL


IBM-supplied Tables

ISPFTABL


IBM-supplied Messages

ISPFMSGS


IBM-supplied Load Modules

ISPFLOAD


Task 5.2 Determine compiler and related data set names

The following table can be used to record the values you will enter during Provide Compiler and Related Dataset Names.

Important

The 17.02 Variable column in the following table lists the corresponding 17.02 variable for those performing an upgrade from release 17.02. These variables also correspond to the field names on the Code Pipeline Site Compile Libraries screen in Provide Compiler and Related Dataset Names.

Compiler and related data set names

Dataset Type

17.02 Variable

Your Site-Specific Dataset Name

System Maclib Dataset for Assemblies

MACLIB


System Modgen Dataset for Assemblies

MODGEN


LE Maclib for compiles

SCEEMAC


COBOL for MVS Maclib

COBMMAC


COBOL for MVS Steplib

COBMSTEP


CICS Link Library for Link edits

CICSLINK


PL/I Steplib

PLISTEP


Step 6 Plan Started Task parameters

Code Pipeline relies on four Started Tasks (CM, CI, CT and SX) to perform all its basic functions. Without these Started Tasks, Code Pipeline will not function. Follow the instructions in this Milestone to plan the necessary Started Task parameters you will enter during Milestone-8-Define-Started-Tasks.

Important

  • If you are upgrading from a previous release of Code Pipeline, instead of planning your Started Task parameters in this Task, you can import your existing parameters for the CM, CI, and CT Started Tasks. For more information, see Milestone-8-Define-Started-Tasks.
  • The SX Started Task planned in this Milestone does not have a configuration screen because it is defined by the definitions for the other Started Tasks.

Task 6.1 Determine Started Task common parameters

Certain parameters are used by all of the Code Pipeline Started Tasks. The following table can be used to record the values you will enter during Define Common Parameters.

Important

The 17.02 Variable column in the following table lists the corresponding 17.02 variable for those performing an upgrade from release 17.02.

Started Task common parameters

Parameter

Description

17.02 Variable

Your Value

Server Names
SERVERID

The Internal Server ID to be used by the Code Pipeline CM Server.

SERVERID


Server Names
WZCMNAME

Internal communication ID for the CM Started Task. For TCP/IP communications, it is the logical name.

WZCMNAME


Server Names
WZCINAME

Internal communication ID for the CI Started Task. For TCP/IP communications, it is the logical name.

WZCINAME


Server Names
WZCTNAME

Internal communication ID for the CT Started Task. For TCP/IP communications, it is the logical name.

WZCTNAME


Port Numbers
WZCMPORT

The port number on which the CM Started Task should listen for communications from the CI and CT Started Tasks. Limited to 5 digits.

WZCMPORT


Port Numbers
WZCMXPRT

The port number on which the CM Started Task should listen for REST API communications and connections from

Workbench for Eclipse

for Eclipse clients via HCI. Limited to 5 digits. 

WZCMXPRT


Communications
XSYSPROT

Communications protocol to be used by the CM Started Task. (Currently, only TCP/IP is supported.)

XSYSPROT


Communications
WZCMADDR

The IP address of the LPAR on which the CM Started Task runs. The IP address can be either a DNS name or IP address, unless VIPA (Virtual IP Addresses) is used. If VIPA is used, this must be the IP address assigned to the CM Started Task or the DNS name that resolves to the VIPA IP address.

WZCMADDR


Task 6.2 Determine CM Started Task parameters

The following table can be used to record the values you will enter during Define CM Specific Parameters.

Important

The 17.02 Variable column in the following table lists the corresponding 17.02 variable for those performing an upgrade from release 17.02.

CM Started Task parameters

Parameter

Description

17.02 Variable

Your Value

SECCLASS

The name of the SAF security class to be used by Code Pipeline.

SECCLASS


SECRULE

The Security Rule Specification. The default rule effectively turns security off. See the Code-Pipeline-Technical-Reference-Guide chapter entitled “Security” for details on how to turn security on.

SECRULE


SETPROC

The name of the SX Started Task. (The CM Started Task will start and stop this Task dynamically when Set processing is required.)

SETPROC


FXPROC

The name of the FX started task. CM will issue a z/OS START for this name when Custom Exit processing is required within

Workbench for Eclipse

.

FXPROC


TCPUSERID

The job name of the TCP/IP address space. The default is TCPIP.

TCPUSERID


Number of TCBs
GPROCESS

Used by the CM Started Task to determine the number of long-running requests (Threads) that can be processed against Db2. Maps to the number of long-running TCBs. The default is 2.

GPROCESS


Number of TCBs
SPROCESS

Used by the CM Started Task to determine the number of short-running requests (Threads) that can be processed against Db2. Maps to the number of short-running TCBs. The default is 2.

SPROCESS


Number of TCBs
TPROCESS

Used by the CM Started Task to determine the number of long-running transaction process threads. Maps to the number of long-running TCBs used for transaction processes. The default is 2.

No corresponding 17.02 variable


Authorized Users
for Server Start:
AUTHUSER
(4 fields)

For normal maintenance, the Code Pipeline server will need to be stopped and restarted. To start the server, Code Pipeline administrators must be listed as Authorized Users. As part of the install, the UserID of the installer should be the first entry.

AUTHUSER


Optional
Parameters
WEBAPI

Through 

BMC AMI Common Enterprise Services

(

CES

), you can use Code Pipeline REST APIs to perform various operations (Promotes, Fallbacks, and Generates) and enable the triggering of various Notifications, such as a post in a chat. For more information, see the

CES

Online Help.

Specify Y (Yes) if Code Pipeline REST APIs will be used. The default is N (No).

WEBAPI


Optional
Parameters
WEBTASKS

The Code Pipeline REST APIs use subtasks to send the event notifications to

BMC AMI Common Enterprise Services

(

CES

). This parameter allows you to specify the maximum number of subtasks that will be allowed to run simultaneously.

Specify the maximum number of subtasks. The default is 10.

No corresponding 17.02 variable


Optional
Parameters
WEBIDLE

The Code Pipeline REST APIs use subtasks to send the event notifications to

BMC AMI Common Enterprise Services

(

CES

). This parameter allows you to specify the length of time in seconds that an idle subtask will wait before shutting down.

Specify the maximum idle time. The default is 30.

No corresponding 17.02 variable


Optional
Parameters
CUSTOMDIALOGS

If using Code Pipeline in

BMC AMI DevX Workbench for Eclipse

, Code Pipeline administrators can create Custom Dialogs for Generates that will appear in 

Workbench for Eclipse

and allow changing of values used for Generates (such as whether a Bind is required, Generate Sequence, etc.). For more information, see the Code-Pipeline-Technical-Reference-Guide chapter entitled “Custom-Dialogs”.

Specify Y (Yes) if Custom Dialogs will be used. The default is N (No).

CUSTOMDIALOGS


Optional
Parameters
CONFIGNAMES

This is used to tell CM whether it should build a list of Runtime Config entries that are valid for this instance of CM. A Runtime Config entry is valid if the SRID parameter in the config entry matches the SERVERID parameter specified in appropriate CM startup member (see Determine Started Task Common Parameters). Valid values are Y and N. The default is N.

Important

If this parameter is set to Y, when a

Workbench for Eclipse

for Eclipse client connects to CM via HCI, the Runtime Config specified on the connect request will be verified against this list. If the specified entry does not exist on the list, the connect request will be rejected. If the

Workbench for Eclipse

for Eclipse client does not specify a Runtime Config and one of the Runtime Config list entries contains the parameter SRVRDEF=YES, CM will use this default config for the

Workbench for Eclipse

for Eclipse client’s session.

CONFIGNAMES


Task 6.3 Determine CI Started Task parameters

The following table can be used to record the values you will enter during Define CI Specific Parameters.

Important

The 17.02 Variable column in the following table lists the corresponding 17.02 variable for those performing an upgrade from release 17.02.

CI Started Task parameters

Parameter

Description

17.02 Variable

Your Value

Cross-memory ID
XMEMID

The name of the Cross-Memory ID attributed to the CI Started Task. This ID will be used by Code Pipeline Clients to send messages to this task. This value is usually the same as SERVERID in Determine Started Task Common Parameters.

XMEMID


Port Numbers
WZCIPORT

No longer required. Specify 0.

WZCIPORT

0

Port Numbers
WZCIXPRT

No longer required. Specify 0.

WZCIXPRT

0

IP Addresses
WZCIADDR

No longer required. Leave blank.

WZCIADDR


Task 6.4 Determine CT Started Task parameters

The following table can be used to record the values you will enter during Define CT Specific Parameters.

Important

The 17.02 Variable column in the following table lists the corresponding 17.02 variable for those performing an upgrade from release 17.02.

CT Started Task parameters

Parameter

Description

17.02 Variable

Your Value

CT parms:
TEMPPREFIX

The data set name prefix (12 characters or less) used by the CT Started Task during processing when creating and deleting numerous temporary data sets.

TEMPPREFIX


CT parms:
TEMPUNIT

The unit name the address space will use for IEBCOPY of the CT Started Task temporary data sets.

TEMPUNIT


CT parms:
TEMP_PRIMARY_SPACE

The primary allocation space the address space will use in cylinders for CT Started Task temporary data sets. The default is 5.



CT parms:
TEMP_SECONDARY_SPACE

The secondary allocation space the address space will use in cylinders for CT Started Task temporary data sets. The default is 30.

Important

If the values of the TEMP_PRIMARY_SPACE and TEMP_SECONDARY_SPACE parameters are set too low, there could be an out of storage SB37 condition on the started task.



CT parms:
CWIDLE

The number of seconds a warehouse data set will be left idle before it is closed and deallocated. The default is 60. This is first of four housekeeping parameters for warehouse data sets handled by the CT Started Task.

Important

If this value is set to zero, warehouse data sets will never be automatically closed and deallocated. If this value is too small, there may be excessive allocates and opens of the warehouse data sets. If the value is too high, warehouse data sets will remain allocated and open and may not be able to be backed up by normal site procedures.

CWIDLE


CT parms:
HKINTERVAL

The number of minutes the CT Started Task will wait before performing warehouse housekeeping. The default is 60.

As part of housekeeping, the CT Started Task obtains from the repository a list of members to be deleted from the warehouse, then deletes them. If this done too often, excessive resources will be consumed querying the repository. If this value is set to zero (not recommended), automatic housekeeping will not occur.

HKINTERVAL


CT parms:
COMPSTAT

Whether the CT Started Task should issue compression statistics messages (bytes in, bytes out, and compression percentage) for each member that goes into the warehouse. The default is Y (Yes).

COMPSTAT


CT parms:
HSMINT

Whether the HSM interface should be used during housekeeping to recall migrated data sets required by the CT Started Task. If your site uses HSM for migration, specify Y (Yes). If your site uses another migration tool, specify N (No). The default is N (No).

The CT Started Task has a direct interface to HSM to recall data sets it needs to access. For sites not using HSM, CT submits a batch job to recall the data set. The job is built from model JCL contained in the DSRECALL member of the PARMLIB data set used by CT. Copy the sample DSRECALL member in SAMPLIB to the CT PARMLIB and modify it for your site. The DSRECALL provided with Code Pipeline allocates the migrated data set to trigger its recall. A recall utility job can replace this job, depending on your migration software.

Important

If the job submission method of data set recall will be used, the CT Started Task UserID must have the authority to submit batch jobs to the internal reader.

HSMINT


CT parms:
TCPUSERID

The job name of the TCP/IP address space. The default is TCPIP.

TCPUSERID


IP Addresses
WZCTADDR

If using VIPA (Virtual IP Addresses), the Virtual IP Address for the CT Started Task. If not using VIPA, this field should be left blank.

WZCTADDR


CT parms:
EFPROC

The name of the EF started task for this CT. CT will create an address space using this name at startup, if EF is configured.



CT parms:
EFRTCONF

The name of the Runtime Config entry used by EF to talk to CI.



Audit Log:
AUDITGDG

The GDG (Generation Data Group) base data set name for the Audit Logs.

The first of five parameters for Audit Logs created by the CT Started Task. Review these parameters carefully because the default settings may not be appropriate for every site.

If nothing is specified for AUDITGDG, no Audit Logs are created. If a value is specified, CT will create the GDG base.

After the GDG is created, you can alter it outside of Code Pipeline. Code Pipeline cannot alter a GDG.

AUDITGDG


Audit Log:
AUDITMAX

The maximum number of GDG versions. The default is 10.

AUDITMAX


Audit Log:
AUDITUNIT

Allocation unit for GDG data sets. The default is SYSDA.

AUDITUNIT


Audit Log:
AUDITVOL

Allocation volume for GDG data sets. The default is an asterisk (*) specifying to use SMS.

AUDITVOL


Audit Log:
AUDITSIZE

Allocation size for GDG data sets, in TRACKS. The default is 45.

AUDITSIZE


The following table can be used to record additional values not appearing on the Update the Code Pipeline CT Parameters screen.

Important

The 17.02 Variable column in the following table lists the corresponding 17.02 variable for those performing an upgrade from release 17.02.

Additional CT Started Task audit log parameters

Parameter

Description

17.02 Variable

Your Value

Audit Log:
AUDITSTORCLAS

The GDG data set Storage Class. The default is blank.

AUDITSTORCLAS


Audit Log:
AUDITMGMTCLAS

The GDG data set Management Class. The default is blank.

AUDITMGMTCLAS


Audit Log:
AUDITDATACLAS

The GDG data set Data Class. The default is blank.

AUDITDATACLAS


Task 6.5 Setting the priority of Code Pipeline Started Tasks

For proper performance, the started tasks used by Code Pipeline need to have the proper priority. The following tasks should be given a relatively high priority:

  • CM is a transaction processing environment that only performs business logic with Db2 activity, such as CICS and IMS.
  • CI is primarily a communications gateway.
  • CT handles deployments, moves components, and manages Code Pipeline warehouses.

Other tasks

  • SX tasks are started as required and can have many running concurrently up to user-configured maximums. These tasks primarily handle all the background life-cycle operations run in Sets, but are also used for some other background functions. As such, they could be considered similar to batch, but should probably be given higher priority than batch.
  • RX tasks are also started as required to handle deployments where any additional processes are required (CICS NEWCOPY, Db2 binds, LLA refresh, etc.), and should therefore be given consideration similar to that given to SX tasks.
  • FX tasks are started as required and handle custom exit processing for Code Pipeline for Eclipse users. There can be many FX tasks running concurrently (although this is unlikely). As such, they could be considered similar to batch, but should probably be given higher priority than batch.
  • EF tasks are started at Code Pipeline CT startup and handle parse operations for files saved by Code Pipeline for Eclipse users. As such, they could be considered similar to batch, but should probably be given higher priority than batch.

 

Tip: For faster searching, add an asterisk to the end of your partial query. Example: cert*