Milestone 2: Prepare for Installation
Before you install Code Pipeline, there are certain decisions that must be made and pieces of information that must be gathered. The tables in this milestone can be printed and used to record the values to be specified later in the installation process.
Complete the following tasks to get ready to install Code Pipeline:
Step 1 Establish Code Pipeline data set naming conventions
Various types of data sets are required for the installation of Code Pipeline and should be planned to ensure a smooth installation. The following table describes the various types of data sets required and provides naming examples. The naming convention established in this task will be used in Determine High-Level Qualifiers and Warehouse ID.
Code Pipeline
Data Set Types
Data set Type | Description | Example | Corresponding Field in Installation Configuration Dialog (Specify Environment Values) | Your Naming Convention |
---|---|---|---|---|
SMP/E Datasets | Twelve data sets created by the BMC Installer Mainframe Products SMP/E. | IW.R220000.* | HLQ for Target and Distribution libraries | |
Site Customizations | The ISPF component of Code Pipeline is customizable. These Site customizations are managed in separate data sets from the Base software. Standard Code Pipeline users need READ access to these data sets. Your Code Pipeline Tech Support need UPDATE access. | ISPW.SITE.* | Site High Level Qualifier | |
Training Application | A Training Application called PLAY is delivered in the Code Pipeline SAMPLIB and can be installed as part of the IVP Process. Standard Code Pipeline users need UPDATE access to these data sets. Your Code Pipeline Tech Support need ALTER access. | ISPW.PLAY.* | Play High Level Qualifier | |
Set Log Datasets | Some Code Pipeline background work is performed in a “Set”. A log of the Set’s progress and any errors encountered is stored in a sequential data set unique to that Set. Many of these will accumulate, but they can be automatically migrated and deleted after a period of time. Code Pipeline users need READ access to these data sets. The Code Pipeline SX Started Task needs ALTER access. | ISPW.SX.* | Set (SX) High Level Qualifier | |
Warehouse Datasets | Automatically created by Code Pipeline (CT Task) as required. Used to store compressed versions of the Application Components Code Pipeline is managing. Only the Code Pipeline CT Started Task needs ALTER access to these data sets. | ISPW.WH.* | Warehouse |
Step 2 Determine High-Level Qualifiers and Warehouse ID
The following table can be used to record the values you will enter during Specify Environment Values.
High-Level qualifiers
Installation Requirement | 17.02 Variable | Your Value |
---|---|---|
SMP/E (data sets from the BMC Installer): | TAPEHLQ | |
Site, Play, and Set data sets: | SITEHLQ | |
Site, Play, and Set data sets: | PLAYHLQ | |
Site, Play, and Set data sets: | SETLGHLQ | |
Warehouse: | WHID | |
Warehouse: | WHDSHLQ |
Step 3 Set Up an SAF Class
Code Pipeline makes security checks to SAF which require a Class Name (Generic). It is recommended that a separate SAF Class be defined specifically for Code Pipeline, because the Code Pipeline CM started task issues (RACROUTE REQUEST=LIST) against the class at startup to pre-load all the profiles. Keeping the Code Pipeline profiles separate provides the maximum level of optimization under SAF.
Dynamic RACF CDT Class definition
An example definition for RACF follows:
Example
This can be refreshed using the RACF command:
Specifying the Class during the installation
There is an installation substitution variable in the dialog called SECCLASS. A non-existent class can be specified here if it is to be defined later. Code Pipeline’s CM will still work, except that no internal security will be available.
Running without internal security
Early in the installation process, it will become necessary to turn on some internal security. All of the Reference Data should be protected from users so that it is not accidentally changed. The Reference Data governs the way Code Pipeline works, and it is recommended that only your Code Pipeline Technical Support person updates it.
Step 4 Create UserIDs for Started Tasks
Installation of the Base Code Pipeline product will result in three permanent Started Tasks (CM, CI, and CT) plus one other Started Task (SX) that is started, when necessary, for Code Pipeline to perform work in a Set. Information as to their required permissions is listed in the subsequent paragraphs, however some of the values (such as the names of the Warehouse Datasets) will not be known until later in the installation process.
If your site intends to use the Code Pipeline Deploy feature, a fifth started task (RX) will need to be defined. See the Code-Pipeline-Deploy-Reference for more information about the RX task.
Additionally, if your site intends to use the Custom Exit Processor a sixth started task (FX) needs to be defined. Its permissions are similar to the SX task mentioned above.
If your site intends to use the Code Pipeline External Function Processor Enhancement, a seventh started task (EF) will need to be defined.
CM Authority
The CM Task needs to be associated with a UserID with the following authority:
- READ access to data sets specified in the PROC for the CM.
- EXECUTE privilege on the Db2 Plan as specified in the input parameters. This should be done as part of the Db2 Repository Install.
- The UserID should be set up with no password (using the NOPASS attribute).
- For TCP/IP communications, the CM Task will require an OMVS segment.
- Authority to issue the z/OS START command (to start the SX task).
CI Authority
The CI Task needs to be associated with a UserID with the following authority:
- READ access to data sets specified in the PROC for the CI.
- The UserID should be set up with no password (using the NOPASS attribute).
- STEPLIB must be z/OS APF-authorized.
- For TCP/IP communications, the CI Task will require an OMVS segment.
CT Authority
The CT Task needs to be associated with a UserID with the following authority:
- READ access to data sets specified in the PROC for the CT.
- ALTER access to the Warehouse Datasets from Establish Code Pipeline Dataset Naming Conventions.
- The UserID should be set up with no password (using the NOPASS attribute).
- Authority to issue the z/OS START command (to start the SX Task).
- For TCP/IP communications, the CT Task will require an OMVS segment.
- Code Pipeline authority REFDATA, TECH to keep the repository updated with warehouse details. See the “REFDATA” section in the chapter entitled “Security-Objects-and-Methods” in the Code-Pipeline-Technical-Reference-Guide for more information.
SX Authority
The SX Task needs to be associated with a UserID with the following authority:
- READ access to data sets specified in the PROC for the SX.
- UPDATE access to all Application data sets managed by Code Pipeline.
- ALTER access to a specified HLQ for a “Set Log” (as specified for Set Log Datasets in Establish Code Pipeline Dataset Naming Conventions).
- The UserID should be a maximum of 7 characters because SX runs batch TSO and ISPF.
- Because SX submits controlled compile jobs under its authority, it requires TSOSUBMIT authority. For the same reason, SX requires JCL authority under the TSOAUTH resource class.
- Authority to perform operations the SX task will do directly (for example, BINDs and CICS Newcopy).
RX Authority
If Deploy will be used, the RX Task needs to be associated with an ID with the following characteristics:
- TSO UserID (7 characters or less).
- The UserID should be set up with no password (using the NOPASS attribute).
- An OMVS segment is needed if Code Pipeline will be used to deploy UNIX files.
- RACF authority to data sets the RX task will access (such as Lifecycle and Target Deploy Datasets).
- Authority to perform operations the RX task will do directly (for example, BINDs and CICS Newcopy).
FX Authority
The FX Task needs to be associated with a UserID with the following authority:
- READ access to data sets specified in the PROC for the FX.
- UPDATE access to all Application data sets managed by Code Pipeline.
- ALTER access to a specified HLQ for an “FX Log” (as specified in M.ER variable FXLOGPFX).
- The UserID should be set up with no password (using the NOPASS attribute).
- The FX Task requires JCL authority under the TSOAUTH resource class. It will require the same authority as the SX Processor.
EF Authority
The EF Task needs to be associated with a UserID with the following authority:
- READ access to data sets specified in the PROC for the EF.
Step 5 Plan System Libraries
During Code Pipeline configuration, you will enter the library data set names to be used in tailoring the Site skeletons. ISPF Dataset Names and Compiler and Related Dataset Names can be used to record the values you will enter in Define/Provide System Libraries.
Task 5.1 Determine ISPF Data set names
The following table can be used to record the values you will enter during Provide ISPF Data set Names.
ISPF Data set names
Dataset Type | 17.02 Variable | Your Site-Specific Dataset Name |
---|---|---|
IBM-supplied EXECs | ISPFEXEC | |
IBM-supplied CLISTs | ISPFCLST | |
IBM-supplied Panels | ISPFPANL | |
IBM-supplied Skeletons | ISPFSKEL | |
IBM-supplied Tables | ISPFTABL | |
IBM-supplied Messages | ISPFMSGS | |
IBM-supplied Load Modules | ISPFLOAD |
Task 5.2 Determine compiler and related data set names
The following table can be used to record the values you will enter during Provide Compiler and Related Dataset Names.
Compiler and related data set names
Dataset Type | 17.02 Variable | Your Site-Specific Dataset Name |
---|---|---|
System Maclib Dataset for Assemblies | MACLIB | |
System Modgen Dataset for Assemblies | MODGEN | |
LE Maclib for compiles | SCEEMAC | |
COBOL for MVS Maclib | COBMMAC | |
COBOL for MVS Steplib | COBMSTEP | |
CICS Link Library for Link edits | CICSLINK | |
PL/I Steplib | PLISTEP |
Step 6 Plan Started Task parameters
Code Pipeline relies on four Started Tasks (CM, CI, CT and SX) to perform all its basic functions. Without these Started Tasks, Code Pipeline will not function. Follow the instructions in this Milestone to plan the necessary Started Task parameters you will enter during Milestone-8-Define-Started-Tasks.
Task 6.1 Determine Started Task common parameters
Certain parameters are used by all of the Code Pipeline Started Tasks. The following table can be used to record the values you will enter during Define Common Parameters.
Started Task common parameters
Parameter | Description | 17.02 Variable | Your Value |
---|---|---|---|
Server Names | The Internal Server ID to be used by the Code Pipeline CM Server. | SERVERID | |
Server Names | Internal communication ID for the CM Started Task. For TCP/IP communications, it is the logical name. | WZCMNAME | |
Server Names | Internal communication ID for the CI Started Task. For TCP/IP communications, it is the logical name. | WZCINAME | |
Server Names | Internal communication ID for the CT Started Task. For TCP/IP communications, it is the logical name. | WZCTNAME | |
Port Numbers | The port number on which the CM Started Task should listen for communications from the CI and CT Started Tasks. Limited to 5 digits. | WZCMPORT | |
Port Numbers | The port number on which the CM Started Task should listen for REST API communications and connections from Workbench for Eclipse for Eclipse clients via HCI. Limited to 5 digits. | WZCMXPRT | |
Communications | Communications protocol to be used by the CM Started Task. (Currently, only TCP/IP is supported.) | XSYSPROT | |
Communications | The IP address of the LPAR on which the CM Started Task runs. The IP address can be either a DNS name or IP address, unless VIPA (Virtual IP Addresses) is used. If VIPA is used, this must be the IP address assigned to the CM Started Task or the DNS name that resolves to the VIPA IP address. | WZCMADDR |
Task 6.2 Determine CM Started Task parameters
The following table can be used to record the values you will enter during Define CM Specific Parameters.
CM Started Task parameters
Parameter | Description | 17.02 Variable | Your Value |
---|---|---|---|
SECCLASS | The name of the SAF security class to be used by Code Pipeline. | SECCLASS | |
SECRULE | The Security Rule Specification. The default rule effectively turns security off. See the Code-Pipeline-Technical-Reference-Guide chapter entitled “Security” for details on how to turn security on. | SECRULE | |
SETPROC | The name of the SX Started Task. (The CM Started Task will start and stop this Task dynamically when Set processing is required.) | SETPROC | |
FXPROC | The name of the FX started task. CM will issue a z/OS START for this name when Custom Exit processing is required within Workbench for Eclipse . | FXPROC | |
TCPUSERID | The job name of the TCP/IP address space. The default is TCPIP. | TCPUSERID | |
Number of TCBs | Used by the CM Started Task to determine the number of long-running requests (Threads) that can be processed against Db2. Maps to the number of long-running TCBs. The default is 2. | GPROCESS | |
Number of TCBs | Used by the CM Started Task to determine the number of short-running requests (Threads) that can be processed against Db2. Maps to the number of short-running TCBs. The default is 2. | SPROCESS | |
Number of TCBs | Used by the CM Started Task to determine the number of long-running transaction process threads. Maps to the number of long-running TCBs used for transaction processes. The default is 2. | No corresponding 17.02 variable | |
Authorized Users | For normal maintenance, the Code Pipeline server will need to be stopped and restarted. To start the server, Code Pipeline administrators must be listed as Authorized Users. As part of the install, the UserID of the installer should be the first entry. | AUTHUSER | |
Optional | Through BMC AMI Common Enterprise Services (CES ), you can use Code Pipeline REST APIs to perform various operations (Promotes, Fallbacks, and Generates) and enable the triggering of various Notifications, such as a post in a chat. For more information, see theCES Online Help.Specify Y (Yes) if Code Pipeline REST APIs will be used. The default is N (No). | WEBAPI | |
Optional | The Code Pipeline REST APIs use subtasks to send the event notifications to BMC AMI Common Enterprise Services (CES ). This parameter allows you to specify the maximum number of subtasks that will be allowed to run simultaneously.Specify the maximum number of subtasks. The default is 10. | No corresponding 17.02 variable | |
Optional | The Code Pipeline REST APIs use subtasks to send the event notifications to BMC AMI Common Enterprise Services (CES ). This parameter allows you to specify the length of time in seconds that an idle subtask will wait before shutting down.Specify the maximum idle time. The default is 30. | No corresponding 17.02 variable | |
Optional | If using Code Pipeline in BMC AMI DevX Workbench for Eclipse , Code Pipeline administrators can create Custom Dialogs for Generates that will appear inWorkbench for Eclipse and allow changing of values used for Generates (such as whether a Bind is required, Generate Sequence, etc.). For more information, see the Code-Pipeline-Technical-Reference-Guide chapter entitled “Custom-Dialogs”.Specify Y (Yes) if Custom Dialogs will be used. The default is N (No). | CUSTOMDIALOGS | |
Optional | This is used to tell CM whether it should build a list of Runtime Config entries that are valid for this instance of CM. A Runtime Config entry is valid if the SRID parameter in the config entry matches the SERVERID parameter specified in appropriate CM startup member (see Determine Started Task Common Parameters). Valid values are Y and N. The default is N. | CONFIGNAMES |
Task 6.3 Determine CI Started Task parameters
The following table can be used to record the values you will enter during Define CI Specific Parameters.
CI Started Task parameters
Parameter | Description | 17.02 Variable | Your Value |
---|---|---|---|
Cross-memory ID | The name of the Cross-Memory ID attributed to the CI Started Task. This ID will be used by Code Pipeline Clients to send messages to this task. This value is usually the same as SERVERID in Determine Started Task Common Parameters. | XMEMID | |
Port Numbers | No longer required. Specify 0. | WZCIPORT | 0 |
Port Numbers | No longer required. Specify 0. | WZCIXPRT | 0 |
IP Addresses | No longer required. Leave blank. | WZCIADDR |
Task 6.4 Determine CT Started Task parameters
The following table can be used to record the values you will enter during Define CT Specific Parameters.
CT Started Task parameters
Parameter | Description | 17.02 Variable | Your Value |
---|---|---|---|
CT parms: | The data set name prefix (12 characters or less) used by the CT Started Task during processing when creating and deleting numerous temporary data sets. | TEMPPREFIX | |
CT parms: | The unit name the address space will use for IEBCOPY of the CT Started Task temporary data sets. | TEMPUNIT | |
CT parms: | The primary allocation space the address space will use in cylinders for CT Started Task temporary data sets. The default is 5. | ||
CT parms: | The secondary allocation space the address space will use in cylinders for CT Started Task temporary data sets. The default is 30. | ||
CT parms: | The number of seconds a warehouse data set will be left idle before it is closed and deallocated. The default is 60. This is first of four housekeeping parameters for warehouse data sets handled by the CT Started Task. | CWIDLE | |
CT parms: | The number of minutes the CT Started Task will wait before performing warehouse housekeeping. The default is 60. As part of housekeeping, the CT Started Task obtains from the repository a list of members to be deleted from the warehouse, then deletes them. If this done too often, excessive resources will be consumed querying the repository. If this value is set to zero (not recommended), automatic housekeeping will not occur. | HKINTERVAL | |
CT parms: | Whether the CT Started Task should issue compression statistics messages (bytes in, bytes out, and compression percentage) for each member that goes into the warehouse. The default is Y (Yes). | COMPSTAT | |
CT parms: | Whether the HSM interface should be used during housekeeping to recall migrated data sets required by the CT Started Task. If your site uses HSM for migration, specify Y (Yes). If your site uses another migration tool, specify N (No). The default is N (No). The CT Started Task has a direct interface to HSM to recall data sets it needs to access. For sites not using HSM, CT submits a batch job to recall the data set. The job is built from model JCL contained in the DSRECALL member of the PARMLIB data set used by CT. Copy the sample DSRECALL member in SAMPLIB to the CT PARMLIB and modify it for your site. The DSRECALL provided with Code Pipeline allocates the migrated data set to trigger its recall. A recall utility job can replace this job, depending on your migration software. | HSMINT | |
CT parms: | The job name of the TCP/IP address space. The default is TCPIP. | TCPUSERID | |
IP Addresses | If using VIPA (Virtual IP Addresses), the Virtual IP Address for the CT Started Task. If not using VIPA, this field should be left blank. | WZCTADDR | |
CT parms: | The name of the EF started task for this CT. CT will create an address space using this name at startup, if EF is configured. | ||
CT parms: | The name of the Runtime Config entry used by EF to talk to CI. | ||
Audit Log: | The GDG (Generation Data Group) base data set name for the Audit Logs. The first of five parameters for Audit Logs created by the CT Started Task. Review these parameters carefully because the default settings may not be appropriate for every site. If nothing is specified for AUDITGDG, no Audit Logs are created. If a value is specified, CT will create the GDG base. After the GDG is created, you can alter it outside of Code Pipeline. Code Pipeline cannot alter a GDG. | AUDITGDG | |
Audit Log: | The maximum number of GDG versions. The default is 10. | AUDITMAX | |
Audit Log: | Allocation unit for GDG data sets. The default is SYSDA. | AUDITUNIT | |
Audit Log: | Allocation volume for GDG data sets. The default is an asterisk (*) specifying to use SMS. | AUDITVOL | |
Audit Log: | Allocation size for GDG data sets, in TRACKS. The default is 45. | AUDITSIZE |
The following table can be used to record additional values not appearing on the Update the Code Pipeline CT Parameters screen.
Additional CT Started Task audit log parameters
Parameter | Description | 17.02 Variable | Your Value |
---|---|---|---|
Audit Log: | The GDG data set Storage Class. The default is blank. | AUDITSTORCLAS | |
Audit Log: | The GDG data set Management Class. The default is blank. | AUDITMGMTCLAS | |
Audit Log: | The GDG data set Data Class. The default is blank. | AUDITDATACLAS |
Task 6.5 Setting the priority of Code Pipeline Started Tasks
For proper performance, the started tasks used by Code Pipeline need to have the proper priority. The following tasks should be given a relatively high priority:
- CM is a transaction processing environment that only performs business logic with Db2 activity, such as CICS and IMS.
- CI is primarily a communications gateway.
- CT handles deployments, moves components, and manages Code Pipeline warehouses.
Other tasks
- SX tasks are started as required and can have many running concurrently up to user-configured maximums. These tasks primarily handle all the background life-cycle operations run in Sets, but are also used for some other background functions. As such, they could be considered similar to batch, but should probably be given higher priority than batch.
- RX tasks are also started as required to handle deployments where any additional processes are required (CICS NEWCOPY, Db2 binds, LLA refresh, etc.), and should therefore be given consideration similar to that given to SX tasks.
- FX tasks are started as required and handle custom exit processing for Code Pipeline for Eclipse users. There can be many FX tasks running concurrently (although this is unlikely). As such, they could be considered similar to batch, but should probably be given higher priority than batch.
- EF tasks are started at Code Pipeline CT startup and handle parse operations for files saved by Code Pipeline for Eclipse users. As such, they could be considered similar to batch, but should probably be given higher priority than batch.