This documentation applies to the 8.0 version of BMC Atrium Core, which is in "End of Version Support." You will not be able to leave comments.

To view the latest version, select the version from the Product version menu.

After you upgrade Atrium Core

After upgrading BMC Atrium Core, the following procedures must be performed:

After you have completed these procedures, perform the appropriate post-installation procedures and restart your environment features in the order listed in Restarting BMC Atrium Core.

Reviewing upgrade log files

The installer writes status messages to a log file, giving a detailed account of all changes made to your data model during the upgrade. If the upgrade fails and you have to run the installer again, it resumes after the last operation it completed rather than repeating operations that have already been completed.
The log file is named cmdbengdebug.log, and by default is written to ARSystemServerDirectory\ARServer\db.
To specify a different location, edit the ar.cfg (Microsoft Windows) or ar.conf file (UNIX) before running the installer and change the value of the CMDB-Log-File-Location parameter.

The following messages are displayed in the log file:

[WARNING] [TID: 004092] : UpdateCacheInfo -- Error Retrieving Federation
Update information.... error code: 303
[WARNING] [TID: 004092] : UpdateCacheInfo -- Error Retrieving Dataset
Update information.... error code: 303
[ INFO ] [TID: 001160] : Finished upgrade to version 7.5.00
[WARNING] [TID: 001160] : WARNING: 8037 Index length longer than 255 bytes --
[WARNING] [TID: 001160] : may not work on all databases
[ INFO ] [TID: 001160] : Finished incremental upgrade.
[WARNING] [TID: 001160] : WARNING: 8037 Index length longer than 255 bytes --
[WARNING] [TID: 001160] : may not work on all databases

These warnings are normal, and you can ignore them.

Tip

Make a copy of the cmdbengdebug.log file to view after a successful upgrade. Even though the upgrade succeeded, you can use the log file to find out which instances you must migrate manually to keep your data in the Common Data Model. It also tells you which classes the upgrade installer was unable to delete due to the existence of subclasses you created.

Validating the upgrade

From the BMC Atrium Core Maintenance tool, run a health check to make sure that the upgrade was successful. You can view the log files to check for errors or warnings that you might have encountered.

Enabling entries in the armonitor and ar configuration files

If you are using a UNIX computer, you must enable the following entries that you commented out before the upgrade:

  • The arcmdbd.sh and arrecond.sh entries in the armonitor.conf file
  • The libcmdbconsolefilterapi21.sl and cmdbsvr21.sl entries in the ar.conf file.

If you are using a Windows computer, you must enable the arcmdbh.exe and arrecond.exe entries in the armonitor.cfg file.

To enable the arrecond and arcmdbh entries in the armonitor file

  1. Navigate to the following location on the BMC Remedy AR System server:
    • (Windows)  AR_install_directory\Conf
    • (UNIX/etc/arsystem/server_name
  2. Open the armonitor.conf (UNIX) or the armonitor.cfg (Windows) file in a text editor.
  3. In the armonitor.conf file, locate the lines that include arcmdbd.sh and arrecond.sh.
    In the armonitor.cfg file, locate the lines that include arcmdbd.exe and arrecond.exe.
  4. Remove the number sign (#) from the start of the line.
  5. Save the file.

To enable the libcmdbconsolefilterapi21 and cmdbsvr21 entries in the ar.conf file

  1. Open the ar.conf file.
  2. Locate the lines that include the following:
    • Plugin: /usr/arsystem/norad/cmdb/server/bin/libcmdbconsolefilterapi21.sl
    • Load-Shared-Library: cmdbsvr21.sl
  3. Remove the number sign (#) from the start of the line.
  4. Save the file.

Deleting obsolete directories after upgrade

When you upgrade from version 7.5 or later, you must choose a new installation directory for the upgrade that is not below the BMC_AR_SYSTEM_HOME directory. As a result, the old installation directory is now obsolete.
For example, if /usr/AtriumCore is the new installation directory and /usr/arsystem/cmdb is the old directory that contained the earlier version, delete the old directory to avoid confusion.

Using the post-upgrade utility to identify duplicated records

The upgrade of BMC Atrium CMDB can result in duplicate records of the same virtual system. To identify those duplicated records, the upgrade installer launches a post-upgrade utility called vzUtil. For this version, the utility is called vzUtil75.
The post-upgrade utility operates in two phases: Identification and Merge.

Phases used in post-upgrade utility

Phase

Description

Identification

In the Identification phase, which occurs automatically immediately after an upgrade, the utility scans your data and, by default, identifies duplicate virtual system records in BMC_ComputerSystem and BMC_VirtualSystemEnabler. After the upgrade is complete, if duplicate records are found, a message informs you that you should run the vzUtil utility again to manage the duplicates. The subsequent manual launching of the utility by the user and other steps occur in the Merge phase.

Merge

The Merge phase occurs after all upgrade processes are complete. Perform these steps if the vzUtil utility gave you a message indicating that duplicate virtual system records were detected. This utility is not intended to remove duplicate CIs from within the original records of ComputerSystem and VirtualSystemEnabler classes, nor is it intended to remove duplicates from within the migrated instances. The utility removes duplicates that form a pair of duplicates found only between the original and migrated instances.

When to use the post-upgrade utility

You can use the post-upgrade utility any time after the initial upgrading of your data is complete to find duplicate records of virtual systems.

If the utility is run manually at some point after it is called automatically during an upgrade, you can modify the parameters the utility uses for identification by changing the vzUtil.properties file.

You can specify maximum of seven rules for each class. The rules are similar to reconciliation identification rules. The utility matches the fields specified in each rule in the order they are listed and identifies duplicates that match any of the specified rules. You can specify the rules, along with key attributes, to uniquely identify the duplicate records.

Removing duplicate virtual system records with the post-upgrade utility

This procedure explains how to merge records with the vzUtil utility.

To run the vzUtil utility again for any reconciliation stage, you must clean up for each activity. You should not to re-run the second Merge stage because it is the final stage of vzUtil.

Warning

Changes that result from the following procedure affect the production dataset.

Before you begin

If you need to delete files to run the vzUtil utility again, make a copy of the production dataset.

To perform records merge with vzUtil

  1. Launch the vzUtil utility (vzUtil75.cmd on Windows or vzUtil75.sh on UNIX).
  2. Manually run the appropriate reconciliation jobs to help regenerate reconciliation IDs and merge the CIs in production datasets.
  3. Launch the vzUtil utility again.
    The utility populates the CMDB:VzDuplicates form with new reconciliation IDs and instance IDs.
  4. (optional) View log files.
    Logs for vzUtil are generated by default in the Logs folder of the default BMC Atrium Core installation directory (one level below the BMC Atrium CMDB installed path: installationDirectory/../Logs/vzutil.log.
    Two options are available for logging of vzutil:
    • Console type loggingffile type logging (default)

      You can change the default logging type in the log4j_vzutil.xml file.

To clean up activities to reuse the vzUtil utility

  1. For the Identify stage, manually delete entries from the CMDB:VzIdentify form.
    If the entries are not removed, then the first merge stage detects those entries and deletes the CIs in the production dataset.
  2. For the first Merge stage, complete the following steps.
    1. Create a backup of the production dataset.
    2. Reset the reconciliation IDs of the matching CIs in the import datasets and also their weak objects and relationships.
    3. In the production dataset, delete the CI that is identified as a duplicate in the CMDB:VzIdentify form.
    4. Delete the entry from CMDB:VzIdentify form.
    5. Restore the production dataset from the backup taken before running this stage.
    6. Run the reconciliation jobs that generates the reconciliation IDs for the import datasets with the help of reconciliation IDs in the production dataset.
    7. Remove all the entries from the CMDB:VzDuplicates form.

Running the datamig utility

If you enabled the Impact Normalization Feature option from the Normalization Engine console before you upgraded to BMC Atrium Core 8.0.00, relationship names for all CIs were normalized following the old best-practice relationship rules and impact relationships were generated following the old best-practice impact normalization rules.
The datamig utility cleans up the CIs in the staging datasets for which relationships names are normalized following the old best-practice relationship rules and impact relationships are generated following the old best-practice impact normalization rules.

Note

After upgrading to BMC Atrium Core 8.0.00, if you enable the Impact Normalization Feature option from the Normalization Engine console, relationship names for all CIs are normalized following the 131 new best-practice relationship rules and impact relationships are generated following the 21 new best-practice impact normalization rules.

Changing the datamig file to match your environment

Before you run the datamig utility, you must change the datamig file to match your environment.

  1. From the bin folder that resides under the cmdb\sdk folder under the BMC Atrium Core home directory, open the datamig.bat file (Windows) or the datamig.sh file (UNIX) in a text editor.
  2. Set the ATRIUM_CORE, AR_SERVER_NAME, AR_SERVER_PORT, JAVA_HOME parameters to reflect the path to the BMC Atrium Core installation directory, the AR System server name, AR System port, and the Java home path respectively.
  3. Save your changes.

Running the datamig utility

When you run the datamig utility, the Normalization Engine creates a job called BMC_ResetAiData_<DataSetName>_stage0. This job runs till all the CIs are cleaned up. You can choose to ignore this job. Alternatively, you can delete it after the clean-up.

If you choose to upgrade to BMC Atrium core 8.0.00 in an AR System server group environment, run the datamig utility on the primary server only. When you are prompted to run the datamig utility on the secondary and the subsequent tertiary members of the server group, enter N.

To run the datamig utility on Windows

  1. For each dataset for which you had enabled the Impact Normalization Feature option, do the following:
    • Run the following command: datamig.bat -dataset <DatasetName>
      For example, if the name of your staging dataset is BMC.ADDM, run the following command: datamig.bat -dataset BMC.ADDM.
    • When prompted, enter valid values for AR Server username and password.
    • Wait for the job to finish.
  2. Check the results, as explained in Taking action on the results of the datamig utility.

To run the datamig utility on UNIX

  1. Run the following commands:
    chmod 555 datamig.sh
  2. For each dataset for which you had enabled the Impact Normalization Feature option, do the following:
    • Run the following command: datamig.sh -dataset <DatasetName>
      For example, if the name of your staging dataset is BMC.ADDM, run the following command: datamig.sh -dataset BMC.ADDM.
    • When prompted, enter valid values for AR Server username and password.
    • Wait for the job to finish.
  3. Check the results, as explained in Taking action on the results of the datamig utility.

Taking action on the results of the datamig utility

After running the datamig utility, three files are created at the following location:
(Windows)--C:\Windows\temp
(UNIX)---/tmp
These files contain the results of the datamig utility, as explained in the following table.

File name

Description

Possible action you can take

fixedImpacts_MM_DD_YYYY_HH_MM.csv

Contains the impact relationships that are reset.

No action is needed on the results in this file.

discrepancy_MM_DD_YYYY_HH_MM.csv

Contains rules that either do not match the qualification, or have modified impact attribute values.

  • Leave the impact relationships as is.
  • Reset the impact relationships for selected CIs. See

NoRules_MM_DD_YYYY_HH_MM.csv

Contains the impact relationships for which no rules are defined.

  • Create new rules to define the impact relationships. This allows the Normalization Engine to set impact relationships on similar CIs.
  • Create no new rules.

Note

MM_DD_YYYY_HH_MM represents the date and the time in hours and minutes when the files are created. For example, after you run the datamig utility, the fixedImpacts_MM_DD_YYYY_HH_MM.csv file that is created might look like this: fixedImpacts_07_25_2011_03_43.csv.

Resetting the impact relationships

You can reset the impact relationships that have not been reset by the datamig utility. These impact relationships are contained in the discrepancy_MM_DD_YYYY_HH_MM.csv file.

  1. From the discrepancy_MM_DD_YYYY_HH_MM.csv file, find the impact relationships that you want to reset, and change the corresponding Overwrite column entry to Y.
  2. Save the file in CSV format.
  3. Run the following command:
    • (Windows) – datamig.bat -resetI <filename>
    • (UNIX) – datamig.sh -resetI <filename>
      For example, if you are using a UNIX computer, and you want to remove the impact relationships from the discrepancy_MM_DD_YYYY_HH_MM.csv file, run the following command:
      datamig.sh -resetI discrepancy_MM_DD_YYYY_HH_MM.csv

Cleaning up the old best-practice impact normalization rules

After you run the datamig utility for all staging datasets, clean up the old best-practice impact normalization rules.

note

If you run the datamig utility for all the staging datasets at one time, you can expect some CPU consumption. A safer approach is to run the datamig utility for one staging dataset at a time. However, you must run this utility for each staging dataset before you delete the old best-practice impact normalization rules.

  • To clean up the old best-practice impact normalization rules, run the following command:
    • (Windows) – datamig.bat -clean
    • (UNIX) – datamig.sh -clean

Writing data to the production dataset

After you clean up data in the staging datasets, you must write the cleaned up data to the production dataset (such as BMC.ASSET). BMC Atrium Core gives access only to the Reconciliation Engine to write to the production dataset. To write the data to the production dataset, create a standard Identification and Merge job or run an existing Identification and Merge job. For more information, see Creating a standard reconciliation job.

Where to go from here

Configuring after installation

This version of the documentation is no longer supported. However, the documentation is available for your convenience. You will not be able to leave comments.

Comments