Migrating Oracle data



If you are using TrueSight Capacity Optimization with Oracle as database, use the instructions in this section to migrate your data from TrueSight Capacity Optimization to BMC Helix Capacity Optimization. BMC provides an out-of-the-box migration toolkit that enables you to migrate your data from TrueSight Capacity Optimization. This toolkit includes the CMT tool (Customer Migration Tool) that contains the CMT Engine, CMT Shell, out-of-the-box files and scripts required for migration.

Before you start, review the key considerations. For details, see Migrating-from-TrueSight-Capacity-Optimization

Click here to know the CMT key concepts
  • Discovery: Command used to discover the database schema. The output of this command is referred as version. Discovered versions are saved to files so that later they can be loaded as required. Discovery is performed for both TrueSight and Helix databases, and resulting discovered versions are used for the mapping purpose.
  • Mapping: Defines the mapping between a source (TrueSight) and target (Helix) databases using the discovered versions. Mapping is saved to file so that it can be loaded as required later. It is used as a main input for the export process.
  • Timestamp list file: Available as an out-of-the-box file. It contains information about timestamp column as contained in various tables for the Capacity Optimization schema. This file is loaded in the memory and the output ID is used as a reference while creating a mapping.

Prerequisites

Before you migrate data, ensure that the following requirements are met:

  • You have administrator privileges for accessing TrueSight Capacity Optimization.
  • TrueSight Capacity Optimization is upgraded to version 11.3.01 or later. To migrate data from a version earlier than 11.3.01, you must first upgrade to 11.3.01 and then migrate data to BMC Helix Capacity Optimization. 
  • TrueSight Capacity Optimization is using Oracle 11.2 or later.

To migrate Oracle data

The Oracle database migration activity involves the following major steps:

  1. Activation of BMC Helix Capacity Optimization and downloading the Oracle migration toolkit.
  2. Preparation of your environment and files for migration
  3. Export and import of partial or full data
  4. (Optional) Export and import of delta data
  5. (Optional) Export and import of historical data

1. Activate BMC Helix Capacity Optimization and download the Oracle migration toolkit

  1. Register for BMC Helix Capacity Optimization and activate the product. For details, see Getting-started.
  2. Review the welcome email that includes login credentials and URLs to use the product.
  3. Download the migration toolkit from ftp://<customerprefix>-opt-ftp.onbmc.com/FromOnDemand/optimize-onprem.
  4. Create a directory and copy the downloaded file to the directory.

2. Prepare your TrueSight Capacity Optimization environment and files for migration

  1. Copy the migration toolkit to a server on which you want to run the CMT shell to export data.
    Note: Based on the database and repository size, ensure that the sufficient space is available on the export dump location.
  2. Log in to the server on which you copied the toolkit file and extract the contents of the file.
  3. Connect to the Oracle Database Server as a BCO_OWN user.
  4. Obtain the schema version array file (SCHEMA_VERSION_ARRAY).
    1. Navigate to the directory where you extracted the toolkit files and open the <toolkit_folder>/ho-data-migration/database/Oracle folder.
    2. Run the following SQL query to set long so that output is not trimmed.

      SQL> set long 5000000

    3. Run the following SQL query to obtain the schema version array file:

      SQL> @/path/schema_version.sql
      Sample output
      ddl_1000;ddl_1002;ddl_1003;ddl_1004;ddl_1006;ddl_1007;ddl_1008;ddl_1009;ddl_1011

      Keep this schema version array file to be shared with the BMC team.

  5. Start the CMT Engine.
    1. Navigate to the <toolkit_folder>/cmt-engine/bin folder.
    2. Run the following command: 

      $cmt-service.sh start
  6. Run the CMT shell.
    1. Navigate to the <toolkit_folder>cmt-shell folder. 
    2. Run the following command: 

      $cmt-shell.sh

      The CMT shell prompt is displayed.

    3. At the CMT shell prompt, run the following command to get help on different CMT options: 

      cmt-shell:>help

      oracle_migration_1.png

    4. Run the following command to check the default location of cmt-engine and default location to store the export dump.

      cmt-shell:>cfg-show-config
  7. (Optional) If the default location to store the export dump is /opt/export_data, run the following command to update the export dump location:

    cmt-shell:>cfg-update-file-location <path to store export dump>
  8. Run the respective commands to perform the following tasks for schema tables discovery:
    1. Create an Oracle database connection:

      cmt-shell:>dbsystem-create name <name to identify DB option>

      Provide oracle database details on each prompt:

      oracle_migration_2.png

    2. Obtain the database ID of the the data connection that you created in the previous step: 

      cmt-shell:>dbsystem-get-all
    3. Verify the connectivity to the database: 

      cmt-shell:>activity-connectivity-test  <dbsystem_ID>
    4. Discover the schema tables from the database: 

      cmt-shell:>job-create-discovery --dbsystem-id <dbsystem_ID>
    5. Check the discovered version list and save the required version in file. Keep this discovery version file to be shared with the BMC team.

      cmt-shell:>version-get-all   
      cmt-shell:>version-save --file-name <filename> --new-version-name <new_version_name> --version-id <version_ID>
  9. (Only if you have buffer tables) Create a list of buffer tables that needs to be created in the BMC Helix Capacity Optimization database. Run the respective commands to perform the following tasks:
    1. Obtain a list of tables from the version discovered in the previous step:  

      cmt-shell:>version-get-tables --id <version_ID>

      The command output displays the table IDs for buffer tables. Note down these table IDs.

    2. Create a new table list: 

      cmt-shell:>table-list-create table-list-name <name of the table list>
    3. Check the IDs of the newly created table list.

      cmt-shell:>table-list-get-all
    4. Add the table IDs that you had noted down for buffer tables to the newly created table list: 

      cmt-shell:>table-list-add-table --table-id <table-id> --table-list-id <table-list-id>
    5. Verify that all required tables are added to the table list: 

      cmt-shell:>table-list-get-tables --table-list-id <table-list-id>
    6. Save the updated table list to the file. Keep this table list file to be shared with the BMC team.

      cmt-shell:>table-list-savefile-name <name for the table list file> --new-table-list-name <new list name> -table-list-id <table-list-id>

3. Share the mapping files with the BMC team

  1. Transfer the following files to the BMC team:

    • Schema version array
    • Discovered version file
    • (Optional) Buffer table list file

    The BMC team will create a mapping between TrueSight and BMC Helix databases.

  2. After you receive the mapping files from the BMC team, download all files to the cmt-shell folder. Also, ensure that the out-of-the-box timestamp list file is copied to the same folder.

4. Export full or partial data from TrueSight Capacity Optimization

You can export either full or partial data using the From timestamp that is provided as an input parameter. 

  1. Stop all TrueSight Capacity Optimization services and ETL data collection. For details on stopping the services, see System-level administration overview.
  2. Log in to the server on which the CMT shell is running.
  3. At the CMT shell prompt, run the respective commands to perform the following tasks:
    1. Load the data mapping file (corresponding to the full or partial data export) that you received from the BMC team: 

      cmt-shell:> mapping-load  --name <mapping name> --mapping file <mapping file name>
    2. Obtain the ID for the loaded mapping file. Note down this ID as you need to provide it while creating a database export.  

      cmt-shell:> mapping-get-all

       

    3. Load the out-of-the-box timestamp list: 

      cmt-shell:>timestamp-list-load --file-name <timestamp list file>
    4. Create the database export:

      cmt-shell:> job-create-export --name <export job name> --db-system-id <dbsystem-ID> --mapping-ID <ID for the loaded mapping file>

      The cmt shell prompts you to enter the timestamp from which you want to export the data, in the supported date format. For example, to export data from 15th Jan 2020, enter the date as '15-JAN-2020:00:00:00'.
      To take the full export, enter the timestamp starting from the availability of data. You can also choose to take the export for recent data now and recover the historical data later. However, BMC recommends that you take the export of full data.

      Note: After the completion of the database export, note down the input timestamp (From Time) as well as the output MAX timestamp (To Time) displayed in the export job status, as you need to provide them while creating a delta or historical export. To view the export job status, run the following commands:

      cmt-shell:> job-get-all
      cmt-shell:> job-get --id <export job id>
      cmt-shell:> job-get-export-tasks --id <export job id>
      cmt-shell:> job-get-export-tasks-by-status --id <export job id> --status <FAILED/COMPLETE/IN_PROGRESS>

      After the export job is complete, you can also save the job task logs to a CSV file by running the following command:

      cmt-shell:> job-save-tasks --file-name testExport.csv --id <export job id>
    5. Create a tar image of the dump folder generated in the export_data folder. 

  4. Create a dump for repository content and encryption keys.

    1. Copy the migration toolkit to the host computer on which the Application Server is running and log in to it.
    2. At the shell prompt, change to the directory where you extracted the toolkit files and navigate to the repository folder.
    3. Run the following command: 

      ./repository_and_keys_backup.sh -c <app server installation directory> -r <CO repository directory> -l <path to store dump>

      The following dump files are generated at the specified path:


      • repository_backup.tar.gz
      • certificates_backup.tar.gz
  5. Log in to the server on which the CMT shell is running. At the CMT shell prompt, perform a sequence discovery for the Oracle Database server.
    1. Obtain a list of sequences and their last values:

      cmt-shell:>sequences-discover -DBSystemId <dbsystem ID>
    2. Obtain the ID of the sequence discovery: 

      cmt-shell:>sequences-discovery-get-all
  6. Save the sequence discovery to a file. Keep this sequence discovery file to be shared with the BMC team.

    cmt-shell:>sequences-save --file-name <file name for sequence> -SequenceDiscoveryId <sequence discovery id>
  7. Restart all TrueSight Capacity Optimization services and ETL data collection.

5. Share the database export with the BMC team

Transfer the following export files to the BMC team:

  • Compressed dump folder of the exported data
  • Repository backup file
  • Certificates backup file
  • Sequence discovery file

6. Import the database in BMC Helix Capacity Optimization

The BMC team will perform the database import.

After you get the confirmation from the BMC team about successful data import, validate the imported data, and proceed to the next step.

(Optional) 7. Export delta data from TrueSight Capacity Optimization

Delta data can be any data changes that occur after the data export from TrueSight Capacity Optimization and import in BMC Helix Capacity Optimization. The delta data recovery is required only when there is a considerable time delay between the data export and import process, or the historical data extraction from the data sources through ETLs is not possible. You need to take the export starting from the Max timestamp provided during the full or partial export till the current timestamp. 

  1. Stop all TrueSight Capacity Optimization services and ETL data collection. For details on stopping the services, see System-level administration overview.
  2. Log in to the Oracle Database Server.
  3. At the cmt shell prompt, run the respective commands to perform the following tasks:
    1. Load the data mapping file ((corresponding to the delta data export)) that you received from the BMC team: 

      cmt-shell:>mapping-load --file-name <delta mapping file name>
    2. Obtain the ID for the loaded mapping file. Note down this ID as you need to provide it while creating a delta data export. 

      cmt-shell:>mapping-get-all
    3. Load the out-of-the-box timestamp list:

      cmt-shell:>timestamp-list-load --file-name <timestamp list file>
    4. Create the delta database export: 

      cmt-shell:>job-create-export --name <export job name> --db-system-id <dbsystem-ID> --mapping-ID <mapping-id for above loaded file>
      Please Enter Value for parameter : [ts] : <Provide the MAX timestamp noted during the full or partial export>
      Example
      cmt-shell:>job-create-export  --name qa20delta-25sept1 --db-system-id 1 --mapping-id 102
      Please Enter Value for parameter : [ts] : '22-SEP-2020:12:00:00'
      Job 103 Successfully submitted
    5. Create a tar image of the delta dump folder generated in the export_data folder. 
  4. Restart all TrueSight Capacity Optimization services and ETL data collection.
  5. Share the compressed delta dump folder with the BMC team. 

8. Configure users and roles in BMC Helix Capacity Optimization

Some default user groups and roles  are available out-of-the-box when you subscribe to BMC Helix Capacity Optimization. You can start using the product with these users. Additionally, you can configure your identity provider to set up more users and user groups, and assign specific permissions to them. BMC Helix Capacity Optimization  leverages Helix Single Sign-On to provide single sign-on authentication for users. For details, see Managing users and access control.

9. Download and install Remote ETL Engines in BMC Helix Capacity Optimization

Download and install the Remote ETL Engine and related components that are needed to collect data from your environment. For details, see Installing-remote-components-to-collect-on-premises-data.

10. Validate data in BMC Helix Capacity Optimization

After you get the confirmation from the BMC team about successful data import, validate the imported data.

(Optional) 11. Export historical data from TrueSight Capacity Optimization

The historical data recovery is required only when partial data is exported from TrueSight Capacity OptimizationYou need to take the export starting from the availability of data till the time partial data was exported (From timestamp). For example, you have data from 1st Jan 2019 and you took partial export from 15th Jan 2020, you can recover the data between these time periods.

  1. Log in to the Oracle Database Server.
  2. At the cmt shell prompt, run the respective commands to perform the following tasks:
    1. Load the data mapping file (corresponding to the historical data export) that you received from the BMC team: 

      cmt-shell:>mapping-load --file-name <historical mapping file name>
    2. Obtain the ID for the loaded mapping file. Note down this ID as you need to provide it while creating a historical data export. 

      cmt-shell:>mapping-get-all
    3. Load the out-of-the-box timestamp list:

      cmt-shell:>timestamp-list-load --file-name <timestamp list file>
    4. Create the historical database export: 

      cmt-shell:> job-create-export --name <export job name> --db-system-id <dbsystem-ID> --mapping-ID <mapping-id for above loaded file>
      Please Enter Value for parameter : [ts] : <Provide the FROM timestamp noted during the partial export>
      Example
      cmt-shell:>job-create-export  --name qa20delta-25sept1 --db-system-id 1 --mapping-id 102
      Please Enter Value for parameter : [ts] : '15-JAN-2020:12:00:00'
      Job 103 Successfully submitted
    5. Create a tar image of the historical dump folder generated in the export_data folder. 
  3. Share the compressed historical dump folder with the BMC team. 

The BMC team will import the historical data.

After you get the confirmation from the BMC team about successful data import, validate the imported data.


 

Tip: For faster searching, add an asterisk to the end of your partial query. Example: cert*