Page tree

Unsupported content

 

This version of the documentation is no longer supported. However, the documentation is available for your convenience. You will not be able to leave comments.

Skip to end of metadata
Go to start of metadata

This topic describes how to integrate BMC Capacity Optimization with BMC PATROL and import custom Knowledge Modules (KMs). This is an “open” extractor tool. It can extract user-specified PATROL parameter values from a PATROL agent log file. Use this tool to load data collected by a PATROL Knowledge Module.

This topic contains the following sections:

Logging setup

As mentioned above, the integration requires to set up a regular dump of performance/custom KM metrics on the agent side, that is, on all monitored machines.

The dump is based on the DUMP_HIST command provided by BMC PATROL, which uses the following syntax:

dump_hist -host $HOST -p $PORT -nostat -format "format picture" -s $SINCE_DATE -e $TO_DATE

SINCE_DATE and TO_DATE are specified in the format mmddhhmmyyyy.

This execution dumps to console the performance metric values currently collected by the PATROL Agent.

The DUMP_HIST command needs to be run at regular intervals to collect data. To do so, instruct the crontab daemon or another scheduler (such as the Task Scheduler under Windows) to periodically run the DUMP_HIST command.

The results of regular executions of the DUMP_HIST command must be appended to a text log file.

Note

For more information about the DUMP_HIST command, see the BMC PATROL documentation. For more information about scheduling tasks, see your OS documentation.

Configuring DUMP_HIST formats

The ETL supports the following formats:

You can configure this property in Advanced mode.

Format 1 (default, recommended)

This is the preferred way to integrate BMC PATROL with BMC Capacity Optimization. If you can provide a format picture to the DUMP_HIST command, BMC recommends that you use this format.

The format picture is:

"|%H|%A|%I|%P|%y-%m-%d %h:%M:%s|%v"

Following is the output format:

|server01|AS_EVENTSPRING|EVENTSPRING|RefreshParamSettings|2007-12-06 00:00:49|0
|server01|AS_EVENTSPRING|EVENTSPRING|RefreshParamSettings|2007-12-06 00:02:24|0
|server01|AS_EVENTSPRING|EVENTSPRING|RefreshParamSettings|2007-12-06 00:04:00|0
|server01|AS_EVENTSPRING|EVENTSPRING|RefreshParamSettings|2007-12-06 00:05:35|0
|server01|AS_EVENTSPRING|EVENTSPRING|RefreshParamSettings|2007-12-06 00:07:10|0
|server01|AS_EVENTSPRING|EVENTSPRING|RefreshParamSettings|2007-12-06 00:08:45|0
|server01|AS_EVENTSPRING|EVENTSPRING|RefreshParamSettings|2007-12-06 00:10:20|0
|server01|AS_EVENTSPRING|EVENTSPRING|RefreshParamSettings|2007-12-06 00:11:55|0
|server01|AS_EVENTSPRING|EVENTSPRING|RefreshParamSettings|2007-12-06 00:13:30|0
|server01|AS_EVENTSPRING|EVENTSPRING|RefreshParamSettings|2007-12-06 00:15:05|0

Format 2 (terse)

This is the default output of the DUMP_HIST command. If you cannot provide any format picture to the export command, the agent will dump data in this format:

server01/CPU.CPU/CPUCpuUtil
	Sat Nov 10 09:35:30 2007 4
	Sat Nov 10 09:36:21 2007 4
	Sat Nov 10 09:37:11 2007 2
	Sat Nov 10 09:38:01 2007 2
	Sat Nov 10 09:38:51 2007 1
	Sat Nov 10 09:39:41 2007 2

To be able to process this format, you must add the following configuration property to the ETL:

patrol.format.terse = true

ETL configuration

To configure the ETL, you have to specify the number of metrics to recognize and the match between PATROL metrics and BMC Capacity Optimization metrics. For every metric definition, the following syntax should be followed: APPLICATION_CLASS;PARAMETER;BCO_METRIC;SCALE_FACTOR

For example, for the following row:

|server01|AS_EVENTSPRING|EVENTSPRING|RefreshParamSettings|2007-12-06 00:13:30|0

A sample configuration is: (mapping on the BMC Capacity Optimization metric TOTAL_EVENTS - Business driver - WKLDAT)

AS_EVENTSPRING;RefreshParamSettings;TOTAL_EVENTS;1

Or, for the following row:

server01/CPU.CPU0/CPUCpuUtil

A sample configuration is: (mapping on the BMC Capacity Optimization metric BYCPU_CPU_UTIL- System metric - SYSDAT)

CPU;CPUCpuUtil;BYCPU_CPU_UTIL;0.01

The ETL is able to aggregate data at a specific resolution (by default, at hour level). If you want to aggregate data at a different resolution, configure the properties.

Integration steps

To integrate BMC Capacity Optimization with the Patrol Parser, perform the following task:

  1. Navigate to Administration > ETL & SYSTEM TASKS > ETL tasks.
  2. In the ETL tasks page, click Add > Add ETL under the Last run tab.
  3. In the Add ETL page, set values for properties listed in the tabs below:
    1. Quick configuration: Use the procedure in this tab to quickly configure your ETL by setting only the essential properties required to schedule or run an ETL.
    2. Basic properties: In most cases, you need to set only these properties for the ETL.
    3. Advanced properties: Click Advanced in the Add ETL page to view them. These properties are for advanced users and scenarios only, and usually do not require any modification.

    The following table lists and describes values for properties that you need to select to perform a Quick configuration of this ETL.

    Property Value
    Run configuration
    ETL module
    (BMC recommends you to make this selection first)
    Select BMC - PATROL Agent log parser for custom KMs.
    ETL task name Default name is already filled out for you.
    Run configuration name Default name is already filled out for you.
    Environment Select Production.
    Description (Optional) Enter a brief description.
    Log level Select 1 - Light.
    Execute in simulation mode Select Yes. The ETL will not store actual data into the data warehouse. This option is useful while testing a new ETL task.
    Datasets
    1. Click Edit.
    2. Select one (click) or more (shift+click) datasets that you want to include from Available datasets and click >> to move them to Selected datasets.
    3. Click Apply.
    Patrol log format Select Standard format.
    Lookup sharing
    Sharing status Select SHARED.
    Object relationships
    After import

    Select leave all new entities in 'Newly Discovered'.

    Patrol metric configuration
    (Number of metrics) Select metrics from the metric selection.
    Metric 1
    Note: This property is activated only after you select number of metrics

    Specify the input for the selected metrics.

    File location
    File location Select how you want to retrieve the CSV file:
    • Local directory: Specify a path on your local machine where the CSV file resides.
    • Windows share: Specify the Windows share path where the CSV file resides.
    • FTP: Specify the FTP path where the CSV file resides.
    • SCP: Specify the SCP path where the CSV file resides.
    • SFTP: Specify the SFTP path where the CSV file resides.
    Directory Path of the directory that contains the CSV file.
    Directory UNC Full Path (Windows share) The full UNC (Universal Naming Convention) address. For example: //hostname/sharedfolder
    Files to copy (with wildcards) Before parsing, the SFTP and SCP commands need to make a local temporary copy of the files; this setting specifies which files in the remote directory should be imported.
    File list pattern A regular expression that defines which data files should be read. The default value is (?$<$!done)\$, which tells the ETL to read every file whose name does not end with the string "done". For example, my_file_source.done.
    Recurse into subdirs? Select Yes.
    After parse operation Select Append suffix to parsed file.
    Parsed files suffix Type .done.
    Remote host (Applies to FTP, SFTP, SCP) Enter the name or address of the remote host to connect to.
    Username (Applies to Windows share, FTP, SFTP, SCP) Enter the username to connect to the file location server.
    Password required (Applies to Windows share, FTP, SFTP, SCP) Select No.
    ETL task properties
    Task group Select a task group to classify this ETL into.
    Running on scheduler Select the scheduler you want to run the ETL on.
    Maximum execution time before warning The number of hours, minutes or days to to execute the ETL for before generating warnings, if any.
    Frequency Select Predefined.
    Start timestamp: hour\minute (Applies to Predefined frequency) The HH:MM start timestamp to add to the ETL execution running on a Predefined frequency.
    Custom start timestamp Select a YYYY-MM-DD HH:MM timestamp to add to the ETL execution running on a Custom frequency.
    Property Value
    Run configuration
    ETL module
    (BMC recommends you to make this selection first)
    Select BMC - PATROL Agent log parser for custom KMs.
    ETL task name Default name is already filled out for you.
    Run configuration name Default name is already filled out for you.
    Environment Select Production.
    Description (Optional) Enter a brief description.
    Log level Select how detailed you want the log to be:
    • 1 - Light: Add bare minimum activity logs to the log file.
    • 5 - Medium: Add medium-detailed activity logs to the log file.
    • 10 - Verbose: Add detailed activity logs to the log file.
    Execute in simulation mode Select Yes.
    When set to Yes, the ETL will not store actual data into the data warehouse. This option is useful while testing a new ETL task.
    Module description A link that points you to technical documentation for this ETL.
    Datasets
    1. Click Edit.
    2. Select one (click) or more (shift+click) datasets that you want to include from Available datasets and click >> to move them to Selected datasets.
    3. Click Apply.
    Patrol log format Select any one
    • Standard format
    • Terse format
    Lookup sharing
    Sharing status Select any one:
    • PRIVATE: Select this option if this is the only ETL that extracts data from the given set of resources and the lookup table is not shared with the specified ETL task.
    • SHARED: Select this option if more than one ETL extracts data from the given set of resources and the lookup table is shared with the specified ETL task.
    Object relationships
    After import

    Specify the domain where you want to add the entities created by the ETL. You can select an existing domain or create a new one.

    Select any one of the following options:

    • leave all new entities in 'Newly Discovered'.
    • move all new entities in a new Domain.
      • New domain: Create a new domain. Specify the following properties under New domain:
        • Parent: Select a parent domain for your new domain from the domain selector control.
        • Name: Specify a name for your new domain.
    • move all new entities in an existing Domain 
      • Domain: Select an existing domain from the domain selector control.
    Patrol metric configuration
    (Number of metrics) Select metrics from the metric selection.
    Metric 1 Note: This property is activated only after you select number of metrics

    Specify the input for the selected metrics.

    File location
    File location Select any one of the following methods to retrieve the CSV file:
    • Local directory: Specify a path on your local machine where the CSV file resides.
    • Windows share: Specify the Windows share path where the CSV file resides.
    • FTP: Specify the FTP path where the CSV file resides.
    • SCP: Specify the SCP path where the CSV file resides.
    • SFTP: Specify the SFTP path where the CSV file resides.
    Directory Path of the directory that contains the CSV file.
    Directory UNC Full Path (Windows share) The full UNC (Universal Naming Convention) address. For example: //hostname/sharedfolder
    Files to copy (with wildcards) Before parsing, the SFTP and SCP commands need to make a local temporary copy of the files; this setting specifies which files in the remote directory should be imported.
    File list pattern A regular expression that defines which data files should be read. The default value is (?$<$!done)\$, which tells the ETL to read every file whose name does not end with the string "done". For example, my_file_source.done.
    Recurse into subdirs? Select Yes or No. When set to Yes, BMC Capacity Optimization also inspects the subdirectories of the target directories.
    After parse operation Choose what to do after the CSV file has been imported. The available options are:
    • Do nothing: Do nothing after import.
    • Append suffix to parsed file: Append a suffix you add here to the imported CSV file. For example, _done or _impoted, and so on.
    • Archive parsed file in directory: Archive the parsed file in the specified directory.
      • Archive directory (local): Default archive directory path is filled out for you. For example, %BASE/../repository/imprepository
      • Compress archived files: Select Yes or No.
    • Archive bad files in directory: Archive erroneous files in the specified directory.
      • Archive directory (local): Default archive directory path is filled out for you. For example, %BASE/../repository/imprepository
      • Compress archived files: Select Yes or No.
    Parsed files suffix The suffix that will be appended parsed files; default is .done.
    Remote host (Applies to FTP, SFTP, SCP) Enter the name or address of the remote host to connect to.
    Username (Applies to Windows share, FTP, SFTP, SCP) Enter the username to connect to the file location server.
    Password required (Applies to Windows share, FTP, SFTP, SCP) Select Yes or No.
    Password (Applies to Windows share, FTP, SFTP, SCP) Enter a password to connect to the file location server. Applicable if you selected Yes for Password required.
    ETL task properties
    Task group Select a task group to classify this ETL into.
    Running on scheduler Select the scheduler you want to run the ETL on.
    Maximum execution time before warning The number of hours, minutes or days to to execute the ETL for before generating warnings, if any.
    Frequency Select the frequency of ETL execution. Available options are:
    • Predefined: Select a Predefined frequency from Each Day, Each Week or Each Month.
    • Custom: Enter a Custom frequency (time interval) as the number of minutes, hours, days or weeks to run the ETL in.
    Start timestamp: hour\minute (Applies to Predefined frequency) The HH:MM start timestamp to add to the ETL execution running on a Predefined frequency.
    Custom start timestamp Select a YYYY-MM-DD HH:MM timestamp to add to the ETL execution running on a Custom frequency.

     

     

    Property Value
    Run configuration
    Aggregation resolution Default value is filled out for you.
    Use enterprise grouping Select any one of the following options:
    • Yes: Select any one of the following options:
      • Populate enterprise grouping: Populate the selected grouping object. For more information, see Grouping objects.
      • Use subobject for enterprise grouping: Use subObject to populate the selected grouping object.
    • No: Do not populate any enterprise grouping.
    Metric filter
    Metric list <for selected dataset>

    Note: This property is activated only after you select Datasets under the Run configuration tab.
    1. Click Edit.
    2. Select one (click) or more (shift+click) metrics that you want to include from Available items and click >> to move them to Selected items.
    3. Click Apply.
    Subdirectories to exclude (separated by ';' ) (Local directory) Names of subdirectories to exclude from parsing.
    Input file external validator (Local directory, Windows share, FTP) Select any one of the following options:
    • No external validation: Do not use external validation of the CSV file structure.
    • Use external validation script: Use the following script to validate the CSV file:
      • Script to execute: Specify the validation script to use to validate the input file.
    Additional properties
    List of properties
    1. Click Add.
    2. Add an additional property in the etl.additional.prop.n box.
    3. Click Apply.
      Repeat this task to add more properties.
    Loader configuration
    Empty dataset behavior Choose one of the following actions if the loader encounters an empty dataset:
    • Abort: Abort the loader.
    • Ignore: Ignore the empty dataset and continue parsing.
    ETL log file name Name of the file that contains the ETL execution log; the default value is: %BASE/log/%AYEAR%AMONTH%ADAY%AHOUR%MINUTE%TASKID
    Maximum number of rows for CSV output A number which limits the size of the output files.
    CSV loader output file name Name of the file generated by the CSV loader; the default value is: %BASE/output/%DSNAME%AYEAR%AMONTH%ADAY%AHOUR%ZPROG%DSID%TASKID.
    BCO loader output file name Name of the file generated by the BMC Capacity Optimization loader; the default value is: %BASE/output/%DSNAME%AYEAR%AMONTH%ADAY%AHOUR%ZPROG%DSID%TASKID.
    Detail mode Select the level of detail:
    • Standard: Data will be stored on the database in different tables at the following time granularities: Detail (configurable, by default: 5 minutes), Hourly, Daily, Monthly.
    • Raw also: Data will be stored on the database in different tables at the following time granularities: Raw (as available from the original data source), Detail (configurable, by default: 5 minutes), Hourly, Daily, Monthly.
    • Raw only: Data will be stored on the database in a table only at Raw granularity (as available from the original data source).

    For more information on granularities, see Accessing data using public views and Sizing and scalability considerations.

    Reduce priority
    • Normal:
    • High:
    Remove domain suffix from datasource name (Only for systems) If set to True, the domain name is removed from the data source name. For example, server.domain.com will be saved as server.
    Leave domain suffix to system name (Only for systems) If set to True, the domain name is maintained in the system name. For example: server.domain.com will be saved as such.
    Update grouping object definition If set to True, the ETL will be allowed to update the grouping object definition for a metric loaded by an ETL.
    Skip entity creation (Only for ETL tasks sharing lookup with other tasks) If set to True, this ETL does not create an entity, and discards data from its data source for entities not found in BMC Capacity Optimization. It uses one of the other ETLs that share lookup to create the new entity.
    Scheduling options
    Hour mask Specify a value to execute the task only during particular hours within the day. For example, 0 – 23 or 1,3,5 – 12.
    Day of week mask Select the days so that the task can be executed only during the selected days of the week. To avoid setting this filter, do not select any option for this field.
    Day of month mask Specify a value to execute the task only during particular days within a month. For example, 5, 9, 18, 27 – 31.
    Apply mask validation By default this property is set to True. Set it to False if you want to disable the preceding Scheduling options that you specified. Setting it to False is useful if you want to temporarily turn off the mask validation without removing any values.
    Execute after time Specify a value in the hours:minutes format (for example, 05:00 or 16:00) to wait before the task must be executed. This means that once the task is scheduled, the task execution starts only after the specified time passes.
    Enqueueable Select one of the following options:
    • False (Default): While a particular task is already running, if the next execution command arises – it is ignored.
    • True: While a particular task is already running, if the next execution command arises – it is placed in a queue and is executed as soon as the current execution ends.

Supported datasets

This ETL supports vertical datasets: SYSDAT and WKLDATMetrics are selected while configuring the ETL's properties. Only one dataset at a time (per run) is supported.

Related topics

Using datasets

Developing custom ETLs

Dataset reference

Horizontal and Vertical datasets

Datasets and metrics