This documentation supports releases of BMC Helix Continuous Optimization up to December 31, 2021. To view the latest version, select the version from the Product version menu.

Generic - General Text Log Parser

This topic describes how to use the General Text Log Parser ETL. Some important properties of this parser are:

  • Input: It is more flexible in comparison with compatible CSV data format parsers.
  • Mandate on regular expressions: You need to specify regular expressions needed to read and understand the file format.
  • Additional features: It counts rows that match a regular expression.
  • Supported datasetWKLDAT (vertical format).
  • Aggregation
    • To aggregate parsing results using one of the available statistics: Use the following script:

      extract.parser.wkldregex.[CKB:num].aggrtype = [sum | CKB:count | max | min | average]
    • Aggregation fieldUse the following script:

      extract.parser.wkldregex.[CKB:num].value = %GROUPn

      where 
      [CKB:num] is the definition number.
  • lastcounter parameterThe lastcounter parameter cannot be used with the Generic Text Log Parser. The parser parses all the files that match the available pattern, whereas extractors extract data from the database, and the lastcounter parameter holds the last sample extracted.

Integration steps

To integrate BMC Helix Continuous Optimization with the general text log parser:

  1. Navigate to Administration > ETL & SYSTEM TASKS > ETL tasks.
  2. In the ETL tasks page, click Add > Add ETL under the Last run tab.
  3. In the Add ETL page, set values for the following properties under each expandable tab.

    Note

    Basic properties are displayed by default in the Add ETL page. These are the most common properties that you can set for an ETL, and it is acceptable to leave the default selections for each as is.

    Basic properties

    Property Description
    Run configuration
    ETL module Select Generic - General text log parser.
    ETL task name Default name is already filled out for you.
    Run configuration name Default name is already filled out for you.
    Deploy status Select Production.
    Description (Optional) Enter a brief description.
    Log level Select how detailed you want the log to be:
    • 1 - Light: Add bare minimum activity logs to the log file.
    • 5 - Medium: Add medium-detailed activity logs to the log file.
    • 10 - Verbose: Add detailed activity logs to the log file.
    Execute in simulation mode Select Yes if you want to to validate the connectivity between the ETL engine and the target, and to ensure that the ETL does not have any other configuration issues. This option is useful while testing a new ETL task.
    Module selection

    Ensure that the Based on datasource option is selected.

    Note

    If you select Based on Open ETL template, BMC Helix Continuous Optimization is integrated with a Generic extractor based on the selected Open ETL template. For more information, see  Generic ETL based on a template

    Module description A link that points you to technical documentation for this ETL.
    Datasets
    1. Click Edit.
    2. Select one (click) or more (shift+click) datasets that you want to include from Available datasets and click >> to move them to Selected datasets.
    3. Click Apply.

      Additional information

      Because of its nature, the General Text Log Parser uses the WKLDAT generic workload dataset.

    Entity catalog
    Sharing status Select any one:
    • Shared entity catalog: Select this option if, for the same entities, data is coming from multiple sources. For example, BPA ETL.
      • Sharing with Entity Catalog: Select an entity catalog from the drop-down list.
    • Private entity catalog: Select this option if, for the same entity, data is coming from a single source.
    Object relationships
    After import

    Specify the domain to which you want to add the entities created by the ETL.

    Select one of the following options:

      • New domain: This option is selected by default. Select a parent domain, and specify a name for your new domain.
      • Existing domain: Select an existing domain from the Domain list.

    By default, a new domain with the same ETL name is created for each ETL. 

    Log Line counter
    Find TS in the whole line Select any one:
    • Yes: Select Yes to find TS in the whole line.
    • No: Do not find TS in the whole line..
    Number of business driver definitions

    Select the number of business driver definitions.

    Business driver 1 name Specify the business driver 1 name.
    Business driver 1 pattern Specify the business driver 1 pattern.
    Business driver 1 resource (object) name Specify the business driver 1 resource or object name. Default is TOTAL_EVENTS.
    Business driver 1 subresource (subobject) name Specify the business driver 1 sub resource or sub object name. Default is GLOBAL.
    File location
    File location

    Select any one of the following methods to retrieve the CSV file:

    • Local directory: Specify a path on your local machine where the CSV file resides.
    • Windows share: Specify the Windows share path where the CSV file resides.
      Note: The Windows share option uses sudo mount for mounting the file system to the BMC Helix Continuous Optimization server. Only sudo is supported for an su type command alternative, and BMC Helix Continuous Optimization does not support sesu. Also note that the cpit id should be able to sudo to get root privileges, as mount requires root access

    • FTP: Specify the FTP path where the CSV file resides.
    • SCP: Specify the SCP path where the CSV file resides.
    • SFTP: Specify the SFTP path where the CSV file resides.
    Directory Path of the directory that contains the CSV file.
    Directory UNC Full Path (Windows share) The full UNC (Universal Naming Convention) address. For example: //hostname/sharedfolder
    Files to copy (with wildcards) Before parsing, the SFTP and SCP commands need to make a local temporary copy of the files; this setting specifies which files in the remote directory should be imported.
    File list pattern A regular expression that defines which data files should be read. The default value is (?$<$!done)\$, which tells the ETL to read every file whose name does not end with the string "done". For example, my_file_source.done.
    Recurse into subdirs?

    Select Yes or No. When set to Yes, BMC Helix Continuous Optimization also inspects the sub directories of the target directories.

    After parse operation Choose what to do after the CSV file has been imported. The available options are:
    • Do nothing: Do nothing after import.
    • Append suffix to parsed file: Append a suffix you add here to the imported CSV file. For example, _done or _impoted, and so on.
    • Archive parsed file in directory: Archive the parsed file in the specified directory.
      • Archive directory (local): Default archive directory path is filled out for you. For example, %BASE/../repository/imprepository
      • Compress archived files: Select Yes or No.
    • Archive bad files in directory: Archive erroneous files in the specified directory.
      • Archive directory (local): Default archive directory path is filled out for you. For example, %BASE/../repository/imprepository
      • Compress archived files: Select Yes or No.
    Parsed files suffix The suffix that will be appended parsed files; default is .done.
    Remote host (Applies to FTP, SFTP, SCP) Enter the name or address of the remote host to connect to.
    Username (Applies to Windows share, FTP, SFTP, SCP) Enter the username to connect to the file location server.
    Password required (Applies to Windows share, FTP, SFTP, SCP) Select Yes or No.
    Password (Applies to Windows share, FTP, SFTP, SCP) Enter a password to connect to the file location server. Applicable if you selected Yes for Password required.
    ETL task properties
    Task group Select a task group to classify this ETL into.
    Running on scheduler Select a scheduler for running the ETL. For cloud ETLs, use the scheduler that is preconfigured in Helix. For on-premises ETLs, use the scheduler that runs on the Remote ETL Engine.
    Maximum execution time before warning The number of hours, minutes or days to to execute the ETL for before generating warnings, if any.
    Frequency Select the frequency of ETL execution. Available options are:
    • Predefined: Select a Predefined frequency from Each Day, Each Week or Each Month.
    • Custom: Enter a Custom frequency (time interval) as the number of minutes, hours, days or weeks to run the ETL in.
    Start timestamp: hour\minute (Applies to Predefined frequency) The HH:MM start timestamp to add to the ETL execution running on a Predefined frequency.
    Custom start timestamp Select a YYYY-MM-DD HH:MM timestamp to add to the ETL execution running on a Custom frequency.

    Note

    To view or configure Advanced properties, click Advanced. You do not need to set or modify these properties unless you want to change the way the ETL works. These properties are for advanced users and scenarios only.

    Advanced properties

    Property Description
    Run configuration
    Default locale Enter the default locale information
    Aggregation resolution Default value is filled out for you.
    Collection level 
    Metric profile selection

    Select any one:

    • Use Global metric profile: Select this to use an out-of-the-box global profile, that is available on the Adding and managing metric profiles page. By default, all ETL modules use this profile.
    • Select a custom metric profile: Any metric profiles you add in the Add metric profile page (Administration > DATAWAREHOUSE > Metric profiles).

    For more information, see Adding and managing metric profiles.

    Levels up to

    The metric level defines the amount of metric imported into capacity optimization. If you increase the level, additional load is added to the ingestion, while decreasing the metric level reduces the number of imported metrics.

    Choose the metric level to apply on selected metrics:

    • [1] Essential
    • [2] Basic
    • [3] Standard
    • [4] Extended
    Metric

    Metric list <for selected dataset>

    Note: This property is activated only after you select Datasets under theRun configuration tab.

    1. Click Edit.
    2. Select one (click) or more (shift+click) metrics that you want to include from Available items and click >> to move them to Selected items.
    3. Click Apply.
    File location
    Subdirectories to exclude (separated by ';' ) (Local directory) Names of subdirectories to exclude from parsing.
    Input file external validator (Local directory, Windows share, FTP) Select any one of the following options:
    • No external validation: Do not use external validation of the CSV file structure.
    • Use external validation script: Use the following script to validate the CSV file:
      • Script to execute: Specify the validation script to use to validate the input file.
    Additional properties
    List of properties
    1. Click Add.
    2. Add an additional property in the etl.additional.prop.n box.
    3. Click Apply.
      Repeat this task to add more properties.
    Loader configuration
    Empty dataset behavior Choose one of the following actions if the loader encounters an empty dataset:
    • Abort: Abort the loader.
    • Ignore: Ignore the empty dataset and continue parsing.
    ETL log file name Name of the file that contains the ETL execution log; the default value is: %BASE/log/%AYEAR%AMONTH%ADAY%AHOUR%MINUTE%TASKID
    Maximum number of rows for CSV output A number which limits the size of the output files.
    CSV loader output file name Name of the file generated by the CSV loader; the default value is: %BASE/output/%DSNAME%AYEAR%AMONTH%ADAY%AHOUR%ZPROG%DSID%TASKID.
    BCO loader output file name Name of the file generated by the Capacity Optimization loader; the default value is: %BASE/output/%DSNAME%AYEAR%AMONTH%ADAY%AHOUR%ZPROG%DSID%TASKID.
    Reduce priority
    • Normal:
    • High:
    Remove domain suffix from datasource name (Only for systems) If set to True, the domain name is removed from the data source name. For example, server.domain.com will be saved as server.
    Leave domain suffix to system name (Only for systems) If set to True, the domain name is maintained in the system name. For example: server.domain.com will be saved as such.
    Skip entity creation

    (Only for ETL tasks sharing lookup with other tasks) If set to True, this ETL does not create an entity, and discards data from its data source for entities not found in BMC Helix Continuous Optimization. It uses one of the other ETLs that share lookup to create the new entity.

    Scheduling options
    Hour mask Specify a value to execute the task only during particular hours within the day. For example, 0 – 23 or 1,3,5 – 12.
    Day of week mask Select the days so that the task can be executed only during the selected days of the week. To avoid setting this filter, do not select any option for this field.
    Day of month mask Specify a value to execute the task only during particular days within a month. For example, 5, 9, 18, 27 – 31.
    Apply mask validation By default this property is set to True. Set it to False if you want to disable the preceding Scheduling options that you specified. Setting it to False is useful if you want to temporarily turn off the mask validation without removing any values.
    Execute after time Specify a value in the hours:minutes format (for example, 05:00 or 16:00) to wait before the task must be executed. This means that once the task is scheduled, the task execution starts only after the specified time passes.
    Enqueueable Select one of the following options:
    • False (Default): While a particular task is already running, if the next execution command arises – it is ignored.
    • True: While a particular task is already running, if the next execution command arises – it is placed in a queue and is executed as soon as the current execution ends.
  4. Click Save.
    You return to the Last run tab under the ETL tasks page.
  5. Validate the results in simulation mode: In the ETL tasks table under ETL tasks > Last run, locate your ETL (ETL task name), click  to run the ETL.
    After you run the ETL, the Last exit column in the ETL tasks table will display one of the following values:
    • OK: The ETL executed without any error in simulation mode.
    • WARNING: The ETL execution returned some warnings in simulation mode. Check the ETL log.
    • ERROR: The ETL execution returned errors and was unsuccessful. Edit the active Run configuration and try again.
  6. Switch the ETL to production mode: To do this, perform the following task:
    1. In the ETL tasks table under ETL tasks > Last run, click the ETL under the Name column.
    2. In the Run configurations table in the ETL details page, click  to edit the active run configuration.
    3. In the Edit run configuration page, navigate to the Run configuration expandable tab and set Execute in simulation mode to No.
    4. Click Save.
  7. Locate the ETL in the ETL tasks table and click  to Run it, or schedule an ETL run.
    After you run the ETL, or schedule the ETL for a run, it will extract the data form the source and transfer it to the BMC Helix Continuous Optimization database.

Aggregation

Add the following two configuration parameters:

  • extract.parser.wkldregex.[CKB:num].aggrtype = [sum | CKB:count | max | min | average] : Lets you aggregate parsing results using one of the available statistics.
  • extract.parser.wkldregex.[CKB:num].value = %GROUPn : The aggregation field.

where [CKB:num] is the definition number.

lastcounter parameter

The lastcounter parameter cannot be used with the Generic Text Log Parser. The parser parses all the files that match the available pattern, whereas extractors extract data from the database, and the lastcounter parameter holds the last sample extracted.

Related topics

Using ETL datasets

Generic - CSV file parser

Developing custom ETLs

Dataset reference for ETL tasks

Horizontal and Vertical datasets

Viewing datasets and metrics by dataset and ETL module

This topic describes how to use the General Text Log Parser ETL. Some important properties of this parser are:

  • Input: It is more flexible in comparison with compatible CSV data format parsers.
  • Mandate on regular expressions: You need to specify regular expressions needed to read and understand the file format.
  • Additional features: It counts rows that match a regular expression.
  • Supported datasetWKLDAT (vertical format).
  • Aggregation
    • To aggregate parsing results using one of the available statistics: Use the following script:

      extract.parser.wkldregex.[CKB:num].aggrtype = [sum | CKB:count | max | min | average]
    • Aggregation fieldUse the following script:

      extract.parser.wkldregex.[CKB:num].value = %GROUPn

      where 
      [CKB:num] is the definition number.
  • lastcounter parameterThe lastcounter parameter cannot be used with the Generic Text Log Parser. The parser parses all the files that match the available pattern, whereas extractors extract data from the database, and the lastcounter parameter holds the last sample extracted.

Integration steps

To integrate BMC Helix Continuous Optimization with the general text log parser:

  1. Navigate to Administration > ETL & SYSTEM TASKS > ETL tasks.
  2. In the ETL tasks page, click Add > Add ETL under the Last run tab.
  3. In the Add ETL page, set values for the following properties under each expandable tab.

    Note

    Basic properties are displayed by default in the Add ETL page. These are the most common properties that you can set for an ETL, and it is acceptable to leave the default selections for each as is.

    Basic properties

    Property Description
    Run configuration
    ETL module Select Generic - General text log parser.
    ETL task name Default name is already filled out for you.
    Run configuration name Default name is already filled out for you.
    Deploy status Select Production.
    Description (Optional) Enter a brief description.
    Log level Select how detailed you want the log to be:
    • 1 - Light: Add bare minimum activity logs to the log file.
    • 5 - Medium: Add medium-detailed activity logs to the log file.
    • 10 - Verbose: Add detailed activity logs to the log file.
    Execute in simulation mode Select Yes if you want to to validate the connectivity between the ETL engine and the target, and to ensure that the ETL does not have any other configuration issues. This option is useful while testing a new ETL task.
    Module selection

    Ensure that the Based on datasource option is selected.

    Note

    If you select Based on Open ETL template, BMC Helix Continuous Optimization is integrated with a Generic extractor based on the selected Open ETL template. For more information, see  Generic ETL based on a template

    Module description A link that points you to technical documentation for this ETL.
    Datasets
    1. Click Edit.
    2. Select one (click) or more (shift+click) datasets that you want to include from Available datasets and click >> to move them to Selected datasets.
    3. Click Apply.

      Additional information

      Because of its nature, the General Text Log Parser uses the WKLDAT generic workload dataset.

    Entity catalog
    Sharing status Select any one:
    • Shared entity catalog: Select this option if, for the same entities, data is coming from multiple sources. For example, BPA ETL.
      • Sharing with Entity Catalog: Select an entity catalog from the drop-down list.
    • Private entity catalog: Select this option if, for the same entity, data is coming from a single source.
    Object relationships
    After import

    Specify the domain where you want to add the entities created by the ETL. You can select an existing domain or create a new one.

    Select any one of the following options:

    • leave all new entities in 'Newly Discovered'.
    • move all new entities in a new Domain.
      • New domain: Create a new domain. Specify the following properties under New domain:
        • Parent: Select a parent domain for your new domain from the domain selector control.
        • Name: Specify a name for your new domain.
    • move all new entities in an existing Domain 
      • Domain: Select an existing domain from the domain selector control.
    Log Line counter
    Find TS in the whole line Select any one:
    • Yes: Select Yes to find TS in the whole line.
    • No: Do not find TS in the whole line..
    Number of business driver definitions

    Select the number of business driver definitions.

    Business driver 1 name Specify the business driver 1 name.
    Business driver 1 pattern Specify the business driver 1 pattern.
    Business driver 1 resource (object) name Specify the business driver 1 resource or object name. Default is TOTAL_EVENTS.
    Business driver 1 subresource (subobject) name Specify the business driver 1 sub resource or sub object name. Default is GLOBAL.
    File location
    File location

    Select any one of the following methods to retrieve the CSV file:

    • Local directory: Specify a path on your local machine where the CSV file resides.
    • Windows share: Specify the Windows share path where the CSV file resides.
      Note: The Windows share option uses sudo mount for mounting the file system to the BMC Helix Continuous Optimization server. Only sudo is supported for an su type command alternative, and BMC Helix Continuous Optimization does not support sesu. Also note that the cpit id should be able to sudo to get root privileges, as mount requires root access

    • FTP: Specify the FTP path where the CSV file resides.
    • SCP: Specify the SCP path where the CSV file resides.
    • SFTP: Specify the SFTP path where the CSV file resides.
    Directory Path of the directory that contains the CSV file.
    Directory UNC Full Path (Windows share) The full UNC (Universal Naming Convention) address. For example: //hostname/sharedfolder
    Files to copy (with wildcards) Before parsing, the SFTP and SCP commands need to make a local temporary copy of the files; this setting specifies which files in the remote directory should be imported.
    File list pattern A regular expression that defines which data files should be read. The default value is (?$<$!done)\$, which tells the ETL to read every file whose name does not end with the string "done". For example, my_file_source.done.
    Recurse into subdirs?

    Select Yes or No. When set to Yes, BMC Helix Continuous Optimization also inspects the sub directories of the target directories.

    After parse operation Choose what to do after the CSV file has been imported. The available options are:
    • Do nothing: Do nothing after import.
    • Append suffix to parsed file: Append a suffix you add here to the imported CSV file. For example, _done or _impoted, and so on.
    • Archive parsed file in directory: Archive the parsed file in the specified directory.
      • Archive directory (local): Default archive directory path is filled out for you. For example, %BASE/../repository/imprepository
      • Compress archived files: Select Yes or No.
    • Archive bad files in directory: Archive erroneous files in the specified directory.
      • Archive directory (local): Default archive directory path is filled out for you. For example, %BASE/../repository/imprepository
      • Compress archived files: Select Yes or No.
    Parsed files suffix The suffix that will be appended parsed files; default is .done.
    Remote host (Applies to FTP, SFTP, SCP) Enter the name or address of the remote host to connect to.
    Username (Applies to Windows share, FTP, SFTP, SCP) Enter the username to connect to the file location server.
    Password required (Applies to Windows share, FTP, SFTP, SCP) Select Yes or No.
    Password (Applies to Windows share, FTP, SFTP, SCP) Enter a password to connect to the file location server. Applicable if you selected Yes for Password required.
    ETL task properties
    Task group Select a task group to classify this ETL into.
    Running on scheduler Select a scheduler for running the ETL. For cloud ETLs, use the scheduler that is preconfigured in Helix. For on-premises ETLs, use the scheduler that runs on the Remote ETL Engine.
    Maximum execution time before warning The number of hours, minutes or days to to execute the ETL for before generating warnings, if any.
    Frequency Select the frequency of ETL execution. Available options are:
    • Predefined: Select a Predefined frequency from Each Day, Each Week or Each Month.
    • Custom: Enter a Custom frequency (time interval) as the number of minutes, hours, days or weeks to run the ETL in.
    Start timestamp: hour\minute (Applies to Predefined frequency) The HH:MM start timestamp to add to the ETL execution running on a Predefined frequency.
    Custom start timestamp Select a YYYY-MM-DD HH:MM timestamp to add to the ETL execution running on a Custom frequency.

    Note

    To view or configure Advanced properties, click Advanced. You do not need to set or modify these properties unless you want to change the way the ETL works. These properties are for advanced users and scenarios only.

    Advanced properties

    Property Description
    Run configuration
    Default locale Enter the default locale information
    Aggregation resolution Default value is filled out for you.
    Collection level 
    Metric profile selection

    Select any one:

    • Use Global metric profile: Select this to use an out-of-the-box global profile, that is available on the Adding and managing metric profiles page. By default, all ETL modules use this profile.
    • Select a custom metric profile: Any metric profiles you add in the Add metric profile page (Administration > DATAWAREHOUSE > Metric profiles).

    For more information, see Adding and managing metric profiles.

    Levels up to

    The metric level defines the amount of metric imported into capacity optimization. If you increase the level, additional load is added to the ingestion, while decreasing the metric level reduces the number of imported metrics.

    Choose the metric level to apply on selected metrics:

    • [1] Essential
    • [2] Basic
    • [3] Standard
    • [4] Extended
    Metric

    Metric list <for selected dataset>

    Note: This property is activated only after you select Datasets under theRun configuration tab.

    1. Click Edit.
    2. Select one (click) or more (shift+click) metrics that you want to include from Available items and click >> to move them to Selected items.
    3. Click Apply.
    File location
    Subdirectories to exclude (separated by ';' ) (Local directory) Names of subdirectories to exclude from parsing.
    Input file external validator (Local directory, Windows share, FTP) Select any one of the following options:
    • No external validation: Do not use external validation of the CSV file structure.
    • Use external validation script: Use the following script to validate the CSV file:
      • Script to execute: Specify the validation script to use to validate the input file.
    Additional properties
    List of properties
    1. Click Add.
    2. Add an additional property in the etl.additional.prop.n box.
    3. Click Apply.
      Repeat this task to add more properties.
    Loader configuration
    Empty dataset behavior Choose one of the following actions if the loader encounters an empty dataset:
    • Abort: Abort the loader.
    • Ignore: Ignore the empty dataset and continue parsing.
    ETL log file name Name of the file that contains the ETL execution log; the default value is: %BASE/log/%AYEAR%AMONTH%ADAY%AHOUR%MINUTE%TASKID
    Maximum number of rows for CSV output A number which limits the size of the output files.
    CSV loader output file name Name of the file generated by the CSV loader; the default value is: %BASE/output/%DSNAME%AYEAR%AMONTH%ADAY%AHOUR%ZPROG%DSID%TASKID.
    BCO loader output file name Name of the file generated by the Capacity Optimization loader; the default value is: %BASE/output/%DSNAME%AYEAR%AMONTH%ADAY%AHOUR%ZPROG%DSID%TASKID.
    Detail mode Select the level of detail:
    • Standard: Data will be stored on the database in different tables at the following time granularities: Detail (configurable, by default: 5 minutes), Hourly, Daily, Monthly.
    • Raw also: Data will be stored on the database in different tables at the following time granularities: Raw (as available from the original data source), Detail (configurable, by default: 5 minutes), Hourly, Daily, Monthly.
    • Raw only: Data will be stored on the database in a table only at Raw granularity (as available from the original data source).
    Reduce priority
    • Normal:
    • High:
    Remove domain suffix from datasource name (Only for systems) If set to True, the domain name is removed from the data source name. For example, server.domain.com will be saved as server.
    Leave domain suffix to system name (Only for systems) If set to True, the domain name is maintained in the system name. For example: server.domain.com will be saved as such.
    Update grouping object definition If set to True, the ETL will be allowed to update the grouping object definition for a metric loaded by an ETL.
    Skip entity creation

    (Only for ETL tasks sharing lookup with other tasks) If set to True, this ETL does not create an entity, and discards data from its data source for entities not found in BMC Helix Continuous Optimization. It uses one of the other ETLs that share lookup to create the new entity.

    Scheduling options
    Hour mask Specify a value to execute the task only during particular hours within the day. For example, 0 – 23 or 1,3,5 – 12.
    Day of week mask Select the days so that the task can be executed only during the selected days of the week. To avoid setting this filter, do not select any option for this field.
    Day of month mask Specify a value to execute the task only during particular days within a month. For example, 5, 9, 18, 27 – 31.
    Apply mask validation By default this property is set to True. Set it to False if you want to disable the preceding Scheduling options that you specified. Setting it to False is useful if you want to temporarily turn off the mask validation without removing any values.
    Execute after time Specify a value in the hours:minutes format (for example, 05:00 or 16:00) to wait before the task must be executed. This means that once the task is scheduled, the task execution starts only after the specified time passes.
    Enqueueable Select one of the following options:
    • False (Default): While a particular task is already running, if the next execution command arises – it is ignored.
    • True: While a particular task is already running, if the next execution command arises – it is placed in a queue and is executed as soon as the current execution ends.
  4. Click Save.
    You return to the Last run tab under the ETL tasks page.
  5. Validate the results in simulation mode: In the ETL tasks table under ETL tasks > Last run, locate your ETL (ETL task name), click  to run the ETL.
    After you run the ETL, the Last exit column in the ETL tasks table will display one of the following values:
    • OK: The ETL executed without any error in simulation mode.
    • WARNING: The ETL execution returned some warnings in simulation mode. Check the ETL log.
    • ERROR: The ETL execution returned errors and was unsuccessful. Edit the active Run configuration and try again.
  6. Switch the ETL to production mode: To do this, perform the following task:
    1. In the ETL tasks table under ETL tasks > Last run, click the ETL under the Name column.
    2. In the Run configurations table in the ETL details page, click  to edit the active run configuration.
    3. In the Edit run configuration page, navigate to the Run configuration expandable tab and set Execute in simulation mode to No.
    4. Click Save.
  7. Locate the ETL in the ETL tasks table and click  to Run it, or schedule an ETL run.
    After you run the ETL, or schedule the ETL for a run, it will extract the data form the source and transfer it to the BMC Helix Continuous Optimization database.

Aggregation

Add the following two configuration parameters:

  • extract.parser.wkldregex.[CKB:num].aggrtype = [sum | CKB:count | max | min | average] : Lets you aggregate parsing results using one of the available statistics.
  • extract.parser.wkldregex.[CKB:num].value = %GROUPn : The aggregation field.

where [CKB:num] is the definition number.

lastcounter parameter

The lastcounter parameter cannot be used with the Generic Text Log Parser. The parser parses all the files that match the available pattern, whereas extractors extract data from the database, and the lastcounter parameter holds the last sample extracted.

Related topics

Using ETL datasets

Generic - CSV file parser

Developing custom ETLs

Dataset reference for ETL tasks

Horizontal and Vertical datasets

Viewing datasets and metrics by dataset and ETL module

Was this page helpful? Yes No Submitting... Thank you

Comments