Moviri - Dynatrace Extractor
The integration supports the extraction of both performance and configuration."Moviri Integrator for BMC Helix Continuous Optimization – Dynatrace" is an additional component of BMC Helix Continuous Optimization product. It allows extracting data from Dynatrace. Relevant capacity metrics are loaded into BMC Helix Continuous Optimization, which provides advanced analytics over the extracted data.
The documentation is targeted at BMC Helix Continuous Optimization administrators, in charge of configuring and monitoring the integration between BMC Helix Continuous Optimization and Dynatrace.
This version of the connector compatible with BMC Helix Continuous Optimization 19.11 and onward.
If you used the Moviri Integrator for BMC Helix Continuous Optimization – Dynatrace before, the hierarchy will be changed in this version 20.02.01, and some entities will be gone and left at "All systems and business drivers → Unassigned".
This version of the Moviri Integrator for – BMC Helix Continuous Optimization Dynatrace uses Metrics v2 endpoint for data extraction, which is different than the previous version. Check the "Verify Dynatrace Data" section for detailed metrics.
Collecting data by using the Dynatrace
To collect data by using the Dynatrace ETL, do the following tasks:
III. Run Dynatrace ETL.
Step | Details |
---|---|
Check that the Dynatrace version is supported |
|
Generate an authorization token to access Dynatrace API. | |
Verify if the token is working |
A. Configuring the basic properties
Some of the basic properties display default values. You can modify these values if required.
To configure the basic properties:
- In the console, navigate to Administration > ETL & System Tasks, and select ETL tasks.
On the ETL tasks page, click Add > Add ETL. The Add ETL page displays the configuration properties. You must configure properties in the following tabs: Run configuration, Entity catalog, and Amazon Web Services Connection
On the Run Configuration tab, select Moviri - Dynatrace Extractor from the ETL Module list. The name of the ETL is displayed in the ETL task name field. You can edit this field to customize the name.
- Click the Entity catalog tab, and select one of the following options:
Shared Entity Catalog:
- From the Sharing with Entity Catalog list, select the entity catalog name that is shared between ETLs.
- Private Entity Catalog: Select if this is the only ETL that extracts data from the Dynatrace resources.
Click the Dynatrace - Connection Parameters tab, and configure the following properties:
Property Description Dynatrace URL prefix Dynatrace URL.
The URL must be in the form "https://{id}.live.dynatrace.com"Dynatrace API Token Dynatrace API Token (Generated in the Prerequisite)
Semi-colon separated list of HTTP Headers (<header>:"<value>") Extra Http Header key-valye pair used for accessing ServiceNow API (semi-colon separate list for multiple HTTP Headers) Use HTTP Proxy to connect to Dynatrace Select yes if a HTTP proxy is required in order to connect to Dynatrace API
Use HTTPS Whether or not use HTTPS to connect to the HTTP(S) proxy. HTTP Proxy Address HTTP Proxy FQDN or IP Address HTTP Proxy Port HTTP(S) proxy server port HTTP Proxy Username HTTP(S) proxy server username HTTP Proxy Password HTTP(S) proxy server password
Click the Dynatrace - Extraction tab, and configure the following properties:
Property | Description |
---|---|
Default Lastcounter (YYYY-MM-DD HH24:MI:SS) | Initial timestamp from which extract data. |
Max Extraction Period (hours) | Specify the max extraction period in hours (default is 24 hours / 1 day). |
Entities batch size | Specify how many entities is extracted at the same time from Dynatrace. |
Timeout seconds | Specify timeout seconds for Dynatrace API connection |
Waiting period in seconds when hit API limits | Specify waiting period when hit Dynatrace API number of request per minute limit |
Data Resolution | Select at which resolution data will be imported into TSCO. Possible values are:
|
Extraction time chunk size (hour), 0 for no limitation | Specify how many hours of data can be extracted once (default is 0, no limitation, extracting all data from last counter) |
ETL waits when API calls reaches this number | Specify the safeguard threshold for the 'X-RateLimit-Remaining'. The ETL will wait for the next time slot available (or up to the Timeout seconds) when the 'X-RateLimit-Remaining' header will reach this value. More information at (https://www.dynatrace.com/support/help/dynatrace-api/basics/access-limit/) |
Page Size | Specify page size of each Dynatrace response, page size represents how many data entries returned from Dynatrace API |
Lag hour to now as end timestamp | If ending timestamp of extraction period is close to current timestamp, specify lag hours from the current timestamp for Dynatrace Metric API |
Enable Support for Business Service View | Select if the Dynatrace ETL should be configured to support the Business Service View. |
Extract Dynatrace Tags? | Select if the Dynatrace ETL should import of Dynatrace Tag |
Enable Default Tag for Dynatrace tags without a key/value pair? | Select if the Dynatrace ETL is able to import Dynatrace Tag without a Key Value. Not all the Dynatrace Tag are available in the format of key:value (imported as TAGTYPE:TAG). This property enables the possibility to import Dynatrace Tag configured with just a value |
Default tagtype | TAGTYPE to use for Dynatrace Tag without a key |
Extract Application Tags? | Select if the Dynatrace ETL should import Dynatrace Tag defined at the Application level |
Whitelist Application Tags (Leave empty to extract ALL tags; semicolon separated) | Whitelist of Dynatrace Tag to be imported at the Application Level (semi-colon separated) |
Extract Service Tags? | Select if the Dynatrace ETL should import Dynatrace Tag defined at the Service level |
Whitelist Service Tags (Leave empty to extract ALL tags; semicolon separated) | Whitelist of Dynatrace Tag to be imported at the Service Level (semi-colon separated) |
Extract Host Tags? | Select if the Dynatrace ETL should import Dynatrace Tag defined at the Host level |
Whitelist Host Tags (Leave empty to extract ALL tags; semicolon separated) | Whitelist of Dynatrace Tag to be imported at the Host Level (semi-colon separated) |
Tag Type for Business Service View | Specify which Tag Type should be used to import Service Pools, to be used in the Business Service View (default AppTier). |
Alternative name for 'Default Business Service' domain | Alternative name for default business service view |
Host Lookup Information | This property allow to define which value should be used as Strong Lookup Field
|
Click the Dynatrace - Filter tab, and configure the following properties:
Property Description Collect infrastructure metrics If selected, the connector will collect metrics at host level (VM, Generic)
Import indirect services for each applications If selected, the indirect services will be imported Skip Process Group aggregation If selected, the processes data won't be aggregated to Services Extract service key request metrics If Selected, the key request metrics (BYSET_* Business Driver Metrics) will be imported Extract all hosts If Selected, all the hosts (even the one not assigned to any application/service) will be imported. Host not assigned to any application will be available in the "Infrastructure" domain Extract all services If Selected, all the services (even the one not assigned to any application) will be imported. Services are not assigned to any application will be available in the "Services" domain Extract BYFS_ metrics for hosts? If Selected, it will import byfs_metrics, which results in a very slow execution since it's not aggregated. Default is "No"
If Selected, the Disk's UUID (format DISK-123456789) will be used as sub object names. If want to use disk name (path), manually add a property "extract.dynatrace.disk.name" and set to "true". Extracting Disk names will require to use API V2 Read Entities scope. Please refer to "Generate the token" section above. Using disk name (path) will also results in a slow execution.
Change entitytype based on Cloud type? Change entity type based on cloud type, limited to aws, gcp, azure, and oracle. Openstack or other vm platform type will be put as "generic". Custom metrics JSON file location The location of where the custom metrics configuration JSON file. If not want to use the custom metrics, leave this field empty. Application Whitelist (; separated) Semicolon separated list of Java Regular Expression that will be used to identify the applications to be imported (multiple Regular Expressions will be considered in "OR" clause)
Application Blacklist (; separated) Semicolon separated list of Java Regular Expression that will be used to identify the applications not to be imported (multiple Regular Expressions will be considered in "OR" clause)
Application whitelist (; separated), support Application ID Semicolon separated list of Application ID that will be used to identify the applications to be imported (multiple Regular Expressions will be considered in "OR" clause). This property does not accept Regular Expression, an exact match of the Application ID will be performed. Application blacklist (; separated), support Application ID Semicolon separated list of Application ID that will be used to identify the applications to be imported (multiple Regular Expressions will be considered in "OR" clause). This property does not accept Regular Expression, an exact match of the Application ID will be performed. Host whitelist (; separated) Semicolon separated list of Java Regular Expression that will be used to identify the hosts to be imported (multiple Regular Expressions will be considered in "OR" clause)
Host blacklist (; separated) Semicolon separated list of Java Regular Expression that will be used to identify the hosts not to be imported (multiple Regular Expressions will be considered in "OR" clause)
Host whitelist (; separated), support Host ID Semicolon separated list of Host ID that will be used to identify the applications to be imported (the Regular Expressions will be considered in "OR" statements). This property does not accept Regular Expression, an exact match of the Host ID will be performed. Host blacklist (; separated), support Host ID Semicolon separated list of Host ID that will be used to identify the applications not to be imported (the Regular Expressions will be considered in "OR" statements). This property does not accept Regular Expression, an exact match of the Host ID will be performed. Tags for filtering Apps (; separated) Semicolon separated list of tags to filter out the applications. Format can be [Context]tag:value or tag:value or tag. All the tags must apply to the in-scope Applications (multiple tags will be considered in "AND" clause) Tags for filtering Services (; separated) Semicolon separated list of tags to filter out the services. Format can be [Context]tag:value or tag:value or tag. All the tags must apply to the in-scope Service (multiple tags will be considered in "AND" clause) Tags for filtering Hosts(; separated) Semicolon separated list of tags to filter out the hosts. Format can be [Context]tag:value or tag:value or tag. All the tags must apply to the in-scope Applications (multiple tags will be considered in "AND" clause)
The following image shows sample configuration values for the basic properties.
(Optional) Override the default values of properties in the following tabs:
- Click Save.
The ETL tasks page shows the details of the newly configured Dynatrace ETL.
(Optional) B. Configuring the advanced properties
You can configure the advanced properties to change the way the ETL works or to collect additional metrics.
To configure the advanced properties:
- On the Add ETL page, click Advanced.
Configure the following properties:
- Click Save.
The ETL tasks page shows the details of the newly configured Dynatrace ETL.
After you configure the ETL, you can run it to collect data. You can run the ETL in the following modes:
A. Simulation mode: Only validates connection to the data source, does not collect data. Use this mode when you want to run the ETL for the first time or after you make any changes to the ETL configuration.
B. Production mode: Collects data from the data source.
A. Running the ETL in the simulation mode
To run the ETL in the simulation mode:
- In the console, navigate to Administration > ETL & System Tasks, and select ETL tasks.
- On the ETL tasks page, click the ETL.
- In the Run configurations table, click Edit to modify the ETL configuration settings.
- On the Run configuration tab, ensure that the Execute in simulation mode option is set to Yes, and click Save.
- Click Run active configuration. A confirmation message about the ETL run job submission is displayed.
- On the ETL tasks page, check the ETL run status in the Last exit column.
OK Indicates that the ETL ran without any error. You are ready to run the ETL in the production mode. - If the ETL run status is Warning, Error, or Failed:
- On the ETL tasks page, click in the last column of the ETL name row.
- Check the log and reconfigure the ETL if required.
- Run the ETL again.
- Repeat these steps until the ETL run status changes to OK.
B. Running the ETL in the production mode
You can run the ETL manually when required or schedule it to run at a specified time.
Running the ETL manually
- On the ETL tasks page, click the ETL. The ETL details are displayed.
- In the Run configurations table, click Edit to modify the ETL configuration settings. The Edit run configuration page is displayed.
- On the Run configuration tab, select No for the Execute in simulation mode option, and click Save.
- To run the ETL immediately, click Run active configuration. A confirmation message about the ETL run job submission is displayed.
When the ETL is run, it collects data from the source and transfers it to the database.
Scheduling the ETL run
By default, the ETL is scheduled to run daily. You can customize this schedule by changing the frequency and period of running the ETL.
To configure the ETL run schedule:
- On the ETL tasks page, click the ETL, and click Edit.
On the Edit task page, do the following, and click Save:
- Specify a unique name and description for the ETL task.
- In the Maximum execution time before warning field, specify the duration for which the ETL must run before generating warnings or alerts, if any.
- Select a predefined or custom frequency for starting the ETL run. The default selection is Predefined.
- Select the task group and the scheduler to which you want to assign the ETL task.
Click Schedule. A message confirming the scheduling job submission is displayed.
When the ETL runs as scheduled, it collects data from the source and transfers it to the database.
Verify that the ETL ran successfully and check whether the Dynatrace data is refreshed in the Workspace.
To verify whether the ETL ran successfully:
- In the console, click Administration > ETL and System Tasks > ETL tasks.
- In the Last exec time column corresponding to the ETL name, verify that the current date and time are displayed.
- In the console, click Workspace.
- Expand (Domain name) > Systems > Dynatrace > Instances.
- In the left pane, verify that the hierarchy displays the new and updated Dynatrace instances.
- Click a Dynatrace entity, and click the Metrics tab in the right pane.
- Check if the Last Activity column in the Configuration metrics and Performance metrics tables displays the current date.
Dynatrace Workspace Entity | Details |
---|---|
Entities | |
Hierarchy | |
Configuration and Performance metrics mapping | |
Tag Mapping (Optional) | Tags are totally optional. If use Business Service View, machine will have a tag using tag type configured in the ETL. Tag value will be the direct parent tier name. |
Custom Metrics
The Dynatrace extractor now is supporting custom metrics on Applications, Services and Hosts. By configuring the JSON file
Comments
Log in or register to comment.