Moviri - AppDynamics Extractor

Moviri Integrator for BMC Helix Continuous Optimization – AppDynamics" is an additional component of BMC Helix Continuous Optimization product. It allows extracting data from AppDynamics.  Relevant capacity metrics are loaded into BMC Helix Continuous Optimization, which provides advanced analytics over the extracted data.

The integration supports the extraction of both performance and configuration.

The documentation is targeted at BMC Helix Continuous Optimization administrators, in charge of configuring and monitoring the integration between BMC Helix Continuous Optimization and AppDynamics.


Moviri Integrator for BMC Helix Continuous Optimization - AppDynamics is compatible with BMC Helix Continuous Optimization 19.11 and onward.


Collecting data by using the AppDynamics

To collect data by using the AppDynamics ETL, do the following tasks:

I. Compelete the preconfiguration tasks.

II. Configure the ETL.

III. Run AppDynamics ETL.

IV. Verify AppDynamics Data.

Step I. Complete the preconfiguration tasks

StepsDetails
Check the AppDynamics version is supportedAppDynamics SaaS
Generate User credentials for AppDynamics API access

To generate credentials, follow the steps:

  1. Login to AppDynamics console.
  2. Go to "Administration".
  3. Check the user's username and password.
  4. Click on "Role" to grant access for Controller API for this user with at least a read-only privilege.

To generate credentials, follow the steps:

  1. Login to AppDynamics console.
  2. Go to "Administration".
  3. Click on the "API Clients" tab
  4. Click on "Create" to create a new API Client.
  5. Give the client a name and a description
    1. The Client API username will be "<Client Name>@<Account Name"> similar to the basic authentication.
  6. Set the "Token Expiration" to what ever you want. The ETL will manage its own token internally and refresh when needed.
  7. Take note of the client secret and generate a new one as needed.
Verify if the user credentials have the access

Execute the following command from a linux console:

curl --user <username>@<customer>:<password> <https/http>://<host>.:<port>/controller/rest/applications


Most on-premises Controllers are single-tenant Controllers that use customer1 as the primary default account name. The account name should be left as default. For example:

<your_username>@customer1:<your_password>

Most SaaS Controllers are multi-tenant Controllers and allow you to replace customer1 with your own, instance-specific account name.

An output similar to the following should be obtained:


<applications><application>
  <id>203</id>
  <name>2741FACSYS.PP</name>
  <accountGuid>b846f45a-8193-4f78-8822-090d963c9e30</accountGuid>
</application>
<application>
  <id>80</id>
  <name>1619-TPVI.TI-B</name>
  <accountGuid>b846f45a-8193-4f78-8822-090d963c9e30</accountGuid>
</application>
<applications>

Execute the following command from a linux console:

Token Request cURL
curl -v --location --request POST 'https://<Account Name>.saas.appdynamics.com/controller/api/oauth/access_token' \
--header 'Content-Type: application/x-www-form-urlencoded' \
--data-urlencode 'grant_type=client_credentials' \
--data-urlencode 'client_id=<API Client Name>@<Account Name>' \
--data-urlencode 'client_secret=<API Client Secret>'


Most on-premises Controllers are single-tenant Controllers that use customer1 as the primary default account name. The account name should be left as default. For example:

<your_username>@customer1:<your_password>

Most SaaS Controllers are multi-tenant Controllers and allow you to replace customer1 with your own, instance-specific account name.

An output similar to the following should be obtained:


OAuth Response
{
    "access_token": "eyJraWQiOiJmZGI4ZjY1Yi1lMTYyLTQ3YWQtYjgyMS04MTA0MmUxZDE1YzkiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJ~",
    "expires_in": 60
}

This is the API bearer token that should be used to test if the client has access to the AppDynamics API.

Use the following cURL request to test the newly generated API Token.

API cURL Test
curl --location --request GET 'https://<account name>.saas.appdynamics.com/controller/rest/applications' --header 'Authorization: Bearer <API Token from previous step>'

You should get a response like this indicating a successful connection to the API:

cURL Test Response
<applications>
    <application>
        <id>416</id>
        <name>AD-Capital</name>
        <accountGuid>fdb8f65b-e162-47ad-b821-81042e1d15c9</accountGuid>
    </application>
    <application>
        <id>459</id>
        <name>AD-Financial</name>
        <accountGuid>fdb8f65b-e162-47ad-b821-81042e1d15c9</accountGuid>
    </application>
</applications>

Step II. Configure the ETL

A. Configuring the basic properties

Some of the basic properties display default values. You can modify these values if required.

To configure the basic properties:

  1. In the console, navigate to Administration ETL & System Tasks, and select ETL tasks.
  2. On the ETL tasks page, click Add > Add ETL. The Add ETL page displays the configuration properties. You must configure properties in the following tabs: Run configuration, Entity catalog, and Amazon Web Services Connection

  3. On the Run Configuration tab, select Moviri - AppDynamics Extractor from the ETL Module list. The name of the ETL is displayed in the ETL task name field. You can edit this field to customize the name.

  4. Click the Entity catalog tab, and select one of the following options:
    • Shared Entity Catalog:

      • From the Sharing with Entity Catalog list, select the entity catalog name that is shared between ETLs.
    • Private Entity Catalog: Select if this is the only ETL that extracts data from the AppDynamics resources.
  5. Click the AppDynamics- Connection Parameters tab, and configure the following properties:

PropertyDescription

AppDynamics URL (http/https://hostname:port)

AppDynamics URL.

AppDynamics Username

AppDynamics Username (Generated in the Prerequisite)

AppDynamics Password

AppDynamics Password (Generated in the Rrerequisite)

AppDynamics Proxy URL (http/https://hostname:port)

HTTP Proxy URL

HTTP Proxy Username

HTTP(S) proxy server username

HTTP Proxy Password

HTTP(S) proxy server password

6. Click the AppDynamics Extraction tab, and configure the following properties:

PropertyDescription

Default Lastcounter (YYYY-MM-DD HH24)

Initial timestamp from which extract data. (Eg: “2019-02-05 00”)

Maximum Days to Extract

Maximum days to extract data

Extract Lag (hours)

Hourly extraction lag.

Data Resolutions1 hour and 10min. By default, if the last counter is within 2 days, AppDynamics gives you 10min as resolution. If the last counter is more than 2 days, AppDynamics uses 1 hour resolution.

Application Whitelist

Select application from the list specified. Separated with pipe ("|"). Regular Expression syntax is supported

Application BlacklistIgnore application from the list specified. Separated with pipe ("|"). Regular Expression syntax is supported

Tier Whitelist

Select tiers from the list specified. Separated with pipe ("|"). Regular Expression syntax is supported

Tier BlacklistIgnore tiers from the list specified. Separated with pipe ("|"). Regular Expression syntax is supported
Host WhitelistSelect Hosts from the list specified. Separated with pipe ("|"). Regular Expression syntax is supported
Host BlacklistIgnore Hosts from the list specified. Separated with pipe ("|"). Regular Expression syntax is supported

Import Server Data

If "Yes" is selected, the ETL will not import performance metrics at host level

If "No" is selected, the ETL will not import performance metrics at host level. The ETL will still aggregate performance and configuration metrics at Tier level and it will add servers/hosts in the hierarchy.

Import Server Workload Data

A Yes/No prompt that will cause the ETL to import or not import Business Drivers at host level

Create 'Default Business Service' domain?

Select if the AppDynamics ETL should be configured to support the Business Service View.

Alternative name for 'Default Business Service' domain.Specify what should be the name for business service view instead of default "Default Business Service'.

Tag Type for Service Pools?

Specify which Tag Type should be used to import Service Pools, to be used in the Business Service View (default AppTier).
The following image shows sample configuration values for the basic properties.

7. (Optional) Override the default values of the properties:

PropertyDescription
Module descriptionA short description of the ETL module.
Execute in simulation modeBy default, the ETL execution in simulation mode is selected to validate connectivity with the data source, and to ensure that the ETL does not have any configuration issues. In the simulation mode, the ETL does not load data into the database. This option is useful when you want to test a new ETL task. To run the ETL in the production mode, select No.
BMC recommends that you run the ETL in the simulation mode after ETL configuration and then run it in the production mode.
PropertyDescription
Associate new entities to

Specify the domain to which you want to add the entities created by the ETL.

Select one of the following options:

  • Existing domain: This option is selected by default. Select an existing domain from the Domain list. 
  • New domain: Select a parent domain, and specify a name for your new domain.

By default, a new domain with the same ETL name is created for each ETL. 

PropertyDescription
Task groupSelect a task group to classify the ETL.
Running on schedulerSelect one of the following schedulers for running the ETL:
  • Primary Scheduler: Runs on the Application Server.
  • Generic Scheduler: Runs on a separate computer.
  • Remote: Runs on remote computers.
Maximum execution time before warningIndicates the number of hours, minutes, or days for which the ETL must run before generating warnings or alerts, if any.
Frequency

Select one of the following frequencies to run the ETL:

  • Predefined: This is the default selection. Select a daily, weekly, or monthly frequency, and then select a time to start the ETL run accordingly.
  • Custom: Specify a custom frequency, select an appropriate unit of time, and then specify a day and a time to start the ETL run.

(Optional) B. Configuring the advanced properties

You can configure the advanced properties to change the way the ETL works or to collect additional metrics.

To configure the advanced properties:

  1. On the Add ETL page, click Advanced.
  2. Configure the following properties:

    PropertyDescription
    Run configuration nameSpecify the name that you want to assign to this ETL task configuration. The default configuration name is displayed. You can use this name to differentiate between the run configuration settings of ETL tasks.
    Deploy statusSelect the deploy status for the ETL task. For example, you can initially select Test and change it to Production after verifying that the ETL run results are as expected.
    Log levelSpecify the level of details that you want to include in the ETL log file. Select one of the following options:
    • 1 - Light: Select to add the bare minimum activity logs to the log file.
    • 5 - Medium: Select to add the medium-detailed activity logs to the log file.
    • 10 - Verbose: Select to add detailed activity logs to the log file.

    Use log level 5 as a general practice. You can select log level 10 for debugging and troubleshooting purposes.

    Datasets

    Specify the datasets that you want to add to the ETL run configuration. The ETL collects data of metrics that are associated with these datasets.

    1. Click Edit.
    2. Select one (click) or more (shift+click) datasets from the Available datasets list and click >> to move them to the Selected datasets list.
    3. Click Apply.

    The ETL collects data of metrics associated with the datasets that are available in the Selected datasets list.

    PropertyDescription
    Metric profile selection

    Select the metric profile that the ETL must use. The ETL collects data for the group of metrics that is defined by the selected metric profile.

    • Use Global metric profile: This is selected by default. All the out-of-the-box ETLs use this profile.
    • Select a custom metric profile: Select the custom profile that you want to use from the Custom metric profile list. This list displays all the custom profiles that you have created.
    For more information about metric profiles, see Adding and managing metric profiles.
    Levels up to

    Specify the metric level that defines the number of metrics that can be imported into the database. The load on the database increases or decreases depending on the selected metric level.

    PropertyDescription
    Empty dataset behaviorSpecify the action for the loader if it encounters an empty dataset:
    • Warn: Generate a warning about loading an empty dataset.
    • Ignore: Ignore the empty dataset and continue parsing.
    ETL log file nameThe name of the file that contains the ETL run log. The default value is: %BASE/log/%AYEAR%AMONTH%ADAY%AHOUR%MINUTE%TASKID
    Maximum number of rows for CSV outputA numeric value to limit the size of the output files.
    CSV loader output file nameThe name of the file that is generated by the CSV loader. The default value is: %BASE/output/%DSNAME%AYEAR%AMONTH%ADAY%AHOUR%ZPROG%DSID%TASKID
    Capacity Optimization loader output file name

    The name of the file that is generated by the BMC Helix Continuous Optimization loader. The default value is: %BASE/output/%DSNAME%AYEAR%AMONTH%ADAY%AHOUR%ZPROG%DSID%TASKID

    Remove domain suffix from datasource name (Only for systems) Select True to remove the domain from the data source name. For example, server.domain.com will be saved as server. The default selection is False.
    Leave domain suffix to system name (Only for systems)Select True to keep the domain in the system name. For example: server.domain.com will be saved as is. The default selection is False.
    Skip entity creation (Only for ETL tasks sharing lookup with other tasks)Select True if you do not want this ETL to create an entity and discard data from its data source for entities not found in Capacity Optimization. It uses one of the other ETLs that share a lookup to create a new entity. The default selection is False.
    PropertyDescription
    Hour maskSpecify a value to run the task only during particular hours within a day. For example, 0 – 23 or 1, 3, 5 – 12.
    Day of week maskSelect the days so that the task can be run only on the selected days of the week. To avoid setting this filter, do not select any option for this field.
    Day of month maskSpecify a value to run the task only on the selected days of a month. For example, 5, 9, 18, 27 – 31.
    Apply mask validationSelect False to temporarily turn off the mask validation without removing any values. The default selection is True.
    Execute after timeSpecify a value in the hours:minutes format (for example, 05:00 or 16:00) to wait before the task is run. The task run begins only after the specified time is elapsed.
    EnqueueableSpecify whether you want to ignore the next run command or run it after the current task. Select one of the following options:
    • False: Ignores the next run command when a particular task is already running. This is the default selection.
    • True: Starts the next run command immediately after the current running task is completed.
    3.Click Save.

    The ETL tasks page shows the details of the newly configured AppDynamics ETL:

Step III. Run the ETL

After you configure the ETL, you can run it to collect data. You can run the ETL in the following modes:

A. Simulation mode: Only validates connection to the data source, does not collect data. Use this mode when you want to run the ETL for the first time or after you make any changes to the ETL configuration.

B. Production mode: Collects data from the data source.

A. Running the ETL in the simulation mode

To run the ETL in the simulation mode:

  1. In the console, navigate to Administration ETL & System Tasks, and select ETL tasks.
  2. On the ETL tasks page, click the ETL. The ETL details are displayed.



  3. In the Run configurations table, click Edit  to modify the ETL configuration settings.
  4. On the Run configuration tab, ensure that the Execute in simulation mode option is set to Yes, and click Save.
  5. Click Run active configuration. A confirmation message about the ETL run job submission is displayed.
  6. On the ETL tasks page, check the ETL run status in the Last exit column.
    OK Indicates that the ETL ran without any error. You are ready to run the ETL in the production mode.
  7.  If the ETL run status is Warning, Error, or Failed:
    1. On the ETL tasks page, click  in the last column of the ETL name row.
    2. Check the log and reconfigure the ETL if required.
    3. Run the ETL again.
    4. Repeat these steps until the ETL run status changes to OK.

B. Running the ETL in the production mode

You can run the ETL manually when required or schedule it to run at a specified time.

Running the ETL manually

  1. On the ETL tasks page, click the ETL. The ETL details are displayed.
  2. In the Run configurations table, click Edit  to modify the ETL configuration settings. The Edit run configuration page is displayed.
  3. On the Run configuration tab, select No for the Execute in simulation mode option, and click Save.
  4. To run the ETL immediately, click Run active configuration. A confirmation message about the ETL run job submission is displayed.
    When the ETL is run, it collects data from the source and transfers it to the database.

Scheduling the ETL run

By default, the ETL is scheduled to run daily. You can customize this schedule by changing the frequency and period of running the ETL.

To configure the ETL run schedule:

  1. On the ETL tasks page, click the ETL, and click Edit Task . The ETL details are displayed.

  2. On the Edit task page, do the following, and click Save:

    • Specify a unique name and description for the ETL task.
    • In the Maximum execution time before warning field, specify the duration for which the ETL must run before generating warnings or alerts, if any.
    • Select a predefined or custom frequency for starting the ETL run. The default selection is Predefined.
    • Select the task group and the scheduler to which you want to assign the ETL task.
  3. Click Schedule. A message confirming the scheduling job submission is displayed.
    When the ETL runs as scheduled, it collects data from the source and transfers it to the database.

Step IV. Verify data collection

Verify that the ETL ran successfully and check whether the AppDynamics data is refreshed in the Workspace.

To verify whether the ETL ran successfully:

  1. In the console, click Administration > ETL and System Tasks > ETL tasks.
  2. In the Last exec time column corresponding to the ETL name, verify that the current date and time are displayed.
To verify that the AppDynamics data is refreshed:

  1. In the console, click Workspace.
  2. Expand (Domain name) > Systems > AppDynamics> Instances.
  3. In the left pane, verify that the hierarchy displays the new and updated AppDynamics instances.
  4. Click an AppDynamics entity, and click the Metrics tab in the right pane.
  5. Check if the Last Activity column in the Configuration metrics and Performance metrics tables displays the current date.


AppDynamics Workspace EntityDetails

Entities

TSCO EntitiesAppDynamics Entity

Application

Application

Virtual Application

Tier

Application Server workload

Business Transaction

Machine

Machine


Hierarchy

The connector is able to replicate relationships and logical dependencies among these entities. In particular all the available Application are imported with their tier and machines. Additional hosts and services that are not part of a specific application will be imported in another domain tree.

With the release of 22.2.01.22.005, the ETL now contains additional Business Drivers underneath the Tiers that represent the Host workload metrics.

image2022-4-18_16-22-0.png


Configuration and Performance metrics mapping

TSCO EntityAppDynamics EntityTSCO Metric
Business Driver
Business Trasaction


Calls per Minute

TOTAL_EVENTS

Business Trasaction

Average Response Time (ms)

EVENT_RESPONSE_TIME

Business Trasaction

Errors per Minute

TOTAL_ERRORS

Business Trasaction

Calls per Minute

BYSET_EVENTS

Business Trasaction

Average Response Time (ms)

BYSET_RESPONSE_TIME

Business Trasaction

Errors per Minute

BYSET_ERRORS

Performance Metrics

Tier / Machine

Hardware Resources|CPU|%Busy

CPU_UTIL

Tier / Machine

Hardware Resources|CPU|%Idle

CPU_UTIL_IDLE

Tier / Machine

Hardware Resources|CPU|%Stolen

CPU_UTIL_OVERHEAD

Tier / MachineHardware Resources|CPU|IOWaitCPU_UTIL_WAIO
Tier / MachineHardware Resources|CPU|UserCPU_UTIL_USER
Tier / MachineHardware Resources|CPU|SystemCPU_UTIL_SYSTEM
Tier / Machine

Hardware Resources|Memory|Free (MB)

MEM_FREE

Tier / Machine

Hardware Resources|Memory|Used %

MEM_UTIL

Tier / Machine

Hardware Resources|Memory|Used (MB)

MEM_USED

Tier / MachineHardware Resources|Memory|Cache Size (MB)MEM_CACHE_BYTES
Tier / MachineHardware Resources|Memory|Swap Total (MB)SWAP_SPACE_TOT
Tier / MachineHardware Resources|Memory|Swap Used (MB)SWAP_SPACE_USED
Tier / MachineHardware Resources|Memory|Swap Used %SWAP_SPACE_UTIL
Tier / MachineHardware Resources|Memory|Total (MB)TOTAL_REAL_MEM
Tier / Machine

Hardware Resources|Disks|Space Available

BYDISK_FREE

Tier / Machine

Hardware Resources|Disks|Space Used

BYDISK_USED_SPACE

Tier / Machine

Hardware Resources|Disks|Avg Service Time (ms)

BYDISK_SVC_TIME

Tier / Machine

Hardware Resources|Disks|KB written/sec

BYDISK_WRITE_RATE

Tier / Machine

Hardware Resources|Disks|KB read/sec

BYDISK_READ_RATE

Tier / MachineHardware Resources|Disks|Avg Queue Time (ms)BYDISK_QUEUE_TIME
Tier / Machine

Hardware Resources|Network|Incoming packets/sec

BYIF_IN_PKT_RATE

Tier / Machine

Hardware Resources|Network|Outgoing packets/sec

BYIF_OUT_PKT_RATE

Tier / Machine

Hardware Resources|Network|Incoming KB/sec

BYIF_IN_BYTE_RATE

Tier / Machine

Hardware Resources|Network|Outgoing KB/sec

BYIF_OUT_BYTE_RATE

Tier / MachineHardware Resources|Network|Incoming packetsBYIF_IN_PKT
Tier / MachineHardware Resources|Network|Outgoing packetsBYIF_OUT_PKT
Tier

JVM|Memory:Heap|Committed (MB)

HEAPMEM_COMMITTED

Tier

JVM|Memory:Heap|Current Usage (MB)

HEAPMEM_USED

Tier

JVM|Memory:Heap|Max Available (MB)

HEAPMEM_MAX

Tier

JVM|Memory:Heap|Used %

HEAPMEM_UTIL

Tier

JVM|Memory:Non-Heap|Committed (MB)

NONHEAPMEM_COMMITED

Tier

JVM|Memory:Non-Heap|Current Usage (MB)

NONHEAPMEM_USED

Tier

JVM|Threads|Current No. of Threads

THREAD_COUNT

Tier

Performance|Buffer Cache Hit Ratio

DB_BUFFER_HIT_RATIO

Tier

Performance|Library Cache Hit Ratio

DB_LIBRARY_CACHE_HIT_RATIO

Tier

Performance|Physical Read Total IO Requests Per Sec

DB_PHYSICAL_READS_RATE

Tier

Performance|Physical Write Total IO Requests Per Sec

DB_PHYSICAL_WRITES_RATE

Tier

Server Statistic|physical reads

DB_PHYSICAL_READS

Tier

Server Statistic|physical writes

DB_PHYSICAL_WRITES

Tier

Server Statistic|sorts (disk)

DB_SORTS_DISK

Tier

Server Statistic|sorts (memory)

DB_SORTS_MEMORY

Tier

Server Statistic|sorts (rows)

DB_SORTS_ROWS

Tier

Server Statistic|user commits

DB_USER_COMMITS

Tier

Server Statistic|Buffer cache hit ratio

DB_BUFFER_CACHE_HIT_RATIO

Tier

Server Statistic|Full Scans/sec

DB_FULL_SCANS

Tier

Server Statistic|SQL Cache Memory (KB)

DB_MEM_SQL_CACHE

Tier

Server Statistic|Total Server Memory (KB)

DB_MEM_TOTAL_SERVER

Tier

Server Statistic|Transactions/sec

DB_TRAN_RATE

Tier

Server Statistic|COMMIT_SQL_STMTS

DB_USER_COMMITS

Tier

Server Statistic|DIRECT_WRITE_TIME

DB_DIRECT_PATH_WRITE

Tier

Server Statistic|DIRECT_WRITE_REQS

DB_DIRECT_PATH_WRITE_COUNT

Tier

Server Statistic|DIRECT_READ_TIME

DB_DIRECT_PATH_READ

Tier

Server Statistic|DIRECT_READ_REQS

DB_DIRECT_PATH_READ_COUNT

Tier

Server Statistic|DEADLOCKS

DB_DEADLOCKS

Tier

Server Statistic|ROWS_UPDATED

DB_ROWS_UPDATED

Tier

Server Statistic|ROWS_READ

DB_ROWS_READ

Tier

Server Statistic|TOTAL_SORTS

DB_SORTS_ROWS

Tier

Server Statistic|mempages_alloced

DB_PAGE_ALLOCATED

Tier

Server Statistic|total_bytes_received

DB_BYTES_RECEIVED

Tier

Server Statistic|total_bytes_sent

DB_BYTES_SENT

Tier

Server Statistic|BytesReceived

NET_IN_BYTE_RATE

Tier

Server Statistic|BytesSent

NET_OUT_BYTE_RATE

Tier

Server Statistic|Transactions

DB_TRANSACTIONS

Tier

Server Statistic|LogicalReads

DB_LOGICAL_READS

Tier

Server Statistic|blks_hit

DB_BLOCK_GETS_CACHE

Tier

Server Statistic|blks_read

DB_BLOCK_GETS

Tier

Server Statistic|size_mb

DB_FILE_SYSTEM_SIZE

Tier

Server Statistic|Mem_mapped

DB_MEM_ALLOCATED

Tier

Server Statistic|Mem_virtual

DB_MEM_VIRTUAL

Tier

Server Statistic|Network_bytesIn

NET_IN_BYTE_RATE

Tier

Server Statistic|Network_bytesOut

NET_OUT_BYTE_RATE

Tier

Server Statistic|OpCounters_delete

DB_DELETE_COUNT

Tier

Server Statistic|OpCounters_insert

DB_INSERT_COUNT

Tier

Server Statistic|OpCounters_update

DB_UPDATE_COUNT

Tier

Server Statistic|Oplog_Max_Size

DB_LOG_MAX_SIZE

Tier

Server Statistic|Oplog_Size

DB_LOG_ALLOCATED_SIZE

Tier

Server Statistic|Bytes_received

NET_IN_BYTE_RATE

Tier

Server Statistic|Bytes_sent

NET_OUT_BYTE_RATE

Tier

Server Statistic|Com_delete

DB_DELETE_COUNT

Tier

Server Statistic|Com_insert

DB_INSERT_COUNT

Tier

Server Statistic|Com_update

DB_UPDATE_COUNT

Tier

Server Statistic|Threads_running

THREAD_COUNT

Tier

JMX|Web Container Runtime|Busy Threads

BYVM_WCPOOL_LIVE_CLIENTS

Tier

JMX|Web Container Runtime|Maximum Threads

BYVM_WCPOOL_SIZE

TiersizeSYS_MULTIPLICITY
Aggregated Metrics

Tier / Machine
BYDISK_SIZE
Tier / Machine
BYDISK_TRANSFER_RATE
Tier / Machine
BYIF_BIT_RATE
Tier / Machine
BYIF_IN_BIT_RATE
Tier / Machine
BYIF_IN_BIT_RATE

Tag Mapping (Optional)

If use Business Service View, machine will have a tag using tag type configured in the ETL. Tag value will be the direct parent tier name.

TSCO EntitiesAppDynamics EntityTag Type

Machine

Machine

AppDynamics App Tier

Was this page helpful? Yes No Submitting... Thank you

Comments