Moviri - Pivotal Cloud Foundry Extractor


"Moviri Integrator for BMC Helix Continuous Optimization – PivotalCF" is an additional component of BMC Helix Continuous Optimization product. It allows extracting data from the Pivotal Cloud Foundry a unified, multi-cloud platform to manage enterprise applications at scale. Relevant capacity metrics are imported into BMC Helix Continuous Optimization, which provides advanced analytics over the extracted data in the form of an interactive dashboard, the Pivotal Cloud Foundry View.

The integration supports the extraction of performance and configuration data across the different components of the Pivotal Cloud Foundry Application Service (PAS) and can be configured via parameters that allow entity filtering and many other settings. The connector can also replicate relationships and logical dependencies among entities such as applications, organizations, and virtual machines. 

For the virtual machines under the Pivotal Cloud Foundry Application Service and Isolation Segment, Pivotal CF connector extracts component virtual machines. In the Pivotal CF extractor, we categorized them into two parts, the Diego Node, and PCF Services. Diego Nodes, categorizes the component virtual machines contributed to Pivotal Cloud Foundry Diego component, and other component VMs are counted as PCF Services. 

For the applications under the Pivotal Cloud Foundry domain, Pivotal CF connector extracts both system default applications, as long as the customized applications, and arrange them based on the hierarchy of organization, space and applications, in aligned with the Pivotal Cloud Foundry application manager. 

This documentation is targeted at BMC Helix Continuous Optimization administrators, in charge of configuring and monitoring the integration between BMC Helix Continuous Optimization and Pivotal Cloud Foundry.

Step I. Complete the preconfiguration tasks

Step II. Configure the ETL

Step III. Run the ETL

Step IV. Verify the data collection


Application whitelist. Show only applications from the following spaces. Use space GUID. separate each space GUID using ";".  N

Step I. Complete the Pre-Configuration Tasks

Step II. Configure the ETL

  1. In the console, navigate to Administration > ETL & System Tasks, and select ETL tasks.
  2. On the ETL tasks page, click Add > Add ETL. The Add ETL page displays the configuration properties. You must configure properties in the following tabs: Run configuration, Entity catalog, Connection, and PCF Connection, PCF Extraction Filters
  3. On the Run Configuration tab, select Moviri - PivotalCF Extractor from the ETL Module list. The name of the ETL is displayed in the ETL task name field. You edit this field to customize the name.
    image2020-1-7_9-37-13.png
  4. Click the Entity catalog tab, and select one of the following options:

    • Shared Entity Catalog:
      • From the Sharing with Entity Catalog list, select the entity catalog name that is shared between ETLs.
    • Private Entity Catalog: Select if this is the only ETL that extracts data from the Pivotal CF resources.

      1. Click the Connection tab, and configure the following properties:

        The [liveData] macro is a standalone macro and it cannot be used inline. Click on this message for details.

        Hidden properties: Pivotal CloudFoundry ETL provides extra hidden properties and they can be changed manually from edit configuration page.

        • To change the default polling period (1 minute ) for Applications, Organizations, and Spaces data, set a hidden property "extract.pcf.app.serve.interval" and set the value to desired seconds. For example, if want to change default polling interval from 1 minute to 5 minute, set the value to 300.
        • To change the default polling period (1 minute ) for Bosh deployment and its nodes data, set a hidden property "extract.pcf.bosh.serve.interval" and set the value to desired seconds. For example, if want to change default polling interval from 1 minute to 5 minute, set the value to 300.

        All the other generic properties are documented here

        The ETL tasks page shows the details of the newly configured Pivotal Cloudfoundry ETL.

        image2020-1-21_10-17-34.png

Step III. Run the ETL

After you configure the ETL, you can run it to collect data. You can run the ETL in the following modes:

A. Simulation mode: Only validates connection to the data source, does not collect data. Use this mode when you want to run the ETL for the first time or after you make any changes to the ETL configuration.

B. Production mode: Collects data from the data source.

Running the ETL in simulation mode

To run the ETL in the simulation mode:

  1. In the console, navigate to Administration ETL & System Tasks, and select ETL tasks.
  2. On the ETL tasks page, click the ETL. The ETL details are displayed.
  3. In the Run configurations table, click Editimage2019-6-3_11-28-46.png to modify the ETL configuration settings.
  4. On the Run configuration tab, ensure that the Execute in simulation mode option is set to Yes, and click Save.
  5. Click Run active configuration. A confirmation message about the ETL run job submission is displayed.
  6. On the ETL tasks page, check the ETL run status in the Last exit column.
    OK Indicates that the ETL ran without any error. You are ready to run the ETL in the production mode.
  7.  If the ETL run status is Warning, Error, or Failed:
    1. On the ETL tasks page, click image2019-6-3_11-29-45.png in the last column of the ETL name row.
    2. Check the log and reconfigure the ETL if required.
    3. Run the ETL again.
    4. Repeat these steps until the ETL run status changes to OK.

Running the ETL in the production mode

You can run the ETL manually when required or schedule it to run at a specified time. As of version 20.02.00.005, the integration delays the hierarchy to delay for six hours helping to reduce the load on the loader step.

Running the ETL manually

  1. On the ETL tasks page, click the ETL. The ETL details are displayed.
  2. In the Run configurations table, click Edit image2019-6-3_11-30-9.png to modify the ETL configuration settings. The Edit run configuration page is displayed.
  3. On the Run configuration tab, select No for the Execute in simulation mode option, and click Save.
  4. To run the ETL immediately, click Run active configuration. A confirmation message about the ETL run job submission is displayed.
    When the ETL is run, it collects data from the source and transfers it to the database.

Here's screenshot of ETL page after set it up and run completely.

image2020-1-21_10-19-41.png

Step IV. Verify Data Collection

Verify that the ETL ran successfully and check whether the Pivotal CF data is refreshed in the Workspace.

To verify whether the ETL ran successfully:

  1. In the console, click Administration > ETL and System Tasks > ETL tasks.
  2. In the Last exec time column corresponding to the ETL name, verify that the current date and time are displayed.

To verify that the Pivotal CF data is refreshed:

  1. In the console, click Workspace.
  2. Expand (Domain name) > Systems > PivotalCF> Instances.
  3. In the left pane, verify that the hierarchy displays the new and updated Pivotal CF instances.
  4. Click a Pivotal CF entity, and click the Metrics tab in the right pane.
  5. Check if the Last Activity column in the Configuration metrics and Performance metrics tables displays the current date.