Creating a custom connector and policy

This topic was edited by a BMC Contributor and has not been approved.  More information.

To access the latest information about this topic and all Cloud Security releases, check out the Release notes and notices.

Although BMC Helix Cloud Security (Cloud Security) provides some out-of-the-box (OOTB) connectors such as the Amazon Web Service (AWS) connector for AWS account details, App Vulcanizer for application vulnerability data, and so on, it is not possible to provide OOTB ability to capture and evaluate all types of data.

Cloud Security includes a Base connector, an extendable OOTB connector allows you to publish different types of custom data (for example, logs, security data, infrastructure, cloud services, PaaS services, Internet of things (IoT), and so on) within the same connector. The Base connector can be executed in the CLI mode or can be extended with very little coding.

The policies are written as code in YAML, so they can be easily authored as well as stored and compared using common version control tools. The policy authoring sandbox makes it easy to modify one of the standards-based compliance policies (for example, CIS for Docker Containers) or create a new policy from scratch. If a new technology is introduced, users can extend the base connector to retrieve and send the new data. Any data that can be captured or converted to JSON format can be evaluated with the policy engine.

Finally, open REST APIs can be used to add low-friction compliance evaluation to any process. You call the API, specify the policy to evaluate, and get a fast, synchronous response to ensure compliance rules have been met before continuing.

By customizing the Base connector and creating your own policies, you can collect data constantly or on a user-defined schedule. The custom connector can publish data in the following modes:

  • On demand: Publishes bulk resource files to Cloud Security for policy evaluation on demand
  • Schedule: Publishes bulk resource files to Cloud Security for policy evaluation on a schedule.
  • Event-driven: Publishes bulk resource files to Cloud Security and when the resource files are copied to particular directory for policy evaluation in an asynchronous manner.

The following tasks describe how to create a custom connector and policy:

Configuring the Base connector

  1. Log on to Cloud Security with your registered credentials.
  2. Choose Configure icon > Connectors and click Add Connector.
  3. Under Connector Type > On Premise Connectors (Installable), click Base Connector and then click Continue.
  4. In the Name your connector field, specify a name for the connector.
    This name must be unique and must not have already been created.
    If the name entered is not already displayed on the Manage Connectors page, a green check mark and available label appear next to the field.
  5. Click Continue.
  6. If the download does not start automatically, click Download Connector setup and unzip the Base Connector.zip file using any standard compression tool.
    The zip file will have the name that you specified for the connector in Step 4.
    1. (Windows) Double-click run.bat to run the connector in your target environment.

    2. (Linux) Execute the command chmod +x run.sh to grant execute permissions to the run.sh file. Then run the connector using the run.sh command.

      Note

      The time to complete data collection depends upon the number of targets that you have in your environment. Leave the command window open and switch over to the UI.
  7. Click Continue.

Creating the custom policy

To create your new policy, you need to have a JSON resource feed file and a YAML policy file.

Tip

Prior to creating a custom policy, review the following reference topics:

To create the custom policy, complete the following steps:

  1. On the Cloud Security Dashboard, click Manage > Policies.
  2. Click Authoring Sandbox. The Authoring Sandbox is displayed.
  3. In the Resource JSON section, do one of the following:
    • Click Create New, and then create a new-resource.json file in the text area. 
    • Click Select File and select a JSON that contains resource information.
  4. In the Policy YAML section, do one of the following:
    • Click Create New, and then create a new-policy.yaml file in the text area.
    • Click Select File and select a YAML file that contains policy information.
  5. When creating the policy, note the following key expressions and operators to make sure they reflect the goal of the policy:
    • resourceSpec - Instructs the policy how to parse the list of resources (expression), how to parse each resource’s display name (nameExpression), and the type of resources (typeExpression).

    • exportedVariables- Instructs the policy which variables to show for a specific resource.

    • severity- Defines whether a rule is displayed as Low(1-4), Medium(5-6), High(7-8) or Critical(9+).

    • ruleExpression - Defines how the policy determines if a resource is compliant or non-compliant. For more information about the expressions and operators, see Authoring policies.

  6. Once you have the YAML file in place, you can validate the content using the Validate Rules and Validate for Compliance options above the Policy YAML section. To view the results of the validation process, click the Results option in the bottom left section of the screen.
  7. Click Status to ensure that both files are valid.
  8. Click Save Policy to File in the respective sections to save each of the new files and make them available to the library, as highlighted below.
     

Adding the policy to the library 

To add your new custom policy to the list of available policies, do the following:

  1. Navigate to the Manage Policies screen.
  2. Choose Import Policy > Import from disk.

  3. Specify the location where you saved your policy YAML file.
  4. On the Add Custom Policy dialog box, specify a name for the policy.
  5. Click Save.

Note: The newly added policy might not appear on the screen immediately. Refresh the screen to view it.

Collecting custom data

To execute your custom connector and policy, complete the following steps:

  1. Click the newly added policy, and then navigate to the Execution Schedule tab.
  2. (Optional if you have more than one instance of the Base connector downloaded) From the Connector drop-down, select a connector and then, click Assign Connector to assign the newly created policy to a specific instance of a Base connector.
  3. Navigate to the directory where you downloaded the Base connector.
  4. Unzip the Base Connector.zip file using any standard compression tool.
  5. Run the connector:
    • (Windows) Double-click run.bat to run the connector in your target environment.
    • (Linux) Execute the command chmod +x run.sh to grant execute permissions to the run.sh file. Then run the connector using the run.sh command.

The following table shows the syntax and an alternative syntax for the parameters that you can use with the run command:

Parameter syntax (Short)Alternative syntax (Long)Description
-d <arg>
--directoryToWatch
<arg>
Specify the directory path in which the feed files that are to be evaluated reside.
-f <arg>
--feed <arg>

Specify the path of the file containing the data that must be published. You can also specify a directory path if you want to publish the data from multiple files.

-h
--help
Display help.
-s <arg>
--schedule <arg>
Specify the schedule for the data collection process in minutes. Use this variable to collect data from a specified directory.
-t <arg>
--tags <arg>

Specify the tags that must be used to match the resource feeds against policies.

-p <arg>

--policy <arg>Provide the policy name to evaluate against feed.
-y <arg>--yaml <arg>

Provide the policy yaml file path to evaluate against feed.

The following table shows the various modes in which the Base connector can operate, the command syntax, and corresponding examples:

ModeSyntaxExample
To collect data from a particular file or location
run -f <resource feed 
file directory>
run -f C:/John/MyDirectory/sample-feed.json
To collect data at a particular interval from a specific directory
run -d <directory to watch 
for feed files>
-s <schedule in minutes>
run -s 150 -d C:/John/MyDirectory
To collect data as and when it is generated at a particular location
run -d <directory to watch 
for feed files>
run -d C:/John/MyDirectory
To collect data using a selection hint in the feed
run -f <resource feed 
file directory>
-t <selectionHint>
run -d C:/John/MyDirectory -t DepChecker
Read the configuration from a file
run
 

To collect data using a exisitng policy name

run -f <resource feed file> -p <policyName>

run -f C:/Some/directory/sample-feed.json -p sample-policy-name

To collect data using a policy yaml file

run -f <resource feed file> -y <policy yml file>

run -f C:/Some/directory/sample-feed.json -y C:/Some/directory/sample-policy.yml

View help
run -h
 

The time to complete data collection depends upon the number of resources you specify in the JSON feed file. Leave the command window open and switch over to the Cloud Security UI to see how it has monitored your system.

Performing next steps

To learn more about authoring policies, see Authoring policies.

Back to top

Was this page helpful? Yes No Submitting... Thank you

Comments