Creating a custom pipeline configuration file to integrate with a third-party product
After you create a sandbox environment, run the sample pipeline configuration, and get familiar with the integrating process, you can start creating custom pipeline configuration files to integrate the third-party product with BMC Helix Developer Tools. The following section considers the product as an example, and explains how to create a custom configuration file to push events from to BMC Helix Platform.
Before you begin
- Download and install the BMC Helix Developer Tools connector.
- Create a sandbox environment by using the Custom Integration template.
To create an event pipeline configuration file
- Log on to the host computer on which you have installed the BMC Helix Developer Tools connector.
- Identify the integration details.
For more information, see explore the default sandbox directory structure and configuration files. Run the following command to verify the current status of the connector:
tail -f /opt/bmc/connectors/<connector-name>/logs/fluent.logThe following output is displayed, which conveys that the connector is up and running and logging the heartbeat periodically.
- Go to the /opt/bmc/connectors/<connector name>/custom directory.
An example connector name is Custom_integration_demo. - Create a text file named solarwinds_event_mapping.json file.
Add the mapping details as shown in the following example:
#The solarwinds_event_mapping.json file maps solarwinds events data to the BMC Helix Platform compatible format
#by using the following mapping structure:
{
"EventMappingDetails": [
{
"inputkey": "Event",
"outputkey": "class",
"type": "constant"
},
{
"inputkey": "LocalEventTime",
"outputkey": "creation_time",
"type": "time",
"inputformat": "%Y-%m-%dT%H:%M:%S",
"outputformat": "EPOCH"
},
{
"inputkey": "EventID",
"outputkey": "_identifier",
"type": "assignment"
},
{
"inputkey": "DNS",
"outputkey": "source_hostname",
"type": "assignment"
},
{
"inputkey": "Message",
"outputkey": "msg",
"type": "assignment"
},
{
"inputkey": "status",
"outputkey": "status",
"type": "lookup",
"lookupMap": [{
"0": "OPEN",
"1": "ACK"
}]
}]
}- Save the file.
- Go to the /opt/bmc/connectors/<connector name>/data/<integration ID>/pipeline directory.
- Locate the generic_event_pipeline.conf file.
- Create a copy of this file and rename it to represent the custom third-party integration that you plan to create.
For example, rename it to solarwinds_event_pipeline.conf. - Using a text editor, open the solarwinds_event_pipeline.conf file.
Uncomment the <source>, <filter>, and <match> directives, and provide the details as shown in the following example:
<source>
@type bmc_ade_http_pull
# Enable the debug level logging for the plug-in
@log_level debug
tag refreshEvents
url https://host.test.com:17778/SolarWinds/InformationService/v3/Json/Query?query=SELECT top 5 e.EventID, tolocal(e.EventTime) as LocalEventTime, e.NetObjectID, n.DNS, n.IP, e.EventType, et.Name as EventTypeName, et.Icon, e.Message, tostring(e.Acknowledged) as status, e.InstanceType, e.DisplayName, e.Description FROM Orion.Events e , Orion.EventTypes et, Orion.Nodes n where e.EventType = et.EventType and e.NetObjectID = n.NodeID and e.EventTime > ADDSECOND(-(((CURRENT_TIME/1000)-(LAST_FETCH_TIME/1000))*1),GETUTCDATE())
# Path to remember the state (config_file_path) must be an absolute path that is accessible from within Docker
config_file_path /fluentd/etc/custom/state_config.json
interval 20s
verify_ssl false
format json
user user1
password test123
</source>
# Flatten the JSON structure
<filter refreshEvents>
@type json_transform
transform_script flatten
</filter>
<filter refreshEvents>
@type record_modifier
whitelist_keys message.results
</filter>
# Write the flattened data to the fluentd.log file
<filter refreshEvents>
@type stdout
</filter>
# Transform the flattened data to the BMC Helix Platform compatible format
<filter refreshEvents>
@type bmc_ade_transformer
# Enable the debug level logging for the plug-in
@log_level debug
# Mapping file path (mapping_file) must be an absolute path that is accessible from within Docker
mapping_file /fluentd/etc/custom/solarwinds_event_mapping.json
mapping_json_key EventMappingDetails
input_result message.results
</filter>
# Write the transformed data to the fluentd.log file
<match refreshEvents>
@type stdout
</match>
# Send the transformed data to BMC Helix Platform
<match refreshEvents>
@type bmc_ade_http
# Enable the debug level logging for the plug-in
@log_level debug
endpoint_url <BMC Helix Portal URL>/events-service/api/v1.0/events
ssl_no_verify true # default: false
use_ssl true
http_method post # default: post
serializer json # default: form
rate_limit_msec 0 # default: 0 = no rate limiting
raise_on_error true # default: true
recoverable_status_codes 503, 400, 200 # default: 503
format json
</match>For information about the parameters used in the plug-in, see bmc-ade-http-plug-in.
- Save the file.
- Click the action menu of the sandbox integration that you have created, and select Edit.
- Select Events from the Entity Name list, and enter solarwinds_event_pipeline.conf in the Pipeline File text box.
- Click Update.
- Confirm a successful pipeline configuration:
- On the Configured Integrations tab, locate the configured integration tile that you created by using the Custom Integration template; for example, Custom_Integration_test.
You should see the count of ingested events on the configured tile, as shown in the following example: - Log on to the BMC Helix Operations Management console, and select Monitoring > Events to view the ingested events.
- On the Configured Integrations tab, locate the configured integration tile that you created by using the Custom Integration template; for example, Custom_Integration_test.
Where to go from here
After you create your own custom pipeline configuration file to ingest data from a product that you plan to integrate, parameterize the file to enable end users to provide details.