Integrating with Apache Kafka
You can view the collected data in various BMC Helix applications and derive the following benefits:
BMC Helix application | Type of data collected or viewed | Benefits |
---|---|---|
BMC Helix Operations Management | Events | Use a centralized event view to monitor and manage events, perform event operations, and filter events. Identify actionable events from a large volume of event data by processing events. For more information, see Monitoring events and reducing event noise. |
BMC Helix Operations Management | Metrics | Use alarm and variate policies to detect anomalies and eliminate false positives for more accurate results while monitoring the health of your system. For more information, see Detecting anomalies by using static and dynamic thresholds. |
BMC Helix Dashboards | Metrics | Create dashboards to get a consolidated view of data collected from third-party products across your environment. Improve the efficiency of your system by monitoring the key performance metrics and respond to issues quickly to minimize the down time. For more information, see Creating custom dashboards. |
As a tenant administrator, perform the following steps to configure a connection with Apache Kafka, verify the connection, and view the collected data in various BMC Helix applications.
Supported versions
The Apache Kafkaconnector supports version 3.3.1 of Apache Kafka for data collection.
Planning for the connection
Review the following prerequisites to help you plan and configure a connection with Apache Kafka.
Apache Kafka prerequisites
BMC Helix Intelligent Integrations receives information from Apache Kafka in JSON format and applies JSLT mapping to transform the incoming information into events and metrics (in JSON format) that can be understood by BMC Helix Operations Management.
Prepare the event JSON, JSLT, and event class JSON
Before you start configuring a connection with Apache Kafka to collect events, prepare the event JSON, corresponding JSLT, and the event class JSON if you plan to use an event class other than the existing classes in BMC Helix Operations Management.
A standard event JSLT must contain the following parameters. If you have any additional parameters in the event JSON for which no mapping is available in the standard JSLT and for which you want to send information to BMC Helix Operations Management, include them in the extras section in the JSLT.
"severity":"",
"msg": "",
"status":"",
"_ci_id":"",
"source_hostname":"",
"source_identifier": "",
"details":"",
"source_attributes": {"external_id":""},
"creation_time": "",
"source_unique_event_id":"",
"ii_version": (.ii_version),
"class":" "
}
If you need to include mapping for any additional parameters for which there is no mapping is available in the standard JSLT, include them in the extras section. For example, EventID and EventType are additional parameters in the following sample event JSON. In the JSLT, these parameters are included in the extras section.
Prepare the metrics JSON and JSLT
Before you start configuring a connection with Apache Kafka to collect metrics, prepare the metrics JSON and the corresponding JSLT mapping.
BMC Helix Intelligent Integrations prerequisites
- Depending on the location of the third-party product (SaaS, on-premises), choose one or more BMC Helix Intelligent Integrations deployment modes and review the corresponding port requirements. For information about various deployment modes and port requirements, see Deployment-scenarios.
- Based on the deployment modes, use the BMC Helix Intelligent Integrations SaaS deployment or the BMC Helix Intelligent Integrations on-premises gateway or both. For more information about the gateway, see Deploying-the-BMC-Helix-Intelligent-Integrations-on-premises-gateway.
- The on-premises gateway must be able to reach the third-party product on the required port (default is 7459).
In the preceding list, third-party product refers to Apache Kafka.
Configuring the connection with Apache Kafka
- Access BMC Helix Intelligent Integrations:
- BMC Helix Intelligent Integrations SaaS – Log on to BMC Helix Portal, and click Launch on BMC Helix Intelligent Integrations.
- BMC Helix Intelligent Integrationson-premises gateway – Use the following URL to access BMC Helix Intelligent Integrations:
https://<hostName>:<portNumber>/swpui
- On the CONNECTORS tab, click
in the SOURCES panel.
- Click the Apache Kafka tile .
Specify a unique instance name.
- Specify the following details for the source connection:
- Specify the Apache Kafkahost name and port number (default port number is 7459).
- (Optional) Specify the Schema Registry URL of the Apache Kafkahost with HTTP or HTTPS port number, in the following format:
<protocol>://<hostName>:<portNumber>
For example, https://hostA:9001.
You can use this URL to extract the JSON data from an Avro-encoded JSON.
- Specify one of the following security protocols:
- PLAINTEXT (default)
- SASL_PLAINTEXT
- SSL
- SASL_SSL
- (Optional) Specify a comma-separated list of additional Apache Kafka brokers in <hostName>:<portNumber> format.
For example, hostA:7400, hostB:7500 . Specify the additional parameters required for various authentication protocols:
- Secure sockets layer (SSL) key password if the SSL keystore location is configured
The password of the private key in the keystore file or the PEM key specified in ssl.keystore.key. - SSL truststore file location
- SSL truststore password for the trust store location.
If a password is not set, you can still use the trust storefile, but integrity check will be disabled. - SSL keystore file location
You can use this file for two-way authentication for client. - SSL keystore password
- Simple Authentication and Security Layer (SASL) Kerberos service name
It is the Kerberos principal name that Kafka runs. - SASL JAAS configuration
It is the JAAS login context parameters for SASL connections in the format used by JAAS configuration files. For brokers, the configuration must be prefixed with the listener prefix and SASL mechanism name in lower case. Usually, the security protocol, SASL mechanism, and SASL JAAS configuration are specified together. - SASL mechanism
There could be multiple SASL mechanisms that are enabled simultaneously on a broker, but each client chooses only one mechanism. Apache Kafka supports many SASL mechanisms, for example, GSSAPI (Kerberos authentication), OAUTHBEARER, SCRAM, PLAIN, Delegation Tokens, and LDAP. - All properties
Use this field to specify any additional authentication parameters that are not available on the Source connection page. You can specify the parameters in the parameter=value format.
For example, if you want to use the client DNS lookup parameter, you can specify client.dns.lookup=use_all_dns_ips in this field . To specify multiple parameters, use a comma-separated list.
- Secure sockets layer (SSL) key password if the SSL keystore location is configured
- Specify the number of maximum concurrent REST API requests that should be executed during a data collection schedule (default value is 5).
- Specify the time, in seconds, after which no attempt should be made to establish a connection (default value is 30).
- Specify one of the following security protocols:
- Click VALIDATE AND CREATE .
The specified connection details are validated and the corresponding source connection is created in the Source Connection list. Select the source connection that you created from the list if it is not selected already .
- Ensure that the options for the datatypes for which you want to collect data are selected.
Configure the collectors for the selected data types by clicking the respective data type in the Collectors section and specify the parameters for the selected data type, as explained in the following table.
The ✅️ symbol indicates that this field applies to the data type.- Click CREATE COLLECTORS to create the required collector streams for the selected data types.
Configure the distributors for the selected data types by clicking the respective data type in the Distributors section and specifying the parameters for the selected data type, as explained in the following table:
The [confluence_table-plus] macro is a standalone macro and it cannot be used inline. Click on this message for details.
- Click CREATE DISTRIBUTORS to create the required distributor streams for the selected data types.
- Click one of the following buttons:
- SAVE STREAM : Click this button if you want to edit the integration details before creating the instance. After you save the stream, the connector that you just created is listed in the SOURCES panel. Move the slider to the right to start the data stream.
- SAVE AND START STREAM : Click this button if you want to save the integration details and start receiving data immediately.
- For more information about the data streams, see Starting-or-stopping-data-streams.
Verifying the connection
From BMC Helix Intelligent Integrations , on the SOURCES panel, confirm that the data streams for the connection you created are running. Data streaming is indicated by moving colored arrows.
- A moving dark blue arrow (
) indicates that the event stream is running. Event data will be pushed according to the configured Collection Schedule interval.
- A moving red arrow (
) indicates that the metric stream is running. Metric data will be pushed according to the configured Collection Schedule interval.
Viewing data in BMC Helix applications
View data collected from Apache Kafka in multiple BMC Helix applications.
To view events in BMC Helix Operations Management
- In BMC Helix Operations Management, select Monitoring > Events.
- Filter the events by KafkaEvent class.
Incoming events from Apache Kafka are processed in BMC Helix Operations Management through a set of rules to determine whether the incoming event contains the results of same test on the same node and processed accordingly. For more information, see Event-deduplication-suppression-and-closure-for-reducing-event-noise.
For information about events, see Monitoring and managing events.
To view metrics in BMC Helix Operations Management
- In BMC Helix Operations Management, select Monitoring > Devices.
- Click the links for the required device.
- On the Monitors tab, click the required monitor.
The Performance Overview tab shows the metrics graph.
For information about metrics, see Viewing collected data .