Minor changes are by default collapsed in the page history.
No changes
The page does not exist yet.
Failed to load changes
Version by on
Leave Collaboration
Are you sure you want to leave the realtime collaboration and continue editing alone? The changes you save while editing alone will lead to merge conflicts with the changes auto-saved by the realtime editing session.
Prerequisites
This section contains information about requirements that must be in place before beginning the installation.
One of the requirements needed by the Apache Kafka KM and BMC producer tool is a local installation of Apache Kafka.
Ensure that after installing the Apache Kafka KM and deploying the BMC producer, you have deployed the Apache Kafka. BMC recommends that the local Apache Kafka version you deploy must be the same as the Kafka cluster system you want to monitor.
Points to remember
Setting up the environment
Create a topic covering all relevant brokers.
Identify the location where the different Producers must be running. The Producers need to run automatically in the background (for example, as a service).
Configure the Producers to send messages in size and format that properly simulate the typical activity of the Kafka system.
To simulate the way Kafka platform is working, you as a user will need to decide where to deploy the BMC producer on your destination computers.
BMC producer is a cli tool and can be found inside the Apache Kafka KM installation. You need to deploy it and configure it.
You must provide the following configuration details:
Kafka Broker’s – Which Broker inside the cluster you need to connect to
Message size - Size of the message in bytes
Message block – Number of message blocks to send
Topic - Predefines topic to use
Schedule - Tool must run each n number of minutes
The BMC Producer tool acts as a Kafka Producer. The tool connects to your Kafka cluster and starts sending messages through the predefined topic.
The static topic monitoring is based on a predefined topic that is created.
By the user – The user must use the Kafka API tools to create the topic and decide the number of partitions and the number of replicates when creating the topic
By the Apache Kafka KM - When you provide the topic information during the configuration process, the KM checks if this topic already configured on you Kafka Cluster. If not, the KM creates the topic with partitions and replicates based on the amount of the current running Kafka Brokers in the Kafka Cluster.
For example: If your Kafka cluster has five Kafka brokers but only three are running, the dynamic topic created by the KM is created with three partitions and three replicates.
Producer prerequisites
Ensure that you set the JAVA_HOME and KM_ KAFKA_HOME correctly.
For Kerberos environment, ensure that the JAAS and the krb5 files are in the JVM_ARGS.
Set DEBUG=1 in the Producer arguments to see detailed log messages.