Increase maximum kafka message size


If you see an org.apache.kafka.common.errors.RecordTooLargeException error in the log file, then you need to adjust several kafka parameters.

Step-by-step guide

  1. Update the kafka producer and consumer limits in swp.conf by adding the following lines to the end of the file (not inside any braces) where LIMIT is larger than the value mentioned in the log message:

    akka.kafka.producer {
      # Properties defined by org.apache.kafka.clients.producer.ProducerConfig
      # can be defined in this configuration section.
      kafka-clients {
        max.request.size = LIMIT
        buffer.memory    = LIMIT
      }
    }
    akka.kafka.consumer {
      # Properties defined by org.apache.kafka.clients.consumer.ConsumerConfig
      # can be defined in this configuration section.
      kafka-clients {
        max.partition.fetch.bytes = LIMIT
      }
    }
  2. Update the kafka broker config by adding the following setting to ./conf/kafka/server.properties:

    message.max.bytes=LIMIT
  3. Restart the streamweaver service.


You may also want to use visual panels to communicate related information, tips or things users need to be aware of.

Related articles

Related articles appear here based on the labels you select. Click to edit the macro and add or change labels.



 

Tip: For faster searching, add an asterisk to the end of your partial query. Example: cert*