Skip to main content

Kafka Emitter

To use this Apache Druid extension, include kafka-emitter in the extensions load list.

Introduction

This extension emits Druid metrics to Apache Kafka directly with JSON format.
Currently, Kafka has not only their nice ecosystem but also consumer API readily available. So, If you currently use Kafka, It's easy to integrate various tool or UI to monitor the status of your Druid cluster with this extension.

Configuration

All the configuration parameters for the Kafka emitter are under druid.emitter.kafka.

PropertyDescriptionRequiredDefault
druid.emitter.kafka.bootstrap.serversComma-separated Kafka broker. ([hostname:port],[hostname:port]...)yesnone
druid.emitter.kafka.event.typesComma-separated event types.
Supported types are alerts, metrics, requests, and segment_metadata.
no["metrics", "alerts"]
druid.emitter.kafka.metric.topicKafka topic name for emitter's target to emit service metrics. If event.types contains metrics, this field cannot be empty.nonone
druid.emitter.kafka.alert.topicKafka topic name for emitter's target to emit alerts. If event.types contains alerts, this field cannot empty.nonone
druid.emitter.kafka.request.topicKafka topic name for emitter's target to emit request logs. If event.types contains requests, this field cannot be empty.nonone
druid.emitter.kafka.segmentMetadata.topicKafka topic name for emitter's target to emit segment metadata. If event.types contains segment_metadata, this field cannot be empty.nonone
druid.emitter.kafka.producer.configJSON configuration to set additional properties to Kafka producer.nonone
druid.emitter.kafka.clusterNameOptional value to specify the name of your Druid cluster. It can help make groups in your monitoring environment.nonone
druid.emitter.kafka.extra.dimensionsOptional JSON configuration to specify a map of extra string dimensions for the events emitted. These can help make groups in your monitoring environment.nonone
druid.emitter.kafka.producer.hiddenPropertiesJSON configuration to specify sensitive Kafka producer properties such as username and password. This property accepts a DynamicConfigProvider implementation.nonone

Example

druid.emitter.kafka.bootstrap.servers=hostname1:9092,hostname2:9092
druid.emitter.kafka.event.types=["metrics", alerts", "requests", "segment_metadata"]
druid.emitter.kafka.metric.topic=druid-metric
druid.emitter.kafka.alert.topic=druid-alert
druid.emitter.kafka.request.topic=druid-request-logs
druid.emitter.kafka.segmentMetadata.topic=druid-segment-metadata
druid.emitter.kafka.producer.config={"max.block.ms":10000}
druid.emitter.kafka.extra.dimensions={"region":"us-east-1","environment":"preProd"}
druid.emitter.kafka.producer.hiddenProperties={"config":{"sasl.jaas.config": "org.apache.kafka.common.security.plain.PlainLoginModule required username=\\"KV...NI\\" password=\\"gA3...n6a/\\";"}}