Skip to main content

Kafka Source Node

Quick Reference

Kafka Broker A comma-separated list of Kafka broker addresses. Format: host:port. ex: broker1:9092,broker2:9092

Topic The name of the Kafka topic to consume from. ex: my-topic

Consumer Group ID A unique identifier for the consumer group. ex: fleak-consumer-group

Encoding Type The format of the data in the Kafka topic.

Additional Properties Extra Kafka configuration options as key–value pairs. ex: key: security.protocol, value: SASL_SSL

Overview

The Kafka Source node allows your workflow to ingest real-time streaming data from an Apache Kafka cluster. It acts as a starting point for your data pipeline, continuously polling a specified topic for new events.

Configuration

FieldDescriptionRequiredPlaceholder
Kafka BrokerA comma-separated list of host/port pairs for establishing the initial connection to the Kafka cluster (Bootstrap Servers). Format: host:port.Yesbroker1:9092,broker2:9092
TopicThe name of the Kafka topic to consume data from.Yesuser_signups
Consumer Group IDA unique string that identifies the consumer group. This ID allows Kafka to track which messages have already been processed (offsets). Changing this ID will reset the processing position to the earliest available message.Yesfleak-production-consumer
Encoding TypeDefines how the raw bytes from Kafka should be deserialized into Fleak events. Options: CSV, JSON Object, JSON Array, JSON Lines, String Line, Text, XML.
Default: JSON Array. More info here.
YesJSON Array
Additional PropertiesKey-value pairs for advanced Kafka Consumer configuration (see below).NoNone

Advanced Configuration Properties

The Additional Properties section allows you to pass standard Kafka client configuration properties directly to the underlying consumer. This is commonly used for configuring security protocols, authentication credentials, or tuning client performance.

Note: Any standard Apache Kafka consumer configuration property can be entered here.

Click + Add to insert specific key-value pairs.

Common Use Cases

  • Security & Authentication: If your cluster requires authentication (e.g., SASL/SSL), you can provide the necessary JAAS config, security protocols, and mechanism settings here.
  • Connection Tuning: You can override default timeouts or connection settings (e.g., request.timeout.ms, session.timeout.ms) if your network environment requires it.

SASL/SSL Example

To connect to a cluster secured with SASL/SSL, add the following key-value pairs:

KeyValue
security.protocolSASL_SSL
sasl.mechanismPLAIN
sasl.jaas.configorg.apache.kafka.common.security.plain.PlainLoginModule required username="user" password="pass";

Pre-set Defaults

The Kafka Source node ships with the following defaults. You can override any of these via Additional Properties.

PropertyDefault Value
auto.offset.resetearliest
max.poll.records5000
session.timeout.ms10000
fetch.min.bytes1048576 (1 MB)
fetch.max.wait.ms1000
max.partition.fetch.bytes10485760 (10 MB)
enable.auto.commitfalse