Skip to main content

Apache Kafka

The Conduit Platform by default supports Apache Kafka as a source and destination.

The Apache Kafka destination can connect to and produce records to a topic.

Required Configurations

serversServers is a list of Kafka bootstrap servers (i.e. brokers), which will be used to discover all the brokers in a cluster.Yes

Looking for something else? See advanced configurations.

Output format

The output format can be adjusted using configuration options provided by the connector SDK:

  • sdk.record.format: used to choose the format
  • sdk.record.format.options: used to configure the specifics of the chosen format

See this Conduit article for more information on configuring the output format.


Batching can also be configured using connector SDK provided options:

  • sdk.batch.size: maximum number of records in batch before it gets written to the destination (defaults to 0, no batching)
  • sdk.batch.delay: maximum delay before an incomplete batch is written to the destination (defaults to 0, no limit)

Advanced Configurations

There's no global, connector configuration. Each connector instance is configured separately.

topicTopic is the Kafka topic. It can contain a Go template that will be executed for each record to determine the topic. By default, the topic is the value of the opencdc.collection metadata field.Noorders or {{ index .Metadata "opencdc.collection" }}
clientIDA Kafka client ID.Noconduit-connector-kafka
acksAcks defines the number of acknowledges from partition replicas required before receiving a response to a produce request. none = fire and forget, one = wait for the leader to acknowledge the writes, all = wait for the full ISR to acknowledge the writes.Noall
deliveryTimeoutMessage delivery timeout.No
batchBytesLimits the maximum size of a request in bytes before being sent to a partition. This mirrors Kafka's max.message.bytes.No1000012
compressionCompression applied to messages. Options: none, gzip, snappy, lz4, zstd.Nosnappy
clientCertA certificate for the Kafka client, in PEM format. If provided, the private key needs to be provided too.No
clientKeyA private key for the Kafka client, in PEM format. If provided, the certificate needs to be provided too.No
caCertThe Kafka broker's certificate, in PEM format.No
insecureSkipVerifyControls whether a client verifies the server's certificate chain and host name. If true, accepts any certificate presented by the server and any host name in that certificate.Nofalse
saslMechanismSASL mechanism to be used. Options: PLAIN, SCRAM-SHA-256, SCRAM-SHA-512. If empty, authentication won't be performed.No
saslUsernameSASL username. If provided, a password needs to be provided too.No
saslPasswordSASL password. If provided, a username needs to be provided too.No