Skip to main content

Apache Kafka

Apache Kafka is an open-source distributed event streaming platform used by thousands of companies that lets you read, write, store, and process events across many machines. It can be used as an upstream or downstream resource in your Turbine streaming apps by specifying the topic to write to.

Meroxa supports two variations of Kafka as a resource:

  • Apache Kafka
  • Confluent Cloud

Setup

Credentials

To add an Apache Kafka resource, you will need the following credentials for the connection URL:

  • Bootstrap server information available in the server.properties file
  • Username and Password available in the KafkaServer section in the JAAS file

Resource Configuration

Use the meroxa resource create command to configure your Apache Kafka resource.

The following example depicts how this command is used to create an Apache Kafka resource named apachekafka with the minimum configuration required.

$ meroxa resource create apachekafka \
--type kafka \
--url "kafka+sasl+ssl://$KAFKA_USER:$KAFKA_PASS@$BOOTSTRAP_SERVER?sasl_mechanism=plain" \

In the example above, replace following variables with valid credentials from your Apache Kafka environment:

  • $KAFKA_USER - Apache Kafka Username
  • $KAFKA_PASS - Apache Kafka Password
  • $BOOTSTRAP_SERVER - Host and Port of the Kafka broker

Configuration Requirements

The following configuration is supported and optional:

ConfigurationSource
readFromBeginningWhether or not to read a topic from beginning (i.e. existing messages or only new messages).

Data Record Format

With Kafka, you can pick a data format of your choice and it’s important to be consistent across your usage when using Kafka as an upstream source to your downstream resource. Currently Meroxa only supports JSON.