Confluent Cloud is a fully managed, cloud-native service for connecting and processing all of your data, everywhere it’s needed. It can be used as an upstream or downstream resource in your Turbine data apps by specifying the
topic to write to.
Meroxa supports two variations of Kafka as a resource:
- Apache Kafka
- Confluent Cloud
To add a Confluent Cloud resource, you will need the following credentials for the connection URL:
- API key
- API secret
- Bootstrap Server
meroxa resource create command to configure your Confluent Cloud resource.
The following example depicts how this command is used to create an Confluent Cloud resource named
confluentcloud with the minimum configuration required.
meroxa resource create confluentcloud \
--type confluentcloud \
--url "kafka+sasl+ssl://$API_KEY:$API_SECRET@$BOOTSTRAP_SERVER?sasl_mechanism=plain" \
In the example above, replace following variables with valid credentials from your Confluent Cloud Cloud Console:
$API_KEY- Cluster API Key
$API_SECRET- Cluster API Secret
$BOOTSTRAP_SERVER- Host and Port of the Cluster
The following configuration is supported and optional:
|Whether or not to read a topic from beginning (i.e. existing messages or only new messages).|
Data Record Format
With Kafka, you can pick a data format of your choice and it’s important to be consistent across your usage when using Kafka as an upstream source to your downstream resource. Currently Meroxa only supports JSON.
Confluent Cloud resources must use an existing topic to be used as an upstream or downstream resource in your Turbine data apps. You can create topics via your Confluent Cloud console or Confluent CLI following the steps here.