Confluent Cloud
Confluent Cloud is a fully managed, cloud-native service for connecting and processing all of your data, everywhere it’s needed. It can be used as an upstream or downstream resource in your Turbine streaming apps by specifying the topic
to write to.
Meroxa supports two variations of Kafka as a resource:
- Apache Kafka
- Confluent Cloud
Setup
Credentials
To add a Confluent Cloud resource, you will need the following credentials for the connection URL:
- API key
- API secret
- Bootstrap Server
Follow the guide here to set up your API keys. Refer to your Cluster settings to retrieve the Bootstrap Server information.
Resource Configuration
Use the meroxa resource create
command to configure your Confluent Cloud resource.
The following example depicts how this command is used to create an Confluent Cloud resource named confluentcloud
with the minimum configuration required.
$ meroxa resource create confluentcloud \
--type confluentcloud \
--url "kafka+sasl+ssl://$API_KEY:$API_SECRET@$BOOTSTRAP_SERVER?sasl_mechanism=plain" \
In the example above, replace following variables with valid credentials from your Confluent Cloud Cloud Console:
$API_KEY
- Cluster API Key$API_SECRET
- Cluster API Secret$BOOTSTRAP_SERVER
- Host and Port of the Cluster
Configuration Requirements
The following configuration is supported and optional:
Configuration | Source |
---|---|
readFromBeginning | Whether or not to read a topic from beginning (i.e. existing messages or only new messages). |
Data Record Format
With Kafka, you can pick a data format of your choice and it’s important to be consistent across your usage when using Kafka as an upstream source to your downstream resource. Currently Meroxa only supports JSON.
Additional
Confluent Cloud resources must use an existing topic to be used as an upstream or downstream resource in your Turbine streaming apps. You can create topics via your Confluent Cloud console or Confluent CLI following the steps here.