Learn how to use the Ably Kafka Connector with Confluent Cloud

Kafka is an incredibly powerful tool, allowing for distributed event streaming and stream processing. It excels when it comes to acting as a traditional backend data broker. Due to its impressive throughput, scalability, and reliability, Kafka is often used to engineer large-scale event-driven systems.

Confluent is one of the largest companies which make it easy to manage Kafka, providing a large amount of tools as well as allowing for cloud-hosted deployments managed by them.

What is the Ably Kafka Connector

Whilst Kafka is an excellent choice for internal event streaming and stream processing, it’s not designed to distribute data to client devices.

In an example use-case, we can imagine that we have a tracked delivery service, and are feeding in the locations of all drivers into Kafka, with Kafka doing some heavy-lifting to appropriately store and process the raw GPS data.

Once the data has been stored and processed, we need a way to make the result available to users of our service, so they can see where our drivers are.

In order to allow for all these users to access the data, we need another service to handle this distribution of data. Ably is the perfect candidate for this, as not only can it provide low-latency, highly reliable data distribution to millions of users, but it provides many of the additional bells and whistles you’d expect to keep your service secure and functional, such as Token Authentication.

The Ably Kafka Connector – a sink connector built on top of Kafka Connect – makes it simple to map your Kafka topics to Ably channels, and handles all the complexity involved in getting the data out of Kafka and into Ably.

In this tutorial, we’ll set up a Kafka instance and also an instance of the Ably Kafka Connector in Confluent Cloud.

Step 1 – Create your Ably app and API key

To follow this tutorial, you will need an Ably account. Sign up for a free account if you don’t already have one.

Access to the Ably global messaging platform requires an API key for authentication. API keys exist within the context of an Ably application and each application can have multiple API keys so that you can assign different capabilities and manage access to channels and queues.

You can either create a new application for this tutorial, or use an existing one.

To create a new application and generate an API key:

  1. Log in to your Ably account dashboard
  2. Click the “Create New App” button
  3. Give it a name and click “Create app”
  4. Copy your private API key and store it somewhere. You will need it for this tutorial.

To use an existing application and API key:

  1. Select an application from “Your apps” in the dashboard
  2. In the API keys tab, choose an API key to use for this tutorial. The default “Root” API key has full access to capabilities and channels.
  3. Copy the Root API key and store it somewhere. You will need it for this tutorial.

    Copy API Key screenshot

Setting up Kafka in Confluent Cloud

You’ll need to create an account with Confluent, and sign in to your dashboard. This new account will provide you with a bunch of free credits which will far surpass our usage in this tutorial today.

After signing up, you’ll be prompted to create a new cluster. Currently you need to have your cluster be an AWS one in order for you to have access to their hosted Connectors, so make sure to select that.


Creating a Confluent Cluster

With that done, you have your Kafka cluster created, ready to go. We’ll need to create a topic for us to use as our pipe for communication though. Within your cluster, go to the ‘Topics’ option in the sidebar, and then select ‘Create topic.’.


Starting to create a topic

Leave the default options as-is (topic_0 as the name, and 6 partitions), and don’t enable any additional settings such as schemas for now. The topic should now be created, and you can test it out by going to the Messages tab of the topic, and selecting Produce a new message to this topic.


Send a message to the topic

With that done, you should have a cluster hosted on AWS, managed by Confluent, with a topic which can receive and share messages.

Uploading the Ably Kafka Connector

We now need to set up the Ably Connector to work with our cluster. To do this, we first need to go to the Connectors tab in the sidebar of Confluent, and select Add plugin in the top-right of the interface.


Starting to create a hosted connector

Note that at the time of writing you must be on an AWS-based cluster for the ‘Add plugin’ button to appear.

From this page, we need to add a few details for the Ably Connector. You’ll need to obtain a .zip file of the Ably Kafka Connector to upload, which you can obtain from the Confluent website.

Set the Plugin connector name to anything you’d like to identify the plugin, for example Ably Kafka Connector.

For the Connector class, set it to com.ably.kafka.connect.ChannelSinkConnector.

Set the Connector Type to be a Sink.

Finally, upload the Ably Connector’s .zip file you obtained.


Providing details to create a connector

Once done, there will be a small delay whilst the Connector is fully uploaded and prepared. After this is done, you should now see the Ably Kafka Connector (or whatever you chose to call it) in the list of Connectors in Confluent.


The Ably Kafka Connector has appeared

Running the Ably Kafka Connector on Confluent Cloud

Now that we have the Ably Kafka Connector available to our Cluster, we can set up an instance to run and make use of our Topic we created. If you select the Ably Kafka Connector from the list of Connectors, you’ll start the setup process.

On the first page, choose Global access, and press the Generate API Key and download button. This will automatically create the credentials you’ll need for interacting with the cluster. Make a note of the key and secret displayed to you, and then select Continue.


Starting to set up the Ably Connector Credentials

For the next page, we need to provide our key/value pairs which dicate our specific configuration of the connector. You can make use of the following JSON to get a default template running:

{
  "connector.class": "com.ably.kafka.connect.ChannelSinkConnector",
  "tasks.max": "3",
  "group.id": "ably-connect-cluster",
  "topics": "topic_0",
  "client.id": "Ably-Kafka-Connector",
  "channel": "#{topic}",
  "message.name": "#{topic}_message",
  "client.key": "<YOUR_ABLY_API_KEY>",
  "key.converter": "org.apache.kafka.connect.storage.StringConverter",
  "value.converter": "org.apache.kafka.connect.storage.StringConverter",
  "value.converter.schemas.enable": "false"
}

You will need to replace the <YOUR_ABLY_API_KEY> with the Ably API key you obtained in step 1 of this tutorial.

There’s a few important things to note from this config file:

  • We’ve set the topics to be topic_0. That means this Connector will be listening to the topic we created, topic_0, and handling messages from only that topic. We could add more topics by making a comma-separated list, such as “topic_0,topic_1”.
  • The channel key is set to #{topic}. This means that we will be publishing messages from a Topic into a channel of a matching name. In our case, messages which go into topic_0 will go to the Ably Channel topic_0. This can be adjusted to be based on message details, and also contain hard-coded text. For example, we could set channel to ably_#{topic}, which would mean messages to Topic topic_0 would go to Ably Channel ably_topic_0.
  • The message.name makes the same usage of the topic name, but in this case it attaches it to the name of each message.
  • The key.converter and value.converter are both set to StringConverter. This means that the data we’ll be sending will be a string representation.

With the JSON populated, go to the next and final page we need to touch, where we need to add endpoints. Add the endpoint rest.ably.io:443:TCP. Go through the remaining pages via the Continue button, until the Ably Connector starts provisioning itself. This can take a few minutes, but once it’s done the Connector should be up and running, ready to listen for messages on the Topic topic_0.

Publishing messages from the Kafka Topic to Ably

At this stage we have a Kafka Cluster, with a Topic, and an Ably Connector ready and waiting to send messages from our Topic to Ably Channels. Now’s a good chance to test it out.

Return to the Topics tab in your Confluent Cluster, go to topic_0, and select Produce a new message to this topic. You can send the template message as-is.

Locally in a terminal, we can check the History of the channel we expect the message to go to, topic_0 to verify the message was properly sent.

curl https://rest.ably.io/channels/topic_0/messages \
 -u "MY_API_KEY"

If this has worked, you should see a response containing the message sent:

[
	{
		"id": "A12bc_dEFG:0:0",
		"timestamp": 2692417991422,
		"extras": {
			"kafka": {
				"key": "18"
			}
		},
		"data": "{\"ordertime\":1497014222380,\"orderid\":18,\"itemid\":\"Item_184\",\"address\":{\"city\":\"Mountain View\",\"state\":\"CA\",\"zipcode\":94041}}",
		"name": "topic_0_message"
	}
]

Conclusion

With the above running, we now have the ability to send messages from Kafka in Confluent into an Ably Channel. From here, we can distribute these messages to as many users as desired. This can be incredibly powerful for many use-cases, such as vehicle tracking, ticket booking systems, chat apps, and generally any scenario where you’d be interested in distributing data to end users. Find out more on these use-cases and best practices in this article.