Event Sourcing using Serialized and Kafka Connect

This guide will show you how to use Serialized with Kafka Connect to stream your event-sourced events to your downstream services with Apache Kafka.

Apache Kafka is a popular message broker with great streaming and transformation capabilities. Kafka also comes with a framework called Connect. It is a framework for connecting Kafka with external systems such as databases, key-value stores, search indexes, and file systems.

Serialized Kafka Connector

Writing an event safely to Serialized is how a typical event-sourcing use-case begins, but what happens afterwards depends on how you want to represent your sequence of events. Often there are multiple services in your landscape interested in the written event. For one service you might choose to poll the event feed and maintain a service-local read model, but sometimes you might want to have full stream processing capabilities at hand.

We at Serialized have developed such a connector plugin!

After installation you will have your events automatically polled from Serialized and published to your Kafka topic.

Enjoy a great combination of consistent storage with performant distribution and flexible transformation!

1) Download and install Kafka

Download from the Confluent site or via Scoop (scoop install kafka) or similar.

2) Start Zookeeper Server

Linux

./bin/zookeeper-server-start.sh ./config/zookeeper.properties

Windows

bin\windows\zookeeper-server-start.bat config\zookeeper.properties

3) Start Kafka Server

Linux

./bin/kafka-server-start.sh ./config/server.properties

Windows

bin\windows\kafka-server-start.bat config\server.properties

4) Create destination topic

Linux

./bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic serialized-events

Windows

bin\windows\kafka-topics.bat --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic serialized-events

5) Ensure plugin directory is set

Make sure your plugin directory (plugin.path) is set in config/connect-standalone.properties (or config/connect-distributed.properties).

6) Create the connector config file

Create the config file, eg. kafka/config/serialized-source.properties and enter your Serialized API keys.

name=SerializedSourceConnector
connector.class=io.serialized.kafka.connect.SerializedSourceConnector
topic=serialized-events
serialized.access.key=<your-access-key>
serialized.secret.access.key=<your-secret-access-key>

Optional parameters

feed.name (defaults to _all)
batch.size (defaults to 100)
poll.delay.ms (defaults to 2000)

7) Build

git@github.com:serialized-io/serialized-source-kafka-connector.git

mvn clean package

8) Deployment

Copy target/kafka-source-connector-jar-with-dependencies.jar to your kafka/plugins directory

9) Run!

Linux

./bin/connect-standalone.sh ./config/connect-standalone.properties ./config/serialized-source.properties

Windows

bin\windows\connect-standalone.bat config\connect-standalone.properties config\serialized-source.properties

10) Post

Now go ahead and post an event to your Serialized project and watch it end up on your Kafka topic. Use the tool bin/kafka-console-consumer to verify everything works as expected!