Event Sourcing using Serialized and Kafka Connect
This guide will show you how to use Serialized with Kafka Connect to stream your event-sourced events to your downstream services with Apache Kafka.
Apache Kafka is a popular message broker with great streaming and transformation capabilities. Kafka also comes with a framework called Connect. It is a framework for connecting Kafka with external systems such as databases, key-value stores, search indexes, and file systems.
Serialized Kafka Connector
Writing an event safely to Serialized is how a typical event-sourcing use-case begins, but what happens afterwards depends on how you want to represent your sequence of events. Often there are multiple services in your landscape interested in the written event. For one service you might choose to poll the event feed and maintain a service-local read model, but sometimes you might want to have full stream processing capabilities at hand.
After installation you will have your events automatically polled from Serialized and published to your Kafka topic.
Enjoy a great combination of consistent storage with performant distribution and flexible transformation!
1) Download and install Kafka
2) Start Zookeeper Server
3) Start Kafka Server
4) Create destination topic
./bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic serialized-events
bin\windows\kafka-topics.bat --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic serialized-events
5) Ensure plugin directory is set
Make sure your plugin directory (plugin.path) is set in
6) Create the connector config file
Create the config file, eg.
kafka/config/serialized-source.properties and enter your Serialized API keys.
name=SerializedSourceConnector connector.class=io.serialized.kafka.connect.SerializedSourceConnector topic=serialized-events serialized.access.key=<your-access-key> serialized.secret.access.key=<your-secret-access-key>
feed.name (defaults to _all) batch.size (defaults to 100) poll.delay.ms (defaults to 2000)
firstname.lastname@example.org:serialized-io/serialized-source-kafka-connector.git mvn clean package
target/kafka-source-connector-jar-with-dependencies.jar to your
./bin/connect-standalone.sh ./config/connect-standalone.properties ./config/serialized-source.properties
bin\windows\connect-standalone.bat config\connect-standalone.properties config\serialized-source.properties
Now go ahead and post an event to your Serialized project and watch it end up on your Kafka topic. Use the tool
bin/kafka-console-consumer to verify everything works as expected!