Written by Muhammad Aqib Arif
Senior Software Engineer - Middleware
In this blog, we will see how to integrate Apache Kafka with MuleSoft. First, we’ll create mule flow to publish streaming data to Apache Kafka and then we’ll create a mule flow that will consume Kafka streaming data in MuleSoft. Before we get into details, let’s learn a little bit about Apache Kafka.
Apache Kafka is an event streaming platform proficient at handling trillions of events a day. Primarily considered as a messaging queue, Kafka is based on the thought of a distributed commit log. Kafka has rapidly evolved from a messaging queue to a complete event streaming platform. Apache Kafka creates real-time streaming apps and data pipelines. It is scalable, fault-tolerant, and incredibly fast.
MuleSoft, along with Apache Kafka connector, empowers you to network with the Apache Kafka messaging system and accomplish seamless integration between your Mule app and Kafka cluster using Mule runtime engine (Mule).
In this section of the blog, we’ll see how to publish and consume Kafka streams using MuleSoft. Let’s create two flows in the mule project. The first flow will publish Kafka streams to the Kafka cluster and the second flow will consume the Kafka streams from Kafka cluster.
- For Topic filed, value of this property ${config.topics} is test- In Key field, now() method is used as current datetime
For new users, you can try the above steps to get started, and for others who are planning to use the Kafka Connector, reach out to us to see what out-of-the-box connectors we have to offer. You can email us at [email protected] or visit www.royalcyber.com
1 Comment
Nice article