Parallel Processing of Kafka events with order in MuleSoft

As we know Kafka is a streaming platform, It best suits in one of the scenarios where same event has to be feed across different instances. This can be achieved by creating different consumer group Ids. 

Consider a scenario, there is a Kafka topic with four partitions and a running application is consuming the messages actively from the topic. To provide the best performance, the order of consuming messages from the different partitions are handled by Kafka internally. Now say you are expecting more load on specific window where you will be expected to enable horizontal autoscaling for your api. Also, your api is maintaining order on the events to deal with CURD operations. ๐Ÿคฏ How it can be achieved? ๐Ÿค”๐Ÿค”

You can't travel on multiple boats at the same time ๐Ÿ˜but Kafka can handle processing of multiple events at the same time๐Ÿ˜…. This article provides the understanding on how to handle CURD operation events in parallel processing with Kafka as streaming platform. ๐Ÿ‘€



Exciting isn't it ๐Ÿ˜ so letssss start!!

Kafka guarantees ordering of events only if they are published to single partition, hence there is a configuration in the Kafka publish connector provided by MuleSoft. You can configure the key parameter while publishing, Kafka considers this as primary key and publishes all the respective events to specific partition and mentations the FIFO order while processing further. 

Consumption of messages with different available partitions works based on number of consumers. If there are four partitions and single consumer then the application will consume the data from all partitions. To scale up the processing, additional consumers can be added. For four partitions, if there are two consumers, the load will be divided equally. Please note not to add more consumers then the number of partitions because each partition get assigned with specific consumer. For example if there are four partitions, max we can have four consumer because if even the 5th consumer gets added it will sit idle. 

Stay tuned to follow complete practical working example. See you then ๐Ÿ˜€

Please drop a comment to know your valuable feedback!!

#kafkaWorld #haveFun #keepLearning

Comments

Popular posts from this blog

MuleSoft Integration With Seek Kafka Connector