'A 100% delivery (and processed) guarantee, from producer until consumer in Kafka

I am building a message queue system in an application. It is necessary that EVERY single message that the producer sends to the queue is delivered and processed in the consumer. There is absolutely no data loss allowed.

I am thinking of using Kafka or RabbitMQ.

RabbitMQ only deletes a message from the queue when it is delivered, or processed in the consumer.

But I can't find a 100% delivery guarantee for Kafka. Messages are stored in the queue for a specific period of time or until the storage is full. But if that expires/exceeds with still some undelivered messages in the queue, those could be lost.

So my question is: how, if even, does Kafka ensure a 100% delivery guarantee to the consumer?



Solution 1:[1]

Kafka can guarantee strong delivery guarantees within the retention period of the topic (not queue), yes. (You can also disable retention entirely).

With some extra settings, you can get exactly-once-delivery (EOS), but otherwise, it is expected to be at-least-once, by default.

You can track consumer-group offsets to determine processed progress of the topic.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 OneCricketeer