Category "kafka-consumer-api"

How to get kafka offset data, specified on timestamp

I've tried to get the offset from Kafka topic based on timestamp when I tried to run it was throwing null pointer error, Map<TopicPartition, Long> timest

This error handler cannot process 'SerializationException's directly;

The image below is my topic. Every once in a while, a value other than an empty or json is returned. If it is null, it throws an error as below and enters the

kafka failed authentication due to: SSL handshake failed

I have to add encryption and authentication with SSL in kafka. This is what I have done: Generate certificate for each broker kafka: keytool -keystore server

What is the difference between kafka earliest and latest offset values

producer sends messages 1, 2, 3, 4 consumer receives messages 1, 2, 3, 4 consumer crashes/disconnects producer sends messages 5, 6, 7 consumer comes back up

How does kafka consumer auto commit work?

I am reading this one: Automatic Commit The easiest way to commit offsets is to allow the consumer to do it for you. If you configure enable.auto.commit=t

Kafka custom deserializer converting to Java object

I'm using Spring Kafka integration and I've my own value generic serializer/deserializer as shown below Serializer: public class KafkaSerializer<T>

Flink checkpoint not replaying the kafka events which were in process during the savepoint/checkpoint

I want to test end-to-end exactly once processing in flink. My job is: Kafka-source -> mapper1 -> mapper-2 -> kafka-sink I had put a Thread.sleep(100

Make kafka consumer group inactive

My biggest question is that do we have a concept in kafka called as consumer group to be inactive/active. I am not sure how to achieve this in my local kafka

Is there a way to manually store Kafka offset so a consumer never misses messages?

Using PHP Laravel Framework to consume kafka messages with the help of the mateusjunges/laravel-kafka laravel package. Is it possible to save the offset by cons

Kafka Consumer to read from multiple topics

I am very new to Kafka. I am creating two topics and publishing on these two topics from two Producers. I have one consumer which consumes the messages from bot

How to write a KafkaAvro Serde for GenericData.Record

I'm using Kafka 0.10.2 and Avro for the serialization of my messages, both for the key and for the value data. Now I would like to use Kafka Streams but I'm stu

Kafka Consumer is not receiving Messages on docker

I'm a begginer on kafka as well as docker, I have been doing a course and working with kafka producer and consumer but for some reason it is not working. When I