We have a topic with 100 partitions and the load is millions of records per hour. We ran into the problem whenever we deploy a new version of stream-processor
We have a micro-services architecture, with Kafka used as the communication mechanism between the services. Some of the services have their own databases. Say
I have a .net core application that needs to connect to Kafka using SSL protocol. I have a p12 file which I need to use for authentication with the broker but I
I have been encountering a weird issue with Kafka and Confluent Sink Connector which I am using in my setup. I have a system where in I have two kafka connect s
I'm using Locust for load testing. I want to register a Kafka consumer in separate thread to measure the time of message processing. Here is what I got now: def
I try to get data from Kafka to Flink, I use FlinkKafkaConsumer but Intellij shows me that it is depricated and also ssh console in Google Cloud shows me this e
Using PHP Laravel Framework to consume kafka messages with the help of the mateusjunges/laravel-kafka laravel package. Is it possible to save the offset by cons
I have a kafka consumer which is subscribing on a topic. Implementation is working fine. But when trying to implement unit tests for that, there's a problem bec
I'm using Kafka Admin client API's to create the topic. The topic is getting created, however the topic is getting created with 1 partition by default. The API
Headless service does not bind to new IPs. A pod restart is required. When both zookeepers and brokers restarted at same time brokers are not able to connect wi
I get an error when running kafka-mongodb-source-connect I was trying to run connect-standalone with connect-avro-standalone.properties and MongoSourceConnector
With spring-kafka, there is two types of Kafka listeners. Record Listeners : @KafkaListener(groupId = "group1", topics = {"my.topic"}) public void listenSingl
There might be a situation where even though the message is received from Kafka, but due to some reason (Database is down, webhook is offline or ...) still the
In spring-kafka, how do I add classes from a package to be trusted as a custom header field? The message is being sent like this: @Autowired private KafkaTemp
I'm trying to implement a Spring Boot-based Kafka consumer that has some very strong message delivery guarentees, even in a case of an error. messages from a pa
I'm working on a process that collect data from IBM MQ and process it to a kafka topic. To make sure not loosing any message,I need to commit my JMS message onl
I have deployed a Kafka cluster on a GCP instance. I used the connector through config/connect-distributed.properties. Start collecting data through restapi usi
I have the following Spring configuration for Kafka: import org.apache.kafka.clients.producer.ProducerConfig; import org.springframework.boot.context.properties
I use AWS MSK cluster with brokers logging turned on to CloudWatch. Logging works and I can see brokers logs. We have some topics with cleanup.policy=compact an
I am trying to interpret a Avro record stored by Debezium in Kafka, using Python { "name": "id", "type": {