Category "apache-kafka"

Materialized table from a topic in ksqlDB

I am trying to create a materialized table from a topic. I am creating and producing data into the topic as follows: kafka-topics.sh create --bootstrap-server

How to write a bash command to open up a kafka producer and immediately follow with topic to create?

What's the best way to run the following sequence of commands kafka-console-producer --topic discounts --broker-list localhost:9092 --property parse.key=true --

Kafka 1.0.2 Exiting because log truncation is not allowed for partition X

First of all I have already tried setting the unclean.leader.election to true and I am still having the same problem. The brokers are still exiting with this ex

Does Kafka config "retention.bytes" apply to compact topic?

Does "retention.bytes" apply to "compact" topic? The reason why I came here is that in lenses in my current project, I saw the partition bytes is 2GB which is w

org.xerial.snappy.SnappyError: [FAILED_TO_LOAD_NATIVE_LIBRARY] no native library is found for os.name=Mac and os.arch=aarch64

I'm building a cdc pipeline to read mysql binlog through maxwell and putting them into kafka my compression type is snappy in maxwell config.But at consumer end

Add a new Kafka Node to an Existing Kafka Cluster without downtime

I have a 3 node Kafka cluster with a single zookeeper node, my question is how can I add a new Kafka node to this cluster without downtime?

Spring cloud stream with kafka streams binder: how to set `trusted.packages` for a Stream Processor (that's different than consumer and producer)

I have a simple stream processor (not consumer/producer) that looks like this (Kotlin) @Bean fun processFoo():Function<KStream<FooName, FooAddress>, KS

Accessing schema.name from debezium sink connector to postgres

We have a debezium source connectors working perfectly fine, and one of the properties set is, for example: "transforms.SetSchemaMetadata.schema.name": "myschem

Can one Kafka Producer Class have multiple @EventListener methods?

Im using Spring Kafka and wrote Producer Class @Component @RequiredArgsConstructor class Producer { private static final String TOPIC = "channels"; pri

Can one Kafka Producer Class have multiple @EventListener methods?

Im using Spring Kafka and wrote Producer Class @Component @RequiredArgsConstructor class Producer { private static final String TOPIC = "channels"; pri

No broker/node available in test with Kafka in TestContainers

I am trying to create a bare-bones skeleton integration test for Kafka with TestContainers: just publish message to topic and check it arrives to it (entire set

Best way to join two (or more) kafka topics in KSQL emiting changes from all topics?

We have a "microservices" platform and we are using debezium for change data capture from databases on these platforms which is working nicely. Now, we'd like t

Kafka broker fails inside Docker container without meaningful logs

When I launch Docker container with Kafka broker it fails sometimes, but I can't understand by logs what exactly happens, logs always are: # docker-compose up b

kafka s3 sink connector keys and headers s3 storage write not working

I have enabled "store.kafka.keys" : "true", "store.kafka.headers" : "true", "keys.format.class" : "io.confluent.connect.s3.format.json.JsonFormat", "headers.for

Unable to set topic with camelCase name via Env variable

I'm deploying zeebe using helm. With extraInitContainers directive I manage to include the kafka-exporter 3.1.1 and it loads correctly. In the yml file I set a

Avro Definition for Custom Aggregator

I have a code where I am aggregating the data from Kafka stream via: StreamsBuilder streamsBuilder = new StreamsBuilder(); streamsBuilder.table(AppConfigs.t

Kafka connect s3 sink multiple partitions

I have multiple questions about the kafka connect S3 sink connector 1.I was wondering if its possible using the S3 sink of kafka connect to save records with mu

kafka connect consumer group members with no assigments available

I have a kafka connect task which fetches data from a topic with 3 partitions and send the data to a cassandra sink, so I have kconnect in distributed mode with

How to run kafka s3 sink connector in confluent 6.2.0

Have installed confluent 6.2.0 in my 3 kafka nodes and also installed confluentinc-kafka-connect-s3-10.0.1 in 3 nodes and modified the quickstart-s3.properties

Processing data from a kafka stream using Pyspark

What the console of the kafka consumer looks like: ["2017-12-31 16:06:01", 12472391, 1] ["2017-12-31 16:06:01", 12472097, 1] ["2017-12-31 16:05:59", 12471979,