I have a topic that will eventually have lots of different schemas on it. For now it just has the one. I've created a connect job via REST like this: { "name"
I am using debezium oracle connector in kafka connect.While starting connector I am getting below error, java.lang.RuntimeException: Failed to resolve Oracle da
My pipeline is: Kerberized Kafka --> Logstash (hosted on a different server) --> Splunk. Can I replace the Logstash component with Kafka Connect? Could
I am using Debezium as a CDC tool to stream data from MySql. After installing Debezium MySQL connector to Confluent OSS cluster, I am trying to capture MySQL bi
I have been encountering a weird issue with Kafka and Confluent Sink Connector which I am using in my setup. I have a system where in I have two kafka connect s
I get an error when running kafka-mongodb-source-connect I was trying to run connect-standalone with connect-avro-standalone.properties and MongoSourceConnector
I am trying to interpret a Avro record stored by Debezium in Kafka, using Python { "name": "id", "type": {
I am trying to implemnt CDC piline with Debezium mysql connecter and kafkal But Source connecter not able to pusblish event for insert and update operation in t
When I create a kafka connect connector with the debezium connector, it results in four database connections. Three of them remain idle, while one works as the
I installed kafka confluent oss 4.0 on a fresh linux centos 7 but kafka connect failed to start. Steps to reproduce : - Install Oracle JDK 8 - Copy confluen
I have setup Debezium and Azure Event Hub as CDC engine from PostgeSQL. Exactly like on this tutorial: https://dev.to/azure/tutorial-set-up-a-change-data-captur
In Mongodb, the objectid is base64. I'm streaming these docs to Kafka using Debezium. How can I get ObjectId to be written as UUID in kafka? Mongo Example Doc :
I want to stream data from Kafka to MongoDB by using Kafka Connector. I found this one https://github.com/hpgrahsl/kafka-connect-mongodb. But there is no step t