'Read from single kafka topic and upsert into multiple oracle tables
I'm using kafka to send the multiple collections data into a single topic using mongo source connector and upsert the data into different oracle tables using jbcsink connector.
In mongo source connector,we are appending respective collection name for all records to process the information at sink side based on the collection name.
is that possible using jdbcsink connectors? can we do this via node .js/ spring boot as a consumer application to split the topic message and write it into different collections?
EX: Collection A,collection B collection C - MongoSourceconnector Table A,Table B,Table c- Jdbcsinkconnector Collection A 's data has to map to table A, likewise for the remaining.
Solution 1:[1]
The JDBC Sink will only write to a table that is based on the name of the topic, by default. You'd need to rename the topic at runtime to write data from one topic to other tables.
can we do this via node .js/ spring boot as a consumer application to split the topic message and write it into different collections?
Sure, you can use Kafka Streams (Spring Cloud Streams) to branch the data into different topics before the sink connector would read them.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | OneCricketeer |
