'Centralized code to write to Kafka error topic
I have 50 independent Sprint Boot applications that each read messages from a specific IBM MQ queue and then write the data to a specific Kafka topic. 50 queues and 50 topics in total. If the message is unprocessable, i.e, wrong format/structure, it needs to be written to an error topic that is common for all 50 applications. How do I make the code that writes to the error topic common across all 50 apps? Design it as another Spring Boot app that the other apps invoke, put the common code in a jar, anything else?
Solution 1:[1]
If you want to be completely in control of the error logic, both.
Deploy an error catching service that does whatever you want. Create a library that will hook into any external system, and have it forward that information to your new service.
Today that data could be written to Kafka from your service, but in the future, that could change, and now it's up to you to change it rather than force everyone to update their error-handling dependencies.
Keep in mind, your service then becomes a bottleneck / single point of failure if it works on a request/reply model.
Alternatively, don't write data to Kafka (or your service) from the erroring services. Instead, dump data to files and use tools like Fluentd to pick those up and write logs onto Kafka or Splunk/Elasticsearch.
Solution 2:[2]
I have faced this kind of problem in the past years. I remember getting support from the site below.
Kafka is a great solution to prevent data loss. My recommendation would be to use the logstash plugins.
If there is a specific word etc. in the topics, you can make sense of it on the logstash side.
under filter:
if "error" in source
translate { }
ruby
etc.
I hope it sheds light
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | OneCricketeer |
| Solution 2 | esckins |
