'Incremental Load Process - Snowflake

All,

Thanks in advance for the assistance and guidance.

So... We have been using snowflake for around a year now and have had a minor bug that we now need to tackle. Its not so much a bug but an improvement in our process.

We largely work with JSONS (large contestant volumes), we have a stream in a staging table where the JSON lands from Kafka, we have views that sit on top of the stream that break out the JSONs into a relational views. These views are then inserted into tables. This is all done using a single procedure that can do merges, insert only or even truncate then insert operations. This all works fine unless there is an issue with the JSON (missing data effecting the view for example) when this happens the procedure dumps the stream into a table. This results in anywhere between 1 and 10,000 JSONs dumped into our error table. The error might only originate from 1 jsons. What I would like is a secondary process to kick off that when the bulk load into the table has failed then the same logic is iterated though each JSON so that only the actual JSON that has errors is then written to our error table.

in SQL SERVER integration studios I would be able to have a insert source onject that was a bulk load then another object after that was an incremental load and then the errors from that would be written out to a table. I am after something like this.

Any suggestions, pointers would be much appreciated.

Regards



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source