'is there is a way to automate truncate on staging table before loading new data through snowpipe from S3 bucket
we are moving data from the staging table to the fact table and want to truncate old data from staging table before new data is loaded from S3 bucket through snowpipe. Is there is a way to automate the truncate statement before snowpipe runs or load new data into the staging table?
Solution 1:[1]
Have you considered just continually adding the data to your stage tables, put an append-only STREAM over that table, and then use tasks to load downstream tables from the STREAM. The task could run every minute with a WHEN statement that checks whether data is in the STREAM or not. This would load the data and push it downstream whenever the data happens to land from your ERP.
Then, you can have a daily task that runs anytime during the day which checks the STREAM to make sure there is NO DATA in it, and if that's true, then DELETE everything in the underlying table. This step only needs to happen to save storage and because the STREAM is append-only, the DELETE statement does not create records in your STREAM.
Using this method will remove the need to truncate before Snowpipe loads the data.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | Mike Walton |
