'Azure Data Factory v2 - How to get Copy Data tool output

I'm trying to create an archiving pipeline which essentially does the following:

  1. Call stored procedure (GET) from a SQL Azure DB that returns a resultset
  2. Archive the result from #1 onto storage account (e.g. json files)
  3. Extract ID column from #1 into an array of int
  4. Use result from #4 as a parameter to call a stored procedure (DELETE) in the same SQL Azure DB

So far, I've tried the Copy Data activity/tool which satisfies steps 1 & 2. However, I'm not sure how to get the outputs from that step and can't find any documentation at Microsoft.

Is it not the correct usage? Do I have to manually do it instead?

Also, I'd like to do some validation in between steps (i.e. no result? don't proceed).

I've managed to try the bare/general stored procedure activity but also can't find where to retrieve its output for use in the next step. I'm pretty new to Data Factory and don't really work with data engineering/piplines so please bear with me.



Solution 1:[1]

  1. Using Copy data activity, you can copy stored procedure data to storage. Connect the source to SQL database and use stored procedure as query option, connect the sink to sink folder in a storage account.

  2. Once the data is copied to a storage account, use lookup activity to read the data from the file which is generated from #1.

  3. Extract the output of lookup activity into an array variable using set variable activity.

  4. You can use If condition activity validates the result and run the other activities if it’s true.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 NiharikaMoola-MT