'Power Automate Flow - Process JSON Output in 50 Parallel Branches
I have a JSON with the following schema:
"invoices-report": {
"locations": {
"location": {
"@code": "TX",
"@name": "Texas",
"invoices": {
"invoice": [
{
"@InvoiceNumber": "111111",
"@AccountName": "Name",
"@InvoiceDate": "2021-11-01",
"items": {
"item": [
{
"@QTY": "1.00",
"@Rate": "5",
"@Description": "Services"
},
{
"@QTY": "2.00",
"@Rate": "5",
"@Description": "Services1"
},
…
{
"@QTY": "2.00",
"@Rate": "5",
"@Description": "ServicesN"
},
}
]
}
},
{
"@InvoiceNumber": "222222",
"@AccountName": "Name2",
"@InvoiceDate": "2021-11-01",
"items": { …..}
Basically I have an array of invoice numbers with subarray of Invoice Details. I would like to split it to 50 parallel branches to speed up processing. My flow looks like this:
Parse Json -> Create Variable ->
length(array(body('Parse_JSON')?['invoices-report']?['locations']?['location']?['invoices']?['invoice']))
For example, the variable returns 56,900 invoices. I would like to create 50 branches based on array order No. - >56,900/50=1,138 - > if invoice order numbers 0 - 1,138 - process these invoices in the first branch and so on. Each branch would also contain "Apply to each" function with degree of Parallelism of 50. Could you please explain how do I divide the array on 50 branches? Thank you in advance!
Solution 1:[1]
To do what you want, you need to use the SKIP and TAKE functions over the array to split it up so you can then call your child flow.
Here is an example of the approach you need to take.
FYI, the packet size san be anything, it doesn't need to be a perfect division of the array length.
Your Do Until loop should process until the position of the slicer reaches the end of the array.
This is the expression in the right hand side of the Do Until condition ... length(variables('Array'))
This is the Skip/Take expression ... take(skip(variables('Array'), variables('Slicer Position')), variables('Packet Size'))
https://docs.microsoft.com/en-us/azure/logic-apps/workflow-definition-language-functions-reference#skip https://docs.microsoft.com/en-us/azure/logic-apps/workflow-definition-language-functions-reference#take
So when it runs, you can pass in the Sub Array variable to your child flow and it will process that section of the data in parallel.
You should be able to apply that concept to your own data set.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | Skin |



