'How can I create an Azure data factory batch pool with static IP or small range of IPs

I have a requirement to run a query against our datamart, create a file, and SFTP the file to a remote server on a regularly scheduled basis using Data Factory. I have developed a Python script run the query, format the data and create a file, set up a data factory, a batch account, a storage account, a batch pool, and a pipeline. I had the SA who manages the remote server whitelist the external IP of my batch pool VM which I got by examining my VM in Batch Explorer. I can trigger the task from the pipeline I set up in data factory studio and the job runs and the file gets transmitted successfully. The problem I have is that, as I understand it, the IP of my batch pool VM can change anytime the VM is restarted. I have downloaded the Azure Public IP ranges from for my region from Microsoft, but the pool of potential IPs is over 5.2 million and I can't ask the SA to whitelist that many addresses.

How can I either configure my existing batch pool to use a static IP or small range of IPs or create a new batch pool with a static IP or small range of IPs? I'm pretty new to Azure so if someone could provide a step by step that would be really helpful.



Solution 1:[1]

From January 2020, ADF supports static IP address ranges.

With the introduction of Static IP address range, you can now whitelist IP ranges for the particular Azure integration runtime region to ensure you don’t have to allow all Azure IP addresses in your cloud data stores. This way, you can restrict the IP addresses that are permitted to access the data stores.

Refer this official Microsoft document to know how to implement the same.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 UtkarshPal-MT