'ADF - built-in copy task to create folders based on variables
I am using Azure Data Factory's built-in copy task, set on a daily schedule, to copy data into a container in Azure Data Lake Storage Gen2 using the Built-in Copy tool. For my destination, I'm trying to use date variables to create a folder structure for the data. In my resultant pipeline the formula looks like this:
Dir1/Dir2/@{formatDateTime(pipeline().parameters.windowStart,'yyyy')}/@{formatDateTime(pipeline().parameters.windowStart,'MM')}/@{formatDateTime(pipeline().parameters.windowStart,'dd')}
Unfortunately this is throwing an error:
Operation on target ForEach_h33 failed: Activity failed because an inner activity failed; Inner activity name: Copy_h33, Error: The function 'formatDateTime' expects its first parameter to be of type string. The provided value is of type 'Null'.
Everything I've created was just generated by the tool, the folder path I used when following the tool was as suggested:
Dir1/Dir2/{year}/{month}/{day} (I was then able to set the format of each variable - e.g., yyyy, MM, dd, which suggests the tool understood what I was doing.
The only other thing I can think of, is that the folder structure in the container only contains Dir1/Dir2/ - I am expecting the subdirectories to be created as the copy task runs.
I'll also add, everything runs fine if I just use the directory Dir1/Dir2/ - so the issue is with my variables.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|

