'Azure Data Factory V2: Optional parameters

I'm using ADFv2 and I would like to use some optional parameters defined at the dataset level. I didn't found if that can be achieved.

If I define a parameter in the dataset (after defining a parameter at the pipeline level and connecting the two so the pipeline passes the value to the parameter at the dataset level) when I run the pipeline manually and don't specify a value I get the error

No value provided for Parameter 'parameter'


Solution 1:[1]

Just for anyone that may be looking to do something like this. At least until today's date we can't have optional parameters for a pipeline, this is by design according to what the Product Team of ADF has told me through a support ticket in Azure.

Solution 2:[2]

I ran into this issue recently and the answers provided here, didn't quite address my needs. The way that we are using the ADF pipelines relies on the parameter to be set if you need to run it manually or if being run by a trigger, to be empty so that they can be dynamically set to the date/time that the trigger was activated.

In our pipeline we have checks like the following.

@if(empty(string(pipeline().parameters.FolderDate)),
formatDateTime(utcnow(),'yyyyMMdd'),
pipeline().parameters.FolderDate)

It turns out that if you provide the parameter as a lowercase null, then my pipeline works as expected.

ADF Parameter Setting

Solution 3:[3]

Try @coalesce(null) as the expression for the parameter.

Solution 4:[4]

We have a dataset for blob csv files. The dataset has the filename parameterised, as well as the container and directory.

enter image description here

When this dataset is a sink, we might want to specify the file to write to, or leave it blank and let Azure provide data_xxxx.csv filenames for us.

When this dataset is a source, we might want to specify which file to open, or leave it blank to use the wildcard option (all csv in the directory).

In that limited sense, we want the filename parameter to be optional.

But you can't leave the dataset parameter field blank in the UI or the pipeline won't compile (validate).

enter image description hereenter image description here

In those cases where we want the filename to be blank we use the trick suggested above to inject an empty string (we use @coalesce(pipeline().parameters.filename,'') and leave the filename parameter empty when triggering the pipeline.

enter image description here enter image description here

Solution 5:[5]

@fgbaezp, this is a late answer but you can assign an empty string to a parameter by using dynamic content, something like @toLower('')

Solution 6:[6]

What would you expected if you don't specify a value for the parameter? If you have a default value for that parameter, you could specify default value when defining a parameter. "parameters": { "parameter1": { "type": "String", "defaultValue": "defaultValue1" } }

Solution 7:[7]

Instead of @coalesce you can use ? like @dataset()?.fileName if empty return null. "defaultValue": you can fill @null or [] or better @''. Data Factory and Synapse Analytics cross pass parameters need some input to validate pipeline.

Null Values Sample

If Null return null

Solution 8:[8]

The workaround for this is using coalesce in dynamic content and leaving the parameter blank.

ex: @coalesce(pipeline().parameters.parametername,'') in dynamic content

https://docs.microsoft.com/en-us/azure/data-factory/control-flow-expression-language-functions#coalesce

enter image description here

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 fgbaezp
Solution 2 Foxhound013
Solution 3 Andy Booth
Solution 4 Jason Welch
Solution 5 jmz
Solution 6 Fang Liu
Solution 7
Solution 8 Sudhin