'Spring Cloud dataflow in Openshift - how to get execution Id

Currently I am running spring cloud data flow in open shift environment. I can able to trigger the open shift poda through it. I have used oracle database in both dataflow and application which I am triggering in data flow.

When ever pod is triggered, dataflow is generating a execution id which is visible in pod json file as a Argument for pod. How can I get it as a input parameter for my spring batch job.

On Spring DataFlow I am running Spring Batch jobs so maybe better way is too somehow set execution job id and pass it as input parameter?

Since the execution id cannot be fetched from pod arguments, spring batch ia generating a new execution id and updating the status to that id. Whenever I see it in Dataflow UI, scdf triggered id has status as NA and new execution id ia generated with update status as success or failure.

How can I get exeuction is from pods or how to stop scdf to stop generating a execution id and make my spring f batch to generate a execution id.



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source