'Cancel GitLab pipeline from job OR how to flush output buffer
Disclaimer: this is a cross-post from https://forum.gitlab.com/t/cancel-pipeline-from-job-or-how-to-flush-output-buffer/67008
Not for the first time I am looking for ways to cancel a pipeline from within a job.
True, I can have a job with a script that returns a non-zero exit code like so.
my-job:
script:
- |
echo something
if whatever; then
echo else
exit 42
fi
This will fail the job and thus also the pipeline. It’ll be marked as failed rather than cancelled.
So, I tried to be clever and cancel the pipeline through the API like so:
my-job:
script:
- |
echo something
if whatever; then
echo else
curl --request POST --header "PRIVATE-TOKEN: $MY_TOKEN" "$CI_API_V4_URL/projects/$CI_PROJECT_ID/pipelines/$CI_PIPELINE_ID/cancel"
fi
This works just fine but…the pipeline gets cancelled so quickly that not even the output buffer is flushed. Hence, I loose all echo ... output.
So, is there a better way to cancel the pipeline on dynamic conditions? If not, how could I ensure the stdout buffer is flushed before I cancel it?
Solution 1:[1]
I guess when a pipeline is cancelled, the job logs logic is immediately cancelled too, so there is no handling for the job logs from this moment on.
However, you can separate the logic to two different jobs and use artifacts to pass the decision:
my job:
artifacts:
paths: [.cancel]
script: |
echo something
if whatever; then
echo need to cancel
echo abc > .cancel
fi
cancel job:
script: |
if [ -f .cancel ]; then
curl ....
fi
This is for the answer. But I wonder, wouldn't rules be a better choice for your use case? do you have an advantage in cancelling rather than controlling the execution by rules?
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | Meir Pechthalt |
