'Argo workflows big stdout output to input file
I'm trying to use the stdout
from a step as file input from another step.
Since the output is pretty big, I'm getting the error argument list too long
.
...
spec:
templates:
- name: main
steps:
- - name: big-output
...
- - name: print
template: head-query
arguments:
parameters:
- name: query-result
raw:
data: "{{steps.big-output.outputs.result}}"
- name: head-query
inputs:
parameters:
- name: query-result
path: /input/query.txt
raw:
data: "{{inputs.parameters.query-result}}"
container:
image: alpine
command: [head]
args:
- /input/query.txt
What is the proper way to put the stdout
in a file? Is there some way to avoid modifying the step with the big output?
Solution 1:[1]
Your approach should work, as long as the output doesn't exceed the 256kb limit for output parameters.
The Workflow as it is written is invalid, because raw
is meant to be used with artifacts rather than parameters.
If you were to run argo lint
you would get an error like this:
? in "big-parameter-" (Workflow): json: unknown field "path"
? 1 linting errors found!
Modifying the Workflow manifest to use artifacts instead of parameters should allow it to work.
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: big-parameter-
spec:
entrypoint: main
templates:
- name: main
steps:
- - name: big-output
template: big-output
- - name: print
template: head-query
arguments:
artifacts:
- name: query-result
raw:
data: "{{steps.big-output.outputs.result}}"
- name: big-output
script:
image: alpine
command:
- sh
source: |
echo "pretend this is really big"
- name: head-query
inputs:
artifacts:
- name: query-result
path: /input/query.txt
container:
image: alpine
command: [head]
args:
- /input/query.txt
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | crenshaw-dev |