'Triggering a GKE CronJob workload from cloud composer
I have a google cloud composer environment running. In this composer I want to orchestrate the execution of a GKE CronJob workload, and a google dataflow job. For the dataflow job part, I came to run it, now I'm having some issue with the GKE workload.
While researching, I came to find GKEStartPodOperator which I used to start the pod in GKE. The thing is the pod got created, but it doesn't run
The code I used
kubernetes_min_pod = GKEStartPodOperator(
# The ID specified for the task.
task_id="task-id",
# Name of task you want to run, used to generate Pod ID.
name="name-cronjob",
project_id=PROJECT_ID,
location=CLUSTER_REGION,
cluster_name=CLUSTER_NAME,
# Entrypoint of the container, if not specified the Docker container's
# entrypoint is used. The cmds parameter is templated.
cmds=["python", "/app/src/main.py"],
arguments=["/app/config.json"],
# The namespace to run within Kubernetes, default namespace is
# `default`.
namespace="production",
# Docker image specified. Defaults to hub.docker.com, but any fully
# qualified URLs will point to a custom repository. Supports private
# gcr.io images if the Composer Environment is under the same
# project-id as the gcr.io images and the service account that Composer
# uses has permission to access the Google Container Registry
# (the default service account has permission)
image="eu.gcr.io/project_id/app_image:latest",
)
My code source was based on the google cloud tutorial
When I run the DAG, it runs successfully, but the pod is created without the app in the container gets triggered. The pod status is green, as there are no logs (the app is not executed)
The workload that I want to trigger is a cronJob that is hosted in a GKE cluster in my project.
Has anyone worked with this before
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
