'Gitlab Cloud run deploy successfully but Job failed

Im having an issue with my CI/CD pipeline , its successfully deployed to GCP cloud run but on Gitlab dashboard the status is failed.

I tried to replace images to some other docker images but it fails as well .

 # File: .gitlab-ci.yml
image: google/cloud-sdk:alpine
deploy_int:
  stage: deploy
  environment: integration
  only:
  - integration    # This pipeline stage will run on this branch alone
  script:
    - echo $GCP_SERVICE_KEY > gcloud-service-key.json # Google Cloud service accounts
    - gcloud auth activate-service-account --key-file gcloud-service-key.json
    - gcloud config set project $GCP_PROJECT_ID
    - gcloud builds submit . --config=cloudbuild_int.yaml




# File: cloudbuild_int.yaml
steps:
    # build the container image
  - name: 'gcr.io/cloud-builders/docker'
    args: [ 'build','--build-arg','APP_ENV=int' , '-t', 'gcr.io/$PROJECT_ID/tpdropd-int-front', '.' ]
    # push the container image
  - name: 'gcr.io/cloud-builders/docker'
    args: [ 'push', 'gcr.io/$PROJECT_ID/tpdropd-int-front']
    # deploy to Cloud Run
  - name: "gcr.io/cloud-builders/gcloud"
    args: ['run', 'deploy', 'tpd-front', '--image', 'gcr.io/$PROJECT_ID/tpdropd-int-front', '--region', 'us-central1', '--platform', 'managed', '--allow-unauthenticated']

gitlab build output :

ERROR: (gcloud.builds.submit) 
The build is running, and logs are being written to the default logs bucket.
This tool can only stream logs if you are Viewer/Owner of the project and, if applicable, allowed by your VPC-SC security policy.
The default logs bucket is always outside any VPC-SC security perimeter.
If you want your logs saved inside your VPC-SC perimeter, use your own bucket.
See https://cloud.google.com/build/docs/securing-builds/store-manage-build-logs.
Cleaning up project directory and file based variables
00:01
ERROR: Job failed: exit code 1


Solution 1:[1]

I fix it by using:

options:
  logging: CLOUD_LOGGING_ONLY

in cloudbuild.yaml

Solution 2:[2]

there you can use this work around :

Fix it by giving the Viewer role to the service account running this but this feels like giving too much permission to such a role.

Solution 3:[3]

Kevin's answer worked like a magic for me, since I am not able to comment, I am writing this new answer.

Initially I was facing the same issue where inspite of gcloud build submit command passed , my gitlab CI was failing. GITLAB CI PUTPUT

Below is the cloudbuild.yaml file where I add the option logging as Kevin suggested.

steps:

  • name: gcr.io/cloud-builders/gcloud

    entrypoint: 'bash'

    args: ['run_query.sh', '${_SCRIPT_NAME}']

options: logging: CLOUD_LOGGING_ONLY

Check this document for details: https://cloud.google.com/build/docs/build-config-file-schema#options

Solution 4:[4]

To me worked the options solution as mentioned for @Kevin. Just add the parameter as mentioned before in the cloudbuild.yml file.

steps:
  - name: 'gcr.io/cloud-builders/docker'
    args: ['build', '-t', 'gcr.io/myproject/myimage', '.']
options:
  logging: CLOUD_LOGGING_ONLY

Solution 5:[5]

This worked for me: Use --suppress-logs

gcloud builds submit --suppress-logs --tag=<my-tag>

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Kevin
Solution 2 Dharman
Solution 3 Samiran Mondal
Solution 4 Francisco Nogales
Solution 5 David Valdivieso