'Failed to obtain the location of the Google Cloud Storage bucket
I am trying to transfer data from S3 to GCS by using a Java client but I got this error.
Failed to obtain the location of the Google Cloud Storage (GCS) bucket ___ due to insufficient permissions. Please verify that the necessary permissions have been granted.
I am using a service account with the Project Owner role, which should grant unlimited access to all project resources.
Solution 1:[1]
Thanks to @thnee's comment I was able to piece together a terraform script that adds the permissions to the hidden Storage Transfer service account:
data "google_project" "project" {}
locals {
// the project number is also available from the Project Info section on the Dashboard
transfer_service_id = "project-${data.google_project.project.number}@storage-transfer-service.iam.gserviceaccount.com"
}
resource "google_storage_bucket" "backups" {
location = "us-west1"
name = "backups"
storage_class = "REGIONAL"
}
data "google_iam_policy" "transfer_job" {
binding {
role = "roles/storage.legacyBucketReader"
members = [
"serviceAccount:${local.transfer_service_id}",
]
}
binding {
role = "roles/storage.objectAdmin"
members = [
"serviceAccount:${local.transfer_service_id}",
]
}
binding {
role = "roles/storage.admin"
members = [
"user:<GCP console user>",
"serviceAccount:<terraform user doing updates>",
]
}
}
resource "google_storage_bucket_iam_policy" "policy" {
bucket = "${google_storage_bucket.backups.name}"
policy_data = "${data.google_iam_policy.transfer_job.policy_data}"
}
Note that this removes the default acls of OWNER and READER present on the bucket. This would prevent you from being able to access the bucket in the console. We therefore add the roles/storage.admin back to owner users and the terraform service account that's doing the change.
Solution 2:[2]
The service account's format is typically project-PROJECT_NUMBER@storage-transfer-service.iam.gserviceaccount.com. To find your service account's format, use the googleServiceAccounts.get API call. If you don't find it, use the aforementioned convention to assign the roles below (worked for me)
Then assign it the roles:
roles/storage.objectViewer
roles/storage.legacyBucketReader
roles/storage.legacyBucketWriter
Solution 3:[3]
I was logged into my work account on the gcloud CLI. Changing the auth to gcloud auth login helped solve my random issues.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | Richard Nienaber |
| Solution 2 | Ravindranath Akila |
| Solution 3 | Kyle Pennell |
